<?xml version="1.0" encoding="UTF-8"?><rss version="2.0" xmlns:content="http://purl.org/rss/1.0/modules/content/"><channel><title>Eunha</title><description>Portfolio</description><link>https://eun346.github.io/eunha_choi/</link><language>en</language><item><title>XR Interaction Toolkit (Unity VR Pathway)</title><link>https://eun346.github.io/eunha_choi/posts/8unity-studying/</link><guid isPermaLink="true">https://eun346.github.io/eunha_choi/posts/8unity-studying/</guid><description>Unity Basics</description><pubDate>Sat, 20 Dec 2025 00:00:00 GMT</pubDate><content:encoded>&lt;p&gt;https://learn.unity.com/pathway/vr-development&lt;/p&gt;
&lt;h2&gt;Materials&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;XR HMD
&lt;ul&gt;
&lt;li&gt;If you use a meta quest, please install Meta Horizon Link in advance to connect your unity with HMD.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Unity (recommend LTS version)&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;Basic Setting&lt;/h2&gt;
&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;Download Packages
&lt;img src=&quot;https://connect-mediagw.unity.com/h1/20250911/learn/images/5fb17ebd-4455-4a14-9925-ea2e6e6ab0e3_Configured_Packages.png&quot; alt=&quot;packages&quot; /&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;From the main menu, select &lt;strong&gt;Window &amp;gt; Package Management &amp;gt; Package Manager&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;a href=&quot;https://docs.unity3d.com/Packages/com.unity.xr.management@4.5/manual/index.html&quot;&gt;XR Plugin Management&lt;/a&gt;
&lt;a href=&quot;https://docs.unity3d.com/Packages/com.unity.xr.interaction.toolkit@3.1/manual/index.html&quot;&gt;XR Interaction Toolkit&lt;/a&gt;
&lt;a href=&quot;https://docs.unity3d.com/Packages/com.unity.xr.openxr@1.14/manual/index.html&quot;&gt;OpenXR Plugin&lt;/a&gt;
&lt;a href=&quot;https://docs.unity3d.com/6000.1/Documentation/Manual/urp/urp-introduction.html&quot;&gt;Universal Render Pipeline&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Make a XR Origin
&lt;img src=&quot;./images/1.png&quot; alt=&quot;XR_Origin&quot; /&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;From your hierarchy, click right and select &lt;strong&gt;XR &amp;gt; XR Origin&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Run the app&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;
&lt;h2&gt;With the Device Simulator&lt;/h2&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;/li&gt;
&lt;/ol&gt;
</content:encoded></item><item><title>C# Studying</title><link>https://eun346.github.io/eunha_choi/posts/7csharp-studying/</link><guid isPermaLink="true">https://eun346.github.io/eunha_choi/posts/7csharp-studying/</guid><description>C# Basics</description><pubDate>Tue, 16 Dec 2025 00:00:00 GMT</pubDate><content:encoded>&lt;h1&gt;C# Studying&lt;/h1&gt;
&lt;p&gt;This post is for my C# studying.&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;1. Variables and Data Types&lt;/h2&gt;
&lt;h3&gt;What is a variable?&lt;/h3&gt;
&lt;p&gt;A variable is a named space in memory that stores a value.
In C#, every variable must have a &lt;strong&gt;type&lt;/strong&gt;, which determines:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;how much memory it uses&lt;/li&gt;
&lt;li&gt;what kind of values it can store&lt;/li&gt;
&lt;/ul&gt;
&lt;hr /&gt;
&lt;h3&gt;Value Types&lt;/h3&gt;
&lt;p&gt;Common value types include:
&lt;img src=&quot;./images/1valType.png&quot; alt=&quot;valType&quot; /&gt;&lt;/p&gt;
&lt;p&gt;Example:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;int level = 100;
float speed = 3.14f;
bool isActive = true;
char grade = &apos;A&apos;;
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Key point:
&lt;strong&gt;The type decides what operations are allowed.&lt;/strong&gt;
You cannot treat an &lt;code&gt;int&lt;/code&gt; like a &lt;code&gt;bool&lt;/code&gt;, even if the value “looks similar”.&lt;/p&gt;
&lt;hr /&gt;
&lt;h3&gt;Variable Initialization&lt;/h3&gt;
&lt;p&gt;Variables can be:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;initialized immediately&lt;/li&gt;
&lt;li&gt;declared first, assigned later&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Example:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;int a1, a2, a3;
a1 = 10;

int b = 20;
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;If you forget to initialize, the compiler will usually stop you.
This is intentional: C# avoids “garbage values”.&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;2. Operators&lt;/h2&gt;
&lt;h3&gt;Arithmetic Operators&lt;/h3&gt;
&lt;p&gt;From the operator lecture :&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;code&gt;+&lt;/code&gt; addition&lt;/li&gt;
&lt;li&gt;&lt;code&gt;-&lt;/code&gt; subtraction&lt;/li&gt;
&lt;li&gt;&lt;code&gt;*&lt;/code&gt; multiplication&lt;/li&gt;
&lt;li&gt;&lt;code&gt;/&lt;/code&gt; division&lt;/li&gt;
&lt;li&gt;&lt;code&gt;%&lt;/code&gt; remainder&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Example:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;int num1 = 5;
int num2 = 2;

int sum = num1 + num2;   // 7
int mod = num1 % num2;   // 1
&lt;/code&gt;&lt;/pre&gt;
&lt;hr /&gt;
&lt;h3&gt;Division and Casting&lt;/h3&gt;
&lt;p&gt;If both operands are integers, &lt;strong&gt;integer division happens&lt;/strong&gt;.&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;int a = 5;
int b = 2;
float result = a / b;   // result = 2 (not 2.5)
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Correct way:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;float result = (float)a / b; // 2.5
&lt;/code&gt;&lt;/pre&gt;
&lt;hr /&gt;
&lt;h2&gt;3. Conditional Statements&lt;/h2&gt;
&lt;h3&gt;if / else&lt;/h3&gt;
&lt;p&gt;Conditionals allow the program to choose a path.&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;int num = 11;

if (num &amp;gt; 10)
{
    Debug.Log(&quot;num is greater than 10&quot;);
}
else
{
    Debug.Log(&quot;num is 10 or less&quot;);
}
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Conditions must evaluate to &lt;code&gt;bool&lt;/code&gt;.&lt;/p&gt;
&lt;hr /&gt;
&lt;h3&gt;Logical Operators&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;&lt;code&gt;&amp;amp;&amp;amp;&lt;/code&gt; AND&lt;/li&gt;
&lt;li&gt;&lt;code&gt;||&lt;/code&gt; OR&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Example:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;if (num &amp;gt; 0 &amp;amp;&amp;amp; num &amp;lt; 20)
{
    Debug.Log(&quot;num is between 1 and 19&quot;);
}
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Important detail:
C# uses &lt;strong&gt;short-circuit evaluation&lt;/strong&gt;.
The second condition may not be evaluated if the first already determines the result.&lt;/p&gt;
&lt;hr /&gt;
&lt;h3&gt;switch and enum&lt;/h3&gt;
&lt;p&gt;Used when states are discrete and named.&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;enum STATE
{
    NONE = 0,
    INIT, //1
    PLAY = 100, //100
    OVER, //101

}

private int num = 0;
    private STATE currentState = STATE.INIT;

switch(num)
{
    case 100:
        {
            Debug.Log(&quot;switch 100&quot;);
        }
        break;
    case 90:
        {
            Debug.Log(&quot;switch 90&quot;);
        }
        break;
    default:
        Debug.Log(&quot;switch default&quot;);
        break;
}
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;This appears in the later control-flow section.&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;4. Loops&lt;/h2&gt;
&lt;h3&gt;for loop&lt;/h3&gt;
&lt;p&gt;Used when the number of iterations is known.&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;for (int i = 0; i &amp;lt; 5; i++)
{
    Debug.Log(i); // 0 1 2 3 4
}
&lt;/code&gt;&lt;/pre&gt;
&lt;hr /&gt;
&lt;h3&gt;while / do-while&lt;/h3&gt;
&lt;p&gt;Used when the stopping condition is more important than the count.&lt;/p&gt;
&lt;p&gt;Key difference:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;code&gt;while&lt;/code&gt; checks first&lt;/li&gt;
&lt;li&gt;&lt;code&gt;do-while&lt;/code&gt; runs at least once&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;int num1 = 0;
while (num1 &amp;lt; 10)
{
    Console.WriteLine(num1);
    num1++;
}

int num2 = 0;
do
{
    Console.WriteLine(num2);
    num2++;
}
while (num2 &amp;lt; 5);
&lt;/code&gt;&lt;/pre&gt;
&lt;hr /&gt;
&lt;h3&gt;break and continue&lt;/h3&gt;
&lt;pre&gt;&lt;code&gt;for (int i = 0; i &amp;lt; 10; i++)
{
    if (i % 2 == 0) continue;
    if (i &amp;gt; 8) break;
    Debug.Log(i);
}
&lt;/code&gt;&lt;/pre&gt;
&lt;hr /&gt;
&lt;h2&gt;5. Arrays&lt;/h2&gt;
&lt;h3&gt;Why arrays exist&lt;/h3&gt;
&lt;p&gt;Arrays store &lt;strong&gt;multiple values of the same type&lt;/strong&gt; in one structure.&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;int[] scores = { 90, 70, 50 };
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Access by index:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;scores[0]; // 90
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Looping is almost always required:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;for (int i = 0; i &amp;lt; scores.Length; i++)
{
    Debug.Log(scores[i]);
}
&lt;/code&gt;&lt;/pre&gt;
&lt;hr /&gt;
&lt;h3&gt;2D Arrays&lt;/h3&gt;
&lt;p&gt;Used when data has rows and columns.&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;int[,] grid = new int[2, 3];
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Access:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;grid[0, 1];
&lt;/code&gt;&lt;/pre&gt;
&lt;hr /&gt;
&lt;h2&gt;6. Methods (Functions)&lt;/h2&gt;
&lt;h3&gt;Why methods matter&lt;/h3&gt;
&lt;p&gt;Methods:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;reduce duplication&lt;/li&gt;
&lt;li&gt;make code readable&lt;/li&gt;
&lt;li&gt;isolate logic&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Important rule:
&lt;strong&gt;A method should do one thing.&lt;/strong&gt;
If it does more, it probably should be split.&lt;/p&gt;
&lt;hr /&gt;
&lt;h3&gt;Example&lt;/h3&gt;
&lt;pre&gt;&lt;code&gt;void Print()
{
    Debug.Log(&quot;Hello World&quot;);
}

int MaxInt()
{
    return int.MaxValue;
}

long Sum(int a, int b)
{
    return a + b;
}
&lt;/code&gt;&lt;/pre&gt;
&lt;hr /&gt;
</content:encoded></item><item><title>MCP with Blender &amp; Unity</title><link>https://eun346.github.io/eunha_choi/posts/6mcp-unity-blender/</link><guid isPermaLink="true">https://eun346.github.io/eunha_choi/posts/6mcp-unity-blender/</guid><description>MCP with Blender &amp; Unity</description><pubDate>Mon, 11 Aug 2025 00:00:00 GMT</pubDate><content:encoded>&lt;h1&gt;MCP in Blender &amp;amp; Unity with Claude AI&lt;/h1&gt;
&lt;p&gt;Recently, I experimented with &lt;strong&gt;MCP (Model Context Protocol)&lt;/strong&gt; to connect 3D tools like Blender and Unity with Claude AI. My goal was straightforward: see if AI could directly manipulate or inspect my 3D environments in real time. This post is less of a polished how-to and more of a developer log — what worked, what broke, and what I learned.&lt;/p&gt;
&lt;h1&gt;Understanding MCP&lt;/h1&gt;
&lt;p&gt;&lt;strong&gt;MCP (Model Context Protocol)&lt;/strong&gt; is an open protocol that standardizes how applications talk to large language models (LLMs).&lt;/p&gt;
&lt;p&gt;Key ideas I had to wrap my head around:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Two-way communication between AI and programs.&lt;/li&gt;
&lt;li&gt;Ability to &lt;strong&gt;create, modify, or delete 3D objects&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Editing materials, inspecting entire scenes, and even executing code through the AI.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;💡 &lt;em&gt;Why this matters&lt;/em&gt;: Instead of AI being a “detached helper,” MCP turns it into an active collaborator inside your workflow. That was the hook for me.&lt;/p&gt;
&lt;h1&gt;Tools &amp;amp; Sources&lt;/h1&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Blender&lt;/strong&gt; 3.0+&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Unity&lt;/strong&gt; 6+&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Python&lt;/strong&gt; 3.10+ (Blender MCP), 3.12+ (Unity MCP)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://github.com/ahujasid/blender-mcp&quot;&gt;Blender MCP GitHub&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://github.com/CoplayDev/unity-mcp&quot;&gt;Unity MCP GitHub&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Claude AI App&lt;/li&gt;
&lt;li&gt;UV Package Manager&lt;/li&gt;
&lt;/ul&gt;
&lt;hr /&gt;
&lt;h1&gt;Step-by-Step Setup&lt;/h1&gt;
&lt;h2&gt;Setup&lt;/h2&gt;
&lt;h3&gt;1. UV Package Manager&lt;/h3&gt;
&lt;p&gt;This is required for both Blender and Unity MCP.&lt;br /&gt;
&lt;img src=&quot;./images/1setup.png&quot; alt=&quot;uv&quot; /&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Open terminal and install UV Package Manager.&lt;/li&gt;
&lt;li&gt;If the first command failed, switch to the alternative one provided in the docs.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;💡 &lt;em&gt;Why&lt;/em&gt;:
MCP depends on the package manager for dependency handling. Without this, nothing connects properly.&lt;/p&gt;
&lt;h2&gt;Blender MCP Setup&lt;/h2&gt;
&lt;h3&gt;1. Addon Installation&lt;/h3&gt;
&lt;p&gt;&lt;img src=&quot;./images/2blender1.png&quot; alt=&quot;addon&quot; /&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Download &lt;strong&gt;addon.py&lt;/strong&gt; from &lt;a href=&quot;https://github.com/ahujasid/blender-mcp&quot;&gt;Blender-MCP GitHub&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;In Blender, go to &lt;code&gt;Edit → Preferences → Add-ons&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;Click &lt;strong&gt;Install from Disk&lt;/strong&gt; and select the addon.py file.&lt;/li&gt;
&lt;li&gt;Enable the addon by checking the box next to &quot;Interface: Blender MCP&quot;.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;💡 &lt;em&gt;Why&lt;/em&gt;:
This gives Blender a sidebar tab for MCP. Without the addon, Blender doesn’t know how to “talk” to Claude.&lt;/p&gt;
&lt;h3&gt;2. Blender MCP Tab&lt;/h3&gt;
&lt;p&gt;&lt;img src=&quot;./images/3blender2.png&quot; alt=&quot;blender_mcp&quot; /&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Open the &lt;strong&gt;3D View sidebar&lt;/strong&gt; (press &lt;code&gt;N&lt;/code&gt;).&lt;/li&gt;
&lt;li&gt;Find the &lt;strong&gt;BlenderMCP&lt;/strong&gt; tab.&lt;/li&gt;
&lt;li&gt;Check &lt;em&gt;Poly Haven&lt;/em&gt; if you want asset streaming.&lt;/li&gt;
&lt;li&gt;Click &lt;strong&gt;Connect to Claude&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;💡 &lt;em&gt;Why&lt;/em&gt;:
This is where you actually link Blender’s context to Claude. It felt surreal seeing AI recognize what was in my scene.&lt;/p&gt;
&lt;h3&gt;3. Claude AI Config&lt;/h3&gt;
&lt;p&gt;&lt;img src=&quot;./images/4blender3.png&quot; alt=&quot;claude-blender&quot; /&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Open &lt;strong&gt;Claude AI App&lt;/strong&gt; and go &lt;code&gt;Settings → Developer → Edit Config&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;Copy and paste the below code in &lt;strong&gt;claude_desktop_config&lt;/strong&gt; file.&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;{
  &quot;mcpServers&quot;: {
    &quot;blender&quot;: {
      &quot;command&quot;: &quot;uvx&quot;,
      &quot;args&quot;: [&quot;blender-mcp&quot;]
    }
  }
}
&lt;/code&gt;&lt;/pre&gt;
&lt;ul&gt;
&lt;li&gt;Restart the computer.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;💡 &lt;em&gt;Why&lt;/em&gt;:
This step gives Claude the bridge it needs to process MCP requests. Took me a few tries before it clicked.&lt;/p&gt;
&lt;h3&gt;Video&lt;/h3&gt;
&lt;p&gt;&amp;lt;iframe width=&quot;100%&quot; height=&quot;468&quot; src=&quot;https://www.youtube.com/embed/G4hbDDk09cM&quot; title=&quot;blender mcp Video&quot; frameborder=&quot;0&quot; allow=&quot;accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share&quot; allowfullscreen&amp;gt;&amp;lt;/iframe&amp;gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;Unity MCP Setup&lt;/h2&gt;
&lt;h3&gt;1. Python Path Setup (Windows)&lt;/h3&gt;
&lt;p&gt;&lt;img src=&quot;./images/5unity1.png&quot; alt=&quot;windowpath&quot; /&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Open Python file and click “Open File Location”.&lt;/li&gt;
&lt;li&gt;Copy the file path.
&lt;img src=&quot;./images/6unity2.png&quot; alt=&quot;environ&quot; /&gt;&lt;/li&gt;
&lt;li&gt;Open “Edit the system environment variables”.
&lt;img src=&quot;./images/6unity2.png&quot; alt=&quot;environment setting&quot; /&gt;&lt;/li&gt;
&lt;li&gt;Click Environment Variables.&lt;/li&gt;
&lt;li&gt;Click Path in System variables and Edit.&lt;/li&gt;
&lt;li&gt;Paste the file path.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;2. Package Installation&lt;/h3&gt;
&lt;p&gt;&lt;img src=&quot;./images/7unity4.png&quot; alt=&quot;packm&quot; /&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;In Unity, open &lt;code&gt;Window → Package Manager&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;&lt;code&gt;Click + → Add package from Git URL...&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;Paste the link: https://github.com/CoplayDev/unity-mcp.git?path=/UnityMcpBridge.&lt;/li&gt;
&lt;li&gt;Click Add.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;💡 &lt;em&gt;Why&lt;/em&gt;: MCP for Unity isn’t built-in — GitHub package installation is the only way to get it.&lt;/p&gt;
&lt;h2&gt;3. MCP Auto-Setup&lt;/h2&gt;
&lt;p&gt;&lt;img src=&quot;./images/8unity5.png&quot; alt=&quot;mcp_auto&quot; /&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Go &lt;code&gt;Window → Unity MCP&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;Click Auto-Setup.&lt;/li&gt;
&lt;li&gt;Look for a green status indicator🟢 and &quot;Connected ✓&quot;. (This attempts to modify the MCP Client&apos;s config file automatically).&lt;/li&gt;
&lt;li&gt;Restart the computer.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;💡 &lt;em&gt;Why&lt;/em&gt;: Auto-setup edits the MCP client config automatically. Saves time and avoids manually editing JSON files.&lt;/p&gt;
&lt;h2&gt;Video&lt;/h2&gt;
&lt;p&gt;&amp;lt;iframe width=&quot;100%&quot; height=&quot;468&quot; src=&quot;https://www.youtube.com/embed/Kndy2dcEQU4&quot; title=&quot;unity mcp Video&quot; frameborder=&quot;0&quot; allow=&quot;accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share&quot; allowfullscreen&amp;gt;&amp;lt;/iframe&amp;gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;h1&gt;Discussion &amp;amp; Reflections&lt;/h1&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Challenges&lt;/strong&gt;: MCP requires detailed context to be useful. If you don’t feed enough scene info, the AI’s edits are random or incomplete.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;What worked&lt;/strong&gt;: Once connected, I could spawn and modify objects in Blender directly through Claude — faster than manual modeling tweaks.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Limitations&lt;/strong&gt;: MCP helps speed up repetitive tasks. But currently it doesn’t replace a human designer at all. You still need to think critically about scene composition.&lt;/li&gt;
&lt;/ul&gt;
&lt;hr /&gt;
&lt;h1&gt;Future Outlook&lt;/h1&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Game Development&lt;/strong&gt;: AI-assisted real-time prototyping inside Unity.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Simulation&lt;/strong&gt;: Rapid environment adjustments for training data.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;VR/AR Content Creation&lt;/strong&gt;: AI could manage assets or materials while you focus on immersion.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Labor Impact&lt;/strong&gt;: Might reduce grunt work, but creative direction still belongs to humans.&lt;/li&gt;
&lt;/ul&gt;
&lt;hr /&gt;
</content:encoded></item><item><title>Blender Tutorial</title><link>https://eun346.github.io/eunha_choi/posts/5blender/</link><guid isPermaLink="true">https://eun346.github.io/eunha_choi/posts/5blender/</guid><description>Blender Basics &amp; Floor Plan</description><pubDate>Fri, 08 Aug 2025 00:00:00 GMT</pubDate><content:encoded>&lt;h1&gt;My Blender Tutorial Experience&lt;/h1&gt;
&lt;p&gt;I recently dove into 3D modeling with &lt;strong&gt;Blender&lt;/strong&gt;, a free yet insanely powerful tool. My goal? Prepare for future &lt;strong&gt;Unity&lt;/strong&gt; projects and finally get comfortable creating things from scratch in 3D. To kick things off, I followed two YouTube tutorials:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href=&quot;https://www.youtube.com/watch?v=Ci3Has4L5W4&quot;&gt;Blender Beginner Tutorial&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.youtube.com/watch?v=94kAIpRnhcY&quot;&gt;Blender Floor Plan Tutorial&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;hr /&gt;
&lt;h1&gt;Understanding the Basics&lt;/h1&gt;
&lt;p&gt;Before starting, I clarified essential 3D modeling concepts.&lt;/p&gt;
&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Term&lt;/th&gt;
&lt;th&gt;Description&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;Mesh&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;A combination of vertices (points), edges (lines), faces (polygons), or any combination of these.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;Geometry&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Any 3D object with position data, including meshes, curves, instances, volumes, etc.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;Instance&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;A duplicate of an object sharing the same underlying data as the original.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;Shader&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;A program controlling how light interacts with a surface, affecting color, reflection, transparency, and emission. In Blender, it’s often used interchangeably with &lt;code&gt;material&lt;/code&gt;.&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;💡 &lt;em&gt;Why it matters&lt;/em&gt;: Blender can feel like a maze at first. Understanding meshes, modifiers, and basic geometry gave me a clear roadmap—I suddenly knew what to click, move, and tweak.&lt;/p&gt;
&lt;hr /&gt;
&lt;h1&gt;Tools &amp;amp; Resources&lt;/h1&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Blender&lt;/strong&gt;: &lt;a href=&quot;https://www.blender.org/download/releases/4-4/&quot;&gt;Download v4.4&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;YouTube Tutorials:
&lt;ul&gt;
&lt;li&gt;&lt;a href=&quot;https://www.youtube.com/watch?v=Ci3Has4L5W4&quot;&gt;Beginner Tutorial&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.youtube.com/watch?v=94kAIpRnhcY&quot;&gt;Floor Plan Tutorial&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;hr /&gt;
&lt;h1&gt;Essential Shortcuts&lt;/h1&gt;
&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Shortcut&lt;/th&gt;
&lt;th&gt;Action&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;Middle Mouse&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Orbit around scene&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;Shift + Middle Mouse&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Pan view&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;Mouse Wheel Scroll&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Zoom in/out&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;Tab&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Switch between Object/Edit Mode&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;N&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Toggle Sidebar&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;For &lt;strong&gt;Object Mode&lt;/strong&gt;:&lt;/p&gt;
&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Shortcut&lt;/th&gt;
&lt;th&gt;Action&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;T&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Toggle Toolbar&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;G&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Move objects&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;R&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Rotate objects&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;S&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Scale (resize) objects&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;Shift + A&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Add object&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;Alt&lt;/code&gt; (when adding)&lt;/td&gt;
&lt;td&gt;Add object in perfect form (e.g., perfect cube or triangle)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;Ctrl + L&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Link properties between objects&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;💡 &lt;em&gt;Tip:&lt;/em&gt; For more shortcuts, check Blender’s &lt;a href=&quot;https://docs.blender.org/manual/en/latest/interface/keymap/introduction.html&quot;&gt;Keymap Documentation&lt;/a&gt;.&lt;/p&gt;
&lt;h3&gt;Recommended Setup&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;Go to &lt;code&gt;Edit &amp;gt; Preferences &amp;gt; Interface&lt;/code&gt; and adjust the &lt;strong&gt;Resolution Scale&lt;/strong&gt;&lt;br /&gt;
💡 &lt;em&gt;Why:&lt;/em&gt; Makes Blender easier to work with depending on your screen size.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Check your GPU under &lt;code&gt;System &amp;gt; Cycles Render Devices &amp;gt; CUDA&lt;/code&gt;&lt;br /&gt;
💡 &lt;em&gt;Why:&lt;/em&gt; Ensures hardware acceleration for faster rendering.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Increase &lt;strong&gt;Undo Steps&lt;/strong&gt; to &lt;strong&gt;100&lt;/strong&gt; in &lt;code&gt;System &amp;gt; Memory &amp;amp; Limits&lt;/code&gt;&lt;br /&gt;
💡 &lt;em&gt;Why:&lt;/em&gt; Allows more flexibility when correcting mistakes.&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;hr /&gt;
&lt;h1&gt;Step-by-Step Experience&lt;/h1&gt;
&lt;h2&gt;Tutorial 1: &lt;a href=&quot;https://www.youtube.com/watch?v=Ci3Has4L5W4&quot;&gt;Blender Beginner Tutorial&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;This tutorial covered the fundamentals:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Navigating Blender’s viewport&lt;/li&gt;
&lt;li&gt;Differences between &lt;strong&gt;Edit Mode&lt;/strong&gt; and &lt;strong&gt;Object Mode&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;Building a simple object step by step&lt;/li&gt;
&lt;/ul&gt;
&lt;hr /&gt;
&lt;h4&gt;1. Cookie Base&lt;/h4&gt;
&lt;p&gt;&lt;img src=&quot;./images/1Blender1-1.png&quot; alt=&quot;cookieBase&quot; /&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Add a &lt;strong&gt;Cylinder&lt;/strong&gt; (&lt;code&gt;Shift + A &amp;gt; Mesh &amp;gt; Cylinder&lt;/code&gt;).&lt;/li&gt;
&lt;li&gt;Scale (&lt;code&gt;S&lt;/code&gt;) to flatten into a cookie shape.&lt;/li&gt;
&lt;li&gt;Right-click → &lt;code&gt;Shade Smooth&lt;/code&gt; for a softer look.
&lt;img src=&quot;./images/2Blender1-2.png&quot; alt=&quot;rename&quot; /&gt;&lt;/li&gt;
&lt;li&gt;Rename it Cookie in the Outliner to stay organized.&lt;/li&gt;
&lt;/ul&gt;
&lt;h4&gt;2. Chocolate Chips&lt;/h4&gt;
&lt;p&gt;&lt;img src=&quot;./images/3Blender1-3.png&quot; alt=&quot;uvSphere&quot; /&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Add a &lt;strong&gt;UV Sphere&lt;/strong&gt; (&lt;code&gt;Shift + A &amp;gt; Mesh &amp;gt; UV Sphere&lt;/code&gt;)&lt;/li&gt;
&lt;li&gt;Scale down (&lt;code&gt;S&lt;/code&gt;) to chip size&lt;/li&gt;
&lt;li&gt;Position using &lt;strong&gt;Move&lt;/strong&gt; (&lt;code&gt;G&lt;/code&gt;)&lt;br /&gt;
&lt;img src=&quot;./images/4Blender1-4.png&quot; alt=&quot;duplicate&quot; /&gt;&lt;/li&gt;
&lt;li&gt;Duplicate chips (&lt;code&gt;Shift + D&lt;/code&gt;) and place around the cookie&lt;/li&gt;
&lt;/ul&gt;
&lt;h4&gt;3. Tray&lt;/h4&gt;
&lt;ul&gt;
&lt;li&gt;Add a &lt;strong&gt;Cube&lt;/strong&gt; (&lt;code&gt;Shift + A &amp;gt; Mesh &amp;gt; Cube&lt;/code&gt;)&lt;/li&gt;
&lt;li&gt;Scale and move to sit under cookie
1&lt;a href=&quot;./images/5Blender1-5.png&quot;&gt;insetFace&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Enter &lt;strong&gt;Edit Mode&lt;/strong&gt; (&lt;code&gt;Tab&lt;/code&gt;), select top face, use &lt;strong&gt;Inset Faces&lt;/strong&gt; (&lt;code&gt;I&lt;/code&gt;)&lt;br /&gt;
&lt;img src=&quot;./images/6Blender1-6.png&quot; alt=&quot;extrude&quot; /&gt;
&lt;img src=&quot;./images/7Blender1-7.png&quot; alt=&quot;extrude2&quot; /&gt;&lt;/li&gt;
&lt;li&gt;Extrude inner face down (&lt;code&gt;E&lt;/code&gt;) to create tray depth
&lt;img src=&quot;./images/8Blender1-8.png&quot; alt=&quot;move&quot; /&gt;&lt;/li&gt;
&lt;li&gt;Move the tray under the cookie.&lt;/li&gt;
&lt;/ul&gt;
&lt;h4&gt;4. Materials &amp;amp; Colors&lt;/h4&gt;
&lt;p&gt;&lt;img src=&quot;./images/9Blender1-9.png&quot; alt=&quot;color&quot; /&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Add material in &lt;strong&gt;Material Properties&lt;/strong&gt; → &lt;code&gt;New&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;Choose &lt;strong&gt;Base Color&lt;/strong&gt; for cookie, chips, and tray&lt;br /&gt;
&lt;img src=&quot;./images/10Blender1-10.png&quot; alt=&quot;link&quot; /&gt;&lt;/li&gt;
&lt;li&gt;To quickly assign the same material, select chips → last chip with material → &lt;code&gt;Ctrl + L &amp;gt; Materials&lt;/code&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h4&gt;5. Lighting &amp;amp; Rendering&lt;/h4&gt;
&lt;p&gt;&lt;img src=&quot;./images/11Blender1-11.png&quot; alt=&quot;renderview&quot; /&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Click &lt;strong&gt;Viewport Shading: Rendered&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;Delete default light → Add &lt;strong&gt;Area Light&lt;/strong&gt; (&lt;code&gt;Shift + A &amp;gt; Light &amp;gt; Area&lt;/code&gt;)&lt;br /&gt;
&lt;img src=&quot;./images/12Blender1-12.png&quot; alt=&quot;camera&quot; /&gt;&lt;/li&gt;
&lt;li&gt;Position camera (&lt;code&gt;0&lt;/code&gt; on numpad, enable &lt;strong&gt;Camera to View&lt;/strong&gt;)&lt;/li&gt;
&lt;li&gt;Set render engine to &lt;strong&gt;Cycles&lt;/strong&gt; for realism&lt;br /&gt;
&lt;img src=&quot;./images/13Blender1-13.png&quot; alt=&quot;image&quot; /&gt;&lt;/li&gt;
&lt;li&gt;Render with &lt;code&gt;F12&lt;/code&gt; → Save via &lt;code&gt;Image &amp;gt; Save As&lt;/code&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;hr /&gt;
&lt;h2&gt;Tutorial 2: &lt;a href=&quot;https://www.youtube.com/watch?v=94kAIpRnhcY&quot;&gt;Blender Floor Plan Tutorial&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;This tutorial focused on combining objects and creating a full scene:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Importing and scaling floor plan images&lt;/li&gt;
&lt;li&gt;Modeling floors and walls&lt;/li&gt;
&lt;li&gt;Applying materials, lighting, and render settings&lt;/li&gt;
&lt;/ul&gt;
&lt;h4&gt;1. Setup&lt;/h4&gt;
&lt;ul&gt;
&lt;li&gt;Download a floor plan image&lt;/li&gt;
&lt;li&gt;Open Blender, select all (&lt;code&gt;A&lt;/code&gt;) and delete (&lt;code&gt;X&lt;/code&gt;) default objects&lt;/li&gt;
&lt;li&gt;&lt;code&gt;Import image → Creates an Empty object&lt;/code&gt;&lt;br /&gt;
&lt;img src=&quot;./images/15Blender2-1.png&quot; alt=&quot;img&quot; /&gt;&lt;/li&gt;
&lt;li&gt;Reset location &amp;amp; rotation: &lt;code&gt;Alt + G&lt;/code&gt;, &lt;code&gt;Alt + R&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;Switch to Top Orthographic View (&lt;code&gt;7&lt;/code&gt; on numpad or click z in xyz)&lt;/li&gt;
&lt;li&gt;Set &lt;strong&gt;Units&lt;/strong&gt; to Metric (Scene Properties → Units → Metric → Meters)&lt;/li&gt;
&lt;/ul&gt;
&lt;h4&gt;2. Floor &amp;amp; Walls&lt;/h4&gt;
&lt;p&gt;&lt;img src=&quot;./images/16Blender2-2.png&quot; alt=&quot;plane&quot; /&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Add &lt;strong&gt;Plane&lt;/strong&gt; (&lt;code&gt;Shift + A &amp;gt; Mesh &amp;gt; Plane&lt;/code&gt;)
&lt;img src=&quot;./images/17Blender2-3.png&quot; alt=&quot;scale&quot; /&gt;&lt;/li&gt;
&lt;li&gt;Rename to &lt;strong&gt;Walls&lt;/strong&gt; and set dimensions in Item Sidebar → Apply scale (&lt;code&gt;Ctrl + A &amp;gt; Scale&lt;/code&gt;)
&lt;img src=&quot;./images/18Blender2-4.png&quot; alt=&quot;wallsOrigin&quot; /&gt;&lt;/li&gt;
&lt;li&gt;Click &lt;code&gt;walls&lt;/code&gt; and set origin (&lt;code&gt;Shift + S &amp;gt; Cursor to Selected&lt;/code&gt; &amp;amp; &lt;code&gt;Right Click &amp;gt; Set Origin &amp;gt; Origin to 3D Cursor&lt;/code&gt;) and clear any translation by &lt;code&gt;Alt + G&lt;/code&gt;
&lt;img src=&quot;./images/19Blender2-5.png&quot; alt=&quot;onlyface&quot; /&gt;&lt;/li&gt;
&lt;li&gt;In &lt;strong&gt;Edit Mode&lt;/strong&gt;, delete face (&lt;code&gt;X &amp;gt; Only Faces&lt;/code&gt;)&lt;/li&gt;
&lt;li&gt;In &lt;strong&gt;Object Mode&lt;/strong&gt;, move the &lt;strong&gt;floor plan image&lt;/strong&gt; fit right in the &lt;strong&gt;Wall&lt;/strong&gt; line&lt;/li&gt;
&lt;li&gt;Delete three vertices in Walls (&lt;code&gt;X &amp;gt; Vertices&lt;/code&gt;)
![extrudevertices]
&lt;img src=&quot;./images/20Blender2-6.png&quot; alt=&quot;vertices&quot; /&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;In Edit Mode&lt;/strong&gt;, extrude vertices (&lt;code&gt;E&lt;/code&gt;) along X/Y to match outer walls&lt;/li&gt;
&lt;li&gt;Extrude inner walls for rooms too
&lt;img src=&quot;./images/21Blender2-7.png&quot; alt=&quot;floor&quot; /&gt;&lt;/li&gt;
&lt;li&gt;Duplicate Walls and rename the duplicated one as &lt;strong&gt;Floor&lt;/strong&gt; and hide the &lt;strong&gt;Walls&lt;/strong&gt; by clicking eye icon&lt;/li&gt;
&lt;li&gt;For the same floor, delete some overlap unnecessary vertices&lt;/li&gt;
&lt;li&gt;For each room, select all of the vertices and make a face(&lt;code&gt;F&lt;/code&gt;).&lt;/li&gt;
&lt;/ul&gt;
&lt;h4&gt;3. Finalizing &amp;amp; Rendering&lt;/h4&gt;
&lt;p&gt;&lt;img src=&quot;./images/22Blender2-8.png&quot; alt=&quot;background&quot; /&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Duplicate floor (&lt;code&gt;Shift + D &amp;gt; Right Click&lt;/code&gt;) and rename &lt;strong&gt;Background&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;Hide Floor and Walls&lt;/li&gt;
&lt;li&gt;In Background, delete all inner vertices&lt;/li&gt;
&lt;li&gt;Extrude everything and scale (&lt;code&gt;A &amp;gt; E &amp;gt; S&lt;/code&gt;)
&lt;img src=&quot;./images/23Blender2-10.png&quot; alt=&quot;zextrude&quot; /&gt;&lt;/li&gt;
&lt;li&gt;Hide Floor and Background. Select Walls → Extrude upwards along Z-axis (&lt;code&gt;E &amp;gt; Z&lt;/code&gt;)&lt;br /&gt;
&lt;img src=&quot;./images/24Blender2-11.png&quot; alt=&quot;ori&quot; /&gt;&lt;/li&gt;
&lt;li&gt;See walls&apos; orientation (&lt;code&gt;Overlays menu &amp;gt; Face Orientation&lt;/code&gt;)
Modifier.
&lt;img src=&quot;./images/25Blender2-12.png&quot; alt=&quot;sol&quot; /&gt;
&lt;img src=&quot;./images/26Blender2-13.png&quot; alt=&quot;mod&quot; /&gt;&lt;/li&gt;
&lt;li&gt;Add &lt;strong&gt;Solidify Modifier&lt;/strong&gt; (&lt;code&gt;Modifier &amp;gt; Add Modifier &amp;gt; Solidify&lt;/code&gt;) and change some settings (Mode Complex, Thickness)&lt;/li&gt;
&lt;li&gt;Click Monitor Icon in Modifier
&lt;img src=&quot;./images/27Blender2-14.png&quot; alt=&quot;flip&quot; /&gt;&lt;/li&gt;
&lt;li&gt;Fix flipped faces (&lt;code&gt;Shift + N&lt;/code&gt;) (Blue = outter, Red = inner)&lt;/li&gt;
&lt;li&gt;Click monitor icon again&lt;/li&gt;
&lt;li&gt;Orientation overlay off (&lt;code&gt;Overlays menu &amp;gt; Face Orientation&lt;/code&gt;)
&lt;img src=&quot;./images/28Blender2-15.png&quot; alt=&quot;end&quot; /&gt;&lt;/li&gt;
&lt;li&gt;Save your work (&lt;code&gt;Ctrl + S&lt;/code&gt;)&lt;/li&gt;
&lt;/ul&gt;
&lt;hr /&gt;
&lt;h1&gt;Summary&lt;/h1&gt;
&lt;p&gt;At the end of the day, it wasn’t just about a cookie or a floor plan. These tutorials taught me how to bring any idea to life in 3D. Modeling, texturing, lighting—once you get the hang of these, you’re ready for bigger projects, whether it’s games, architectural visualization, or AR. Honestly, seeing something I imagined appear on screen made me even more excited to dive into Unity. In 3D, the only limit really is your imagination.&lt;/p&gt;
&lt;p&gt;💡 &lt;em&gt;Final Tip:&lt;/em&gt; Always organize objects, name them clearly, and save iterations frequently. This keeps projects manageable as scenes get complex.&lt;/p&gt;
&lt;hr /&gt;
</content:encoded></item><item><title>MR Meta Quest Step-by-Step Tutorial</title><link>https://eun346.github.io/eunha_choi/posts/4mr-tutorial/</link><guid isPermaLink="true">https://eun346.github.io/eunha_choi/posts/4mr-tutorial/</guid><description>MR with Meta Quest 3</description><pubDate>Wed, 06 Aug 2025 00:00:00 GMT</pubDate><content:encoded>&lt;h1&gt;Mixed Reality on Meta Quest 3&lt;/h1&gt;
&lt;p&gt;I&apos;ve been diving into Mixed Reality (MR) recently, and I wanted to experiment with a simple Passthrough prototype on my Meta Quest 3. The goal was simple but oddly satisfying: see my real room through the headset, spawn virtual balls, and watch them bounce off walls and floors. Honestly, it felt a bit like playing catch with my own furniture. This post is my personal developer log — not a polished tutorial. I’ll walk through what I did, why I did it, what broke (and it did), and what I learned along the way.&lt;/p&gt;
&lt;h1&gt;Understanding the Basics&lt;/h1&gt;
&lt;p&gt;Before jumping into Unity, I clarified the essential concepts.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Mixed Reality (MR)&lt;/strong&gt;: blends virtual content with the real world.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Extended Reality (XR)&lt;/strong&gt;: umbrella term for VR, AR, and MR.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Passthrough&lt;/strong&gt;: lets you see the real world through the Quest&apos;s cameras, layered with virtual objects.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;💡&lt;em&gt;Why this matters&lt;/em&gt;: MR is not just &quot;turn on a camera.&quot; Understanding this early saves time when configuring cameras, layers, and interactions in Unity.&lt;/p&gt;
&lt;h1&gt;Tools&lt;/h1&gt;
&lt;ul&gt;
&lt;li&gt;Unity: v.2020.3+&lt;/li&gt;
&lt;li&gt;Meta Quest 3&lt;/li&gt;
&lt;li&gt;USB-c Cable (for Building)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://assetstore.unity.com/packages/tools/integration/oculus-integration-deprecated-82022?srsltid=AfmBOoqs3VykViopb9qVxMb3gFcYp88tIxOFRBEoxyUs_zHPXRYparKT&quot;&gt;Oculus Integration&lt;/a&gt; Package from Unity Asset Store&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;I used 2020.3.1f Unity version. Passthrough and OVRManager behaves slightly differently depending on Unity version. Specifying this ensures anyone trying to replicate my steps hits the same quirks and solutions I did.&lt;/p&gt;
&lt;hr /&gt;
&lt;h1&gt;Step-by-Step Tutorials&lt;/h1&gt;
&lt;h2&gt;Setup&lt;/h2&gt;
&lt;h3&gt;1. Enable Developer Mode&lt;/h3&gt;
&lt;p&gt;&lt;img src=&quot;./images/1setup1.png&quot; alt=&quot;Developer Mode on Meta Horizon&quot; /&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Create a &lt;strong&gt;Meta Developer Account&lt;/strong&gt; in &lt;a href=&quot;https://developers.meta.com/horizon/sign-up/&quot;&gt;Meta&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Turn on &lt;strong&gt;Developer Mode&lt;/strong&gt; in the Meta Horizon app&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;2. Setting Up Unity for Oculus Development&lt;/h3&gt;
&lt;p&gt;&lt;img src=&quot;./images/2setup2.png&quot; alt=&quot;Oculus Development&quot; /&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Go to &lt;code&gt;Edit → Project Settings → XR Plug-in Management&lt;/code&gt;, enable &lt;strong&gt;Oculus&lt;/strong&gt; for both Windows and Android.&lt;/li&gt;
&lt;li&gt;Install &lt;a href=&quot;https://assetstore.unity.com/packages/tools/integration/oculus-integration-deprecated-82022?srsltid=AfmBOoqs3VykViopb9qVxMb3gFcYp88tIxOFRBEoxyUs_zHPXRYparKT&quot;&gt;&lt;strong&gt;Oculus Integration&lt;/strong&gt;&lt;/a&gt; from the Unity Asset Store.
&lt;img src=&quot;./images/3setup3.png&quot; alt=&quot;Oculus Fixall&quot; /&gt;&lt;/li&gt;
&lt;li&gt;Run &lt;code&gt;Tools → Project Setup Tool → Fix All &amp;amp; Apply All&lt;/code&gt; (this actually fixed more problems than I expected).
&lt;img src=&quot;./images/4setup4.png&quot; alt=&quot;Oculus Android platform&quot; /&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Switch platform&lt;/strong&gt; in &lt;code&gt;File → Build Settings → Android&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;💡 Why:
Quest 3 runs Android. Without the XR plugin + Oculus package, Unity can’t talk to the headset — I learned this the hard way when nothing showed up in my first test build.&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;Connecting the Headset &amp;amp; Building&lt;/h2&gt;
&lt;p&gt;&lt;img src=&quot;./images/5method1.png&quot; alt=&quot;method1&quot; /&gt;&lt;/p&gt;
&lt;h3&gt;Method 1: &lt;strong&gt;AirLink (Wireless)&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;convenient, but slightly blurry visuals and longer builds — kind of frustrating when you’re in the zone&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Turn on Developer settings in Quest (Settings → Beta)&lt;/li&gt;
&lt;li&gt;Connect Quest to PC via AirLink&lt;/li&gt;
&lt;li&gt;Run from Unity. (May cause longer load times)&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;img src=&quot;./images/6method2.png&quot; alt=&quot;method2&quot; /&gt;&lt;/p&gt;
&lt;h3&gt;Method 2 (Recommended): &lt;strong&gt;USB-C (Wired)&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;more stable, faster builds&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Connect Quest via USB-C cable&lt;/li&gt;
&lt;li&gt;In Unity, go to &lt;code&gt;files → build settings&lt;/code&gt; and click &lt;strong&gt;Build and Run&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;hr /&gt;
&lt;h2&gt;Passthrough&lt;/h2&gt;
&lt;h3&gt;1. Camera Configuration&lt;/h3&gt;
&lt;p&gt;&lt;img src=&quot;./images/7passthrough1.png&quot; alt=&quot;camera&quot; /&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Delete the default &lt;code&gt;Main Camera&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;Add &lt;code&gt;OVRCameraRig&lt;/code&gt; prefab.&lt;/li&gt;
&lt;li&gt;In &lt;code&gt;OVRManager&lt;/code&gt; (inside OVRCameraRig), set:
&lt;ul&gt;
&lt;li&gt;Hand Tracking Support → “Controllers and Hands”&lt;/li&gt;
&lt;li&gt;Passthrough Support → “Supported”&lt;/li&gt;
&lt;li&gt;Enable Passthrough → Checked
&lt;img src=&quot;./images/8passthrough2.png&quot; alt=&quot;controller&quot; /&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Add &lt;code&gt;OVRControllerPrefab&lt;/code&gt; and &lt;code&gt;OVRHandPrefab&lt;/code&gt; to Right/LeftHandAnchor.&lt;/li&gt;
&lt;li&gt;For &lt;code&gt;RightHandPrefab&lt;/code&gt;, change to &lt;strong&gt;hand right&lt;/strong&gt; in components: &lt;strong&gt;OVR Hand&lt;/strong&gt;, &lt;strong&gt;OVR Skeleton&lt;/strong&gt; and &lt;strong&gt;OVR Mesh comp&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;💡 Why:
The default &lt;code&gt;Main Camera&lt;/code&gt; does not support VR tracking or Passthrough. So we have to replce the deafault &lt;code&gt;Main Camera&lt;/code&gt; into &lt;code&gt;OVRCameraRig&lt;/code&gt; which handels stero rendering, tracking, and device input. Changing settings in &lt;code&gt;OVRManager&lt;/code&gt; enables both controllers and hands in MR while turing on Passthrough. &lt;code&gt;OVRControllerPrefab&lt;/code&gt; and &lt;code&gt;OVRHandPrefab&lt;/code&gt; provide controller/hand models so you can actually see and interact with them in MR.&lt;/p&gt;
&lt;h3&gt;2. Adding the Passthrough Layer&lt;/h3&gt;
&lt;p&gt;&lt;img src=&quot;./images/9passthrough3.png&quot; alt=&quot;passthrough layer&quot; /&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Create an empty object &lt;code&gt;Passthrough&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;Reset its Transform.&lt;/li&gt;
&lt;li&gt;Add &lt;strong&gt;OVR Passthrough Layer&lt;/strong&gt; component inside Passthrough.&lt;/li&gt;
&lt;li&gt;Change &lt;strong&gt;Placement → Underlay&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;💡 Why:
Underlay makes the real-world view sit behind virtual objects, avoiding weird overlaps.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;./images/10passthrough4.png&quot; alt=&quot;centereyeanchor&quot; /&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;In &lt;code&gt;CenterEyeAnchor&lt;/code&gt;, set Clear Flags to &lt;strong&gt;Solid Color&lt;/strong&gt; and backaground to &lt;strong&gt;black&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Create 3D object &lt;code&gt;Cube&lt;/code&gt; in Hierarchy and adjust the position as needed.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;💡 Why:
Prevents Unity’s default skybox or other background from showing through. A quick test object to confirm Passthrough + rendering are working.&lt;/p&gt;
&lt;h2&gt;Video&lt;/h2&gt;
&lt;p&gt;&amp;lt;iframe width=&quot;100%&quot; height=&quot;468&quot; src=&quot;https://www.youtube.com/embed/DnNl0YZpAPk&quot; title=&quot;Passthrough Video&quot; frameborder=&quot;0&quot; allow=&quot;accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share&quot; allowfullscreen&amp;gt;&amp;lt;/iframe&amp;gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;Scan Space&lt;/h2&gt;
&lt;h3&gt;1.OVR Settings&lt;/h3&gt;
&lt;p&gt;&lt;img src=&quot;./images/11scan1.png&quot; alt=&quot;ovrmanager&quot; /&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Add &lt;code&gt;OVRSceneManager&lt;/code&gt;.
&lt;ol&gt;
&lt;li&gt;Assign &lt;strong&gt;Plane (OVR Scene Anchor)&lt;/strong&gt; to &lt;em&gt;Plane Prefab&lt;/em&gt; and &lt;strong&gt;Volume (OVR Scene Anchor)&lt;/strong&gt; to &lt;em&gt;Volume Prefab&lt;/em&gt; (plane mesh &amp;amp; volume).&lt;/li&gt;
&lt;li&gt;Assign &lt;strong&gt;Invisible Plane (OVR Scene Anchor)&lt;/strong&gt; to &lt;em&gt;Plane Prefab&lt;/em&gt; and &lt;strong&gt;Invisible Volume (OVR Scene Anchor)&lt;/strong&gt; to &lt;em&gt;Volume Prefab&lt;/em&gt; (plane mesh &amp;amp; volume). — &lt;em&gt;recommended&lt;/em&gt;.
&lt;img src=&quot;./images/12scan2.png&quot; alt=&quot;ovrcamerarig&quot; /&gt;&lt;/li&gt;
&lt;/ol&gt;
&lt;/li&gt;
&lt;li&gt;In &lt;code&gt;OVRCameraRig&lt;/code&gt;, change to &lt;strong&gt;Supported&lt;/strong&gt; in scene support.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;💡 Why:
Plane and Volume in &lt;code&gt;OVRSceneManager&lt;/code&gt; enables scene understanding so your virtual objects can collide with real-world walls/floors. Invisible prefabs keep the real-world mesh hidden but functional. Without enabling scene support in &lt;code&gt;OVRCameraRig&lt;/code&gt;, the headset won’t actually run scene scanning.&lt;/p&gt;
&lt;h2&gt;Video&lt;/h2&gt;
&lt;p&gt;&amp;lt;iframe width=&quot;100%&quot; height=&quot;468&quot; src=&quot;https://www.youtube.com/embed/sJj2BqZMWRE&quot; title=&quot;Scan Space Video&quot; frameborder=&quot;0&quot; allow=&quot;accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share&quot; allowfullscreen&amp;gt;&amp;lt;/iframe&amp;gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;Ball Interaction&lt;/h2&gt;
&lt;h3&gt;1. Ball Interaction Script&lt;/h3&gt;
&lt;p&gt;&lt;img src=&quot;./images/13ball1.png&quot; alt=&quot;ballcode&quot; /&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Go &lt;code&gt;RightHandAnchor → Add Component&lt;/code&gt; and create a &lt;strong&gt;BallInteraction&lt;/strong&gt; component.&lt;/li&gt;
&lt;li&gt;Copy and paste the below script.&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;using System.Collections;
using System.Collections.Generic;
using UnityEngine;

public class BallInteraction : MonoBehaviour
{
    public GameObject prefab;
    public float spawnSpeed = 5;

    void Update()
    {
        if (OVRInput.GetDown(OVRInput.Button.SecondaryHandTrigger))
        {
            GameObject Ball = Instantiate(prefab, transform.position, Quaternion.identity);
            Rigidbody BallRB = Ball.GetComponent&amp;lt;Rigidbody&amp;gt;();
            BallRB.velocity = transform.forward * spawnSpeed;
        }
    }
}
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;💡 Why:
This script lets you spawn and throw balls from your right hand trigger.&lt;/p&gt;
&lt;h3&gt;2. Physics&lt;/h3&gt;
&lt;p&gt;&lt;img src=&quot;./images/14ball2.png&quot; alt=&quot;bounce&quot; /&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Create a Physic Material &lt;code&gt;Bounce&lt;/code&gt; (Bounciness=1) in Assets folder.
&lt;img src=&quot;./images/15ball3.png&quot; alt=&quot;bounceSphere&quot; /&gt;&lt;/li&gt;
&lt;li&gt;Create a 3D object sphere &lt;code&gt;Bounce Sphere&lt;/code&gt; (scale 0.1, 0.1, 0.1) and add &lt;strong&gt;Rigidbody&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Go &lt;code&gt;Bounce Sphere → Sphere Collider → Material&lt;/code&gt; and assign &lt;code&gt;Bounce&lt;/code&gt;.
&lt;img src=&quot;./images/16ball4.png&quot; alt=&quot;removesphere&quot; /&gt;&lt;/li&gt;
&lt;li&gt;Add &lt;code&gt;Bounce Sphere&lt;/code&gt; in Asset folder and remove it in Sample Scene.
&lt;img src=&quot;./images/17ball5.png&quot; alt=&quot;right&quot; /&gt;&lt;/li&gt;
&lt;li&gt;Go &lt;code&gt;RightHandAnchor → Ball Interaction → Prefab&lt;/code&gt; and assign &lt;code&gt;Bounce Sphere&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;💡 Why:
The Rigidbody and Physics Material make the ball actually bounce instead of rolling lifelessly, and saving it as a prefab keeps spawning clean and reusable.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;./images/18ball6.png&quot; alt=&quot;time&quot; /&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Go &lt;code&gt;Edit → Project Settings → Time&lt;/code&gt; and set &lt;strong&gt;Fixed TImeStep = 0.0083333&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;💡 Why:
Matching Unity’s physics update to 90 Hz reduces jitter in MR, and linking the prefab ensures that pressing the trigger spawns bouncy balls.&lt;/p&gt;
&lt;h2&gt;Video&lt;/h2&gt;
&lt;p&gt;&amp;lt;iframe width=&quot;100%&quot; height=&quot;468&quot; src=&quot;https://www.youtube.com/embed/-LsH-tnGQBY&quot; title=&quot;Ball Interaction Video&quot; frameborder=&quot;0&quot; allow=&quot;accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share&quot; allowfullscreen&amp;gt;&amp;lt;/iframe&amp;gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;h1&gt;Errors&lt;/h1&gt;
&lt;ul&gt;
&lt;li&gt;AirLink → blurry visuals, slower loads.&lt;/li&gt;
&lt;li&gt;USB builds → load more than 10 mins sometimes.&lt;/li&gt;
&lt;li&gt;Unity sometimes didn’t detect Quest if Wi-Fi was off or connected to different Wi-Fi with your computer.&lt;/li&gt;
&lt;/ul&gt;
&lt;hr /&gt;
&lt;h1&gt;Future Usage&lt;/h1&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Interactive Navigation Cues&lt;/strong&gt;: Seeing balls collide with scanned walls made me think: if objects can interact reliably with meshes, you could use the same logic to overlay arrows indoors.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Spatial Data Visualization&lt;/strong&gt;: Aligning balls with real surfaces highlighted the importance of accurate mesh placement. This directly extends to real-time overlays on equipment.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Room-Aware Game Mechanics&lt;/strong&gt;: Physics interactions with walls suggest gameplay mechanics: your real room can become the level itself.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Furniture / Object Placement&lt;/strong&gt;: Adjusting spawn positions to avoid clipping with walls/furniture revealed the potential to preview objects realistically in your own space.&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;hr /&gt;
</content:encoded></item><item><title>ROS2 &amp; Unity TCP Tutorial</title><link>https://eun346.github.io/eunha_choi/posts/3tcp-ro2-unity/</link><guid isPermaLink="true">https://eun346.github.io/eunha_choi/posts/3tcp-ro2-unity/</guid><description>ROS2 and Unity TCP connect Tutorial</description><pubDate>Thu, 24 Jul 2025 00:00:00 GMT</pubDate><content:encoded>&lt;h1&gt;ROS TCP&lt;/h1&gt;
&lt;p&gt;ROS2 is often seen as the robot’s &lt;strong&gt;brain&lt;/strong&gt;, while Unity serves as the &lt;strong&gt;eyes and environment&lt;/strong&gt;. To simulate realistic robotics behavior, we need a bridge between them. That bridge is the &lt;strong&gt;ROS-TCP Endpoint/Connector&lt;/strong&gt;. By connecting both, we can test navigation, visualization, and SLAM in a controlled environment before deploying to real hardware.&lt;/p&gt;
&lt;hr /&gt;
&lt;h1&gt;Understanding the Basics&lt;/h1&gt;
&lt;p&gt;&lt;strong&gt;ROS TCP Endpoint/Connector&lt;/strong&gt;
&lt;img src=&quot;./images/1rostcp.png&quot; alt=&quot;rostcp&quot; /&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Def&lt;/strong&gt;: A ROS node that enables message passing over TCP between ROS2 and external systems like Unity&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Purpose&lt;/strong&gt;: : Facilitates real-time communication for robotics simulations and visualizations&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Example&lt;/strong&gt;: A ROS node sends robot position data to Unity, which renders the robot in a 3D environment&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;strong&gt;ROS-Unity Communication&lt;/strong&gt;
&lt;img src=&quot;./images/2rosUnity.png&quot; alt=&quot;rosUnityComm&quot; /&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Image Source: &lt;a href=&quot;https://www.mdpi.com/1424-8220/24/17/5680&quot;&gt;MDPI&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Def&lt;/strong&gt;: The broader process of how ROS 2 and Unity interact, using the TCP Endpoint/Connector to exchange data via topics, services, or actions.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Example&lt;/strong&gt;: Unity subscribes to a ROS topic (/robot_color) to change a robot’s color, or ROS responds to a Unity service request with a robot’s pose.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;strong&gt;SLAM (Simultaneous Localization &amp;amp; Mapping)&lt;/strong&gt;
&lt;img src=&quot;./images/3slam.png&quot; alt=&quot;slam&quot; /&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Image Source: &lt;a href=&quot;https://www.laserscanning-europe.com/en/what-slam&quot;&gt;LASER_Scanning&lt;/a&gt; and &lt;a href=&quot;https://kodifly.com/what-is-slam-a-beginner-to-expert-guide&quot;&gt;KODIFLY&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Def&lt;/strong&gt;: A method for robots to build a map of an unknown environment while tracking their location within it.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Applications&lt;/strong&gt;: Autonomous navigation, 3D map reconstruction, obstacle avoidance.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Example&lt;/strong&gt;: A robot uses LIDAR to create a 2D occupancy grid in RViz, visualized in Unity.&lt;/li&gt;
&lt;/ul&gt;
&lt;hr /&gt;
&lt;h1&gt;Materials&lt;/h1&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;Github&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;a href=&quot;https://github.com/Unity-Technologies/ROS-TCP-Endpoint?tab=readme-ov-file&quot;&gt;ROS TCP Endpoint&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;a href=&quot;https://github.com/Unity-Technologies/ROS-TCP-Connector&quot;&gt;ROS TCP Connector&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;ROS (I used Ubuntu 22.04 with ROS2 Humble)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Unity v.2020.2+&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Visual Studio (for C#)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;RViz (for visualization in ROS)&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;hr /&gt;
&lt;h1&gt;Tutorial&lt;/h1&gt;
&lt;p&gt;Since I used ROS2, this tutorial is written based on ROS2 (especially ROS2 Humble).&lt;/p&gt;
&lt;h2&gt;Setup&lt;/h2&gt;
&lt;h3&gt;ROS&lt;/h3&gt;
&lt;p&gt;&lt;img src=&quot;./images/4setup1.png&quot; alt=&quot;download&quot; /&gt;&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;Download &lt;a href=&quot;https://github.com/Unity-Technologies/ROS-TCP-Endpoint?tab=readme-ov-file&quot;&gt;ROS TCP Endpoint&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;In your src folder of ROS TCP Endpoint, navigate to your Colcon worspace and run the following commands:&lt;/li&gt;
&lt;/ol&gt;
&lt;pre&gt;&lt;code&gt;source install/setup.bash
colcon build
source install/setup.bash
&lt;/code&gt;&lt;/pre&gt;
&lt;ol&gt;
&lt;li&gt;In your Colcon workspace, run the following command, replacing &lt;code&gt;&amp;lt;your IP address&amp;gt;&lt;/code&gt; with your ROS machine&apos;s IP or hostname.&lt;/li&gt;
&lt;/ol&gt;
&lt;pre&gt;&lt;code&gt;ros2 run ros_tcp_endpoint default_server_endpoint --ros-args -p ROS_IP:=192.168.0.5
&lt;/code&gt;&lt;/pre&gt;
&lt;ul&gt;
&lt;li&gt;If you&apos;re running ROS in a Docker container, 0.0.0.0 is a valid incoming address, so you can write
&lt;code&gt;ros2 run ros_tcp_endpoint default_server_endpoint --ros-args -p ROS_IP:=0.0.0.0&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;On Linux you can find out your IP address with the command &lt;code&gt;hostname -I&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;On MacOS you can find out your IP address with &lt;code&gt;ipconfig getifaddr en0&lt;/code&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;hr /&gt;
&lt;p&gt;Once the server_endpoint has started, it will print something similar to &lt;code&gt;[INFO] [1603488341.950794]: Starting server on 192.168.50.149:10000&lt;/code&gt;.
(Alternative) If you need the server to listen on a port that&apos;s different from the default 10000, here&apos;s the command line to also set the ROS_TCP_PORT parameter:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;ros2 run ros_tcp_endpoint default_server_endpoint --ros-args -p ROS_IP:=127.0.0.1 -p ROS_TCP_PORT:=10000
&lt;/code&gt;&lt;/pre&gt;
&lt;h3&gt;Unity&lt;/h3&gt;
&lt;ol&gt;
&lt;li&gt;Open the Package Manager (&lt;code&gt;Window &amp;gt; Package Manager&lt;/code&gt;)&lt;/li&gt;
&lt;li&gt;Click the + button in the upper lefthand corner of the window. Select Add &lt;code&gt;package from git URL...&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;Enter the git URL for the package.&lt;/li&gt;
&lt;li&gt;For the ROS-TCP-Connector, enter &lt;code&gt;https://github.com/Unity-Technologies/ROS-TCP-Connector.git?path=/com.unity.robotics.ros-tcp-connector&lt;/code&gt;.&lt;/li&gt;
&lt;/ol&gt;
&lt;ul&gt;
&lt;li&gt;(not necessary)For Visualizations, enter &lt;code&gt;https://github.com/Unity-Technologies/ROS-TCP-Connector.git?path=/com.unity.robotics.visualizations&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;ol&gt;
&lt;li&gt;Click &lt;code&gt;Add&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;open Robotics/ROS Settings from the Unity menu bar, and set the ROS IP Address variable to the IP you set earlier. (If you&apos;re using Docker, leave it as the default 127.0.0.1.)&lt;/li&gt;
&lt;li&gt;in the ROS Settings window, ROS2 users should switch the protocol to ROS2 now.&lt;/li&gt;
&lt;/ol&gt;
&lt;h2&gt;ROS Unity Integration&lt;/h2&gt;
&lt;h3&gt;Publisher&lt;/h3&gt;
&lt;p&gt;&amp;lt;iframe width=&quot;100%&quot; height=&quot;468&quot; src=&quot;https://www.youtube.com/embed/oeHS8G2DeYs&quot; title=&quot;Ball Interaction Video&quot; frameborder=&quot;0&quot; allow=&quot;accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share&quot; allowfullscreen&amp;gt;&amp;lt;/iframe&amp;gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Unity publishes data (e.g., robot position coordinates) to a ROS 2 topic, which a ROS subscriber receives.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;&lt;strong&gt;Concept&lt;/strong&gt;: Unity can act as a Publisher, sending data continuously to a ROS topic.
&lt;strong&gt;Typical Use Case&lt;/strong&gt;: Streaming an object’s position, velocity, or any sensor-like data from Unity to ROS.
&lt;strong&gt;Unity Side&lt;/strong&gt;: You use the ROSConnection component to define the topic and message type, then call Publish() inside a Unity script.
&lt;strong&gt;ROS Side&lt;/strong&gt;: A ROS node subscribes to the same topic to consume Unity’s data.&lt;/p&gt;
&lt;h3&gt;Subscriber&lt;/h3&gt;
&lt;p&gt;&amp;lt;iframe width=&quot;100%&quot; height=&quot;468&quot; src=&quot;https://www.youtube.com/embed/TqKIByLq1NI&quot; title=&quot;Ball Interaction Video&quot; frameborder=&quot;0&quot; allow=&quot;accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share&quot; allowfullscreen&amp;gt;&amp;lt;/iframe&amp;gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;ROS2 publishes a topic to change the color, and Unity subscribes to that topic to update it.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;&lt;strong&gt;Concept&lt;/strong&gt;: Unity can also subscribe to ROS topics and react when messages arrive.
&lt;strong&gt;Typical Use Case&lt;/strong&gt;: ROS publishes sensor data or control commands, and Unity updates its simulation accordingly (e.g., changing color, triggering animations).
&lt;strong&gt;Unity Side&lt;/strong&gt;: In Unity, you register a callback for a topic using Subscribe&amp;lt;T&amp;gt;().
&lt;strong&gt;ROS Side&lt;/strong&gt;: A ROS node publishes messages to that topic.&lt;/p&gt;
&lt;h3&gt;Service&lt;/h3&gt;
&lt;p&gt;&lt;img src=&quot;./images/5service.png&quot; alt=&quot;service&quot; /&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Unity (Service Client) request an object’s pose in Unity.
ROS (Service Server) calculates its pose and responds to Unity.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;&lt;strong&gt;Concept&lt;/strong&gt;: Unlike Publishers/Subscribers, a Service works as a request–response pattern.
&lt;strong&gt;Typical Use Case&lt;/strong&gt;: When Unity needs a precise answer at a specific moment.
&lt;strong&gt;Unity as Service Client&lt;/strong&gt;: Unity sends a request, e.g., “What’s the pose of this object?”
&lt;strong&gt;ROS as Service Server&lt;/strong&gt;: ROS computes the answer and returns it.&lt;/p&gt;
&lt;h3&gt;Service Call&lt;/h3&gt;
&lt;p&gt;&amp;lt;iframe width=&quot;100%&quot; height=&quot;468&quot; src=&quot;https://www.youtube.com/embed/Jf0TovLSnvA&quot; title=&quot;Ball Interaction Video&quot; frameborder=&quot;0&quot; allow=&quot;accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share&quot; allowfullscreen&amp;gt;&amp;lt;/iframe&amp;gt;&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;Start at current position&lt;/li&gt;
&lt;li&gt;Move toward destination&lt;/li&gt;
&lt;li&gt;Near target → Request update from ROS&lt;/li&gt;
&lt;li&gt;Get new destination&lt;/li&gt;
&lt;li&gt;Repeat&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;&lt;strong&gt;Concept&lt;/strong&gt;: A Service Call can be repeated in sequence to drive a process step by step.
&lt;strong&gt;Typical Use Case&lt;/strong&gt;: Unity moves an object toward a goal. Once near the target, Unity requests a new goal from ROS, receives it, and continues.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Note: While Publisher and Subscriber cover most real-time data flows, Services shine when you need precise, one-off information. The Service Call tutorial illustrates how even a simple request–response can orchestrate dynamic behavior when chained together.&lt;/p&gt;
&lt;p&gt;In practice, you’ll often combine these: Publishers for streaming state, Subscribers for reacting to commands, and Services for exact queries. The key is to choose the right communication model for the task at hand.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h2&gt;Robotics Nav2 SLAM Example&lt;/h2&gt;
&lt;p&gt;&amp;lt;iframe width=&quot;100%&quot; height=&quot;468&quot; src=&quot;https://www.youtube.com/embed/atLGOWf3JpI&quot; title=&quot;Ball Interaction Video&quot; frameborder=&quot;0&quot; allow=&quot;accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share&quot; allowfullscreen&amp;gt;&amp;lt;/iframe&amp;gt;&lt;/p&gt;
&lt;p&gt;&amp;lt;iframe width=&quot;100%&quot; height=&quot;468&quot; src=&quot;https://www.youtube.com/embed/IOTopMyFSew&quot; title=&quot;Ball Interaction Video&quot; frameborder=&quot;0&quot; allow=&quot;accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share&quot; allowfullscreen&amp;gt;&amp;lt;/iframe&amp;gt;&lt;/p&gt;
&lt;h3&gt;Running&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Clone the &lt;a href=&quot;https://github.com/Unity-Technologies/Robotics-Nav2-SLAM-Example/tree/main&quot;&gt;Nav2 SLAM Example&lt;/a&gt; and run:&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;git clone https://github.com/Unity-Technologies/Robotics-Nav2-SLAM-Example.git
cd Robotics-Nav2-SLAM-Example/ros2_docker/colcon_ws
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;➡️ Expected Output: A 2D occupancy grid in RViz showing the robot’s map, with real-time updates as it navigates. You can extend this with custom visualization (e.g., battery status, camera feeds).&lt;/p&gt;
&lt;h3&gt;Visualization &amp;amp; Custom Visualizer&lt;/h3&gt;
&lt;p&gt;DefaultVisualizationSuite includes:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;GoalPose → Robot position (topic-based).&lt;/li&gt;
&lt;li&gt;OccupancyGridVisualizer → SLAM map.&lt;/li&gt;
&lt;li&gt;LaserScanSensor → LiDAR scan.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;You also need to add the following code in your Unity.&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;using System;
using System.Collections.Generic;
using RosMessageTypes.Geometry;                     // Generated message classes
using Unity.Robotics.Visualizations;                // Visualizations
using Unity.Robotics.ROSTCPConnector.ROSGeometry;   // Coordinate space utilities
using UnityEngine;

public class PoseTrailVisualizer : HistoryDrawingVisualizer&amp;lt;PoseStampedMsg&amp;gt;
{
    [SerializeField]
    Color m_Color = Color.white;
    [SerializeField]
    float m_Thickness = 0.1f;
    [SerializeField]
    string m_Label = &quot;&quot;;

    public override Action CreateGUI(IEnumerable&amp;lt;Tuple&amp;lt;PoseStampedMsg, MessageMetadata&amp;gt;&amp;gt; messages)
    {
        return () =&amp;gt;
        {
            var count = 0;
            foreach (var (message, meta) in messages)
            {
                GUILayout.Label($&quot;Goal #{count}:&quot;);
                message.pose.GUI();
                count++;
            }
        };
    }

    public override void Draw(Drawing3d drawing, IEnumerable&amp;lt;Tuple&amp;lt;PoseStampedMsg, MessageMetadata&amp;gt;&amp;gt; messages)
    {
        var firstPass = true;
        var prevPoint = Vector3.zero;
        var color = Color.white;
        var label = &quot;&quot;;

        foreach (var (msg, meta) in messages)
        {
            var point = msg.pose.position.From&amp;lt;FLU&amp;gt;();
            if (firstPass)
            {
                color = VisualizationUtils.SelectColor(m_Color, meta);
                label = VisualizationUtils.SelectLabel(m_Label, meta);
                firstPass = false;
            }
            else
            {
                drawing.DrawLine(prevPoint, point, color, m_Thickness);
            }

            prevPoint = point;
        }

        drawing.DrawLabel(label, prevPoint, color);
    }
}
&lt;/code&gt;&lt;/pre&gt;
&lt;hr /&gt;
&lt;h1&gt;Errors&lt;/h1&gt;
&lt;h2&gt;ROS Error&lt;/h2&gt;
&lt;p&gt;&lt;strong&gt;Colcon Build&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Wrong package name: ROS-TCP-Endpoint-main → ros_tcp_endpoint&lt;/li&gt;
&lt;li&gt;Wrong build location: colcon build inside Robotics-Nav2-SLAM-Example → Robotics-Nav2-SLAM-Example/ros2_docker/colcon_ws&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;strong&gt;Not Connected to Unity&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Wrong Wi-Fi&lt;/li&gt;
&lt;li&gt;Wrong ROS IP Address&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;Unity Error&lt;/h2&gt;
&lt;p&gt;&lt;strong&gt;&lt;code&gt;DeserializationException: Cannot deserialize message&lt;/code&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Rebuild msg files&lt;/li&gt;
&lt;/ul&gt;
</content:encoded></item><item><title>ROS2 Tutorial</title><link>https://eun346.github.io/eunha_choi/posts/2ros2-tutorial/</link><guid isPermaLink="true">https://eun346.github.io/eunha_choi/posts/2ros2-tutorial/</guid><description>ROS2 Beginner -&gt; Intermediate Tutorial</description><pubDate>Mon, 14 Jul 2025 00:00:00 GMT</pubDate><content:encoded>&lt;h1&gt;Understanding the Basics&lt;/h1&gt;
&lt;h2&gt;ROS (Robot Operating System)&lt;/h2&gt;
&lt;p&gt;ROS2 is essentially a collection of software libraries and tools designed to streamline robot development. It provides the building blocks for communication, control, and simulation so that each component of a robot can remain modular and independent. While ROS1 already established this architecture, ROS2 pushes things further by adopting DDS (Data Distribution Service), resulting in better performance, improved reliability, and true distributed communication across a network.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;./images/1rosvsros2.png&quot; alt=&quot;roscompare&quot; /&gt;&lt;/p&gt;
&lt;h2&gt;Node&lt;/h2&gt;
&lt;p&gt;&lt;img src=&quot;https://docs.ros.org/en/foxy/_images/Nodes-TopicandService.gif&quot; alt=&quot;NODE&quot; /&gt;
A node is the smallest unit of computation in ROS2. Each node has a clear, single responsibility—whether that’s reading sensor data, processing images, or issuing motor commands. Nodes communicate with each other through standardized interfaces, which is what allows us to scale from a toy simulation to a complex robot without rewriting the entire system.&lt;/p&gt;
&lt;h2&gt;Topic&lt;/h2&gt;
&lt;p&gt;&lt;img src=&quot;https://docs.ros.org/en/foxy/_images/Topic-MultiplePublisherandMultipleSubscriber.gif&quot; alt=&quot;Topic&quot; /&gt;
Topics are the primary data channels between nodes. They support 1:N and N:N communication, making them ideal for streaming sensor data or publishing continuous control commands. If multiple subscribers are listening, ROS2 handles the distribution—something that becomes essential once a system grows beyond a few components.&lt;/p&gt;
&lt;h2&gt;Parameter&lt;/h2&gt;
&lt;p&gt;Parameters are configurable values stored inside a node. They allow runtime tuning without code changes—for example, adjusting a robot’s speed limit or sensor thresholds. While they seem simple, parameters are surprisingly powerful when combined with launch files and dynamic updates.&lt;/p&gt;
&lt;h2&gt;Service&lt;/h2&gt;
&lt;p&gt;&lt;img src=&quot;https://docs.ros.org/en/foxy/_images/Service-MultipleServiceClient.gif&quot; alt=&quot;Service&quot; /&gt;
A service provides a synchronous, request–response communication pattern. Since calls are deterministic and 1:1, services work best for short operations such as resetting a simulation, querying a map, or changing a mode. Services use &lt;code&gt;.srv&lt;/code&gt; files to define their request and response format.&lt;/p&gt;
&lt;h2&gt;Action&lt;/h2&gt;
&lt;p&gt;&lt;img src=&quot;https://docs.ros.org/en/foxy/_images/Action-SingleActionClient.gif&quot; alt=&quot;Action&quot; /&gt;
Actions extend the idea of services by supporting long-running tasks with continuous feedback. Navigation is a typical example: you send a goal, receive progress updates, and eventually get a result. The &lt;code&gt;.action&lt;/code&gt; file describes this three-part structure.&lt;/p&gt;
&lt;h2&gt;Interface&lt;/h2&gt;
&lt;p&gt;ROS2 defines three file types for communication:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;code&gt;.msg&lt;/code&gt; → basic message structures&lt;/li&gt;
&lt;li&gt;&lt;code&gt;.srv&lt;/code&gt; → service (request/response)&lt;/li&gt;
&lt;li&gt;&lt;code&gt;.action&lt;/code&gt; → long-running actions with feedback&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;These interfaces ensure consistent communication across nodes, packages, and even different programming languages.&lt;/p&gt;
&lt;h2&gt;Launch&lt;/h2&gt;
&lt;p&gt;Launch files are one of the major conveniences in ROS2. Instead of manually running dozens of commands, you can start entire systems—nodes, parameters, remappings, and environment configuration—through a single Python-based launch file. As projects grow, launch files become essential for keeping everything organized.&lt;/p&gt;
&lt;h2&gt;Composition&lt;/h2&gt;
&lt;p&gt;Node composition lets multiple nodes run inside a single process. This significantly reduces overhead and improves performance, especially for systems with high-frequency communication. It’s one of those features that doesn’t seem necessary at first, but becomes valuable when optimizing for real robots.&lt;/p&gt;
&lt;h2&gt;Executor&lt;/h2&gt;
&lt;p&gt;Executors determine how callbacks are processed:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Single-threaded executor&lt;/strong&gt; — each callback runs sequentially&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Multi-threaded executor&lt;/strong&gt; — callbacks may run in parallel&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Understanding executors is important for writing predictable, real-time-safe behavior.&lt;/p&gt;
&lt;hr /&gt;
&lt;h1&gt;Materials&lt;/h1&gt;
&lt;p&gt;To follow most ROS2 tutorials comfortably, the following setup works best:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Ubuntu 22.04&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://docs.ros.org/en/humble/index.html&quot;&gt;Official ROS2 Humble Docs&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;YouTube video tutorials&lt;/li&gt;
&lt;li&gt;Visual Studio Code (C++ / Python)&lt;/li&gt;
&lt;li&gt;Additional tools:
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Turtlesim&lt;/strong&gt; for basic concepts&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Gazebo&lt;/strong&gt; for simulation&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;RViz 2&lt;/strong&gt; for visualization&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;hr /&gt;
&lt;h1&gt;Materials&lt;/h1&gt;
&lt;ul&gt;
&lt;li&gt;Ubuntu 22.04&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://docs.ros.org/en/humble/index.html&quot;&gt;Official ROS2 Humble Docs&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Yutube Videos&lt;/li&gt;
&lt;li&gt;Visual Studio (C++ / Python)&lt;/li&gt;
&lt;li&gt;Additional Tools
&lt;ul&gt;
&lt;li&gt;Turtlesim&lt;/li&gt;
&lt;li&gt;Gazebo (Sim)&lt;/li&gt;
&lt;li&gt;RViz 2&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;hr /&gt;
&lt;h2&gt;Tutorials&lt;/h2&gt;
&lt;h2&gt;Beginner: CLI Tools&lt;/h2&gt;
&lt;p&gt;Before writing any code, ROS2 encourages you to explore the ecosystem through the command line. This stage is surprisingly valuable because it forces you to understand how nodes, topics, services, and actions behave in a real running system.&lt;/p&gt;
&lt;h3&gt;What you learn here&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Inspecting active nodes with &lt;code&gt;ros2 node list&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;Checking published and subscribed topics with &lt;code&gt;ros2 topic list&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;Echoing data streams using &lt;code&gt;ros2 topic echo&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;Calling services directly through CLI&lt;/li&gt;
&lt;li&gt;Visualizing architecture using &lt;code&gt;rqt_graph&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;Basic debugging with &lt;code&gt;ros2 doctor&lt;/code&gt; and &lt;code&gt;ros2 info&lt;/code&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Most developers (including me) underestimate this step, but these CLI tools become indispensable once you’re debugging large multi-node systems.&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;Beginner: Client Libraries (Python &amp;amp; C++)&lt;/h2&gt;
&lt;p&gt;Once you get familiar with how ROS2 works under the hood, the next step is building your own packages. This is where &lt;code&gt;colcon&lt;/code&gt; and the workspace structure start making sense. You create a &lt;code&gt;src/&lt;/code&gt; folder, add packages, write nodes, and build everything in a clean, modular layout.&lt;/p&gt;
&lt;h3&gt;Core Topics&lt;/h3&gt;
&lt;h3&gt;1. Creating Packages&lt;/h3&gt;
&lt;p&gt;You learn to generate packages using:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;ros2 pkg create my_pkg --build-type ament_cmake
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;or Python packages via:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;ros2 pkg create my_pkg --build-type ament_python
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;This automatically sets up the directory structure, &lt;code&gt;package.xml&lt;/code&gt;, and build files.&lt;/p&gt;
&lt;h3&gt;2. Publisher &amp;amp; Subscriber&lt;/h3&gt;
&lt;p&gt;This is the “Hello World” of ROS2 development. Writing a minimal pub/sub pair teaches you:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;message imports&lt;/li&gt;
&lt;li&gt;QoS (Quality of Service) basics&lt;/li&gt;
&lt;li&gt;callback structures&lt;/li&gt;
&lt;li&gt;timers&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;It’s simple, but it becomes the foundation for almost every robotics system.&lt;/p&gt;
&lt;h3&gt;3. Services &amp;amp; Clients&lt;/h3&gt;
&lt;p&gt;Here you write synchronous communication in both languages. You define &lt;code&gt;.srv&lt;/code&gt; files, build interfaces, and implement request–response handlers.&lt;/p&gt;
&lt;h3&gt;4. Custom Interfaces&lt;/h3&gt;
&lt;p&gt;You define your own &lt;code&gt;.msg&lt;/code&gt; and &lt;code&gt;.srv&lt;/code&gt; structures.
This part forces you to think about architecture instead of dumping arbitrary data.&lt;/p&gt;
&lt;h3&gt;5. Parameters&lt;/h3&gt;
&lt;p&gt;ROS2’s parameter system allows you to:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;load parameters from YAML&lt;/li&gt;
&lt;li&gt;override them via CLI&lt;/li&gt;
&lt;li&gt;declare dynamic parameters&lt;/li&gt;
&lt;li&gt;handle parameter callbacks&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;This becomes extremely useful in launch files and tuning algorithms.&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;Intermediate: Advancing with ROS2&lt;/h2&gt;
&lt;p&gt;This stage covers the actual building blocks that real robots rely on. It’s where everything you learned starts connecting, and where the architecture begins to feel more intentional.&lt;/p&gt;
&lt;hr /&gt;
&lt;h3&gt;Working With Dependencies&lt;/h3&gt;
&lt;p&gt;Before building any ROS2 package, you need dependency resolution.
&lt;code&gt;rosdep&lt;/code&gt; handles this by scanning &lt;code&gt;package.xml&lt;/code&gt; and installing missing system libs.&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;rosdep install --from-paths src -y --ignore-src
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;It looks like a minor detail, but forgetting this step is responsible for 80% of beginner build failures.&lt;/p&gt;
&lt;hr /&gt;
&lt;h3&gt;Actions&lt;/h3&gt;
&lt;p&gt;Actions take services to the next level. You define &lt;code&gt;.action&lt;/code&gt; files with:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Goal&lt;/li&gt;
&lt;li&gt;Result&lt;/li&gt;
&lt;li&gt;Feedback&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Then you implement action servers and clients in both Python and C++.
Navigation stacks and manipulation frameworks rely heavily on actions.&lt;/p&gt;
&lt;hr /&gt;
&lt;h3&gt;Composable Nodes&lt;/h3&gt;
&lt;p&gt;One of the most powerful features in ROS2.
Instead of launching each node as a separate process, you load multiple nodes into a single shared process.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Why it matters:&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;lower latency (no process-to-process overhead)&lt;/li&gt;
&lt;li&gt;reduced memory consumption&lt;/li&gt;
&lt;li&gt;faster communication&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Composable nodes make a huge difference in real robot performance.&lt;/p&gt;
&lt;hr /&gt;
&lt;h3&gt;Node Interfaces &amp;amp; Dynamic Parameters&lt;/h3&gt;
&lt;p&gt;This covers more advanced techniques like:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;introspecting node interfaces&lt;/li&gt;
&lt;li&gt;reacting to parameter updates&lt;/li&gt;
&lt;li&gt;using templates and advanced C++ patterns&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;It helps you understand how nodes should behave in large-scale systems.&lt;/p&gt;
&lt;hr /&gt;
&lt;h3&gt;Launch Files (Python / XML)&lt;/h3&gt;
&lt;p&gt;Launch files become more complex here:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;multiple nodes&lt;/li&gt;
&lt;li&gt;namespaces&lt;/li&gt;
&lt;li&gt;remapping&lt;/li&gt;
&lt;li&gt;events and lifecycle management&lt;/li&gt;
&lt;li&gt;conditional execution&lt;/li&gt;
&lt;li&gt;passing parameters to nodes automatically&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;You essentially design how your entire robotics system boots up.&lt;/p&gt;
&lt;hr /&gt;
&lt;h3&gt;TF2 (Transforms)&lt;/h3&gt;
&lt;p&gt;TF2 is the coordinate transform system of ROS2. Every robot with physical movement uses it.&lt;/p&gt;
&lt;p&gt;You learn:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;broadcasting transforms&lt;/li&gt;
&lt;li&gt;listening and chaining transforms&lt;/li&gt;
&lt;li&gt;using TF buffers&lt;/li&gt;
&lt;li&gt;visualizing frames in RViz&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Understanding TF2 is crucial for anything involving sensors or movement.&lt;/p&gt;
&lt;hr /&gt;
&lt;h3&gt;Testing (Unit &amp;amp; Integration)&lt;/h3&gt;
&lt;p&gt;ROS2 has built-in testing through:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;GTest for C++&lt;/li&gt;
&lt;li&gt;pytest for Python&lt;/li&gt;
&lt;li&gt;&lt;code&gt;launch_testing&lt;/code&gt; for integration tests&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Testing robotic systems might feel optional, but once you run a multi-node stack, automated tests save hours of debugging.&lt;/p&gt;
&lt;hr /&gt;
&lt;h3&gt;Robot Modeling (URDF) &amp;amp; Visualization&lt;/h3&gt;
&lt;p&gt;URDF (Unified Robot Description Format) is used to model robot links and joints.
You visualize it in RViz and integrate it with Gazebo simulation.&lt;/p&gt;
&lt;p&gt;This is the point where your ROS2 knowledge becomes tangible—you can visualize your own robot moving, even before a physical prototype exists.&lt;/p&gt;
&lt;hr /&gt;
&lt;h1&gt;Debugging Notes&lt;/h1&gt;
&lt;p&gt;Some common issues that appear while building ROS2 projects:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Missing dependencies (fix with &lt;code&gt;rosdep install&lt;/code&gt;)&lt;/li&gt;
&lt;li&gt;Incorrect package paths&lt;/li&gt;
&lt;li&gt;Misconfigured CMakeLists or package.xml&lt;/li&gt;
&lt;li&gt;Build failures due to unsaved files (surprisingly common)&lt;/li&gt;
&lt;li&gt;Gazebo version conflicts&lt;/li&gt;
&lt;li&gt;RViz plugin issues&lt;/li&gt;
&lt;li&gt;Wrong QoS causing subscriber to receive no messages&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Debugging ROS2 is an ongoing skill, not a one-time lesson.&lt;/p&gt;
</content:encoded></item><item><title>VR Game Tutorial</title><link>https://eun346.github.io/eunha_choi/posts/1vr-tutorial/</link><guid isPermaLink="true">https://eun346.github.io/eunha_choi/posts/1vr-tutorial/</guid><description>This post deals with Tutorial for VR Game.</description><pubDate>Mon, 07 Jul 2025 00:00:00 GMT</pubDate><content:encoded>&lt;h1&gt;Introduction&lt;/h1&gt;
&lt;p&gt;I recently followed Valem’s “How to Make a VR Game in Unity” series (13 videos) to understand the fundamentals of VR interaction using the Unity XR Interaction Toolkit. The videos cover everything from basic VR setup to locomotion, grabbing, haptics, UI interaction, and comfort features. While the tutorial itself is straightforward, implementing it in a real Unity project exposed a lot of small details that matter more than expected. This post is a structured summary of everything I learned along the way—both from the guide and from debugging my own mistakes.&lt;/p&gt;
&lt;hr /&gt;
&lt;h1&gt;Tutorial Overview&lt;/h1&gt;
&lt;p&gt;&lt;strong&gt;Source:&lt;/strong&gt; “How to make a VR game in Unity” by Valem (13-video series)&lt;br /&gt;
&lt;strong&gt;Hardware:&lt;/strong&gt; Meta Quest 2 / Quest Link&lt;br /&gt;
&lt;strong&gt;Software:&lt;/strong&gt; Unity 2023.x + XR Interaction Toolkit&lt;/p&gt;
&lt;h3&gt;Video Breakdown&lt;/h3&gt;
&lt;p&gt;1–2. VR Overview &amp;amp; Unity Setup&lt;br /&gt;
3. Hand Models &amp;amp; Input&lt;br /&gt;
4. Continuous Move &amp;amp; Turn&lt;br /&gt;
5. Teleportation&lt;br /&gt;
6. Hover / Grab / Interactables&lt;br /&gt;
7. Offset Grab &amp;amp; Distance Grab&lt;br /&gt;
8. VR UI&lt;br /&gt;
9. Haptics&lt;br /&gt;
10–11. Grab Poses&lt;br /&gt;
12. Tunneling Vignette (comfort)&lt;br /&gt;
13. Example Project Overview&lt;/p&gt;
&lt;hr /&gt;
&lt;h1&gt;Development Steps&lt;/h1&gt;
&lt;h2&gt;Programming (C#)&lt;/h2&gt;
&lt;p&gt;Throughout the tutorial I implemented several core interaction scripts:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;custom input detection (pinch, grip, trigger) using the Input System&lt;/li&gt;
&lt;li&gt;projectile / bullet firing logic&lt;/li&gt;
&lt;li&gt;vibrating controllers via XR haptics&lt;/li&gt;
&lt;li&gt;enforcing custom grab poses&lt;/li&gt;
&lt;li&gt;automatically disabling teleport rays while interacting with UI&lt;/li&gt;
&lt;li&gt;managing interaction layers for direct vs. ray interactors&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Implementing these by hand gave me a much better understanding of how XR Interaction Toolkit actually processes events behind the scenes.&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;Unity &amp;amp; XR Interaction Toolkit&lt;/h2&gt;
&lt;h3&gt;Core Unity steps included:&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;enabling OpenXR in XR Plugin Management&lt;/li&gt;
&lt;li&gt;setting up the XR Origin (camera + hands)&lt;/li&gt;
&lt;li&gt;using XR Ray Interactor for teleportation and distance grabbing&lt;/li&gt;
&lt;li&gt;building interactive World Space UIs&lt;/li&gt;
&lt;li&gt;adding continuous locomotion and snap turning&lt;/li&gt;
&lt;li&gt;configuring Tunneling Vignette to reduce motion sickness&lt;/li&gt;
&lt;li&gt;mirroring hand poses between left and right hands&lt;/li&gt;
&lt;li&gt;organizing interaction layers&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Even though Unity automates a lot of VR setup today, the small configuration details still matter—especially when multiple systems like locomotion, UI, grabbing, and physics overlap.&lt;/p&gt;
&lt;hr /&gt;
&lt;h1&gt;Step-by-Step Implementation&lt;/h1&gt;
&lt;h2&gt;Video 1–2: VR Overview &amp;amp; Setup&lt;/h2&gt;
&lt;h3&gt;1. Project Setup&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Created a new Unity project&lt;/li&gt;
&lt;li&gt;Adjusted project settings (rendering, physics, XR defaults)&lt;/li&gt;
&lt;li&gt;Added a main VR scene with a plane and basic environment&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;2. XR Setup&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Installed &lt;strong&gt;XR Interaction Toolkit&lt;/strong&gt; via Package Manager&lt;/li&gt;
&lt;li&gt;Added &lt;strong&gt;XR Origin&lt;/strong&gt; containing camera and both hand controllers&lt;/li&gt;
&lt;li&gt;Configured Input Actions for both controllers&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;3. XR Plugin&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Enabled &lt;strong&gt;OpenXR&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;Activated required interaction profiles&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;This part seems simple, but OpenXR configuration must match the headset—otherwise nothing responds.&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;Video 3: Hand Models &amp;amp; VR Input&lt;/h2&gt;
&lt;h3&gt;Hand Setup&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Imported custom hand models&lt;/li&gt;
&lt;li&gt;Placed them under LeftHand / RightHand&lt;/li&gt;
&lt;li&gt;(Optional) Added Animator controllers for finger motions&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;Input Detection&lt;/h3&gt;
&lt;p&gt;Coded detection for:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;trigger&lt;/li&gt;
&lt;li&gt;grip&lt;/li&gt;
&lt;li&gt;pinch&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;This made it easier to connect interactions like firing bullets or grabbing interactables.&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;Video 4: Continuous Movement &amp;amp; Turning&lt;/h2&gt;
&lt;h3&gt;Continuous Locomotion&lt;/h3&gt;
&lt;p&gt;Implemented using:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Continuous Move Provider&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;Character Controller&lt;/li&gt;
&lt;li&gt;Gravity + center adjustments&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;Turn Provider&lt;/h3&gt;
&lt;p&gt;Two modes:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Continuous Turn (smooth rotation)&lt;/li&gt;
&lt;li&gt;Snap Turn (fixed-angle rotation for comfort)&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Movement feels simple, but balancing gravity, character height, and collision takes trial and error.&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;Video 5: Teleportation&lt;/h2&gt;
&lt;h3&gt;Teleportation Setup&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Added &lt;strong&gt;XR Ray Interactor&lt;/strong&gt; to each controller&lt;/li&gt;
&lt;li&gt;Configured straight or curved ray&lt;/li&gt;
&lt;li&gt;Added &lt;strong&gt;Teleportation Provider&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;Defined teleportation areas (floors, platforms)&lt;/li&gt;
&lt;li&gt;Added a hover reticle for valid targets&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Teleport is still the most comfortable VR locomotion option, so getting this right matters.&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;Video 6: Hover, Grab, and Interactables&lt;/h2&gt;
&lt;h3&gt;Hover Detection&lt;/h3&gt;
&lt;p&gt;Used &lt;strong&gt;XR Direct Interactor&lt;/strong&gt; + Collider.&lt;/p&gt;
&lt;h3&gt;Grab Interactables&lt;/h3&gt;
&lt;p&gt;Configured:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Rigidbody&lt;/li&gt;
&lt;li&gt;Collider&lt;/li&gt;
&lt;li&gt;Interaction settings (instantaneous / velocity-based / kinematic)&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;Projectile Firing&lt;/h3&gt;
&lt;p&gt;Imported a pistol model and implemented forward bullet firing.&lt;/p&gt;
&lt;p&gt;This was the first part that made the project feel like an actual VR game.&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;Video 7: Offset Grab &amp;amp; Distance Grab&lt;/h2&gt;
&lt;h3&gt;Offset Grab&lt;/h3&gt;
&lt;p&gt;Created attachment transforms for left and right hands so each object is held naturally.&lt;/p&gt;
&lt;h3&gt;Distance Grab&lt;/h3&gt;
&lt;p&gt;Enabled remote grabbing using:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;XR Ray Interactor&lt;/li&gt;
&lt;li&gt;Interaction layers&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Layer management is surprisingly easy to overlook, but it&apos;s crucial for preventing your rays from grabbing UI or teleport surfaces unintentionally.&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;Video 8: VR UI&lt;/h2&gt;
&lt;h3&gt;World-Space UI&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Added a World Space canvas&lt;/li&gt;
&lt;li&gt;Attached buttons, sliders, toggles&lt;/li&gt;
&lt;li&gt;Set UI layer + interaction components&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;Handling Interaction Conflicts&lt;/h3&gt;
&lt;p&gt;Disabled teleport rays whenever hands interacted with UI—preventing accidental teleports.&lt;/p&gt;
&lt;h3&gt;Dynamic UI&lt;/h3&gt;
&lt;p&gt;Added a toggle panel that appears in front of the user.&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;Video 9: Haptic Feedback&lt;/h2&gt;
&lt;h3&gt;Haptic Setup&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Attached custom haptic scripts&lt;/li&gt;
&lt;li&gt;Triggered haptics on grab/release&lt;/li&gt;
&lt;li&gt;Controlled intensity + duration&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Haptics are a small detail but dramatically improve interaction feel.&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;Video 10–11: Grab Poses&lt;/h2&gt;
&lt;h3&gt;Custom Grab Poses&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Captured hand joint data for right hand&lt;/li&gt;
&lt;li&gt;Applied pose to interactable via attach transform&lt;/li&gt;
&lt;li&gt;Mirrored pose to left hand&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Custom grab poses make objects feel much more realistic, especially tools or weapons.&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;Video 12: Tunneling Vignette&lt;/h2&gt;
&lt;h3&gt;Comfort Feature&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Imported &lt;strong&gt;Tunneling Vignette&lt;/strong&gt; sample&lt;/li&gt;
&lt;li&gt;Configured feathering, fade times, colors&lt;/li&gt;
&lt;li&gt;Attached component to main camera&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;This reduces VR motion sickness by limiting peripheral vision during movement.&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;Video 13: Example Project&lt;/h2&gt;
&lt;p&gt;Explored Unity’s built-in XR samples:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;interactables&lt;/li&gt;
&lt;li&gt;physics interaction&lt;/li&gt;
&lt;li&gt;socket interactables&lt;/li&gt;
&lt;li&gt;climbing&lt;/li&gt;
&lt;li&gt;gaze input&lt;/li&gt;
&lt;li&gt;locomotion&lt;/li&gt;
&lt;li&gt;both 2D &amp;amp; 3D UI&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;This was helpful to understand more complex, real-world setups.&lt;/p&gt;
&lt;hr /&gt;
&lt;h1&gt;Problems I Encountered&lt;/h1&gt;
&lt;h2&gt;1. Unity Version Mismatch&lt;/h2&gt;
&lt;p&gt;&lt;strong&gt;Issue:&lt;/strong&gt; Certain XR features didn&apos;t match the tutorial.&lt;br /&gt;
&lt;strong&gt;Fix:&lt;/strong&gt; Downgraded Unity to &lt;strong&gt;2023.3.16f1&lt;/strong&gt; (matching the tutorial’s version).&lt;/p&gt;
&lt;h2&gt;2. Meta Quest Link Issues&lt;/h2&gt;
&lt;p&gt;&lt;strong&gt;Issue:&lt;/strong&gt; Quest Link displayed in low resolution or failed to connect.&lt;br /&gt;
&lt;strong&gt;Fix:&lt;/strong&gt; Switched PCs + updated GPU drivers &amp;amp; Oculus app.&lt;/p&gt;
&lt;h2&gt;3. Input Misconfiguration&lt;/h2&gt;
&lt;p&gt;&lt;strong&gt;Issue:&lt;/strong&gt; Trigger / grip inputs not firing correctly.&lt;br /&gt;
&lt;strong&gt;Fix:&lt;/strong&gt; Rewrote entire Input Action asset and fixed bindings.&lt;/p&gt;
&lt;p&gt;Debugging VR setups often feels like fighting invisible settings—especially with the Input System.&lt;/p&gt;
&lt;hr /&gt;
&lt;h1&gt;Questions Going Forward&lt;/h1&gt;
&lt;p&gt;Following the tutorial clarified the basics, but it also left me with a few questions that matter for real development:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Should I rely on Unity’s default grab system or fully customize grab poses per object?&lt;/li&gt;
&lt;li&gt;Are there recommended architecture patterns for complex VR interactions?&lt;/li&gt;
&lt;li&gt;How can I simulate and debug VR features quickly without wearing the headset every time?&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;These are areas I want to explore more deeply as I start developing my own VR mechanics.&lt;/p&gt;
</content:encoded></item></channel></rss>