MSOE Winter Game Jam Mixed Reality Tech Demo HoloLens 2 (1st Place)

Problem Statement: For a MSOE game development club winter game jam, the goal was to showcase interactive Mixed Reality by combining real-world spatial understanding with autonomous agents and user-controlled environmental effects on the HoloLens 2—demonstrating technical depth in a short build window.

Solution: I built a HoloLens 2 tech demo using MRTK2 and Microsoft Scene Understanding to generate a real‑world floor mesh which was converted into a runtime nav mesh. Users spawn winter‑themed wolves that obey gravity and randomly roam the mesh. A hand menu toggles snowfall particle systems, adjusts wind strength influencing snow drift, and spawns additional wolves. The project won 1st place for its innovative use of spatial understanding and intuitive hand‑tracked controls without peripherals.

Skills Used:

  • C#
  • Unity
  • MRTK
  • HoloLens 2
  • OpenXR
  • XR
  • MR
  • VR
  • Game Development

Development Process

Prototyping Spatial Mesh Interactions and Hand Interactions

Created initial prototype with a placeholder low poly wolf model that was kinematic and was able to collide and be occluded by the spatial mesh. Implemented basic hand menu with MRTK2 to spawn wolves.

Hand Menu & Controls

Built MRTK2 hand menu to spawn wolves, toggle snow, and adjust wind force affecting particle direction.

Scene Understanding + Runtime Nav Mesh

Step image 1

Detected floor surfaces via Microsoft Scene Understanding, filtered planes to build a clean floor mesh, and generated a runtime nav mesh. Wolves roam randomly over the mesh with gravity compliance (no state machine).

Near Interaction and Constraints

Added new wolf and near interaction handlers and constraints to allow users to pick up and move wolves around the environment naturally with their hands instead of the raycaster.

Recognition

Received 1st place award for innovative integration of spatial mapping, procedural navigation, and intuitive MR interaction.