Search code examples
unity-game-enginehololenswindows-mixed-realitymrtk

Procedurally generating objects on surfaces in spatial mapping to improve immersion


I'm developing a game which to help with immersion I'd like to place tufts of grass randomly on some flat horizontal surfaces. I'm currently struggling to realise whether there is an easy way to do this using the Mixed Reality Toolkit in Unity or whether I'll need to look lower level for the Hololen's built in surface mapping that generates the triangles, which will obviously take significantly longer to implement.

I've taken a close look at the spatial mapping component of the MRTK as that appears to be the section I want, but by the looks of it it's watching a mesh provided by the Hololens for updates, similar to if you just import a room model to an Object Surface Observer in Unity. There doesn't appear to be any iterative generation of triangles or interpretation of points, so I assume I am looking in the wrong place. I've also considered using spatial understanding to create a floor surface, but then that misses out on being able to spawn objects on tables or other higher up surfaces.

For more clarity on the desired outcome in case anyone has a workaround (I don't have enough reputation to post the image inline): https://i.sstatic.net/H0ZoJ.png

Any guidance would be greatly appreciated!


Solution

  • You could take the normals of the spatial mapping mesh and see which ones point up.

    The Spatial Mapping is just a mesh generated on runtime. It's nothing special, just a mesh like every other. Also it's always on Layer 31 called Spatial Mapping.