AR Prototyping 3 – World editing, Scaling, Mesh scanning, virtual window, Vive tracker

I’ve been continuing my AR prototyping with some more quick Twitter videos. Sometimes it’s to show how Modbox could be used in AR to build – other times it’s just showing a specific UI idea or feature.

Room Scaling

I found when building in AR I often wanted to edit things in my house without having to actually physically move over there (partially right now since my VR tracked space is limited). So it was essential to be able to translate the entire space alike to how most VR editors work (using the controller’s grip to zoom/pan). I had this working before when I made my initial Drone AR game in Modbox, but it was difficult to keep a spatial reference of where the editing was taking place (for that video I just display where the Vive’s ‘play area’ is in relation). To keep a spatial reference when AR editing Modbox can now show a scanned mesh of the room:


Generating this mesh was done in the headset before playing. You can see a lot of visible issues with the resulting mesh (the mesh spatialization on the ZedMini isn’t nearly as accurate as taking time for a photogrammetry scan, but it is quick to do). I think the visuals of the scanned mesh could be more abstract in the future (faded or textureless)
Mesh scanning of the environment is going to be essential for AR headsets (for determining occlusion, collision, and maybe generating environmental cubemaps), so I think having a 3d representation of the environment like this for AR building apps should be easy to do and incredibly useful.

Some more experimenting with real time mesh scanning:


And Into VR:


It’s likely the next generation of VR headsets will have this sort of environmental awareness, which will be useful for a lot of applications.
For Modbox I’d love to allow VR players to scan their environments to build off of ingame, or scan a real life object to use as a ingame entity.
But I think the best use case is just trying to make the VR player feel less separated from the real world. If someone approaching me could be shown in abstracted visuals while playing, I wouldn’t be so concerned about flailing my arms around wildly in VR games. I also wouldn’t feel so disconnected from reality, which is partly why I am more interested in AR than VR at the moment.
Proper scanning of the environment from the headset would also be a great replacement for generating the VR play area / boundaries, rather than the current awkward setup process for the Vive Chaperone/Oculus Guardian system

Virtual Window with tracking

I really enjoyed messing around with the virtual window test I did before – but it was pretty inaccurate and required manual camera placement (just positioning a Unity camera to be close as possible to my monitor’s position).
Using a project called PictureWindow I was able to set the monitor position exactly ingame, and also have the perspective change to match the headset:


I loved this effect, and have a lot of ideas to try out with this for the future. It’s especially useful right now with the limited resolution/FOV of AR headsets (so your monitor then becomes a nice high resolution preview of the virtual space). There are also a few game modes I’d like to try with this not requiring a headset (but still needing tracked head/eye position of players)

Modbox AR Editing

Incorporating Vive Tracker into Modbox:


Modbox has a ‘Vive Tracker’ entity you can connect to other entities in edit mode, then in play mode control them. Which is a great/easy way to create custom toys/objects to control.

And a quick vid to show how mesh editing works in Modbox AR mode:


I’ll be looking into how this radial interface could translate to hands (rather than using controller trackpad), but not sure if LeapMotion / alternatives have accurate enough tracking for that yet.