Prototyping future AR in VR

My goal with my VR game creation app Modbox is to eventually allow it to build AR experiences. With the idea that rather than making a AR game that will procedurally fit into any environment (like most AR experiences currently attempt to do) Modbox would give you the tools to create AR experiences that are completely tailored to your environment. Building AR inside AR. Unfortunately there are no AR devices ready for this yet – mobile AR is terrible and pointless, while both HoloLens and Magicleap don’t have properly tracked 6dof input.
So to begin prototyping AR I am using the ZedMini by Stereolabs, attached to a HTC Vive headset:

ZedMiniHTCVive

It has two cameras positioned where my eyes are to allow for AR passthrough while in VR, with a latency of around 60ms. This means I can still use SteamVR controllers and it’s perfect tracking – and also not have to run on limited mobile hardware. The ZedMini Unity SDK comes with depth sensing to provide occlusion with the real world, and mesh spatialization to map the environment (which I use for wall collision).
While the actual in headset experience isn’t really consumer ready (due to latency and the limited FOV), it does allow for making good proof of concept videos.

IoT Control

The first idea I had was to connect IoT smart home devices to the allow for control inside AR, and to respond to game events.
In Modbox I connected to the smart lights in my place with the Philips Hue API to allow controlling them, at first just changing the color of the lights with a virtual color selector:

Then I wanted to test the idea of giving the lights actual spatial location. My original plan was just to add some ‘light entities’ then point at them to turn them off and on, but then I thought it’d be fun to ‘shoot them out’ with a bow and arrow.
To do this wooden breakable cubes are positioned in Modbox, then using the ingame visual scripting the light is set to turn off when the cube breaks.
The obvious use of this for future headsets world be controlling lights in your home by just pointing at them – but I can see this a lot of applications for this with AR games. Such as blinking the lights red when hit by a enemy – or fading all of the lights in/out to do a scene change.

Game Creation / Playing

Next thing I tried was an example of how to use Modbox to create a game in your environment, in this case a Space Pirate Trainer sort of drone shooting game:

The room has been mesh scanned so the drones collide against the walls. In this I am setting up Modbox ‘spawner entities’ with drones and adding some basic wave controller scripts.

You can see use of the ‘selector’ tool to position stuff out of reach. But there is also the option of using the grip controllers to translate the entire virtual space (seen in the second Twitter vid)

This is something I think will be really useful for AR building – rather than having to move all around your house you can just scale the world down to edit it. I think what’s needed for this is a better visualization of the environment scale (for this is just shows you where the initial play area was). Maybe show a abstracted scanned version of the room

As another test of game creation I added a NPC character and a RC car+controller:

And to try out multiplayer I wanted to try AR to VR playing:

The VR player is in a completely virtual world, but can still play games with someone in a AR headset. Ideally Modbox will allow for complete cross platform play, with online games between desktop/VR/AR/mobile.

Virtual Window

For this test the virtual camera had to be manually placed to match the real world monitors position. Ideally the headset could just entirely track a monitor’s position and size from looking at it, and then render a cameras view to display on that monitor. This could be a good way for headsets to display part of the virtual world in a crisp high resolution way before the headsets screens are capable of that.

Editing AR on Desktop

The best way to edit AR and create AR content is inside AR. Otherwise positioning things in space can be pretty difficult and unintuitive. But there are definitely tasks that are better on a desktop screen than in AR – specificity any text reading or writing (due to too low resolution for reading, and input device for writing)

Here is an example of using a desktop interface to edit a AR environment, and see the resulting changes around you:

In the video I can switch between keyboard and the AR input at any time. For Modbox I can see all the scripting done on a the desktop interface, while the level design in VR / AR.