Archive

Archive for the ‘Modbox’ Category

VR Chatbots and interactable NPCs

February 27th, 2017 No comments

The latest Modbox update includes the start of the NPC system – allowing Modbox creators to add and edit human characters.

One idea I wanted to try was having the user speak to the NPCs and have them respond – I already added voice commands to edit mode (and found speaking in VR to feel pretty natural), so it seemed pretty easy to just add a few voice commands you could give to NPCs, then have them respond with some preset responses. Then I decided rather than specific commands I would just let the user talk freely, using voice dictation APIs rather than voice commands, and rather than preset replies I could just hook up the NPC AI to a chatbot service and use text to speech for responses.

Surprisingly the hardest part of this was the speech to text. Unity has a Dictation Recognizer, which uses the Windows speech tools, but due to a Windows update that has been broken for half a year on 64 bit, and based on the Unity forums they apparently aren’t going to fix it. So I had to have Modbox create a new 32 bit process, then use the clipboard to send the text data back to the 64 bit application (using the clipboard for this might be the hackiest thing I’ve done in Modbox, but after spending 2 hours trying to get IPC working with Unity I opted for the quick solution).
For the text to speech I was expecting to just use the old Windows tools – the horrible robot voice every played with 15 years ago. I ended up trying out Amazon Polly – and while getting the API to work with Unity was a giant timesink the results were amazing. I am really hopeful these voice APIs will expand with more options like emotion selection. Then to make the lip sync work I used Oculus’ lip sync tools – I just needed to manually set what phonemes resulted in which mouth blendshapes on the Morph3d models.

For multiple characters I then added a start up command, where you say ‘Hello ‘Name” to the character to have them start listening. This is shown in the Modbox update video

I have no idea how I’ll end up using this for the future of Modbox, if at all, but it was a fun experiment to try. Problem with how it currently uses chatbots is it has zero connection to gameplay – the current API just gives a random response based on what other people have said to it (I just used Cleverbot, which can give pretty hilarious responses). I imagine there are chatbot API’s out now that I could program to understand the users intent (mostly used now by horrible AI help systems on websites, and messaging startups), but that’d be a lot more effort than I was willing to put into this for now – plus I didn’t have gameplay in mind yet for what I would do with these guys (just giving them commands to pick up objects and interact with the world would be great, but that’ll have to come after I give the NPC AI a lot better understanding of the environment).
Another easier way to do this, that I hope future VR games do, is just show the user a few voice command options. So rather than the traditional dialog selection system in RPG games have the user say the commands. Recognizing preset voice commands is a lot easier and less error prone than doing full speech to text.

Categories: Modbox, VR

VR Experiments

April 27th, 2016 No comments

While giving demos for MaximumVR I had a few people mention to really feel like a Godzilla monster they wanted to stomp around the city.
So to add feet Kate used some old Vive Dk1 controllers and attached them to rubber boots:

Had to use old Vive Dk1 controllers rather than the pre/consumer versions since those have a bandwidth limit of 2 (they go through the headset, while old ones have a separate connection).
We originally just tried strapping them on the feet directly, but this had the problem of not knowing where the ground was exactly (your feet would be at a different position depending on where you strapped them to your feet, plus the angle you strapped them on would need to be perfect). Giant rubber boots ended up working well also since players just naturally felt ridiculous wearing them.

No way players will actually be able to try this yet – but hopefully full body tracking will eventually be a thing!

One of the main Vive experiments I wanted to try since getting the first dev kit was interacting with fluid and soft bodies. I wasn’t sure of the general game idea, but I knew that just interacting with non rigid bodies would be incredibly satisfying.

To do this I had to grab the Flex branch of Unreal and compile it, then just added standard motion controllers. My plan was to make a pancake simulator game (just since the most satisfying liquid was one with a lot of viscosity, and pouring it then moving it around in a pan was fun). I knew the Flex liquid simulation was pretty limiting game wise (no collision event data, no way to serialize, and can only change properties for the entire group), but just messing with the liquid in anyway would be enough of a game for me.

But, I got tired of dealing with the Unreal engine. I am glad I took a weekend to learn Blueprints and get a C++ refresher, but the time it would take me to go farther with this project wouldnt be worth it.

Categories: Modbox, VR