GPT3 NPCs

I got access to OpenAI’s GPT3 API in February and decided to use it to expand on my VR chatbots attempt I made a few years ago. Made a video on how it could be used in Modbox to allow players to make games with their own created AI:

Some great coverage from RockPaperShotgun: A glimpse into the future of being yelled at by videogame AI.

Also did a interview with TechRapter about the possibilities of AI languages models in games (including AAA games)

For this test I just used the GPT3 ‘davinci’ model, which is trained on a large amount of internet sources. Problem with that is if I ask the chatbot who ‘George Bush’ they would respond accurately, which wouldn’t make sense in a fantasy game setting. Instead ideally the AI would be trained on material specific for the game and it’s setting. For the next Elder Scrolls Bethesda could hire writers to write massive amounts of backstory / lore, all to train the NPCs in the game to allow players to ask questions and learn (and the player would then have to ask the right questions)

Unfortunately I am still in the process of figuring out how I can release this, if I can at all. After the video was made OpenAI then added GPT3 chatbot guidelines disallowing basically anything like this. I was in communication with them beforehand, but they decided later that any sort of free form chatbot was ‘dangerous’.
They only allow GPT3 chatbots if only speaks about specific subjects (I guess by having a predictable prompt), and literally never says anything bad/negative (and we have to keep logs to make sure this is the case). Their reasoning was literally a ‘what if’ the chatbot “advised on who to vote for in the election”. As if a chatbot in the context of a video game saying who to vote for was somehow dangerous.
I can understand reasons to keep GPT3 private – like for it’s possible use in deception or spam. But they are so scared of their chatbot saying a bad thing and the PR around that they’ve removed the possibility of doing anything useful with it. Context should be taken into account – a clearly labeled chatbot in a video game is different than a Twitter bot designed to spam/deceive. It’ll just be a matter of time before other competitors can do what GPT3 can anyway