Yesterday I kicked off a new AI project. For the last few weeks we've been getting a mimic bot up and running in SL. A mimic bot is one that learns purely by hearing what people say to it, and then re-saying those things later and then remembering the response. You end up with a very natural chatbot, but one that hasn't got the faintest idea whatit is talking about. We think there'll be a great market for them in SL as drunks at the end of the bar! e also added some explicit learning, and the aim is to make them a simpler alternative to our Discourse based chatbot.
But walking back from the BCS/RSI event with Rod Brooks (Director CSAIL at MIT) I had a sudden brain-wave as to how this mimic technology could be used to do something that I've been thinking about for ages - and do it incredibly elegantly and in a way that for the moment is only possible in a virtual world.
So yesterday I found the time to cut the code and put it into operation. It's a real struggle not to say a lot about it now, but I want to get a few months data before making it public. I'm not saying it will revolutionise AI (it won't), but it might really open peoples eyes as to the possibilities.
***Imported from old blog***