(to repurpose from Tracy Kidder). A number of years ago I was dev lead for a groundbreaking company called LifeF/X , we made fairly believable avatar heads – ones that were pretty far down the uncanny valley – believable enough so you though you were talking to video of a real person – albeit a person who had some issues. We had many things in our favor – low render target size (about 300×300 pixels for the face) plus some really advanced facial animation software – nothing like it exists today unfortunately.
The lead scientist/facial-animator behind that software was Dr. Mark Sagar. After LifeF/X spent all the VC money and folded he went on to Sony & Weta for become their expert in facial animation. If you’ve seen Spiderman or King Kong or Avatar you’ve seen Mark’s work close up. He then returned back to research at the university where he’s been working on an extremely creepy (sorry Mark!) AI/Avatar project called Baby X.
I find is creepy because he’s creating a neural network that’s starting off basically with a human baby’s level of understanding and teaching it through interaction. All good, sound tech, – the fact that the model used is based on a real baby’s interactions – his daughter in fact – is a bit unnerving to me – the animation isn’t quite out of the uncanny valley, so it’s a bit creepy to watch, but the progress is real. Baby X is powered by an artificial brain with inputs layered in through an artificial nervous system. It’s designed to be plugged in to other AI systems that may deliver higher level thought.
I bring this up because he’s gotten the tech far enough along to attract US$ 7.5M in VC money for a spin-off company – Soul Machines. From the press release:
Soul Machines is a developer of intelligent, emotionally responsive avatars that augment and enrich the user experience for Artificial Intelligence (AI) platforms.
So, here we see the first VC investment in a company creating AI designed for human interaction – think your personal assistant (ala Siri, Cortana, Alexa, Google Now, etc.) but one that understands human emotions, has their own emotional state, and shows up as a human on your PDA, computer, AR glasses – whenever you need her (or she needs you). You talk, they listen, understand, and respond. The future is getting closer.