One of the key characteristics of the AMEE is its ability to read individual people including their emotions. I mentioned early on in the blog’s history about the epidemic of loneliness and depression afflicting much of the industrialized world. This is something that I hope will help, if not cure, than at least reduce.
So, what do I mean by recognize people’s emotions?
When you watch ‘Mission to Mars’, you observe how AMEE recognizes its owners and is friendly and submissive. Later (when damaged) it shows its aggressive/protective side when threatened. Of course, it was supposed to be a military robot so we won’t go that far 😉 but still, how cool is that robot?
For our purposes, it starts with basic emotions. Certain mannerisms and facial expressions are associated with laughter, with anger and so on. The robot should be able to constantly analyze it’s owner’s face to determine their emotional state. If the robot determines the owner is feeling sad for too long, it can try to nudge him or her to change their state of mind. The robot can also mimic the owner when they are happy. Or it can become more protective when the owner is frightened by an outsider. Additionally, the robot will be smart enough to know that keeping their owner’s moving is key to mental and physical health. It will nudge them periodically to get physical like going for a walk.
There currently is no open source software for this and purchase options are limited. This might make custom designed software mandatory. The upside is that we can create AMEE’s personality however we like if we choose to build our own emotion software. Many labs are currently testing this out. Obviously, it doesn’t need to be as good as AMEE in the movie right off the bat. My target is 50% of that for version 1.