Offloading cognitive load for expressive behaviour: small scale HMMM with help of smart sensors

Social robotics is a field of robotics where the robot interacts and communicates with physical agents following social cues and norms. Currently, several robots exist that are able to do so using HMMM (Heterogeneous Multi-Modal Mixing) by combining multiple external inputs into one smooth behaviour. One of those robots is the EyePi. However, it remains very complex and expensive making it unsuitable for rapid prototyping of simple social robotics. Therefore, instead of the usual configuration where a small computer like the Raspberry Pi controls simple sensors, the opposite is attempted where smart sensors are tied together by a simple micro-controller.

Here a toolkit is created consisting of a collection of smart building blocks in the fields of speech recognition, computer vision and visual display, completed with a set of programming scripts written in Arduino IDE and OpenMV IDE. The toolkit is tested by creating a prototype of a minimal social robot based on the capabilities of one of the first social robots, Kismet, as well as chatbots, Eliza.

The resulting prototype is capable of understanding 34 different interpretations of emotions and feelings, subsequently divided into 6 categories. Furthermore, gazing, blinking and breathing motions are also integrated, calling for HMMM traits to effectively mix these with the pan-tilt face following algorithm. Each emotion is connected to a certain behaviour affecting the frequency and amplitude of the breathing movement. Those emotions are then shown by displaying eye animations on two 8x8 DOT matrixes. The final result is a toolkit allowing for simple and rapid prototyping of minimal social robots, whose building blocks can be re-used, swapped, omitted or new ones can be added.