When designing robots for social settings, it is hard to determine beforehand the social challenges the robot might run into when employed in real settings. Testing these robots in an enacted social situation helps us to explore preferred social robot behaviour. However, instead of having pre-programmed movements/expressions, we will use human-driven robots.
The human actor puppeteering these ’improvotype’ robots will need to have full, intuitive control over the robot to act expressively. For this purpose, we need to design a system/interface that allows users to choose their preferred controller and maps the input to the robot’s (possibly many) actuators to create expressive movements.
This system can then also be used to educate students in robotic puppeteering, teaching students how a puppeteering control method can be applied to robots in a collaborative performance project. This course would then include: identifying actuators, mapping inputs to actuators and the performance project with a desk lamp (contest for ’most expressive desk lamp’).
This creates the following challenge: How to design a modular mapping system that allows non-technical students to intuitively and expressively control a desk lamp robot using various control methods
Objectives:
- Review current advanced control techniques used in expressive robotics
- Identifying and developing suitable controllers with their mapping functions
- Developing and implementing a GUI for users to (re-)map controllers to a simple 6DOF robot
- Developing a short project to educate students in robotic puppeteering and to evaluate usability and
expressive capabilities
• An ethical exploration of attributing human emotional range to a robot