The goal of this project is to (re)develop control software for a 6 DOF Kinova arm in order to make it applicable for at least two adaptive control use cases: control using an eye-tracker for a client with severe spasm - and control for remote, intuitive real-time puppeteering of a care-robot prototype. Both cases require a similar thought-out control mapping (coordinate space? joint space?), real-time performance, and a strong safety layer (internal/external collision detection).
The project is part of the Ability Technology Lab's endeavour to develop new smart control/interface mapping tools for existing technology (such as wheelchairs or, indeed, this robot arm) - see https://abilitytech.nl.
However, before the robot arm is going to be tested with real users in a real use case, we will subject it to a part of theatrical prototyping - a method that will be developed in the recently granted NWO project 'Dramaturgy for Devices' - https://performingrobots.sites.uu.nl/2023/07/17/announcement-dramaturgy-for-devices/