SECURE is a Marie Skłodowska-Curie Action funded by the European Commission. Its aim is to train roboticists and research fellows on the cognitive and interaction level of robot safety. These fellows should then be able to cope with the new challenges for safety that come with the increased complexity in human work and living spaces. They also need to be familiar with safety concepts and solutions for a multitude of robotic platforms. Therefore, the SECURE network aims to train fellows on innovative scientific and technological requirements for safe human-robot interaction and will employ several of the currently best robot platforms in Europe. The fellows are trained at six partner institutions in Europe and are supported by another five associated partners, ranging from large-scale international industrial partners to small enterprises, thus providing an optimal training environment for young researchers.
Our main goal is to create robots that analyze and track human behavior over time in the context of their surroundings (situational) using audio-visual monitoring in order to establish common ground and intention-reading capabilities. In BabyRobot we focus on the typically developing and autistic spectrum children user population. Children have unique communication skills, are quick and adaptive learners, eager to embrace new robotic technologies. This is especially relevant for special education where the development of social skills is delayed or never fully develops without intervention or therapy.
HRI-BioPsy is an abbreviation for the the above project title and is conducted at the Adaptive Systems Research Group, part of the University of Hertfordshire. HRI-BioPsy aims to shed more light on processes and factors that impact upon the "quality" or rapport of human-robot interaction. In the long run insights from this kind of research may contribute to a more principled understanding of what makes human-centric interaction work and, coversely, which kind or attributes of (robot) behaviour(s) will most likely lead to an interactional breakdown. This project is a follow-up of the Motor Interference and Motor Coordination in Human-Human Interaction project.
KASPAR (Kinesics and Synchronisation in Personal Assistant Robotics) is a child-sized, minimally expressive robot that has been developed by a team lead by Prof. Kerstin Dautenhahn within the Adaptive Systems research group. Since 2005 the robot has been used extensive in different research projects including Robotcub (http://www.robotcub.org/) and ROBOSKIN (http://www.roboskin.eu/) and other research on cognitive or developmental robotics. A key use of the robot is for robot-assisted play for children with autism. We found that children with autism respond generally very positively towards interaction with the robot as a non-threatening, enjoyable, interactive toy that is programmable in order to suit the therapeutic needs of different children on the autistic spectrum. Links to youtube videos of KASPAR:
The Aurora project is a long-term project founded by Prof. Kerstin Dautenhahn with the aim to investigate the use of robots and other interactive technology in autism therapy. Our aim is not to replace human contact but to provide tools that can mediate between children and their social environment as a stepping stone towards the development of communication and social interaction skills. The project involves PhD students and research staff. The project is internally as well as externally funded. We have been using and evaluating different robots in the past, including mobile robots as well as humanoid robots. We currently focus our research on the humanoid KASPAR robot, a minimally expressive robot that seems very promising for use in autism therapy (http://kaspar.stca.herts.ac.uk/). We also developed an interactive software called TouchStory to teach children with autism about narrative. Currently research in the project includes how to use KASPAR in order to teach children with autism collaborative skills (Josh Wainer PhD project), to investigate communication with and through KASPAR (Luke Wood PhD project), as well as Dr. Ben Robin’s work on KASPAR as a social mediator (part of the ROBOSKIN project, http://www.roboskin.eu/).
Designed by Dr. Kheng Lee Koay
As part of the LIREC project (http://www.lirec.org/) a new mechanoid robot was developed called Sunflower, designed by Dr. Kheng Lee Koay. Sunflower is based on the “pioneer” platform with the addition of a touch-screen user interface and diffuse LED display panels to provide expressive multi-coloured light signals to the user. Other expressive behaviours include sound, base movement, and movements of the robot’s neck. The non-verbal expressive behaviours have been inspired by expressive behaviour that dogs display in human-dog interaction in similar scenarios as those those used in the Robot House, in collaboration with ELTE in Hungary (Dr. Ádám Miklósi’s group). The robot possesses some human-like features (a head, arms) but its overall design is non-humanoid. This design follows our previous research results showing that mechanoid (mechanically-looking) robots are well accepted by users with different individual preferences. For more pictures of Sunflower in the Robot House see the picture gallery.
Within a universal agent-world interaction framework, based on Information Theory and Causal Bayesian Networks, we demonstrate how every agent that needs to acquire relevant information in regard to its strategy selection will automatically inject part of this information back into the environment. We introduce the concept of “Digested Information” which both quantifies, and explains this phenomenon. Based on the properties of digested information, especially the high density of relevant information in other agents actions, we outline how this could motivate the development of low level social interaction mechanisms, such as the ability to detect other agents.
In the perception-action loops of multiagent systems, their information-theoretic properties are studied. In particular, information-theoretic quantities such as empowerment, the amount of perceivable control an agent has, and other information-theoretic measures are used to study and shape the behaviour of an agent collective.
This involves, especially in the context of multi-agent systems, identifying the conditions in which multiple agents behaving as a group have more abilities than the sum of their own individual behaviour, but also how they need to interact to achieve self-organization in larger collectives. A main assumption behind this is that information quantities offer a universal approach that is less architecturally biased approach to identify constraints on the organization of behaviour necessary to bring forward self-organization.
Information theory has proven to be a useful tool to identify computational requirements on a decision-making mechanism in perception-action loops of agents. Under the biologically plausible assumption of informational parsimony of decision processes, this project studies the constraints that particular tasks and agent embodiments impose on the structure of the decision-making.
“Empowerment”, the informational capacity of an agent to modify its environment, has been shown in the last years to be an effective driver for intrinsic behaviour of agents “embodied” in a world, whether simulated or real. Its properties and the reasons for its success are increasingly well understood.
However, this comes at a price: the effort for accurately computing empowerment k steps into the future grows exponentially with the action horizon k. Thus, only short horizons could be utilized. Though empowerment is to some extent “prescient” in that, even in its local form, it identifies directions of interest to an agent, this strong limit on the achievable horizons limits its usefulness.
With the challenge comes the cure: it turns out that information theory can again be used to address this issue in a consistent and coherent way by understanding what particular space structures “do” to their informational landscape and making use of this to extend the empowerment horizon far beyond what was possible before.
The use of robotic devices for providing physiotherapy is a relatively new field within the area of robotics in health care and emerged from the idea of using robots to assist people with disabilities. It is rapidly advancing based on the recent developments in robotics, haptic interfaces and virtual reality. The idea of using robots to assist a therapist with a rehabilitation exercise has led to the development of several rehabilitation robotic devices. Considering the robot as an advanced tool under the therapist's supervision, the key challenge in the area of rehabilitation robotics is how best the therapist's skills can be enhanced with robot technology.
The work to be carried out in this research is based on the Gentle/S rehabilitation system. Gentle/S utilised haptic and virtual reality technologies to deliver challenging and meaningful therapies to upper limb impaired stroke subjects. The clinical trial results with Gentle/S system and other systematic reviews highlighted the need for robotic therapy to be highly 'adaptable' according to the specific needs and performance of the patient. Research also highlights the opportunity for using robotic technology to quantitatively 'assess' the underlying recovery process. The current research, termed as Gentle/A (A for adaptability), is therefore aiming towards designing a better therapeutic human adaptive interface with an improved assessment capability. Motivating visual interfaces combined with an adaptive sense of touch can bring the user very close to reality, while making the exercise more fun. Hence ‘Haptics + Virtual Reality’ is gaining wide spread acceptance among user and research communities.
The principle purpose of this empirically based thesis study is to define an interactive interface which when combined with haptic (touch based) digital repository, of differing ‘virtual gratings’, will then open access of touch centric third dimensional landscapes to a specific user group of people with blind and visual impairments, who wish to engage within the digital tactile processes of creative arts, surface pattern/fashion, design practice or indeed engineering industries.
The primary points of discourse within the study as a whole are 1) to offer multi-modal fully interactive haptic “gratings” which can be easily mapped to the blind/VI user own working tactile knowledge. 2) to clearly define an overview of a remote repository that offers kinaesthetic tactile intercourses of varied surfaces mimicked from real life. 3) to show the potential effective links into widening participation (WP) through collaborative working process of researcher and user group.
This study will follow the hypothesis that an increase of virtual ‘realness’ and an enhancement of accessible multi modal mapped data, will allow visually impaired participants to commit surface patterning to memory and therefore be able to engage with digital surface design on a more accessible level. An initial feasibility study has been undertaken; the study was designed to interrogate the structures and boundaries of current haptic technologies (the state of the art –SoA) in the exemplar form of the PHANToM haptic probe, whilst also observing tactile user interactions with analogue surface process. The feasibility study aims were to understand SoA whilst exploring users fine motor skills and orientation techniques observed from ’typical’ creative process to then diagnostically scope findings for points to consider within future test studies.
Future works and disseminations will offer empirical studies working with the chosen haptic device in collaboration with the focus user group. The testing phase will initially follow Wade’s theory Nine in Hole Peg Test (NHPT) via means of virtual and analogue testing rigs. Further tests will be adapted to suit specific user needs and additions will then be created and tested. Finally the third phase will be set around a repository of textures, which will offer users a plethora of varied surfaces and shapes, to use as part of the standard surface pattern making process, results will then be analysed and disseminated.
Haptic technologies present opportunities for rehabilitation in many scenarios: stroke rehabilitation, MS patients, motor-cognitive impairments and HCI for visually impaired users. Many new, relatively low-cost, haptic technologies are now being produced, with a small form factor that allows for these devices to be used in subjects’ homes e.g. SensAble’s PHANTOM Omni. From a rehabilitation perspective this allows for a close personal interaction that is not viable in a clinic setting and also abstracts the idea that a user needs to be in the clinic to perform all of their rehabilitation.
Social mediators enhance communication and interaction between participants who may or may not be in the same location. In the context of this PhD, they will allow for enriched communication in a rehabilitation scenario, allowing for caregivers and therapists to monitor and apply rehabilitation more effectively. This PhD will incorporate currently established medical assessment techniques and aim to improve upon them in terms of data analysis and acquisition by the use of haptic technologies and haptic tasks specifically designed for the purpose of assessing stroke survivors’ performance and recovery while undergoing rehabilitation. Interaction will be further enhanced by establishing a telecommunication protocol to deliver haptic force feedback between remotely located partners.
Finally, in designing a comprehensive physical interaction system for the purpose of rehabilitation there should be a level of adaptability. Every user is different, moves differently, thinks differently - each user's experience of the system will be different: the system should incorporate algorithmic learning techniques to adapt to the user's abilities and then 'improve' along with the user. From this standpoint, the haptic environments developed will autonomously adapt to users’ specific requirements whilst undergoing haptic rehabilitation therapy. Further advice will be sought from experts in rehabilitation of neurological disorders.
There is already a wealth of research within the field of haptic technologies specifically in the area of rehabilitation robotics. This research aims to enhance the value of what has previously been researched by showing that, through the use of haptic devices as social mediators, current rehabilitation techniques can be improved, and to show that a viable platform for rehabilitation can be integrated into user’s homes post-clinical rehabilitation.
Initial studies will primarily focus on collecting haptic data, using the Phantom Omni haptic device, from healthy users performing assessment tasks typically performed by those with upper limb impairments. Further studies will explore the ‘Social Mediator’ where the data collected will be used to identify key areas of difficulty for individual users, as would be the case in an interaction between therapist and patient.
M. Bowler, F. Amirabdollahian, and K. Dautenhahn. “Using an Embedded Reality Approach to Improve Test Reliability for NHPT Tasks” ICORR 2011 (Accepted, March 2011)
M. Bowler. “The use of haptic force-feedback devices as assistive technology and assessment tools for the rehabilitation of upper limb impairment” RAEng Young Researchers Meeting, Abstract, Presented September 2010.
SCRIPT is a 3 year project which started in November 2011, partially funded by the European Commission under the 7th Framework Programme. The SCRIPT project aims to produce two prototype robotic devices, a passive‐actuated device and one actuated actively, that will focus on hand and wrist rehabilitation after stroke. Both of these devices will be used in the stroke patient’s home to enable better management and delivery of therapies to stroke patients. The device also aims to provide motivating and challenging therapeutic activities using interactive games. It is thought that frequent interaction between the patient and device will further influence recovery at chronic phases of stroke rehabilitation.
ACCOMPANY is a 3 year project, which started in October 2011, partially funded by the European Commission under the 7th Framework Programme. The ACCOMPANY project aims to develop a companion robot which will assist elderly people to maintain their independence for longer within their home environment. The project team will assess user requirements and acceptance of the robot in their design of a physical, cognitive and social assistant for everyday home tasks. It is hoped the robot will contribute to the re-ablement of the user, delivering services through socially interactive, acceptable and empathic interaction. The project aims to establish a co-learner relationship, where both the robot and user can provide mutual assistance.