Bush House exhibition showcases future of robotics
Posted on 09/07/2018
Researchers from the Department of Informatics
showcased the evolution of robotics in a recent public exhibition at Bush House, held to celebrate UK Robotics Week from 25-29 June 2018.Robot 'Baxter' at the Robotics Week exhibition in Bush House
A range of robots illustrated developments in this area, including King's former robotic receptionist Kinba, which recognises faces and interacts with humans, to a worm-like medical robot that could one day feed back sensory information to surgeons about the feel and texture of tissue as it travels through the body.
The exhibition focused on King's cross-Faculty expertise in:Sensing
: Giving robots the ability to see, touch, hear and move around by interpreting environmental feedback. While robotic sensors may be similar to human it may also be possible to develop technologies that allow robots that sense things that humans cannot.
Sensing can be ‘local', where robots are in the same place as human controllers or ‘remote’ where they are located away from humans. Some forms of remote sensing includes chemical sensors that are attached to mobile robots and employed for detecting explosives.
Motion: Walking across a room without crashing into furniture, crossing a crowded street without bumping into others or lifting a cup to one’s lips are tasks performed by most people without conscious thoughts, but for robots such movements need to be choreographed.
Motion can be engineered, using kinematic equations that design trajectories for leg arm or wheel actions, or planned, using artificial intelligence to determine sets of actions and adjust in real-time if feedback indicate that actions were not executed as planned.
Human-robot interaction (HRI) research attempts to study the expectations and reactions of humans with respect to robots so as to ensure effective interactions. This can range from how workers behave when controlling robots remotely, to how humans trapped in a collapsed building might respond to a rescue robot, to how people can make decision jointly with intelligent robots.
Learning: This is how robots can acquire skills or adapt to environments beyond their pre-programmed behaviours.
Robots can learn in a variety of ways, by reinforcement, by imitation and autonomously. Reinforcement Learning techniques associate a reward with specific outcomes due to robot actions and involve an iterative process in which a robot learns to maximise its reward through sequences of trial and error actions.
‘Learning from demonstration’ techniques involves a robot learning to imitate the actions of a human demonstrator while ‘statistical machine learning’ techniques, such as artificial neural networks, allow a robot to detect and recognise patterns in its environment and develop appropriate responses without guidance from a human.
Multi-Robot Systems: This involves robots coordinating with each other to perform complex tasks that may be difficult or inefficient for a single robot.
Small sub-problems are distributed to individual robots who then interact with each other to find solutions. Multi-robot systems have a wide set of applications, from rescue missions to the delivery of stock in a warehouse.
Speaking about the exhibition, Professor Elizabeth Sklar, Reader in Computer Science and member of the Centre for Robotics Research, said:
'The sheer breadth of expertise in robotics that King’s possesses is truly remarkable, and this exhibition provides a glimpse of a future in which people and robots come together harmoniously.
'The potential for significant advancement in this area is boundless and the work that King’s is doing is opening up new boundaries and possibilities, and that’s incredibly exciting.'