Skip to main content

25 June 2018

Exhibition provides a glimpse into the future of robotics

Robots took centre stage at Bush House as an interdisciplinary team of experts from across King’s provided a tantalising glimpse into the future of robotics.

Robot
Robot

A range of robots were on hand to illustrate the evolution of robotics. From Kinba, a robot that recognises faces and interacts with humans; to a worm like medical robot that could one day feedback sensory information to surgeons about the feel and texture of tissue, as it travels through the body.

The cross faculty exhibition, which ran from the 25th to the 29th of June, highlighted King’s cross-faculty expertise in:

Sensing – giving robots the ability to see, touch, hear and move around by interpreting environmental feedback. While robotic sensors may be similar to human it may also be possible to develop technologies that allow robots that sense things that humans cannot.

Sensing can be ‘local', where robots are in the same place as human controllers or ‘remote’ where they are located away from humans.  Some forms of remote sensing includes chemical sensors that are attached to mobile robots and employed for detecting explosives.

Motion – walking across a room without crashing into furniture, crossing a crowded street without bumping into others or lifting a cup to one’s lips are tasks performed by most people without conscious thoughts, but for robots such movements need to be choreographed.

Motion can be engineered, using kinematic equations that design trajectories for leg arm or wheel actions, or planned, using artificial intelligence to determine sets of actions and adjust in real-time if feedback indicate that actions were not executed as planned.

Human-robot interaction (HRI) research attempts to study the expectations and reactions of humans with respect to robots so as to ensure effective interactions. This can range from how workers behave when controlling robots remotely, to how humans trapped in a collapsed building might respond to a rescue robot, to how people can make decision jointly with intelligent robots.

Learning – is how robots can acquire skills or adapt to environments beyond their pre-programmed behaviours

Robots can learn in a variety of ways, by reinforcement, by imitation and autonomously. Reinforcement Learning techniques associate a reward with specific outcomes due to robot actions and involve an iterative process in which a robot learns to maximise its reward through sequences of trial and error actions.

‘Learning from demonstration’ techniques involves a robot learning to imitate the actions of a human demonstrator while ‘statistical machine learning’ techniques, such as artificial neural networks, allow a robot to detect and recognise patterns in its environment and develop appropriate responses without guidance from a human.

Multi-Robot Systems – involves robots coordinating with each other to perform complex tasks that may be difficult or inefficient for a single robot.

Small sub-problems are distributed to induvial robots who then interact with each other to find solutions. Multi-robot systems have a wide set of applications, from rescue missions to the delivery of stock in a warehouse.

Speaking about the exhibition, Professor Elizabeth Sklar from the Department of Informatics said: ‘The sheer breadth of expertise in robotics that King’s possesses is truly remarkable, and this exhibition provides a glimpse of a future in which people and robots come together harmoniously.

‘The potential for significant advancement in this area is boundless and the work that King’s is doing is opening up new boundaries and possibilities, and that’s incredibly exciting.‘