Show/hide main menu

Centre for Robotics Research


Robotics Research at King’s College London is on a mission to develop creative robotic approaches to society’s most critical challenges. The cross-faculty team brings together expertise in high-fidelity sensing, high-precision manipulation, machine learning, intelligent control and human-robot interaction to address a broad range of problems facing medical, manufacturing, agriculture and emergency response domains. The overarching aim of Robotics@KING’S is to overcome technical, practical and social barriers to wider deployment of effective robotic solutions.


hri mrs 
 learning  motion
 sensing  structure



Human-robot interaction (HRI) research attempts to study the expectations and reactions of humans with respect to robots so as to ensure effective interactions. This can range from how workers behave when controlling robots remotely, to how humans trapped in a collapsed building might respond to a rescue robot, to how people can make decisions jointly with intelligent robots. HRI research is integrated within much of our robotics research: if we know how humans are likely to respond, we can build robots that fit in with that behaviour.

The Prosthetics Lab

Led by Dr Ernest Kamavuako, Department of Informatics

How can the use of invasive recordings (mostly intramuscular electromyography, EMG) provide reliable myoelectric control systems? The lab investigates methods for the acquisition and processing of surface and intramuscular electromyography signals for robust control of upper limb prostheses. The methods include acute and chronic offline investigations and real-time control experiments with focus on clinical translation.

The Interaction Lab

Led by Professor Elizabeth Sklar, Department of Informatics

Investigates the notion of shared decision making in human-robot teams.

The Robot Learning Lab

Led by Dr Matthew Howard, Department of Informatics

Investigates human-in-the-loop techniques for teaching robots to perform tasks.

The Work, Interaction & Technology Research Centre

Led by Professor Christian Heath, King’s Business School

Investigates social interaction between people and robots in a wide range of settings, from surgical suites to workplaces and museums.

The KCL-Vattikuti Institute of Robotic Surgery

Led by Professor Prokar Dasgupta, Guy’s Hospital, Faculty of Life Sciences and Medicine

A pioneer in robotic surgery in the UK for over 10 years. The team has conducted a number of seminal randomised trials in Robotic Surgery, developed a flexible octopus-inspired robot and established the first international curriculum for safe surgical training.



Learning is how robots can acquire skills or adapt to environments beyond their pre-programmed behaviours. Robots can learn in many ways – by reinforcement, by imitation, and autonomously, for example. As such methods develop, new applications for robotics are emerging that aim to bring AI (artificial intelligence) into the ‘real world’.‘Reinforcement Learning’ techniques associate a reward with specific outcomes due to robot actions and involve an iterative process in which a robot learns to maximise its reward through sequences of trial-and-error actions. ‘Learning from Demonstration’ techniques offer human-in-the-loop training, where a robot learns to imitate the actions of a human demonstrator. ‘Statistical Machine Learning’ techniques, such as artificial neural networks, allow a robot to detect and recognise patterns in its environment and develop appropriate responses without guidance from a human teacher.

The Robot Learning Lab

Led by Dr Matthew Howard, Department of Informatics

How can advanced robotic systems learn skills from human behaviour? The lab’s primary research interests are machine learning for control, especially as applied to robotics. This includes learning kinematics and dynamics, approximate optimal control and reinforcement learning. A central theme in the lab is generalisation of control across systems with different dynamics, where those differences may be in terms of the constraints, the actuation or other differences in dynamics. This has implications for adaptive control and imitation learning (robots imitating humans or other robots) for a wide class of problems. It also relates to the control of redundancy in both the kinematics, dynamics and actuation.

The Intelligent Systems Lab

Led by Dr HK Lam, Department of Informatics

Employs Machine Learning techniques for control and classification problems, applied to a number of areas including the biomedical domain.



Walking across a room without crashing into furniture, crossing a crowded street without bumping into others or lifting a coffee cup from a table to one’s lips are tasks performed by most people without much conscious thought. But a robot has to choreograph its every move. Motion can be engineered, using kinematic equations that design trajectories for leg, arm or wheel actions,or planned, using artificial intelligence to determine sets of actions and adjust in real-time if feedback indicates that actions were not executed as planned. The path or trajectory of such motion should be designed effectively, so that a robot achieves its goal, and efficiently, so that it does not waste energy, a precious resource for any robot.

The Intelligent Systems Lab

Led by Dr HK Lam, Department of Informatics

How can computational intelligence and machine learning techniques provide advanced control for robotic systems? The lab investigates theoretical approaches and practical applications, with a focus on ‘fuzzy control’, where a robot’s conditions are approximated and appropriate actions are selected based on imprecise estimates – a more practical and realistic approach than many closely-engineered solutions. The fuzzy-model-based control problem aims to develop mathematical tools with superior stability, robustness and performance that can support nonlinear control applications.

The Advanced Kinematics and Reconfigurable Robotics Lab

Led by Professor Jian Dai, Department of Informatics

The Haptics Lab (led by Dr Hongbin Liu) and the Interaction Lab (led by Professor Elizabeth Sklar) each investigate different aspects of single and multi-robot motion.



In multi-robot systems, robots coordinate with each other to perform complex tasks that might be difficult or inefficient for a single robot. This often involves dispatching small sub-problems to individual robots and allowing them to interact with each other to find solutions. In heterogeneous robot teams, different robots have different capabilities, so the coordination becomes more constrained and hence, more complex. Multi-robot systems have a wide set of applications, from rescue missions to delivery of payloads in a warehouse.

The Interaction Lab

Led by Professor Elizabeth Sklar, Department of Informatics

How can robots interact effectively with each other and with people? The lab investigates situations where multiple robots cooperate to achieve common goals. Different coordination strategies are compared experimentally, including market-based strategies, where robots ‘bid’ for permission to perform tasks that lead to goal completion, and biologically inspired strategies, where robots imitate groups of birds or insects and dynamically respond to sensed properties of the environment, including other robots.

The Intelligent Systems Lab

Led by Dr HK Lam, Department of Informatics 

Investigates control theory and fuzzy-logic decision making for groups of robots.



If robots are going to be autonomous, they have to be able to decide what to do for themselves. Some of these decisions will involve a robot working on its own. Some other decisions will involve teams of robots,and some will involve robots making joint decisions with humans. Research on reasoning and planning stems from Artificial Intelligence and focuses on providing methods to make these decisions. Research on these topics at King’s looks at decisions in complex dynamic environments, and we are especially interested in being able to explain the resulting decisions to people.

Decision Making

Led by Dr Elizabeth Black, Dr Sanjay Modgil and Professor Simon Parsons, Department of Informatics

Investigates computational argumentation, creating methods by which robots –physical and virtual – can justify their decisions. Recent work applied to the medical domain has developed argumentation methods to support patients’ self-management of chronic conditions, such as secondary stroke prevention and diabetic foot ulcers.

Explainable Planning

Led by Dr Daniele Magazzeni and Professor Simon Parsons, Department of Informatics

Combines aspects of AI planning and computational argumentation, and considers strategies where robots can explain the reasons behind the choice of particular actions and can justify why certain resources are used.

AI Planning

Led by Dr Amanda Coles and Dr Andrew Coles, Department of Informatics

Develops methods for generating efficient plans in complex dynamic environments. For example, recent research has focussed on creating planning techniques for guiding Mars rover robots and applying planning methods for managingship movements.

Motion Planning

Led by Professor Stefan Edelkamp, Department of Informatics

Studies intelligent navigation for robots, helping them to decide how to move around the world.



Robotic sensing gives robots the ability to see, touch, hear and move around with the help of environmental feedback. Robotic sensors may be analogous to human sensors, or may allow robots to sense things that humans cannot. Sensing can be ‘local’, where robots are in the same physical space as human controllers or collaborators. ‘Haptic’ sensing is one type of local sensing. Haptic data can enable a robot to interact successfully with the physical world. Researchers in this area try to understand how people make use of touch and apply this understanding to develop robots that perform more accurately – possibly saving lives in the process! Sensing can also be ‘remote’, where robots are located away from humans. Remote sensing includes medical imaging sensors that are attached to probes and employed for surgical procedures, chemical sensors that are attached to mobile robots and employed for detecting explosives, and environment sensors that are attached to drones (unmanned aerial vehicles, or UAVs) and are employed for observing and modelling the earth.

The Haptics Lab

Led by Dr Hongbin Liu, Department of Informatics

How can biological-level haptic capabilities be implemented for robots? The Haptics Lab designs robots with advanced perception and interaction capabilities to address unmet needs in medicine, enabling safer and more effective diagnosis and treatment, as well as other applications such as guiding blind people or those performing tasks in low light (as in crisis situations).


The Medical Robotics Imaging Lab

Led by Professor Kawal Rhode, Department of Biomedical Informatics

How can custom-made robots and image processing techniques be integrated to improve conventional medical procedures? Examples include a trans-esophageal ultrasound robot used for guiding cardiac interventions and single/multiple arm extra-corporeal ultrasound robots for performing fetal ultrasound examinations


The QUEST Lab (Quadrupole Resonance Sensors Team)

Led by Dr Jamie Barras, Department of Informatics

How can Technology help Humanitarian Action and Peacebuilding? The lab develops chemical sensors that are designed to, for example, help detect buried landmines, or to analyse the quality of a medicine without the need to remove it from its packaging.


The Environment Lab

Led by Dr Mark Mulligan, Department of Geography

Can we develop cheap, robust DIY environmental monitoring capacity that can be scaled and grid-connected? The lab’s FreeStation initiative uses open source hardware, software and 3D-printing technologies to build and deploy reliable, low-cost automated weather stations; and includes a variety of open source environmental monitoring sensors from soil moisture probes, meteorological sensors to wildlife cameras.

The Robot Learning Lab

Led by Dr Matthew Howard, Department of Informatics

Investigates the use of innovative sensors embedded in clothing, resulting in ‘smart’ textiles that can measure biometric properties such as muscle fatigue.

The Interaction Lab

Led by Professor Elizabeth Sklar, Department of Informatics

Investigating the use of wellness sensors to help patients make data-backed decisions about self managing chronic conditions.



The design of the physical structure comprising a robot’s body is key to enabling motion, balance and manipulation. Robots that can change body shape, such as ‘metamorphic’ and ‘soft body’ robots, provide unique flexibility to navigate uneven surfaces and constrained spaces, while manipulators that can change can grasp irregular objects and perform dexterous tasks.Why is it hard for a robot to walk? Robots need to carry a processor and power supply, as well as sensors, and these components can be heavy and awkward. Metamorphic or ‘origami’ designs allow a robot’s structure to rebalance as it moves, making walking more tractable.Why is it difficult for a robot to grasp? Picking up objects requires a complex combination of vision and touch. Creating robots that can grasp, hold, tilt, and push objects with just the right amount of strength brings robots closer to human-like dexterity, and greatly increases their usefulness.

The Advanced Kinematics& Reconfigurable Robotics Lab

Led by Professor Jian Dai, Department of Informatics

How can reconfigurable and metamorphic robots, such as origami robots, support complex tasks in dynamic environments? Key application areas include hazardous environments, industrial and manufacturing settings, agriculture and healthcare.

The Haptics Lab

Led by Dr Hongbin Liu, Department of Informatics

Investigates the use of soft-body materials to construct flexible robotic devices for minimally invasive surgical procedures.

Sitemap Site help Terms and conditions  Privacy policy  Accessibility  Modern slavery statement  Contact us

© 2019 King's College London | Strand | London WC2R 2LS | England | United Kingdom | Tel +44 (0)20 7836 5454