Skip to main content

28 February 2017

The Internet of Silicon Retinas (IOSIRE)

King’s College London, in collaboration with UCL and Kingston University, have recently been awarded £1.4 million in funding from the Engineering and Physical Sciences Research Council (EPSRC) for their three year project IoSiRe, “Internet of Silicon Retinas: Machine to machine (M2M) communications for neuromorphic vision sensing data”. IoSiRe, which is led by Dr Mohammad Shikh-Bahaei (Department of Informatics at King’s), is scheduled to start 1 July 2017, and will explore a novel and advanced technology for layered representation and transmission of silicon retina data over the Internet of Things (IoT) for cloud-based analytics.

Oil painting computer outlines
Computers

In the next ten years, most of the envisaged services for the analysis of sensor data for event, action, object or person recognition will utilise advanced communications for the transport of data to cloud storage and computing servers. These advanced communications infrastructures are today collectively termed as machine-to-machine (m2m) communications. The wider context of connecting devices and enabling interaction over the internet has been termed as the Internet of Things (IoT), an example of this is the smart fridge, where your fridge would have internal cameras that would be able to detect when you were running low on milk and inform you via text message. The future of IoT involves being able to achieve high frame rate visual sensing and processing for surveillance and monitoring (e.g. vehicles and drones) at very low power.

Current high speed visual sensing involves an incredibly high frame rate, however new concepts are challenging this idea. Biologically speaking humans do not use frames in our vision, instead our sight is based on detecting details of reflectance and movement (predominantly via cone and rod photoreceptor cells in mammals) in an asynchronous manner while the visual cortex part of the brain fills in the remaining information. Inspired by this observation, hardware designs of neuromorphic sensors, a.k.a., dynamic vision sensors (DVS), have been proposed recently. DVS devices use technology that works like a human retina, local pixel-level changes caused by movement are transmitted at the time they occur. A DVS camera can capture fast motion events using far less power (10-20 milliwatts) and far faster than a regular shutter based camera; when the events are rendered as video frames, 700-2000 frames per second can be achieved. Commercially available DVS cameras are currently available and their applications have begun to emerge, from implants for the visually impaired, to visual surveillance capable of detecting very high-speed events.

The IoSiRe project aims to explore photorealistic video rendering at 100-700 frames per second (the average HD film is 24 frames per second) by linking DVS to large visuals libraries, to use DVS representations for face or object detection and classification, and to develop visual scene classification and retrieval for surveillance systems that will outperform current high-resolution videoing at a fraction of the bandwidth, power consumption and cost. In order for these three applications to work successfully the information captured by the DVS will be transmitted to a powerful cloud service via an m2m communications framework.

 

Fig 1. Envisaged IoSiRe m2m communications, processing and adaptation. 

Dr Shikh-Bahaei (King’s College London), the consortium lead, Dr Andreopoulos (UCL) and Professor Martini (Kingston University London) are the Principal Investigators in this project. The IoSiRe will also benefit from wide experience and knowledge of visual data processing and M2M communications available in the project’s high-calibre industrial collaborators: Mediatek, Samsung, Ericsson, Thales, iniLabs, and Keysight Technologies. Shikh-Bahaei says:

“After securing the EPSRC funding for our ‘SENSE’ consortium project in September 2016, I am very pleased to succeed in IoSiRe; another collaborative EPSRC project on advanced 5G and IoT technologies. In addition to my co-investigators within King’s, I am excited to work in the IoSiRe project with great collaborators from UCL, Kingston University, and leading industries. I am confident that IoSiRe project can deliver massive advances to the IoT, Vision Sensing, and Cloud Computing technologies, and will pave the way to realising visual data transmission over machine to machine networks of the future.”

ISORE news diagram image

In this story