SLAM: the main event

Share this on social media:

Tags: 

Greg Blackman reports from a KTN-organised image processing conference, where event cameras and the future of robotic vision were discussed

Autonomous cars, drones delivering packages, and virtual reality headsets might all be viable technologies and ones that have been shown to work, but they’re not yet found in everyday life. Part of the reason for this, according to Owen Nicholson, CEO and co-founder of Imperial College London spin-off Slamcore, relates to the simultaneous localisation and mapping (SLAM) algorithms used in much of this technology. ‘We need to get SLAM algorithms working with affordable hardware, and we’re still not there yet,’ he commented during an intelligent imaging event, jointly organised by the UK Knowledge Transfer Network (KTN) and the Institution of Engineering and Technology (IET), which took place in London on 1 March.

Nicholson pointed out that self-driving cars are not ready for mass deployment, drones crash when they are not under manual control, and that half of VR users suffer from motion sickness because of latency issues.

Within the field of robotics, SLAM algorithms have been developed and refined since the early 1990s. They are designed to construct a map of an unknown environment while simultaneously pinpointing where the robot is within its surroundings. Sensors identify features in the scene that can be recognised from different positions and used to triangulate the robot’s location.

In 2003, SLAM was shown to work with a single camera, and since then other sensor data, including that from depth sensors, has been used for robot guidance. Slamcore, which has had investment from Amadeus Capital, among other investors, is developing SLAM solutions fusing different sensor data.

The company is also writing algorithms for event cameras, technology that has been around for 10 years but has not made it out of the laboratory, and which Nicholson feels could offer real benefits for robotics.

Event cameras don’t have the concept of frames; rather they record a stream of events but only when something changes in the scene. Because there are no frames, most vision algorithms won’t work with the data.

Chronocam is one firm that has raised investment, most notably from Renault, for its event camera-based vision sensors, for which it won best start-up at the 2016 Inpho Venture Summit, an investment conference in Bordeaux, France.

On the software side, Slamcore co-founder Hanme Kim and colleagues at Imperial College London won the best paper award at the 2014 British Machine Vision Conference for work on simultaneous mosaicing and tracking with an event camera.

The benefits of event cameras are that they give high dynamic range and are able to cope with fast movement in the scene, but Nicholson said that the ‘real future of event cameras lies in their low power consumption’. He said there is an order of magnitude improvement in the data rate and power consumption of event cameras compared to standard cameras, because event sensors only report information when something in the scene changes.

Nicholson commented during his presentation that there is ‘still lots to do on event camera hardware’, and that ‘algorithms and hardware need to be built hand in hand’.

Chronocam describes its event camera-based sensors as 'bio-inspired vision technology', and during the event Andrew Schofield, a senior lecturer in the school of psychology at the University of Birmingham, described work undertaken at the Visual Image Interpretation in Humans and Machines (ViiHM) computer vision network, which aims to transfer understanding of biological vision to help solve problems in computer vision. ViiHM, funded by the UK Engineering and Physical Sciences Research Council (EPSRC), has presented grand challenges - a theoretical, technical and application challenge - for the computer vision and biological vision communities to develop a general purpose vision system for robotics.

The Intelligent Imaging event brought together academia and industry, with presentations on image processing in art investigation, defence applications, super-resolution microscopy, and space imaging. In his introduction, Nigel Rix, head of enabling technologies at KTN, commented that the UK has a good science and innovation base, but is less good at commercialising those innovations. The KTN aims to act as a bridge between academia and industry, providing funding for technology readiness levels of four to six.

Related article:

What can drones learn from bees? - Dr Andrew Schofield, who leads the Visual Image Interpretation in Human and Machines network in the UK, asks what computer vision can learn from biological vision, and how the two disciplines can collaborate better

Company: 

Related analysis & opinion

A point cloud of a National Research Council Canada artefact superimposed on a CAD model. Credit: NIST

31 July 2020

How do you choose a 3D vision system for a robot cell? Geraldine Cheok and Kamel Saidi at the National Institute of Standards and Technology in the USA discuss an initiative to define standards for industrial 3D imaging

Credit: Drag&bot

02 July 2020

Christoph Hellmann Santos, of the ROS Industrial Consortium Europe and Fraunhofer IPA, explains why more industrial users are now building robots through the open source software platform, Robot Operating System

Chinese hospitals have ordered 2,000 UVD robots from Blue Ocean Robotics since the start of the pandemic. Credit: Blue Ocean Robotics

09 April 2020

The Covid-19 pandemic has highlighted the importance of robotics and automation in many areas of manufacturing and society, according to CEOs of robotic firms in a webinar run by A3

28 February 2020

Paul Wilson, managing director of Scorpion Vision, describes what it takes to install a 3D robot vision system in a Chinese foundry

20 June 2019

The UK is up to 20 per cent less productive than its major competitor countries because it is not investing in automation, Mike Wilson at the British Automation and Robot Association said at UKIVA's machine vision conference in Milton Keynes. Greg Blackman reports

Related features and analysis & opinion

InGaAs and event-based sensors still rely on silicon CMOS readout ICs

15 April 2020

Greg Blackman explores the work being done on InGaAs sensors, along with other types of novel sensor that could find their way into the machine vision market

19 December 2019

Greg Blackman explores the cameras used to capture extremely rapid events

MVTec’s Halcon software library includes a deep learning OCR tool with pre-trained fonts from a wide range of industries. Credit: MVTec

03 August 2020

Matthew Dale explores vision solutions for code reading and inspection in pharmaceutical production

Engineers at KYB in front of a pick-and-place solution for handling steel metal cylinders. Credit: Pickit

03 August 2020

Car manufacturing has been hit hard by Covid-19, but the need for automation on production lines has not diminished, as Greg Blackman finds out

A point cloud of a National Research Council Canada artefact superimposed on a CAD model. Credit: NIST

31 July 2020

How do you choose a 3D vision system for a robot cell? Geraldine Cheok and Kamel Saidi at the National Institute of Standards and Technology in the USA discuss an initiative to define standards for industrial 3D imaging

Credit: Drag&bot

02 July 2020

Christoph Hellmann Santos, of the ROS Industrial Consortium Europe and Fraunhofer IPA, explains why more industrial users are now building robots through the open source software platform, Robot Operating System

Two robots have been installed at Aalborg University Hospital in Denmark. Credit: Kuka

04 June 2020

Keely Portway looks at how robots are automating procedures in hospital testing laboratories, and how imaging underpins this

Chinese hospitals have ordered 2,000 UVD robots from Blue Ocean Robotics since the start of the pandemic. Credit: Blue Ocean Robotics

09 April 2020

The Covid-19 pandemic has highlighted the importance of robotics and automation in many areas of manufacturing and society, according to CEOs of robotic firms in a webinar run by A3