Thanks for visiting Imaging and Machine Vision Europe.

You're trying to access an editorial feature that is only available to logged in, registered users of Imaging and Machine Vision Europe. Registering is completely free, so why not sign up with us?

By registering, as well as being able to browse all content on the site without further interruption, you'll also have the option to receive our magazine (multiple times a year) and our email newsletters.

SLAM algorithm developer raises $5m

Share this on social media:

SLAMcore, a UK developer of Simultaneous Localisation and Mapping (SLAM) algorithms for robots and drones, has raised $5m in funding in order to help it deliver its technology to market.

The funding was led by global technology investor Amadeus Capital Partners, with existing investors Toyota AI Ventures and the Mirai Creation Fund also joining the funding round, in addition to newcomers MMC Ventures and Octopus Ventures.

SLAM algorithms are used by robots and drones to acquire spatial intelligence, enabling them to accurately calculate their position, understand unfamiliar surroundings, and navigate with consistent reliability.

‘BIS Research estimates the global SLAM technology market to be worth over $8 billion by 2027,’ commented Amelia Armour, principal of Amadeus Capital Partners. ‘This funding round will enable SLAMcore to take its spatial AI solution to that growing market and we expect demand for its affordable and flexible system to be high. Having backed SLAMcore at the start, we’re excited to be investing again at this critical stage for the company.’

SLAMcore spun out from the department of computing at Imperial College London in early 2016 and closed its first funding round in March 2017. Headquartered in London, UK, the company has grown to a team of 15 with a range of expertise in designing and deploying spatial algorithms for robots. 

‘The robotics revolution may seem just around the corner but there is still a big gap between the videos we see on the internet and real-world robots,’ said SLAMcore CEO Owen Nicholson. ‘SLAMcore is helping robot and drone creators to bridge the gap between demos and commercially-viable systems.’

‘It is a really exciting time for robotics,’ added SLAMcore co-founder Dr Stefan Leutenegger. ‘We are seeing a convergence of geometric computer vision algorithms, availability of high-performance computational hardware, and deep learning. We are embracing this new world and will move quickly towards offering solutions for robots requiring an advanced level of understanding of their environment.’

‘Our initial product will calculate an accurate and reliable position, without the need for GPS or any other external infrastructure, but that is just the start,’  Nicholson continued. ‘With this funding, we will also develop detailed mapping solutions capable of creating geometrically-accurate reconstructions of a robot’s surroundings in real-time, and understanding the objects within, utilising the latest development in machine learning.’

Related Articles 

Vision cleans up: Dyson robot vacuum navigates with imaging

Data-driven business models inspired by 3D vision - Framos’ Dr Christopher Scheubel discusses potential new business models based on 3D vision data

Company: 

Related news

Reconstruction of traffic signs with high-resolution colour non-line-of-sight imaging using conventional CMOS cameras sensors

21 June 2019

Dr Johannes Meyer (left) and EMVA president Jochem Herrmann

20 May 2019

Recent News

26 September 2019

Rugby fans are now able to watch highlights from the Rugby World Cup, currently taking place in Japan, from angles and viewpoints not possible with conventional cameras, thanks to a multi-camera system from Canon

13 September 2019

A hyperspectral imaging system built by US research centre Battelle, using Headwall sensors, has been chosen as a finalist for the Department of Homeland Security’s Opioid Detection Challenge

23 July 2019

On the 50th anniversary of the Moon landing on 20 July 1969, Zeiss has described how, in less than nine months, it built the camera lens used to capture the iconic images during the Apollo 11 mission

18 July 2019

Researchers at Lund University in Sweden are using high-speed cameras to study how insects use visual information to control flight