Skip to main content

Slamcore debuts full-stack spatial AI SDK in industry competition

The three most challenging questions in autonomous robotics are: where am I? how far away are the objects around me? and, what are those objects? The vast majority of robot failures stem from an inability to answer these basic questions. As a result, developers and designers across the industry dedicate significant time, effort and resource as they struggle to develop subsystems to provide these answers consistently, accurately and reliably.

A universal solution to a complex problem

Although the problem is complicated, we believe that the solution is universal and that it makes no sense for every robot developer to have an in-house team dedicated to answering these questions for each and every individual robot design. Our goal is to solve the three challenges as one, and in a way that can be simply integrated into virtually any autonomous robot. Our entry to the 2020 Embedded Vision Summit ‘Vision Tank’ competition demonstrates how we have done so.

https://youtu.be/7DEJzZWteD4

Vision is the key to simultaneous localization and mapping (SLAM) in robots, and to semantic understanding of the physical space a robot occupies. Cameras are at the heart of the solution, but other sensors including wheel odometry, inertial sensors and LIDAR are often needed. Fusing, calibrating and processing data from the right combination of sensors without overloading processor or memory capacity is crucial to effectively overcoming SLAM challenges. Until now developers were forced to rely on trial and error, adding sensors and creating new algorithms for every combination. Every new sensor added increased costs, complexity and development time. Other pressing issues are put on-hold whilst teams wrestle with getting SLAM right.

All the algorithms you need

No longer. SLAMcore is debuting a new approach with our software development kit (SDK). This tool gives developers everything they need to build, test and deploy SLAM solutions using our algorithms and low-cost, easily available off-the-shelf sensors. The SLAMcore SDK is not a ‘plug and play’ SLAM solution, but a toolkit. It allows roboticists to combine proprietary visual positioning algorithms with state-of-the-art calibration and fusion algorithms and complementary tools which use data from low-cost sensors and run on low-power processors such as Raspberry Pi.

Building on decades of research, the SLAMcore algorithms outperform other solutions by an order of magnitude in terms of accuracy, robustness and computational efficiency. The SDK includes algorithms that deliver location, mapping and perception capabilities. Each aspect builds on and improves the others; more accurate positioning improves the quality of 3D mapping; panoptic segmentation facilitates semantic understanding and the removal of dynamic objects from maps, improving localization capabilities and reducing computational demands. We believe that we are alone in addressing these three challenges as one. This full-stack approach to spatial AI provides a single, universal solution that any developer can integrate into their designs.

An SDK to stimulate the industry

The SLAMcore SDK puts all of these advantages into the hands of anyone looking to build robots to help society. We have solved complex SLAM challenges for them so they can focus on the specific needs of their robotic solutions. In-house resources are freed-up, decisions around combinations of sensors can be made and tested quickly, and overall development time and costs reduced. Our commercial model will include a per-seat monthly fee to use the SDK for development, and a monthly per-robot fee for deployment. In this way we believe we can open the potential of this technology to as many developers as possible, irrespective of their size or stage of business.

SLAMcore’s mission is to make quality spatial AI available to all and to stimulate the development, production and deployment of robotic solutions to the world’s most challenging problems. The SDK demonstrates our commitment to creating and maintaining a robotics ecosystem that shares specialist knowledge and capabilities, creates re-usable solutions and establishes a supply chain approach that will deliver better robots, faster and at lower cost to benefit everyone.

The SDK is currently in alpha, if you’d like to be considered as a participant in our beta trials please register your interest here.

Topics

Read more about:

Product, Robotics, Embedded vision

Media Partners