Intel releases SLAM camera for warehouse navigation

Share this on social media:

Intel has launched a camera with simultaneous localisation and mapping (SLAM) built-in for autonomous devices.

The RealSense T265 tracking camera is based on visual and inertial sensor fusion. It uses inputs from dual fisheye cameras, and a Bosch inertial measurement unit (IMU) along with its own processor on board. The Movidius MA215x ASIC provides edge processing capabilities for real-time six degrees-of-freedom poses.

These six DoF measurements enhance translation and rotation data, which is crucial for pose estimation for autonomous devices, and for detecting the environment to move without collisions or other errors.

The camera’s SLAM system helps machines and devices to navigate in warehouses, at logistic centres and other places without GPS.

Intel RealSense devices are available from Framos. Darren Bessette, category manager devices at Framos, said: ‘Until now, developers were heavily challenged to use a single IMU or vision sensor to measure both orientation and translation in space. Intel’s T265 hybrid approach improves the precision of pose estimation for drones, robots and the immersive experience in AR/VR applications based on the paired strengths of both measuring methods.’

The camera includes all components on a single board. Middleware processing is provided directly on the Movidius Myriad 2MA215x ASIC chip, which enables a higher CPU performance in order to provide the host system with six DoF poses with low latency.

The Bosch BMI055 IMU provides an accelerometer and gyroscope with high sample rates in a single package, while the OV9282 fisheye camera contains a monochrome global shutter image sensor with an 160° view, complemented by an infrared cut filter. The T265 camera comes with USB 3.0 support, though USB 2.0 video streaming is sufficient for running the system.

Related news

12 February 2021

Cognex's revenue was $811m for the year, a 12 per cent year-on-year increase thanks to higher revenue from consumer electronics and e-commerce

Credit: Nataly Reinch _ Shutterstock.com

27 May 2021

The vision devices use Sony's IMX500 sensor, which is able to run AI algorithms on the chip to provide real-time information about free parking spaces and other transport data

20 April 2021

The Kria K26 SOM is built on top of the Zynq UltraScale+ MPSoC architecture. It has 4GB of DDR4 memory and 245 IOs for connecting sensors

14 April 2021

The platform integrates the Nvidia Jetson TX2NX module with an edge-to-cloud software stack including AWS Panorama Device SDK

22 February 2021

Participants in the Embedded Camera API Exploratory Group will discuss requirements for new interoperability standards to accelerate market growth and reduce development costs

19 February 2021

On 4 March, panellists from Basler, MVTec, Sick, and Amazon Web Services will discuss developments in embedded vision during the Embedded World digital show. IMVE's Greg Blackman will moderate