Lidar working distance improved

Share this on social media:

Using MEMS tuneable VCSELs, a team has managed to improve the operable distance of a lidar system to 10 metres. The University of California, Berkeley team say that this could one day enable a self-driving car to spot a child in the street half a block away, answer a smartphone from across the room with a wave of a hand, or play virtual tennis in real space.

The lidar system uses light to provide feedback about the world around it. Lidar systems of this type emit laser light that hits an object, and then can tell how far away that object is by measuring changes in the light frequency that is reflected back.

The system was developed by researchers at the University of California, Berkeley, and can remotely sense objects across distances as far as 10 metres, 10 times farther than what could be done with comparable current low-power laser systems.

‘While metre-level operating distance is adequate for many traditional metrology instruments, the sweet spot for emerging consumer and robotics applications is around 10 metres,’ said UC Berkeley’s Behnam Behroozpour, who will present the team’s work at CLEO 2014, being held 8-13 June in San Jose, California, USA. ‘This range covers the size of typical living spaces while avoiding excessive power dissipation and possible eye safety concerns.’

The researchers sought to shrink the size and power consumption of the lidar systems without compromising their performance in terms of distance. By using the MEMS device at its resonance the researchers were able to amplify the system’s signal without a great expense of power.

In their new system, the team used a type of lidar called frequency-modulated continuous-wave (FMCW) lidar, which would ensure their imager had good resolution with lower power consumption, Behroozpour said. This type of system emits ‘frequency-chirped’ laser light – light that has either increasing or decreasing frequency – on an object and then measures changes in the light frequency that is reflected back.

With further development, the team say the technology could be used to make smaller, cheaper 3D imaging systems that offer exceptional range for potential use in self-driving cars, smartphones and interactive video games like Microsoft’s Kinect, without the need for large boxes of electronics or optics.

The team’s next plans include integrating the VCSEL, photonics and electronics into a chip-scale package. Consolidating these parts should open up possibilities for ‘a host of new applications that have not even been invented yet', Behroozpour concluded.

Recent News

06 May 2021

The GTOF0503 sensor features a 5µm three-tap iToF pixel, incorporating an array with a resolution of 640 x 480 pixels

30 April 2021

The algorithm can deduce the shape, size and layout of a room by measuring the time it takes for sound from speakers to return to the phone's microphone

20 April 2021

The Kria K26 SOM is built on top of the Zynq UltraScale+ MPSoC architecture. It has 4GB of DDR4 memory and 245 IOs for connecting sensors

18 March 2021

CEA-Leti scientists have developed a lensless, infrared spectral imaging system for medical diagnostics. It plans to commercialise the technology through a start-up