Thanks for visiting Imaging and Machine Vision Europe.

You're trying to access an editorial feature that is only available to logged in, registered users of Imaging and Machine Vision Europe. Registering is completely free, so why not sign up with us?

By registering, as well as being able to browse all content on the site without further interruption, you'll also have the option to receive our magazine (multiple times a year) and our email newsletters.

Lidar working distance improved

Share this on social media:

Using MEMS tuneable VCSELs, a team has managed to improve the operable distance of a lidar system to 10 metres. The University of California, Berkeley team say that this could one day enable a self-driving car to spot a child in the street half a block away, answer a smartphone from across the room with a wave of a hand, or play virtual tennis in real space.

The lidar system uses light to provide feedback about the world around it. Lidar systems of this type emit laser light that hits an object, and then can tell how far away that object is by measuring changes in the light frequency that is reflected back.

The system was developed by researchers at the University of California, Berkeley, and can remotely sense objects across distances as far as 10 metres, 10 times farther than what could be done with comparable current low-power laser systems.

‘While metre-level operating distance is adequate for many traditional metrology instruments, the sweet spot for emerging consumer and robotics applications is around 10 metres,’ said UC Berkeley’s Behnam Behroozpour, who will present the team’s work at CLEO 2014, being held 8-13 June in San Jose, California, USA. ‘This range covers the size of typical living spaces while avoiding excessive power dissipation and possible eye safety concerns.’

The researchers sought to shrink the size and power consumption of the lidar systems without compromising their performance in terms of distance. By using the MEMS device at its resonance the researchers were able to amplify the system’s signal without a great expense of power.

In their new system, the team used a type of lidar called frequency-modulated continuous-wave (FMCW) lidar, which would ensure their imager had good resolution with lower power consumption, Behroozpour said. This type of system emits ‘frequency-chirped’ laser light – light that has either increasing or decreasing frequency – on an object and then measures changes in the light frequency that is reflected back.

With further development, the team say the technology could be used to make smaller, cheaper 3D imaging systems that offer exceptional range for potential use in self-driving cars, smartphones and interactive video games like Microsoft’s Kinect, without the need for large boxes of electronics or optics.

The team’s next plans include integrating the VCSEL, photonics and electronics into a chip-scale package. Consolidating these parts should open up possibilities for ‘a host of new applications that have not even been invented yet', Behroozpour concluded.

Recent News

24 October 2019

Imec says the new production method promises an order of magnitude gain in fabrication throughput and cost compared to processing conventional infrared imagers

04 October 2019

Each pixel in Prophesee’s Metavision sensor only activates if it detects a change in the scene – an event – which means low power, latency and data processing requirements

18 September 2019

3D sensing company, Outsight, has introduced a 3D semantic camera that combines lidar ranging with hyperspectral material analysis. The camera was introduced at the Autosens conference in Brussels

16 September 2019

OmniVision Technologies will be showing an automotive camera module at the AutoSens conference in Brussels from 17 to 19 September, built using OmniVision’s OX03A1Y image sensor with an Arm Mali-C71 image signal processor