Driver assistance with time-of-flight at IS2013

Share this on social media:

Greg Blackman reports from the Image Sensors 2013 conference, including the latest from Toyota Central R&D Labs on its time-of-flight sensor for driver assistance

Toyota's Central R&D Labs in Japan has developed a prototype high-definition imaging lidar sensor for its Advanced Driver Assistance Systems (ADAS). The time-of-flight 3D sensor, which has a 100m range, could potentially be used in future Toyota vehicles for aspects like obstacle avoidance, pedestrian detection and lane detection.

Speaking at the Image Sensors conference (IS2013) in London (19-21 March), Dr Cristiano Niclass of Toyota Central R&D Labs said that the imaging lidar sensor provides more comprehensive coverage of the road for driver assistance than millimetre-wave radar or stereovision, both of which have drawbacks – millimetre-wave radar has excellent range, but low angular resolution and field of view, whereas stereovision has a better field of view, but only operates over a shorter distance.

The lidar sensor has a range of 100m and operates under bright sunlight, as well as in rain and adverse weather conditions, Dr Niclass said. The system uses a polygon mirror to direct the laser beam in six different directions and a time-of-flight sensor running at 202 x 96 pixels at 10fps. It is eye-safe, with a laser output power of 21mW.

The system was tested under a background illuminance of 70klux and a sky illuminance of greater than 100klux, and recorded an error of less than 15cm at 100m distance at 10fps.

Dr Niclass said the sensor would recognise most objects based on the 3D data. He said that at 10fps there would be some motion artefacts when driving at speed on a motorway, for instance, but added that the system is designed for driver assistance rather than autonomous driving, which is a lot more challenging.

The IS2013 conference covers all aspects of image sensor technology for consumer, broadcast, and industrial and scientific imagers. In his presentation, Jim Lewis, CEO of Swiss 3D imaging company, Mesa Imaging, challenged some of what he considered were myths surrounding the capabilities of time-of-flight (TOF) imaging. He suggested that TOF technology needed improvement in operating outdoors, where direct sunlight, longer operating ranges, variable scenes, and bad weather conditions all make acquiring accurate 3D data difficult. He also said there was more to building an industrial-grade TOF camera than perfecting the pixel – LEDs will degrade over time and the camera will have an inherent temperature variation, all of which complicates engineering a robust industrial camera.

Lewis said that to widen the adoption of TOF imaging in industry, depth map stability has to be improved, the technology has to be able to track fast moving objects better, and the dynamic range has to improve. Mesa Imaging’s latest TOF sensor is the Onyx, a 25k pixel analogue sensor.

As well as 3D imaging, the conference had presentations on multi-aperture imaging, broadcast image sensor technology, CMOS RGB Clear technology, digital cameras, and medical applications. The latest market data was also presented by Paul Danini, a market analyst at French market research firm, Yole Développement. He commented that the CMOS image sensor market was worth $6.6 billion in 2012, 60 per cent of which was represented by the mobile phone sector. He said that computer tablets and automotive were two areas expected to grow significantly in the future.

The first colour image, captured by a CMV20000 sensor, to be sent back by the hazard cameras on the Perseverance rover after its landing on Mars on 18 February. Credit: NASA/JPL-Caltech

26 February 2021