Global shutter for ADAS

Share this on social media:

Topic tags: 

Cliff Cheng, senior director of automotive marketing at OmniVision Technologies, details its latest global shutter sensor for industrial and automotive imaging

OmniVision Technologies has for years been developing CMOS image sensors with global shutter operation for a wide variety of markets, including surveillance, industrial machine vision applications, and more recently automotive-grade machine vision and viewing applications.

The OV2311 image sensor is among OmniVision’s latest offerings for advanced driver assistance system (ADAS) applications like driver monitoring. It features the OmniPixel3-GS global shutter technology with 3µm pixel pitch, and comes in a compact, 7.219mm x 6.157mm automotive chip-scale package that is Automotive Safety Integrity Level (ASIL) B and AEC-Q100 qualified.

The sensor can operate at a maximum resolution of 2 megapixels, providing 1,600 x 1,300 at 60fps, 1,280 x 720 at 90fps or 640 x 480 at 180fps, depending on the configuration. The OV2311 also has a PWM output for control and synchronisation with an external IR LED, which enables in-cabin monitoring in dark operating conditions. Additionally, this image sensor is equipped with many safety features, such as Watch Dog pulse and a calibrated temperature sensor.

Cameras that operate using a global shutter sensor do not suffer from motion distortion – as can be the case when capturing live video of moving objects using rolling shutter sensors – because all of the pixels for the entire image frame are exposed at the same time.

Rolling shutter artefacts result from the nature of rolling shutter operation, where each line of a frame is exposed sequentially at a different time, causing a temporal shift between lines. For many applications that require distortion-free images, in terms of shape and pattern, the highly noticeable distortion in moving rolling shutter images is unacceptable and unusable. For example, machine vision applications such as barcode scanning, package inspection, face recognition for automotive driver monitoring systems, and augmented and virtual reality devices, to name a few, all have zero tolerance for motion distortion artefacts.

Company: 

Related features and analysis & opinion

29 March 2019

Greg Blackman speaks to Sofradir about its €150 million investment to advance infrared sensor technology

28 March 2019

Greg Blackman reports from the Image Sensors Europe conference in London, which took place from 13 to 14 March

22 February 2019

Ron Low, Framos head of sales Americas and APAC, reports from Framos Tech Days at Photonics West in San Francisco where Sony Japan representatives presented image sensor roadmap updates

19 February 2019

Greg Blackman reports on CEA Leti's new image sensor, shown at Photonics West, which contains onboard processing and is able to image at 5,500 frames per second

28 August 2018

Technology that advances 3D imaging, makes lenses more resistant to vibration, turns a CMOS camera virtually into a CCD, and makes SWIR imaging less expensive, are all innovations shortlisted for this year’s Vision Award, to be presented at the Vision show in Stuttgart

05 April 2019

Greg Blackman reports on the complexities of training AllGo Systems' driver monitoring neural networks, which the firm's VP of engineering, Nirmal Kumar Sancheti, spoke about at the Embedded World trade fair

29 March 2019

Ahead of the Control trade fair in May, Greg Blackman speaks to Fraunhofer IPM about three new systems it will present for quality inspection

15 November 2018

In an effort to improve operational efficiency and minimise or eradicate defects, a growing number of manufacturers in the automotive sector are introducing machine vision technology in quality control processes for components, modules, sub-assemblies and finished vehicles. So, what are the main current and potential applications of machine vision technology in automotive manufacturing quality control applications? What are likely to be the key innovations and trends in this area over the next few years? What role might vision technology play in the automotive factory of the future?