Imaging on impact

Share this on social media:

Greg Blackman assesses the equipment characteristics required to provide vision solutions in harsh environments

Even cameras for the classic machine vision application of the factory floor need to be fairly robust, with many manufacturing environments experiencing high temperatures, dusty atmospheres and so on.

With applications of vision growing all the time, the term ‘harsh environments’ has to encompass much more than manufacturing. For example, the destructive forces involved in road traffic accidents can be immense and the consequences for the occupants severe. Wreckage from a highspeed collision on a busy stretch of dual carriageway can act as a sharp reminder to just how destructive these forces can be. According to the European New Car Assessment Programme (Euro NCAP), a European programme assessing car safety, 42,000 fatalities occur on European roads every year.

The advent of crash test dummies in the 1950s allowed the effects of high-speed impacts on humans to be simulated. Crash tests have lead to the implementation and development of various car safety features, such as seatbelts, airbags, crumple zones, and side impact bars.

Vehicle manufacturers are paying increasing attention to designing their cars with safety in mind and there is a wealth of legislation to which modern cars must adhere in order to be deemed roadworthy. Simulating the forces involved in road accidents and the effects on the occupants in crash test scenarios is an important part of assessing how safe a vehicle is. Vision systems play their part in these tests, relaying images of the crash for analysis, and as such, they themselves must be able to withstand the levels of vibration and shock involved in high velocity impacts.

Photron, a manufacturer of high-speed digital video cameras, provides vision systems for automotive safety testing. Fastcam MH4 and SA3 models are both designed to withstand 100G at 10ms in six axes.

There are two types of crash test commonly carried out by car manufacturers: full body testing, where a vehicle is propelled into a barrier and the effects of the impact are monitored, and ‘sled testing’, in which a vehicle interior is placed on a high G testing rig that simulates the forces generated by a high velocity impact and tests the effects of such forces on individual components within the car. In some cases, tests are repeated at a rate of 20 per day.

Photron’s high-speed cameras have to withstand extreme pressure conditions when used to monitor crash test scenarios.

‘The legislative requirements being placed on the automotive industry are becoming more rigorous. More tests are required and in greater detail,’ explains Andrew Hilton, managing director of Photron in the UK. ‘Companies are increasingly asking for more detailed analysis of components within a crash test event.’

‘One of the obvious design challenges for creating these cameras is making them rugged enough to withstand high G. The other challenge is to make them small and light,’ says Hilton. He expounds further by making the point that, traditionally, cameras that can stand up to the force involved in a crash test scenario are large, heavy objects. ‘To make a camera rugged is to make it heavy,’ he says. Furthermore, if the camera is large it requires a strong, solid mount to fix it on, which will also add weight to the assembly.

Automotive safety testing is meant to provide an accurate representation of the damage caused by a high velocity impact on a car unencumbered by heavy cameras and mounting. ‘If you start putting heavy camera assemblies in a car it invalidates the test,’ Hilton says. Small cameras also have the advantage of being able to be placed in awkward positions inside the vehicle to monitor in detail various components within the car. Aspects, such as the impact of the head on the seat’s headrest, or the effect of the impact on seatbelt fastenings or driver’s pedals, can be closely examined with small cameras.

Photron’s Fastcam MH4 allows four separate camera heads to be positioned within the vehicle, each measuring 35mm3 and weighing 100g, thereby fulfilling the criteria of ‘small and light’.

While Photron’s cameras are designed to withstand shock in particular, Edmund Optics, a manufacturer of optical components, has developed its harsh environment optics (HEO) lenses that are hermetically sealed to keep out the elements.

Edmund Optics prototyped and produced its HEO lenses for use in car camera sensors to aid parking. Lexus now produces a make of car that can parallel park itself using the strategically placed cameras. Other car models employ frontmounted sensors that provide automatic cruise control by measuring the distance to the vehicle in front and adjusting the car’s speed accordingly.

Edmund Optics’ HEO lenses are tested with a high-pressure spray to ensure they can cope with harsh environments.

The cameras need to be waterproof and be able to withstand high speeds. The lenses were developed as a small and easily-integrated package that could be used as part of a camera body. ‘We wanted to make the lenses small so that a custom camera solution wasn’t necessary,’ says Joel Bagwell, optical engineer at Edmund Optics.

Automotive manufacturers are able to integrate the lenses into vision systems of their choice. Testing was carried out to IPX9K and IPX7 standards, the former being a high-pressure spray that forces water onto the lens at 15l/min at 10,000kPa of pressure. ‘The lenses are sprayed at 0°, 30°, 60°, and 90°,’ explains Bagwell, ensuring that the seals remain impervious to water even at high pressures. This is vital to avoid moisture entering the lens apparatus, which in turn causes ‘fogging’ of the lens and obscures the view of the sensor. The IPX7 standard guarantees the lenses are waterproof. Lens systems are immersed in water at a 1m depth for 30 minutes. Assembly of the lens apparatus takes place in a controlled environment free from moisture.

‘Sealing is key,’ says Bagwell. ‘If you have a good sealing method then you can avoid fogging of the lens. Every single lens of ours undergoes a full set of testing so we can be sure that each one is going to stand up to the conditions we specify.’

Housing equipment in sealed containers to keep out water and dust is an important aspect of National Instruments’ (NI) approach to producing vision systems for use in hostile environments. NI produces a rugged Compact Vision System (CVS) that is tested to 50G shock, 5G vibration and can withstand temperatures of up to 55°C.

National Instruments’ CVS is used to measure scallop height, and has to be able to cope with extreme variants of temperature, and an atmosphere with high saltwater moisture levels – particularly when used on board trawlers.

NI’s CVS was utilised by the National Oceanic and Atmospheric Administration (NOAA) Fisheries Service, a division of the US government’s Department of Commerce, to measure the size and record the abundance of deep-sea scallops along the coast of North Carolina, USA, as part of its annual survey.

The vision system analysed approximately 125,000 scallops dredged from the sea floor. Scallops were passed along a conveyor belt onboard the trawler and recorded by the CVS. ‘The equipment is exposed to a lot of salt water spray, so it is important to seal everything inside an enclosure that can be easily washed down,’ explains Matthew Slaughter, vision product marketing engineer at NI. ‘Maintenance is also important,’ continues Slaughter, explaining that checks must be made to ensure that the rubber seals and gaskets have not perished and that the electronics remain dry. The cabinet encasing the CVS’s hardware is engineered to meet IP67 standards, which ensure the system is sealed against dust and water. Temperatures onboard the trawler can vary immensely. ‘Image sensors are sensitive to extremes of temperature,’ says Slaughter.

Typically, 45-55°C is the top end of the temperature range to which image sensors can be exposed before damage occurs. NI’s vision system contains no moving parts and is passively cooled, using heat sinks within the system’s architecture to draw heat away from sensitive components. The design of the vision system was inputted into computer software that models where temperature will accumulate and the size, shape, and position of heat sinks needed to provide maximum cooling effect.

A level of resistance to vibrations or shock is also important for vision systems. Cameras mounted in areas subjected to continuous heavy vibrations require strong soldering to ensure components remain fixed in place. In addition, the camera needs to record a clear image. ‘This can be achieved using a low exposure time,’ suggests Slaughter, so that even if the camera or object moves a sharp image is still captured. Also, software running on vision systems in high vibration environments should be designed to correct for variations in the image. Particularly rugged vision systems, such as NI’s CVS, are designed to withstand certain conditions, usually by encasing and sealing delicate components within a protective housing.

The more hostile an environment becomes, however, the greater the need for a more specialised approach to protecting vision systems – and with the range of vision applications diversifying all the time, that need will only increase.


Space may well be ‘the final frontier’ according to Star Trek, and it certainly is one of the harshest environments known to man in which to work. A number of UKIVA members report involvement in vision applications in the space industry.

James Webb Space Telescope

While camera systems have always been an integral part of space exploration, vision systems are now being employed for complex and varied tasks. These range from actual mission applications to inspection of components to be used in the space environment.

Assessment of damage to the heat shield tiles on the Space Shuttle is critical in determining the risk for reentry. Cameras from Adimec Advanced Image Systems have been used on a number of space shuttle missions. On the Endeavour space shuttle mission in August 2007, the shuttle astronauts took more than 1,500 high-resolution images to determine the threat of damage from a well-publicised gouge in the heat shield as well as a few other damaged areas and to see if a space walk was necessary. The imaging system played an important role in helping the shuttle and crew fly home safely. It is mounted at the end of the space shuttle’s 50-foot robotic arm and is controlled by the crew. Multiple high-performance cameras are coupled to a Pleora iPort IP engine. This enables high-resolution images to be streamed to a laptop inside the shuttle over a standard Ethernet link. Damage inspection has not been restricted to conventional imaging in the visible region of the spectrum. Infrared cameras from Flir Systems attached to an astronaut’s spacewalk tool belt, have been used to image 10 pre-damaged reinforced carbon-carbon samples as part of an experiment to test orbiter heat shield repair techniques. The astronauts also used the camera to scan the wing leading edge of the shuttle and the radiators of the International Space Station. The images and videos were saved to a flash card and then transferred to a space-qualified laptop computer.

Boeing’s Orbital Express is an unmanned spacecraft capable of docking to, inspecting, servicing, deorbiting or relocating satellites. For the first time ever in July 2007, Orbital Express successfully transferred propellant and a battery to a client satellite using a vision system on a robotic arm featuring a camera acquisition system by Active Silicon.

This unmanned operation in orbit will significantly extend the life, operation and cost-effectiveness of various types of spacecraft. Vision systems also have a role to play in research in space. A thermographic camera has been used in European Space Agency experiments to create pure crystals in space. Although the process of crystallisation under zero gravity conditions did not result in the level of crystal hoped for, the camera system demonstrated why many similar experiments in space were unsuccessful, ultimately saving the considerable costs of pursuing this particular line of research.

Vision inspection systems are used on a wide range of components designed for use in space. For example, the James Webb Space Telescope is the next stepping-stone toward understanding the universe and studying the Big Bang theory. Within the telescope, programmable aperture masks, or microshutter arrays in the near infrared spectrometer are used for sensing hundreds of simultaneous spectra – a first in space-based astronomy. National Instruments LabVIEW graphical development program is used to test the microshutter arrays to detect and diagnose problems early in development and greatly reduce overall cost. A microshutter array consists of tens of thousands of 100μm x 200μm shutter elements that can be individually held open or closed to create a custom image mask based on the sensor field of view.

In operation, elements are opened where objects exist and closed where interference between objects would occur on the detector. Because the testing environment must mimic the conditions of space, the software delivers system monitoring and control of temperature and pressure using General Purpose Interface Bus instrumentation. With this control, users can subject units to various temperature profiles that simulate the effects of solar warming.

Related features and analysis & opinion

05 April 2019

Greg Blackman reports on the complexities of training AllGo Systems' driver monitoring neural networks, which the firm's VP of engineering, Nirmal Kumar Sancheti, spoke about at the Embedded World trade fair

29 March 2019

Ahead of the Control trade fair in May, Greg Blackman speaks to Fraunhofer IPM about three new systems it will present for quality inspection

15 November 2018

In an effort to improve operational efficiency and minimise or eradicate defects, a growing number of manufacturers in the automotive sector are introducing machine vision technology in quality control processes for components, modules, sub-assemblies and finished vehicles. So, what are the main current and potential applications of machine vision technology in automotive manufacturing quality control applications? What are likely to be the key innovations and trends in this area over the next few years? What role might vision technology play in the automotive factory of the future?