Skip to main content

Lighting the way

Illumination is a key part of a machine vision system, but while simply switching on the lights might work there is a lot to be said for considering more complex lighting design. Peter Bhagat, CEO of illumination company Gardasoft Vision, remarked that expert lighting control is an ‘untapped benefit’ for many machine vision applications. He noted: ‘It’s striking that, of all the lights sold, only 10 per cent use expert light control. This doesn’t match with camera sophistication, where engineers generally use more of the features available.’

His remarks chime with those of Simon Stanley, managing director at ProPhotonix, who commented that ProPhotonix is getting more frequent requests to develop custom lighting solutions for complex and diverse machine vision tasks.

Inspecting a product from two different angles, for example, and dividing the inspection into two closely spaced time intervals requires precise control of the light. ‘The general trend is high-power lights that are precisely controlled electronically, so that the light and camera acquisition are synchronised,’ said Stanley.

Advanced lighting control can yield some extremely useful benefits for a machine vision setup – techniques like pulsing the LED or overdriving can all make inspection more accurate or increase the lifetime of the vision system. However, engineers might not have the expertise or time to learn about these more complex illumination techniques, which is why products are required that offer this functionality in a way that is easy to use.

‘Engineers recognise the importance of lighting and they’ve got a reasonable idea of the geometries that can be used and how to highlight the part to get the image they need,’ said Bhagat. ‘It’s when it comes to the more sophisticated aspects that engineers might not have the knowledge or the time to learn about more advanced lighting techniques.’ He added that most machine vision setups use the light in continuous mode, rather than employing more advanced settings.

A good example is pulsed mode, which can increase the lifetime of the light. The end of life for LED lighting according to the industry standard is the time when the LED reaches 70 per cent of its original brightness. ‘Seventy per cent of the original brightness is a huge change for a machine vision system,’ said Bhagat. ‘A 30 per cent reduction in brightness over time will cause any thresholds for identifying defects – black spots in a web of material, for instance – to be inaccurate. Even dimensional measurements may not work properly. A system might work for six months and then stop finding the defects because the threshold points need to change. This is potentially a huge issue for machine vision systems; firstly, in my experience, many systems start with only just enough light. Therefore, if it works initially, it might not do in six months’ time because of the degradation in illumination brightness.’

Gardasoft is releasing Triniti, a lighting controller that provides functionality for advanced control of illumination, and a product that has been shortlisted for the Vision Award at this year’s Vision show in Stuttgart, Germany (see page 4 for more on the awards). With regards to adjusting for degradation in brightness as LEDs age, Triniti provides a mechanism for regulating this over time in order to maintain constant brightness to the end of the light’s life; it means the starting point is less than 100 per cent brightness, but then the illumination can be adjusted up to cope with the drop in intensity.

‘The reliability and repeatability of the [machine vision] system can be dramatically improved by having a constant light output over a number of years,’ commented Bhagat.

Pulsing the light not only improves its lifetime, but also means the LED can be overdriven, which is to pulse the light for a short period of time at more than 100 per cent brightness. The higher brightness can be a big advantage, as it means the vision system can run faster, with the strobe freezing any motion.

Overdriving is common practice in machine vision and all Gardasoft’s controllers have built in limits for overdriving – the controller will allow the light to be driven at more than 100 per cent brightness, but will limit the pulse width to prevent damage to the light.

The downside of pulsed light is that it takes a little longer to synchronise the illumination and camera. ‘This [synchronisation] is not difficult to do, but understandably for a busy engineer, they might want to work out how to do it,’ Bhagat remarked. ‘Once the time is invested, they will be very pleased with the increased life, extra brightness and improved reliability of the system.’ Triniti gives a diagrammatic version of the timing of light and camera, which helps synchronise both.

The Triniti chip is a small device built inside the light. The chip holds some fixed data about the light, such as its electrical characteristics, and some variable data, such as how the light is being used. The Triniti controller can then read the data from the chip and drive the light based on that information. The chip will also have factors like overdriving characteristics, which can be more aggressive than standard limits because they’re specific for each light model, noted Bhagat. In addition, lighting manufacturers, CCS and Smart Vision Lights, have both adopted Triniti for their LED lighting.

Design is everything

A well designed illumination system will make subsequent image capture and processing easier; it will make the overall system more reliable and accurate. ‘The important thing is to look at the product and the task at hand, and ask if we illuminate this product in a different way or with a more sophisticated light will that make the imaging task any easier? The answer to that is usually yes,’ stated Stanley at ProPhotonix.

Stanley commented that the fundamental LED technology is changing all the time. ‘The LEDs are getting more efficient; we’re [ProPhotonix] positioned to take advantage of that, because we’re working with the LED chips. We’re also seeing advances in the UV range, both in terms of efficiencies and pricing.’ The company has the capability to pack a number of different kinds of chip into a light, with tight control of the electronics to provide lights emitting multiple wavelengths, for instance.

One of the challenges to overcome when working with LEDs, according to Sophie Perrot, product manager at Stemmer Imaging, is thermal management of the system, as the lifetime of LEDs decreases dramatically when the temperature rises. Perrot noted that good system design has to include heat management within the light.

However, Perrot added that the biggest difficulty when installing lighting is to understand why an illumination system composed of several components doesn’t work.

‘For me, an illumination controller should not only deliver the right amount of current at the right time, but also survey the light, e.g. check the voltage applied, measure the current going through the circuit,’ she commented.

At the same time, sensors could survey the temperature of the housing or the junction of the LEDs to ensure it doesn’t get too high.

Laser light illumination

Along with advances being made in LED chips, progress has also been made in laser diodes, particularly in the blue and green diodes, commented Stanley. ProPhotonix’ site in Cork, Ireland manufactures LEDs using chip-on-board methods, while its London, UK site focuses on laser modules.

Wallace Latimer at laser company, Coherent, commented that the core laser diodes have improved in terms of thermal stability, package size and lifetime, especially for the visible wavelengths. In addition, green direct laser diodes are now available commercially, which opens up applications that were difficult with previous illumination sources, he noted – green would provide much higher contrast on a black object than a red laser, for instance.

‘The performance of the laser in triangulation is just as important as any illumination in machine vision,’ he noted. ‘The core goal is to create contrast, and the quality of the illumination profile speeds up the processing and improves accuracy. If you have a perfect plane illumination front, then everything reflected back is from the object under inspection. The inspection system benefits if the light is always in the same place, with the same spectral brightness and spectral purity.’

Another important characteristic is line uniformity, remarked James Saxon, technical sales engineer at Laser Components UK, which supplies laser modules for machine vision. Machine vision lasers commonly use a Powell lens to generate a uniform line over around 80 per cent of the line’s length – the more uniform the laser, the easier it will be to configure the system.

Lasers can also be modulated in the megahertz region, just as LEDs can be, with the pulses synchronised with the camera shutter. ‘The advantage of modulating the laser is that really high accuracy can be achieved, and high contrast images, because it’s [modulation] a very effective way of filtering out background noise,’ said Saxon. ‘By having the camera and laser synchronised, the camera will only pick out the laser line and be less susceptible to background noise.’ Laser Components can produce thin laser lines for small objects – its MVMicroline series of laser modules generates a 5µm thick line – or powerful lines for applications where there is lots of ambient light.

Using high power lasers and LEDs on the factory floor, however, brings up the issue of lighting safety. Latimer at Coherent is chair of the safety section for an updated machine vision lighting standard that is currently being addressed by a group within the AIA, in cooperation with the Japan Industrial Imaging Association (JIIA) which wrote the original lighting standard. The standard committee has been writing an addendum that would be included in the JIIA standard, adding multiple sections on performance, documentation, communication protocols, connectors, and safety.

The engineers building and integrating vision systems have a good sense for the most part of how to configure illumination to give them contrast, Latimer said, but added that the illumination hardware itself still has some mystery to it. ‘LED manufacturers all have their own nuances, their own interpretation of their products,’ he remarked. ‘If you look at all the work that’s been done on the sensor side with all the standards like GigE Vision and USB3 Vision, it’s really created a central core that makes the cameras easy to integrate and use. We need to offer this for illumination. That’s why the lighting standard is so important.

‘We feel that the lack of information and understanding in terms of laser safety prevents people wanting to use lasers,’ Latimer continued. ‘Standards documents are about education, but also about giving engineers an information source for LEDs and lasers that summarises how to test and document lighting.’

There will be a follow-on meeting for the lighting standard at the Vision trade fair in Stuttgart in November.

Latimer and Bhagat commented on the need to make lighting for machine vision easier to deploy. Latimer said: ‘I think as we create more intelligence in illumination systems and tie them into an integral component we get a much smarter, reliable and easier to deploy system. We’ve reduced the time it takes to install and configure a vision system from months to weeks now, and with very sophisticated functionality. That builds the confidence of the user base in the ability to deploy these systems.’

Bhagat said that one way to help engineers make better use of lighting is to provide access to the illumination in high-level software packages. The Triniti system offers APIs for integration into image processing packages, such as Common Vision Blox from Stemmer Imaging, LabView from National Instruments, and Cognex’s Vision Pro software. Gardasoft provides a user control package, which sits within any of these software suites or any C#, C++ or VB.NET application. ‘It makes it so much easier for them [engineers] to make use of these advanced techniques if lighting control is included in the machine vision software,’ he said. ‘Now, it’s a 10-minute job rather than a two-day job to provide programmatic remote control of the light.’



Media Partners