Is flexibility the key to a smart future?

Share this on social media:

Topic tags: 

The challenges of either building a full system or integrating machine vision on the factory floor are numerous. How is it best to work around legacy systems, in constrained spaces, and feed accurate information back to computers to guide automated robots? Rob Ashwell investigates

The aerospace sector is, at last, beginning to adopt automation and imaging technologies to manufacture commercial and military aircraft. The reasons for the uptake are simple – they’re financial. But the challenges of retro-fitting such systems into any facility are many, and one on the scale of an aircraft component manufacturing site more so.

Why now?

Aerospace has traditionally shunned automated manufacturing, partially out of the low volumes being shipped, but also because there is a high cost to getting it wrong and, while a pool of expert builders are expensive, they are also tried and tested by the industry.

With very few exceptions (Apple being one), there are few companies (or industries for that matter) that wouldn’t be envious of the aerospace sector’s two leading players, Airbus and Boeing.

Boeing’s quarterly statements, for example, show the company has nearly 5,700 commercial aircraft on order, giving it a $472 billion backlog. Similarly, in its H1 2016 statements, Airbus reported that its order book value totalled €978 billion. But these two companies’ statements also reveal an issue.

As Boeing’s financial statement put it: ‘During the quarter, the 787 programme reached a 12 per month delivery rate... The 737 programme rolled out the first two 737 MAX production airplanes and has captured over 3,200 orders for the 737 MAX since launch.’

In total, the manufacturer shipped 762 commercial aircraft last year, and increased its shipments year on year – it shipped 723 in 2014, 648 in 2013 and 601 in 2012.

The extreme factory build

Indeed, both aircraft manufacturers, which have not followed the same rate of adoption of automation and machine vision systems as, say, the automotive sector, are planning ramp ups and looking to begin the significant adoption of machine vision to achieve this.

Inside Airbus' Mobile Alabama manufacturing facility. While the facility is large, space is limited and installation of automation and vision systems challenging

As such, Boeing is reported to be planning investments worth more than $1 billion over the next two-to-three years in automation systems to boost production. It has also partnered with the University of Washington to develop advanced automation equipment. A major driver of this is likely to be to support its facility in North Charleston (South Carolina), which unlike its Washington State plant, benefits from no existing skilled workforce in the area.

Boeing’s Gerould Young is the director of materials and manufacturing technologies at Boeing Research and Technology. Speaking at the 2014 AeroDef Manufacturing Summit, Young said Boeing was heading towards automated factories, and ‘learning from the [significantly more advanced] auto industry... [but] scale, complexity and structural integration make automation challenging for fabrication processes.’

Flexible automation and machine vision architectures

The traditional problems that a system integrator faces are lighting, modelling systems with multiple unknowns, communication systems, the need to minimise system downtime and, once it is up and running perfectly, the need to persuade the operators of its requirements.

Based in the opposite corner of South Carolina, just a short (in US geographical terms) drive away from the new Boeing facility, is the systems integrator Integro Technologies. The company has been developing machine vision systems for legacy manufacturing equipment across a broad range of sectors for more than a decade.

According to its senior sales engineer, Starke Farley, lighting is one of the major challenges faced in any factory installation of an imaging system – especially if there is a skylight or roll-up door, which lets in ambient sunlight and renders bandpass filters useless. After this, one of the biggest problems encountered is through unspecified elements in legacy equipment. ‘When you’re installing into other people’s equipment, they send you the design [in either 2D or 3D]. And if, for some reason, there was a change to the machine after the design, then your camera [or] your light… won’t fit and we have to modify it on the fly.

The Electroimpact E700 automated riveting machine uses laser guidance to automate the fastening of composite skins to the aircraft's frame

‘A lot of time you also have moving components, where one machine is handing off to another,’ Farley continued. ‘So the drawings for machine A [which is being designed for] do not factor in machine B. And then you’ve got a robotic arm that comes in and completely wipes out your camera system.’

The company will, almost universally, develop the systems at their factory and present it to the customer as both a designer view and as a modelled factory acceptance test.

But the scale of the products being manufactured add another level of complication. Aerospace is a unique sector. The size of the aircraft being produced is on the mind-blowing scale. At the extremes, an Airbus A380 is more than 72 metres in length with a wingspan of more than 79 metres. Its height is 24 metres.

This means a flexible system is needed, with robots and imaging systems moving along gantries, often performing multiple functions.

As Comau’s Robert Zerwick – reported by Boston Commons earlier this year – put it, ‘widespread use of automation in the aerospace industry won’t take off until systems integrators offer more flexible solutions.’

Already Comau has developed robots capable of being mounted onto a diverse range of platforms, from gantries to self-guided cars.

Gantry mounted systems

One of the big pressures for automation comes from the mechanical fasting process. According to JR Automation’s David MacPhail, the major spend on automation projects in the aerospace sector is specifically for ‘drilling and filling.’ MacPhail cites data from aerospace consultant Nick Bullen, who suggests that mechanical fasteners account for 60 per cent of the cost of airframe assembly, 80 per cent of lost-time injuries and 80 per cent of defects.

Part of the reason for this cost is the industry’s use of composite materials, which cuts weight and saves fuel, but also creates a risk of fibres pulling and tearing. And problems are caused when debris from the drilling process ends up between the composite skin and the metal substructure. The two pieces must then be separated and inspected, which is costly and time consuming.

Among the industry’s most advanced automated systems are those from Electroimpact, an aviation automation firm based in Washington State, and with development centres in the UK and Australia. The company has worked with both Airbus and Boeing, including most recently on Boeing’s 777x wings.

Electroimpact’s newest automation products are the E7000 / E7100 riveting machines. While the larger of the two is 11.5 metres wide and made to a custom length – making it a challenge simply to fit into an existing facility – the makers boast of the most efficient use of factory floor space. For inspection, the system uses a Keyence IG-28 laser array to automate the riveting process, running at rates of up to 20 rivets per minute, which Electroimpact claim is the world’s fastest.

The company is also researching systems to identify and rectify issues in real time. These would use machine vision, infrared sensing and laser projection.

CMOS and wide dynamic range

The adoption of a flexible, moving inspection system brings with it new challenges, especially when it comes to lighting. How do you illuminate the key elements with a high contrast if the position of the robot and camera moves? The industry’s shift from CCD to CMOS will prove a key element here.

The scale of Boeing's South Carolina manufacturing facility is immense, but this brings its own unique challenges when implementing automation systems

Arnaud Destruels, from Sony’s Image Sensing Solutions division, commented that developing systems with both speed and accuracy requires a holistic design approach, taking several factors into account.

‘Lighting plays a key role in effective industrial image processing… The light’s direction must also be well controlled to create a high contrast, and to allow key features to be recognised and read against their background. Uniformity is [also] vital in cutting the length of post-processing time required by the software.’

According to Destruels, some of the main challenges come when parts of the sub-assembly being captured lie in the shadow of larger components, or when the bright lighting that enables images of parts to be captured effectively when they have low contrast against the substrate, leads to glare in other parts of the image.

‘In normal factory settings, these issues could to some extent be dealt with through image processing run on the back-end computer system. But the compromises needed to ensure [everything] can be captured in one image can mean accuracy is sacrificed.’

And for a part as large as a wing, this would also be impossible.

However, he continued ‘The ability of CMOS imagers to deliver high frame rates means it is possible to overcome this problem and get illumination consistency across the board using wide dynamic range.’ Here multiple shots are taken in sequence, each with a different exposure time. By combining these, it’s possible to produce a composite photograph that provides a much higher bit depth than a single image, correcting shading on parts of the image without the loss of effective bit depth that would be seen with a single-exposure.

IoT communications become even more crucial

One interesting way of solving the problem is instead to go smaller. Boeing, for example, has partnered with the University of Washington to look at ways to automate the aircraft build.

According to its press release, ‘the initial research focus is on automation, robotics, mechatronics and metrology’,  with projects including the ‘development of in-wing crawlers’ and ‘inside fuselage automation for percussive rivet forming, and sensor fusion.’

In traditional machine vision systems, high frame rates have meant there was a pressing need for strong communication systems. As Destruels put it: ‘There is now a need to synchronise systems on the production line using the IEEE1588 PTP [Precision Time Protocol]. This synchronisation of systems on a network to a common clock through PTP allows an object in any given frame to be identified for removal or remedial processing easily, precisely and quickly further downstream by the production line’s robotic systems.’

However, as machine-vision equipped robots become truly mobile – be it inside the wing, or on a self-driving vehicle –  the need for strong communications strengthens further. But here wireless standards will be required to communicate with the network.

Similarly, the rise of Industry 4.0 means that these vision-based automated systems will be joined by Internet of Things connected devices, meaning factories could, for example, eventually see smart rivets that know when they’ve been installed correctly. With IoT technologies complementing and working alongside machine vision, system integrators will be able to give enhanced analysis and greater accuracy to their customers – the incorporation of smart components into the manufacturing process will also mean the installation of vision-based automation systems on the factory floor will become simpler and less disruptive for the integrator and end customer.

Related features and analysis & opinion

15 November 2018

Lucid Vision Labs will be a new name at the Vision trade fair when it takes place from 6 to 8 November, although there will be some familiar faces at its booth in Stuttgart. The firm was only founded in January 2017, but it has a pedigree that makes it wise beyond its years, and has hit the ground running with subsidiaries already set up in Asia and Europe.

31 October 2018

Machine vision devices are now an increasingly common sight in manufacturing facilities, for inspection and quality control functions across a range of industrial sectors. Some observers are also beginning to explore how companies might make better use of the data they gather from machine vision systems – perhaps by integrating devices into the broader production process, or feeding back the information collected to make ongoing improvements and adjustments to machinery.

31 October 2018

Qualcomm Technologies' Snapdragon board is designed for mobile devices, but can be used to create other embedded vision systems. Credit: Qualcomm Technologies

Embedded computing promises to lower the cost of building vision solutions, making imaging ubiquitous across many areas of society. Whether this turns out to be the case or not remains to be seen, but in the industrial sector the G3 vision group, led by the European Machine Vision Association (EMVA), is preparing for an influx of embedded vision products with a new standard.

27 September 2018

Petra Gospodnetic at Fraunhofer ITWM describes her work building a virtual image processing environment to simulate the design of an inspection system

11 December 2018

David Dechow, at Integro Technologies, discusses vision system integration, a discipline that remains as strong as ever as new imaging technologies become more readily available

27 September 2018

Petra Gospodnetic at Fraunhofer ITWM describes her work building a virtual image processing environment to simulate the design of an inspection system