Skip to main content

Bridging the machine vision data gap for Manufacturing 4.0

Mathew Daniel at data management software provider Sciemetric Instruments explains the importance of centralised data collection in factory automation, and why image data is a key part of that

Ever growing requirements for traceability, root cause identification, and quality improvement are driving the adoption of machine vision technologies across the manufacturing industry. And yet, how to manage all that machine vision data is challenging vendors throughout the supply chain.

Take a dispensing operation for a sealant between two parts on an automotive powertrain line. Hundreds of images with related data points could be collected every minute. Multiply that by the 480 minutes of an eight-hour shift, then again by the dozens of stations on the line that might also use vision for quality inspection.

That’s a monstrous amount of data from multiple sources that must be brought to heel. In most scenarios, data resides across the plant floor on multiple PCs – silos that contain images and only the most basic of image data (such as pass/fail status). Access and retrieval is difficult and only basic traceability is possible.

Vision data still left on the sidelines

The demands of Manufacturing 4.0 are putting manufacturers under increased pressure to achieve a higher standard of quality and efficiency, to reduce their costs both on the production line and from warranty claims, and to provide proof of compliance with specific legislation or customer requirements. They want solutions that are complete, purpose built, low cost and offer ease of use from setup time to management.

Machine vision vendors understand the value of making better use of image data, and manufacturers have clear needs that can be addressed by doing so. Why, then, does the gap between Manufacturing 4.0 and machine vision remain too broad to jump easily? Because few machine vision vendors are considering how, or have the capability, to harness images and their data in the context of all the other data in the plant.

Vision must be managed as part of a whole

Many plants are already using centralised data collection, management and analysis tools for other datasets (e.g. digital process signatures or scalars) from processes and test stations on the line. Vision is seldom included in this equation, but doing so is the next logical step. As insightful as digital process signatures can be to spot trends and anomalies that require corrective action, the adage still holds true – an image is worth a thousand words. Coupling an image with a part or assembly’s other datasets opens a whole new horizon to trace root cause and close the loop fast to address a quality issue.

For plants that are looking at a system for centralised data collection, management and analysis for the first time, it only makes sense to get it right from the start and ensure machine vision isn’t a forgotten part of the equation. How?

Create a single history for each part

Start by indexing every image and its related dataset by the serial number of the related part or assembly. This makes it easy to correlate with all the other data relevant to that part or assembly. We have worked in plants with DIY and vendor-specific databases that stored some data by date and time stamp in one silo and other data by serial number in a different silo. Any effort at data retrieval and correlation to troubleshoot a quality issue took days with custom query tools, when it should only take hours.

Next, collect these serialised images and their datasets over a network architecture into a central database in production real-time (within one cycle of cycle completion). Because they are indexed by serial number, they can be consolidated with all the other data for a specific part or assembly. This results in a comprehensive birth history record that can be quickly searched and cross-referenced.

Then take action

What can you do with a birth history record that includes vision data?

  • Greater insight into station performance: Quickly highlight stations that are falling behind in part count or in yield. This can be particularly useful for parallel station comparisons.
  • Feature trend analysis: Set limits fast and track time-of-day or product variances by model.
  • Part failure process analysis: Review data from multiple processes to trace root cause of a failed part. This is critical to address the issue quickly and mitigate its negative impact. Take, for example, the failure of a seal in a joint. In a true Manufacturing 4.0 setting, the engineer can take a comprehensive look at the joint’s leak test results, fastening data, dispense data, real-time video image and real-time video bead data to identify and address the issue quickly, as well as determine if any other parts are at risk of the same failure.
  • Selective recall: This correlated data analysis then allows for selective and targeted quarantine or recall. Why recall thousands of units and suffer the consequential impact to your bottom line and brand reputation if only a few are at risk?
  • Proof of compliance: Image and image data that can be recalled by a part’s serial number makes it easy to provide evidence that the part was built to specification and that manufacturing and test processes were under control.
  • Tracking limit changes and playing the numbers: Generate feature trend reports to analyse the impact of new limit settings. The historic data can also be used to play 'what if?' – run simulations to understand the impact of new limit settings.

A new competitive edge

Vendors and manufacturers acknowledge the need to do more with machine vision data, but this must be done in a holistic way that integrates the collection, management and analysis of all the production data generated on the plant floor. Creating a single, serialised birth history record for each part or assembly is the launch point for manufacturers to meet the competitive challenge posed by Manufacturing 4.0.

--

Mathew Daniel is vice president of operations at Sciemetric Instruments, where he manages service and installation, product development, and manufacturing and quality. Daniel oversees many of the manufacturing data and analytics implementations provided to large manufacturers, helping them to organise and maximise a return from their production data.

Topics

Read more about:

Software, Industry 4.0, Technology

Media Partners