Robot uses vision to infer human actions to work together better

Share this on social media:

Engineers at the Politecnico di Milano in Italy are developing a vision system that can infer what a human is going to do in order for a robot to operate more effectively with its human co-workers.

Professor Paolo Rocco at the Politecnico di Milano gave a presentation at the European Machine Vision Forum in Bologna, Italy about the research his group is conducting on cobots, robots designed to work with humans. The conference took place from 5 to 7 September.

One of the projects Rocco spoke about is a vision system that can predict a human’s intention by looking at how they're working. In an assembly process, for instance, once the robot knows what part the human will pick up, it can move to another task or potentially aid the human.

The system uses a deep learning algorithm trained, in the example Rocco showed, by tracking the motion of the human's right hand. In this example the engineer was assembling a box by adding a lid to it. The human and robot are in front of one another and between them are two trays: one with the box, the other with the lid. Whenever the human goes to take a box or pick up a lid, the robot understands and does the complementary action. So, when the human takes the lid, the robot picks up the box and positions it for the human to attach the lid, and vice versa. 

‘Humans normally do a sequence of actions,’ Rocco explained during his presentation. ‘The most recent thing we’ve done is to predict the sequence of actions the human will do. So, whenever the human wants to make a collaborative action, the robot will be ready, so there will be no waste of time.’

In Rocco’s setup, the robot constantly monitors the human’s action so that the cycle of the human and that of the robot synchronises. ‘The benefit was a 20 per cent reduction in cycle time just using the machine learning algorithm to infer what the human is doing,’ Rocco said.

Rocco’s group has recently founded a spin-off company called Smart Robots. The company makes a smart hardware and software solution, which optimises the space where human operators and robots work side by side. The device can be linked to robots from various manufacturers, and can be used to ensure collision avoidance, or reschedule the robot’s tasks if it interferes with what the human is doing. 

Related news

Credit: Shutterstock/andrey_l

23 June 2021

A new time-of-flight depth sensor with a field of view of 360° x 60° has been announced by optical design firm Jabil Optics

Investment in automation equipment is being made in battery production. Credit: Kuka

14 June 2021

The VDMA pointed to a number of green technologies – including plans for the sixth largest solar power plant in the world to be built in Turkey – that are driving growth in automation equipment sales

Flir Firefly DL. Credit: Flir

26 March 2021

Customers can create deep learning models using Neurala’s Brain Builder software and then upload these to a Flir Firefly DL camera

02 March 2021

Teledyne e2v and Yumain aim to make AI technology more accessible by addressing challenges associated with AI

01 March 2021

Amazon Lookout for Vision is a cloud service, providing a machine learning model that customers can train with their images to spot production defects