Skip to main content

Camera tracking across multiple moving cameras

University of Washington engineers have developed an automatic tracking system which can follow people across moving and still cameras. The algorithm used identifies a person in a video frame and can track them across multiple camera views in real time.

‘Tracking humans automatically across cameras in a three-dimensional space is new,’ said lead researcher Jenq-Neng Hwang, a UW professor of electrical engineering, on the University’s website. ‘As the cameras talk to each other, we are able to describe the real world in a more dynamic sense.’

In the past tracking people across non overlapping fields of view had been difficult due to changing perspectives and colour hues produced by different cameras. To overcome this, the researchers linked the cameras by calibrating each one to a short period of training data. This data allows the systems to recognise the difference in colour, texture and angle.

Once the link has been formed, the tracking system identifies a person and collects data on the body movement as well as the clothing texture and colour.

The team presented results at the Intelligent Transportation Systems Conference in October that used footage from a moving camera provided by the Swiss Federal Institute for Technology, Zurich.

The University website also said the researchers have tested the system on campus using a robot and a flying drone. The system successfully followed people even when the subject was obscured from view.

The researchers said the system could be used for marketing purposes to gather information about consumers' moving patterns; as well as applications such as security and surveillance to monitor unusual behaviour or to track a moving suspect.

Related Links:


Quantum imaging technology gains UK funding

Back on track

Machine vision tool can describe images



Read more about:


Media Partners