First applications available for Imago event camera

Share this on social media:

The reactions of creative engineers at the Embedded World were already very interesting, after they had understood how an event-based camera works. Every pixel decides independently and with a frequency of up to 10 kHz if something has moved/changed. Backgrounds don’t trigger a signal, only movements do. As there is no fixed scanning rate (adieu “frames per second”) and every pixel can deliver a signal, an optimized data stream reaches the VisionCam EB. The camera must then decide algorithmically about what should happen.

For this, there are now first example applications available from sensor manufacturer Prophesee, Paris, and from IMAGO Technologies, Friedberg. They consolidate the motion information to statements like: “number of objects falling down in the camera’s field of view”, or “the kinematic movement meets the expectations”, as well as “the vibration is in the tolerance range”. Event-based vision – a new technology for established and completely new applications.

The Austrian Institute of Technology's 3D surface scanner. Credit: AIT Austrian Institute of Technology

06 September 2021

Image: Khing Choy/shutterstock.com

13 October 2021

The Austrian Institute of Technology's 3D surface scanner. Credit: AIT Austrian Institute of Technology

06 September 2021

Emergent Vision Technologies’ Zenith HZ-100-G 100GigE camera

24 August 2021