First applications available for Imago event camera
The reactions of creative engineers at the Embedded World were already very interesting, after they had understood how an event-based camera works. Every pixel decides independently and with a frequency of up to 10 kHz if something has moved/changed. Backgrounds don’t trigger a signal, only movements do. As there is no fixed scanning rate (adieu “frames per second”) and every pixel can deliver a signal, an optimized data stream reaches the VisionCam EB. The camera must then decide algorithmically about what should happen.
For this, there are now first example applications available from sensor manufacturer Prophesee, Paris, and from IMAGO Technologies, Friedberg. They consolidate the motion information to statements like: “number of objects falling down in the camera’s field of view”, or “the kinematic movement meets the expectations”, as well as “the vibration is in the tolerance range”. Event-based vision – a new technology for established and completely new applications.