Skip to main content

Prophesee releases software kit for event-based imaging

Prophesee, a provider of neuromorphic vision technology, has released a suite of software tools to help engineers develop event-based imaging systems.

The new software is for the company's Metavision event-based sensor, which it launched last year. Event-based vision is an imaging method that records changes in a scene rather than capturing an image frame by frame.

The Metavision Intelligence Suite has three components – Player, Designer and SDK – each aimed at different stages of the design process. It is available in both a time-unlimited free trial as well as a professional version, providing access to source code, advanced modules, revision updates, full documentation and support.

In total, the suite consists of 62 algorithms, 54 code samples and 11 ready-to-use applications. It provides users with both C++ and Python APIs as well as documentation and a wide range of samples organised by increasing difficulty to introduce the fundamental concepts of event-based machine vision.

Each pixel in Prophesee’s Metavision sensor, released last year, only activates if it detects a change in the scene – an event – which means low power, latency and data processing requirements compared to traditional frame-based systems.

The third-generation VGA sensor is aimed at developers of cameras for industrial automation and IoT systems such as robots, inspection equipment, monitoring, and surveillance devices. The software toolkit supports this work.

Set algorithms provided in the suite include high-speed counting, vibration monitoring, spatter monitoring, object tracking, optical flow, ultra-slow-motion, machine learning and others.

'We understand the importance of enabling the development ecosystem around event-based vision technology. This software toolkit is meant to accelerate engineers’ ability to take advantage of its unique benefits without having to start from scratch,' said Luca Verre, CEO and co-founder of Prophesee. 'The tools offer productivity and learning features that are valuable regardless of where a development team is on the adoption curve of event-based vision, and will jump-start design projects with production ready design aids.'

The Metavision Player module has a graphical user interface for visualising and recording data streamed by Prophesee-compatible event-based vision systems. The Designer module consists of a set of libraries, Python APIs and code examples built for quick and efficient integration and testing. Finally, the SDK is a large set of event-based vision algorithms, available via APIs. Algorithms are coded in C++ and available via pre-compiled Windows and Linux binaries in its free license version.

Earlier this year, Sony teamed up with Prophesee to develop a stacked event-based vision sensor, which was announced at the International Solid-State Circuits Conference in San Francisco.

Topics

Read more about:

Event-based vision, Business

Media Partners