Skip to main content

SynSense and Prophesee to build edge neuromorphic vision chip

SynSense and Prophesee, two leading neuromorphic technology companies, today announced a partnership that will see the two companies leverage their respective expertise in sensing and processing to develop ultra-low-power solutions for implementing intelligence on the edge for event-based vision applications.

The partnership combines in one single chip SynSense’s low-power vision SNN processor DYNAP-CNN® with Prophesee’s event-based Metavision® sensors, and is focused on developing a line of cost-efficient modules that can be manufactured at high volume.

SynSense has quickly become a world leader in commercial neuromorphic computing technology since their founding in 2017. Building on long experience in asynchronous computation, they now produce a line of dedicated event-based vision processors that provide an unprecedented combination of ultra-low power consumption and low-latency performance.

Prophesee broke new ground with the industry first commercial neuromorphic-based vision sensing platform. Its Metavision solutions use an event-based vision approach which delivers significant improvement over traditional frame-based acquisition methods, reducing power and data processing requirements while allowing operation in the most demanding conditions.

“We couldn’t be more excited to enter into a strategic partnership with Prophesee,” said Ning Qiao, CEO of SynSense. “Both companies are world leaders in their respective fields. The deep cooperation between us will promote the development of vision-based neuromorphic intelligence and will accelerate the commercialization of neuromorphic technology.”

Through this partnership SynSense will benefit from Prophesee’s sensing technology and wide existing network of partners. Prophesee will benefit from deep integration with SynSense’s novel processing technology, and world-leading developments in low-power vision applications.

Event-based vision is a paradigm-shifting advancement that is based on how the human eye records and interprets visual inputs. The sensors facilitate machine vision by recording changes in the scene rather than recording the entire scene at regular intervals. Specific advantages over frame-based approaches include better low light response (<1lux) and dynamic range (>120dB), reduced data generation (10x-1000x less than conventional approaches) leading to lower transfer/processing requirements, and higher temporal resolution (microsecond time resolution, i.e. >10k images per second time resolution equivalent).

With Prophesee’s patented Event-Based Metavision sensors, each pixel embeds its own intelligence processing enabling them to activate themselves independently, triggering events.

“As IoT applications in smart homes, offices, industries and cities proliferate, there is an increasing need for more intelligence on the edge. Our experience in implementing neuromorphic enabled vision sensing with smart pixel methods complements SynSense’s expertise in neuromorphic processing to make implementing high performance vision more practical. Together we can drive the development of highly efficient, lower cost products that provide more visibility, safety and productivity in a wide range of use cases.” said Luca Verre, CEO and co-founder of Prophesee.

The partnership will address the design, development, manufacturing and commercialization of the combined neuromorphic technology, including sensors, processing solutions, software and solutions to enable a broad range of applications

“Applying neuromorphic techniques to vision applications represents large market opportunity in many different sector. A recent report by Yole Développement forecasts that neuromorphic computing and sensing will represent between 15% and 20% of total AI computing revenue in 2035, about a roughly $20B market,” said Ning Qiao, CEO of SynSense.

The combined vision processing solution will be co-marketed by both companies and commercialized by SynSense for addressing IoT and Smart Home detection and gesture control applications.


Media Partners