Embedded vision market mapped with online tool

Share this on social media:

The Edge AI and Vision Alliance, formerly the Embedded Vision Alliance, has launched a web-based tool that maps the embedded vision market.

The Embedded Vision and Visual AI Industry Map, developed with Woodside Capital Partners, provides a new way to visualise the embedded vision market. It is a resource for embedded vision professionals to identify prospective customers, suppliers and partners.

'Today, hundreds of companies are developing embedded vision and visual AI building-block technologies, such as processors, algorithms and camera modules – and thousands of companies are creating systems and solutions incorporating these technologies,' said Rudy Burger, managing director of Woodside Capital Partners. 'With so many companies in the space, and new companies entering constantly, it has become difficult to find the companies that match a specific profile or need. We’ve created the Embedded Vision and Visual AI Industry Map to address this challenge.'

The map is a free-to-use tool that provides an easy, efficient way to understand how hundreds of companies fit into the vision industry ecosystem. The tool displays companies within different layers of the vision value chain, and in specific end-application markets.

The map covers the entire embedded vision and visual AI value chain, from sub-components to complete systems.

'From our immersion in the embedded vision and visual AI industry over the past eight years, we know that the right company-to-company partnerships are essential,' said Jeff Bier, founder of the Edge AI and Vision Alliance. 'We’re excited to provide this industry map to help people efficiently find the companies they want to partner with – and to help companies make themselves more visible.'

The Embedded Vision and Visual AI Industry Map is available for all to access here: https://www.edge-ai-vision.com/resources/industrymap/.

Related news

Credit: Nataly Reinch _ Shutterstock.com

27 May 2021

The vision devices use Sony's IMX500 sensor, which is able to run AI algorithms on the chip to provide real-time information about free parking spaces and other transport data

20 April 2021

The Kria K26 SOM is built on top of the Zynq UltraScale+ MPSoC architecture. It has 4GB of DDR4 memory and 245 IOs for connecting sensors

14 April 2021

The platform integrates the Nvidia Jetson TX2NX module with an edge-to-cloud software stack including AWS Panorama Device SDK

22 February 2021

Participants in the Embedded Camera API Exploratory Group will discuss requirements for new interoperability standards to accelerate market growth and reduce development costs

19 February 2021

On 4 March, panellists from Basler, MVTec, Sick, and Amazon Web Services will discuss developments in embedded vision during the Embedded World digital show. IMVE's Greg Blackman will moderate