Asset identification with neuromorphic vision on a drone
This project is the port to hardware of a proof of concept realized in simulation, in which IBM Spiking Neural Units (SNUs) and neuromorphic vision were exploited in an asset monitoring use case. An existing vision algorithm for object identification was transferred in the neuromorphic paradigm, integrating optical flow computed with SNUs. This method will be implemented in hardware and optimized to highlight neuromorphic hardware’ s benefits: great sparsity, small latency and low power consumption. With those properties, this approach offers an alternative to existing object identification methods, such as WLAN or Bluetooth.
The simulated use case is assets monitoring with a drone equipped with a neuromorphic vision sensor. Light beacons emitting a visual pattern are attached to assets, such as workers or packages, and are blinking with a pattern, like Morse code, to transmit useful information. The aim of this project is to create a similar use case in hardware and use it as a demonstrator showcased in the fortiss labs, with an embedded event camera on a drone.
After a successful implementation in simulation, this project aims to apply SNU based optical flow estimation with hardware event camera input, as well as achieving a real time object tracking and identification.
To the best of our knowledge, the low latency characterizing neuromorphic hardware has not been exploited yet for asset monitoring. This project will bring the first approach using this property to decode moving variable visual patterns, combining spiking optical flow estimation with an object identification method.
In this nine-month project, the asset monitoring solution should be applied to different real world use cases, especially involving variable transmitted payload.
01.08.2022 – 30.05.2023