Center for AI

Spiking neural units enable efficient event-controlled cameras

Whether quantum computing, artificial intelligence or deep learning, computer technology is developing at a rapid pace. It can also still learn a lot of things from nature. Nerve systems and the embedded neural networks, as well as the transmission of impulses via nerve cells, are a marvel of dynamics and energy efficiency for example. These properties can be transferred to computer applications with the help of so-called spiking neural networks. An on-going fortiss project that involves so-called event-based cameras, in which spiking neural units developed by IBM are employed, demonstrates the associated advantages. These two technologies, inspired by biology, are building blocks of a rapidly emerging trend in AI: neuromorphic computing, or expressed another way, computing according to neural processes. These technologies will be further explained in this blog post.

Processing visual scenes in real-time, identifying scattered signal patterns and reacting accordingly…biological organisms can do all of these things, thus making them highly-efficient information processing systems. Their capabilities are based primarily on the many millions of tiny communications units within their organism: the nerve cells, also referred to as neurons. These are highly-specialized, highly-sensitive cells that are responsible for forwarding information along the communication paths of our nervous system. A neuron accepts information with the help of electric and chemical signals, processes it and then forwards it on. All of this happens with a degree of dynamic and efficiency that makes today’s modern computer systems tip their hats out of respect.

What if this dynamic and efficiency could be deciphered and ported over to computer applications? And what new fields of applications could this lead to? These questions have been raised by scientists at fortiss and IBM Research. In the joint research center Center for AI, this has resulted in a project that could eventually open up new application opportunities in the fields of object protection, asset management or vehicle traffic. The project name FAMOUS stands for “Field service and Asset Monitoring with On-board SNU and event-based vision in Simulated drones”. The goal is to employ deep learning to demonstrate the utility of IBM spiking neural units (SNU) in real image processing applications based on event cameras.  

IBM spiking neural units based on biological processes

IBM spiking neural units are based on the principle of neural information processing. The neurons receive impulses via so-called spikes. The impulses are accumulated in the cell body (the so-called soma). As soon as a certain threshold is exceeded, the neuron passes the information along to other neurons. Instead of continuous, this process occurs only when triggered by an outside impulse. In other words, it’s event-controlled. The process is thus highly dynamic and with low latency, but on the other hand extremely efficient since the system remains idle until triggered. These are properties that can yield major benefits in modern computer applications.

 

Spiking Neural Unit
Left: neural information processing principles. As soon as a threshold is exceeded, the neuron forwards the information to other neurons.
Right: IBM spiking neural unit follows this principle and can seamlessly integrate into common AI systems.

IBM researchers replicated this neural process with the help of blocks from the deep learning framework, which led to the creation of the IBM spiking neural units. These SNUs can be seamlessly integrated into common AI systems and to a certain extent represent a bridge between the biological processes and the digital world. SNUs instruct AI systems on how biological processes support information processing and how the AI system can achieve its enormous dynamic, plus other advantages. Impulse transmission by means of spikes, and their unique neural dynamic, are the most important advantages of biological processes that are able to be ported over to a deep learning framework. The principle is also transferable to other biological processes, depending on which properties the AI systems are to adopt from biology. IBM is making the framework available for this purpose (see page below: NeuroAI Toolkit).

Employing deep learning: faster and more thorough decoding of visual scenes

The FAMOUS project specifically involves the previously-mentioned advantages of event-based information transfer: low latency and higher energy efficiency through event-based control. Researchers at fortiss are utilizing the technology in so-called event-based cameras to identify vehicles or other objects from drones and to map their movements. The advantage of event-controlled cameras is that they enable a much faster and more complete capture of the visual scenes.

Although “normal” video cameras can also identify and track objects, which they already do today in many applications, these recordings are slower and less complete than event-driven recordings modeled after neural transmission.

Simulated crane
Right: a simulated crane from the perspective of a drone using a conventional camera.
Left: the same scene using an event camera. Since the drone moves to the left, it records positive events (red) and negative events (blue).

This is explained by the recording principle behind conventional cameras, which rapidly capture sequential images – so-called frames – which then result in the video. The higher the frame rate – measured in frames per second (FPS) – the more image information is captured.

Cinema films are 24 FPS, which means that 24 still images are shown each second. The speed of the sequence tricks our brain into thinking the movement is fluid.

Event cameras also record information between the frames

The downside of conventional camera control is that events occurring between the frames are not recorded, thus leading to a loss of information. Decoding the image furthermore costs time. In contrast, event-controlled cameras only record the changing visual information, which means that much less information is processed much faster.

 

Simulated construction
A simulated construction site in which different objects or persons are to be identified by means of the “active optical identification sensor.”

To demonstrate these advantages, fortiss equipped simulated drones in the Neurorobotics Platformwith event-driven cameras that monitored different moving objects in an industrial work environment.

Thanks to an “active optical identification sensor”, the objects are detected, identified and localized.

The sensor is perfectly matched to event-controlled cameras.  Identification and monitoring in this environment is much faster and more precise than with conventional AI-based cameras.

Vehicle traffic or industrial environment applications

Myriad application opportunities exist for employing the new technologies, particularly when it comes to getting a fast and detailed picture of the current situation. As shown in the example, companies can determine the exact position of employees, machines and equipment at a construction site at any time, without GPS or a wireless connection. This allows people and equipment to be located quickly and brought to safety in case of fire or a disaster. Vehicle traffic applications are another possibility, such as for tracking police, fire or rescue vehicles and switching the traffic signals to green along their route.

 

By activating this video, you consent to transmitting data to YouTube.

3D views of FAMOUS simulation scene.

The field of computer science has always enjoyed working with analogies to biology, as evidenced by terms such as autonomous computing or artificial intelligence. The idea of using efficient and successful biological “solutions” for deep learning training is a new category, which opens a promising field of new applications.

After all, nature is full of clever solutions that can serve as an inspiration for further fields of application for AI.

  Marketing & press

Your contact

Marketing & press

presse@fortiss.org