Machine Learning for Embedded Systems

The general trend in the field of artificial intelligence in recent years has been toward ever larger and higher-performance models. In neural networks in particular the number of layers and weights is increasing explosively. This is directly related to the establishment of deep neural networks. But is bigger automatically always better? The size of neural networks demands ever larger, more energy-hungry and more expensive hardware. To conserve resources and the climate Fraunhofer IMS is developing machine learning methods that work directly on smart sensors. This includes both inference and actual learning.

 

Artificial Intelligence for Embedded Systems
© Fraunhofer IMS
Logo of the software frameworks Artificial Intelligence for Embedded Systems (AIfES)

To enable online training on very small embedded systems, such as an 8-bit microcontroller, new concepts are necessary: it is about distributed learning. A microcontroller alone is not capable of implementing the entire learning process completely independently. However, the individual embedded systems are able to master this task together through networking. The resources for machine learning are distributed along classes, layers and data. Fraunhofer IMS is researching on the automated realization of distributed learning methods for a wide variety of embedded systems. The AI software framework AIfES, which is also our development, is used too.

Another research focus of Fraunhofer IMS is the intelligent feature extraction for smart sensors. Deep neural networks already have an included feature detection, but this comes at the cost of size, so that they cannot be used on small embedded systems. By means of feature extraction specifically adapted to the problem at hand, the network size can be kept as small as possible while maintaining the highest possible accuracy. These concepts have already been implemented in LiDAR sensors and for the detection of atrial fibrillation in ECG signals.

Data processing flow in a LiDAR sensor
© Fraunhofer IMS
Data processing flow in a LiDAR sensor
Detection of bearing damages using PGNNs
© Fraunhofer IMS
Usage of Physical Guided Neural Network for the detection of bearing damages

Machine learning always requires data that represent the requirements in the use case as realistic as possible. However, sufficient or high-quality data do not exist for all applications. One solution can be the use of synthetic data for learning. In order to be able to use synthetic data that is as realistic as possible in the learning processes, Fraunhofer IMS uses hybrid models in the form of Physics Guided Neural Networks (PGNNs). This merges the worlds of simulation and machine learning.

 

 

Fraunhofer IMS does not only research in the area of supervised learning methods but also in the area of unsupervised learning methods. One application example is Nonintrusive Load Monitoring (NILM) (Link: NILM). The aim is to measure the device-specific energy consumption without influencing the measurement system itself. Possible applications include energy efficiency, condition monitoring and predictive maintenance.

Unsupervised Learning using the example of NILM
© Fraunhofer IMS
Disaggregation of energy consumers with the help of Nonintrusive Load Monitoring (NILM)

Our technologies - Innovations for your products

Distributed Learning

Distributed learning enables training of complex tasks on multiple small embedded systems.

Hybrid Learning and PGNNs

In case of insufficient training data simulations based on physical models help to improve the data base.

Feature Extraction for Smart Sensors

By means of adapted feature extraction the size of a neural network can be reduced.

Embedded AI for LiDAR

Embedded AI can be used to accelerate and improve the quality of distance measurement by LiDAR sensors.

Industry 4.0 Research

Against the background of "Industry 4.0", Fraunhofer IMS is researching predictive maintenance solutions for the manufacturing industry.

TimestampsAI

Faster and data reduced solution for high-resolution LiDAR-systems: TimestampsAI is our latest development for your applications to capture complex scenes in 3D in any environment.

Our technology areas - Our technologies for your development

Communication and Networking

Communication interfaces allow data exchange with other devices and connection to networks

User Interfaces

User interfaces as interface between device and user allow the configuration and operation of a product

Computer Vision

Computer Vision methods extract the maximum amount of information from image data

 

Embedded Software and Artificial Intelligence (Home)

Here you can get back to the overview page of the core competence Embedded Software and Artificial Intelligence (ESA).