Technical Details of AIfES

Resource-saving programming

AIfES functions explicitly work with pointer arithmetic and only declare the most necessary variables in a function. This means, the storage areas for the training data and the weights are provided by the main program. AIfES functions access these storage areas by transferring a pointer without the need of great resources themselves.

Platform independent and compatible

By using the compatible programming with GCC porting on nearly any platform is possible. This enables the completely self-sufficient integration including the learning algorithm on an embedded system. Depending on the customer's request, the Fraunhofer IMS can support in compiling the source code for the respective platform.

For the use with Windows for example the source code is compiled as »Dynamic Link Library« (DLL), to be able to integrate them in software tools such as LABVIEW or MATLAB. Especially the direct connection to MATLAB is useful to test different kinds of data processing.

The integration in different software development environments like Visual Studio or a Python-IDE is also possible. The main program, which binds the DLL can therefore also be in a different programming language such as C++, C#, Python, VB.NET, Java, …

For the first development of the individual ANN the Computer as a platform is a suitable choice to perform fast calculations. After the right configuration is completed the porting to the embedded system can be conducted.

A small selection of platforms and microcontrollers AIfES was already tested with:

  • Windows (DLL)
  • Raspberry Pi with Raspbian
  • Arduino UNO
  • ATMega32U4
  • STM32 F4 Series (ARM Cortex-M4)
© Fraunhofer IMS

The AIfES library

A chart depicting how to create and transfer your ANN using AIfES
© Fraunhofer IMS

Creating and Transferring ANN with AIfES

A chart depicting how AIfES functions
© Fraunhofer IMS

Compatibility and memory access with AIfES

The artificial neuronal network in AIfES

At the moment AIfES has a feedforward-network, which is almost configurable in all parameters and also allows deeper network structures. The ANN is only configured and calculated by one main function.
Regression- and classification tasks are possible. The network construction can also be adjusted to every customer request.

Short Overview of the features:

  • Number of inputs and outputs are freely definable
  • Number of hidden-layer and neurons per layer are configurable
  • Different activation-functions with additional parameters
    • Sigmoid, Softsign, ReLU, PReLU, …
  • Activation-functions adjustable to customer request
© Fraunhofer IMS
neural network activation function AIfES
 

Pre-trained model or training on the embedded system

AIfES offers two possibilities to integrate a neuronal network on an embedded system:


First the classical option where the neural network is pre-trained on a high performance system such as a computer and afterwards it is transferred to the embedded system. Through the construction of AIfES this integration can be conducted without any detours, because all platforms are using the same source code and only the weights have to be transferred.

Second option is the training on the embedded system itself. This can be useful if a sensor is supposed to calibrate itself or if a retraining is necessary. Last can be a benefit for the compensation of production-related system deviations. Another example is the decentralization of intelligence on adaptive embedded systems.

Learning algorithms

AIfES already contains two different learning algorithms, which can both be used on microcontrollers:

Backpropagation:

The backpropagation learning method has been implemented with common setting parameters. A short overview of the current features:

  • Online- and batch-Backpropagation
  • Momentum
  • automatic global learning-rate adjustment

Evolutionary /genetic learning methods

Since AIfES was developed mostly for smaller to medium sized neuronal networks, evolutionary learning is particularly suited. The Fraunhofer IMS developed a learning method based on an evolutionary/genetic algorithm, which is capable of escaping a local minimum independently. With the necessary population this process generates a higher need of storage capacity, but practical studies with sensor data show that this algorithm has a great benefit in smaller networks compared to the classic backpropagation method. Not only the total error was smaller in these studies, also the parameterization effort was significantly lower. For example, where the backpropagation method was sensitive to changes in learning rate, the evolutionary method could be used with a consistent configuration. The importance of this can be found in applications where no manual adjustments are possible.

A practical example of this learning method is our handwriting recognition demonstrator.

The algorithm is implemented in a way that also own fitness functions can be integrated. Therefore it is a universal optimizer.

Multicore

The evolutionary learning process can be used as a multicore-variant on a computer on the Rasberry Pi to reduce the learning time.

Technical Details

Platform independency and special learning techniques distinguish AIfES.

Range of Application

Human-Technology Interaction, Industry 4.0, Metrology, Medical Technology, Machine Learning Algorithms and Hardware Accelerators.

Licensing

How can I use AIfES and what does the Fraunhofer IMS offer besides AIfES? You can find out more about our services here.