New possibilities arising with AIfES
Fraunhofer IMS has developed AIfES, a platform-independent and constantly growing machine learning library developed using the C programming language, which implies a fully configurable Feedforward Neural Network (FNN). AIfES uses standard libraries based on the GNU Compiler Collection (GCC). The program source code is reduced to a minimum, thus even the integration on a microcontroller including learning algorithms is possible. AIfES runs on almost any hardware from 8-bit microcontrollers to smartphones and PCs
AIfES is Open Source
AIfES is offered as a dual license model. For private projects or developers of Free Open Source Software (FOSS) under the GNU General Public License (GPL) version 3, AIfES can be used free of charge. If AIfES is to be combined and distributed with commercially licensed software and/or if the AIfES source code for the commercially licensed software is not to be distributed under the GNU GPL V3, a license agreement must be concluded with Fraunhofer IMS. Further information and a way to contact us can be found here.
AIfES for Arduino®
The Fraunhofer IMS with AIfES and Arduino prepare to enter a partnership. For this purpose, a version compatible with the Arduino IDE was realized, which can be run on almost any Arduino board. Follow this link to go directly to the corresponding repository.
Intelligent microelectronics and sensors
AIfES offers AI to run on small intelligent microelectronics and sensors, independent on connectivity towards a cloud or a powerful and resource-hungry processing entity while providing full AI mechanism like independent learning. This opens the door for many new applications, starting from real-time evaluation of sensor data, calibration of sensors towards recognition of pattern and their classification. On top, the development of virtual sensors becomes possible by simulating a dependency of representative measured values on a new target value.
Easy start and compatibility with other frameworks
AIfES is very similar to and compatible to the well-known Python ML frameworks such as TensorFlow, Keras or PyTorch, but its functionality is significantly reduced. In the current version, Feedforward Neural Networks (FNN) are supported, which can be configured completely freely. Also the popular activation functions like ReLU, Sigmoid or Softmax are already integrated. A full implementation of Convolutional Neural Networks (ConvNet) will follow in the near future.
The model development is also oriented to the Python frameworks, so that the user can find his way around directly. The common algorithms for training like the Gradient Descent Optimizer (SGD) or the Adam Optimizer are also included.
It is possible to import an already trained FNN from another ML framework. Only the network structure and the weights are needed to reproduce an FNN and even to train it further. A Keras example is included in the library and it is also not necessary to port the model into TensorFlow Lite. Of course, the weights of an FNN in AIfES can also be read out, stored or sent directly to another device. Different FNNs can be loaded at runtime by reconfiguring the model and loading new weights. The weight sets can be stored in the cloud to make them available to other devices. The applications are almost limitless
An embedded system with different tasks is possible
The construction of AIfES functions inside the library enables for example the calculation of an ANN including its parameterization with only one function call. This structure allows an embedded system to be completely reconfigured to perform a totally different task afterwards. This is realized by exchanging the weights of the ANN and - if necessary - a change of the network structure is possible. The same AIfES functions are still available and only the transferred parameters are altered.
Because processing can take place offline on the device, no sensitive data needs to be transferred.
AIfES offers a decentralization of the processing power, for example by small intelligent embedded systems that process the data and provide calculated results to the next higher entity to avoid raw data overloading of the whole system. As a result, the totally transferred amount of data is significantly reduced and the system can act faster and more efficiently as a whole.
Finally it is possible to build a network from small and adaptive systems that share tasks among themselves.