Artificial Intelligence: An exchange of ideas

Today's applications and vision of future developments of AI

In this interview, technology experts Samuel Thomas Stähle, CEO of PowerBrain.Shop®, and Ronald Sieber, CEO of SYS TEC electronic AG, discuss the latest trends and developments in artificial intelligence.

From your perspective, how have the development, architecture and capabilities of artificially intelligent software - the colloquial "artificial intelligence" - evolved over the past few decades?

Sieber: In recent years, the industry, and thus also our company, has made significant progress in the development of very small computer architectures and embedded systems. We have been able to increase their robustness and steadily reduce their prices. This is mainly due to the ever increasing integration density of modern components. Ultimately, this has led to the fact that today's industrial edge devices have many times the computing power that was available to the Apollo spaceships around 50 years ago, in which people flew from the earth to the moon and back again safely. Current edge controllers are now so powerful that we can run modern AI software directly on them.

Stähle: From a computer science perspective, AI algorithms and data structures have been under research and development for about 70 years - and this process is far from complete. It will probably never be completed, but will continue to develop - with a continuous trend towards standardization and simplification in applicability.

I programmed the first AI algorithms myself more than 20 years ago in the 1990s. We used them for optical text recognition for print font quality evaluation on curved surfaces in Germany. We used expensive and large specialized computers, very expensive digital line scan cameras and optics, and software we wrote ourselves. At that time, modular purchasing of AI as a product or service was out of the question.

Today, however, there are much cheaper and miniaturized computers including special chipsets for AI data processing. AI as software is now available on the open market in industrial quality, either as a cloud service or for use directly in the Edge, such as our Edge AI PowerBrain™.

This, of course, revolutionized the use of AI, and its logic ultimately follows historical market observation: a new quality will be replaced by a new quantity, which in turn will be replaced - iteratively - by a new quality, which in turn will be replaced by a new quantity. Over the last quarter of a century, it has been observed that this cycle as a development paradigm also includes AI.

What is their perception from the market - what about customer acceptance and project applications of AI in 2021?

Sieber: For years, our company has been manufacturing more and more embedded systems with AI support, for example using i.MX 7 processors or NVIDIA® chipsets and large memory dimensions on behalf of customers. Therefore, we would speak of a steadily growing demand - primarily from the market segments Industry 4.0, Mobility and Infrastructure Management.

In the area of Industrie 4.0, users are increasingly interested in who they trust with their data and what happens to their data. Paradigms such as data sovereignty and data economy are becoming increasingly important. Here, our edge controllers offer the great advantage that the devices already have so much computing power that AI projects can also be implemented locally on site. In most cases, only status information or error messages are then transmitted to the cloud, while the often explosive sensor readings no longer leave the local edge controller itself.

A similar application picture emerges in the mobility and infrastructure management segments. In addition to the protection of sensitive data, which can often be directly related to individual persons, edge computing is specifically used here in order to be independent of fluctuating Internet connections. Thus, the edge controller-based AI system works predominantly autonomously and can, if necessary, fall back on a cloud-based backend for temporally uncritical actions.

How do you think project work in AI has changed between the beginning of your career and today?

Stähle: In the past quarter century, a great deal has developed technologically - of course.

On the hardware side, significantly greater computing capacities are now available at lower prices. Modern hardware of industrial quality has powerful process architectures and high data storage capacities in the end devices and sensor technology.

On the software side, numerous programming languages as well as software development tools have come onto the market, which make life much easier for artificial intelligence developers and increase their efficiency. Numerous tools for the analysis and simulation of training data are now also available to increase the efficiency of Data Scientists.

Our PowerBrain.Shop® team has also been able to play a part in innovating AI development. For example, we have translated AI for standard industrial use cases into products and software services that can be purchased at industrial quality and deployed within minutes as AI PowerBrains™ in project and solution business. The implementation is mostly carried out by integrators or manufacturers such as SYS TEC electronic AG. Thus, a big step forward has been made in the area of "ease of use", facilitating the market entry for many small and medium-sized integrators and manufacturers.

One serious "megatrend" we have seen come and slowly go over the last two decades is the Cloud AI narrative pushed by data centers, manufacturers and operators, and government players. The assumption was that AI would always require huge computer farms (AKA cloud).

Today, this perspective is shifting noticeably as computing capacity and sensor data processing and storage capabilities in edge devices continue to grow. The historical reliance on data centers is thus slowly dissolving. Similar to the development of the "personal computer", it is being replaced by "personal AI" in local embedded systems of higher quality and independence. This will eliminate data aggregation costs to the data center/cloud, cloud transaction costs, and attack vectors from the Internet.

What success factors play a special role in the development of Edge AI projects?

Stähle: It seems important to us to take a holistic view of the specific application. In the area of condition monitoring or predictive maintenance, for example, it is important that sensor data from the critical points on the machine or system are available in appropriate quality and in a timely manner. After all, machine learning models can only learn from those data that are provided during training and are available during operational use. The resulting performance and quality of Edge AI in operation reflects the context(s) in which the training datasets were collected, or the focus the project team had in choosing the sensor data and where it was collected.

Sieber: Indeed - we have had that experience as well. It is absolutely crucial to install vibration sensors at the 'hotspots' for vibrations of a machine or system in order to detect the actual mix of oscillations and vibrations as early as possible and with maximum precision. Especially for the protection of high-priced machine parts like bearings and axles as well as gearboxes and drives, the right choice of vibration sensors and their mounting locations is important to optimize their maintenance.

What future trends do you expect to see in artificial intelligence?

Stähle: Presumably, there will be a two-track development - similar to computing systems or computers.

On the one hand, gigantic "data octopuses" in clouds and data centers will use increasingly complex data structures, AI models and algorithms to process the collected data volumes and thus, for example, develop highly specialized AI solutions such as AlphaFold 2 from DeepMind to predict the folding of proteins in order to advance scientific findings.

On the other hand, the ever-increasing availability of computing and storage capacity in end-user devices will ensure that the "democratization" of AI advances: every company will be able to train and deploy its own edge AI on-site on machines and equipment in complete internet and cloud independence. In this, more and more signal processing and deep learning algorithms will be used.

The next big qualitative step would probably be potentially revolutionary research such as that conducted by Prof. Christoph von der Malsburg to find even more intelligent and optimal AI data structures and algorithms. These would subsequently also be used in Edge AI as well as data centers.

Furthermore, we expect many optimizations in the area of visualization of AI training, for its monitoring and quality assurance, which will allow, for example, to detect bias - the bias of AI models - and to visualize the mental models of the trained AI - in order to be able to identify structural improvement possibilities for continuous quality assurance. All this without needing a university degree in data science or AI.

Sieber: We have already successfully implemented the first projects with PowerBrain™ on our sysWORXX CTR-700 edge controller and we are amazed at the impressive results of the edge AI. The computing power required for this can definitely be considered moderate. Overall, the sysWORXX CTR-700 is powerful enough to perform signal preprocessing and the actual machine control at the same time as the Edge AI. I think this sets the trend for the future: innovative edge controllers such as the sysWORXX CTR-700 will take on increasingly complex control tasks and will also increasingly process multidimensional signal types such as structure-borne noise, which will be reduced locally to information vectors that are easier to evaluate thanks to integrated AI solutions such as PowerBrain™.

Stähle: Speaking of vectors, another fundamental paradigm shift involves reducing potential attack vectors to zero. Edge AI software operates autonomously on-site in the embedded system and does not require a back channel to the Internet or to hosted server farms. This makes it impossible to contact and manipulate the artificial intelligence from the outside. With cyber crime and automated attack attempts increasing annually, this maximization of cyber security is a strong driver for our end users. In addition, we hear the strong customer interest in data sovereignty, data security and data privacy.

After the countless releases and sales of highly personal as well as secret data on the darknet and similar platforms in the past few years, users today are placing more emphasis on security. They no longer want to transfer their machines, operational or plant data halfway around the world without a compelling need to do so.

Even the training data for artificial intelligence, which could provide information about the operating modes of systems, their statuses or, for example, production capacities, are hardly ever stored and processed unquestioningly in cloud servers anywhere in the world. Rather, the paradigm is changing here as well due to the constantly cheaper and larger-volume memory and memory cards on site in embedded systems. This increases the degree of self-sufficiency of users and their machine parks, while lowering costs and reducing dependencies.

In essence, therefore, Edge AI is a revolutionary change in the way artificial intelligence is implemented and operated. I see the evolution as similar to the revolution that once accompanied the invention and launch of affordable personal computers (PCs) for individual homes and businesses in the 1970's. The days of oversized data centers were numbered. Who today remembers the phrase attributed to Thomas J. Watson, the one-time CEO of IBM: "I think there will be a need in the world for maybe five computers."?
What emerged from these centralized mainframe systems today are infinitely more flexible, reliable, self-sufficient, and cost-effective systems with much greater stability. This is made possible by decentralization and 'democratization', by a much greater quantity of information technology.

We will all see the continuation of this development in the artificial intelligence segment as well - and enjoy much safer edge AI in both our professional and private lives. It will also be much more affordable in the future. Already today, for example, customers of our AI PowerBrain.Shop® can create and use their own Edge AI PowerBrains™ for various end devices, embedded systems and hardware platforms within minutes. You benefit conveniently and cost-effectively from this natural development trend.

Would you also like to benefit from the possibilities offered by AI in the edge device? Let us advise you now!

I will be happy to help you!

Florian Süß, Sales Manager

About our partner: POWERBRAINSHOP Holding Corporation

PowerBrain.Shop® is a growing AI software supplier with a global footprint, developing an AI product development and training platform for the next generation of Artificial Intelligence and powerful AI software 'brains' - called AI PowerBrains™. The impossible is made possible by enabling and greatly simplifying the next disruptive step in development - the widespread adoption of Internet-independent Edge AI - including development, training, validation and quality assurance, delivery, monitoring and auditing of AI for users and use cases across industries, business units and organizations. As a driving force in the artificial intelligence segment, we are a partner to all those who want to make our world more efficient with intelligence.