Today's applications and vision of future developments of AI.
In this interview, industry experts Samuel Thomas Stähle, CEO of PowerBrain.Shop®, and Ronald Sieber, CEO of SYS TEC electronic AG, shed light on the current trends and developments surrounding artificial intelligence.
From your point of view, how have the development, architecture and capabilities of artificially intelligent software - the colloquial "artificial intelligence" - evolved over the past decades?
Sieber: In recent years, the industry, and thus also our company, has made significant progress in the development of very small computer architectures and embedded systems. This has enabled us to increase their robustness and steadily reduce their prices. This is essentially due to the ever higher integration density of modern components. Ultimately, this has led to today's industrial edge devices having many times the computing power that was available to the Apollo spaceships around 50 years ago, in which people flew from the earth to the moon and back again safely. Current edge controllers are now so powerful that we can run modern AI software directly on them.
Stähle: From the point of view of computer science, AI algorithms and data structures have been under research and development for about 70 years - and this process is far from complete. It will probably never be completed either, but will continue to develop - with a continuous trend toward standardization and simplification in applicability.
I programmed the first AI algorithms myself more than 20 years ago in the 1990s. We used them for optical text recognition for print font quality evaluation on curved surfaces in Germany. We used expensive and large specialized computers, very expensive digital line scan cameras and optics, and software we wrote ourselves. At that time, modular purchasing of AI as a product or service was out of the question.
Today, however, there are much cheaper and miniaturized computers including special chipsets for AI data processing. AI as software is now available on the open market in industrial quality, either as a cloud service or for use directly in the Edge, such as our Edge AI PowerBrain™.
This, of course, revolutionized the use of AI, and its logic ultimately follows historical market observation: a new quality will be replaced by a new quantity, which in turn will be replaced - iteratively - by a new quality, which in turn will be replaced by a new quantity. Over the last quarter century, it has been observed that this cycle as a development paradigm also includes AI.
What is your perception from the market - what about customer acceptance and project applications of AI in 2021?
Sieber: For years, our company has been manufacturing more and more embedded systems with AI support, for example, using i.MX 7 processors or NVIDIA® chipsets and large memory dimensions on behalf of customers. Therefore, we would speak of a steadily growing demand - primarily from the market segments Industry 4.0, Mobility and Infrastructure Management.
In the area of Industrie 4.0, users are increasingly interested in who they entrust with their data and what happens to their data. Paradigms such as data sovereignty and data economy are becoming increasingly important. Here, our edge controllers offer the great advantage that the devices already have so much computing power that AI projects can also be implemented locally directly on site. In most cases, only status information or error messages are then transmitted to the cloud, while the often explosive sensor readings no longer leave the local edge controller itself.
A similar application picture emerges in the mobility and infrastructure management segments. In addition to the protection of sensitive data, which can often be directly related to individual persons, edge computing is used here specifically to be independent of fluctuating Internet connections. In this way, the edge controller-based AI system operates predominantly autonomously and can, if necessary, fall back on a cloud-based backend for actions that are not time-critical.
How do you think AI project work has changed between the beginning of your career and today?
Stähle: In the past quarter century, a great deal has developed technologically - of course.
On the hardware side, significantly greater computing capacities are now available at lower prices. Modern hardware of industrial quality has powerful process architectures and high data storage capacities in the end devices and sensor technology.
On the software side, numerous programming languages as well as software development tools have come onto the market, which make the life of the artificial intelligence developer much easier and increase its efficiency. Numerous tools for the analysis and simulation of training data are now also available to increase efficiency for Data Scientists.
Our PowerBrain.Shop® team has also been able to contribute to the innovation of AI development. For example, we have translated AI for standard industrial use cases into products and software services that can be purchased in industrial quality and deployed within minutes as AI PowerBrains™ in project and solution business. The implementation is mostly done by integrators or manufacturers like SYS TEC electronic AG. Thus, a big step forward has been made in the area of "ease of use", facilitating the market entry for many small and medium-sized integrators and manufacturers.
One serious "megatrend" we have seen come and slowly go over the last two decades is the narrative of Cloud AI pushed by data centers, manufacturers and operators, and government actors. The assumption was that AI would always require huge computer farms (AKA cloud).
Today, this perspective is shifting noticeably as computing capacity and sensor data processing and storage capabilities in edge devices continue to grow. The historical dependence on data centers is thus slowly dissolving. Similar to the development of the "personal computer", it is being replaced by "personal AI" in local embedded systems of higher quality and independence. This eliminates data aggregation costs to the data center/cloud, cloud transaction costs, and attack vectors from the Internet.
What success factors play a special role in the development of Edge AI projects?
Stähle: The holistic view of the specific use case seems important to us. In the area of condition monitoring or predictive maintenance, for example, it is important that sensor data from the critical points on the machine or plant are available in appropriate quality and in a timely manner. After all, machine learning models can only learn from those data that are provided during training and are available during operational use. The resulting performance and quality of Edge AI in operation reflects the context(s) in which the training dataset was collected, or the focus the project team had in choosing the sensor data and where it was collected.
Sieber: Indeed - we have had that experience as well. It is crucial to place vibration sensors at vibration 'hotspots' in a machine or plant in order to detect the actual mix of oscillations and vibrations as early as possible and with maximum precision. Especially for the protection of high-priced machine parts like bearings and axles as well as gearboxes and drives, the right choice of vibration sensors and their mounting locations is important to optimize their maintenance.
What future trends would you expect to see in artificial intelligence?
Stähle: Presumably, there will be a two-track development - similar to computing systems or computers.
On the one hand, gigantic "data octopuses" in clouds and data centers will use increasingly complex data structures, AI models and algorithms to process the volumes of data collected and thus develop, for example, highly specialized AI solutions such as AlphaFold 2 from DeepMind to predict the folding of proteins in order to advance scientific findings.
On the other hand, the ever-increasing availability of computing and storage capacity in end-user devices will ensure that the "democratization" of AI advances: every company will be able to train and deploy its own edge AI on-site on machines and equipment in complete Internet and cloud independence. More and more signal processing and deep learning algorithms will be used in this.
The next big qualitative step would probably be potentially revolutionary research such as that conducted by Prof. Christoph von der Malsburg to find even more intelligent and optimal AI data structures and algorithms. These would subsequently also be used in Edge AI as well as data centers.
Furthermore, we expect many optimizations in the area of visualization of AI training, for its monitoring and quality assurance, which will allow, for example, to detect bias - the bias of AI models - and to visualize the mental models of the trained AI - in order to be able to identify structural improvement possibilities for continuous quality assurance. All this without needing a university degree in data science or AI.
Sieber: We have already successfully implemented the first projects with PowerBrain™ on our sysWORXX CTR-700 edge controller and we are amazed at the impressive results of the edge AI. The computing power required for this can definitely be rated as moderate. Overall, the sysWORXX CTR-700 is powerful enough to simultaneously perform signal pre-processing and the actual machine control in addition to the Edge AI. I think this sets the trend for the future: innovative edge controllers such as the sysWORXX CTR-700 will take on increasingly complex control tasks and will also increasingly process multidimensional signal types such as structure-borne noise, which thanks to integrated AI solutions such as PowerBrain™ will be reduced locally to information vectors that are easier to evaluate.
Steels: Speaking of vectors, another fundamental paradigm shift means reducing potential attack vectors to zero. Edge AI software operates autonomously on-site in the embedded system and does not require a return channel to the Internet or to hosted server farms. This makes it impossible to contact and manipulate the artificial intelligence from the outside. With cyber crime and automated attack attempts increasing annually, this maximization of cyber security is a strong driver for our end users. In addition, we hear the strong customer interest in data sovereignty, data security, and data privacy.
After the countless releases and sales of highly personal as well as secret data on the darknet and similar platforms in recent years, users today are placing more emphasis on security. They no longer want to transfer their machines, operational or plant data halfway around the world without compelling necessity.
Even the training data for artificial intelligence, which could provide information about the operating modes of systems, their statuses or, for example, production capacities, are hardly ever stored and processed by users unquestioningly anywhere in the world in cloud servers. Rather, the paradigm is changing here as well due to the constantly cheaper and larger-volume memories and memory cards on site in embedded systems. This increases the degree of self-sufficiency of users and their machine parks, while lowering costs and reducing dependencies.
In essence, therefore, Edge AI is a revolutionary change in the way artificial intelligence is implemented and operated. I see the evolution as similar to that revolution that once accompanied the invention and launch of affordable personal computers (PCs) for individual homes and businesses in the 1970's. The days of oversized data centers were numbered. Who today remembers the phrase attributed to Thomas J. Watson, the one-time CEO of IBM: "I think there will be a need in the world for maybe five computers."?
What emerged from these centralized mainframe systems today are infinitely more flexible, reliable, self-sufficient and cost-effective systems with much greater stability. This is made possible by decentralization and 'democratization', by a much greater quantity of information technology.
We will all see the continuation of this development in the artificial intelligence segment as well - and enjoy much safer edge AI in our everyday professional and private lives. It will also be much more affordable in the future. Already today, for example, customers of our AI PowerBrain.Shop® can create and use their own Edge AI PowerBrains™ for various end devices, embedded systems and hardware platforms within minutes. You benefit conveniently and cost-effectively from this natural development trend.
Would you also like to benefit from the possibilities offered by AI in the edge device? Let us advise you now!
About our partner: POWERBRAINSHOP Holding Corporation
PowerBrain.Shop® is a growing AI software supplier with a global footprint, developing an AI product development and training platform for the next generation of Artificial Intelligence as well as powerful AI software 'brains' - so-called AI PowerBrains™. The impossible is made possible by enabling and greatly simplifying the next disruptive step in development - the widespread adoption of Internet-independent Edge AI - including development, training, validation and quality assurance, delivery, monitoring and auditing of artificial intelligence for users and use cases across diverse industries, business units and organizations. As a driving force in the artificial intelligence segment, we partner with all those who want to make our world more efficient with intelligence.