Möchtest du unsere Inhalte auf Deutsch sehen?

x
x

IoT-Data Science – how scalable implementation and integration works

““

Click on the button to load the content from Spotify Player.

Load content

Listen to the IoT Use Case Podcast on Spotify.
Listen to the IoT Use Case Podcast on Spotify.
Listen to the IoT Use Case Podcast on other platforms.

IoT Use Case Podcast #129 - All for One + avantum consult

Merging IoT and AI: In the 129th episode of the IoT Use Case Podcast, we talk about how the combination of IoT data and machine learning opens up new opportunities for process optimization and interaction in industry. Our guests are Jakob Procher, at the time of recording Consultant AI & Data Science at avantum consult GmbH, and Andreas Lehner, Head of Innovation at blue-zone GmbH, representing our IoT partner All for One.

Episode 129 at a glance (and click):

  • [13:15] Challenges, potentials and status quo – This is what the use case looks like in practice
  • [23:48] Solutions, offerings and services – A look at the technologies used

Podcast episode summary

The episode is about the revolutionary fusion of IoT and AI, which opens up new possibilities for optimizing processes and improving interaction in industry. Experts such as Jakob Procher and Andreas Lehner share their valuable insights into the implementation and potential of these technologies, backed up by practical examples from their own experiences. Projects such as shopfloor.GPT illustrate how Large Language Models can be used in industry to increase efficiency and identify and rectify faults more quickly.

A central point of discussion is the indispensable role of data quality and integration for the success of AI applications. Specific use cases, such as the optimization of forging machines and heavy-duty transporters, illustrate how specific problems can be solved through the use of AI and IoT.

The episode also highlights the importance of cloud technologies, which serve as a catalyst for effective data connectivity and the realization of AI projects. Another important aspect is the principles of MLOps and DevOps, which support the automation and efficiency of AI projects.

Looking to the future, the podcast episode discusses trends such as Explainable AI and Responsible AI as future focal points in AI development. These developments have the potential to increase the transparency and accountability of AI models and facilitate their integration into the industry.

The economic benefits of such technologies, including the reduction of reject rates and the optimization of production processes, are highlighted as decisive added value for companies. In conclusion, the episode emphasizes the need for strategic implementation of AI technologies to effectively address real business problems and ensure long-term success.

Podcast interview

Today we are highlighting a truly fascinating development on the market: IoT data is increasingly being combined with machine learning. IoT data is used and attempts are made to use special insights and algorithms to leverage added value. We are already seeing the first pilot projects in our network that work with large language models and thus enable a new dimension in interaction and process control in the industry. An exciting example is shopfloor.GPT. Perhaps you have already heard of ChatGPT, an episode specifically on this topic will follow soon. Such tools make it easier for users to work more efficiently and help to identify and rectify faults more quickly – all with the help of artificial intelligence. But how exactly does this work and what use cases are there? What needs to be considered during implementation? Jakob Procher, Consultant Artificial Intelligence & Data Science, from avantum consult GmbH and Andreas Lehner, Head of Innovation, from blue-zone, answer these questions today on behalf of the partnership with All for One. Today we look at exactly what the companies do and what this looks like in practice.

A warm hello to Andreas and Jakob. First of all, how are you? Where are you right now?

Andreas

Hello Madeleine, thank you very much for the invitation. I’m delighted to be part of a podcast again. Like last time, I’m speaking with you from Austria. The weather isn’t ideal today, but I’m looking forward to the use case, so that doesn’t detract from it.

On this occasion, if you feel like it, episode 57 with Andreas and Frauscher, a customer in the field of railroad track technology, is highly recommended and still up to date. A really fascinating episode that you are welcome to listen to. Today it is a special pleasure for me to welcome Jakob to the group. How are you and where are you right now?

Jakob

Hello Madeleine, I’m also pleased that I was invited today and that everything went so well. I’m currently in Düsseldorf, working from home, and today I’m going to tell you a bit about my day-to-day work and AI in the context of IoT.

Very nice. I am delighted to have two experts with me today. Can you tell us a bit more about your role at avantum consult and how you support companies in implementing AI strategies based on IoT data, among other things? Can you tell us more? What exactly is your role?

Jakob

Sure, I’d love to. So my exact job description is consultant for AI and data science. Accordingly, I actually deal with all topics that have something to do with this. Avantum consult is a consultancy for analytics topics. In other words, we process company data. We are not explicitly focused solely on the IoT topic, but this is also a major issue that we are addressing. I have a background in mechanical engineering, which is why I really enjoy working with IoT topics. At some point, I moved in the direction of AI and I’m quite happy that I ended up in this area, but I can still deal with mechanical engineering and IoT topics. Accordingly, I’m always happy to be deployed in this area.

It is always important to understand what the upstream and downstream processes are and how everything is connected, especially when it comes to data – usually machine device data. Andreas, what’s your situation? Today you work for the company blue-zone, which is also a partner of All for One. Could you tell us a bit about the constellation and, above all, what role blue-zone plays in digitalization, especially when you work together on projects, perhaps even with Jakob. Could you tell us something about how everything is connected?

Andreas

As a blue-zone, we are a partner of All for One and specialize in the development of embedded software, precisely where it concerns the integration of sensor data and data control. My personal background is in embedded software development, i.e. in the areas of control systems and sensor technology, and in the context of the ongoing hype surrounding the cloud, also in connectivity and cloud connection. The next logical step, and this is the reason for our discussion today, is to link AI models to this data. As an SAP system house, All for One naturally has a special focus on processes and their digitalization. Together with All for One, we are looking for use cases and implementations to support companies with digitalization – from the machine to the ERP and downstream service processes.

Did I understand correctly that you are capturing the data, say from the field? Regardless of the type of data points, you, i.e. blue-zone, connect the sensor data, for example. You, Avantum, then analyze this data, build complex models and provide support in the form of advice and man- and womanpower. And finally, you integrate the data into ERP systems, although if All for One is an SAP partner, this will probably be prioritized so that the processes can run through completely. Is that the whole package, can you understand it that way?

Andreas

Exactly, perfectly summarized, Madeleine. At the end of the day, it’s really about digitalizing and automating these processes in order to achieve real added value and benefits.

I mentioned it at the beginning, Andreas, you work with Frauscher, among others. As I said, that was a reference to episode 57, where your customer has already spoken. You have very different customers and also different companies. Who are you working with? Do you have any special sectors? It’s probably not just rail transport. What exactly do you do with which customers?

Andreas

Generally speaking, these are mainly companies from the industrial sector, including mechanical and heating engineering companies. Specifically, I can mention Frauscher, for example, about whom we have already made a podcast, or companies such as GEBHARDT Intralogistics, which offers a comprehensive health management dashboard with its Galileo IoT platform for intralogistics, which makes it possible to get started with AI topics.

I will link this briefly in the show notes for our listeners. Check it out, there are some really exciting projects, both with Frauscher and with GEBHARDT. Jakob, how are things with you? Which clients are you working with today?

Jakob

Our clientele is very diverse. All for One is particularly well represented in the SME sector, but at avantum consult we are not limited to that. We serve all sectors and company sizes. In the SME sector, the topic of IoT is being raised with us more frequently, but there is no direct restriction. One project we recently carried out was with Warema, where we evaluated and categorized maintenance orders in order to improve the maintenance processes – a very exciting project.

I always talk about the practical side of things in the podcast. I would be particularly interested in the use cases that you implement with your customers. Jakob, can you give us a few examples to help us understand what these projects are?

Jakob

Yes, definitely. I have brought you two different examples. One is a forging machine and the other is a heavy-duty transporter. I want to go into a little more detail about what we did there overall. With the forging machine, the company has automated or improved its production so that the processes run as automatically as possible. They have acquired a new machine that had initial startup difficulties. The machine works with program codes that have to be parameterized in order to forge different parts automatically. It is often only in retrospect that you realize that a setting was not optimal and rejects were produced as a result. This is of course a waste that they want to reduce. They want to analyze these program codes and understand exactly what is behind them in order to ultimately save time and money and better understand their process as a whole. It is a very exciting project.

It’s more or less about setting the process parameters, i.e. analyzing the program codes that were probably set on the machine itself but never questioned. Can you put it that way?

Jakob

Exactly. This machine is equipped with sensor technology and is updated while the process is running. This allows important information to be extracted from previous runs with different process parameters. You can recognize which findings can be used for future runs and what they ultimately mean. Maybe before I create the program, I get a hint that a certain parameter could lead to an error and I should reduce it by 10%. This allows me to fix potential problems before anything serious happens.

A really good example. I have a few more questions. But you just mentioned a second example, didn’t you?

Jakob

Yes, the second example refers to heavy-duty transporters, a more dynamic scenario than the static forging machine. These vans are equipped with sensor technology and the IoT data connection is already very advanced at this company. The data is already stored in an IoT cloud and is sent there. There are an enormous number of these heavy-duty transporters worldwide and all this data is already being collected. The customer is now considering what new business models are possible and how the customer experience can be further improved. We provide support in analyzing these vehicles with regard to various failure parameters. For example, you could predict whether a spring in the gearbox will soon fail. This is particularly important as heavy-duty transporters are often rented and shipped worldwide. If a transporter breaks down after shipping, this would take an enormous amount of time and be the worst case scenario for a project. It is therefore crucial to know as early as possible if and when something could go wrong.

So, on the one hand, we have a project that deals with the shopfloor and the connection of forging machines and, on the other hand, a very exciting project that is developing new business models involving the rental of heavy-duty transporters and other stakeholders. These are really great examples. To categorize things before we delve deeper into the projects: Do you have a sense of how many customers are currently using AI algorithms based on IoT data? In the podcast, I often talk about connecting data that is then processed in the cloud. But how big is the share of AI? Is there a feeling or perhaps even trends as to who in the industry uses these technologies more and who uses them less?

Jakob

Yes, the topic is really exciting. We are often present at trade fairs and talk to many companies. Almost everyone is talking about AI and ongoing projects these days. But realistically speaking, very few companies actively use AI in production. Some have initiated lighthouse projects, but many quickly realize that their data does not yet fit and are therefore not yet that advanced. Sometimes they still struggle with the data connection, which would then be relevant for blue-zone. But it is increasing, speaking from the middle class. Larger companies sometimes already have their own AI teams that deal exclusively with this – for example, a data science team with around ten people who deal with it across the entire production process. In general, I would say that a lot of data is already connected in the mechanical engineering and industrial sectors, and it will be hugely important in the future. People know they have to tackle it, and they will.

Andreas

I would like to add to this, because AI is a huge term, especially in the consumer sector, and AI is not always the same as AI; simple algorithms are often referred to as AI. It is important to find a solution for a specific problem, and this can also be a simple threshold detection. It doesn’t always have to be a large machine learning model to achieve results.

[13:15] Challenges, potentials and status quo – This is what the use case looks like in practice

In the end, it’s always about the business case. So where can I ultimately save time or perhaps even make new profits? The example with the heavy-duty transporters is aimed precisely at this. It would be interesting to learn more about the business case. Jakob, you’ve already mentioned what the cases are about. But what is the customer currently losing in terms of time and money?

Jakob

I like to start with the forging machine. The challenges are obvious: the main aim is to reduce the number of rejected parts and thus avoid wasting material. This improves the customer’s margin. By reducing the number of rejects, delivery times for end customers can also be shortened because fewer faulty parts pass through quality assurance and are delivered to customers. In the end, the customer is more satisfied. Another advantage is that customers understand their own processes better. If you start to not only store and possibly visualize the available data, but also analyze it to identify which parameters are relevant and can be influenced – for example to optimize process times or reduce failure rates – then you have already gained a lot. And that is probably the most important aspect: developing this deeper understanding of your own processes.

I just remembered a point because I did an episode with the Saint-Gobain company. The contact person there, Markus, might be of interest to you. I can connect you afterwards, because they have exactly this case: it’s about potential in the parameterization of their settings. Another important point is to know the processes, which means that you first have to map everything that you have set up in the past, which is a huge effort.

Jakob

Yes, that’s actually a very important point. Many of our customers want to implement predictive maintenance, which is somehow considered the holy grail with IoT data. But most customers do not yet understand that the steps before that are almost more important. This includes condition monitoring, displaying IoT data and building understanding. This is essential for the success of such processes.

In relation to the example with the heavy-duty transporters: The point here is that these devices or large systems are rented out. It is therefore not just about cost savings in processes, but also about using data to bind customers more closely to the company and develop new business models. What does the business case look like?

Jakob

Yes, you don’t just buy heavy-duty transporters every month. It is therefore extremely important to retain customers in the long term so that they remain satisfied. Quality plays a major role here; customers should have high quality standards, otherwise they could be left behind in the market in the long term. Another important aspect for the future is the obligation to provide evidence of data, for example in relation to accidents or big data, will continue to influence business cases. It is therefore crucial that customers start carefully recording and versioning their data at an early stage.

Very important issue. There are also other drivers, not only from the legislator, but also accidents or obligations to provide evidence that have to be fulfilled. This is a great opportunity to work with data there. Andreas, since you also work with different customers, do you see it similarly or are there any additions?

Andreas

The points mentioned by Jakob are absolutely relevant. What we are seeing with our customers are the challenges in the service area. Jakob already mentioned the “holy grail” of predictive maintenance, which can be seen as the ultimate goal. However, there are already numerous opportunities for improvement in advance, such as optimizing the service process to improve the availability of spare parts. This does not necessarily require a predictive scenario; it is sufficient to quickly access customer data without having to send your own service technicians to the machine. These steps do not yet require AI, but are very relevant and are also related to the shortage of skilled workers. In general, our customers are increasingly using the potential of data from their control processes to develop new business models based on data.

The two examples – the connection of forging machines and heavy-duty transportation – involve specific data. In the first example, we have already discussed the reject rates and the associated sensors. Can you give specific examples of the data required for heavy-duty transportation? What information is relevant for projects and needs to be collected?

Jakob

Sensor values are of course very important for the evaluation of IoT data. The information from the control units is also crucial. This data can have different origins and must be used. They are particularly relevant for areas such as anomaly detection or predictive maintenance. It is important to put the whole thing in the right context. Business data is therefore at least as important and contributes to the overall picture. This involves information about the individual vehicle, such as location and rental period.

AI is ultimately about having enough data in good quality. Let’s talk about how this affects your customers’ processes. First of all, the data must be available, i.e. the data connection is essential. Then they probably need to be available in sufficient quantity and quality to be able to use algorithms efficiently. How does that affect your customers, Jakob?

Jakob

Yes, that is an important point. So, as you said, first you need some data. I say to my customers who don’t yet understand their processes: start collecting data first. Because, as you said, you need a lot of data for AI algorithms or statistical models. But don’t forget: bad data doesn’t help much. Therefore, it is advisable to analyze as soon as possible which data are truly needed, which ones are highly relevant, and which ones may not be necessary. Because in the IoT sector, the amount of data becomes an issue at some point, and not all data ends up being useful. But Andreas, you had a few more thoughts on this, which we had discussed in advance.

Andreas

Exactly, as you say, quality is crucial. You have to think about which sensors to use and what resolution of data is required. Ideally, you should focus on the pain of the specific use case you have and collect the information you need to find a solution. It is not expedient to collect unstructured data and only later consider what to do with it. It is more effective to look at what information is currently available and relevant to the use case and to collect it. You should then carry out a data analysis relatively quickly in order to obtain initial results. As Jakob said, you have to start somewhere, but not just by collecting data, but also by analyzing it. In this way, you approach the goal step by step and a structured approach is very important.

[23:48] Solutions, offerings and services – A look at the technologies used

We learned a lot about the challenges and the specific case. Finally, I would like to briefly discuss how this can be implemented. Which technologies are used? What does a solution look like? Jakob, as an expert in data warehousing, where and how do I store data and use AI algorithms? What should the IT setup and architecture look like in order to be able to implement such use cases?

Jakob

In recent years, it has become clear that the cloud is becoming increasingly important. Many companies now use a cloud, which offers great advantages in the IoT sector. Various ready-made services facilitate data connectivity. Updates for machines can be imported automatically. In my opinion, the most important thing is that companies should be cloud-based. From a technological point of view, everything should be mapped as automatically as possible. DevOps or ML-Ops in the AI sector and container technologies should not be neglected in the IoT sector.

So, I know the DevOps principle. I have a developer and an operator, and can provide things in IT. What exactly does MLOps or Machine Learning Operations mean?

Jakob

Exactly, MLOps is clearly derived from DevOps, i.e. Development Operations. DevOps basically means that you have a complete iterative process that runs as automatically as possible and is organized in life cycles. The same applies to MLOps. Here you start with data quality, followed by model training, automated deployment, automated monitoring, automated retraining and all of this in a cycle. This is particularly important for machine learning models, especially now that we are using the example of IoT data. The machines simply change over time. In other words, I can’t use the same AI model for two or three years in a row because data simply changes.

Are you talking more about SMEs or large companies? Because it sounds as if SMEs do not currently have this in their IT infrastructure.

Jakob

Well, I actually only know a few companies that do not pursue DevOps in any form. They at least try to automate everything. There are also technologies and tools that provide a great deal of support. Large companies have certainly gained more experience with this, but it is even more important for small companies to implement it properly. This is because there are often not that many people working in the field of AI in SMEs. You don’t always want to hire a service provider or consultant to take care of exactly that, but the automation should work smoothly. That’s why it should be bigger.

Andreas, another question for you. You work primarily with the data connection as a focus, but of course also with the partnership with All for One. How do you think companies should proceed in order to apply these principles and really optimize AI models for industrial use? Do you have any best practices from your practice?

Andreas

So, the first thing is that it always starts with people, I’ll say that now. You need this clear responsibility from a person who also takes care of the issue. As you mentioned earlier, the topic of DevOps, which has actually become a familiar term and is finding its way into more and more companies. MLOps can also be thought of more or less as an extension of DevOps. It’s clear, especially when it comes to software, that I have versions of my software that need to be stored in a meaningful way, I need my releases, my release notes, and of course this also applies in the MLOps area. I have to version my models, i.e. it is specifically about a person who takes on the topic and has clear responsibility for it. They take a structured approach and deal with all ML topics in the same way as source code, where I have releases that I am developing further.

I don’t want to go too much into the technology, but I think it’s very exciting for the audience to understand a bit more about data models as well. I would like to talk a little about the solution that you at avantum consult are pursuing together with blue-zone and All for One in order to implement such data models in the IT architecture. Can you say what is important in the implementation? What should be considered in order to make them really usable for the use cases we mentioned?

Jakob

Exactly, we have already talked about the fact that business data is hugely important, but IoT data is also extremely important and must not be neglected under any circumstances. It is therefore extremely important to set up a clean data warehouse for the business data with a data mart in which data is provided at all times that is relevant to the company context. At the same time, IoT data is also important. Most people know that IoT data is very different and is generated at a much higher frequency than business data. Therefore, you have to make sure that the two worlds are well married. The IoT data comes in continuously as streaming data. These cannot be updated as frequently as the business data, as this would mean too high a cost and too great a burden. The combination of both is therefore extremely important. The AI models lie between these two worlds and have to integrate the whole thing and bring it back together again at the end so that it can be presented cleanly in a report.

I see. It’s good that you brought up the subject of streaming again. We don’t want to go into this topic in depth today, but you are welcome to listen to episode 74. Confluent will be represented by Kai Wehner, who will be talking about all aspects of streaming. That’s a completely different chapter, but incredibly important to look at, and maybe we can even make another episode about it. Thank you very much for these insights into the solution. Perhaps the last question for today: Is AI really always in use, i.e. as artificial intelligence, or is it perhaps just a statistical model? Using the example of the forging machine, can you explain how you decide whether to use AI as algorithms and so on, or whether to simply rely on statistical models? After all, taking this next step is to some extent a supreme discipline, isn’t it?

Andreas

Exactly as you say. So the simple statistical models are of course used where I move in the problem space, where people can still imagine how this can be solved or, for example, how it can be solved with the expertise of a specialist. You always have to analyze the problem space first and then see whether you have the necessary data and sensors available. Typically, statistical models can achieve good results with relatively few sensors and data if the problem can be described. You have to go into the AI models when the problem spaces become more complex, because that’s when things look completely different. As soon as humans can no longer imagine what is going on, AI can be particularly helpful. But maybe I’ll leave that to Jakob.

Jakob

Yes, you’ve definitely illustrated that well. You don’t have to solve every problem with a neural network or a complicated AI. This also applies to large language models. Not every problem today can be solved by ChatGPT and the like. Sometimes it’s enough to use very simple statistical models, which are usually very performant, and most people have a lot of experience with them. On the one hand, this means that a smaller amount of data is generally required. It must not be too low, but you will reach your goal faster. This also means that the development cycles are simpler because it is not so complicated. Statistical models often also have the advantage that they are easier to explain. With anomaly detection or predictive maintenance, for example, you can see exactly which part of the machine is failing or which part will soon fail. This explanation is also possible with neural networks, but it is a little more complicated again. This is easier to implement with statistical models.

It’s great that you’ve now come full circle back to the beginning, because your customers are naturally very interested in seeing how they can reduce their reject rates and which parameters they should scrutinize, for example in the case of the forging machine. These models can be applied accordingly in order to achieve results that have a business impact. Or in the example of the heavy-duty transporter, to see where possible failures of springs in the transmission occur and how often this happens. It would be a big mistake to deliver a device and then have failures that could be foreseen. That’s why such models are incredibly valuable and a great lever. This brings me to the last question, Jakob, what future developments do you see on the market in the area of AI based on IoT data? And what significance does this have for you in your collaboration with Andreas and All for One together with blue-zone? Can you elaborate on that?

Jakob

With pleasure. As I have already mentioned, one of the biggest trends and drivers is Explainable AI, i.e. the explainability of models. On the other hand, the topic of IoT data in general is a huge issue, and the possibilities are far from exhausted. IoT scenarios are becoming ever larger and more important. In this context, the topic of language models, especially Large Language Models, is becoming increasingly important in the IoT sector, for example, for the evaluation of maintenance information, which can be easily integrated with them. You have already named a partner who specializes in this.

Exactly, the company elunic. At this point I have to say briefly: Subscribe to the podcast if you haven’t already done so. The next episode will reveal even more exciting things. Exactly, you can listen to it again in detail. But Jakob, I didn’t want to interrupt you.

Jakob

No problem at all, I’ll definitely listen to the episode too. Basically, these are the most relevant areas. What is perhaps also important is the topic of Responsible AI. This means that models must be developed in such a way that they do not disadvantage any area or sensor and that they are correspondingly robust.

So this is really an issue that sensors should not be disadvantaged, or how should we understand this?

Jakob

Responsible AI is not easy to define. Features in individual models have different influences. Care must be taken to ensure that one sensor does not provide too much information and thus receive too much weighting. It is important to have a balanced data set. This applies not only to responsibilty in a human context, but also to the fact that people are and must remain a central component of production processes, even though AI models can automate processes.

Yes, that sounds to me like an episode we can do again as a follow-up, especially on the topics of Explainable AI and Responsible AI. There are many topics that we can discuss in more detail here. I would like to take this opportunity to thank you both for your time today. I’ve put your contacts in my show notes, so if you want to talk to Jakob or Andreas, you can find them on LinkedIn or on our platform iotusecase.com. Thank you again, I thought it was great that you brought concrete customer examples with you. This was a wonderful way to understand who is on the move where, especially in the collaboration with blue-zone for data collection and avantum for data analysis, and then the integration into the ERP data world with your partner All for One, which unfortunately is not represented today. Andreas, you have done a good job as a representative. Thank you once again for being here. I’ll leave the last word to you.

Andreas

Yes, then I say thank you on my side. I’m also pleased that I was able to be there again today. We would be really happy if we could get in touch with the audience and take the first steps towards AI or other topics together. Thank you again and see you next time.

Jakob

I don’t have much more to add. I was very pleased to be part of it all and I hope that we will network and continue to have a lot to do with each other.

Yes, thank you very much and have a great week. Take care. Bye!

Jakob

Bye!

Andreas

Bye!

Please do not hesitate to contact me if you have any questions.

Questions? Contact Madeleine Mickeleit

Ing. Madeleine Mickeleit

Host & General Manager
IoT Use Case Podcast