Möchtest du unsere Inhalte auf Deutsch sehen?

x
x

IoT & Data Science: Practical insights into industry projects

““

Click on the button to load the content from Spotify Player.

Load content

Listen to the IoT Use Case Podcast on Spotify.
Listen to the IoT Use Case Podcast on Spotify.
Listen to the IoT Use Case Podcast on other platforms.

IoT Use Case Podcast #117 - ACP CUBIDO

In the 117th episode of the IoT Use Case Podcast, we focus on the connection between data science and IoT. This time, the moderator and CEO of IIoT Use Case GmbH, Madeleine Mickeleit, welcomes two experts from ACP CUBIDO: Dr. Mario Schnalzenberger, Chief Data Scientist, and Cornelia Volaucnik, Data Scientist. The episode provides practical insights into various industry projects in which IoT and data science play a central role.

Episode 117 at a glance (and click):

  • [20:34] Solutions, offerings and services – A look at the technologies used

Podcast episode summary

The episode starts with a discussion about the intersection between data science and IoT and how it applies to different business cases. Dr. Mario Schnalzenberger and Cornelia Volaucnik use three specific projects from the mining and metal products industries to explain how data science can help improve product quality and optimize data preparation and integration into existing systems such as data warehouses.

ACP CUBIDO Digital Solutions offers cross-industry digitalization solutions ranging from analytics and software development to licensing and implementation of hardware infrastructures. The customer projects presented are particularly interesting, including ÖBB with a use case for precise electricity consumption forecasts and AMAG (Austria Metall GmbH), a leading international supplier of primary aluminum and semi-finished aluminum products, which is increasing its process efficiency through innovative IoT applications.

  • AMAG (Austria Metall GmbH): Focused on optimizing production data management, which led to significant time savings in DWH loading and improved data quality.
  • SAG Innovation GmbH: The aim was to integrate data science into the production lines in order to increase productivity and product quality, which led to an increase in productivity of up to 30%.
  • Energie Steiermark: The focus here was on improving the accuracy of energy demand forecasts, which led to more accurate forecasts and cost reductions.

The episode concludes with a discussion about the future of IoT and data analytics in the customer context and provides insights into the challenges and potentials that ACP CUBIDO has identified in its projects. This makes the episode a valuable resource for anyone interested in the intersection of IoT, data science and cross-industry digitalization.

Podcast interview

Hello Mario, hello Cornelia, welcome to the IoT Use Case Podcast. I’m very happy that you are here today. How are you, where are you right now?

Mario

Thank you for the invitation. I am here at our head office in Leonding in Upper Austria. We had a pleasant start to a new working week after I had unfortunately been ill for a while. But now I’m recovered and hopefully everything will be back to normal.

You said Upper Austria, where exactly are you located?

Mario

It’s a small city next to Linz.

Linz is very well known. A really beautiful region.

Mario

Together with Wels, Linz and Steyr, it is the industrial center of Upper Austria, if not the whole of Austria, with the largest industrial companies, which are focused on steel and chemical production.

Very nice. I even know some listeners from the region personally, and from Austria in general. I’m very glad that you’re here today. Cornelia, how are you? Where are you?

Cornelia

I’m doing very well, thank you. I’m delighted to be here and, just like Mario, I’m in the office next door in Leonding, so to speak.

I’m very happy that you’re joining us today. Let’s make a short introduction to your company. ACP CUBIDO Digital Solutions GmbH is part of ACP Holding Digital AG. You primarily offer analytics and software development and related hardware topics, licensing, (cloud) infrastructure, implementation and support – all from a single source. The important thing is that you focus on analytics and software development. That’s what today is all about. You ensure that the right data is available in the right place at the right time. Today we will find out exactly how this works by looking at your customer projects in practice. Cornelia, which clients do you work with?

Cornelia

I am working with similar customers. What we are presenting today, for example, is from the metal sector. We have already had several projects with energy service providers. But in principle, we really do have a focus on data.

Yes, you operate across different industries. In the podcast, I always talk about various projects and, above all, use cases from practice. Cornelia, you’ve just mentioned that you have various projects with customers. Which project are you working on today or what are we looking at in more detail? What use cases does this involve?

Cornelia

Yes, very much so. The project I want to talk about today is with AMAG Austria Metall AG. They produce aluminum strips, sheets, aluminum plates, etc. The project we implemented with them was in the area of predictive quality. This was a really impressive project in which we analyzed process data to find out what influence it has on product quality.

Yes, very cool. I would like to inquire further later on. Mario, which customers are you working with and what use cases are they working on?

Mario

I have brought two small examples. The first concerns ÖBB, which has been working with our precise electricity consumption forecast for almost 10 years. What consumes the most electricity? What consumes the most electricity? Of course, the locomotives. Consumption is divided up in such a way that these locomotives also have to draw electricity from the infrastructure accordingly. It is a considerable amount of electricity that is consumed there, and an accurate electricity forecast pays off in that I can save a lot of money when buying on the international or national market. In the use case itself, it was mentioned that several hundred thousand euros per year were already being discussed, and this has been the case for the past nine years, and it continues to persist now. Electricity consumption forecasts definitely pay off for companies.

The second use case, an OEE use case, will hopefully be familiar to most of the audience from the manufacturing industry. This is overall equipment efficiency combined with predictive quality or predictive maintenance. That’s from SAG, the Salzburg Aluminum Group. There, we have achieved an increase in productivity of up to 30 percent with our implementation of an OEE together with Predictive Quality. This is the second use case we want to take a closer look at.

Yes, super exciting. So SAG is a manufacturer of aluminum profiles, so to speak, or what do they produce?

Mario

In this case, aluminum containers. SAG focuses on aluminum containers and, in this particular case, pressure vessels for vehicles, i.e. where gases are stored under pressure. Regardless of whether these are compressors, tanks or other suchlike.

Before we get into the details of the projects, perhaps one more question in advance. You are now working with very diverse customers. What is your vision with the topic of IoT and your focus on data analytics? What is the vision you are pursuing with your customers?

Mario

Basically, I see IoT and the way to access and process this data as being as essential as electricity itself. This form of data processing should be seen as completely normal and a daily occurrence, because anyone who knows how their machines work, what their customers need and exactly how they need it, can produce machines or products much better and much more efficiently and subsequently offer them more cheaply and efficiently. Sooner or later, this will become a matter of course, just as it was not essential in the beginning to be able to connect machines with each other in production or for a machine to have an Internet connection. This is now the case virtually every day.

What interests many people is the business case behind the projects. You have already mentioned the use cases. It’s about predictive quality, it’s about OEE. Let’s start with SAG. Mario, what is the business case for SAG behind this project? Why did they decide to go this way?

Mario

As a basic principle, the goal is not IoT. It is not supposed to be IoT, the goal should always be a solution that is helpful to me. That means it will come with a problem or a challenge that I can then solve. That means predictive maintenance, predictive quality. How can I now improve your product and perhaps use IoT to do so? I don’t have to. At SAG, scrap minimization was an issue. They have an extremely innovative manufacturing process. They even have some patents in this area. Welding in particular was always a challenge, and the metallurgists were faced with a conundrum. On the one hand, they had no data because they had not recorded it. On the other hand, since the process really was completely new and patented, it was extremely innovative. This made the company unique, as it was the only one to manufacture such products. This presented a particular challenge, as there was no experience from others to build on. It was therefore all the more important to make use of our own experience. As I always say, experience and knowledge are contained in the data, and this is where we come in. We collect the data and then try to use this experience. The final outcome was a 30 percent increase in productivity. It was essential that we found out where the two or three sticking points were at the relevant stations. This is more or less a process where there were four or five different stations that preceded the welding robot. The production line had a total of seven stations. To really find out why something is not working properly. Tightness is the top quality objective for pressure vessels. Then to find out what the reason is and how I can optimize it. We also discovered other things. If I start collecting data, I can also analyze the data in a different way. That’s another point. Data not as an end in itself, not just for one purpose, but always from several perspectives. That’s kind of the key.

Okay, that means you have optimized the data preparation, so to speak. Was it also about a specific system that you connected or optimized at the customer’s site? Was it about the general IT architecture, which is then connected to this line?

Mario

Actually both. So on the one hand, of course, since it’s patented, they’ve programmed a lot themselves, right down to the system control. This gave us many opportunities to implement things in many different places. But in other places they have ready-made equipment, such as a welding robot. There are various suppliers, you take one of these welding robots, you can program it and then it’s ready. There is nothing you can do about this, but you can also access the data there. In some cases you can really do things yourself, in others you just have to take what’s there and try to use it as much as possible. We have changed people’s perspective. Digitalization doesn’t just start at the machine, but also in the minds of employees. If I do something, then it should really be assignable and clear where it comes from. That’s why all this extra work, like entering or controlling something somewhere or pressing an additional button, isn’t just there to annoy and keep you busy. At the end of the day, this helped us to achieve this 30 percent productivity control. That was somehow the door opener that showed that it wasn’t there to annoy them or replace them, but to make their work easier. Less rejects also means less work that I don’t like, i.e. sorting things out, throwing them away somewhere, putting them back, extra handling and so on and so forth.

At SAG, you have connected the production line with various assets, such as these robots, to the IoT platform, so to speak. You have taken on the topic of data acquisition, then you have implemented OEE as a use case and the last one goes in the direction of predictive quality control, i.e. product quality and the traceability of such production data.

Cornelia, you mentioned AMAG as a customer. What did you do there? Are these the same use cases or different ones?

Cornelia

No, that’s a different use case. There are of course certain similarities. In principle, the aim was really to find out which production steps have the strongest influence on the quality of the end product. An aluminum ingot goes through many production steps. There are also a lot of influencing factors that affect it. That goes through a lot of plants. It can take a really long time to complete a production cycle like this. In principle, if the quality of an end product is not right, you want to know what the deciding factor was. We have really analyzed this through this IoT project. We carried out statistical analyses and used data science methods to determine which process parameters have an influence on the end product. In this project, the correct transformation of the sensor data that was collected was very important. There was a lot of data engineering involved.

Did you focus mainly on data acquisition in this project, i.e. connecting the data, or was the focus more on data processing at the end? What was the focus of this project?

Cornelia

I would say that both were very important in this project. That really was a lot of data, very large amounts of data. The fact that we wanted to map this production process meant that data engineering was very important, because you simply have to take this complexity with you.

Was there a specific system that the customer was interested in connecting or was it generally about the data connection of the production line?

Cornelia

No, that was the data connection of the lines in general, so to speak.

Thank you very much for the introduction of these two customers. What is always exciting for many customers or end users is the integration with different trades, other companies, subsidiaries and so on. What was it like with AMAG? Is it also an issue that other trades are now being connected with IoT? How important is it for your customers to involve other stakeholders?

Mario

Basically, with both clients, we worked with one company, which was always a central innovation unit, if you like, with different names, whether it was Big Data, Digitalization or Innovation. These had to be realized together with various subsidiaries or sister companies in different places. It gets really exciting when you only have additional third-party and fourth-party providers who then have to implement extensions in the ERP system, so to speak. The special ERP systems then had to be adapted again by someone implementing them. At the end of the day, there is usually a large project group around the table that has to discuss where, what and how adjustments need to be made. Because I am always convinced, and I always try to convey this to customers and project partners, that problems should be solved where they can be solved efficiently. This means efficient in the sense of causing as little effort as possible and as little impact as possible. If a sensor measures something incorrectly, then I should not send the sensor to 27 project partners and let them calculate the correction, but I make the correction calculation in one place and send the corrected values on. That’s very trivialized now, there are much more complex facts, but that’s roughly what should happen and that’s always the approach we take. We strive to resolve certain problems as early as possible so that we do not have to ask ourselves each time whether the information is really correct.

Let’s go back to SAG, there are various relevant data that are connected there. What types of data are relevant? Is it about real-time data? Can you tell us a bit more about the data you have connected there?

Mario

Basically, the IoT always contains real-time data when we communicate with machines. This means that the core always consists of real-time data from machines that actively or passively send data. For example, the robot was addressed via AB and the data was extracted from there. As soon as the welding process was complete, we created a trigger that sent the data back. The data is retrieved depending on how our partner or data provider acts. We use JSON or Parquet to provide the data, depending on the requirement. JSON is advantageous because it is a flexible format in the IoT environment that can add or remove data at any time. Parquet is a compact and efficient format, especially for large amounts of data, but less flexible in other scenarios. When it comes to integration into ERP or other systems, the connection is made directly. This means that we obtain data from various sources, including system sources and traditional methods such as data collection from orders and customer data. This information is needed to calculate key figures such as overall equipment effectiveness (OEE), including productivity and other factors. It is important to note that this information is not normally contained in the machine itself, as machines are primarily designed for production. The OEE can only be calculated with a certain amount of control information.

[20:34] Solutions, offerings and services – A look at the technologies used

Let’s dive into the solution that you have built for the customer. I think many people found themselves in this section in particular. I think the processes and the challenges are very similar today in terms of use cases. How did you do it? How did you set up the solution at SAG?

Mario

For example, if I have assembled three parts from machine 1 in machine 2 and so on, you can imagine it a bit like Lego. In the end, we have one product, but at the beginning of the production line, seven or ten parts are reviewed. Nevertheless, in the end we would like to know whether the product is good or bad overall. This comprehensive evaluation and the tracing of qualities backwards and forwards enable us to view information on each production step both in isolation and in the overall context. This approach from large to small and from small to large, together with the corresponding evaluations, has led to significant findings. For example, we found that the clock rates were not accurate as the ERP side came into play and information did not match. This enables us to create a holistic picture of a company and recognize that changes may be required in availability and productivity as well as in purchasing, production planning or sales. This may mean that we have to adjust the product price or make a better or worse assessment in purchasing. This is what makes our solution so special: At the beginning is the OEE, at the end we get answers to many questions.

What also makes the project special is this OEE dashboard that you have also built there, where you have linked this production line with various upstream systems or certain other IT systems that were used to calculate the KPIs, so to speak. How did you do this in terms of the technology stack? You have to collect the data somewhere, process it and then run the analysis. I know that you also work a lot with Azure. Was that mainly Azure Services then? Do you collect the data and then transfer it to the Azure service in the cloud or how exactly did you do this?

Mario

Basically yes, we are a Microsoft partner and therefore primarily use Azure. Only Azure was used in the solution, if you like. The on-premises access methods are of course not integrated in the cloud, but from the Edge point onwards, everything is in the cloud. An aha effect was start small, grow big and that the cost promises were fulfilled. We can easily scale the volume. When we start with a machine, we only need to consider the volume for that machine. As it grows, it adapts easily because we have the necessary potential. This means that we are not restricted by limitations. As far as the data is concerned, we like to work with Power BI when it comes to evaluations. Power BI offers two main methods for working in this context. One of these is the direct connection to data streams. This works quite efficiently, but is very time-sensitive and therefore very time-bound, as the information must already be available at the time of the query. In some ERP systems, there may occasionally be problems if, for example, order data has not yet been fully recorded, which may mean that the order is not yet displayed. This is a problem that we have experienced with several customers, and it is independent of the customer. To ensure that the data is nevertheless correct, we have developed combined methods. These methods are applied retrospectively on a daily basis to correct the data. This means that the OEE dashboard offers 100% coverage, even for the last day. For each part, the exact order to which it belongs, how well it was manufactured in production and what effect it has on the OEE is recorded. It is recorded whether it is a good or bad part, whether it was later used in another part that then became a bad part, and so on.

Cornelia, we also talked about AMAG. How did you do that? In the end, it was all about optimizing and modernizing data management. You’ve just explained a bit about what the goal was. How did you solve it there, from connection to processing to analysis? Was that the same or a similar case or how did you do it?

Cornelia

Yes, technically similar, but not identical, I would say. The data came via an HPC, was then routed via an IT hub and finally written to a data lake via stream analytics. This was a really big part of the project, as complex processes were mapped here. The data processing and analysis then took place in Azure Databricks. As Mario already mentioned, we work in Azure for all these components.

At the beginning, you also talked about improved quality, particularly in connection with predictive quality as a use case. How did you actually analyze the production parameters? Do you really go into detail and examine different types of data? You have just given some examples. How did you manage to really carry out the analysis for improved data availability at the end and the traceability of these production parameters that affect quality? Wie habt ihr das gemacht?

Cornelia

We worked intensively with Databricks, and the focus here was on tracking all this sensor data through to the end product. A lot happens in the production process, such as forming or cutting, and we have worked closely with our customer, AMAG, to ensure that we map the data handling process correctly and accurately. In this way, we were actually able to trace the quality losses in the end product back to the sensors and find the corresponding correlations. If, for example, the temperature of a sensor is too high or another process parameter is set differently, this affects our end product.

This means that in the end, you have provided the customer with an end-to-end solution and developed it further together with them. This solution involved the integration of data and the effective use of this data in the system, which ultimately led to improved data availability. Of course, you also used your own tools. It is also the case that you continue to develop various tools, features and things. What’s to come in the future? What are the trends that are emerging? What can we look forward to from CUBIDO over the next few years?

Mario

Wow, what’s next? For me, I definitely see the future in the IoT sector, that everyone will need IoT and everyone will want to analyze this data. Citizen data science will become an issue, which means that every company will have its own data science agents to carry out the analyses. That’s really to say that every company has its own analyst who wants to work with it. That’s why I need the data, because without data there is no analysis – that’s a key point. Another big topic is, of course, AI. Whether we’re talking about chatbots or other AI methods, it’s an important topic. On the one hand, this involves using AI as a tool to make work easier and reduce laboratory activities, for example predicting product quality based on images or other methods to really speed up predictive quality. On the other hand, it’s about having methods that can provide answers, and here ChatGPT is a possibility. It may not be perfect yet, but it is at least an option for obtaining answers to a large number of questions. However, it will largely depend on how the information is prepared. This brings us back to data engineering and the processing of information, and I think we have a lot of work to do here. No matter where you look, people are always talking about what ChatGPT and other tools can do. However, the preliminary work required for this, namely the translation of this information so that ChatGPT can think with it, is rarely discussed.

Yes, absolutely. What you are saying is exactly the key point, namely the processing of data and in particular providing the correct data and identifying it. The two projects you presented today also show the impressive possibilities of IoT in combination with data science. Real-time data from the field, combined with analytics and data science, offers immense opportunities.

Today we learned about two exciting cases, both from the AMAG side and from the SAG side, which leverage real business cases. I think there is still a lot of potential for the future. Cornelia, you are now an expert, especially on these AI topics. Are there any additions from your side? How do you see the future? What else is in store for us?

Cornelia

Yes, Mario has actually summed it up quite well. I also think that AI topics in connection with IoT data will be really exciting over the next few years. Especially when it comes to predictive quality in the field of image recognition or something like that, because you can already analyze that extremely well. Often the crucial point is that I need the right data. I may still have to collect the right data, but there really is a lot of potential in the longer term.

I would be very happy if we could perhaps do a special episode on this topic, since today we only teased what possibilities there are. I was really impressed by the two projects you presented today together with your client CUBIDO.

We learned that AMAG, a leading supplier of aluminum products, has achieved significant time savings in production data management by modernizing with your help. In the second project, in collaboration with SAG Innovation, we also saw the power of combining IoT and data science in the aluminum manufacturing sector, enabling them to increase their productivity by 30 percent. These are really exciting cases. I would give you the last word for today. Thank you very much for being here. Thank you very much for all the exciting collaborations and with that I would like to hand over the last word to you.

Mario

Thank you Madeleine for the invitation and for the opportunity to present our projects here today. One core element is always to emphasize what we have in common, i.e. we really try to take an individual approach with the customer, to take the customer by the hand and take them along on the journey. After all, one of our goals is to allow the customer to proceed more or less independently so that we can really make faster progress. That means being able to IoT-enable people.

Cornelia

Thank you for inviting us. I have the feeling that my experience from the project is that good cooperation with the customer is very important. Especially when it comes to process knowledge and data content knowledge, it is simply important to work closely with the customer on the project.

Very nice. Thank you very much for the last closing words for today and I wish you a nice rest of the week. Take care and yes, see you next time.

Mario

Thank you, see you next time, ciao!

Cornelia

Thank you, ciao!

Ciao!

Please do not hesitate to contact me if you have any questions.

Questions? Contact Madeleine Mickeleit

Ing. Madeleine Mickeleit

Host & General Manager
IoT Use Case Podcast