Möchtest du unsere Inhalte auf Deutsch sehen?

x
x

A Digital Data Collector – Multitool for the Shop floor

““

Click on the button to load the content from Spotify Player.

Load content

Listen to the IoT Use Case Podcast on Spotify.
Listen to the IoT Use Case Podcast on Spotify.
Listen to the IoT Use Case Podcast on other platforms.

IoT Use Case Podcast #40 - manz, iT Engineering

“The value-added share of software in today’s mechanical engineering should not be underestimated.” – the use case of the 40th episode of the IIoT Use Case Podcast illustrates this once again. Wolfram Schäfer, founder and managing director of iT Engineering Software Innovations GmbH, brought along Stephan Lausterer (Head of R&D System Engineering, Manz AG), an actual user of his digital Swiss Army Knife for the shop floor, and presented the concrete value creation from practice.

Podcast episode summary

“Collect”, “Explore” and “Improve” – the IoT building blocks, the IIoT Building Blocks, from iT Engineering Software Innovations (iTE SI) work according to this three-stage principle. In this podcast episode, they demonstrate the use of their Data Collector using Manz as a practical example. Step 1: Data collection – even with a heterogeneous plant landscape. Step 2: Visualization and first derivations. Step 3: Learn from the data and add value through optimizations.

The users of iTE SI’s solution are spread across the mechanical engineering sector, regardless of industry. The software company has been at the interface between machines and IT for over 20 years and accompanies the digital transformation of mechanical engineering and the manufacturing industry on the way to Industry 4.0 with its solutions and products in the manufacturing environment. They also help the Swabian engineering company Manz with the commissioning of special-purpose machines. Manz is active in lithium-ion battery technology, solar and photovoltaic cells, wet chemistry, laser processing, inspection systems, and the manufacture of various electronic components. iTE SI was brought on board with the goal of shortening the duration of special-purpose machine development up to final use on the shop floor and, once there, generating added value from data.

The digitization solution collects raw data in a minimally invasive way without affecting the controls themselves. The data collector works with various interfaces and can therefore be used on a wide range of devices and machine controls, such as PLC or NC controls. The data is first pre-processed on an edge device and then cleaned and “harmonized” before being sent to the cloud – creating an all-round clean data picture. As a result, the data can be viewed visualized on a dashboard of an app. The discussed use case is about high frequency data as well as image processing data from axis positions of laser robots – drive data in control real time or laser data like pulse duration. Among other things, they enable targeted fault analysis, predictive fault avoidance and cycle time optimization.

What makes the data even more valuable is that it is enriched with existing process knowledge from the customer and manufacturer. “In the end, it’s about bringing everyone’s knowledge together,” is summarized in this podcast episode.

Podcast interview

Hello Wolfram and hello Stephan, welcome to the IIoT Use Case Podcast. I am very happy that you are with us. Glad you guys took the time. I would start right away with an introduction. Wolfram, would you like to introduce yourself briefly and say a little bit about the company?

Wolfram

Greetings to the round. My name is Wolfram Schäfer. I am the founder and managing director of the company iT Engineering Software Innovations GmbH. We are a software service company, we develop software for mechanical engineering and for industrial production and production technology. Our customers are spread across the mechanical engineering sector regardless of industry. These are products that we develop for our customers there or solutions in the manufacturing environment. All in all, we have been involved in this area for over 20 years. Since the last five to six years, we have of course been strongly involved in the topic of digitalization in industry, in mechanical engineering – with new technologies, new approaches to solutions and, as I said, we offer corresponding products and solutions.

 

Thank you very much. It’s nice that you also brought your customer, the Manz company. Hello Stephan, also in your direction. I would be pleased if you could briefly join the round of introductions and perhaps also introduce Manz’s core business a little bit, so that we can then also get into the content. 

Stephan

Yes, with pleasure. Hello, also from me to the round. My name is Stephan Lausterer. I have been with Manz for twelve years and am responsible for software development in the Basic Technology division. This includes libraries for our controls, for all our machines, as well as the user interface HMI, quality assurance for the software products we create, and electrical designs. A big topic in recent years is Industry 4.0, where we have developed our smartPRODUCTIONKIT. Manz is a Swabian engineering company. We are a high-tech company specializing in technologies for the production of lithium-ion batteries, electronic components as well as solar cells and photovoltaic cells. The competence in the area is the area of wet chemistry, laser processing inspection systems, i.e. optical inspection systems above all, and now newly added the area of Industry 4.0 with the software products developed there. 

 

I would like to come back to the topic of lithium-ion batteries, which is a very topical one, later on. To start with, Wolfram, here’s my question to you: What relevance does the topic of digitization have for your customers in general, or do you see something on the market that is emerging more and more? What is your opinion? 

Wolfram

As I said, we have been at the interface between machines and IT for over 20 years and it sounds a bit trite, but data is the crude oil of the future or digital transformation in industry. With the experience that we have there, it is the basis to make any digitization efforts, to make interoperability with the machines – so bidirectional, to be able to communicate with machines, with everything that moves in the store floor, to put together new solutions in this regard. Keywords include condition monitoring, predictive maintenance, digital twin, track & trace applications, and OEE optimization. For us, in our business, it’s moving more and more towards providers. In other words, our customers are in the mechanical engineering sector, for whom we then develop products that they in turn use for their customers. Or on the other hand, we implement digitization solutions on the shop floor for the users themselves, who use machines and produce something. These are the previously mentioned examples and beyond that also the connection to ERP systems, communication with EMS systems and so on. And in the last few years, we have noticed on a broad front – after the emergence of Industry 4.0 ten years ago was perhaps more of a marketing hype and everyone was cautious – that solution ideas are slowly being seen that simply also promise added value, that a little more light is coming into the darkness or the fog is clearing, and we are seeing where we can actually start with new business models and where above all we can create benefits. The topic simply becomes more relevant from year to year, I would say from my experience. 

 

Stephan, once again in your direction: You had said that you are first of all a classic mechanical engineer, also in the field of lithium-ion battery technology. In terms of your core portfolio, you are not classically oriented toward digitization for the time being. Why did you choose this path? What were the drivers you saw in the market for you? 

Stephan

Perhaps we are not classically active in this area, but I don’t think the value-added share of software in today’s mechanical engineering should be underestimated. It has been growing steadily for five or ten years. And here, of course, the topics of digitization and Big Data are becoming more and more prevalent. What specifically brought us to this four years ago was due to the fact that we also build a lot of special-purpose machines – so no off-the-shelf products, no machines that you can order from a catalog. Here, the path from the development of the machine to the machine being at the customer’s site is sometimes relatively long. That’s when we looked for possible solutions to shorten this path. One idea was to use digitization solutions and the visualization and analysis of data to give the commissioning engineer and then ultimately the customer the opportunity to transparently see at a glance the characteristic values, key figures, performance data, and KPIs of the machine that they receive. 

 

Before I come to the commissioning process, I would like to ask you a very brief question. You had talked about lithium-ion battery technology and special-purpose machines. What do these machines actually look like today? How do you have to imagine this? To draw a picture for listeners who may have never seen anything like it.

Stephan

From the outside, the machine looks like a mill-turn machine as you would see in a catalog. There is a housing on the outside, and what is special about the machine are the processes that take place on the inside, behind the doors, so to speak. As I mentioned at the beginning, our big technology areas are laser technology, image processing and the processes needed to manufacture lithium-ion batteries with our machines. These are roll-to-roll applications on which the anode, cathode and separator films can be processed, cutting, punching, welding processes – a diverse field of activity that ultimately leads to being able to drive a car with electricity. 

 

I was just trying to get my mind around that. The bottom line is that the lithium-ion batteries are manufactured in your machines, right? From the outside, it looks like a mill-turn machine, but this is a complex process that is probably also customer-specific, hence special-purpose machine. 

Stephan

Exactly, and some of these are individual machines, while others are linked to form lines in order to map the production process 

 

Now you had just said it’s about commissioning. If I imagine a commissioning process like this, then the machine has to be set up somewhere – planning, setting up on site, there’s a complex process behind it. How does such a commissioning process work today? Do you have an example? Maybe also from one of your customers, that you understand the process behind it a little bit. 

Stephan

In simplified terms, of course, there is an agreement with the customer. There is a specification, acceptance criteria are defined together with the customer, which must be met at the end of the day. Now a machine is designed – mechanically, electrically and with the software. Of course, there is a pool of basic data and also modules, but the devil is, as so often, in the details. The machine is built through Operations, i.e. through the entire procurement process, to assembly. Then, when everything is assembled, mainly the mechanics and the electrics, you can also flip the main switch at some point, and then the commissioning and software developers come to the machine and breathe life into it. And not only to be able to move the machine, but also to start up the processes I was talking about before. These are partly strongly dependent on the material. This means that the customer provides us with material that has to be processed on the machine, and tests have to be carried out on this material in order to meet the aforementioned quality criteria or performance criteria. It is precisely this data that we want to display and analyze as early and as precisely as possible. 

 

Now, when you talk about material – I don’t know if you’re allowed to talk that much about it – but what kind of material is it? Is that the classic housing of such a lithium-ion battery or is that the filling with certain acid or how do you have to imagine it?

Stephan

This starts at the very front of the cathode, anode, separator. These are foil materials that are coated. Base material is usually aluminum and copper, which is coated accordingly. That is then processed, either cut, die cut, to the housing of that cell. This can be a so-called pouch cell. This is virtually wrapped in a foil, which is relatively flexible. Or that goes into an enclosure, into a can, flippantly a can, which is then also sealed. At some point, the electrolyte is added and then that is sealed again. Then you can do electrical tests and start the battery. So ultimately it’s about the whole chain from the anode, cathode, electrode foil to the finished battery pack, which now concerns the lithium-ion batteries themselves. But we don’t just do battery technology, we also do electronics components around it, the manufacturing of the electronics components that are needed to run a battery pack in an automobile or a smartphone, for example. 

 

What are the classic challenges that you encounter or that customers present you with? Can you tell us a bit about the day-to-day challenges?

Stephan

The challenge is that the processes must run at a correspondingly high quality and must also meet the customer’s quality criteria. Tolerances are specified there or also speeds that have to be met – for example, the number of films per hour that have to be produced so that the throughput is achieved and the machine also pays for itself in the corresponding time that the customer envisages. That is one issue. Also, it’s all about the malfunctions. As is so often the case, straight forward is done relatively quickly, but then catching the source of the errors, error handling, knowing or so analyzing where the errors are coming from, and then turning those off is another issue. And that’s where the collection of data, the analysis of data, and ultimately – where we want to ultimately end up, of course – the work with the data helps. This means making optimizations on the basis of the data obtained, or perhaps even avoiding cases of malfunction, so that high-quality products can constantly come from our machines during production. 

Now, of course, I talk a lot about data in the podcast. I don’t know if you can go into a specific process right now, but maybe you can tell us a little bit about what metrics in the hardware that you guys are building end up being of interest to you or your customer? 

Stephan

In general, we work a lot with axles and with drives. The positions are important, of course, because that’s exactly when you have to produce the parts accordingly. This means that the axis position, when it is necessary to position a laser or to transport the foils via a robot to a specific location, must be precisely placed again. This is data that comes from the drives as well as data from image processing. When inspecting how an object lies or when inspecting such a weld seam optically, for example, whether one can deduce from this how the quality of this weld seam is, whether this is a good or bad part. Then it’s things like laser power, the pulse duration of a laser, or electrical measurements like an ohmic resistance, or just an electrical continuity test to see if there’s a short or if everything is as it should be. 

 

Wolfram, now the question for you. As I said, we are talking about a wide variety of data. In general, there is already a lot of intelligence in the machine. But I now also have to set threshold values in order to then create added value for the customer afterwards – to prevent malfunctions or downtimes, for example. How do I do that exactly? How do I bring this knowledge, which I may already have or that the employees have, into the digital world? 

Wolfram

In the first step, it’s not really a matter of starting and thinking in terms of machine learning, but simply of bringing together the data from all processes on a time basis, for example. We had already heard earlier in Stephan’s remarks that this specific case involves many machines, many processes, several lines producing in parallel. A lot of data is generated. In order to be able to make a statement at the end, they must first all be brought onto the same temporal basis and harmonized. In some cases, the data are high-frequency. And in order to bring all this together at the end, I first need tools, technologies, to be able to store them together. After that, it’s a matter of being able to analyze them, because we want to find correlations in these data in order to perhaps be able to find improvements or reasons for malfunctions. In other words, it is first of all an analysis process that has to bring together the experience of those responsible for the process, the software developers, the control engineers, etc., in order to have a common basis. To do that, we need tools again that perhaps go beyond a normal Excel chart, where we can show more complex relationships to be able to develop in that area. Only in the next step we might be able to train forecasts. That’s where we would be in the future with the AI model or machine learning model, with which we can then actually achieve improvements. And how do I bring it all into the digital realm? I need the tool, of course, to create that connectivity to the machine. I need a database where I can store that – whether it’s an edge cloud or an edge device – to be able to evaluate it at that point. Then it is simply also an engineering process to work with this data, so first of all an aid, a tool that I need to be able to analyze processes.

Stephan

In this context, it is also very important not to focus only on the data of one machine and then leave it there, but if you find effects, then it often occurs that these are relevant in subsequent processes. An example: A weld on a machine was not successful. This is not recognized properly and it is only in a follow-up process, two or three machines later, that it is discovered that something went wrong five minutes earlier. This is also technically relevant to recognize and process. 

Wolfram

Or even if the result is just recognized as being within tolerance, there could still be major deviations in a further, later step, hence this harmonization across all process steps. 

Stephan

Exactly, or if you think about mechanical stresses there that may arise as the manufacturing process continues, if it’s boundary-pushing process results, for example. 

 

The bottom line is that process knowledge, which the customer also has, is synchronized somewhere with manufacturer and machine knowledge in order to leverage these effects, isn’t it? 

Wolfram

I think that’s really what it’s all about in the end, bringing everyone’s knowledge together. On the one hand, the data with it, but then the process knowledge from the customer, the process knowledge in the case of Manz, who deal in depth with processes, that they come together and have a common basis for discussion, on which they can then further develop processes, machines, products. It’s all a very, very highly dynamic process at this point. The products that are manufactured may sometimes not even have been completely designed when the machine is already being constructed; everything is created simultaneously, so to speak. 

 

I believe there is a lot of potential that can be leveraged in these individual processes – be it glaze performance, milling data, image processing of the weld seam, etc. Wolfram, you said that it’s also about connectivity, about the topic of the edge – just as a buzzword. To put it very concretely: How do I get the data from the customer’s infrastructure to the point where someone can evaluate it and thus add value?

Wolfram

Exactly. So from our point of view, it’s not about primarily saying we have to pump the data into the cloud. It’s a lot of data, a lot of machines, and as has been alluded to before, simply high-frequency data. By this we often mean drive data in control real time. There we are in the range of 4 milliseconds or even faster, even higher frequency or acceleration data from acceleration sensors. From our point of view, it doesn’t necessarily make sense to pump this data directly into the cloud. Instead, it is first of all raw data that we can process on such an edge device and then process the information directly there or store it in the cloud afterwards. This is actually a question of IT technology: How do we get through the network to the cloud and want to evaluate that? Sometimes we say a buzz word like edge cloud and kind of mean something in between. For this purpose, we have developed the so-called Collector. It is a service in the sense that has to meet a few requirements. It minimally invasively attempts to tap data from the controllers. For example, we may not use it to raise any influences on the utilization of the controller itself. These are sometimes such problematic details that you are confronted with there. This means that it is a small, slim, narrow service that can run on any machine controller, but which can perhaps also run on an image processing computer because a corresponding OPC UA interface is offered or on the sensor device. We have not only machine controls there in this environment, PLC controls or NC controls, but many devices that generate data. That’s why we need this service. We all have to administer the many high-frequency machines. That means there’s a second component to it that’s actually operable over a network, so it’s a browser-based app that I can use to configure all of these services. Of course, many machines have similar data that we want to collect, but then again specific ones. That is, some things we want to copy, continue to adopt, and sometimes we want to set something specifically. That’s why it was split into two parts. We can sort of administer it from the office, if you will, or over the network easily that we don’t have sneaker administration in manufacturing. Then we distribute these services and initially collect time series data in one or more central databases, which we can harmonize, on which we can place trigger points, which we can offset. Let’s say we might have several different manufacturers on a shop floor – one puts out temperatures in degrees Celsius, the other in degrees Fahrenheit. It is necessary to say: At the source, where we want to evaluate these data later, we have everything in degrees Celsius, for example, so that the process people who are to look at it later do not draw any false conclusions here. This is what we mean by harmonize. These are functionalities that we can support accordingly with this Collector and the Collector App there. Importantly, these high-frequency stories support hardware independence, i.e., across control boundaries. And if you think one step further: We’re talking about a greenfield solution right now – that means all new machines, modern interfaces. But on the other hand, in other use cases we sometimes have a brownfield situation and have to be able to communicate with various interfaces to the field, to the shop floor. This is then a relatively heterogeneous thing. This creates a distributed application that can be administered from a central location. And then from there, you’re talking about routing information to the cloud, all these dashboard topics that you can then embed on there. 

 

True, many have a heterogeneous system landscape. Then with brownfield it gets a little more complex and if I have opportunity to plan greenfield maybe a little easier. The question that poses itself to me a little bit: Now I have Manz on the one hand – with the incredibly exciting process knowledge about the machines, about the special-purpose machines, perhaps also about the individual processes, how they then also run at the customer – and on the other hand to you, who also bring the knowledge. How do I get this processed in real time now? Who takes on which part here, where is this handover point, so to speak, this interface?

Wolfram

We developed the solution in such a way that we first offer this collector, this app, this software, to our customers and we call it IIoT Building Blocks – “Collect”, “Explore”, “Improve”. That’s three steps. Explore and Improve are actually open topics where we are also happy to support with open source technologies that we have evaluated. But there the transition is fluid. We deliberately want to support our customers so that they can contribute their own knowledge and capacity, because ultimately they have the process knowledge and they have to be able to work with it. It was also a goal of this development to be able to offer a natural landscape, so to speak, that makes it possible not to have prefabricated dashboards and to implement every change at great expense, but to actually quickly offer the possibility and to say: Okay, here a process manager who also has IT knowledge can create his own evaluations, his own dashboards, in order to make his work easier. We don’t want to go there and say that every change, every idea recording has to be done by an IT specialist, but it should be presented on a very simple level that you can lend a hand yourself and solve your challenges in an agile environment. It’s a smooth transition, so to speak. 

 

Stephan, I’d like to ask you once again: You record this data from the controllers, from the sensors of your machines, so to speak. For example, you can use this collector. But before that, you bring in your process knowledge, where you sort of pre-process the data on the edge, right? 

Stephan

We use iT Engineering’s data collector because it was important to us to have a product on hand that is capable of locking data at a high clock rate and that is also fail-safe. So if a connection to the edge device or to the cloud is lost, the data is still not lost. There is then also a mechanism in there that makes this possible. It’s important to just have an all-around clean data image. The next point is then that the users, the process developers, the process engineers and the commissioning engineers, who then also have to process the data, have access to the data. This means that they can select and display the data. We have a dashboard and there you select, for example: I would now like to visualize the laser powers of the laser in the welding module in the form of a line chart. And we want to bring that directly to the user, to the internal Manz user, so that he can display, visualize, and track the data he needs for process quality and stability virtually without software development knowledge, without the specialists. So it goes hand in hand like this: One is to make the data available to the colleague in the first place. This is where iTE SI’s data collector makes a big contribution. He writes the data into the database and from there it is virtually taken and processed by our specialists, process colleagues and commissioning engineers. In the simplest case, they are only visualized, and in more complex cases, they are then analyzed – with limit values, with offsets, with evaluations on subsequent machines, which I said before. There are a variety of options, but it is as simple as possible for the user. 

 

Thank you, that also helped again to back it up a bit from practice. Now we’re always talking about business cases. In the end, you want to create added value somewhere for the customer who, for example, uses Manz’s special-purpose machines. Now it’s all about cost savings or maybe even moving towards new business models. So what does the business case look like on your end? What is the bottom line?

Stephan

As stated at the beginning, the main development purpose was initially to have the machine at the customer’s more quickly. And with the quality that the customer expects. First and foremost, we are aiming to save time and money. As a result, we would of course like the customer to then use the smartPRODUCTIONKIT to continue using it in his production. There are now various approaches, for example a freemium model, when the plant or the line goes out, that the smartPRODUCTIONKIT can continue to be used free of charge for a while. If the customer then hopefully also sees a benefit and added value, then he can also license this with us with various licensing models. Then he gets updates or can respond to customer needs by adding even more dashboards, other dashboards or analytics – multiple possibilities. 

 

Now, if I can just move slowly toward the end: We talked about some challenges at the beginning. Have they been solved today? Is the project ongoing and what does the result look like for you in practice?

Stephan

We have the first machines in the field where the smartPRODUCTIONKIT is running, and also running very reliably. We have it in use in commissioning. The project is ongoing. They always say: Software is never finished. I don’t want to put it quite that blatantly, but we still have ideas about how to develop the whole thing further. And ultimately, of course, our customers, our internal customers and also the end customer, the user of the machine, always have more ideas. The product continues to develop together with the customer. And we are working on that, and of course we also have ideas on how to further develop the product in terms of service usage or learning with data. Even though Industry 4.0 has now been around for 10 years, it is a relatively young domain compared to mechanical engineering. I think there is still some potential there, also if you take the buzzword AI and link it together. You can come up with a lot of thoughts and ideas that can be used profitably. 

 

I think that even if you look at the end customers, a lot is developing there right now – also in terms of acceptance. In the end, it’s also a learning curve that you have to embrace. The whole issue is incredibly complex, and I see that every day. Wolfram, once again in your direction: You are probably also constantly developing your products. We had thought of the product a bit in the direction of the end customer. How are you still doing with your IIoT Building Blocks? What’s going on? How is it all progressing for you ? 

Wolfram

The Building Blocks are divided into “Collect”, “Explore” and “Improve”. The Collect part is not the value-added part at all. I think our motivation was to provide a piece of software that could be used like a kind of Swiss Army knife – towards interfaces with the shop floor and databases. In the use case shown, I wanted to add to the previous point, you can see how nicely you can also map the entire product lifecycle. For example, the “Explore” module can be used here to visualize data and create transparency. I think that’s a very central point in all these improvement measures, because it’s only through this transparency that you get impulses that you might have to implement later in purely human terms. All the way to machine learning and predictive models that you might be able to use to optimize quality because you’ve learned about these manufacturing steps. So I can train any nets that might counteract if I know, for example, that the welding process is at the upper tolerance limit. Maybe I can counteract at another point so that I end up with a high-quality product again. The whole thing is happening in the Greenfield, but of course the development continues towards more interfaces for the whole Brownfield. The average age of the machines in Germany or Europe – I’m not quite sure right now – is between 12 and 18 or 15 and 20 years, according to various sources. So in any case rather older. This means that we basically don’t have these current interfaces in the direction of OPC UA or MQTT everywhere, but rather quite, quite different interfaces that we have to connect there in order to be able to view this holistically. And our development will continue in this area, and of course there are many other use cases that we have there. 

 

Now that was a nice segue. I was just about to ask the follow-up question: What are the challenges of other customers? What use cases are you still working on? That’s probably not always about the commissioning process.

Wolfram

First of all, creating a lot of connectivity to machines and also brownfield connectivity, that is, to older existing plants. This involves topics such as condition monitoring or predictive maintenance, which is an exciting topic with a lot of potential, in my view, but which is also a difficult topic. It is about cycle time optimizations or OEE optimizations. In many cases, it’s these connectivity questions at the moment: How can we communicate with machines? How can we also write back and influence processes on the machine. These are actually many use cases that we are discussing in this environment. 

 

Thank you very much. Thank you both for this exciting content.

Please do not hesitate to contact me if you have any questions.

Questions? Contact Madeleine Mickeleit

Ing. Madeleine Mickeleit

Host & General Manager
IoT Use Case Podcast