Whether manufacturer or supplier – more and more companies in the automotive and manufacturing industries are demanding direct access to telemetry data from the machines involved in production. Even individual processes should be able to be analyzed a company’s own tools. What this entails:Requirements for digital solutions or corresponding capabilities of business partners and mechanical engineers. Which requirements these are and which use cases are already being implemented here today are shown in podcast episode 89 of the IoT use case podcast FRÄNKISCHE Rohrwerke and daenet.
Episode 89 at a glance (and click):
- [11:56] Challenges, potentials and status quo – This is what the use case looks like in practice
- [24:00] Solutions, offerings and services – A look at the technologies used
- [41:15] Results, Business Models and Best Practices – How Success is Measured
Podcast episode summary
Like ACP Digital, daenet GmbH is part of the ACP Group and consists of approximately 160 employees who are exclusively dedicated to the topics of digitization and digital technologies. They develop solutions in the emerging technologies field. In close cooperation with Microsoft, daenet is working on the latest technologies that are on their way to becoming established on the market. They then help their (and Microsoft’s) customers to successfully implement these technologies in their products.
This is also the case with the project with FRÄNKISCHE Rohrwerke that is the subject of this podcast episode. FRÄNKISCHE’s business revolves around the development and production of a wide variety of pipes, accessories and system components made of plastic and metal. Customers come from the building construction, underground engineering, automotive and industrial sectors. The desire for direct access to telemetry data is playing an ever-increasing role for them. Every single component produced should be precisely tracked.
In this episode, you will learn how FRÄNKISCHE built a modern IoT platform for planning, monitoring and recording production processes. Reporting: Stefan Endorff(Teamleader Digital Transformation Office at Fränkische Industrial Pipes) and Damir Dobric (CEO and Lead SoftwareArchitectat ACP Digital –daenetGmbH).
Podcast interview
Hello Damir and hello Stefan! I am very happy that you are with us today. Damir, how are you and where are you right now?
Damir
Hello! I am doing wonderfully today; apart from all the technical problems we are struggling with day and night. I am currently in my office in Frankfurt am Main.
Very nice! You’re in your office, so to speak.
Damir
Almost my second home.
Glad to have you with us today. Stefan, I welcome you to the round as well. I’m glad you took the time. Are you at the office too?
Stefan
Thanks for the invitation too! I’m currently in the home office, but just got back from the office. This is practically the modern and hybrid work system. Unfortunately, the sun is not shining at our place, so I can’t really bring good weather. But we are very close to nature locally, so it’s nice to be able to have a conversation here in the quiet environment.
I was about to say, you with FRÄNKISCHE, I just looked on Google Maps, you are located in the direction of Würzburg, Haßfurt, Schweinfurt, that area. That’s kind of your headquarters, if you will, isn’t it?
Stefan
Exactly, with us it is called the ” Wellrohr Triangle”, because most of the competitors are directly descended from us. We are located in the middle of Coburg, Bamberg, Würzburg and relatively close to various main traffic routes.
I’m really looking forward to hearing more about what you guys are doing on site in a bit.
Damir, I would start with you briefly to introduce your company in context to the topic of IoT. As I understand it so far, there is “ACP Holding Digital AG”. It was established in 2019 under the umbrella of the ACP Group, as a so-called competence center for digital business solutions in the German-Austrian region.
You’re more than 160 experts, covering AI, IoT, AR, VR, Mixed Reality, Big Data and Analytics. Among others, your company, “daenet GmbH”, belongs to this group. You have been in business since 1998, based in Frankfurt am Main, and are experts in software development, especially with a strong partnership with Microsoft, which I find very exciting.
You understand each other and call this a “Trust at Technical Advisory” for early adoption of technologies focused on cloud, IoT and AI. That’s why it fits in very well and I’m very excited to see what project you’ve implemented together.
Damir, you are CEO and Lead Software Architect at daenet. What exactly does your department do and which clients do you work with?
Damir
That’s right, we’re part of ACP Digital and we’re about 160 employees who focus exclusively on digitization, digital technologies and more. We also belong to the ACP Group, which is about 2000 people in this company and we do a remotely comparable business. In the context of digitalization, specifically my unit, daenet GmbH, we do solutions that go towards emerging technologies, with some other partners.
I’m personally a Microsoft Regional Director and MVP (Microsoft Azure) and we’ve been focused on Microsoft Azure technology for eleven or twelve years and for eight or nine years, you could say we were working with IoT. We are working with Microsoft on the latest technologies, technologies that are coming in one to two years. And when these become established on the market, we help our, and Microsoft’s, customers to successfully implement these technologies in their products.
The whole thing is very exciting and insanely fun, however, it takes a lot of learning time and testing time and the things we’re dealing with most of the time, they’re usually not always mature and that can be frustrating sometimes. However, when something is then launched and successfully implemented, it’s a great feeling and looking back at what you’ve done in the last few years and what you’re doing today in production worldwide.
Regardless of technologies, we feel comfortable in industrial companies, especially manufacturing. We work a lot there, but not exclusively, because IoT can also be used for building management or building administration, or even in logistics. In the meantime, we are also taking our first steps in IoT with insurance companies, something that was hardly thought possible a few years ago. There are very innovative ideas and projects that we are implementing.
Very exciting environment in which you are active. What use cases are you working on with different customers? Can you give us a quick intro to what project we’re looking at in detail today?
Damir
One topic is Smart Factory, but I would like to talk about manufacturing first. We have projects here where it’s not just about the machines and connecting machines, and controlling them. With some customers we have things like: Robotic control, remote via a cloud, to help build houses with 3D printers, for example.
Or we do remote imaging of “Inspection Technologies”, so any materials are taken, for example plates that is irradiated with ultrasound and then we do an inspection of it, like a tomography device with ultrasound, that’s also another kind of IoT.
There are also projects in logistics where we can track various objects/items worldwide. Be it on airplanes, on ships, or even on trucks.
As far as IoT is concerned, the world is so open. Especially now in the last few years, we have increasing technological possibilities to get to the smallest possible things with high-level software, such as the sensors. This enables many scenarios that were not so easily possible not long ago, or if simple, were not affordable.
That’s what gives a whole different perspective on it now. I don’t even want to mention the topic of the cloud, what possibilities we are given here to map the scenarios that are not just tied to one company, to one specific location, but we can look at this more globally.
This also enables use cases such as “roaming”. Building IoT scenarios on specific things and locations where they move, not just in their place and location, but through networks. You have to imagine an airplane, what happens there, with which IP address an airplane takes off and with which it lands. Meanwhile, it may not have any. These are scenarios that were once only conceivable in science fiction movies.
I would link all the use cases you just mentioned as well. Here on the portal you have some really exciting use cases, I would link them in the show notes.
How did you two meet in the first place? How did the project come about?
Stefan
Quite classical! Of course, you follow what is going on in the IoT and industrial sector in terms of digital manufacturing and what is developing over the years. Without a concrete need, there is also little budget and resources for it. This goes well until demand comes and then you realize that the set or given schedule seems extremely tight and you don’t have the practical experience internally at all.
At that point, there are two possibilities when such a project comes to the table: You act as if all is well with the world, do something for a long time, only to realize at the end that it is not possible to implement it at all. Or you assess your own possibilities realistically and realize that you need both a competent project management or project team and a partner who can support you in the technical and also in the know-how development and also promote you.
One thing must be clear: it is a matter of both exploitation and exploration, both of which must be managed. You can’t do that without expertise. There are enough nice PowerPoint sheets, but without experience and also resources it is impossible to implement something like this.
We then came through Microsoft, as we were in the process of figuring out with our central IT how to approach the issue. In my department we also have a software development that develops for Linux and also a focus for Microsoft. Already worked in the classic path to the cloud, but not yet to the extent. So then the decision was relatively clear that we wanted to set this up based solely on the potential opportunities for scaling via the Azure Cloud.
Microsoft then named partners and after the first conversation together they initially found each other. The first approach is simple and you try to start as small as possible, in our case this was not possible, because the target and the expectation of also concrete customer development represents and demanded a huge framework in the global context for us, and also for daenet.
Challenges, potentials and status quo – This is what the use case looks like in practice [11:56]
Let’s dive right into your project and understand what exactly you’ve done and what your challenges are as well.
FRÄNKISCHE Rohrwerke initially develops and produces pipes and accessories. Also system components made of plastic and metal for building construction and civil engineering, for a wide variety of sectors, for example automotive or even across the industry. You were founded in 1906, are a family business and employ more than 5,000 people at 19 locations.
What is your vision for digitization? What is the big “why” behind it?
Stefan
Digitization in the form of creating digital tools, such as an app for scheduling stormwater processing topics and online systems, various videos on YouTube and the like, to bring customers, as well as employees, potential groups closer to the topics we work on a daily basis.
We already have all of that, but those are just individual tools. You have to make a clear distinction between digital transformation – that’s a holistic approach in which I look at the entire company with all the processes and all the employees involved; in other words, not just a tool, a tool that is needed for the current requirement, but the system landscape, the manufacturing technologies, the process landscape around it. There are several other issues coming up, not just manufacturing, for the current topic we are discussing and need to look at it as an overall context.
For us, the topic of IoT was planned integratively from the very beginning, which also meant that we could not approach it with individual solutions tailored to our needs. These usually promise short-term success, but then usually end up in increased effort and then integrated with other issues, so from the beginning the generic approach was desired to create a platform and not a single solution, so that not only for the one project you can use the services that you create, but with minor adjustments ready to use for anything that comes our way in the future.
We start from our vision from the customer, who brings us the business and the orders, then go over the project planning into the development, the run up of products, accompany parallel to the existing ERP system the creation of value, take care of the whole topic “after market”, expiration of projects.
All these small keywords have a great significance in the background. The metaphor of the iceberg, of which you can only see the tip and not the rest under water, is very true. But we are also concerned with redesigning using the current technological possibilities and the wishes of the market and bringing this together.
This requires constant evaluation of technologies, where we have been strongly supported by daenet, and of customer needs in terms of sustainability, stability and security.
The buzzwords for us in the future, for technology and also for people, are both diversity and hybrid solutions. We are a global site, so diversity is already a given at our company and is also supported and encouraged. The hybrid solutions are more related to people’s work environment.
Just as I’m sitting in my home office today, and was still in the location this morning; it’s now quite normal to be able to work independently of the workplace. Also, the whole issue of integration that runs through the whole area. Because what has turned out is that pure homogeneity is stagnation; you cannot develop with stagnation.
What are the processes behind it? How do you deal with your customers?
Stefan
From the processes for the manufacturing side, it works. The customer enters a demand, is resolved positively, it is then not only the demand of the finished part, but also the complete planning of all required components resolved into raw materials. This then goes through purchasing, into production planning and logistics. Here, the stocks are checked until the production order arrives in the production department at one of our locations.
It may also be that it was scheduled at a different plant, depending on where the particular step is performed. This does not have to be directly at the location where the order was received, but can be another location worldwide. The production starts, makes its added value on the raw material, on the components and at the end there is the quality inspection.
Generally, the product is sent via logistics to the next location, where further finishing is carried out. The added value is further increased, components are built and in the process there are accompanying tests and final tests until the product is then finally sent to the customer via logistics.
There, too, the various data are transmitted in parallel. This is standard from Ideal Transmission so that the customer can see what is delivered where and when. Of course, we have performance measurements, analysis, capabilities, and also the internal failure analysis, if there are errors somewhere that are still running in parallel.
Thinking into this scenario, what are classic challenges that you guys have seen in this flow? So a demand comes in from the customer, an order comes in, the whole thing is produced by you. What then are day-to-day challenges that the teams on site face and also potentials towards IoT that you have seen?
Stefan
It is still easy to handle in planning if I can act and work independently of the following and preceding steps. However, the current project was also about creating a complete audit trail, i.e. a traceability at the individual part level from start to finish.
From raw material procurement to shipping to the customer, to map everything across all locations is a completely different order of magnitude. The accompanying tests are just as much a part of it, and they must then also be located in the part.
Let’s start with raw material testing: There are incoming inspections, documents, tests that have been done internally or are provided by the supplier, they all have to be recorded and assigned to the part that is in the batch of the raw material. Then the raw material is transported; this is a delicate process in terms of making sure that the right raw material, in the right mix, and also in the right preparation, arrives at the appropriate manufacturing facilities. Exactly here we have developed a solution with the company daenet and here all connections are queried again whether also really the correct connections were set.
This is recorded, tracked, and enriched with the metadata from the production order to ensure in a truly traceable way that the right material arrives at the right extrusion line. As a rule, the first step with us is always an extrusion of the pipe.
The pipe must first be created. Then, in all the subsequent processes, this reference to the measurement data, test data, connection data, raw material and material flow is carried along, so that I can stand at the end product, across all the different manufacturing processes, and say that this test was carried out in the respective step with the result. In some cases, there are also inspection pictures that were brought from the machines in order to be able to trace the material flow.
Your pipes are manufactured in such extrusion machines and the main issue here now was to ensure this exact tracking of these individual components. For example, this component may be a pipe for a particular customer, where different materials are used, which was not necessarily possible before. Is this summarized correctly?
Stefan
There have also been paper documents so far, but there I am always on a patch level, not on a single part. And a tracking or even an analysis to discover features that one was not aware of before is only possible manually and thus actually not possible at all.
It gets interesting in the subsequent steps with the assembly, when various tubes have been formed, are assembled and external or internal attachments are still connected. Here then also to be able to measure certain insertion forces to know that the component is seated correctly, that it has had the correct temperature profiles during deformation. Also to have a final quality check at the end, where the presence of components, the position is checked visually, the complete course of the finished parts is checked, a pressure test is made, the tightness is checked and this then generates a unique number for the customer, with which they can view all this data in a platform also created in parallel, based on the number or the article, over the entire batch.
The customer has access to all measured data, both quality and process data, and can carry out inspections in the portal again, make analyses themselves and, if there is a failure, can look up the cause ad hoc, which gives us agility and enables us to react much faster.
That is, until now, such data have been retrievable manually. This had to be asked by phone or email. Or how did that go so far? How has the customer done it so far?
Stefan
The production orders and metadata from the ERP system were already digital, of course, but the records on machines, speeds, temperatures, these were recorded manually until then. The test characteristics were recorded manually, there are still accompanying tests, where employees make tests, take out parts, perform destructive tests and the like.
This was also recorded manually, but not yet assigned to this part or batch, but were located in a dedicated system. Here we have all these different systems that were there, plus the IoT, which now automatically provides data, linked to the company daenet, which has made connections of all the machines. Bringing in machines that were not yet in the network, some of which were not even prepared. They met the technical specifications, but they weren’t yet prepared in terms of connectivity, and now that was all added to it.
Solutions, offerings and services – A look at the technologies used [24:00]
You started the project together. What were technological requirements for the solution?
Stefan
The basic requirement at the start of the project: it was not possible to estimate what resources would be needed, in terms of IT, hardware, network and the like. You can’t go blindly into procurement to purchase equipment, especially at a time when equipment was not available.
Global supply chains escalate from our chip manufacturer and edge devices, i.e. industrial PC manufacturer: there, delivery times have been increased from a few weeks to one and a half years. That’s where the cloud, which it had before, but that’s where it became particularly apparent, has had the advantage, we were able to run before we learned to walk and are now adjusting scales. In some cases, they are being optimized so that we can reduce costs. We could easily book the resources we need at the push of a button. That was the biggest advantage of the cloud.
Just the other day, we painfully noticed what it’s like when Microsoft has hiccups and our production comes to a standstill. But I have to say positively, within an hour the problem was solved; that’s when the contingency plans took effect.
We have also had such issues in the run-up, this fall, when a patch does not work. Here, too, Microsoft has provided support. This was not so critical, but took a week. It does require some alternate thinking.
I’m still excited about the cloud, though. It doesn’t take anything away, but you do have to point out the one or two negative things because they are often used as criteria. You have the same problems in the physical and local world, the same choices, and you have to be able to react here as well. As I said, it was commendable how quickly it was done.
Basically, digital manufacturing requires a stable and secure infrastructure, which you have to have both in the company and, of course, you also have to make sure that the connections also provide the best possible bandwidth, because at the moment we are still using images and measurement data. But then when video comes into play, the line size you have and also the resilience becomes more important.
Together, you have built an IoT platform via Microsoft Azure for planning, monitoring and also recording these processes that you have mentioned. Who did what part of the project?
Stefan
The first thing we did was to assess the situation and hand over the whole issue of IoT to daenet. We made a complete project definition, what has to be done at all, what scope we have on machines, how many plants are affected and then we also had other topics, for example our MES system had to be completely revised. This is where we got into the joint work, taking care of the back end.
The front end then also has raw material tracking and monitoring flowed into it, which can be operated via the interface as well as via other IoT devices, such as handheld scanners, in which the employee doesn’t have to configure everything by hand in the interface and then check to see if it fits. They can simply walk through the shop floor with the scanner, scan their connections, their production order and the rest is done in the background by the software.
With this in mind, the user interface was also designed to provide an optimal user experience. We have taken the path of optimization here to make it as simple and self-explanatory to use as possible, as well as using current technologies so that we are device-independent.
It is a web technology, which is now the standard and that can be used on stationary or mobile devices. Card readers and RFID readers can also be connected to the solution, for more operator use, so that the user does not have to re-register each time, but can simply go with the card and scan their assets.
It was about the technical solution, as well as about process optimization, about the least possible effort due to the new solution, which first has to be accepted and trained, because everything that is new is initially off-putting. Until the operator realizes that there is also a benefit, an advantage. That’s why it was also a big feature to emphasize the user experience here and to say that if we have user interfaces, they also have to be as simple and understandable as possible.
This is the holistic digital transformation. Each individual has to learn the ropes, this is something new. This is a change that is taking place in your company. daenet has built up and also approved the IoT sector, and the data analysis or the evaluation is a part that lies with you, isn’t it?
Stefan
Both. Again, we brought in another partner to execute all the data factory issues in Azure Cloud per excellence. The advantage is that you can get to grips with the tools, like Data Factory, very quickly even as a normal Microsoft user. There are more professional approaches for scalability, sustainability and stability.
We received support here and, based on this data forwarding and processing in other systems, we made enrichments for the customer portal in order to be able to concentrate on visualization in the customer portal and, of course, to integrate existing systems with it.
Can you explain how you guys did the data connectivity? How does data recording work?
Damir
That’s a very good trick question; such a project usually starts, apart from the business requirements, with the data acquisition and in this case it can also happen that such a project doesn’t come about at all because the data doesn’t give what we would like. Many say they have a lot of data and they would like to do something with it, however there is nothing in that data that would expect a business requirement.
Here, it’s not just that we have to connect the machines. In the end, a pipe comes out, but it’s remarkable to see the complexity behind it; I’m not talking about software.
Let’s move on to software and simple requirements. I now collect the data, for example, from the machine. It’s not just machines involved here, we also have other systems that we integrate and we also have other devices, such as a scale, a camera or for example a barcode scanner, QR code scanner and so on. All this must be solved somehow.
That is the reason why we decided to pick an IoT platform. In this case, we are Microsoft partners and we like to do this because we believe that Microsoft gives us such a ready-made platform so that we don’t always have to start from scratch, which is also good for us and, of course, also for our customers, or rather for our customers’ customers.
How have we connected the machines here now? In this case, we connected the machines mainly via OPC UA protocols. This is common practice in Germany, although not in the rest of the world.
There are about 230 to 250 different protocols on the market that no one really knows how to use. There is hardly a platform that simply maps all this. Therefore, at some point, the German industry determined and gave it a try to create OPC UA, which is a successor of OPC from back then.
Regardless of whether or not that would really be the best solution in every scenario, it’s a standard for now, at least at our location. That is what we very much welcome.
We now have OPC UA and this component in this particular case comes from Microsoft and now this component has to be installed somewhere near the machine, technically speaking. That doesn’t have to be bolted onto the machine for the component to communicate with the machine and do anything with that data.
The part we use is a kind of a gateway and to implement this gateway – it is a product! This could be a matter of years. The beauty of the Azure IoT platform is that there’s a pre-built concept called the IoT edge, and on top of that come a couple of components from Microsoft that provide security, that provide automatic updates, and everything you know from Microsoft. The actual intellectual property, the implementation of these business cases, is implemented in so-called “modules”.
For those of you who are a tech savvy: We used .NET and C# in this project to fuel these modules with code.
This is only part of the solution. When we get this data, with these modules out of the machine, which in this case is OPC UA with the machine, but for example with a scale or a QR code scanner, we actually have to develop the driver software, which is also deployed as these modules. The data is then pulled out of the machine and then automatically fed into Cloud according to a standard with certain protocols from Microsoft.
The component at Microsoft that receives this data is called the IoT Hub. And once they’re there, that’s when the real fun begins. The data can be stored somewhere and, among other things, analyzed so that it appears nicely to some user in an application and so that something can be done with it. This is what we call “telemetry flow”; feeding data from specific devices, machines or sensors into the cloud.
The interesting thing is, that’s not the only thing you do. Sometimes we also have to take control from the machine. That is exactly what is interesting about these cloud technologies: How do I get from the cloud to a machine or to a sensor? To do this, you need a platform that makes it possible to do this relatively easily, but also securely. You have to imagine, a machine or a gateway, is pretty well protected at FRÄNKISCHE or at anyone else. That means you can’t just get your hands on this machine. There’s a firewall and whatever technology behind it. There are special protocols that we use for this and various procedures that make such things possible.
There is one more difficulty with these things. It’s all nice when it works. One only asks the question: How does it get there? Traditionally, the way it’s always been done is that someone used to sit down, plug in a DVD or CD, and then it would run.
Only the complexity at FRÄNKISCHE is such that colleagues come up with new ideas. And they don’t want to wait three years for anyone to go anywhere or install anything on a CD. This must happen immediately. We did it together as part of the DevOps strategy with certain other tools that we also use from Microsoft here, all these modules and gateways that we have.
We compile automatically, we test automatically and we install automatically. And not in cloud, but directly into the cluster of them. That’s Kubernetes in this case, where we then host, deploy and install the whole thing. All these devices are in an IT environment that is closed and controlled.
DevOps is the collaboration, or the methodology, between software development and IT operations that you set up there. If one still has queries, you are open to technical detail questions that come along. The data analysis, you said, is partly done by yourselves, partly with partners. What does the user interface look like for you? What is the outcome for you in the end?
Stefan
It depends on who the user is. For us internally, there is an archive function, also a graphical representation, right now for the raw material topic, which is represented visually: This is the pallet with the raw material, I have a line here that depicts the line, the conveying line, numbered with the connectors and then comes a processing plant, of whatever kind, even if it is only drying.
I have a distribution to the plants so that the employee sees it as if they are standing in front of it in the production. The symbols are based on reality, as is the hand scanner, where the user walks through reality and scans and this automatically ends up in the digital world. This provides the user with an archive in which they can find out in a very simple list form what was transported via which line and when, with which raw material, and with just a few clicks they have a complete overview of the life of the part.
The second perspective is the customer portal, which is again a web interface that is also available externally – the first topic is of course only internal – here there are all kinds of charts, table views and filters can also be set. For our internal use, there are also various analysis options, similar to Excel.
There are already approaches for the next projects to be able to automatically extract characteristics from the growing data using machine learning or artificial intelligence in the future and then generate added value from the data and not just costs.
Results, Business Models and Best Practices – How Success is Measured [41:15]
What’s the business case for you, for your company? Have you done any return-on-investment calculations or considerations like that? Can you give us some insights?
Stefan
Of course, they also try to get the benefit out of it. Whereby by the fact that it was a concrete, project-related order also, the cost accounting has taken place with the project. Of course, you have to look at the future orientation and competitiveness.
A major upheaval is currently taking place in the automotive industry in particular. Pioneers from America are invading our German market, our territory, with their electric drives. Companies that have been broadly and also well positioned for years often now get into trouble because they have a completely different mindset. Players are entering the market.
Our domestic automotive industry is also becoming more and more deeply involved in digital networking, is then getting into topics that were previously not their core engineering topics, and naturally also want suppliers to be able to be seamlessly integrated here; so the change is also coming step by step more and more to digital production. In the medium term, this will increasingly become a basic requirement. This can be done with solutions that are tailored to the needs. Blind actionism does not help either; it is not sustainable and costs more than it brings.
With the company daenet, we are moving forward with the IoT sectors in the hybrid approach. We have not only captured data, written away and stored and display that, but it was mapped with the cloud an interaction that processes also test processes with cameras were developed natively, for us, from the company daenet, as well as the connection and communication between processes and machines.
This whole virtual and physical world is now merging seamlessly and will then take the strain off employees in the future. Many of the administrative, repetitive activities can be taken over by the redesigned digital processes. User interfaces are being simplified and adapted to the user, assessments are increasingly backed up with facts where previously gut feelings ruled; there are data bases coming along that can be used to make really robust analyses.
In conclusion, are there any insights from your joint project that you would like to share? Best practices to draw from or pitfalls for others looking to implement such projects?
Stefan
The most important thing is communication. The DevOps, we fortunately already have that in use internally. As a collaboration platform, it was excellent. You get your complete project documentation transparently and comprehensibly, as well as the communication, and all software modules and building blocks that were developed.
This entire technical process, including provisioning, is mapped via this platform. And in the manufacturing areas, people are lean these days; many are also trying agile. You can see, especially with large customers, for example the Audi E-Tron or the ID.4 from VW, how difficult such large customers find it despite resources we can only dream of.
Just because of the unilateral to switch to the field of software development. That was what we managed to do with Damir and with the daenet; to say that internally we already have this mindset. We had to motivate the rest and pull them along, but because the partner was already working 100% in this area, we didn’t have such a hard time.
You have to see very clearly that we are also suppliers via OEMs for the major manufacturers. Especially when it comes to the premium sector, if someone buys a car for €130,000, they don’t want to stand around somewhere and have some software problems and the car doesn’t drive anymore.
Damir
It was a very complex project, but the nice thing is, what I’m following here is, it’s been a very good approach to work the two different worlds, OT and IT world, this automation world, with the machines and also with the new modern software technologies.
Personally, I very often find that there has always been a misunderstanding between these two worlds. The software people may not understand what the people or the machines are doing. For them, cloud is probably something that could be left out.
We have really shown here that you can take the modern approaches to software development, this is about a revolution in a software, in a way of doing industry things with software. That’s what cloud enables.
I would recommend that anyone who hasn’t yet embarked on that journey to definitely check that out to see if that’s really the good idea to ditch the cloud. I’m not suggesting that the cloud should be taken for all solutions; there’s no way that’s going to happen. We have maybe 50% of it “on edge” here. It’s not in the cloud at all, it’s going to stay that way and that’s fine, but all these issues of how things are done and the speed at which a new idea is implemented, things are completely transformed and reinstalled.
This efficiency is remarkable. I don’t even think that Stefan and his colleagues are still in a position to measure what possibilities this presents. That will only become apparent in a few years, when they develop their potential and creativity and see that this is supported. This is not to be underestimated.
These are endless possibilities that you have created with this IoT platform, aren’t they?
Stefan
Right, definitely. We have laid the foundations, so to speak. Now the focus is on deepening and stabilization. Every employee in every plant now has to get to grips with the platform and the requirements, which have also increased as a result. But now we have just laid the foundations so that we can then address the issues that we have been looking at for a long time but have not been able to address, such as predictive maintenance or automatic error detection, fraud detection. We can address that now because we now have the data live.
In my department, we have a separate area for analytics; they now have data sets with which they can build and train models in the future and gain experience with them first. At the moment, we are not even aware of what we can do with it in the future. The more we get out into the world, in all plants, and the employees understand this, the more ideas will come from the employees, because this is not a one-man show. There were always at least two partners involved, usually three or four. We had to pick up all the external machine manufacturers first, because it was really the case that we didn’t have a machine manufacturer who could really deal with these techniques. They all advertise Industry 4.0, they all have OPC UA, but the integrative approach to bring that into the platform, to connect that, to let the machines communicate with each other, none of them could offer that. Here we had to consult the employees as well as the external suppliers.
A big thank you to you guys for sharing this project here. It’s a journey that a wide variety of companies are on right now. At this point, a big thank you to you for sharing your project so openly. Thank you for being with us today. Really a very exciting topic. I’m curious to see how it develops in the future.
Thank you very much! See you then!