Möchtest du unsere Inhalte auf Deutsch sehen?

x
x

Medium-sized machinery and plant engineering – This is the cornerstone for predictive maintenance

““

Click on the button to load the content from Spotify Player.

Load content

Listen to the IoT Use Case Podcast on Spotify.
Listen to the IoT Use Case Podcast on Spotify.
Listen to the IoT Use Case Podcast on other platforms.

IoT Use Case Podcast #70, consileo + elastic

Episode 70 at a glance (and click):

  • [06:54] Challenges, potentials and status quo – This is what the use case looks like in practice
  • [14:32] Solutions, offerings and services – A look at the technologies used
  • [30:30] Results, Business Models and Best Practices – How Success is Measured

Podcast episode summary

More and more companies in the mechanical and plant engineering sector are focusing on further development and on finding and expanding new areas of business.

The IT service provider consileo has made it its business to create, design, provide and support software. Both in the IoT and in the cloud environment. As a further segment, consileo deals with the topics of security and compliance. New technologies, co-developed by machine engineer and managing director Sebastian Fischer, are intended to provide customers with noticeable and measurable results.

On the other hand, there is Elastic, a kind of “booster” for search queries or even analytics platform. Elastic supports businesses and individuals, providing them with software solutions to process data volumes at the largest scale in near real-time. David Schreckenberg, Enterprise Account Executive, will explain to us in detail which tools are used to process data.

In today’s episode of the IoT Use Case Podcast, we find out how these two companies came together, what special machinery manufacturer Tracto-Technik has developed together with Elastic for an IoT platform, and how data ingestion, processing, evaluation and analysis work in this context.

Podcast interview

Welcome Sebastian and David. Sebastian, what I found very exciting when we met was your personal journey. Because you have many years of professional experience in a medium-sized mechanical engineering company, but you have also dealt specifically with IoT topics and business development. This is particularly exciting in the context of new business areas. I think a lot of people out there are new to this segment, and they’re just learning the ropes. How was this trip for you?

Sebastian

That was very exciting, no question about it. I’m a classic mechanical engineer, and with the topic of digitization, you get into completely new areas. It also has a bit to do with the fact that I personally have a great affinity with new technologies, like to try out new things and have always thought innovatively. Then you might find it easier to get into the topics. Of course, this is a new world. Software development, all the terminology that comes up. The way of working is different, the technologies are completely new. It also takes a lot of time to get used to it.

 

In addition to the actual development, there are also topics from the legal framework. How do you deal with the issue of marketing? How do you distribute digital products in the first place? You are already involved in business development, but you are also far away from development.

 
Very true. It’s a real buzzword jungle with all these terminologies where we want to clean up.
 
For context to your company. You are responsible for consileo in the management. What exactly do you do there and what does consileo do?
Sebastian
Consileo is an IT service provider that is essentially concerned with two areas: Firstly, software development, i.e. developing creative new things, especially in the IoT and cloud environment. On the other hand, we take care of topics related to compliance and security. The two topics in particular fit together very well. When you start a new development, you have to think directly about security. There is direct legal framework, GDPR. My job at this point is to build up a new business unit that brings these very topics to the forefront for machine builders and also for the manufacturing sector.
 
With my mechanical engineering background and digital know-how, I know how companies work. This is an ideal combination to start exciting projects with the customer, I think.
 
That means you’ve moved from the customer side to the service provider side, right?
Sebastian
You could say that, yes. I also find that very exciting, because now you have the opportunity to accompany customers directly from scratch and on various projects and topics, and not just be stuck on a single topic. I found this thought very appealing.
 
Absolutely. David from Elastic is also with us today. I would like to know how did you two get together?
Sebastian
The service provider who did the development at that time had the idea to use Elastic for the case, which is actually rather untypical. It was in this context that the contact with David came about. We immediately got along very well in terms of content and also personally. David, I think, found the case quite interesting. That’s how it spontaneously came about, and we now enjoy talking to each other very much and frequently.
 
David, you’re what’s called an Enterprise Account Executive at Elastic. You build long-term partnerships with customers. Elastic is a publicly traded company to begin with. You are not just any company, but with a focus on software solutions that can process and handle data in real time and at scale. You work with huge brands: Cisco, eBay, Goldman Sachs, Microsoft, NASA, you name it; I believe Wikipedia is also involved. Even names you wouldn’t directly associate with the industry.
 
What are you doing together with medium-sized mechanical engineering companies?
David
I was also surprised at the time when Sebastian approached us together with his colleague, because we are used as a technology by many customers in many areas. At our core, we are a search technology, you could also say an analytics platform. Whenever it comes to making large and small amounts of data searchable with high performance. Whether that’s searching a web store or searching for anomalous conditions in an IoT landscape, IT landscape, or cyber security, whether there’s anything happening there that shouldn’t be happening there. Elastic is a very well-suited technology for this purpose.
 
When Sebastian approached me and I was surprised – because I don’t have a mechanical engineering background myself, but I do have a background in power engineering, and I’ve worked there for many years – I was very impressed at first with what the two of them had achieved at Tracto. Just not in an IoT use case, the monitoring of devices, of machines that Tracto builds. That was already very exciting and impressive.
 

Challenges, potentials and status quo - This is what the use case looks like in practice [06:54]

On the one hand, we want to understand your technologies, especially with the search technologies, what it’s all about. On the other hand, from consileo’s side, what projects there are. Sebastian, what use cases and real-world projects did you bring with you today?

Sebastian

It is about the customer Tracto. Tracto builds special machines for trenchless pipe laying. So when pipes have to be laid trenchlessly under roads or rivers, where you don’t want to interrupt traffic, where there may be obstacles in the way, Tracto produces the right machines. At this point, Tracto had the major goal of further advancing the topic of “digitization”. Digitization on several levels in the company, including in the product area. The question was asked: What does digitization mean in the product? Both for the customer and for the end user who ultimately uses the machine. What benefits can Tracto also derive for itself as a company from digitization from the topics?

What are we talking about in concrete terms? Tracto has developed and built an IoT platform with us where machine data is collected. These are used at various levels. On the one hand, the end user uses it to log board data, for example, or to do things like fleet management. To see how the machine is doing and what it is doing. Of course, also to optimize internal service procedures and service processes at Tracto. I don’t want to talk about predictive maintenance yet, but first having the ability to see what the machines are doing in the field based on data? How are the machines doing? What errors occur? What anomalies might you see in the data? That makes it very exciting, and that puts Tracto in a position to do proactive service, rather than reactive service, in the future.

 

Put that together and you have a great benefit. Once on the customer side, who automatically receives data from the machines to log data. But also for Tracto as a customer itself, which is able to provide better service.

 
You said it’s about huge special machines and drilling. Can you describe the day-to-day job of your client? These machines are somewhere in the middle of nowhere on the road, on a construction site and there is drilling?
Sebastian
Exactly. Tracto itself sells the machines to its customers. And Tracto’s customers perform drilling. So the machine can be in the field or on a road and performs a drilling. Tracto-Technik itself must provide the service. If the machine is ever defective or has a breakdown, service must be performed on the machine. So you have to distinguish that a little bit.
 
When we talk about topics like service and so on, what are challenges here first, which is what Tracto started with? Where they have recognized: Oh, here are potentials that can be leveraged in the individual processes?
Sebastian
One of the challenges at Tracto was that the machines are blind in the field. That is, one had no or little information about the condition of the machines in the field. Whenever there was a failure or error, the customer actively called Tracto and then some kind of telephone troubleshooting was done. This ties up capacity, but it also ties up know-how.
 
If I want to scale in the future, bring more machines into the field, sell more machines, then I have to make sure that my service organization can grow with it. It can’t do that if it’s attached to certain know-how carriers and manual processes. That was ultimately the trigger to say: Okay, Tracto has to think about this, has to face up to such topics as data collection, IoT topics – on the other hand also the end customers of Tracto, who also have requirements for documentation. If you imagine documenting a borehole by handwriting – almost inconceivable in this day and age – when the data is basically there. Thus, one has directly created several benefits. Once to the customer to say: he has digital capabilities, has a process improvement and a process simplification. On the other hand, Tracto itself, which can streamline and simplify its processes.
 
The service organization that has to grow, very good point. There also comes the question, why am I doing IoT? There is already remote maintenance. The issue of “I have a service contract, but why IoT now?”. Why should I go one step further? In the end, it’s also about a service organization that is dependent on manual processes or knowledge that is in people’s heads and can be used that way, right?
Sebastian
Absolutely. The topic of scaling and internationalization in particular is something you have to think about. Even machines located anywhere in the world want 24/7 support, assistance and service. There’s no other way for me to depict that than to do it digitally.
 
IoT, that is the next logical step. Topics like remote maintenance, they existed before, that’s nothing new. The logical evolution is IoT with active monitoring; with active alerts that may come up when things are not running normally.
 
In the end, it’s always about data. Can you tell us what data is interesting here? Which is particularly exciting for your customer?
Sebastian
Essentially, it is the operating data of the machine. The classic operating data like engine data from a diesel engine that is on board … what engine temperatures do we have? What are the hydraulic pressures? What torques, what forces result there? What data is relevant for the actual drilling process? This goes as far as errors that the machine produces, which you also want to see.
 
One more question about your customer’s challenges. That’s also what it’s about for you: you’ve built a system and accompanied the customer through it. What were the requirements here, where your customer said: That absolutely has to be! We need that, that must work!
Sebastian
In the end, several things were important. A key point is that the machines are in the field. When the machine is in the field, it is quite possible that machines are located in regions where network coverage is not one hundred percent. The machines had to be able to buffer and reload data so I could have the full view.
 
What was also an important requirement, was that the presentation of the data should also be in a simple form and accessible to different areas of the company. We talked a lot about service, but ultimately product management is interested in: How do the machines work in the field? What are the average hours of operation per month? Or the sales organization would like to know how many operating hours a machine has run, so that I can proactively approach the customer to sell a new one.
 

Solutions, offerings and services - A look at the technologies used [14:32]

Together with Tracto, you built the IoT technology, or rather the architecture, and co-developed the app. I’d like us to shimmy from data ingestion, to processing, to evaluation and analysis. How does the data acquisition work and with which hardware?

Sebastian

The data that is generated essentially comes from sensors and actuators on the machine. Pressure sensors, position sensors and speed sensors provide signals to a machine controller, a PLC controller, which in turn passes the data to a gateway. This is a PC that receives the data, processes it accordingly, and then also sends it to the cloud. However, this also ensures that data is buffered if there is no network coverage. This is the main use case why we approached Elastic. These offer a very slim and elegant solution.

 
David, now I have the data in this gateway and it’s being distributed over an appropriate network, but how exactly does the data processing work with you guys?
David
In the specific case, we are using a component called Filebeat. From a software point of view, this is a very lightweight little thing that you install where the data needs to be tapped. Let’s just take the small PC that sits on the machines and sends the data from there towards Elasticsearch. Filebeat is a component from the ingest layer of our platform idea. There are different levels in the platform. As an example, the visualization view, where you can build analytics and display different things. Also the data storage, that’s the real Elasticsearch, that’s in the name in our company name. Then below that is the layer that is made up of various small software components. I’ve already mentioned Filebeat, but there are other options that can be installed with relative flexibility where data originates to be tapped and sent towards Elasticsearch.
 
Filebeat in the sense of “file” for “folder”, right? A type of data packets that are sent.
David
That’s right. Like log files, normal files. In Tracto’s case, they are sent to our managed cloud service via Filebeat. We offer our technology in several ways. At Tracto, it’s the variant where we provide the platform as-a-service in the cloud. This has the advantage for Tracto, Sebastian and consileo that it can be scaled up and down very flexibly. Without having to worry about compute – that’s no longer an un-tricky subject these days. Delivery times and so on, you can start flexible. Start small, and grow into it over time.
 
That is, I have the PLC as a data source with the individual sensors and actuators. Then the state information, error logs, whatever, are sent to your Elastic Stack, where all that machine data is aggregated and processed so that it can be analyzed in the next step. Sebastian, do you have any additions from the field?
Sebastian
What’s important at this point, both for Tracto and for us as an implementation partner, is the ability to process the data quickly and retrieve it quickly. This is very important especially for the analysis case. I can write data into the database very quickly, but I can also get it out. If I’m about to make complicated queries to the database, such as: give me the average of all the operating hours of all the machines in the last month – I want that to be fast and efficient. That was a major factor in choosing Elastic, because that’s what the database is made for.
 
We are talking about many and fast data. David, how do you handle that data with your technology? Why is this important?
David
It has to be said, it’s not always important, but in more and more use cases it’s becoming important to be able to make data searchable in near real-time. How we do that is basically the secret source of Elastic. It is the distributed logic. Elastic is a very scalable platform. It doesn’t really matter whether we’re looking at a few gigabytes of data or several petabytes of data. The fact that it is well scalable horizontally, in the sense of: I don’t make a machine bigger and bigger because at some point you physically reach a limit. But I can parallelize compute. I can place compute side by side and thus map the largest amounts of data with high performance.
 
That’s why in projects where there are high demands on search performance, where you want to work quickly with data – whether it’s an IoT use case or cyber security … You don’t want to have a query answered overnight, but you want to know quickly what’s going on. Since Elastic, it’s fair to say, has become a standard for many things.
 
Just thinking about your other huge customers, that’s masses of data flowing there, and that shouldn’t be a problem for you then. To go deeper; how does the evaluation of this data finally work? Because at the end of the day, you want to have an application somewhere where you can make this data available, for example, in product management.
Sebastian
There are several ways to evaluate the data, which is also used by Tracto in several variants. On the one hand, there is “Kibana” from Elastic as a visualization tool. This is a pro tool from my point of view to look deep into the data.
 
Then there is also the possibility to pass the data to “Power-BI”, as a standard reporting tool from Microsoft. This is also used in the company as controlling, among other things. There, reports tend to be generalized rather than visualized. These are exactly the questions like: Give me the average consumption, the average speeds of the last three days, or where have the machines been?
 
The next option is to access the data directly via an interface at Elastic, for example, in order to fill custom-developed applications with data. In this case, this is the application for the customers, who can access their machine data and view their drilling data there.
 
There are more options, but Tracto uses just these three.
 
Kibana is an open platform or software solution that you guys additionally used to then solve this specific problem, right?
David
Kibana is the software component in our platform that is responsible for visualization. That is, data is brought into Elastic through the indexing layer. Then they are in Elasticsearch, where they are available in the database. Database as a synonym for Elasticsearch. Kibana is the layer above that you can build dashboards on, in all colors and shapes. You can map very individual things there. Recreate entire production lines, but also quite a few out-of-the-box dashboards and visualizations. Where you can drill deep into data, but also build quick analytics, practically on-the-fly.
 
This is a very powerful tool. You can operate that once you’ve had a little exposure to the basic concepts. It is definitely very versatile in terms of possibilities.
 
Sebastian, you can evaluate the whole thing a little bit. Have you worked with it before? Is this something you can just use, or do you need to have some skills beforehand?
Sebastian
You do have to bring a few skills to the table. An affinity for data is certainly not wrong. Also understanding the structure of the data; also understanding the structure of Elastic at a basic level. If you work with it a little bit, you can get strong visualizations.
 
With Kibana I can answer all questions to the data, in any way. Whether it’s GPS positions of machines, whether it’s averages, maximums or minimums, whether it’s very fast aggregations of data. That is, data that actually comes in every second, summarized over several days, months, or even millions of devices. It’s very fast and efficient, but you have to engage with it.
 
For that you have consileo as a contact person, you help there. One more question; you guys are using Microsoft Azure Services. What do you do with it exactly?
Sebastian
We use Microsoft Azure Services for various issues. Once, we use Elastic as a managing service. With many customers, including Tracto, it runs on Microsoft Azure, so we chose it. Our development environment, that is, with the things that we develop with, runs on Microsoft Azure and the digital apps run on that as well.
 
We use Azure Kubernetes service there. This is basically a virtual computer that can scale very elegantly and, depending on user requirements, automatically grows with them.
 
If we talk in the next step about really analyzing the data – you talked at the beginning about predictive maintenance as a kind of supreme discipline, in the direction of data science. How does it work to intelligently analyze all the drilling data, fleet management data?
David
The Elasticsearch platform is accessible – again via Kibana – via a GUI, but you can approach this the same way via APIs. If you want it to be more technical, a data science or machine learning framework will do the trick. Under the marketing slogan “Data Science for Everyone”.
 
It is indeed like that. If you come from the domain and you know what you want, if you have a little bit of understanding of what you want to achieve with the data, then you have a framework there that you can use to configure all sorts of different things via a wizard. This is done via outlier detection, anomaly detection jobs. So finding patterns in data that are out of the ordinary; not configuring unnecessary alerts with thresholds, which is infinitely labor intensive if you’re doing this for many data points. It is much more elegant to let an AI work there for you. The whole thing is structured in such a way that you don’t have to build up a huge staff of data scientists in the company – if you want to work for it. Which is really a problem these days.
 
Every company, no matter how much money it puts into it, struggles there. That’s what we bring as a platform. What Sebastian has created with Tracto is a great foundation for further use cases in that direction, namely Predictive Maintenance. Move from reactive to proactive maintenance. As a first step, run anomaly detection over the data to give early warning: Attention, something is happening here that was not there before. There’s a lot of opportunity there and some very interesting next steps to tackle.
 
If I’m not an IoT expert, I probably don’t need to know buzzwords like “wizard” and “threshold holds” and so on. But one question: you had talked about GUI. I know API interfaces; GUI is new to me though. Is this a search-specific issue with you guys?
David
This is simply a Graphical User Interface. Quasi the UI, so they also say, on top of the platform. That you can visually interact with the platform versus via APIs, which then is more of a programming thing.
 
Got it, all clear. Sebastian, this data analysis topic, is that already something for your customers or is that still a long way off?
Sebastian
That is already very present. I think that’s very present for every customer who gets their hands on data for the first time. That is the purpose of the data, to analyze it. There are several ways to do that. There are very lightweight ways, Anomaly Detection for example. The classic case, the engine temperature is always 100 degrees Celsius and suddenly it is 120. Then I would like to be informed about it. There are also much broader analytics that can be used here, certainly beyond what Elastic offers. You almost need a data scientist there.
 
You get to these topics very quickly. As soon as the first data comes in, you ask yourself: What do I do with it?
 
You just mentioned the topic of competencies in medium-sized mechanical engineering. It’s really that it’s a big challenge. Sebastian, you went the distance and brought a partner on board. Many do. To say, hey, we’re partnering and almost building ecosystems of partners that I need to do this together. Is that a path that you feel is right?
Sebastian
The competence is not there in many cases. The right way is to rely on a competent partner who has IT expertise in these areas. But who perhaps also understands to some extent how mechanical engineers are and think. It doesn’t help – the topic will become part of value creation, at all machine builders. You also have to build up your own competence to some extent. That means you need a kind of sparring partner on the machine builder side who can enter into discussion with the external service providers, develop better solutions; who also understands the business quite well. This is how good solutions are created. This is a mix, of course, also a transformation of the machine builder, no question.
 
David, how’s that working out for you?
David
I believe that medium-sized mechanical engineering companies are in the same situation as all other companies. Talent in IT, in almost any discipline though, is hard to come by. So it’s smart to use a partner like consileo for various topics to speed things up, so that you don’t have to stand on your hands for so long until you’ve maybe got someone on board. This can take a long time.
 
What we can contribute there as Elastic or where we see ourselves in this problem is actually the issue, a platform for different solutions. I had already touched on some of the topics in which we are on the move. The classic search is one thing, in the web store are applications. Then there’s the monitoring of IT infrastructure, the monitoring of IoT infrastructure – whether that’s drilling machines or the production lines of car manufacturers.
 
Cyber security is an area that we address with solutions on our platform. This is increasingly interesting – as we are also experiencing with customers – to centralize things. Instead of running many isolated solutions for which you have to train people who may leave the company, all of which have their own license models and have to be maintained. That you say: Okay, maybe instead of seventeen solutions we’ll only take three, or ideally just one, and that’s called Elastic. That’s something where we can bring value to companies in engineering.

Results, Business Models and Best Practices - How Success is Measured

In the end, it’s always about the business case. How do I save the costs? Do I build a new business model from this or can I increase my sales through new services? Sebastian, how was that for Tracto? In summary, what is the outcome and business case for your client?

Sebastian

I think you can see the business case in two places. By using innovative techniques and also offering benefits to the customer, for example in drilling data acquisition, you hopefully sell more machines, and with more machines you have more business. That is certainly a driver. You have to face up to digital change.

The other thing, of course, is the reduction opportunities in service that arise. By possibly having to go on fewer field calls, by having the data, I also don’t have to send a service technician out. This results in savings of between 15 and 30 percent.

David, you just said whether that’s a production line or that case – do you have best practices on those types of business cases from your side?
David
Not in general, but of course specifically. We work with customers very closely to add value, because that doesn’t work in enterprise software sales, pushing packs of CD-ROMs across the counter as an example. But you have to sell something that represents value to the customer. There is anecdotal evidence: we work with the Mayr-Melnhof Group from Austria. They are cardboard box manufacturers. This is a very energy-intensive process and goes a bit in the same direction as Sebastian’s and Tracto’s use case. They have digitized their production lines and visualized them in Elasticsearch. Correlation analyses were used to optimize the production process until they were able to save 20 percent of the high-priced raw materials, but still have the same product quality at the end.
That was one, but there are many, and in our case that’s more of an individual look at the business case, project by project.
Glad you have some best practices there anyway. You also have very exciting references, from which one can possibly also derive something. Sebastian, in the end it’s also about learning. For insights that might be shared, so as not to make mistakes twice. What were the biggest insights here on your journey?
Sebastian
Essentially, there were two. That you also have to think out-of-the-box sometimes and can use things, like Elastic, that are not intended for this case at all, but represent the optimal solution. I think the second thing is that at the end of the day, internal processes can also be optimized through IoT. In this case, it was simply the service organization that benefited significantly. I think a lot of people don’t have that in mind. They think strongly in terms of customer orientation, which is the right thing to do. But you can also lift INTERNAL values.
It also means, I think, courage and trust. You have to find the right partner and have the courage to go the distance with this technology. That works and also takes courage, doesn’t it?
Sebastian
Very true. It doesn’t work without courage. This is a new technology, a whole new field for machine builders as well. That takes a lot of courage and a lot of commitment. In the end, however, it pays off.

Transferability, Scaling, and Next Steps - Here's how you can use this use case. [33:43]

The last question in the direction of use cases. Sebastian, you have a wide variety of use cases that you can think of. Whether that’s the production line I mentioned, which you can solve with Elastic, the corresponding data handling, or coming out of that core business with Tracto. That’s probably transferable to other machine and plant builders, too, what you’ve done, right?

Sebastian

Yes, definitely. This is very generalizable, because at the end of the day, many construction machines have the requirement – they are also in the field, also have poor network coverage under certain circumstances. There is the same use case as with Tracto. But there are other use cases, such as a steel mill, which generates large amounts of data and wants to have it analyzed, where the solution works just as well.

Then that was a nice closing to the end, I think. Whoever is interested there can approach you directly and discuss potentials. The corresponding information is linked in the show notes.

Please do not hesitate to contact me if you have any questions.

Questions? Contact Madeleine Mickeleit

Ing. Madeleine Mickeleit

Host & General Manager
IoT Use Case Podcast