This episode is of particular interest to shopfloor managers who want to know how data can be collected in a structured and, above all, standardized manner from a wide variety of OT and IT systems and integrated into other systems. The ZEISS Group shows the path they have taken here with Orchestra from soffico.
Episode 86 at a glance (and click):
- [07:36] Challenges, potentials and status quo – This is what the use case looks like in practice
- [25:13] Solutions, offerings and services – A look at the technologies used
- [34:21] Results, Business Models and Best Practices – How Success is Measured
Podcast episode summary
Orchestra is essentially used at ZEISS for comprehensive data transformation and integration to bridge the gap between IT and OT. Orchestra acts as an integration platform in the architecture across all levels, through which all methods for data communication are provided and executed. ZEISS is thus in a position to constantly connect or replace new applications as well as machines and systems.
In episode 87 of the IoT Use Case Podcast, the Head of Connected Smart Factory at the ZEISS Group, Jochen Scheuerer, explains the approach he has chosen for the factories of the business units and how, together with their partner soffico, they have set up a so-called “data hub” with a wide variety of templates. This data hub can be used for a wide variety of use cases for data connection and integration – without having to “reinvent the wheel” every time. Rica Holzmann (Team Leader International Partner Management & Sales) will be representing soffico here.
There will also be an in-depth discussion on how other manufacturing companies can address the issue – with real best practices and lessons learned directly from the field.
Here’s the whitepaper to the use case of this podcast – Download now!
Hello Rica and Jochen! Glad to have you with me today. Rica, how are you and where are you right now?
Hi Madeleine, thanks for inviting me to the podcast! I’m doing great, sitting in our office in Augsburg.
I’m also in the office, in Oberkochen, at the headquarters. I’m looking forward to eating soon. When we got the work done, we can get food in our great canteen. I’m here to pick up and take colleagues with me.
Oberkochen, where exactly is that?
Oberkochen is five kilometers from Aalen on the Ostalb. This is located directly at the A7 between Ulm and Ellwangen.
Ah okay, and ZEISS is otherwise also represented in Jena, right?
Exactly, the company was originally founded in Jena, was set up in Oberkochen in West Germany after the war, and then merged back into one company after the fall of the Berlin Wall.
Rica, to soffico: The bottom line is that you offer a software solution and, above all, an integration platform called “Orchestra”. This is used to connect machine data flexibly, securely and reliably with each other – keyword data hub for the smart factory.
You also offer a wide variety of architecture components where you have the ability to quickly run and roll out applications. You do the whole thing with integration teams, have a big pre-sales team yourself, which develops the whole thing with the customer or is also provided. You have the interaction of software and service there, that goes in the direction of service and you build that up together with the customer. In terms of connectivity, you don’t care if it’s old or new protocols from OPC UA, S7 controllers or even MQTT to connect smart sensors.
You are also the experts for the topic of integration – data hub in the context of also integrating data, whether that is SAP data or from other IT systems such as MES or other cloud systems.
Lastly, you are part of the x-tention group of companies and you are digitizing the organization of the industry with soffico with corresponding services, all the way to operations.
You’re a team leader in partner management and sales, even internationally. What is your job here and what kind of clients do you work with?
There is not much I can add to the product. Our customers are mainly in the area of “excellent” German medium-sized businesses, both in process manufacturing and in discrete manufacturing. We have not only the manufacturing companies with us in the customer base, but also companies – we call it “that come from the service-oriented area”; so customers from banking, e-commerce, from the insurance industry and from logistics. We also serve other companies that have a need for highly scalable IT and OT integrations.
You have already mentioned correctly, we are a part of the x-tention group. x-tention itself focuses mainly on the healthcare sector. We have been able to build up a very good name for ourselves in the group, especially in the area of e-health, electronic patient files & co. Soffico benefits greatly from this, because we are used to being deployed in the critical area, in the regulated environment, with our software and can therefore fulfill specific requirements from the regulated environment, and bring the corresponding certifications with us and can support with validation.
Healthcare alone is a huge area. You are responsible for building and implementing the whole thing with partners and customers, right?
Exactly, we are very technology-oriented at soffico, which means we work very closely with customers or partners. This goes from the idea, to the conception, to the support of the implementation and are very cooperative in the support of our customers and our partner portfolio.
You brought Jochen from the company “ZEISS” with you today, we’re practically leaping into the context of the smart factory. How did you first get to know each other?
ZEISS is a large German manufacturer of everything from small things to large machines. Here it was the case that an acquisition took place in our company; we bought an MES manufacturer. Before we used the product in our company, we first looked at what they were doing and how it worked. We were simply recommended Orchestra, from the company soffico, for the connection of the solution at that time, because it was multifunctional. That was the first contact we made from Oberkochen to Augsburg.
Challenges, potentials and status quo – This is what the use case looks like in practice [07:36]
Rica, what use cases do you handle in general and which ones are we looking at in detail together with ZEISS?
Our solution is used in different areas by different customers and partners. This ranges from connectivity-as-a-service, for example, to OEE analytics, condition monitoring, or even MES connectivity, where use cases are implemented together with a partner, to large instances, such as those found at various global technology groups. Here is Orchestra central midpoint to be able to implement data exchange.
We also support a wide variety of use cases, such as energy data management or digital manufacturing orders. With the ZEISS company, I would say that the whole thing was made at a larger scale again. This is really about drawing in a whole data integration layer to really enable a complete integration architecture in a highly scalable way, where you can go down to the OT and collect the data there, aggregate it and then make it available to IT systems. Otherwise, our use cases are very diverse; this is certainly also due to the fact that connectivity is one of the foundations for various use cases that are built on top of it, which help data to become visible, transparent and at some point capable of forecasting.
Today, it is important to understand this integration architecture. Jochen, you are a leading global technology company, the optical and optoelectronic industry; you produce a wide variety of solutions for industrial metrology and quality assurance, including microscopy solutions. These are both the smallest devices and machines that you make.
You have an annual turnover of 7.5 billion euros, as of 2021, 35 thousand employees, are represented in 50 countries and have a corresponding corporate background; founded in Jena in 1846 and the sole owner of the umbrella company is Carl Zeiss AG and also the foundation, which includes one of the largest German foundations for the promotion of science.
One of your growth areas is the topic of “I4.0” and the use of IIoT technologies. Can you tell us something about your portfolio? Also what your specific job is, and the vision that drives your team.
As you mentioned, a large company, long history, reunited with the former ZEISS East Combine in the West since reunification in 1990. ZEISS now has four major business units.
First of all, the business unit with the highest sales, lithography, which means that we manufacture exposure equipment for a leading global manufacturer to make chips. Without ZEISS there is no exposure, without exposure there is no chip, and without a chip there is no IPhone. We have an extremely high market share and are fortunate that virtually all the chips in the world have been produced with ZEISS exposure machines; downsizing also plays a major role here.
The second area is medical technology. We manufacture medical technology products to perform eye surgery; microscopes are among the products here. Just recently, we won a German Future Prize, where we can observe the proliferation of cells without much light falling on them, because a cell will not proliferate if there is too much light. With R&D (=Research & Development), we are very well into helping people.
The third area is the classic metrology division: 3D coordinate measuring machines to measure and evaluate any product that is manufactured in any form. Nowadays, with laser technology, there are many ways to easily control and monitor products in the process.
Last but not least, we have the “ophthalmic lens” division; ophthalmic lenses are manufactured in 70 factories spread around the world. Here, too, there are very innovative products that we manufacture. The area also includes the things that I personally enjoy: from glasses, spotting scopes, photo lenses and also cameras; as a “ZEISSian”, you get involved with your own product in your free time.
It’s an honor to have you join us today and share something from your practice. If we dive into the vision of “Industry 4.0,” where do you want to go? Which plants are you responsible for?
My department is relatively small, we are in the competence center in the group IT. We do not belong to the previously mentioned four Divisions. We try to make our competence available as a service to our internal customers, our specialist departments in business units. We have the task of providing a comprehensive MES system. Here, there are business units that are somewhat more advanced. But there are also some that are not quite state of the art.
We are trying to bring everyone to the same digital level; a classic buzzword would be: I just want to stop using paper in manufacturing, what do I have to do for that? At the same time, we want to gain data from the manufacturing process here; data is the new gold! And the more I know about the product, the more efficiently I can develop my product processes for manufacturing. Reducing scrap rates and machine downtimes and otherwise optimizing the whole process is the task here. It is very important to know when what happens where and to analyze from early on.
That sounds like a big job for you in the Group IT role, since you’re also responsible for multiple plants.
You said you’re in Oberkochen, but you’ll probably also be responsible for plants across divisions, right?
We provide a service that our internal customers in the business units buy, and these business units are then responsible for their own plants, for the process; we are a classic cost center where we offer shared services and serve as an enabler for technologies.
If I were to accompany you on a typical day at work, what are your main tasks?
The daily work routine is characterized as a classic manager, to focus things, to keep things running, to speed things up, to get in touch with an internal customer and to see if everything is running; the classic operating principle.
What my colleagues do is much more interesting: We have integrated and developed an MES system together with a large German software manufacturer in Walldorf, and make it available to several internal business units. We did the whole thing on a template basis, so we had pilot projects that gave us their requirements, what they wanted to have, and we made real templates out of those different templates. One for assembly, one for automation, and one for direct manufacturing, and continue to develop these templates with new internal customers, or roll out the existing ones to other plants.
Today, where paper is still in use, this is attempted to be digitized. Always together with the department – I get requirements, look at what I can already cover in my templates, what can I roll out or do I have to do additional development of the templates so that the department can work fully?
What is the biggest challenge you see there? Is that the use case or rather connectivity on the shopfloor?
I think we can cover the use case quite well because it’s already running, I just need to digitize it and can optimize it if necessary. The problem can occur with personnel who do not understand why you would want to optimize a system that is already running. With the win, it’s somewhat of a challenge to convince people that the whole thing is supposed to change in five years. The biggest problem we have is heterogeneity. Imagine you have a business that is 175 years old. Not every machine that exists has an OPC UA interface. Everything I need comes out of here. The retrofit idea, which allows the business more functions or even simpler services, should not be underestimated.
Connecting a torque wrench from 1990 that ends up sending me OPC UA is the biggest problem. There are so many interfaces and manufacturer-specific controls. Every manufacturer can do what they want here, but I want to get to the data and I also want to get it standardized. And hand that over to Orchestra as soon as possible so I can do as much as I can with it there.
If our listeners have similar problems, I would like to invite them to share their experiences; this is exactly what we have the experts for in the network. I’ll link the whole thing in the show notes, many here probably have similar challenges.
What are classic data packages that are relevant for you in the projects?
It depends on the manufacturing process. This can be in assembly that I want to know, for example, with which torque screws were tightened or whether the employee who built the product was wearing ESD shoes? Is he antistatically sealed? Also values or anomalies to recognize can be points here. For example, steaming quality can also leave much to be desired, because we have summer, the doors are open and the wind blows through.
These are things where we try to save time, money, scrap, and work efficiently. In the USA and Asia, of course, we have different requirements than here; that’s another factor.
What requirements were important to you in the collaboration with soffico?
Such things don’t happen overnight; there is a history behind them. We as a company ran off a long time ago to do 1-to-1 connections. For example, an ERP system was hard-wired to a RPM sensor on a machine. If the RPM was not receivedg, then the ERP could not continue and if the ERP was at standstill, the values of the machine could not be retrieved.
Today, you no longer work in such a 1-to-1 connection, but you have so-called layers that can cache data. So the first big requirement was, I want to have something in between and the actual sensor that can buffer data for me and the data not only goes to the ERP system, but I can also take exactly that data into another channel, for example, into a data lake.
Due to our work in medical technology, we have the obligation to prove for 30 years who, when, how and where, with which qualification, which screw was tightened with which torque. This goes as far as the composition of a batch of glasses.
You probably benefited from the fact that you had already worked a lot in the healthcare sector; you need to know such evidence.
Sure, documentation is always a huge issue here and validability in general. You have the historical landscapes, little standardization, this is where healthcare may even be further along than industry and this is something we need to work together to address. This is achieved through the scalable cell concept, as we call it within Orchestra, which allows data to be processed, transformed and mapped at different levels. Here you can then provide it where the data is also needed.
Solutions, offerings and services – A look at the technologies used [25:13]
Rica, can you say in general who had what role? Jochen, you are the expert unit from the Services area, which works with internal customers itself; and Rica, you have your role around the “cell concept” and the topic of database and integration.
The project consists of different roles. Both ZEISS internally and from us. We were in a very close collaboration in terms of creating these templates. We try to make the whole thing as generic as possible to be able to connect the different systems of the individual BU’s accordingly without having this effort every time anew.
Orchestra provides the necessary elements here from the ground up; we call this “Channel”. These are more or less connectors that can address different systems via the respective protocols.
You develop communication scenarios around the channels, which can be individual, but don’t have to be, if they are processes that can be found across systems in different areas. Then generic templates can be created and rolled out in the individual BUs.
So we have all the different processes and use cases; can we shimmy from data ingestion, to processing, to analysis here? Data acquisition must be performed via the various interfaces. We have an old torque wrench and a new turning center with OPC UA, how does the data mining process work?
The data acquisition is such that we first look at what we have for electronic components and what interfaces we have here? For example, a turn-drill-mill center with OPC UA interface: Is great, but I need the labels for this so that I know whether speed also means rotational speed and not revolutions or even something completely different. I first need to do a mapping of the existing machine to a standard. We do this by reading the data from OPC, small servers and putting it into a so-called “JSON file”. This is a ZEISS-specific document where the labels are the same. I can read out exactly this information with Orchestra as a Manufacturing Service Bus – that’s how we named it, MSB – and I can store it again via Orchestra in different target platforms exactly according to the format.
An SAP system might want to have that through an IDoc interface differently than a Sequel database in an Azure, somewhere in the cloud, where I’m running a long-term history; that’s the advantage. Although I have the information only once, I can store it in different target platforms according to specifications. To do this, you can create scenarios on the MSB and this data is stored to me in a standardized way.
You talked about templates of scenarios; is that the standardized shell you’ve built for data connectivity?
A distinction must be made here: The channel does the data collection. The scenario is more or less modeled in our designer. Here the system will model cross-cutting data processes, meaning there can be any number of source and target systems. Classically, we do this via “BPMN”. If you imagine a BPMN process like this, it can be very small, but also very large and complex, so you can also model and map more complex data processes with it relatively easily using this low-coding approach.
What is BPMN again?
Business Process Modelling.
You have channels on your Orchestra that sort of do this mapping and then I can model the whole thing where I can do this data processing into other systems as well, right?
That’s right, this is all visual! We call it low coding; of course, coding is also an aspect that we are coverin, but theoretically you can also simply model it graphically. And also the mapping, where you connect the fields, of the source and target system, can be implemented graphically.
It used to be that ERP systems had 1:1 connections; how does data integration work across layers? How does data processing work in Orchestra?
We have the cell concept; so Orchestra runs on different levels. So we have a small orchestra artifact “Orchestra Juno”. This runs at a very small scale, for example on industrial computers, can receive data here, aggregate data, buffer data and also make it available to a central orchestra via a secure cell connection. There, the data is processed further and can, among other things, be made available to a third Orchestra instance, for example a cloud.
Juno is practically an architecture component that just runs on you, you can just use it, and you can “build” it yourself.
If we take the use cases a step further: How do you get this data analysis on such a basis that you, Jochen, can implement the use cases in the end?
With connectivity, we form the basis for this transparency and for visibility. We deliver the data needed for certain forecasts or even for monitoring into various monitoring analytics systems, depending on what is in use and what needs to be analyzed. We do not take on the classic data analysis, simply because these are separate areas; one is the data infrastructure and data architecture and the second is the data analysis, which can be performed individually in each case.
The big advantage in our case is simply that we can push this data via Orchestra as the central system. I can push the data I gain through Orchestra into a time series analysis tool. Using a time series analysis tool, I can find out when, where, what, how was done. I can push it into an ERP system analytics cloud; there, too, I can prepare the information in a way that adds process value for the process engineer to derive activities or establish long-term histories; for example, which product heats up faster and why. In the past, you would never have been able to figure it out except with qualified people looking incessantly.
Today I can do that wherever the customer wants, play the data there and they can analyze for themselves, what do I need to do to improve my process?
Orchestra is the central data hub, so to speak, for processing the information at the end, and in the case of ZEISS, you have a lot of manpower, but you also support smaller companies in setting up the whole thing.
Results, Business Models and Best Practices – How Success is Measured [34:21]
Do you have any insights on the business case for the data hub?
In the combination that we get data prepared in a structured way, we have many possibilities for comparison. I can compare lines with each other, as well as products raw materials. The other business case is: I optimize my manufacturing process; I can also do predictive maintenance and we have really good return on investment where process engineers with a lot of screws can’t get anymore. We then looked at the data and found that there were things happening in various places, could we do a cleaning on the device by cancelling it beforehand? Thus, the next batch would be fine; otherwise, it might have broken.
In any case, it is important not to do everything at once, but to see where you can make maximum value with little effort, so that it becomes a self-runner. What you should not do is change architecture while the implementation is running. I have to think up front about what an 80/20 architecture might be; adaptations always happen, products disappear, new ones are added.
I need a system landscape overview, I need to understand the processes behind it, and then I can get going with one. That’s how we did it too! Two pilot projects launched with different templates that we may roll out close to 30 times in the business. This is not without its problems, but it will only work now that we are no longer interfering with the architecture. This has to be fixed once; starting from the naming convention and stopping at Perfomance, which is what Orchestra is supposed to provide. For example, I have a classic three-system landscape. A development system, a quality system and a production system. Here, in the end, I can’t say that one would be enough. It wouldn’t take a minute and the machine would stop.
Rica, do you have any experience data from other projects?
I can absolutely underline that and say that ZEISS has come incredibly far here, it is very visionary and strategically well thought out there. I can only take away positive experiences, compared to companies that historically grow with 1:1 interfaces. At some point we call it “spaghetti interfacing” when you can’t even find the single connection in the interface multiplicity. If you approach the whole thing strategically, build it up with an architecture, you have gained a lot; that is the main experience from other projects. Think in advance how the architecture can look and you have already done a good groundwork!
Would you also say that the trend in the future will be for every manufacturing company to have a universal system and to connect every machine? At least it makes sense if you don’t have a market leader of machine manufacturers installed in your field.
Many machine manufacturers now supply their own systems and clouds via which data can be collected from the machines or from sensors. But then I make myself a big colorful bouquet of different vendor-specific clouds and don’t get it together. If I say that I only have these machines in use and only want to keep them in the future, that’s fine.
As soon as something special comes along and needs this data as well, then you need something universal.
I see it the same way, the whole ecosystem is relevant and this best-of-breed approach prevails, simply because you want to maintain a certain independence. That’s where it makes sense to get the right partners and not be bound to one manufacturer for the entire process.
That is also the approach of IoT Use Case: That you create an ecosystem and I believe that both the mechanical engineering and the component manufacturers, the data from wear limits of their components somewhere packaged in IoT services. Again, it doesn’t make sense to build your own cloud infrastructure. There is currently the trend, however, the question is how this will develop in the future, as theoretically every manufacturing company wants to openly collect data, for a specific use case.
It’s simply a philosophy; do I use something from “open source” now, do I use something from a manufacturer that I might be bound to, or do I try to find a solution? The Open Industry Alliance 4.0, of which soffico is also a member, is helpful. Here, it’s simply a case of different manufacturers getting together to generate open and reciprocal, possible standards.
If someone is the only one who makes a car that is assembled with screws, with a pitch of 1.5, then they have a nice car, but there is no one else who wants to supply or make these screws, then they simply cannot get spare parts. For many years, I worked for a well-known Finnish mobile phone manufacturer and was responsible for Bluetooth tests and compatibility tests, among other things. This was already a challenge 25 years ago, until people decided to make standards; it doesn’t matter if the manufacturer is from Sweden or from Finland; they got together to develop a worldwide Bluetooth standard.
That was a beautiful closing! It was very well understood how your technology works, around the data hub. A very exciting project; respect to you! Maybe we’ll hear from you again, thanks for your time!
Until next time!