Möchtest du unsere Inhalte auf Deutsch sehen?

x
x

IT/OT Integration in the Pharmaceutical Industry: Scaling in a GMP-Compliant Way

““

You are currently viewing a placeholder content from Spotify Player. To access the actual content, click the button below. Please note that doing so will share data with third-party providers.

More Information
Listen to the IoT Use Case Podcast on Spotify.
Listen to the IoT Use Case Podcast on Spotify.
Listen to the IoT Use Case Podcast on other platforms.

In this episode of the IoT Use Case Podcast, host Dr. Peter Schopf talks with Bastian Erb, Director Production Engineering at Vetter Pharma, and Fabian Fitzer, Team Lead Consulting at soffico. The focus is on how highly regulated production processes in the pharmaceutical industry can be digitalized — using clear processes, governance, and a shared target vision instead of technology-driven actionism.

Podcast episode summary

How can digitalization in the pharmaceutical industry succeed under GMP conditions? In this episode, Vetter Pharma and soffico demonstrate how IT/OT integration can be implemented sustainably even in highly regulated environments.

The starting point: paper-based GMP documentation, high manual effort, heterogeneous machine and laboratory systems, and strict requirements for data integrity, validation, and auditability. At the same time, a scalable data foundation for future use cases was missing.

The approach: a central data integration architecture based on the Orchestra platform from soffico. Instead of point-to-point connections, Vetter relies on a standardized, three-layer architecture covering the OT level, aggregation layer, and IT level. QA, validation, and operations were integrated from the very beginning. Automation using Kubernetes and CI/CD enables scalability while maintaining compliance.

The result: fewer manual checks through “review by exception,” consistent data for MES, LIMS, and analytics applications, and a robust foundation for data-driven optimization — all the way to AI-based use cases.

The episode is aimed at managers in regulated industries who want to implement IT/OT integration strategically and for the long term.

Podcast interview

Today on the IoT Use Case Podcast: How highly regulated production processes in the pharmaceutical industry can be made stable and digital. The key is not to start with technology, but with clear processes and a shared target vision. Our guests are Bastian Erb, Director Production Engineering at Vetter Pharma, and Fabian Fitzer, Team Lead Consulting at soffico. Our conversation shows that long-term thinking, clear responsibilities, and a common target vision often have a stronger impact than individual technology decisions. Enjoy listening.
Today, we have two exciting perspectives on the podcast: Vetter Pharma — important to note, spelled with a “V” — one of the world’s leading companies for sterile filling and packaging in the pharmaceutical industry. In technical terms, this is often referred to as aseptic fill and finish. It is therefore a highly regulated industry, as it directly relates to human health. We are also joined by soffico with their Orchestra platform, which enables complex IT-OT data flows to be mapped cleanly while complying with regulatory requirements. Bastian, let’s start with you: could you briefly introduce yourself and explain which topics you are bringing to the discussion today?

Bastian

My name is Bastian Erb, and I am Head of Production Engineering at Vetter Pharma. As you already mentioned, Peter, we are a leading CDMO in the pharmaceutical sector. That means we act as a contract manufacturer for large Big Pharma companies as well as for smaller firms. We handle the sterile filling of injectables. We offer an end-to-end process: starting in the early clinical phase with development work together with our customers, accompanying them on the path to commercialization, and taking over the fill-and-finish process through to final packaging. This means we perform the filling for our customers, carry out visual inspection of the units, handle the final packaging, and also provide laboratory services that support our customers throughout their journey.

Can you briefly describe how things developed for you during the COVID period? Overall, there was quite a bit of turbulence in the market, while from the outside your situation appeared more like a strong upward trend.

Bastian

The COVID period did not move Vetter too far away from our standard operations. Our focus is strongly on products that are not part of the classic vaccine segment, meaning higher-value products. These did not increase significantly during COVID. The pandemic was primarily driven by vaccines and large-scale mass production, and our portfolio remained relatively stable in that area. We did not experience a major surge as a result. However, our figures clearly show that we are overall a very successful company and that the market increasingly requires our services. It is a strongly growing market. Today, we operate nearly 24 cleanrooms and can rely on a large machine portfolio. We have grown significantly in recent years and will continue to grow in the years ahead, which is why digitalization is a very important factor for us. This is also how we embarked on a longer journey together with soffico.

That sounds good. Where are you located at the moment, Bastian, and where are your sites?

Bastian

We are currently strongly positioned in southern Germany. Our home base is Ravensburg. There, as well as in Langenargen on Lake Constance, we operate three aseptic filling sites. In addition, we have a large logistics center in the Ravensburg area. Most of our activities therefore take place within a radius of around 20 kilometers around Ravensburg. Beyond that, we operate a research site in Rankweil, Austria, as well as another one in Chicago in the United States. However, our commercial production is clearly centered in southern Germany around Ravensburg.

Before we go deeper into that, Fabian, let’s briefly turn to you as well.

Fabian

My name is Fabian Fitzer from soffico. I work in IT consulting there and lead a team. soffico is primarily a software manufacturer of a standard software product called Orchestra. Orchestra is a low-code connectivity platform that enables manufacturer-independent integration of OT and IT systems. To achieve this, Orchestra provides standard adapters both for machine integration and for various IT systems. On the OT side, this includes protocols such as OPC UA, Modbus, or MQTT; on the IT side, REST, SOAP, or databases. We integrate these systems in the backend and can graphically model process flows, design them with any level of complexity, and then roll them out in a scalable way. In principle, this allows almost any data-driven use case to be implemented, while the data is reliably transported in the background. In addition to the Orchestra product itself, we also offer services. This brings me to my area, consulting. That means we support our customers in integration projects across all phases—from architecture and solution design through to implementation. Customers can, of course, also work with Orchestra themselves, which they are happy to do. When support is needed, that is where we come in. I have been with soffico for five years now and have worked on many large projects as a technical project lead, architect, and in implementation roles. My focus is strongly on the pharmaceutical industry, but I have also worked in other sectors such as automotive and electronics manufacturing. We have been working with Vetter on this project for several years now, and we can now go into more detail on that.

Absolutely, that will be very exciting. One challenge, as you mentioned, is that there is so much that can be done and so much data available. Bastian, where did you start with IT/OT integration? Which data do you use, and what is your objective?

Bastian

We started our first evaluations together with soffico in 2022. We operate in a highly regulated environment, which means that data is one of our most valuable assets. We must ensure data integrity, understand what happens to our data, be able to interpret it, and guarantee its quality. For this reason, we have been moving step by step toward a targeted and scalable architecture since 2022. This year, we reached a major milestone and are now in the final phase of our conceptual design. We now know what our scalable architecture looks like, and we know that the process works with soffico and Orchestra. We have identified various use cases that represent around 80 to 90 percent of our shopfloor in a representative way. For that reason, we deliberately took a lot of time to define the architecture properly instead of going live immediately with the first use cases. soffico supported us very well in this process. We were able to learn from each other, and it was also interesting to gain insights into the pharmaceutical industry as well as other industrial sectors. In this way, we have created a concept that will remain viable for Vetter over the next five to seven years, even in light of our strong growth.

[07:39] Challenges, potentials and status quo – This is what the use case looks like in practice

What was the initial pressure for you? Did it come from regulatory requirements, or did you decide yourselves that you needed to act and wanted to tackle this topic? Where did the motivation come from, and how did you proceed? Did you approach it in a very systematic way, or were you more guided along the way?

Bastian

You have to imagine that today the documentation of a GMP process still largely takes place on paper. This means that for each of our commercial orders we generate a large amount of paper to document the process and ultimately ensure traceability. A single batch can consist of up to 500 pages of documentation, with several thousand signatures. Many process and machine data points are still transferred manually into this paper documentation. For us, it is therefore a significant efficiency gain if we can access machine data directly and automate process evaluations. This also allows us to gradually move away from paper. Our goal was not to simply do “paper to digital,” meaning to replicate paper-based documentation in an application, but to truly automate processes. To achieve this, you need data and generic interfaces. A company of our size does not work with just one machine supplier, but with many different ones, ranging from large process systems to small laboratory equipment. Integration therefore has to work in a generic way, ideally using a template-based approach, so that what is developed today can also be reused for the next piece of equipment tomorrow. Very early in our digitalization journey, we therefore came to the conclusion that without an integration layer—without middleware—we would not be able to move forward at the required speed. We deliberately chose to look at data integration before selecting the final application. Of course, we want to represent our batch record in an MES system later on, but we focused on data integration first, before choosing a final MES provider. For us, data availability, data quality, and data integrity are central topics. That is why we made these decisions at a very early stage. Over the past few years, we have developed strongly from a process perspective. We do not come from a technology-driven background; instead, starting in 2023, we intensively analyzed our business processes and derived the relevant use cases from them. We mapped these out in 2024 using various proofs of concept and used 2025 to transition these POCs into a scalable architecture that is prepared for future growth. In 2026, we will move into rollout.

To put this into context once more: GMP is quite specific to the pharmaceutical industry—Good Manufacturing Practice. It requires proof and documentation of what was done, how it was done, and how changes were implemented. I find it very interesting that you approached this topic in a strongly process-driven way. In earlier discussions, including a special episode, we observed that in many digitalization projects technology takes center stage. This often leads to problems, because while you may end up with an impressive POC, the processes themselves are not truly understood—neither in their current state nor in how they will evolve. A third element that is often missing is the human factor. My personal guideline has therefore become: technology, processes, and people. How did you approach this? In terms of change management, user involvement, and acceptance of the technology—did you have concrete approaches in place for that?

Bastian

Before we engaged in digitalization at this scale, we established a governance model at a very early stage to ensure that everyone was working toward the same target vision. This target vision serves as a company-wide point of reference. It is deliberately defined at a certain level of abstraction — not at the level of individual applications or specific tools — but in a way that clearly describes our roadmap and our intended architecture. This gives the individual departments a clear orientation. For us, it was important to focus first on the core processes. In our case, that meant the aseptic core, including the laboratories. We wanted to provide value to our customers at an early stage, because our customers also want to work with data. Sharing data at defined stages of digital maturity is an important factor. That is why we started with a clear target vision — one that provides guidance and one that everyone is expected to align with. Change management is a critical factor, and we are certainly not at the end of that journey yet. In fact, there will never be a final endpoint, because it is a living process. However, you can clearly see that we have evolved significantly over the past few years. The company has clearly stated the goal of digitalizing the core processes. This helps enormously, because processes and resources can then be aligned around a shared corporate objective. As a result, a solid approach has emerged for how we address these topics. We are not driven by technology, but by a holistic perspective. We incorporate relevant EAM approaches, focus on processes, systems, and data, and aim to truly understand how they interact.

[13:39] Solutions, offerings and services – A look at the technologies used

Fabian, turning to you and soffico: Can you explain how Orchestra supports this approach? Especially in such a highly regulated environment, this is not an everyday setting. Where do you come from originally, and how did your journey into this environment begin?

Fabian

At soffico, we were heavily involved in the healthcare sector for a long time — working with hospitals and health insurance providers. These are also highly regulated environments. We are very familiar with this type of regulatory framework, and we feel comfortable operating within it. When we decided to place a stronger focus on industry, it was a natural step for us to also move deeper into the pharmaceutical sector. At Vetter, the goal is a unified, clear, and scalable architecture that can be applied across all sites and centrally monitored and operated. In other words, we are aiming for a very coherent overall concept that we are planning and implementing together. We are operating in a GMP-regulated pharmaceutical environment. GMP means that European regulations for the production and packaging of medicinal products must be strictly followed. As a result, this environment is highly documentation-driven and process-driven. Especially at the OT level, the pharmaceutical industry is often relatively rigid, slow-moving, and less flexible compared to other industries. Despite these conditions, our goal is to build a scalable, flexible, and evolvable architecture. That is the core challenge. On the one hand, we use Orchestra as an integration platform that enables versatile, vendor-independent integration. On the other hand, we rely on a very high degree of automation. In practical terms, this means using CI/CD pipelines in combination with a Kubernetes environment, which we also deploy at Vetter. This level of automation allows us to define processes clearly and implement them in a compliant way, including automated test cases and all required validations. The objective is to run these processes in a highly standardized and low-risk manner. Once a process is clearly defined, tested, and approved, it can be scaled efficiently. It is important to understand that GMP regulation mandates certain practices. The principle of documented evidence applies: if something is not documented, it is considered not to have happened. This requires clear processes, clear approval workflows, changes based on the four-eyes principle, and strictly separated phases from development to testing to production — without shortcuts. In the automotive industry, quality is also a major concern, but it is not regulated by the EU in the same way or to the same extent.

You touched on a number of exciting topics, such as CI/CD pipelines and similar concepts that originally come from the IT world and are now being used at the interface between OT and IT for data acquisition and data transparency. From your perspective, what is the concrete benefit of this? Do you see this as a general market standard, or more as something specific that you are implementing here?

Fabian

In general, Kubernetes is not something completely new. We have also discussed it several times on your podcast. Kubernetes was originally developed by Google and has been around for several years, but it has only started to gain broader adoption in industry relatively recently. So we are still operating in a very innovative field. Especially the way we use Kubernetes at Vetter — specifically for machine connectivity at the OT level — is something new. In this context, we are taking a very innovative approach and leveraging technologies from the IT world that give us capabilities we simply would not have with more traditional approaches, at least not with the level of flexibility and process dynamics we are aiming for.

Exactly. That is what I find particularly interesting — bringing established IT concepts more and more into the OT world, and doing so in a regulated environment with additional requirements. There must have been many topics around data security, governance, compliance, and audits. From your perspective, what was the most challenging part? Who took the longest to grant approvals — quality, works council, or other stakeholders?

Bastian

For us, this was actually a very substantial work package within the concept project. We realized early on that we had to deal intensively with validation processes. Only by doing so could we ensure that what we were designing from a technological and process perspective would ultimately be transparently validatable. This is crucial in order to be able to fully demonstrate the processes during audits — both customer audits and regulatory authority audits. If we had not considered validation from the very beginning, it would certainly have become a major challenge later on. Data integrity plays a central role in this context. It must be ensured that the data we capture at the machine level also arrives correctly in the higher-level systems. This is safeguarded through classic end-to-end tests, which are an integral part of the validation process. We work with a typical three-system landscape: development, qualification, and production, each running in separate instances. From my point of view, a major success factor was that we designed our integration from the very beginning around this three-stage landscape, aligned with established IT processes and release management. This allowed us to leverage synergies with existing processes. As a result, we encountered relatively little resistance along the way. Instead, we understood the initiative as an enablement for digitalization and vertical integration — and worked together toward a common goal.

Fabian

Exactly. I believe a very decisive factor was that QA was involved from the very beginning. QA was already included during the architecture phase, so that the ideas and concepts could be anchored early within the company and viewed holistically. This was extremely important in order to create acceptance across the entire organization for a new topic and a change process, which naturally always represents a certain hurdle. QA, in this context, stands for the Quality Assurance department.
If you look at how the setup looks today, Orchestra runs as a data integration platform on Kubernetes. Orchestra acts as a middleware that connects machines and IT systems — essentially as a kind of service bus in which process flows run and a unified data architecture is established. Our goal is to avoid point-to-point connections, isolated solutions, silos, and so-called spaghetti interfacing, and instead operate a central platform. This platform runs on Kubernetes and handles both machine connectivity and integration toward the IT layer. To achieve this, we have built a three-tier architecture: one layer at the OT level for direct machine connectivity, an intermediate layer for aggregating segments, and a layer toward the IT level. This architecture fits Vetter’s requirements very well and is crucial for scalability while at the same time ensuring a high level of fault tolerance — which is extremely important in the pharmaceutical industry.

We have talked a lot about transparency. In the past, processes were heavily paper-driven; today, they are becoming digital. When comparing the original as-is process with the target to-be process: are there specific use cases or aspects that particularly stand out or that you find especially interesting?

Bastian

We integrate highly distributed data sources. Our shopfloor is structured like a pyramid: very broad at the bottom, with many different systems and processes, and becoming increasingly narrow at the higher levels of the ISA pyramid. We identified this diversity and the distributed nature of the data sources as a key factor very early on. Only then can we ultimately automate state-based and event-driven workflows. Our goal is to make as many decisions as possible data-driven — whether they relate to process decisions, process analysis, or process optimization. Today, it is often the case that a value is written down from a machine, compared against a target value on paper, and then signed off using the four-eyes principle. If I integrate this data source vertically, ideally I only need to perform this review when a system detects a deviation. As long as systems such as MES, LIMS, or other systems report that the actual process corresponds to the target process, I no longer need to deal with every individual decision in detail. Instead, I can focus on a review by exception. This results in a significant efficiency gain and at the same time gives us the time to analyze the exceptions in detail, correlate them with additional process data, and make a well-founded assessment of whether they have an impact on the process or not. Business intelligence and data analytics are major and continuously growing fields. The better we orchestrate and standardize our data sources, the better we can use this data across the company for analytics applications. Today, a company of our size is already strongly data-driven, but primarily based on enterprise-level data. By connecting data all the way down to the shopfloor, we can position ourselves much more broadly and go much deeper into process optimization — ultimately creating additional value for Vetter.

Overall, I find your approach very convincing, especially because you had a clear target vision from the very beginning. Many companies struggle in the early stages with building business cases based on individual use cases — typically with a clear ROI after one or two years. In a context like this, that is often hardly feasible, because you first need to create the foundations that later enable a wide range of different use cases. It quickly becomes a vicious circle: you do not yet have transparency regarding potential benefits, savings, or effects, but you still have to invest upfront in comparatively costly groundwork, such as data structures and integration. That requires a certain level of initial conviction to even get started. Was this driven by individual people in your case, or was it a conscious organizational decision? This is highly relevant for many listeners, because at some point this decision has to be made — and it is anything but trivial.

Bastian

This was a clear organizational decision. We are a family-owned company and, despite our size, we still see ourselves as a mid-sized enterprise. We had full backing from management and a clear directive: we want to digitalize. In doing so, we did not only focus on vertical data integration, but very early on also addressed the topic of master data. After all, automation only makes sense if the relevant master data is available digitally. A key term here is product lifecycle management. Processes can only be automated efficiently if the underlying master data is cleanly available in digital form. For us, these were the first two digital initiatives: on the one hand, migrating master data management into a PLM system and establishing a classic product lifecycle management process, and on the other hand, driving data integration—both horizontally at the IT level and vertically down to the shop floor. Only on the basis of these two building blocks were we then able to build processes together with additional applications such as MES and LIMS systems. We deliberately did not start by looking at individual tools, but instead focused on master data and integration as the foundation.

[26:51] Transferability, scaling and next steps – Here’s how you can use this use case

Would you say that you have now reached an intermediate stage? Is Orchestra generally ready for use across the pharmaceutical industry — in the sense of plug and play with everything showing green lights? Or where do you see the boundary between what can be adopted one-to-one and where individual customization is still required? And how do you see this evolving going forward?

Fabian

What we are implementing at Vetter is, of course, very pharma-specific. When we look at machine connectivity, we are often talking about protocols such as OPC UA, which are now widely used across industry. However, when we zoom into the laboratory environment, things look quite different. There we are dealing with devices such as scales, UV meters, pH meters, or osmometers, some of which use very specific protocols — for example RS232, serial interfaces, or proprietary, file-based formats. You do not encounter this in every company, and as a result, you end up with a very heterogeneous environment that still needs to be integrated and managed. What Vetter is doing together with us in terms of using Kubernetes is also still very innovative in this form. I would say that Vetter, together with us, is acting as an innovator here and opening up a field that is not yet widely established. In summary, IT/OT projects often resemble each other in their fundamental patterns; the use cases and objectives are comparable. But in the end, every customer is different, with their own philosophies, challenges, and requirements. That is why the 80–20 rule is a good way to describe it.

In which other industries are you active with Orchestra beyond this?

Fabian

Basically, we are active wherever something is manufactured or operated. That ranges from automotive and pharmaceuticals to the electronics industry. Beyond that, we also work with organizations in the public sector, such as public administration, as well as with banks, insurance companies, and even in the defense sector. So we are very broadly positioned. In every organization, the core challenge is ultimately the same: getting data from A to B via a standardized, maintainable, and operable path. That is exactly what Orchestra is well suited for — across industries.

That sounds like a high degree of flexibility. To wrap things up, I’d like to ask how things will continue on your side. Bastian, turning to you: you have now built the data infrastructure, identified multiple application areas, and also mentioned artificial intelligence — which of course requires data and context to be applied meaningfully. What does your aligned target vision look like, and how are you approaching the next steps?

Bastian

We are now entering the rollout phase for the use cases. First, we are finalizing the infrastructure, and then we will roll out the use cases with the appropriate operating model. As in any company, this is not just about data integration, but also about consistently further developing the overall system landscape. At the moment, we are right in the middle of the journey. Our goal is to consistently advance the remaining digitalization initiatives and, step by step, work toward our target vision. We will reach this vision through intermediate stages in the foreseeable future, but it will never remain static for ten years. We will continuously benchmark it against the market and against our own processes. For us, digitalization is not a one-off initiative; it will accompany us over the coming years and repeatedly require adaptability. That said, I believe we have laid the foundation to further develop our core processes in a way that generates significant added value — both for ourselves and for our customers.

What I find particularly positive about this is that you are doing it from a position of strength. You are growing strongly, and that often leads companies to prioritize growth over innovation. All the more encouraging to see that you are combining both: building the foundations while at the same time remaining innovative. Digitalizing early, from a position of strength and before real problems arise, strikes me as a very good approach. Fabian, from soffico’s perspective: where do you currently see the strongest areas of development?

Fabian

If we look at the phases we are in, we have completed the initial steps. The architecture is in place, the first connections have been implemented, and next year we will move into the actual rollout phase. That is when things will become particularly exciting, because we will see whether the processes and the architecture perform in day-to-day operations exactly as we envisioned — especially once the validation processes are truly lived and applied in practice. I believe that in one or two years, we will be in a very good position to report on best practices, the learnings we have taken away, and also where we may have made mistakes — because that is part of the journey as well. Looking toward 2026 and beyond, this will definitely remain a very exciting topic.

Thank you very much from my side. I found the conversation extremely interesting and look forward to updates in the coming months and years. Do you have any final words for our listeners, or have we covered everything?

Bastian

To wrap things up, I would like to explicitly thank Fabian and the entire soffico team for what they have accomplished together with us over the past few years. This has been an important foundation for our growth. The support was excellent, the level of flexibility was very high, and in the end, the product itself convinced us as well. A big thank you to Fabian and the entire soffico team.

Fabian

Thank you very much, Bastian. I would like to return that sentiment in exactly the same way. I think what we have already achieved together is fantastic. Now it is about taking the next steps. For us at soffico, having a truly partnership-based relationship with our customers is extremely important in order to be successful together and pull in the same direction. That works very well with Vetter, and I am very much looking forward to our continued collaboration. For anyone who is interested in the topic: at soffico, we regularly host events. In November, we held our Orchestra Symposium, where Bastian kindly gave a very insightful presentation. For next year, we are planning additional roundtables with different customers — the next one, with a focus on pharmaceuticals, is planned for the first quarter. Anyone who is interested is very welcome to get in touch, stop by, and discuss the challenges in the pharmaceutical industry with us.

Great. Thank you very much for the interesting episode, and see you soon.

Fabian

Thank you, goodbye.

Bastian

Thank you very much.

Questions? Contact Madeleine Mickeleit

Ing. Madeleine Mickeleit

Mrs. IoT Founder of IIoT Use Case GmbH | IoT Business Development | Which use cases work and HOW? Focus on practice! #TechBusiness #AddedValue