Making OT Data Usable in the Cloud with Connectors
The challenge: Proprietary structures inhibit the overarching use of data
Now that the benefits of storing data in the cloud for IT applications have long since become widespread, operational technology (OT) data faces numerous challenges along the way. The very different formats alone result in a lot of effort if information is to be standardized. Operating technology includes production data and sensor signals as well as service information or metric time series data.
In addition, much information is bound to proprietary technology and can only be prepared for company-wide use at great expense. Thus, access to DCS, PLCs, SCADA or historians must be done individually. Today, there is no clear way to make this data available: This not only harbors security vulnerabilities, but also represents an immense cost factor.
Tag fees and licensing requirements further complicate the process. Especially companies that want to use their operational data across sites and divisions for analysis and further development are looking for ways to bring the different formats together. In view of the constantly growing amount of data produced by machines, sensors or simulation tools, the need to use the data intelligently is also increasing. According to an IDC estimate, global data volumes will increase 61 percent to 175 zettabytes in 2025 compared to 2018. Industrial data accounts for a large part of this volume. Only about 5% of this data is used at all, and not all of it consistently. Given that only 3% of companies even have the basic data quality standards for advanced analytics, there is huge potential to generate value from the volume of data.
The solution: One data pool as a company-wide information basis and industrial data analysis
When the industrial operating data, which is acquired sporadically and structured very differently, is brought into a uniform format, completely new possibilities for analyses and operations arise. Azure is Microsoft’s cloud solution specifically for software developers who want to unify data models and structures on a cloud platform. Uptake Fusion is the SaaS tool in the Azure Marketplace that brings and consolidates metadata as well as historical and time series data from multiple sources into the cloud. Quality assurance and normalization of data are automatic with Uptake Fusion.
Once the storage in the cloud is done, the access for different users can be defined individually in roles. Tools such as Time Series Insights and Power BI also provide important insights on this basis, which can be used to increase efficiency and thus secure the result.
Uptake Fusion “liberates” data from what is often multi-secured storage in individual parts of the enterprise, each protected with the security measures in place there. As soon as enterprise-wide analyses are to be performed, the material must be extracted from these mechanisms to become meaningful “Big Data”. The IT and OT perspectives merge to create a holistic improvement across the enterprise.
Uptake Fusion preserves the metadata that is largely lost with other methods when transferred to the cloud. As a service (Software as a Service), the data volume is hosted either in the customer’s own cloud or in the Uptake Cloud.
Uptake Fusion provides rapid centralized access to granular industrial operational data from multiple sources for experimentation and analysis. Asset hierarchy and data models can be easily imported and reconciled with existing data management systems. In less than two weeks, the cloud-native Industrial Analytics Data Hub can leverage event and time series data in new and impactful ways.
The result: Curated data for every conceivable purpose
Uptake Technologies’ customers and references speak for themselves: a leading expert in vehicle transportation – United Road Services – increased fleet uptime with 4x ROI using Uptake. A customer in the freight rail segment increases the time between unscheduled maintenance by 34%. One of the largest mining companies in the world selected Uptake and identified more than $34 million in value added over a five-year period through predictive industry analytics. This shows: Uptake delivers business value for every use case across industries.
With only about two weeks of implementation time, the transition to Uptake Fusion is very quick. After this short time, all the benefits are available to the users. The secure storage and standardization of operational data results in synergy effects for almost all operational units. Any standardization reduces the effort to process data before further use. Some applications are not even implemented due to the expected effort: therefore, Uptake Fusion also opens up new fields of analysis and application that are not possible in confining structures. What saves time also saves costs, and working efficiently increases not only employee satisfaction but also the overall operating result.
The data is also a basis for companies to meet their ESG targets. Especially in asset-intensive industries such as the oil & gas industry, reliable data is one of the basic requirements for sustainability programs. The material is suitable for real-time statements about consumption or for audits.
Click here to find the app in the Microsoft Azure Marketplace: Uptake Fusion: Cloud-Native Industrial Data Analytics Hub