Accelerate data pipeline development with a platform

node-divider

IoT Use Case Quix
4 minutes Reading time
4 minutes Reading time

The processing of real-time business data has high demands on the IT infrastructure and the expertise of the IT teams. Development environments such as Quix’s platform, which simplify complex technologies and accelerate the development of data-driven applications, are therefore in demand.

The challenge: High IT requirements for processing real-time data

Data has become a major economic factor in recent years. Companies can use it to better analyze their customers, develop products faster and build new types of business models. These include, for example, delivery forecasts in logistics, machine learning and digital twins in manufacturing, or connected services in the automotive industry. The prerequisite for this type of data project is a specific infrastructure that is capable, above all, of evaluating events in real time (event streaming).

One example: Many industrial companies use the Industrial IoT to determine the status and process data of their machines and systems. They use it to monitor production and check the quality of the manufactured products. Large amounts of data are generated in this process, especially when visual methods are used with the evaluation of image and video information. This real-time data must be evaluated immediately after it is generated, as it forms the basis for further processes.

The demands on IT performance are therefore very high. Above all, slow databases with high latency times and a non-optimal database architecture for time series data often prevent meaningful use of data in medium-sized companies. Therefore, easy-to-use development environments that hide complex technologies such as Kubernetes, Kafka or Docker from the companies using them and simplify the development of data-driven applications are in demand..

The solution: A platform for the development of data-driven real-time applications

Quix is a platform for developing real-time data-driven applications that includes IT infrastructure, application programming interfaces (APIs), and a development environment. It allows Python developers, for example, to stream, process, and store data without requiring IT teams to deploy additional technologies. In doing so, the platform can easily scale to handle trillions of business events per day and provide organizations of all sizes with data pipelines for real-time processing.

The platform is an open source toolbox that provides three types of building blocks for a data pipeline: Data Sources, Data Transformations and Data Targets. They allow different applications for the Industrial IoT, as a simplified example shows: IoT sensors in a machine tool serve as a data source. The platform processes them with a data transformation, for example by comparing them with a threshold value. The results are then fed into a target application, such as a real-time dashboard to display the data.

Quix’s particular strength is its rich ecosystem of technical integrations and partners, which allows the platform to be easily embedded into an existing architecture. For example, many customers already use a streaming solution such as Apache Kafka, often in conjunction with Confluent, which they complement with Quix. To analyze this data, Quix provides pre-built transformation objects for fraud detection or sentiment analysis, as well as for some of the predictive AI models in the Huggingface community, among others. There are more pre-trained models there, such as the well-known GPT2. Basically, you can develop your own Quix data transformations based on the well over 60,000 AI models that perform every conceivable machine learning task.

The result: A developer stack for data products

Quix provides all the necessary functionality to develop applications with data pipelines. The use of Quix creates new data products that have a distinct advantage: Data is kept in the memory, resulting in lower latency and lower operating costs.

Quix’s developer stack provides a web UI, APIs and an SDK that shield developers from the underlying infrastructure. The entire platform uses a serverless approach and a very tight connection to Kafka. Database capabilities also include a time series database, so companies are ready for any scenario in the Industrial IoT.

NEWSLETTER

Get our IoT Use Case Update now

node-divider

Get the free monthly update on exciting use cases, latest podcast episodes and news from our IoT community.