Möchtest du unsere Inhalte auf Deutsch sehen?

x
x

Energy-efficient buildings through IoT data analytics

““

Click on the button to load the content from Spotify Player.

Load content

Listen to the IoT Use Case Podcast on Spotify.
Listen to the IoT Use Case Podcast on Spotify.
Listen to the IoT Use Case Podcast on other platforms.

IoT Use Case Podcast #90 - Microsoft + techem

Saving energy with the cloud – In this podcast episode, technology giant Microsoft and well-known energy service provider Techem talk about their collaboration in IoT and how using Microsoft’s Azure IoT technology in metering data processing has created sustainable energy management solutions and services at Techem. 

Episode 90 at a glance (and click):

  • [08:33] Challenges, potentials and status quo – This is what the use case looks like in practice
  • [20:15] Solutions, offerings and services – A look at the technologies used
  • [36:31] Results, Business Models and Best Practices – How Success is Measured

Podcast summary

Techem is one of the leading energy service providers for all aspects of real estate. Intelligent metering technology can be found in almost every third German rental apartment. But not only here in Germany: Techem is “out in the field” with a total of more than 50 million end devices in 19 countries. For many years, bulk processing has been day-to-day business for the energy service provider. But how do they collect these masses of data?

Energy is generated, distributed and then consumed. The path of energy leads us to some of the questions of this podcast:

  • Generation: How can it be matched to consumption?
  • Grid distribution: How can I optimize power grids? How can I monitor a system with a wide variety of connectors and asset interfaces in such a way that I can detect anomalies at an early stage so that interventions can be made under predictive maintenance? How can the downtimes in the networks be reduced?
  • Consumption: How do energy management solutions for end customers work (e.g. via dashboards)?


The podcast explores the specific use case that Microsoft has implemented together with Techem: The IoT data is read from the meters and transmitted as a stream to the Azure Data Explorer using the Azure IoT Hub. This is a fully managed Big Data analytics platform.  It enables the analysis of large amounts of data in near real time. Whenever the device sends out the data, it can be processed immediately. That means you don’t have any time delays or latencies.

More about this and many exciting insights into the project from Techem and the selected partner Microsoft can be found in episode 90 of the IoT Use Case Podcast. Guest at the microphone: Malte Schulenburg (Senior Client Executive, Microsoft) and Dr. Roland Werner (Head of IT, Techem).

Podcast interview

Today we have a very exciting episode with the Techem Group. We are all familiar with them at least from the fact that most of us have a so-called “radio meter” hanging from the heating system. One can only imagine how many millions of these devices are installed in the utility units of apartments and buildings, and also still in the industry.

What is particularly exciting is how Techem collects these masses of data. Is there a possibility to see energy consumption per square meter more often during the year when billing heating costs? And is it possible to access this data and, if so, how? You can find out all the answers and many exciting insights about the project from Techem and its selected partner Microsoft right now. And with that, straight into the podcast studio – let’s go!

Hello Malte and Roland, welcome to the IoT Use Case Podcast. Nice to have you with us today. Malte, how are you and where are you right now? Are you working from home or where are you right now?

Malte

Good morning, Madeleine. I am actually working from home; I am in the beautiful city of Berlin. It is still rather cold, but the sun is shining.

Greetings, then I can practically wave to you. And Roland also nice that you took the time to join us today. Where are you right now and how are you doing?

Roland

I am doing wonderfully. Today I am sitting in my office at our headquarters in Eschborn, just outside Frankfurt.

Very nice, I even know Eschborn. Are there any productions represented there?

Roland

After all, we are not a manufacturing industry; we deal with data and the findings from it. This is our production, so it takes place in different locations, especially with the IoT end devices at our customers’ sites.

I don’t think I need to introduce Microsoft here. Many of our partners from the IoT use case ecosystem, but also many of our users, are already working with your services today. You guys are well known around the world, especially with your Microsoft Azure services and the corresponding cloud computing services behind it. I think much more exciting would be talking about you personally: you’re an Account Executive. What does that mean in terms of job role and what types of clients do you work with?

Malte

In my role as an Account Executive, I work directly with customers on the Forefront. That means I’m always the first point of contact when customers are dealing or want to deal with Microsoft in any way. Especially in the topic of Azure, there are many questions or possible use cases. There, I exclusively look after customers in the energy sector. This means that I work with customers who are involved in the broadest sense with energy production, energy distribution or even energy savings.

Microsoft has also been a guest on the podcast before, I’ll link to the other episode. But today it’s all about the use cases from your area and especially about the projects. Can you give a few examples from the energy segment, which use cases you address here and which project we are looking at in detail today?

Malte

Especially if you look at Azure Landscape, there are an enormous number of ways you can apply that. This means that we always work from the customer’s perspective. For example, when you look at energy management solutions, the question is: How can I monitor my system with a wide variety of connectors and interfaces of my assets in such a way that I can detect anomalies at an early stage and also intervene through predictive maintenance?

Then there are use cases where you work in the area of “power grids”, which is the distribution of electricity. There we work with our customers on the development of Digital Twins. In principle, this means mirroring the hardware network so that it can then be digitally planned, expanded and monitored.

Another use case is Smart Charging. That is, when renewable electricity is produced, it always has fluctuation in production. This means that, unlike a classic gas or coal-fired power plant, a constant load cannot always be supplied to the grid.

An increasing requirement in this area is that consumption is matched to production. Depending on the feed-in to the grid, consumption then takes place. You can do that with EV charging stations, among other things; these are all examples where we work with customers. With technology specifically, we worked together on how to save CO2. In other words, how can CO2 savings and CO2 optimization be achieved specifically through data analysis?

This is also what you mean by connectors and assets. So an asset could also be a Techem device, for example. Is that how it works?

Malte

Exactly, it’s always about how I can make a piece of hardware that is connected somewhere smart and, above all, network it so that I can then use the data that I read out somewhere to generate added value and achieve options for action. I need data transparency first, and then I can take certain steps.

Do you have one or two examples of which projects you are still working on with which customers?

Malte

Very gladly, if you look at energy, in principle you can draw a path. Energy is generated, distributed and then consumed. And in principle, our topics can be based on these subdivisions there as well.

That is, when we look at generation, it’s often about how can it be matched to consumption? Power grid distribution is about topics like: How can I optimize power grids? How can I detect anomalies there early so that interventions can be made using predictive maintenance? How can the downtimes in the networks be reduced?

Consumption itself is about energy management solutions for end customers. That they can then see everything via dashboards, for example. To give you a concrete example: We did a project with E.ON in which end customers were given their own dashboard in their smart homes so that, as user customers, they could get an overview of their consumption, of the connected devices, and then flexibly realize cost savings there.

One trend we see is with Allego, for example. There, we helped build smart charging solutions for electric cars. This means that the car is connected to a charging solution and there, depending on the feed-in of renewable electricity, the charging is then in principle carried out directly on the car. Here, too: If I, as a customer, have some flexibility in terms of when my car needs to be charged, then that aligns directly with when the electricity is fed into the grid.

This has two advantages: The utilization of renewable electricity is increased; thus I have increased sustainability. And secondly, I can also save costs, which the end customer then books as a benefit for himself.

Also I will link these projects in the show notes.

Challenges, potentials and status quo – This is what the use case looks like in practice [08:33]

Today you brought Roland, one of your customers from the company Techem. What exactly did you do here? What was your project about?

Roland

Techem is a familiar name to some people, as we are represented in every third rented apartment in Germany with our intelligent metering technology. We are on the road in a total of 19 countries and out in the field – as we call it – we have over 50 million end devices. That’s why we like to call ourselves the “Hidden IoT Champion” from time to time, because mass data processing has been our daily business for many years.

Even if you look around, I have two Techem devices here. And I think almost everyone has one in their apartment.

Roland

We are historical, of course; the company too. We have been around for 70 years. But we have a massive shift in digitization. On the one hand, this concerns the end device technology. If you look at a radio heat cost allocator welded to your radiator today, it has a small digital display and that is a radio controlled device. No one has to come into the apartment to do readings anymore; that’s the past!

Today, these devices are digital, send their data in preset frequencies, for example, monthly, so that we can see from the apartments what heat was consumed. So that this is returned in an orderly manner on the one hand, to the tenant and the landlord, but also the orderly heating cost billing can be created from it.

But they are also water meters. What is the water consumption? They are heat meters. Our vision and mission is to make such residential buildings green, smart and healthy. And healthy also includes aspects such as smoke alarms that hang from the ceiling and automatically check that they are also functioning properly and work really well in an emergency. We are present there with our device technology.

You have a huge range of different products. But there is your vision of digitalization, to approach the whole thing holistically and to make data usable for different areas and to move towards new services that are used both for your customers, but also internally somewhere for optimization.

Roland

The decisive factor is the added value we bring to our customers. Our clients are typically landlords and managers of residential buildings; occasionally for industrial buildings. But in essence, what our customers are concerned with there is that on the one hand they can pass on what they generate in heating costs to the tenants in an orderly manner, for example, but also in a measure.

You can’t just distribute that per square meter, it has to be calculated based on consumption and calculated correctly. This is one of our essential core businesses. Nowadays, you can imagine, the topic is present: how can I reduce energy, how can I use less gas?

The so-called warm rent is a very expensive component for many of our tenants and also the landlord. That’s where we try to tackle and help how to reduce all that. How to avoid consuming energy unnecessarily in the basement? And that is part of our solution spectrum.

Can you address these challenges you’ve seen while implementing your vision from both a business and technology standpoint?

Roland

The classic recording of heat consumption in the apartments, I have just briefly indicated, takes place once a month. This is also sufficient as a measured value. Only these data are transferred and from them you can make this redistribution, orderly during the year, but also at the end of the year; only the requirements increase. It is not enough for me as a tenant to get a bill after a year and know how much I have to pay additionally.

In times of the current gas crisis, many people have many concerns with. Therefore, it is very important to provide this information during the year as well. We always say transparency increases awareness and awareness helps change. That is, if I as a tenant already see once a month that I consume more than the others, then you can counteract and am not at the mercy of a high bill of which I do not even know how high it could be.

So the intra-year consumption information is hugely important for behavior change. It has been published – for anyone who reads the press: People in Germany have started to reduce their energy consumption by turning down the temperature a bit, changing the ventilation behavior. Anyone can take part. This applies not only to the tenant, but also to the landlord.

That’s also an insanely exciting approach, not just for me personally, but probably for anyone listening. Because the issue affects us all. I imagine there are a lot of technological challenges that come along with that. What are they for you?

Roland

You have to make a distinction. The data that is received above the basement ceiling – i.e. from the apartments – and the data that we receive from the basement, i.e. that continuously monitors the heating, for example, whether it is functioning and set optimally. These are different frequencies and amounts of data that are coming at us.

Thanks to our in-house development of technology, we have a great foundation from which to start. On the one hand, the devices that we brought into the apartments many years ago: all of them are transmitting via radio. They regularly send their information and we increasingly hang corresponding gateways in the corridors of the houses, which transmit this data to us once a month. That means no one has to do anything anymore.

Nowadays, we really only have to go into the apartments when a water meter is due for replacement because of the calibration period. We continuously get this data stream using the IoT technology that we have engineered ourselves. These are all battery-powered devices that have long run times. Sustainability is also very important to us. These devices last up to 20 years with the batteries. This data, continuously sent to us, then ends up on our cloud. And there it continues.

I can imagine that you are subject to strict legal regulations. Can you give some insights, what dependencies do you have here and what are regulations from the legislator that are still to come?

Roland

For one thing, there are regulations, what needs to be properly recorded in the first place? There is the so-called heating costs ordinance, how landlords and managers of real estate are allowed to convert and allocate these heating costs to their tenants. This is all very well predefined in Germany and also in neighboring countries. Of course, we adhere to this very strictly. Of course, we comply with data protection regulations in everything we do. For example, we only need one reading at the end of the month and not within the month.

Of course, we have to take everything into account in terms of IT security, with encryption of the values so that nothing gets lost. That’s hugely important to us and priority number one in everything we do. There are also more and more legal regulations in Germany and in Europe on how to optimize energy consumption in houses – and what has to be recorded and shown again for this purpose?

I have just addressed this category consumption information. This is an EU guideline that has to be implemented. This means that wherever heat consumption can be recorded remotely, the tenant must also be given a chance to see it accordingly so that they can adjust their behavior during the year.

What kind of data packets and data types do you have to process?

Roland

A huge amount of data comes to us every day, continuously, but also in peaks, for example at the beginning of a month. These are classic IoT telegrams, which then say, for example: Hello, I am the following end device with my serial number XY and have a month-aggregated value here, or, I have a current value. These are mostly meterings, e.g. consumption units of heat or consumption units such as kilowatt hours or cubic meters of water.

In the basement we get temperatures and other units. These are telegrams standardized by us. We adhere strictly to the Open Metering Standard (=OMS), which we helped to establish in Europe, so that the data is transported in this form. Then they are received by us, we decrypt them and store them in scalable databases, nowadays very much on the Azure Cloud.

I would be interested to know what the technological requirements were for the solution. And why did you choose Microsoft?

Roland

When I joined Techem three years ago as Head of IT, after many years in IT consulting before that, it became clear to me that we were facing a massive increase in variability and volumes of data. On the one hand, this has to do with the fact that our business is simply growing and that the types and quantities of smart devices in homes are also increasing. That would still have been somewhat manageable and estimable, but we have a massive increase in sensors as well, especially as I said from the boiler room.

There are many readings, there we have seen a massive increase, both of quantities and of frequencies. That meant I had to find a solution for our business that no longer scaled data center-based. That’s also where we have very strong databases and opportunities to do that, that’s where we come from. But now, in addition with the latest cloud technology, to master these new quantities and variabilities also in the Big Data sense was the goal.

When I joined the company, software development and our operations teams were already doing very well, also with Microsoft technology. At the beginning of the Covid year, we had made a massive shift to Microsoft 365 and that worked well for us. It was very obvious that I also proclaimed an Azure cloud strategy for our IT team. So we’re actually pursuing a single-cloud strategy with Microsoft Azure and we’re focusing on moving there as quickly as possible, especially because we have very good IoT software/middleware there, from my perspective, which we then also brought into use very quickly.

Solutions, offerings and services – A look at the technologies used [20:15]

Malte, can you describe who did what in this project and who had which responsibilities?

Malte

I think a very important point is that you always have to work collaboratively on a project like this. It’s very important to see this as a team effort, as a project where you work closely and collaboratively on it and also define the cloud strategy to some extent. If you want to tackle something like this and work with a lot of data and also process it in a timely manner, you have to know why you’re doing it and how you’re going to set up your business model with it.

Cloud is not an end in itself; I need to know why I’m doing it. This must be defined together at the beginning. Then the division in the case has been that Techem focuses on its core competence. I don’t know as much about that as Roland does. Microsoft is not as familiar with this as Techem is. That means that when we talk about the hardware, about the devices, that is all metier that remains in the hands of Techem.

Where Microsoft then comes into play is providing the cloud infrastructure and the various services that then reside on the cloud. What Microsoft also does is that we are completely responsible for the further development of the services. That is, in principle, you can divide it up so that Techem takes care of what they do with their data and Microsoft takes care of what happens at the infrastructure level and the service level.

We make sure that security is guaranteed in the cloud and that scalability is ensured. Techem is always provided with the service they need at the time. The further development of services is also part of this. That’s one area Techem doesn’t have to worry about. So that’s quite a good split where we work together.

So how does the data collection work from each of the assets and devices that you have?

Roland

There is the so-called OMS, which defines part of how the Telegram is structured, that is, the radio technology that we have engineered there and we have our own research development department and then have these devices produced for Techem, bring them into the field with our fitters and they then establish radio contact.

We radio out of the apartment for the time being. And in the past, if someone walked by there once a year with their receiver, they would have received that telegram. But in the meantime, it’s these gateways that we’ve engineered that forward the Telegram.

Where do these telegrams go? We are currently on two or three routes there. For several years now, the telegrams that are sent to us once a month have been using the so-called CoAP protocol. This is not MQTT, but a second IoT protocol for battery-optimized use. We want to make sure that these gateways that we hang up with their battery packs last as long as possible, just like the end devices.

There, the data first lands in a CoAP receiver and is then unpacked and processed from there. In the meantime, however, we have increasingly switched to the MQTT protocol, even for the network-connected, transmitting devices. You have to imagine, if it’s about the boiler room and data is sent there every 15 minutes via warning routes, with corresponding 3G, 4G or 5G routers, then I also need a power supply, because that should send all the time.

To do that, we’ve made a massive shift to Azure standards on how to receive data with the IoT Hub and the IoT devices. For us, an essential data service where we store all these telegrams is Azure Data Explorer.

This means that you receive all these data packets via your OMS radio standard. These are routed to the gateway and also primarily processed via MQTT. Do you also record buildings and infrastructure data? Is it an issue for you to also integrate IT systems or existing data?

Roland

The pure measurement data is only the motion data. They wouldn’t help us if we couldn’t assign where exactly this device is installed and to which user unit it belongs? That’s why managing a Digital Twin is hugely important to us.

We have had a core system there for almost 20 years of in-house developments, and we are also very proud of it. There it is first recorded to which property we are submitted? You have to imagine, this can be a house, it may have several house entrances and several floors. The question quickly arises: Which device is installed on which floor, in which user unit? Even in which room or on which radiator is this radio heat cost allocator?

We thus have the master data and the mapping of the real world. Now the measurement data from the field is continuously coming in to us and we can assign that. Only in this way can we take an overall look at the total sum of consumption in one of these usage units or in the entire building. And this proper data maintenance and assignability is enormously important for us. There we bring the data together, both in the data center and on the cloud.

This data processing into Azure, I’ll call it the cloud now. How does it work? Can you explain this service?

Malte

When we look at Azure, it’s the infrastructure first. That is, it ensures we map out an infrastructure in a scalable way, and then services for data processing can also be integrated into Azure. If you look at data processing, there are basically three domains: that’s ingest, transform, model and surf and all of that can be represented within Azure; data collection, the query and then also the visualization and management of the data that is read out.

If you look at the specific case that we implemented together with Techem, it’s that the IoT data is read out of the meters and it can then be streamed to Azure Data Explorer using Azure IoT Hub.

What is the Azure Data Explorer? This is a fully managed Big Data analytics platform. This is one of the services that runs on Azure and can be used by customers. This enables large amounts of data to be analyzed in near real time.

Whenever then the device sends out the data, then it can be processed immediately. That means you don’t have any time delays or latencies. This is enormously important, especially in these use cases, in order to immediately have transparent data in real time, so that decisions or differences can be determined, to which a customer can then react.

Roland

I’m happy to add to that; Azure Data Explorer was a nice example for me of the interplay as well. Where has Microsoft helped us here? When I was looking for solutions back then, I reached out to our Microsoft account team and you helped us learn about other cases. How does the automotive industry do it? Or how comparable companies do it. Azure Data Explorer was a very new product on the market at the time.

It was previously in internal use below telemetry capture, especially on Azure. This was a well-known tool and originally came to Microsoft via the acquisition of an Israeli company.

What I liked was that the team gave me contacts to the product management and also to the development team. I was able to try out this service with my team very quickly within two months. We pumped data in there and it worked beautifully.

In fact, until today it runs continuously and we do not have to worry about it. I don’t need anyone to take care of the infrastructure. Innovations are continuously coming out, it’s well documented. We have done very well with this decision, it has always worked well for us and has been very scalable above all.

We then developed this further, of course. How can we form models, machine learning methods, which we then also bring from analytics into this database, in the Big Data sense. I’m not worried about data volumes exploding or frequencies.

How exactly do you do the analysis at that point?

Roland

I’ll gladly give you an example from the boiler room. There are decisions there that have to be made once a year. For example, should I renew this plant or can I optimize it in some way. But they are decisions that have to be made more frequently, up to daily.

For example, on a heating system we typically have what is called the heating curve. This is the ratio of how warm to make the supply temperature, depending on the outdoor temperature. What does my outdoor temperature sensor say? How cold is it on my street? If this heating curve is set incorrectly and the temperature changes, then heat is generated unnecessarily that is not consumed at all; this is a waste. It is generated again and again, although this heat is not actually needed.

This happens when people forget to switch a heating system from winter to summer operation. These are things that we can monitor.

How does this work with Azure Services? In the end, someone must have this visualization in front of them. What does it look like then?

Roland

It is a mixture of humans and prepared machines, which then give signals to the human. In the first round, when we designed this digital boiler room, we had started with a Power BI dashboard for our internal purposes, and there we showed these 15-minute values, but also alerts. In the meantime, this has become a complete client-capable customer portal, where end customers can also see their situation.

Some want us to do that, some give us the mandate to do that for them. We can look at it visually, but more important are the alarms. We continuously have algorithms that, as new data comes in, evaluate that data. For example, if we notice that a hot water temperature is falling below a certain threshold and there is therefore a risk of legionella forming, then we can take immediate action and, for example, if we notice that the system is failing so slowly, we can send a technician there.

For these algorithms, we worked a lot with Databricks, on Azure Cloud, as an environment to model. Then these models are brought into the Azure Data Explorer and run there with the embedded Python model. So we have hourly value-added/insight detection from the 15-minute data and can take action there.

The subject of the regulatory or the regulations that you are subject to. How does Microsoft deal with this?

Malte

Regulatory requirements must of course be met in order to present a business case. None of this is an end in itself; you also have to add value. This requires that we comply with the regulations. It is always very important to encrypt data. That sounds relatively mundane, but it’s just the way it is. In the use cases, in this case with Techem, but also otherwise with the customers, we work together to ensure that they have full data autonomy and encrypt the data so that no one can access it.

Even we, as Microsoft, can’t access it then. The customer can then set up their own customer encryption key; only they have access. Microsoft has now introduced the EU data limit. This means that we ensure that data processing and the use of all services also only take place on Microsoft servers within the EU.

Moreover, even the customer can choose in which region or data center, in which country they want to use this data. That’s where we really give customers a high degree of autonomy.

Roland

I think we’re also on Azure Cloud together in Amsterdam. EU data processing is enormously important for us and for our end customers. We have our own data center in Gothenburg. In between, there is a good, fast line. We also have custom encryption turned on, including for Azure Data Explorer, for example, so that data there is further encrypted.

The possibilities are all there. I think the task then is for every IT leader to train their team. We are in hot times as far as cyber and IT security is concerned. I’m proud of my small, fine team; that’s where further training on cloud techniques and IT security is very important. I feel very comfortable because I basically have the impression that professional data centers run by professional operators give me a good level of protection. If I then add best practices, and I get good advice there, also from the Microsoft team, which other customers also follow, then you can build very good protection hurdles. This is a continuous effort for us.

Results, Business Models and Best Practices – How Success is Measured [36:33]

What’s the business case for you guys? Can you summarize that one again at the end?

Roland

For us, the use of Microsoft technology has become essential for our measurement data processing. In the beginning, it was a supplementary instrument, but it has become indispensable. We are now able to handle large amounts of data and new frequencies. We are now able to identify and respond to anomalies for our customers even faster and offer them value-added services. Simply delivering our current service ever faster and more reliably with great radio rates.

If you look at our website, you will see that things that may have taken days, weeks in the past are working faster and faster. Our customers appreciate this very much. Above all, I feel it puts us in a better position for our future. I assume that these geopolitical changes to which we are all subject will indeed once again bring new regulations and ideas to the table. I want to be ready for new regulatory requirements, but also opportunities to be equipped with high-tech.

Malte

One point I always like to highlight: a relatively low entry barrier. In the past, a customer had to set up their own data center, they had to think about investments. In the cloud, with a few clicks in my Azure Dashboard, I can start up an instance and try it out first. I am enabled to act as a fast mover in the first place. This is all still within the OpEx framework.

Even if things don’t work out, I just shut down the instances and don’t really have any costs anymore. This is, you could almost say, a paradigm shift, which allows companies to try out new ideas very quickly. You don’t have to spend a lot of time in theory thinking about whether this is a business case, whether this is purposeful. Instead, you can simply try it out quickly and get direct feedback from the market as to whether topics work.

None of this is an end in itself; it must add business value, and only then does it make sense to be used. I can try things out quickly – if it doesn’t work, I stop. If it works well, the solution can also scale up virtually without limits. If it works well, I don’t have to worry about it and can grow flexibly with the cloud.

That’s the beauty of working with clients, because you can really look specifically at a use case and just try out different concepts. It’s so much fun right now because failure is no longer expensive. It is this mindset that we need more of to also drive innovation forward.

What services are still to come? What else can we look forward to seeing you bring to the table in the future?

Malte

New features of existing solutions are constantly being developed and then new solutions are also developed. Two topics come to mind because they are immensely important: One is the area of Security & Compliance. Data governance with Microsoft PurView is a particularly exciting topic.

The second is Microsoft’s partnership with OpenAI in the field of artificial intelligence. I believe that if you look at the first use cases, this is a trend that will continue to grow in the coming months and years. In concrete terms, this means: How can I use artificial intelligence, including Azure Machine Learning, to trigger data even more cleverly? Because often the problem is not that we don’t have enough data. But rather that we are not able to understand the data and work with the data. This is where artificial intelligence can provide us with added value that has intrinsic potential.

What’s coming from the Techem environment?

Roland

In addition to the further development of device technology to further sensor technology, I see the expansion of digital services for our end customers, the simplicity of our added values to consume. We are proud that we have a good foundation with our method of how the IoT data gets to us. But much more decisive is how we answer our customers’ core questions.

Another trend that is important to us that I see both in-house and at Microsoft: All of this compute power, whether it’s for AI or even for IoT, takes a lot of energy. It’s important for us to work with companies that are trying to produce things as green as possible. That means we are glad that Microsoft is making enormous efforts there. Among other things, we have a green coding initiative. We can all pitch in to save CO2, including in what we do with data computing.

Thank you for the closing statement! Thanks Roland for your time, Malte also for yours. A really exciting project.

Have a great week! See you then!

Please do not hesitate to contact me if you have any questions.

Questions? Contact Madeleine Mickeleit

Ing. Madeleine Mickeleit

Host & General Manager
IoT Use Case Podcast