📣 A quick reminder before we start: Our next ITOT.Academy kicks off in May, and our early bird offer is available once more. Want to join our fourth group and learn how to bridge IT and OT? There is no better time than now!
👉 Check the curriculum & enrol via ITOT.Academy 👉
If you’ve been following our blog and podcast, you know we spend most of our time in what we call the IT/OT zone: data platforms, connectivity, governance, AI use cases, and everything in between. We’ve also covered the Purdue model, MES, UNS, and even Model Predictive Control. However, we rarely talk about what happens at Level 1 and Level 2 — the actual process control layer that keeps plants running. Not the data it produces. Not the dashboards built on top of it. The control itself.
So when we had the chance to sit down with Margret Bauer, Professor of Process Automation at the Hamburg University of Applied Sciences, we jumped at it. Margret is an electrical engineer by training, did her PhD in data analytics on process data back in the early 2000s (before “data analytics” was cool), worked for ABB Corporate Research, and even did early IT/OT integration work — connecting SAP with ABB’s 800xA system back in 2007. (Yes, 2007)
PID: The Most Important Algorithm Most Don’t Know About
Let’s talk about PID control.
Not P&ID (the diagram) — PID, short for proportional, integral, derivative. If you studied engineering, you probably had one half-lecture on it, sandwiched between Kalman filters and Lyapunov functions. Easy to overlook.
Except it runs the world.
Margret was blunt about this: 99.9% of all rockets that have flown into space run on PID control. All the robots you see online? PID underneath. Every valve opening and closing in a chemical plant, a refinery, a bakery? PID.
The concept is elegant: the proportional part looks at the present, the integral part looks at the past, and the derivative part looks at the future. Three aspects of time, one controller. As Margret put it: it has the worst name and the best track record of any control strategy out there.
But don’t let the simplicity fool you. In practice, PID is hard to implement well. Valves have physical limits — they can’t open beyond 100% (no matter how politely you ask). They take time to respond. And when you need to coordinate two valves for the same flow — say, one big valve for coarse control and a small valve for fine-tuning — the strategies on top of PID get complex fast. These layered strategies exist across every process plant, and they are the strategies that nobody outside the automation world ever talks about.
A Dying Breed
Margret posted on LinkedIn that process control engineers are a dying breed. When we asked why, her answer was painfully logical: the automation worked. Companies invested in control systems in the 1970s, 80s, and 90s. Plants got more stable. And then management looked at the 20-person controls department and said: “Why do we still need these people? The process runs fine.”
So they cut the teams. One by one, across the industry.
And that is a major problem.
In industry, many control departments are gone — and with them, the expertise to improve or even maintain automation performance. And in academia, process control is barely taught anymore. There are barely any new process control engineers coming through the pipeline. The academics who still focus on it? A handful worldwide, passionate but outnumbered (and Margret surely is passionate 🙂)
The AI Reality Check
Willem couldn’t resist: “Margret, of course, I’m going to come in with the solution for all your problems. You need to use AI. It’s going to solve everything.”
(We all laughed.)
Margret’s response was obviously more measured.
One of her master’s students developed a reinforcement learning algorithm for a batch penicillin process that improved throughput by 25%. Genuinely impressive. But it worked because the student had a well-understood simulation model. In the real world? The algorithm wasn’t scalable, wasn’t repeatable, and wouldn’t transfer to another process.
This ties straight into something we’ve been discussing a lot recently on this blog: the physical twin problem. AI models need to understand the underlying physics, the process behaviour, the control strategies. Without that, you’re optimising in a vacuum. David’s own experience with nonlinear MPC during his master’s thesis confirmed the same thing — beautiful results on simulated data, useless on real plant data.
The takeaway isn’t that AI can’t help. It’s that AI without process knowledge is just maths looking for a purpose.
The Operator Paradox
There’s another angle Margret brought up that resonated with us: the better your automation, the more bored your operators become. One of her students — a former operator — said she used to bring a book to her shift. Press the button, sit down, read for eight hours, hand over to the next shift.
That’s great from a stability standpoint. But it creates a dangerous gap. When something does go wrong — and it always does eventually — operators haven’t seen enough upsets to know how to respond.
The more you automate, the less exposed your operators are to disturbances, and the harder it becomes to train them for the exceptions.
And you can’t just “turn off the MPC layer to make things interesting again,” as David pointed out. So the industry adds another layer — operator training simulators, essentially flight simulators for plant operations. Layer upon layer upon layer.
Margret’s view? We’ll never fully automate everything. Every process is different, every plant is an individual. We’ll always need people. The question is how we keep them engaged, trained, and ready for the moments that matter.
Why This Matters for the IT/OT World
If you’re reading this blog, chances are you’re working on data platforms, digital twins, AI use cases, or integration architectures. All of that is important. But it all sits on top of a foundation that most of us take for granted.
Process automation isn’t a solved problem. It’s an under-invested, under-documented, under-appreciated layer that directly determines the quality of the data we work with, the stability of the processes we try to optimise, and the feasibility of the AI models we try to deploy.
If the foundation crumbles, nothing above it works.
So next time you’re debugging a data quality issue, or wondering why your AI model produces nonsense, or trying to understand why a sensor reading oscillates when it shouldn’t — maybe the answer isn’t in your data platform. Maybe it’s one layer below.
Find Margret on LinkedIn: https://www.linkedin.com/in/margret-bauer-a885618/
Book ‘Process Control in Practice’ mentioned during the podcast: https://www.amazon.de/Process-Control-Practice-Gruyter-Textbook/dp/3111103722
📣 Our next ITOT.Academy kicks off in May, and our early bird offer is available once more. Want to join our fourth group and learn how to bridge IT and OT? There is no better time than now!
👉 Check the curriculum & enrol via ITOT.Academy 👉
Stay Tuned for More!
Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.
🚀 See you in the next episode!
Youtube: https://www.youtube.com/@TheITOTInsider
Apple Podcasts:
Spotify Podcasts:
Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions.











