Operations intelligence is the information that gives you confidence to choose what’s next.
It could be as small as knowing which work order to focus on next and having the confidence that it’s the right one.
It could be as big as deciding to bring in contract resources to hit a company goal, armed with real information that supports the call.
That’s what I set out to build: a system that generates that kind of clarity, for everyone across the plant.
The problem I kept running into
I have spent more than a decade and a half in operations-facing roles at a Fortune 250 electrical utility. Over that time I built a lot of things — near real-time reporting infrastructure for major operations programs, an asset installation tracker with dynamic targeting, a dimensional data model spanning governance and operations layers, a customer-facing platform serving millions of users.
Every time, the same problem. Before any of that work could matter, someone had to build the model. Someone had to take the raw export, define what “downtime” meant for this floor, map the reason codes, figure out which timestamp to trust, and build a foundation clean enough to report from.
That work doesn’t get talked about. It’s not glamorous. But without it, nothing else works.
I built the OIM because I was tired of rebuilding the same intelligence framework over and over — and watching manufacturing operations struggle with data they couldn’t trust.
What it actually does
The OIM takes an ERP CSV export and auto-generates a proper dimensional data model. No warehouse, no consultant, no $200K implementation.
It doesn’t matter whether you’re the operator on the line, the shop foreman, the plant manager, or the executive steering committee — the OIM gives everyone a view of the same reality, translated for the decision they actually need to make.
That’s not a small thing. Most manufacturers have years of operational data sitting in an ERP that was configured a decade ago. The data exists. The model doesn’t. The OIM builds the model.
Great, not perfect
The OIM is designed to be great — not perfect.
In the pursuit of data model perfection, models become brittle. They take months to deploy, require constant maintenance, and collapse the moment the ERP changes. That’s not a data product. That’s a liability.
The chief goal of the OIM is to be quick to deploy, useful, and trusted by all.
That doesn’t mean it’s slapped together. Far from it. It borrows from proven methodologies: dimensional modeling from Kimball, configuration-driven design from Data Vault 2.0, and a decade of pragmatic engineering from real-world operations. But it draws from those frameworks where they serve the goal — and stops where they don’t.
Proven, not precious.
Why I’m writing about this
I’m publishing the OIM’s architecture decisions as I go — not because they’re novel, but because the reasoning behind them matters. Most manufacturing software is sold as a black box. You don’t know why it works the way it works, and when it doesn’t, you’re stuck.
The OIM is built to be understood. The posts in this series are the paper trail.
If you’ve ever inherited a data mess from an ERP migration, had to explain to a plant manager why the numbers don’t match, or rebuilt the same reporting infrastructure for the third time at a new company — this is for you.
The OIM is what I wish had existed when I was in the trenches.