Practitioners Unplugged

Practitioners Unplugged


Episode #12 | Building a Data Highway for AI Success in Manufacturing with Jonathan Alexander of Albemarle Intelligence

September 03, 2025

If you’ve been tracking the hype around AI in manufacturing, you’ve heard it all—pilot purgatory, proof-of-concepts that don’t scale, shiny tools that don’t move the P&L. Episode 12 of Practitioners Unplugged cuts through that fog with Jonathan Alexander, Global Manufacturing AI and Advanced Analytics Manager at Albemarle. His philosophy is refreshingly clear: start with operational excellence, use technology as an accelerator, and make context the backbone of everything.

Working within Albemarle’s global manufacturing excellence organization, Jonathan brings over 15 years of experience turning digital transformation buzzwords into measurable results. His approach challenges conventional wisdom: instead of chasing the latest AI trends, focus on solving universal manufacturing problems through standardized, scalable systems.

As Jonathan puts it: “Semantic models are king and context is king.”

Here are the five key insights from our conversation with Jonathan:

1. Manufacturing First, Technology Second—Build Your Analytics Highway

When asked where his function sits—OT or IT—Jonathan’s answer reveals Albemarle’s core philosophy: “I would consider myself neither OT or IT, I would consider myself manufacturing.” This manufacturing-first lens shapes everything: problem selection, architecture, and adoption strategies.

Jonathan’s most powerful analogy centers on infrastructure investment. Just as President Eisenhower’s Interstate Highway System transformed American commerce by creating standardized routes, manufacturing organizations need an “analytics highway” before they can scale AI initiatives.

“We had to say, well, how are we, what problems are we going to solve? And the traditional way of people doing digital transformation is they find a technology and then they go and apply it on a certain use case, and then they step back and take that technology and find another place for it.”

Instead of technology-first thinking, Albemarle started with their biggest universal problem: process variability. They built a standardized infrastructure using PI System’s Asset Framework to contextualize 76,000+ instruments across six sites. This 300-page standardization document—admittedly “the least fun thing” Jonathan ever created—became the foundation that enabled rapid deployment of analytics at scale.

The ROI parallel is striking: the Interstate Highway System generated $6 for every $1 invested over 70 years. Similarly, Albemarle’s infrastructure investment pays dividends through reusable templates that eliminate repetitive custom engineering work.

2. One Visualization to Rule the Chaos: Make Complex Analytics Look Like Google Maps

Albemarle didn’t chase isolated use cases. They picked a global class of problems—process variability—and built a standard path from raw signals to action. Rather than building custom dashboards for each use case, they standardized on Statistical Process Control (SPC) charts as their universal interface.

“We said, okay, imagine if we have all of our instruments monitored with this, with the right filtering so that they’re giving good signals. It’s kind of like the concept of Google Maps.”

Just as Google Maps uses consistent visual language across any destination, Albemarle’s approach means operators in Australia, China, or South America see the same interface: green dots for normal operation, red dots for out-of-control conditions. This eliminates training barriers and enables consistent interpretation across diverse workforces.

Think of SPC at scale as traffic conditions across thousands of “lanes” (tags): you can spot congestion instantly and know where to focus. It’s common-sense, teachable, and consistent—exactly what you need across regions, shifts, and skill levels.

3. Focus on “Action Boards,” Not Dashboards—Standard Templates Enable Speed

Jonathan deliberately banned the term “dashboard” in favor of “action boards”—a semantic shift that fundamentally changed how his team approached analytics deployment.

“You can create all these great insights, but if nobody used them, it’s just another art piece on the wall. And so for us, what we wanted to do is teach people not to be builders. We didn’t want all of our engineers to spend all of their time building new calculations and new dashboards.”

The problem with traditional dashboards is that engineers love building them. Each creates their own version, then leaves the organization, forcing the next engineer to start over. This cycle wastes enormous resources on redundant development rather than value-generating activities.

With SPC as the front door to action, Albemarle layered in anomaly detection using principal component analysis (PCA). Here, the magic is the template. Albemarle created standard unit-operation templates (with derived versions for reactors, columns, filters, etc.) tied into the PI Asset Framework. Once contextualized, “we could build a machine learning model and… have it deployed across hundreds of operators all in one hour.”

They now run approximately 1,200 PCA models, typically trained on 2–12 months of relevant history for golden-batch-like behavior rather than strict predictive analytics. Standardization plus context equals speed.

4. Context Is King: The Bottleneck Isn’t Data Ingress—It’s Semantics

A lot of manufacturers assume the historian is the heavy lift. For Albemarle, the harder part was building the semantic model over time and across generations of assets and drawings.

“The challenge for us, it wasn’t getting data into the system, it was contextualizing it in the right way,” Jonathan explains. Automation of P&IDs and tag lists fell short because “not all of our P and IDs and our tag lists were accurate. They were done by different standards over a different time.”

This contextualization challenge created unexpected resilience benefits. The true test of Albemarle’s system came during significant workforce turnover at one facility. Traditionally, losing multiple operators, engineers, and managers simultaneously would trigger weeks of downtime as institutional knowledge walked out the door.

“We had this system in place and in the past when that would’ve happened, that would’ve been just unbelievably disruptive to the manufacturing sites. But we didn’t, they didn’t experience that.”

The standardized analytics infrastructure maintained operational stability despite the knowledge drain. More importantly, it created a higher baseline performance level that doesn’t degrade when people leave. In a world of rotating contractors, generational change, and tight labor markets, this resilience is strategic.

5. Let Business Value Drive Technology Choices: GenAI’s Place (For Now)

While generative AI dominates current conversations, Jonathan maintains focus on traditional machine learning techniques that deliver measurable ROI. His principle component analysis models have prevented costly equipment failures by detecting correlation changes that human operators miss.

Albemarle is experimenting with genAI (like code acceleration in Microsoft Copilot), and the time savings can be stunning. But Jonathan is candid about the challenge: “The challenge that people are having with generative AI… is being able to specifically describe the business case and the ROI,” beyond soft savings.

“I think with the advances in generative AI right now, semantic models are king and context is king. ‘Cause we’ve actually done some stuff with generative AI, just dumping a bunch of data from here into there and kind of crossing our fingers and praying that it will understand the information that we give it and it doesn’t.”

As a result, genAI is “less than 10% of my time” today—while the team continues to scale SPC and machine-learning methods that tie directly to measurable process outcomes. Jonathan’s team uses generative AI where it adds clear value while continuing to expand traditional ML applications that directly impact manufacturing performance. This balanced approach avoids both “shiny object syndrome” and technological conservatism.

Conclusion: Building the Foundation for Manufacturing’s AI Future

Jonathan Alexander’s journey at Albemarle demonstrates that successful AI at scale requires patience, standardization, and relentless focus on manufacturing fundamentals. Their approach—treating AI as an enabler for proven continuous improvement methodologies rather than a revolutionary replacement—offers a practical roadmap for other manufacturers.

The key lessons for practitioners: Anchor on variability and value through SPC at scale. Template everything to turn months into hours for model deployment. Invest in the semantic model because ingestion plumbing is necessary, but semantics is decisive. Design for turnover with standard visuals and thresholds. Keep genAI grounded with clear business cases tied to the semantic layer.

Jonathan sums up the ethos well: don’t let “the AI tail wag the operations excellence dog.” Technology is “just an enabler” to move faster through proven continuous improvement methods.

The lesson extends beyond AI to digital transformation broadly: sustainable change comes from solving universal problems through systematic approaches, not from deploying impressive technology demonstrations. Organizations that build their “analytics highway” first create the foundation for whatever technologies emerge next.

The highway is built one mile at a time, but once complete, it transforms how quickly you can reach any destination.

To submit a request for a new episode topic from Practitioners Unplugged,
visit our contact page

Thanks for reading. Don’t forget to subscribe to our weekly newsletter to get every new episode, blog article, and content offer sent directly to your inbox.

Explore more about Schneider Electric & AVEVA