
Over the past few weeks, I’ve been writing about a pattern I’ve seen across a lot of data platforms:
AI initiatives start strong… and then stall.
Not because of talent.
Not because of tooling.
Because of architecture.
Across different organizations, the pattern is surprisingly consistent:
- SQL Server environments built for stability are asked to support experimentation
- Lift-and-shift migrations move workloads to the cloud without changing architecture
- Costs increase, but performance and agility don’t
- AI teams are forced to work around the platform instead of with it
- None of these are unusual decisions.
In fact, they’re the default path for many teams.
But when you step back, a different picture starts to emerge.
The organizations making real progress with AI aren’t necessarily using radically different tools.
They’re making different architectural decisions.
They’re separating workloads.
They’re aligning platforms to how those workloads behave.
They’re enabling experimentation without putting operational systems at risk.
And in many cases, they’re adopting hybrid architectures where:
- SQL Server continues to power operational systems
- Platforms like Snowflake – or Microsoft-native solutions like Microsoft Fabric – support analytics and AI workloads
That shift isn’t about replacing what works.
It’s about removing friction where it doesn’t.
Over time, I’ve found that the most valuable conversations aren’t about either tools or migrations.
They’re about questions such as:
- Where is our architecture creating unnecessary friction?
- Which workloads are we forcing into the wrong platform?
- What would change if experimentation didn’t compete with production?
Those questions tend to lead to much better outcomes than starting with:
“What should we migrate?”
If you’ve been following along in this series, you’ve probably already started mapping some of this to your own environment.
And if not, that’s usually where the conversation begins.
I’ve been spending more time recently helping teams work through these exact questions – evaluating existing architectures, identifying friction points, and mapping out practical paths toward AI-ready platforms.
No two environments truly look exactly the same.
But the patterns are often familiar.
If you’re working through similar challenges, I’m always open to comparing notes.
Leave a Reply
You must be logged in to post a comment.