This is the first article in a short series on how shifting constraints are changing the role of business software.
For much of the last decade, software was built under a comfortable assumption: compute was effectively infinite. If something ran slowly, we scaled it. If it cost more, we absorbed it. Cloud infrastructure, cheap capital, and stable supply chains made this approach feel not just reasonable, but permanent.
As an experienced engineer, I built systems in that world. The incentives were clear. Abstract aggressively. Optimize for developer speed. Let infrastructure absorb inefficiency. Those decisions weren’t reckless — they were rational responses to the environment we were operating in.
That environment is changing.
Cloud costs are no longer invisible. Energy constraints are becoming operational concerns, not just policy debates. Supply chain disruptions have reminded businesses that physical limits still exist. The idea that you can always “just add more compute” is being quietly but decisively challenged.
This series looks at what happens when long-standing assumptions like that start to break.
How Software Learned to Waste Resources
Modern software stacks are remarkably good at hiding complexity. Layers of abstraction make systems easier to extend and teams easier to scale. They also make it easier to ignore what software actually consumes at runtime.
That trade-off was intentional. For years, businesses prioritized speed to market and feature velocity over efficiency. As an engineer, you were rewarded for shipping, not for shaving milliseconds or reducing memory pressure. Efficiency became something you worried about only when things were already on fire.
Over time, waste crept in — not through negligence, but through design patterns optimized for a world of abundance. Each individual decision made sense. The accumulated effect was software that quietly burned resources in exchange for convenience.
As long as compute felt infinite, this wasn’t a problem.
Why “Just Scale It” Is Breaking Down
When inefficiency finally shows up, the default response is still to scale infrastructure. Bigger instances. More replicas. Higher tiers. Sometimes that’s the right move.
Increasingly, it’s just a way to avoid architectural decisions.
The problem is that scaling hardware doesn’t fix the underlying assumptions. It amplifies them. Every abstraction becomes more expensive. Every inefficiency becomes a recurring cost. What used to be a technical concern turns into a financial one — and it doesn’t go away.
At a certain point, systems that “work” become systems that are too costly to keep working the same way.
The Counterintuitive Move: Design for Constraints Again
This isn’t a call to abandon modern tooling or regress to hand-tuned systems everywhere. It’s a call to be intentional.
Constraints force clarity. When resources are no longer assumed to be infinite, teams have to ask better questions. What actually needs to be fast? What can be asynchronous? Where does abstraction help, and where does it get in the way?
Software designed around known workloads and real business needs can make those trade-offs explicitly. It doesn’t try to solve every problem generically. It solves the problems the business actually has.
That shift — from assuming abundance to designing within constraints — is where performance discipline quietly returns. Not as nostalgia, but as strategy.
Conclusion: Why This Matters Now
Generic platforms are built to assume abundance. They have to be. They serve everyone.
Purpose-built software is different. It reflects the realities of a specific business, operating under real constraints, with real costs. It allows teams to trade wasteful abstraction for efficiency where it actually matters — and to do so deliberately, not reactively.
That theme runs through this entire series: many of today’s “new” problems are really the result of old assumptions no longer holding.
Purpose-built software lets you trade wasteful abstraction for efficiency where it actually matters.
In the next article, we’ll look at another assumption that’s breaking just as visibly: the idea that our data was ever as solid as we thought it was.
