News

Architectures have layers - just like ogres

Layers, layers, layers. I seem to be running into layers a lot lately, and I'm getting as tired of them as I am of ogres. For the benefit of those of you without small children or a social life, I present this bit of dialog from the original Shrek movie:

SHREK: Okay, um, ogres are like onions.
DONKEY: [Sniffs] They stink?
SHREK: Yes. No!
DONKEY: They make you cry?
SHREK: No!
DONKEY: You leave them out in the sun, they get all brown, start sprouting' little white hairs.
SHREK: No! Layers! Onions have layers! Ogres have layers! Onions have layers. You get it? We both have layers. [Sighs]

What brings this into the developer context for me are some recent ideas I've seen going around about layered architectures. It seems that somewhere along the way architectures all developed layers - and like ogres and onions, architectures with layers are not always appreciated.

If you go back a few years, you'll recall the transitions from dumb clients with timesharing to client-server systems to three-tier systems to (inevitably, it seems), n-tier systems. And over time, n has just been increasing. These days, it's not unusual to find a system with a database at the bottom, a Web page at the top, and seven or eight layers of increasing abstraction and services in between.

Slowly but surely, complexity has been sneaking in to our applications. Oh, there's always a good reason: "We need a data access layer in case we decide to switch databases" or "we need to abstract out the business objects so that they'll work if we need to move to a Web farm." But more and more, I think the cart is getting far ahead of the horse here.

The issue, I think, is that problems breed solutions - and then we developers can't resist applying those solutions to other problems. Someone thrashes around with a database change, and someone else implements a data factory as the way to handle the same problem in the future. Over the years, we've accumulated a wide array of collective wisdom in the form of patterns and best practices.

But what we haven't accumulated (or at least, what's not as common) is the wisdom to know which best practices to apply in which situations. The temptation is apparently overwhelming to hook every application up with factories and logging layers and stored procedures and business objects and custom controls and who knows what else. The result is an application that is future-proofed against any possible change in its external conditions - but so complex as to be virtually unmaintainable. Adding a new logical entity ends up rippling changes through all the layers, requiring massive amounts of code churn for what seems like a simple change.

What's the answer? I can see two ways to possibly cut through this thicket. The first is an increasing reliance on code generation. If you can specify a necessary change in a single place, and let automated tools rebuild all the code for you, then it doesn't much matter whether there are three layers or a dozen. There's a lot of exciting work going on in this area right now, and I'm seeing more and more mature and useful code generation products.

My second suggestion is more heretical: don't use all those darned layers! Last year I did an ASP.NET application with a very simple architecture: Web pages with code-behind classes that use ADO.NET to talk directly to SQL Server stored procedures. You know what? It works. It was delivered on time and within budget, and a year later, I can duck in and make small changes without studying the architecture for a week. Sure, this application would not be robust in the face of the move to a server farm or a change of the underlying database. But we looked at those issues up front and decided that they were not risks worth worrying about. If you add some sensible risk analysis to your planning, you too may decide that you can simplify the layers of your applications - and perhaps make them less resemble ogres.