Engineering Laws
I have a soft spot for laws you will not find in physics class or law school but that still hold up in court. Except here the court is production.
The longer I am in software, the more they stop feeling like old professor folklore. Instead they feel like another Tuesday. I guess I’m at the age where I can’t help but start collecting them.
Here’s a growing list of the ones I keep running into.
Conway’s Law
“Any organization that designs a system (defined broadly) will produce a design whose structure is a copy of the organization’s communication structure.”
— Melvin Conway, “Conway’s Law” (1968)
Melvin Conway’s original statement sounds bulky. In plain terms, an org’s communication structure shows up in its systems, whether it wants to or not.
Here is the version most people know:
“If you have four groups working on a compiler, you’ll get a 4-pass compiler”
What’s funny is that a 2008 Microsoft Research study on Windows Vista found that their internal organizational metrics were statistically significant predictors of failure-proneness in Windows Vista. It’s hard not to enjoy the symmetry here.
And then Tom Cheatham’s version, which takes this from observation to meme:
“If a group of N persons implements a COBOL compiler, there will be N-1 passes. Someone in the group has to be the manager.”
Next time someone asks for a system architecture, send the org chart instead. If they want fewer layers, add more management.
Goodhart’s Law
“Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes.”
— Charles Goodhart, Problems of Monetary Management: The UK Experience (1975)
Again, the original is a mouthful. This version captures the idea cleanly:
“When a measure becomes a target, it ceases to be a good measure.”
— Marilyn Strathern, Improving ratings (1997)
The moment you start tracking velocity, people get faster at… looking fast. Once someone ties rewards to a metric, the system optimizes for the metric, not the outcome.
Suddenly you have perfect test coverage without actually testing anything, uptime that ignores user pain, and dashboards that glow green while no one’s getting anything done.
This one ages like wine. Or milk. Depends who’s measuring.
Schneier’s Law
“Anyone, from the most clueless amateur to the best cryptographer, can create an algorithm that he himself can’t break.”
— Bruce Schneier, “Memo to the Amateur Cipher Designer” (1998)
Bruce Schneier’s original framing is about cryptography, but the lesson applies well to any security system. It’s an example of the Dunning-Kruger effect. With production consequences.
Cory Doctorow later gave it the name and broadened the idea to security systems generally:
“any person can invent a security system so clever that she or he can’t think of how to break it.”
— Cory Doctorow, “Microsoft Research DRM talk” (2004)
What is hard is creating a system that nobody else can break. The only proof is exposure. Let people try.
This kills the idea of security by obscurity.