Note1 December 2024

In which agility turns out to be a polite word for lost.

The Overcorrection

Forgetting as Governance

Organizations don't respond to failure by asking what went wrong. They respond by asking how to ensure they're never blamed for it again. That distinction shapes everything that follows.

A missed forecast produces a tighter review cadence, a security incident introduces a mandatory approval layer, and a failed launch creates a standing committee. Each change is rational. Each one points to something real, an event. But the target is rarely abstract risk. It's exposure or memory. It's the reputational residue of who was present when something broke.

Institutions remember embarrassment more vividly than error. The scar becomes structure.

The Accumulation Problem

Most dysfunction isn't designed. It accumulates.

A safeguard is added after a specific failure. Another follows the next one, and every single one is defensible in isolation. Together they form something no one would have chosen in advance: a system optimized less for business outcomes than for the distribution of blame.

The committee formed after one bad quarter becomes permanent. The approval step added after a compliance scare survives three leadership transitions. The risk log outlives the risk.

Organizations are extremely good at addition and are structurally bad at subtraction. Because subtraction is often political.

The Politics of Subtraction

Adding a control signals responsibility. Removing one signals risk.

Introducing a review is interpreted as diligence. Eliminating a review reads as exposure. Even if the conditions that justified it have changed, the memory of the original incident remains emotionally intact.

To propose subtraction is to implicitly argue that the cost of protection now exceeds the risk of failure. That's a hard claim to make in any institution shaped by recent harm.

Safeguards redistribute authority. A new approval step centralizes discretion. A new reporting requirement narrows autonomy. A new escalation path elevates certain roles into permanent arbitration. Protection isn't neutral. It shifts who gets to decide.

Which is why subtraction rarely happens organically. The people empowered by an added layer aren't usually the ones incentivized to remove it.

Over time, reaction hardens into identity. The cautious organization tells itself a story about discipline. The fast organization tells itself a story about boldness. Both narratives justify the structures that followed their last wound.

Overcorrection is rarely recognized as such because it feels like professionalism.

The Mirror Image of Speed

Not all overcorrection takes the form of additional constraint.

Some organizations learn that their wound was hesitation. They watched opportunity expire while they deliberated. They concluded that friction, not recklessness, was the threat.

So they subtract.

Reviews shrink. Documentation is deferred. Decisions compress. The cultural hero becomes the person who clears blockers rather than the one who raises structural concerns. Speed becomes identity.

But subtraction without reflection is still accumulation. Shortcuts compound. Temporary workarounds solidify into architecture. Institutional memory fragments across undocumented decisions and private threads. The same problems recur because the system lacks durable retention.

Velocity without memory produces instability.

Caution and speed present as opposites. In practice, they're siblings. Both are responses to pain. Both can calcify into doctrine. Both reshape authority and accountability in predictable ways.

The difference is tempo, not structure.

Institutional Memory and Institutional Forgetting

Healthy systems require memory and forgetting.

Memory preserves lessons. Forgetting prevents calcification. Governance, at its most durable, is the management of both.

Most organizations over-index on memory: archived incidents, formalized safeguards, preserved rituals. They build dashboards to ensure past failures never repeat. But very few build mechanisms for deliberate expiration.

When was this approval step introduced? What incident justified it? Do the conditions that produced that incident still exist? If the answer is no, why does the structure remain?

Without intentional forgetting, safeguards outlive their context and shortcuts outlive their urgency. Structures designed for one moment become permanent architecture.

The cost accumulates quietly.

In highly cautious systems, opportunity decays before it can be tested. In highly fast systems, coherence erodes before it can stabilize. One suffers slowly and visibly. The other fails abruptly and privately. Neither recognizes itself as miscalibrated because each can point to recent evidence that its model works.

Miscalibration rarely feels like failure. It feels like discipline.

Recalibration Over Reaction

The alternative to overcorrection isn't moderation. It's recalibration across time.

Recalibration requires building expiration dates into reaction. It means auditing not only outcomes but structural accretions, asking which approvals, which shortcuts, which rituals were introduced in response to a specific moment, and whether that moment still governs present conditions.

Not every safeguard deserves to survive its founding crisis.

Not every shortcut deserves to become culture.

Forgetting, in this sense, isn't negligence. It's governance.

It's the willingness to say: this structure solved a real problem. The problem has changed. The structure must change with it.

Institutions don't collapse because they valued caution or velocity. They weaken when reaction hardens into identity and memory is never balanced by subtraction.

The scar becomes strategy.

Good governance knows when to let it fade.

Footnotes

Fear in organizations rarely presents as panic. It presents as professionalism: one more review, one more escalation, one more sprint, one more launch. The sincerity of the response doesn't prevent overcorrection. It often accelerates it.

Engineers have language for technical debt because it can be inspected in code. Process debt is harder to surface. It lives in meeting structures, approval flows, and informal norms that accumulate without version history. Its weight is felt long before it's visible.

Cautious systems tend to decline gradually through missed opportunity and attrition. The kind of decline that shows up in exit interview data nobody reads and pipeline metrics that soften slowly enough to explain away for several quarters.

Fast systems often fail discontinuously. A hidden dependency breaks, a compliance gap surfaces, a key person leaves and the institutional memory leaves with them. The failure arrives as surprise even though the conditions were in place for years.

This asymmetry is part of why organizations rarely learn from each other's mistakes. The cautious company looks at the fast company's collapse and sees recklessness. The fast company looks at the cautious company's irrelevance and sees timidity. Both diagnoses are correct and both miss the point. The failure mode in each case wasn't the tempo. It was the inability to examine what the tempo was protecting against, and whether that thing still existed.

Most post-mortems don't ask that question. They ask what broke. Which is how you get the next overcorrection.

Some safeguards are foundational and shouldn't be revisited. Financial controls, security reviews, safety protocols, and legal compliance exist because the downside risk is structural rather than episodic. The point isn't that protection is unnecessary. It's that protection calibrated to a specific moment can quietly outlive the conditions that justified its intensity.


Reply

I’d welcome your thoughts on this essay. Send me a note →

Related reading
Latest entries