Convergence of Software Engineering Productivity in Changing Organisations
Module 4 of 6 Intermediate 35 min

Why New Technology Rarely Raises the Ceiling — Understanding Absorption Patterns

Prerequisites: Why Engineering Productivity Always Finds Its Level, Reading the J-Curve — Disruption, Recovery, and What the Endpoint Really Means, Team Structures, Cognitive Load, and the Architecture of Your Productivity Ceiling

What you'll learn

  • Why technology adoption follows a predictable arc and where the productivity plateau occurs
  • How absorptive capacity constrains the speed and ceiling of technology-driven productivity gains
  • How Brooks' "No Silver Bullet" principle distinguishes gains from accidental complexity (temporary and bounded) from gains against essential complexity (rare)
  • How to predict where a given technology adoption initiative will plateau, given your organisation's current structural context

Why this matters

When productivity is stuck, the first question most leaders ask is "what tool can we adopt?" It is an understandable instinct. New technology is visible, purchasable, and carries the implicit promise of a step-change. Vendors and conference talks reinforce the story: teams that adopted X saw a 40% velocity gain. Why wouldn't you try it?

What those stories rarely include is what happened six months later — and why the gains rarely held at scale across the whole organisation. If you have lived through a tool adoption that delivered less than promised, you have already observed the pattern this module names. Understanding why technology gains plateau is not pessimism; it is the foundation for making smarter decisions about where to invest. The difference between a leader who chases the next tool and one who raises the structural ceiling lies entirely in knowing what technology can and cannot do to a productivity baseline.

Core concept

The plateau is not a surprise — it is the default

Technology adoption follows a recognisable arc. The Gartner Hype Cycle — a framework from research and advisory firm Gartner — maps this arc across five phases: Innovation Trigger, Peak of Inflated Expectations, Trough of Disillusionment, Slope of Enlightenment, and Plateau of Productivity. The names are memorable because they match what engineers actually experience. The "Plateau of Productivity" is the stabilised endpoint, not a high-water mark — it is where a technology's genuine, sustainable contribution to organisational output eventually settles. That plateau is almost always lower than the Peak of Inflated Expectations suggested.

The pattern is not unique to any particular class of technology. Object-oriented programming, cloud migration, microservices, continuous integration pipelines — each followed the same arc: early excitement, a trough of integration pain, then a settle into a level of benefit that was real but smaller than promised.

Absorptive capacity: the hard ceiling underneath the hype

The key mechanism that governs where the plateau lands is absorptive capacity: an organisation's ability to integrate new knowledge, technologies, or practices into existing workflows and mental models at a given point in time, determined by the overlap between existing knowledge and the incoming change. A finite resource constrained by structural factors.

This definition matters precisely because it is not about the technology's quality. An organisation's absorptive capacity is not raised by buying better tools; it is raised by building knowledge overlap, learning mechanisms, and structural readiness before and during adoption. A team that lacks the foundational knowledge to understand what a tool is doing — and why — cannot extract its value regardless of how good the tool is.

Absorptive capacity is also a flow constraint, not just a ceiling. Even an organisation that eventually could absorb a technology will do so more slowly if its teams are already carrying high cognitive load — the total mental demand placed on a team by its domain scope, tool complexity, and coordination overhead. Teams deep in integration overhead or context-switching do not have the germane load capacity (the portion of cognitive load available for valuable learning) to absorb new practices quickly.

This is why technology adoption timelines reliably surprise organisations: they calculate learning time based on how long it takes a motivated individual to learn the tool, then multiply by headcount. They do not account for the distributed cognitive work of aligning workflows, updating mental models, and rebuilding team-level coordination patterns.

Brooks' "No Silver Bullet": accidental versus essential complexity

In 1986, Fred Brooks — the computer scientist who led the IBM OS/360 operating system project — published an essay called "No Silver Bullet: Essence and Accident in Software Engineering." His central argument has aged remarkably well.

Brooks distinguished two types of complexity in software work:

  • Accidental complexity — difficulties that arise from imperfect tools, languages, or processes; friction that humans have introduced and could, in principle, remove. Writing boilerplate, navigating slow build systems, managing deployment configuration manually.
  • Essential complexity — the irreducible difficulty inherent in the problem being solved. Understanding the business domain, managing distributed state, reasoning about security boundaries, designing for change. These difficulties exist because the problem itself is hard, not because of any tool choice.

Brooks' argument: most technologies target accidental complexity. Higher-level languages, better IDEs, code generators, AI code assistants — these reduce friction. They are genuinely valuable. But they do not touch essential complexity, which is where the hard work lives. And because accidental complexity gains are bounded (there is a floor below which process friction cannot drop), each successive technology delivers smaller marginal gains. The productivity increase from moving from assembly to C was enormous. The increase from one high-level language to another is much smaller. The increase from one AI coding assistant to a slightly better one is smaller still.

This is not a counsel of despair — it is an accurate model of where technology investment pays off and where it hits diminishing returns.

The adoption curve: why early-adopter success does not scale automatically

Technology adoption across an organisation follows a curve first described by Everett Rogers and later popularised in Geoffrey Moore's Crossing the Chasm: innovators, early adopters, early majority, late majority, laggards. Each cohort differs not just in timing but in the characteristics that made earlier cohorts successful.

Innovators and early adopters are typically high-expertise, high-motivation engineers who treat mastering new tools as an intrinsic goal. They succeed with a technology partly because of who they are, not only because of what the technology offers. When organisations attempt to scale early-adopter success to the early majority and beyond, they encounter a silent assumption failure: the knowledge overlap, motivation, and fault-tolerance of the pioneer cohort is not uniformly present. The same tool that accelerated a team of experts can stall a team of competent generalists who lack the foundational concepts to troubleshoot it under pressure.

Absorptive capacity differences between teams are the mechanism behind this scaling failure.

Why technology alone cannot raise the productivity baseline

Returning to the core model from earlier modules: the structural ceiling is the maximum sustainable productivity level achievable given the organisation's current team structure, cognitive load distribution, and communication patterns. The productivity baseline trends toward this ceiling over time.

Technology adoption is a disruption event — it triggers a J-curve (the specific shape of the convergence curve during a single disruption: productivity dips below the pre-change level, then recovers). But crucially, where the recovery endpoint lands is governed by the structural ceiling, not by the technology's inherent power. If the team structures, decision ownership, and communication patterns have not changed, the productivity baseline will converge back toward where it already was — with, at best, a modest upward shift from the portion of accidental complexity that the technology genuinely removed.

Technology gains get absorbed into existing structure. The bottleneck moves; it does not disappear.

Concrete example

Apex Engineering is a 200-person software engineering organisation inside a mid-sized financial services company. In Year 3 of their journey, following a microservices migration and a company acquisition, Apex's engineering director leads the adoption of AI-assisted code review across all teams.

The initial results are encouraging. Within the first two months, velocity metrics — measured through deployment frequency and code review cycle time — improve by 12%. Engineers are enthusiastic; the tool flags common issues automatically and reduces the time reviewers spend on style and obvious logic errors. The engineering director begins exploring whether to expand the AI tooling investment further.

Six months later, the picture is different. Velocity gains have plateaued at 5–7%. The AI tooling is still in use and still valued, but the incremental improvement has stalled. Investigation reveals why: the code review process itself has not changed. Reviewers still operate in the same way, with the same scope of review and the same communication patterns. The AI tool addressed some accidental complexity — automating checks that were slow and tedious — but the essential work of architectural review, cross-team dependency discussion, and integration verification remains unchanged. The tool did not alter team boundaries or decision ownership.

The engineering director recognises the signal. The 5–7% sustained gain is real and worth keeping, but further AI investment will not raise this number significantly. The absorptive capacity of Apex's teams for this class of tool has been reached. The underlying workflow and review processes have not changed, so the structural ceiling has not moved.

This is the point at which the engineering director turns her attention back to the structural redesign work from Year 2.5 — the reorganisation into seven stream-aligned teams (long-lived teams that own an end-to-end value stream from user need to production, with minimal handoffs) — recognising that structural change produced more durable productivity gains than the technology adoption. That is the lesson of absorption patterns: technology adds value within the existing ceiling; only structural change raises it.

Analogy

Consider climbing a mountain with better boots. Upgrading from worn-out hiking shoes to high-performance climbing boots makes the ascent genuinely easier — your feet are more comfortable, your grip is more reliable, and you expend less energy on equipment friction. Early climbers who switch see real gains. But the mountain's steepness does not change. The altitude of the summit does not move. Once all climbers have better boots, the improvement from upgrading is exhausted. The hardest sections of the climb remain hard; they were always hard because of the terrain, not the footwear.

This is accidental complexity reduction at work. Better boots address the accidental friction of poor equipment. They do not address the essential difficulty of altitude, weather, and the route itself.

A related analogy: adding a faster self-checkout lane at a grocery store. The lane genuinely speeds up certain transactions, and early adopters (customers who already know the interface) see real time savings. But overall store throughput is also constrained by stockers restocking shelves, cart availability, customer flow from the aisles, and the number of parking spaces. Eliminating checkout friction shifts the bottleneck elsewhere. Total throughput converges toward a baseline set by the store's structural capacity, not by checkout speed alone.

Going deeper

The historical pattern of technology optimism

Every generation of software engineering has believed that the current wave of tooling will finally break through the productivity ceiling. Structured programming, relational databases, object-oriented programming, fourth-generation languages, component-based development, agile methodologies, DevOps toolchains, and now AI-assisted development — each was framed, in its moment, as a potential order-of-magnitude improvement. Each delivered genuine gains. None delivered the order of magnitude.

Brooks wrote his essay after observing the OS/360 project, which remains one of the most studied software development efforts in history. His conclusion — that essential complexity is irreducible — was not a prediction about specific technologies but a structural observation about the nature of software work itself. The hypothesis has survived four decades of technological change largely intact.

Absorptive capacity as a concept from organisational theory

The term absorptive capacity originates in organisational theory — specifically, a 1990 paper by Wesley Cohen and Daniel Levinthal titled "Absorptive Capacity: A New Perspective on Learning and Innovation." Their insight was that a firm's ability to exploit external knowledge depends critically on the firm's pre-existing knowledge base. Organisations cannot learn at random from the environment; they can only learn in directions adjacent to what they already know.

This has a direct implication for technology adoption sequencing. Organisations that attempt to adopt a technology for which they lack prerequisite knowledge do not simply learn more slowly — they often learn the wrong things, developing workarounds and cargo-cult practices that mimic the surface behaviour of the technology without capturing its architectural intent. This is a common failure mode in large-scale tool rollouts.

The concept of absorptive capacity reappears in Module 05, where it plays a different role: explaining why different types of organisational change produce different recovery signatures, and why some organisations recover from disruption faster than others.

Technology adoption and the convergence curve

In the terms established earlier in this curriculum: a technology adoption event is a disruption that triggers a convergence curve — the path productivity traces over time from pre-disruption through disruption trough to post-convergence baseline. The technology adoption variant of this curve is distinct from a structural reorganisation in one important way: the recovery endpoint is constrained by the existing structural ceiling.

This is why Apex's microservices migration in Year 1 — covered in earlier modules — also plateaued. Velocity recovered close to its prior level by month 8, but not above it. The team structures had not changed; developers still owned loosely-defined slices of the system rather than end-to-end service boundaries. The technology addressed deployment and scaling accidental complexity. It could not address the essential coordination friction caused by misaligned team boundaries.

Common misconceptions

"Adopting the technology is the same as gaining the benefit."

Deployment and absorption are not the same thing. A technology is inert until it is integrated into team workflows, mental models, and collaboration patterns. Organisations frequently measure adoption by licence activation rates or feature rollout percentages, then are surprised when productivity metrics do not move in proportion. The gap between deployment and absorption is where the promised gains disappear. Absorption requires time, knowledge overlap, and structural readiness — none of which are supplied by the technology itself.

"Elite early-adopter success will scale organisation-wide."

When a team of senior engineers with deep domain expertise produces remarkable results with a new tool, the natural impulse is to roll that tool out broadly. But the factors that made the early-adopter team successful — technical depth, tolerance for ambiguity, motivation to experiment — are not uniformly distributed. The early majority and late majority cohorts have different knowledge baselines and different cognitive load situations. Scaling a tool without simultaneously building the prerequisite knowledge is a reliable path to disappointing results. Absorptive capacity differences between teams make uniform scaling unreliable.

"More sophisticated technology solves coordination and structure problems."

Organisations sometimes reach for technology when the real constraint is team boundary misalignment, unclear decision ownership, or communication pattern friction. Adding sophisticated tooling to an organisation with structural problems can amplify the problem: the tool adds complexity and cognitive load without addressing the underlying friction. Conway's Law — the principle that organisations design systems that mirror their communication structures — is useful here: if the communication structure is dysfunctional, the systems built with and on top of new tools will reflect that dysfunction. Technology absorption fails when the structural prerequisites for using it well are absent.

Check your understanding

  1. Apex Engineering's AI code review adoption produced a 12% initial velocity gain that plateaued at 5–7%. Using the concept of absorptive capacity, explain why the gains stalled — and what would have had to change for them to continue rising.
Reveal answer The initial 12% gain reflects genuine reduction in accidental complexity: automated checks replaced manual, time-consuming review steps. The plateau at 5–7% indicates that Apex's teams had absorbed as much of the tool's benefit as their existing workflows and knowledge structures could accommodate. The underlying review process — its scope, the communication patterns around it, the decision ownership — remained unchanged. To continue raising gains, Apex would have needed to redesign those processes (structural change), which would have required teams to build new shared understanding and coordination patterns. Absorptive capacity is constrained by what the organisation already knows and how it currently works; without changing those structural inputs, the ceiling on technology-driven gains is fixed.
  1. A colleague argues: "We should roll out AI-assisted development tools to our entire engineering organisation immediately — our most senior team has already seen great results." What question would you ask first, and why?
Reveal answer The most important question is: what does the typical team's knowledge baseline look like relative to the senior team's? Early-adopter success is partly attributable to the characteristics of the early adopters — technical depth, motivation, fault-tolerance — rather than solely to the tool. If the broader organisation lacks prerequisite knowledge, rolling out the tool will push each team into a learning curve without the expertise to navigate it effectively. The result is not the same productivity trajectory but a much slower absorption, cargo-cult adoption, and potential disillusionment. Before scaling, you would want to understand absorptive capacity variation across teams and build a knowledge-transfer plan alongside the tool rollout.
  1. How does Brooks' distinction between accidental and essential complexity explain why productivity gains from successive generations of tooling tend to get smaller?
Reveal answer Accidental complexity is bounded: there is a floor below which process friction cannot be reduced. Each generation of tooling removes some portion of the remaining accidental friction. The first high-level language removed enormous friction compared to assembly; the jump from one modern language to another removes much less, because much of the low-hanging friction was already gone. Essential complexity — the irreducible difficulty of the problem being solved — is not touched by tooling improvements at all. As accidental complexity approaches its floor, successive tools compete over a shrinking pool of addressable friction, producing smaller and smaller marginal gains.
  1. Why does a technology adoption J-curve typically not raise the productivity baseline to the same degree as a structural change?
Reveal answer The J-curve recovery endpoint is governed by the structural ceiling, not by the technology's inherent power. Technology gains are absorbed into the existing team structure, cognitive load distribution, and communication patterns. Those structural factors have not changed, so the productivity baseline they support has not changed. The technology may move some accidental friction out of the way, producing a modest upward shift in where the baseline converges — but it cannot raise the ceiling that the structure sets. Structural change, by contrast, alters team topology, decision ownership, and communication patterns — the factors that define the structural ceiling itself.
  1. An organisation's engineering director says: "Our absorptive capacity is low right now — teams are overloaded. Let's wait until things calm down before adopting any new tools." Is this good advice, and what might she be missing?
Reveal answer The instinct is sound: adopting new technology when teams are at cognitive load saturation will slow absorption and may produce negative outcomes (poor implementation, increased extraneous load, frustrated teams). However, "wait until things calm down" misses the structural question: *why* are teams overloaded? If the answer is misaligned team boundaries or coordination overhead — structural problems — then things may not calm down on their own. The director would do well to distinguish between waiting (passively hoping conditions improve) and addressing the structural cause of high cognitive load (structural change that reduces it deliberately). A better formulation: "Our absorptive capacity is low. Let's address the structural cause before introducing new technology."

Key takeaways

  • Technology adoption follows a predictable arc — the Gartner Hype Cycle's "Plateau of Productivity" is not a high-water mark but the settled level of genuine, sustainable benefit, which is typically lower than initial projections.
  • Absorptive capacity — an organisation's ability to integrate new knowledge, technologies, or practices into existing workflows and mental models — is the hard ceiling that governs how fast and how far technology gains can materialise. It is constrained by existing knowledge and structural factors, not by the technology's quality.
  • Brooks' "No Silver Bullet" principle remains accurate: most technologies address accidental complexity (tool friction, process awkwardness), which is bounded and diminishing. Essential complexity — the irreducible difficulty of the problem — is rarely touched.
  • Early-adopter success does not automatically scale. Absorptive capacity differences between teams make uniform rollout unreliable without simultaneous knowledge-building investment.
  • Technology gains get absorbed into existing structure; the bottleneck moves, it does not disappear. Only structural change — altering team topology, cognitive load distribution, and decision ownership — can raise the structural ceiling and with it the productivity baseline.

References