We're All Software Developers Now

We're All Software Developers Now

I’ve been writing software for 30 years. In that time, I’ve built bank processing systems, worked on OS kernels and device drivers, written compilers, developed embedded systems, architected large-scale web backends, and shipped consumer software. I led the team that built and received FDA authorization for the first purely software-as-a-medical-device autism diagnosis aid1—navigating the full regulatory gauntlet from design through submission to approval. I’ve seen mainframes give way to PCs, the web emerge from nothing, mobile eat the world, and cloud computing reshape infrastructure. I’ve learned to expect constant transformation—it’s the nature of this industry.

But what’s happening now is qualitatively different.

For the last several months, I’ve been working extensively with Claude Code—Anthropic’s AI coding assistant. The experience has been startling, exhilarating, and frankly terrifying. I now routinely do work that would have required a team of 3-5 people six months ago. I’m talking about work that would have needed a designer, multiple developers, QA engineers, and technical writers. I’m doing it alone, for $200 a month.

This isn’t about working slightly faster or being a bit more productive. This is a fundamental shift in what’s possible for a single person to accomplish. The multiplier effect is real, it’s here, and it should both excite and terrify you—it excites and terrifies me.

The Multiplier Effect: What I’ve Actually Seen

When I say “3-5x productivity,” I’m not speaking in abstract terms or citing someone else’s benchmarks. I mean that work which would have required a coordinated team of 3-5 people six months ago—with all the communication overhead, project management, and coordination that entails—I’m now completing solo.

The most concrete example I can give you is Isambard2, a project I wrote about recently. In about a month of intermittent work—evenings, weekends, whenever I had time—I built a production-quality Discord bot framework with AWS infrastructure integration and full Claude Agent SDK support. When I wrote that blog post a couple weeks ago, it was 45,000 lines of TypeScript. It’s now over 54,000 and continues to grow. It’s not just “working code” either—it has 100% mutation test coverage, meaning it’s not just tested, it’s proven correct against a rigorous suite of tests that verify every code path behaves as expected. It’s maintainable, extensible, solid.

Six months ago, that would have been a 3-6 month project for a small team. Minimum. You’d need someone to design the architecture, multiple developers to implement different components, QA to write comprehensive tests, and someone to write documentation. The coordination alone would eat weeks. I did it in a month, working intermittently, part-time.3

In my entire software career, I have always been resource-constrained. Always. There has never been enough time, never enough developers, never enough capacity to build everything worth building. I have never once had an empty backlog. Software is never done. There is always another feature to build, another system to improve, another idea worth exploring.

A 3-5x multiplier is more than welcome—and it’s still not enough. I’m still not producing as much software as I want to produce, as quickly as I want to produce it. Give me more.

This is the crucial insight that many people miss when they worry about AI eliminating developer jobs: amplification doesn’t mean you need fewer developers—it means you produce more software. The backlog doesn’t shrink; it expands. What changes is the scope of what becomes possible.

If I had a team of 3-5 people working for me right now, I wouldn’t lay them off. I’d teach them how to use these tools to become 3-5x more productive themselves. Then we’d tackle projects that are currently unthinkable. That’s what amplification means in practice.

At least, that’s what it means for now.

The Pace of Change: Why This Time Really Is Different

Six months ago, I was using Aider. It was the best tool available, and it worked—sort of. But it required constant hand-holding. You had to break everything down into tiny steps, review every change carefully, and frequently backtrack when the AI went off the rails. It was helpful, but barely. Pre-Opus 4.5, pre-Codex 5.x, the models just weren’t good enough to be truly transformational.

Today is completely different. The combination of Anthropic’s newest models with massive improvements in Claude Code has crossed some invisible threshold. The tool doesn’t just assist—it understands. It maintains context across complex codebases, suggests architectural improvements, catches bugs before I do, and writes tests that I would have written myself. The experience went from “frustratingly helpful” to something else entirely.

And the rate of improvement appears to be accelerating, not plateauing. That’s what keeps me awake at night.

Six months from now, I expect another 3-5x improvement over today. That would put me at 10-25x more productive than I was just a year ago. Six months after that? We might be looking at 30-100x improvement over an 18-month period.

For context, Moore’s Law gave us a doubling of compute power every 18 months. That was revolutionary. The entire software industry adapted to that pace—we learned how to ride that exponential curve, how to plan for it, how to structure our careers and businesses around it.

3-5x every six months isn’t just quantitatively faster than Moore’s Law. It’s qualitatively different. It’s not a pace of change we know how to manage. Our institutions, our educational systems, our career planning, our corporate structures—none of them are built for this.

We’re used to managing Moore’s Law’s pace of change. This new pace? This is utterly disruptive. And I don’t mean “disruptive” in the usual Silicon Valley sense where someone’s excited about their new app. I mean disruptive in the sense that we don’t yet know how to think about or manage the change itself.

The Expanding Definition of “Developer”

So how many software developers are there in the world? The answer depends entirely on who you count.

Traditional industry surveys put the number of professional developers at around 36 million globally.4 Include serious hobbyists, and that number jumps to around 47 million. GitHub claims over 180 million users, though many of those accounts are inactive or non-developers.

Now consider this: ChatGPT has roughly 900 million weekly active users.5 Surveys suggest over 40% of ChatGPT users have used it for code.6 Even if we assume only 10% of AI users have actually triggered code execution in a meaningful way, that’s 90 million people—already more than double the “official” developer count.

And the code they’re creating is real. Index.dev reports that 41% of all new code in 2025 is AI-generated or AI-assisted.7 Y Combinator’s Winter 2025 batch had 25% of startups with codebases that were 95% AI-generated.8 This isn’t experimental anymore. This is production software running in the real world.

Who are these new “developers”? They’re not who you’d expect.

Biology researchers are generating next-generation sequencing analysis pipelines with no programming background—work recently published in Biochemistry and Molecular Biology Education as a case study in democratized computational biology.9 Journalists are building web scrapers to investigate stories—the Pulitzer Center published a guide titled “How Non-Coding Journalists Can Build Web Scrapers With AI.”10 Kevin Roose at the New York Times has been documenting his “software for one” experiments, building personal apps in 10 minutes that solve one-off problems.11

Is the software produced “solid”? Often not. Is it maintainable? Usually no. Does it follow best practices? Rarely. But for single-use tasks, does any of that matter?

If you need to analyze a CSV file once, or scrape data from a website for a one-time research project, or build a quick calculator for your specific workflow, who cares if the code is “maintainable”? You used it, it worked, you’re done. The software served its purpose.

This is the democratization of software development. Different groups will have different standards for what constitutes “adequate” software. A journalist scraping a website doesn’t need a maintainable codebase—they need the data for their story.

Of course, some of these domains have higher stakes than others. A biologist analyzing sequencing data that might inform FDA approval of a mass-market drug really does need code that produces correct results—not code that merely looks right. A lawyer using AI to research case law needs to verify those cases actually exist. An analyst processing financial data needs to know that edge cases didn’t silently drop half the rows.

These new developers won’t know what “adequate” means at first. There will be mistakes—critical systems that fail because someone trusted AI-generated output without verification. Lawyers have already cited non-existent case law. Data pipelines have silently corrupted results. But this is how professional software development matured too: through painful lessons, gradually developing practices and tools to catch errors before they matter.

The difference now is that both the tools and the users will improve simultaneously. As AI systems gain better built-in testing, validation, and quality checks, and as users develop intuition for when to trust and when to verify, these incidents will become rarer. The gap between professional and amateur code is narrowing, and it’s narrowing fast.

The Luddite Parallel

We call people who resist technological change “Luddites.” The actual history is worth remembering.12

The original Luddites were English textile workers between 1811 and 1816 who destroyed weaving machinery that threatened their livelihoods. They weren’t ignorant or irrational—they were skilled craftspeople watching their skills become obsolete, their wages plummet, and their communities collapse. The British government’s response was harsh—they deployed more troops to suppress the Luddites than they had fighting Napoleon on the Continent at the same time. The fear was real: unchecked, Luddite unrest could broaden into full revolution and societal collapse.

Two hundred years later, the Industrial Revolution inarguably improved working-class lives in Britain and worldwide. Life expectancy doubled. Poverty decreased dramatically. Education became universal. The standard of living exploded.

We forget that the transition was brutal. Many people starved. Many suffered while the new order developed. It took generations for the benefits to materialize and distribute widely. Whatever the long-term outcome, the Luddites’ short-term pain was acute and real.

A Taoist parable called “The Farmer’s Horse” captures this uncertainty:

A farmer’s horse runs away.

“How terrible!”

“We’ll see.”

The horse returns with wild horses.

“How wonderful!”

“We’ll see.”

The farmer’s son breaks his leg training a wild horse.

“How terrible!”

“We’ll see.”

The army comes to conscript young men for war. The son is spared because of his broken leg.

“How wonderful!”

“We’ll see.”

We’re in the middle of the story.

The Industrial Revolution happened at a pace that society could barely manage, and our support systems—government assistance, education, labor protections—evolved slowly in response. Those systems are still not adapted to Moore’s Law’s pace of change, much less to what we’re experiencing now. We’re utterly unprepared, institutionally, for 3-5x improvements every six months.

There’s no neutral word for what’s coming. “Transformation” undersells it—it sounds managed, intentional, positive. “Cataclysm” oversells it and implies we know the outcome will be negative. But we don’t. Like the farmer, all we can honestly say is: “We’ll see.”

Will Developers Lose Jobs?

The question everyone wants answered: will software developers lose their jobs?

The honest answer is: I don’t know. Here are three cases.

The optimistic case: Backlogs never empty. There is always more software to build, always another problem worth solving, always another idea worth exploring. Amplification creates more and better software for the same cost, not the same software for less cost. Developers don’t disappear—they become more capable, more valuable, able to tackle bigger problems. The role evolves but doesn’t vanish. Lots of software gets built without them, but plenty still needs them.

The pessimistic case: At some point, simple math kicks in. If one developer can now do the work of 30 developers, don’t you need 1/30th the headcount? Even accounting for expanded backlogs and new projects, the equation eventually favors downsizing. Junior roles disappear first—why hire someone to write boilerplate when AI does it instantly? Mid-level roles get squeezed as the bar for what constitutes “valuable developer work” rises sharply. Only senior architects and specialists survive, and even they are much more productive, meaning you need fewer of them.

The middle case: The definition of “developer” expands dramatically while the profession itself transforms. Many people who never would have called themselves developers—journalists, researchers, designers, analysts—now build software as a natural part of their work. Meanwhile, the bar for professional developers rises. Junior roles do change, but they don’t disappear—they require different skills. The ability to prompt effectively, validate AI output, and integrate AI-generated code becomes as important as traditional coding skills. Professional developers become more like architects, reviewers, and integrators than line-by-line coders. The strongest skills become user empathy, domain understanding, critical thinking—already the hallmarks of the best software engineers, but not the core toolkit of many today.

I’m a techno-optimist by nature. I believe in human adaptability and the power of technology to improve lives. But I’m also watching this transformation unfold in real-time, and I can see the disruption clearly. I’m both excited and terrified, and I think both reactions are appropriate.

I said earlier that if I had a team, I’d invest in amplification. But not every manager will make that choice. Some will see the opportunity; others will see only the cost savings. Disruption is… disruptive. Some developers will thrive. Some will struggle. The same is true for managers, for companies, for shareholders. The winners and losers aren’t predetermined, and they won’t sort neatly along any obvious line.

Many software “end users” won’t need professionally-developed software anymore, nor professional software developers to make it. The bar for being a software developer keeps dropping. A journalist who can articulate what they need and iteratively work with AI to refine it doesn’t need to hire a developer anymore. They can build it themselves. The smarter non-technical folks can “know it when they see it” and work with AI until they get what they need.

That’s both beautiful and terrifying. Beautiful because it democratizes software creation. Terrifying because it displaces professional work wholesale.

The Economics Question

So what happens economically?

For professional developers, my best guess is that the top end gets even stronger. High-end work will be even more polished, more capable, delivered faster. Senior developers who can effectively use AI to amplify their work will be extraordinarily valuable. The work they produce will be better, and they’ll produce more of it.

At the low end, there will be an explosion of amateur software. Most of it will be disposable, and that’s fine. If it solves one problem and gets thrown away, it served its purpose. This software was never going to be commissioned professionally anyway—it only exists because the barrier to creation dropped so low.

The squeeze comes in the middle. Routine professional work that can be automated, boilerplate systems that follow known patterns, straightforward CRUD applications—this is where disruption hits hardest. It’s not that this work disappears entirely, but the number of people needed to do it shrinks.

Paradoxically, the quality of this middle-tier software may actually increase. Mediocre developers today produce mediocre software—it’s more expensive than it should be and often poorly suited to users’ needs. When a domain expert can work directly with AI to build exactly what they need, the result may be better than what they’d have gotten from a developer who didn’t understand the problem.

Current quality concerns are real but temporary. Right now, 46% of developers don’t trust AI output accuracy, and 75% won’t merge AI code without manual review.13 The “vibe coding hangover” is real—senior engineers report “development hell” when maintaining AI-generated code that lacks proper structure.14

I’ve largely moved past this myself. I rarely review the actual code in Isambard anymore. Instead, I’ve built infrastructure around Claude Code that prevents quality from dropping too far—rigorous linting, comprehensive testing, strong typing, mutation testing. Now I mostly just trust it. Half my agentic cycles are “help me design and add this new feature” and the other half are “go through all the tests and make sure they’re legitimate; think about angles that aren’t covered and cover them; can we refactor things to be more solid? Do that.” The AI does both halves.

This kind of infrastructure is going to get folded into baseline AI coding tools, inevitably—it’s low-hanging fruit that dramatically improves output quality. The trust issues, the “vibe coding hangover,” the need for manual review? These are current limitations, not permanent ones. Six months ago, Claude Code couldn’t maintain context across a large codebase. Today it can. Six months from now? Security vulnerabilities will decrease, accuracy will improve, generated code will be more maintainable. Maybe even formal proofs of correctness will be automatic.15

The tools are getting better faster than we’re used to adapting to them—and possibly faster than we can adapt to them. That’s the core challenge.

Exponential Growth Is Always a Motherfucker

Exponential growth is always a motherfucker. The human brain didn’t evolve to reason about exponential processes. We’re linear thinkers trying to navigate an exponential world.

In 30 years of living and breathing the insane rates of transformation in software, I have never seen anything remotely close to this pace. Not the web boom. Not mobile. Not cloud. Not even close. This is different. Technology is disrupting technological development itself. The meta-stability is gone.

Don’t get me wrong—there have been real productivity shifts over those 30 years. CVS to Subversion to Git. IDEs that actually understood code. Stack Overflow. Container orchestration. Each one mattered. But you want to know the single biggest felt productivity boost I experienced before now? Backlit laptop keyboards. Suddenly I could work a few extra hours into the night without squinting—an instant doubling of my productive hours! (Yes, I know what that says about developer work culture. That’s a different essay.) The point is: all those genuine improvements—and there were many—feel incremental compared to what’s happening now. AI doesn’t just dwarf them; it makes them feel like rounding errors.

The internet growth during the late 90s and early 2000s was massively impactful for the world, but for developers specifically? It meant new platforms to build on, new problems to solve. It didn’t make us dramatically more productive at the act of writing code. This does.

I don’t have prescriptive advice. I don’t know what you should do with your career, your company, your investments, or your fears. What I can offer is a framing:

Stay curious. The tools will keep changing. What won’t change is the value of being able to adapt, to learn, to understand what you’re trying to build and whether the AI actually built it. The ability to validate, to architect, to understand systems—those skills aren’t going away. They might be the only skills that don’t go away.

Stay learning. If you’re a developer, learn to use these tools effectively. Don’t resist them out of pride or fear. Learn to amplify yourself. If you’re not a developer but you have problems to solve, experiment. You might be surprised what you can build.

Stay agile. Don’t plan based on what AI could do six months ago, or what it can do today, or even what you think it will do a year from now. You will fall behind if you operate that way. Plan in shorter increments. Use AI to help you plan. Accelerate your own velocity—and keep accelerating. The pace isn’t slowing down.

And accept the uncertainty. We are in the middle of something whose shape we cannot yet see. It might be wonderful. It might be terrible. Most likely it will be both, in ways we can’t predict, affecting different people and communities differently.

I’m excited because the possibilities are genuinely thrilling. I’m terrified because the pace of change is faster than our institutions can handle, and I don’t know who gets hurt in the transition.

Right now, this explosion of productivity has me super excited for the future.

“How wonderful!”

“We’ll see.”


  1. Cognoa Receives FDA Marketing Authorization for First-of-Its-Kind Autism Diagnosis Aid , PR Newswire, 2021. ↩︎

  2. Building Isambard , my blog post about the project. ↩︎

  3. Isambard commit history on GitHub. ↩︎

  4. Global Developer Population Trends 2025 , SlashData, 2025. ↩︎

  5. ChatGPT Nears 900 Million Weekly Active Users , The Information, December 2025. ↩︎

  6. ChatGPT Usage Statistics , Zebracat, 2025. ↩︎

  7. Developer Productivity Statistics With AI Tools , Index.dev, 2025. ↩︎

  8. A quarter of startups in YC’s current cohort have codebases that are almost entirely AI-generated , TechCrunch, March 2025. ↩︎

  9. AI-assisted computational biology education , Biochemistry and Molecular Biology Education, 2025. ↩︎

  10. How Non-Coding Journalists Can Build Web Scrapers With AI , Pulitzer Center. ↩︎

  11. Vibe Coding: Building Software for One , Kevin Roose, The New York Times, February 2025. ↩︎

  12. Brian Merchant’s Blood in the Machine is an excellent account of the Luddite uprising and its parallels to today’s tech disruption. ↩︎

  13. Stack Overflow Developer Survey 2025: AI , Stack Overflow, 2025. ↩︎

  14. The vibe coding hangover is upon us , Fast Company, 2025. ↩︎

  15. Sung Kim on formal verification , Bluesky, 2026. ↩︎