Since the 1970s, productivity has outpaced wage growth in many sectors. One sector where wages have kept up with productivity growth is tech, especially among programmers. But, I suspect this is changing thanks to the use of AI coding assistants.
To quickly review the relevant economic history, before 1973 wage growth roughly tracked productivity in Western economies, but since then productivity has grown 6 times faster than wages in most industries. Economists have given many possible explanations for this change: globalization, automation, declining union power, and policy shifts. There's not a clear consensus about the causes, but it's clear what the effect is: less value is being captured by labor and more by capital.
Programmers, however, have been largely exempt from this trend. Instead, they've seen a rapid rise in wages, especially over the past 20 years, and an even more rapid rise in non-wage compensation (stock, perks, etc.). Why? Because the production of high quality software is hard to automate, hard to outsource, critical to businesses, and protected by skill barriers.
AI is changing all this. It's beginning to automate many programming tasks. It allows outsourcing some programming work to AI. And as we've seen with the rise of vibecoding, AI is reducing skill barriers to producing functional code.
I've seen these changes in my own work as a programmer. I use Claude Code nearly every day. Although it doesn't let me do anything I couldn't do before, it lets me do everything I need to do faster and easier. Gone are the days of spending long hours reading docs and StackOverflow, trying to puzzle out how to make something work. Now I just ask Claude and, most of the time, it figures it out in just a few minutes.
But these gains in productivity may have come at some personal cost. I'm probably 2-3x more productive than I used to be in terms of time to ship code to production. I'm not, however, suddenly being paid 2-3x times more for doing this. In fact I'm being paid just about the same as I was before I started using Cluade Code even though I'm doing a lot more. And looking at public job listings, I don't see signs that anyone else is getting pay raises either.
But if I and other programmers are becoming more productive by using AI, why aren't we seeing an increase in salaries? It could simply be lag. AI is new, and the market hasn't had time to respond. But I'm suspicious that something else is going on.
Across many industries, labor saving devices have failed to increase wages proportional to the value generated. Some of this makes sense: it takes capital to operate these devices, and capital is going to need to recoup those costs. But naively you might think that, if labor is now more productive, it should be worth paying more for.
What I think this analysis misses is that wages are set in a competitive environment. I'm not actually paid, directly, based on how productive I am. I'm instead paid a wage that clears the programmer hiring market at my level of skill. And since the main effect of AI coding agents is to allow more people to write more code, we've effectively increased the supply of programming, putting downward pressure on the hiring market's clearing price.
We already see signs that the bottom of the market has been priced out. New grads from computer science programs report great difficulty finding jobs, whereas just a few years ago they were receiving multiple offers. Coding bootcamp grads face similar headwinds. And even among experienced programmers, although there are many open roles, hiring processes are increasingly competitive and companies are increasingly choosy about who they hire.
What does this mean for programmer's career prospects? I'm not certain. I might be jumping the gun in my analysis, and in 12-18 months we may see wages catch up with productivity gains. But if that doesn't happen, then I strongly suspect that programmers will see their wages stagnate even as they become more productive. The silver lining is that, at least for now, many programmers receive equity compensation, which allows them to capture some of the value of their productivity gains that are going to capital. Hopefully that will be enough for me and my fellow programmers to stay solvent in the years ahead.
Cross-posted to Less Wrong, where you may find additional discussion in the comments.
Hey Gordon! My economic model is that individual worker productivity only acts as a ceiling on wages/compensation through impact on labor demand (eg not many businesses can pay someone more than their productive value to that business).
Actual compensation is a dance between labor demand and supply. In practice, compensation appears to increase along with productivity, but that's because higher productivity raises demand. Moving from $1 in productivity to $2 increases the number of businesses who can get a positive return on employing that labor, so demand rises.
But if supply rises exactly to match, compensation wouldn't change. Labor supply is very laggy in software though, because it takes years to train.
Sharing for feedback!
The hard question for me to answer is about the demand for additional software in the world. How much software is currently not economically viable to build because building software is too expensive? I don't even know how to guess at that
But if a lot of software becomes economically viable at 2x productivity, then demand raises and wages could go up. Even at 100x productivity it could happen as long as that 100x productivity required a rare level of skills / capability / effort.
If AI continues to automate and the owners of the business capture all that value, it feels essential to own as much of these companies as possible