AI isn't coming for your programming job
We haven't run out of new information problems to solve
I’ve been using GitHub CoPilot a bit and it’s really useful. I’m talking about the older version, not the one currently in beta with a chat interface. I’m sure the new one is going to be much better, but it’s already really good.
GitHub released some research showing that CoPilot improves programming productivity. Now it’s GitHub’s product and they charge money for it so what else are they going to say? But they also tell their own programmers to use it and code completion appears to be an obvious area where generative AI can be used to maximum effect:
Code is much simpler and far less nuanced than human languages so the problem of usefully completing using an initial prompt is much easier than the general AI chatbot use case.
Lots of programming involves writing ordinary “boilerplate” code that does something other programmers have done a zillion times before like:
Using a programming API to accomplish a standard task like creating a user record.
Issuing a database query.
Processing a form with standard inputs like name, email, phone number, etc.
And on and on and on…
There’s an enormous amount of existing code to use as the training data and, critically, much of that code is known to work. So the training set is large and much of it is high quality.
The number of CoPilot users passed a million a while ago. Lots of other companies are getting into the generative-AI-for-coding game as well. It appears that AI-assisted coding is going to be a thing and it will produce a not-insignificant productivity increase.
When productivity goes up, you don’t need the same number of people to create the same amount of stuff. So one thing that could happen is that overall you build the same amount of stuff but with fewer people, i.e., some people will lose their jobs.
This has obviously happened in, say, agriculture, where thanks to mechanized farm equipment and fertilizers far fewer people need to be employed in agriculture today than were employed in the past despite the fact that we need to grow much more food due to population increases.
So it’s plausible to say that AI-assisted coding will mean fewer programming jobs. But if you are going to make that claim there is a serious problem you will have to face, namely this is hardly the first time that programming productivity has increased due to advances in technology. I will make what I think are generally non-controversial claims:
Programming in higher-level languages is more productive than writing machine code or assembly and language design continues to advance even today, i.e., languages are getting even more powerful and thus more productive.
Using sophisticated pre-AI Integrated Development Environments is more productive than using primitive text editors as you get:
Code highlighting and folding.
Limited (but still useful) code completion.
Interactive access to documentation.
Sophisticated code searching and navigation.
The availability of open-source modules to accomplish standard tasks has been monotonically increasing over time and this increases productivity by reducing the amount of code you need to write yourself to solve a particular programming problem.
Moore’s Law has increased productivity by vastly increasing the computing power available to run software.
The rise of cloud computing increases productivity through a similar mechanism by putting larger amounts of compute and storage within the reach of more people.
Through all of this, the number of people employed in creating software in some form or another has been going up (though what we title those jobs has changed over time). In other words, every previous increase in software engineering productivity has led the world to collectively choose to make more software with more people.
The Bureau of Labor Statistics thinks the number of jobs in Software Developers, QA Analysts, and Testers is going up by 25% over the next ten years. Has the BLS just missed the boat on AI?
If you think AI is coming for programming jobs, then I believe the burden of proof is on you to explain why this time is different, i.e., why this time the world will choose to maintain the same amount of software with fewer people despite never having made that choice before. What that claim amounts to is that just as AI coding comes on the scene, the world has for some reason run out of new information problems to solve. This, I think, is an incredibly strong claim. I also don’t think it’s true.