Software engineering as an occupation grew because of static analysis and GCs (literally why the labor market is the size that it is as we speak); the opposite appears to be the outcome of AI advances.
The same happened with accountants and spreadsheet software, the number of accounting jobs grew. The actual work they performed became different. I think a similar thing is likely to happen in the software world.
Tech has already learned there’s not enough real frontier left to reap the bounty of(removing zero interest rates that incentivize mere flow of capital). This stuff is being invested in to yield the most productivity at the least cost. There will either be a permanent net decrease in demand or, being so high level, most openings will pay no more than 60-70K in an America (likely with reduced benefits) where wages are already largely stagnant.
I think there is definitely merit to your statements. I believe the future of the average software developer job involves a very high level language, API integration, basic full stack work with a lot of AI assistance. And those roles will mostly be at small to medium businesses who can't afford the salaries or benefits that the industry has standard in the US.
Almost every small business I know has an accountant or book keeper position which is just someone who had no formal education and the role is just managing QuickBooks. I don't think the need for formally educated accountants who can handle large corporate books decreased significantly, but I don't have any numbers to back that up. Just making the comparison to say I don't think the hard / cool stuff that a lot of software developers love doing is going away. But these are just my thoughts.
It's reasonable to expect that sometime relatively soon, AI will be a clear-cut aid to developer productivity. At the moment, I consider it a wash. Chatbots don't clearly save me time, but they clearly save me effort, which is a more important resource to conserve.
Software is still heavily rate-limited by how much of it developers can write. Making it possible for them to write more will result in more software, rather than fewer developers. I've seen nothing from AI, either in production or on the horizon, that suggests that it will meaningfully lower the barrier to entry for practicing the profession, let alone enable non-developers to do the work developers do. It will make it easier for the inexperienced to do tasks which need a bit of scripting, which is good.
> Software is still heavily rate-limited by how much of it developers can write
Hmm. We have very different experiences here. IME, the vast majority of industry work is understanding, tweaking, and integrating existing software. There is very little "software writing" as a percentage of the total time developers spend doing their jobs across industry. That is the collective myth the industry uses to make the job seem more appealing and creative than it is.
At least, this is my experience in the large FAANG type companies. We already have so much code. Just figuring out what that code does and what else to do with it constitutes the majority of the work. There is a huge legibility issue where relatively simple things are obstructed by the morass of complexity many layers deep. A huge additional fraction of time is spent on deployments and monitoring. A very small fraction of the work is creatively developing new software. For example, one person will creatively develop the interface and overall design for a new cloud service. The vast majority of work after that point is spent on integration, monitoring, testing, releases, and so on.
The largest task of AI here would be understanding what is going on at both the technical layer and the fuzzy human layer on top. If it can only do #1, then knowledge workers will still spend a lot of effort doing #2 and figuring out how to turn insights from #1 into cashflow.
>At least, this is my experience in the large FAANG type companies. We already have so much code. Just figuring out what that code does and what else to do with it constitutes the majority of the work.
That sounds horrible. I've always sought out smaller companies that need stuff built. It certainly doesn't pay as much as SV companies but it's pretty stimulating. Sometimes being a big fish in a small pond is pretty nice.
IMO, maintaining someone else's code is probably the worst type of programming job there is, especially if it's bad /disjointed code. A lot of people can make a good living doing it though. It would be nice if AI could alleviate the pain of learning and figuring out a gnarly codebase.
Yep. It's not very satisfying, but that's the state of things. I think we should be more honest as an industry about that. Most of the content that prospective SWEs look at has a self-marketing slant that makes things look more interesting than they typically are. The reality is far more mundane. Or worse: micromanaging, pressure-driven, and abusive in many places.
> I've seen nothing from AI, either in production or on the horizon, that suggests that it will meaningfully lower the barrier to entry for practicing the profession, let alone enable non developers to do the work developers do.
Good observation. Come to think of it, all examples of AI coding require a competent human to hold the other end, or else it makes subtle errors.
How many humans do you need per project though? The number can only lower as AI tooling improves. And will employers pay the same rates when they’re already paying a sub for their AI tools and the work involved is so much more high level?
I don’t claim to have any particular prescience here, but doesn’t this assume that the scope of “software” remains static? The potential universe of programmatically implementable solutions is vast. Just so happens that many or most of those potential future verticals are not commercially viable in 2024.
Exactly. Custom software is currently very expensive. Making it cheaper to produce will presumably increase demand for it. Whether this results in more or fewer unemployed SWEs, and if I'll be one of them, I don't know.
> Making it possible for them to write more will result in more software, rather than fewer developers.
Goddamnit, software developers are already writing more software than we need. I wish they'd stop. Or redirect all that energy to new problems to solve. Instead we're seeing cloud-deployed microservice architecture CRUD apps that do what systems built for mainframes with kilobytes of RAM do, only worse. We're in a glut of bad software, do you think that AI accelerating production of more of the same will make things better?
If chatbots aren't saving you time you need to refine what you choose to use them for. They're absolutely amazing at refactoring, producing documentation, adding comments, translating structured text files from one format to another, implementing well known algorithms in newer/niche languages where repository versions might not exist, etc. On the other hand, I've mostly stopped asking GPT4 to write quickstart code for libraries that don't have star counts in the high thousands at least, and while I'll let it convert css/style objects/etc into tailwind I it's pretty bad at styling in general, though it is good at suggesting potentially problematic styles when debugging layout.
> you need to refine what you choose to use them for
This is making assumptions about the work I do which don't happen to be valid.
For example:
> libraries that [...] have star counts in the high thousands at least
Play little to no role in my work, and
> I'll let it convert css/style objects/etc into tailwind
Is something I simply don't have a use for.
Clearly your mileage varies, and that's fine. What I've found is that for the sort of task I farm out to the chatbots, the time spent explaining myself clearly, showing it counterexamples when it gets things wrong, and otherwise verifying that the code is fit to purpose, is right around the time I would spend on the task to begin with.
But it's less effort, which is good. I find that at least as valuable if not more so.
I remember watching this really funny video where a writer, by trade, was talking about recent AI products they were exploring.
They saw a "Make longer" button which took some text and made it longer by fluffing it out.
He was saying that it was the antithesis of his entire career.
As a high schooler who really didn't care, I would've loved it, though.
I've heard one CEO been asked about gen-ai tools to be used in the company. The answer was vague, like they are evaluating the tooling. However one good example was made: chatgpt is really good in writing mails, and in summarizing text as well.
He said they don't want to have situation when sender is using chatgpt to write a fancy mail and recipient is using chatgpt to read it. However I think that it is the direction where we are going right now.
I was giving examples, in the hopes that you could see the trend I was pointing towards for your own benefit. You can take that and learn from it or get offended and learn nothing, up to you.
Not sure why you are scared of GPT assisted documentation. First drafts are universally garbage, honestly I expect GPT to produce a better and more accurate first draft in a fraction of the time, which should encourage a lot of people who otherwise wouldn't have documented at all to produce passable documentation.
> Yikes. Not looking forward to that in the future.
Instead of documentation, I'm hoping more for "analysis". A helper that can take in a whole project (legacy or not) and tell you what it's supposed to be doing, and maybe point out areas for improvement.