As a STEM field, the software engineer's role is probably still secure against AI tool wielders waving around chainsaws without the necessary training and accidentally sawing production databases in half.
What LLMs are poised to replace is basically the entirety of the liberal arts and academic disciplines, where output is largely measured on the production of language - irrespective of that language's correspondence to any actual underlying reality. Musicians and authors - fiction and non-fiction alike - are already witnessing their impending obsolescence in real time.
Neither musicians or authors are going anywhere, and it's kind of bizarre to say so.
People make music because they want to. They always have, since the dawn of civilization. Even if we end up somehow making music bots that can generate something OK, people won't magically stop wanting to make music.
And for both cases, it's important to remember that ML/LLM/etc can only re-arrange what's already been created. They're not really capable of generating anything novel - which is important for both of those.
LLMs specifically and AI generally are not going to replace your Taylor Swift, Ed Sheeran, Twenty One Pilots, Linkin Park, etc.
Speaking to the parent comment specifically - it strikes me as uninformed that “liberal arts and academic” disciplines rely only on language production. That’s absurd.
I have a hard science degree but I started my education in the humanities and spent time studying business and management in my education as well. That gave me an immense advantage that many technologists lack. More importantly it gave me insight into what people who only have a hard science education don’t understand about those fields. The take that they aren’t connected to reality is chiefly among those misunderstandings.
LLMs can’t replace any field for which advancement or innovation relies upon new thinking. This is different from areas where we have worked out mathematically articulable frameworks, like in organic chemistry. Even in those areas, actual experts aren’t replaceable by these technologies.
> The take that they aren’t connected to reality is chiefly among those misunderstandings.
On the contrary; nineteenth century leftist thinking, and post-modernism in particular, is infamous for dispensing with the notion of objective reality and the enterprise of science and empiricism entirely, rejecting them as grand narratives that exist merely to perpetuate the ruling power structures [0].
> Speaking to the parent comment specifically - it strikes me as uninformed that “liberal arts and academic” disciplines rely only on language production. That’s absurd.
If everything is narrative in structure and linguistically mediated, then yes, these disciplines are primarily focused on the production of language as a means for actuating reality.
The comment I replied to read as if the liberal arts are predicated on producing language rather than language with utility, which is something I’ve heard before coming from people who, often, have no real exposure to those areas of academia. I wasn’t attempting to paint with a broad brush in general terms across all of history or litigate specific schools of thought.
It’s certainly true that those disciplines rely on the production of language in much the same way (at least in application) as computer science relies on the production of some kind of logical expression device (be it an electronic abacus or a web application, or a paper formally analyzing an ISA).
> as if the liberal arts are predicated on producing language rather than language with utility
I'm sure that the liberal arts are engaged in activities and the production of language with some utility, but this is orthogonal to the question of its correspondence to reality or its epistemic value, as originally posed.
Playing the publish or perish game is different from developing some genuine insight, "justified true belief," into the state and mechanics of the world.
The grievance studies affair [0] is replete with "scholarly works" accepted for publication that are devoid of both epistemic and utilitarian value, ranging from the merely absurd to literally paraphrasing Hitler.
> And for both cases, it's important to remember that ML/LLM/etc can only re-arrange what's already been created.
There's millions upon millions of pieces of music, books, and other media forms in existence. Pretty much anything humans make is derivative in some way. The idea that humans create totally novel things is sort of silly. TVTropes is a thing for a reason.
I'm definitely not an AI booster but I'm realistic about the stuff humans create. It's entirely possible for AI to write hit pop songs and you'd never know unless someone admitted to it. They're not complicated and extremely derivative even when humans write them.
> it's important to remember that ML/LLM/etc can only re-arrange what's already been created. They're not really capable of generating anything novel
Any music that is completely novel is crap. All good music is inspired by, borrows from and outright steals from past music.
“Immature poets imitate; mature poets steal; bad poets deface what they take, and good poets make it into something better, or at least something different. The good poet welds his theft into a whole of feeling which is unique, utterly different than that from which it is torn.”
- TS Eliot
An LLM may certainly be not as good as a human at creating music. But it's certainly capable of creating music that is either more novel or less novel than any arbitrary human composition. You just need randomness to create novel music, but to make it tasteful still requires a human.
“They're not really capable of generating anything novel”
This is one of those statements that can be comforting, but is only true for increasingly specific definitions of “novel”. It relies on looking at inputs (how LLMs work) vs outputs (what they actually create).
A multi-modal LLM can draw “inspiration” from numerous sources, enough to make outputs that have never been created before. They can absolutely make novel things.
I think the real question is whether they can have taste. Can they really tell good ideas from bad ones beyond obvious superficial issues?
> I think the real question is whether they can have taste. Can they really tell good ideas from bad ones beyond obvious superficial issues?
Someone not musically talented but appreciates music could generate a hundred songs with an AI, pick ten good ones, and have an album that could sell well. The AI is a tool, the user is the only one that really needs taste.
Most people like to create art for other people appreciate. It's a rare individual that wants to make art and then to show it to no one.
Unfortunately the market for art will filled lowest common denominator stuff generated by AI. Only the high end will be driven by non AI. There's not enough of a market for everyone to do high end stuff.
Unfortunately, "want to" doesn't pay the bills. Before recorded music, there was a lot more demand for a jazz saxophone player, and that musician made a decent middle class living off of that skill. Now we have Spotify and don't need to pay that musician. Suno.ai makes very passible music. Midjourney makes passible art. The microwave makes passible food. LLMs write passible books.
Programming as a job will go the way of the saxophone player as a job, but then what's left? Are we a jester economy? One where everyone competes to perform tricks for wealthy billionaires in order to get some money to eat?
You do not need wealth to make 'great art'. It's nice to have access to the best software tools, the highest quality paints, the finest instruments, etc - but those have never been needed. It's pretty reductive to think of art that way.
Wealth is the time required to make great art- Every great artist needs this.
Currently, some artists are able to make a living from their art, and can spend the time and effort to create great art without being independently wealthy- but that is going to become increasingly difficult
Music and other arts are a manifestation of human life and its ephemeral nature. And because human creates it, another human can relate to it because we all share the same fate, in the end.
LLMs have no understanding of the limited time of human nature. That's why what they produce is flat, lacking emotion, and the food they make sometimes tastes bland, missing those human touches, those final spices that make the food special.
What LLMs are poised to replace is basically the entirety of the liberal arts and academic disciplines, where output is largely measured on the production of language - irrespective of that language's correspondence to any actual underlying reality. Musicians and authors - fiction and non-fiction alike - are already witnessing their impending obsolescence in real time.