You don't post as often as some of the other Substackers I follow, and I was thinking of cancelling my subscription. But this post has swayed me; I'll stay.
I've been saying for years that one reason machines have found it easy to learn to behave like people is because people have spent years training themselves to behave like machines. It shows, for instance, in the way that job interviews have been reduced to box-ticking exercises designed to eliminate, as far as possible, personal preferences and emotional responses. Institutions and organisations are increasingly suspicious of any process that delivers a different outcome from the one a computer would have delivered. We are urged to rely on procedure rather than experience, to follow rules rather than than making choices. No wonder computer programmes can imitate us so effectively.
You're right to mention Marvel movies. Sean Thomas, the British legacy media's most insistent enthusiast for AI, recently wrote a book and submitted it to an AI editor; he claimed that the editorial feedback he received from the machine was as good as any feedback he could hope to get from a real-life, flesh-and-blood editor. But the machine's advice was essentially to make the book more conventional: "Lack of clarity can be frustrating for readers who want a definitive answer," so "while ambiguity can be effective in some cases, it’s important to provide enough clues and hints to allow readers to draw their own conclusions about what happened." If that's the kind of advice a professional editor would give, perhaps it explains why so much modern fiction is so unenterprising. Well, OK, Thomas was writing genre fiction. But even modern literary fiction often seems to be written to a formula, insistent on clarity, determined to explain itself, and deeply reluctant to tolerate the kinds of ambiguity that were the mainstay of the novel in the hands of, say, Henry James, and were common in his heirs up to the 1990s. Read, say, the early Angus Wilson (singling him out somewhat arbitrarily since I have read all of his 1950s novels in the last few years), and marvel at his refusal to advance neat and tidy explanations for human behaviour. The kind of demands he makes on the ordinary intelligent reader are, in their quiet way, quite breathtaking.
Of course, as you suggest, the question these days is how many people can spot the difference.
This is very well put. There is some general crisis of confidence about human judgements, tied in to fears about unconscious bias and other kinds of unfairness. Of course it's debatable whether proceduralism isn't equally unfair in different ways.
I blame a) the decline of Christianity, in which each individual decision is an awesome moral responsibility, and b) the marginalisation of literature, our great storehouse of imagined applied ethics. These things have been replaced, as the primary devices according to which we understand the human condition, by the social sciences, with their goal of measuring and quantifying human experience.
As to whether or not proceduralism, or human judgement, is the more unfair, perhaps this shouldn't be the first question we ask. Shouldn't the primary question be which delivers the best outcomes? I once served on an interview panel where we followed procedure and appointed the candidate who was best qualified on paper and who ticked the most boxes. We weren't allowed to factor into our discussion the fact that we knew and liked the other main candidate, knew that we could work with him, and trusted him to do the job. I think that many workplaces would be more harmonious if we were able to allow such intangible factors to prevail over mere procedure.
I expect that many workplaces do in fact hire known quantities they already like even if the formal criteria disagree, but I agree with your wider point.
AI IMO writes better than maybe 75% of people – it’s just that these 75% self-evaluate and just don’t publish anything (I certainly don’t publish long-form essays here or anywhere, let alone fiction).
Even if it’s stylistically bland, inexpressive output, that matches or exceeds vast majorities of humans. Three years ago, it beat maybe a third of people. Before that, it could not functionally write coherently at all, which frankly is also “human level output” for large proportions of the world population who ever functionally learned to write at all, i.e. not just shopping lists.
If the proportion of people it can’t best in each field keeps – let’s say – roughly halving every 2.5 years and my assessment above is correct, AI will be a 98.5th percentile writer in a decade. Still not a great novelist, but what percentage of humans is.
Progress is of course uneven in different fields – good writing is probably just not enough of a market to invest scarce and expensive ML talent in, plus you get a lot of good, angry prose written about you if you try for obvious reasons. E.g. in programming, not an artistic but arguably a creative field, the best current models are 90th+ percentile contributors only given natural language instruction.
Agree that the Claude post was quite hackneyed, but it was more believable as a person that the very noticeable ChatGPT style of "it's not X — it's Y". I've felt before that OpenAI are actually doing us a favour by having that particular style, as it's easier to notice.
Yep, with ChatGPT the masses are fine with it, the tastemakers hate it. Scary prospect though is the development of AI writing that the tastemakers like.
I could probably quote most paragraphs from this in agreement. But I suppose my follow on is that most people do not care about most forms of art enough to be that bothered about the distinction between the highbrow and mushy entertainment, which inevitably sells much better. A classic novel might be constantly published over centuries, but in any given year it's outsold by airport fiction, and may well be better known through its film adaptations. I think there's some use for snobbery about this, but even for the artistically-inclined most of us in most areas of our life opt for the middling option, whether that's light entertainment TV, easy to prepare dinners, or hyper-commercialised sporting fixtures. As we've already seen, that's the kind of thing that's most vulnerable to automation, AI-enabled or otherwise.
I think the great thing is that there’s room for things that are a mixture of both- most of what I love is generic trash plus a little art rather than one or the other. but what I really worry about with AI is not the decline of creativity on the output side but the input side. There are so many things humans do where the creativity and pleasure is in the experience of making not the end result. It’s sad to think that will go away because people are now just typing a sentence and pushing a button
I'm more hopeful than you. I think those who are drawn to particular art forms will want to make it the old fashioned way. The inventions of drum samples, digital audio workstations and autotune did not prevent kids of my generation from picking up the guitar. And I share your preference for art that sits on the border between mainstream and artistic.
It’s not that bad, though you’re right Conor - at times it’s more like an undergrad psycology student’s post breakup journaling. It is still so obviously AI to anyone who’s ever used an LLM, as is Spaghetti Will Smith. I’d say Will Smith hasn’t touched pasta since Men in Black.
You don't post as often as some of the other Substackers I follow, and I was thinking of cancelling my subscription. But this post has swayed me; I'll stay.
I've been saying for years that one reason machines have found it easy to learn to behave like people is because people have spent years training themselves to behave like machines. It shows, for instance, in the way that job interviews have been reduced to box-ticking exercises designed to eliminate, as far as possible, personal preferences and emotional responses. Institutions and organisations are increasingly suspicious of any process that delivers a different outcome from the one a computer would have delivered. We are urged to rely on procedure rather than experience, to follow rules rather than than making choices. No wonder computer programmes can imitate us so effectively.
You're right to mention Marvel movies. Sean Thomas, the British legacy media's most insistent enthusiast for AI, recently wrote a book and submitted it to an AI editor; he claimed that the editorial feedback he received from the machine was as good as any feedback he could hope to get from a real-life, flesh-and-blood editor. But the machine's advice was essentially to make the book more conventional: "Lack of clarity can be frustrating for readers who want a definitive answer," so "while ambiguity can be effective in some cases, it’s important to provide enough clues and hints to allow readers to draw their own conclusions about what happened." If that's the kind of advice a professional editor would give, perhaps it explains why so much modern fiction is so unenterprising. Well, OK, Thomas was writing genre fiction. But even modern literary fiction often seems to be written to a formula, insistent on clarity, determined to explain itself, and deeply reluctant to tolerate the kinds of ambiguity that were the mainstay of the novel in the hands of, say, Henry James, and were common in his heirs up to the 1990s. Read, say, the early Angus Wilson (singling him out somewhat arbitrarily since I have read all of his 1950s novels in the last few years), and marvel at his refusal to advance neat and tidy explanations for human behaviour. The kind of demands he makes on the ordinary intelligent reader are, in their quiet way, quite breathtaking.
Of course, as you suggest, the question these days is how many people can spot the difference.
This is very well put. There is some general crisis of confidence about human judgements, tied in to fears about unconscious bias and other kinds of unfairness. Of course it's debatable whether proceduralism isn't equally unfair in different ways.
I blame a) the decline of Christianity, in which each individual decision is an awesome moral responsibility, and b) the marginalisation of literature, our great storehouse of imagined applied ethics. These things have been replaced, as the primary devices according to which we understand the human condition, by the social sciences, with their goal of measuring and quantifying human experience.
As to whether or not proceduralism, or human judgement, is the more unfair, perhaps this shouldn't be the first question we ask. Shouldn't the primary question be which delivers the best outcomes? I once served on an interview panel where we followed procedure and appointed the candidate who was best qualified on paper and who ticked the most boxes. We weren't allowed to factor into our discussion the fact that we knew and liked the other main candidate, knew that we could work with him, and trusted him to do the job. I think that many workplaces would be more harmonious if we were able to allow such intangible factors to prevail over mere procedure.
I expect that many workplaces do in fact hire known quantities they already like even if the formal criteria disagree, but I agree with your wider point.
AI IMO writes better than maybe 75% of people – it’s just that these 75% self-evaluate and just don’t publish anything (I certainly don’t publish long-form essays here or anywhere, let alone fiction).
Even if it’s stylistically bland, inexpressive output, that matches or exceeds vast majorities of humans. Three years ago, it beat maybe a third of people. Before that, it could not functionally write coherently at all, which frankly is also “human level output” for large proportions of the world population who ever functionally learned to write at all, i.e. not just shopping lists.
If the proportion of people it can’t best in each field keeps – let’s say – roughly halving every 2.5 years and my assessment above is correct, AI will be a 98.5th percentile writer in a decade. Still not a great novelist, but what percentage of humans is.
Progress is of course uneven in different fields – good writing is probably just not enough of a market to invest scarce and expensive ML talent in, plus you get a lot of good, angry prose written about you if you try for obvious reasons. E.g. in programming, not an artistic but arguably a creative field, the best current models are 90th+ percentile contributors only given natural language instruction.
Agree that the Claude post was quite hackneyed, but it was more believable as a person that the very noticeable ChatGPT style of "it's not X — it's Y". I've felt before that OpenAI are actually doing us a favour by having that particular style, as it's easier to notice.
I agree, though I’m surprised how many people find even that cliched chat gpt style to be acceptable
Yep, with ChatGPT the masses are fine with it, the tastemakers hate it. Scary prospect though is the development of AI writing that the tastemakers like.
I could probably quote most paragraphs from this in agreement. But I suppose my follow on is that most people do not care about most forms of art enough to be that bothered about the distinction between the highbrow and mushy entertainment, which inevitably sells much better. A classic novel might be constantly published over centuries, but in any given year it's outsold by airport fiction, and may well be better known through its film adaptations. I think there's some use for snobbery about this, but even for the artistically-inclined most of us in most areas of our life opt for the middling option, whether that's light entertainment TV, easy to prepare dinners, or hyper-commercialised sporting fixtures. As we've already seen, that's the kind of thing that's most vulnerable to automation, AI-enabled or otherwise.
I think the great thing is that there’s room for things that are a mixture of both- most of what I love is generic trash plus a little art rather than one or the other. but what I really worry about with AI is not the decline of creativity on the output side but the input side. There are so many things humans do where the creativity and pleasure is in the experience of making not the end result. It’s sad to think that will go away because people are now just typing a sentence and pushing a button
I'm more hopeful than you. I think those who are drawn to particular art forms will want to make it the old fashioned way. The inventions of drum samples, digital audio workstations and autotune did not prevent kids of my generation from picking up the guitar. And I share your preference for art that sits on the border between mainstream and artistic.
I was that half-talented kid. Now I am an older version of that same kid.
Same!
It’s not that bad, though you’re right Conor - at times it’s more like an undergrad psycology student’s post breakup journaling. It is still so obviously AI to anyone who’s ever used an LLM, as is Spaghetti Will Smith. I’d say Will Smith hasn’t touched pasta since Men in Black.
Maybe he’d be in a better mood if he had the odd bowl of pasta