AI is Coming for You | Pushing the Wave

AI is Coming for You

Writing, 28 April 2024
by L.A. Davenport
Christoph Bernhard Francke
Christoph Bernhard Francke, inventor of the modern binary number system, Public domain, via Wikimedia Commons
Last week, I talked about the dark side of artificial intelligence (AI) lying hidden behind the glossy sheen and the seductive promise of transforming our humdrum lives in myriad different ways.

I could have stopped there and moved on to a topic that will likely feature in my next column, but something has changed all of that. Or rather the true meaning of a work-related conversations has become chillingly clear, and I felt compelled to discuss it here.

As I mentioned before, there have been endless articles predicting that, while AI will bring hitherto unimagined efficiencies to our apparently inefficient professional lives (although one could question how efficient one wants really to be), this great streamlining is expected to lead to thousands, if not millions, of job losses.

The introduction and proliferation of automation has already led to the shedding of a vast number of blue collar jobs of course, and the ongoing impact of its inexorable march can be seen in factories and warehouses, and on building sites, up and down the land. But this time we are talking about white collar jobs, those supposed safe havens for people with the alleged ticket to lifelong employment: higher education.

A recently updated article in Business Insider suggested that there are 10 main areas where the cuts will be felt the most:
  • Technology, affecting roles such as coders, computer programmers, software engineers and data analysts
  • Media, including advertising, content creation, technical writing and journalism
  • The legal industry, specifically paralegals and legal assistants
  • Market research analysts
  • Teachers
  • Finance, including financial analysts and personal financial advisors
  • Traders
  • Graphic designers
  • Accountants
  • Customer service agents

One can argue over the degree to which each of those sectors, and the roles singled out within them, will be affected.

However, it is clear from even a cursory glance at the way in which AI (which I have repeatedly argued, here and here, is not ready or even able to make these jobs obsolete) is already being used by companies to take over human tasks that the technology will be put hard at work to guarantee that a significant proportion of people will be out of work over the next five years or so, perhaps even sooner.
I have spoken a few times on here about my day job as a journalist, specifically as a medical reporter, and I have to admit that I would previously have been surprised to see myself on Business Insider’s list.

As an aside, my previous employment was as a technical writer and content creator, and I freely admit I can see a thousand ways in which half of the day-to-day tasks that make up those two roles, and consequently half the people doing them, could easily be made redundant in the blink of an eye by some fancy AI tool.

But journalism? Yes, I know there are a thousand websites out there that either reproduce press releases verbatim and call it a news story, or scrape information from other sources to cobble together second-rate dross, simply to create ‘content’ for whomever might be tempted to read it. What they offer could be, and probably already is being, done just as easily by an AI tool.

But real, actual journalism? The act of reporting the news? Done by a machine? Impossible!

I am aware that ChatGPT can, in response to a command, cobble together a meaningful-looking text by scraping together information from other sources, but the digesting and interpreting of a primary source to turn into an article aimed at a specific audience is, by definition, a creative undertaking.

Just to be clear here, we are not talking about the kind of reporting that involves interviewing one or more people. That, I hope, will remain forever beyond the ken of AI. What I am talking about is simply the writing of a news story from purely text-based sources.

While it might sound straightforward to the casual observer, it in fact requires a working understanding of the materials, the judicious selection of information and the balancing of different elements to develop a compelling narrative for your audience, of which you have a deep knowledge; a narrative that doesn’t simply rely on hoary old cliches but brings something fresh to the topic.

That’s the theory anyway, and done well it is far beyond the current capabilities of AI.
That gets to the heart of the nature of the intelligence behind AI. As Sasha Budd argued in a recent issue of Prospect magazine: “Human intelligence is not just about description and prediction based on existing data. It is about explanation.”

Human intelligence, she notes, seeks “answers to questions that we ourselves have posed, questions that originate in the things we care about.”

“None of this is true of Al in its present form.”

She points to a 2023 article in the New York Times by Noam Chomsky and others, which note that while AI programs “can be helpful in computer programming, for example, or in suggesting rhymes for light verse,” it is known from linguistic science and the philosophy of knowledge “that they differ profoundly from how humans reason and use language.”

“These differences place significant limitations on what these programs can do, encoding them with ineradicable defects.”

Chomsky and colleagues say that it is “at once comic and tragic, as [Jorge Luis] Borges might have noted, that so much money and attention should be concentrated on so little a thing—something so trivial when contrasted with the human mind.”

They continue: “The human mind is not, like ChatGPT and its ilk, a lumbering statistical engine for pattern-matching, gorging on hundreds of terabytes of data and extrapolating the most likely conversational response or most probable answer to a scientific question. On the contrary, the human mind is a surprisingly efficient and even elegant system that operates with small amounts of information.”

Consequently, “such programs are stuck in a prehuman or nonhuman phase of cognitive evolution. Their deepest flaw is the absence of the most critical capacity of any intelligence: to say not only what is the case, what was the case and what will be the case—that’s description and prediction—but also what is not the case and what could and could not be the case."

And that most fascinating of observations is in fact what any parent observes when they spend a degree of time with their child. They seem to know instinctively that here is so much to be gained by being deliberately wrong and then exploring the consequences, both in terms of how that feels to utter and to experience, but also in terms of the effect on one’s interlocutor.

That, to me, is true intelligence, and is demonstrated by babies who are barely able to crawl; and they continue to explore the concept of instant negation and its consequences for the rest of their childhood. Indeed, it shows that what makes us human, and AI programs so tiresomely limited, is our ability to question and invert, not simply to find out what is.
But back to my career as a medical reporter. It is worth stating here that I am highly specialised in my role. I commonly write complex articles and features that require years of experience and a depth of knowledge that allows me to switch from one disease area to another with relative ease, as well as an ability to interview world experts in such a way as to be able to explain their theories and findings to doctors and other medically qualified professionals, all in under 1000 words.

To be honest, I felt safe from the mission creep of AI up here in my ivory tower, along with the few dozen other people in the world at my level in medical journalism. And yet I didn’t feel so comfortable when I recently found myself in a conference call with the section editor of leading medical news agency.

They explained that the more day-to-day stories based on text-only sources I talked about earlier, which form the bread and butter of our work and a notable chunk of our income, had initially been farmed out to a company in Mumbai but were now going to be entrusted to an AI tool, as soon as they could work out how to optimise the inputs.

I reflected on this for a moment, and wondered what it would mean for my future earnings if that chuck of work was excised, to borrow a surgical term.

But before I could go any further, the section editor said that the consequence was that their staff writers, who had previously written the majority of those day-to-day stories, would now be free to do the more complex stories that, to borrow a culinary phrase, are my signature dish.

Oh, I see.

It dawned on me that use of AI to take over more mundane but still highly valuable writing work is not going to cannibalise the jobs of those who previously wrote those stories and they will end up out on their ear. Rather, those people, who are much less qualified and experienced than me and my colleagues, will be given my work, presumably without getting a raise.

And so those cheaper people will end up doing our specialised work, and it is the specialists of today, who everyone, me included, had assumed would be protected from AI’s onslaught, will end up losing their careers.
While I leave you to chew on the implications of that, I will simply note that this week I added the final two parts of The White Room, a short story from my collection No Way Home to this site (here and here).

I should also note that the countdown to the launch of Escape, The Hunter Cut, continues, and it is now is only just over a week before it hits the virtual shelves, which will be celebrated by a week-long blog tour. I have to confess that I am somewhat giddy with excitement!
© L.A. Davenport 2017-2024.

0 ratings
Out Now!
Escape The Hunter Cut by LA Davenport Out Now
AI is Coming for You | Pushing the Wave