Discover more from Full Disclosure
In the New Statesman, as well as here on Substack, John Oxley has written about how political campaigns are now entering the “AI age”.
It’s an interesting piece, arguing that the first campaigns to use AI effectively will reap real benefits (i.e. win). In this view, they’ll target better, personalise more and save money on production, resulting in the first “AI election”.
Perhaps. But it’s far from guaranteed, so we thought we should take the opportunity to point out a list of practical issues that will get in the way, and that might slow adoption to a crawl.
1/ There will be reputational costs for campaigns using generative AI, at least initially. Using synthetic images, video or text (even if you’re open about it) will open campaigns up to questions about their authenticity. No one will celebrate the fact that the campaign was terrifically efficient, but they might call it a fake, and someone will have to spend time explaining why they did it. (At this point, please reflect on the fact that the British press has spent the last two weeks talking about a single tweet from the Labour Party.)
2/ The value of AI for targeting and personalisation is overblown. Big campaigns (and companies) have had access to lots of (often poor quality) data for years. Garbage-in, garbage-out, so the saying goes. But that’s not the only problem. Perhaps AI will overcome poor quality data, discover new insights about people’s preferences and motivations and thus personalise messages to an unparalleled degree. But these won’t be the only messages people see. Their TV will be playing in the background, social media feed scrolling endlessly and the Whatsapp notifications popping up, one after another. Why should AI-optimised messages suddenly cut through and prove excessively persuasive? We remember "Take Back Control". "Get Brexit Done” and “Yes We Can” because they’re catchy and simple, not because they’re personalised.
3/ AI doesn’t deliver campaign messages. At the moment, personalised campaign messages will be transmitted via digital means (social media, emails, messaging and so on) because that’s where the audiences for them are. Politicians will still be at the whim of social media algorithms, ad delivery mechanisms, spam filters, moderators and more. AI might free up resources, or get you a slightly better open rate for your email fundraising messages, but the services a campaign uses to reach people will continue to be a limiting factor. AI doesn’t (yet) solve the “last-mile problem”.
4/ If a campaign employs fully-automated massive personalisation, it also gives up control. Most of the campaigners we’ve spoken to about AI are extremely sceptical that they’d ever want to be in a situation where they couldn’t stand behind every single combination of message, visuals and targeting that goes out in the name of their campaign. The AI did something weird? Go clean up its mess.
5/ There’s no reason to think your campaign will be any good at generative AI. At this very moment, millions of people are typing random phrases into a ChatGPT box and hoping for the best. “I’m sure I could have got it to do exactly what I wanted with a better prompt” they say (note: everyone is saying that). We’re currently in the f**k around and find out stage, and we’re finding out that ChatGPT is about 75% as good as “actually good”. Can campaigns expect the digital staffers ordered to “get us using AI” will do any better with these tools?
6/ The current generation of tools are designed to prevent use in politics, and they may get worse at it over time. OpenAI, Google, Microsoft and Midjourney are all desperate to avoid the kinds of controversy that come with political misuse of their tools (see Puffa’ed Popes and Presidential Perp-walks). When Meta launches its generative AI tools for advertising, it’ll try and do the same. As such, they’re all putting guardrails around political uses of AI and are extremely sensitive to people finding ways around them. Even if a campaign can get generative prompts to do useful things, those same prompts might not work tomorrow. It’s called “prompt hacking” for a reason.
7/ Campaigns will get ripped off by AI charlatans. Soon after the 2024 US Presidential election finishes, politicians across the rest of the world will start to get calls from American consultants who “used AI to raise money, mobilise supporters and persuade voters”. They’ll promise the world for a surprisingly high price, but with no guarantees and some hard to verify ethical claims. If campaigns are very lucky, they’ll just be a bit overhyped, and they won’t end up with a Cambridge Analytica-type scandal on their hands. It’ll be a coin toss.
8/ The shiny penny gets dull over time. For a while in 2007/8, people thought they were getting emails written by Barack Obama. Now, the same people routinely describe all political email as “spam”. In 2016, Trump used Twitter and Facebook to further amplify his already loud voice. Four years later, people were tired of hearing it. The AI chatbots are coming, but there will be too many of them, they will become annoying, and voters will work out how to block them. The best a campaign can hope for is to find a good use for AI in the tiny window before people hate it. A winning “AI campaign” will be the one for whom that tiny window aligns with polling day.
9/ Campaigns should demand more transparency from AI toolmakers. Even the most unenlightened campaigns should be asking for the verification programmes and other safeguards that the social media platforms brought in after the Russian interference of 2016. Given the experience of the last 7 years, adding these later is the wrong way to go. Bad actors will be a problem, and the fact that mechanisms for separating good from bad don’t really exist yet should make the mainstream players wary.
10/ Regulators will be sniffing around. The Italian data protection regulator has already banned ChatGPT in Italy. In the UK, we still have the GDPR, and it’s particularly hot on ‘special category data’ that includes the use of people’s political preferences. If a campaign starts using AI to reach millions of people each week they should expect questions - lots of them, using all of the tools the law provides (subject access requests, investigations, audits and more). So the “smart” campaign saved money on graphic designers, data scientists and email writers? Well, it then spent it on compliance officers and lawyers instead. Nice.
11/ Winners write history. Every time you read about the brilliant techniques a campaign used, you should also note that it won. No one writes much about the losers - survivorship bias is the norm in “what just happened in that campaign?” coverage. Furthermore, no party is going to make up a 15% poll deficit because it was “good at AI”. Even if (and it’s a big if, for the reasons above) there’s a first mover advantage for a campaign using AI in a 51-49 election, attributing the victory to its use of a particular set of tools is incredibly hard to prove. If you read otherwise, it’s because the winners wrote it.
None of this is to say campaigns won’t use AI in the coming years (or even months). They’ll definitely try, not least for the splashy press it will get them. For campaigns with lots of resources (like the 2024 Democrats), there’s little cost to trialling AI tools - after all, with all that money sloshing around you won’t really have to worry too much (add in the fact that the US press is generally less sceptical about the role of tech in politics, unless it happens to help Trump get elected again).
For everyone else, everywhere else, where money and time are tight, and choices about what to do are therefore genuinely difficult, there are plenty of reasons to weigh up the costs and benefits. The “AI election” may feel inevitable, but it won’t be plain sailing for the campaigns who want to be “first”.
We’ll be back shortly with our normal service - an update on what we’re seeing in the local election campaigns ahead of May 4th.
Until then,
Team FD @ WTM
Subscribe to Full Disclosure
A newsletter from Who Targets Me about digital political campaigning in the UK (and sometimes beyond) in the run up to the 2024 election.