The Greater Lyon in the era
of post-truth
In a world swaying between alternative facts and conspiracy theories, the Greater Lyon metropolis is betting on preaching falsehoods to restore the truth.
Context
For
emlyon business school, 2018
A speculative podcast episode, developed as part of the Disrupted Futures ↗ programme, dedicated to introducing foresight thinking taught by em lyon business school.
F·r·ictions
Synopsis of this future
In a fictional future, an online radio station in Lyon hosts a sociologist, a researcher in Science, Technology, and Society from the University of Lyon, for an interview. This provides an opportunity to discuss how fake (falsehoods, counterfeits) in all its forms has transformed the very fundamentals of civic life, to the point of disrupting our relationship with the existential notion of truth.
Fragments of f·r·iction
— Host: My dear listeners, welcome to this new edition of your podcast “Tech Paf,” your go-to show where we talk about tech and what it throws at us. This week, you voted for us to revisit how fake has infiltrated our lives. Fake news, deepfakes – so many terms, but more importantly, digital shocks that have shaken our very concept of truth. And if there’s one area that has taken the brunt of these fakes, it’s politics and, more broadly, democracy. To discuss this, I have the pleasure of welcoming Jérémie d’Orsac, a researcher in Societal Technology Sciences at the Department of Anthropology, Sociology, and Political Science at Lumière University Lyon 2. Let’s get started with Tech Paf!
— Jérémie d’Orsac (J.O.): First of all, if we want to understand our post-truth society, we need to put fakes in their historical context: the manipulation of falsehoods and lies has always served power struggles, long before the arrival of the internet and connected technologies. What we’ve seen with the spread of these technologies – or techniques, more accurately – is how they’ve massified the use and dissemination of fake content. From the late 2010s, fake news – information aimed at poisoning public opinion – and deepfakes, those highly sophisticated digital forgeries, started impacting our lives. One clear, and regrettable, example of this impact was the role of fakes in the election and re-election of Donald Trump in 2016 and then again in 2024. By the early 2030s, fakes were no longer just kingmakers; they were dictating the rules to power, which was increasingly elected under contentious circumstances. The difference today is that these fakes are no longer crafted by human master-programmers but by artificial intelligence systems that generate and spread them algorithmically. The explosion of fakes and their manipulation, whether for fun or to further ideological agendas, has found an amplifier in the phenomenon of filter bubbles…
— Host: … Filter bubbles, just to remind everyone, are those echo chambers we find ourselves in when our social networks only show us content that aligns with our opinions and interests.
— J.O.: Yes, exactly! The personalisation of online content and its targeted dissemination has become increasingly effective, contributing to what you rightly said: locking users into a narrow worldview. Often, the content they see turns out to be fake. The result of this combination – lack of exposure to diverse viewpoints and the inability to distinguish between truth and lies – has led, and continues to lead, to extreme polarisation in public debate. It also causes communities to retreat into themselves, which is another step towards the end of the idea of the common good.
— Host: For the sake of your study, you focused on Greater Lyon. What did you notice that was so particular there?
— J.O.: Through this study, we realised that Greater Lyon is both a typical case of what’s happened to local democracies and a unique example of resilience against the dominance of fakes. At first, the Lyon metropolis was hit hard by this transformation. It hadn’t anticipated it, and worse, its initial responses turned out to be false solutions. It had to face highly sophisticated disinformation, such as altered videos of elected officials’ speeches. Their first reflex was to strengthen public information policies, but this had little success in a world full of doubt and deception. A second response was to throw open the doors of power to citizens, attempting to convince them through action, since they couldn’t be swayed by communication alone. Greater Lyon thus implemented an ambitious citizen participation programme, supported by civic tech. However, the crisis of trust in institutions, combined with the massive contribution of ideologically driven bots to online participation, eventually undermined this effort. With successive changes in political majorities, we saw a retreat into a rigid and closed governance model – top-down, where everything comes from the top and trickles down. In the end, trust was broken in both directions. In the study, I point out that the constant doubt and denial fuelled by ubiquitous misinformation pushed institutions to the edge of destabilisation and even overthrow. Today, we know exactly – examples abound – who benefits from these agitators’ actions…
— Host: … You mean the so-called populist parties?
— J.O.: Yes, exactly. Populist parties were quick to adopt the doctrine of fakes as a key tool for spreading their ideas. And they weren’t the only ones. We’ll come back to this later, but Greater Lyon itself learned to fully exploit deep fake technologies and turn them to its advantage!
— Host: And what about the press? Wasn’t it supposed to prevent this situation?
— J.O.: For a long time, the press was considered a counterbalance, a guarantor of verified and objective information. But it’s now clear to everyone that the local press, especially in Lyon, failed to adapt to the post-truth era. This isn’t unique to Lyon’s regional press; it’s just that the post-truth era is diametrically opposed to the very DNA of journalism. Untangling truth from lies became an impossible task for journalists, even as the business model for the press struggled to evolve, and public trust in it was at an all-time low. Unsurprisingly, it crumbled, fearing to report false information and unable to reinvent itself, leaving its readership alone to face the disinformation around them.
— Host: And how did the citizens of Greater Lyon react to this transformation of public truth? Were they complicit or resistant?
— J.O.: In a world where multiple versions of the same story collide and saturate both social media and the streets, the search for truth is no longer anyone’s priority. In some ways, everyone became complicit. If we step away from the case of Greater Lyon for a moment, we now know that some of the citizens’ salvation came from the same technologies that had previously been blamed for their ruthless efficiency in spreading falsehoods. From fact-checking, more and more citizens moved to glitch-checking – also known as fake-checking. Behind this neologism lies the use of artificial intelligence programmed to detect deepfakes by looking for anomalies – those famous glitches – or any other trace of the automated system that generated the fake. These systems search for glitches that might reveal the fake’s true nature. Nowadays, these widely used plug-ins accompany users as they browse via their augmented glasses or holophones. They’ve become as indispensable as ad blockers were back in the day, before online ads were made illegal.
In my opinion, these are far from trivial details, because when you talk to the communications heads of Greater Lyon over the past decade, they all explain that these countermeasures greatly inspired the metropolis’s response.
— Host: Speaking of which, I wanted to ask you about that. Can you tell us how Greater Lyon reacted to this erosion of its legitimacy? Because, let’s be honest, it’s pretty badass, to use an old expression no one really says anymore.
— J.O.: Greater Lyon’s response was not only “badass”, as you put it, but also bold and controversial. Greater Lyon simply decided to play by the rules – and with the rules. The city’s response was unprecedented in France. Greater Lyon chose to use the same technologies that had widened the gap between it and its citizens. They capitalised on the automated generation of fakes, whether in audio, images, or video. The city’s communications teams exploited – and continue to exploit – the latest advancements in deep fake technology to produce a new form of public information, which they call “public affirmation”; acknowledging that information has become highly subjective and persuasive content. To disseminate this content, these teams rely on micro-targeting technologies, delivering these custom-made messages to highly specific audiences with extreme precision. It’s essentially a shift marked by the abandonment of the last transparency policies in public life. The resources previously dedicated to transparency efforts were redirected towards this counter-offensive strategy, turning the practice of fakes to the city’s advantage.
— Host: What’s fascinating in this chapter of your study is how Greater Lyon literally flipped the logic of fakes against fakes themselves. To help us understand, how did this concretely play out in the city’s actions?
— J.O.: During my study, I identified four major post-truth communication strategies implemented by Greater Lyon.
First, there’s egocratic projection. In this strategy, the digital rendering technologies used to create deepfakes are applied to simulate and show how a particular public policy might influence an individual’s daily life. The rendering, whether visual or audio, features the individual – their voice and appearance synthesised for the occasion – who sees themselves projected into a possible future shaped by the implementation of that public policy. The primary goal is to convince the citizen to support the proposal – and, by extension, the institution – by allowing them to visualise the benefits of the policy at a personal level. In other words, it shines a spotlight on the direct and indirect benefits they could gain by backing the proposal. This process intrigued me because it taps into the concept of egocracy, where citizens prioritise their individual interests over the common good. It marks a shift from public information that was once intended to be factual and objective to a message that is emotional and subjective.
Next comes rumour-driven simulation. The idea is simple, though a bit twisted, one might say. The municipality creates fakes that act as trial balloons to simulate and test the reception of a public policy or an upcoming decision. These fakes are covertly sent to micro-communities, selected according to a precise protocol. The fakes generate rumours, which the city then monitors and evaluates using social listening techniques. This allows them to gauge how this focus group perceives the public policy being tested. It simulates, on a small scale, what could happen if a particular local reform were implemented. The reactions they observe provide valuable data that help inform the city’s decisions and anticipate citizens’ responses. In reality, this is nothing new. It’s almost the industrialisation of the practice of “floating a trial balloon”, where politicians used to slip a comment to the media to observe reactions from their peers or the public.
Another striking strategy, for which I’ve found no equivalent elsewhere, is justification through alternate history. When the time for accountability comes or when criticism needs addressing, Greater Lyon looks back instead of telling us about its future. It uses the principle of alternate history for justification. This rhetoric relies on the concept of uchronia – the rewriting of history based on the alteration of a past event. Using convincing deepfakes, the city shows citizens what might have happened, according to their view, if a particular decision hadn’t been made or if a different course of action had been taken. By presenting this alternate narrative, illustrated through deepfake technologies, the city justifies its actions with speculative reasoning rather than objective facts.
Finally, all these fake-driven strategies need to find their audience. That’s the role of hyper-targeted micro-campaigning: younger listeners may hardly remember the large-scale public ad campaigns, which have long since been relegated to archaic techniques and the nostalgia of a few printers. The different fake-driven strategies I mentioned earlier – egocratic projections, rumour-driven simulations, and justifications through alternate history – all rely on micro-campaigning to reach their target audience. Thousands of tailored micro-campaigns are distributed digitally to specific segments, according to the objectives pursued by Greater Lyon. These messages are sometimes relayed through channels that aren’t directly associated with the city to maximise their impact. This approach risks not being recognised as “official” information and, in doing so, treads dangerously close to playing the same game as other fakes.
— Host: And what conclusion do you draw from these strategies?
— J.O.: Many experts see this controversial shift in strategy by the city as a pharmakon – both poison and remedy. I agree with their conclusion for a simple reason: by abandoning the principle of equally communicated public information, the last remnants of the common good have been sacrificed on the altar of stabilising society. Some are already calling it a new ticking time bomb for local democracy.
— Host: Thank you so much, Jérémie d’Orsac, for your insights. I’d like to remind our listeners about your stunning study, “Being a Democracy in the Age of Fake”, released as an interactive documentary and available on all major free streaming platforms. As for us, we’ll see you in three days for a very special episode of Tech Paf, celebrating the 10th anniversary of the Digital Rights for Animals Act. You can now return to your offline activities. See you soon!