Jimmy Francis, Founder and Creative Partner,
Interesting Times.The early internet was sold as a techno-utopia – a free space for expression, democracy, and discovery. In practice, it was chaos with dial-up tones, suspicious chat rooms, and a sense of wonder that anything was possible. The promise was decentralisation and access. The reality became banner ads, flame wars, and AOL CDs in the mail. Still, the dream was sincere: that connectivity could cure inequality and alienation. Sound familiar?
Underlying this techno-utopianism was the belief that if enough self-interested individuals connected through digital networks, a harmonious society would naturally emerge. It’s the fantasy of a social order without politics, just code. But without structure, accountability, or intent, what emerged was a breeding ground for monopolies, echo chambers, and bad faith actors. The system didn’t balance – it amplified.
Today’s artificial intelligence (AI) carries a similar promise: liberation from drudgery, enhancement of creativity, and frictionless living. But just like the internet before it, this vision glosses over who controls the tools and who gets left behind. AI is pitched as a benevolent helper, yet it’s trained on stolen labor, built into closed ecosystems, and weaponised to replace rather than uplift. It’s less messiah, more polite automation overlord.
Early bird tickets to the Campaign Saudi Briefing: Media and Marketing are now on sale. Get yours now for access to conversations with key stakeholders across governmental entities, brands and agencies in Saudi Arabia.
We’ve long loved the myth of the self-correcting system – from markets to ecosystems to algorithms. But this dream lets us off the hook of actually making hard choices. ‘Let it run itself’ often means letting inequality metastasise until the crash. The promise of stability becomes a licence for complacency. Chaos becomes the update schedule.
The current myth is that AI is neutral, objective and rational. That it will simply ‘know’ what’s best – faster and fairer than we ever could. But intelligence without context is just overconfident maths. AI doesn’t ‘know’; it predicts. And what it predicts is often a bland, biased, regurgitated version of the past. Automation without ethics is just scaled-up indifference.
Generative AI doesn’t invent the future – it repackages the past. It can only remix what it’s trained on, which is mostly what’s already been popular. It can give you ten thousand versions of what has already worked. But if you’re looking for something new, something never before seen, you won’t find it in the latent space of yesterday.
Creative work has quietly shifted from communicating with people to pleasing platforms. We design for feeds, not feelings. Aesthetic choices are filtered through engagement stats, SEO heatmaps, and brand alignment decks. The algorithm is now the client, and it pays in dopamine, not dollars. The question is no longer ‘Is this good?’ but ‘Will it trend?’
Generative AI doesn’t invent the future – it repackages the past.
Borrowing metaphors from nature sounds enlightened – until it gets weaponised to avoid accountability. Ecosystem thinking became a way to say, ‘it’s all connected,’ while ducking the fact that someone is still in charge. It romanticises balance but forgets that nature also has parasites, extinction events, and fungus that eats your brain. It’s not always the best role model for governance. The fantasy of the flat network was that power would dissolve in the light of connection. No more bosses, just nodes. But power didn’t go away – it just got less visible. Platform moderation, algorithmic curation, data harvesting – these are forms of control too. The 21st century didn’t eliminate hierarchies; it made them look like user experience (UX) design.
In many AI systems today, it’s unclear who’s accountable. The outputs are complex, the training data murky, the goals vaguely defined. And so decisions with massive impact are made with plausible deniability. ‘The model decided’ becomes the ultimate corporate shrug. We’ve built machines of immense power and given them a cosmic ‘vibe check’ as a moral compass. You’re feeding the machine every time you post, like, share, and create. But how much of what you make is shaped by what the machine trained you to want? Are you leading the model – or are you just the meat in the loop? The line between creator and dataset is getting blurry. Time to redraw it – with thick, angry lines.
“Machines can remix surprise – but only you can make it real.”
Once we decided humans were just biological machines running selfish code, it became easier to justify systems that strip away empathy and nuance. In this reductionist worldview, there’s no room for mystery, only metrics. But when people are treated like programmable objects, we stop asking bigger questions – like what makes us more than the sum of our code? If human nature is selfish, then UX becomes manipulation. Many apps are now dark patterns disguised as ‘frictionless’ design.
The world of creative tools starts to mirror slot machines: tap, swipe, get a dopamine hit. But what if delight isn’t something that can be A/B tested? What if joy isn’t the same as engagement? When we let these systems define creativity, we risk reducing imagination to a probability curve. Art becomes prediction. Vision becomes variation. Remember when the internet was supposed to decentralise power? Instead, it created a new class of gods: cloud landlords, platform kings, surveillance merchants. The irony is almost poetic – our dream of digital freedom ended in five companies owning everything. Now AI is repeating the same move, but faster.
AI doesn’t just replace tasks – it reshapes behaviours. Tools that claim to liberate us often subtly retrain us to behave in ways the system prefers. You’re not being freed, you’re being formatted. It’s not autonomy – it’s ambient coercion wearing a UX smile.
We used to design for people. Now we design for APIs, KPIs, and LLMs. Every headline, every layout, every beat is reverse-engineered for machine parsing. We don’t need art directors – we need machine whisperers. But the machine doesn’t care about meaning. It cares about click-through rate. Every time we optimise for metrics, we risk draining meaning out of the work. What’s measurable becomes what’s valuable. But storytelling isn’t just about retention graphs – it’s about resonance. Meaning doesn’t always trend, but it lasts. Optimisation is cheap. Significance isn’t.
You’re feeding a model that might soon call itself ‘creative’ and pitch better than you in meetings. We’re all unknowingly training our future rivals. It’s tempting to let the tools lead. They’re fast, smart, tireless. But the moment we let them define the creative process, we lose something essential. Machines don’t have intuition. They don’t feel risk. They don’t fear failure. You do – and that’s where real art lives.
Art that matters is rarely optimised. It’s messy, political, human. It gets under your skin. It doesn’t convert – it confronts. Algorithms fear subjectivity because it can’t be modeled. That’s your edge. Your refusal to be efficient is a revolution. AI is a master of the average. It gives you what’s been proven, what’s worked before. But true creative work isn’t repeatable. It’s disruptive. It’s the thing that doesn’t belong.
Machines can remix surprise – but only you can make it real. Dreams aren’t predictions. They’re ruptures. They’re absurd, poetic, terrifying, beautiful. They don’t make sense – and that’s their power. AI can simulate thought, simulate style, simulate connection. But it can’t feel awe. It can’t wake up in the middle of the night in a panic or a poem.“We are not watched over by machines of loving grace – we are watched over by machines built by people, for profit, with power.” I did not write a single word of this article – I made AI write this. Well, except that last sentence.
By Jimmy Francis, Founder and Creative Partner, Interesting Times








