Beyond the Algorithm - 1
Why Technology Needs Philosophy
2025-10-09 16 min
Description & Show Notes
Culture in the Age of Algorithms: Who Owns Attention?
Attention is the new currency. Cora unpacks how platforms capture, sell, and shape attention — and what that means for culture, creativity, and free will.
#GfAev #GesellschaftFürArbeitsmethodik #BrigitteESJansen
#GfAev #GesellschaftFürArbeitsmethodik #BrigitteESJansen
- Why algorithms are not just tools but cultural actors
- Technology and philosophy – an old but urgent relationship
- How ethics can guide innovation
- First outlook: responsibility in AI design
Literature
- Shoshana Zuboff, The Age of Surveillance Capitalism, PublicAffairs, New York, 2019.
- Luciano Floridi, The Ethics of Information, Oxford University Press, Oxford, 2013.
- Hans Jonas, Das Prinzip Verantwortung, Suhrkamp, Frankfurt am Main, 1979.
- Martin Heidegger, Die Technik und die Kehre, Neske, Pfullingen, 1962.
Transcript
Welcome to Beyond the Algorithm, the
podcast where philosophy, ethics, culture,
and marketing collide with technology. I'm
Cora, and today we're diving into a
question that affects every single one of
us. Picture this. You're scrolling through
Instagram late at night. You've had a long
day. You're tired. Maybe even a little
lonely. Suddenly, an ad pops up. Not just
any ad, the exact pair of sneakers you
looked at yesterday. Not only that,
they're in your size, your favourite
colour, and there's even a limited time
discount. You hesitate for a second, then
click. Was that your decision? Or was it
the decision of an algorithm that knows
exactly when you're vulnerable? This is
persuasion in the age of artificial
intelligence. It's subtle. It's constant.
And it raises questions that go far beyond
marketing. Is it persuasion? Is it
manipulation? Or is it something entirely
new? Today, We'll explore the ethics of
persuasion in AI marketing. We'll look at
where persuasion came from, how it's
changed, the dilemmas it creates, and what
it means for business, for culture, and
for philosophy. So let's go beyond the
algorithm. A short history of persuasion.
Persuasion is as old as humanity. Long
before algorithms, Long before advertising
agencies, humans tried to change each
other's minds. In ancient Greece,
persuasion was considered a skill, even an
art. Aristotle described three pillars,
ethos, credibility, pathos, emotion, and
logos, logic. If a speaker could combine
these three, they could move a crowd.
Imagine standing in the Agora in Athens,
listening to a philosopher argue why a law
should be passed. His authority gave him
ethos. His passionate delivery gave him
pathos. And his reasoning gave him logos.
Fast forward to the Middle Ages.
Persuasion was in the sermons of priests,
in the edicts of kings, in the stories
told by wandering bards. It shaped not
just opinions, but entire worldviews. Then
came the printing press. Suddenly,
persuasion could scale. Pamphlets spread
revolutionary ideas across Europe. Martin
Luther's theses were not just theology.
They were marketing. The Protestant
Reformation was powered as much by the
printing press as by belief. In the 20th
century, persuasion became industrialized.
Advertising agencies learned how to shape
desire. Edward Bernays, nephew of Sigmund
Freud, applied psychology to marketing. He
convinced women to smoke by branding
cigarettes as torches of freedom. It
wasn't about the product. It was about
identity. Television took persuasion into
every living room. Think of Coca-Cola
commercials at Christmas. Think of
political debates broadcast to millions.
And then, the Internet. Suddenly
persuasion wasn't just one to many. It was
interactive. Personalised. Trackable. And
now, With artificial intelligence,
persuasion is no longer limited by human
creativity or human reach. It's automated.
It's optimised. And it may know you better
than you know yourself. So persuasion has
always been with us. But AI changes the
scale, the speed, and maybe even the
morality. Persuasion in the age of AI.
Let's look at what's different today.
Imagine YouTube. You go in to watch one
video. 30 minutes later, you're still
there, pulled deeper and deeper into
recommendations. That's not an accident.
That's algorithmic persuasion at work. Or
think of Amazon. You search for a book.
Suddenly you see three more suggestions.
People who bought this also bought that.
It feels helpful. But it's persuasion by
association. Or consider Spotify. The
Discover Weekly playlist doesn't just
reflect your taste. It shapes it. Over
time, your sense of what you like is
guided by what the algorithm offers. This
is persuasion powered by AI. Unlike
traditional persuasion, which was broad,
visible, and often slow, AI persuasion is
personalised. No two people see the same
feed. Real-time. Messages adapt instantly
to your behaviour. Invisible. You often
don't realise persuasion is happening.
Predictive. It doesn't just respond to
what you want. It anticipates it. This
raises new questions. If an ad is shown to
you not because you searched for
something, but because the system predicts
you're about to want it, is that still
persuasion? Or is it manipulation? If
TikTok knows you'll watch 10 more clips
when you're restless, And it serves them
up precisely in that moment. Are you
choosing? Or are you being nudged, gently
but relentlessly, by code? The difference
may seem subtle. But over millions of
interactions, it reshapes behaviour,
culture, and even democracy. Case study,
Cambridge Analytica. Let's talk about one
of the most famous examples. Cambridge
Analytica. In the 2010s, this company
harvested data from millions of Facebook
users without their consent. With that
data, they built psychological profiles.
Then, during elections, they targeted
people with messages tailored to their
fears, hopes, and biases. If you were
anxious about immigration, you saw one
message. If you cared about the economy,
you saw another. If you were undecided,
you were shown carefully crafted content
to sway you. This was persuasion at scale.
Not a single speech. Not a single ad. But
millions of personalised whispers,
designed to influence choices that shaped
entire nations. Was it persuasion? Or was
it manipulation? That depends on where you
draw the line. But one thing is clear.
AI-powered persuasion is not just about
selling sneakers. It's about selling
ideas, values, even political futures.
Segment 4. Everyday Examples, CA. 5
minutes. You don't need to look at
politics to see this. You can see it in
your daily life. Open Netflix. The
thumbnails you see are not random. They're
personalised. If you like romance, the
cover will show the love story. If you
like action, the same film might be shown
with an explosion on the thumbnail. The
movie is the same. The story you're being
sold is different. Scroll through
Instagram. Why does one post show up first
and another never appear? Because the
algorithm decides what keeps you engaged.
Order food online. The app nudges you
toward what's popular nearby or what's on
promotion. Your craving feels natural. But
the choice may have been guided long
before you opened the app. This is the
subtlety of algorithmic persuasion. It
doesn't tell you what to think. It shapes
what you see and what you don't. And in
doing so, it shapes what you believe you
want. The ethical dilemmas. So what are
the ethical challenges? Autonomy. Are we
free if machines can predict and exploit
our weaknesses? If you buy because you
were nudged at the perfect moment, was it
your decision? Transparency In the past,
persuasion was visible. You knew when you
were watching an ad. But today, persuasion
hides in the feed. We don't always realise
what's influencing us. How can we give
consent to something we don't even
perceive? Power. Never before have so few
had so much influence over so many. A
handful of companies, Meta, Google,
Amazon, TikTok, control the flow of
persuasion for billions. This imbalance
raises not only business questions, but
democratic ones. And finally,
manipulation. Where is the line between
helping people discover what they want and
making them want what benefits you? The
philosopher Kant argued that morality
means treating people as ends, not as
means. But algorithmic persuasion often
reduces us to means, data points in a
system optimised for engagement, not for
human flourishing. Business, culture, and
philosophy For business, persuasion works.
Personalisation sells. But effectiveness
without ethics is fragile. Brands that
manipulate too aggressively risk backlash.
In the future, ethical persuasion may be
the strongest form of marketing. For
culture, algorithms shape what we see,
hear, and share. They amplify outrage,
accelerate trends, and fragment public
conversation. We no longer consume the
same culture. We live in microcultures,
each shaped by personalised feeds. For
philosophy, free will is under pressure.
If algorithms predict us better than we
predict ourselves, what does it mean to
choose freely? Are we autonomous agents?
or predictable patterns. This is not just
a marketing question. It is a question
about what it means to be human in a
machine-driven age. Let's pause and
reflect. First, persuasion is ancient. But
AI makes it faster, subtler, and more
powerful than ever before. Second, The
line between persuasion and manipulation
is blurring. Autonomy, transparency, and
power are at stake. Third, to go beyond
the algorithm, we must demand ethics in
persuasion. Because persuasion without
ethics is not just bad marketing. It's a
threat to trust, to culture, and to
freedom. This was beyond the algorithm.
Today we explored the ethics of persuasion
in AI marketing and the fine line between
influence and manipulation. If you enjoyed
this conversation, subscribe and share it
with someone who's ever wondered why their
ads feel so eerily personal. Next time,
we'll look at attention, the most valuable
currency in the digital age. Who owns it?
Culture, technology, or us? Until then,
stay curious. And stay beyond the
algorithm.