An AI Pondering?
What can't an algorithm imitate & what gives a story a heart?
Does AI make the value of human storytelling clearer than ever? Algorithms can generate content, but they cannot replicate the power, authenticity, or emotional impact of sharing lived experience. When a person shares their unique experience it not only offers us insight into ourselves in similarity or contrast, but also broadens our understanding of what it means to be human. Another human experience woven into the tapestry of our collective understanding. Far from replacing us, does AI have the potential to encourage us to embrace what makes us distinctive and to tell our inimitable perspectives and stories with even more honesty, personality and nuance? Going further, does it offer us clarity on our gifts? The very ones that distinguish us from an algorithym and from any possible kind of imitation? So that we can value and interrogate what it really means to be human, what it means to hold access to an imagination, a heart and an emotional link between our kin and kind.
I asked AI:
What can't an algorithm imitate?
ChatGPT said:
Thought for 5s
Great question. Short answer: algorithms can simulate many surfaces of expression, but they can’t originate from a living body in a real world with stakes. Here’s a tighter breakdown:
What an algorithm can’t truly imitate
Lived stakes — Decisions that cost you something: reputation, relationships, sleep. Real consequences shape voice.
Embodied sensation — The weight of grief in your chest, the salt on your lips after a swim. Qualia aren’t in training data.
First-person memory — Not just events, but the misremembrances, gaps, smells, and private symbols that organize your life.
Moral courage — Choosing a position and carrying the accountability for it. Machines don’t risk, confess, or atone.
Vulnerability — Willingness to be seen before the ending is tidy; to publish while still unsure.
Taste formed over years — Idiosyncratic preferences earned by doing, failing, iterating—not sampled, but seasoned.
Context-in-relation — Being a daughter, neighbour, teacher, lover. Relational history gives meaning an edge AI can’t feel.
Place-sense — Weather, dialect, micro-rituals, land. The grain of a specific street at 5 a.m.
Subtext & silence — What’s meant but unsaid, negotiated in breath, glance, timing; social improvisation in the moment.
Original fieldwork — Interviews, site visits, experiments, archive digs. New inputs beat recycled patterns.
A changing self — Growth arcs, contradictions, converted beliefs. Models don’t repent or deepen; people do.
Care — The slow attention of editing, fact-checking, and protecting sources. Responsibility is a human art.
How to lean into your human edge (practical cues)
Start with a lived scene (time, place, body detail).
Name a real risk you took or a cost you paid.
Offer a specific contradiction you still hold.
Include place-bound particulars (sounds, slang, weather).
Add one fresh datum you gathered yourself (quote, photo, measurement).
Close with a decision you stand behind—own the accountability.
Use these as section headers or a sidebar. They make a strong case: AI can echo patterns; you can make meaning.



It can’t create new knowledge or invent new things. The models are trained on screen scraping existing knowledge. It will never replace handmade art, the feeling of clay in one’s hands, the feeling of painting on paper with a brush… I asked one to create Bach fugue, it couldn’t.
Tried it with creating a perfume, it did but nothing that was innovative, interesting or different. It only knows what existing perfumers have created or had created. It can’t even explore archival sources because they are closed off and models have not been trained using archival sources. It can’t take creative leaps because it doesn’t understand the chemistry and has no experience of what things actually smell like - it isn’t a gardener.