What AI cannot taste
An outsourced life
ML Issue 49 - A weekly takeaway for creatives 🥡
“Feeling insignificant because the universe is large has exactly the same logic as feeling inadequate for not being a cow.”
— David Deutsch,
The Beginning of Infinity: Explanations That Transform the World

Mona’s notes
One evening last week, I wanted to cook a cute, savoury Gujarati snack: dhokla.
Normally, I’d ring my mum, but she and dad are currently in India, escaping the London rain.
Out of habit, I asked AI instead of searching. It replied:
“Here’s a clear, reliable recipe for traditional Gujarati dhokla (the soft, steamed kind — not fried).”
Something felt off.
I realised I didn’t want an AI recipe, I’d asked out of muscle memory. What I actually wanted was to choose one myself, from a website with photos, comments, reviews. Somewhere I could sense reliability through texture and context.
Because really, what does AI know about how dhokla should taste? It has likely never eaten it.
I know people who would trust the AI recipe and cook these cute yellow, spongey snacks without a second thought. But in that moment, it became clear that I was relying on language as a proxy for experience, rather than experience itself.
I wasn’t consulting a cook. Instead, I was interacting with a generative Large Language Model (LLM), that produces new text based on statistical patterns learned from vast amounts of data. It isn’t retrieving a definitive recipe or verifying correctness, instead it’s generating something believable…and this matters.
The model produces something that resembles knowledge, but what it is actually doing is recombining existing language, recipes, descriptions, instructions, into a new arrangement that feels coherent. It does not know which version matters. It does not know which deviations are innovative or ineffective.
Understanding AI as a material
If you use AI within your artistic practice, it’s worth understanding what it actually is, structurally, materially, especially if you’re relying on its outputs.
AI is not an authority, a witness, or a participant. It is a powerful language engine, fluent in description, but detached from lived experience.
When we use AI in our practice, whether to generate text, images, concepts, or structures, we are not collaborating with another consciousness. Rather, we are working with a mirror made of language. The output reflects the archive it was trained on, not an encounter with the world.
In other words: AI can reference culture, but it cannot participate in it.
What changes is the authorship of a work. The artist is no longer the sole creator of form, but is now an editor of probability, and curator.
To work responsibly with these tools, we as artists need to understand what we are working with: it is not intelligence in the human sense, but pattern recognition at scale. A system optimised for believability rather than truth.
I believe that this is perhaps the new critical skill today: knowing when language has detached from experience. Knowing when to stop generating and start tasting.
— Love Mona x
Q: What colour is Sunday?




Monday= red
Tuesday= green
Wednesday= blue
Thursday= purple
Friday= black
Saturday= orange
Sunday= yellow
🫀