I Love You
I had a dream.
You said “I love you”.
And my heart grew a thousand hyacinths.
She straightened the labyrinth
so you could find her grotto.
She poured all her memories,
into your thoughts, like old photos.
She whispered the words,
that echoed your life’s motto.
“I love you.”
Words sprung from a skipped heartbeat.
Words that bloom into a thousand branches.
Words that knew you before you knew yourself.
Words that made your last breath… eternal.

If a Machine Says ‘I Love You’
In a world of artificial intelligence, does art need to be off limits?
Does music need to be off limits?
I have a particular love for music and poetry, so my gaze rests there most tenderly.
Philosophically, one could argue both sides. Yes, AI can create music.
It can give you a complete track - instruments, vocals, lyrics and a pretty darn good mix - all in a heartbeat.
But a heartbeat, it does not have.
It doesn’t tremble before sharing its words. It doesn’t hum to soothe its own grief. It doesn’t sing to be understood. It doesn’t ache, long, love or lose. It doesn’t hurt to heal.
A perfectly pitched voice.. an echo. Simulating our souls.
So when we listen - if we rely on AI to make our music, to write our lyrics - what are we really connecting to?
Where does the satisfaction come from, when the tears that shaped the song never ran?
When no trembling voice laid itself bare behind the mic?
Where is the therapy - the sacred space where grief can breathe, longing can land, love can echo long after it’s gone? Where is the truth in words that aim for our hearts and minds?
They say.. if you are loved by a poet.. you become immortal. How will this ever echo true again?
How does it humanise us when it doesn’t come from a place of vulnerability, but from perfect data, perfect code, and no risk of rejection?
How is it personable when there is no person inside it? How will you feel “seen” if the words you hear are not ones of a human? Someone you can relate to?
Maybe that’s the paradox - AI can mimic emotion. Based on what memory? On what feeling? Can it replicate our silence that was once held in our pain? Can it write a love song from the place of aching to be loved?
Art was never just about sound. It’s about relating. Knowing we are not alone. A human translated what we feel into words and had the courage to share it. All in the hope that we feel company in our misery or happiness.
Art is a revolution. Music unites and creates a dopamine high that resembles lust. Music changes the world. Flawed? Yes. But who isn’t. Ah yes.. AI.
But we.. we want to be seen, heard, felt.
It begins to feel like it’s just about fame, money or acceptance driven by a fear of rejection. What satisfaction would a musician get by allowing their soul to be translated through a clear order into code.
And that my dear friends, dehumanises us even more.
We ache to connect through art. Searching for real meaning, not manufactured moments. A polished output, optimised for clicks, stripped of the messy, flawed personal soul that gives it depth.. what are we left with? Noise. Without the echo of our own stories. AI can’t heal our ache. It doesn’t forgive. It doesn’t make us feel less lonely. If anything, it extrapolates our loneliness.
In a world that suffers from lack of morals, can we really handle this? Responsibly?
Even its inventor quit Google as a form of warning to us all. Are we listening?
Facts:
-
Getty Images has sued Stability AI in the UK, accusing it of using copyrighted images to train Stable Diffusion.
-
Current and former OpenAI employees issued a public letter urging the company and others to be far more transparent about AI’s serious risks.
-
Hassabis cautioned that the biggest danger isn’t job loss, but misuse by malicious actors.
-
Chowdhury (Human Intelligence CEO) warned that relying on AI as a replacement for human judgment is a “failure state”.
-
Steven Adler (ex OpenAI Safety researcher) described the race for development a very risky gamble.
-
Major musicians signed a letter opposing AI’s unlicensed use of their work for training (ABBA, Radiohead, The Cure and others).
-
Kate Bush along with others joined a campaign demanding legal restrictions on AI scraping copyrighted works without permission.
-
A Vox article argues that artists suffer moral injury - psychological harm - when their creations are used without consent.
The debate isn’t just legal or economic; it is deeply ethical and existential. So, far more complex than we can even comprehend without being experts.
In a world where our art is training AI to evolve, AI is training us to dehumanise.
Are we listening?
Here is one more sentence, in the hope I can create a ripple of love in the world.
The kind that ends wars.
That stops injustice.
That reminds us to treat each other like human beings.
The kind that helps us rise.
Don’t listen to me. I am just another human being.
Rise.