When people talk about AI, they usually talk about speed, automation and efficiency. What they rarely ask is something much harder:
Can AI help communicate emotional truth?
And even more challenging:
Can it have a meaningful role in palliative care — one of the most sensitive human spaces there is? For us, this was not a theoretical question. Through our work with the non-profit initiative Superhelden fliegen vor, we were confronted with a reality most creative and tech teams never truly encounter: the emotional depth, fragility and dignity of accompanying young palliative patients and their families.
That changes how you think about communication. And it changes how you think about technology.
Why this question matters
In most industries, AI is judged by one thing:
Does it make work faster? In palliative care, that standard is not enough. Speed means very little if the result feels distant, cold or emotionally false. In a context shaped by illness, care and mortality, the real question is different:
Can technology support dignity, memory and emotional resonance without flattening the human experience?
That is a much higher standard. And it requires much more human direction.
AI is not emotional by itself.
It becomes meaningful only through human care, context and intention.Mirko I Eliah GOGO
What we explored
Together with Superhelden fliegen vor, we looked at two things in parallel.
First, how AI could support internal workflows by reducing friction and saving time in a resource-sensitive non-profit environment. Second, whether AI could also help create emotional communication in a deeply sensitive setting — not by replacing human experience, but by opening a different narrative space where language alone sometimes falls short.
That question became real in the partially AI-generated short documentary Love your Existence.
At its center is Nene, an AI-generated symbolic character representing the many palliative patients supported by the initiative over time. Nene was never meant as a gimmick or a substitute for a real person. The character was created as a vessel for memory, vulnerability and emotional presence.
That was the turning point.
Because suddenly AI was no longer just a tool for efficiency.
It became part of a form of storytelling shaped by tenderness, dignity and care.
So, can AI be emotional?
Not on its own. AI does not feel. It does not grieve. It does not understand mortality. Neither does a book or a CD by themselves. They all have to be filled by a human experience. But they all can support emotional storytelling when it is guided by people who do understand, who have lived the story. That is the difference.
In other words:
AI can support emotional communication. An AI Avatar can become someone who touches you, by the things he does and says. But it cannot replace human depth by itself.
What we learned
This work taught us three things.
1. Sensitive contexts require more human direction, not less.
The more delicate the subject, the less automation can stand on its own.
2. Symbolic AI characters can carry emotional weight.
If handled carefully, they can create space for empathy, reflection and dignity.
3. Innovation also belongs in care.
AI should not only be discussed in the language of business, scale and productivity. It also needs to be explored in spaces where humanity is tested most deeply.
Final thought
Most conversations about AI still revolve around efficiency. But if AI is going to matter culturally, it also has to prove itself in places where dignity matters more than speed. Our work with Superhelden fliegen vor showed us that AI can have a place there — not as a substitute for care, but as a tool in service of it. And that may be one of the most important questions we can ask of this technology.