Lessons from our Experiments with AI Image Generation

We recently created four AI-generated images to promote our work, but they didn’t get the vote of support we were hoping for.

Although we created some remarkably impactful images, our supporters felt deceived.

We had just completed a round of qualitative interviews with members of our Nicosia Dignity Centre in Cyprus to understand our impact. These were anonymised to encourage people to be open about their opinion. We know that faces are more engaging, so we thought it would be a great opportunity to use AI-generated images. We could protect our service users’ anonymity and have powerful, positive images to illustrate their points.

Wrong. It seems the uncanny valley is still undermining our intentions and people just don’t trust these images. They make people suspicious. Here are the main objections we heard from our supporters:

1. We are being inauthentic

The biggest issue is inauthenticity. That goes both for what we are portraying…

“I understand that a proper portrait has more impact but if it’s not of the real person, is that impact real?”

…and how others perceive it, which could expose us to criticism. As an organisation with authenticity as a core value this concern is really key.

“I would feel a bit cheated to be honest and therefore worry that this might make RSE look disingenuous to others too.”

2. They dehumanise our work

To supporters who have worked in our Centres, the images represent a gap between our very personal work with individuals and an AI-generated image of it:

“It takes away something from the human experience that as volunteers is what brings us to the work. It doesn’t ring true for me.”

And again this creates a concern. Do these images make it harder for our potential supporters to relate to the people we serve?

3. The look is off

This will probably get better in time, but some of the images are still a little off. In this one, for example, the hand isn’t quite right:

We also found it challenging to create an image of a young woman that didn’t look like it came from a glossy magazine. This gives an insight into the limitations of the current technology (we used Midjourney) and the data on which it’s trained, though we expect it to develop and improve quickly.

4. ‘Why can’t you just use real images with consent?’

We have used hundreds of photos of real people and will continue to do so. I would rather not use them—it’s so hard to get right but they are critical for raising money.

For every photo we publish, we now ask ourselves if it is upholding people’s dignity. In the past we have used photos that show people suffering or overly grateful, and I regret that we have perpetuated the myths around helpless victims and white saviours.

Some organisations blur out faces or use silhouettes and cropping. The problem is that they often make people look like criminals.

Many organisations ask people to sign consent forms. But how can consent be guaranteed in the context of such an unequal power dynamic? Of course people consent to their photo being taken and published if it means they can continue to receive essential support.

When publishing images of real people, we have the added complication of serving a marginalised group who already experience persecution. Digital images never go away and facial recognition algorithms are already impressive.

We thought we had discovered a great opportunity create impactful images, overcome the many consent issues and uphold people’s dignity, but we have learnt a lot through this process of experimentation and gathering feedback.

The appeal of AI is that it’s a tool to help us achieve our goals. While we are committed to investing in our expertise with this emerging tech, we are a small organisation with limited time and resources and we need to make sure we get the balance right. Ultimately, if our values seem to be compromised, then it’s not the right move for us at the moment.

What do you think about these images and issues? We’d love to hear from you in the comments…

 

Share this:

Related stories

Join Our Community

Your subscription could not be saved. Please try again.
Your subscription has been successful.

There are so many things you can do to support refugees. Sign up now and be part of our global community of volunteers, activists and people who care.