fbpx
DigitalFeaturedInsightsOpinionPeople

Tech Talk: Through the looking glass

The Future Gazers session at Dubai Lynx: Gemma Spence, VMLY&R Commerce; Thomas Kolster, Goodvertising Agency; Daniel Hulme, Satalia & Georgia Kinahan, Cannes Lions

By Jalaja Ramanunni

Someone recently stated there are two types of people in the marketing world – one that can’t utter the words’ data’ and ‘creativity’ in the same sentence, and one that believes they go hand in hand. However, marketing is no longer about data versus creativity; data has seeped into every aspect of marketing.

It was almost impossible to have a discussion at Dubai Lynx 2023 without mentioning technology. E-commerce is becoming more social as brands look at meeting shoppers in their own environment. Meanwhile, AI and AR play a bigger role in storytelling and customer experiences. The pressure on CMOs to step out of the ‘creative-first’ world is real, we hear, as they face pressure from CFOs.

While data is now used as a source of inspiration for creativity, it is essential to consider potential implications of technology in marketing. On one hand, marketers are creating hyper-targeted and hyper-personalised campaigns that resonate with their audience. On the other hand, there is a growing concern about gender bias in generative AI. MullenLowe recently prompted AI tools to create images of mechanical engineers, F1 drivers, mathematicians, CEOs, boxers and football players. The results consistently showed male representation in these roles.

Not surprising. It may seem like AI is gender-biased at first glance, but is it really? There’s only so much AI can do when these professions are dominated by men. According to BBC, there are no female drivers performing at the very top F1, and it is nearly 50 years since a woman last contested the Formula 1 Grand Prix. AI mirrors reality.

I decided to go to the source and asked our chatty friend GPT-4 if generative AI is gender-biased. It said: “If the training data used to train a generative AI model contains biased information or stereotypes, the resulting model may generate biased outputs. To address these issues, it is important to use diverse and inclusive training data and to regularly evaluate the outputs of generative AI models for bias.” 

In simpler words, generative AI is gender-biased because we are, and it goes on to hint that generative AI can be trained to remove the bias. MullenLowe felt a need to take corrective measures and created a database of women representation across different roles to train AI. It is surely a move in the right direction. Technology will do what we consciously ask it to do, and it seems like we need to learn how to work with AI, including de-bias its decisions. 

However, the solution must go beyond fixing the symptoms. Gender bias stems from human behaviour – real change is required on the ground.