Skip to Content

For years, marketers have been integrating AI-powered products into nearly every aspect of their work, gaining critical insights and driving more conversions with improved campaign optimizations.

More recently, the profusion of news surrounding generative AI has renewed the critical importance of the human in the driver’s seat, especially when it comes to building advertising creative. Commercial creative would not exist without the talent behind it, full stop. And just as marketers have been using AI-powered tools, creatives have been experimenting with AI in everything from ads that talk back to concept decks that sing.

With responsive search ads (RSA) and responsive display ads (RDA), we’ve already seen the ability of AI to optimize thoughtfully crafted assets into the best-performing combinations of creative. We know that AI can scale asset production to serve the most relevant version of an ad across channels via Performance Max, App, and Video action campaigns. Now, to get a pulse on how interdisciplinary creative teams are already exploring AI, we asked agency leads across three regions how they use it. Here’s what they said.

Novel use cases for AI tools

Use cases for AI tools: Interactive ads from voice activation to audiovisual journeys. Envisioning results and selling that vision to clients. Guiding concept iteration, strategy, and production.

1. Interactive ads from voice activation to audiovisual journeys

During the 2019 holiday season, a team tasked with selling the Google Nest to Australian shoppers had a unique opportunity to let people test the product from the comfort of their own homes.

“We wanted to bring the experience of Nest into the ad itself, so that was the moonshot then — essentially, quite a while back before generative AI was a thing,” said Tingyan Han, senior data strategy director, APAC at EssenceMediacom. “We needed to demonstrate the many ways in which Nest makes moments at home better.”

There were many considerations when working with a speech-to-text API, from the UX to ad inventory and measurement … to testing whether local accents were recognizable.

To do that, the team turned to a speech-to-text API co-authored by Google with the help of the Google Nest and Google Media Lab teams.

“There were many considerations when working with a speech-to-text API, from the UX to ad inventory and measurement, all the way down to testing whether local accents were recognizable,” said Han. The ad would pause within the first few frames to allow the user time to opt in to the experience. When users said, “Hey Google, show me gingerbread house recipes,” it would do as told.

The activation led to an increase in consideration to purchase. Vincent Tay, creative director, APAC at EssenceMediacom, says that’s because the interactivity served to demonstrate the benefits of the product itself.

We must always remember to place the human truth at [our campaign’s] core.

“While the thought of integrating new and shiny technology into every facet of our campaign can be alluring, we must always remember to place the human truth at its core,” said Tay, who partnered with a UX/UR designer alongside Han on the campaign.

More recently, Publicis-owned Leo Burnett Australia used AI to build a talking car; Leo Burnett Taiwan used another kind of AI to celebrate Lunar New Year with potato chip-inspired fortunes.

“We have been using Google Colab regularly for early prototyping efforts involving up-and-coming AI algorithms,” said Laurent Thevenet, head of creative tech at Publicis covering APAC and MEA. “We are also starting to use Bard, as we like the fact that it is connected to the internet and can retrieve real-time information.”

In addition to sending regular updates on emerging tech to its 4,000 creatives across the region, Publicis is also mentoring talent internally, sending mentees back to their respective agencies and teams with new knowledge.

“Machines need operators and, for the creative industry, it means that creatives will partner with machines, directing them towards the desired output,” said Thevenet. “These creatives have to be hybrid talents able to think critically while having the ability to operate and connect these emerging AI systems with one another.”

2. Envisioning results — and selling that vision to clients

“Imagine a concept store on the moon. What does that look like?”

Ed Yeoman, creative director at London-based branding and design agency Human After All, has observed his team bringing AI into the ideation stage to collaborate on design fiction for this type of “wild-card idea.”

“Some people use it to get verbal prompts around language and messaging, and some people use it to get visual prompts to help them kind of imagine something that doesn’t exist yet,” including events and event spaces, he said.

Google’s announcement of the forthcoming addition of Adobe Firefly to Bard represents a commitment to ethical image generation.

Gabriel Cheung, Global Executive Creative Director, TBWA\Chiat\Day New York, says AI-generated content can also challenge teams to push their thinking. “Because of the way AI works with existing information, we let AI-powered tools quickly generate the ‘first thought’ ideas. Our thinking is that if AI can generate the same idea, then we haven’t done our job well enough to come up with something new,” said Cheung.

Google recently announced a new conversational experience in Google Ads and Search campaigns that can be used to enhance brainstorms, a use case Yeoman’s team found helpful.

“[The prompting] part of the process … puts you in the role of the creative or strategic director,” said Yeoman. “Oftentimes, at that level, you’re helping to frame challenges for people to inspire them to do certain things. It’s the same with AI. You’re hoping to frame this challenge in a way which gives you an optimal output.”

At the heart of our business are the principles of empathy and reason. We try and build those into everything.

Meanwhile, Google’s announcement of the forthcoming addition of Adobe Firefly to Bard represents a commitment to ethical image generation. Firefly is trained on a catalog of hundreds of millions of licensed images in Adobe Stock. As private technology companies develop responsible AI and call for regulation, creative agencies are writing their own ethical guidelines.

“We felt that there was a little bit of a lack of people talking about what it meant for humans,” said Yeoman. “At the heart of our business are the principles of empathy and reason. We try and build those into everything.”

3. Guiding concept iteration, strategy, and production at scale

Every year, global creative agency Jellyfish Group invites its own technologists to a companywide hackathon. The participants invited to this year’s event in Paris received a new kind of challenge: design an AI-powered solution for improving the quality and performance of creative assets. The event brought 120 technologists across data, media, and creative together to address this critical business need.

“The benefit is huge; it elevates everyone’s knowledge and experience, and that carries throughout the rest of the year,” said Sam Yates, chief solutions officer of creative technology at Jellyfish.

AI is not just about ethics; it’s literally a business problem for us. If we’re not using AI the right way, it could lead to bias because of the training samples.

“AI has developed so fast and there’s so much opportunity, but it’s also kind of a wild West scenario,” said Di Wu, VP of data science at Jellyfish Group. “People are just starting to think about regulation and what it means for us to adopt AI. We created a task force around it to basically say, AI is not just about ethics; it’s literally a business problem for us. If we’re not using AI the right way, it could lead to bias because of the training samples used [to train the hypothetical model].” Agencies will soon have access to more tools to create variations on their own work, ensuring ownability.

Working within established partner networks is one way to ensure safety, added Yates. “We don’t want to stifle innovation in any way, but we do have to think about some of the risk mitigations that go with exploring new technologies,” he said. “Working closely with Google, for example, is a great opportunity for us to explore a tech stack in a safe and responsible way.”

Research shows that advertisers see 2X more conversions, on average, when adding an RDA to an ad group with a static display ad.

Even before the milestones of 2023, Google had partnered with Jellyfish to design a bespoke service powered by Google Cloud Platform’s Vision and Video APIs. “Optics” could decode which elements of an ad performed best according to a brand’s goals and best practices like the YouTube ABCDs. With advances in AI, such methodologies could eventually enable teams to distinguish emotional content across ads, or even the presence of humor, to shed light on which creative choices resonated most with their brands’ audiences.

Once teams have finalized creative assets, they can get the most out of them using smart creative formats like RSAs and RDAs. Tools like Performance Max can help make sure the right ad reaches the right audience at the right time, optimized for their chosen channels and screens. Research shows that advertisers see 2X more conversions, on average, when adding an RDA to an ad group with a static display ad.

From creative testing to storyboarding, AI-powered solutions have a pivotal role to play in the future of creative development. The role itself must be defined by the needs of real creative teams, and some of the most useful applications may be found unexpectedly. Through active exploration and experimentation with AI, we can discover more concrete ways of thinking about the guardrails we want to uphold and the benefits to creative professionals they unlock.