Ogilvy, the renowned creative agency, has unveiled a groundbreaking initiative aimed at promoting policy changes within the advertising industry. With the increasing utilization of artificial intelligence (AI) in marketing campaigns, concerns regarding authenticity and transparency have emerged.
In response, Ogilvy has introduced the ‘AI Accountability Act,’ an effort focused on bringing greater transparency to influencer-led campaigns on social media platforms. By advocating for disclosure and clear identification of AI-generated influencer content, Ogilvy seeks to foster trust and maintain the efficacy of social media platforms.
As the advertising industry continues to embrace AI technology, questions surrounding the authenticity of influencer-led campaigns have become more prevalent. The widespread use of AI-generated influencers has prompted the need for increased transparency, ensuring consumers are aware when human influencers are replaced by their virtual counterparts. Ogilvy recognizes the importance of addressing these concerns and is taking a proactive approach to mend the questionable elements of influencer-led marketing.
Also Read: How brands are celebrating Father’s Day 2023
Ogilvy has long been committed to promoting ethical practices within the advertising industry. In line with its ‘inclusive influence’ commitment, the agency ceased collaborating with influencers who manipulate or distort their appearance in sponsored ads. By taking a stand against deceptive practices, Ogilvy aims to protect consumers from misleading content and maintain the integrity of influencer marketing.
With the introduction of the ‘AI Accountability Act,’ Ogilvy intends to drive policy changes by advocating for transparency in AI-driven influencer campaigns. Brands will be required to disclose the use of AI-generated influencer content to the public, ensuring consumers are aware when virtual influencers are employed. This disclosure aims to prevent deception and promote authenticity in influencer-led campaigns across social media platforms.
In collaboration with social media platforms, Ogilvy urges them to clearly identify and label advertisements featuring virtual influencers. By providing visible markings and clear declarations, social media platforms can contribute to maintaining the effectiveness and trustworthiness of influencer marketing.
By inspiring full disclosure through the adoption of the hashtag declaration #poweredbyAI, the agency aims to foster a culture of trust between brands, influencers, and consumers. Additionally, a new watermark on AI-generated content will serve as a visual identifier, further ensuring accountability and transparency when such content is utilized in influencer campaigns.
Ogilvy’s ‘AI Accountability Act’ follows the pioneering steps taken by India’s Advertising Standards Council, which became the first national watchdog to mandate clear disclosure rules for AI-generated influencer content.
Other industry giants such as Meta and TikTok have also acknowledged the importance of ethical guidelines and visible markings on virtual influencers. However, Ogilvy asserts that more work remains to be done to ensure responsible marketing practices in the AI influencer landscape.
Julianna Richter, global CEO, Ogilvy PR, said: “The ability of AI to create and learn at speed has already transformed the way we produce personalized content online. But AI must be centred around empathy and transparency. That’s where you create honest interactions with consumers and can drive real impact at the intersection of new digital capabilities.”
Rahul Titus, global head of influence, Ogilvy, added: “The AI market is valued at $4.6 billion and projected to grow by 26% by 2025, in large part because of the growing increase using AI in Influence. As leaders in this industry, we have the responsibility to be ethical and transparent as we populate this new frontier. The technological advances using AI are exciting for the Influencer marketing landscape but runs the risk of compromising authenticity if we don’t declare the difference between what’s real and what’s not.”