Generative AI & Advertising: Decoding AI Regulation

Aug 31st '23

Generating tailored ads that resonate uniquely with each individual viewer. Designing dynamic campaigns that evolve according to consumer response. Creating a year’s worth of brand content, with just a few clicks. These are just few of the ways that generative AI might transform the ad industry, with the promise of even greater changes on the horizon.


So, how are the Committee of Advertising Practice (CAP) regulating the use of generative AI in advertising?


  • What is generative AI?

To put it simply, generative AI is a form of artificial intelligence that can create original content such as images, text or music tailored to a user’s requests. It learns from existing data that it has been trained on, to produce new things that have not explicitly been programmed in.


  • Nothing new under the Code

You may be wondering whether it matters to CAP if an image used in an ad has been generated by AI.


Similar to previous advice on new platforms (in short: the Code is media-neutral), if an ad falls within our scope, our rules will apply, regardless of how the ad was created. This is because we regulate ads based on how the consumers will interpret them, which is not impacted by the means used to generate that specific piece of content.


However, there are instances where how an ad was created might be relevant to whether it complies with the rules.  The Advertising Standards Authority (ASA) has yet to rule, to their knowledge, on any ads using AI generated images. However, if these were used to make efficacy claims, there is the potential for them to mislead if they do not accurately reflect the efficacy of the product – much in the same way as images that are photoshopped or subjected to social media filters might. Advertisers should consider this if they are thinking about using AI generated images in their ads.


There is also a risk of some AI models amplifying biases already present in the data they are trained on, which could potentially lead to socially irresponsible ads. There have been a few documented examples of specific generative AI tools tending to portray higher-paying occupations as perceived men or individuals with lighter skin tones, or  portray idealised body standards that could be harmful or irresponsible. Advice to advertisers that want to use AI generated images in their advertising is that they should be mindful of the potential risk of inherent bias and sense-check their ads to ensure that they are socially responsible and do not inadvertently portray harmful stereotypes.


Overall, advertisers should be aware that, as stated in this recent Stripe & Stare Ltd ruling, even if marketing campaigns are entirely generated or distributed using automated methods, they still have the primary responsibility to ensure that their ads are compliant. This is true for all stages of ad creation and distribution.


Source: CAP


How can we help!

LS Consultancy are experts in Marketing and Compliance, and work with a range of firms to assist with improving their documents, processes and systems to mitigate any risk.


We provide a cost-effective and timely bespoke copy advice and copy development services to make sure all your advertising and campaigns are compliant, clear and suitable for their purpose.


Our range of innovative solutions can be tailored to suit your unique requirements, no matter whether you’re currently working from home, or are continuing to go into the office. Our services can be deployed individually or combined to form a broader solution to release your energies and focus on your clients.


Contact us today for a chat or send us an email to find out how we can support you in meeting your current and future challenges with confidence.


Explore our full range today.


Contact us


Call Us Today on 020 8087 2377 or send us an email.



Connect with us via social media and drop us a message from there. We’d love to hear from you and discuss how we can help.


Facebook | Instagram | LinkedIn | X (formally Twitter) | YouTube


Contact us