“Degrading” ads objectifying women, showing non-consensual sexual encounters, and using pornographic tropes have been banned by the UK advertising watchdog after appearing to child audiences on on mobile gaming apps.
An investigation by Advertising Standards Authority (ASA) used avatars to mimic the browsing behaviour of different age groups and genders to monitor the ads that appear on mobile games. It then identified breaches of the UK code.
Although most of the almost 6,000 adverts that appeared did comply with UK rules, the watchdog identified eight that portrayed women in a “shocking” way and banned them.
One advert for the app Perfect Lie – a game which included a sexual innuendo – was shown to a female child avatar while using a game which featured a virtual cat and likely appeals to a younger audience.
The offending ad, which showed a teacher bent over, with her bottom appearing pixelated, was found to risk causing harm and serious offence.
Another ad for an interactive romance game called My Fantasy was shown to both male and female child avatars while using a game that involved freeing cars from traffic jams.
It showed an animation of a woman being approached by another woman and being pushed on to a desk. It then showed options asking what she should do – “enjoy it”, “push her away”, “please continue” and “stop it”.
The watchdog said the content was “strongly suggestive and implied the sexual encounters were not consensual.”

Two ads for an artificial intelligence chatbot app called Linky: Chat With Characters AI, appeared while the female child avatar was using a flight simulator game and a character simulation game.
The ad began with a woman dressed in a manga T-shirt, a short skirt and bunny ears dancing in a bedroom. It then showed a text that read: “Tell me which bf [boyfriend] I should break up with.”
It then showed a text conversation with three manga-style men. One character was conveyed as “obsessively possessive, aggressively jealous and won’t let you out of his sight. He’s also a kidnapper and killer”. The text described yanking the woman “into the car, swiftly knocking her out”. She asked, “okay but what if I enjoy this” and he replied, “You will not enjoy this.”
The ASA said the ad was “suggestive and implied scenarios involving violent and coercive control and a lack of consent”.
The report highlighted that although these instances were rare, they have a “zero-tolerance” to content that show “degrading portrayals of women”.
“We know that seeing harmful portrayals of women can have lasting effects, especially on younger audiences,” said Jessica Tye, regulatory projects manager at the ASA.
“Whilst we’re glad to see that most advertisers are doing the right thing, the small number who aren’t must take responsibility. Through this report, we’re making it clear: there’s no room for these kind of ads in mobile gaming, or anywhere,” she added.
Over the past two years the watchdog has investigated and upheld 11 complaints in cases where in-app ads have harmfully objectified women, or condoned violence against them.
Almost half of Britons are concerned about the way women and girls are depicted in ads, a separate piece of research by YouGov of 6,500 people revealed.
It found 45 per cent of people are concerned about ads that show idealised images of women and 44 per cent are concerned about the objectification of women and girls in ads.