Meet the Other Phone. Only the apps you allow.

Meet the Other Phone.
Only the apps you allow.

Buy now

Please or to access all these features

Guest post: “We take action where advertisers get it wrong.”

1 reply

MumsnetGuestPosts · 11/02/2019 13:30

The internet can seem like a daunting place. Barely a week goes by without concerns about the impact of children spending time online making headlines. While the platforms and parents have responsibility for making sure children are safe, the Advertising Standards Authority (ASA) also has a role to play in ensuring the ads that children see, hear and engage with online aren’t harmful.

Recent research released by Ofcom has shown that children spend just over two hours a day online – more time than they spend in front of a TV. Ofcom’s study of their media use also showed that children are also increasingly drawn to platforms like YouTube and Netflix for the choice and control they offer. This is why it’s important our rules protect children across all media including online and social media, advergames (games which contain advertising), kids’ apps and video on demand services.

For example, advertisers must not exploit children’s vulnerability or lack of experience; nor should they contain anything that is likely to result in their physical, mental or moral harm.

As a result, our strict rules are designed to ensure children are not targeted by inappropriate content. For instance, ads for age-restricted products such as gambling, alcohol, e-cigarettes and food and soft drinks high in fat, salt or sugar (HFSS), must be carefully targeted to reduce the likelihood of children being exposed to them. Where they do see them, strict content rules prohibit ads being of particular appeal – for example, through the use of cartoon or licensed characters and certain colours and imagery.

Targeting tools are becoming ever more sophisticated and parents, rightly, want reassurance that proper protections are in place. We take action where advertisers get it wrong.

Last year we banned an advergame for Swizzels, ‘Squashies World’, where the advertiser was unable to prove its audience didn’t include a significant number of children. Our rules prevent the advertising of HFSS products to an audience that is made up of more than 25% under-16s. Similarly, we banned content on a Cadbury’s website because some of the downloadable storybooks and activity packs promoted HFSS food and were targeted at children.

We have also ruled on instances where a child may be surfing the internet but logged in via an adult’s profile. Last year, a seven-year-old saw an in-app gambling ad on a shared device and we banned the ad, ruling that the advertiser should have taken further measures to minimize the likelihood of under-18s being exposed to the ad.

It’s not enough for advertisers to simply rely on self-declared ages; we know that children can register false birth dates to open social media accounts. Advertisers must prove to us that they’ve filtered out users whose browsing behaviour or profiles indicate that they might be age-inappropriate.

Parents are understandably wary and conscious of apps that encourage or exhort their child to make in-app purchases. This can result in worry when children are browsing the internet on their own or on a shared device. Our rules state that children should not be directly pressured to make purchases; we ruled against two online games, Bin Weevils and Moshi Monsters, which broke these rules.

In these instances, both games contained language and prominent calls to action that we considered put pressure on young players to purchase a membership subscription and in-game currency to take part in additional gameplay.

This year, we’ve started a new project specifically concerned with researching the ads that appear in games apps targeted at children under the age of eight, in order to have a clearer picture of exactly what ads they’re seeing, including to check whether they are being shown ads for age-restricted products and whether they are being put under pressure to make in-app purchases or micro-transactions. If we find that there are problems with ads on these apps we’ll take firm action to put things right.

We work year-round with advertisers to ensure that ads are targeted correctly, that they don’t contain inappropriate content marketing restricted products, or pressure children to make purchases.

Our rules are in place to make sure all ads are responsible – if you see an ad that you believe is misleading or harmful, we’d encourage you to submit a complaint to us.

Now, more than ever, there are measures in place to keep children safe when it comes to online ads – but we’re not complacent and will continue to make sure the rules are proportionate and robust, to ensure we’re doing our bit to make the internet safer for everyone.

From MNHQ: We will be passing on your questions to Craig and his team on Wednesday 13th February.

MumsnetGuestPosts · 19/02/2019 16:38

Here is a statement from Lydia Marshall at ASA:

"Thanks for all the questions in response to our guest post. We’ve done our best to answer as many as possible here.

When it comes to YouTubers advertising their own merchandise on their channels, this is allowed under our rules – it is unlikely to need any form of labelling if the fact it’s an ad is clear within the context. It would need to be clear from the title and/or thumbnail that it was advertising if the whole video was about the merch, or in the video itself if it’s just a section in otherwise unrelated content. Claims for any products or merchandise would, of course, have to stick to our rules, not cause harm or offence and should be appropriately targeted.

Instagram requires users to be at least 13, and any ads for age-restricted products or that are inappropriate for an under-18 audience should not appear on accounts where the user is a child. Advertisers are able to target ads on social media platforms away from accounts where the interests or browsing history is in line with that of an underage person.

A common theme from your questions appears to be concern around the use of sex or innuendo in advertising. The use of sex or innuendo to advertise products is not, in and of itself, prohibited by our rules. But it does, of course, often prompt complaints. In judging whether the rules have been broken we have to consider the content of the ad, the context in which it appears and the audience that is likely to see it. Advertisers should exercise care when using overtly sexual content, and in particular ensure that they do not target children.

When it comes to gender stereotypes in ads, our new rule – which comes into force in June this year – will prevent advertisers from depicting harmful gender stereotypes. This doesn’t mean marketers can’t show girls playing with dolls and boys playing with cars – but they mustn’t suggest that stereotypical roles or characteristics are always uniquely associated with one gender, or are the only options available to one gender.

We don’t currently have an app-based option for reporting ads, but we are exploring ways to make our processes more efficient and effective, which could include lighter touch reporting tools for people to let us know what they think.

If you’d like any more information, please visit our website, and if you have concerns about an ad, fill in our online complaints form."

Watch this thread for updates

Tap "Watch" to get all the latest updates

End of posts

There are no more MNHQ posts on this thread