{{ searchResult.published_at | date:'d MMMM yyyy' }}

Loading ...
Loading ...

Enter a search term such as “mobile analytics” or browse our content using the filters above.


That’s not only a poor Scrabble score but we also couldn’t find any results matching “”.
Check your spelling or try broadening your search.


Sorry about this, there is a problem with our search at the moment.
Please try again later.

This week, the 'interactive services' industry took a major step in its efforts at self-regulation.

The publication of guidelines on moderating sites for children will have an impact on brands who provide online communities or games that attract a younger audience.

As all you new media types will know, it's social media week. This means lots of interesting events, meeting up with people you only know by their avatar (can’t that be a shock sometimes?), extra blogging duties and trying to keep track of #smwldn.

What you may not know is that the ‘interactive services’ industry improves its efforts at self-regulation this week, when the UK Council for Child Internet Safety launched guidelines for moderating ‘interactive services’ for children (in other words, online worlds such as social networks, MMOGs, forums, Facebook, IM, Twitter etc).

This is important for two reasons. Firstly, self-regulation is important in order to maintain creativity in the industry, and take away any remaining ‘wild west’ associations. Secondly, proper guidelines give brands and their agencies real advice on what they should be doing within their social media campaigns to make the internet a safe place for children to be around.

These are sensible guidelines that have been drafted with input from industry, charities and child safety experts, as well as specialist moderation agencies (including my own company, eModeration, together with Tempero and Chat Moderators).

When the last set of good practice guidelines were produced in 2005, Facebook was tiny, Club Penguin was just launching (Moshi Monsters was still two years away), and Twitter didn’t exist. With the number of kids in social media, we desperately needed something else in place.

It’s great to see the UK leading the way on this. I was talking to a colleague in the US who said that the intricate system of state regulation means something like this just wouldn’t be possible, and I’ve been really encouraged to see the way that charities and industry have come together to share best practice.

So if you have any involvement in creating, managing or moderating an online environment that children might go to, please do read the guidelines, and work with them.

We all have a duty to make sure the online world is safe for our children. And self-regulation will make all our lives that bit easier.

Tamara Littleton

Published 10 February, 2011 by Tamara Littleton

 Tamara Littleton is CEO at social media management agency Emoderation and a contributor on Econsultancy.

26 more posts from this author

Comments (0)

Save or Cancel

Enjoying this article?

Get more just like this, delivered to your inbox.

Keep up to date with the latest analysis, inspiration and learning from the Econsultancy blog with our free Daily Pulse newsletter. Each weekday, you ll receive a hand-picked digest of the latest and greatest articles, as well as snippets of new market data, best practice guides and trends research.