{{ searchResult.published_at | date:'d MMMM yyyy' }}

Loading ...
Loading ...

Enter a search term such as “mobile analytics” or browse our content using the filters above.

No_results

That’s not only a poor Scrabble score but we also couldn’t find any results matching “”.
Check your spelling or try broadening your search.

Logo_distressed

Sorry about this, there is a problem with our search at the moment.
Please try again later.

The investigation into Habbo Hotel has thrown up some difficult questions about how much online environments do and don’t do to keep children safe.

There’s a line that children’s brands tread between protecting their young users from harm and allowing them to express themselves in an environment that they enjoy.

It should be stating the obvious to say that child safety is paramount for any company that creates a place for children to play online (although sadly this isn’t always the case).

On any site, however well managed, there will be an element of risk, as there is in the real world, but by using a combination of safety checks, proper moderation and education it is possible to minimise that risk, and create an environment where children and teenagers can play safely.

Part of the problem I think is that some sites try to be all things to all people. Agree what age group you’re targeting, and create the appropriate environment for that group. If you leave children in a room together with nothing much to do, they’ll create their own games.

If you put hidden, private areas in that room, you’re going to attract secretive, illicit and, in the most extreme cases, dangerous behaviour. The same applies to an online world, but exaggerated by a cloak of anonymity.

Young children are still learning that actions have a consequence, how to take responsibility and how to manage risk. If you let them create a new persona through an avatar, they’re one step removed, and their inhibitions go. They take more risks. So you need to keep a closer eye on what they’re doing.

The need for moderation

Any site that children use should have solid moderation which combines technology (to filter out the most obviously inappropriate content), and human moderators. You need real people who are native speakers to understand the nuances of language.

Children create new words and phrases all the time, particularly when they’re talking about sex. They’ll try all sorts of variants on phrases to get round filters.

Again, you have to get the balance between allowing children to talk freely, nothing kills a site off quicker than censored chat, and being on the alert for danger signs.

There are some obvious things that should trigger an intervention by a moderator. A request for a webcam chat, for example, or to take a conversation out of the site and onto Skype or MSN should ring a warning bell.

Children will often share personal details, and that should be prevented. Those details might be as straightforward as a phone number or address, or something less obvious, like information about a school’s football team that could identify the school the child is at.

There are also warning signs that experienced moderators and best of breed tools can spot to identify someone who’s pretending to be younger than they are. Cultural references, language, context and past behaviour patterns are all things that can help you identify grooming behaviour.

Of course, you’re unlikely to be able to stop two people of the same age flirting with each other online, but you can stop overtly sexualised chat on a site that attracts young children.

If a child is at any sort of risk, for example of self-harm, suicide or sexual abuse, there must be a clearly defined escalation process to alert the relevant authorities (including groups like CEOP and the IWF in the UK or Cybertipline in the US, for example) who can intervene quickly. If you’re not watching the site, you won’t be able to do this.  

What age group are you targeting?

A site that lets all ages mix together is trying to be all things to all people. You could have areas for different ages, or if you’re not prepared to apply the resource to moderate content for things like sexualised chat, then the site should only be open to adults.

Education is important, both of the children on the site, and of their parents. Children should understand risk, and be taught to avoid it; and parents of young children should monitor – and understand – what their children are doing online. CEOP does some great work in this area.

Above all, the culture of a site will inform how children behave on it. If a site gets a reputation for being a free-for-all, with unchecked behaviour, it will attract the kind of behaviour that could put children at risk.

But if that site has a culture of supporting its users, doing everything it can to keep them safe and educating them about the risks so they can protect themselves, it will attract a different kind of play.

That takes time and resource, so a site has to make a choice: safety over profits, or profits over safety?  

Tamara Littleton

Published 14 June, 2012 by Tamara Littleton

 Tamara Littleton is CEO at social media management agency Emoderation and a contributor on Econsultancy.

27 more posts from this author

Comments (1)

Hannah Rainford

Hannah Rainford, Associate Director of Social Media at Jellyfish Online Marketing

Moderation and online spaces for children is always going to be a difficult subject and I have respect for anyone who works in that field, having worked there myself for a few years.

The only way to fully protect children is with premoderation. This does take away the 'social' aspect but this is the only way that you are going to attempt to eliminate this problem. Another way is to take a leaf out of Club Penguin's book, which asks that the parent's have a connected account with the child, so that activity can be monitored.

I'm hoping that as the computer-generations grow up, we end up with parents who are more savvy in online spaces and are able to realise what their children could be exposed to online and take greater steps in preventing this.

I think it's too late for Habbo Hotel. They're not going to be able to come back from this unless they revisit their entire moderation strategy with premoderation and more human moderators, which in turn would ruin the fun 'social' element for the children.

over 4 years ago

Comment
No-profile-pic
Save or Cancel
Daily_pulse_signup_wide

Enjoying this article?

Get more just like this, delivered to your inbox.

Keep up to date with the latest analysis, inspiration and learning from the Econsultancy blog with our free Daily Pulse newsletter. Each weekday, you ll receive a hand-picked digest of the latest and greatest articles, as well as snippets of new market data, best practice guides and trends research.