Companies investing in virtual worlds and Massively Multiplayer Online Games (MMOGs) now have a practical guide on how best to moderate these online environments, thanks to a new whitepaper published by international moderation company, eModeration. The report, Five Techniques For Creating Safer Environments For Children, outlines how organisations can not only use moderation to make virtual worlds as safe as possible, but also to enhance the game play and create an even more positive, engaging experience. The paper also urges companies to strike the right balance between creating a safe environment and imparting advice that does not unduly scare children and parents.

Recommendations included in the report detail how companies should: consult all available research when drafting parental guidelines; involve parents as much as possible with their children’s internet activities; use automated moderation filters; utilise the expertise and experience of moderators; make reporting inappropriate behaviour clear and simple.

The whitepaper has been drafted by Tamara Littleton, a respected pioneer and authority on virtual-world and MMOG moderation, who is an original member of the Home Office Internet Taskforce for Child Protection on the Internet that advised the UK government on moderation of communities to help safeguard children. eModeration currently provides moderation services for the virtual worlds, including Dizzywood.

The full white paper can be accessed at:
http://www.emoderation.com/news/press-release-virtual-world-and-mmog-whitepaper

Below is a summary of the techniques covered in the report:

1.Consult all available research when drafting parental guidelines – any organisation that plans to set up a virtual world must explain how they mitigate risk in its parental guidelines. Each virtual world will vary in theme and content, but there are a number of rules that children and parents should adhere to. The full list can be found in the whitepaper, but the golden rule for children is that when online, never share personally identifiable information (PII); this way, a child can never be traced.

2.Use automated moderation filters – these can be used to intercept the disclosure of a child’s personal information preventing children from giving away their mobile phone number, email or IM address and their social network pages, which would otherwise hand to an adult with malicious intent a wealth of useful information. Sophisticated filters can flag to a moderator when a child is being persistently pursued for information, such as where they are from, what school they got to, and for his/her personal preferences, such as favourite football team or singer. It is also possible to tackle overt bullying, abuse and harassment using filters.

3. Utilise the expertise and experience of moderators – automated filters often detect inappropriate or abusive behaviour, but they do not remove the need for human moderators, who are trained to keep the peace and ensure a healthy playing environment. Without human moderators, children can find themselves in a ‘Lord of the Flies’ scenario. Becoming a character or a host also helps enhance the playing experience for children. Moderators can help them overcome in-game challenges and obstacles, and make suggestions on new things within the game for them to experience.

4. Make reporting inappropriate behaviour clear and simple – virtual worlds must provide a very easy way for children and parents to report instances of inappropriate behaviour and should provide easily-accessible contact details for a moderator. The site should also have a very clear policy on what constitutes bullying so that children understand what is, and what isn’t, acceptable behaviour before they play.

5. Get parents involved – parents have a very important role to play in ensuring virtual worlds and MMOGs are a safe and positive place for their children to play. There should always be a ‘parents’ guidelines’ page clearly visible on the site. It is also very important that parents are encouraged to adopt a balanced approach when it comes to educating their children on the dangers of virtual worlds, and do not unwittingly frighten them before they’ve even played the game.

Tamara Littleton, CEO, eModeration, comments: “It’s important to keep a sense of perspective and remember that the risks associated with virtual worlds should be tackled in a similar way to those facing children in the real world. Dangers do exist, but by following safety guidelines and applying commonsense, these risks can be mitigated as much as is possible and children can enjoy a positive, stimulating experience.”

For more information, visit www.emoderation.com.

About eModeration
Founded in 2002, eModeration Limited is an international, specialist user-generated content moderation company. It provides 24-hour community and content moderation to clients in the entertainment and digital publishing industry and major corporate clients hosting online communities and consumer-driven projects.

eModeration's CEO and founder, Tamara Littleton, has an established background in editorial quality control, fault escalation and process management gained from previous work as the Product Delivery Director for Chello Broadband and Online Operations Manager for BBC Online, where she managed the world's first ISO 9000-accredited team for digital publishing management and monitored over 400 BBC websites. Tamara Littleton is a member of the Home Office Internet Taskforce for Child Protection on the Internet which brings together government, law enforcement, children’s agencies and the internet industry, who are all working to ensure that children can use the internet in safety. She was also the Chair of e­mint, the online community for community professionals from 2006-2007.

eModeration's team of moderators and staff are the key to eModeration's success and excellent client list. eModeration draws on the expertise of carefully recruited and trained moderators located mainly in the US and Europe with specialist editorial and community moderation skills, which are matched uniquely to the client. The company can moderate 24/7/365 in more than 18 languages. All its moderators are managed online from eModeration's headquarters in London, United Kingdom.

Further press information:

Malini Majithia / Kate Hartley
Carrot Communications
Tel: +44 (0)20 7386 4860
E: emoderation@carrotcomms.co.uk

Published on: 12:00AM on 16th May 2008