Enter a search term such as “mobile analytics” or browse our content using the filters above.
That’s not only a poor Scrabble score but we also couldn’t find any results matching
Check your spelling or try broadening your search.
Sorry about this, there is a problem with our search at the moment.
Please try again later.
With brands and marketing agencies building their own, or joining existing, online spaces to exploit a growing social media audience, child safety should always, without doubt, be the number one priority for those managing social media projects.
Tempero MD Dominic Sparkes worked with Econsultancy on the recently released Child Protection Best Practice Guide, which looks at the legal and ethical considerations for brands and companies operating in this area.
We asked him about the issues involved in child safety online in general, and setting up and managing online communities for children...
Can you tell me about Tempero's work on the issue of child safety online?
Since its inception in 2003, Tempero has put the protection of children and young people using community and social media tools at the very heart of its business.
We not only work to protect children online through stringent moderation policies and procedures, but we also work to educate young people in keeping safe and empower them to protect themselves.
Tempero has been an active contributor and member of the Home Office task force group, which looked at creating a moderation good practice guidance. We’re also a member of the IWF and are currently collaborating with UKCCIS.
What are the major issues involved with ensuring child safety online?
Initially, understanding the law and how the law applies to your community, reviewing moderation options and assessing which tools are available and how to use them are critical. It’s important to create stringent reporting processes - both internally and externally. All this needs to be done with creating engaging spaces in mind at the same time.
How big an issue is it?
With the explosion in user-generated content (UGC) and social media tools emerging as ways for children to make new friends, catch up with old friends and participate in interactive gaming – the problem of safety for children and young people has become more and more prevalent.
In 2000 the Government set up the Internet Task Force Group, looking at child safety online and has more recently set up the UK Council for Child Internet Safety – recognising the importance for law enforcement, children’s charities and industry to work together to provide better safeguards to protect children online.
Social Media isn’t going away and the volume of use from children in particular will only grow. Unfortunately the volumes of issues is likely to grow also.
Do you think the government is dealing effectively with the issue of child safety online? Is there adequate legislation?
For such a difficult issue, yes, and the UK is leading the way in Europe. There’s always more that can be done though and for that we all have to take responsibility.
The Government has initially been trying to develop recommendations and allocate responsibility, but over time they may have to implement further legislation as the industry grows.
Not every company can afford to moderate so there needs to be more in place in terms of education and corporate responsibility, safety tools, and clear explanation of best practice.
What restrictions/guidelines are placed on marketers attempting to target children online? Do these work in practice?
Best practice has been recommended for everyone operating in this space: community owners, educators and marketers. There is great variation in how this best practice is being implemented though and in general we believe that’s due to lack of awareness and education within the industry.
What are the best forms of moderation for online communities with young audiences?
It’s important for companies to do a comprehensive risk assessment before starting a community aimed at children and young people. Following this assessment, decisions regarding safety and moderation services need to be made.
In practical terms, pre-moderation, ideally through a combination of human moderation and advanced technical tools (which enable more advanced tracking of long term behaviours and patterns) is the way forward.
What are the biggest problems for brands and websites which target children online?
In terms of regulation, striking a balance between protection and overly stringent restrictions. Data Protection is also a major issue and although this is one area where legislation is very clear, many brands and communities aren’t even aware that technically they are capturing and holding data. Age verification adds to the challenges with no one solution yet being ideal.
Other issues include how to run communities safely, within budget, are technically scalable, fully protected whilst still empowering children to have fun and gain from social media.
Do operators of websites focusing on children have an good enough understanding of the issues and legislation surrounding child safety online?
No, the government needs wider reach for good practice guidance and legislation. Many companies don't know what they should be doing or how to get the right information and help.
Which websites are dealing with these issues effectively? Do you have examples of best practice in this area?
Many of our clients have fantastic best practice procedures which we have worked with them to develop. CBBC, for example, is a market leader in this area.
What advice would you give to people setting up an online community for children?
Design your child protection strategy before you start, this includes doing a comprehensive risk assessment. We’ve seen many companies over the years adding in safety measures and tools after they have started, at great cost and stress.