Enter a search term such as “mobile analytics” or browse our content using the filters above.
Check your spelling or try broadening your search.
Sorry about this, there is a problem with our search at the moment.
Please try again later.
For many online publishers, user-generated content is often created through commenting systems that allow users to engage in discussion around a publisher's content. In many cases, these user-generated comments are more interesting than the content they are in response to. That's a boon to publishers.
But comments can be problematic. Trolls and spammers, often anonymous, can wreak havoc and turn a friendly experience into an experience plagued by hate and vitriol.
This weekend, Matthew Ingram, a writer at GigaOm and former community editor for The Globe and Mail, weighed in on the topic of anonymous comments, asking "Are they good or evil?"
His post is interesting, and well worth a read. For publishers, the debate about such matters is far more than just theoretical. User-generated content can't be ignored, and developing a sensible comments policy is a must for publishers who permit users to interact around their content.
Here are five tips for doing just that.
- Know who your users are. Different kinds of content bring out different kinds of commenters. You can't develop a sensible commenting policy unless you understanding who your users are and what motivates them to leave comments.
- Check your content. High-quality content is far more likely to produce high-quality comments. Therefore if you're worried about the type of comments you're receiving, you might want to evaluate the possibility that your content is part of the problem. Needless to say, publishers should not be surprised that sensationalist headlines and linkbait is a motivator for less-than-quality commenting.
- Let the community filter. As a publisher, you should be able to trust your users. If you've built a loyal, engaged audience, chances are that members of it are more than willing to help you filter comments. Filtering can range from the promotion and demotion of comments (eg. a voting system) to a simple spam reporting mechanism.
- Incentivize reputation-building. In many cases, debates over the value of identity really, when you get right down to it, revolve around reputation. Instead of, say, forcing all users to provide a first and last name, thinking that this will keep commenters on their best behavior, look at ways that you can reward commenters who add value. After all, that's what you really want. Points, rankings, special profile features (eg. avatars, user titles, etc.) and the promotion of superb comments are but a few of the ways different websites effectively encourage commenters to add value.
- Don't throw the baby out with the bath water. Nobody likes bad apples, but a few of them never hurt anybody. Trolls, spammers and troublemakers can never be completely defeated, but if they're not ruining the experience for everyone else, consider whether or not draconian measures are really warranted.
It's important to remember that commenting functionality is an outlet for human thought, and emotion. What form both take usually depends on the environment individuals are placed in. A good comments policy is designed to create a good environment.