Moderation & subjectivity
Welcome to Whisplay, a platform for the gamers, by the gamers! We want to create a healthy community where everyone can speak up and feel confortable, through respect and mutual aid. A very important subject then arises: moderation.
This technique which consists in filtering and temperate exchanges between internet users is widespread on the web – for good reasons – but somehow we are often unhappy with the results… what is the perfect balance between censorship and laxity so that these exchanges remain fair and beneficial? The reputation of verbal toxicity of the gaming networks must come to an end, and to do that, a fair and technically efficient moderation must be put in place.
Some words or images are immediately spotted as inappropriate… for the rest, subjectivity still lingers on a vast majority of the subjects: it all depends on our cultural or generational background, or simply on our personal opinions and references.
It is understood that our culture will influence our perception of what is acceptable or not, History will shape minds – whether we are aware or not. How we interpret a sentence might depend on our nationality, continents, but not only: generational gaps might also play a role. What was acceptable years back can be inadmissable today, and the contrary is also true. As a consequence, moderation of contents must evolve with time and take into consideration everyone everywhere.
1 Godwin1 point
Let’s examine a strong yet simple example: imagine you want to write an article about drifts in video games regarding historical facts. A lot of scenarios take place during the Second World War and show characters from this era. If you explain why certain facts have been tampered with, diminished or rewritten for a certain game, will some people call you a Holocaust denier or see a glorification of nazism? Or will this article be seen as a simple analysis among others?
The turn of phrase, the tone, the state of mind and the personal history of the reader – some of which are very subjective – might very well influence the interpretation of the content. In this case, an effective and adaptive moderation is essential to protect the integrity of the article.
What can we do?
The diversity of experiences and points of view is a treasure, that might even be contagious and allow you to rediscover the universe from a different angle you never suspected existed. We must exercise our right to freedom of speech without the threat of censorship and our opinions being attacked, and we must equally protect the users from hate and potential devastating contents.
That is the challenge of every project of moderation. To do so, we have to determine at which point an exchange cease to be constructive and requires an intervention of the moderation staff.
How to respect ideas and lift them up? Some use only algorithms and robots, some trust only human moderators to autorize a content, and some only hope the community will behave properly and respectfully…
At Whisplay, the three techniques are used conjointly:
- First, Google robots are deployed to isolate automatically images, texts and videos that do not respect the site’s ethics ;
- The community will also have the possibility to report contents it considers inappropriate, by choosing a reason from a list or writing a new one ;
- On top of that, our relation services will be mobilized to decide on litigious contents (and hunt trolls).
When a publication is signaled (article, image or commentary), it is frozen, rendered invisible and sent to our team for moderation. Of course, one report from one person is not enough for a content to be blocked: the process works conjointly with the reason and the number of reports in relation to the total number of views. A homophobic content for example must be taken care of much faster and more severely than a simple out of topic.
The intervention of the moderation team – whose numbers will go up accordingly with the number of Whisplay users – is very important, but represents an gigantic cost for a small company. The tools available to our team will solve this problem, and when a moderator is confronted to a reported content, he is presente with two choices:
- To block the content indefinitely, and signal the authorities if necessary ;
- To block temporarily the content, put it on hold and signal the author so he can modify it accordingly.
On numerous websites, the report button is unfortunately often misused. Healthy contents are then reported for the wrong reasons:
- I don’t agree with the author,
- I don’t like this author,
- I’m a troll and I like it…
Because of these unsound behaviors (or maybe the word ‘’report’’ is not clear enough, hum?), the moderation team can be overwhelmed rather quickly. Our system allows any moderator to validate a content in a permanent way if it is considered as ‘’healthy’’. This Approved! stamp will stay on as long as the content stays intact (no modification). Trolls will be allowed to keep trolling and playing with the report button, when our moderators will be chilling on the terrace.
Efficient moderation is finding the perfect balance between censorship and laxity, which make the gaming networks looks bad nowadays, unfortunately. When the project was launched, one of the most important goal was to create a free, respectful and constructive space for all gamers. It is time for a game- changing platform, join us soon on Whisplay!
1Loi de Godwin: internet adage asserting that "As an online discussion grows longer, the probability of
a comparison involving nazis and Hitler approaches 1".
Pictures: The Whisplay Company, unsplash.com