Til forsiden

How to handle online discussion - a guide from The Association of Norwegian Editors

The discussion of so-called hate expressions, harassment, threats and bullying, and of general content and form in online comments, debate fora and social media, has gained importance in the public sphere. This is of high relevance to edited media. Many media outlets still provide commentary sections with their published stories, and in addition, many have established profiles in different social media, where audiences can express themselves directly, without advance moderation.

Sist oppdatert

Først publisert

1. Clarify the rules for the debate and the debaters' own responsibility

The basis for all debate is that the participators are responsible for their own statements. This should be made clear to those who consider whether to take part in online fora where audiences are invited to share comments or opinions.

Many media outlets have guidelines similar to the following:
"You are welcome to continue the discussion of this article. Please take into consideration how you present yourself to others and which expressions you use. A rule of thumb: Don't write anything that couldn't have been cried out in a public place with many listeners. You must use your full name; false profiles will be deleted. Stay on topic, and show others respect and generosity. Harassment, threats and hateful messages will be deleted." (Nordlys)

Or:

"iTromsø wants an open, constructive debate. We consecutively remove postings that are racist, harassing, unethical or illegal. We encourage all participants to argue reasonably and to show respect for the opinions of others, and we reserve the right to exclude participants who disobey our rules for participation." (iTromsø)

NRK Nordland goes further, and impresses the responsibility of the debaters:

"Stay within the law. Any post that may be legally problematic will be removed (libellous remarks, copyright infringements et cetera). Racism, gender discrimination, ethnic harassment or other hateful expressions will not be tolerated. You are responsible for your contributions to the online discussion; Norwegian Broadcasting Corporation is not." (Our emphasis.)

Verdens Gang does the same:

"However, we would like to inform you that when writing a comment, you are personally responsible for its content, whether you post it at our website or within your own closed network of Facebook friends."

In Budstikka, the guidelines start with clarifying the individual responsibility:

"Budstikka informs you that you are personally, legally responsible for your comments to our articles. This is valid whether you comment through our website or through the newspaper's Facebook page. This also means that the commenter is not under any form of source protection.

We warmly welcome all readers to comment our stories. At Budstikka.no, we want engaging discussion. Budstikka routinely reads comments and moderates postings in the commentaries. We demand a respectful and reasonable tone, and ask you to stay on topic. Harassment, nasty personal characterisations, hateful attacks or threats will not be accepted. Spam and commercial postings will be deleted.

You must comment using your own name. You can alert Budstikka if you see comments you think are against acceptable debate practice. Participants who do not follow the newspaper's rules may face exclusion."

At diskusjon.no, it is made expressly clear that users of the forum are not automatically under any form of source protection. This is obviously correct, but likely also often misunderstood:

"As user of the forum, you are not automatically under any form of source protection, but the editor may enter prior agreement with the user on source protection or independently decide on such. Diskusjon.no will provide police or the judicial system with personal information if a court order related to (possible) criminal actions under police investigation, if such information is relevant. (…) You are personally responsible for the content of the comments you publish."

In our opinion, far more media outlets ought to impress the personal responsibility of those participating in the discussion, and also clarify that there is no automatic source protection connected to the option of commenting online. An idea could be to demand that the user checks a box for having read the guidelines for commenting, prior to being able to publish.

Further, Norwegian Press Complaints Commission has in some cases found clear guidelines for online commentary a mitigating circumstance. This is an additional reason why this can be significant.

2. Maintain a proper registration system

If you are to regulate the discussion, and to retrieve users of the forum, a proper system for registration is of the utmost importance. It should be organised in a way that makes it impossible to get around it with fake profiles or false information.

Even if you allow users to participate using aliases or nicks, the moderators ought to know their identities. This is significant if anyone is to prosecute those who have expressed themselves in a legally prohibited manner.

There are several ways to register users, for instance through a Facebook account, Disqus, aID and others. We find it difficult to recommend one system rather than another, but will advise you to allow some time searching for the system you find best.

Also, be aware that some media outlets - Nettavisen, for instance - have created systems of their own, as they do not find Facebook secure enough, due to its possibility for fake profiles.

3. Consider which stories appropriate for debate

Consider carefully which issues/stories are appropriate for debate, and do not open more debate threads than can be followed by the moderators.

It is no longer so that practically every news story published online is followed by comments. Still, let us remind you of this. Even though one can never fully control where unfounded, offensive comments show up, it is obvious that certain topics are subject to this more often than others. Naturally, we can't make a complete list of these, but in general, there is reason to warn against reader comments to stories concerning accidents and crime. The same may be true for stories about specific cases within children's services, mental health and related sectors, and stories closely connected to individuals' lives.

For best to take these considerations into account, we recommend that editors establish guidelines covering which stories should be equipped with the option to comment, and which should not.

These guidelines ought to be as clear as possible. It is of great importance that editorial employees familiarise themselves with the guidelines.

4. Strengthen moderation in certain contexts

Related to the former are the stories where it can be difficult to turn off comments, because they deserve to be discussed and because readers' interest is high, but where the risk of illegal or unethical comments is high, and the discussion can easily take an unwanted direction.

Some of the typical topics are immigration and integration, religion and sex; there are more. The Middle East conflict is a typical example of an international issue generating significant interest, in which the very same interest can result in statements one doesn't want among readers' comments.

5. Use filter programs and alert buttons

There is a number of filter programs on the market. To calibrate these so that they react to what one wants them to react to can be difficult, but we do recommend considering this as a possible measure yet the same.

The same applies, not least, for the opportunity for commenters to report others' posts, when finding them in conflict with laws or ethics. The latter is likely an even more efficient measure than the former, even though it can be difficult to administrate, when some users decide to report all they disagree with.

Most media outlets that provide readers with the opportunity to comment also provide alert buttons, a measure we do recommend. When choosing to delete posts, it is important to inform the users of the reason.

6. Provide the opportunity to grade comments

Several media outlets have found it useful to allow their commenters to grade posts.

Of course, this is not an infallible method in prioritising or grading comments on a scale of constructiveness, but it is worth an attempt.

Inviting commenters to pay attention to posts they find valuable, rather than those they disagree the most with or find offensive may contribute to improving the discussion. This can be combined with rewarding commenters who follow the rules with greater visibility.

7. Take part in the debate!

Moderators, journalists and editors ought to take part in the debate.

Frequently, online comments can lead to development in news stories. Due to this, it is important to encourage moderators, journalists and editors to take part in the discussion. Media outlets that have made this a priority have experienced that this helps calming the most extreme commenters, and that it contributes to keeping the debate on topic.

Journalists and editors may find participation challenging, because one easily crosses into sharing one's opinions on the underlying issues. Keeping thoughts sorted is essential. However, it should be possible to discuss the journalism in itself, and also questions related to its content or relevance, without segueing into partiality through asserting one's opinion.

The tone one uses in entering the discussion can influence the debate culture. The representatives of the media outlet ought to use a polite, but unequivocal tone. To take part in dialogue with commenters can contribute to keeping the discussion well founded, and to a shared understanding of the topic.

8. Close the debate at times

There is no law stating that the access to comment should be kept open all the time.

In our opinion, editorial staff with a real opportunity to supervise the discussion and to follow up alerts from users should be at work. For most media outlets, this means closing the comment sections at night (even vg.no does that), and during weekend times when few or no employees are at work.

From experience, we know that the most extreme expressions occur at times when few or no employees are available at the media outlet.

9. Exclude transgressors

For most edited media, exclusion of individual users is the strongest sanction.

Many use it actively. For major outlets with much user-generated content and high activity, as many as 50 individuals can be excluded at any time.

The duration of the exclusion varies, but most operate with three months for first-time offenders. We think excluding repeat offenders for a longer time should be considered.

10. Turn off search engines and refrain from posting in social media

As a last point, we include the opportunity to refrain from posting a specific story in social media, and to turn off search engines.

This can be a double-edged sword, as it also reduces traffic and the attention the story can generate. Nor is it impossible to retrieve the unwanted material; it will only be a bit more difficult. The advantage is that one can more easily control the reach and retrieval of illegal or unethical comments. As an example, we can mention that VG in 2015 never shared its award-winning story about a teenage suicide in social media.

We want to thank all those who have contributed to the work with this guide, with special thanks to editors Halvor Fitness Tretvoll, Erik Stephansen, Tone Tveøy Strøm-Gundersen and Geir Ramnefjell, and from attorney Jon Wessel-Aas, for their advice.

If you would like to give input, please contact us at The Association of Norwegian Editors.

Trenger du rådgivning?

Kontakt oss på telefon eller e-post

Kontakt oss
Reidun Kjelling NybøReidun Kjelling Nybø

Vi er tilgjengelige via hotline 24/7 for alle medlemmer

Utforsk ressursene

Let i NRs veiledere, domsarkiv, rapporter, kursopptak og spørsmål & svar

Trykk Enter for å søke..

Se alle ressurser

Til forsiden

Til forsiden

Designet og utviklet av Kult Byrå