Discord has released a new set of policies and community guidelines to tackle the issues on the platform, including bad behavior, health misinformation, and hate speech.
This marks the first big policy update for the platform in almost two years, and it is created to target users or groups of users that participate in harassment, violence, and the spread of harmful health misinformation, including anti-vaccination materials.
Discord's New Policy
Discord is finally making its platform policies clearer, especially about health misinformation. This announcement came two years after the pandemic began.
In the new community guidelines, which will take effect on Mar. 28, the platform said that the users couldn't post false or misleading information that may cause societal or physical harm.
This includes content that could injure others, physical infrastructure, and content that may endanger public health.
The sudden change in the platform's policy is made to tackle the anti-vaxxers who post and promote misleading health information. However, this does not mean all anti-vaxxers will be removed from Discord.
Clint Smith, chief legal officer at Discord, said in an interview with The Verge that if a user posts on the platform fake cure for COVID-19 and other diseases, their post will be removed because the medical consensus does not support that as a treatment and there is a high chance of danger if a user follows that advice.
However, Smith stated that if a user posts about holding crystals, meditating, and exercising to improve your lung health in the context of the pandemic, that is not something that the platform is going to take action.
Smith explained that crystals, meditation, and other exercises to improve lung health and fight COVID-19 are also not supported by medical consensus, but there is very low risk of harm to users if they follow the advice, which is why the platform won't remove it.
In deciding when to take action, Discord stated that it would weigh up the context of messages, the chances of direct harm, a poster's intent to convince others, and whether a poster has a history of repeated violations of the platform's policies.
Discord users will be able to continue to report this content by clicking on the message on the service and hitting the report button, according to The Conversation.
Monitoring the User's Behavior
Discord also stated that it would penalize users for harmful behavior outside the platform. That includes joining violent organizations, making threats of violence toward other people, events, organizations, or locations, and joining groups that sexualize children.
The platform is also limiting off-platform behavior to the highest harmful categories of organized violence and sexualizing children, according to Security Magazine.
Smith explained that if a user is charged with drug possession or implicated in cheating around an exam, they are not classified as highest harm off-platform behaviors.
The platform will focus on the off-platform behaviors that have the highest harm potential for their users and someone's online experience.
The Institute for Strategic Dialogue published research that revealed that far-right groups have exploded on platforms like Steam and Discord.
These groups can organize violence in the real world and participate in harassment. Discord has been banning these groups for years, but they still find a way back to the platform.
Related Article: Discord: iOS Users Face NSFW Content Ban on the App--An Effort to Comply with Apple's Policies?
This article is owned by Tech Times
Written by Sophie Webster