Harsher Punishments Are Coming to Valorant for Voice and Chat Toxicity
Valorant developer Riot Games has posted a blog update pledging harsher, more immediate punishment for voice and chat toxicity.
Last year, Riot Games introduced the Valorant Systems Health Series, designed to keep track of and improve the Gameplay Systems while addressing problems such as AFKs, gameplay toxicity, smurfs, and matchmaking fairness.
Below, we'll cover the second article in the series, titled Valorant Systems Health Series - Voice and Chat Toxicity.
Harsher Punishments Coming to Valorant
"While we can never remove the bad conduct itself from individuals, we can work to deter behavior such as insults, threats, harassment, or offensive language through our game systems," The developers wrote, "...we want to walk you through some of the steps we’ve taken (and the measured results), as well as the additional steps we’re planning to take to improve the chat and voice experience in VALORANT."
Check out everything they've worked on within the past few months below.
What’s Been Done So Far?
Player Feedback
The developers write that the first form of detection within their system is player feedback.
Any reports made within Valorant is used to administer punishments, and the developers keep track of that data.
Muted Words List
Recently, Riot Games added the Muted Words List, a feature that allows players to mute words and phrases that they don't like.
The Valorant developers wrote, "This serves two functions: First, knowing that our automatic detection systems aren’t perfect, we want to give some agency to you to filter unwanted communication in the game. Second, we plan on incorporating the words that you filtered using the Muted Words List into our automatic detection in future iterations of the system..."
What Work Is Planned?
1. Generally harsher punishments for existing systems
"For some of the existing systems today to detect and moderate toxicity, we’ve spent some time at a more “conservative” level while we gathered data (to make sure we weren’t detecting incorrectly)."
Riot Games states that this should result in quicker treatment of players acting out of line.
2. More immediate, real-time text moderation
"While we currently have automatic detection of 'zero tolerance' words when typed in chat, the resulting punishments don’t occur until after a game has finished."
The team behind Valorant is looking into ways for punishments to be administered directly after they happen.
3. Improvements to existing voice moderation
"Currently, we rely on repeated player reports on an offender to determine whether voice chat abuse has occurred."
Valorant developers state that while voice chat abuse is harder to detect compared to text chat abuse.
The Valorant team writes, "Instead of keeping everything under wraps until we feel like voice moderation is “perfect” (which it will never be), we’ll post regular updates on the changes and improvements we make to the system."
Those interested can keep an eye on the next update around the middle of 2022.
4. Regional Test Pilot Program
Valorant's Turkish team released a local pilot program to "try and better combat toxicity in their region."
How Players Can Help?
Players can help when it comes to making these programs better.
"Please continue to report toxic behavior in the game; please utilize the Muted Words List if you encounter things you don’t want to see; and please continue to leave us feedback about your experiences in-game and what you’d like to see," The Valorant team wrote, "By doing that, you’re helping us make VALORANT a safer place to play, and for that, we’re grateful.