Toxic players who enjoy harassing others through in-game communications are in for an equally unpleasant awakening.

As the creators and publishers of some of the best cooperative games, Ubisoft and Riot Games are no strangers to having to deal with toxic players. Fortunately, the two companies have joined forces for Zero Harm In Comms (opens in a new tab), a research partnership aimed at combating bad behavior online. The project will use artificial intelligence technology to track, process, and teach bots how to identify instances of bullying online to "promote more rewarding social experiences and prevent harmful interactions."

Ill-mannered gamers will no longer be able to hide behind the anonymity of their keyboards. The research association aims to create a system where information about offending players will be stored and then shared across the industry in a database.

Tres personajes del Proyecto Q se pelean en un parque

(Image credit: Ubisoft)

this is it

You've probably heard the term "toxic" used to describe certain types of online gamers. Whether it's referring to specific gaming techniques that are frowned upon by the general gaming crowd or a case of genuine abuse or bullying, it's hard to put a face to a screen name in the age of online anonymity.

Although anti-cheat software has been introduced into many major online PvP games, they don't tend to notice when players are actively harming other players with their words. The narrow space between swearing and bullying is often filled by the kind of trolls you'd expect to find only in fairy tales.

Reyna y Jett de Valorant

(Image credit: Riot Games)

yo, chatbot

In Ubisoft Le Forge's announcement, CEO Yves Jacquier expresses his empathy (opens in a new tab) for players who find themselves in these dire positions. “Disruptive player behavior is a problem that we take very seriously, but one that is also very difficult to solve,” he says, referring to how systems in place continually fail to identify and punish users for their misbehavior. "We believe that by coming together as an industry, we will be able to address this issue more effectively."

And they will come together, as Riot Games CEO Wesley Kerr gushes. “We are committed to working with industry partners like Ubisoft who believe in creating safe communities,” he says. This partnership with Ubisoft is just one example of the "broader commitment and work [Riot Games are] doing... to develop systems that create healthy, safe, and inclusive interactions."

The AI ​​software will work by taking chat logs across the range of Ubisoft and Riot games and removing any instances of sensitive information before being tagged based on behavior displayed. All of this data will be collected to better prepare the AI ​​bots to detect players who violate community guidelines.

Sure, some games have made strides in combating bad manners in their individual games, but having your gear taken away in the middle of Call of Duty: Modern Warfare 2 doesn't seem threatening enough, the fact that hundreds of people read your comments bad, hopefully it will make disreputable players think twice before speaking up in post-game chats.

Share This