As of recent, multiple companies are working on AI algorithms and databases to stop “bad people” from facilitating “hate” around the web and workplace. In late 2019, Hatebase announced that it wants to eliminate online hate speech and keep up with “white nationalist lingo.” In mid-May 2020, Facebook announced the Hateful Memes Challenge, which aims to stop “offensive memes and hate speech.” And as of this week, the International Game Developers Association (IGDA) is pushing to tackle crunch and support diversity by using a database to report “bad behavior.”
With the growing number of databases and AI algorithms being trained to curtail any “wrong think” or bad behavior, only expect more of these programs to emerge in the near and distant future.
In the meantime, gamesindustry.biz reports that Renee Gittins — executive director of the IGDA (far right in the header image) — is gunning to tackle four “key problems” in the industry as of today. This includes “event diversity,” “crunch,” “ethics,” and “proper credit for completed work.”
The IGDA and Gittins claim that they want to “help improve the industry and make it supportive of everyone within it.” Additionally, this will supposedly manifest through a database that can gather reports on bad behavior.
Below, Gittins explains how she and the IGDA want to apply pressure to the games industry to explore the four key issues since said industry is “in need of help” as of now:
“We thought it would be good to put them forward all together, so they have the most effect, with a plan to apply pressure to encourage companies to really consider them. We want these to be conversations that people are having, and clearly right now is a huge time of need within the game industry.”
Furthermore, employees will be able to call out their company via a report. That report will be transformed into an anonymized account, which will then be added to the database:
“We’re going to be tracking reports, if people say that a company has been in explicit violation of those standards.”
These accounts will be tracked for future use to keep tabs on a company’s behavior. Gittins further explains this method, as noted below:
“If anyone is thinking of working with [a company] in the future, or there are some questions about how they’re behaving, people can… get a history of how they behaved in comparison to the standards the IGDA is putting forward. This is going to evolve based on the needs of the community, and tracking that data is going to be important in the long-term.”
Unlike other systems that may be loud in concept and practice, this method that Gittins and the IGDA propose will silently take place. In other words, a company will change over time to fit current-year standards by accruing badges without much noise reaching the public:
“We’re not looking to call people out — we’re looking to improve the industry. Our mission is to support and empower game developers in having fulfilling and sustainable careers. In order for that to happen, the companies they work for both have to be sustainable, as in continuing to operate, and they have to be sustainable to work at, as in not burning out their employees.
However, one thing we’re working towards creating, on the opposite side of the spectrum, is a badge system. Our studio partners who have done particularly well in crediting [their staff], or who have done really well in supporting those on paternity and maternity leave, there will be badges to celebrate that they’re supporting employees with best practices.
We’re trying to move things forward and not call out specific companies — we want to improve all of the companies.”
Lastly, don’t expect this to be the last company to bring forth another database and AI algorithm for tracking “bad behavior” in the games industry. With that said, Gittins closes out by saying the following:
“We’re doing something new on reaching out to companies, working with them, and providing resources. And truly being a voice for game developers, and putting forth their best interests in one place.”