Without a doubt, playing online can be a great experience with the right people, but also a real torture if the community is toxic and every few minutes we have to endure insults or bad words. Bleep It is a project that Intel will launch soon that uses Artificial Intelligence to mitigate this problem.
The Californian company carries more than two years working at Bleep and it is the result of the collaboration with Spirit AI, a firm specialized in the development of software that eliminates abusive language from text chats. The challenge now is to build a solution that does the same with voice and in real time.
A few days ago, those attending the GDC 2021 Showcase were able to attend the conference Billions of Gamers Thousands of Needs Millions of Opportunities where Intel showed a demo of Bleep in beta. The software is capable of recognize and remove or replace offensive words or expressions in real time, at the moment only in English.
Also, Bleep is pcustomizable. The user will be able to decide whether or not they want to hear certain expressions and their intensity. Some examples we have seen are verbal abuse, racism and xenophobia, profanity, hatred of the LGBTQ + community, or sexually explicit language. In each category we can choose between none, some, most or all.
It is obvious that Bleep it will not solve a problem that goes far beyond choosing “How much racism do you want to endure” moving a bar as recognized by Intel itself but, at the same time, they consider that it is a step in the right direction and that it will allow players to better control their experience.
Roger Chander, vice president of Intel, has assured in a statement that the beta version of Bleep will be available on laptops and desktops with Intel processors state-of-the-art, so very soon we can test to what extent an AI can improve our online gaming sessions.
More information | Vice