TakeTwo is an app that aims to reduce racial bias in digital content, whether it is overt or subtle, with a focus on text across news articles, headlines, web pages, blogs, and even code. The solution is designed to provide a consistent set of language recommendations, leveraging directories of inclusive terms compiled by trusted sources.
Background: In response to the murder of George Floyd, IBM hosted a Call for Code for Racial Justice challenge to develop socio-technical solutions to address the issue of racial bias. Our team focused on the problem statement "Bias can be learned and perpetuated in different ways (eg. societal beliefs, misrepresentation, ignorance) that consequently may create inequitable outcomes across all spheres of life." We wanted to leverage technology to mitigate bias in content, address bias detection in the media, and educate people in the process. Our intention was to empower people with the context of the words and phrases they use so they can choose what they say with awareness. Ultimately, we hoped this would minimize bias perpetuated through content particularly from ignorance.
From July 2020 to March 2022, my team and I developed a starter solution utilizing extensive user and social science research, externalized the solution to the open source community under the Linux Foundation, and promoted TakeTwo across multiple channels to gain multidisciplinary contributors.