Algorithmic misogynoir in content moderation practice


"Algorithmic misogynoir in content moderation practice" from Brandeis Marshall, offers an intersectional perspective by exploring the discrimination specifically faced by Black women in the United States.

20210610_hb_discrimination blackandwhite.jpg

This paper is one of two complementary articles addressing the cultural, technical, and structural problems in current moderation content practices—particularly the double standards faced by those from marginalized communities. Taken together, the papers offer a transatlantic perspective on this increasingly important global issue.

"The state of content moderation for the LGBTIQA+ community and the role of the EU Digital Services Act" by researcher Christina Dinar, focuses on the challenges faced by LGBTIQA+ community in Europe and offers detailed recommendations for the forthcoming Digital Services Act.

While we are one human race, social, economic, and political constructs impose a hierarchy that centers white colonialist culture and white patriarchy through manipulation, coercion, or force. These social, economic, and political structures are not designed for equity. Rather, they’re designed to de-value everything but white culture. As Gulati-Partee and Potapchuk discuss in their work on advancing racial equity, “[r]acial disparities are driven and maintained by public- and private-sector policies that not only disadvantage communities of color but also over-advantage whites.”

We exist in a global society that exudes anti-Blackness – and this is true for online spaces as well. The tech industry is monopolized by white men in ownership, leadership, and the workforce. When building businesses, Black women receive funding at levels 20 times lower than the median national funding level that includes all demographics of business owners. This demonstrates how much more difficult it is for Black women to raise the capital they need. Men account for 70-80% of the tech workforce and leadership. This industry has built a well-documented culture of toxicity, where men hold all the cards and make all the rules and covert structural racism persists. And this toxic tech culture is perpetuated in the classroom when computer science educators are not committed to changing its culture. These circumstances make the pursuit of equitable practices tenuous at best.

Being a Black woman – offline and online – means experiencing two extremes: 1) no one pays attention to what you say or how you say it,  or 2) your words are fodder for scrutiny, surveillance, and judgment. Being simultaneously invisible and hypervisible is a consistent theme (as explored in Dr. Tressie McMillian Cottom’s Thick) and the result of part of what  Moya Bailey calls misogynoir (Crunk Feminist Collective, March 2010), the  “anti-Black racist misogyny that Black women experience.” The Abuse and Misogynoir Playbook sheds light on a five-phase cycle of disbelieving, devaluing, and discrediting the contributions of Black women as the historical norm. Algorithmic misogynoir builds on Bailey’s description to identify how these interactions play out online and in code for Black women.

In digital spaces, Black women’s presence, experiences, and interactions include everything from communicating our accomplishments to sharing our trauma. But now, the interactions and responses come extremely quickly, from real, bot, and troll profiles from anywhere on the globe.

Black people and women are more likely to be the targets of online harassment than their white and male counterparts. Black women have been criticized internationally for their scholarship, had posts removed for statements that are met with more scrutiny than the posts of their white counterparts, and been suspended or banned from a platform for speaking out against any form of algorithmic discrimination.

Black women’s distinct circumstances aren’t recognized and centered. Shireen Mitchell’s  2018 Stop Online Violence Against Black Women report showed how online campaigns using Facebook ads were created to disparage Black girls and women with sexualized memes, hashtags, and fake accounts to help spread disinformation ahead of and during the 2016 U.S. Presidential Election. And Charisse C. Levchek’s Microaggressions and Modern Racism documents anti-Black racism at the micro and macro level, both via in-person and online interactions. Levcheck points to racial slurs and other forms of hate speech done by micro-aggressors and macro-aggressors on the internet. She specifically calls on organizations to enact policies and procedures to address instances of racism, penalize these micro/macro-aggressors, and support survivors of racism.

All this falls under what Matamoros-Fernandez calls “platformed racism,” which is “a new form of racism derived from the culture of social media platforms their design, technical affordances, business models and policies and the specific cultures of use associated with them.” Content moderation can work towards creating inclusive, welcoming spaces for Black women, but current practices embrace misogynoir and then deploy it algorithmically.