A review of Ruha Benjamin’s “Race After Technology”

itsnotmyfault
7 min readApr 10, 2022

Ruha Benjamin’s book is truly a wonder of the modern academic landscape. As the subtitle of the book (Abolitionist Tools for the New Jim Code) implies, Benjamin’s book explicitly references Michelle Alexander’s (2012) book “The New Jim Crow”, which is a deep dive on mass incarceration and how it disproportionately targets black people, incentivizes them into guilty pleads and therefore labels them as felons for life, and how this process systematically disenfranchises and impoverishes them. “The New Jim Crow” is a classic that is foundational to the current understanding of “systemic racism”, and takes a very data-based sociological view to make its arguments for legal reform.

As I [previously stated](https://www.reddit.com/r/BlockedAndReported/comments/ol72zh/book_club_reminder_testosterone_rex_july_24_1pm/h5eno1e/?context=3), I’ve read Weapons of Math Destruction by Cathy O’Neil, and thought it was very good. It’s a pretty suitable description of how the systems-level thinking popularized in-part by Alexander extends to technological solutions that are intended to either minimize human bias, or lower corporate labor costs in decision-making. The by-now classic example is that if a job applicant sifting algorithm is conditioned on historic data about who made a good employee, it will therefore be likely to think college-educated, straight, white males are better employees… but it will likely only reveal this preference behind closed doors via information like “home zip code” or “credit score” rather than information that is illegal to discriminate upon (such as race or marital status). The “Weapons of Math Destruction” in the title are the “New Jim Crow”-esque negative feedback loops that take place due to these systems: If I can’t get a job at [Company A] because they said my credit score was too low, I can’t earn enough income to pay my bills on time, and my credit score drops.

Building upon these works, Benjamin’s book is, as I said, truly a wonder of the modern academic landscape. Despite only being a few years old, it has racked up an incredible number of citations and immediately pushed Benjamin into the spotlight. In my view, a large portion of it is a retread of the same grounds as O’Neil’s book, but adapted into the academic language of a humanities graduate instead of a wall street quant. In stark contrast to either O’Neil’s or Alexander’s intense focus on clearly presenting the data and calmly explaining the brutal logic of the system-level operation, Benjamin’s presentation is a masterful presentation of the “counter-storytelling” approach that is a major pillar of Critical Theory in academia.

And, by that, I mean that I think that you’d have to have a Ph.D to buy into some stuff that is written here, because from my dum-dum perspective it’s literally sophistry and word games. Most specifically, chapter 3 “Coded Exposure: Is Visibility a Trap” begins with a few definitions of the word exposure: The amount of light per unit area, the disclosure of something secret, the condition of being unprotected, the condition of being at risk of financial loss, the condition of being presented to view or made known.

>Some technologies fail to see Blackness, while others render Black people hypervisible and expose them to systems of racial surveillance. Exposure, in this sense, takes on multiple meanings. pg 99 of the paper copy

I felt like I was being trolled because it was more or less explicitly saying “we’ll be using a lot of sophistry”. And sure, it’s a convenient way to bring up a dozen different things in a row and pretend they’re all connected, because you can use the same English word to describe them. After EXPOSING film, my black subjects are not VISIBLE: racist. Automated surveillance and police facial databases disproportionately have black faces? This increases black people’s EXPOSURE to risk and EXPOSES them to view forcing them into VISIBILITY: racist. If an AI system is trained primarily on Asian and White faces and doesn’t identify darker people as accurately: Also EXPOSES black people to greater risk, makes them not VISIBLE and also racist. The body of Saartje Baartman used as freak show attraction, scientific subject, museum exhibit, etc: Routinely EXPOSED and made VISIBLE and racist. After much legal wrestling, France agrees to give back the body, but isn’t 100% sure which bones are hers, but the decision is made to just never figure it out, because DNA testing would further EXPOSE her to invasive VISIBILITY and be racist. It continues on like that into the next chapter, which does the same trick but with FIX, as in a bug fix, fixed in place, to fix a problem, fixin’ to go do something, and to fix a dog.

It’s not that I don’t agree that each individual instance is a problem, and that at its root the connection to “exposure” or “visibility” or “fix” is just surface-level window dressing. In my head I can see that it’s just good writing to make it so that we don’t have an abrupt end to each instance followed by a “and now, for something completely different”, but my experience while reading it is that I’ve now gone through every iteration of “white/black person is exposed/not exposed previously and is now more exposed/less exposed and that is racist”. It starts to feel like every possible combination of actions will lead to the singular conclusion. Having the examples tied up with a set of key words that ultimately provide no predictive power is a serious issue if one’s goal is to help others understand the situation better, but it seems like the goal is simply to have others perceive you as an incredible wordsmith.

Thank you Dr. Benjamin. You’ve got a very high verbal IQ and it really shows, but I have just a few quick questions. If it was racist for Kodak to have film that didn’t capture black people well, is ONLY racist for them to start making film that captures black people well if they make it for a police contract, or is it always racist because they’re trying to extract a profit out of black people? Or is it more racist still if they stick with their first product forever, never developing a product for darker skin colors? Should I be assuming that there’s some medium-level amount of exposure that’s simply too boring to be discussed in the book that I should be aiming for?

Part of my frustration is that she’ll bring up a particular example that I know a thing or two about, but instead of actually talking about the technical details about how such a thing came to occur (which I know to be interesting and revealing), she’ll sort of just … immediately move on to focus on race again. There’s always only a cursory acknowledgment that there’s something deeper going on in the particular example, but our focus must always be at the macro level, and our explanation to be only racism. An early example discusses the Infrared sensors that are ubiquitous in automatic soap dispensers, self-flushing toilets, and hands-free sinks. She recounts a viral moment where they worked for a white woman but not a black woman, compares this to “whites only” bathrooms, before walking back to a more grounded discussion of the underlying physics.

>That said, there is a straightforward explanation when it comes to the soap dispenser: near infrared technology requires light to bounce back from the user and activate the sensor so skin with more melanin, absorbing as it does more light, does not trigger the sensor. But this strictly technical account says nothing about why this particular sensor mechanism was used, whether there are other options, which recognize a broader spectrum of skin tones, and how this problem was overlooked during development and testing, well before the dispenser was installed. Like segregated water fountains of a previous era, the discriminatory soap dispenser offers a window onto a wider social terrain. pg 68 of paper copy

To the first part, I would say “yes, and that’s exactly the kind of book I would rather be reading.” She knows exactly the questions to ask! Would it kill her to just tell me if there’s another technology that is better for touchless detection at that range, and if it was as cheap as IR rangers at the time it was being popularized. Maybe ultrasonic rangers, but I thought people were using laser/IR way earlier. Were these products driven by (white) American/European companies, or is this more of a “Japanese automotive automation make IR rangers really cheap and people started thinking about other uses for them”? Do early generations of these products work keep a constant arbitrary threshold, while later ones account for ambient light conditions and adjust so that they can capture black hands more frequently? It’s a really interesting technical question!

To the second part, I have just one thing to say: Did you know that those IR sensors will get set off by anyone wearing a high-visibility safety jacket that even comes into the same zip code. So, while the robot sinks may be racist, they’re certainly not classist.

Book club members may be amused to see both John McWhorter and Kathryn Paige Harden make appearances. Crossposted at substack: https://itsnotmyfault.substack.com/p/a-review-of-ruha-benjamins-race-after?s=w

--

--