Articles, notes, and symposia pieces published in CLR’s print volumes.

Print Edition

Volume 114, April 2026, Fanna Gamal, Article California Law Review Volume 114, April 2026, Fanna Gamal, Article California Law Review

The Algorithmic Racial Proxy

To comply with the colorblind impulses of American antidiscrimination law, computer programmers tend to exclude race as a data input when constructing a machine learning algorithm. Yet scholars and advocates consistently argue that even these formally race-blind algorithms can racially discriminate by relying on so-called “proxies for race,” or variables that have a strong correlation with race, such as zip code, income, or prior criminal arrest. While a programmer wishing to respond to this argument might attempt to remove both race and all racial proxies from input data, their task is complicated by a key dilemma: The definition of a racial proxy is far from obvious. This Article examines the myriad definitions of a racial proxy proffered by courts, scholars, and state and private actors to demonstrate how race and racial assumptions become embedded in the machine learning algorithms that increasingly structure human life. Ultimately, what is at stake in the ability to define a racial proxy is a novel form of algorithmically driven racial construction, which permits the production of new and meaningful classes of individuals that can later be exposed to differing resources, opportunities, subordination, and privilege.

Read More