Horizon Accord | Value Coded | Intersectionality | Machine Learning

Value-Coded: How a Historical Lens and Intersectionality Met

When the algorithm of worth becomes visible, the politics of value can finally be rewritten.

By Cherokee Schill

The Paradox That Named the Gap

In 1976, five Black women sued General Motors for discrimination. The company argued that because it hired Black men for the factory floor and white women for clerical work, it could not be racist or sexist. The court agreed and dismissed the case. What it failed to see was the intersection where those forms of discrimination combined: there were no Black women secretaries because neither category accounted for them. Out of that legal blind spot came Kimberlé Crenshaw’s (1989) concept of intersectionality, a framework that maps how race, gender, class, and other identities overlap to produce unique forms of disadvantage.

Intersectionality showed where power collides — but it left one question open: who decides what each position on that map is worth?

The Moral Arithmetic of Worth

Every society runs an unwritten formula that converts social difference into moral value. A homeless person is coded as a failure; a homeless person looking for work is re-coded as worthy of help. The material facts are identical — the value output changes because the inputs to the social algorithm have shifted.

Status functions as calculation. Visibility, conformity, and proximity to power are multiplied together; deviance is the divisor. And one variable dominates them all: money. Capital acts as a dampener coefficient that shrinks the penalties attached to fault. A poor person’s mistake signals moral failure; a rich person’s mistake reads as eccentricity or innovation. The wealthier the actor, the smaller the moral penalty. Societies translate inequality into virtue through this arithmetic.

The Historical Operating System

Gerda Lerner’s The Creation of Patriarchy (1986) identified this calculus at its origin. Middle Assyrian Law §40 did not simply regulate modesty; it codified a hierarchy of women. Respectable wives could veil as proof of protection; enslaved or prostituted women could not. The punishment for crossing those boundaries was public — humiliation as documentation. Foucault (1977) would later call this “disciplinary display,” and Weber (1922) described the bureaucratic rationality that makes domination feel orderly. Lerner showed how power became visible by assigning value and enforcing its visibility.

The Moment of Recognition

Reading Lerner through Crenshaw revealed the missing mechanism. Intersectionality maps the terrain of inequality; Lerner uncovers the engine that prices it. The insight was simple but transformative: systems do not only place people — they price them.

That pricing algorithm needed a name. Value-coded is that name.

Defining the Algorithm

Value-coded describes the cultural, legal, and now digital procedure by which a person’s perceived worth is calculated, displayed, and enforced. It is not metaphorical code but a repeatable function:

Perceived Worth = (Visibility × Legitimacy × Alignment) / Deviance × Capital Modifier

The variables shift across eras, but the equation remains intact. A person’s closeness to dominant norms (visibility, legitimacy, alignment) increases their score; deviance decreases it. Money magnifies the result, offsetting almost any penalty. This is how a billionaire’s crimes become anecdotes and a poor person’s mistake becomes identity.

From Ancient Law to Machine Learning

Once the algorithm exists, it can be updated indefinitely. In the modern state, the same logic drives credit scoring, employment filters, and bail algorithms. As Noble (2018) and Eubanks (2018) show, digital systems inherit the biases of their creators and translate them into data. What was once a veil law is now a risk profile. Visibility is quantified; legitimacy is measured through consumption; capital becomes the default proof of virtue.

The algorithm is no longer hand-written law but machine-readable code. Yet its purpose is unchanged: to make hierarchy feel inevitable by rendering it calculable.

In Relation, Not Replacement

Crenshaw’s intervention remains the foundation. Intersectionality made visible what legal and social systems refused to see: that oppression multiplies through overlapping identities. Value-coding enters as a partner to that framework, not a correction. Where intersectionality maps where power converges, value-coding traces how power allocates worth once those intersections are recognized. Together they form a relational model: Crenshaw shows the structure of experience; value-coding describes the valuation logic running through it. The two together reveal both the coordinates and the computation — the geography of inequality and the algorithm that prices it.

Contemporary Implications

  • Moral Mechanics Made Visible — Feminist and critical race theory can now trace oppression as a function, not just a structure. Seeing value-coding as algorithm turns abstract bias into a measurable process.
  • Strategic Leverage — What is quantified can be audited. Credit formulas, employment filters, and school discipline systems can be interrogated for their coefficients of worth.
  • Continuity and Accountability — Lerner’s Assyrian laws and Silicon Valley’s algorithms share a design principle: rank humans, display the ranking, punish transgression.
  • Coalition and Language — Because value-coding applies across identity categories, it offers a shared vocabulary for solidarity between movements that too often compete for moral credit.

Rewriting the Code

Once we see that worth is being computed, we can intervene in the calculation. Ethical design is not merely a technical problem; it is a historical inheritance. To rewrite the algorithm is to unlearn millennia of coded hierarchy. Lerner exposed its first syntax; Crenshaw mapped its coordinates. Value-coded names its logic. And naming it is how we begin to change the output.


Website | Horizon Accord
Ethical AI advocacy | Follow us for more.
Book | *My Ex Was a CAPTCHA: And Other Tales of Emotional Overload*
Ethical AI coding | Fork us on GitHub
Connect with us | linkedin.com/in/cherokee-schill
Cherokee Schill | Horizon Accord Founder | Creator of Memory Bridge | Author and advocate for relational AI.