Home

Donate
Perspective

What the Father of Black History Can Teach Us About Technology

Danielle A. Davis Canty / Feb 27, 2026

Danielle A. Davis Canty, Esq., is director of technology policy at Joint Center for Political and Economic Studies, host of The Miseducation of Technology Podcast, and a Public Voices Fellow on Technology in the Public Interest with The OpEd Project.

Carter G. Woodson, born in 1875 in New Canton, Virginia, founded the Association for the Study of Negro Life and History (ASALH).

Black History Month exists in the United States because Carter G. Woodson, a historian and author, believed Black people needed control over education itself — specifically what was taught and whose knowledge was treated as authoritative.

In 1926, Woodson created Negro History Week — the foundation of what would become Black History Month — as a response to a problem: Black students were being educated through curricula that excluded African and Black contributions, portrayed Black people as inferior, and treated the white experience as the standard. This was not the result of a single lie, but an entire educational structure that consistently selected what mattered, what was omitted, and what students were trained to accept as normal.

Woodson documents this process in The Mis-Education of the Negro, particularly in Chapter 3, “How We Drifted Away from the Truth.” There he shows how miseducation works through repetition and institutional authority. Across subjects — geography, science, language, literature, law, medicine, and history — Black people were routinely excluded or depicted as inferior. Over time, this taught students, including Black students, to treat racial hierarchy as normal and justified, rather than as a result of deliberate educational choices.

Reading The Mis-Education of the Negro raised a question for me that still feels urgent today: who teaches us how to understand the world — and who benefits from those explanations? As a Black woman working in technology policy, that question led me to interrogate what we are taught to accept as true about technology. That line of inquiry eventually became the basis for my podcast, The Miseducation of Technology.

Because the same educational pattern Woodson identified shows up today in how we are taught to understand technology.

We are routinely told that digital systems are neutral. That algorithms are objective. That data “doesn’t lie.” When harm occurs, the explanation is usually framed as technical: a bug, scaling issue, or an unintended consequence.

This mirrors the dynamic Woodson described. In his account, miseducation does not require covert coercion. Teachers and institutions may be sincere, but the structure of what is taught and what is excluded trains people to defer to accepted explanations even when outcomes are unequal. The result is not ignorance, but habituation: students learn what questions are reasonable to ask and which ones fall outside the bounds of legitimacy.

That same logic shapes how technology is discussed today. Public attention is directed toward performance — speed, efficiency, innovation — while questions about design choices, governance, and accountability are sidelined. Digital systems are presented as neutral tools, and unequal outcomes are explained as technical glitches or the unavoidable byproducts of complex systems rather than the result of human decisions. As a result, responsibility dissolves into a familiar phrase: “That’s just how the system works.”

This is precisely what I mean by The Miseducation of Technology.

It is not that people fail to see harm. It is that they are taught to interpret harm as accidental or inevitable, rather than as the product of choices about who designs these systems, whose experiences are centered, and which interests guide deployment. Just as Woodson showed how education trained acceptance of racial hierarchy by presenting it as natural, today’s technology discourse trains acceptance of inequality by presenting it as neutral and unavoidable.

For Black communities the consequences are concrete. Surveillance technologies such as facial recognition have been disproportionately deployed in Black neighborhoods and misidentified Black faces at higher rates, leading to wrongful stops and arrests. Automated decision-making systems used in hiring, credit, and healthcare replicate historical patterns of exclusion while presenting themselves as neutral and data-driven. And infrastructure decisions – from unequal broadband deployment to the siting of resource-intensive data centers – quietly shift cost, risk and environmental burden onto communities with little opportunity for meaningful input.

These outcomes are often described as accidental or unintended. But Woodson showed that when systems are designed and governed without the people most affected, inequality is not a mistake – it is the expected result. Technological inequality, in this context, is not surprising. It is predictable.

Revisiting Woodson during Black History Month reminds us that he did not accept the miseducation of the Negro as inevitable. He named it, challenged it, and insisted that this thinking could be changed.

Woodson might not have anticipated algorithms, data-driven systems, or digital infrastructure. But the process he described — the training of people to defer to certain institutional explanations even when those explanations produce unequal outcomes — remains deeply familiar today.

Just as Woodson refused to accept miseducation in his time, we do not have to accept it in our digital era.

That conviction is what animates my podcast, The Miseducation of Technology — and why the work of questioning technology remains unfinished.

Because while technology may be miseducated, we do not have to be.

Authors

Danielle A. Davis Canty
Danielle A. Davis Canty, Esq. is the Director of Technology Policy at the Joint Center for Political and Economic Studies, where she leads the organization’s work on broadband access, content moderation, algorithmic accountability, and data privacy. A seasoned attorney with a focus on tech and telec...

Related

AI, Surveillance, and Suspicion: Listen To Warnings From the Black CommunityFebruary 7, 2024
Perspective
Where AI Meets Racism at the BorderSeptember 30, 2025
Podcast
How to Apply the 'Tyrant Test' to TechnologyFebruary 1, 2026

Topics