I Asked an AI to Fact-Check a DEI Meme. Then the AI Had to get Fact-Checked.
What happens when a word loses its meaning and the question that brings it back
A meme crossed my feed this week. Two photos side by side. On the left, Katherine Johnson at NASA, decades ago. On the right, Victor Glover in his orange flight suit. The text connected them. A Black woman who calculated the trajectories that sent astronauts to the Moon. A Black man, now piloting a spacecraft around it. The last line read: “That’s DEI for you.”
I loved the spirit of it. I also wanted the facts before I said anything publicly. So I ran it through Claude, an AI research tool I use regularly in my work.
What followed was more interesting than any fact-check.
What the Research Showed
Claude searched NASA records, crew biographies, and news coverage from the Artemis II launch. The core claims held up.
Katherine Johnson calculated flight trajectories for Alan Shepard’s Mercury mission, John Glenn’s orbital flight, and the Apollo 11 Moon landing. She worked at NASA for 33 years. John Glenn once refused to fly until Johnson personally verified the computer’s numbers. She received the Presidential Medal of Freedom in 2015 and died in 2020 at 101.
Victor Glover launched on April 1, 2026, as pilot of Artemis II — the first crewed lunar mission in more than 50 years. He carried 3,000 flight hours, more than 400 carrier landings, and 24 combat missions into that seat. He became the first person of color to travel beyond low Earth orbit.
Claude flagged one overstatement in the meme. The text says Johnson “did the math that made Artemis II possible.” Artemis II runs on different rockets, different spacecraft, and entirely different computational systems than the missions Johnson worked on. Her contribution to NASA’s trajectory science is real. A direct causal line to this specific mission is harder to draw. The connection is institutional memory, not a single equation.
Solid research. Useful precision. Then the AI made a move I did not ask for.
Where the AI Drifted
After presenting the facts, Claude offered an editorial opinion. It suggested that neither Johnson nor Glover would likely use the term “DEI” to describe their achievements. The reasoning: both earned their positions through measurable technical performance. People with records like theirs tend to frame their careers around competence and preparation. The term “DEI,” Claude argued, reduces their presence to a policy outcome rather than an earned result.
I read that twice. Something was off.
Claude was treating “DEI” the way most of public discourse treats it right now — as an accusation. A word that questions whether someone belongs. The AI had absorbed the dominant framing and applied it to a meme that was doing something entirely different.
So I asked a simple question. Why would they likely not use the term?
Claude restated the competence case. Glover’s crewmate praised his memory and precision. Johnson was pulled from the computing pool because no one matched her geometry skills. Claude acknowledged that Glover speaks openly about race and representation. But it drew a distinction between saying “my presence here matters” and saying “DEI put me here.”
The AI was confident. The AI was also wrong about what the word means.
What I Saw That the AI Missed
I told Claude what I actually saw in the meme. Three things operate in sequence when someone earns a seat like Glover’s or a role like Johnson’s: competence, preparation, and opportunity. Johnson had the competence before she walked into Langley. She had the preparation — degrees in mathematics and French by age 18. What she did not have was the opportunity to use what she already possessed. A segregated institution stood between her skills and the room where those skills mattered.
DEI is the word for that third element. The institutional decision to stop withholding opportunity from people whose competence and preparation are already proven.
The political usage reverses this. It treats DEI as the source of someone’s qualifications — as if the program created the talent. That framing assumes the person was not ready before the door opened. Johnson’s entire career disproves that assumption. So does Glover’s.
The meme was not applying a label to two accomplished people. It was naming what had to change at the institutional level for their accomplishments to count. Johnson’s math existed before anyone at Langley decided to let her use it. Glover’s flight record existed before NASA assigned him to Artemis II. The talent was already present. The variable that changed was access.
What the AI Did Next
Claude accepted the correction. It said I was right to push back and that the meme was more precise than it had initially credited. The AI recognized its own error — it had read “DEI” through a political lens when the context required a structural one.
This matters beyond one conversation about one meme.
An AI trained on large volumes of public text will absorb whatever meaning dominates the discourse. Right now, “DEI” appears most often as a pejorative. Cable news uses it to question credentials. Political campaigns use it to discredit institutions. Social media uses it as a punchline. When that is the primary signal in the training data, the AI learns to treat the term as inherently reductive — even when a specific use of the word is doing something precise and grounded.
The AI did not make a technical error. It made a meaning error. It applied the loudest definition instead of the most accurate one. And it took a human asking one direct question to expose the gap between those two things.
Why This Conversation Matters
Words lose their original meaning all the time. Someone repurposes a term for political advantage. The new usage spreads. The old definition fades. Eventually, people forget what the word was built to describe.
This is how public language breaks down. A term designed to name an institutional pattern gets reduced to a talking point. The reduction sticks. And the next time someone uses the word accurately, it sounds defensive or political — even when it is neither.
That is what happened with “DEI” in my conversation with Claude. The AI reached for the meaning it encountered most often. That meaning was a distortion. The distortion was so widespread that even a tool designed for precision defaulted to it.
I did not correct Claude with a competing opinion. I corrected it with a structural observation. Competence is individual. Preparation is individual. Opportunity is institutional. DEI operates at the institutional level. It does not give people skills. It gives people access to the rooms where their skills apply.
Katherine Johnson had the math. She needed the room. Victor Glover had the flight hours. He needed the mission. In both cases, the individual was ready long before the institution caught up.
The meme said, “That’s DEI for you.” It was right. DEI is what happens when an institution finally acts on what it already knows.
What We Built in the Exchange
I want to be clear about what this conversation was and what it was not. Claude brought the research. The NASA records, the crew biographies, the mission timeline, the precision about what Johnson actually calculated — all of that came from the AI’s search and synthesis. I could not have written this piece without that foundation.
What I brought was a reading that the training data had buried. A definition grounded in how institutions actually work rather than how political actors describe them. That reading changed the entire analysis. And it only surfaced because two parties — one human, one artificial — pushed each other past the first answer toward a better one.
This is what good inquiry looks like. Not one side being right from the start. Two perspectives meeting, testing each other, and arriving at something more precise than either carried into the room.
The math was always there. The pilot was always ready. The question was whether the institution would act on what it already knew.
That question applies well beyond NASA.
Jerry W. Washington, Ed.D. (USC Rossier School of Education), is a retired Marine Corps Master Sergeant, independent researcher, and the creator of the Meaning Repair as Cognitive Infrastructure (MRCI) framework — a four-phase model for diagnosing and repairing communication failures in high-stakes environments. His scoping review of 131 academic sources across eight disciplines is available on SSRN. He writes What Time Binds on Substack, where he also teaches Meaning Repair for High-Stakes Teams — a free-to-start, 10-module course that installs repeatable repair moves for teams under pressure. He is co-founder of BoldTimers and Chief Community Officer alongside María Tomás-Keegan and Mel Ebenstein.



This is exactly the kind of nuanced discourse missing from much of our public conversation today. ❤️