The Moral Dyad
The most under-rated model in social psychology
A California jury on Wednesday found that Meta and Google were to blame for the depression and anxiety of a woman who compulsively used social media as a small child, awarding her $6 million in a rare verdict holding Silicon Valley accountable for its role in fueling a youth mental health crisis.
—NPR, March 25
I see the jury’s verdict as a classic instance of the Moral Dyad.
The moral dyad is a phenomenon where we attribute agency only to one side and feeling only to another side. This is the villain-victim mindset.
If you think of Meta and Google as unfeeling, all-powerful agents and you think of the woman as helpless and hurt, then you are seeing the case in Moral Dyad terms. In my opinion, the Moral Dyad is the most under-rated model in all of social psychology. I think everyone should know about this model, which is why I am writing yet another essay about it.
The Moral Dyad model was propounded by Daniel Wegner and Kurt Gray in their book The Mind Club, published in 2016. (I reviewed their book here.) Their research sought to determine how we view the minds of other human beings.
What they found was that there are two clusters of beliefs that we hold about other humans. One cluster concerns agency. We think of other humans as having the ability to make choices, form plans, and work toward goals.
The other cluster concerns feelings. We think of other humans as having the capacity to experience sensations. We are especially inclined to notice when other humans feel pain.
The Moral Dyad model says that in any moral situation we are inclined to view one human or group of humans as having all of the agency, while the other individual or group feels all of the pain. That is, instead of recognizing that both sides have agency and feelings, we gravitate toward taking an either-or view of the situation.
Wegner and Gray use a robot/baby metaphor to describe the Moral Dyad. A robot can carry out intentions but cannot feel pain. A baby can feel pain but is not equipped to undertake deliberate actions. The Moral Dyad model says that we treat one side as if it were a robot and the other side as if it were a baby.
In the court case, the jury treated the woman as a helpless baby. Her use of social media was outside of her control. Conversely, Meta and Google were like an unfeeling robot. They used their superior technical skills to exploit the woman.
My own theory in The Three Languages of Politics is consistent with this tendency to reduce moral situations to Dyads. Clearly, the progressive oppressor-oppressed axis treats the oppressor group as having agency without feelings while the oppressed group has feelings without agency. But in the conservative model of civilization vs. barbarism there is also an inclination to look at criminals or terrorists as having nothing but evil intentions and to see others as helpless victims. And libertarians see the state as having agency while ordinary citizens can be helpless in the face of government coercion.
When we view situations in Moral Dyad terms, we become overconfident in our judgments. We fail to see nuance. We do not understand the other side. Once we understand the theory of Moral Dyads, we can be more careful about not to take the simplistic robot/baby view. I think that we would be less certain that Big Tech is purely evil.


Anecdotally, as parents, we have seen firsthand the harm that social media can cause to certain groups, most notably to pre-teens and teenagers.
Instead of going the roundabout route of initiating litigation against the social media companies, which takes years, we simply unplugged their services in a matter of minutes. While our child didn’t like the change initially, her self-reported mood improved dramatically. Problem solved.
Unfortunately, I don’t have any robust scientific evidence to demonstrate the harms of social media, and that alone may be a nonstarter on a Substack like this one. But, the personal decision to unplug doesn’t necessarily need a scientific basis.
Who’s to say an algorithm designed for engagement is inherently bad? If I want to watch videos of Arnold Kling, YouTube will gladly serve them up. Pick almost any subject, and YouTube or Instagram will quickly find more of it for you. That can narrow your world, but it can also be highly useful: these platforms often give people more of what they have already shown they want.