Once Again, a New Book Debunks Some History I Never Knew In the First Place

Looking for news you can trust?Subscribe to our free newsletters.

I am once again befuddled by history:

The full role of white women in slavery has long been one of the “slave trade’s best-kept secrets.” “They Were Her Property,” a taut and cogent corrective, by Stephanie E. Jones-Rogers, who teaches at the University of California, Berkeley, examines how historians have misunderstood and misrepresented white women as reluctant actors. The scholarship of the 1970s and ’80s, in particular, did much to minimize their involvement, depicting them as masters in name only and even, grotesquely, as natural allies to enslaved people — both suffered beneath the boot of Southern patriarchy, the argument goes.
Jones-Rogers puts the matter plainly. White slave-owning women were ubiquitous. Not only did they profit from, and passionately defend, slavery, but the institution “was their freedom.” White women were more likely to inherit enslaved people than land. Their wealth brought them suitors and gave them bargaining power in their marriages. If their husbands proved unsatisfactory slave owners in their eyes, the women might petition for the right to manage their “property” themselves, which they did, with imaginative sadism.

Am I befuddled by history? Or by historiography? Or do I need a different word altogether?
Until five minutes ago, before I read this book review, it never would have occurred to me that white women were anything less than full partners with men in the white supremacy of the antebellum South. I have never read anything that even remotely suggests such a thing. And yet, apparently this has been a widely held belief—and not just by the masses, but by practicing historians as well.
If it were just that I was ignorant of this era in history, that would be one thing. But that’s not it. I’m no expert, but I’ve read the usual amount about America before the Civil War and about slavery in particular. And the conclusion I’ve always drawn—without ever really thinking hard about it—is that white women were every bit as racist, cruel, and domineering as white men. I’ve never read the opposite. So where did it come from? Was it taught in college classes just after I graduated from college? In popular books? In movies? Solely in journal articles for professionals? Or what? Can someone un-befuddle me?