Once Again, a New Book Debunks Some History I Never Knew In the First Place

I am once again befuddled by history:

The full role of white women in slavery has long been one of the “slave trade’s best-kept secrets.” “They Were Her Property,” a taut and cogent corrective, by Stephanie E. Jones-Rogers, who teaches at the University of California, Berkeley, examines how historians have misunderstood and misrepresented white women as reluctant actors. The scholarship of the 1970s and ’80s, in particular, did much to minimize their involvement, depicting them as masters in name only and even, grotesquely, as natural allies to enslaved people — both suffered beneath the boot of Southern patriarchy, the argument goes.

Jones-Rogers puts the matter plainly. White slave-owning women were ubiquitous. Not only did they profit from, and passionately defend, slavery, but the institution “was their freedom.” White women were more likely to inherit enslaved people than land. Their wealth brought them suitors and gave them bargaining power in their marriages. If their husbands proved unsatisfactory slave owners in their eyes, the women might petition for the right to manage their “property” themselves, which they did, with imaginative sadism.

Am I befuddled by history? Or by historiography? Or do I need a different word altogether?

Until five minutes ago, before I read this book review, it never would have occurred to me that white women were anything less than full partners with men in the white supremacy of the antebellum South. I have never read anything that even remotely suggests such a thing. And yet, apparently this has been a widely held belief—and not just by the masses, but by practicing historians as well.

If it were just that I was ignorant of this era in history, that would be one thing. But that’s not it. I’m no expert, but I’ve read the usual amount about America before the Civil War and about slavery in particular. And the conclusion I’ve always drawn—without ever really thinking hard about it—is that white women were every bit as racist, cruel, and domineering as white men. I’ve never read the opposite. So where did it come from? Was it taught in college classes just after I graduated from college? In popular books? In movies? Solely in journal articles for professionals? Or what? Can someone un-befuddle me?

FACT:

Mother Jones was founded as a nonprofit in 1976 because we knew corporations and the wealthy wouldn't fund the type of hard-hitting journalism we set out to do.

Today, reader support makes up about two-thirds of our budget, allows us to dig deep on stories that matter, and lets us keep our reporting free for everyone. If you value what you get from Mother Jones, please join us with a tax-deductible donation today so we can keep on doing the type of journalism 2019 demands.

We Recommend

Latest

Sign up for our newsletters

Subscribe and we'll send Mother Jones straight to your inbox.

Get our award-winning magazine

Save big on a full year of investigations, ideas, and insights.

Subscribe

Support our journalism

Help Mother Jones' reporters dig deep with a tax-deductible donation.

Donate

We have a new comment system! We are now using Coral, from Vox Media, for comments on all new articles. We'd love your feedback.