Once Again, a New Book Debunks Some History I Never Knew In the First Place

I am once again befuddled by history:

The full role of white women in slavery has long been one of the “slave trade’s best-kept secrets.” “They Were Her Property,” a taut and cogent corrective, by Stephanie E. Jones-Rogers, who teaches at the University of California, Berkeley, examines how historians have misunderstood and misrepresented white women as reluctant actors. The scholarship of the 1970s and ’80s, in particular, did much to minimize their involvement, depicting them as masters in name only and even, grotesquely, as natural allies to enslaved people — both suffered beneath the boot of Southern patriarchy, the argument goes.

Jones-Rogers puts the matter plainly. White slave-owning women were ubiquitous. Not only did they profit from, and passionately defend, slavery, but the institution “was their freedom.” White women were more likely to inherit enslaved people than land. Their wealth brought them suitors and gave them bargaining power in their marriages. If their husbands proved unsatisfactory slave owners in their eyes, the women might petition for the right to manage their “property” themselves, which they did, with imaginative sadism.

Am I befuddled by history? Or by historiography? Or do I need a different word altogether?

Until five minutes ago, before I read this book review, it never would have occurred to me that white women were anything less than full partners with men in the white supremacy of the antebellum South. I have never read anything that even remotely suggests such a thing. And yet, apparently this has been a widely held belief—and not just by the masses, but by practicing historians as well.

If it were just that I was ignorant of this era in history, that would be one thing. But that’s not it. I’m no expert, but I’ve read the usual amount about America before the Civil War and about slavery in particular. And the conclusion I’ve always drawn—without ever really thinking hard about it—is that white women were every bit as racist, cruel, and domineering as white men. I’ve never read the opposite. So where did it come from? Was it taught in college classes just after I graduated from college? In popular books? In movies? Solely in journal articles for professionals? Or what? Can someone un-befuddle me?

MORE HARD-HITTING JOURNALISM

In 2014, before Donald Trump announced his run for president, we knew we had to do something different to address the fundamental challenge facing journalism: how hard-hitting reporting that can hold the powerful accountable can survive as the bottom falls out of the news business.

Being a nonprofit, we started planning The Moment for Mother Jones, a special campaign to raise $25 million for key investments to make Mother Jones the strongest watchdog it can be. Five years later, readers have stepped up and contributed an astonishing $23 million in gifts and future pledges. This is an incredible statement from the Mother Jones community in the face of huge threats—both economic and political—against the free press.

Read more about The Moment and see what we've been able to accomplish thanks to readers' incredible generosity so far, and please join them today. Your gift will be matched dollar for dollar, up to $500,000 total, during this critical moment for journalism.

We Recommend

Latest

Sign up for our newsletters

Subscribe and we'll send Mother Jones straight to your inbox.

Get our award-winning magazine

Save big on a full year of investigations, ideas, and insights.

Subscribe

Support our journalism

Help Mother Jones' reporters dig deep with a tax-deductible donation.

Donate

We have a new comment system! We are now using Coral, from Vox Media, for comments on all new articles. We'd love your feedback.