As the digital media bubble pops, journalism is in "panic" mode. Read our take.
Senior Research Editor
Maddie writes and edits stories about food, health, the environment, and culture. She oversees Mother Jones' research department and manages its Ben Bagdikian Fellowship Program. Email tips to moatman [at] motherjones [dot] com.
When Danny Meyer, owner of revered eateries like Gramercy Tavern and The Modern in New York, announced last year that he'd abolish the practice at his businesses, he helped spark a national conversation about whether paying a gratuity at a restaurant still makes sense. Along with severalotherrenowned chefs, Meyer has revealed the ugly truth about the practice, which until recently was rarely talked about: that tips create a disparity between different employees, are fairly unregulated and easy to exploit, and are inconsistent and leave servers at the whims of customers rather than the employer.
In this week's episode of Bite, author and labor organizer Saru Jayaraman tells to us more about tipping's disturbing origins. Jayaraman isn't against gratuities, per se, but she feels strongly that the "tipped minimum wage"—the lower wage that restaurant workers take home in all but nine states—has got to go. This lower wage hasn't increased since the early '90s—the nineties—and it forces a staggering number of the nation's 11 million restaurant workers to rely on food stamps.
California is one of the few states where tipped workers earn the full state minimum wage. But even so, some entrepreneurs there think tipping is unfair. Andrew Hoffman, co-owner of Berkeley's The Advocate and Comal restaurants, says the practice favors front-of-the-house employees.
"There's nothing that we sell that isn't the product of all of that collective work," he says. "Yes, the bartender that night made the drink, but who stocked the liquor when it came in? Who washed the glasses when it was done? A restaurant is a collaboration. It's a team sport."
"A restaurant is a collaboration. It's a team sport."
Hoffman got rid of tipping and now includes a 20 percent automatic service charge on his customer's bills instead, which he uses to balance out the pay across the restaurant.
But not everyone has been happy with their no-tipping experiments—in fact, some restaurateurs are returning to a traditional gratuity model. Tune in to our latest episode to hear more.
Fresh out of college and working as an unpaid intern for a San Francisco nonprofit, I paid the bills by moonlighting at an Indian restaurant in the Pacific Heights neighborhood. My hostess job entailed long stretches of boredom punctuated by a cacophonous frenzy. There were icy glares from impatient diners and reprimands from managers for drifting from my podium, but compared with most restaurant workers, I was sitting pretty: My hourly rate exceeded California's minimum wage, I was tipped out by the servers at the end of each shift, and I even received health care benefits—a city mandate.
Very few of America's 11 million restaurant workers share my story. The federal minimum wage is a paltry $7.25 an hour, but in 18 states servers, bussers, and hosts are paid just $2.13—less than the price of a Big Mac. This is known as the federal "tipped minimum wage" because, in theory, these food workers will make up the difference in tips. Twenty-five states and DC have their own slightly higher tipped minimums. The remaining seven, including California, guarantee the full state minimum wage to all workers.
*Some of the wages shown in the above map are only for large employers.
On the surface, tipping seems little more than a reward for astute recommendations and polite, speedy service. But the practice has unsavory roots, as Saru Jayaraman, a labor activist and author of Forked: A New Standard for American Dining, told me during a taping of Bite, the new food and politics podcast from Mother Jones. The origin of the word is unclear—one theory says "tip" is shorthand for "to insure promptness"; another suggests it's from 17th-century thief slang meaning "to give." In any case, European aristocrats popularized the habit of slipping gratuities to their hosts' servants, and by the mid-1800s rich Americans, hoping to flaunt their European sophistication, had brought the practice home.
Restaurants and rail operators, notably Pullman, embraced tipping primarily, Jayaraman says, because it enabled them to save money by hiring newly freed slaves to work for tips alone. Plenty of Americans frowned upon the practice, and a union-led movement begat bans on tipping in several states. The fervor spread to Europe, too, before fizzling in the United States—by 1926, the state tipping bans had been repealed.
America's first minimum-wage law, passed by Congress in 1938, allowed states to set a lower wage for tipped workers, but it wasn't until the '60s that labor advocates persuaded Congress to adopt a federal tipped minimum wage that increased in tandem with the regular minimum wage. In 1996, former Godfather's Pizza CEO Herman Cain, who was then head of the National Restaurant Association, helped convince a Republican-led Congress to decouple the two wages. The tipped minimum has been stuck at $2.13 ever since.
This is why restaurant workers today take home some of the lowest pay offered by any industry. Seven of the 10 worst-paying job categories tracked by the Bureau of Labor Statistics (BLS) are in food services. Real median wages for waiters and waitresses are down 5 percent since 2009; cooks saw a decline of 9 percent.
Sure, we occasionally hear about waiters hauling in $80K at posh urban establishments. Those are the stories that corporate players such as Darden, the notoriously stingy owner of the Olive Garden chain, want you to remember. The restaurant association's website claims the national median take-home pay for tipped servers is $16 to $22 an hour. But those same workers, according to the BLS, made just $9.01 an hour in 2014—poverty wages for a family of four and nowhere near enough to cover rent on the average two-bedroom apartment. (The association says this figure is low because some restaurants report tips improperly.)
America's two-tiered wage system is hardest on women, who make up 71 percent of tipped servers—waitresses are twice as likely to use food stamps as the general population. And while federal law requires employers to make sure their tipped workers earn at least minimum wage after tips, that rarely happens—from 2010 to 2012, according to the Department of Labor, 84 percent of restaurants were in violation of federal wage law, "which means the women who put food on the tables in America can't actually afford to feed themselves," Jayaraman says.
"The women who put food on the tables in America can't actually afford to feed themselves," Jayaraman says.
The racist origins of tipping persist, meanwhile, in the take-home wages of nonwhite restaurant workers, who earn 56 percent less than their white colleagues. In one study, researchers at Cornell University and Mississippi College found that customers at an unnamed national chain restaurant—even the black customers—tipped white servers better than black servers. This disparity, the researchers noted, could in theory render the tipped minimum wage unlawful.
Jayaraman says she's not advocating the end of tipping, just that it take on a different form. Several celebrated restaurants, including Alice Waters' Chez Panisse and Danny Meyer's The Modern, have largely replaced tipping with higher menu prices or mandatory service charges. San Francisco's Bar Agricole tried it, too, but reverted to tipping after servers complained they were making less money. At least they're working in California, where they'll never take home less than the current $10-an-hour minimum wage, even if every last table stiffs them.
"If you have enough calories in your diet, not getting enough protein would be very, very hard," says Meathooked author Marta Zaraska.
Stroll through the aisles of your supermarket and you'll see advertisements left and right for snacks packed with the new magical nutrient: protein. Wheyhey ice cream—"20 grams of protein per pot"—promises to help you with "losing weight" and "skin anti aging," while P28 high protein sliced bread wants to be "part of your journey to a healthy lifestyle." Lenny & Larry's protein-packed cookies supposedly help "chase away hunger." Artisanal bison jerky bars line the Whole Foods' checkout aisle, and everyone at work is on a paleo diet.
Do we really need this much protein? To maintain normal health, the average sedentary adult woman needs a daily dose of 60 grams and a man needs around 70. Yet data shows that Americans may consume around 120 grams daily. That means we're consuming twice as much as what's needed, likely without even trying. "If you have enough calories in your diet, not getting enough protein would be very, very hard," journalist and author Marta Zaraska told me in an interview for our latest episode of Bite, "Zebra Meat and Vegan Butchers."
In her new book Meathooked: The History and Science of Our 2.5-Million-Year Obsession With Meat, Zaraska digs deep into the reasons behind this protein hunger. According to Zaraska's research, the craze goes much further back than the rise of the paleo diet and other protein-focused diets.In fact, one of the myths fueling this protein fixation has roots in a shaky finding from the 1800s. That's when German scientist Carl von Voit determined how much protein soldiers and hard laborers consumed each day, and then extrapolated that the average body required 150 grams a day. "The problem with his methodology is obvious," writes Zaraska: "It's a bit like observing children stuffing themselves with cookies and concluding that young humans require tons of sugar to grow." By 1944, the US Department of Agriculture had halved that recommendation, but the idea that we need lots of protein to be healthy lived on.
Most of the protein we consume comes from animals: Americans eat roughly 270 pounds of meat a year. For years, many people thought that without animal flesh, our bodies don't get all of the essential amino acids they need. (Meat is considered a "complete" protein because it contains all of the acids.) Zaraska traces some of this misunderstanding back to, ironically, Frances Moore Lappé, author of Diet for a Small Planet. In her seminal 1971 manual for embracing a low-impact life, Lappé suggested that vegetarians should chart the amino acids in their plant foods and eat the foods together at the right times to make sure they could "complete" their plant-based proteins through the right combinations of amino acids from different sources, a task that required laborious planning and analysis.
True, plant foods can lack enough essential amino acids; beans, for instance, are low in methionine. (Grains are high in methionine, hence the advice to enjoy rice and beans together.) But since the 1970s, we've learned that the body actually completes proteins—fills in the missing elements—on its own. "Now we know that the liver can store amino acids so we don't have to combine[the acids] in one meal," states the Academy of Nutrition and Dietetics. In the 20th-anniversary edition of her book, Lappé acknowledged that when it came to amino acids, she had "reinforced another myth." Not only does the body complete proteins; there are several plant foods that have all of the essential amino acids that a person needs, writes Zaraska, such as buckwheat, quinoa, soy, and potatoes.
I'd need 42 grams of protein to meet my requirement. On Tuesday, I ate 67 grams without thinking twice—and I don't eat meat.
The consensus among many doctors and dietitians these days seems to be that if you are eating a diverse array of foods, you don't need to stress about protein. The Institute of Medicine's recommended daily allowance of protein is 0.36 grams per pound of body weight (adjusted slightly if you're active, ill, or pregnant). I'd need about 42 grams to meet my requirement; when I added up everything I ate earlier this week, I was startled to discover that I had eaten 66 grams without thinking twice—and I don't eat meat. Considering a single serving of chicken breast clocks in at 31 grams and a piece of skirt steak at 22, it's easy to see why Americans frequently double-dip on their protein allowances. (Calculate your own daily allowance here.)
On its own, eating a lot of protein isn't actually that unhealthy. As Stanford medicine professor Christopher Gardner told me, for the most part our bodies can tolerate extra helpings of the nutrient, though excessive amounts have the potential to wreak havoc on the kidneys. It's what comes with the protein that puts us at risk, explains Gardner. When General Mills came out with its more expensive "Cheerios Protein," the brand boasted that the new cereal would provide the whole family with "long-lasting energy." But that energy likely had more to do with the nutty O's sugar content; as the Center for Science in the Public Interest pointed out in a November class-action lawsuit, Cheerios Protein contains 17 times the amount of added sugar as the original, and only a touch more of the protein. (General Mills tried to get the suit thrown out in January, to no avail so far.)
Cheerios Protein contains 17 times the amount of added sugar as the original, and only a touch more of the protein.
Gardner also worries that in our hunger for protein, we've begun skipping real foods. We're saying, "'I'm not going to eat food, I'm going to have a bar as a meal'—which means that it's coming with fewer of the natural nutrients of food," he says.
But Gardner's real concern has to do with the planet's health. Around 80 percent of the protein we consume comes from animals, he says, in the form of meat, eggs, or dairy. And those creatures need a lot of resources to become food. One-third of a pound of hamburger requires 660 gallons of water to produce, if you include the irrigation needed for the feed. Raising animals for people contributes to a bevy of environmental plagues, including deforestation, water contamination, loss of biodiversity, and desertification. Of the more than 25 percent of all greenhouse gases attributed to the food system, 80 percent comes from producing livestock.
A global shift to a more plant-based diet could save the planet up to $31 trillion, according to a new report.
In early 2015, the Dietary Guidelines Advisory Committee, a body of scientists who review nutrition advice for the USDA and the Department of Health and Human Services, advised the government to encourage a shift to a more plant-based diet: "Consistent evidence indicates that, in general, a dietary pattern that is higher in plant-based foods…and lower in animal-based foods is more health promoting and is associated with lesser environmental impact than is the current average US diet," the committee wrote. Ultimately, this recommendation was left out of the 2016 Dietary Guidelines. But others are sounding a similar alarm. Earlier this week, Oxford researchers published a report in the Proceedings of the National Academy of Sciences arguing that a global shift to a more plant-based diet could reduce global food-related greenhouse gas emissions by 29 to 70 percent by 2050 and save the planet up to $31 trillion US dollars, or 13 percent of the world's GDP.
Protein-cramming probably won't hurt you, but it likely won't do you much good, either. And as the Oxford researchers note, the choices we make about food "have major ramifications for the state of the environment." For the sake of our crowded planet, maybe it's time torelax and stop trying to make protein part of every item on your plate.
In each biweekly episode, we'll interview a writer, scientist, farmer, or chef to uncover the surprising stories behind what ends up on your plate.
Earlier this winter, an essay on the food and culture website First We Feast laid out some complaints about contemporary food journalism: "Food media has felt, for lack of a better word, soft," editor Chris Schonberger wrote. To find investigative reporting on food issues, readers must look outside the "food media" bubble. As legendary culinary writer Ruth Reichl told Schonberger and company: "If you're interested in the politics of food, you can go to Mother Jones or something."
Bite is a podcast for people who think hard about their food. In each biweekly episode, my co-hosts Tom Philpott and Kiera Butler and I will interview a writer, scientist, farmer, or chef to uncover the surprising stories behind what ends up on your plate. We'll help you digest the major food news of the week. We're interested in how your food intersects with other important topics like identity, social justice, health, corporate influence, and climate change.
Don't worry—we'll have some fun, too. We're happy to indulge in some full-on foodie-ism from time to time. (Check out our recipes for wine-braised short ribs and cranberry salsa.) We'll reflect on the weirdest things our guests have eaten as of late. And we'll try to solve your food mysteries—especially if you get in touch with us on Twitter or Facebook, or by sending an email to email@example.com.
Subscribe to Bite on iTunes or via our RSS, and get ready for our first episode, which will drop very soon. We hope you're hungry.
Ever since our early ancestors left the fertile sauna of Africa and headed North, we humans have been searching for ways to fend off sleet and snow and rain and cold. The Inuit once relied on seal and whale intestines to get the job done. Nowadays, we rely on waterproof synthetics.
These modern fabrics represent a certain kind of progress, but they also have a worrisome downside. Some of the fluorocarbon chemicals used in their manufacture are dangerous for our health, and are so stable that their residues will persist in the environment, quite literally, until the next Ice Age. What's more, there's no guarantee that the industry's latest alternatives, which are marketed as safer, are much of an improvement.
PFOA, which has contaminated drinking water in many districts, may pose health risks at concentrations as low as one part per trillion.
To make their fabrics repel water—causing it to bead up and fall away rather than penetrate the material—most manufacturers rely on perfluorocarbons (PFCs), the same chemicals used to make nonstick cookware and cupcake wrappers. Some PFCs escape into the atmosphere and into wastewater during production—and small amounts can turn up as residue on the clothing itself.
PFCs have been around since the 1950s, but we didn't know a lot about their effects until the early 2000s, when scientists began releasing data on PFC toxicity and their persistence in the environment. A particularly troublesome PFC is perfluorooctanoic acid, or PFOA, a suspected human carcinogen that has been linked to cancer, kidney damage, and reproductive problems in rats. It may also pose human health risks if it accumulates in drinking water at levels as miniscule as one part per trillion—the equivalent of less than one teaspoon in 1,000 Olympic swimming pools' worth of water. One study also associated elevated exposure to PFCs, including PFOA, with weakened immune responses in children.
The makers of PFCs have been the subject of several blockbusterexposés—PFOA most recently made headlines as the culprit poisoning residents of Parkersberg, West Virginia. These compounds have a very long biological half-life—specifically, it takes our bodies more than four years to flush out half of the PFCs currently residing in our tissues. As such, the US Environmental Protection Agency warns that "it can reasonably be anticipated that continued exposure could increase body burdens to levels that would result in adverse outcomes."
Because PFOA and its precursors virtually never go away, they accumulate in nature and eventually find their way back to us. Researchers have found the chemical in remote parts of the Arctic, in soil and dust, in fish and meat, in human tissue, and in drinking water throughout the United States. (To find out if your county's water has tested positive for the chemical, see this map).
In 2006, the EPA asked major chemical manufacturers, including DuPont and 3M, to set a goal of eliminating PFOA and its precursors from both emissions and products by January 31, 2015—their final reports are due by the end of this month. The European Union has also proposed restrictions on the substance. So problem solved, right? You no longer need to fret about the chemicals used to make your sweet new neon ski parka?
Greenpeace tested 40 name-brand outdoor clothing items last year and reported that PFOA was "still widely present."
Well, not exactly. There are reasons to stay worried. For one, the EPA's phaseout program was voluntary, and it includes no mandate that clothing manufacturers must also remove PFCs from their supply chains. (The EPA does say it is working on a rule that would require clothing companies that import fabrics made with PFOA to subject themselves to the agency's review.) Greenpeace tested 40 pieces of outdoor clothing and gear it had purchased in late 2015 and reported that PFOA is "still widely present" in name-brand products, including items from the North Face, Patagonia, and Mammut.
Patagonia calls Greenpeace's assessment "not accurate" and says it has mostly phased out PFOA. Mammut says it has eliminated the chemical entirely—as does North Face, starting with its spring 2015 line. Some of the products Greenpeace tested may have been manufactured before phaseout efforts were complete.
Most of the sportswear manufacturers have replaced PFOA, which has an eight-carbon backbone, with six-carbon (C6) PFCs. Mammut, for example, says it is provisionally using a "responsible" and "PFOA-free" C6 chemistry, while Marmot, another outdoor clothing brand, argues that C6 "is the safest alternative for the environment."
It's true that these shorter PFCs don't remain in our bodies as long as PFOA does. Still, "the C6 chemicals don't seem to be the magic coating for your clothing that you're looking for," says Environmental Working Group senior scientist David Andrews. Like PFOA, the shorter compounds persist in the environment, which is one reason why Greenpeace, EWG, and plenty of other scientists around the globe don't consider them safe alternatives. In addition, as Patagonia explains, "the shorter-chain structure also tends to perform less effectively in repellency tests," which means a larger quantity may be needed to achieve the same result.
"We should probably have more oversight into this whole class of chemicals…It took decades to show how bad PFOA is."
Manufacturers in the United States are not required to test chemicals for safety before using them in products, and the health effects of the shorter-chain PFCs are as yet a mystery. But "the short-chain chemicals show a lot of the same characteristics as their longer predecessors," EWG's Andrews told me.
Indeed, as a class, PFCs raise all sorts of red flags. In 2014, 200 scientists from around the world signed the "Madrid Statement," a document calling for more research on PFC toxicology and urging governments around the world to restrict their use for nonessential purposes. "We should probably have more oversight into this whole class of chemicals," Andrews says. "It took decades to show how bad PFOA is."
Outdoor clothing makers acknowledge these concerns—"it may be preferable to search for fluorocarbon-free water repellent as a long term solution," notes Patagonia—but they insist their hands are tied. The North Face's "chemical responsibility" web page assures that the company hopes to phase out "fluorinated DWR" (that's durable water repellent) by 2020, but notes that "short-chain DWR is currently the best available viable alternative."
In January, British apparel maker Páramo became the first in the snow-gear industry to go completely PFC-free.
Several clothing companies say the durability of their products—made possible by PFC chemistry—is key to their environmental friendliness. As Patagonia's spokesman put it, "abandoning PFCs and moving to currently available alternatives would have an even greater negative impact on the environment because the lifespan of our gear would be greatly reduced, requiring replacement far more quickly, which of course carries significant costs—carbon emissions, water usage, waste output, bigger landfills, and more." He added that the company is still committed to finding an alternative, and that it has partnered with a Swiss firm working at the "cutting edge of chemical treatments that don't harm the planet."
There is at least one safer option currently floating around. A company called Nikwax sells a PFC-free waterproofing product akin to the rubber in the soles of your shoes: You cover your jacket with the Nikwax gel, toss it in the wash, and presto—it's coated with a network of elastic water-repellent molecules. The problem is that Nikwax is a direct-to-consumer product, meant to go on the jacket you've already bought. In that sense, it doesn't help solve the PFC conundrum.
But that could change. In January, Páramo, a small British brand partnering with Nikwax, became the first company in the outdoor industry to completely eliminate PFCs from its manufacturing process. Italian climber David Bacci wore Páramo's threads as he scaled the Patagonian peaks Fitz Roy and Cerro Torres, and he wrote that the clothing "worked perfectly" and kept him "dry and warm in extreme conditions."
Nikwax North America president Rick Meade says he thinks the publicity around fluorinated chemicals will lead to some "dramatic shifts of interests to consumers in the next one to three years." For now, until more clothing companies commit to ditching PFCs, your snow outfit will most likely be made with a PFOA cousin that's coated in mystery.