• Let Us Now Praise Apple (Sort Of)

    Here is professional technology enthusiast David Pogue on the big Apple product rollout today:

    It turned out that everyone was right. The new iPhones have wireless charging, faster processors, and, in the ultra-luxe iPhone X, edge-to-edge screens. In other words, all the stuff that my Samsung 8 already has. And yet everyone is so excited they can barely sit still.

    I like to make fun of this because I’m an Apple cynic, but honestly, kudos to them for keeping up their marketing mojo long after their actual products have ceased to be very interesting. Compare this to Microsoft, which—well, let me tell a story about that.

    I’m a Surface tablet junkie. I love my Surface 3.¹ And my Surface 4 Pro. And my Surface Pro 2017, which I bought a couple of weeks ago. But wait. Why did I only buy it a couple of weeks ago? It was announced in May and began shipping in June. What took me so long?

    Answer: I didn’t know it existed. I’ve been using Windows since 1991 and I’ve purchased two Surface tablets since 2015. The first one was purchased at a Microsoft store. I adore Surface tablets. I would appear in commercials for Microsoft if they asked me. And yet I had no idea that a new one was out. I didn’t get so much as a single email about it.² My Twitter feed had nothing about it. None of the media outlets I read bothered to highlight it enough that I saw it.

    Is this because it was boring? In a way, yes. It’s the same size as the previous model and has all the same ports. Basically, it has better battery life, a faster processor, and a new stylus that’s considerably better than the old one (which was pretty good to start with).³ It’s definitely evolutionary. But the longer battery life was enough to suck me in.

    If Apple had introduced this, everyone on the planet would know about it. The battery life would be a category killer for laptops. The pen would be an artist’s dream come true. The faster processors would do wonders for gaming and 3D rendering. The miscellaneous updates in the operating system would be game changers. It would be the greatest upgrade in history.

    I dunno. Some people can get away with nonsense like this. Donald Trump. Kanye West. Apple. Everyone plays along because it’s part of the act. But no one else can do it because, after all, it’s also pretty ridiculous.

    Still, you’d think Microsoft could at least do ordinary boring marketing for their new tablet. There’s no reason to almost literally keep it a secret, is there?

    ¹This is now Marian’s Surface 3.

    ²In fairness, it’s possible that this is because I opted out of getting emails from Microsoft. If so, kudos to them for keeping their word. Still, you’d think they could figure out some kind of workaround to target folks who obviously like Surface tablets.

    ³It also has the same ridiculously paltry storage options as the previous model. Someday I need to have a long talk with the Surface product manager about that. What’s the deal with 128 GB on a tablet that supposedly can “replace your desktop”? You can upgrade to 512 GB, of course, but only if you upgrade to the top-end model, which will cost you about an extra thousand dollars. Seriously?

  • Here’s How Big Pharma Helped Set New Pain Guidelines

    The Toronto Star via ZUMA

    I wrote yesterday about opioids and pain treatment, and along the way I mentioned that the trend toward more opioid prescribing was blessed in 2001 by The Joint Commission, an accrediting agency for medical facilities. That year they issued new guidance about pain management, which required hospitals to treat pain “aggressively” if they wanted to remain accredited.

    I had a strong recollection that the commission’s recommendations had been heavily influenced by lobbying from the pharmaceutical industry, but I didn’t trust my memory about that and wasn’t able to immediately find confirmation. However, Dr. Anna Lembke is an expert about this. She’s the author of Drug Dealer, MD, a book about the forces that have driven our nationwide opioid addiction. Here she is in an NPR interview a few months ago:

    On what Lembke means when she says that big medicine and Big Pharma “were in cahoots”

    The pharmaceutical industry realized that they can no longer directly go to doctors to get them to prescribe their pills. Various regulations were put in place to prevent them giving gifts and pens and hats and things that we do know can influence doctor prescribing. So instead they took a kind of Trojan horse approach and infiltrated regulatory agencies and academic medicine in order to convince doctors that prescribing more opioids was evidence-based medicine, and evidence-based medicine means medicine based on science, and that’s something that all doctors are supposed to practice. …

    So for example, what they did was Purdue Pharma joined forces with the Joint Commission, and the Joint Commission is an organization that accredits hospitals, and Purdue Pharma gave all kinds of teaching material to the Joint Commission and said, “You really need to make doctors treat pain more aggressively and that needs to be a quality measure.” So the Joint Commission said, “You know what? You’re absolutely right, and we’re going to do that and we’re going to take your videos that you made that tell doctors that opioids aren’t addictive as long as they’re treating them for pain.” …

    So it became a kind of groupthink where it looked like treating pain aggressively with opioids was something that was based on science, when in fact it was based on Big Pharma’s influence of these major regulatory bodies.

    As I said yesterday, there’s plenty of blame to go around. But there’s no question that Big Pharma deserves a big share of it.

  • Wildfire Season in the West Lasts Longer and Longer

    Stuart Palley via ZUMA

    On the East Coast, the big danger comes from hurricanes, which are becoming ever more intense as ocean waters continue to warm. Here on the West Coast, we worry about wildfire season, which lasts longer and longer as summer temperatures linger into fall:

    Despite record-breaking rain and snowfall across the West in 2017, this year’s fire season has been unforgiving….“Typically by the third week of September we see not as much fire activity,” said Jessica Gardetto, spokeswoman for the National Interagency Fire Center. “But we just haven’t had that relief.” The blazes have been responsible for the deaths of eight firefighters and have destroyed more than 500 homes.

    ….What makes the fires burning across the West so extreme? One aspect that sets this year apart is the length of time the fire season has lasted, in part because of dry air, conducive for sustaining wildfires. Lightning strikes in Oregon and Washington have sparked many of the wildfires still ravaging large swaths of land, while drought-stricken Montana continues to battle several large fires.

    Rising sea levels. Bigger hurricanes. More drought. Longer wildfire seasons. This is global warming, folks.

  • Chart of the Day: Household Income Finally Beats 1999 Record

    The Census Bureau has finally gotten around to calculating household income for 2016, and the news is good: adjusted for inflation, income was up 3.2 percent last year. In fact, household income is now at an all-time high:

    And now for the buzzkill portion of this post: this means household income has increased a whopping 0.6 percent since 1999. That’s $22 per year.

    On the other hand, this is better than the top .01 percent has done. According to Piketty and Saez, their income peaked in 2000 and is down 4 percent through 2015 (with some spikes in between). Will 2016 be the year that they too beat their 20th century high? P&S take even longer than the Census Bureau to make their calculations, so we’ll have to wait and see. But I’ll bet they do.

  • Here’s Why I Hate Credit Reporting Agencies — And Why You Should Too

    A few days ago, Equifax, one of the Big Three credit reporting agencies, admitted that the personal data of 143 million consumers had been compromised. This is not the biggest data breach ever, but it might be the worst. After all, Equifax is not just any company. It’s a company whose main job is collecting masses of private financial data—and it does this even though it has neither a business relationship nor explicit permission from the people it monitors. This is a massive and unprecedented FUBAR.

    (For more on why the Equifax breach is even worse than you think, Michael Hiltzik explains here.)

    I am no fan of the credit reporting business, one of the most arrogant and anti-consumer industries imaginable. Twelve years ago I wrote about them for the Washington Monthly, and it’s startling how little has changed since then. I could republish the story today with only the most cursory changes.

    For example, part of my piece was devoted to “credit freezes,” something you may have heard a lot about lately. This is an action you can take to protect yourself in case of identify theft: if you ask for your account to be frozen, credit agencies will furnish a credit report only after they’ve confirmed that it really is you who applied for credit. This stops identity thieves in their tracks: if they apply for a credit card in your name, the credit agency will call you first. When you tell them you never applied for the card, it doesn’t get issued.

    But this really shouldn’t be an option you have to request. It should be routine for all credit transactions. The reason it isn’t is because it’s inconvenient for the credit reporting agencies, who have fought regulation on this topic tooth and nail. It’s also because they literally make money on identify theft—no, that’s not a typo—and therefore don’t have much incentive to do anything about it.

    Still, as much as I think all accounts should be frozen by default, my solution to the problem of identity theft isn’t to force the credit reporting agencies to freeze or unfreeze accounts—or to force them to do anything else. It’s to make them responsible for all damages related to identity theft and then let them figure out the best solution. Here’s what I wrote in my Monthly piece:

    There is a successful precedent for this type of approach. In 1968, Congress passed the Truth in Lending Act, which imposed a variety of regulations on the lending industry. One notably simple provision was that consumers could be held liable for no more than $50 if their credit cards were stolen and used without their authorization. For anything above that, it was the credit-card issuer who had to pay. The result was predictable: Credit-card companies have since taken it upon themselves to develop a wide range ofeffective anti-fraud programs. Congress didn’t tell them to do it, or even how. It just made them responsible for the losses, and the card issuers did the rest themselves.

    The same method should be used for identity theft. There’s no need to create mountains of regulations, which are uniformly despised by the credit industry. Instead, simply make the industry itself—and any institution that handles personal data—liable for the losses in both time and money currently borne by consumers. The responsible parties will do the rest themselves.

    There’s more to say about this, but sadly, my piece is no longer available at the Monthly site. The great linkrot plague has devoured it. Luckily, I’m a magazine packrat and I still have a dead-tree copy. So I scanned it and turned it into a PDF. Click here to read it—and to find out just why I hate the credit reporting agencies so intensely. It’s worth your time, especially considering how little has been done about this over the past decade. It represents one of the all-time abject surrenders to Big Finance, and it’s something the Elizabeth Warren wing of the Democratic Party should be all over. The time for small-bore proposals is over. It’s time to make the credit agencies—and others—pay for their flagrantly careless behavior. When they allow someone to steal your identity, they’re the ones who should pay the price, not you.

    UPDATE: The Wayback Machine also has a copy of my article. I shoulda checked! Click here to see it.

  • American Hospitals are Ungodly Expensive

    Chapin White of RAND tells us that hospitals are expanding every which way in Indiana:

    Indianapolis was historically characterized by geographically distinct hospital submarkets, with a dominant system controlling each area. Those geographic divisions have broken down in the last two decades, with systems building new facilities and encroaching on each other’s territory, particularly in suburban areas with concentrations of privately insured patients. As in the rest of the country, hospitals in Indianapolis have established and tightened their relationships with physician practices and used those relationships to drive referrals within their systems.

    Recently, observers have speculated that the breakdown of hospitals’ geographic territories, as well as the expansion in the number of facilities, might lead to greater competition and lower hospital prices in Indianapolis. But that speculation has not been tested empirically.

    In theory, this means that Indiana hospitals are competing more. At the same time, they’ve been consoldiating into a small number of big chains, and as Chapin says, they have “tightened” their relationships with physicians. This means that most physicians no longer refer their patients to different hospitals for different things. They always refer them to the same place. So what’s the upshot? Here’s a chart showing how much above the Medicare reimbursement rate Indiana hospitals charge private insurers:

    There are two takeaways here. First, these hospitals charge private insurers a lot more than Medicare. The average for outpatient care was 258 percent more. For inpatient care it was 117 percent more.

    Second, there’s stunning variation in prices. This is an old story, but it’s an old story that never gets old. The least expensive hospital charged private insurers 71 percent more than Medicare for outpatient services. The most expensive charged 396 percent more than Medicare for the same basket of services.

    What accounts for the difference? Did big hospitals with economies of scale, lots of competition, and tight relationships with physicians charge less? In a word, no. In two words, hell no:

    At the bottom of the price distribution are the independent CAHs [critical access hospitals, which are all small and rural] and three small systems….Although CAHs are, by definition, geographically isolated and have no nearby competitors, that lack of competition does not correspond to higher negotiated prices. The upper end of the price distribution is dominated by five large hospital systems, with Parkview Health standing out for having exceptionally high prices. Hospital systems and consolidation among hospitals have been cited as drivers of high and increasing prices, and these findings are consistent with that argument.

    This is the not-so-hidden story of exploding medical costs. We’ve become so accustomed to hating on insurers that we hardly notice that hospital consolidation is a much bigger villain. When a big insurer has a local monopoly, it can usually negotiate lower prices from hospitals because the hospitals have nowhere else to go. But when there are lots of insurers and only one or two local hospitals, it’s the hospitals that have the upper hand. They can charge high prices because the insurers have no choice except to do business with them. As hospital systems get steadily larger and rope in more and more physicians, their effective competition decreases and they have the ability to demand ever higher prices.

    Insurance companies are hardly innocent bystanders in the health care system, but if you want to really target the drivers of higher costs, look to the source: the actual providers of medical services. That means doctors, hospitals, pharmaceutical companies, and medical device makers. That’s where the real money is.

  • We Are Rethinking Our Rethinking About Pain

    Last week, in an interview with author and surgeon Atul Gawande, Sarah Kliff asked about the opioid epidemic:

    “We started it,” Gawande told me flatly. He argued that health providers are at the root of the country’s staggering opioid epidemic. He didn’t blame the pharmaceutical companies — although there is good evidence that they played a large role — but instead focused on how views of pain began to shift in the 1990s, with doctors urged to take their patients’ suffering more seriously.

    The medical profession certainly shares some of the blame for this, but I wouldn’t go as far at Gawande. I’m open to correction on this, but my understanding is that it was really the confluence of three different things:

    Doctors. During the 70s and 80s, the medical profession began to get more serious about pain treatment. Several influential articles in medical journals argued that patients who were treated with opioids rarely became addicted, and this contributed to an increased willingness to prescribe them.

    Parents. During the 80s and 90s, parents became more insistent about treating pain in their children for things like sprained ankes and broken bones. Instead of aspirin, they wanted Vicodin. This made everyone, doctors and patients alike, more comfortable about using opioids.

    Big Pharma. Pharmaceutical companies never bothered promoting morphine because it’s cheap and earns them no money. But when patented opioids like Percocet and OxyContin came onto the market, pain suddenly became a big moneymaker. This required steady introductions of new products as old ones went off patent, and therefore much more aggressive marketing than in the past.

    In 2001, this all came together when The Joint Commission, which accredits medical facilities, issued new guidelines on pain:

    • Pain should be assessed in all patients.
    • Pain intensity should be evaluated using the now-familiar 1-10 scale, and that scale should be prominently posted everywhere that patients are assessed.
    • Pain should be managed “aggressively and effectively.”
    • Patients should be instructed about pain and the importance of effective pain management.

    These days, The Joint Commission is eager to disassociate itself from this mess. Last year they issued a statement saying that the 2001 standards never so much as mentioned opioids and certainly had nothing to do with rise in use of opioids. Needless to say, this is special pleading on steroids. It’s true that The Joint Commission didn’t start the opioid epidemic, but they certainly put their blessing on it. And there’s little question that both pharma and doctors lobbied for standards that mandated more aggressive pain management.

    But that doesn’t let the rest of us off the hook. Boomer parents also bear some responsibility thanks to their unwillingness to tolerate even moderate pain in their children. Doctors were primed to respond, and even when they were skeptical they often decided that in the face of a demanding parent, the easiest course was just to prescribe an opioid and send everyone on their way. It’s not like the kids are all going to become junkies, right?

    The goal now, obviously, is to substantially reduce the routine prescription of opioids for every ache and pain—especially in children and teens—but without making life hell for chronic pain sufferers who genuinely need strong medication. We need to react, but not overreact.

  • Has Social Media Made Us All Into Political Junkies?

    Atrios today:

    Once upon a time only weirdos like me (and you, dear readers) paid this much attention to politics. I documented the minutiae that most people weren’t aware of. Now everybody knows! Social media has led to a kind of “bleed” such that everybody is aware of all of the stupid shit that once upon a time only weirdos like me (and you!) were aware of. Even if you aren’t that interested in politics, the information comes at you like the Kardashians. I don’t know if this is good or bad. Once upon a time it was easy to tune out politics. Now it is impossible.

    I think this is wildly wrong. At a guess, public knowledge of political comings and goings hasn’t changed in decades. The vast, vast majority of people pay only the slightest attention to politics, spending their time instead on soap operas, gossip magazines, kids’ soccer games, problems at work, trying to lose ten pounds, unpaid bills, pro football, the price of hamburger, and the guy down the street with the barking dog. Social media may have pushed politics into people’s faces a little more, but it’s also pushed all that other stuff into people’s faces a little more.

    However, this seems eminently measurable. So how about it, political science types? What kind of survey data do we have that measures interest in politics beyond the superficial? What other indicators might provide clues? Turnout rates aren’t up. Small-dollar political contributions are probably up (?), but that’s still a minuscule portion of the population. Total viewership of news programs is probably down. What else?

  • The Republican Party Is Not the Problem

    Ricardo Arduengo via ZUMA

    Over at New York magazine, Jon Chait pushes back against Lee Drutman’s notion that America is trapped in a descent into “doom-loop partisanship.” The problem, Chait says, is far more specific: one of our major political parties is normal while the other has gone nuts.

    The Only Problem in American Politics Is the Republican Party

    The psychological relationship between the parties has a certain symmetry…. superficial similarity to the terror with which partisans now greet governments controlled by the opposing party…. competing tribal epistemologies…. news media that is open to contrary facts…. no equivalent to a Rush Limbaugh in influence and sheer lunacy….etc.

    ….The doom loop Drutman describes is, in reality, both sides responding to the phenomenon of Republican extremism. Republicans are sealed off in a bubble of paranoia and rage, and Democrats are sealed off from that bubble. Democrats fear Republican government because it is dangerous and extreme. Republicans fear Democratic government because they are dangerous and extreme.

    There’s much more at the link, where Chait describes the asymmetry between the parties well. I don’t disagree with a word he says. However, I want to stress one small qualification. America is a democracy, and parties survive only if they gain popular support. Over the past couple of decades, we liberals have marveled at the steadily increasing lunacy of the Republican Party, confidently predicting at every turn that eventually the fever has to break. But it hasn’t. Republicans have won the presidency at the same rate as usual. They have won the House. They have won the Senate. They control state governments. They control county governments. There are still a few blue enclaves like California where Democrats truly control things, but not many. Generally speaking, the only thing Democrats truly control in America is its big cities. Urban mayors are almost uniformly Democratic.

    In other words, the problem is not the Republican Party. The problem is that lots of people vote for the Republican Party. The lunacy will stop when that does.

    If you think this comment is pedantic, I submit that you have a deep misunderstanding of politics. Roughly speaking, liberals would do well to forget the Republican Party even exists. Their focus should be almost exclusively on how and why conservatives continue to attract the support of half the American public no matter how crazy they seem to become. Until we figure this out, things are only going to get worse.