By now you’ve probably read the news that deaths from painkillers are rising. This is indeed bad news, but the story carries with it the same assumption made in illegal drug cases: the drug or drugs in question can’t be used responsibly except under the supervision of a doctor. With illegal drugs, all use is abuse; with semi-legal drugs, all use without a prescription is abuse.
Here’s a case to consider: you hurt yourself somehow and a doctor prescribes you Vicodin. You have, say, twenty, and you consume seventeen in the manner told to you by your doctor by the time the pain stops. Now you have three left. You noticed that in addition to helping with your pain, the drug also gave you a mighty pleasant feeling. A month goes by before one day you’re feeling less than 100%. You remember the precautions your doctor told you. You take one of your leftover Vicodins in a totally safe manner. Now, according to drug czar Gil Kerlikowske, you’re a drug abuser. You’re part of the “drug abuse epidemic”. You’re an accident waiting to happen. Take it a step further: you simply take one, safely, without any pain or other health-related motive. You just want to feel good for a while. Should that be a crime?
I know there’s a big difference between washing down a handful of pills with a 40 oz. and the scenario I just described. But in Kerlikowske’s statistics there isn’t.
These stories, when they pretend to be balanced, include the legitimate desire for patients with severe and/or chronic pain to counter it. And that’s one reason to avoid letting scary stories rush us to foolish action. But another is the desire, not generally recognized as legitimate, for people to get high. People like to feel good. People have been discovering and inventing ways to do this since prehistory. (It’s not my cup of tea, but there’s a very large and influential group of people who believe that when God Incarnate came to earth, the very first miracle he performed to show his divine nature was assisting people to this end.)
Balancing the needs of people who are in pain against accidental overdose deaths is one thing, but balancing the needs of people who are in pain and people with a legitimate desire to alter their biochemical state against accidental overdose deaths is the proper calculation if you must do one.
The WSJ blog Real Time Economics reports that about 15% of the US is on food stamps. David Hackett Fischer’s “backcountry” part of the US figures heavily, as do a few other states.
I’ve read that the Obama administration has sought to make it easier for people to get food stamps, and this of course has differing interpretations. One is that they genuinely want struggling people to get the extra help they need. The other is that expanding the program—making more people dependent on the government and on this administration specifically—bolsters their chances for the next election. Since the decision did not spring forth from Obama’s forehead alone we can fairly assume a mix of the two. But clearly, in an ideal world, nobody would be on food stamps. How can we get there from here?
Here I’m going to rely on some anecdotal evidence, but since I was the observer I trust it. I lived in New Mexico recently, and between finishing college, being underemployed, and partying probably a little too much I knew lots of people on food stamps. The impression that I got, overwhelmingly, was that very few people consider food stamps a temporary measure. People might not have thought about food stamps all that much, but when it occurred to them that they could qualify, many of them considered it free money for the taking. From what I could observe, it didn’t correlate with a change in anyone’s behavior all that much, and when it did, not for what an omnipotent but benevolent observer would consider the better. You were poor, you got food stamps, and then you were slightly less poor. End of story.
I don’t doubt that many people who get food stamps want to make their journey through those ranks as short as possible, or that many people actually do land that next job and leave the program. But I’m not sure at all that in the bulk of cases there’s a lot that can be done, policy-wise, to get people out of the program other than tightening the conditions of the program.
Bryan Caplan recently tweeted this:
What other mass murdering regime inspired Stauffenberg-style tyrannicide? Imagine an American officer plot to kill Pres over Indian Wars.
Twitter’s space limitation makes it interesting for some things and not very good for others. One of those other things is a proper consideration of this question.
Even if the Allies had not captured a large number of German documents intact, the world would still have known the broad outlines of this assassination plot. The German government publicly acknowledged the attempt. As far as I know there are larger gaps in the Soviet records available to us, for the obvious reason that the German records were seized before they could be destroyed and the Soviet records were not. An incredible amount of Soviet material is available, to be sure, but much of this was released for specific reasons, i.e. was selectively released for political or diplomatic purposes.
Plots were announced by the Soviet government—always after having been discovered and eliminated—with high frequency. But the nature of Soviet propaganda makes it hard to know which were actual plots against the leadership and which were masks for some kind of purge or other. Simply put, I have trouble imagining that nobody in the Soviet leadership thought to eliminate Stalin during the incredibly bloody years of his reign before World War II. What doesn’t seem obvious a priori is how real plots would have been dealt with.
It seems, based on what we know of the Great Purge, that there was almost no substance to the criminal accusations of the majority of people who were killed then. They were eliminated for potential disloyalty to the regime in order to consolidate power. However, the regime gave this at least the thinnest pretense of legality. Clearly, questions could have arisen, and this was a CYA procedure. This makes it seem as if my hypothetical plotters, people who really intended to kill Stalin, would have been executed quietly and put into the deepest memory hole possible.
On the other hand, the propaganda value of a real plot vs. that of a fake plot is identical, since the public never knows anyway. This makes it seem possible that some of the show trials we know of were “real” trials.
Either way, we’d never know.
The Washington Post reports that the D.C. metro area picked up 7,000 people aged 25-34 during the recession. What do they come up with?
“It’s the economy and hipness,” said William Frey, a demographer with the Brookings Institution, who analyzed the census data comparing the 2005 to 2007 period with 2008 to 2010. “Young people are going to places that have a certain vibe. If there’s a recession, they want to ride it out in a place like that. And Washington has the extra advantage of being a government town that’s not as hard-hit by recessions as others.”
Now compare that with this:
Bert Sterling, who scours census statistics to compile lists of the best places to live — and who lives in Portland — said he expects the Washington area eventually will revert to a more traditional role, as a place to establish a career and eventually move on.
“Washington is known as a center for power and, during the recession, has taken on the image of a place where the jobs are,” he said. “I haven’t heard anyone say Washington is hip and cool. But I don’t know if you want your seat of government to be too cool and quirky.”
(It boggles the mind why William Frey would say that D.C. is hip at all. The only thing I can think of is that he presumably lives inside the Beltway and, as such, barely has an idea that there’s a world outside of it worth living in.)
Rather than acknowledge that economic emergencies are great times to get government jobs, as 15 minutes on the Reason blog will tell you, the WP gives us this.
I’ve often wondered how exactly English came to be so dominant as a world language. The British deserve the credit for getting this started, but after that it’s a little fuzzy. Other countries, most notably Spain and France, had empires as well. The French language is spoken, at least by elites, in many countries, but most of these are former colonies. Spanish is spoken by a large number of people as well, but mostly as a first language. Yet English has by far the largest number of non-primary speakers, over a billion.
The long history of the British empire, as I said, got this started. But before they were dominant there was the Spanish empire, which now pales in linguistic influence. The relevant difference may be that Spain’s empire was (relatively) geographically concentrated, while Britain’s possessions were far-flung across the world.
One key component in the 20th century is the fact that two great-power nations used it. There are other great powers, but no two with a language in common. In a case of 2 vs. 1 vs. 1…, it looks like the language shared by two powers wins. (Austria-Hungary and Germany shared a language, but I have trouble classifying Austria-Hungary on the same level as the US and UK, and even within Austria-Hungary most people would have preferred speaking another language than German.)
I still don’t have a great answer, but a study mentioned in Die Welt today says that English is the most efficient language, so perhaps that’s another reason.
The round of midterms in my first semester of the Ph.D. program is done. I did ok, not fantastic. It’s alerted me to a problem of mine: I hardly know how to study. I have never really done it before. I mean, sure, a little here, a little there, but only on an ad hoc basis. I suppose everyone in grad school thinks he or she is the smartest one in the room, but I honestly can’t imagine that some of the folks outscoring me on the midterms are actually smarter than I am. What I can obviously see is that they are the kinds who have a much more organized approach to academics than I have. And that this is really the superior approach to graduate school, and that I will have to adopt this approach immediately.
The Wikipedia article on tipping indicates that there’s no clear trend worldwide about tipping; it doesn’t seem that it started in one culture and then carried on in places where that culture had influence. Hong Kong, which like mainland China has no native tradition of tipping, has begun to see it under Western influence, but in Australia and New Zealand, obviously culturally Western places, it’s very rare. Jordan has a culture of tipping, Belgium does not. In many places, again on something of a checkerboard pattern, tipping is not the norm, but will be used to reward exceptional service, so that it is a known but not everyday phenomenon. One common practice is a third way in which many places have a “service fee” or some such thing that includes the money Americans would leave for tips as part of the bill.
(I have a humorous recollection of being in Uruguay with some other Americans asking our local friends how to say “tip” in Spanish. We had to explain what it was, and all they could think was why in hell you’d leave money after you paid. There is a word, actually, but it’s not used there!)
It’s ingrained enough into US culture that there’s an exemption from the federal minimum wage for it. What I wonder is if a culture that does include tipping would ever phase it out. I can think of:
- Enough people migrating from a non-tipping area to a tipping area to change the balance. This is unlikely for large countries like the US because if it became uncommon in some zone it would still be common in others around it.
- Included fees becoming common enough that people stopped tipping overall. I could imagine restaurants being hard-up and sneaking extra fees onto the receipt, but adopting this would have to happen very quickly over a wide-spread area. Maybe it would be a gradual lowering of the gratuity limit (from large groups to mid-sized, one or two people at a time, and then just an expected part of a meal).
- People becoming more rude overall, although old folks complain that this is happening already, and it hasn’t led to tipping being phased out.
- The nuclear option: the US government drops the minimum wage exemption. People now know that restaurants are paying full wages to servers, and correspondingly they stop tipping either from increased food prices, the reduced social obligation, or both. I don’t know for sure, but I’m told some places in the US have a city or state minimum wage that doesn’t have a service exemption, e.g. Portland, so I wonder if people are gradually tipping less there. This could be counteracted though, as mentioned in point 1.
On the contrary, tipping in the US seems to be subject to “inflation”. It used to be 15% as a general standard, as far as I’m aware, and now I see tip calculators and receipt suggestions for 18% and 20%.
Cultures with strong concern for social obligations (e.g. by having popular welfare state policies, or high charitable giving, or some kinds of proxies) don’t seem to have correlations with the prevalence of tipping.
I’m sure this has been amply covered elsewhere, I just haven’t read those sources yet.
I’ve been thinking about Warren Buffett lately. He’s on record saying that he should be paying a larger percentage of his income as taxes. Aside from the fact that for people who really believe they aren’t paying enough there’s already a place to send that check to, the reaction to his comments is very interesting. Political expediency rules the day, or, as an economist might look at it, naivety.
Let’s step back a moment for a glance at some facts. Fact 1, Warren Buffett clearly has business genius to a very rare degree. Fact 2, he endorsed Barack Obama and made campaign contributions to him for the 2008 election. Fact 3, he knows that his words carry great financial value—just ask for a free copy of his report. It’s unlikely that this tax is going to go through, so essentially Warren Buffett is gambling. If the tax passes, well, he loses, although it’s not that big of a deal to him. He already has more money than he can spend, and has already announced he’s not passing on his massive fortune to his children. If the tax does not pass, he gave politically valuable support to Obama at essentially no cost to himself—a non-monetary campaign contribution.
What nobody seems to have said so far (unless I missed it) is that there’s no way he doesn’t think Obama will remember this later.
This should surprise no one, but the Wall Street Journal blog reports that “Public Sector Workers Much More Likely to Get Sick, Hurt at Work”. I suppose that this includes people like firefighters, who have a genuinely risky occupation, but surely they couldn’t entirely account for a difference of 5.7 instances of injury or illness per 100 full-time workers vs. 1.8 in the private sector.
Another angle to the JFK story to consider is that whoever the conspirators were, and whatever their motives were, Dallas was the place where they acted. One has to assume that the plot was not cobbled together on the morning of the visit. Thus the conspirators waited until Kennedy was in Dallas.
What follows from this? Either the conspirators were based in Dallas (such as Oswald), or they traveled to kill Kennedy, or a combination of the two. This would cast extra suspicion on people who knew or presumably had access to knowledge of Kennedy’s whereabouts. This hardly narrows it down, however. Various US government officials knew this, but from them informants could send this information outward to foreign governments, the Mafia, or any group sufficiently capable of maintaing some kind of intelligence service, broadly defined.
It also follows that the conspirators believed they could get away with the assassination in Dallas. (Maybe they could have gotten away with it anywhere, and Dallas just happened to be the spot where they did. Or maybe there was something special about Dallas.) In planning beforehand, they’d have to have a strategy for dealing with the authorities–either eluding the massive law enforcement presence or corrupting part of it.