Al Franken wrote to me this morning—we're tight, like that; he emails me a couple times a week—and he began:
Health care is a right. It's not a privilege.Now, I've been in favor of universal health care in this country since before a lot of you were born, so I'm happy to see my buddy Al take up the cause. But it was the language that caught my attention. I hear this phrase a lot, these days, and I'm perplexed. Because in my youth, health care was not a right. Not even us leftie commies thought it was a right. It was something we thought everyone could and should have: but that's not quite the same thing. Rights are inalienable. They're intrinsic to being human. They're something—to stick to the ground they grew in—they're something that God intended as part of every human being's humanness.
A lot of people don't live on that ground anymore. I never did, having been raised atheist. So I'm a little cautious of the rhetoric of rights. What exactly are we talking about? Jefferson knew quite precisely. "They are endowed by their creator with certain inalienable rights." That's clear enough. But what do I mean, when I say that people have a right to free speech? Do I mean something other than "I think everybody ought to be able to speak freely"?
I think I do. Certainly we produce these assertions, not as our own whims, but gravely, as fundamental laws of human nature: we're not talking about "laws" like laws against jaywalking—our tone implies—but "laws" like the law of gravity. We may know that an ancient Greek would have been totally baffled by the notion that health care could be a right, (and would probably dispute that political rights could ever properly belong to someone who was not the male head of a freehold in the first place.) But we nevertheless hold that rights are—somehow—self-evident. Like my friend Al up there. He doesn't go on to argue it. You don't need to argue things like that. You state it and you're done.
You can see that this is true from the way the Republicans have fumbled the Obamacare repeal. You might expect Paul Ryan or Mitch McConnell to say, "What nonsense. Health care is not a right. It's something you buy if you can afford it!" This is clearly what they think. But, as highly evolved political beings, they know, they can tell by its scent in the air, that to say so would be be political death. So they tie themselves in knots.
And at that point, when even the guys on the other side of the fence feel they can't deny it out loud, I think that we have to say—yes, health care has in fact become a right. Whatever rights may be, access to health care is one of them. So my question, dear reader, is—how did this happen? It's a sea change. A new right has been born in our very presence. Did you see it being born? Do you understand how it happened?
Or was I simply wrong, and has it been a right all along? Just because I was there to observe the waning years of the 20th Century, doesn't make me an expert on them.
I would love to know what you think! This one puzzles me.