Friday, October 26, 2012

The "Coals to Newcastle" Model of Health and Medicine



The fount of vernacular springs eternal. Even in America, where most people don’t know where or what a “Newcastle” is (outside of amber lager in a yellow-labeled bottle), the term “coals to Newcastle” is still well-warn in many circles, particular collegiate ones. For those who don’t already know: Newcastle on Tyne was England’s coal exporting port and well-known coalmining town since deep into the Middle Ages. To carry coals to Newcastle, then, is to do something that is rendered totally pointless by scale – similar to a drop of rain in the ocean, a grain of sand in a desert, or, for that matter, a vote in a presidential election (yes I said it). There are two ways to employ this phrase: to say directly that something is superfluous because it is already vastly outweighed by an overabundance of the same thing, e.g., Newcastle already has enough damn coal; and to say indirectly that a small gesture is pointless because it is vastly outweighed by other, more powerful factors, e.g., drinking a “Diet” Coke with your extra large lardburger and grease-sticks. It is with the second one that I am concerned in this essay.   

Like most people of a scientific bent, I have long suspected that herbal medicines, alternative remedies, and so-called healing foods – their very concepts, in fact – were nearly, if not actually, completely hogwash. I have never taken a dose of ginseng, echinacea, kava kava, rhodiola, ashwaghanda, cat’s claw, St. John’s wort, or chamomile and had it do anything other than imparting a nasty taste (except that last one, which can be pretty tasty). Nor have I ever experienced any surges in immunological function from mega-doses of vitamin C, any calming effects from magnesium, any beautifying effects from bio-available silicone, or any noticeable healing acceleration from colloidal silver. And as for the currently insanely popular fermented teas, I stand alongside the growing body of researchers who believe those things are potentially less beneficial than harmful.

And yet – and yet… Until recently, and exclusively in the West, nearly all medicines were taken straight out of the gardens, fields, and forests in which people dwelled. Practitioners in the field of ethnobotany have done a terrific job of tallying and studying the use of plant- and animal-derived medicines and entheogens that continue in some cultures to this very day, and to them and their cultural informants we owe – with allowances for ethical slipperiness in a lot of early cases – a considerable debt for a whole host of modern medicines. According to a 2007 article by the U.S. Cancer Institute’s David Newman and others, around 70 percent of all new medicines introduced in the United States in the last 25 years were derived from “natural products.”  Among the most widely-known and beloved examples are aspirin and quinine, which both come from tree bark; novocaine and cocaine, which come from South American bushes; opiate painkillers like codeine and morphine, which come from poppies; and birth control, the main biochemical source of which is wild yams from Central America.

Granted, in cases of nature-derived cures such as these, the emphasis is on the word derived. The name of the game in modern medicine is to identify, isolate, synthesize, and amplify the specific molecule or molecules in the host sample in order to make it stronger, more chemically durable, more easily manufactured and marketed, and more easily regulated. While there’s no telling how much willow tree bark it would take to clear up a headache – well, I assume someone has done the experiment, but I certainly haven’t – it usually only takes a pair of Aspirin to do the trick, each of which is no larger than the nail on my pinky. And therein lies the root of my total lack of faith in alternative medicine, for most Americans, most of the time.

To reiterate: I have never experienced a noticeable effect from any of the alternative remedies and supplements mentioned above, nor any others to speak of. Bear that word “noticeable” in mind. Meanwhile I, like most people in this country, smoked cigarettes for many years, drank and continue to occasionally drink alcohol, wake up in the morning with a fishbowl-sized mug of coffee, and, while I eat as cleanly and healthfully as I can manage, as a matter of course I regularly ingest immeasurable amounts of preservatives, nutrient additives, chemical flavor enhancers, industrial food-like substances, pesticides, herbicides, and so on. I also breathe city air and drink city water, and God only knows what dark forces lurk therein. And that’s me, somebody who eats what most Americans would consider a positively Spartan diet and exercises nearly every day; I can scarcely imagine the total amounts of who-knows-what that makes its way into the bodies of average Americans. Against such odds, how could something as subtle as an active compound in its organic, non-isolated, non-magnified form possibly have any noticeable effects?

Note again the first three complicating factors on my list: smoking, booze, and coffee; the primary vehicles for, respectively, nicotine, alcohol, and caffeine. Smoking is falling out of fashion in the US, thankfully, and while alcohol is still very much in fashion that doesn’t mean everyone drinks it regularly and/or to the point of intoxication. But coffee is pretty ubiquitous. Tobacco, by the way, is a plant, as are the bushes from which we get coffee “beans” (actually dried berries), and alcohol is produced by yeast in an oxygenated atmosphere full of cereal or grain sugars. Moreover, all three of them were originally employed for either ritual or very occasional use, at least in most cases, and were neither manufactured nor consumed in anywhere near the industrially titanic concentrations in which they’re now available (with the one exception of South American tobacco variants which were, at one time, way stronger than what we grow in Virginia). Add into that refined sugars, saturated fats, preservatives and additives and enhancers and whatever else, and you’ve got the Newcastle to which ginseng is a bucket of coals.

While I’ve mused on this line for many years, I’ve never seen it experimentally tested, until now. The point was driven home with some force in a recent federal study headed up by Brown University’s Rena Wing and including a remarkable supporting cast. The study was intended to measure the effects of a change of diet and exercise in obese adults with type 2 diabetes. Which seems like a foregone conclusion, when you think of it; and for that reason little more than an expensive and totally pointless waste of federal dollars – of course diet and exercise will help! Earlier studies had already shown that proper diet and exercise significantly lower blood sugar levels, blood pressure, and cholesterol. At this point I pass the mic to Gina Kolata of The New York Times:

The study randomly assigned 5,145 overweight or obese people with Type 2 diabetes to either a rigorous diet and exercise regimen or to sessions in which they got general health information. The diet involved 1,200 to 1,500 calories a day for those weighing less than 250 pounds and 1,500 to 1,800 calories a day for those weighing more. The exercise program was at least 175 minutes a week of moderate exercise.

But 11 years after the study began, researchers concluded it was futile to continue [because] the two groups had nearly identical rates of heart attacks, strokes and cardiovascular deaths.   

Www... whuh?

Noted blogger and paleoanthropologist John Hawks appended this in an online commentary by saying, “One expert quoted in the article thinks that the large effects of smoking cessation, statin drugs and blood pressure medications may swamp the small effect of diet and exercise.” To which he added this piquant reaction: “Swamp the small effect of diet and exercise on type 2 diabetes in obese patients? Wow.” (Original italics)

So there you have it. None of the researchers interviewed for the Times article had anything to say about the daily intake of [use your imagination] by Americans, particularly obese ones, throughout their entire lives, although that may come up in the subsequent research articles the team is readying to publish. But they don’t have to. The study involved limiting patients to either 1,200-1,500 or 1,500-1,800 calories a day, but I can find no information about the sources of those calories. Contrary to popular belief – for some reason – a calorie is not a discrete thing, like a vitamin or a nutrient; a calorie is a unit of energy, like a watt or a joule, specifically the amount of energy required to raise one gram of water by one degree Celsius in a normal atmosphere. Focusing on calories is a handy dietary starting-point but it leaves out quite a lot, not least of which being the fact that literally anything from which your body can derive energy therefore consists of calories, including toothpaste, cardboard, and candle wax. Exactly what does that have to say about the quality of one’s diet?

Very little, I’m afraid. Author and dietary guru Michael Pollan likes to lament over our nutrient-oriented view of diet, as if getting the Recommended Daily Intake of this and that is the best way to achieve optimal health. (It isn’t – the Inuit eat almost nothing but animal fats, the San eat mostly vegetables and tubers, and the French drink wine and olive oil in almost equal measure, and all three populations boast considerably better average health than even those Americans who only shop at Whole Foods.) If there is any general rule of dietary health, it is derived not from consideration of diets as constellations of nutrients but as behavioral adaptations, and it would sound approximately like this: eat what you, by your specifically ethnic as well as generally human background, have evolved to require for maximum nutritional efficiency, whatever the hell that happens to be. It probably involves fresh vegetables. It probably doesn’t involve drugs like caffeine and alcohol. It definitely doesn’t involve high-fructose corn syrup, monosodium glutamate, potassium benzoate, Yellow #5, or DDT. And it surely isn’t helpful to augment that by breathing carbon monoxide every time you step outside or drinking chlorine every time you pour yourself a glass of water.

All of which converges on one central point: there are simply too many elements going into our bodies every hour of every day to even keep track of, no matter how healthfully we try to live, and the bulk of them speak with much louder voices than do the dried leaves in a small bag of Yogi Tea. This is America, where subtlety is simply not an option. We live large, work large, play large, drive large cars and trucks, create and endure large amounts of stress, eat large and drink large and swallow large handfuls of magnified large-molecule drugs every time we get so much as a hangnail. And we are all sick, tired, and dying, in a very large way, in the face of which herbal and other "traditional" remedies are indeed as useless as carrying coals to Newcastle. You're better off carrying a backpack away from all this. 



Sunday, August 19, 2012

Personae Non Grata



As a teenager, I once wrote an entire essay arguing that the indefinite article a/an is the first sign that you’re dealing with an imbecile. What, for example, is the difference between someone who says “I am Catholic” and someone who says “I am a Catholic”? Or someone who says “I am liberal” and someone who says “I am a liberal”? The difference is that the first person is giving one (of possibly very many) descriptors of him- or herself, while the second is identifying entirely with a social role. The same can be said of someone who says “I am a nerd” versus “I am nerdy,” someone who says “I am a spiritualist” versus “I am spiritual,” someone who says “I’m a Yankees fan” versus “I like/love/support the Yankees,” and so on. Would you call yourself feminist or would you call yourself a feminist, because only the former is a person. The latter is an artifact, a thing, altogether cobbled from the  hearts and minds of others.   

I’ve grown up a lot since writing that original hate-speech, and I now consider it to be rather reductionist, but the principle of the thing is still valid and in fact I now consider it more apt than I ever did before. From Carl Jung – and expounded upon by Joseph Campbell – we learn that the very words person and persona come from the Latin word for masks that were used on-stage by actors in Roman drama, through which they “sounded” (per-, through; sonare, to sound) their lines. The reason for this is pretty simple: nobody came to a play with a program featuring a dramatis personae or list of characters because there weren’t any printing presses, so in order to remember who was speaking you needed only to look at his mask. If whoever is talking looks like Oedipus, then whatever he’s saying is what Oedipus is supposed to say. In society, according to Campbell, “one has to appear in one mask or another if one is to function socially at all; and even those who reject such masks can only put on others, representing rejection.”

In that one prescient sentence, Campbell predicts the hipster subculture about a half-century before its inception.

So we’ve all got roles to play, and we’re supposed to behave according to the parameters associated with those roles. Idealists like to argue that we should strip off our masks and live in a more natural state – which just underscores the point, since that’s exactly what idealists are supposed to say. In reality, we cannot, in our present civilized state, ever strip away our masks or entirely step out of our roles because of just how deep they go. We pretend we don’t like criminals, for instance, but in fact in a tacit way we mostly feel that criminals have a place in society; they are on the opposite side of the balancing-beam from philanthropists, and they provide paying jobs for cops, judges, security thugs and jailers. What we really don’t like is unpredictability. This is why “crazy” people get locked away in institutions and/or medicated into vegetation, regardless of whether they are actually a threat to themselves or anyone else – it’s because they’re unpredictable. An insane person might walk into a grocery store and start doing a strip-tease for the potato chips, and that scares us – or anyway it scares our social leaders – far more than an armed gunman. At least armed gunmen act like armed gunmen, in a predictably ruthless and naughty way.

It warrants noting that there really is a way to live outside of behaviorally-constrictive social roles, and it’s the way we all live as children before we get shipped off to the factory (elsewhere referred to as a “school”) to have our spontaneity stripped away and replaced with predictability. That’s what schools really teach, incidentally; you can jolly well learn readin’ and writin’ and ‘rithmetic at home with your parents, but only by having both stern adults and a densely-packed peer group referring to you over and over as “the nice boy” or “the mean girl” or “the joker” or “the scholar” or “the jock” or whatever do you get pigeonholed into the role that’s supposedly your identity. Before this process occurs, however, we are as malleable as clay, being totally different people in each and every setting in which we find ourselves. Naughty, nice, friendly, lazy, courteous, quiet, loud, thoughtful, reckless… we respond to our environments the way chameleons respond to colors. And in fact it’s the very spontaneity of pre-adolescence that Zen people and their ilk aim for in their practice. The end game of non-attachment doesn’t mean letting go of your television; it means letting go of your supposed identity and, ultimately, your very self.

But that’s for Zen people. As for the rest of us, we come to adopt our social identity as our actual identity not because we’re lazy or sheep-like but because, contrary to a silly cliché, we only see ourselves in other peoples’ eyes. Everything we exude, every action we take or appearance we assume creates an impression in others, and it’s that impression that shapes our conceptions of ourselves. Which seems simple enough and is actually pretty self-evident, but for the following: when we focus too much on the impressions we make on others it makes us appear to be fake, our identity mere affectation. True, in the strictest philosophical sense all identities are fake – if by “fake” one means inorganic or not so of its own accord – but, like the film buff who detests spoilers and behind-the-scenes stills, we loathe actually seeing that fakeness. Nobody goes to a puppet show to appreciate the wires. And the fakeness of identity is nowhere more obvious and ingratiating than in people who identify themselves wholly, emphatically, and exclusively with the details of a single social role.

You don’t have to look far to find examples of this. Leak a story or start an internet meme that associates toughness with eating radishes, and the stores around every frat house will suddenly be radish-free. Plaster pictures or stories all over the place linking rock-a-billy music with Sailor Jerry-style tattoos, linking Dr. Who with steampunk, linking hippie music with hula-hoops or tight ropes (call it a “slack line” all you want; it’s still the same thing I saw in the circus when I was nine), linking night-clubbing with bright things in martini glasses (seriously, as a former bartender, how in the hell do you dance holding onto a cosmo or an appletini? I can’t even carry one of those things without spilling it all over me), linking rap music with sagging jeans, linking snowboarding with silly hats… and those aren’t even hypothetical examples, they’re real ones…. Anyway, spread the idea that thing X is intimately associated with thing Y, and everyone whose identity is solely and obsessively fused to one of them will ravenously snap up the other. 

I don’t often quote psychologists – indeed, I don’t often even acknowledge them – but I do have a few favorites, including the abovementioned Jung. Another of my favorites is A. H. Maslow. Known principally for his “hierarchy of needs,” Maslow was also the first prominent psychologist to make the seemingly heretical claim that pleasure might not be a mere by-product of gratified biological urges and may be an end in itself. Being happy for the sake of being happy, instead of as a reward for good works – imagine such a thing! Another of his contributions was the notion that personal identity is real, or that it can be real, but that it’s often hidden under a rolling snowball’s worth of accumulated bullshit. Said Maslow of finding one’s identity:

The loss of illusions and the discovery of identity, though painful at first, can be ultimately exhilarating and strengthening.

Yes, it’s painful to cease being the center of your own attention, the precisely-crafted model of a comfortingly narrow role within a comfortingly narrow subset of the larger and more impersonal culture that surrounds it. You will no longer be an Alpha Nerd if you admit that while you like The Big Bang Theory you don’t like the original Star Trek; no longer be an Alpha Bro if you admit that while you think Axe Body Spray smells okay you also think wearing a ball cap at a 45-degree angle looks ridiculous; no longer be an Alpha Hippie if you admit that while you think getting stoned is fun you also find that eating organic burritos all the time gives you painful gas; no longer be an Alpha Hipster if you admit that while you genuinely enjoy American Spirits you have always thought Pabst Blue Ribbon tastes like someone wrung a gym sock into a can… No, I’m sorry to say, once you start letting go of some of the prescriptive baubles of your selected subculture you can no longer sneer at everyone else like they just limped onto the field, but at least you’ll be one step farther from obviously and overtly fake. Mix it up enough and you might just pass as no more fake than the rest of us.  




Friday, August 3, 2012

Online Dating: A Short Treatise on a Target-Rich Environment


Online dating has evolved from an esoteric practice to a social phenomenon, a cash cow, and, as more and more people turn to it, a legitimate way to find romance (if such a thing exists). Internet-based dating services pop up at over 100 per year, catering to every niche and nod that tickles peoples’ fancies along the way. Yet among a goodly amount of people the whole idea of online dating still shoulders an unpleasant stigma, conjuring images of pudgy shut-ins with no redeeming physical qualities and no social skills, because – let’s be honest, here – that’s how it all began and – let’s be more honest – in a lot of ways that’s still the case. But that isn’t the case exclusively, and it’s becoming less so with each new year and new flock of people willing to give online dating a try. My purpose in this article is to explore why and how that works.

For anyone who has deigned to try online dating, its appeals are fairly straightforward and obvious. For one thing, in the real-world world you never know if that person you’re eyeballing from across the room is single or taken, and in fact in most cases (not because of luck but as a consequence of that ultimate buzzkiller: statistics) it’s the latter. On a dating website you have only to read a person’s status. Moreover, because everyone’s there for a stated purpose, there’s no need for awkward fumbling-with-the-laces questioning. The ice, if you will, comes pre-broken.

Accessibility is another boon of online dating. Not only do you not have to squeeze a night at the bar into what for many Americans is already a hectic schedule, you don’t even have to be in the same place. I’ve pre-emptively set up dates in places where I was either moving or planning to spend a period on an archaeological project (what I do when I’m not writing), and with no more or less success than normal. I don’t care how smooth you are, you can’t astral-ly project smoothness over state lines. I personally didn't find this to be the shiningest of appeals when I tried online dating because I'm often recklessly confident and a pretty shameless flirt, i.e., I met my second-to last girlfriend standing in a crowded city bus. But for some people it's a godsend.

And then there’s what social scientists call the “Simon Effect” (after the character Simon from William Golding’s Lord of the Flies), whereby a measure of distance between a oneself and one’s audience – in the case of Golding’s Simon, this was accomplished by makeup – makes a person less inhibited. This is probably also the case with the popularity of texting; if people can’t hear your voice they can’t hear your hesitance, your nervousness, or your lisp (nor can you hear their boredom or reproach). Behind the safety glass of the internet you can really cut loose, freely admit to being a cat-lover or a pervert or a vegan or a religious fanatic or whatever, without having to meet anyone’s critical gaze. Some people are born with the ability to be wholly socially confident right out of the gate; others, like myself, possess it now but had really to work at it; and for everyone else there’s online dating.

Finally, there’s the geography of the dating game: the internet broadens one’s range. This is in direct (I should say an indirect) correlation with a person’s confidence, i.e., the more cripplingly insecure a person is the better his or her chances of meeting a potential mate online – having nowhere to go from zero but up. But even for the measurably confident it can buffer the stakes. Take me: I’ve got about a 20% success rate in the world of face-to-face dating, “success” here meaning one responds to another’s advances and at least one decent date ensues; and about a 3-7% success rate online (for reasons I’ll get into shortly). That’s about a 25% combined success rate overall if I’m hitting both avenues. The only sure-fire way for me to get more dates is to either a) take a college course in my subject, where I tend to stand out as "that motherfucker who always talks in class;" or b) lower my standards.

Naysayers will balk, of course, and rightly so. Dating profiles can at times be so informative they take away all chance of surprise. Part of the adventure of romance is getting to know somebody new. Then there’s the amount of time it often takes to “feel each other out” – without the benefit of in-person contact it can be very difficult to satisfactorily express oneself and one’s desires, particularly for those who aren’t professional writers (yep - sorry). And of course there are online dating’s biggest and most obvious problems, the participants themselves, the worst of whom summarily fit into one of three separate and discrete groups:

1.      Liars – that is, people who lie about their weight, age, and/or other physical attributes. Beyond the obvious “you don’t look anything like your pictures” that resounds in the online dating world like a tolling bell, I have an especially ugly tale of this sort to tell. When I was 22 I checked out Yahoo’s fledging Personals service (probably my first mistake) and immediately hit it off with a cute, heavily-tattoed 21 year-old – or so her ID said – skate punk I'll call Hailey, although Pinnochiette The Whore would work just as well. Everything was fun and exciting for about a month, until one day we were in the mall for some reason when we ran into some pimply-faced teenager at the food court. Hailey showed him her newest tattoo, an enormous piece on her upper back, and he exclaimed, “Wow Hailey, I can’t believe you’ve got so many tattoos already and you’re only 16!” So watch out for that.

2.      Liars – that is, people who misrepresent their personalities or character traits. This is more common than you’d think, even more so than people misrepresenting their looks, and it has to do with our magnificent culture and its tendency to teach people that “putting your best foot forward” means “making up a bunch of fake shit about yourself as if you were bait on a hook.” I can’t think of how I could limit myself to just one specific example of this, so I’ll provide a broad one: while attempting to find dates online I’ve gone on dates with something like 30 women who described themselves as “adventurous,” exactly one of whom owned so much as a pair of boots. One of them called herself "athletic" and was in fact obese, like morbidly obese; she's athletic in that she's "going to the gym every day trying to get in shape." I can only imagine it’s just as bad with guys.

3.      Liars – that is, people who lie about their non-physical, non-psychological facets, the most common of which are men or women who lie about their relationship status. Nobody likes to be someone’s “other” man or woman (well, okay, some people do, but they prefer to know it), and it’s murderously infuriating to find out that a potential lover is actually married and trolling for an affair. I’ve never had this happen to me, personally, but it has happened to a few of my female friends and they were insensed. Luckily, because all things on the internet tend to move ahead apace, there is a handy workaround for this last one that works about 25% of the time (which is enough): grab a few photos and plug them into Google’s Image Search. If those same images appear on a social networking site, it’ll pop up. Incidentally, this is also a handy way to see if a potential date has gained 50 pounds since that photo was taken, and/or is or isn’t really a billionaire astronaut with an Olympic medal in sex.

Of course, honesty in its own right doesn’t always count for much if it’s poorly packaged and presented, and because of the aforementioned Simon Effect of online dating you see that rather a lot. One woman who contacted me had this as the opening line of her dating profile:

First off, I am a heavier person, and if you’re so shallow that you can’t see past that then don’t even bother contacting me!

Ignoring the obvious (e.g., that is isn’t a case of “see” so much as “feel,” and that anyone who couldn’t see that she was overweight hadn’t bothered to look at her photos), this is an upfront assertion of prejudice. How does not being attracted to overweight people make someone shallow? Being curvaceous or slightly overweight is one thing, but this woman was positively rotund and that isn’t a matter of shallowness – it’s a serious health risk. Had she instead opened with “I know I’m overweight but I’m working on it” or “I know I’m overweight but I’m totally okay with myself” – that is, had she spun the argument around so that it celebrated her rather than made her seem judgmental – then... well... more likely someone else would have been willing to date her. Someone not me. 

And it isn’t just women; I knew a guy in Oregon who pulled a similar stunt. His line was this:

I am a no-nonsense badass looking to get the most out of life. I’m not interested in censoring myself or dividing my time between my passions and some chick. Hit me up if you want to come along for the ride!

He thought it made him come across as a tough guy. It didn’t. It made him come across as the last person in the world any sane-minded woman would want to date.

Anyway, the bottom line is that dating – every form of dating – is a numbers game. You will never meet all of the single people in a given region, city, or even neighborhood, and if you’re serious about wanting to find romance you do well to bolster your chances by broadening your range to include online venues. Online dating sites are, after all, places where people go for almost no other reason; they represent what ecologists would call a “target-rich environment.” Sure, there are risks, but they’re mostly - mostly - harmless and usually easy to evade. Besides, if you aren’t willing to take a few risks then your love life is probably doomed already.  



Friday, June 15, 2012

Relegated to the Quacks: The Reiterative Function of Medical Dysfunction

Much ink has been spilled on the topic of the legitimacy of what many consider the “trendy” modern maladies, including Lyme disease. I contracted Lyme a little over a year ago in a non-endemic area in western Utah while working on an archaeological project, and was diagnosed about seven months later by a plucky physician’s assistant who had the unprecedented courage to order a serological Lyme test in a place where it supposedly doesn’t exist. I hope his boss never finds out… Anyway, the circus that I endured both before and after diagnosis prompted me to investigate the Lyme phenomenon from every possible angle, a task made much easier by my being a graduate student with access to medical research journals (they don’t like the public to be able to see those). The point of this article is a narrow, forceful attack on how sufferers of elusive maladies like Lyme are treated by mainstream medical practitioners and the impact this has on both the patients and the maladies themselves.

During the months I spent enduring horrific mental, emotional and physiological symptoms before diagnosis, during all but the last few weeks of which I did not have health insurance, I was treated to a hilariously ingratiating cavalcade of white-coated clowns demonstrating various levels of ineptitude, the most delightfully quizzical of which took place at Flagstaff Medical Center (FMC) in Flagstaff, AZ. I was admitted with (as usual) severe dizziness, cognitive dysfunction, heart palpitations, chronic headache unresponsive to pain medication, twitchiness, jumpiness, insomnia, overwhelming fatigue, stomachache and occasional nausea, mild fever, and “tracers” in my vision that reminded me of when I took LSD as a teenager. The doctor in the ER listened carefully to my symptoms, had a look at the pile of blood tests and heart tests and whatever all else I carried in my little medical folder, and then launched into a pejorative rant about the dangers of medical trends.

“There is no reason to test for Lyme disease because it doesn’t exist here; everyone thinks they have it because they saw it on CNN or something. And I’ll bet people have also suggested you might have a food intolerance? Or a yeast overgrowth? Or fibromyalgia or chronic fatigue syndrome…?” I nodded – weakly – and he nodded back, his bearded face suddenly looking less professional and more menacingly cocky. “It is extremely frustrating how we doctors have to deal with these trends. People see something on TV, and *bam* they think they’ve got it. And when we legitimate doctors try to tell them they don’t have these conditions, then they go and find ‘alternative’ witch doctors and snake oil salesmen who tell them they do. I blame it on the internet… Why, back in my day…” And on it went. After I’d sat in considerable pain and misery listening to this diatribe against the dangers of trends, I finally heard his diagnosis: anxiety/depression. Full stop. His advice was for me to drive back to Page (two hours away, in the dark of night, while unable to focus my eyes or stop shaking or remember what road signs mean, thus making his advice not only unethical and very stupid but also criminally dangerous) and start seeing a shrink.

This same story has – with alarmingly little variation – been repeated over and over again among sufferers of mysterious, novel, and/or elusive maladies throughout the nation. The annals of both anecdotal and clinical reports abound with such nonsense, and many authors contend that Lyme patients see, on average, about twelve or fifteen doctors before they finally find one that even bothers to test for it (in my case it was fourteen). Even in the face of what should be fairly conspicuous coincidence these sorts of shenanigans persist: another friend and colleague in Salt Lake City, with whom I’d been conducting fieldwork when my symptoms began, developed a similar suite of symptoms on an identical timeline and was told by half a dozen doctors that “depression can cause all of these symptoms,” even after telling them about me.

I mention these anecdotal tales to set up two lines of criticism. First, with regard to my experience at FMC, there is something not only illogical but frankly and insidiously wretched about lecturing someone about the stupidity of jumping on trends and then stamping that person with the trendiest diagnosis of all. In addition to this being just plain hypocrisy, it also speaks volumes about the current culture of medical practice in the US. It is taboo – indeed, it is practically anathema – for a doctor to say to a patient, “I don’t know.” Someone who walks into a hospital, pays through the nose, and then leaves with a diagnosis of “beats me” can come back with a lawyer. This is why it’s handy to have umbrella diagnoses around for especially troubling cases, but, sadly, use of the words spirit or humors is no longer fashionable, and blaming mysterious ailments on a person’s genetics took on an unpalatable flavor following the rise of eugenics and Nazism. So doctors have a new one: variously called “psychosomatic” or “somatoform” syndromes, and most often assuming the rubber-stamp guise anxiety/depression. This is nothing more than medieval humors for the modern era, a nebulous and all-inclusive diagnosis that is both legally legit and wholly scientifically invalid.

This leads then to my second point: with regard to both my and my colleague’s so-called diagnosis and the pronouncement that “depression can cause all of these symptoms,” a single afternoon’s worth of research belies the validity of that statement. One particularly pernicious fallacy among people who don’t understand statistics, but use them anyway, is that of thinking probabilities can be stacked. If, in other words, there is “a chance” for A and “a chance” for B and “a chance” for C, then there is “a chance” for ABC – period. In reality, the breadth of that chance narrows at each successive step. Think back to middle school arithmetic: the chance of a coin landing on heads is 50%. Add another coin landing on heads, and the chance drops to 25%. Add another, and it drops to 12.5%. There is only a 6% chance of getting four heads in a row, 3% chance of five in a row, and so on – and that’s starting with 50/50 odds. The chances of depression causing chronic headaches that don’t respond to most painkillers is somewhere closer to about 5%. The chances of depression causing overwhelming fatigue, body aches, chills, memory lapses, general cognitive dysfunction, and gastrointestinal distress are about the same, and while the chance of depression causing a detectable fever are even lower I’ll grant it 5% just to keep things even and give doctors the benefit of the doubt. That means the chance of depression causing all of those symptoms is… 0.0000000004%, or somewhere just over one in a trillion. Tack on the helpful addendum “anxiety,” which is characterized by chronic muscular strain and thus has a higher probability of causing physiological ailments, and the overall chance does indeed rise – from a little over one in a trillion to a little over one in a billion. Granted, one in a billion is still “a chance” in many people’s eyes, or else Las Vegas wouldn’t exist, but as a medical diagnosis you might as well be waving a wand at someone.

Which, sadly, is exactly what ends up happening to a lot of us: having been diagnosed as crazy, the only alternative for effective treatment is alternative in the mainstream sense. Thus the “witch doctors and snake oil salesmen” mentioned by that insidious buffoon in Flagstaff. Do a Google search for Lyme+disease+cures and have a look at the results – you’ll see everything from herbal remedies to DVD-based hypnotherapy to peroxide injections to hyperbaric chamber therapy, among rather a lot else. And indeed, some or even most forms of alternative medicine have at least some merit, especially herbal remedies, but the scientist in me looks askance toward a lot of that stuff. There is no regulation on their production or use and very little research to support its efficacy. Which is okay, if you aren’t really sick; feedback machines and at-home hypnotherapy may not work if you’ve got a genuine infection, but they won’t kill you either (although peroxide injections and oxygen-deprivation chambers certainly can). The main problem with these alternative treatments isn’t whether or not they work, it’s the sociocultural ramifications. For every new magic wand that gets sold under the heading Cure Your Lyme Disease there’s another support beam under the mainstream medical opinion that Lyme disease is, at least in many cases, nonsense. If that’s true, then it really is all in people’s heads. Which legitimizes doctors calling Lyme patients crazy. Which causes them to seek alternative treatments. Which creates an economic need for ever more clever magic wands. Which adds another support beam… and so on. This is what’s known among scientists as a reiterative formula or, more commonly, a feedback loop. The problem, by design, can and will only make itself worse.

So how do you break that loop? Of the many effective means I’ve found, one stands out as the simplest: demand proof. If your doctor feels that he or she is also entirely qualified to hand out psychiatric diagnosis, then demand to see a psychiatric degree. If your doctor tells you that anxiety/depression can cause your symptoms, ask what research supports that hypothesis. No lawyer would get away with saying “because I know these things” when giving evidence in a tricky case, and no doctor should be able to either. The appeal to authority is not enough when that authority has demonstrated itself time and time again to be grossly incompetent, easily corruptible, and, increasingly, just plain malicious. Until our government passes a Patient Autonomy Act you pretty much have to create your own autonomy, and it starts with not being afraid to call your doctor the crazy one.

Friday, May 11, 2012

Faux Toes

Because people keep asking me to do this...
Rattlesnake Poetry

A Blog about Blogging

*Clears throat*...

Recently, someone posted an online op-ed piece (read: blog - call it what the fuck it is, people) on her experiences at the university in what is technically my hometown. It looked like this. The responses were so numerous and vitriolic that, were it not accompanied by a picture of the author, I would almost suspect the article was secretly written by Rush Limbaugh as a way to get hated by even more people. ("What demographic haven't I pissed off yet...? I know - I'll pose as a self-righteous young whipper-snapper publicly denigrating a small city that nobody's ever heard of!")

This was my original response as it appeared on that site:

American journalists are a pretty low form of life, and the lowest of them are doe-eyed op-ed newspaper bloggers. I didn't encounter any Noble Truths in this article, just a short series of criticisms by someone who wants to sound clever. Still, the grammar wasn't horrific and I did enjoy the "partying heart beating furiously at its center" metaphor - if only for the irony of trying to imagine a partying heart. But then that's modern hipster poetry-prose in sum. Overall I give it a C. 

Having said that: what exactly did she expect? When did Binghamton become anything more than a dreary semi-no-mans-land, and how does anyone with more than a kindergarten education not know that already? The "famous" bars of Binghamton? Yes, I suppose that's true, if you move there from Harpursville or Elmira or Mogadishu; otherwise they're standard dives for a dying city with one university and too much rain, Seattle's grumpy little sister who never learned to play nice but still looks kind of sexy on her motorcycle. I moved out west 10 years ago and, whenever people ask where I'm from, I recommend they watch a few episodes of The Twilight Zone. But then I know that. Everyone knows that, including the people who still live there and love it. Anyone who expects differently deserves what (in this case: she) gets.

This is what I'd like to add now:

That article has been viewed tens of thousands of times, at this point, which means it's been seen by more people than have read my books, published and unpublished articles, and Cracked contributions combined. It also generated something approaching a million lines of text in response (one woman wrote a diatribe in the Comments section that was actually longer than the article itself). To which I must humbly say: bravo.

I didn't like the article very much, although I didn't exactly hate it either. I've always dogged on my hometown because it's grey and dreary and full of racists and other ne'er-do-wells, but that's always been in the tone of good-natured ribbing rather than outright hatred. I lived in New Orleans for five years after leaving Binghamton and, let me tell you, for greyness AND dreariness AND dangerous criminals you really cannot beat New Orleans. But I still love the place. I love both of them. I just love them the way you love a parent who drinks too much; a sort of "Oh you" hands-on-hips/shaking-head *sigh* kind of love, like I wish it had done more with its life. But then that's probably what inspired me to do so much more with my own life. How many people grow up in Beverly Hills and decide, as an act of defiance, to go out and become a professional archaeologist, author, adventurer, research biologist, behavioral ecologist, teacher, artist, medical activist... Probably close to zero. Life is just too good in Beverly Hills, and when it's not good you can always snort coke and mooch off the "goodness" of your neighbors. Not so in Binghamton (at least not the second one).

I also think she leaned too heavily on Harry Potter references. Four years of writing instruction should have taught her to either be more subtle in her allusions or, at the very least, not direct them all at one source. And I don't count the "Noble Truths" bit as a reference to Buddhism because the article didn't use Gautama's fourfold prescriptive philosophical structure - it just spouted a bunch of opinions, roughly (but not actually) adding up to about four.

But, and here is the bravo point, this is the direction in which I like to see the world going. I don't want to repeat myself too much, so let me just reiterate a point I make very often about music, movies, and the academic and medical industries: the days of the Big Dogs spoon-feeding us whatever the fuck they feel like are numbered. For most of the history of Western civilization things like knowledge, entertainment, medicine, and the news were in the hands of people who knew damned good and well that the commonweal had to swallow what they were handed because nobody had any other choice. Not so anymore. You can now go online and cherry-pick good songs from albums without having to taste any of the filler, access databases and libraries around the world and conduct your own research, find medicines and medical practitioners galore that efficaciously outdistance most of the white-coated retards at the local clinic, and so on. And the same is true for what we generously call the news. Think Fox is too slanted? Think CNN is too dog-and-pony-show? Think the local news focuses too much on "it will kill your children" hooks and puff pieces? Think NPR is just too.... NPR? Go online. See what the people on the ground are reporting. Scroll down to the Comments and see what other people think about what they're reporting. Leave some comments yourself. In other words: let the Big Dogs know that we don't have to put up with their acrimonious bullshit anymore. We have alternatives now, more than ever before. Their reign of tyranny is at an end.

You may not agree with the author of the Binghamton-slamming blog. Or you may agree with her article and think its droves of critics are just childishly chanting "I know you are but what am I!" (Or, for that matter, you might not read the thing at all - that's a freedom you have too, after all.) But it's no worse than anything I've ever heard on Fox or CNN or MSNBC and it's better than anything Limbaugh ever squatted and shat out. So I'm closing this with a reference of my own, a little older and more threadbare than JK Rowling but also a bit more poignant:

"I disapprove of what you say, but I will defend to the death your right to say it." ~ Evelyn Beatrice Hall

Monday, April 23, 2012

Dear: Salt Lake City (Spring 2012),


My good friend Shelley presciently – if somewhat suspiciously – told me when I first moved back here in January that “this time it’s gonna be better than last time.” I considered it dubious because last time I moved here, in 2010, I was in perfect health and had just graduated from college; by contrast, this time I moved here following a long bout with neurological Lyme disease having just spent about nine months barely able to move or remember how to tie my shoelaces. Also my beloved grandmother had just died. How, I wondered, could this possibly be a better experience?

Oddly, strangely, and very unexpectedly, she was right. Last time I lived in Salt Lake City, in “perfect health,” I was here for almost a year and spent most of that time lonely, miserable and bitchy. No friends to speak of, no lovers at all, and a whole series of just downright ridiculous living situations that ended with me living in a hostel for over a month. When I left for Glen Canyon a little over a year ago I considered it a blessed escape from a prolonged nightmare. And then Fate, being the bastard that it is, taught me what a real prolonged nightmare feels like…

And now: despite having moved here with an IV catheter in my arm and an ongoing “where the hell am I” mindset, and in just under five months, this time it really has been cool. I’ve had a great (and successful) time in grad school, a small handful of really awesome romances, made some great new friends and reconnected with some great old ones, and even had a few sweet adventures. Granted, it hasn’t all been roses – I’m pretty sure the infection is dead but I’m still recovering from the damage it wreaked, as well as recovering from sinus surgery – but it has certainly been a lot MORE roses than last time.

So this is my final thought in the last few weeks of school and packing before spending my summer back in what I consider my real homeland: you never know what’s coming. You really, really never know what’s coming.    



Tuesday, April 17, 2012

"I Recommend Tragedy"

Continuing my “because somebody asked" blog/essay serious, this edition finds me addressing the topic of personal suffering through the viewpoints of a) the ego or identity and b) the interior workings of complex phenomena outside of the human brain. Here is the first one.

After being diagnosed with mesothelioma – way before effective treatments of this rare and especially unpleasant form of cancer spread into the market – Dr. Stephen Jay Gould, responding to assertions that the most effective cure for a statistically incurable disease is a positive attitude, quipped, “From the depths of my skeptical and rationalist soul, I ask the Lord to protect me from California touchie-feeliedom” and, less concisely,

We must stand resolutely against an unintended cruelty of the “positive attitude” movement – insidious slippage into a rhetoric of blame for those who cannot overcome their personal despair and call up positivity from some internal depth. We build our personalities laboriously and through many years, and we cannot order fundamental changes just because we might value their utility: no button reading “positive attitude” protrudes from our hearts, and no finger can coerce positivity into immediate action by a single and painless pressing.

It is, in other words, not within our power to dissolve our own egos, wipe the slate clean, and shift gears from curmudgeon to roseate at the drop of a hat for the express purpose of buttressing a psychological defense strategy against something for which there is little other feasible defense.

Or so sayeth the late, great SJG. While I usually agree with practically everything Gould has to say about everything, I have yet to find a single person of any intellectual caliber with whom I don’t disagree at one time or another. In fact the ego can be dissolved, albeit not easily, and we really can wipe the slate clean and start anew. All it takes is to realize that your ego is not you; it is an image in terms of which you present yourself to yourself, and to others, and by which you are measured and regarded. In that sense it is like a mask, and it is pivotal at this point to reiterate a curious piece of linguistic evolution that I’ve mentioned before: the term personality, and its shortened form persona, come from the Greek word for the masks worn by dramatists on-stage (per = through, as in perceive or pervade; son = sound, as in sonic or sonar), the personae – plural for persona, which is still used interchangeably with “personality” – being the masks that told the audience who was talking. This was important because, without the mask, audience members wouldn’t be able to tell if the person onstage was Zeus or Aristophanes or a peasant or even a cloud. By donning the mask, the burden of the actor to *play* so complicated a role as a god or a cloud was relaxed, allowing the actor to simply speak the lines and make the motions. The audience knew at a glance who/what was speaking.

For reasons known best to scholars who have a hell of a lot more free time than me, this term has moved offstage and into our everyday usage, and, moreover, our personalities are now considered one of our most valuable things. Your persona is who you are. Someone devoid of personality is either considered vapid or, on a bad day, a sociopath; it is our personalities, and the trappings and signals attached to them, that define us. “He is a nice guy.” “She is a Christian who likes to play Frisbee.” “Those people are hippies but they also shop at Whole Foods which means their anti-corporate rhetoric is at odds with their behavior…” and so on. Our personalities, our masks, are supposed to be intractable, and people become nervous when someone is or becomes unpredictable. So we act as though our personae were our selves. They aren’t.

One of the things that Zen training advocates is dissolution of the illusion that we, as entities, are discrete and separate from everything else. But you don’t need Zen to hammer this point, and in fact I sometimes feel that I lean on Zen and other so-called Eastern philosophies way too often. Instead, just think about when you were a child: you were one person at home, another person at school, quite another person playing with your friends, and yet an altogether other person when visiting your grandparents or attending church. As time goes on we encourage – if not actually force – children to assume socially recognizable roles, and even if those roles aren’t considered “good” or “moral” they’re still preferable to not having a role at all. Thus there is a place in our society for criminals, and it’s within reason to be a thug, but God forbid you’re a criminal or a thug one day and a charitable philanthropist the next. Unpredictability is anathema in our culture; it is worse than being evil, which is at least immutable. However, strip away all of that – all the favorite foods and music, all the Type A or Type B nonsense, all the experience and preference and taste and whatever all else – and what’s left? A spontaneous being is left, like a shark or a chimpanzee, which sometimes moves in detectable patterns but is by no means constrained by them.

So: how does one go about dissolving the ego? There is a broad array of methods, only a few of which I’ll mention here.

First, there are drugs. Psychedelic, psychotropic, and/or dissociative drugs can pretty well wipe one’s slate, and in fact even alcohol can do it to the extent that alcohol intoxication tends to strip away inhibitions and social mores (in vino, for better or for worse, truly veritas). This was among the earliest and most significant aspects of drugs like LSD and DMT elucidated by researchers: someone in the throes of such chemicals becomes, as it were, primitive in the sense that he or she becomes disconnected with what is best described as “personal reality.” The problems with this method are twofold: first, drugs are dangerous, and even in societies where psychoactive compounds aren’t illegal they are (usually) used exclusively by, or under the direction of, shamans; and second, as easily as people in such states can become disconnected from personal reality, they can also become disconnected from capital-r Reality. As a whole. Permanently.

Second, there is meditation and/or prayer, two seemingly very different practices that, in fact, yield identical results. Among Buddhist, Taoist, Hindu, Yogi, and similar-scoped practitioners, meditation is a practice that rids one of the illusion of separateness of identity and illuminates or “enlightens” our total connection with the universe. Among Christians, Muslims, Jews, and other practitioners of that ilk, intense and rigorous prayer is a practice that decries the transience of human life and connects us to the great and irreducible infinitude of which we are created and to which we inevitably return, a.k.a., God. What’s the difference? But meditation has its own drawbacks, chief among which: it’s hard, at least for a lot of people. It involves lots of sitting and lots of deep breathing and, worse, trying not to let our minds wander indiscriminately, and, worse still, it takes time and effort that could be spent doing something else. All culture is distracting (hell, all of *life* is distracting, and it probably wasn’t a whole lot easier to sit in silent contemplation beside a fire when you had to keep feeding the fire in order to keep the lions away), but our culture in particular has a distinctive talent for distraction, specializes in it, revels in it, positively rejoices in it. If we aren’t entertained for whole seconds we get bored, for whole minutes we get restless, for whole hours we start to claw at the walls, and it’s little wonder that the second-to-worst punishment in our society is solitary confinement. A Buddhist or Franciscan monk would delight in being put in a quiet cell with reliable food service for a couple of days, having nothing to do but sit on the floor and contemplate Existence. Meanwhile the average American starts tapping his or her feet when the elevator is a couple of seconds too slow, and a traffic light that doesn’t change rapidly enough can lead to a multi-car pileup involving blood, pain, mayhem, and – worst of all – lawyers, doctors, and cops. So, for most Americans, meditation is probably not an option. 

But then there’s a third option, although I hesitate to use the word “option” because you can’t really plan for it: horror, catastrophe, tragedy. There’s a term that psychologists use to explain what happens when someone goes through a horrific event, and it is, not coincidentally, the same term psychologists use to describe the state of mind of one who is on a heavy psychedelic drug trip or in a very deep meditative state, and that term is “dissociation.” It is, to use just one of many definitions (because it’s the one that I prefer, and for no other reason than that), a state of mind characterized by manic or intractable distractedness from current goings-on, not just around the subject but also within the subject’s own head. Dissolution of the self, in other words, insofar as the “self” is the sum total both of one’s perceptions and of one’s perceptions of those perceptions. Another word for that would be “zombification,” in popular terminology, and it too can be created by a certain class of drugs, but let us not muddy the waters…

So just how effective is tragedy at dissolving the ego? Dipping back into Buddhism for a moment: before becoming “the Buddha,” young Gautama Siddhartha’s mother died under mysterious circumstances when he was only a week old. In the words of author and psychiatrist Mark Epstein: “Something tragic happened… right at the beginning. That might be what it takes to become a Buddha, is that you have to suffer on such a primitive level.” This by itself sounds kind of dire. St. Paul, having realized that sin and grace are only recognizable in terms of one another – like lightness and darkness, up and down, hot and cold, yin and yang, and all other complimentary dualities – asked in a rhetorical panic, “Shall we continue in sin that grace shall abound?” Not… necessarily, although I can’t imagine a blessedly moral person having even the slightest inkling of his or her goodness without having had at least a taste of what it meant to be otherwise. And I’m not sure I agree that it requires of one to become enlightened that one first suffers some nasty fall. On the other hand, I definitely think it helps.

For my part, my dissociating event was Lyme disease. While I continue to suffer with the aftermath to this day, and probably will right up until I die, the acute and most insufferable part of the disease is behind me – but that part lasted for at least five months and only trailed off toward nine or ten. During that time I effectively had no personality. I would watch a movie or attend a class or, on days when I could muster the energy, do something “fun” like kayak or hike, and afterward someone would ask me my opinion. I never had an answer. I literally could not form any opinions. I would hear a story about some hero who rescued a child from drowning, and my reaction would be, “Huh – so, what you’re saying is, there are now a couple of people who get to live… a bit longer. I don’t see the relevance of that.” Food all tasted like food, no matter what it was. People all looked like people, neither attractive nor ugly nor funny nor smart nor stupid nor anything but bipedal animals that knew how to talk.

And then I woke up. Slowly, with great effort, and to this day my thinking is cloudy and my memory is riddled with holes like a sign on a backcountry road. But wake up I did, and continue to do, and I am more like a baby than like a 32 year-old man. I keep finding myself looking back on my pre-Lyme life in utter astonishment. Was I really that clever? Was I really that self-righteous? Did I really believe all those things, and did I really not believe all those other things? Did I really think solitude and adventurism were of utmost value? Did I really think love was a myth? Was I really so cocky half the time? Was I really so insecure the rest of the time? Just exactly who in fuck was that guy?

So that, in a rambling nutshell, is my answer to the question, “It is possible to rid oneself of one’s own ego?” (It was asked after I made some off-handed comment about how ego is not a real thing but a concept, and can be shed like any other concept, albeit with tremendous difficulty – which comment had been met with assiduous skepticism.) It is possible, but it is not easy, and the most expedient ways are either dangerous or horrifying or both. Then again, speaking of the not-expedient and entirely harmless meditative method, German author Hermann Hesse said, “Though cured of an illusion, I found this disintegration of the personality by no means a pleasant and amusing experience.” 

Preach auf, Kamerad!



Sunday, April 1, 2012

Resurrected Diatribe on the Social Role of Law, or "How to rob Peter to pay Paul to rob Peter some more."


Like most of my writings these days, this one comes as a lengthy response to a two-part question posed by a friend of mine: “Why do you hate cops? Also, aren’t you friends with some cops?” The simplest response would be “I don’t” and “I am,” but that doesn’t begin to approach the complexity of my real answer, which is that I don’t hate law-enforcement officers per se but I definitely hate laws, and not just the uniquely American laws that don’t make any logical sense (i.e., helmet laws, jaywalking, drinking and smoking ages, certain drug laws, …) but the very idea of law and of codified legal systems. Codified legal systems are an example of cultural evolution and, further, how “evolution” does not mean the same as “progress” – and can, in fact, mean exactly the opposite. An old Roman proverb asks, Quis custodiet ipsos custodes (who will guard the guards)? In our own society the answer is as obvious as it is frustrating: nobody. 

All human societies have sanctions against undesirable activities, like murder and rape and theft, going back to the earliest roots of humanity. Sometimes rooted in mysticism, sometimes in material economics, sometimes in inherent psychological abhorrence, these sanctions almost always make some sort of sense upon close scrutiny. Among the early Abrahamic tribes, for instance, it was taboo to eat swine, a mystical paradigm that most likely arose from plain old fear of death by trichinosis. In the pre-Hindu Vedic tradition it was forbidden to kill a cow, again a mystical paradigm that very likely boiled out of simple economic sense: a dead cow can provide meat for maybe a week, but a live cow can provide dairy for years (it’s tough for Americans to get their heads around that sort of logic, but remember we’re talking about a marginal Mesolithic ecology that didn’t offer a whole lot of alternatives). The near-universality of taboos against incest is most likely an adaptation based on myriad generations of “oops”-looking children, although that doesn’t stop some societies from condoning it to this day. And of course sanctions against rape and murder are best explained by the mechanics of simple psychology: anytime something makes somebody scream in horror, you probably shouldn’t do it.

Theft… is a tricky one. In order for theft to even exist, there has to be private ownership. I would say that nobody likes having his or her stuff taken, it just seems so obvious and self-evident, but in fact that may be a reflection of my modern Western upbringing. Ethnographic accounts of so-called primitive societies are rife with examples of people just taking things from each other, although the ideologies that often surround this “taking” appear to be based on a fundamental and important difference between their societies and ours. As one !Kung bushman explained to anthropologist Richard Lee (paraphrased), “We don’t exchange with things, we exchange with people.” In other words it’s the relationship between the individuals that matters most, not the relationship of people to their things.

Meanwhile the punishments for disobeying these sanctions or taboos also speak volumes about their respective societies. In some Native American groups, particularly in the Southwest, the practice of “clowning” – literally making fun of someone’s actions during a public ceremony – was (is?) adequate punishment for minor infractions like cheating on a spouse or being a jerk, while more serious infractions invoked banishment or possibly death. That was for the group to decide. In even simpler societies, like the hunter-gatherer groups of the Kalahari, group fluidity solved many of these problems: someone who brought social ruckus was either asked or forced to leave camp. And even the Biblical law of like for like, summarized in the popular phrase “an eye for an eye and a tooth for a tooth,” which also derives from the practices of early Abrahamic tribes, essentially suggests that individuals should mete out punishments that befit infractions. In these and countless other examples the burden of decision falls on the individual or the group, and is not explicitly codified. 

Thus we come to the issue of codification, my favorite example of which comes from China. During the Shang Dynasty (1766-1027 BC, or about 3756-3017 BP) the Chinese created large, beautiful bronze cauldrons and used them for public cooking of sacrifices. Writings often adorned the sides of these cauldrons, and many of these writings referred to laws or judicial ceremonies consistent with local practices. Thus, whenever anyone came to offer his or her sacrifice the laws of the land were right there to see, literally written in bronze. The Confucians considered this an excellent practice, reasoning that explicitly-written laws will eliminate confusion (no pun intended) about exactly what is and is not permissible; but the Taoists thought it was ridiculous, reasoning that explicitly writing laws fails to account for human nature and the tendency of people to enjoy solving puzzles. When sanctions aren’t made explicit in the form of laws they’re simply aspects of society, taboos and no-no’s that you either do or don’t do depending on whether or not you feel like dealing with the social fallout. Codifying those sanctions converts them from tacit traditions into overt obstacles, and boy-oh-boy do people love trying to negotiate obstacles, or so the Shang Dynasty Taoists believed.

This is one of the earliest and most delightful examples I can think of where ancient scholars were already way ahead of modern psychologists. People really do enjoy solving puzzles, it’s an evolutionary trait that helped us survive and become what we are, and among the most fun puzzles you can present for someone is that of getting through a maze. Codifying laws is an abstract form of creating a maze, with solid walls representing forbidden or off-limits or not allowed, and the moment you do that people immediately start trying to figure ways to get around, over, under, or even through those walls. This is known as “litigiousness,” a tendency to attempt to wangle around laws based on the language of their codification. It is the principle – if not sole – occupation of lawyers, the sum of whose careers was best exemplified when Bill Clinton asked, “What do you mean by ‘is’?”

So: taking social sanctions and taboos out of the hands/minds/hearts of individuals or groups of living, breathing people and codifying them as a set of concrete abstractions engenders litigiousness. I can live with that. But it also engenders officiousness. The literal definition of officious is “volunteering one’s services where they are neither asked nor needed; meddlesome.” The common-usage definition is somewhat more concise: self-righteous incursion into others’ affairs. The only difference between this and simple bullying is that bullies don’t believe they’re doing something moral, virtuous, or socially justified; they just think it’s fun. An officious person is one who gets in your way, pushes you down, lectures you and embarrasses you and frightens you, and sometimes even physically restrains and/or beats you – in other words, bullies you – all with the motivating benefit of rationale. That rationale is law.  

Finally, there is the issue of material consideration. Recall the !Kung example and what “taking” means in a society that places principle emphasis on people rather than things. This is an important distinction to bear in mind when considering the fact that because codifying laws and punishments begets litigiousness, and because litigiousness begets lots and lots of people trying to wangle around those laws, it is necessary to erect an entire societal subcomplex in order to enforce those laws and punishments. This creates its own problems, chief among which is that of allocation: you can’t get something from nothing, not in this universe, so who pays for all of that? Why, the people being policed and disciplined of course – who did you think? This is easily justified by the tacit (and sometimes overt) explanation that it’s for our own good, because the taxes and fines that we pay to support the penal system are visited back upon us in the form of protection and a sense of safety in our cities and towns. Right? Don’t you feel safe in modern American cities and towns? Don’t you feel like the amount of money put toward law enforcement – not just salaries, but cars, guns, office buildings, courthouses, and of course prisons, among rather a lot else – is offset by the warm and fuzzy feeling of total security you have, just knowing that nobody will ever hurt or wrong you in America? For those who have been wronged by criminals: didn’t they catch the crook and punish him or her to a degree that you found personally satisfying? Doesn’t that happen like all the time? And for those who have been punished for doing something that is, for either stupid reasons or no reason at all, considered unlawful: don’t you feel like you deserved it because you’re a bad person? Don’t you feel like it’s okay for your tax dollars to finance a mechanism that reaches back around into your wallet again when you park in the wrong place or cross the street outside of a pair of painted lines? Doesn’t that seem totally sensible and not at all like some sort of racket?

If the answer to any of those questions was “no,” then you get my point. Our legal system, like any other codified legal system, exists for the ostensive purpose of maintaining social order but operates primarily as a way to a) protect the interests of our investments rather than our selves – this is why mall cops and retail outlet security guards deserve all the hassle that meth heads and teenagers give them; they don’t even entertain the illusion of serving and protecting people, they serve and protect things and are apparently totally okay with that – and b) generate their own capital. So that they can keep growing. So that they can enforce more laws, hire more policemen and buy more guns and cars and prisons and high-tech gadgetry. So that they can arrest more people and levy more fines. So that they can keep growing…

If this sounds suspiciously like what a cancer cell does, that’s because it is.

So that’s why I hate the legal system, and not just our legal system but all of them. We as a species got by just fine for over a quarter of a million years without scrolls and books and cauldrons telling us what to do, what not to do, and how we can expect to be punished if we disobey the latter. Non-penal social sanctions operated just fine during this entire era, and in fact I – and many sociologists – think those sorts of sanctions are tremendously more effective than what we’ve got now. If you disobey the law of smoking marijuana you’ll lose some money, either directly to the penal system via fines or indirectly via paying for a good lawyer; but if you disobey the social taboo against being an asshole you will be ridiculed, ostracized, and possibly beaten-up by your friends and neighbors. Which form of punishment has a deeper, more lasting, and more meaningful impact? And the threat of which one is more likely to deter an infraction? Our society has raised us to think it’s the former, because money is of such inordinate importance and nobody wants to have his or her money taken away (this applies to “time” and “life” as well, both of which are also inordinately important and useful tools for take-away threats, but I’ll leave that alone for now). But let’s be honest: what’s really more important? Your monetary currency or your social currency? We answer that question all the time when we use our money to buy expensive showoff items; e.g., if money really was of apex importance we wouldn’t so readily transpose it into fancy baubles. So fines against stupid laws are, at best, a toll that you pay to do things that society doesn’t want you to do, and you only have to pay that toll when you get caught. If it wasn’t for the extremely obvious fact that the whole thing is a self-serving racket I would criticize it for being a lousy way to ensure public safety. Which is true – but the more true statement is the one about it being a racket.

On a final and interesting historical note, Thomas Aquinas, aka Saint Thomas Aquinas, wrote in one of his Epistles that the Ten Commandments, those most sacred examples of codified stricture in the history of Christendom, were not passed into the hands of humankind as a set of rules to be followed. They were passed into the hands of humankind precisely to demonstrate that they couldn’t be followed. Thou-shalting us not to murder, steal, worship another God or commit adultery are reasonable enough requests, but *requiring* people to love God and their mothers and fathers (“love” being a thing that’s pretty difficult to achieve under duress) is a tall order, as is not doing any work on Sabbath day. What if the roof is leaking…? And forbidding jealousy is just plain ridiculous; that’s like forbidding people from salivating when they smell pizza. Granted, I’m pretty sure every armchair philosopher who has ever really considered the Ten Commandments came to his or her own conclusion about this stuff, tucked it away and then trotted it out during a “this is why religion is stupid” rant, but again: Saint Thomas Aquinas beat you to the punch. A set of impossible rules means that everyone is a sinner. If everyone is a sinner, then everyone goes through life feeling penitent and humble. If everyone feels penitent and humble, then everyone will constantly be trying to repent and make a better life for him/herself and others through acts of faith and charity. In sum, according to St. Thomas, the most canonical and easily-recognizable set of codified rules in the Western world was specifically designed so that everyone who knew about them was always guilty and would therefore always be trying to do better. Our own laws are very similar, in that pretty much everyone is guilty of something-or-other pretty much all the time, but the state of constant guilt they inspire is not designed to compel us to be better people. It is designed to fuck us over.  

As for why I hate cops: again, I don’t. Most of the law-enforcement personnel that I’ve known had a genuine desire to protect and serve the interests of their fellow citizens. Many of them aren’t cursed by deep thinking the way that I am, and none of them are anthropologists, yet even then I’ve known plenty of cops who had a sense of equity (that is: they know when the “letter of the law” doesn’t apply and are willing to employ independent thoughts rather than just follow orders). The legal system engenders officiousness and state-mandated theft of property via ridiculous punishments for infractions against ridiculous laws, but not intractably, and at the end of the day cops and judges are still people and I’ve known and liked quite a lot of them. If anything I feel bad for them, the way that I feel bad for Wal-Mart employees. I would like to shake them by the shoulders and scream, “Don’t you realize you’re doing awful work for an awful system that cares more about things than people? Don’t you realize your paychecks are consignment dregs from a vat of misanthropic terribleness?!” But of course I would never be so rude. Not in this economy.


First Post Script: this… this whatever-this-is is titled “Resurrected” because I began writing it over a decade ago and only recently decided to finish it – by which I mean I recently stumbled across it and wondered, by typing them out, if my opinions had changed since then. I think the setup to this piece was when I was at a party that got busted by some cops and I said to a friend, “fucking pigs…” So: a belated “sorry” is due to all of my friends in law enforcement. I don’t hate you, I just hate how the legal system has become as much of an insidious racket as the modern medical system and religious and educational institutions. It’s okay; really, it is. Everybody needs a job.

Second Post Script: I am indebted to G. K. Chesterton, Richard B. Lee, Alan Watts, Marvin Harris and Lao Tzu for material relevant to this… for this… for being too dead – except Lee – to prevent me from indiscriminately squishing their ideas together despite (and I’m serious about this) all of their theories being at diametric odds. Research!