February 22, 2007

Recent Reading: The Return - II

Roger Gathman at Limited Inc. recommended Michael Pollan's article in the New York Times, Unhappy Meals, a fascinating history of the rise and effect of nutritionism.
No single event marked the shift from eating food to eating nutrients, though in retrospect a little-noticed political dust-up in Washington in 1977 seems to have helped propel American food culture down this dimly lighted path. Responding to an alarming increase in chronic diseases linked to diet — including heart disease, cancer and diabetes — a Senate Select Committee on Nutrition, headed by George McGovern, held hearings on the problem and prepared what by all rights should have been an uncontroversial document called “Dietary Goals for the United States.” The committee learned that while rates of coronary heart disease had soared in America since World War II, other cultures that consumed traditional diets based largely on plants had strikingly low rates of chronic disease. Epidemiologists also had observed that in America during the war years, when meat and dairy products were strictly rationed, the rate of heart disease temporarily plummeted.

Naïvely putting two and two together, the committee drafted a straightforward set of dietary guidelines calling on Americans to cut down on red meat and dairy products. Within weeks a firestorm, emanating from the red-meat and dairy industries, engulfed the committee, and Senator McGovern (who had a great many cattle ranchers among his South Dakota constituents) was forced to beat a retreat. The committee’s recommendations were hastily rewritten. Plain talk about food — the committee had advised Americans to actually “reduce consumption of meat” — was replaced by artful compromise: “Choose meats, poultry and fish that will reduce saturated-fat intake.”...

Henceforth, government dietary guidelines would shun plain talk about whole foods, each of which has its trade association on Capitol Hill, and would instead arrive clothed in scientific euphemism and speaking of nutrients, entities that few Americans really understood but that lack powerful lobbies in Washington. This was precisely the tack taken by the National Academy of Sciences when it issued its landmark report on diet and cancer in 1982. Organized nutrient by nutrient in a way guaranteed to offend no food group, it codified the official new dietary language. Industry and media followed suit, and terms like polyunsaturated, cholesterol, monounsaturated, carbohydrate, fiber, polyphenols, amino acids and carotenes soon colonized much of the cultural space previously occupied by the tangible substance formerly known as food. The Age of Nutritionism had arrived.

...

The first thing to understand about nutritionism ... is that it is not quite the same as nutrition. As the “ism” suggests, it is not a scientific subject but an ideology. Ideologies are ways of organizing large swaths of life and experience under a set of shared but unexamined assumptions. This quality makes an ideology particularly hard to see, at least while it’s exerting its hold on your culture...

In the case of nutritionism, the widely shared but unexamined assumption is that the key to understanding food is indeed the nutrient. From this basic premise flow several others. Since nutrients, as compared with foods, are invisible and therefore slightly mysterious, it falls to the scientists (and to the journalists through whom the scientists speak) to explain the hidden reality of foods to us. To enter a world in which you dine on unseen nutrients, you need lots of expert help.

...

Another potentially serious weakness of nutritionist ideology is that it has trouble discerning qualitative distinctions between foods. So fish, beef and chicken through the nutritionists’ lens become mere delivery systems for varying quantities of fats and proteins and whatever other nutrients are on their scope. Similarly, any qualitative distinctions between processed foods and whole foods disappear when your focus is on quantifying the nutrients they contain (or, more precisely, the known nutrients).

This is a great boon for manufacturers of processed food, and it helps explain why they have been so happy to get with the nutritionism program. In the years following McGovern’s capitulation and the 1982 National Academy report, the food industry set about re-engineering thousands of popular food products to contain more of the nutrients that science and government had deemed the good ones and less of the bad, and by the late ’80s a golden era of food science was upon us. The Year of Eating Oat Bran — also known as 1988 — served as a kind of coming-out party for the food scientists, who succeeded in getting the material into nearly every processed food sold in America. Oat bran’s moment on the dietary stage didn’t last long, but the pattern had been established, and every few years since then a new oat bran has taken its turn under the marketing lights. (Here comes omega-3!)

By comparison, the typical real food has more trouble competing under the rules of nutritionism, if only because something like a banana or an avocado can’t easily change its nutritional stripes (though rest assured the genetic engineers are hard at work on the problem). So far, at least, you can’t put oat bran in a banana. So depending on the reigning nutritional orthodoxy, the avocado might be either a high-fat food to be avoided (Old Think) or a food high in monounsaturated fat to be embraced (New Think). The fate of each whole food rises and falls with every change in the nutritional weather, while the processed foods are simply reformulated. That’s why when the Atkins mania hit the food industry, bread and pasta were given a quick redesign (dialing back the carbs; boosting the protein), while the poor unreconstructed potatoes and carrots were left out in the cold.

Of course it’s also a lot easier to slap a health claim on a box of sugary cereal than on a potato or carrot, with the perverse result that the most healthful foods in the supermarket sit there quietly in the produce section, silent as stroke victims, while a few aisles over, the Cocoa Puffs and Lucky Charms are screaming about their newfound whole-grain goodness.
Read it all.

Meanwhile, discovering that Mr Gathman had been a contributor at the Austin Chronicle, I dug up some of his old articles. In a review of a number of medical texts, I discovered this interesting tale of serendipity. The tale is well known; just how serendipitous the discovery was I had not known.
Here's how Alexander Fleming discovered penicillin: He found some contaminating mold growing on a petri dish of staphylococchi he had left standing by a window. The mold had sprouted there, on its own, and was destroying the bacteria. But this account understates the element of accident. Other scientists could never reproduce the discovery as Fleming recounted it because, as a matter of fact, penicillium won't usually grow that way. Someone finally discovered that the temperature of London at the end of July, 1928, when Fleming's discovery was made, had been exceptionally cool. This allowed a spore, floating up from the laboratory on the floor below, which was investigating fungi, to grow. Then, the temperature returned to normal, and that allowed the staphylococchi to grow. Penicillin, in other words, was as improbable as the "chance meeting of an umbrella and a sewing machine on a dissecting table," to quote Lautreamont's line which, in June of 1928, surrealists in Paris were proclaiming as an aesthetic principle. Probably Fleming never heard of André Breton, but they were brothers under the skin. As Le Fanu succinctly puts it, "The therapeutic revolution of the post-war years was not ignited by a major scientific insight, rather the reverse: it was the realization by doctors and scientists that it was not necessary to understand in any detail what was wrong, but that synthetic chemistry blindly and randomly would deliver the remedies."
Speaking of book reviews, while searching for ones of Hampton Sides' Blood and Thunder I was taken with this summary of the argument of Jared Diamond's Collapse.
A thousand years ago, a group of Vikings led by Erik the Red set sail from Norway for the vast Arctic landmass west of Scandinavia which came to be known as Greenland. It was largely uninhabitable—a forbidding expanse of snow and ice. But along the southwestern coast there were two deep fjords protected from the harsh winds and saltwater spray of the North Atlantic Ocean, and as the Norse sailed upriver they saw grassy slopes flowering with buttercups, dandelions, and bluebells, and thick forests of willow and birch and alder. Two colonies were formed, three hundred miles apart, known as the Eastern and Western Settlements. The Norse raised sheep, goats, and cattle. They turned the grassy slopes into pastureland. They hunted seal and caribou. They built a string of parish churches and a magnificent cathedral, the remains of which are still standing. They traded actively with mainland Europe, and tithed regularly to the Roman Catholic Church. The Norse colonies in Greenland were law-abiding, economically viable, fully integrated communities, numbering at their peak five thousand people. They lasted for four hundred and fifty years—and then they vanished.

...

There was nothing wrong with the social organization of the Greenland settlements. The Norse built a functioning reproduction of the predominant northern-European civic model of the time—devout, structured, and reasonably orderly. In 1408, right before the end, records from the Eastern Settlement dutifully report that Thorstein Olafsson married Sigrid Bjornsdotter in Hvalsey Church on September 14th of that year, with Brand Halldorstson, Thord Jorundarson, Thorbjorn Bardarson, and Jon Jonsson as witnesses, following the proclamation of the wedding banns on three consecutive Sundays.

The problem with the settlements, Diamond argues, was that the Norse thought that Greenland really was green; they treated it as if it were the verdant farmland of southern Norway. They cleared the land to create meadows for their cows, and to grow hay to feed their livestock through the long winter. They chopped down the forests for fuel, and for the construction of wooden objects. To make houses warm enough for the winter, they built their homes out of six-foot-thick slabs of turf, which meant that a typical home consumed about ten acres of grassland.

But Greenland’s ecosystem was too fragile to withstand that kind of pressure. The short, cool growing season meant that plants developed slowly, which in turn meant that topsoil layers were shallow and lacking in soil constituents, like organic humus and clay, that hold moisture and keep soil resilient in the face of strong winds. “The sequence of soil erosion in Greenland begins with cutting or burning the cover of trees and shrubs, which are more effective at holding soil than is grass,” he writes. “With the trees and shrubs gone, livestock, especially sheep and goats, graze down the grass, which regenerates only slowly in Greenland’s climate. Once the grass cover is broken and the soil is exposed, soil is carried away especially by the strong winds, and also by pounding from occasionally heavy rains, to the point where the topsoil can be removed for a distance of miles from an entire valley.” Without adequate pastureland, the summer hay yields shrank; without adequate supplies of hay, keeping livestock through the long winter got harder. And, without adequate supplies of wood, getting fuel for the winter became increasingly difficult.

The Norse needed to reduce their reliance on livestock—particularly cows, which consumed an enormous amount of agricultural resources. But cows were a sign of high status; to northern Europeans, beef was a prized food. They needed to copy the Inuit practice of burning seal blubber for heat and light in the winter, and to learn from the Inuit the difficult art of hunting ringed seals, which were the most reliably plentiful source of food available in the winter. But the Norse had contempt for the Inuit—they called them skraelings, “wretches”—and preferred to practice their own brand of European agriculture. In the summer, when the Norse should have been sending ships on lumber-gathering missions to Labrador, in order to relieve the pressure on their own forestlands, they instead sent boats and men to the coast to hunt for walrus. Walrus tusks, after all, had great trade value. In return for those tusks, the Norse were able to acquire, among other things, church bells, stained-glass windows, bronze candlesticks, Communion wine, linen, silk, silver, churchmen’s robes, and jewelry to adorn their massive cathedral at Gardar, with its three-ton sandstone building blocks and eighty-foot bell tower. In the end, the Norse starved to death.

...

[T]he disappearance of the Norse settlements is usually blamed on the Little Ice Age, which descended on Greenland in the early fourteen-hundreds, ending several centuries of relative warmth. (One archeologist refers to this as the “It got too cold, and they died” argument.) What all these explanations have in common is the idea that civilizations are destroyed by forces outside their control, by acts of God.

...

It did get colder in Greenland in the early fourteen-hundreds. But it didn’t get so cold that the island became uninhabitable. The Inuit survived long after the Norse died out, and the Norse had all kinds of advantages, including a more diverse food supply, iron tools, and ready access to Europe. The problem was that the Norse simply couldn’t adapt to the country’s changing environmental conditions. Diamond writes, for instance, of the fact that nobody can find fish remains in Norse archeological sites. One scientist sifted through tons of debris from the Vatnahverfi farm and found only three fish bones; another researcher analyzed thirty-five thousand bones from the garbage of another Norse farm and found two fish bones. How can this be? Greenland is a fisherman’s dream: Diamond describes running into a Danish tourist in Greenland who had just caught two Arctic char in a shallow pool with her bare hands. “Every archaeologist who comes to excavate in Greenland . . . starts out with his or her own idea about where all those missing fish bones might be hiding,” he writes. “Could the Norse have strictly confined their munching on fish to within a few feet of the shoreline, at sites now underwater because of land subsidence? Could they have faithfully saved all their fish bones for fertilizer, fuel, or feeding to cows?” It seems unlikely. There are no fish bones in Norse archeological remains, Diamond concludes, for the simple reason that the Norse didn’t eat fish. For one reason or another, they had a cultural taboo against it.

Given the difficulty that the Norse had in putting food on the table, this was insane. Eating fish would have substantially reduced the ecological demands of the Norse settlements. The Norse would have needed fewer livestock and less pastureland. Fishing is not nearly as labor-intensive as raising cattle or hunting caribou, so eating fish would have freed time and energy for other activities. It would have diversified their diet.

Why did the Norse choose not to eat fish? Because they weren’t thinking about their biological survival. They were thinking about their cultural survival. Food taboos are one of the idiosyncrasies that define a community. Not eating fish served the same function as building lavish churches, and doggedly replicating the untenable agricultural practices of their land of origin. It was part of what it meant to be Norse, and if you are going to establish a community in a harsh and forbidding environment all those little idiosyncrasies which define and cement a culture are of paramount importance.“The Norse were undone by the same social glue that had enabled them to master Greenland’s difficulties,” Diamond writes. “The values to which people cling most stubbornly under inappropriate conditions are those values that were previously the source of their greatest triumphs over adversity.” He goes on:
To us in our secular modern society, the predicament in which the Greenlanders found themselves is difficult to fathom. To them, however, concerned with their social survival as much as their biological survival, it was out of the question to invest less in churches, to imitate or intermarry with the Inuit, and thereby to face an eternity in Hell just in order to survive another winter on Earth.
Diamond’s distinction between social and biological survival is a critical one, because too often we blur the two, or assume that biological survival is contingent on the strength of our civilizational values. That was the lesson taken from the two world wars and the nuclear age that followed: we would survive as a species only if we learned to get along and resolve our disputes peacefully. The fact is, though, that we can be law-abiding and peace-loving and tolerant and inventive and committed to freedom and true to our own values and still behave in ways that are biologically suicidal. The two kinds of survival are separate.

...

Rivers and streams and forests and soil are a biological resource. They are a tangible, finite thing, and societies collapse when they get so consumed with addressing the fine points of their history and culture and deeply held beliefs—with making sure that Thorstein Olafsson and Sigrid Bjornsdotter are married before the right number of witnesses following the announcement of wedding banns on the right number of Sundays—that they forget that the pastureland is shrinking and the forest cover is gone.

When archeologists looked through the ruins of the Western Settlement, they found plenty of the big wooden objects that were so valuable in Greenland—crucifixes, bowls, furniture, doors, roof timbers—which meant that the end came too quickly for anyone to do any scavenging. And, when the archeologists looked at the animal bones left in the debris, they found the bones of newborn calves, meaning that the Norse, in that final winter, had given up on the future. They found toe bones from cows, equal to the number of cow spaces in the barn, meaning that the Norse ate their cattle down to the hoofs, and they found the bones of dogs covered with knife marks, meaning that, in the end, they had to eat their pets. But not fish bones, of course. Right up until they starved to death, the Norse never lost sight of what they stood for.
You could, of course, just read the whole review, or even the book, which currently sits amongst my other volumes, mocking me with its two inch wide spine.

While on the environment, here's a great article from Jeffrey St Clair on the destructive impact of grazing on public lands. Being the kind of chap I am, I'll quote his snarky takedown of the Western myth that attaches to ranchers, rather than the more informative stuff.
Let's face it, public lands ranching is a peculiarly American form of social welfare. To qualify, you just need to get yourself a base ranch adjacent to some the most scenic landscapes in North America. Many ranchers came by their properties through primogeniture, a quaint medieval custom still common throughout much of the rural West. If you're a second son, a daughter or a latecomer, don't despair: you can simply buy a ranch with a grazing permit, like billionaire potato king JR Simplot did. It's easy. No background checks or means-testing required. A million bucks should get you started.

Rest assured, your public subsidies are guaranteed-by senate filibuster if necessary. Ranching subsidies are an untouchable form of welfare, right up there with the depletion allowance and the F-22 fighter. And guess what? It's guilt free. Nobody complains that your subsidies came at the expense of Headstart or aid to mothers with dependent children. Or that you take your profits from the subsidies and spend it on a case of Jack Daniels, a chunk of crystal meth, a night at the Mustang Ranch, a week at the roulette tables in Reno-or donate it all to Operation Rescue or the Sahara Club. (The most outlandish fables about profligate welfare queens have nothing on the average slob rancher.) Hell, do it right and somebody might even promote you as a cultural hero, a damn fine roll model for future generations.

The Western ranching fraternity is more homogenous than the Royal Order of the Moose: landed, white, male and politically conservative to a reactionary and vicious degree. It is not culturally, ethnically or racially representative of America and never has been. Shouldn't there be equal access to this federal grazing largesse, these hundreds of millions in annual subsidies to well-off white geezers? A creative litigator might be able to demonstrate that federal grazing policies violate the Civil Rights Act.

Strip away the Stetsons, bolo ties, and rattlesnake-skin boots, why should we view Western ranchers differently than the tobacco farmers of the Southeast? And what about their political benefactors? Is there are fundamental difference between, say, Max Baucus and Saxby Chambliss? Robert Bennett and Trent Lott? (Okay, Lott and Baucus have fewer bad hair days.) All annually raid the federal treasury to sustain industries that degrade the environment, ravage the public health and enervate the economy. Federal tobacco subsidies about to about $100 million a year; grazing subsidies may exceed a billion a year. No wonder there's so little congressional support for a single-payer health care system: passage of such a plan might actually force the Cow and Cigarette Caucus to choose between the health of the citizenry and their allegiance to political porkbarrel.

Some argue that the rancher is rooted to place, the he loves the land, holds a unique reverence for its contours, beauty, rhythms. The rancher, they say, is one of Wallace Stegner's "stickers," not an itinerant booster or commercial migrant, like the cut-and-run logger. He doesn't leave behind radioactive tailings piles or slopes of stumps in his wake, but gently tends and grooms the landscape, improving Nature's defects, year after year, generation after generation.

But take away the subsidies, the nearly free forage, the roads, the even cheaper water that magically appears from nowhere in the middle of the high desert, the tax breaks, predator control, abeyances from environmental standards and disproportionate political clout when any thing else goes against him, such as drought, rangefires, bad investments. Then charge them for the gruesome externalities of their "avocation" and then see how many stick around for the hardscrabble lifestyle that remains. Federal subsidies and political protection are the velcro for most of these guys, not the view of the Wind River Range.
Also enjoyable is St Clair's angry piece The Withering of the American Environmental Movement.

If you're looking for angry, be sure to check out Roger Morris' mini-biography of Donald Rumsfeld, in two parts at TomDispatch. (Actually, it's pretty much everywhere.)
[I]n the 1960s, Rumsfeld's ardor for a high-tech military was only stirring, a minor dalliance compared to his preoccupation with advancement. While few seemed to notice, the brash freshman made an extraordinary rush at the lumbering House. In 1964, before the end of his first term, he captained a revolt against GOP Leader Charles Halleck, a Dwight D. Eisenhower loyalist prone to bipartisanship and skepticism of both Pentagon budgets and foreign intervention. By only six votes in the Republican Caucus, Rumsfeld managed to replace the folksy Indianan with Michigan's Gerald Ford.

In the inner politics of the House, the likeable, agreeable, unoriginal Ford was always more right-wing than his benign post-Nixon, and now posthumous, presidential image would have it. Richard Nixon called Ford "a wink and a nod guy," whose artlessness and integrity left him no real match for the steelier, more cunning figures around him. To push Ford was one of those darting Capitol Hill insider moves that seemed, at the time, to win Rumsfeld only limited, parochial prizes -- choice committee seats, a rung on the leadership ladder, useful allies.

Taken with Rumsfeld's burly style that year was Kansas Congressman Robert Ellsworth, a wheat-field small-town lawyer of decidedly modest gifts but outsized ambitions and close connections to Nixon. "Just another Young Turk thing," one of their House cohorts casually called the toppling of Halleck.

It seems hard now to exaggerate the endless sequels to this small but decisive act. The lifting of the honest but mediocre Ford higher into line for appointment as vice president amid the ruin of President Richard Nixon and his Vice President, Spiro Agnew; Ford's lackluster, if relatively harmless, interval in the Oval Office and later as Party leader with the abject passing of the GOP to Ronald Reagan in 1980; Ellsworth's boosting of Rumsfeld into prominent but scandal-immune posts under Nixon; and then, during Ford's presidency, Rumsfeld's reward, his elevation to White House Chief of Staff, and with him the rise of one of his aides from the Nixon era, a previously unnoticed young Wyoming reactionary named Dick Cheney; next, in 1975-1976, the first Rumsfeld tenure at a Vietnam-disgraced but impenitent Pentagon that would shape his fateful second term after 2001; and eventually, of course, the Rumsfeld-Cheney monopoly of power in a George W. Bush White House followed by their catastrophic policies after 9/11 -- all derived from making decent, diffident Gerry Ford Minority Leader that forgotten winter of 1964.
Ah, history.
Barely a year after moving next to the Oval Office (and contrary to Ford's innocent, prideful recollection decades later that it was his own idea), Don and Dick characteristically engineered their "Halloween Massacre." Subtly exploiting Ford's unease (and Kissinger's jealous rivalry) with cerebral, acerbic Defense Secretary James Schlesinger, they managed to pass the Pentagon baton to Rumsfeld at only 43, and slot Cheney, suddenly a wunderkind at 34, in as presidential Chief of Staff.

In the process, they even maneuvered Ford into humbling Kissinger by stripping him of his long-held dual role as National Security Advisor as well as Secretary of State, giving a diffident Brent Scowcroft the National Security Council job and further enhancing both Cheney's inherited power at the White House and Rumsfeld's as Kissinger's chief cabinet rival. A master schemer himself, Super K, as an adoring media called him, would be so stunned by the Rumsfeld-Cheney coup that he would call an after-hours séance of cronies at a safe house in Chevy Chase to plot a petulant resignation as Secretary of State, only to relent, overcome as usual by the majesty of his own gifts.
Also at TomDispatch I found the essay that Mike Davis has expanded into a book on the history of the car-bomb. (Part two here.)
The members of the Stern Gang were ardent students of violence, self-declared Jewish admirers of Mussolini who steeped themselves in the terrorist traditions of the pre-1917 Russian Socialist-Revolutionary Party, the Macedonian IMRO, and the Italian Blackshirts. As the most extreme wing of the Zionist movement in Palestine -- "fascists" to the Haganah and "terrorists" to the British -- they were morally and tactically unfettered by considerations of diplomacy or world opinion. They had a fierce and well-deserved reputation for the originality of their operations and the unexpectedness of their attacks. On January 12, 1947, as part of their campaign to prevent any compromise between mainstream Zionism and the British Labor government, they exploded a powerful truck bomb in the central police station in Haifa, resulting in 144 casualties. Three months later, they repeated the tactic in Tel Aviv, blowing up the Sarona police barracks (5 dead) with a stolen postal truck filled with dynamite.

In December 1947, following the UN vote to partition Palestine, full-scale fighting broke out between Jewish and Arab communities from Haifa to Gaza. The Stern Gang, which rejected anything less than the restoration of a biblical Israel, now gave the truck bomb its debut as a weapon of mass terror. On January 4, 1948, two men in Arab dress drove a truck ostensibly loaded with oranges into the center of Jaffa and parked it next to the New Seray Building, which housed the Palestinian municipal government as well as a soup-kitchen for poor children. They cooly lingered for coffee at a nearby café before leaving a few minutes ahead of the detonation.

"A thunderous explosion," writes Adam LeBor in his history of Jaffa, "then shook the city. Broken glass and shattered masonry blew out across Clock Tower Square. The New Seray's centre and side walls collapsed in a pile of rubble and twisted beams. Only the neo-classical façade survived. After a moment of silence, the screams began, 26 were killed, hundreds injured. Most were civilians, including many children eating at the charity kitchen." The bomb missed the local Palestinian leadership who had moved to another building, but the atrocity was highly successful in terrifying residents and setting the stage for their eventual flight.

It also provoked the Palestinians to cruel repayment in kind. The Arab High Committee had its own secret weapon -- blond-haired British deserters, fighting on the side of the Palestinians. Nine days after the Jaffa bombing, some of these deserters, led by Eddie Brown, a former police corporal whose brother had been murdered by the Irgun, commandeered a postal delivery truck which they packed with explosives and detonated in the center of Haifa's Jewish quarter, injuring 50 people. Two weeks later, Brown, driving a stolen car and followed by a five-ton truck driven by a Palestinian in a police uniform, successfully passed through British and Haganah checkpoints and entered Jerusalem's New City. The driver parked in front of the Palestine Post, lit the fuse, and then escaped with Brown in his car. The newspaper headquarters was devastated with 1 dead and 20 wounded.

According to a chronicler of the episode, Abdel Kader el-Husseini, the military leader of the Arab Higher Committee, was so impressed by the success of these operations -- inadvertently inspired by the Stern Gang -- that he authorized an ambitious sequel employing six British deserters. "This time three trucks were used, escorted by a stolen British armored car with a young blond man in police uniform standing in the turret." Again, the convoy easily passed through checkpoints and drove to the Atlantic Hotel on Ben Yehuda Street. A curious night watchman was murdered when he confronted the gang, who then drove off in the armored car after setting charges in the three trucks. The explosion was huge and the toll accordingly grim: 46 dead and 130 wounded.
And if that leaves you insufficiently disturbed, try this rather old (but new to me) article by Craig Unger on Tim La Haye and the premillenial dispensationalists (also soon to be a book).
For miles around in all directions the fertile Jezreel Valley, known as the breadbasket of Israel, is spread out before us, an endless vista of lush vineyards and orchards growing grapes, oranges, kumquats, peaches, and pears. It is difficult to imagine a more beautiful pastoral panorama.

The sight LaHaye's followers hope to see here in the near future, however, is anything but bucolic. Their vision is fueled by the book of Revelation, the dark and foreboding messianic prophecy that foresees a gruesome and bloody confrontation between Christ and the armies of the Antichrist at Armageddon.

...

As we walk down from the top of the hill of Megiddo, one of them looks out over the Jezreel Valley. "Can you imagine this entire valley filled with blood?" he asks. "That would be a 200-mile-long river of blood, four and a half feet deep. We've done the math. That's the blood of as many as two and a half billion people."

When this will happen is another question, and the Bible says that "of that day and hour knoweth no man." Nevertheless, LaHaye's disciples are certain these events—the End of Days—are imminent. In fact, one of them has especially strong ideas about when they will take place. "Not soon enough," she says. "Not soon enough."
Mr Unger recently produced a report on the people shilling for a war with Iran, From the Wonderful Folks Who Brought You Iraq. Any thematic segue between that piece and the earlier one on Rapturism is hopefully entirely in your own mind.

February 21, 2007

Recent Reading: The Return - I

As February is almost over it's about time for the January round-up of highlights from internet commentary I've been reading instead of posting stuff here. In no particular order, of course, linked by irrelevant segues, and including a bunch of stuff from this month just to undercut the premise.

There's a lot so I think I might stagger it over few posts, just to be super-lazy.

First up there's The General in his Labyrinth, Tariq Ali's review of In the Line of Fire, Pervez Musharraf's memoirs, and a potted history of Pakistan.
In 1977, when Zia came to power, 90 per cent of men and 98 per cent of women in Afghanistan were illiterate; 5 per cent of landowners held 45 per cent of the cultivable land and the country had the lowest per capita income of any in Asia. The same year, the Parcham Communists, who had backed the 1973 military coup by Prince Daud after which a republic was proclaimed, withdrew their support from Daud, were reunited with other Communist groups to form the People’s Democratic Party of Afghanistan (PDPA), and began to agitate for a new government. The regimes in neighbouring countries became involved. The shah of Iran, acting as a conduit for Washington, recommended firm action – large-scale arrests, executions, torture – and put units from his torture agency at Daud’s disposal. The shah also told Daud that if he recognised the Durand Line as a permanent frontier the shah would give Afghanistan $3 billion and Pakistan would cease hostile actions. Meanwhile, Pakistani intelligence agencies were arming Afghan exiles while encouraging old-style tribal uprisings aimed at restoring the monarchy. Daud was inclined to accept the shah’s offer, but the Communists organised a pre-emptive coup and took power in April 1978. There was panic in Washington, which increased tenfold as it became clear that the shah too was about to be deposed. General Zia’s dictatorship thus became the lynchpin of US strategy in the region, which is why Washington green-lighted Bhutto’s execution and turned a blind eye to the country’s nuclear programme. The US wanted a stable Pakistan whatever the cost.

...

From 1979 until 1988, Afghanistan was the focal point of the Cold War. Millions of refugees crossed the Durand Line and settled in camps and cities in the NWFP. Weapons and money, as well as jihadis from Saudi Arabia, Algeria and Egypt, flooded into Pakistan. All the main Western intelligence agencies (including the Israelis’) had offices in Peshawar, near the frontier. The black-market and market rates for the dollar were exactly the same. Weapons, including Stinger missiles, were sold to the mujahedin by Pakistani officers who wanted to get rich quickly. The heroin trade flourished and the number of registered addicts in Pakistan grew from a few hundred in 1977 to a few million in 1987. (One of the banks through which the heroin mafia laundered money was the BCCI – whose main PR abroad was a retired civil servant called Altaf Gauhar.)

As for Pakistan and its people, they languished. During Zia’s period in power, the Jamaat-e-Islami, which had never won more than 5 per cent of the vote anywhere in the country, was patronised by the government; its cadres were sent to fight in Afghanistan, its armed student wing was encouraged to terrorise campuses in the name of Islam, its ideologues were ever present on TV. The Inter-Services Intelligence also encouraged the formation of other, more extreme jihadi groups, which carried out acts of terror at home and abroad and set up madrassahs all over the frontier provinces. Soon Zia, too, needed his own political party and the bureaucracy set one up: the Pakistan Muslim League.
It's often forgotten that Brzezinski's bright idea to give Russia their own Vietnam not only made a mess of Afghanistan, it also succoured the Islamic Religious Right (as "Islamism" / "radical Islam" / "militant Islam" would be more accurately termed) in Pakistan.

Here's a cheery topic: Eliminationism in America, a 10 part series from David Neiwert.

From Counterpunch, The Profits of Escalation: Why the US is Not Leaving Iraq by Ismael Hossein-Zadeh.
This is key to an understanding of why the US ruling elite is reluctant to pull US troops out of Iraq. The reluctance or "difficulty" of leaving Iraq stems not so much from pulling 140,000 troops out of that country as it is from pulling out more than 100,000 contractors. As Josh Mitteldorf of the University of Arizona recently put it, "There are a lot of contractors making a fortune and we don't want that money tap turned off, even though it is borrowed money, which our children and grandchildren will have to repay."
Here's a nice rant from Bernard Chazelle, "Bush, the Empire Slayer".
Cravenness is bigotry's favorite nourishment, and cynics might expect the political class to gorge on it by blaming our imperial agony on the natives. In America, today, cynics rarely go wrong; and the air, indeed, is thick with talk of fainthearted hordes of Mesopotamian ingrates, who quail at the latest bombing and wail at the moon in exotic garb.

Not long ago, the achingly earnest Nicholas D. Kristof, a New York Times columnist whose only sin is to be more virtuous than you—and keep you informed of this in each and every one of his bromidic columns—reassured his readers that the trouble is not with the Muslims but with the Arabs. They are too violent and they give Islam a bad name. Well, that settles that. Funny, though, that in the last twenty years Americans have outkilled Arabs in a ratio in excess of one hundred to one. But there I go again, nitpicking, while Saint Kristof is back in Cambodia, rescuing teenage prostitutes one Pulitzer prize at a time.
In The Nation, George Scialabba laid out a program for the new Democratic Majority, a lot of which makes sense.
Any nonrich, nonreligious person who has paid attention to politics since 1994, when the Goldwater/Gingrich Republicans took over Congress, and above all these past six years, has probably exhausted his or her capacity for indignation. The greed, the mendacity, the indifference, even hostility, to such notions as the common good or the public interest--the whole sorry record, reviewed in sickening detail by David Sirota and Mark Green, whose powerful books very much warrant their enraged titles and subtitles--have left many of us gasping.

We now have a bit of breathing space, thanks to the midterms. It's time to consider how the right got away with it and how to prevent it from happening again. The most useful of these books ... is Steven Hill's 10 Steps to Repair American Democracy. "To ponder the shortcomings of our political system is to court despondency," Hendrik Hertzberg observes in his foreword. The Electoral College, the Senate, the disenfranchisement of the District of Columbia, the two-party duopoly, the winner-take-all principle, partisan redistricting, 95 percent incumbent re-election rates, media concentration, Buckley v. Valeo, the K Street Project, voter turnout below 50 percent, shortages of voting machines and poll workers--this is a functioning democracy? If these travesties of logic and fairness promoted majority rule rather than prevented it, they would doubtless have been abolished long ago.
Richard Seymour recently linked to The Conscience of the Ex-Communist, Isaac Deutscher's 1950 review of The God That Failed, which measures the disillusionment of mid-20th century leftists with the Russian revolution against the similar turning away by 19th century former admirers of the French. He makes the point that recognising the betrayal of a revolution is no excuse for joining the reactionaries - a point which should be blindingly obvious in the age of the neocon.
An honest and critically minded man could reconcile him­self to Napoleon as little as he can now to Stalin. But despite Napoleon's violence and frauds, the message of the French revolution survived to echo powerfully throughout the nineteenth century. The Holy Alliance freed Europe from Napoleon's oppression; and for a moment its victory was hailed by most Europeans. Yet what Castlereagh and Metternich and Alexander I had to offer to "liberated" Europe was merely the preservation of an old, decomposing order. Thus the abuses and the aggressiveness of an empire bred by the revolution gave a new lease on life to European feudal­ism. This was the ex-Jacobin's most unexpected triumph. But the price he paid for it was that presently he himself, and his anti-Jacobin cause, looked like vicious, ridiculous anachronisms. In the year of Napoleon's defeat, Shelley wrote to Wordsworth:
In honoured poverty thy voice did weave
Songs consecrate to truth and liberty—
Deserting these, thou leavest me to grieve,
Thus having been, that thou shouldst cease to be.

If our ex-Communist had any historical sense, he would ponder this lesson.
And speaking of Mr Seymour, here's his review of Dominick Jenkins' The Final Frontier
The Military Services Institute, formed in 1878, was to represent and coordinate the interests and knowledge disciplines of what Jenkins calls the 'military progressives', those who were persuaded of the need for a professionalised officer corps, a standing army, military academies... the trouble was, they depended on Congress for appropriations but could not point to a single enemy that raised the need for a large standing army. What they sought to do therefore was to offer the state control over warmaking, using the sciences to derive laws akin to those provided by mathematics and mechanics. Far from being dangerous to liberty, they could show with copious example, standing armies were essential to it. What is more, by understanding the mechanics of conflict better, they could minimise the risk of war, as well as the risks of warmaking. It was necessary, of course, to engage in the inflation of threats (or the invention of them), since America's railway system, industry and agricultural surpluses all favoured its rapid defense in the event of an attempted invasion. General Emory Upton advanced some unique arguments: first, that America's military successes were impaired by excessive human and financial waste, a matter which would be remedied through science and professionalisation; and second, that there was a great propensity for internal commotion - Shay's Rebellion, the Whisky Rebellion, the Great Rebellion, the Rail Roads riots of 1877 - which would need to be crushed before it became a nation-wide insurgency so that democracy could prevail. Others wondered how much the immigrant communities really valued American interests, particularly given a conflcit with the societies from which they had emigrated. Further, it was argued that America's growing transportation and commerce internationally would make it more vulnerable to attack, and that to assert her rights as a trading nation, it would be essential to have the military werewithal to resist rival intimidation. And they offered the instance of China, a great civilisation, plundered and humbled by a cluster of imperial locusts. New York's growing financial prominence might well surrender to foreign conquest as the Yangzi Delta's manufacturing dominance once had.

While the patrician reformers converged with the military progressives in their empire-building tendencies, the crucial gulf between them was how they perceived military service itself. The reformers tended toward a romanticised view of volunteer warriors, and of the army as a place to emulate American heroes past. The military progressives knew that it could never be thus. They sought an army capable of defeating a large European or Asian power, which meant conscription - men would be forced to fight by drill, propaganda and the threat of the firing squad. What is more, the military leadership knew as well as the reformers did that the main examples of heroism past were less salutary than anyone would publicly admit: the war against the south having been won through the prodigious use of terror against the civilian population. There was one way, and one way alone, to get around this: if the ordinary soldier could not be a hero, the commander could. The future of romantic combat lay in the charismatic power of commanding officers.

I think in all of this, you have the essential ingredients for the transition from an increasingly challenged, polarised and crisis-ridden republic to an empire.
Apropos of which, here's Chalmers Johnson's National Intelligence Assessment on the United States, published in Harper's
Eisenhower went on to suggest that such an arrangement, which he called the “military-industrial complex,” could be perilous to American ideals. The short-term economic benefits were clear, but the very nature of those benefits—which were all too carefully distributed among workers and owners in “every city, every statehouse, every office of the federal government”—tended to short-circuit Keynes's insistence that government spending be cut back in good times. The prosperity of the United States came increasingly to depend upon the construction and continual maintenance of a vast war machine, and so military supremacy and economic security became increasingly intertwined in the minds of voters. No one wanted to turn off the pump.

Between 1940 and 1996, for instance, the United States spent nearly $4.5 trillion on the development, testing, and construction of nuclear weapons alone. By 1967, the peak year of its nuclear stockpile, the United States possessed some 32,000 deliverable bombs. None of them was ever used, which illustrates perfectly Keynes's observation that, in order to create jobs, the government might as well decide to bury money in old mines and “leave them to private enterprise on the well-tried principles of laissez faire to dig them up again.” Nuclear bombs were not just America's secret weapon; they were also a secret economic weapon.

...

To understand the real weight of military Keynesianism in the American economy today, however, one must approach official defense statistics with great care. The “defense” budget of the United States—that is, the reported budget of the Department of Defense—does not include: the Department of Energy's spending on nuclear weapons ($16.4 billion slated for fiscal 2006), the Department of Homeland Security's outlays for the actual “defense” of the United States ($41 billion), or the Department of Veterans Affairs' responsibilities for the lifetime care of the seriously wounded ($68 billion). Nor does it include the billions of dollars the Department of State spends each year to finance foreign arms sales and militarily related development or the Treasury Department's payments of pensions to military retirees and widows and their families (an amount not fully disclosed by official statistics). Still to be added are interest payments by the Treasury to cover past debt-financed defense outlays. The economist Robert Higgs estimates that in 2002 such interest payments amounted to $138.7 billion.
The CIA aren't allowed to do a National Intelligence Estimate on the US itself but, as Johnson says, the ones on other countries he vetted while at the Agency were little different from magazine articles. "When my wife once asked me what was so secret about them, I answered that perhaps it was the fact that this was the best we could do."

Also at Harper's, on Ken Silverstein's "Washington Babylon" weblog (the name is taken from the book Silverstein, then co-editing Counterpunch, wrote with Alexander Cockburn detailing the seamier side of business as usual in the US capital, from which I heard about their newsletter, way back in the '90s, years before I had access to the net), various experts give their views on the likelihood of Bush starting a war with Iran, and the likely result, if you're following that stuff. The consensus seems to be that there is no deliberate plan for an attack, and certainly not for an invasion, but Bush's posturing might just start a war accidentally. The opinion on intent seems about right; the gibberish currently being spouted about Iranian arms shipments to the Iraqi "insurgency" could very well be less an attempt to manufacture a casus belli for war with Iran, than a way of claiming America's abject failure in Iraq is due not to incompetence but the hidden machinations of mastermind Saddam Zarqawi Ahmadinejad. The agenda is domestic, and all about the administration's refusal to admit their own idiocy.