No single event marked the shift from eating food to eating nutrients, though in retrospect a little-noticed political dust-up in Washington in 1977 seems to have helped propel American food culture down this dimly lighted path. Responding to an alarming increase in chronic diseases linked to diet — including heart disease, cancer and diabetes — a Senate Select Committee on Nutrition, headed by George McGovern, held hearings on the problem and prepared what by all rights should have been an uncontroversial document called “Dietary Goals for the United States.” The committee learned that while rates of coronary heart disease had soared in America since World War II, other cultures that consumed traditional diets based largely on plants had strikingly low rates of chronic disease. Epidemiologists also had observed that in America during the war years, when meat and dairy products were strictly rationed, the rate of heart disease temporarily plummeted.Read it all.
Naïvely putting two and two together, the committee drafted a straightforward set of dietary guidelines calling on Americans to cut down on red meat and dairy products. Within weeks a firestorm, emanating from the red-meat and dairy industries, engulfed the committee, and Senator McGovern (who had a great many cattle ranchers among his South Dakota constituents) was forced to beat a retreat. The committee’s recommendations were hastily rewritten. Plain talk about food — the committee had advised Americans to actually “reduce consumption of meat” — was replaced by artful compromise: “Choose meats, poultry and fish that will reduce saturated-fat intake.”...
Henceforth, government dietary guidelines would shun plain talk about whole foods, each of which has its trade association on Capitol Hill, and would instead arrive clothed in scientific euphemism and speaking of nutrients, entities that few Americans really understood but that lack powerful lobbies in Washington. This was precisely the tack taken by the National Academy of Sciences when it issued its landmark report on diet and cancer in 1982. Organized nutrient by nutrient in a way guaranteed to offend no food group, it codified the official new dietary language. Industry and media followed suit, and terms like polyunsaturated, cholesterol, monounsaturated, carbohydrate, fiber, polyphenols, amino acids and carotenes soon colonized much of the cultural space previously occupied by the tangible substance formerly known as food. The Age of Nutritionism had arrived.
...
The first thing to understand about nutritionism ... is that it is not quite the same as nutrition. As the “ism” suggests, it is not a scientific subject but an ideology. Ideologies are ways of organizing large swaths of life and experience under a set of shared but unexamined assumptions. This quality makes an ideology particularly hard to see, at least while it’s exerting its hold on your culture...
In the case of nutritionism, the widely shared but unexamined assumption is that the key to understanding food is indeed the nutrient. From this basic premise flow several others. Since nutrients, as compared with foods, are invisible and therefore slightly mysterious, it falls to the scientists (and to the journalists through whom the scientists speak) to explain the hidden reality of foods to us. To enter a world in which you dine on unseen nutrients, you need lots of expert help.
...
Another potentially serious weakness of nutritionist ideology is that it has trouble discerning qualitative distinctions between foods. So fish, beef and chicken through the nutritionists’ lens become mere delivery systems for varying quantities of fats and proteins and whatever other nutrients are on their scope. Similarly, any qualitative distinctions between processed foods and whole foods disappear when your focus is on quantifying the nutrients they contain (or, more precisely, the known nutrients).
This is a great boon for manufacturers of processed food, and it helps explain why they have been so happy to get with the nutritionism program. In the years following McGovern’s capitulation and the 1982 National Academy report, the food industry set about re-engineering thousands of popular food products to contain more of the nutrients that science and government had deemed the good ones and less of the bad, and by the late ’80s a golden era of food science was upon us. The Year of Eating Oat Bran — also known as 1988 — served as a kind of coming-out party for the food scientists, who succeeded in getting the material into nearly every processed food sold in America. Oat bran’s moment on the dietary stage didn’t last long, but the pattern had been established, and every few years since then a new oat bran has taken its turn under the marketing lights. (Here comes omega-3!)
By comparison, the typical real food has more trouble competing under the rules of nutritionism, if only because something like a banana or an avocado can’t easily change its nutritional stripes (though rest assured the genetic engineers are hard at work on the problem). So far, at least, you can’t put oat bran in a banana. So depending on the reigning nutritional orthodoxy, the avocado might be either a high-fat food to be avoided (Old Think) or a food high in monounsaturated fat to be embraced (New Think). The fate of each whole food rises and falls with every change in the nutritional weather, while the processed foods are simply reformulated. That’s why when the Atkins mania hit the food industry, bread and pasta were given a quick redesign (dialing back the carbs; boosting the protein), while the poor unreconstructed potatoes and carrots were left out in the cold.
Of course it’s also a lot easier to slap a health claim on a box of sugary cereal than on a potato or carrot, with the perverse result that the most healthful foods in the supermarket sit there quietly in the produce section, silent as stroke victims, while a few aisles over, the Cocoa Puffs and Lucky Charms are screaming about their newfound whole-grain goodness.
Meanwhile, discovering that Mr Gathman had been a contributor at the Austin Chronicle, I dug up some of his old articles. In a review of a number of medical texts, I discovered this interesting tale of serendipity. The tale is well known; just how serendipitous the discovery was I had not known.
Here's how Alexander Fleming discovered penicillin: He found some contaminating mold growing on a petri dish of staphylococchi he had left standing by a window. The mold had sprouted there, on its own, and was destroying the bacteria. But this account understates the element of accident. Other scientists could never reproduce the discovery as Fleming recounted it because, as a matter of fact, penicillium won't usually grow that way. Someone finally discovered that the temperature of London at the end of July, 1928, when Fleming's discovery was made, had been exceptionally cool. This allowed a spore, floating up from the laboratory on the floor below, which was investigating fungi, to grow. Then, the temperature returned to normal, and that allowed the staphylococchi to grow. Penicillin, in other words, was as improbable as the "chance meeting of an umbrella and a sewing machine on a dissecting table," to quote Lautreamont's line which, in June of 1928, surrealists in Paris were proclaiming as an aesthetic principle. Probably Fleming never heard of André Breton, but they were brothers under the skin. As Le Fanu succinctly puts it, "The therapeutic revolution of the post-war years was not ignited by a major scientific insight, rather the reverse: it was the realization by doctors and scientists that it was not necessary to understand in any detail what was wrong, but that synthetic chemistry blindly and randomly would deliver the remedies."Speaking of book reviews, while searching for ones of Hampton Sides' Blood and Thunder I was taken with this summary of the argument of Jared Diamond's Collapse.
A thousand years ago, a group of Vikings led by Erik the Red set sail from Norway for the vast Arctic landmass west of Scandinavia which came to be known as Greenland. It was largely uninhabitable—a forbidding expanse of snow and ice. But along the southwestern coast there were two deep fjords protected from the harsh winds and saltwater spray of the North Atlantic Ocean, and as the Norse sailed upriver they saw grassy slopes flowering with buttercups, dandelions, and bluebells, and thick forests of willow and birch and alder. Two colonies were formed, three hundred miles apart, known as the Eastern and Western Settlements. The Norse raised sheep, goats, and cattle. They turned the grassy slopes into pastureland. They hunted seal and caribou. They built a string of parish churches and a magnificent cathedral, the remains of which are still standing. They traded actively with mainland Europe, and tithed regularly to the Roman Catholic Church. The Norse colonies in Greenland were law-abiding, economically viable, fully integrated communities, numbering at their peak five thousand people. They lasted for four hundred and fifty years—and then they vanished.You could, of course, just read the whole review, or even the book, which currently sits amongst my other volumes, mocking me with its two inch wide spine.
...
There was nothing wrong with the social organization of the Greenland settlements. The Norse built a functioning reproduction of the predominant northern-European civic model of the time—devout, structured, and reasonably orderly. In 1408, right before the end, records from the Eastern Settlement dutifully report that Thorstein Olafsson married Sigrid Bjornsdotter in Hvalsey Church on September 14th of that year, with Brand Halldorstson, Thord Jorundarson, Thorbjorn Bardarson, and Jon Jonsson as witnesses, following the proclamation of the wedding banns on three consecutive Sundays.
The problem with the settlements, Diamond argues, was that the Norse thought that Greenland really was green; they treated it as if it were the verdant farmland of southern Norway. They cleared the land to create meadows for their cows, and to grow hay to feed their livestock through the long winter. They chopped down the forests for fuel, and for the construction of wooden objects. To make houses warm enough for the winter, they built their homes out of six-foot-thick slabs of turf, which meant that a typical home consumed about ten acres of grassland.
But Greenland’s ecosystem was too fragile to withstand that kind of pressure. The short, cool growing season meant that plants developed slowly, which in turn meant that topsoil layers were shallow and lacking in soil constituents, like organic humus and clay, that hold moisture and keep soil resilient in the face of strong winds. “The sequence of soil erosion in Greenland begins with cutting or burning the cover of trees and shrubs, which are more effective at holding soil than is grass,” he writes. “With the trees and shrubs gone, livestock, especially sheep and goats, graze down the grass, which regenerates only slowly in Greenland’s climate. Once the grass cover is broken and the soil is exposed, soil is carried away especially by the strong winds, and also by pounding from occasionally heavy rains, to the point where the topsoil can be removed for a distance of miles from an entire valley.” Without adequate pastureland, the summer hay yields shrank; without adequate supplies of hay, keeping livestock through the long winter got harder. And, without adequate supplies of wood, getting fuel for the winter became increasingly difficult.
The Norse needed to reduce their reliance on livestock—particularly cows, which consumed an enormous amount of agricultural resources. But cows were a sign of high status; to northern Europeans, beef was a prized food. They needed to copy the Inuit practice of burning seal blubber for heat and light in the winter, and to learn from the Inuit the difficult art of hunting ringed seals, which were the most reliably plentiful source of food available in the winter. But the Norse had contempt for the Inuit—they called them skraelings, “wretches”—and preferred to practice their own brand of European agriculture. In the summer, when the Norse should have been sending ships on lumber-gathering missions to Labrador, in order to relieve the pressure on their own forestlands, they instead sent boats and men to the coast to hunt for walrus. Walrus tusks, after all, had great trade value. In return for those tusks, the Norse were able to acquire, among other things, church bells, stained-glass windows, bronze candlesticks, Communion wine, linen, silk, silver, churchmen’s robes, and jewelry to adorn their massive cathedral at Gardar, with its three-ton sandstone building blocks and eighty-foot bell tower. In the end, the Norse starved to death.
...
[T]he disappearance of the Norse settlements is usually blamed on the Little Ice Age, which descended on Greenland in the early fourteen-hundreds, ending several centuries of relative warmth. (One archeologist refers to this as the “It got too cold, and they died” argument.) What all these explanations have in common is the idea that civilizations are destroyed by forces outside their control, by acts of God.
...
It did get colder in Greenland in the early fourteen-hundreds. But it didn’t get so cold that the island became uninhabitable. The Inuit survived long after the Norse died out, and the Norse had all kinds of advantages, including a more diverse food supply, iron tools, and ready access to Europe. The problem was that the Norse simply couldn’t adapt to the country’s changing environmental conditions. Diamond writes, for instance, of the fact that nobody can find fish remains in Norse archeological sites. One scientist sifted through tons of debris from the Vatnahverfi farm and found only three fish bones; another researcher analyzed thirty-five thousand bones from the garbage of another Norse farm and found two fish bones. How can this be? Greenland is a fisherman’s dream: Diamond describes running into a Danish tourist in Greenland who had just caught two Arctic char in a shallow pool with her bare hands. “Every archaeologist who comes to excavate in Greenland . . . starts out with his or her own idea about where all those missing fish bones might be hiding,” he writes. “Could the Norse have strictly confined their munching on fish to within a few feet of the shoreline, at sites now underwater because of land subsidence? Could they have faithfully saved all their fish bones for fertilizer, fuel, or feeding to cows?” It seems unlikely. There are no fish bones in Norse archeological remains, Diamond concludes, for the simple reason that the Norse didn’t eat fish. For one reason or another, they had a cultural taboo against it.
Given the difficulty that the Norse had in putting food on the table, this was insane. Eating fish would have substantially reduced the ecological demands of the Norse settlements. The Norse would have needed fewer livestock and less pastureland. Fishing is not nearly as labor-intensive as raising cattle or hunting caribou, so eating fish would have freed time and energy for other activities. It would have diversified their diet.
Why did the Norse choose not to eat fish? Because they weren’t thinking about their biological survival. They were thinking about their cultural survival. Food taboos are one of the idiosyncrasies that define a community. Not eating fish served the same function as building lavish churches, and doggedly replicating the untenable agricultural practices of their land of origin. It was part of what it meant to be Norse, and if you are going to establish a community in a harsh and forbidding environment all those little idiosyncrasies which define and cement a culture are of paramount importance.“The Norse were undone by the same social glue that had enabled them to master Greenland’s difficulties,” Diamond writes. “The values to which people cling most stubbornly under inappropriate conditions are those values that were previously the source of their greatest triumphs over adversity.” He goes on:To us in our secular modern society, the predicament in which the Greenlanders found themselves is difficult to fathom. To them, however, concerned with their social survival as much as their biological survival, it was out of the question to invest less in churches, to imitate or intermarry with the Inuit, and thereby to face an eternity in Hell just in order to survive another winter on Earth.Diamond’s distinction between social and biological survival is a critical one, because too often we blur the two, or assume that biological survival is contingent on the strength of our civilizational values. That was the lesson taken from the two world wars and the nuclear age that followed: we would survive as a species only if we learned to get along and resolve our disputes peacefully. The fact is, though, that we can be law-abiding and peace-loving and tolerant and inventive and committed to freedom and true to our own values and still behave in ways that are biologically suicidal. The two kinds of survival are separate.
...
Rivers and streams and forests and soil are a biological resource. They are a tangible, finite thing, and societies collapse when they get so consumed with addressing the fine points of their history and culture and deeply held beliefs—with making sure that Thorstein Olafsson and Sigrid Bjornsdotter are married before the right number of witnesses following the announcement of wedding banns on the right number of Sundays—that they forget that the pastureland is shrinking and the forest cover is gone.
When archeologists looked through the ruins of the Western Settlement, they found plenty of the big wooden objects that were so valuable in Greenland—crucifixes, bowls, furniture, doors, roof timbers—which meant that the end came too quickly for anyone to do any scavenging. And, when the archeologists looked at the animal bones left in the debris, they found the bones of newborn calves, meaning that the Norse, in that final winter, had given up on the future. They found toe bones from cows, equal to the number of cow spaces in the barn, meaning that the Norse ate their cattle down to the hoofs, and they found the bones of dogs covered with knife marks, meaning that, in the end, they had to eat their pets. But not fish bones, of course. Right up until they starved to death, the Norse never lost sight of what they stood for.
While on the environment, here's a great article from Jeffrey St Clair on the destructive impact of grazing on public lands. Being the kind of chap I am, I'll quote his snarky takedown of the Western myth that attaches to ranchers, rather than the more informative stuff.
Let's face it, public lands ranching is a peculiarly American form of social welfare. To qualify, you just need to get yourself a base ranch adjacent to some the most scenic landscapes in North America. Many ranchers came by their properties through primogeniture, a quaint medieval custom still common throughout much of the rural West. If you're a second son, a daughter or a latecomer, don't despair: you can simply buy a ranch with a grazing permit, like billionaire potato king JR Simplot did. It's easy. No background checks or means-testing required. A million bucks should get you started.Also enjoyable is St Clair's angry piece The Withering of the American Environmental Movement.
Rest assured, your public subsidies are guaranteed-by senate filibuster if necessary. Ranching subsidies are an untouchable form of welfare, right up there with the depletion allowance and the F-22 fighter. And guess what? It's guilt free. Nobody complains that your subsidies came at the expense of Headstart or aid to mothers with dependent children. Or that you take your profits from the subsidies and spend it on a case of Jack Daniels, a chunk of crystal meth, a night at the Mustang Ranch, a week at the roulette tables in Reno-or donate it all to Operation Rescue or the Sahara Club. (The most outlandish fables about profligate welfare queens have nothing on the average slob rancher.) Hell, do it right and somebody might even promote you as a cultural hero, a damn fine roll model for future generations.
The Western ranching fraternity is more homogenous than the Royal Order of the Moose: landed, white, male and politically conservative to a reactionary and vicious degree. It is not culturally, ethnically or racially representative of America and never has been. Shouldn't there be equal access to this federal grazing largesse, these hundreds of millions in annual subsidies to well-off white geezers? A creative litigator might be able to demonstrate that federal grazing policies violate the Civil Rights Act.
Strip away the Stetsons, bolo ties, and rattlesnake-skin boots, why should we view Western ranchers differently than the tobacco farmers of the Southeast? And what about their political benefactors? Is there are fundamental difference between, say, Max Baucus and Saxby Chambliss? Robert Bennett and Trent Lott? (Okay, Lott and Baucus have fewer bad hair days.) All annually raid the federal treasury to sustain industries that degrade the environment, ravage the public health and enervate the economy. Federal tobacco subsidies about to about $100 million a year; grazing subsidies may exceed a billion a year. No wonder there's so little congressional support for a single-payer health care system: passage of such a plan might actually force the Cow and Cigarette Caucus to choose between the health of the citizenry and their allegiance to political porkbarrel.
Some argue that the rancher is rooted to place, the he loves the land, holds a unique reverence for its contours, beauty, rhythms. The rancher, they say, is one of Wallace Stegner's "stickers," not an itinerant booster or commercial migrant, like the cut-and-run logger. He doesn't leave behind radioactive tailings piles or slopes of stumps in his wake, but gently tends and grooms the landscape, improving Nature's defects, year after year, generation after generation.
But take away the subsidies, the nearly free forage, the roads, the even cheaper water that magically appears from nowhere in the middle of the high desert, the tax breaks, predator control, abeyances from environmental standards and disproportionate political clout when any thing else goes against him, such as drought, rangefires, bad investments. Then charge them for the gruesome externalities of their "avocation" and then see how many stick around for the hardscrabble lifestyle that remains. Federal subsidies and political protection are the velcro for most of these guys, not the view of the Wind River Range.
If you're looking for angry, be sure to check out Roger Morris' mini-biography of Donald Rumsfeld, in two parts at TomDispatch. (Actually, it's pretty much everywhere.)
[I]n the 1960s, Rumsfeld's ardor for a high-tech military was only stirring, a minor dalliance compared to his preoccupation with advancement. While few seemed to notice, the brash freshman made an extraordinary rush at the lumbering House. In 1964, before the end of his first term, he captained a revolt against GOP Leader Charles Halleck, a Dwight D. Eisenhower loyalist prone to bipartisanship and skepticism of both Pentagon budgets and foreign intervention. By only six votes in the Republican Caucus, Rumsfeld managed to replace the folksy Indianan with Michigan's Gerald Ford.Ah, history.
In the inner politics of the House, the likeable, agreeable, unoriginal Ford was always more right-wing than his benign post-Nixon, and now posthumous, presidential image would have it. Richard Nixon called Ford "a wink and a nod guy," whose artlessness and integrity left him no real match for the steelier, more cunning figures around him. To push Ford was one of those darting Capitol Hill insider moves that seemed, at the time, to win Rumsfeld only limited, parochial prizes -- choice committee seats, a rung on the leadership ladder, useful allies.
Taken with Rumsfeld's burly style that year was Kansas Congressman Robert Ellsworth, a wheat-field small-town lawyer of decidedly modest gifts but outsized ambitions and close connections to Nixon. "Just another Young Turk thing," one of their House cohorts casually called the toppling of Halleck.
It seems hard now to exaggerate the endless sequels to this small but decisive act. The lifting of the honest but mediocre Ford higher into line for appointment as vice president amid the ruin of President Richard Nixon and his Vice President, Spiro Agnew; Ford's lackluster, if relatively harmless, interval in the Oval Office and later as Party leader with the abject passing of the GOP to Ronald Reagan in 1980; Ellsworth's boosting of Rumsfeld into prominent but scandal-immune posts under Nixon; and then, during Ford's presidency, Rumsfeld's reward, his elevation to White House Chief of Staff, and with him the rise of one of his aides from the Nixon era, a previously unnoticed young Wyoming reactionary named Dick Cheney; next, in 1975-1976, the first Rumsfeld tenure at a Vietnam-disgraced but impenitent Pentagon that would shape his fateful second term after 2001; and eventually, of course, the Rumsfeld-Cheney monopoly of power in a George W. Bush White House followed by their catastrophic policies after 9/11 -- all derived from making decent, diffident Gerry Ford Minority Leader that forgotten winter of 1964.
Barely a year after moving next to the Oval Office (and contrary to Ford's innocent, prideful recollection decades later that it was his own idea), Don and Dick characteristically engineered their "Halloween Massacre." Subtly exploiting Ford's unease (and Kissinger's jealous rivalry) with cerebral, acerbic Defense Secretary James Schlesinger, they managed to pass the Pentagon baton to Rumsfeld at only 43, and slot Cheney, suddenly a wunderkind at 34, in as presidential Chief of Staff.Also at TomDispatch I found the essay that Mike Davis has expanded into a book on the history of the car-bomb. (Part two here.)
In the process, they even maneuvered Ford into humbling Kissinger by stripping him of his long-held dual role as National Security Advisor as well as Secretary of State, giving a diffident Brent Scowcroft the National Security Council job and further enhancing both Cheney's inherited power at the White House and Rumsfeld's as Kissinger's chief cabinet rival. A master schemer himself, Super K, as an adoring media called him, would be so stunned by the Rumsfeld-Cheney coup that he would call an after-hours séance of cronies at a safe house in Chevy Chase to plot a petulant resignation as Secretary of State, only to relent, overcome as usual by the majesty of his own gifts.
The members of the Stern Gang were ardent students of violence, self-declared Jewish admirers of Mussolini who steeped themselves in the terrorist traditions of the pre-1917 Russian Socialist-Revolutionary Party, the Macedonian IMRO, and the Italian Blackshirts. As the most extreme wing of the Zionist movement in Palestine -- "fascists" to the Haganah and "terrorists" to the British -- they were morally and tactically unfettered by considerations of diplomacy or world opinion. They had a fierce and well-deserved reputation for the originality of their operations and the unexpectedness of their attacks. On January 12, 1947, as part of their campaign to prevent any compromise between mainstream Zionism and the British Labor government, they exploded a powerful truck bomb in the central police station in Haifa, resulting in 144 casualties. Three months later, they repeated the tactic in Tel Aviv, blowing up the Sarona police barracks (5 dead) with a stolen postal truck filled with dynamite.And if that leaves you insufficiently disturbed, try this rather old (but new to me) article by Craig Unger on Tim La Haye and the premillenial dispensationalists (also soon to be a book).
In December 1947, following the UN vote to partition Palestine, full-scale fighting broke out between Jewish and Arab communities from Haifa to Gaza. The Stern Gang, which rejected anything less than the restoration of a biblical Israel, now gave the truck bomb its debut as a weapon of mass terror. On January 4, 1948, two men in Arab dress drove a truck ostensibly loaded with oranges into the center of Jaffa and parked it next to the New Seray Building, which housed the Palestinian municipal government as well as a soup-kitchen for poor children. They cooly lingered for coffee at a nearby café before leaving a few minutes ahead of the detonation.
"A thunderous explosion," writes Adam LeBor in his history of Jaffa, "then shook the city. Broken glass and shattered masonry blew out across Clock Tower Square. The New Seray's centre and side walls collapsed in a pile of rubble and twisted beams. Only the neo-classical façade survived. After a moment of silence, the screams began, 26 were killed, hundreds injured. Most were civilians, including many children eating at the charity kitchen." The bomb missed the local Palestinian leadership who had moved to another building, but the atrocity was highly successful in terrifying residents and setting the stage for their eventual flight.
It also provoked the Palestinians to cruel repayment in kind. The Arab High Committee had its own secret weapon -- blond-haired British deserters, fighting on the side of the Palestinians. Nine days after the Jaffa bombing, some of these deserters, led by Eddie Brown, a former police corporal whose brother had been murdered by the Irgun, commandeered a postal delivery truck which they packed with explosives and detonated in the center of Haifa's Jewish quarter, injuring 50 people. Two weeks later, Brown, driving a stolen car and followed by a five-ton truck driven by a Palestinian in a police uniform, successfully passed through British and Haganah checkpoints and entered Jerusalem's New City. The driver parked in front of the Palestine Post, lit the fuse, and then escaped with Brown in his car. The newspaper headquarters was devastated with 1 dead and 20 wounded.
According to a chronicler of the episode, Abdel Kader el-Husseini, the military leader of the Arab Higher Committee, was so impressed by the success of these operations -- inadvertently inspired by the Stern Gang -- that he authorized an ambitious sequel employing six British deserters. "This time three trucks were used, escorted by a stolen British armored car with a young blond man in police uniform standing in the turret." Again, the convoy easily passed through checkpoints and drove to the Atlantic Hotel on Ben Yehuda Street. A curious night watchman was murdered when he confronted the gang, who then drove off in the armored car after setting charges in the three trucks. The explosion was huge and the toll accordingly grim: 46 dead and 130 wounded.
For miles around in all directions the fertile Jezreel Valley, known as the breadbasket of Israel, is spread out before us, an endless vista of lush vineyards and orchards growing grapes, oranges, kumquats, peaches, and pears. It is difficult to imagine a more beautiful pastoral panorama.Mr Unger recently produced a report on the people shilling for a war with Iran, From the Wonderful Folks Who Brought You Iraq. Any thematic segue between that piece and the earlier one on Rapturism is hopefully entirely in your own mind.
The sight LaHaye's followers hope to see here in the near future, however, is anything but bucolic. Their vision is fueled by the book of Revelation, the dark and foreboding messianic prophecy that foresees a gruesome and bloody confrontation between Christ and the armies of the Antichrist at Armageddon.
...
As we walk down from the top of the hill of Megiddo, one of them looks out over the Jezreel Valley. "Can you imagine this entire valley filled with blood?" he asks. "That would be a 200-mile-long river of blood, four and a half feet deep. We've done the math. That's the blood of as many as two and a half billion people."
When this will happen is another question, and the Bible says that "of that day and hour knoweth no man." Nevertheless, LaHaye's disciples are certain these events—the End of Days—are imminent. In fact, one of them has especially strong ideas about when they will take place. "Not soon enough," she says. "Not soon enough."