Stuart Jeanne Bramhall's Blog: The Most Revolutionary Act , page 1369
February 25, 2014
Rethinking Industrial Agriculture
Small food forest
(This is the second of two posts about dramatic changes that are occurring in food production and marketing, as well as consumer food choices. Part II addresses the application of design technology to water and soil management, which is revolutionizing the movement towards local food production.)
Applying Design Technology to Farming
Most food localization initiatives have been accompanied by radical technological advances that apply design principles to the way food is grown. The design technology employed in the rapidly growing fields of permaculture and biointensive farming is based on a radically different approach to water and soil management, modeled on nature’s ecosystem design principles. Anyone who studies natural ecosystems can’t help but notice there are no neat rows or bare soil in natural forests and prairies. Nature crams as many living organisms as possible, all with complex symbiotic relationships, into every square inch.
Ironically this “revolutionary” technology happens to be 4,000 years old. Chinese farmers discovered around 2,000 B.C. that designing their fields to replicate natural ecosystems produced the highest yields. This approach is well-described in F.H. King’s 1911 book Farmers of Forty Centuries. The US Department of Agriculture sent King to China in the early 1900s to investigate why Chinese farms were so amazingly productive. What he discovered was a highly sophisticated system of water and soil management that emphasized species diversity and rational utilization of ecological relationships among plants and between plants and animals.
The Watershed Model of Water Management
Despite King’s innovative work, it has taken English-speaking countries a full century for the lessons to sink in. Applying capitalist slash and burn mentality to farming clearly hasn’t worked. Agricultural yields in Britain and its former colonies, which all employ similar “modern” methods of water management, have destroyed tons of topsoil and essentially reduced agricultural yields by a third. In a desperate attempt to ramp up yields, chemical insecticides and herbicides were introduced after World War II. These, in turn, systematically killed off microscopic soil organisms essential to plant health.
Britain, the US, Canada, Australia, New Zealand and other former British colonies all adopted the “drainage” system of water management. In this approach, trees are systematically cleared (usually by burning) and wetlands and springs are drained. Typically land managed in this way is subject to alternating flooding and drought, creating an unending cycle of economic hardship for farmers and farming communities. Besides destroying existing crops, repeated flooding also washes away topsoil and essential plant nutrients.
In contrast traditional farmers in non-English speaking countries are more likely to use the centuries’ old “water catchment model” of water management, sometimes referred to as terraquaculture. Because they deliberately design their farms to catch and hold water, they aren’t subject to flooding, soil erosion and draught. Chinese farmers wouldn’t dream of draining their wetlands, which are always the most productive areas for high energy food crops, such as rice and other grains.
Plowing “Kills” Soil
Soil technology has also greatly advanced in the last five decades, with the discovery of complex micro-ecosystems that support optimal plant growth. These eocosystems include a myriad of soil yeasts, bacteria and other organisms that live in symbiosis with host plants. Not only do they provide nutrients to the root systems of larger plants, but they also produce a myriad of natural insecticides and herbicides to protect them against pests. Mechanically disrupting the soil through plowing kills these organisms. They can potentially recover if the soil is left undisturbed – unless the grower totally wipes them out with pesticides, herbicides or bacteriocidal GMOs.
Studies show that plant diversity is also essential to a healthy plant ecosystem. Planting a single crop in neat rows surrounded by bare soil is also perfect invitation for weeds and insects to come and attack them.
Permaculture, in contrast, discourages noxious weeds and insect pests by creating “food forests” made up of compatible food-producing trees, shrubs and ground cover crops. Unlike veggie gardens limited to annuals that have to be replanted every year, the food forest is self-sustaining with minimal input. For people worried about the economy collapsing and their gardens being invaded by barbarians from the big city, it’s also virtually indestructible.
To get some idea what a food forest looks like, check out this video by Australian permaculture guru Geoff Lawton:
Attention City Dwellers
Lawton is also a big fan of small space urban permaculture because it’s the most productive in terms of yield per square foot. The following is a video by one of his students about designing a permaculture food growing system on your balcony or terrace:
photo credit: London Permaculture via photopin cc
Originally published in Dissident Voice


February 24, 2014
Corporate Food is Bad for You
Chicago Lights Urban Farm
(This is the 1st of 2 posts about dramatic changes that are occurring in food production and marketing, as well as consumer food choices. Part I addresses the conscious shift many consumers have made over the past decade to locally grown organic food.)
Various studies reveal that as many as 20% of Americans make the conscious choice to eat organic food. Those who make the switch from corporate, industrially produced food do so for a variety of reasons. The main ones are cost, health and ethical concerns. Cost is a big consideration for low income families. In an economic depression accompanied by spiking food prices, growing your own fruits and vegetables or purchasing them from a grower at a farmers’ market can save families literally thousands of dollars a year.
Ironically the economic crisis has one silver lining in inner cities, as neighborhoods organize to create urban orchards and gardens on vacant, foreclosed land. An example is Chicago Lights Urban Farm, which supplies fresh produce for the once notorious Cabrini Green subsidized housing complex. This is the first access to fresh produce in decades for many inner city residents – thanks to the mass exodus of supermarket chains in the eighties and nineties.
Health issues linked to industrial agriculture are the second biggest reason people choose locally grown organic food over the standard corporate options. The growing list includes a number of debilitating and fatal illnesses linked with endocrine disruptors (estrogen-like molecules) in chemical herbicides and pesticides; contamination with infectious organisms; severe allergies, immune problems and cancers associated with GMOs (genetically modified organisms) and nanoparticles; type II diabetes related to growth hormones fed to US cattle and the proliferation of superbugs like MRSA (methcillin resistant staphylococcus aureus) linked to antibiotics routinely fed to factory farmed animals.
Endocrine Disruptors and Food Borne Pathogens
At the moment the biggest concern for health advocates is the epidemic of breast cancer and infertility linked to the growing presence of endocrine disruptors in our water supply and food chain. Breast cancer currently affects one out of eight women, and sperm counts in American men are among the lowest in the industrialized world. However the infectious organisms arising from factory farming methods and lax regulation of slaughter facilities are also responsible for a growing number of health problems. Infectious organisms linked with severe illness and death include the prion carried by cattle that causes Creuzfield Jakob disorder (aka Mad Cow Disease); campylobacter, salmonella and pathogenic E coli from the fecal contamination associated with overcrowded livestock pens and inadequate regulation of slaughterhouse hygiene; and Mycobacterium avium paratuberculosis (MAP), an increasingly common organism linked to a big spike in Crohn’s disease. Lax US food regulation and inspection regimes are worrying enough. Adding to all these concerns is the vast amount of supermarket food imported from third world countries where food production is totally unregulated.
Genetically Modified Organisms (GMOs)
GMO-related health issues are another reason more and more consumers are going organic. Unlike New Zealand and most of Europe, which ban GMOs, in the US 88% of corn, 93% of soy, 90% of canola (used in cooking oil), 90% of sugar beets (the source of half of US sugar) are genetically modified. Moreover thanks to the millions Monsanto spends lobbying to block product labeling laws, the majority of US shoppers have no way of knowing whether supermarket foods contain GMOs. Knowledgeable consumers are especially angry about the so-called “Monsanto Protection Bill.” This was a clause inserted in a recent continuing budget resolution that virtually guarantees Monsanto immunity against lawsuits for GMO-related health problems and environmental damage.
Nanoparticles
The latest food controversy involves the presence of untested nanoparticles in processed foods. Nanoparticles are submicroscopic particles the food industry adds to foods and packaging to lengthen shelf life, to act as thickening agents and to seal in flavor. As You Sow, NRDC and Friends of the Earth, first raised the alarm about five years ago regarding the nanoparticles used in cosmetics. They were mainly concerned about studies which showed that inhaled nanoparticles cause the same kind of lung damage as asbestos and can lead to cancer. More recently the American Society of Safety Engineers has issued warning about research showing that nanoparticles in food pass into the bloodstream, accumulate in organs and interfere with metabolic process and immune function.
Environmental and Psychological Benefits
Aside from cost and health concerns, an increasing number of consumers eat locally produced organic food for ethical and environmental reasons. In doing so, they are consciously opting out of an insane corporate agriculture system in which food is transported halfway around the world to satisfy an artificially created demand for strawberries in the winter. They are joining food localization initiatives springing up in thousands of neighborhoods and communities to increase options for locally produced organic food. As they reconnect with local growers to start farmers’ markets (the number in the US is 3,200 and growing) and Community Supported Agriculture (CSA) initiatives*, they find they are simultaneously rebuilding fundamental community ties their grandparents enjoyed.
Many farmers’ markets serve the additional function of a key gathering place for friends and neighbors. As you can see from the following video:
*Community Supported Agriculture is an alternative, locally-based economic model of agriculture and food distribution, in which local residents pre-subscribe to the produce of a given plot of farmland and take weekly delivery of fresh fruits and vegetables and free range/organic meat, eggs, raw milk, etc.
photo credit: crfsproject via photopin
Originally published in Dissident Voice


February 23, 2014
Let Them Eat Crickets
(With apologies to Marie Antoinette. This post is dedicated to readers who have lost their pensions or unemployment benefits or who are looking at having their hours, wages or Social Security benefits reduced. Some simple cricket recipes below. Please note in preparing cricket flour, you must first remove the legs and antennae. Next week: garden snails recipe from Gordon Ramsay.)
Excerpts from the UN’s Edible Insects: Future Prospects for Food and Feed
By 2050 the population of the Earth will be 9 billion, and food production will have to double to feed them all. This will be a major challenge, given that oceans are already overfished and a growing shortage of fresh water, which will drastically worsen as the planet warms. The Food and Agriculture Organization of the United Nations believe that edible insects, both as food and animal feed, may be the answer to growing food shortages.
They point out that over 1900 insect species of insects edible and two billion people around the world already consume them as a regular part of their diet. Insects are a highly nutritious and healthy food source with high fat, protein, vitamin, fiber, omega-3 fatty acids and essential minerals. The most common insect species used food are bees, wasps, ants, grasshoppers, locusts, crickets, cicadas, leaf and planthoppers, scale insects and true bugs, termites, dragonflies and flies.
Farming insects for food or animal feed is relatively new but has enormous potential, especially in third world countries. Insect rearing is not necessarily a land-based activity and does not require land clearing to expand production. Moreover because they are cold-blooded, insects are very efficient at converting feed into protein (crickets, for example, need 12 times less feed than cattle, four times less feed than sheep, and half as much feed as pigs and broiler chickens to produce the same amount of protein). Like pigs, they can be fed on organic waste streams.
As a business, insect harvesting and rearing is a low-tech, low-capital investment option that offers entry to the very poorest sections of society, including the landless. Protein and other nutritional deficiencies are typically more widespread in disadvantaged segments of society and during times of social conflict and natural disaster. Because of their nutritional composition, accessibility, simple rearing techniques and quick growth rates, insects can offer a cheap and efficient opportunity to counter nutritional insecurity by providing emergency food and by improving livelihoods and the quality of traditional diets among vulnerable people.
Simple Cricket recipes (from http://www.insectsarefood.com/recipes.html)
It is important to note that crickets should only be purchased from reliable sources. Crickets should be treated much in the same manner as any other raw food, in particular seafood. In other words it is best to keep crickets fresh as possible. Prior to preparing your crickets for a meal place them inside a plastic container or storage bag and keep them in the refrigerator at least for an hour or until you are ready to use them. This will not kill the crickets, but rather slow down their metabolism, inducing a state of hypothermia, in other words, prohibiting their movement when removed from container. If you prefer however, as many people do, feel free to place them inside the freezer for an hour or two as this will definitely kill them, guaranteeing their immobility.
After removing from refrigerator or freezer, place them in a pot of boiling water sized to hold the specific amount of crickets you’re using. Add a few pinches of salt. Boil for about two minutes. This ensures cleanliness. Once boiled, remove from water and let cool. Crickets at this time can be placed in storage bags and kept in the freezer or used right away for any number of recipes. All crickets should be prepared in this manner prior to eating.
Dry Roasted Crickets
Served as a snack for any number of persons
Ingredients:
25 – 50 live crickets – or however many you wish to cook/serve
Directions:
Salt, or any preferred seasoning that can be shaken or sprinkled onto crickets after roasting.
Next, preheat oven to 200 degrees. Arrange the crickets on a cookie sheet, making sure none of them overlap. Proceed to bake at low temperature for about 60 minutes or until the crickets are completely dry or dry enough for personal taste.
Open up oven at the 45-minute mark and test a cricket to see if it’s dry enough by crushing with a spoon against a hard surface or if you prefer, between your fingers. The crickets should crush somewhat easily. If not place them back inside oven until crisp.
Once roasted and cooled down, place a few crickets between your palms and carefully roll them breaking off legs and antennae in the process. This ensures clean and crisp crickets without legs or antennae getting in the way of.
Season them with salt, Kosher salt, sea salt, smoked salt or whatever sort of seasoning you wish. They are very good and healthy to eat as a roasted snack. Eat them on the spot or place them back into the freezer for future use.
Cricket Flour
Ingredients:
4 cups of flour
1 cup of roasted crickets (¼ – ½ cup of crickets to every cup of flour works well.)
Directions:
Break off the antennae and legs by gently rolling the cricket between your hands.
Once you collect enough crickets in a bowl proceed to crush either using a mortar and pestle or rolling pin on a hard surface.
Gather the crushed crickets – they should look like small specks (usually of dark brown color) and blend them well into the flour of your choosing.
Once you’ve blended the crickets with the flour you’re set to use it in any way you wish.
Hoppin’ Good™ Cricket Fried Rice
Serves 4 – 6
Ingredients:
4 cups cold cooked brown rice
1 ½ cups of roasted crickets (about 3 – 4 dozen)
1 cup chopped scallions
½ cup cooked corn kernels
2 large eggs
1 teaspoon Kosher salt
Powdered ginger to taste
Powdered coriander to taste
Garlic powder to taste
1 teaspoon fresh ground black pepper, or to taste
4 tablespoons oil for stir-frying, or as needed
1 ½ tablespoons light soy sauce or oyster sauce, as desired
Directions:
Wash and finely chop scallions. Lightly beat the eggs with salt, ginger, garlic powder, coriander and pepper.
Heat a wok or frying pan and add 2 tablespoons oil. When the oil is hot, add the eggs. Cook, stirring, until they are lightly scrambled but not too dry. Remove the eggs and wipe clean the wok or frying pan.
Add 2 tablespoons of oil. Add rice. Stir-fry for a few minutes, using wooden spoon to break it apart. Add crickets. Add scallions. Stir in soy sauce or oyster sauce as desired. Continue stirring for a few more minutes.
When the rice is heated through, add the scrambled egg back into the pan. Mix thoroughly. Stir in corn kernels. Serve hot.
This dish goes great with any other dish or appetizer, i.e., cooked greens, egg rolls, dumplings, etc.


February 22, 2014
Where Have All the Unions Gone?
Loss of union protection is catastrophic for millions of American workers with no way to protect themselves against layoffs and wage, benefit and pension cuts. In 2013, only 11.3% of US workers belonged to unions. Many Americans are unaware of the deliberate 95-year campaign by Wall Street to destroy the trade union movement. It all started in 1919 when the National Association of Manufacturers engaged Edward Bernays, the father of public relations, to destroy public support for a steel workers strike. Following a brief rise in union activism during the Great Depression, it continued with the punitive 1948 Taft Hartley Act, the expulsion of militant unionists during the McCarthy Era, and the cozy cold war collaboration between the CIA and AFL-CIO bureaucrats. The most decisive blow would be the trade liberalization of the 80s and 90s and the wholesale export of skilled union jobs to third world sweatshops.
Edward Bernays’ Campaign to Demonize Unions
In his 1995 Taking the Risk Out of Democracy, the late Australian psychologist Alex Carey describes how the National Association of Manufacturers engaged Edward Bernays to launch a massive media campaign to reverse public support for steel workers striking for the right to bargain collectively. Bernays first got his start helping President Woodrow Wilson sell World War I to a strongly isolationist and antiwar American public. Following the war, Bernays was immediately engaged by major corporate clients that included Proctor & Gamble, CBS, the American Tobacco Company, Standard Oil, General Electric and the United Fruit Company.
Bernays is also regarded as the father of “consumerism,” the transformation of Americans from engaged citizens into passive consumers by bombarding them with thousands of pro-consumption messages. He was also instrumental in convincing doctors and dentists (without a shred of scientific evidence) that disposing the industrial toxin fluoride in municipal water supplies would be good for peoples’ teeth.
His media campaign to convince the American public that striking workers were dangerous radicals, Bolsheviks and anarchists was an instant success. The anti-Red hysteria it created ushered in a decade of severe repression, enabling Bureau of Investigation J Edgar Hoover to launch a Red Scare and illegally arrest, detail and deport several hundred suspected radicals.
The 1948 Taft Harley Act
During the Great Depression of the 1930s, unions became popular again. Then, as now, corporations took advantage of high unemployment rates to cut wages, increase hours and force employees to work under unsafe sweatshop conditions. Led largely by the CIO (Congress of Industrial Organizations), organized labor fought back with scores of sit down and wildcat strikes.
Immediately following World War II, the National Association of Manufacturers sought to reverse union gains by ramming the Taft Hartley Act through a Congress dominated by Republicans and conservative southern Democrats. Among other provisions restricting worksite unionization drives, Taft Hartley prohibits mass picketing, as well as wildcat and sit down strikes.
The McCarthy Era
The effect of the 1947 Taft Hartley Act on union membership was almost immediate. In 1946 the Congress of Industrial Organizations (CIO) had 6.3 million members. By 1954, when it merged with the AFL, this number was down to 4.6 million or 34.7% of the American workforce. This percentage steadily declined as union officials used the anticommunist hysteria of the McCarthy Era (1950-56) to expel militant trade unionists from their ranks. The original Taft Hartley Act included a provision preventing members or former Communist Party members from holding office in a labor union – which the Supreme Court struck down in 1965 as unconstitutional. .
Thanks to the Taft Hartley Act and the purging of militant grassroots unionists, a trade union bureaucracy arose that felt closer to management than the workers they supposedly represented. This stemmed, in part, from perks they received for delivering “labor discipline” (i.e. preventing disruptive industrial action). Thus instead of lobbying to repeal Taft Hartley and relying on well-organized rank and file and industrial action, union officials became more focused on “sweetheart deals” they made with managers.
Enter the CIA
According to former CIA officer Tom Braden, many AFL-CIO officers were also on the CIA payroll for their work with USAID in suppressing foreign unions with anti-US leanings. In 1967 Braden bragged about this in the Saturday Evening Post. Founded by prominent Wall Street lawyer Allen Dulles, the CIA has always played a major role in protecting Wall Street interests. They have a long history of overthrowing democratically elected governments that threaten US corporations with overseas investments (e.g. major oil companies and United Fruit Company and Coca Cola in Latin America).
Killing Off American Manufacturing
With Reagan’s election in 1980, numerous trade laws protecting US industries and workers were repealed through the Caribbean Basin Initiative and the General Agreement on Tariffs and Trades. Clinton continued this process by fast tracking both NAFTA and the World Trade Organization treaty through Congress. Once protective quotas and tariffs were repealed, there was nothing to stop Wall Street corporations from shutting down thousands of US factories and reopening them as sweatshops in the third world. In the process millions of US workers lost union manufacturing jobs to take minimum wage jobs at MacDonald’s and Wal-Mart.
The loss of the US manufacturing sector has clearly played a major role in the failed recovery and declining US global influence. This seems an enormous price to pay for the sake of destroying trade unions. Our children and grandchildren, who will reap the consequences, will not look kindly on the neoliberal presidents (Reagan, Clinton, both Bushes, and Obama) who enacted these disastrous policies.
photo credit: DonkeyHotey via photopin cc


The Role of Ideology in Inspiring Change
The space between the TV screen and the child is nothing less than sacred ground – Mr Rogers
Crossroads: Labor Pains of a New World View
Joseph Obeyon 2012
Film Review
Crossroads is an exciting and surprisingly uplifting new documentary about the role of ideology in finding solutions to the urgent global crises humankind faces in the 21st century.
In bringing an evolutionary perspective to these urgent economic and ecological crises, the film offers a uniquely optimistic view of political and social change. Featuring a broad range of scientific experts, it focuses primarily on the role of ideology in preventing or facilitating change. For the last few centuries a competitive/individualistic view of ourselves was helpful in driving the engines of development and technological progress. However increasing evidence suggests that this widely embraced ideology is no longer sustainable.
This competitive/individualist worldview is also totally at odds with the collectivist/interdependent way of life our genes have programmed us for. Scientists have discovered that people share much of the same genetic code that enables schools of fish and flocks of birds to perform complex maneuvers as if they were a single organism. Primitive peoples have preserved the ability to see themselves this way, but it has been lost to most of industrialized society.
Crossroads stresses the role of television advertising, which pressures people to consume by making them feel insecure, in perpetuating this flawed individualistic view of ourselves. Constant bombardment with psychologically sophisticated messaging is far more powerful than actual experience. Studies consistently show that people derive the most happiness from activities that connect them with other people.
The dilemma facing 21st century political and environmental activists is how to get large numbers of people to make major changes quickly. Crossroads frames this and the multiple crises humankind faces as questions to be answered, rather than problems. High levels of global unrest suggest a substantial proportion of the world’s population already knows the old answers don’t work any more. The secret to finding new answers, according to one social scientist interviewed, is to get people to answer the fundamental question of what it means to be human.
The film ends by asking whether enough of humankind can change quickly enough to save the human species. Obeyon clearly believes we can. He cites studies showing that only a critical mass of 10% of a population is necessary to bring about cataclysmic social change. The same studies reveal that below this number it appears as if no visible progress is being made.
He stresses that global political and business leaders won’t be leading the change: they have too much to gain from maintaining the status quo.
photo credit: vauvau via photopin cc
Crossposted at Daily Censored


February 21, 2014
An NSA-approved Guide to Revolution
Activists who advocate for violent revolution don’t advertise their views on the Internet for obvious reasons. That being said, Storm Clouds Gathering treads a really fine line with their recent. Revolution: An Instruction Manual. They don’t exactly advocate using violence to dismantle corporate fascism. But they don’t really condemn it, either. Instead they argue from perspective that revolutions are mainly won by psychological means and it makes most sense to attack the state where they are weakest.
The filmmakers are totally non-ideological in their approach to dismantling capitalism. In fact, they begin with the assertion that any revolution with a an inflexible pre-ordained view of the desired outcome is doomed to failure.
They then share a general overview of their own vision – a loose confederation of self-governing communities similar to the Iroquois Federation. This was the model for the Articles of Confederation, which was the founding document of the United States of America before the bankers and mercantalists used the Constitution to strip the 13 original states of their power.
Audience Participation Required
The film is interactive and requires audience participation. In fact, it stops at 1:47 minutes until the viewer answers “yes” or “no” whether they believe the system can be reformed. If they click “yes” the video ends. I clicked “no.”
The strategy the filmmakers lay out for dismantling the corporate state involves removing, one by one, what they identify as the three “pillars of power”:
Control of the “public mind,” as it concerns patriotism and nationalistic beliefs, such as freedom, democracy and terrorism.
Control of money and finance through money creation, taxation and inflation.
A state monopoly on violence to compel obedience through fear.
How They Got Past the NSA Censors
The film finishes quite abruptly by recommending people read three books on revolution, including Gene Sharp’s From Dictatorship to Democracy. This was an extremely wise choice, as this is the training manual the State Department and CIA-linked foundations widely distributed to activists engaged in the “color” revolutions in Eastern Europe and the Arab Spring.
I have written at length about the CIA role in financing the nonviolent movement, as well as nonviolent guru Gene Sharp’s historic links with the Pentagon, State Department, and US intelligence.
Thierry Meysson, editor of Voltaire Net, was the first to go public (in 2005) with Sharp’s longstanding links to the military-intelligence complex.* The only weakness of Meysson’s original article is his failure to cite his references. I researched the sources and confirmed each of his original assertions for a 2012 Daily Censored article entitled The CIA and Nonviolent Resistance.
Also see How the CIA Promotes Nonviolence, The CIA Role in the Arab Spring and How Nonviolence Protects the State
*In 2002, Meysson’s The Big Lie was also the first to expose US intelligence involvement in 9-11.


February 20, 2014
Was Occupy Wall Street Coopted?
Occupy New Plymouth (NZ) Oct 15, 2011
Deeply curious where the Occupy movement had disappeared to, I recently ran across an article about a new project called Rolling Jubilee. It seems a coalition of Occupy groups has joined up to pay off individuals’ personal debt. Rolling Jubilee is a project of Strike Debt, a group formed in November 2012 by a coalition of Occupy groups. It seeks to oppose all forms of debt imposed on society by banks.
The aim of Rolling Jubilee is to abolish millions of dollars of personal debt by purchasing it (at random) on the secondary debt market, as collection agencies do. The latter commonly purchase debt for as little as 1% of its value and then reap enormous profits by demanding debtors pay the full amount. Instead of seeking repayment from debtors, Rolling Jubilee simply erases the debt.
In its first six months of operation Rolling Jubilee raised sufficient funds to buy and abolish more than $8.5 million worth of debt. They list debt they have purchased and eliminated on the Rolling Jubilee website. Most appears to be medical debt, i.e debt incurred for treatments that aren’t covered by health insurance.
A Far Cry from Ending Corporate Rule
At first glance Rolling Jubilee strikes me as a typical feel-good kind of project – like walking 20 miles for a cancer cure – that allows liberals to believe they are making positive change without threatening corporate interests in any way. The project is a far cry from Occupy Wall Street’s original goal of ending corporate rule. I honestly can’t see any way that paying off patients’ medical debt is going to help dismantle the corporate oligarchy that currently rules the industrialized world.
Banks and corporations seem to have the same reaction I do. They love Rolling Jubilee. Business Insider describes the project as brilliant. A Forbes column on the Rolling Jubilee featured the headline “Finally an Occupy Wall Street Idea We Can All Get Behind.”
According to Forbes, banks, credit card companies and student loan agencies can’t forgive debt because the IRS considers this kind of debt relief a “gift” and charges the debtor tax on it. This is utter nonsense, of course. It makes you wonder if the people who write for Forbes have ever met or talked to any unemployed or homeless people. There is no way the IRS is going to tax anyone without income or assets.
Making a Cottage Industry Out of Revolution
Twenty years ago this example of Occupy morphing into a less politically threatening pro-corporate entity would have been condemned as cooptation. However in an era in which CIA-funded left gatekeeping and democracy manipulating foundations head up the nonviolent movement, cooptation doesn’t seem like the correct term any more. Maybe we need to invent a new term – pre-optation, perhaps?
February 19, 2014
Collapse: Revisiting the Adam and Eve Myth
A Short History of Progress
by Ronald Wright (2004 Caroll and Graf)
Book Review
The theme of A Short History of Progress is social collapse. In it, Canadian historical archeologist Ronald Wright summarizes humankind’s biological and cultural evolution, as well as tracing the role of ecological destruction in the collapse of the some of the most significant civilizations (Sumer, Mesopotamia, Greece, Rome, Easter Island and the Mayan civilization). Exhaustively researched, the book advances the theory that many of colossal blunders made by modern leaders are very old mistakes made by earlier civilizations. Wright starts with the mystery of the agricultural revolution that occurred around 10,000 BC, when Homo sapiens ceased to rely on hunting and berry-picking and began growing their own food. Twelve thousand years ago, the global population was still small enough that there was more than ample wild food to feed them. Yet for some reason, a half dozen human settlements in widely separated regions simultaneously domesticated plants and animals. Why?
The Importance of Stable Climate
Citing extensive geological and archeological evidence, Wright suggests plant and animal domestication may have been triggered by unprecedented climate stability. Prior to 10,000 BC, the earth’s climate was wildly unstable, with ice ages developing and abating over periods as short as a decade or so. These sudden periodic changes in climate forced our hunter gatherer ancestors to continually migrate in search of food. The climate stabilization that occurred following the last ice age (around 10,000 BC) enabled them to settle in larger groups, save seeds to cultivate crops that took months to harvest, and engage in trade for other basic necessities.
Wright goes on to describe a number of diverse civilizations that arose and collapsed between 4,000 and 1,000 BC – and their unfortunate tendency towards mindless habitat destruction and runaway population growth, consumption, and technological development. In each case, an identical social transformation takes place as resources become increasingly scarce. As prehistoric peoples find it harder and harder to feed themselves, inevitably a privileged elite emerges to confiscate communal lands and enslave their inhabitants. They then install a despotic tyrant who hastens ecological collapse by wasting scare resources on a spree of militarization and temple or pyramid building. This process is almost always accompanied by wholesale murder, torture, and unproductive wars.
Wright relates this typical pattern of ecological destruction and collapse to a series of “progress traps,” in which specific human inventions turn out to have extremely negative unintended consequences. Instead of fixing the underlying problem they’re meant to solve, the inventions create an even worse environmental mess. It’s a pattern so common in prehistory that it’s become enshrined in the Adam and Eve and similar creation myths. All describe how the quest for knowledge ended humankind’s access to freely available and abundant food and forced them to produce their own.
Our Ancestors Wipe Out the Neanderthals and Mammoths
According to Wright, the first of these “progress traps” was the invention of weapons (for hunting) by early Homo sapiens. Wright blames this early invention of weapons for the first (archeologically) recorded instance of genocide – namely the wiping out of Homo Neanderthalis (Neanderthal man) by Cro-Magnon man between 40,000 and 30,000 BC. This was followed by other important mass extinctions as Homo sapiens spread out across the globe between 30,000 and 15,000 BC. The most recent archeological evidence suggests the mammoth, camel and horse became extinct in North America during this period because of perfected hunting techniques that allowed human beings to carry out mass slaughters (involving as many as 1,000 mammoths or 100,000 horses simultaneously).
Some archeologists attribute the end of hunting as a predominate food source (in numerous regions simultaneously) and the rise of plant-based diets to the decline in game animals stemming from this indiscriminate slaughter. The birth of agriculture, in turn leads to widespread deforestation and soil erosion in all the ancient civilizations, accompanied by soil salinization from over-irrigation. According to Wright, the entire cycle takes around a thousand years, which happens to be the average lifespan of most historic civilizations.
Turning Iraq Into a Desert
The first civilization to collapse in this way was Sumer (in southern Iraq), which flourished between 3,000 and 2,000 BC. The Sumerians invented irrigation, the city, the corporation (in the form of priestly bureaucracies), writing (for trade purposes), hereditary kings and slavery. By 2,500 BC, soil salinization (from irrigation) had caused a massive drop-off in crop yields. Instead of implementing environmental reforms, the ruling elite tried to intensify production by confiscating communal lands, introducing slavery and human sacrifice and engaging in chronic warfare.
From Sumer the cradle of civilization moved north to Mesopotamia (Babylon), in the region of northern Iraq and Syria, and humankind created one of the first man made deserts out of a region lush in date palms and other native vegetation.
Around 1,000 BC, similar civilizations also appeared in India, China, Mexico, Peru and parts of Europe. The Greeks (around 600 BC) were the first with any conscious awareness that they were destroying their own habitat. Plato writes a vivid description of the dangers of erosion and runoff from deforestation. The Athenian leader Solon tried to halt increasing ecological devastation by outlawing debt serfdom, food exports, and farming on steep slopes. Pisistratus offered grants to farmers to plant olive trees for soil reclamation.
Wright makes a good case for similar environmental destruction, rather than barbarian invasion, causing Rome to collapse. By the time of Augustus, Italian land had become so degraded that Rome was forced to import most of their food from North Africa, Gaul, and other colonies.
The Role of the New World
The most interesting section of the book concerns the role the New World played in rescuing the environmentally decimated European civilization. According to Wright, it was mainly New World gold and silver that capitalized the industrial revolution. However he also stresses the importance of the New World foods that were added to the European diet at a point where the population had outstripped their food supply. Maize (sweet corn) and potatoes are twice as productive (in terms of calories per acre) as wheat and barley, the traditional European staples. He also makes the point – ominously – that, despite all our apparent technological progress, humankind hasn’t introduced one new food since the Stone Age. In fact, Homo sapiens hasn’t evolved culturally or intellectually since our ancestors failed to confront resource scarcity in a way conducive to their survival.
If anything, given mass extinctions, potentially catastrophic climate change, and a growing scarcity of energy, water and fertile soil, we seem to be repeating the old maladaptive pattern. As examples, Wright cites the idiotic war on terrorism, which has ironic parallels with the chronic warfare the Sumerians launched 4,000 years ago. He also cites the rise of the New Right and the folly of trying to address resource scarcity by consolidating wealth and power in the hands of a tiny elite.


February 18, 2014
Is a College Degree Worth the Cost?
The Best Educated Janitors in the World
Given the $962 billion Americans owe in student loan debt, it seems reasonable to ask what a college degree buys them in employability and future income.
Not much according to a recent Online Degree feature revealing that 33,655 PhDs and 239,029 master’s degree recipients are on food stamps. American janitors are the most educated in the world, with 5,000 of them holding doctorates. According to the Bureau of Labor Statistics, approximately 1/3 of US college graduates work in jobs not requiring a bachelor’s degree.
Peter Schiff’s recent encounter with college grads in New Orleans is also extremely revealing:
photo credit: an untrained eye via photopin cc
Crossposted at Daily Censored and Veterans Today


February 16, 2014
Horror Film About Nuclear Waste
Into Eternity
Directed by Michael Madsen (2010)
Film Review
Into Eternity is an eerie account of Onkalo, the world’s first permanent nuclear waste repository. So-called “spent” fuel rods from nuclear energy plants remain radioactive for 100,000 years. Most of the radiation that has contaminated northern Japan post-Fukushima is from spent fuel rods being temporarily stored in water pools on the roof of one of the reactors. Becoming exposed following the earthquake and tsunami, the fuel rods caught fire, releasing massive amounts of radiation.
There are an estimated 250,000 – 300,000 tons of nuclear waste lying around in cooling pools in countries that rely on nuclear energy to produce electricity. The scope of the problem is mind boggling. 250,000 tons of highly radioactive material capable of wiping out all living things and contaminating adjacent agricultural lands and future crops for 100,000 years. The amount of waste increases daily, as the US and other countries merrily churn out spent fuel rods from existing – and new – nuclear reactors.
A Security Nightmare
As Fukushima and Into Eternity make clear, these temporary cooling pools are extremely vulnerable to natural and man-made disasters (e.g. earthquakes, volcanoes, tsunamis, wars, civil unrest). In a world on the brink of economic Armageddon, they are a security nightmare, owing to the extensive maintenance and surveillance they require. At present permanent underground storage is the only possible solution. The film briefly discusses reprocessing and transmutation as unfeasible. Both reduce, without eliminating, the quantity of permanent radioactive waste. Reprocessing reduces the total quantity of nuclear waste by transforming it into plutonium. The latter takes one million years to degrade.
The History and Future of Onkalo
The Finnish and Swedish governments are collaborating to dispose of their own nuclear waste (6,000 tons) in a huge system of underground tunnels blasted out of solid bedrock in Olkiluoto Finland. Work on the facility commenced in the 1990s. Once the spent fuel rods have been deposited, Onkalo will be cemented over, backfilled and decommissioned more than a century from now. No person working on the facility today will live to see it completed.
After outlining the immense danger posed by 250,000 – 300,000 tons of nuclear waste that will remain radioactive for 100,000 years, the film centers mainly around the debate over marking Onkalo to prevent future generations from inadvertently drilling into it. This is essential, as a new Ice Age is anticipated in 60,000 years, which will likely obliterate all Finnish cities for 10,000 years or so. Most ancient language are forgotten in a matter of centuries. Beowulf and other literature written 1,000 year ago in Old English is virtually unreadable today.
It’s mind boggling for human beings to conceptualize time spans beyond a few generations. The human species has changed drastically since it originated in Africa 100,000 years ago. If humans survive another 100,000 years, they will likely be as different from us as we are from our hairy ancestors.
More Sad than Scary
My personal reaction to this film was immense sadness, rather than horror. I cried through much of it. It forced me to confront that our planet’s 250,000 tons of nuclear waste – not catastrophic climate change or water or energy scarcity – is the single biggest factor threatening human survival and civilization. Unless some solution can be found before the global economic system implodes, our children and grandchildren will be left with a planet in which wide swathes of territory are left totally uninhabitable.
Even more horrifying than the film, is that it has received almost no mention in the US media. I guess the corporate media prefers Obama’s solution to the nuclear waste problem: denial. Obama has recently authorized billions of dollars of taxpayer subsidies to build new nuclear reactors.
I wonder what his children and grandchildren will say?


The Most Revolutionary Act
- Stuart Jeanne Bramhall's profile
- 11 followers
