The High Cost of Freedom from Fossil Fuels

Wanderings

by Walter Brasch

For a few hours on the afternoon of Nov. 1, the people of southern California were scared by initial reports of an alert at the San Onofre Nuclear Generating Station. An “alert” is the second of four warning levels.

Workers first detected an ammonia leak in a water purification system about 3 p.m. Ammonia, when mixed into air, is toxic. The 30 gallons of ammonia were caught in a holding tank and posed no health risk, according to the Nuclear Regulatory Agency (NRC).  

During the 1970s and 1980s, at the peak of the nuclear reactor construction, organized groups of protestors mounted dozens of anti-nuke campaigns. They were called Chicken Littles, the establishment media generally ignored their concerns, and the nuclear industry trotted out numerous scientists and engineers from their payrolls to declare nuclear energy to be safe, clean, and inexpensive energy that could reduce America’s dependence upon foreign oil.

Workers at nuclear plants are highly trained, probably far more than workers in any other industry; operating systems are closely regulated and monitored. However, problems caused by human negligence, manufacturing defects, and natural disasters have plagued the nuclear power industry for its six decades.

It isn’t alerts like what happened at San Onofre that are the problem; it’s the level 3 (site area emergencies) and level 4 (general site emergencies) disasters. There have been 99 major disasters, 56 of them in the U.S., since 1952, according to a study conducted by Benjamin K. Sovacool Director of the Energy Justice Program at Institute for Energy and Environment  One-third of all Americans live within 50 miles of a nuclear plant.

At Windscale in northwest England, fire destroyed the core, releasing significant amounts of Iodine-131. At Rocky Flats near Denver, radioactive plutonium and tritium leaked into the environment several times over a two decade period. At Church Rock, New Mexico, more than 90 million gallons of radioactive waste poured into the Rio Puerco, directly affecting the Navajo nation.

In the grounds of central and northeastern Pennsylvania, in addition to the release of radioactive Cesium-137 and Iodine-121, an excessive level of Strontium-90 was released during the Three Mile Island (TMI) meltdown in 1979, the same year as the Church Rock disaster. To keep waste tanks from overflowing with radioactive waste, the plant’s operator dumped several thousand gallons of radioactive waste into the Susquehanna River. An independent study by Dr. Steven Wing of the University of North Carolina revealed the incidence of lung cancer and leukemia downwind of the TMI meltdown within six years of the meltdown was two to ten times that of the rest of the region.

At the Chernobyl meltdown in April 1986, about 50 workers and firefighters died lingering and horrible deaths from radiation poisoning. Because of wind patterns, about 27,000 persons in the northern hemisphere are expected to die of cancer, according to the Union of Concerned Scientists. An area of about 18 miles is uninhabitable. The nuclear reactor core is now protected by a crumbling sarcophagus; a replacement is not complete. Even then, the new shield is expected to crumble within a century. The current director at Chernobyl says it could be 20,000 years until the area again becomes habitable.

In March, an earthquake measuring 9.0 on the Richter scale and the ensuing 50-foot high tsunami wave led to a meltdown of three of Japan’s Fukushima Daiichi nuclear reactors. Japan’s nuclear regulatory agency reported that 31 radioactive isotopes were released. In contrast, 16 radioactive isotopes were released from the A-bomb that hit Hiroshima Aug. 6, 1945.  The agency also reported that radioactive cesium released was almost 170 times the amount of the A-bomb, and that the release of radioactive Iodine-131 and Strontium-90 was about two to three times the level of the A-bomb. The release into the air, water, and ground included about 60,000 tons of contaminated water. The half lives of Sr-90 and Cs-137 are about 30 years each. Full effects may not be known for at least two generations. Twenty-three nuclear reactors in the U.S. have the same design-and same design flaws-as the Daiichi reactor.

About five months after the Daiichi disaster, the North Anna plant in northeastern Virginia declared an alert, following a 5.8 magnitude earthquake that was felt throughout the mid-Atlantic and lower New England states. The earthquake caused building cracks and spent fuel cells in canisters to shift. The North Anna plant was designed to withstand an earthquake of only 5.9-6.2 on the Richter scale. More than 1.9 million persons live within a 50-mile radius of North Anna, according to 2010 census data.

Although nuclear plant security is designed to protect against significant and extended forms of terrorism, the NRC believes as many as one-fourth of the 104 U.S. nuclear plants may need upgrades to withstand earthquakes and other natural disasters, according to an Associated Press investigation. About 20 percent of the world’s 442 nuclear plants are built in earthquake zones, according to data compiled by the International Atomic Energy Agency.

The NRC has determined that the leading U.S. plants in the Eastern Coast in danger of being compromised by an earthquake are in the extended metropolitan areas of Boston, New York City, Philadelphia, Pittsburgh, and Chattanooga. Tenn. The highest risk, however, may be California’s San Onofre and Diablo Canyon plants, both built near major fault lines. Diablo Canyon, near San Luis Obispo, was even built by workers who misinterpreted the blueprints.  

Every nuclear spill affects not just those in the immediate evacuation zone but people throughout the world, as prevailing winds can carry air-borne radiation thousands of miles from the source, and the world’s water systems can put radioactive materials into the drinking supply and agriculture systems of most nations. At every nuclear disaster, the governments eventually declare the immediate area safe. But, animals take far longer than humans to return to the area. If they could figure out that radioactivity released into the water, air, and ground are health hazards, certainly humans could also figure it out.  

Following the disaster at Daiichi, Germany announced it was closing its 17 nuclear power plants and would expand development of solar, wind, and geothermal energy sources. About the same time, Siemens abandoned financing and building nuclear power plants, leaving only American-based Westinghouse and General Electric, which own or have constructed about four-fifths of the world’s nuclear plants, and the French-based Areva.

The life of the first nuclear plants was about 30-40 years; the newer plants have a 40-60 year life. After that time, they become so radioactive that the risk of radiation poison outweighs the benefits of continuing the operation. So, the operators seal the plant and abandon it, carefully explaining to the public the myriad safety procedures in place and the federal regulations. The cooling and decommissioning takes 50-100 years until the plant is safe enough for individuals to walk through it without protection. More critical, there still is no safe technology of how to handle spent control rods.

The United States has no plans to abandon nuclear energy. The Obama administration has proposed financial assistance to build the first nuclear plant in three decades, and a $36 billion loan guarantee for the nuclear industry. However, the Congressional Budget Office believes there can be as much as 50 percent default.  Each plant already receives $1-1.3 billion in tax rebates and subsidies. However, in the past three years, plans to build nuclear generators have been abandoned in nine states, mostly because of what the major financiers believe to be a less than desired return on investment and higher than expected construction and maintenance costs.

A Department of Energy analysis revealed the budget for 75 of the first plants was about $45 billion, but cost overruns ran that to $145 billion. The last nuclear power plant completed was the Watts Bar plant in eastern Tennessee. Construction began in 1973 and was completed in 1996. Part of the federal Tennessee Valley Authority, the Watts Bar plant cost about $8 billion to produce 1,170 mw of energy from its only reactor. Work on a second reactor was suspended in 1988 because of a lack of need for additional electricity. However, construction was resumed in 2007, with completion expected in 2013. Cost to complete the reactor, which was about 80 percent complete when work was suspended, is estimated to cost an additional $2.5 billion.

The cost to build new power plants is well over $10 billion each, with a proposed cost of about $14 billion to expand the Vogtle plant near Augusta, Ga. The first two units had cost about $9 billion.

Added to the cost of every plant is decommissioning costs, averaging about $300 million to over $1 billion, depending upon the amount of energy the plant is designed to produce. The nuclear industry proudly points to studies that show the cost to produce energy from nuclear reactors is still less expensive than the costs from coal, gas, and oil. The industry also rightly points out that nukes produce about one-fifth all energy, with no emissions, such as those from the fossil fuels.

For more than six decades, this nation essentially sold its soul for what it thought was cheap energy that may not be so cheap, and clean energy that is not so clean.

It is necessary to ask the critical question. Even if there were no human, design, and manufacturing errors; even if there could be assurance there would be no accidental leaks and spills of radioactivity; even if there became a way to safely and efficiently dispose of long-term radioactive waste; even if all of this was possible, can the nation, struggling in a recession while giving subsidies to the nuclear industry, afford to build more nuclear generating plants at the expense of solar, wind, and geothermal energy?

[Walter Brasch’s latest book is Before the First Snow, a fact-based novel that looks at the nuclear industry during its critical building boom in the 1970s and 1980s.]

Walter M. Brasch, Ph.D.

Latest Book: Before the First Snow: Stories from the Revolution

(www.greeleyandstone.com)

www.walterbrasch.com

www.walterbrasch.blogspot.com

A Patch of Pumpkin Heads

by Rosemary and Walter Brasch

                In a few days, millions of children will put on costumes, go door to door, and shout “trick or treat.” By Nov. 1, it’ll be over.

           But, it won’t be over for Americans who will face presidential candidates for the next year. The candidates will continue to try to mask their true selves, while luring us with treats that disguise tricks. Let’s see what each of the candidates might be wearing for the coming year.

            President Obama could dress as a stable boy. Since his first day on the job, he’s had to shovel whatever it is that was left for him in the stable. His opponents, however, think he should dress up as Pinocchio, with an exceptionally long wooden nose, and carrying a hammer and sickle.

           Rick Santorum had begun fading away after he was trounced in a Senate re-election campaign in Pennsylvania, too reactionary even for the Republicans. Wrap him in bandages as the Invisible Candidate.

           The other Rick in the race is Perry. For awhile, he was the leader of the pack until the other candidates ganged up on him. Moderates thought he was too reactionary; the extreme right-wing thought he was too liberal. Dress him in a helmet, black leather jacket, and jeans, etch a few tattoos onto his body, and have him encased by a sandwich board. For a few brief shiny moments, he was everything that Camelot wasn’t.

The current front-runner is Herman Cain, whose mask is a cloth pizza slice, cut to the 9’s. But since he’ll be a passing pizza, as the Republican voters love and unlove their front runners, perhaps he could also wear a half-eaten slice with a red bull’s eye on his back.

           Michele Bachmann has become one-with-a-teapot. Every voting citizen is likely to see her during the coming year spewing scalding steam, but unable to make quality tea.

Dr. Ron Paul could wear a surgeon’s scrubs, with a lot of fringe, able to leap onto any patient to cut fat and some muscle.

           Jon Huntsman, perhaps the most intelligent and most civil of the candidates, could dress in a three-piece striped pants suit of the diplomat he once was. But, since civility isn’t a trait among this year’s Republican crop, the other candidates will probably throw a potato sack over him and bury him in the dirt.

           The cast from The Wizard of Oz always presents good costume possibilities.

           Mitt Romney, once standing straight, is now leaning so far right that he is likely to be kissing the floor soon. Perhaps he could dress as the Cowardly Lion and hope to find some courage.

           It’s too obvious to dress Newt Gingrich as a salamander, none of whom have monogamous relationships. But, it is possible that this incarnation of the former House speaker could wear the mask as Dick Cheney, the man without a heart, who dresses as the Tin Man.

Dorothy, the sweet Innocent with intelligence and compassion, isn’t in the running for the Republican nomination. Sens. Susan Collins, Olympia Snow, and Lisa Murkowski are all possible Dorothies but have no reason to dress up since the Republican party doesn’t like anything sweet and moderate.

           The Wizard manipulating everyone might be Roger Ailes, the brilliant president of Fox News. But, since we are writing the story, we’ll make this wizard evil, blustery, and dense. Cerberus, the three-headed vicious dog who prevents souls condemned to Hell from ever escaping, could be the disguise that best identifies Rush Limbaugh, Glenn Beck, and Sean Hannity.

           And, of course that leaves just one main character from the Oz saga, the Scarecrow without a brain. Need anyone look farther than the Alaskan Tundra for the one most likely to seize all the treats she can and still trick the people?

[Walter Brasch’s latest book is Before the First Snow, a look at America between 1964 and 1991, the eve of the Persian Gulf War. Rosemary Brasch is a retired labor grievance officer and Red Cross family services specialist.]

Iraq: Just Another War Without an End

by Walter Brasch

We know the names of every one of the 4,479 Americans who were killed and the 32,200 who were wounded, both civilian and military, between March 20, 2003 and Oct. 21, 2011, the day President Barack Obama, fulfilling a campaign promise, declared the last American soldier would leave Iraq before the end of the year.

We know Second Lieutenant Therrel Shane Childers was the first American soldier killed by hostile fire in Operation Iraqi Freedom.

On March 21, 2003, less than a day after the U.S.-led invasion, Childers was shot in the stomach by hostile forces while leading a Marine platoon to secure an oil field in southern Iraq.  His father, Joseph, told NPR that it was his dream to lead Marines into combat.

Childers, from Gulfport, Miss., had enlisted in the Marines 12 years earlier, was a security guard at the Geneva consulate and the Nairobi embassy, fought in the Persian Gulf War, and then attended the Citadel on a special program that allows enlisted personnel to be commissioned upon graduation. He was a French major and on the Dean’s List. Childers, who had wanted to be a horse trainer when he retired from the Marines, was 30 years old when he died. The Marines promoted him to first lieutenant posthumously.

On the day Childers was killed, 12 men-seven from the United Kingdom, one from South Africa, and four from the U.S.-were killed in a helicopter crash near Umm Qasr, a port city in southern Iraq. At the time, the Marine Corps called the crash of the CH-46E Sea Knight accidental, but didn’t elaborate.

About the time the helicopter crashed, Lance Corporal José Antonio Gutierrez, a 22-year-old Marine, was killed by what is euphemistically known as “friendly fire.” He was an orphan from Guatemala who had illegally crossed into the United States from Mexico, lived on the streets of San Diego and Los Angeles, was granted a temporary visa, lived with a series of foster families, graduated from high school, and began attending college, hoping to become an architect. The U.S. granted him citizenship posthumously.

On the second day of the war, three more Americans and six from England were killed. On the third day, 30 more Americans and four British were killed. By the end of March, 92 were killed.

One month before the invasion, Defense Secretary Donald Rumsfeld had declared the upcoming war, which he warned would be a “shock and awe” strategy, might last “six days, maybe six weeks; I doubt six months.”

On May 1, 2003, aboard the U.S.S. Abraham Lincoln off the coast of San Diego, President George W. Bush, decorated in flight gear, declared “Mission Accomplished.” Official military records show that when President Bush made his announcement, 172 Coalition troops had been killed. More than 4,600 American and allied soldiers would die in Iraq after that declaration; more than 31,500 Americans would be wounded, many permanently disabled, after that bravado proclamation.

We know the oldest American soldier to die in combat was 60; the youngest was 18, of which there were 34. We know that 476 of those killed were from California; Pennsylvania and Florida each had 176 deaths by the time the President announced full withdrawal from Iraq.

There are names we don’t know. We don’t know the names and life stories of the 4.7 million refugees, nor the two million Iraqis who fled the violence caused by the Coalition invasion. We don’t know the names of the orphaned children, one-third of all of Iraq’s youth. We don’t know the names of the 100,000-150,000 civilians killed. We don’t have accurate records of more than a million who were wounded. It no longer matters who killed or wounded them, who destroyed their lives and property-American, allied, Shia, Sunni, insurgent, criminal, or al-Qaeda. It doesn’t matter if they died from IEDs, suicide bombers, gunshots, artillery, bombs, or missiles. In war, they’re simply known as “collateral damage.”

In Afghanistan, 2,769 Coalition troops have been killed, 1,815 of them American, by the day that President Obama announced the withdrawal from Iraq. There are already 14,343 wounded among the Coalition forces. Between 36,000 and 75,000 Afghani civilians have been killed by insurgents and Coalition troops during the past decade, according to the United Nations. President Obama told the world that the war in Afghanistan would continue at least two more years.

You can try to sanitize the wars by giving them patriotic names-Operation Iraqi Freedom; Operation Enduring Freedom. But that doesn’t change the reality that millions of every demographic have been affected. War doesn’t discriminate. The dead on all sides are physicians and religious leaders; trades people, farmers, clerks, merchants, teachers, and mothers.  And they are babies and students. We don’t know what they might have become had they been allowed to grow up and live a life of peace, one without war.

We also don’t yet know who will be the last American soldier to be killed in Iraq. We don’t know how Post-Traumatic Syndrome Disorder (PTSD) will affect the one million soldiers who were called for as many as seven tours of duty, nor when the last Iraq War veteran will die from permanent injuries. And we will never know how this war will affect the children and grandchildren of the veterans.

But there is one more thing we do know. A year before José Antonio Gutierrez was killed, he had written a “Letter to God” in Spanish. Translated, it read: “Thank you for permitting me to live another year, thank you for what I have, for the type of person I am, for my dreams that don’t die. . . . May the firearms be silent and the teachings of love flourish.”

[Walter Brasch first began writing about war in 1966. He wishes he didn’t have to. His latest book is Before the First Snow, a novel that focuses upon America between 1964 and 1991, the eve of the Persian Gulf War.]

 

Banning the First Amendment

by Walter Brasch

Parents demanded it be banned.

School superintendents placed it in restricted sections of their libraries.

It is the most challenged book four of the past five years, according to the American Library Association (ALA).

“It” is a 32-page illustrated children’s book, And Tango Makes Three, by Peter Parnell and Justin Richardson, with illustrations by Henry Cole. The book is based upon the real story of Roy and Silo, two male penguins, who had formed a six-year bond at New York City’s Central Park Zoo, and who “adopted” a fertilized egg and raised the chick until she could be on her own.

Gays saw the story as a positive reinforcement of their lifestyle. Riding to rescue America from homosexuality were the biddies against perversion. Gay love is against the Bible, they wailed; the book isn’t suitable for the delicate minds of children, they cried as they pushed libraries and schools to remove it from their shelves or at the very least make it restricted.

The penguins may have been gay-or maybe they weren’t. It’s not unusual for animals to form close bonds with others of their same sex. But the issue is far greater than whether or not the penguins were gay or if the book promoted homosexuality as a valid lifestyle. People have an inherent need to defend their own values, lifestyles, and worldviews by attacking others who have a different set of beliefs. Banning or destroying free speech and the freedom to publish is one of the ways people believe they can protect their own lifestyles.

During the first decade of the 21st century, the most challenged books, according to the ALA, were J.K. Rowling’s Harry Potter series, apparently because some people believe fictionalized witchcraft is a dagger into the soul of organized religion. Stephanie Meyer’s Twilight series was the 10th most challenged in 2010. Perhaps some parents weren’t comfortable with their adolescents having to make a choice between werewolves and vampires.

Among the most challenged books is Ray Bradbury’s Fahrenheit 451, the vicious satire about firemen burning books to save humanity. Other books that are consistently among the ALA’s list of most challenged are Brave New World (Aldous Huxley), The Chocolate War (Robert Cormier), Of Mice and Men (John Steinbeck), I Know Why the Caged Bird Sings (Maya Angelou), Forever (Judy Blume), and The Adventures of Huckleberry Finn (Mark Twain), regarded by most major literary scholars as the finest American novel.

Name a classic, and it’s probably on the list of the most challenged books. Conservatives, especially fundamental religious conservatives, tend to challenge more books. But, challenges aren’t confined to any one political ideology. Liberals are frequently at the forefront of challenging books that may not agree with their own social philosophies. The feminist movement, while giving the nation a better awareness of the rights of women, wanted to ban Playboy and all works that depicted what they believed were unflattering images if women. Liberals have also attacked the works of Joel Chandler Harris (the Br’er Rabbit series), without understanding history, folklore, or the intent of the journalist-author, who was well-regarded as liberal for his era.

Although there are dozens of reasons why people say they want to restrict or ban a book, the one reason that threads its way through all of them is that the book challenges conventional authority or features a character who is perceived to be “different,” who may give readers ideas that many see as “dangerous.”

The belief there are works that are “dangerous” is why governments create and enforce laws that restrict publication. In colonial America, as in almost all countries and territories at that time, the monarchy required every book to be licensed, to be read by a government official or committee to determine if the book was suitable for the people. If so, it received a royal license. If not, it could not be printed.

In 1644, two decades before his epic poem Paradise Lost was published, John Milton wrote a pamphlet, to be distributed to members of Parliament, against a recently-enacted licensing law. In defiance of the law, the pamphlet was published without license. Using Biblical references and pointing out that the Greek and Roman civilizations didn’t license books, Milton argued, “As good almost kill a man as kill a good book; who kills a man kills a reasonable create [in] God’s image,” he told Parliament, “but he who destroys a good book kills reason itself, kills the image of God.” He concluded his pamphlet with a plea, “Give me the liberty to know, to utter, and to argue freely according to conscience, above all liberties.”

A century later, Sir William Blackstone, one of England’s foremost jurists and legal scholars, argued against prior restraint, the right of governments to block publication of any work they found offensive for any reason.

The arguments of Milton and Blackstone became the basis of the foundation of a new country, to be known as the United States of America, and the establishment of the First Amendment.

Every year, at the end of September, the American Library Association sponsors Banned Book Week, and publishes a summary of book challenges. And every year, it is made more obvious that those who want to ban books, sometimes building bonfires and throwing books upon them as did Nazi Germany, fail to understand the principles of why this nation was created.

[Walter Brasch was a newspaper and magazine reporter and editor before becoming a professor of mass communications, with specialties in First Amendment and contemporary social issues. His current book is the mystery novel, Before the First Snow, a look at the 1960s, and how issues unresolved during those years are affecting today’s society.]

         

The Mugging of SpongeBob SquarePants

by Walter Brasch

SpongeBob SquarePants may be hazardous to your mental development-if you’re a four-year-old. At least that’s what two psychologists at the University of Virginia claim, based upon a study they conducted that may have as many holes as the average sponge who lives under the sea.

In the first paragraph of an article published this week in the academic journal Pediatrics, Angeline S. Lilliard and Jennifer Peterson set up their study with a pick-and-choose somewhat slanted view of television. According to these psychologists, “correlational studies link early television viewing with deficits in executive function . .  . a collection of prefrontal skills underlying goal-directed behavior, including attention, working memory, inhibitory control, problem solving, self-regulation, and delay of gratification.” Translated into English, we conclude that psychologists don’t speak English.

To make sure no one misreads the study as anything but pure empirical science, they toss in “covariant assessment,” “covariate,” “posthoc analyses,” “backward digit span,” “encoding,” “cognitive depletion,” and something known as the “Tower of Hanoi,” not to be mistaken, apparently, for the Hanoi Hilton, or the Tower of Babel, which this study seems most likely to emulate.

For their subject group, they rounded up four-year-olds from “a database of families willing to participate.” Three groups of children were given the same four separate tasks. Those who watched a truncated version of a “SpongeBob” cartoon, which has scene changes an average of every 11 seconds, fared worse in the measurements than did the groups that watched a more “realistic” and “educational” PBS cartoon (“Caillou”) that had an average scene change of 34 seconds. The third group (known as a “control” group) drew things and participated in all the tasks. On all four tests, “SpongeBob” lost. The fact the researchers labeled “Caillou” as educational could reveal pre-conceived bias; even a cursory look at “SpongeBob,” although primarily entertainment, reveals numerous social and educational issues that could lead to further discussion.

The pre-schoolers were mostly White, from middle-class and upper-class families. Thus, there was no randomly-selected group, something critical in most such studies. The researchers do acknowledge this, as well as a few defects in the study itself. Possibly salivating over future grants, they tell us that “further research . . . is needed.”

The reality may not be that four-year-olds who watch “SpongeBob” and similar cartoons had developmental defects but that they are far more interested in the cartoon than in other activities and temporarily suspend those “good quality” activities while they remember the cartoon and think of other events or issues that SpongeBob and the cast got into. The researchers measured the students’ responses shortly after watching the cartoons; perhaps measurements a few hours or a week later might have given different results.

Nevertheless, the researchers-hung up on standard deviations, regression analysis, and Cronbach’s Alpha, among other empirical tests-didn’t do the most basic of all research. They didn’t ask the children what they thought about the cartoons, nor any questions leading to why the children who viewed “SpongeBob” may not have performed as well the other two groups on tests that may or may not be of value. It’s entirely possible that watching fast-paced well-written tightly-directed animated cartoons may be more fun-and more productive-than watching slower-paced educational cartoons. But we don’t know because the research was quantified.

The wounded response by Nickelodeon, which airs “SpongeBob Squarepants,” isn’t much better than the academic study. Squeezed into a sentence, the comment is that the cartoon is for 6-11 year olds, not the four-year-olds who were tested. The Nick PR machine wants us to believe that even if everything the researchers said was true, it doesn’t matter because the cartoon isn’t aimed at four-year-olds.  Apparently, even if older siblings are watching “SpongeBob” or their parents are watching horror, adventure, or war movies it doesn’t matter because those forms of entertainment aren’t for four-year-olds.

For more than eight decades, animated cartoons have come under fire by all kinds of academic researchers and certain “we-do-good” public groups. From 1930 to 1968, the Hays office, ensconced in Puritan ideals of morality, censored films and cartoons for all kinds of reasons. By the 1960s, academic researchers began questioning the violence in cartoons, focusing primarily upon the Warner Brothers characters. For a few years, television programmers, either believing themselves to be great pillars of morality or afraid of losing sponsors, forcibly retired many of the most popular cartoons from the screen.

At least half of the studies concluded that watching violence could be one of the factors that lead to violent acts. Another group of studies showed little correlation. But, stripping away the academic verbiage, the most logical conclusion of all the studies that denuded a small forest was that persons pre-disposed to violence may become violent if exposed to violence in cartoons. Certainly, watching Roadrunner/Wile E. Coyote cartoons won’t cause a Quaker to go out and mug Baptists.

The mugging that SpongeBob (and other characters in quick-sequencing action) got is another attempt to quantify life by exorcizing a small part of life, running tests, and trying to explain human cognition and development without understanding humans.

[Walter Brasch has a Ph.D. in mass communication. That means during his career he has been subjected to more than his fair share of annoying academic studies. Among his 16 books, he is the author of Cartoon Monickers: A History of the Animation Industry, and Before the First Snow, a novel about the history of America and its counter-culture between 1964 and 1991.]

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Walter M. Brasch, Ph.D.

Latest Book: Before the First Snow: Stories from the Revolution

(www.greeleyandstone.com)

www.walterbrasch.com

www.walterbrasch.blogspot.com

www.facebook.com/walterbrasch

http://www.youtube.com/watch?v…

 

Toxic Lead to Cover Iowa Killing Fields

by Walter Brasch

Iowa, which gave us the carnival known as the Iowa Straw Poll and artery-clogging Deep Fried butter, will unleash another health problem, beginning Sept. 1.

The Iowa legislature last year approved a dove hunting season, the first in more than nine decades. However, the state’s Department of Natural Resources and the Natural Resources Commission (DNR) banned the use of lead shot and bullets.

That led to a massive all-out assault by the National Rifle Association (NRA) and the U.S. Sportsman’s Alliance (USSA).

In a letter to Gov. Terry Branstad, the NRA underscored its opposition by waving a veiled threat that banning lead ammunition is an “attack [on] our freedoms.”

“Absurd,” replied Robert Johns of the American Bird Conservancy, who explained that “the NRA continues to deliberately miscast the lead-versus-nonlead ammunition issue as an attack on hunting.” There is nothing in the Constitution or in any federal court decision that would prohibit the banning of any specific kind of ammunition.

The NRA blatantly suggested the ban on lead shot “is designed to price hunters out of the market and keep them from taking part in traversing Iowa’s fields and forests.” For its “evidence,” it pointed out the cost of non-toxic ammunition is higher than ammunition made of lead. However, the use of non-toxic shot results in only a 1-2 percent increase in total costs for hunters, according to a study conducted by the National Wildlife Research Centre, certainly not enough to justify the NRA’s paranoid panic that non-toxic bullets will lead to a decrease in hunting.

Iowa’s DNR, the NRA claimed, was echoing not just environmental extremism but “the unscientific battle cry of the anti-hunting extremists.”

Contrary to NRA and USSA statements, there are several hundred scientific studies that conclude that lead shot is a health and environmental danger. Lead can cause behavioral problems, learning disabilities, reduced reproduction, neurological damage, and genetic mutation. For those reasons alone, the U.S. bans lead in gasoline, water pipes, windows, pottery, toys, paint, and hundreds of other items.

“Wildlife is poisoned when animals scavenge on carcasses shot and contaminated with lead-bullet fragments, or pick up and eat spent lead-shot pellets[,]mistaking them for food or grit,” the Center for Biological Diversity points out. As many as 20 million birds and other animals die each year from lead poisoning, says the CBD.

Humans can be poisoned by eating animals that have eaten the pellets from the ground or which have eaten decaying carcasses of birds that have been shot with lead ammunition. Iowa is one of only 15 states that don’t have some regulation that bans lead in shot and ammunition. Most European countries ban the use of lead shot for hunting.

The U.S. Fish and Wildlife Service in 1991 banned the use of lead shot in all waterfowl hunting. The NRA screamed its opposition at that time. However, the ban didn’t lead to a reduction of hunting or hunters, nor did it violate any part of the Constitution.

R.T. Cox, in his column, “The Sage Grouse,” notes that “bird hunters can leave 400,000 pellets per acre of intensely hunted areas.” About 81,000 tons of lead shot are left on shooting ranges each year, according to the Environmental Protection Agency. Part of the reason for so much lead shot on the ground is that doves, which can fly up to 50 miles per hour and make sharp turns, are difficult to hit. While hunters may claim they shoot the birds as a food source, such claims are usually blatant lies meant to hide the reality that the 20 million doves killed each year are nothing more than live targets. The five ounce mourning dove, hit by shot, provides little usable meat. The NRA even advises hunters that for health reasons, they should “cut away a generous portion of meat around the wound channel.”

Lead on the dove killing fields isn’t the only problem. An investigation by the North Dakota Dept. of Health in 2007 revealed that 58 percent of venison donated to food banks by the Safari Club contained lead fragments. During the past decade, 276 California condors were found to have had lead poisoning; there are fewer than 400 in the state. A ban on lead shot was enacted in 2007.

There are alternatives to using lead. Non-toxic bullets and shot are made from tungsten, copper, and steel, without the negative health problems. While some hunting advocates maintain that lead bullets are significantly better in the field, there is no evidence to suggest that “green” ammunition results in fewer kills.

Nevertheless, disregarding scientific evidence and facing NRA wrath, Branstad said he agreed with a legislative panel’s decision to ignore the findings of the state’s professional wildlife conservationists, who he said exceeded their authority, to restore lead shot hunting.

Andrew Page, a senior director for the Humane Society of the United States, has another opinion, one far more logical than the NRA/NSSA rants: “If hunters are conservationists as they say they are, they should be the first to stand up and say they won’t poison wildlife or the ecosystem.”

‘Step Right up!’ Snake Oil for Sale

by Walter Brasch

The Tea Party, mutant spawn of the Republicans, held their spineless parents and the nation hostage during the debt ceiling crisis, and is now demanding an even greater ransom.

Flushed with what they mistakenly believe is success, they have launched an all-out assault upon the presidency. Their generals, fattened by Iowa corn and midway schmaltz, are Michele Bachmann, Rick Perry, Rick Santorum, and Herman Cain. Sarah Palin, hovering near the battlefields to soak up the media sunlight, much like a black hole absorbs all energy and light from nearby stars, is waiting to see how the war goes (and if she can write some intelligent sentences) before deciding to re-enter battle.

Bachmann is the winner of the strangest political non-election in the country, the Iowa Straw Poll. She won the race the old-fashioned way. She bought it.

To make sure that Iowans entered the Tents of Instant Gratification and, thus, cast their ballot the right way, the candidates, who paid $15,000-$31,000 to rent space at Iowa State, provided food, music, and carnival fun for the voters. Bachman had a petting zoo, and drew fans to a concert by country superstar Randy Travis. Cost of the banquet: $30 a ticket.

To assure there were enough votes, Bachmann’s campaign, like all other campaigns, paid the $30 admissions ticket. That would be $144,690 for 4,823 votes, plus several hundred thousand dollars in related campaign expenses. Related campaign expenses for the candidates included renting charter buses to bring voters from throughout Iowa to Ames.

But, Iowans aren’t stupid. Many wanted to see Randy Travis and eat the food of politics but didn’t plan to vote for Bachmann. About 6,000 persons took the “free” $30 tickets. Thus, she officially paid $180,000, $37.32 a vote; unofficially, with all expenses figured in, the cost could easily have been well over $200 a vote so she could be the winner and earn the title of Media Darlin’ of the Week.

The establishment media generally avoided Ron Paul, the second place winner, who “only” got 4,671 votes, 152 less than Bachmann, and 27.7 percent. Paul is a pariah in the Republican party, and something the media can’t figure out, because he actually has a core set of principles, which sometimes leads him to ally with liberals, but for different reasons.

Third place, with 13.6 percent of the vote and, according to numerous media pundits not charismatic enough to be a serious contender, went to Tim Pawlenty, who didn’t drink much of the tea and dropped out of the race after spending about $1 million in Iowa. Not dropping out were Tea Party favorites Rick Santorum (9.8 percent) and Herman Cain (8.6 percent), who lured voters into his tent with free Godfather’s Pizza. Mitt Romney, who had spent about $2 million in the 2007 Straw Poll, but skipped this year’s non-binding poll, finished behind Rick Perry, dripping tea with every statement he makes, entered the presidential race only after the Iowa Straw Poll, but did get 718 write-in votes for 4.3 percent of the vote. Nevertheless, Romney is still believed to be the front-runner.

Thus, going into the primary season, the Tea Party can arouse themselves with Bachmann, Perry, Santorum, Cain, and maybe Palin. Not identified with the Tea Party, but in its gravitational pull are Romney, Jon Huntsman, and whatever is left of Newt Gingrich’s chances.

The Tea Party began a few months after Barack Obama was elected president, with a stated purpose to reduce wild government spending. But its deep structure shows an amorphous bunch of white middle-class ultra-conservatives, aided by upper-class political consultants and media manipulators, who have developed the ability to sound impressive with only half-truths behind their rants and chants, and a zealous determination to keep President Obama out of a second term.

During the debt ceiling crisis, Tea Partiers refused to budge on a demand of not raising the debt ceiling, cutting numerous social and educational programs, and holding firm to the Bush tax cuts for the wealthy. Everyone must cut back, especially during economic crises, they bleated. Austerity is their mantra.

But, based upon their extravagant lifestyle and the wild spending they did in Iowa, shouldn’t their mantra now be “hypocrisy”?

A Punishing Educational Curriculum

by Walter Brasch

           With the nation’s unemployment rate hovering about 10 percent, recent high school graduates are escaping reality by going to college, and college grads are avoiding reality by entering grad school. The result is that it now takes an M.A. to become a shift manager at a fast food restaurant.

           Colleges have stayed ahead of the Recession by becoming business models, where students are “inventory units,” and success is based upon escalating profit. Increasing the number of incoming units, class size, and tuition, while not increasing teaching and support staff, leads some colleges to believe they are solvent in a leaking economy. Budgets for academics are decreasing; budgets for dorms are increasing. Enrollment in degree-granting institutions is expected to be about 19.1 million in 2012, an increase of about 25 percent from 2000, according to the National Center for Educational Statistics.

           Desperate to destroy their image as places of scholarship, colleges are using the 98.6 admissions criteria-admit almost anyone with a body temperature. Colleges may claim they admit only students with at least a 3.0 grade point average, which at some high schools is about half the student body, but it’s likely that students with lower averages aren’t recruited because they’re already working as lab specimens.

           Across the nation, Developmental Education classes are increasing, with some departments now within the Top 5 in the college. For those who don’t speak “academicese,” that means more students are in college who have basic readin’, ‘riting, and ‘rithmetic problems.

           Nevertheless, there are still a few hold-outs among colleges where students actually go to study, develop their minds, and hope to make great contributions to society. This, of course, in a declining economy, is not acceptable.

           At Neargreat Tech, when the Admissions department failed to increase enrollment because most high school grads didn’t want to be associated with geeks, the President convened a Judiciary Review Board to reduce the college’s academic reputation. First in was the class valedictorian.

           “Bennish, this is the fifth time this semester you’ve been caught sneaking into the library. This administration just doesn’t know what to do with you.”

           “Sir, maybe I could increase my community service and read books to the ill and illiterate.”

           “Why can’t you just go to our football games Saturday afternoons, then party and get drunk like a normal college student?”

           “Because, sir, we don’t have a football team.”

           “Then start one! If it’s as bad as it could be, you’ll have an excuse to drink. Next!”

           Next in was a student accused of disturbing the peace.

           “Rachmaninoff, your advisor says you’re a pretty good musician, but you only want to play the classical stuff. We’re assigning you to the marching band.”

           “But, Dean, I play the piano.”

           “Great! The band needs a pianist.”

           “Sir, it might be difficult to carry a piano along Broadway. Besides, there are only 20 members in the band anyhow.”

           “Even better! Pick an instrument. Banjo. Double bass. Electric guitar. They need everything! Dismissed!”

           Next to be called to face a disciplinary hearing was Schopenhauer. “You were seen lying on the grass beneath a tree in the quad,” said the president. “The campus police claim you were thinking. We should give you an opportunity to defend yourself against this egregious accusation. What exactly were you doing?”

           “Thinking.”

           “That’s outrageous! You know we don’t like our students to think. What’s your major?”

           “Philosophy, sir.”

           “That’s the problem,” the president declared. “Since you’re only a freshman, and probably don’t know better, I’ll be lenient. You are sentenced to a day of writing graffiti on the university’s bathroom walls.” He paused a moment, then snapped, “And don’t let me catch you writing anything intelligent on those walls!”

           Later that afternoon, the president met with his staff.

           “This isn’t going to work,” said the dejected president. “We can’t catch every practicing scholar on campus. They’re just snickering at our rules. If we can’t stop education, then we won’t be able to raise our enrollment and get performance bonuses.”

           That’s when Winslow, a newly-appointed deputy assistant dean spoke up. “Perhaps we need to look elsewhere for our inspiration. What is it that almost every college but ours has?” He didn’t wait for a response when he declared the college needed fraternities and sororities.

           “How do we know the students will even want to participate?” asked the president. “Most of our students have no desire to participate in a system that humiliates them, strips them of their individuality, and causes them to walk six abreast down a narrow street while singing off-key.”

           “Perhaps,” suggested the deputy assistant dean, “we can tap our reserve fund and build a couple of fraternity houses, maybe a sorority house or two.”

           “Will that guarantee we’ll get more common students to raise the enrollment?”

           “If you build it, they will party,” said the deputy assistant dean.

           “Winslow may have a bright idea here,” said the president, who immediately promoted him to vice-president of academics and parties.  

‘Pssst. Hotdogs Ten Bucks Each’

“Pssst.”

           I walked straight ahead, looking neither right nor left in a darkened alley illuminated by a half-moon.

           “Pssst.”

           I quickened my pace, but there was no avoiding the shadowy figure.

           “Ain’t gonna harm ya. Jus’ wanna sell ya somethin’.”

           I hesitated, shaking. Stepping in front of me, he shoved a hotdog under my nose. “Ten bucks each,” he whispered ominously through his throat.

           “Ten bucks?!” I asked, astonished at the cost.

           “You want it or not?”

           With Michele Obama (who chose to attack obesity rather than poverty, worker exploitation, or even hunger and malnutrition), supported by publicity-hungry legislators, hotdogs were the latest feel-good food to come under assault. A medical association whose members are vegans had spent $2,750 to place a billboard message near the Indianapolis Motor Speedway. The picture showed four grilled hot dogs sticking out of a cigarette box that had a skull and crossbones symbol on its face. An oversized label next to the box informed motorists and fans of the upcoming Brickyard 400, “Warning: Hot dogs can wreck your health.” The Physicians Committee for Responsible Medicine claimed that just one hot dog eaten daily increased the risk of colorectal cancer by 21 percent.

           The Committee isn’t the only one destroying Americans’ rights to eat junk food. The Center for Science in the Public Interest, which seems to come up with a new toxic food every year, once declared theatre popcorn unhealthy.  Many schools banned soda machines. Back in 2011, McDonald’s reduced the number of french fries in its Happy Meal and substituted a half-order of some abomination known as applies. Even cigarette company executives, trying to look professorial at a Congressional hearing, once said that smoking cigarettes was no worse than eating Twinkies. However, smoking a Twinkie could cause heart and lung diseases, cancer, and diabetes.

           Nevertheless, in Michele Obama’s second term as First Anti-Fat Lady, I was desperate for my daily fix of hot dogs, and my would-be supplier knew it. I leaped at my stalking shadowy figure with the miracle junk.

           “Not so fast!” he growled, pulling the hotdog away. “Let’s see your bread.”

           “I don’t have any bread,” I pleaded. “Not since a zoologist at Penn concluded that hummingbirds that ate two loaves of bread a day got constipation.”

           “Not that bread, turkey! Bread! Lettuce!”

           “I haven’t eaten lettuce in three years since the government banned it for having too many pesticides, and the heads that remained were eaten by pests.”

           The man closed his trench coat and began to leave.

           “Wait!” I pleaded, digging into my pockets. “I’ve got change.”

           He laughed, contemptuously. “That’s not even coffee money.”

           “I don’t drink coffee,” I mumbled. “Not since the government arrested Juan Valdez and his donkey for being unhealthy influences on impressionable minds.”

           I grabbed for his supply of hotdogs, each disguised in a plain brown wrapper, each more valuable than a banned rap record. He again pulled them away.

           “I ain’t no Salvation Army. You want ‘dogs, you pay for ‘dogs. I got thousands who will.”

           “I need a fix. You can’t let me die out here on the streets.”

           “If it was just me, I’d do it. But there’s the boys. They keep the records. If I give you a ‘dog and bun, and don’t get no money, they’ll break two of my favorite fingers. I don’t cross nobody. And I don’t give it away.”

           “Please,” I begged. “I need a ‘dog. It’s all I have left to live for. I don’t care about colorectal cancer. Without hotdogs, my life is over. You can’t let me die out here on the streets.” He shrugged, and so I suddenly got bold. “Give me a ‘dog,” I demanded, “or I’ll tell everyone you have the stuff. You won’t be able to meet the demand. The masses will tear you apart like a plump frank.”

           “You wouldn’t do that to a guy just trying to make a buck, would you?”

           “Two ‘dogs with mustard and onions, and I keep my mouth shut. No ‘dogs and I scream like a fire engine.” He had no choice.

           Walking away, he stopped, turned back, and called after me-“Tomorrow. This corner. This time. Two ‘dogs. Twenty bucks. I’ll see you every night.”

           I didn’t reply. He knew he had me.

[Rosemary Brasch, who likes hotdogs, assisted on this column. Walter Brasch says he prefers hamburgers, but will defend to the death the right of Americans to eat what they want. His latest book is Before the First Snow, a look at a part of America, as seen by a “flower child” and the reporter who covered her story for more than three decades, beginning in the 1960s.]

’10 Commandments Judge’ Running for Presidency

by Walter Brasch

           The chief justice of the Alabama Supreme Court who was removed from office for defying the Constitution and a federal court order is one of 14 major candidates running for the Republican nomination for the presidency.

           Alabama’s Court of the Judiciary unanimously had ordered Roy S. Moore removed from office in November 2003 after he refused to remove from the judiciary building rotunda a 5,280 pound granite monument of the Ten Commandments; he had personally overseen the funding, carving, and placement.

           The federal court ruled that placement of the monument, and Moore’s repeated statements that the monument represented God’s sovereignty over all matters judicial and moral, violated the Establishment Clause of the First Amendment.

           With strong popular support, Moore said not only was the court’s ruling illegal, but that he would continue to defy it. The message sent to the voters was that it’s acceptable to disregard two centuries of legal history that gave the federal constitution supremacy over states, and to violate federal law if you disagree with it. For a citizen to do so carries penalties; for a judge to do so carries removal from office.

           Reflecting upon the case, Moore said that even eight years after his removal from office, he “would still make the same decision.”  The role of government, says Moore, “is to secure those rights that [a Christian] God has given us.” Although he says he supports religious diversity, the “source of our morality stems from our belief in a god, and a specific god.” However, in his Dec. 13, 2006, column for WorldNetDaily, Moore stated that Rep. Keith Ellison (D-Minn.), a Muslim, should be denied the right to hold office because “in the midst of a war with Islamic terrorists we should not place someone in a position of great power who shares their doctrine.”

           Roy Moore says he is running for the presidency because “there’s a need for leadership in the country,” and neither President Obama nor the leaders of both parties in Congress are providing that leadership. “Petty politics,” he says, are taking precedence over the needs of the country. “We can’t get anything done,” he says, “because decisions are [made] not what’s good for the country, but what is good for the party.”

           Moore identifies a weak economy as “the foremost problem today.” The nation “is going the wrong way,” he says. He acknowledges that much of the problem came under the Bush-Cheney Administration, “but was increased by Obama.” Although the Republicans propose cutting social programs rather than raising the debt ceiling, every Congressional leader, Democrat and Republican, voted to increase the debt ceiling during the past decade, with the highest increases under Republican presidents: Ronald Reagan (189%), George H.W. Bush (55%), and George W. Bush (86%). In Bill Clinton’s two terms. The debt ceiling was increased only 37 percent; Barack Obama is asking for a 35 percent increase.

           Moore, a “states’ rights advocate,” shares the views of most conservative candidates for the Presidency. Among those views are:

            the federal income tax should be abolished.

            Abortion, for any reason, should not have federal funds because not only does abortion “contradict the right to life contained in the organic law of our country,” it violates the 14th Amendment.

            People should “have the right to choose their own employment,” instead of having to join unions. Therefore, says Moore, all states should have “right-to-work” laws. If Moore’s vision is enacted, these laws would effectively cripple unions from representing the workers.

            Same sex marriage, says Moore, violates the will of God. In one case, as chief justice, he argued that homosexual behavior is “a crime against nature, an inherent evil, and an act so heinous that it defies one’s ability to describe it.”

           However, on a couple of issues, his views lean closer to that of liberals. He opposes the nation’s entry into war without Congressional authorization. Moore is a graduate of West Point, who became an MP company commander at the end of the Vietnam War, and then graduated from the University of Alabama law school. He opposes the U.S. intrusion into Libya on both military and legal grounds. “It’s very easy for a president to be sucked into global wars,” he says, “but it’s not our goal to go over there [Libya] and take out a leader just because we don’t like him.” Unlike many Republicans, he acknowledges that the Libyan attack, like the U.S. invasion of Iraq under the Bush-Cheney Administration, should have had Congressional approval under the War Powers Act of 1973.

           Moore, who owns horses-he once spent a year as a cowboy in Australia working for a fundamentalist Christian-believes that the dwindling population of wild horses and burros in the Southwest, and all wild animals, should be protected. Both the Bush-Cheney and Obama administrations have failed to do so, often influenced by the cattle and meat industry.

           Moore, near the bottom of the pack in the polls, probably won’t become the Republican nominee. But, unlike some conservative candidates, he doesn’t parade his religious beliefs to gain votes. He lives the life of his religious convictions, and isn’t afraid to make sure everyone knows what they are, especially when they provide the base for his political views.

[Brasch’s current book is Before the First Snow, a look at the nation’s counterculture and social problems, as seen through the eyes of a “flower child” and the reporter who covered her story for more than three decades.]

.