Study begs question: do people get rich acting unethically or act unethically because they think they are privileged?

The news this week that a new study found that wealthier people were more likely to behave unethically set off a chicken-or-egg debate in my mind. 

In the study, Paul Piff, a graduate student at the University of California-Berkeley, led a team of researchers at UC-Berkeley and the University of Toronto in a variety of behavioral experiments involving about 1,000 people.

They ran 7 experiments, all of which concluded that upper-class individuals behave more unethically than lower-class individuals:

  • In two of the studies, upper-class individuals were more likely than lower-class individuals to break the law while driving.
  • In a laboratory study, upper-class individuals were more likely to exhibit unethical decision-making tendencies.
  • Other lab studies showed upper class people more likely to take valued goods from others, to lie in a negotiation and to cheat to increase their chances of winning a prize.
  • Yet another lab study showed that those in the upper class were more likely than the poor to endorse unethical behavior at work.

The researchers used a number of ways to evaluate socioeconomic status, such as education levels, annual income and the participants’ own perception of their social standing. But it didn’t matter what measure they used to sort participants into classes: those with higher status tended to behave in ways that served their own self-interest, even if it was unethical. My own view is that the reason it didn’t matter which criterion they measure: In the United States, virtually all social status reduces to money. The wealthier you are, the higher the social class, the higher the self-perception of class, the higher the annual income and the higher the level of education. 

By the way, neither the study nor anyone else is saying that all wealthy people are unethical, only that a larger percentage of wealthy people than poor people will do or consider doing things that most people consider to be unethical.

Now comes the chicken or egg: do people gain higher status primarily by engaging in unethical activities? Or does having a higher status make people think they are better than others and can play by their own rules?

Let’s start with the “chicken” speculation of the authors of the study, i.e., being rich changes how people behave:

  • The independence offered by financial security may foster a sense of entitlement and a lack of concern for others.
  • Affluent people may be more likely to get away with misbehavior because they have better paying jobs and better paying jobs are associated with less supervision.
  • The affluent may be more willing to take ethical risks because they have the resources at their disposal to address the inconvenience of getting caught.

We know that there is less social mobility in the United States than ever before in its history and less than any other industrialized country. What that means is that today’s upper class of wealth is primarily, but not exclusively, children of the upper class. That would certainly speak to the idea that the chicken came first, that is, that wealth created the pattern of bad behavior and not the other way around, at least in the current generation.

The researchers did find an “egg” explanation, which means that there may be something about cheating, lying and other unethical behavior that helps people get rich. The researchers found that unethical behavior was closely related to positive feelings about greed. Although the connection appeared to be strongest among high-status individuals, even lower-status individuals were more prone to ethical lapses if they felt that greed was good.

In other words, if you want money as an ends to itself and value the acquisition of wealth, you will be more likely to behave unethically.

Here comes the paradox: greed imbues our entire value system.  We equate the pursuit of happiness with the pursuit of wealth. We measure success by wealth and what it buys: the amount of the sports or book contract, the size of the diamond or the house. We admire rich people and follow their doings. Much of the mass media revolves around celebrities, who for the most part are either rich or striving to be rich. Over the past 30 or so years, our leaders have created a system that provides no constraint on the accumulation of wealth, even if that means a decline in public services for everyone else, thus confirming private greed as the central economic value.

Everywhere we go, we are told to be greedy. Now comes this research that shows that greed causes people to turn their backs on our shared norms of social and economic interactions. In other words, the basic drive that propels our society erodes its foundation.

The right-wing likes to blame many groups for the decline of American civilization that they see: The real implication of this research is that it shows that it’s not the pointy-headed academics who have led us into the fix we’re in, nor the European-style socialists, nor those who offend archaic family values.

No, it’s greed pure and simple that is sinking the United States.

The Great Courses video lectures presents a history of the west and calls it world history

On the back cover of a recent New York Times Book Review was a full-page advertisement for a “Great Course” (that’s the brand name) set of DVDs or CDs titled “The World Was Never the Same: Events That Changed History.”

I never read these ads, but the idea of reducing history to 36 incidents appeals to my “top 10” mentality, like the “10 greatest battles in history” or the “25 most influential people of the 20th century.”  Great Courses describes the video course as “36 of the most important and definitive events in the history of the world. It’s an intriguing and engaging tour of thousands of years of human history.”

With only 36 events, I assumed that selection of events to include on the list would reveal the ideological bent of the lecturer, in this case a professor at the University of Oklahoma by the name of J. Rufus Fears.

Never fear, in his list of important events of world history, Fears reveals himself to be another white male selling an old-fashioned and specifically American version of the growth of the Christian world, spiced with a survey of world religions to promote a superficial diversity.  Check out the list or skip to my analysis:

  1. Hammurabi Issues a Code of Law (1750 B.C.)
  2. Moses and Monotheism (1220 B.C.)
  3. The Enlightenment of the Buddha (526 B.C.)
  4. Confucius Instructs a Nation (553–479 B.C.)
  5. Solon—Democracy Begins (594 B.C.)
  6. Marathon—Democracy Triumphant (490 B.C.)
  7. Hippocrates Takes an Oath (430 B.C.)
  8. Caesar Crosses the Rubicon (49 B.C.)
  9. Jesus—The Trial of a Teacher (A.D. 36)
  10. Constantine I Wins a Battle (A.D. 312)
  11. Muhammad Moves to Medina—The Hegira (A.D. 622)
  12. Bologna Gets a University (1088)
  13. Dante Sees Beatrice (1283)
  14. Black Death—Pandemics and History (1348)
  15. Columbus Finds a New World (1492)
  16. Michelangelo Accepts a Commission (1508)
  17. Erasmus—A Book Sets Europe Ablaze (1516)
  18. Luther’s New Course Changes History (1517)
  19. The Defeat of the Spanish Armada (1588)
  20. The Battle of Vienna (1683)
  21. The Battle of Lexington (1775)
  22. General Pickett Leads a Charge (1863)
  23. Adam Smith (1776) versus Karl Marx (1867)
  24. Charles Darwin Takes an Ocean Voyage (1831)
  25. Louis Pasteur Cures a Child (1885)
  26. Two Brothers Take a Flight (1903)
  27. The Archduke Makes a State Visit (1914)
  28. One Night in Petrograd (1917)
  29. The Day the Stock Market Crashed (1929)
  30. Hitler Becomes Chancellor of Germany (1933)
  31. Franklin Roosevelt Becomes President (1933)
  32. Mao Zedong Begins His Long March (1934)
  33. The Atomic Bomb Is Dropped (1945)
  34. John F. Kennedy Is Assassinated (1963)
  35. Dr. King Leads a March (1963)
  36. September 11, 2001

There are so many things wrong with this list from the standpoint of world history that I don’t know where to start!

Except for the Hammurabi Code and the founding of some major world religions, all the events involve the growth of the Christian West according to the mid-20th century middlebrow American version that goes from Greece to Rome to Christian Europe to the land of the free.

One quarter of all the events involve the United States, a nation that has existed for about 230 years, or a mere 2% of the time since humans began cultivating plants and animals. How, for example, did the assassination of John Kennedy change world history? Eisenhower and Kennedy had already committed us to Viet Nam and the civil rights movement was already growing. And how do both the 1929 stock market crash and the election of FDR get onto a list of 36 events that changed history?

Neither of the events representing arts and letters changed history. There are commonalities between the creative artists involved: Both are recognized by most people. Both represent more of a summing up of a tradition than a break with tradition. Both have created examples of Christian art:

  • Dante is one of my favorite poets, one who I have studied extensively and reread quite often, but I would never claim that he changed history, not even literary history, unless you believe that no one else would have thought of writing in the vernacular instead of Latin. Other than the innovation of writing in Italian, his style represents the height of medievalism, soon to be swept away by the stylistic innovations of the Renaissance. Real literary innovators with lasting influence include Li Po and Tu Fu (late Tang Dynasty poets), Cervantes and Joyce.
  • Same thing goes for Michelangelo, whose event was getting the commission to paint the Sistine Chapel. A magnificent and extremely famous painting, but Michelangelo, though imitated, did not revolutionize painting, only helped it a little to evolve. Nothing Michelangelo did was as influential as the work of Giotto, Masaccio, Van Eck, Monet or Picasso, not to mention the late Yuan Chinese painter Zhao Mengfu, who had a lasting impact on both Chinese and European art.

There are subtle signs of ethnocentricity everywhere: For example, the founding of Europe’s first university in 1088 is mentioned, but not the introduction of the examination system in the Chinese civil service more than 475 years earlier. We learn of the Battle of Vienna in which the Christian Hapsburgs turned back the Islamic Ottomans, but nothing about the founding of any Chinese dynasty, nothing about the Mughal conquest of India, nothing about Chinggis Khan! Also note the male-centric nature of the lectures: no event involves a woman.

We would have to hear or see the lecture to know for sure, but the promotional materials suggest that Fear’s handling of slavery and its bloody demise—a major theme in world history—is a whitewash. The event representing the American Civil War is presented from the point of view of the South, i.e., “General Pickett leads a charge.” The paragraph description of this lecture on the website makes the specious claim that the South would have won the Civil War if it had won the Battle of Gettysburg. This understanding of the war conveniently forgets the tremendous resource advantage that the North had, which Ulysses S. Grant understood and used to his advantage in planning his battles. But more to the point, there is no room in the course for any event depicting the 400-year history of the slave trade, which funded the economic and technological advances of Western Europe and the United States from the 16th through much of the 19th centuries.

And who is this J. Rufus Fears, who lectures to us about these important events in world history?

The Great Course promotional material calls Fears a “historian,” but judging from what I could find about him on the Internet, he is as much of a practicing historian as Rufus T. Firefly.

An Internet search finds that Fears teaches a lot of video/audio courses for adults, all about great men or great ideas of the West or of world religions. His only publication other than course material that I could find is as editor of a book by Lord Acton, a 19th century English Catholic historian and politician. Fears did write an article in “Atlantis: Fact or Fiction,” a 1978 book of essays on possibilities of the actual existence of Atlantis that focuses on Atlantis in myth and literature. The University of Oklahoma says that he’s a professor of classics, which is not really history, although it can involve history. I cannot find one scholarly article or book by Fears, or even one book or article of popular history of the sort that David McCullough or Bruce Catton might write.

While perhaps not a historian, Fears is a regular guest on “The Rusty Humphries Show,” a right-wing radio talk show that runs on more than 250 stations. A visit to Humphries website lists some other recent guests, the usual right-wing suspects, including John Bolton, Michele Bachmann, Newt Gingrich, Paul Ryan, anti-choice activist John Schneider and Rand Paul.

I’m not saying that “The World Was Never the Same: Events That Changed History” doesn’t provide history to those who buy the course. But everything I can find out about the course and the good Professor Fears indicates that what we’re getting is comforting if distorted verification of the ideological imperatives of western superiority and American exceptionalism.

I think I’ll pass on Fears’ version of world history and instead reread some Fernand Braudel, Mote’s magisterial Imperial China: 900-1800 or Howard Zinn’s A People’s History of the United States.

By saying Obama follows a secular not a religious agenda, Republicans make case for reelecting the President

The Republicans are doing a lot to restore my faith in President Obama.

First Rick Santorum accused Barack Obama of being the most anti-religious president in the nation’s history. Then Mitt Romney said that Obama sought to substitute a “secular agenda” for one based on faith.

What a relief to know we have a secular president.

What a relief, especially after weeks of hearing Santorum, Romney and the other Republicans talk about introducing religion into public policy and political decisions in one way or another.

What a relief to have reaffirmed the fact that President Obama, unlike his immediate predecessor, follows the U.S. Constitution and the wishes of our mostly deist founding leaders and promotes a secular agenda.

We are, after all, a secular country, one that is not supposed to have religion enter into government decisions, nor to favor one religion over others.

I understand that a good 20-25% of voters think differently. They believe that we are a Christian nation. What’s more, they want to force a set of values on everyone that they associate with Christian practice.

But we have to go no further than the issue of birth control to recognize how much the real world diverges from the ideals of the Christian right wing. In the real world, 98% of all women use birth control sometime during their life. In the real world, the cost of birth control is far less than the cost of an unwanted pregnancy, which means that when you ask a religious organization to pay for their employees’ birth control you are asking them for no money and in fact giving them money since their insurance costs will decline.

To be sure, both Romney and Santorum are playing to the hard core base that now determines Republican primary elections. But besides pandering to the right-wing “values” voters, the labeling of Obama as “non-Christian” and “non-religious” also has a subtle impact on other voters. It’s another, harder to disprove, version of the “Obama wasn’t born in the United States” canard. Wherever we fall on the political spectrum and however devout or non-practicing we are, most Americans have a Christian background and live their lives by an ethos they identify as Christian. To say that Obama is not Christian or is anti-religious (which is just another way for them to say “anti-Christian”) turns him into the “other” or the “stranger” who has historically been so feared in American culture and politics. The ultimate outsider, of course, is the Black.

There’s an economic aspect to the accusations, too: that old saw that communists and socialists are godless. Americans for decades are used to hearing the words “godless” and “socialist” (or “communist”) pronounced one after the other to describe progressives and liberals. To say that Obama is against religion is also a veiled way of saying that he is against our free market capitalist system.

And yet, I think many will share my desire that religion not enter into a president’s decision-making. I think most of us prefer that decisions are based on facts, science, reason, the law and what’s best for the country and its people.

That Romney and Santorum affirm that our current president is following a secular path gives me more confidence in what Obama is doing. That the Republican candidates don’t like Obama’s secular path is scary. Because they could get elected, and that would be trouble. We recently had a faith-based president and it didn’t really work out, unless you like useless, goalless wars, state-sponsored torture, catastrophic environmental change and the largest deficit in American history.

Why pick on affirmative action? Why hasn’t anyone sued universities about favoritism to legacies and athletes?

Yesterday’s announcement that the U.S. Supreme Court has accepted an appeal from a rejected white applicant to the University of Texas-Austin reminds me that every time the constitutionality of affirmative action returns to the issues agenda, one question is always left out:

Why didn’t the applicant suing a university for accepting minorities with a less impressive record of academic achievement also or instead sue the university for discriminating in favor of athletes and legacies? Legacies, for those not up on academic admissions parlance, are students whose parents previously went to or have contributed money to the university.

The unfairness of lowering standards to accept athletes to an institution dedicated to intellectual achievement and professional training seems fairly obvious. I can understand giving a break on the SAT or grades to a national chess champion or the winner of a science fair, but what does sports have to do with the mission of higher education?

And yet where are the lawsuits claiming that the university acted illegally in preferring a kid with a 1000 on the SATs who can throw a football 70 yards through a tire to someone with 1100 on the college boards who has no extra-curricular activities?

The advantage given to legacies is even more unfair, because it is a major part of the rigidity in the college system that necessitated affirmative action in the first place. 

A quick search of “college admissions legacies” will reveal the oft-told history of legacy preferences, which Ivy and other colleges began to use after World War I when their objective criteria were leading to the admission of too many Jews. Today, Ivy League and private colleges give from 10% to 15% legacy preference, but some give as many as 30% of all acceptances to legacy applicants. Overall, many more students are admitted as legacy preferences than are admitted through affirmative action programs. What’s more, polls find that 75% of all Americans are opposed to legacy preferences.

Yet no one sues universities because they are giving preferences to people whose parents graduated from or gave money to the school.

Correction.  In his 2010 analysis of everything that’s wrong with legacy preferences in The Chronicle of Higher Education, educational policy guru Richard D. Kahlenberg cites a losing 1970’s case filed against legacy preferences at the University of North Carolina-Chapel Hill. The problem was that the plaintiff included legacies in a hodgepodge of discrimination complaints, including discrimination for being an out-of-state student. Her SAT scores were 850, uncommonly low for an in-state or out-of-state UNC student both back then and now. 

Kahlenberg summarizes the compelling legal argument against legacies at both public and private universities, based on the 14th amendment and the Civil Rights Act of 1866. He also demolishes the argument that universities need legacy admissions to keep the donations rolling into the university coffers. Analysis reveals that people give about the same to their alma mater or other universities with or without the legacy factor.

The higher the university on the food chain, the more the legacy admission undercuts the ambitions of other competent but less connected candidates. Let’s face it, the more important the job, the more likely it will be filled by an Ivy or Ivy-like (e.g., Stanford or Northwestern) graduate, and if not an Ivy, a public Ivy (e.g., U of Washington or UNC) or other prestigious school. The college educated earn more in general, so no matter how you slice it, legacies come from wealthier families on average than non-legacies at virtually every university.

When you’re better off, you are more likely to have special lessons, more likely to travel abroad, more likely to participate in national youth competitions, more likely to take an SAT prep course and more likely to live the lifestyle behind the cultural assumptions of the SATs. Affirmative action is one of the ways that colleges can level the playing field.

I’m not saying that once legacy admissions are ended we won’t need affirmative action anymore. What I’m saying is that the legacy system reflects the subtle action of institutional racism and is one more reason we need affirmation action. By the way, we’ll know that we won’t need affirmation action anymore when the rate of poverty among African-Americans or the average wage of African-Americans is about what it is for everyone else. 

As others have pointed out, the Supreme Court decision to take the appeal is especially disturbing in light of its 2003 ruling upholding affirmative action. In that ruling, the Supreme Court laid down some affirmative action guidelines for universities and suggested that the high court shouldn’t revisit the issue for another 25 years. Of course that was before Roberts and Alito joined the court.

Most of the plaintiffs in these affirmative action lawsuits are middle class and upper middle class whites. That leaves us with the fact that none of these fighters for equality ever thought to take on legacies. There are certainly more legacies than there are affirmative action students, and the legacies tend to include more of the children of those people who have taken money from the middle class through the economic and tax policies of the past 30 years.

It’s quite puzzling. The only answer that I have is that it’s another manifestation of the racism that has distorted the politics and social policy of this country since its inception.

MIT professor revisits the cultures that bombed Pearl Harbor, destroyed the WTC and dropped A-bombs

I’m reading a very fascinating scholarly study called Cultures of War by MIT history professor John Dower. Professor Dower analyzes in detail the similarities in the cultural assumptions, bureaucratic decision-making processes, fascination with technology, religious orientation, use of propaganda and strategic military imperatives of four events that serve as symbolic points in the cultural history of two wars. 

Interestingly enough, in all cases the decision to act proved disastrous for mankind. In three, and maybe all four, it was also disastrous for the nation/organization instigating the act:

  1. The Japanese attack of Pearl Harbor
  2. The United States detonation of atomic bombs on Hiroshima and Nagasaki
  3. Al Qaida’s terrorist attacks by suicide crash on September 11, 2001
  4. The United States “war by choice” against Iraq under the false pretext of destroying weapons of mass destruction and disabling Al Qaida.

The obvious symmetry in considering the cultures that produced all four of these actions is that they are paired: in both pairs, the actions of the United States are typically considered to be reactions against horrible deeds, by a nation in one pair and by a terrorist organization in the other.

But Dower carefully draws U.S. defensive motives into question: He recapitulates what we already know about the duplicitous lead-up to the invasion of Iraq by the Bush II Administration. He also reminds us that no one can say for sure that dropping two atom bombs saved more lives than the more than 200,000 that the U.S. obliterated in two fairly short bombing raids. We know for a fact, however, that the U.S. wanted to brandish its new weapon for the Soviets and everyone else in the world and wanted to stop potential grumbling at home about the cost of the Manhattan Project. Dower also shows us how much the U.S. wanted to go to war against Japan before Pearl Harbor and how much the Bush II (non)brain trust wanted to attack Iraq before 9/11.

Many would consider it blasphemy and/or treason to equate the moral bearing of Osama bin Laden and the U.S. under Bush II or Roosevelt, but Dower makes a very strong case.   Here are some of the similarities:

  • The use of religion as a justification and of religious imagery in manifestos about the events
  • The postulation of a battle between civilizations
  • The belief that your civilization/religion is infinitely superior to the civilization/religion of the foe
  • A justification of killing innocent civilians, politely known as “collateral damage”
  • The focus on technology (in the case of Osama, it was computers, not weapons)

We really did feel threatened by the combined force of the Japanese and Germans, and we really did feel threatened by the terrorist attack. But Dower makes it clear that Osama’s followers, too, felt their civilization threatened by U.S. military activity and economic and social imperialism. 

The fact that many of us think that the Japanese and the extreme Islamists were fools or devils to feel that their way of life was superior merely suggests that we are unable to transcend our own cultural imperatives that tell us that our way of life is the best. I’m not saying that Al Qaida was right to launch the 9/11 attack. It was as wrong as we were to drop the atom bombs and to attack Iraq (and to pursue the Viet Nam War for that matter). But they certainly were right to think their civilization was threatened by U.S. military and political actions. 

And just as Pearl Harbor united even the most vocal pacifists and isolationists in the United States, just as 9/11 united us again, so did the invasion of Iraq, the declarations by the Bush II Administration of a holy war and our establishment of a world-wide torture gulag help Osama recruit many new terrorists throughout the Islamic world. In all cases, too, the governments and terrorist organization embarked on major propaganda campaigns to convince their people that they did the right thing by unleashing death, in one case against soldiers sworn to fight to the death and in the others against innocent bystanders.

Perhaps the most horrible similarity in all the cultures of war that Dower considers in his provocative and easy-to-read book is that in all four attacks, the participants—the military men, the government officials, the scientists and engineers, the soldiers who did the dirty work—were able to forget that they were engaged in killing large numbers of people. 

Many factors led to the dehumanization of the people at the receiving end of the bombs, tanks and suicide plane crashes:

  • A bureaucratic language that used euphemisms and passive constructions to conceal horrible realities
  • A focus on the complex challenge of the task at hand (as opposed to the destructive ends)
  • A belief in the inferiority of the victims
  • The self-deception often generated by the constant creation of propaganda for others
  • A religious belief, i.e., that you’re on a religious mission

These factors affected the decisions and actions of the one-party Shinto autocracy, the Christian representative democracy and the Islamic theocracy. Who has the moral high ground here?

There is no threshold for terror, for starting wars of choice or for unleashing weapons of mass destruction. 

There is no military justification for bombing and attacking civilians that can offset or override the moral evil involved in killing masses of innocents. 

Heinous acts of terror, genocide and lawlessness in a just cause turn that cause to evil and take from the perpetrators any claims of morality or civilization.

I came away from reading Cultures of War more convinced than ever of these things.

New York Fashion Week and Westminster Dogs: lots of prancing in NYC this week

It is one of those rare poetic coincidences that both the Westminster Dog Show and New York Fashion Week unfolded in the same week this year. Some might even call it proof of an intelligent, if satirical, design behind cultural history.

Imagine: while pedigree dogs bred and trained to an arbitrary standard are parading around one floor in Manhattan at the same time mostly overly thin and mostly female models with who knows what surgeries, maquillage and other enhancements applied to imitate another arbitrary standard are parading around another Manhattan floor.

The news media and television entertainment programming have dedicated extensive space and time to both these shows. Fashion plays into the greater universe of celebrity culture, since celebrities are often the ones who wear the newest fashions or sell fashions to the consumer. As celebrity culture has grown, so has interest in these fashion shows. 

In the same way, the growth of interest in the Westminster Dog Show mirrors the growth in America’s interest in dogs and other pets, as witnessed by the growing number of new products for dogs that come out each year, the growth of spending on dogs, the growth of advertising of pet products and the growth in the use of dogs as a cultural icon in other commercials. Consider, for example, the TV ad in which someone is walking a dog when his button bursts from overeating. Or the one in which a dog drinks beer with his master. The one in which a woman tells her dog that she reacts to eating a cereal the way the pooch reacts to having its belly scratched. The list of dogs popping up in commercials for other products in recent years is endless.   

What’s most interesting is that in the case of both the animals and the models , these living embodiments of perfect form prance gracefully in front of the adoring crowds not just for themselves, but to glorify a third party. In the case of the dogs, it’s their owners and trainers. In the case of the models, it’s the designers whose clothes adorn their bodies.

At least the dogs get to do tricks and demonstrate their intelligence. The models could readily be replaced by a future generation of robots that looked and moved as humans do.

In the real world, most people like seeing an attractive person in interesting or provocative dress in the street, even if they would never go to a fashion show. We don’t always feel so positively about dogs, which can be big and mean-looking, howling or whimpering viragos, dangerously left to wander without leash or not cleaned up after by the master.

It’s ironic then that in the world of shows, it’s the models who are getting into trouble with the neighbors. Like its Parisian version, the New York Fashion Show is a tent affair, erected in front of Lincoln Center in Damrosch Park. Residents are complaining that between setting up, operating and taking apart two Fashion Shows and a circus, the park is out of commission for 10 months out of every year.  The New York Times reported yesterday that a group of residents and a group called NYC Park Advocates announced that they had sent a “cease and desist” letter to the city and to Lincoln Center demanding that Damrosch Park be returned to its proper use as a city park.

We could break every week down to a series of three to five cultural stories that represent the playing out of long-term trends in ideological indoctrination. This week, for example, to Fashion Week and the Westminster Dog Show we might add the death of Whitney Houston, the excitement created because a devoutly Christian professional basketball player of Chinese descent played six good games and the Republicans relenting on offsets to the continuation of the temporary cut in the Social Security tax. 

Beneath these immediate stories lurk ideological imperatives: the culture of celebrity; the commoditization of emotion; the idea that the private sector solves all economic problems and government none. That two of these stories are shows that take place in New York merely suggests how much the mass culture of consumption now dominates the public marketplace of ideas.

There are too many ways that employers can raid pension plans of workers

An article in the Wall Street Journal earlier this week depicted the various ways that employers can legally raid the pension fund of their employees and retirees.

The premise of the article, titled “Signs Your Pension Plan Is in Trouble,” was that even healthy pensions plans that are adequately funded can be in trouble.

The article doesn’t admit it, but all of the risks to healthy pension plans described in it come down to permitted but nefarious actions by employers, including:

  • The employer selects financial assumptions that make the plan look like a bigger burden than it really is and uses that as an excuse to freeze payouts or commitments.
  • The employer offers incentives for early retirement using the assets of the pension plan, which depletes the pension plan of funds to pay retirees.
  • If the employer either sells or spins off the operation, the new owner uses the assets of the plan to pay off its own underfunded obligations to its plans, thus putting the strong plan in trouble.
  • A religious or other nonprofit organization changes the status of its plan to a “church” plan, which exempts it from federal pension rules including the requirement to fund the plan.

The common theme in all these scenarios is that the employer legally takes money earmarked for paying the pensions of retirees and uses it for something else. It sounds a lot like what the rightwing wants to do with public pensions: cut them to pay for tax cuts.

The article basically mongers fear, but anyone reading the article who cares about fairness should feel outrage instead. 

Every single one of these tricks should be illegal. Whether unionized or not, the employee accepts pension benefits in lieu of current salary. The money in the pension plans should be sealed off from other funds into which the employer dips to meet expenses, just like Social Security is sealed off from the rest of the federal government, with its own trust. Despite recent reform, pension laws still provide too little protection to both private and public employees.

When the mainstream media chatters about the retirement crisis in America, it usually points a finger at the profligate Baby Boomers, who spent most their earnings (and thus shored up the U.S. and world economies) and now don’t have enough saved for retirement. Perhaps more should be said about the profligate company executives who treated and continue to treat the assets of pension plans as their company’s private piggy bank.

In GOP’s alternative world, Social Security is part of overall budget and not in a special trust

The banter between Republicans and Democrats on extending the temporary cut on the Social Security tax continues to take place in an alternative universe.

In the real world, cutting a program that is part of the budget doesn’t offset a temporary decrease in the Social Security tax, since Social Security is administered by a special trust which loans money to the federal government. A cut in the budget only cuts the amount of money the federal government has to borrow from Social Security. The government will eventually have to pay it all back, no matter what.  That is, unless the rightwing has its way and the U.S. defaults on its financial obligation to repay the Social Security Trust Fund.

The original beauty of putting more money into consumers’ hands by cutting the Social Security tax rate temporarily was that as long as the federal government pays back what it owes the Social Security Administration, Social Security is safely financed for a long time, with no real problem until 2037 if current trends continue.  The Obama Administration thus was able to pump money into the economy to fight the recession without hurting government finances because the money was coming from the fiscally strongest part of the government’s financial structure.

But in the Republican’s alternative world, Social Security is part of the budget and the Social Security taxes part of the revenues that the government collects to pay for its expenditures.  This fiction, supported to a large degree by imprecision in the mainstream media, plays into two basic principles of the current right-wing:

  1. Make the tax system more regressive. Unlike income taxes, everyone pays the same rate for Social Security and there is a cap on the income assessed that rate, so treating it as just another tax makes the overall tax system more regressive.
  2. Destroy Social Security and replace it with a risky private pension system. If Social Security is considered part of the overall budget, then the program is in trouble, since the United States has been spending more than it takes in for years, thanks to Bush II’s decade-old “temporary” cuts for the wealthy and our senseless, costly and bloody wars in Iraq and Afghanistan. 

The House Republicans have now retreated from their intransigent insistence that spending cuts in the budget must make up for the lost revenue to the Social Security Trust Fund from continuation of these temporary cuts through year’s end. They have clearly lost face with the Tea Party element, but to block the continuation of this temporary tax cut for virtually all workers would have lost them the election and taxing the wealthy to “compensate” for the extension would have angered their financial backers.

The Republicans may have lost the battle, but by no means did they lose the war.  The months-long bickering over “funding the temporary Social Security tax cuts” did establish the false idea in the news media that Social Security is part of the overall budget. And while still strong, Social Security is financially weaker than it would be if it had the additional revenues represented by the temporary tax cut. 

Another, more progressive, way to kick-start the stalled economy might have been to pump money into infrastructure improvements, alternative energy development and other job-creating programs and finance it by ending Bush II’s temporary income tax cuts to the wealthy. The Republicans have still prevented this option, despite the fact that surveys keep showing that voters wanted to raise taxes on the wealthy. So the loss of the one battle has enabled the Republicans to keep winning the war.

I want to close this post by recommending a blog called Stochastic Scientist, upon which I stumbled while responding to a tweet from the writer, Kathy Orlinsky. Stochastic Scientist covers scientific developments and offers a very pleasing and well-written mix of science news.

There are two types of right-wing extremists and both base their actions on faith, not reason

“Tell me what you think makes a person right wing extremist.”

That’s the question posed to me on a personal tweet I received yesterday from someone names Silly Girl.  Silly Girl describes herself as “pro-life” and a “Tea Party Patriot.”

Responding to Silly Girl’s question takes far more characters than Twitter’s limit.  Twitter is great for sloganeering and enticing followers to link elsewhere.  But it’s not a great venue for deep thought.  So here’s my answer, for Silly Girl and the rest of the world:

There are currently two kinds of extreme right-wingers currently participating in the American marketplace of ideas:

  1. Social extremists
  2. Economic-political (econ-pol) extremists

Interesting enough, both types of right-wing extremist base their ideas on a faith that they believe overrides the lessons of science, history and rational thinking:

  • Social extremists have faith in religion, primarily Christianity, and refuse to believe the truth of science.
  • Econ-pol extremists have faith in the unencumbered free market to solve all problems and sort out wealth in a fair manner, even though both history and economic analysis reveal that the free market often works against the best interests of society.

By contrast, social and econ-pol extremists differ in their approach to social control.  This difference is striking when we consider that many people such as Rick Santorum, Michele Bachmann and John Boehner are both social and economic conservatives:

  • Social extremists want to control actions of individuals, e.g., preventing abortions, birth control and gay marriage and forcing religion into the school and other public places.
  • Econ-pol extremists want to take all limitations from the actions of institutions, specifically businesses, so they can operate free of constraints such as labor, safety and environmental laws and regulations.

We can see this difference in the approach to control when we take a look at the major positions held by the two types of right-wing extremist.  I think I have them all, but
if I missed some of the major ideas of right-wing extremists, please let me know.

Basic beliefs of social right-wing extremists:

  1. Global warming is not occurring and the theory
    of evolution is wrong
  2. Religion should be taught in the schools
  3. Abortion, stem cell research, gay marriage and birth control are wrong
  4. Homosexuality is a sin and a failing, not a natural occurrence in the population
  5. It’s okay to discriminate against practitioners of Islam and/or religions other than Christianity

Basic beliefs of econ-pol right-wing extremists:

  1. There should be no constraints on businesses, including constraints to promote fair wages, workplace or consumer safety or a
    clean environmental
  2. The private sector can solve all problems and deliver all goods and services better than government
  3. Private institutions should replace public ones, e.g. schools, roads, prisons and military services
  4. Labor organizations should be made illegal and there should be no minimum wage or other employer mandates to compensate employees fairly
  5. Taxes should not only be low but also regressive, meaning that the richer one is the less one’s relative tax burden should be
  6. The military should engage to protect the private interests of large American multinational companies

I label as extremists those who subscribe to either of these sets of beliefs for several reasons:

  • To believe in all of these ideas (as opposed to just one or two) defines extremism on the right.
  • Unwillingness to compromise characterizes both types of extremists, and is a trait associated with extremists of all varieties.
  • For the most part, these ideas are rejected by mainstream America.  Most Americans support keeping abortion legal, although sometimes with limits.  Most Americans support raising taxes on the wealthy.  People overwhelmingly support birth control, the minimum wage and safety regulations.
  • Those views of the extreme rightwing that large amounts of other Americans share usuualy firmly contradict all scientific evidence, such as not believing in Darwinism or doubting the occurrence of global warming.  Failure to believe truth is a sure sign of extremism, all else notwithstanding.

These two kinds of extremists really have nothing in common, except that they both tend to support views formed before the 18th century and both must exercise a great deal of faith to uphold their beliefs in light of sometimes overwhelming contradictory evidence. Social extremists tend to be less educated and live in the South or in rural communities; while some have money, as a group they are not wealthy. Econ-pol extremists tend to be wealthier than the average American and are not as defined by geography. In a sense, two unrelated groups have formed a partnership out of political convenience.

What is so odd about the current coalition of social and economic-political right-wingers constituting the Republican Party is that the policies of the econ-pols hurt a large majority of the social true believers, many of whom are poor, near poor or in the struggling, some would say drowning, middle class. And yet the social right-wingers continue to support the econ-pols.  It’s a deal with the devil that is keeping much of America poor and driving much of middle class America towards poverty.

Individuals should take precedence over institutions when it comes to a health matter like contraception

Is it a matter of women’s rights or is it a matter of religious freedom?

That question has grown to become one of the major concerns of the mainstream news media since the Obama Administration mandated that contraception be covered in all health insurance policies. 

The answer to this question of definition determines where you stand on what the Obama Administration did:  If you think we’re talking about women’s rights, then you likely believe the Obama Administration did the right thing.  If you think we’re talking about religious freedom, you likely believe it’s illegal for the Obama Administration to make Catholic organizations cover birth control in health insurance policies they offer employees.  Look at the coverage in Google News and you’ll see that Democrats, women’s organizations and progressives such as Rachel Maddow are talking about women’s rights and Republicans and right-wing pundits like Rush Limbaugh are talking about religious freedom.

Both sides have a point.

So what do we do when two fundamental rights come into conflict?

We could go with majority rules. An overwhelming majority of Americans have used birth control, support birth control and think that health insurance should cover it. In fact , as Gail Collins, Rachel Maddow and others have pointed out, two-thirds of Catholic women currently use birth control and virtually all have used some form of birth control at some point in their lives.  The surveys show that the only group among Americans in which a majority is not in favor of birth control is the right-wing Christian evangelical movement, which unfortunately now sets policy for one of our only two political parties and thereby defines the terms of virtually every debate involving social issues in the United States.

Even including the overly loud voice of the Protestant right, majority rules would dictate that the Obama Administration made the right move.

Of course, at the heart of the very idea of rights is the principle that the majority can’t bully a minority.  But in this case it’s a minority wanting to bully the majority. 

Both those who are in favor of all health insurance policies covering contraceptives for women and those against it focus their attention on individual institutions or persons.  They often forget to mention the third party in the discussion, and that’s society.

As a matter of public policy, the government is charged with securing the public health.  We can judge success in this matter by the health of our citizens and by the funds we must allocate for health care.  As measured by people or by money, there can be no doubt that contraceptives help to promote a healthier society.  Birth control prevents two major factors in healthcare costs: unwanted pregnancies and venereal diseases. Birth control also leads to fewer abortions. 

When the government considers public policy issues, it often has to weigh conflicting rights.  Some examples include water rights policies, environmental standards, rules for eminent domain actions and product safety standards.

In the case of covering contraception, the public policy decision looks like a no brainer: something that serves the public interest by leading to a healthier population and lower health costs is supported by an overwhelming majority of citizens.  The policy favors a right central to the lives of individuals versus the right of an institution to avoid paying for something considered standard by most individuals and institutions.

I don’t believe there is any religious freedom involved in setting a standard for public health, which is all the Obama Administration did. In fact, a decision to exempt religious organizations might have infringed on the religious freedom of the women using birth control. 

But even if a religious freedom were involved, it would be superseded by public policy, just as the right to marry more than one woman was superseded by public policy in the last century, and just as the right to believe in the power of prayer ends as a matter of law and public policy when parents deny medical treatment to a child for religious reasons.