Fifty years from now, what will we remember about 2013?

Today the news media culminates a week of looking backwards at the past 365 days, 52 weeks and 12 months. I write all three to suggest that time is an arbitrary measure. To be sure, the year, month and day are based on the natural movements of the earth, sun and moon. But it is arbitrary both to begin the year in the dead of winter instead of the beginning of spring or another time, and certainly arbitrary to imbue a significance into one measure of time compared to another. What is there about one trip around the Sun that makes it a natural time to look back or to use as an increment of meaning? Instead of years or decades, we may think of human lives and history in terms of stages (childhood, adolescence, adulthood, middle et. al.), which can be of variable length depending on the society and individual. There’s the long 19th century that some historians begin in 1789 and end in 1914, for example. On a smaller scale of time, the Internet has just about eradicated the idea of a daily cycle of events that is reviewed in the morning or afternoon by perusing the newspaper or watching the news.

News stories and cultural trends sometimes emerge for brief periods but often transcend years. For example, we could label 2013 the year of electronic spying, but I have a feeling that this issue will remain before the public’s eye for years to come. We could also call 2013 the year of the twerk, but it would probably be more accurate to call August the month of the twerk, except that Miley’s notorious hip thrusts took place on August 26, so the media twerking frenzy really occurred over the two weeks of late August and early September, AKA the “back to school season.” Of course, if you ask retailers, the back to school season doesn’t begin a week before Labor Day, but weeks earlier with the beginning of back to school promotions near the end of July.

I love looking at old lists of major events, movies, TV shows, music and trends from past years.  We can see support for ending capital punishment grow, then ebb then grow again. We can see attitudes towards taxation, LGBT people, a woman’s right to control her own body and many other social and legal issues evolve. We can see the gradual increase in the rejection of science and truth by civic leaders, organized groups and media outlets, especially in attitudes and reporting on global warming, vaccinations and science education.  These last few years we can see the ever-increasing rightward movement of the Republican party to the point that it is willing to sacrifice the well being of the country on the altar of cutting government spending programs and keeping taxes at an extraordinarily low rate for the wealthy and near wealthy.

Perusing these lists can help us identify long-term trends and news stories, but we also see how often our society has focused on the trivia like twerking and ignored the important. For example, the initial deployment of ARPANET was on no list of major news or trends of the year at the end of 1969, but history books will note it. ARPANET’s descendent, the Internet, has significantly widened the disconnect between what we think is important at the end of the year or at any given time and what becomes important as we gain a little perspective. Take for example, this year’s list of topics most tweeted about, linked to or the subject of articles: The Kardashians, Duck Dynasty, twerking and the birth of another royal leech will make all these lists.  The trivial has overwhelmed the important in the short term.

This time of year also brings many predictions on what will happen in the coming year in politics, culture, entertainment, sports and fashion. Most of these predictions aim at being clever or snarky and all are tinged with the ideology of the predictor.

So let me wade briefly into this morass of lists and assertions with a few observations:

We are on much stronger ground if we use the last day of the year to take stock of the current situation: to evaluate where we stand at this current moment. What I see is a country polarized by a series of issues or philosophical stances: the 1% (or 5%) versus everyone else; those who believe in a diverse society versus those who want to impose their morality and mores on everyone; empiricists who trust the findings of research versus the ultra-religious, those who want to help the poor versus those who think the poor are responsible for their lack of resources; those who believe that government has a role to play in the economy versus those who believe that the free market unfettered by regulation always works best for society.  On many of these issues, most people lean to the progressive side in what they believe, but by giving play to both sides the news media keeps the side with the minority views not only alive but dominant.  Further muddling the water is racism and religion, which drive those whose economic interests would be better served by progressive policies into the arms of the free-marketeering plutocrats.

As to the past year, I want to optimistically propose that decades from now we will remember Edward Snowden as a hero and his revelations as one of the most important news events of the year and decade.  The unnecessary blips that marred the rollout of the Affordable Care Act will be forgotten, just as the snafus that accompanied the first days of Medicare and Medicare Part D disappeared from our collective memory. Unfortunately the victims and refugees of Syria, South Sudan, Iraq and elsewhere will also be forgotten. So will the 1.3 million people who lost unemployment benefits and the 1.9 million additional people who will lose their benefits by June if Congress doesn’t reverse itself in 2014.

Instead of making predictions for 2014, I want to close with a few wishes for the new year: I wish that extended unemployment benefits would be reinstated. I wish that Congress would remove the cap on income that is assessed the Social Security tax, thereby ending any future funding problem for America’s only reliable retirement plan. I wish that Congress would end all subsidies for nuclear, gas, oil and ethanol production and put the money into wind and solar generation of electricity. I wish that we would raise the taxes on people with incomes of more than $200,000 enough to pay for our Iraqi and Afghanistan fiascos. I wish that the federal government would end its use of drones and its widespread spying on all Americans.  I wish that a significant number of people would get rid of their cars and use mass transit. I wish that democratic government would be reinstalled in Egypt, there would be a peaceful overthrow of the Syrian regime and that Israel and the Palestinians would negotiate a lasting peace. But most of all, I wish that progressives flood the polls in November and vote out the right-wingers.

And to all my readers, I wish all a joyous and prosperous New Year.

Supporting free speech is one thing, but “standing with Phil” signals homophobia, sexism & racism

It’s one thing to support Phil Robertson’s constitutional right to free speech. It’s quite another thing to proclaim you “Stand with Phil,” which about 200,000 people have done in signing an electronic petition available at the Faith Driven Consumer website.

What saying that you stand with Phil means is that you agree with his frequently-expressed homophobic, racist and sexist views. I wonder how many of the 200,000 people who signed the petition understand that they have now insulted and demeaned real people—work associates, people they see in the supermarket, friends of their children. It’s possible that a number of them are like Sarah Palin and didn’t even read the remarks, but still knee-jerked in support of a celebrity they like.

Faith Driven Consumers, by the way, is a membership organization that claims to represent the 15% of the population who it says wants to buy goods and services only from companies that actively support Christianity. The website posts reviews of businesses that analyze their commitment to the Christian faith. Under the fast food category, for example, the organization gives Chick-fil-A 4.5 stars for “leaning towards a Biblical (sic) view of the world” and McDonald’s 1.5 stars for “leaning against a Biblical view of the world.” Backyard Burger, whatever that is, earns 3 stars for a “mixed response.”

Here is what Faith Driven Consumers says about McDonald’s: “While it is making efforts to encourage healthier eating and to assist families in crisis through its Ronald McDonald House philanthropy, we can’t reconcile its celebration of the homosexual agenda and its promotion of abortion services with a corporate focus on catering to children and families.“

The agenda of Faith Driven Consumers sounds vaguely reminiscent of the 1930’s, when the Nazis encourage Germans not to shop at Jewish stores.

Perhaps more frightening than the exclusionary policies is the fact that there is no information about the leadership or backers on the website. I can find nothing on the Internet about the founder and spokesperson, someone named Chris Stone. Faith Driven Consumers is not a nonprofit organization, meaning that it makes money making its recommendations, just like Angie’s List. Joining costs nothing and I see no solicitation for money or place on the website to contribute money, so the website and organization must be getting surreptitious backing, but from where? That’s the scary part.

The Reverend Jesse Jackson made an interesting observation that the Duck Dynasty Dude is worse than the bus driver who hassled Rosa Parks because the driver at least was following state law.

Phil Robertson thinks he’s following the law, too: his god’s law, which he believes forbids homosexuality and keeps women subservient to men.

Phil Robertson has his religion and Jesse Jackson has his, and in their hearts both believe that religious dictates supersede the laws of man.

But Jackson was talking not about the laws of god, which are subject to interpretation, but about the laws of man. Jackson is a leading figure in the civil disobedience movement, which is based on peacefully disobeying bad and immoral laws. His career has been built on confrontations with people who are just following orders. He understands that the man just has a job to do.

By contrast, Phil Robertson goes out of his way to say hurtful and insensitive things about minority groups and then tries to hide behind his narrow and harsh version of Christianity.

NY Times runs another Op/Ed column arguing science should not try to extend human lifespan

The New York Times opinion page seems to be on a full-bore campaign against radical extension of human life.

For the second time in less than a month, the Times has decided that the voices in favor of not pursuing life extending technologies and therapies need to be heard. Three weeks ago it was so-called bioethicist Daniel Callahan who questioned the value of extending human life much beyond the 78 years that the average American now enjoys. . Now the Times has found room for a column by Roger Cohen—a supporter of the U.S. invasion of Iraq and defender of Rupert Murdoch—to make exactly the same argument that Callahan made.

Like Callahan, Cohen depicts radical life extension not as a blessing and a sign of success as a species, but as a burden on society because of current limitations on both natural and medical resources and a lack of jobs in society. Cohen is unable to exercise even an iota of imagination to conjure a world run by renewable resources in which there is fairer distribution of the rewards of work, people have fewer children and everyone regardless of age has access to education, food, medical care and adequate shelter. All he sees are the problems of taking care of the elderly instead of the great joy that life can provide at any age.

Cohen cites statistics that suggest that the 56% of Americans don’t want to undergo medical treatments to live to 120 or more. Of course the question is theoretical. I know a lot of very active people in their 80’s and 90’s—some with pain or illness, some without, but not one of them is sitting around waiting for or longing for death.

At the end his article, Cohen waxes philosophical about the relationship between death and meaning. Like many before him, he claims that human life has no meaning without death. His exact words: “This resistance to the super-centenarian dream demonstrates good sense. Immortality — how tempting, how appalling! What a suffocating trick on the young! Death is feared, but it is death that makes time a living thing. Without it life becomes a featureless expanse. I fear death, up to a point, but would fear life without end far more.”

That’s fine for him, and I also know that many long for death because they believe in an afterlife that will be a better, happier place.

But for me, human life is the ultimate value and extending it and making it more comfortable is the greatest good. I for one would not be bored with a longer life, even with eternal life: I could study more about human history, human society, evolution and science. I could learn more languages. I could visit more of the world—at a more leisurely pace than current junkets abroad since I would have more time. I might even travel in space. I love playing games and watching sports, but even more, I get a great sensual pleasure out of preparing food and eating. As for sex—even if I ever became unable to achieve an erection, I would still take immense joy in the many other pleasures we label as sexual. Cohen says that death gives our lives meaning. I disagree: I think the knowledge we are going to die imbues all pleasure with melancholy or sadness. I’m not the first to express this belief—it was part of the philosophy of the ancient Roman and Greek Epicureans.

I love life and I don’t want anyone to take even a minute away from me. The thought that humans keep extending our lives through the pursuit of knowledge keeps me from despair. The idea that the human species could survive the destruction of the earth when the sun burns out by transporting large numbers of people to another planet in another solar system sustains my hope.

But I also realize that we have to change our ways for humans to survive as a species and for us to attain radical life extension for all. It will take a more equitable distribution of wealth, a focus on renewable resources, replacement of the accumulation of material things as the ultimate goal of life, an end to expensive and destructive wars, the basing of community decisions on science and not on convenience or the best interests of a few—in short it will take a repudiation of our wasteful, materialistic, war-mongering society. That’s something that those advocating against life extension don’t seem willing to contemplate.

 

Small problem with Joseph Epstein’s complaint about meritocracy: where is it?

Every once in a while, a white male who has made his living as a “responsible conservative” or a conservative parading as a centrist produces an article bemoaning the fact that we are now ruled by a meritocracy. Through the years, George Will, Irving Kristol and William Buckley Jr. can count themselves among the many so-called public intellectuals who have bemoaned the coming of the meritocracy.

The latest is Joseph Epstein, a long-time conflater of civic virtues with the rights of the privileged, in a Wall Street Journal article titled “The Late, Great American WASP.” Like most of predecessors, Epstein contrasts the current meritocracy with the former system in which the most powerful people were likely to be male, Protestant, of British descent, from wealthy and well-established families with many connections to business opportunities and attended an Ivy League school. Epstein defines WASP as the ruling class that dominated politics, economics (by which I think he means business) and education until it was gradually replaced by a meritocracy starting after World War II. By putting a right-wing slant on carefully-selected anecdotes, Epstein hopes to prove that when WASPs ruled we muddled through pretty well and that now that we have a meritocracy, as witnessed by the Clinton and Obama presidencies, we are pretty much going to hell in a hand basket.

The problem is that we do not have a real meritocracy, and certainly not in politics, business or education. Epstein can’t make his argument without this assumption, which is patently false.

In the days of WASP ascendancy, the most powerful people in most fields did go to an Ivy League or Ivy-type schools, and that’s still the case. If you don’t believe me, pick any field outside sports, even entertainment, and start investigating the backgrounds of the most powerful people in it. In all cases you’ll find an inordinate percentage and often a majority came from wealthy families or went to a top echelon school, be it Harvard, Yale, Duke or Stanford.

In the old days, mostly rich and well-connected kids—kids from the ruling elite—got to go to these handful of schools, and that’s still the case. As many researchers have noted, legacies get bigger breaks in admissions decisions at Ivy League schools than do athletes and minorities. That’s what got our second president Bush into Yale (and his opponent in the 2000 election, Al Gore, too), a fact that Epstein ignores in substantiating his side argument that Bush II turned himself into a non-WASP.

There is a very good reason that so many kids who get into the top schools are wealthy: they have all the advantages. The latest research shows that kids from the poorest of backgrounds lose from 10-13 IQ points because they have to dedicate too much of their brains to thinking about their next meal. That point spread spans the difference between being a smart kid and a genius. The wealthy have an edge over the middle classes because they can afford to spend more in the ever-escalating race to prepare children: The more money the family makes, the more likely the child will get special classes, travel abroad, summer camps with intellectual enrichment, SAT tutors, SAT prep courses, educational consultants, subject tutors. The wealthy parents are more likely to make large contributions to the university.

Take a look at the statistics: the U.S. currently has less mobility between the classes and less upward mobility than at any time in more than a century. The social mobility in today’s United States is lower than that of any other westernized industrial or post-industrial nation. Poor people move up to the middle or upper classes less frequently here than in any of the nations that had royalty and a rigid class system for centuries.

Parts of our American society do operate as a meritocracy. Bill Clinton, Barack Obama and Joe Biden all prove that the brightest and most talented do achieve positions of power. Harvard, Yale and Stanford do accept the “best and the brightest” alongside the merely good who come from money. But that was always the case when the WASP’s ruled as well. Even in the days of European royalty, even in the bad old slave days of ancient Rome, if you had a near photographic memory, could compute large sums instantaneously or displayed perfect musical pitch, the rich folk were going to find you and make sure you could help them run their society. That hasn’t changed one bit. But despite what you may have heard from your parents or may think about your own children, those extremely talented people are so rare as to be statistically irrelevant when discussing whether or not we have a meritocracy.

What has changed is that it’s not just the white males anymore in the positions of power. An increasingly ethnically and racially diverse ruling elite has emerged, but it is an elite based more on money and connections than on true merit.

Epstein’s argument fails both in its logic and in its details. He calls Laura Bush a “middle class librarian.” It’s true that Laura’s profession was/is librarian, but I would not call her background middle class by any means: Her father was a home builder and successful real estate developer, two professions that lead to both wealth and power in the local economy. In his latest book, The Myth of Liberal Ascendancy, William Domhoff documents the enormous political influence that real estate interests have had on local and regional politics. By the way, Laura’s maiden name was Welch and her mom’s was Hawkins. She was raised as Methodist. Sounds like an upper class (for Midlands, Texas society) WASP to me.

Later in the article, Epstein claims that the two strongest presidents since 1950 are Truman, who never attended college, and Reagan, who went to the antithesis of Ivy—a small Christian college. Epstein states Truman and Reagan’s greatness matter-of-factly as if it’s common knowledge and readily accepted by most people. In the case of Reagan, believing that he was a great or a detestably awful president is a litmus test for political views: right-wingers and right-wingers-in-centrist-clothing rate him highly; progressives rate him as one of our worst presidents. Now most people do rate Truman highly, but I personally consider him the worst president in American history by virtue of his having approved dropping two atom bombs on civilian targets. The larger point is that Epstein pretends that his own opinion is evidence that the meritocracy doesn’t work as well as the old WASPocracy did.

Articulate and well-bred conservatives railing against the so-called meritocracy reflect the broader anti-intellectualism that the ruling elite imposes on American society via the mass media. But whereas the reason for the anti-intellectual message in movies and ads remains hidden, it stands out crystal clear in arguments such as Epstein’s: It’s about power. In a true meritocracy, the most talented are in charge in whatever the field, not the rich and connected. In even the least complex of agrarian societies, talent manifests itself as knowledge and the ability to accumulate and use knowledge. Conservatives represent traditional society in which the wealthy rule. They fear a society in which the most capable for each job gets that job as opposed to keeping themselves and their offspring in the best and best-paying positions. So when the wealthy aren’t busy buying up the best and the brightest to do their bidding and justify their hold on power, they try to disparage intellectual activity.

Thumbs up to A&E for suspending “Duck Dynasty” celebrity, thumbs down for ever creating the show

When Sean Hannity, Bobby Jindal, Sarah Palin and other right-wingers come out in favor of freedom of the speech, you know that someone has just said something false, stupid and insulting about a group routinely demonized by ultra-conservatives.

In this case, these Christian right illuminati are standing up for a bearded and backward backwoodsman’s right to slur gays.

The latest right-wing freedom fighter to speak his mind and stand up for religious values is Phil Robertson, one of the stars of “Duck Dynasty,” a reality show about a family business that sells duck calls and other duck hunting paraphernalia in the swampy backwoods of Louisiana. The Robertson family thrives by displaying rural values and wearing their fundamental Christianity on both their overalls and their long, untamed beards.

Robertson’s outrageous views emerged in answer to this question by a GQ interviewer, “What, in your mind, is sinful?” Robertson’s response was not that growing inequality was sinful, not that chemical warfare was sinful, not that cutting food stamp benefits for children was sinful, not that herding people into camps was sinful, not that torture or bombing civilians were sinful, not that paying immigrants less than minimum wage was sinful, not that polluting our atmosphere and waterways was sinful.

No, in answering this softball of a question, none of these horrible sins came top of mind to Robertson. What did was male homosexuality: “Start with homosexual behavior and just morph out from there. Bestiality, sleeping around with this woman and that woman and that woman and those men…It seems like, to me, a vagina—as a man—would be more desirable than a man’s anus. That’s just me. I’m just thinking: There’s more there! She’s got more to offer. I mean, come on, dudes! You know what I’m saying? But hey, sin: It’s not logical, my man. It’s just not logical.” Note that women never enter the picture except as preferred receptacle—it’s all about his antipathy to male homosexuality.

There can be no doubt that Robertson has the right to speak these ugly opinions. But shame on the public figures who have decided to select this particular instance to defend the right to free speech. I suppose it’s easier for them to defend his right to speak than to defend his views, which they may or may not believe but certainly want certain voters to think they believe.

And there can be no doubt that A&E had the right to suspend Robertson. I’m delighted they did, but whether they should have or not is not that interesting a question, certainly not as interesting as considering whether A&E ever should have run the series in the first place. “Duck Dynasty” is the most popular reality TV show ever on cable TV. Like all reality TV, storylines are scripted, so what we’re seeing is not reality, but a kind of cheaply-produced semi-fiction produced in a quasi-documentary style that lends a mantle of credibility to its insinuation that we are viewing reality. The great invention of reality TV is the divorcing of fame from any kind of standard: these people are not actors, sports stars, born wealthy or royalty. They haven’t even slept with the famous, as the Kardashians have. Like the Jersey wives, the Robertsons represent the purest form of celebrity—famous for nothing more than being famous.

A&E and the show’s producers have always sanitized and romanticized the harsh aspects of the Robertsons’ lives even to the point of beeping our “Jesus” from the speech of the bearded boys. Suspending Robertson is part of the continuing strategy to hone down the rough spots of rural American life. Besides, the network had no choice but to act quickly or risk a boycott of the entire network by sponsors and gay rights groups.

Moreover, A&E had everything to gain and nothing to lose by suspending Robertson. Those offended by Robertson’s views will never tune in or ceased watching a long time ago, but perhaps there are still those out there who haven’t watched yet and share Phil Robertson’s views. After all, even the premiere of the fourth season—the most watched nonfiction program in cable history—only drew 11.8 million. That’s a drop in the bucket of the 45% of the population who believe homosexuality is a sin (or so reports a recent Pew study).

(Having lived only within the borders of large cities for more than 40 years I find these numbers shocking, but in many ways, we have two societies now: blue and red, urban and suburban, multicultural and religious fundamentalist. I’m a resident of the blue, urban, multicultural world and tend to interact only with others who share my views on social and political issues.)

The gay-bashing controversy also serves as this week’s “Duck Dynasty” media story. Only the Kardashians seem to get more stories about them than the Robertsons.

I won’t blame A&E for developing shows for the rural market, but I do blame it for developing these particular shows. Reality TV is the end game of the Warhol aesthetic—the apotheosis of branding elements into human deities called celebrities through a medium that has ostensibly avoided the distortions created by the artist’s mediation. But it’s only apparent, since it is not reality we see but an imitation of reality made to seem real by the suppression of most artistic craft.

Suburbanites, denizens of new cities, rural hunters—every major demographic group gets its own lineup of reality TV in post-modern America. In all cases, the producers varnish reality and give it a dramatic shape that at the end of the day feeds on commercial activity and conspicuous consumption. You wouldn’t catch Snooki squatting in a duck blind, nor Phil Robertson clubbing in South Beach. But they represent the same value of undeserved celebrity selling mindless consumption.

NY Times uses anecdotal thinking to create feeling food stamp fraud is rampant in article saying it’s minor

News features often use examples or anecdotes to highlight a trend that is the subject of the story. Sometimes all the writer has as proof of his or her thesis are the examples, so the article strings together a couple of anecdotes to demonstrate that a new trend is unfolding; such as people eating strange foods in expensive restaurants or craving limited edition cosmetics. Quite often, though, the anecdote depicts the reality of a real trend; for example, more families in homeless shelters or the problems signing up for health insurance on an exchange.

In the case of either a real or false trend or idea, it is common that the article starts with an anecdote that shows us the trend or idea at work. Instead of saying, “people are eating ants,” we get a description of a dish or a pleased gourmand crunching away. Beginning inside an anecdote brings the story alive and makes the reader react emotionally before the mind engages with the facts of the matter. An early advocate of starting inside a case history instead of with a statement of thesis was the Roman poet Horace, when he suggested in Ars Poetica about two thousand years ago that the writer “begin in the middle” (in media res). Horace, like most great writers, understood that showing something was much more powerful than merely telling people about it.

How strange, then, that the New York Times would publish an article that reports a fact, but only provides case histories that go counter to that fact. Moreover, the article begins with a case history counter to the facts under report, which means that by the time most readers get to the facts, the anecdote has convinced them of the very opposite of what the facts prove.

What isn’t surprising is that the article disproves a long-held right-wing belief and that the anecdotes in the article support the disproved belief.

The issue is food stamp fraud, people illegally using food stamps to buy liquor, gasoline or other forbidden items. In “Food Stamp Fraud, Rare but Troubling,” Kim Severson correctly reports that food stamp fraud is practically non-existent, a mere 1.3% of the total of food stamp aid given, down from more than 4% in the 1990s before debit cards replaced paper food stamps. Compare this paltry 1.3% to 10%, the current figure for Medicare and Medicaid fraud (typically by physicians, as Severson’s article does not mention). Or compare the $3 billion lost to food stamp fraud, overpayments and government audits combined to the estimated $100 billion a year that insurance fraud costs insurers and their customers.

I’m not denying that the anecdotes occurred. Certainly, a relatively small number of people try to defraud the government by misusing food stamps, but the statistics suggest that the problem is practically non-existent and not worth mentioning or worrying about. The demagogues stating that food stamp fraud is an enormous problem are trying to promote antipathy toward recipients of social benefits, the so-called “welfare queens” accusation. The facts of the article demolish this view as it concerns food stamps.

We can only speculate on how this story developed: Did the editor assign the writer an article that would support the right-wing view that food stamp fraud is rampant, a reason they want to cut the program (and let hundreds of thousands face food insecurity) and did the facts turn the article a different way, leaving the writer with nothing but anecdotes to support the editor’s goal? Or was it the opposite: a conservative reporter trying to put a right-wing face on the facts through anecdotes that go counter to those facts?

Or did the writer pit anecdotes against facts as a way to present a “fair and balanced” story? If so, the writer forgot that anecdotes are as much like facts as apples are like oranges.

Unless of course, the writer has read Daniel Kahneman’s Thinking Fast and Slow, in which the eminent social scientist uses numerous controlled experiments to prove that people will believe a single anecdote that conforms to their ideas over multiple facts that disprove them. In other words, the writer could have cleverly constructed a story to influence the reader to believe the very thing that the article disproves through providing random anecdotes that go counter to the underlying facts. The facts say, “No food stamp fraud,” but the richly detailed case histories may convince us otherwise.

“Food Stamp Fraud, Rare but Troubling” is thus a masterpiece in deniable deception. The article claims to prove one thing—and it does, except for those internal heart strings plucked so expertly by the anecdotes that sing to right-wingers that they were right all along.

Detroit’s bankruptcy latest attempt of wealthy to steal from poor

Kudos to Ross Eisenbrey of the Economic Policy Institute for rejecting the notion that overly generous pensions led to Detroit’s bankruptcy.

Instead of pensions, Eisenbrey cites several reasons for Detroit’s financial problems:

  • A depleted revenue stream as wealthy people moved to nearby municipalities, taking advantage of the city as an economic driver while destroying the city’s tax base.
  • Bad financial deals with banks, including interest rate swaps, which are contracts in which two parties agree to exchange interest rate cash flows, based on a specified amount from a fixed rate to a floating rate, from a floating to a fixed, or from one floating rate to another floating rate. Each side is betting that a certain set of economic conditions will prevail, so that they come out ahead on the swap. As Eisenbrey details, these swaps were profitable for Wall Street banks and exposed Detroit to financial risks that ended up costing the city $600 million in additional interest.
  • Corporate subsidies and tax loopholes for businesses that did not create enough jobs to justify these gifts to private sector companies.

Unmentioned by Eisenbrey is the fact that all three of these forces represent the same theme: rich folk squeezing a city dry of its wealth and then leaving it to flounder. Wealthy suburbanites benefited from living near Detroit without paying taxes to the city. Wealthy banks essentially benefited from selling Detroit’s politicians a bill of goods. Wealthy company owners lowered their operating costs without giving back enough in new jobs.

As Eisenbrey advocates, the burden of solving Detroit’s financial problems should not fall on the Motor City’s middle class and working class people who have worked long years for pensions that they negotiated and upon which they depend to survive. Funny isn’t it: while it’s not okay to break the financing contract with the banks, politicians think nothing of breaking the contracts they signed years ago with its workers. Eisenbrey wants Detroit to say “enough is enough” to the banks and walk away from the onerous interest rate swaps and other financing gimmicks. The banks have made enough money on the Motor City already.

Eisenbrey also wants to end the loopholes and special deals to corporations and have the state of Michigan chip in more money to pay Detroit’s bills. I would add a special regional tax based on income (or as in France, on wealth) that the state would collect for the city from Bloomfield Hills, Grosse Point, Birmingham, Franklin and the other nearby and distant Detroit suburbs.

In his very perceptive article, Eisenbrey also suggests that Detroit’s emergency manager Kevyn Orr, Michigan Governor Rick Snyder and other civic leaders are mischaracterizing Detroit’s problems by focusing on the $18 billion in long-term debt the city owes. It’s another example of right-wing politicians defining the issue in terms that benefit their constituencies. Let’s set aside the possibility that $18 billion may be a grossly overstated estimate. Eisenbrey’s correctly reasons that municipalities cannot liquidate the way private companies can, so the size of the debt is not the issue. All that matters is the cash flow—how much money Detroit needs to pay its bills each month. Right now Detroit faces a $198 million cash flow shortage.

Cash flow is easy for municipalities to deal with, at least in theory—raise taxes or lower costs. The city has already cut costs not only to the bone, but to the marrow. Now it’s time to raise taxes, but on a regional level.  Too long wealthy suburbanites have sucked Detroit dry. It’s now time for them to give something back.

But that’s not going to happen. More likely is that Detroit will become a model for the latest way for the rich to continue their 30+ year war on the rest of us: declare a city in financial trouble and use that excuse to gut pensions and worker’s salaries, thus putting even more downward pressure on the wages of private sector workers and insuring the continuation of the low-tax regime that has a financial chokehold on most families.

Why did the FDA make its new antibiotic restrictions voluntary instead of mandatory?

Were you as delighted as I was when I read the headline that the Food and Drug Administration has a new policy prohibiting the use of antibiotics to speed the growth of pigs and other animals cultivated for human consumption? Trace antibiotics in the animals we eat have contributed to the increasing resistance of bacteria to the antibiotics we use to treatment infections. The new policy forbids use of antibiotics as growth stimulants and also requires farmers to get prescriptions each and every time they want to treat a sick animal with antibiotics.

On the surface it looks like a great victory for every American because it is going to make all of us safer and less likely to die in an illness. The New York Times version of the announcement points out that two million people fall sick and 23,000 die every year from anti-biotic resistant infections. CNN reports that in April the FDA said that 81% of all the raw ground turkey the agency tested was contaminated with antibiotic-resistant bacteria. Currently every hospital patient encounters the danger of opportunistic infections that don’t respond to antibiotics.

Every one of the 15 news reports I read hail it as big news: “major new policy,” ”broad measures” and “sweeping plan” are some of the descriptions of the FDA action.

But before we break out the champagne, let’s read the fine print: It’s all voluntary.

Virtually all the news stories bury this fact or downplay it.  For example, the Times says that, based on comments made during the discussion period that proceeds all federal regulation, rules and advisories, the FDA was confident that drug companies would comply (which I suppose means refusing to sell antibiotics to farmers without prescriptions for specific animals).

Then there’s the matter of a three-year phase-in period. No one has bothered to explain why anyone would need three years to implement this plan: just stop doing it, right away.

As some reports have noted, health officials have warned about the overuse of antibiotics leading to increased resistance since the 1970’s. In other words after 40 years of warnings, studies, discussions and negotiations regarding a major public health challenge, the best we can come up with is a voluntary plan.

Have no doubts about it: Some drug company somewhere in the world will continue to sell this stuff to farmers and farmers will still use it.

If the federal government were really serious about lowering the amount of antibiotics humans ingest in their food and water, it would have set mandatory regulations that took effect within 30 days. But such an action would take a cash stream from drug manufacturers and raise the cost of raising domesticated animals. Farmers and meat processors would make less money and consumers would likely pay a little more for their ground round and chicken nuggets.  It’s worth it, though, as the eradication of the use of antibiotics will make everyone in the United States (and the world) safer from the threat of contracting a life-threatening infection every time they have an operation and safer from the risk of an epidemic of virulent and untreatable infections.

Industry pressures most assuredly caused the wishy-washy action of asking drug makers to resist the urge to make more money. The news behind the news then is that once again, our government has compromised the health, safety and economic well-being of its citizens to enable a small group of companies to continue making money. The additional illnesses and deaths are paid for by all of society, bringing down the costs or raising the profits for a small segment of society.  It’s another example of shifting of the costs from companies to society at large, and it demonstrates once again that unfettered free market capitalism does not lead to the greatest good for the most people.

Serious economists must be laughing at Wall Street Journal attempt to use Laffer Curve to support tax cuts

Wall Street Journal editorials often twist facts, leave out key facts, make incorrect inferences from facts or just plain get the facts wrong.  But the editorial titled “Britain’s Laffer Curve” shows that sometimes the editorial writers simply have no idea what the facts are saying.

In this editorial, the Journal wants to show that cutting taxes leads to increased tax revenues and invokes the notorious Laffer curve to do so. Laffer Curve theory has been around for ages but is associated with right-wing economist Arthur Laffer who supposedly drew it on a paper cocktail napkin for some government luminaries during the 1970’s.  When I interviewed Laffer in 1981 for a television news report, he denied the myth.

What the Laffer Curve postulates is that as taxes are raised, less money circulates in the economy and rich folk are less likely to invest to make more money, since they are keeping so little of it. Research suggests that neither of these statements are true, but by assuming that they are true, one could imagine a situation in which taxes are so high that by lowering them, you raise the amount of revenue that is raised by the government.  Laffer Curve theory proposes that there is a theoretical point at which the tax rate is at a level that produces the most revenues possible from an economy. Laffer Curve theory also predicts that there are occasions when raising taxes will indeed raise significantly more revenue and lowering taxes will indeed lower revenues—it depends on whether we are on the upward or downward slope of the imaginary Laffer Curve.

President Ronald Reagan and a slew of right-wingers since him have used the theory of the Laffer Curve to justify cutting taxes. They assume that no matter what the conditions are, we are always on the side of the imaginary Laffer Curve on which a cut in taxes always leads to an increase in revenues.

The Journal of course takes it for granted that taxes are always too high, especially on businesses, even though they are currently still much lower than during most of the last hundred years and certainly far lower than when Laffer supposedly took Mont Blanc to napkin.

The editorial in question proudly states that since Great Britain cut its corporate tax rate from 28% to 22% in 2010 the British Treasury has gained from 45 cents to 60 cents in additional taxes for every one dollar of revenues lost by cutting the tax rate. In other words, economic growth (or more people paying all their taxes) compensated for 45%-60% of the revenues lost through the tax cut.

Now that may or may not prove the existence of a Laffer Curve that can describe the relationship between tax revenues and taxes collected. But it does prove that you cannot use Laffer Curve economics to justify a tax break.   Even after the Laffer Curve effects, the British government is still 55%-40% in the hole, meaning it must find other sources of revenues or cut government spending by that amount.

And where did the shortfall go? To businesses and their owners, AKA rich folk, who history suggests will invest their additional wealth in the secondary stock market and luxury goods, neither of which really help the economy to grow.

The Journal wants us to believe that the experience of Britain should make us want to cut taxes to raise government revenues. But what the example shows is that cutting taxes leads to a loss of government revenue and a net transfer of an enormous amount of wealth from the poor and middle class to the wealthy. It’s as if the editorial writers have looked at a blue sky and declared, “Look at that blue sky. It proves that the sky is always yellow.” They see the facts, but that doesn’t persuade them from believing what they want to believe is true.

Real economists the world over must be laughing at the Journal and its editorial board’s gross misinterpretation of the facts. Except, of course, those economists in the pay of right-wing think tanks.

Increase in adults reading juvenile fiction another sign of infantilization of Americans

The title of Alexandra Alter’s Wall Street Journal article on adults reading fiction written for middle-schoolers describes the situation perfectly. “See Grown-ups Read. Read, Grown-ups, Read” suggests not middle school, but an elementary school reading
level.  Alter’s story describes one of the many ways that our mass culture is infantilizing adults, turning them into oversized children.

Alter finds several reasons why adults like reading fiction written at the reading, intellectual and maturity level of 12- to 15-year-olds:

  1. The Harry Potter series of books continues to influence reading choices.
  2. There is less of a difference in tastes between generations today than in the past.
  3. There is less of a stigma in adults reading children’s books for pleasure.
  4. The quality of literature for middle-school children has increased and the themes have become more mature.

The first three reasons are euphemistic ways to say that many adults now maintain the interests of childhood or pursue childhood interests. Of course, Alter avoids the negative judgment implied—and meant—by my expression, “the infantilization of adults.”  As one of the several experts Alter quotes puts it, “It used to be kids who would emulate what their parents were reading, and now it’s the reverse.”

The fourth reason is worth analyzing further. Let’s accept the premise that the quality of the writing in books for the middle school audience has improved and the themes and situations are more complex than in the past. The easy rhetorical response is that these books are still for children and not for adults. There is no stream of consciousness writing, no shifting of perspectives without signally the shift (known as free indirect discourse), no long elegant Proustian sentences, and no modernistic imagery. Even today’s new and improved middle school fiction falls short of the best of fictional writing for adults. In addition, the themes covered are those of interest to the middle schooler and thus inherently less complicated than what should be of interest to adults.

Alter peppers the article with quotes from experts, but all of them are authors, editors or publishers of juvenile fiction. No place does she have room for the views of a sociologist, psychologist or philosopher, who might fear, as I do, that adults are losing their capacity for complex thought by reverting to their childhood joys and activities, be it juvenile fiction, theme parks or shoot-shoot-bang-bang video games. In fine Wall Street Journal free-market tradition, the article is about a growing market. In the Journal’s view, all free markets are good and the results of free market growth are always good. The editorial slant of the newspaper reflects a modern version of Voltaire’s buffoonish professor, Dr. Pangloss. He’s the one who keeps repeating in Candide that everything is for the best in the best of all possible worlds. For the Journal, everything is for the best when the free market is operating.

Besides, infantilization of adults is good for Journal advertisers and the American consumer economy is general. Infantilization makes people less able to understand the fine print, less able to understand if what is for sale is really of value. It leaves people less in control of their emotions and more insecure and susceptible to manipulation, just as children and teens are when compared to mature adults. In short, it’s easier to sell products and services—especially useless ones—to the less mature mind.