What is the biggest cause in the drop in crime rates?

The latest statistics demonstrate that New York City’s Draconian stop-and-frisk policy has not been the cause for a precipitous drop in the rate of violent crime in the five boroughs. Even after NYC’s finest curtailed stop-and-frisk without cause, crime rates continued to plummet.

I’ve been meaning for some time to analyze why crime rates have dropped and continue to drop across the United States, but especially in urban areas outside of Chicago.  Despite the right’s wails and lamentations about unsafe communities, most of us live in far safer places than we did a decade or two ago. Interestingly enough, the crime rate is down most precipitously in that modern Sodom or Gomorrah, the Big Not-So-Rotten Apple.

Why has crime decreased?

First, I want to discount the idea that crime fell as a result of increased incarceration of individuals, victims to the many 3-strikes-you’re-out and anti-crack laws passed in the late 70’s and 80’s. We have filled our prisons with a bunch of people—black males to a large extent—who don’t deserve to be incarcerated. All they have done is minor kid’s stuff or drugs. We have the highest incarceration rate in the western world and yet we still have the highest rate of violent crimes. No doubt, some small percentage of those locked up for years for tooting crack might have committed future crimes, but some percentage of those locked up learned criminal ways in prison and became lost to society.  I’m thinking the net effect disproves the idea that locking more people up than any other industrialized nation led to a drop in crimes rates.

One of the gun lobby’s many fantasies is that the increase in open carry and other gun rights leads to a decrease in crime, because the criminals won’t want to run into someone who would shoot back. This absurd claim crumbles to lies as soon as we look at the facts: Forget that the incidents of citizens stopping criminals by pulling out their gun are extremely rare. Consider that the higher the prevalence of guns in any country in the world, the higher the rate of deaths and injuries from guns in that country. More guns equal more violent deaths. Also consider the fact that while there are more guns out there now, fewer households own guns today than 20 years ago, continuing a trend that is more than 50 years old now.  Fewer people own more guns. I think it’s likely that the decline in gun owners may have led to a drop in crime.

So far, I’ve consider some bogus arguments conservatives make about the drop in crime. Now let’s take a look at three legitimate arguments which I think have been factors in the continued drop in crime, but not any as the primary cause.

Let’s start with the end of the use of lead paint: This theory goes that crime increased soon after we started using lead-based paint in apartment buildings, because children would eat the paint chips and suffer one or more of the side effects, which include learning disabilities resulting in decreased intelligence, attention deficit disorder and behavior issues, all predictors of criminal behavior. Once we stopped using lead paint, the crime rate went down (even thought the rate of diagnosing ADD continues to soar). It’s a very believable theory backed by evidence that suggests but does not prove causality. Not enough research has done on the affect of lead paint on human adherence to social norms, but the explanation does sound plausible.

We can also look at the growth of dispute resolution programs in the schools as another factor in lowering the rate of crime. I think it was some time in the 80’s when these programs began, first in urban areas. Having sixth grade kids mentor first-graders, throwing middle school kids in with high schoolers, bringing together groups of students from different schools to talk about race, religion and other hate issues, the growth in organized sports leagues—all of this additional socialization had to turn many
marginal children away from crime.

My own pet theory is that the growth of video game play helped to lower the crime rate.  The idea is that people work out their anger and anti-social urges playing Grand Theft Auto and Call of Duty: Black Ops.  So while I despair that most video games tend to infantilize young men, preventing their ideas and thought processes to mature, I do think that the games have kept many young men busy and out of trouble.

I do reject one non-conservative theory: A professor has postulated that the legalization of abortion has resulted in fewer unwanted children born and that unwanted children commit more crimes. The problem with this theory is that the introduction of birth control pills assuredly prevented the birth of more children than did the legalization of abortion. But the introduction of the pill paralleled the increase in the crime rate in the 1960’s and early 70’s, at least at first.

Lead paint, growth in socialization programs and video games all played a role in the decrease in crime, without being the main cause. Sociologists and historians who calculate crime rates in many cultures through centuries report that the rate of crime is primarily a function of the number of 16-29 year old males in the population. Most crime is committed by young men, so the higher percentage of young males in the population, the higher the crime rate.

The facts certainly match this theory until about 2003. When the Baby Boom turned 16, crime rates started to soar. Males aged 16-29 represented the largest percentage of our population in our nation’s history.  When Generation X—otherwise known as the Baby Bust—started to turn 16 and Baby Boomers started turning middle-aged, crime rates started dropping. Now the birth rate increased again with the Millennial generation (AKA Generation Y, although judging from the high achievements of its female members, maybe Generation Non-Y is a better moniker!). But when the Millennials started turning 16, the crime rate did not pick up again.

My thought is that the impact of the Millennials on the overall population is far less than that of the truly outsized Baby Boom generation. So while we have more 16-29 year old males, this demographic segment is not as great a percentage of the whole as it was at the height of Boomer young adulthood.  The end of lead paint, greater socialization, the growth of video games, a decline in gun ownership and other factors still unidentified all combined to keep the crime rate going down.  By this theory, if the Millennials were as large a factor as the Baby Boom generation had been, the crime rate might still not have risen, but not to Boomer levels because of these additional factors.

 

Ayn Rand Institute factotum tries to convince us Wal-Mart pays its employees enough

Right-wing factotum Doug Altner, an analyst with the Ayn Rand Institute, poses a provocative question in a Forbes article: If Wal-Mart is such a crappy place to work, why do 1.4 million Americans work there and many more want to?

It’s the type of question that extreme free-marketers love to ask, because it assumes that we really have a free market in which every market participant has equal rights and equal clout.

Altner forgets that the poor and the uneducated have very few options.  That 10,000 people apply for 300 openings when a new Wal-Mart opens says less about the attractiveness of the company and more about the lack of jobs, especially for young people who don’t have diplomas from one of the top 300 or so colleges. In many rural areas, Wal-Mart’s entrance into the marketplace destroyed many local and regional retailers who paid their employees more in wages and benefits. Wal-Mart does pay marginally more money than burger-flipping, but that does not mean that it pays a decent wage.

Altner gives three reasons why he thinks people clamor to work for Wal-Mart:

1.     The work is not physically straining.

What Altner writes is that “Many entry-level Walmart jobs consist of comparatively safe and non-strenuous work such as stocking shelves, working cash registers, and changing price labels.” Altner goes on to mention that these jobs pay more than other entry-level jobs that require few skills. What he doesn’t say is that other non-strenuous jobs pay much more money. I can’t remember the last time I worked up a sweat writing an article or meeting with a client. I do remember that an attorney and a company president with whom I recently met both looked absolutely exhausted—they had just come from a racquetball game at 10:00 a.m. on a week day! Sarcasm aside, the fact that work is not physically exhausting shouldn’t justify paying less than a living wage.

2.     Wal-Mart provides entry to myriad career opportunities.

Altner points out that Wal-Mart tends to promote from within and that three-quarters of its store managers started as hourly workers (Altner uses the Wal-Mart euphemism and calls them “associates.”). I’m not sure how Altner earned a PhD in industrial engineering, because he certainly forgot to do the math. Counting 1.3 million employees and about 5,000 Wal-Marts and Sam’s Clubs across the 50 states, a new hire at Wal-Mart has less than three-tenths of one percent chance of becoming a store manager.  But the 1.3 million number is the wrong one to use, since 70% of all Wal-Mart employees quit within a year, a stunning indicator of employee dissatisfaction which Altner neglects to mention.  If we consider the competition for 75% of store manager jobs to be everyone hired in a 12-month period, then the statistical probability of becoming a store manager is significantly less than two-tenths of a percent. But the real odds of getting a job are assuredly even less than this number, since the company does not turn over all store manager jobs within a year. Maybe the competition for store manager jobs includes everyone hired by Wal-Mart in a six-year period (less than 6/100ths of a percent) or even a 10 year period (less than 4/100ths of a percent).  I’m guessing that if you consider everyone who draws a salary playing for all the major and minor leagues of baseball, basketball, soccer, hockey, race car driving, golf, tennis and every other professional sport, the chance of becoming a professional athlete is probably as great as starting as a Wal-Mart entry level employee and becoming a store manager. In other words, the opportunity is there, but it’s far-off and very unlikely.

The article does not miss a beat in defending Wal-Mart. For example, the one example of a hourly worker who made it to the top is a woman named by Fortune Magazine as one of the 50 most powerful women of 2006. I’m sure that offers consolation and hope to the many Wal-Mart female employees who are suing the company for wage discrimination and a hostile work environment. He doesn’t say it, but Altner selects this woman as his example to fight the negative publicity Wal-Mart continues to get for its treatment of women.

3.     Others are waiting for the Wal-Mart jobs.

Altner declares that Wal-Mart can get away with paying low wages, so why should it pay higher wages. He complains that critics think Wal-Mart should pay more, “regardless of whether he has the skills or experience to justify such a wage, regardless of whether he is a model employee or a slouch, regardless of how many other individuals are willing and able to do his job for less, regardless of whether raising wages will be good for the company’s bottom line. In effect, their premise is that $12+ per hour wages shouldn’t have to be earned or justified; they should be dispensed like handouts.” He forgets that it is Wal-Mart that is getting a handout because so many of its employees rely on government assistance programs to supplement their low wages.

Altner is really making a case for no minimum wage and, in fact, no government regulation.  He can’t really believe that Wal-Mart offers a great deal if 70% of employees quit within a year, unless he postulates that all 70% are “slouches.” The fact that others are waiting in line to take the jobs is no excuse for paying less. Many attorneys are clamoring for high-priced jobs at corporate law firms. I get dozens of resumes a month from people clamoring for a job at a public relations agency. These are high-paying jobs that stay high-paying despite the fact that lots of people want them.

For decades, Wal-Mart has taken advantage of the economic climate. It thrives by exploiting the fact that the minimum wage has lost its buying power through inflation and is much lower than it should be; the fact that the current laws allow corporations to play games with part-time work to limit benefits; the fact that there are far fewer jobs than there are people seeking them.  In this sense, they are bottom feeders who take advantage of the poor and uneducated.

The heart of Ayn Rand’s philosophy is selfishness. In its discussion of Ayn Rand’s philosophy, the Ayn Rand Institute website writes, “Man—every man—is an end in himself, not the means to the ends of others. He must exist for his own sake, neither sacrificing himself to others nor sacrificing others to himself. The pursuit of his own rational self-interest and of his own happiness is the highest moral purpose of his life.”

I disagree entirely. All of our sustenance and pleasure—our necessities like food and shelter and our joys like music and sports—come through our interactions with society.  We owe society—for roads and bridges, our electrical grids, our safety, our markets, our cultural norms and standards, our base of knowledge and all our goods and services. No one can live outside society or without other people. As a society, we owe everyone, and in this I mean everyone in the world, food, shelter, education, health care and a chance to better themselves.

As individuals, we manifest what we owe society by following its rules and regulations. For the good of the whole, we impede people’s rights to adulterate food or use inaccurate weights and measures. And we also impede people’s right to take advantage of the downtrodden. That’s what the minimum wage is all about.

If Wal-Mart (and many other employers)  did not operate always and only on the principles of pure selfishness, there would be no need to unionize or to raise the minimum wage. But they don’t, so society and its members have every right to foster unionization and to set a minimum wage.

We are living in a particularly harsh time now, mainly because for too long we have let those like Doug Altner and other Randers set the tone of the political debate. We hear too much about the rights and prerogatives of corporations and rich folk and too little about the family of humans and how interconnected we all are.

Dunkin’ Donuts adds both extra sugar and salt to new hot chocolate-flavored concoction

The amazing thing about the diversity of manufactured food offerings in food stores and restaurants is the degree to which the proliferation of new food products leaves everything tasting the same: half salty and half sweet.

For one thing, there’s the sauce inflation at national chains for what is called “casual dining.”  You know, Applebees, Red Lobster, Olive Garden, Outback.  The fad now is to cook meat with one sauce, glaze or other covering and then pour both a cheese or cheese product and another sauce over the entire entree.  For example, Olive Garden has its Steak Gorgonzola Alfredo and Pizza Fritters Napoli. Applebee’s has its Fiesta Lime Chicken and Four-cheese Macaroni and Cheese with Honey Pepper Chicken Tenders. And check out the description of this sweet and salty delicacy from Outback: wood-fire grilled chicken breast topped with sautéed chicken with mushrooms, crisp bacon, melted Monterey Jack and cheddar and honey mustard sauce. The combinations of meats, sauces and spices always leave these dishes with both a salty and a sweet taste, sometimes with or without a slightly hot favor, depending upon whether or not it is “spicy.” There is no other taste and no subtlety of taste or aroma.

Forget about the added calories from the cheese and multiple sauces for the moment. Just think what happens when you put salt and sugar on everything—it all comes out tasting the same.  Now I know that we only taste salt, sweet, sour, bitter and umami (meat flavor), but with the varied smells of food, it really is possible to create a myriad of taste combination. It seems as if all American food manufacturers want to do is make us taste only salt and sugar.

The latest salt-and-sugar concoction—at least as far as I can tell with my limited TV viewing and total abstinence from processed food and national chain restaurant offerings—is Dunkin’ Donuts’ (DD) “salted caramel hot chocolate.”

Caramel is the smoky sugary flavor you get when you melt sugar at a low heat. There is always some sugar or other sweetener in all chocolate, but adding caramel pumps another jolt of sugar—or whatever sweetener DD uses—into the drink. Plus there’s the salty part: who wants to drink something salty anyhow? Salty food make you want to drink something—I’ve found water is best to quench a salted thirst. So the effect of drinking DD’s salted caramel hot chocolate drink is to make you want to drink something else. A Dunkin’ coffee anyone?

The small version of this beverage product runs 220 calories and the extra large version comes in at 550 calories, which is approximately one-quarter of what nutritional experts tell us the average adult male should eat in an entire day. While looking at the list of ingredients, keep in mind that the ingredients are always listed in order of quantity—the first item is used the most in the food product, the second item second most, etc.

Here are the ingredients in salted caramel hot chocolate: Water, Salted Caramel Flavored Hot Chocolate Powder {Sugar, Non Dairy Creamer [Partially Hydrogenated Coconut Oil, Corn Syrup Solids, Sodium Caseinate (a milk derivative), Dipotassium Phosphate, Sugar, Mono and Diglycerides, Sodium Silicoaluminate, Sodium Stearoyl Lactylate, Soy Lecithin, Artificial Flavor and Artificial Color (Annatto and Turmeric)], Nonfat Dry Milk, Cocoa processed with alkali, Natural and Artificial Flavor, Salt, Cellulose Gum, Silicon Dioxide, Sodium Citrate}.

The first thing to note is that what we have is a packet of dry chemicals that DD stirs into hot water. Some of the chemicals derive from processing plants or animals; others are completely artificial, in that the starting point wasn’t a food that grows such as sugar or cocoa. Also note that cocoa is probably the fifth most used product in the dry mixture, behind non-dairy creamer, which in and of itself has seven chemical parts, including corn syrup and even more sugar.  I write “probably,” because of all the stuff in the non-dairy creamer. Note, too, that there is more non-dairy creamer than nonfat dry milk. Thank goodness there is less salt than cocoa in this brew.

Drinking one, or even five, of this beverage comprised of sugar, chemicals and salt won’t count towards the daily dark chocolate dose that keeps away heart disease. In fact, a DD salted caramel is one of the unhealthiest things you can put in your body.  Drinking one defiles the body much as dumping radioactive wastes defiles a wooded area or wetlands.

Don’t think that the sauces, glazes, cheese products and drizzles at national casual dining chains are any different. Dig deep into the ingredients of most of these concoctions and you’ll find a lot of chemicals, a lot of corn syrup and sugar and a lot of artificial flavors.

Of course, with all the potential carcinogens and all those unneeded empty calories come that simultaneous hit of both sugar and salt which the food manufacturing industry has trained us to crave.

Pro-nuclear journalists nuke mainstream news media; fall-out is wasted time in addressing global warming

In the past week, both Eduardo Porter, a left-leaning columnist from the New York Times and Hendrik Hertzberg, a centrist-looking-left columnist at the New Yorker have advocated nuclear power as the necessary bridge to solar and wind power.

Both writers use the same arguments: We can’t produce enough energy—by which they mean electricity—by solar and wind, so we need nuclear to replace carbon-spewing coal and oil if we are to address global warming in time.  Porter and Hertzberg make two assumptions: 1)  we must wait for the market to develop for solar and wind as opposed to making massive government investment to create the market; and 2) the only viable solutions involve central generation of electricity controlled by large companies and the decentralized solutions such as mini-generators in neighborhoods and solar panels for heating space and water are unacceptable.

But using nuclear energy to produce electricity strikes a bargain with a devil as pernicious to the earth as global warming. The safety problems with nuclear power are numerous and well documented. Briefly, a major accident spewing significant radiation has occurred somewhere in the world about every 10 years since the 1950’s, including Chalk River, Kyshtym, Three Mile Island, Chernobyl and Fukushima, to name some of the more notorious accidents.

There is also the issue of storage. The United States still hasn’t found a permanent place to store nuclear wastes, so it sits at the plants or in temporary locations smoldering  and spewing radioactivity.  The half life of some nuclear waste is 25,000 years, which means that in 25,000 years half of the radioactivity will have dissipated. That’s more than twice the length of time that humans have had civilizations with written records and agriculture. How many people can speak the languages humans spoke 25,000 years ago? When I think of storage of nuclear waste, I always imagine future intelligent life discovering cavernous vaults with strange hieroglyphics on them, wondering what the symbols mean and eager to break open the vault and see what is inside. They drill through only to let the radioactivity escape and poison them and those in nearby settlements.

By spending money on building more plants for the nuclear generation of electricity we deny resources that could make solar viable. Instead of the private sector investing in nuclear, why can’t the government increases taxes on the private sector’s use of coal and oil and use the funds to buy solar-based batteries, solar roof panels for heating federal buildings and other existing solar products? Making solar and wind more competitive does not have to involve only making them cheaper; it can also involve making coal and oil more expensive by raising taxes or withdrawing current tax breaks. The government could also give greater tax breaks for developing wind mills and solar plants. It could pass a law that makes it harder for rich folk to pursue lawsuits because their view of the Atlantic is impeded by a windmill.

Why waste precious public and private resources in nuclear energy? Let’s go right to what must be the energy of the future—solar and wind.

 

 

 

What do Wal-Mart and Arne Duncan have in common? Neither understands that it’s all about the wages

Two stories floating around the news media lately both make me want to shake the principal actors and yell in their faces, “Raise wages and you’ll solve the problem.”

The first story involves Wal-Mart’s latest embarrassment—employees in it Canton, Ohio store organized a Thanksgiving food drive for fellow workers.  This act of charity—by and for employees only—begged the question that pundits, labor leaders, left-leaning actors and supporters of the minimum wage are all asking: Does Wal-Mart pay its employees too little money?

The Canton food drive for Wal-Mart employees came on the heels of a report by Demos, the liberal think tank, that if Wal-Mart had not engaged in a stock buy-back program is recent years, it would have had the money to raise employee salaries by $5.83 an hour and kept the same profit. My only problem with the survey is that it doesn’t attack the profit margin, which is pretty fat for Wal-Mart and could be reduced as another way to pay employees a living wage.

At this point, Wal-Mart’s treatment of its employees has achieved near mythic notoriety in the mainstream and the left-leaning media. The food drive is merely this week’s “Wal-Mart doesn’t pay its employees enough” story. I’m sure many others are as tired as I am of shouting at the paper, TV, radio or computer screen, “Just do the decent thing and raise their salaries to $15 an hour!”

Perhaps not so many people were yelling at  Secretary of Education Arne Duncan the other day when he announced a new public relations campaign by the Department of Education to get more kids to consider careers as school teachers. Other sponsors include the Advertising Council, Microsoft, State Farm Insurance, Teach for America, the nation’s two largest teachers’ unions and several other educational groups. The problem the campaign addresses is that the Baby Boom generation of teachers is beginning to retire and many predict teacher shortages in the future.

If Arne Duncan doesn’t know it, maybe his friends at Microsoft and the multinational advertising agencies involved in the Advertising Council could tell him that it’s a simple matter to attract more—and more competent—people to a job or career. Just offer more money.

I suspect that Duncan is not entirely serious about attracting more people to the teaching profession, given his continued support of charter schools. From day one, the goal of the charter school movement has been to hammer down salaries of teachers by destroying public school unions.  We know that the big money funding the charter school movement doesn’t really care about quality education. Otherwise they would have pulled the plug on charter schools years ago, given that on average charter schools underperform public schools.

The equation is simple:

1. Charter schools pay less

2. Thus, charter schools drive down teachers’ salaries

3. Lower teacher salaries decrease interest in becoming a teacher.

If this esteemed group of government entities, companies and nonprofit organizations really wanted to build the next generation of school teachers, it would be bankrolling a campaign to make union organizing easier and to set high federal wage standards for all school teachers, public and private.

Both these stories come down to people with power scratching their heads and wondering what to do when the answer is standing right in front of them like a large cold and hungry elephant shivering and trumpeting loudly. PAY THEM MORE! It may mean taking a little less in profits, which are currently exorbitant. Or it may mean raising taxes. Doesn’t matter—those with jobs should make enough money to feed their families, and the professionals to whom we entrust our children should not have their decent wages reduced but instead be raised to the same rate at which we pay lawyers, accountants and other professionals. Pay teachers as much as we pay neurosurgeons and top PR execs, and we’ll have more people interested in the profession.

In our adulation of the dead JFK, let’s not forget almost every myth about him is false

In the tsunami of stories about the 50th anniversary of the assassination of President John F. Kennedy, no one yet has observed that JFK was one of the first and finest examples of manipulation of the mass media to elect a major candidate.

In 1956, Kennedy was a back-bench Senator known for little else than being the son of a rich man and the right-wing alternative to a moderate Tennessean as Adlai Stevenson’s running mate. Then his family launched an incessant public relations program based on the question, “Can a Catholic be elected president?” It seemed as if every month some national magazine or prestigious newspaper was asking the question and answering mostly in the affirmative.  In launching this PR campaign, the Kennedy family had one very large advantage: the family business was the largest advertiser in the mass media in the 1950’s.

After the first debate with Richard Nixon, the Kennedy PR machine shifted into fifth gear to focus the media conversation not on what was said, but on how they said it and what they looked like.  It was certainly the first time that issues—real or fabricated—took a back seat to style in discussing a major election. Likeability, that ineffable essence that the media later told us Bush II had and Al Gore did not, became a factor and the news media made sure we liked JFK a lot more than we did RMN. Of course, they had some help from Tricky Dick himself!

Fifty years after his assassination, the Kennedy legend is mostly built on myths, the most significant and mendacious of which is that he was a liberal or a progressive. Kennedy came from a dark past: His father sympathized with the Nazis. His younger brother was a lawyer for Joseph McCarthy.

As president, Kennedy tended to favor the right-wing. He called for decreasing taxes on the wealthy and corporations and for an increase in military spending. The two fiascos of his Administration—the Bay of Pigs invasion and the assassination of the head of the South Vietnamese government—were both examples of American imperialism and militarism.  Both decisions came back to haunt our country for years, like the equally foolish decision to invade Iraq decades later.

In retrospect, Kennedy’s civil rights record was shabby. Yes he was hobbled by his inability to manage Congress, but reviews of his administration’s actions in such books as Taylor Branch’s Parting the Waters suggest that Kennedy was always looking for an excuse to declare failure in Democratic attempts to pass civil rights legislation.  Other books suggest that in finally passing civil rights and anti-poverty legislation, violence at the marches and riots in the inner cities moved Congress and the American people far more than did fulfilling the legacy of a martyred president.

Although Kennedy was born 30 years too early to be part of the Baby Boom generation, the fact that he was America’s youngest president when the Baby Boomers were reaching their teens did make it easy for Kennedy to become a symbol of a new, younger America. His public lifestyle and his rhetoric did seem to symbolize that youthful time, but his political actions did not represent youthful rebellion and idealism, but rather immature adventurism in foreign affairs and a middle-aged willingness to live with the status quo in everything else.

Part of the Kennedy myth is his personal glamour and elegance—but it was the glamour of rich folks spending their money on expensive stuff. The glamour was also part of the Kennedy PR machine, as exemplified by the first lady’s televised tour of the White House. I do, however, appreciate the fact that until Obama, Kennedy was our last president to cherish urban and urbane values. Between these two, all our presidents have wanted to be seen flipping sausages at a barbecue pit or chopping wood.

I do not believe someone’s personal life should enter into an accounting of his or her public legacy. I don’t care one way or another that Kennedy is reported to have bedded dozens if not hundreds of women. It has nothing to do with his ability to perform as president or his public legacy, unless the sex were not consensual or there were something else he did that indicated poor judgment or unacceptable behavior—underage women, hypocritically advocating celibacy while whoring around, sexual harassment or rape, for example. That Kennedy once forced a White House intern with whom he was having an affair to publicly felate a Secret Service agent does not speak well of the man.

It probably helps Kennedy’s legacy that no president has died in office since he did. I remember many older family members telling me how heart broken they were when Franklin Delano Roosevelt died in office. It occurred 18 years before Kennedy was assassinated and 22 years after Warren G. Harding died in office. It’s now been 50 years and, thank goodness, no one has supplanted Kennedy as “the president I remember dying in office.”  The violence of the assassination heightens the sadness and sense of tragedy surrounding Kennedy, as well it should.  That there are so many photographs and moving images of Kennedy makes it easy for even those born long after him to know him, or at least know his myths.

The persistent rumors of a conspiracy to assassinate JFK also contribute to his high visibility. In fact, most of the Kennedy myth has little to do with the public man. Just think of the ways that his life and death are being covered these past few weeks:

  • The details of the assassination
  • The conspiracy theories
  • His wealth and glamour
  • His sex life (helped by the fact that one of his paramours was a third-rate actress who had a habit of bedding famous and powerful men and after her death became another mass media martyr)
  • The excitement of the New Frontier
  • The sad fact that he didn’t have time to work on his political agenda.

Of course, there are a few stories of substance as well, mostly discussions of whether Kennedy would have escalated the war in Viet Nam. Typically, left-wingers say no and right-wingers say yes. In this case, the right is most certainly closer to the truth, based on all of Kennedy’s actions as president.

Like most public myths, Kennedy is a vessel into which we can pour whatever beliefs we want. Some see him as right-wing, left-wing, cold warrior, dove, hawk, symbol of a more optimistic time, glamour god, sex symbol, sex pervert, leader of youth, friend to minorities, whatever you want.

My take on Kennedy is that he was a rich guy whose family spent a lot of money helping him obtain an office for which he was less than qualified.  His politics reflected the views of large corporations of his time, from lowering taxes on the wealthy to pursuing an aggressive imperialism throughout the world (For more on how large corporations controlled Kennedy, read G. William Domhoff’s recent The Myth of Liberal Ascendancy). He basically cared about power and his social class.  That he is beloved as one of our greatest presidents of all time is just another proof of the power of rich folk to manipulate the news media.

How often does mass media exhort public to imitate people who aren’t rich?

We’re seeing a very rare media trend this fall. Story after story in the style, living, home and even business sections of newspapers and websites are advising people to imitate individuals who aren’t famous and don’t earn a lot of money, maybe $30,000 to $57,000 a year.

The envied group we’re supposed to imitate consists of professional shoppers. At least that’s the conclusion I draw from typing “Black Friday shopping tips” in the Google search box.  Of the 1.24 million results that come up, the first few pages are filled with articles that are going to teach us how to “shop like a pro,” by which the writers must mean a professional shopper, those low-paid gofers of party planners, marketing departments and rich folk.

Here is a sampling of articles in which we can learn how to “shop like a pro”:

  • How to Shop on Black Friday Like a Pro” lays out three steps and three tips for shopping like a pro the day after Thanksgiving. Unfortunately, the writer and editor are less than pros and make a number of irritating syntactical errors, such as writing “your” instead of “you’re.”
  • Shop Black Friday Like a Pro” starts with the premise that the readers—like the writer—love to shop for the Holidays.
  • 12 tips for shopping Black Friday like a pro” is a graduate seminar in how to shop during the Holidays. The last tip however, places a dark cloud on the whole process (I write “process,” since when there are 12 steps, there must be a “process“): “Plan a nice brunch or other social gathering at the end of your trip, so you’ll have something to look forward to.” Wait a second. If, as the article claims earlier, you are so excited about shopping that you “are already salivating,” why do you need something to which to look forward? Maybe professional shoppers are supposed to end their work days with a snack, kind of like reverse carbo-loading.
  • 5 Steps to Shopping Black Friday Like a Pro” advises people to have a Holiday shopping strategy.
  • Black Friday Survival Guide – How to Shop Like a Pro” compares Black Friday to the Superbowl, but warns that on the “potentially dangerous and stressful day” you better learn how to shop like a pro.  Football serves as the appropriate analogy for the grim picture of waiting on line, running towards products and pushing and shoving painted by the author.
  • 3 Ways to Shop Black Friday Like a Pro” boils it down to the essentials of planning your route and coordinating with friends, so that one of you shops for certain items while the other looks for other things.

Those who aren’t satisfied merely to achieve a professional status, though, may prefer “How to win Black Friday 2013: Tips from a master.”  The article never tells what it means to win, but if there are three things I know about 21st century America it’s:

  1. We like to shop
  2. Winning is fun
  3. We like to aspire to the pinnacle, such as the pinnacle of shopping professionalism, that exalted point at which others laud you as a “master.”

The “shop like a pro” theme doesn’t exhaust the ways in which writers are giving us advice for Black Friday. You can find tips, steps, ways, lessons, strategies, tactics and ideas in any quantity you like: 3, 5, 7, 10, 12 or 20. There are even 10 issues worth discussing in the eternal debate between shopping in person on Black Friday or online on Cyber Monday. Don’t worry—there are advantages to each.  That’s the great thing about America—you’re doing okay, as long as you’re shopping.

Interestingly enough, no one mentions products much in their advice on shopping. There are occasional references to tablets and video games, but mostly the products don’t matter—it’s all about the act of buying.

Traditionalists shouldn’t beware just yet. My Google News search of “Thanksgiving” yielded 158 million stories, as opposed to a mere 136 million for “Black Friday.” If we measure significance by number of Google hits, Thanksgiving is still the top Holiday of the last week of November.  There are many how-to-articles for Thanksgiving, too—how to roast a turkey, how to make a turducken, how to make gravy, how to plan a vegetarian Thanksgiving, how to address family disputes, how to decorate in a festive way.  It all seems mundane and old-fashioned, though, compared to the thrilling rapture of pulling a credit card out of a wallet and handing it to a cashier.

But give it time. Black Friday is relatively new as a holiday.  It is still developing its traditions and its history. In the future, perhaps, certain food will become associated with Black Friday, like Mexican food with the Superbowl (My money is on hot turkey tacos). People will start telling stories of Black Friday the way they remember it in the good old days. The year Mom wrestled a PlayStation away from a 400-pound man. The year we roasted turkey on the portable grill in the Wal-Mart parking lot. And sooner  or later, someone is going to figure out that like most other American holidays, the best way to celebrate Black Friday is to buy something for someone and give it to them. Yes, I can see the glorious day—glorious for retailers—when people exchange presents for Black Friday. And at that point, we’ll have to create a new holiday—the one on which we shop for Black Friday presents.

Nation’s Michelle Goldberg nails GOP for denying poverty exists

Michelle Goldberg puts a lot of facts together to reach her perceptive conclusions about Republican attitudes on American poverty in “Poverty Denialism,” in the latest Nation.

Goldberg quotes the usual suspects—FOX commentators, Bobby Jindal, Tea Party Congressmen—to demonstrate that the GOP denies poverty exists, and instead proposes that there are only a bunch of people who could work if they wanted to but prefer to sit at home getting fat on government handouts.

The article points out that the Republicans could blame President Obama for poverty, if they wanted to. But to do so would be to admit that poverty exists and require them to present a plan to deal with it. Goldberg recalls that Nixon, Kemp and even Bush II proposed initiatives to address poverty.  All the current Republicans want to do now is cut, cut, cut.  I urge you to read Goldberg’s article.

The sheer meanness in the attitudes of most Republicans is dazzling. They are willing to watch their fellow citizens starve so that taxes on the wealthy and corporations can remain low and go even lower.  It’s not even good economics, since the poor will spend all the money the government gives to them and thereby fuel the economy and create more jobs, whereas the wealthy recipients of GOP largess are just going to save it. But these Republicans don’t really care about creating new jobs or strengthening the economy. They want theirs. Thus they ignore the poor except to cut their food stamp benefits by 7% and to make it harder for them to get public assistance, or to make clever and cruel comments about their laziness.

The current Republican neglect and mocking of poverty represents the height of selfishness. But it only makes sense: Selfishness goes hand-in-glove with a materialistic culture in which the mass media and TV commercials tell you to indulge yourself all the time.

Of course the same people who fund the GOP mostly control or own the companies that are advertising and producing our entertainment. It’s up to the American people to reject this selfish thinking and elect representatives who want to make sure that everyone in the country eats at night, can access needed medical care and has a chance to attend a quality school, As humans in a wealthy society, we have both a right to these basics and the responsibility to make sure that everyone has them. Republicans once knew it, but the current bunch seems to have forgotten.

Rollout and communications snafus don’t invalidate good of Affordable Care Act

So far, the rollout of the health exchanges—the heart of the Affordable Care Act (ACA)—has reminded me of the incompetence associated with the Bush II Administration. The interesting part of the Obama Administration’s bungling of the rollout is that the mistakes have not been made now in the heat of the moment, but months ago when Barack Obama and his inner circle had time to think about it.

Waiting until the Supreme Court affirmed the ACA to begin writing the software and setting up the website was at the very least overly cautious. I would be among the ones who would call it gutless, because it was not in the best interest of the United States and its citizens to wait and the money saved would really have been a drop in the bucket in the current deficit.  Far better it would have been if the Administration had given the tech folks the time to do more extensive testing of the system. We should note, however, that as with many large websites of private sector companies, it’s very possible that the health exchange website would have still encountered problems even with extra time.

It’s a shame, because the state exchanges are mostly working and it’s primarily the people in the Republican-controlled states who have to wait until the federal website is fixed to sign up for health insurance.  Those are the same states in which the poor eligible for Medicaid coverage under the ACA won’t get coverage because their governors rejected the federally funded expansion of Medicaid in their states.

Perhaps more interesting to me is the mistake in messaging that the President made when it comes to the relatively small number of people who are losing their existing policies.

First the facts: ACA sets new standards for health insurance plans. Setting standards has been a government function since at least the Sumerians. It is the government that tells us how much an ounce must weigh and how much fat and extenders you can throw in and still call it “lean ground beef.” All indications are that in writing both the laws and the regulations, large health insurers had input into developing the new standards. The policies held by the 3 million who will have to change do not meet the basic standards established in the ACA. Many are lousy policies, not worth the paper upon which they’re printed.

What the President said—months ago—was that no one will lose their policies. What he should have said was that less than one percent of people would have to change policies because their policies were below the new standard.  This more truthful statement would have set the bar of expectations at the appropriate level. Many of those who have to change policies would still be angry and frustrated, especially if they lived in Republican-controlled states and couldn’t get through on the federal website. But the President’s misstatement would not have provoked a scandal in the news media, nor would the rest of the public be up in arms.

Setting the expectations of the audience is one of the basic principles of communications. The operating theory is the idea of “relative deprivation,” which basically states that people get angrier at being deprived of something than at never having had it. Contrast the positive reaction if you promise someone $80 and you give her $90 to the negative reaction if you promise someone $110 and you give her $100 for doing the same job. The public was led to believe that no one would lose their policies and now feels the frustrations of relative deprivation.

When I conduct seminars on communications, I advise my students—mostly executives and professionals—to set the audience’s expectations at the very beginning of the interaction. For example, before you open the floor to questions at a meeting of many people, always say that individuals will have the chance to ask only one question and one quick follow-up question until everyone has had a chance to pose questions. Without setting that expectation, if someone tries to grab control of the meeting by barraging you with questions and long comments, the audience may think you’re trying to suppress discussion when you try to stop him. But if you have set expectations, when the demagogue tries to take control, the audience will be on your side and shout out, “Give someone else a turn.” I’ve seen both scenarios play out multiple times.

In a sense, the website snafu is also a failure to meet expectations. When an organization announces a website is up and running, the expectation is that it will work.

I’m fairly confident that the website will get fixed and that the health exchanges, new standards and other features of the new law will lead to many more Americans being covered by good health insurance plans, an improvement in the nation’s health and a decline in the cost of medical care. The Affordable Care Act will work, but it’s unfortunate that before it does, the Administration has to learn some basic lessons in setting expectations…and meeting them.

Jewelry ads pop up – a sign that America’s favorite holiday—Black Friday—is fast approaching

On the side of the highway into Pittsburgh today I passed a billboard for Orr’s, a regional jewelry chain and it reminded me that we are about to be inundated with ads cajoling us to buy bits of rocks embedded in metal and other functionless baubles.

The Orr’s billboard is mostly white with black lettering focused on the words, “Stephen Webster,” who a quick trip to the Internet told me is a British jewelry designer. On the right is a highly stylized photograph of a nearly bald woman with her head and neck twisted upwards almost in the elongated style of the Italian Mannerist painter Pontormo. The only colors on the page are the jewels in the rings and earrings she wears. But the figure is so much like the background white that all people in passing cars can really digest are the words “Stephen Webster.”

The unspoken message that Orr’s assumes we will get is that Stephen Webster is the equivalent of Cadillac or Apple, a premium brand. But I didn’t know it, nor did the other person in the car, nor did anyone in my office, nor does anyone except those interested in jewelry or perhaps design in general.  Thus, what Orr’s is selling is not the “Stephen Webster” brand, but the fact that Orr’s has the brands.  A quick trip to the Orr’s website confirms this analysis: the home page features a long horizontal billboard space in which ads for Stephen Webster, Henri Daussi, Roberto Coin, Cartier (the only one I recognized) and Marco Bicego rotated in succession.

The Orr’s basic marketing message—We have the best brands—got me thinking of other approaches that jewelry stores take to selling what are essentially luxury and to my mind frivolous products with arbitrary value.

We are starting to be bombarded by TV ads by Jared, a national chain which primarily places its stores in malls. For years, Jared has used the line, “He went to Jared.” The line is whispered as lascivious gossip between neighbors, screamed to best friends and sisters, intoned seriously by admiring but envious buddies. The context is always other people and their reactions. Whether you label the operative behavior as “Keeping up with the Joneses” or “If you got it, flaunt it,” the message that Jared is trying to make is come to Jared to makes sure your friends and neighbors respect, honor, envy and like you.

Thorsten Veblen’s “Theory of the Leisure Class” seems to be the operating theory here. For Veblen, the leisure class engages in conspicuous consumption for the sole purpose of making individual distinctions based on financial wealth.  You show you are worth something by buying your spouse “diamonds as big as horse turds,” as my father used to put it. In this case, you make sure that everyone knows you are prosperous by going to Jared.

Jared has been using the slogan for years, so it must be working. But around even longer and working even better, I think, is the slogan of another national chain, Kay’s. Their line, sung in commercials after the presentation of a ring, earrings, bracelet or watch leads to an embrace, is “Every kiss begins with Kay.”

The reason I like the Kay’s approach—selling sex—is because it comes closer to the reason that most jewelry (other than graduation watches and sweet 16 charm bracelets) is purchased: to give to a beloved to symbolize a sexual relationship. Yes, some people want to make sure that they have the top quality, whatever that means, and have been trained to associate brands with quality. But they aren’t buying jewelry to get a brand. And yes, keeping up with or surpassing the Joneses is a big motivation to many people when they purchase jewelry, but it’s only a secondary motivation. No one buys an engagement ring to please the family (although he or she may have proposed because of family or societal pressure). They buy the ring or the earrings or whatever for the loved one. They may select one item over another because they know it’s better than what the Joneses have, but only after the decision to buy has been made.

The causal connection between giving someone jewelry and engaging in sexual relations is strongly rooted in our society precisely for the reasons that Veblen details. It is a form of display that is supposed to make the wearer more attractive and more of a status symbol for the giver. The exchange of rings symbolizes marriage, which is a public construct and the traditionally sanctioned locus for sexual relations. We give a ring to mark the engagement, as well, and for key anniversaries. We are brainwashed that other types of jewelry are the go-to gift for the spouse.

In a real sense, jewelry commoditizes romantic relationships, which means it turns romance into something that can be bought and sold.  Instead of buying conjugal rights, you buy the symbol and give that to the object of affection.  In the world of symbols and hidden meanings by which our society lives, every kiss does (or can) begin with the presentation of jewelry. Kay’s hits the bull’s eye.