The symbolism of a “slutwalk” is clear: No means no

The current issue of Nation has an opinion piece by Salamishah Tillet that discusses a term of which I was only vaguely aware before: the slutwalk.

A slutwalk is an anti-rape march and street protest.  As I understand it, the women marching in a slutwalk dress in a provocative and revealing manner that shows plenty of skin and/or adorn themselves with the kind of cosmetics or hair styles often associated with women presumed to have a lot of sexual partners.

According to Tillet, the first slutwalk occurred last April in Toronto after a Toronto police officer told a group of students in a public safety class that women “should avoid dressing like sluts in order not to be victimized.” Tillet reports that there have now been more than 70 slutwalks all over the world, including in Chicago, Berlin, Cape Town, New Delhi and Mexico City.  New York City is holding one tomorrow, October 1.

I personally don’t care for women who don the attire and cosmetics associated with slutwalks.  No woman with whom I have ever been involved or in whom I have ever had a romantic interest has ever dressed in the slutwalk manner.  When women at my company have occasionally dressed that way, I have asked other female employees to speak with them about what constitutes a business-like appearance and why.

But I think the slutwalk is a wonderful statement of the simple fact that “no means no.” 

As we know from the news and entertainment media, defense attorneys in rape trials often accuse the rape victim of “having asked for it” by the way she dressed or presented herself or tries to establish that the woman had engaged in intimate relationships with many men, or several men in the hours preceding the rape.  I have never understood either line or reasoning:

  • So what if a woman is dressing in a sexually arousing way?  We don’t excuse a murderer because the deceased pissed him/her off by their actions.  And we don’t excuse a thief because he is poor or has lost a lot of money gambling or in the stock market.  Civilized humans are supposed to be able to curb their instincts when to act upon them would be against the law or inappropriate for the situation.  This ability to discipline one’s self is part of the essence of our humanity. Many philosophers through the ages would say it’s the primary factor that distinguishes us from other animals.

  • Why is it that a man who sleeps with a lot of women is called a stud, which has a positive connotation (unless he is married and prominent, in which case he is a “sex addict who needs treatment”), but a woman who sleeps with a lot of men is called a slut, which has a negative connotation? The stud-slut dichotomy was and is a major tenet of sexism and all right-thinking people should reject this double standard. Treat the goose and gander the same way.

Let me use the rhetorical technique called reductio ad absurdum—taking something to its most absurd conclusion—to make my point.  Let’s say a woman of her own free will and not under the influence of foreign substances has decided to have sex with 8 men in a row, with the other males watching while she is engaged with each man, what is crudely called a “gang bang” or “pulling the chain.”  She has completed her business with 7 of the men and the 8th is about to take his turn. If she says “no” and he continues, it’s rape. Period. End of story.  That police departments, prosecutors, judges and juries don’t always see it that way is a continuing travesty of justice and makes a mockery of our concepts of freedom and free will.

I salute the women marching against rape in tomorrow’s slutwalk in New York and in all the slutwalks that have taken place or will take place.  These women are not saying that women should dress or act provocatively.  They’re saying that “no” means “no,” no matter what.  They’re saying that rape is not about anything other than violently forcing a woman to engage unwillingly in a sexual act. 

“No” means “no.”

The time has come to run warnings on TV commercials for fast food, junk food and processed food

When traveling in foreign lands I always tune into the local TV for a few hours each day.  I’m currently in the middle of a two-week trip to France (which explains why the OpEdge blogs have not been as topical as usual), and so have been watching some French TV.

For the most part, I can’t tell the difference between the offerings on French and American TV:  Reality TV, superficial news coverage, police procedurals, situation comedies and mediocre or popular movies dominate French television programming, some of it older episodes of American shows such as Grey’s Anatomy and Two and a Half Men (called “Uncle Charlie,” a name they’ll now have to change).  I even got to see the end of Police Academy I and the beginning of Police Academy II dubbed in French the other night.

The French TV commercials also resemble those on American TV.  The major products sold are cars and fast, junk and processed food products such as Kellogg’s cereal with chocolate and frozen bread. Not a commercial break went by in which I didn’t see a commercial for McDonald’s, although I have seen no commercial for any kind of alcohol.

But there is one enormous difference between French and American TV.  Every French commercial for fast, junk and processed food has a warning superimposed on the screen.  The warnings I have seen include (my translations):

  • “Avoid eating between meals”

  • “Have at least five servings of fruits and vegetables each day.”

  • “For your health, avoid eating too much sugar and fat.”

  • “For your health, participate regularly in physical activity”

  • Statistically speaking, the French are among the slimmest people in the developed world, but obesity there doubled from 1994 to 2003.  The French weight problem, however, pales compared to what we face in the United States. Here are the latest statistics I was able to find in a quick Internet search (with some rounding up or down):

    Percent obese:

    United States: 34%

    France: 11.3%

    Percent overweight:

    United States: 34%

    France: 31%

    Obesity among children:

    United States: 17%

    France: 3.8%

    Ranking among 29 developed countries:

    United States: First!!

    France: 23rd

    And here’s the kicker: A few years ago, France became the first country in the European Union in which childhood obesity rates have started to level off. 

    Obesity is bad for the individuals carrying the extra pounds and it’s bad for society in general.  Obesity (and to the lesser extent being overweight) has been linked to a number of serious health problems including diabetes and heart disease.  Obesity also costs our society a lot of money, since people with health problems consume many more medical resources than those without a weight problem and thus jack up the cost of health insurance for everyone. 

    The French government saw the problem and acted.  The French are doing many things to overcome their weight problem, but only the naïve (or those with a vested interest) could deny that a major step is to remind people of healthy eating habits every time they see tantalizing commercials for innutritious or less nutritious food.  TV is the main source of information for many people, and TV commercials use just about every tactic and technique of persuasion.

    By contrast, the United States seems to pay lip service only to acknowledging and acting upon our collective weight problem:

    • There are no warnings on American TV commercials for fast, processed and junk food. 

    • The Food and Drug Administration’s food pyramid was turned into a confusing mess, and it remains to be seen if the new “food plate” will be any better. 

    • Our mass media stresses exercise as much as, if not more than, reduced food consumption as the key to losing weight.  Regular exercise is important for many reasons, but when it comes to weight loss, it’s absurd to put as much stress on exercise as on food consumption once you learn that it takes a half an hour to an hour of moderate exercise to work off one donut.

    Why the difference? Here’s my take: When the French government saw the problem, it acted in the best interest of its people, whereas the U.S. government has felt constrained by the lobbying efforts of processed, junk and fast food interests.  Both France and the United States have market economies, but the French are much more willing to intervene in the markets for the good of the country.

    There is no such thing as an absolutely free market, despite what the right-wing says.  We give subsidies to industries.  We enter into trade agreements.  We don’t allow people to steal from others and sell it.  We enforce contracts.  We collect taxes.  All of these are free market constraints. To add one more and make processed, junk and fast food advertisers run warnings on their TV commercials seems like a no-brainer. 

    After deciding to deep-six Islamic-tinged white bread music, did PA school board have Mexican fast food for lunch?

    Around my house, when one wants to give an example of sappy and saccharine light classical music, one usually invokes Borodin’s “Polovtsian Dances.”  And when we want to disparage the kind of corny white-bread show-tune music of our parents’ generation, the go-to song is “Take my hand, I’m a stranger of paradise,” a hit from the 1954 movie of the musical Kismet, which uses Borodin’s dances and takes a small whiff of the Islamic orientalism of 1001 Arabian Nights to primp up a standard western love story told unimaginatively.  Kismet is kind of like applying the “theme restaurant” approach to musical drama.  Instead of the pinch of cilantro of a Chili’s or a few icons of Italian décor in a Pizza Hut, Kismet gives us a little romantic jigger of the Near East.

    You’d think such an American chain-like recipe would be perfect for a rural western Pennsylvania school district looking for a safe play for the annual high school drama.  But the Richland School District has decided to scrap its plans to have the high-schoolers tackle Kismet after community members complained about the timing so soon after the 10th anniversary of the Sept. 11 attacks.

    I’m not sure what timing has to do with it.  The only thing offensive in Kismet is its very lack of offensiveness—something that the residents of rural areas would usually embrace, judging from the local restaurant, movie and radio selections.  In this case, “timing” is a code word for “anti-Islamic,” much as “support our troops” was usually a code phrase for “support this illegal and ill-considered war.”

    What’s most disappointing was that the school district capitulated to a group of local ignoramuses.  Nowhere in the coverage do we know how many people really complained.  I do know from past experience that organizations tend to capitulate too quickly to complaints and often draw a conclusion from a very low sample size.  The latest to fold to a small part of the public was Netflix, which made a smart long-term business move by separating the fee for DVD rentals from that of unlimited program streaming.   When people complained, the Netflix reaction was a stupid move—separating the two delivery mechanisms into two distinct companies.

    I remember when I was PR counsel for a large supermarket company, an advocacy organization with a name that included the word “American” wanted the client to put brown slip covers on copies of Cosmopolitan, GQ and other supposedly racy magazines that the supermarket displayed on its shelves.  An absurd request, since the material is far less risqué than what’s on TV and billboards.  Another major supermarket had recently agreed to this organization’s demands. 

    Instead of knee-jerking to this unnecessary assault on first amendment rights, I did some research.  I found out that in the previous three years, only one complaint of the more than 50,000 that the supermarket had received had mentioned risqué magazine covers; I should point out that virtually all of the company’s stores were in rural areas or small cities, places in which one would be more likely to receive a complaint.  The other fact I uncovered was that this foundation consisted of one individual who ran such a website.  We did not fold, and we received no further complaints.

    Would that the Richland School District had stood its ground!  Then I could have placed my complaint that the school district has no business offering their students such pabulum as Kismet with South Pacific, My Fair Lady, the H.M.S. Pinafore and West Side Story available.   

    Another survey of “best” cities fixes the outcome by selection of what they think is important for the good life

    The latest mass media survey of the best cities in which to live again fixes the results by putting bias into the criteria by which it measures things.  The fix is always in favor of an automobile, mall and chain store-based existence, even when considering city life.   This time, it’s a Bloomberg Business Week study of the “best cities” appearing earlier this week that uses its list to communicate the ideological imperative of  American consumerism.  

    At first, Bloomberg teases us with the idea that it will be judging cities on what have traditionally been the virtues of cities (except for mass transit): “What if you could live in a city that offered a wealth of culture, entertainment, good schools, low crime and plenty of green space? Many people might opt for the obvious choices, such as New York or San Francisco, but, great as they are, data reveals there are other cities that are even better.”

    But when it gets down to actual evaluation, Bloomberg relies on very few attributes of that define traditional urban life:  “We looked at a range of positive metrics around quality of life, counted up restaurants, evaluated school scores, and considered the number of colleges and pro sports teams.”

    Here’s what they forgot:

    • Mass transit
    • Number of locally-owned non-chain restaurants (they only count the absolute number of restaurants)
    • Museums/monuments and architectural marvels (they only list pro sports teams)
    • Diversity
    • Average environmental footprint per resident
    • Public space, which includes more than parks, and does not include malls, which are private spaces
    • Access to the highest quality health care

    Here is Bloomberg Business Week’s list of “Best Cities” for those who want the suburban experience and don’t mind driving a lot and eating at a lot of chain restaurants:

    1. Raleigh, North Carolina
    2. Arlington, Virginia
    3. Honolulu, Hawaii
    4. Scottsdale, Arizona
    5. Irvine, California
    6. Washington, DC
    7. San Diego, California
    8. Virginia Beach, Virginia
    9. San Francisco, California
    10. Anchorage, Alaska

    The Bloomberg list includes San Francisco, Irvine and Honolulu and thus does not measure cost of living.  It also includes a number of places that aren’t really cities, but suburbs that depend on their proximity to cities for some of their high rating, e.g., Arlington, Scottsdale and Irvine. The only city not in the South or West is in Alaska. Only two have decent mass transit, Washington and San Francisco; and except for these two and parts of San Diego and Raleigh, all of these cities look like and lay out like car-loving suburbs.

    Now I’d like to present my alternative list of America’s “Best Cities” for living, based on the bulleted items, adding good schools, universities and secondary schools and entertainment from the Bloomberg list.  Note that I am talking about cities in which you live within the city limit: or can walk (not take the car) to a train (not a bus) to the city.  Also note that the high cost cities on the list also tend to offer higher salaries and that the few cities with mediocre mass transit demand only very short car trips and have a lot of walkable neighborhoods. 

    OpEdge American Best Cities

    1. New York, New York
    2. Boston, Massachusetts
    3. Washington, DC
    4. Chicago, Illinois
    5. Philadelphia, Pennsylvania
    6. Pittsburgh, Pennsylvania
    7. San Francisco, California
    8. Seattle, Washington
    9. Milwaukee, Wisconsin
    10. Portland, Oregon

    I didn’t mean it for it to happen this way, but note that every city on my list is located in the bluest of blue states.  By contrast, with the exception of Honolulu and Washington the Bloomberg cities are located in red states or the red state part of California. As the French poet Baudelaire once put it, “To everyone, his (or her) illusion.”

    Is the Obama Administration planning to go to war in Somalia and Yemen?

    Reading the lead story on the front page of today’s New York Times certainly sent a shiver up and down my spine, as I’m sure it did for many people. 

    The Times story discusses a debate in administration circles on “whether the United States may take aim at only a handful of high-level leaders of militant groups who are personally linked to plots to attack the United States or whether it may also attack the thousands of low-level foot soldiers focused on parochial concerns: controlling the essentially ungoverned lands near the Gulf of Aden, which separates the countries.”  The Times goes on to state that the dispute over legal limits on the use of expanded lethal force in the region has divided the State Department and the Pentagon for months, but the article claims that the discussions are all theoretical since current administration policy is to attack only “high-value individuals” in the region.

    The low-level soldiers involved in the “attack or not to attack” debate are in Somalia and Yemen, so essentially the theoretical discussion is, or may be, an implicit or veiled way to deterrmine if President Obama has the legal right to wage war in Somalia and Yemen without approval of Congress. 

    In other words, the Obama Administration is considering a war against groups in Yemen and Somalia.

    In that context, the article is another attempt by The Times to float an extreme idea, something The Times likes to do a lot. For example, this past January The Times, in another front page lead story, floated the idea that states be allowed to go bankrupt in a way that would allow them to pay bondholders but break contracts and pension agreements with unions.

    Clearly, some powerful forces in the Obama Administration want us, once again, to expand the war on terrorism beyond its natural boundaries by attacking people on foreign soil who are uninvolved in terrorist acts.  It sounds like an idea from the Cheney-John Yoo branch of the Republican Party, and the very fact that the Administration is still discussing this option—even in theory— after months does not speak well for the Obama Presidency.

    Every time President Obama imitates the Republicans or gives away the store in a negotiation (or more typically before the talking starts), he turns off more of the Democratic base of progressives and unionized workers.  In doing so, he gains no political points with the right, which will hate him no matter what he does.  Instead of making courageous stands that he could defend in a winning reelection campaign, Obama prefers to create a lose-lose situation: he loses more of his base but gains nothing in return.  At this point, for Obama to tack center, he would have to move to the left.

     

    What was the Smithsonian thinking when it accepted money from the “UnHistory” Channel for a natural history exhibit?

    On my recent trip to Washington, D.C., I visited the wonderful Smithsonian National Museum of Natural History.  The museum excels at meeting its mission to dedicate itself “to inspiring curiosity, discovery, and learning about the natural world through its unparalleled research, collections, exhibitions, and education outreach programs.”  The many exhibits explore evolution, geography, mineralogy, biology, archeology, anthropology, oceanography, climatology and other natural sciences with verve and with very little of the dumbing down that often dooms natural history and science museums.  The collection of specimens, dinosaur bones, trilobite fossils, primate skulls, gems, stuffed animals and other artifacts is magnificent.

    What I want to explore today is a temporary exhibition we saw called “Written in Bone: Forensic Files of the 17th-Century Chesapeake,” which uses the methodologies of anthropologists to examine history through 17th-century bone biographies, including those of colonists on the edge of survival at Jamestown, Virginia and those of wealthy individuals in St. Mary’s City, Maryland.  “Written in Bone” depicts how researchers use recent technological breakthroughs to clarify and test the assumptions of colonial history.

    A very interesting exhibition, but why did the Smithsonian have to stoop to taking money from the History Channel and letting the History Channel co-sponsor it?

    Go ahead and ask me what my objection is to the Smithsonian partnering with the History Channel is, or the UnHistory Channel, as I like to call it.

    It all comes down to the programming. 

    Let’s look, for example, at the UnHistory Channel’s shows throughout the day and prime-time last Sunday: documentaries about ice-road truckers, pawn brokers, the impact of aliens on  ancient engineers, UFO’s and Bigfoot.  Note that there is nothing that’s pure history or natural history.  Instead we see reality-show slumming mixed with irrational beliefs such as aliens affecting the Earth and imaginary creatures.

    The weekday lineup isn’t much better.  This week’s morning and afternoon programming on the UnHistory Channel includes shows on comic book heroes, UFO hunters, people who scour thrift stores for bargains, the Templar code (popular because of its fictional use in recent Dan Brown fictions), “MonsterQuest,” which “uses the latest high-tech equipment to take a scientific look at legendary creatures around the world” and “God versus Satan,” which explores theological beliefs surrounding a mythical celestial battle called Armageddon.

    Weeknight primetime programming is no better: UFOs, ancient aliens, theology and pawn shops fill the bill, although later in the week, UnHistory Channel does have one prime-time show about the day after the September 11 attacks and another about New York City.

    I understand that public institutions have faced funding challenges since President Ronald Reagan put them on a starvation diet that continues unabated after 30+ years.  But why stoop to accepting money and thereby providing widespread positive publicity to an organization that seems dedicated to tearing down everything that the National Natural History Museum stands for?  By entering into a “partnership” with the UnHistory Channel, the Natural History Museum endorses that TV station’s programs and approach to programming, which essentially is to give the superstitious, the theocrats and the intellectually lazy what they want, instead of presenting truth and examining real historical and scientific controversies.

    When a natural history museum accepts funding from banks, computer companies, oil companies and other large corporations, it sometimes finds itself dealing with entities that damage the environment or would prefer to deny the impact of global warming.  But denying global warming or drilling for oil in protected lands is not the mission of these companies, it’s a byproduct of their short-sided approach to their real missions.  Like most people, these corporations are ethically complex creatures. They do many things, some bad and some very good.

    But the mission of the UnHistory Channel, as seen in its programming, is completely antithetical to any natural history museum.  Virtually every show the channel currently airs is harmful to legitimate science and history, because it 1) enables unscientific and unhistorical myths to fester; 2) passes off irrational belief as empirical science;  or 3) plays into the most trivial concerns of contemporary reality television such as pawn shops and those who drive trucks over ice.

    Throughout the “Written in Bone” exhibition were little icons of UnHistory Channel sponsorship.  Perhaps because of the colonial connection, I looked at these icons as little badges of shame, much like Hawthorne’s scarlet letter on the dress of the adulteress, Hester Pryne.  The difference of course is that the fictional colonial Bostonian society forced Hester to wear her bright red A.  By contrast, the Smithsonian has willingly embraced its badges of shame.

    The Smithsonian should have passed on working with the UnHistory Channel on “Written in Bone,” and it should accept no future funding from the UnHistory Channel. 

    With real unemployment more than 16%, why are experts worried that the labor force may be shrinking

    Perhaps to dishonor Labor Day, over the weekend Bloomberg Finance broadcast an article bemoaning the fact that the U.S. labor force may be shrinking.  What they are talking about is not the number of jobs but the number of people with or looking for jobs.

    The key fact presented by writers Steve Matthews and Joshua Zumbrun is that over the next 40 years, the labor force is expected to grow by a mere .6% a year, compared to the 2% a year growth rate from 1950 to 2008, at which point the great recession of 2008-20?? hit and the labor force began to shrink.

    The article and its several experts and worried corporate leviathans argue that with fewer workers, there will be less income around to purchase stuff, therefore the economy will shrink.  The experts present a perfect mirror image to the widely-held belief that an economy can only prosper if it is growing and it can only grow if the population grows.  They reduce economic planning to a homily: population increase leads to economic growth; population decrease leads to economic trouble. I would assert, however, that it‘s possible for an economy to thrive in stasis, and in fact imperative that we pursue no-growth and negative-growth strategies to address resource depletion and human-induced global warming.

    But we don’t even have to question the much-employed shibboleth of growth to realize what’s behind the worries of the economic experts and corporations: they’re all afraid that wages will go up for the working stiff.

    Let’s start by reviewing the reason given in the article for the decline in the workforce (actually the stagnation of the workforce): fewer women in the workforce and the retirement of the enormous baby boom generation.

    In other words, people are leaving the labor force but not dying or leaving the country.  And is that such a bad thing? Don’t we have an official unemployment rate that’s more than 9% and a real rate of unemployment (which includes those underemployed and those who have stopped looking for work) of more than 16%?  If the labor force shrinks, unemployment will shrink, won’t it? (And remember that for years the real estate bubble bloated our job roles with all those unnecessary broker, appraiser, builder, banking, insurance and related jobs that will never return.)

    The women and retirees leaving or not entering the workforce aren’t going to disappear.  They will still need goods and services that their spouses, families, Social Security or retirement plans will finance. Now unless we have immigration, our population will shrink when baby boomers start to pass away in large numbers, but the Bloomberg article isn’t talking about population shrinkage.  It’s talking about the labor force shrinking relative to the population because of retirements and lifestyle changes.

    An axiom of all economic theories, be they free market or Marxist, is that when the supply of any good or service decreases, its relative value to purchasers increases and so does its price.  And that’s what the large corporations, the economists that do their “big picture” thinking and the journalists who indoctrinate the world with their messages are all afraid of.  They are afraid that with fewer workers, they’ll eventually have to start paying more at every level, from burger flippers to physical therapists, from sanitation workers to (heaven forbid) school teachers!  It may even be possible to reverse the 30+-year trend of stagnant or lower wages that has plagued virtually everyone employed who doesn’t own or operate a business.

    A case history of relative deprivation: Netflix raises its prices and people complain.

    Transport yourself back in time about 10 years, and assume you have some kind of a decent job.  With the price of movie tickets pushing eight-nine bucks and pay TV offering a disappointing selection of movies, wouldn’t you have paid $16 a month for unlimited movies over your computer and one DVD at a time delivered to your home, no muss, no fuss?

    Most people would have jumped on it like flies on honey. 

    But now, when Netflix is unbundling streaming movies from delivering DVDs by mail, lots of people, goaded by the news media, are bitching about the new Netflix prices.

    The media is trying to whip up protests against Netflix and thereby are exaggerating both the nature and the impact of the price increase. The media is completely absurd to dwell on the 60% increase in price.  In absolute terms, it’s only a few bucks more for something that’s dirt cheap to begin with.  I’m certain that the extra six bucks (which you don’t have to pay, by the way, if you select to receive either streaming or DVD but not both) will hurt some financially pressed families, but when six bucks a month is the straw that breaks the financial back, Netflix is the least of your worries.

    One report says that 30,000 people have made negative comments on the Netflix Facebook page, which has 1.5 million Facebook followers; Netflix has about 23 million members overall.  Last time I checked, the protesters amounted to two percent of the Facebook followers and a little more than one-tenth of one percent of all members, hardly a groundswell of discontent.  Of course, the media is used to overestimating the impact of minorities, as demonstrated by its incessant Tea party coverage in the last election cycle.

    But even if we discount the media feeding frenzy, it’s clear that a lot of people are pissed at Netflix, despite the fact that the product is still very cheap, especially for the frequent user.

    The Netflix incident represents a perfect example of the concept of relative deprivation.  Here’s how Wikipedia defines relative deprivation: “Relative deprivation is the experience of being deprived of something to which one believes oneself to be entitled to have. It refers to the discontent people feel when they compare their positions to others and realize that they have less than them. Schaefer defines it as ‘the conscious experience of a negative discrepancy between legitimate expectations and present actualities.’ It is a term used in social sciences to describe feelings or measures of economic, political or social deprivation that are relative rather than absolute.”

    Some perhaps not-so-hypothetical examples:

    • You’re willing to take a new job for $55,000 a year until you hear that they offered someone else who turned it down at $67,000.  You therefore balk at accepting their offer to you of $62,000.
    • You’re used to paying $2.00 for a gallon of gasoline, even though you know Europeans are paying $4.50.  Now you’re angry because you have to pay $3.90 a gallon.
    • Five months after buying your first smartphone, you find yourself complaining because the Internet downloads are slower when you’re driving through a tunnel.

    Relative deprivation can make us resent a good situation or get us so used to a certain standard of living that we take unwarranted risks rather than cut back.  Relative deprivation is what happens when luxuries become necessities.

    To the savvy consumer Netflix has offered an opportunity to save two dollars a month (or 20% using the media’s hyped up numerology).  In the unbundling, the cost for getting one DVD at a time or for unlimited streaming fell to $8 a month each.  Consumers who primarily use one or the other service can cut their costs, and also cut their footprint of overall consumption. 

    That’s what we intended to do…but then…

    We started to switch to DVDs only, since more than 85% of the movies watched in our household are not currently stream-able.  But then we saw that we could now get two DVDs at a time for just $12, so that’s what we did.  I’m guessing we’ll watch about 10-12 movies a month, all of which we’ll actually want to see, at a cost of around a dollar a movie.

    Somehow I don’t feel deprived, relatively, absolutely or in any other way.

    The media’s obsession with deep-fried weird foods reflects a society fixated on consumption

    The last few days has seen this year’s two-day cycle of stories about deep-fried food at state fairs.  For the past several years, virtually all national and much local coverage of state fairs has reduced to a gee-whiz feature on some new recipe to dip in batter and then deep-fry an unexpected common food or, as is often the case, food product.  

    This year’s batter-dipped-and-deep-fried morsel is deep fried Kool-Aid”; last year it was deep fried beer. 

    Some other foods that have been given the deep fried treatment at state fairs in recent years include cookie dough, butter, a banana split, peanut-butter-jelly-and-banana sandwich, pork chop, margarita, Twinkies (giving new life to the Twinkie defense: “I got indigestion, so I killed them”), cheese steak, corn on the cob, Klondike Bars (what would you do?), Snickers bar, Oreos, Girl Scout cookies, avocados and fresh peaches.  Perhaps the most Byzantine concoction is to deep fry two donuts with a piece of chicken sandwiched between them!!! 

    Many of these food products come from the kitchen of a 300-pound food entrepreneur named Charlie Boghosian who sets up food booths at 400 state fairs and other festivals throughout the year.

    To a large degree, the media has replaced coverage of state fairs with coverage of other types of large summer gatherings such as music festivals.  And yet that one round of national and local news stories on state fairs perseveres, but now it’s always on the same topic: deep-fried food. 

    Nothing about 4H contests, butter sculpting, auto races, hoe-downs, horse shows, cowboy and agricultural exhibitions, amusement park rides, petting zoos, penny arcades, local bands, parades, craft sales, baking competitions or any of the other stuff I remember from the last state fair I attended some 25 years ago.   Beyond what the fair offers, there are all kinds of ways the media could cover state fairs on the national level, including noting long-term trends, i.e., that they are smaller and less important than they used to be, and to a large degree offer a homogeneity of entertainments, in part thanks to national vendors such as Charlie Boghosian. 

    But no trend stories, either.  It’s all about the deep-fried food.

    The media covers each year’s deep-fried fancy with a combination of prurient voyeurism and accident-gawking, all delivered with the kind of irony that says, “I’m both kidding and not kidding.” 

    All of the articles assume that these foods are popular without question and take it for granted that we now associate summer fairs with deep-fried weird food.  Few speculate as to why, and those who do all contrast the eating of the deep-fried fancy with some virtuous ideal of food consumption. Some examples:

    • “Many, though, relish in the experience of sinking their teeth into something so utterly unhealthy in complete rebellion against doctors’ orders and societal pressures to eat fresh vegetables and low-fat foods.”
    • “In a nation where every meal can sometimes seem a celebratory indulgence, the State Fair is a chance for Americans to drop all pretence of restraint and really make pigs of themselves.”
    • “But there’s something really unsettling about a food that consists of taking a chemically flavored powder and adding more and more unhealthy layers around it. Am I crazy? Have I been watching too much Jamie Oliver?”

    In the last case, the writer’s worry about the health implications of consuming deep-fried Kool-Aid becomes a symbol of her own assumed nutritional elitism, which takes the bite out of her scolding;. Am I crazy? Have I been watching too much Jamie Oliver? After questioning the healthiness of it all, she challenges her credentials to pose such a question since she’s a heath nut who watches Jamie Oliver.  The anti-elitist (and therefore American) joys of anti-nutrition are suggested in one unspoken belch of subtext.

    What’s most interesting is that all of these comments are variations on the “guilty pleasure” theme which dominates advertising for high-end junk food and desserts.  In this case, the guilty pleasure is slumming at the fair. 

    For both sophisticated dark chocolate and down-market deep-fried Twinkies, advertising and media coverage sell the same concept, “guilty pleasure.”  For so-called diet food products, the theme morphs into “guiltless pleasure,” but the underlying thought process is the same.  Whether you feel like crap or are having a great time, you deserve to have something to eat.   Even if it does make you feel a little guilty.  After all, guilt is just another emotion that the Great American Instant Gratification Machine can help you assuage through consumption—perhaps some shopping, and maybe a little something more to eat.

    FCC recommendations to increase reporting depend on the free market that has gutted news coverage

    Don’t berate yourself for missing the release of a study the Federal Communications Commission (FCC) released late last week.  As of this morning, Google News reports only 13 stories in total.  Most of the stories were based on a New York Times article; evidently the FCC released the study to the Times early, a frequent occurrence which often makes sense.

    The 478-page study, titled “The Information Needs of Communities,” demonstrates that the amount of news reported is shrinking rapidly, even as the number of news outlets is proliferating. The report seems to supply little if any original research, but analyzes a large number of recent studies and an enormous amount of anecdotal evidence.  The findings reveal that far less original reporting goes on than 10 years ago, with local coverage especially lacking:

    • Newspaper newsrooms have lost about 13,400 positions in the past four years.
    • Between 2006 and 2009, newspapers cut the amount they collectively spent on reporting each year by a $1.6 billion.
    • Over the past 10 years, reporting has shrunk significantly in the following areas: state government, investigative, environmental.  In addition, there is much less reporting done on the significance of national policy on the local community.

    As a study of the city of Baltimore released a year ago by the Pew Research Center’s Project for Excellence in Journalism shows, newspapers provide most of the original coverage, with other media usually reporting what they read in the newspaper or Associated Press.  Pew points out that newspapers account for 50% of original reporting, with TV accounting for another 30%, but as the FCC study suggests, much TV news is about crime and very little about politics, government or the economy.

    The FCC study does an excellent job in describing the proliferation of new media, virtually all of it Internet based, including on-line versions of traditional media, on-line media doing original reporting or aggregating other sources, blogs and social media.

    We thus have an increase in media and a decrease in reported news.  What’s filling the gap?  The FCC report does a less than stellar job in stating what the hundreds of thousands of new media outlets are feeding us, so I thought I would do it:

    • Opinion pieces, such as this very blog.
    • “Citizen’s reporting,” which is news reported by amateur journalists or passers-bye with cameras.  The FCC report does a good job of analyzing this phenomenon, both the amateurs who do investigative reporting and the citizen snapshots that end up on newspaper websites.  The report admits, however, that “citizen’s reporting” can never replace professional journalism.
    • Gotcha stories, such as when Sarah Palin tries to rewrite history, and YouTube/social media embarrassments.
    • Celebrity and mass entertainment news. I opened this piece by mentioning that Google New reported 13 mentions of the FCC study. On the day it was released, Google News reported approximately 24,000 media outlets doing stories on Lady Gaga.
    • Sports.
    • How-to stories, primarily related to relationships, shopping, finances, child-rearing, fitness and dieting.
    • Corporate shilling for products and services, some of which businesses pay for as if it were advertising and some of which stems from the need to fill the news and the readiness with which corporate PR types are able and willing to help.

    The FCC study rightly proclaims that “the independent watchdog function that the Founding Fathers envisioned—going so far as to call it crucial to a healthy democracy—is in some cases at risk at the local level.”  I guess the panel of experts, led by former journalist Steven Waldman, hasn’t read the New York Times or heard NPR lately—national and international news is also suffering.  The decline in reporting has occurred at all levels of the news, as the FCC study itself documents, so the risk inherent of having an uninformed citizenry exists on every level.

     The report’s recommendations to confront this problem make me imagine a physician who proposes to cure an allergic reaction to peanuts by feeding the patient more peanuts.  The peanut in this case is the unregulated free market.  Remember that many more companies use to own our TV and radio stations and our newspapers, because until the Telecommunications Act of 1996 deregulated the industry, there used to be strict limits to how many and what types media outlets a single company could own. Fewer owners naturally leads to a consolidation of resources, which we have seen most dramatically in radio, where a small number of relatively right-wing owners have largely replaced local news talk shows and local news with Rush Limbaugh, Sean Hannity and their ilk.  

    For example, the FCC study recommends that “nonprofit media need to develop more sustainable business models, especially through private donations.”  To help, the study proposes giving tax breaks for donating to nonprofit media.  But who has the most money to donate? Rich folk and corporations, who will thus be able to influence nonprofit reporting.  Of course some would say that rich folk and corporations have always controlled for-profit media.  True enough, but with so few reporters and so few media owners, the problem is far worse today than in prior decades.

    If the FCC report really wanted to address the problem of a decline in original reporting, it would have advocated a return to the days when no single corporate entity could own so many media.  It would also call for raising the federal budget for support of public broadcasting and nonprofit news media, particularly to support local news reporting.   It might have also proposed rules that require media to more prominently tell us when advertisers are paying for the story or when corporations provide the story as a video news release or an article.

    Some of the other recommendations that the FCC report advocates make a lot of sense:

    • Use the Internet to create greater government transparency so citizens can directly monitor institutions.
    • Spend more government advertising, such as armed forces recruitment, on local media.
    • Keep the Internet open and work towards achieving universal access to broadband Internet.
    • Take into account historically underserved communities when crafting rules and regulations. 

    All well and good, but none will encourage the news media to report more news.