Over time, derogatory labels such as nerd and queer warp, transform and become terms of pride.

Ian Simpson gave a very reasoned response to my recent blog entry about anti-intellectualism in mass culture.  His comment regarded the word, “nerd”: “The term ‘nerd’ is often worn as a badge of honor these days by a substantial subset (myself, included). In the digital age, being a nerd is actually kind of cool. The tech-savvy hacker is this year’s model, and the jocks and buttoned-down frat rats get their comeuppance in the end (at least in the movies)…all the same, being a nerd isn’t quite as bad as it used to be.”

Everything Ian says is absolutely true and on point.

Ian’s response reminded me of two other words that once were considered to be completely derogatory, queer and the “n” word.

A long time ago, “queer” meant someone who was odd or strange, but sometime during the 20th century the meaning devolved to “someone strange sexually,” that is to say, GLBT.  When I was reaching manhood in the late 60’s and early 70’s, queer was a very derogatory term, one that I was afraid to use around any one who was homosexual, or around anyone for that matter, for risk that someone would think I was anti-gay.  I still hate using or hearing the word.  It grates on my ears, like long nails on a chalkboard.

But in fact, queer is no longer a derogatory term among GLBT and enlightened heterosexuals and asexuals.  “Queer as folk”, “queer studies” and “queer theology” all attest to the change in use of the term.  For many GLBT, queer is now a label of honor.

Those benighted non-souls who still hold deep-seated prejudices against GLBT individuals, however, still use queer as a poisonous curse word.

The “n” word has seen a slightly different evolution.  To my recollection, in the 60’s and 70’s, it was never very cool to say the “n” word, although I heard it lot from other whites.  I remember that by 1972, it was better not to use the word, “Negro,” which gave way first to “Black,” and then to “African-American.” 

And yet African-Americans now will use the “n” word with and to other African-Americans, especially in typical male bonding environments, such as sports fields, locker rooms, taverns, dens and back yards.  But it is still an anathema for a non-African American to say, “n—.” It’s about the only word that I am too embarrassed to write out in an essay, although I would write it out if I had a character say it in a work of fiction.  I think my squeamish attitude reflects the attitude of Americans not of sub-Saharan African origin, except for that sizable group of unrepentant virulent racists.

On the surface, it looks as if “nerd” shares the same fate as “queer”: a derogatory term is now worn as a badge of pride by the people it describes, while remaining an insult to those with prejudices and resentments against the group.

But note this important difference in the original meanings of “queer” and “nerd”: In the case of “queer,” its meaning was accurate—someone who is not a heterosexual.  Now a lot of false and malicious baggage that was attached to GLBT people decades ago, and still today unfortunately, also attached to the term “queer.”  But “queer” always means and still means someone who is GLBT.

But the original definition of “nerd” was, and is, false!  “An unstylish, unattractive, or socially inept person; especially: one slavishly devoted to intellectual or academic pursuits.” 

There are plenty of very handsome men and beautiful women who are intellectuals and get good grades.  Many intellectuals and academics are also great athletes.  

The mass media wants us to believe that those who are very smart or academic are not attractive to the opposite sex.  The term “nerd” hasn’t escaped that image.  Just think of phenomena such as nerd love jokes and nerd quotes.  So even when the nerd gets the girl (or guy), the nerd love experience imagined in the mass media is an awkward one.

Of course, as Ian and most avid book-readers already know, that image is ridiculously untrue.

Call for nominations for OpEdge’s Ketchup Award, named after the condiment Reagan officials called a vegetable serving.

Many people act as if they really believe the first line to St. John’s gospel, “In the beginning was the word.”  They think by using a word or phrase they can create or deform the reality being described or make people look at it from a different perspective. 

For example, when corporations started using the term “downsizing” to describe massive layoffs of employees, the idea was to conceal the human misery that layoffs cause by shifting the focus from the people to the ephemeral entity that is a corporation.  With this newly created compound noun, they sought to replace the message, “2,000 people are losing their jobs” with the more positive message, “The company is getting smaller (and stronger).”

Of course, most people saw through the ruse, so in time another new phrase entered the lexicon of terms to describe massive firings: right-sizing.  People quickly saw through that one, too.

(A quick note: companies sometimes do have to terminate the employment of many people when changed market conditions or foolish moves by management threaten the continued operation of the business.  What I’m talking about here is the language they use, and not the actions they take.)

Examples of these euphemisms are everywhere: “pre-owned” to describe a used vehicle; “police action” to describe a war; “special methods of questioning” and “refined interrogation techniques” to describe torture.    

Sometimes, the replacement term is a piece of jargon that sounds weird until it is repeated endless times, such as the use of the word “product” to describe something intangible like insurance, software or a professional service.

So far, most of the examples I’ve given are simple euphemisms: synonyms that pretty up the situation or concept.  Sometimes, though, the new term is meant to manipulate or completely distort, usually for an ulterior motive.  My favorite example of all time is the Reagan Administration’s attempt to consider as a vegetable that goopy combination of tomato paste and corn syrup we know as ketchup.  Reagan’s folks wanted to define ketchup as a vegetable and not what it is, a condiment, so that they could cut the budget for the school lunch program and still say that the children were getting a balanced, healthy meal, ignoring the low nutritional value of ketchup compared to fresh or canned tomatoes, green beans, carrots, kale or other real vegetables (not to mention that to constitute one serving of vegetable, someone would have to choke down a half-cup of ketchup).

Other times, the new label is an out-and-out lie, as when earlier this year Tea Party elder mis-statesman and former Congressman Dick Armey said that the founders of Jamestown, all capitalists to their core, were socialists.  Armey turned these early American entrepreneurs into socialists rather than admit that a capitalist venture could ever fail and to hammer home his false message that any and every economic failure must stem from socialistic actions.  

As a writer and a student of language and society, I find these new words and phrases to be quite fascinating, especially when they spread lies or manipulate the public.  That’s why I decided to bestow an award each year on the weirdest, funniest and/or most manipulative new or newly reported label, word or phrase used by an organization or individual to distort or recreate reality.

It’s called the Ketchup Award, after the Reagan Administration’s favorite vegetable, and I’m asking my readers to send me nominations by December 31.  I’ve mentioned the awards twice on OpEdge, so consider this blog entry the final call for submissions.

If you would like to nominate a new or newly reported distortion for the first annual Ketchup Awards, just post it in a comment on one of my blog entries, send your nominations to the OpEdge FaceBook page or email ketchupawards@gmail.com.  Please include the phrase and the person or organization who said it in your nomination.  No need to include any links, but keep in mind that my staff and I will have to verify the word or phrase, who said it and that it was actually said in 2010, and a link will make it much easier for us to do so.

In a special blog entry on or around January 15, 2011, I will list at least 10 finalists and make three awards:  3rd Place gets One Dollop; 2nd Place Two Dollops; and the grand prize winner will get The Full Squeeze. 

There will be no prize for the submitter of the winning entries, except for the recognition you will receive on OpEdge and the warm feeling you’ll get inside knowing that you have helped to unmask a charlatan.

Thanks in advance for your nominations.

The news media keeps busy covering celebrity worship and parents trying to game the educational system.

A while back I wrote about Parade’s use of the July Fourth celebration as a platform for worshipping celebrity culture.  As I said then, it’s the “modus operandi” (the way it works) in the mass media. 

This past Sunday, Parade once again reminded us to worship actors and entertainers, this time as part of the new rite of passage for American teens—going off to college.

The title of the article says it all: “Schools of the Rich and Famous.”

And who are these rich and famous?  Of the 30 names mentioned, 27 are actors and entertainers, skewering young but ranging from Emma Watson to Joan Rivers.  Two are titans of business, Warren Buffet and Steve Jobs.  The other is a caterer turned home advice expert turned business titan and entertainer, Martha Stewart.  There are no writers, scientists, explorers, astronauts, diplomats, inventors, community activists, physicians, politicians, elected or government officials, classical or jazz musicians; not even an athlete, which is truly weird.

Once again, Parade is telling us that the highest achievement is to be in front of a camera on TV or in the movies.

What’s truly hilarious is how the writer Rebecca Webber presents this list of where celebrities went to college:  She gives us a multiple choice quiz.  The subhead is “Test your knowledge of celebrities and their student days.”  Celebrity trivia is not a body of knowledge, nor will accumulation of information about where celebrities went to college help anyone either to solve today’s pressing problems or to consider the wisdom of the ages.  There is no knowledge involved or discussed in this article at all.

On the other hand, perhaps Webber thinks that taking her quiz will help the kids prepare for their standardized exams.

I think I’ll nominate Webber’s use of the word “knowledge” for a Ketchup award, which this blog will give at the end of the year to the most obnoxious and most absurd bending of language of the prior year.  I call it the Ketchup Award in honor of the condiment that the Reagan administration declared a vegetable for the purpose of evaluating the nutritional value of the federal school lunch program.

Turning to another growing trend, The Sunday New York Times placed an article on the decision to hold a child back for a second year of kindergarten on the front page of the Sunday Styles section, right under its steamy coverage of the breakup of a billionaire’s marriage. 

Now of course, certain children need to start late because of emotional problems or maybe they aren’t ready to learn how to read.  But in many cases, as the Times reports, parents are holding back their children so that they will have an edge in sports and in the classroom. 

It’s another trick of parents trying to give their kids a leg up instead of letting them stand on their own two feet.  It works in sports, perhaps.  But in the case of holding them back so they do better in school, it won’t work and in some cases it may backfire.

The hold-back trend had already taken hold when my son was getting ready to go to kindergarten.  At that time, the cutoff for school had recently changed from December 31 to September 30, but every boy born after June 30 whom we knew in our large middle class circle of acquaintances was held back by their parents.  And virtually all of them had some behavior problems in early grades.  Hey, maybe they were bored.  And years later, it turned out that a lot of the kids who started on time got into top-notch universities, even the youngest, while lots of the kids who started late ended up going to D list colleges.  Now that’s strong evidence, but keep in mind that it’s all anecdotal, based only on my experience.  So don’t put that much stock into it.

But think about this notion: if all parents or even a significant number held back their kids, then the advantage would be lost. 

Parents who hold back their kids for sports should compute the statistical odds of their children becoming professional athletes: There are about 3,700 jobs a year in the four major sports, or about two-thousandths of one percent of the population of U.S. males.  Then again, many athletes now come from other lands, so the odds are even worse.  So, realistically athletics are fun only.  Ask yourself, then, do you want your kids to start their careers or go to graduate school a year later for an edge in a fun activity? 

Be that as it may, in most cases starting kids late, for either academic or athletic reasons, is just another way to extend childhood and another way for parents to interfere in the educational process to give their child an unfair, although in this case a dubious, advantage.

Maureen Dowd speaks in code and everyone understands what she means.

Another low point in the endless and senseless coverage of a married professional golfer’s extramarital affairs was Maureen Dowd’s smarmy attempt to find a trend in the actions of Tiger Woods and White House social secretary Desiree Rogers who might (but might not) share some blame for party-crashers penetrating President Obama’s first state dinner and demurred from testifying to Congress about the incident. 

GenXProgress in the Daily Kos has already discussed the inherent racism in the connection between Tiger and Ms. Rogers, since the two people and two cases have absolutely nothing to do with each other except that both are at least part African-American.

What GenXProgress does not detail, and I will provide here, are the many racial code phrases Dowd uses or creates on the spot to explain why our anger at Tiger and Ms. Rogers is and should be similar in nature:

  • “…put themselves beyond authority…”
  •  “…perfectionist high-achievers brought low…”
  • “…both the golf diva and the social diva…
  • “…it was the assertion of personal privilege by Tiger and Desiree that was offensive…” (and not Ms. Roger’s assertion of “executive privilege” or Tiger’s desire not to air his dirty laundry in public, a desire he shares with virtually every public person caught in extracurricular hanky-panky).
  • “She mistook herself for the principal, sashaying around and posing in magazines as though she were the first lady…”

As far as I can tell, these phrases are all code for “uppity N*****s.”  Shame on Ms. Dowd for stooping so low.

Trust, but Verify Who Actually Said It.

Here are some of the more than one million pages of news media, books and websites on the Internet that cite Ronald Reagan as having said, “Trust, but verify.” Many of the citers are writing about politics or foreign policy, but the sample of links below show that the citation of Reagan as having said this slogan is far-reaching, and includes articles or documents about investments, the urban lifestyle, campus facilities management, auto dealerships and even web design:

The problem is, while Reagan said “Trust, but verify,” he was not the first to say it as most of these sources state or imply.

“Trust, but verify” is an old Russian proverb, as a New York Times editorial in 1987 and the Wikipedia article both point out.

Why would we have a collective failure of memory of who said it?  In this case, I think there are two causes:

  • The desire of our society in general to glorify presidents, and of the right-wing to glorify this particular president.
  • A cultural reluctance to cite not just alien sources, but our recent enemies, the communists. 

In my view, writers of non-fiction have an ethical responsibility to check their facts and write the truth.  If you want to say that it was a favorite Russian proverb of Reagan’s, fine, but makes sure you let us know that it was originally Russian, for the sake of truth.

Speaking of the Devil “They”

No sooner did I post a rant about syntactical mistakes that editors (and teachers) hate to see, in yesterday’s blog on ways to get the media to toss away a news release, when low and behold—I pull from my mailbox an incredibly embarrassing example of the offense that gets made fun of by editors perhaps more than any other: the use of “they” for “he’, “she” or “it.”

Here is the headline from a 9” X 6” postcard sent by Fragasso Financial Advisors, a financial planning firm:

 Did your advisor talk with you during the downturn?

What did they say?

The mistake, of course, was to use “they” to refer to “your advisor,” since “advisor” is singular and “they” is plural.   Of course, to write “What did he or she say” is pretty stiff, as are all the variants: he/she, he or she, heshe, shehe, et. al. 

I would rewrite it as “Did your advisors talk with you…,” pluralizing “advisor” so that “they” can refer to it. 

If the text were for a speech, radio ad, TV or other spoken application, “they” would be useable, if not preferred.  Spoken language is always less formal than written nonfiction prose.  On the other hand, those elements of language that arbiters most resist changing usually have to do with logic and there is nothing logical about a single entity being referred to as plural entities.

An open question is why the firm uses “advisor” instead of “financial advisor?”  The less precise term could refer to a wide range of professional service vendors and thus lends an element of inaccuracy, or perhaps imprecision, to the headline.

Wake Up the Copy Editors

The New York Times may be going a little too far in trying to make its Tuesday “Science Times” section accessible to the mythical average Joe-and-Jane.  I’m sure many of my (perhaps mythical) readers know the section I’m talking about: It’s the one with all the health care and technology/engineering articles with some occasional science thrown into the mix.

The beginning of two articles in today’s “Science Times” both take a chatty, “here’s what the in crowd thinks” approach to introducing the subject, a writing style that really belongs in gossip, society and fashion magazines.  Neither article offers up any study or expert to support the assertions made at the beginning of the article, but rather presents them as self-evident, at least to those of us who are “in the know.”  Both articles go on to present theories of experts that contradict or stand in contrast to the ideas presented as known gospel in the first paragraph(s).

The first article concerns recent theories on the “evolutionary” advantage of sleep.  Before we go any further, I must state that I am a firm believer in the theory of evolution and the fact that humans and all living things descended through time and mass extinctions from single cell creatures.   I just don’t like to see pop-Darwinism thrown around to make silly conjunctures about the complex behavior of humans.

 “If there is a society of expert sleepers out there, a cult of smug snoozers satisfied that they’re getting just the right number of restful hours a night, it must be a secretive one. Most people seem insecure about their sleep and willing to say so: they would like to get a little more; maybe they wish they could get by on less; they wonder if it’s deep enough.”

This next story, from the same issue, also is about the evolutionary origins of an aspect of human behavior, in this case, the serial monogamy that many people in western societies practice.  The assumption that “we” (or at least “our crowd”) believe the ridiculous sexist nonsense proffered in the first two paragraphs is its own kind of reinforcing ideological subtext, and an offensive one to my way of thinking.  And note again the lack of any expert or study to support the “theories,” or even support the assertion that most people believe these theories.

“In the United States and much of the Western world, when a couple divorces, the average income of the woman and her dependent children often plunges by 20 percent or more, while that of her now unfettered ex, who had been the family’s primary breadwinner but who rarely ends up paying in child support what he had contributed to the household till, climbs accordingly. The born-again bachelor is therefore perfectly positioned to attract a new, younger wife and begin building another family.

“Small wonder that many Darwinian-minded observers of human mating customs have long contended that serial monogamy is really just a socially sanctioned version of harem-building. By this conventional evolutionary psychology script, the man who skips from one nubile spouse to another over time is, like the sultan who hoards the local maidenry in a single convenient location, simply seeking to “maximize his reproductive fitness,” to sire as many children as possible with as many wives as possible. It is the preferred male strategy, especially for powerful men, right? Sequentially or synchronously, he-men consort polygynously.”

While we’re on the subject of leads, I’ve seen another sign that cutbacks in newsrooms are leading to a lowering of editorial standards.  Here are the first two paragraphs in today’s Associated Press story on the Yankees-Orioles baseball game last night:

“Andy Pettitte retired his first 20 batters before a lamentable seventh-inning sequence spoiled both his perfect game and no-hit bid, and the New York Yankees beat the Baltimore Orioles 5-1 Monday night.

“Pettitte (12-6) was poised to finish the seventh without allowing a baserunner, but former Oriole Jerry Hairston Jr. let a two-out grounder by Adam Jones slip through his legs for an error. Hairston was playing in place of Alex Rodriguez, who was given the night off.”

Now here’s the entire story about the game that ran in The Pittsburgh Post-Gazette and many other newspapers, which have taken up the practice of using the first sentence of the A.P. article as the compete story in a round-up section of baseball games:

“Andy Pettitte retired his first 20 batters before a lamentable seventh-inning sequence spoiled both his perfect game and no-hit bid, and the New York Yankees beat the Baltimore Orioles 5-1 Monday night.”

Someone at the Post-Gazette really should have taken the time to revise “before lamentable seventh-inning sequence” (not such a great phrase to begin with, but acceptable when followed by a sentence of explanation) into something like “before an error and a hit…”

 

Tar Sand in the Eyes

Michael Lynch, a so-called energy consultant who used to be involved in energy research at MIT’s Center for International Studies, has a silly little piece of specious reasoning in today’s New York Times. The point of Lynch’s article is that the widespread belief in the  “peak oil” theory is leading to wasting money investing in “hairbrained renewable energy schemes” and imposing “unnecessary and expensive conservation measures.” 

Of course, Lynch never gives a single example of a “harebrained”  scheme or of unnecessary conservation, preferring to spend his limited words on attempting to demonstrate that the earth has 2.5 trillion barrels of oil and not a mere 2 trillion as claimed by some “peak oil” advocates.  As Wikipedia tells us, the peak theory, first proposed by M. King Hubbert, is a mathematical way to determine when the production of petroleum from any given oil field peaks, after which the yield from the field will start to dwindle inevitably to nothing. 

Let’s say that all of the adjustments that Lynch proposes to make to the peak oil theory are accurate and appropriate:  Won’t we still  run out of oil one day?  Are we better off sticking our hands in the tar sands (from which oil companies hope to one day pull oil) because we have more oil than one set of engineers say we do?  And isn’t the burning of oil for fuel still a major cause of global warming?

The interesting part of the article of course is the sly way in which by changing the argument from “when will we run out of oil” to “have we reached what can technically be described as the ‘peak’ in possible oil production,” Lynch hopes to justify less regulation and less investment in a viable energy future.  It’s an old rhetorical trick, akin to throwing (tar) sand into the eyes of the reader.