The Trouble with ‘Inappropriate’

Since the Harvey Weinstein scandal broke last fall, published reports of men’s sexual misconduct have abounded. Just this week, the Chronicle of Higher Education published an investigation of Jorge Domínguez, “a prominent Harvard professor and former vice provost accused of groping, kissing, and other inappropriate behavior by close to 20 women.” That and all the other reports divide into two distinct categories, something I didn’t really realize until watching an episode of All in With Chris Hayes on MSNBC in December. Hayes’s guests included Rebecca Traister and Irin Carmon, journalists who have been covering this beat.

Here’s the exchange that got me thinking:

TRAISTER: I think that journalism has been driving this in a way that has made it airtight, at least up to a point. So, in all of the cases where there has been terrific reporting, in advance of whatever winds up  happening to the accused, we have gotten this incredibly detailed, incredibly well checked in, all of the cases that we know about accurate vision of what has unfolded. …

CARMON: I mean, I think there is something that reporting can do here that other tools like the criminal justice system or HR are unable to do, right. You’re able to weigh all of the different stories in a way that is publicly accountable. You try to get people to use their names on the record. The thing that was the most effective I think about the Harvey Weinstein story when I got to Ashley Judd’s name, wow! Ashley Judd was on record. …

TRAISTER: I also think that you could tell where there has been the reporting and where there hasn’t been, because as this has gone on, there are some cases where some people have been fired or suspended in advance of reports. So for example, Garrison Keillor was fired pre-emptively, I believe, and we only have the sketchiest vision of what he did. … When a company fires somebody or suspends them … they’re legally obligated by some measures not to reveal all of the terms of why they are doing this, and that has left people with confusion about what is happening, what has been alleged, and you can see where there is an absence of the reporting.

So here’s the divide. In one bucket are cases driven by journalistic reporting — those of Weinstein, Domínguez, James Levine of the Metropolitan Opera, Bill O’Reilly, Charlie Rose, Al Franken, Kevin Spacey, and in fact not that many others. They have in common that through a combination of the prominence of the male and the frequency or severity of the offenses, a major news organization deemed the charges worth investigating and, ultimately, the investigation worth publishing. As Carmon and Traister suggested in the interview, current journalistic best practices call for a high level of checking and verification. Preferably, sources go on the record, as Ashley Judd did; if they decline to, a New Yorker or New York Times or Washington Post will use their account only if solid corroboration is found.

In the other bucket are cases the world learns about only after a company or other organization has taken a personnel action, like firing or suspension. Examples that come to mind, besides Garrison Keillor, are Ryan Lizza of The New Yorker  and Jonathan Schwartz and Leonard Lopate of New York Public Radio. Traister said the organization is “legally obligated by some measures not to reveal all of the terms of why they are doing this, and that has left people with confusion.” I am not an expert in this or any area of the law, but I suspect that companies don’t reveal details by choice, not obligation, knowing that doing so might open them up to defamation suits, compel them to to reveal accusers’ names, or cause other problems. But whatever the reason, they don’t give particulars, resulting in reactive (rather than proactive) press coverage that’s vague and often, as Traister says, confusing.

For example, in February, the Ford Motor Company fired a top executive, Raj Nair, and a press release “explained”:

The decision follows a recent internal investigation into reports of inappropriate behavior. The review determined certain behavior by Nair was inconsistent with the company’s code of conduct.

“We made this decision after a thorough review and careful consideration,” said Ford President and CEO Jim Hackett. “Ford is deeply committed to providing and nurturing a safe and respectful culture and we expect our leaders to fully uphold these values.”

The New York Times article about the move heavily quoted the release and added no details about Nair’s alleged behavior.

The word that jumped out at me in the release and the article, only because it was so predictable, was inappropriate. It is the catchall designation for bad behavior, or behavior that we are asked to think was bad, but can’t really judge, because we aren’t told exactly what was done.

There’s nothing new about inappropriate, I hasten to say. Back in 1999, I wrote an essay for the Chronicle of Higher Education about the word, which was coined in the early 19th century, according to the OED, and was used by Charles Dickens in Dombey and Son: “[He] invaded the grave silence with the singularly inappropriate air of ‘A cobbler there was.’” At the time of my piece, inappropriate had been propelled into heavy rotation by Bill Clinton’s inappropriate behavior, and his admission, “I did have a relationship with Miss Lewinsky that was not appropriate.” I complained,

While it is currently deployed to characterize virtually anything a writer or speaker finds unsatisfactory, its most common use is clearly as a euphemism for sexually explicit material, especially when this gets onto children’s radar screens, or (as with the President) forms of sexual behavior that for various reasons are not universally accepted. … The problem with the overuse of “inappropriate,” finally, is that it is fuzzy language and inevitably results in missed signals and squawky communication.

But the frequency then was nothing compared with the present. Inappropriate appeared in the New York Times 803 times in 2017. So far this year it’s appeared 226 times, a projected annual pace of 1,755 — more than double last year.

I hate it when people talk about banning words and I certainly don’t propose a ban on inappropriate. But I have a suggestion for the press, regarding the Raj Nair-type cases, ones where the firing is newsworthy but not newsworthy enough to dispatch a team of reporters to unearthing the backstory. Ask the company for details. If it does not provide any, simply write, “the company did not provide details.” And leave the wimpy and weaselly word inappropriate to the press-release writers of the world.

The Yagoder Effect

This went out over the wires a few days ago and was picked up on Twitter:

 
The AP reporter’s mistake is understandable. Scott Israel is a New York native and has a pronounced New York accent.

Thus, if Israel were to say “shiver,” it would indeed come out as “shivva”–so presumably the reporter was correcting for accent. Unfortunately, he or she wasn’t aware of the Jewish tradition of “sitting shiva” for the deceased.

A version of the same thing once happened to me. For many years, I have been a customer of Hitchcock, a purveyor of wide shoes based in Hingham, Massachusetts. One day a Hitchock catalog came in the mail, and on it my name was written as “Ben Yagoder.” A characteristic of the Massachusetts accent, of course, is to add an “r” sound to words that for everyone else ends in a schwa: so, John F. Kennedy’s famously imitated pronunciation of Cuba as “Cuber.”

I figured one of two things happened. One, a Hitchcock staffer made the same mistake as the AP reporter: heard me say “Yagoda,” (somehow) knew I was from New York, and “corrected” me. Or, two, a staffer was so wedded to the “-er” pronunciation that they extended it beyond pronunciation to spelling.

I like theory two.

“Smacked-ass”: An Appreciation

Sign outside a South Philadelphia restaurant

The night before my (adoptive) hometown Philadelphia Eagles took on the New England Patriots in Super Bowl LII (which I keep reading as the Trumpian insult “Lil’”), Saturday Night Live aired a brilliant skit imagining Colonials from each region trash-talking each other at the Continental Congress.

Local girl Tina Fey led the Fluffyans (the way we say the city’s name) and nailed the weird local vowels, like pronouncing the team as “Iggles,” the place where you hang your hat as something like “hay-ome,” and a generic encouragement as “C’mawn!”  She also laid down a few distinctive Philadelphia terms, like pop-pop (for grandfather), youse (for the second-person plural), and hoagie (for hero/submarine sandwich).

Since moving here in 1982, I’ve noticed a few other Philadelphia words, including wooder ice for what I grew up calling “Italian Ice”; gravy for spaghetti sauce; down the shore (“to the beach”); jimmies, for what the rest of the country (except Boston!) calls the sprinkles you put on ice cream; and the relatively recent and much talked-about all-purpose noun jawn.

But my favorite is a term I didn’t even know was a localism till last week. That’s when I read a remembrance by Stephen Fried of D. Herbert Lipson, the longtime publisher of Philadelphia Magazine, who passed away in December at the age of 88. It began:

I could tell you about my first weeks at the magazine in 1982, when Herb ordered me to get a haircut and then sent his assistant around every day to see if I had. (I hadn’t. Still haven’t.) I could tell you about the time he stormed into and out of my messy office, calling me an “unmade bed” — prompting one of my colleagues to actually buy a dollhouse bed he “unmade” for me. Or when he told the folks in the art department that I was “such a smacked ass.”

I also started working at Philadelphia Magazine in 1982 — it’s what brought me to the city — and I well remember the Lipsonian insults of Fried. I had never heard “smacked-ass” before and immediately adopted it. Well, actually, I use it in only one situation: in reference to photographs of myself. “If possible, try not to make me look like a smacked-ass,” I’ll say. Or, “Don’t use that one. I look like a smacked-ass.”

After reading Steve’s piece, I realized I had never heard anyone else use “smacked ass” since ’82, and, naturally, investigated, first by looking at the Oxford English Dictionary, Dictionary of American Regional English, and the prominent online dictionaries, none of which listed it. The top definition on Urban Dictionary was posted in 2004: “an absolute idiot that walks around as if he’s got no clue in life.” Searching the term on Google Books yielded 256 hits. To the extent I could discern the home of the authors, they were all from Philadelphia, including the memoirist Joe Queenan, the children’s book author Jerry Spinelli, and the crime writer Lisa Scottoline, whose novel Rough Justice has this line: “Then I hold a press conference where I tell the world that the mayor is a smacked ass.” That obviously suggested a Philly provenance. Herb Lipson himself was from Easton, Pa., but started working in Philadelphia in 1953, right after graduating from Lafayette College.

The first Google Books citation was a snippet from a 1977 criminology text, quoting (presumably) a criminal: “I just asked for change for a ten-dollar bill and felt like a real smacked-ass to myself.” The snippet view doesn’t allow me to search for any info on the person being quoted, but one of the co-authors, the late James Inciardi, was a professor at my institution, the University of Delaware, and may have done fieldwork in Philly, less than an hour away.

A Google search for “smacked ass” led me to a bulletin board where someone used it and was asked what it meant. He replied, “Northeastern US slang for ‘complete idiot.’” Someone else responded, “Funny, I’ve never heard of that in my 30 years of existence, all of it in the Northeast.” Then the original poster said, “Philadelphia, actually. Maybe it was just my mother.”

Green’s Dictionary of Slang contains no entry for the term, but it does have “Face like a smacked arse,” defined as “a phrase used to describe someone who looks very depressed.” It appears to be common in Ireland and the North of England. The first cite for it is 2000 but I found a 1986 quotation on Google Books:  “Big red nose, big red face, just like a smacked arse.”—Cedar, by James Murphy. “Face like a smacked arse” has gotten quite popular, with 23 Google Books hits since 2010.

I posed the question on the American Dialect Society e-mail list and got some helpful responses. John Baker dug up a 2005 Philadelphia Inkwire (the way we say Inquirer) article discussing “Phillyspeak”: “Smacked ass. Peculiarly Philadelphian, this refers to a person, generally male, who has done something really dumb or foolish.” And Garson O’Toole found the earliest use I’ve seen, by the Inquirer columnist Tom Fox in 1971: “I had trouble my first year in high school. I was 13 and a real smacked ass. I knew all the answers. I was so smart when I was 13 I flunked everything but gym and expression.”

Anecdotally, I asked around. Everybody from Philadelphia was familiar with the expression; everyone from somewhere else wasn’t (even if they had lived here for decades). So smacked ass appears to be a particularly Philadelphia expression (with an intriguing Irish connection) that emerged no later than 1971. Any pre-dating or insights welcome.

In conclusion, here’s a message from all us Fluffyans to the Iggles, snapped by an Inkwire photographer:

Photo credit: Philadelphia Inquirer

 

One Step (Backwards) for the Parts of Speech

I was watching The Rachel Maddow Show the night, a couple of weeks ago, when Sen. Rand Paul, the Kentucky Republican, was about to force a brief government shutdown through a filibuster and other delaying tactics. Maddow showed video of him speaking on the Senate floor about one of his favorite themes, government waste, and I briefly glimpsed one of his visual aids that, if I read it correctly, was deeply strange. I searched the Internet for the poster, and it turned out I had read it correctly.

Armstrong.jpj

The deeply strange thing, needless to say, is that the word a is not a preposition but an article. Talk about waste; Paul spent I don’t know how much government money on a slick poster and couldn’t even be bothered to get his parts of speech right.

That made me wonder if there was anything else Paul got wrong in his attack on this study, which he initially broached in a 2016 press release. Spoiler alert: The answer is “yes.”

Backing up a bit, when Neil Armstrong arrived on the moon in July 1969, the first words he said seemed to be, “That’s one small step for man, one giant leap for mankind.” This doesn’t really make sense, since “man” and “mankind” are pretty much the same thing. Armstrong subsequently contended that he had actually said, ” … step for a man … ” (emphasis added), which does make sense, and makes it a good quote.

You can judge for yourself by listening to this recording.

Over the years, there have been various attempts to determine whether he said “a” or not. In 2006, an Australian computer programmer ran the recording through audio software and concluded that Armstrong had voiced the article but static blotted it out. Three years later, two other researchers, using supposedly higher-quality recordings, came to the opposite conclusion. Armstrong, who died in 2011, himself would adopt the position that while he intended to say a, he wasn’t completely sure whether he had done so and preferred that the word be rendered in parentheses. That seems reasonable.

The study ridiculed by Rand Paul was published in September 2016; you can read it here. The lead author was Melissa Baese-Berk, a linguist at the University of Oregon. Paul’s poster (and his press release) misrepresented the purpose of the study, which it described as an attempt “to figure out whether Neil Armstrong used the preposition ‘a.'” Rather, the authors examine, through an experimental study, how the confusion over Armstrong’s quote may relate to broader issues regarding how we hear what others say. As Dr. Baese-Berk wrote to me in an e-mail,  “We were less interested in whether Armstrong used the article and more interested in whether this instance could illustrate the very issues of timing and speech reduction that we were interested.”

Furthermore, the poster is deeply disingenuous regarding funding. Take a look at it. How much do you think the National Science Foundation spent on Dr. Baese-Berk’s Neil Armstrong study–$700,000, right? Wrong. In fact, the grant was spread out over more than forty complementary studies. Paul’s press release is equally deceptive and misleading on this point.

Having read the Armstrong  study, I acknowledge that its findings do not amount to a cure for cancer. However, to state what should be obvious, that isn’t the way science works. It advances our understanding through fits and starts. The starts could not occur without the fits.

Nor am I saying that all academic research, government-funded or not, is worthwhile. Some of it amounts to self-perpetuating boondoggling, with little hope of eventually yielding real enlightenment or utility, and deserves to be critiqued.

But not through Rand Paul’s two-bit demagoguing.

Woo-Hoo for “Woo Woo”

Woo-woo tips mingle with practical pointers. “Eat from heart-shaped bowls, and put heart stickers on your refrigerator,” Minich recommends. (Why? “To keep the spirit of love alive,” duh.)

–The New York Times, March 27, 2016, review of Whole Detox, by Deanna Minich

… “Valley of Love,” a logy, woo-woo drama about a former couple who, at the request of their son, who killed himself earlier that same year, have come to find answers in the California desert.

–The New York Times, March 24, 2016

“I fluctuate between being very practical and very impulsive, and this was a very impulsive decision,” continued Mr. [Tim] Daly. … “Not to get too woo-woo, but there was a good vibe and I just kind of leapt.”

–The New York Times, February 5, 2016

Clearly, woo woo has hit center stage, or at least that portion of it occupied by The New York Times. And what exactly is woo woo? Deepak Chopra offered a rather defensive definition in a 2011 Huntington Post piece: “It used to annoy me to be called the king of woo woo. For those who aren’t familiar with the term, ‘woo woo’ is a derogatory reference to almost any form of unconventional thinking, aimed by professional skeptics who are self-appointed vigilantes dedicated to the suppression of curiosity.”

Some sources attribute the term — presumably an onomatopoetic rendition of the eerie soundtrack that plays when mystical folk unleash their mysticism — to James Randi, the longtime magician/skeptic whose career of debunking was recently chronicled in the documentary film An Honest Liar. The earliest reference I’ve been able to find is from a 1983 edition of New Age Journal, cited in a 1984 Philadelphia Inquirer article by Steven X. Rea:

George Winston, who practices yoga and who currently has three albums on the jazz charts … has jokingly called this crowd the “woo-woos.” In a 1983 interview in New Age Journal, Winston, asked if he knew who comprised his audience, answered that there were some classical fans, some jazz, some pop and “all the woo-woos.”

“You know,” he added, “there’s real New Age stuff that has substance, and then there’s the woo-woo. A friend of mine once said, ‘George, you really love these woo-woos, don’t you?’ and I said ‘Yes, I do love them,’ and I do. I mean, I’m half woo-woo myself.”

Woo woo soon developed from a noun to an adjective, as in this 1988 quote from a journal called Training: “Subsidiary gurus, licensed to deliver high woo-woo programs developed by others, often will remind you of TV weathermen.” (Interjection-noun-adjective is a rather unusual course of anthimeria.) The Times’ first use came two years after that, in an article about the Earth First movement: “In small towns among the redwoods, new-age settlers have appeared in tie-dyed wardrobes and dreadlocks. They work as carpenters, holistic healers, mandolin players, giving themselves names like ‘Sequoia’ and ‘The Man Who Walks in the Woods.’ Within Earth First, these neo-hippies are known as the ‘woo-woo element.’”

tumblr_muniyeBmdP1qz4txfo1_500

Hugh Herbert

While looking into the origin of the mystic-mocking term, I was struck by how many other different ways it has been used, including as the catch phrase of Hugh Herbert, a rubber-faced comedy actor of the 1930s and ’40s. Wikipedia tells us:

His screen character was usually absent-minded and flustered. He would flutter his fingers together and talk to himself, repeating the same phrases: ‘hoo-hoo-hoo, wonderful, wonderful, hoo hoo hoo!’ So many imitators (including Curly Howard of The Three Stooges and Etta Candy in the Wonder Woman comic book series) copied the catchphrase as ‘woo woo’ that Herbert himself began to use ‘woo woo’ rather than ‘hoo hoo’ in the 1940s.

Interestingly, a 1938 article by Lucius Beebe in the New York Herald-Tribune associates the phrase with other comedians: “Originated by the Ritz Brothers and long accepted in the West as a cry of dismay, festivity, or general acclamation, the screaming of ‘woo woo’ has penetrated the New York bars.”

People nicknamed “Woo Woo” include:

  • Arnie (Woo Woo) Ginsberg, a retired Boston disk jockey, one of whose trademark sound effects was a train whistle. Jonathan Richman referenced him in the 1989 song “Fender Stratocaster”: “Like Woo Woo Ginsberg at the juke box joint/You hear the sound and you get the point.”
  • Legendary Chicago Cub fan Ronnie (Woo Woo) Wickers. (Not to be confused with Philadelphia Phillie fan Brad Golden, who shouts, “Everybody hits! Wha Hoo!”)
  • In the 1940s, 15-year-old Ellsworth (Sonny) Wisecarver Jr. developed a habit of running off with older women, garnering him national publicity and the moniker the Woo Woo Kid. Fun fact: A 1987 film based on Wisecarver’s exploits, In the Mood, was the first starring role of Patrick (McDreamy) Dempsey.

The Wisecarver woo woo seems to stem from the term’s use to denote a sense of risque hijinks, sort of the intersection of “hubba hubba,” “ooh la la,” and, in another bit of onomatopoeia, a wolf whistle, with an implied association with the idea of pitching woo. In 1960, Time magazine illustrated the glamour of the financial writer Silvia Porter by quoting a letter to her lecture agency, “Our second choice would not have the allure and woo-woo of Miss Porter.”

Then there was the Hamilton Jordan affair. As readers who were past the age of reason in 1978 may recall, Jordan, a top adviser in the Carter administration, made headlines that year when, at a Washington bar, he supposedly spit his drink on a woman’s blouse. The White House thereupon issued a 33-page white paper denying the allegation. The Washington Post reported:

The White House rebuttal issued yesterday rested heavily on the statements of Daniel V. Marshall III, a bartender at Sarsfield’s at 2524 L St. NW, where the incident occurred. …

Marshall’s version of what happened is that Jordan was quickly surrounded by young women who wanted to be near the “celebrity.” He said Jordan “woofed down” a steak and drank a beer and two Amaretto-and-creams.

The women were coming up to Jordan “and ‘woo-woo,’ you know what I mean?” Marshall asked.

I could discuss South Park’s Woo Woo PC Chant, the Woo Woo cocktail (vodka, peach schnapps, and cranberry juice), and Jeffrey Osborne’s 1986 “You Should Be Mine (the Woo Woo Song),” but you get the idea. Woo woo has an uncanny semantic productivity. Not to get too woo woo on you.

Let’s Call the Whole Thing ‘Often’

I was listening the other day to “Reply All,” a podcast about the Internet, and P.J. Vogt, the reporter/host, had occasion to say the word “often.” I was pretty confident that I knew how he was going to pronounce it. After all, Vogt is young (I would judge in his early 30s), and speaks with vocal fry, list lilt, uptalk, and, generally, a pronounced Ira Glass-esque lack of slickness.

In other words, I knew he would say “off-ten,” pronouncing the t.

And he did.

A good deal of history is embedded in his choice. The Oxford English Dictionary notes that the word often became commonly used (supplanting oft) only in the 15th century, and that in the 16th and 17th, it was sometimes said with the t voiced, sometimes not. Queen Elizabeth I said offen (the dictionary doesn’t say how it knows this), and that pronunciation became the accepted one. In the blog Daily Writing Tips, Maeve Maddox quotes John Walker’s Critical Pronouncing Dictionary, published in 1791: “in often and soften the t is silent.”

John Keats seemed to be assuming such a pronunciation in lines he wrote for a draft of “Endymion” (1818):

“… O foolish rhyme! / What mighty power is in thee that so often / Thou strivest rugged syllables to soften … ”

(My colleague Charles Robinson, a Romantics scholar, cautions, “I would agree that he probably pronounced often without the t — but you cannot prove it from the rhyme. Remember, there are partial and sight and near rhymes — so even if he did pronounce it off-ten, it would still ‘rhyme’ with soffen.“)

But the t version would soon revive. According to the American Heritage Dictionary, “With the rise of public education and literacy and, consequently, people’s awareness of spelling in the 19th century, sounds that had become silent sometimes were restored, as is the case with the t in often.”

The dictionary is noncommittal about the shift, but in the 20th century, usage commentators often got exercised about off-ten. H.W. Fowler wrote in Modern English Usage (1926) that the t-voiced version was “practised by two oddly consorted classes — the academic speakers who affect a more precise enunciation than their neighbours’ … & the uneasy half-literates who like to prove that they can spell.” Alan S.C. Ross’s “Linguistic Class-Indicators in Present-Day English,” the 1954 essay that coined the terms “U” (upper-class) and “non-U” (everyone else), put off-ten decidedly in the non-U camp.

Eric Partridge’s Usage and Abusage (1957) quotes a contemporary edition of The Concise Oxford Dictionary as calling the t-pronunciation “vulgar.” He adds: “It is certainly unnecessary and is usually due to an affectation of refinement.”

There is a regional as well as a class element to this, at least in the United States. The Dictionary of American Regional English quotes a 1928 issue of American Speech: “The Ozarker nearly always pronounces the t in often.” And DARE also cites the Linguistic Atlas of the Gulf States (1989) as reporting 453 informants who said the t as opposed to 290 who did not.

Data on pronunciation, as opposed to writing, are hard to come by, but I did my best. I listened on YouTube to 12 versions of the opening line of “On the Street Where You Live” — “I have often walked on this street before.” It was offen in both the My Fair Lady original cast album and the movie soundtrack, and in the renditions by Vic Damone, Etta Jones, Bobby Darin, Nat King Cole, Harry Connick Jr., Dean Martin, and Willie Nelson (whose version is my favorite). Only Tom Jones (a Welshman), Nancy Wilson (African-American, born in Ohio), and Smokey Robinson (African-American, born in Detroit) sang off-ten.

“Birches” by Robert Frost, has the lines:”Often you must have seen them/Loaded with ice a sunny winter morning/After a rain.” In this recording, Frost says offen.

As I suggested at the outset, it’s my sense that in recent years, young people have become partial to off-ten. The language blogger Jan Freeman agrees and offers anecdotal support:

I’ve been interested in this one since my daughter, brought up as an OFF-en speaker, went to college at the University of Michigan and came back saying OFF-ten. I don’t think it’s a regional thing — I grew up two hours south of Ann Arbor, and I don’t remember OFF-ten even as a variant. It must have been something she picked up from friends.

To at least pseudo-scientifically test this proposition, I met individually with the undergraduates in the class I’m currently teaching and asked them to read aloud the sentence, “Experience has shown that first impressions are often lasting ones.” Eight said off-ten and five said offen. (Obviously, their pronunciation may have been affected by seeing the t on the piece paper in front of them, or by self-consciousness.)

Whence the appeal of this pronunciation? All I know is that it seems of a piece with the popularity of amongst, whomever, saying “a person that” instead of “a person who,” pronouncing either as eye-ther, and the spellings grey and advisor. These are all changes in previously accepted usage that seem more formal, British, and/or fancier, and (in off-ten and the first three examples) are slightly longer. I leave to greater minds than mine the question of why these qualities are desirable.

In any case, in keeping with these trends, the question of how to pronounce “often” may soon cease to matter. Just as it replaced oft back in the day, it is being supplanted — if my students’ work can be trusted — by an amongst-ish antique word. That’s right, I’m talking “oftentimes.”

Who That?

A couple of weeks ago, referring to Ben Carson’s (supposedly) terrible temper, Donald Trump said, “I don’t want a person that’s got a pathological disease.”

What caught my eye was that he didn’t say, “… a person who’s got a pathological disease.” For some years, I have been noticing that my students favor the choice of that over who as a relative pronoun; I did some grumbling about it here, lumping it with other popular usages (“one-year anniversary” instead of “first anniversary,” sticking a comma after a sentence-starting “But” or “And”) that I collectively referred to as “clunk.”

I hasten to say that that that is perfectly correct, grammatically. The Merriam-Webster Dictionary of English Usage sums up the matter: “In current usage, that refers to persons or things, which refers chiefly to things and rarely to subhuman entities, who chiefly to persons and sometimes to animals.”

Nor is human that any kind of newfangled thing. Shakespeare writes in Hamlet,  “By heaven, I’ll make a ghost of him that lets me.”  Horace Walpole observed, “This world is a comedy to those that think, a tragedy to those that feel.”  The Man That Corrupted Hadleysburg is a Mark Twain title. Ira Gershwin wrote “The Man That Got Away” and Irving Berlin “The Girl That I Marry,” possibly to avoid having the word whom in the title of a song. (On the other hand, the lovely Oscar Hammerstein-Jerome Kern tune is “The Folks Who Live on the Hill.”) Way back when, which was sometimes slotted in as well, as in the 1662 edition of The Book of Common Prayer: “Our Father, which art in heaven.”

Like so many other shibboleths, the idea that that is incorrect in reference to humans originated in the 18th century. The impact on usage was swift, as seen in the Google Ngram Viewer chart below. The blue line represents the relative frequency of the phrase “a person that,” the red line of “a person who”:

Screen Shot 2015-11-14 at 3.03.08 PM

Ngram Viewer charts usage in books, but the Corpus of Contemporary American English (COCA), which contains 450 million words written or uttered between 1990 and 2012, attests that human that is most common in speech. The chart below shows the  frequency of “a person that” in the different generic databases in COCA; “Spoken” mainly comes from broadcast transcripts.

Screen Shot 2015-11-15 at 9.41.03 AM

But even in print, Ngram Viewer attests that my observation of my students’ affection for that is part of a broader trend: since 1965, the frequency of “a person that” has increased roughly 150 percent.

Screen Shot 2015-11-15 at 9.51.06 AM

What’s the reason for the trend? Some discussions propose that it reflects a societal move toward depersonalization. Others have suggested that that now tends to be used when the subject is vague (“Anyone that wants to retire comfortably should start saving early”) and who when it is specific (“I’m a person who … “). But in my reading and listening, I don’t perceive such a distinction. Mignon Fogarty, aka Grammar Girl, points to a nifty passive-aggressive use: “I always think of my friend who would only refer to his new stepmother as the woman that married my father. He was clearly trying to indicate his animosity.” Maybe Trump was attempting such a ploy.

But I’m going to stick with my earlier hypothesis that a fondness for that is part of a generational sense that streamlined, glossy language moves— even so seemingly small a thing as the use of the word who — are somehow cheesy, and that it’s better to embrace the awkwardness. And why does the younger generation feel that way? Sorry. I’m not the sort of blogger that would hazard a guess on that.

How “Online” Became “Offline”

I read this sentence in The New York Times not long ago: “Most evenings, before watching late-night comedy or reading emails on his phone, Matt Nicoletti puts on a pair of orange-colored glasses that he bought for $8 off the Internet.”

The phrase that caught my inner ear was “off the Internet.” It sounded odd because, given the widespread use of the expressions online and on the Internet, one would expect the preposition to be on. 

A possible explanation for the “bought it off the Internet” formulation stems from the use of off (since the 1600s, according to the Oxford English Dictionary), in the sense of from, “esp. with take, buy, borrow, hire, and the like.” It’s a colloquial but very real idiom, as in “I bought it off my brother.” (Even more colloquial is “I bought it off of my brother.”)

But I don’t buy this etymology for “bought it off the Internet.” For one thing, the off-instead-of-from pattern doesn’t really apply: it sounds weird to say, “I bought it from the Internet.” Looking into the history of the phrase further convinced me that the explanation lies elsewhere. Here are some examples from the early years:

  • “The G Box has the responsibility of taking packets off the Internet and handing them over to the LAN or vice versa.”—Computerworld magazine, 1992
  • “‘Getting information off the Internet is like taking a drink from a fire hydrant,’ says Kapor.”—The Nation, 1993
  • ” … sexual images can be downloaded off the Internet.” —CIO magazine, 1993
  • ” … pulling shareware off the Internet.” —InfoWorld magazine, 1994
  • “And we’re looking for ways to try to at least help parents deal with what their children can get off the Internet.”—Pres. Bill Clinton, 1994
  • “People said they would buy more off the Internet if they knew the privacy policies for the companies whose sites they visit.”—Network World magazine, June 1997

The progression is interesting. The early references are to files, software, text, or images, and the word off suggests a sense of the Internet as a giant clothesline, or tree, on which these things are hanging, ready to be plucked. I believe that notion extended to the Matt Nicoletti idea of purchasing things from Internet vendors, as first seen in the 1997 Network World quote.

Before long, people started talking about buying something off a particular vendor. From Nick Hornby’s 2007 novel Slam: “Mum buys stuff off Amazon sometimes.”

I mentioned all this to my daughter Maria Yagoda, and she said people her age (twenties) and younger have taken things a step farther, saying, “I bought it offline” to indicate something purchased in an Internet transaction. Sure enough, a poster to Urban Dictionary created an entry for this in 2005:

Screen Shot 2015-05-28 at 5.08.53 PM

And it’s still very much around a decade later. I did a Google search for the phrase “bought it offline,” limited to things posted in the last year. Of the hits that were not ambiguous, about half referred to purchases that were made in stores (the traditional “offline”) and about half to ones that were made online. Examples of the latter:

  • “Rodriguez purchased a bus pass on the Facebook group recently and said her pass ‘worked perfectly’ and she bought it offline because of the cheaper cost.—The State News (Michigan State University student paper)
  • “i agree about the naked palette i bought it offline because i couldn’t find it in Australia”—YouTube video “Makeup Products OVERHYPED”
  • “Where Do I Find My Product Code If I Bought It Offline And Dont Have The Confirmation Email Anymore?”— message board

How did this usage arise? In 2009, someone posed that question to the Yahoo Answers community: “How come when a lot of people buy something online they tell others they bought it ‘offline’?” Sacha’s response was chosen “best answer,” and I think it’s basically right, including the implicit observation that the old-fashioned and somewhat Al Gore-y term Internet has been supplanted by the all-purpose online. Sacha opined: “its a quick way of saying it. for example instead of saying i bought it off the internet, they say i bought it offline – coz it wouldnt it make sense if they sed i bought it off online. if you get what im saying? lol.”

Lol indeed. The etymology is all well and good, but the phrase remains peculiar at best, nonsensical and confusing at worst. As Neil Roberts points out, offline is in all other contexts understood to mean not connected to the Internet, so where is the possible logic in saying “I bought it offline” when what is clearly meant is “I bought it online”?

But demanding logic from language developments is a mug’s game. So I’m going to withdraw the question and go offline.

Letterman and Irony

With the end of David Letterman’s long TV run, it seems that everyone has weighed in on his significance and contributions. Here’s my take (originally published in the Chronicle of Higher Education) on Letterman’s characteristic stance–irony.

“What’s all this irony and pity?”
“What? Don’t you know about Irony and Pity?”
“No. Who got it up?”
“Everybody. They’re mad about it in New York.”
–Hemingway, The Sun Also Rises

To paraphrase Philip Larkin, irony began in 1973, between Robert Altman’s The Long Goodbye and Randy Newman’s fifth LP. The ur-text, for me, was the first paragraph of the preface of Kurt Vonnegut’s novel Breakfast of Champions:

The expression “Breakfast of Champions” is a registered trademark of General Mills, Inc., for use on a breakfast cereal product. The use of the identical expression as the title for this book is not intended to indicate an association with or sponsorship by General Mills, nor is it intended to disparage their fine products.

The kind of irony I’m talking about is verbal, which I define as a form of expression in which one makes a point or conveys an idea by saying something other than what one means. (It’s different from situational irony — the “Gift of the Magi” sort of thing — and dramatic irony, as in a novel where a character traveling on the Titanic excitedly discusses what he’s going to do after landing.). The term, which derives from a stock character in Greek comedy, the eirôn, describes a rhetorical device that obviously originated long before the 1970s, and is most famously employed by Mark Antony: “Brutus is an honorable man.” Anatole France, in the 19th century, adopted “irony and pity” as a sort of watchword; it got into The Sun Also Rises via the critic Gilbert Seldes. (The Language Hat blog has helpfully sketched out this history.)

Hemingway is the great modern ironist. His particular discovery and innovation was the invocation of strong emotion via (ironic) terseness. That extends to his characters, such as Jake Barnes, who remarks, “I’d a hell of a lot rather not talk about it.”

Irony wasn’t a mere technique for Hemingway: It was rooted in his sense that the standard literary language of his time was outmoded, false, and, to a certain extent, debased. He was the most influential stylist in 20th-century American literature, inspiring Raymond Chandler and other private-eye novelists, sports scribes like Jimmy Cannon and W.C. Heinz, tabloid columnists like Jimmy Breslin and Pete Dexter, “minimalist” short-story writers like Raymond Carver and Ann Beattie (who early in her career incorporated as Irony and Pity Inc.), and Vonnegut, who, along with Donald Barthelme, expanded the comic possibilities of irony in the 1960s and 70s.

When I read Breakfast of Champions in 1973, the phrases that jumped out from the preface and gave me an I-needed-that slap in the face were “breakfast cereal product” and “their fine products.” I gathered, without being able to articulate it at the time, that Vonnegut was appropriating corporate and promotional language, thereby suggesting how debased it had become. But he wasn’t asserting that the products weren’t fine, which made what he was doing irony, not merely sarcasm.

And that brings me to Vonnegut’s fellow Hoosier David Letterman, whose final television broadcast aired on May 20. Think of Letterman mouthing the words “television broadcast” — or “beverage” or “ladies and gentlemen” or even introducing himself as “Dave” Letterman — and you get a sense that he was working similar effects, in the realm of the television broadcast. The opposite of irony is sincerity, and sincerity has for a long time been debased by TV talkers, with their sympathetic nods, creased brows, and phony concern. For years and years, Letterman was palpably not sincere in a single syllable he uttered.

Starting with and moving beyond the 1960s “put-on,” Letterman’s comedy generation did remarkable things with ironic poses. The list is long: Bill Murray’s smarmy lounge singer on Saturday Night Live; Steve Martin’s wild and crazy guy; Albert Brooks’s faux standup persona; SCTV’s pinky-ringed Sammy Maudlin and Bobby Bittman (played by Joe Flaherty and Eugene Levy); Martin Short’s Jackie Rogers Jr. and Irving Cohen on SCTV — and his whole self-presentation for the last 10 years; Letterman’s band leader and sidekick Paul Shaffer, with his groovy lingo, elephantine shades, and circus-clown sport coats. All took on the dissembling and self-aggrandizing affectations of an earlier show-biz era. (That this shtick played so well and lasted so long is testament to the pleasures and power of the old model. Again, irony and not sarcasm.) The younger Stephen Colbert went ironically all in to an extent never seen before, in his decade-long stint as a preening and blustering conservative talk-show pundit.

Of course, Colbert ended his run last year and will step into Letterman’s time slot in the fall, presumably playing himself. That’s appropriate. Irony is extremely hard to carry off over the long haul. Look at Hemingway, who was unable or unwilling to drop it and became a self-caricature.

Letterman’s pivot from irony has been a result not merely of getting older but also of a series of powerful events in his and the nation’s life. In 2000, he had quintuple bypass surgery and a glimpse of mortality. The following year was 9/11 (which Graydon Carter predicted would bring the end to the age of irony. Not so much.) Letterman came on the air less than a week after the attacks and delivered what was probably his most sincere televised declaration to date: “If you didn’t believe it before, you can certainly believe it now. New York City is the greatest city in the world.” In 2002, after his friend Warren Zevon received a terminal diagnosis, Letterman devoted an entire affecting episode to the singer; three years later came the death of his mentor, Johnny Carson. In 2009, after receiving blackmail threats, he acknowledged multiple affairs with staff members and devoted a segment of the show to a public apology to his wife and staff.

But the biggest happening was the 2003 birth of his son, whom he often talks about on the air, with warmth and emotion. Once, referring to his bypass surgery, he held up a picture of the lad and said, “This is the reason I think my life was spared, so I could be part of this kid’s life.”

In the run-up to his final show, Letterman has said what he means, a lot, expressing appreciation for his long run and gratitude to his longtime staffers and favorite guests, especially musicians. But it’s not that easy being sincere, especially for someone with so much irony in his blood. In these weeks, he’s tended to haul out go-to phrases like “Thanks for everything” and (when someone thanks him) “You’re too kind,” making him sound like he’s in a receiving line.

And, as inevitably happens when an ironist puts away his mask, there’s a bit of the Boy Who Cried Wolf effect. When Oprah Winfrey finally came on his show, ending their years-long feud, or “feud,” Letterman told her, “It means a great deal to have you.”

“Does it really?” she replied. “Or are your just doing your Dave thing?”

Ironically, you couldn’t really tell.