What the internet does to the mind is something of an eternal question. Here at The Atlantic, in fact, we pondered that question before the internet even existed. Back in 1945, in his prophetic essay “As We May Think,” Vannevar Bush outlined how technology that mimics human logic and memory could transform “the ways in which man produces, stores, and consults the record of the race”:
Presumably man’s spirit should be elevated if he can better review his shady past and analyze more completely and objectively his present problems. He has built a civilization so complex that he needs to mechanize his records more fully if he is to push his experiment to its logical conclusion and not merely become bogged down part way there by overtaxing his limited memory. His excursions may be more enjoyable if he can reacquire the privilege of forgetting the manifold things he does not need to have immediately at hand, with some assurance that he can find them again if they prove important.
Bush didn’t think machines could ever replace human creativity, but he did hope they could make the process of having ideas more efficient. “Whenever logical processes of thought are employed,” he wrote, “there is opportunity for the machine.”
Fast-forward six decades, and search engines had claimed that opportunity, acting as a stand-in for memory and even for association. In his October 2006 piece “Artificial Intelligentsia,” James Fallows confronted the new reality:
If omnipresent retrieval of spot data means there’s less we have to remember, and if categorization systems do some of the first-stage thinking for us, what will happen to our brains?
I’ve chosen to draw an optimistic conclusion, from the analogy of eyeglasses. Before corrective lenses were invented, some 700 years ago, bad eyesight was a profound handicap. In effect it meant being disconnected from the wider world, since it was hard to take in knowledge. With eyeglasses, this aspect of human fitness no longer mattered in most of what people did. More people could compete, contribute, and be fulfilled. …
It could be the same with these new computerized aids to cognition. … Increasingly we all will be able to look up anything, at any time—and, with categorization, get a head start in thinking about connections.
But in Nicholas Carr’s July 2008 piece “Is Google Making Us Stupid?,” he was troubled by search engines’ treatment of information as “a utilitarian resource to be mined and processed with industrial efficiency.” And he questioned the idea that artificial intelligence would make people’s lives better:
It suggests a belief that intelligence is the output of a mechanical process, a series of discrete steps that can be isolated, measured, and optimized. In Google’s world, the world we enter when we go online, there’s little place for the fuzziness of contemplation. Ambiguity is not an opening for insight but a bug to be fixed. The human brain is just an outdated computer that needs a faster processor and a bigger hard drive.
Even as Carr appreciated the ease of online research, he felt the web was “chipping away [his] capacity for concentration and contemplation.” It was as if the rote tasks of research and recall, far from wasting innovators’ time, were actually the building blocks of more creative, complex thought.
On the other hand, “you should be skeptical of my skepticism,” as Carr put it. And from the beginning, one great benefit of the internet was that it brought people in contact not just with information, but with other people’s ideas. In April 2016, Adrienne LaFrance reflected on “How Early Computer Games Influenced Internet Culture”:
In the late 1970s and early 1980s, game makers—like anyone who found themselves tinkering with computers at the time—were inclined to share what they learned, and to build on one another’s designs. … That same culture, and the premium it placed on openness, would eventually carry over to the early web: a platform that anyone could build on, that no one person or company could own. That idea is at the heart of what proponents for net neutrality are trying to protect—that is, the belief that openness is a central value, perhaps even the foundational value, of what is arguably the most important technology of our time.
But as tech culture evolved and pervaded life outside the web, even its problem-solving methods began to seem reductive at times. Ian Bogost outlined that paradox in November 2016 when a new product called ketchup leather was billed as the “solution” to soggy burgers:
The technology critic Evgeny Morozov calls this sort of thinking “solutionism”—the belief that all problems can be solved by a single and simple technological solution. … Morozov is concerned about solutionism because it recasts social conditions that demand deeper philosophical and political consideration as simple hurdles for technology. …
But solutionism has another, subtler downside: It trains us to see everything as a problem in the first place. Not just urban transit or productivity, but even hamburgers. Even ketchup!
So, what’s your personal experience of how the internet affects creativity? Can you point to a digital distraction—Netflix, say, or Flappy Bird—that’s enriched your thinking in other areas of your life? On the flip side of the debate, can you point to a tool like email or Slack that’s sharpened your efficiency but narrowed the scope of your ideas? We’d like to hear your stories; please send us a note: hello@theatlantic.com.
Last week, a reader who signed his email “J.” gave us a detailed critique of what he calls the “zombie rules” of grammar—the gripes against such things as split infinitives and dangling prepositions that “fuel ... people’s misconceptions (and their nervous cluelessness) about English.” This next reader, Chris, has a rebuttal from his experience as an ESL teacher:
I find that adhering to grammar rules, however zombified they may be, is important for me in teaching university students—the reason being that once they complete their studies, they will be on the job hunt, and their English abilities will be on trial. The likelihood that a future employer might be a follower of zombie rules to English grammar is quite high, so rather than that student be judged at the most crucial time for them, I attempt to nip it in the bud early if possible.
NB: In “croissants uneaten,” uneaten can DEFINITELY still be looked at as something other than a verb with the verb left or went having been elided. For example,
The croissants were left uneaten by the partygoers.
This seems to act more as an adjective disguised as an adverb, similar to hungry in “The children went hungry for three days.”
Sentence 1: The uneaten croissants were finally discarded.
we would see that the verbal structure in this sentence employs a passive form: “were discarded.” “The croissants” is the subject and “The uneaten croissants” is the SP (subject phrase).
Since an SP consists of a determiner (the), any number of adjectives, and some number of nouns, gerunds, etc., but not any embedded verb forms; and since Sentence 1 already has a well-formed verb structure (“were discarded”); then the tentative conclusion is that in Sentence 1, uneaten is an adjective and not a verb form.
b. It’s normal in English for adjectives to precede the subject noun, although there are many instances when it might follow, e.g. “athlete extraordinaire,” “the person responsible,” “battle royal,” “devil incarnate.”
Thus, the NP (noun phrase) “croissants uneaten” could just as easily be construed as “uneaten croissants” and, as shown above, “uneaten croissants” can arguably be construed as an adjective-noun combination. Therefore it’s arguable that “croissants uneaten” is also an adjective-noun combination, with the adjective following the noun.
c. Your correspondent J. also makes this argument:
Although it appears in many of the same syntactic positions as adjectives, uneaten does not meet most of the criteria for adjective-hood (an asterisk indicates that something is ungrammatical):
It is not gradable: *more uneaten, *most uneaten
It cannot be modified by words like too and very: *very uneaten croissants
It doesn’t work with a verb like become: *The croissants became uneaten.
I don’t think that’s the way to look at it. The adjective uneaten has a binary meaning, i.e. something is either eaten or it’s not. (We’ll ignore the brief interim during which it is transitioning between the two states.) Thus, eaten is an absolute state, as is uneaten.
Consequently, adverbs of gradation (more, most, too, very) simply don’t apply. This is precisely parallel to the rule that tells us not to modify the adjective unique with the adverbs very, more, etc., since ‘unique’ is absolute.
Now, can we say that something is “partially eaten” (and therefore “partially uneaten”)? Of course, but it is understood that the part that is eaten is absolutely eaten, and the part that is uneaten is absolutely uneaten.
As for “The croissants became uneaten,” of course we don’t say it that way, but saying
Sentence 2: The croissants were eaten.
conveys the same meaning, except with more felicity.
d. Finally, as to this comment:
To get a better sense of all of this, compare uneaten to a past participle that has clearly become an adjective, like embarrassed.
The adjective embarrassed is not used in a binary-state way. One could argue that one is either embarrassed or one is not, but the fact is that we do modify this adjective in a gradation:
Sentence 3: I was slightly/somewhat/quite/greatly/mortifyingly embarrassed.
Conclusion: My arguments lead me to assert that “croissants uneaten” is an acceptable English noun phrase and that uneaten is not a verb, but an adjective.
Personally I’m on the side of the adjectives—though I take full responsibility for having prompted this debate by advertising uneaten as a verb in the first place. I draw this conclusion not only from the guidance of my trusty arbiter, the Merriam-Webster dictionary (or just Merriam, as my college friends and I called her), but also from the fact that I’m unable to translate “croissants were uneaten” into an equivalent construction using the active verb uneat. (Side note: If anyone out there has found a way to boldly uneat where everyone else has eaten before, please let me know: My colleagues share a lot of snacks here at The Atlantic, and I’m constantly late to the party. (Side note to the side note: If there’s no such thing as a split infinitive, what happens if you try to split an infinitive that doesn’t exist?))
But I’ll give J. this: In “Walking Dead autopsied, croissants uneaten,” uneaten does work in parallel with autopsied—each describes what’s being done, respectively, to the hit TV show and the croissants. Though uneaten may not be a verb itself, it functions in context as a past participle. That, anyway, was my excuse when I tried to pass it off as one.
I know it might seem petty, or trollish, or a waste of everyone’s valuable time to spend these paragraphs arguing over the character of a single word. (Apologies to my editor.) But this debate underscores for me something much bigger, and more important: A word, as Knox pointed out last week, is functional—a mechanism for meaning-delivery. But it’s not a machine; the right metaphor, I think, would be something much closer to human. A word has an essential identity in its definition, and carries that everywhere; and yet it’s a shape-shifter, context changing its meaning and its grammatical function. Words are like people— multifaceted and messy and hard to pin down. And isn’t that kind of beautiful?
David Frum is worried it will happen under President Trump. “The fancy term is authoritarian kleptocracy,” Frum says in a long and enriching talk with Atlantic editor Scott Stossel last Thursday about the dangers of the Trump administration (starting at the 10:22 mark):
The SoundCloud audio version is here. And if you haven’t yet read David’s cover story on Trump, or want to read it again in light of this discussion, here’s the link. If you prefer to listen to it on the go or while doing chores around the house, here’s the audio version:
This reader really liked the piece:
I’d just add a philosophical aspect, which is that if Obama was our first black president, then Trump is our first postmodern president. In postmodernity all truth is local, thus if you deconstruct any attempt at claiming an overarching truth, you’ll find a power grab.
This particularly applies to Trump’s relation with the media. If the media calls out one of his lies, it is seen by him and his supporters as not truth but a competing narrative—or, in today’s terms, #FakeNews. And so Trump has weaponized language, and any attempts at restraining him through shaming, appeals to tradition, and appeals to logic fall flat.
With the news landscape so fragmented, it’s really hard to solve this problem. I can ignore the traditional gate keepers like NYT and WaPo, and I can confirm all my biases on platforms such as Breitbart or DailyKos. Can we overcome that fragmentation? I think so.
Ultimately I believe it comes down to the need to return to hard-nosed investigative journalism, and putting out fewer opinion pieces. So, say Trump goes forward with his tariffs on Mexico. Well it may help the Rust Belt workers, but it will be detrimental to workers in border towns. So you’d want a reporter talking to people and businesses affected. It’s kind of hard to ignore these stories vs. opinion pieces.
In general, to overcome the cultural malaise that led to Trump, we’re going to need more dialogue across communities. The goal is to build a common “meta-narrative” that post-modernity tears down. We need grassroots activity and the revival of social institutions (churches/mosques/synagogues, mutual aid societies, neighborhood councils, etc.). So it just comes down to countering balkanization in media, culture, and politics.
This next reader has a very different view:
“The American free press” consists of some of the largest businesses in the world, huge corporations worth billions of dollars, the unregulated “fifth estate” in America. They are more powerful than politicians or representatives, free to say anything under the guise of “freedom of the press.”
They are no longer really “the press”; they represent the interests of the owners who, through their exposure to many millions of people, have power even beyond that of the president or elected representatives.
Let’s get real. The idea of what is happening in the world is what is presented to you by the media. You see “reality” through their lens. What they say seems to be the same as fact. They really control what you think! The Washington Post endlessly disses Trump, gives his critics more coverage mix fact with opinion, and distort facts. They are manipulating you.
Not so fast, replies this reader:
Or alternately, you could simply apply rational thought to what you read and draw rational conclusions based on the quality of evidence provided, the number of peer sources co-validating it, and the logic of the arguments presented. Or just buy into unsubstantiated conspiracy theories that everything is a lie.
Another reader piles on:
Newspapers like The Washington Post provide sources; Trump never does, unless it’s his own gold-plated observation—like the phantom thousands of people in New Jersey whom he saw cheer on 9/11. The major newspapers also apologize and issue corrections when they make an error; Trump will do the same only when Mar-a-Lago freezes over. And lastly, Trump provides us all with seemingly never-ending examples of distortion, insults, and unethical sexual behavior. Trump is manipulating his penurious lemmings and then spits nails after the majority of the American people resist him.
Update from a reader who suggests that part of the problem is that online media is too democratized:
Very interesting article by Frum and the follow-up posts by readers. I want to add that the rise of Twitter is a major factor in this. It allows people (like Trump) to reach his target audience, unchecked. Any nuance or fact checking or hard questions cannot be condensed into 140 or whatever the Twitter character limit is.
It also promotes people like Milo Yiannopoulos who have nothing valuable to contribute but instead are ready to throw verbal molotov cocktails and watch the world burn. There is no accountability, therefore no need to be truthful.
Let me also pose this question: Why are all of us equipped to comment on news and what’s happening in the world? We don’t let all of us build rockets or do neurosurgery. So why does that standard of having sense, education, training, and aptitude apply to being a journalist? Having a blog—or worse, a collection of loony opinions like Breitbart—is not journalism.
On Monday, February 20, we’ll celebrate Presidents’ Day. So this week, we asked our Politics & Policy Daily readers: What U.S. president do you admire most—and why? We received dozens of thoughtful responses, but here are a few of our favorites.
For Dolores Oliver, the answer is George H. W. Bush. She admires his ability to “work beyond ideological barriers”:
First, Bush was willing to resist pressure to aggressively brag about the fall of the Soviet Union. This approach reminded me of Lincoln’s commitment to welcoming back the South after the Civil War. He worked hard to respond with humility and support to bring the former Soviet satellite countries into the international community and eventually Russia too. Had the West come out with a prideful, bellicose attitude, perhaps we would be far worse off in our relationship with Russia than we are currently.
Secondly, he was willing to stand firm against great pressure within his party against the American with Disabilities Act (ADA). Instead, he recognized the need to give individuals with disabilities the opportunity to function independently, thus empowering many who otherwise would be homebound.
Third, he was willing to stand firm against tyranny when Saddam Hussain invaded Kuwait. He worked carefully and wisely to merge together a coalition of more than thirty countries to remove Iraqi forces and liberate Kuwait in under four months.
Lastly, he was willing again, against great pressure, to acknowledge the need to increase taxes—which would eventually lose him a second term.
On a similar note, Mary Shannahan chose President Jimmy Carter because he “walks his talk”:
I admire him because of his integrity while in office. Since his term ended, he’s facilitated peace on a global level and supervised integrity, or lack of it, in elections throughout the world. Here in the States he’s active with Habitat for Humanity. His principles are guided by his faith.
From Jennifer Poulakidas: “LBJ, for sure”:
What President Johnson was able to accomplish during his tenure is undeniably amazing and advanced our country in many very significant ways. AND, he was able to get a majority of the Congress to join him! The Civil Rights Acts, the Voting Rights Act, the first ESEA and HEA bills, the Immigration Act of 1965, the establishment of Head Start, Medicare, Medicaid and Work Study, creation of the National Endowments of Humanities and the Arts—the list could continue.
Paul E. Doherty suggested President Harry Truman, who he calls a real “man’s man.” Why?
He probably made more difficult decisions than any other president, and right or wrong, he made them in the best interest of our country. He truly meant it with the sign on his desk in the Oval Office that said “The Buck Stops Here!” After leaving the White House he went back to Independence, Missouri, to live the rest of his life with his family. Truly a great American!
Reader Cindy Simpson would have some questions for FDR:
If he were in office today, he’d probably be impeached: Did he know about Pearl Harbor? If so, when, and if not, why? And what about those affairs—for both him and his wife?
But I admire Franklin D. Roosevelt. I believe he led this country through a very difficult time—he helped to get people relief and employment during and after the Great Depression; established social security, the SEC, and the FDIC; and navigated the U.S. entry into WWII (though of course, it wasn’t all good).
For college student Zubair Merchant, it’s a tossup between two young presidents, John F. Kennedy and Barack Obama:
Both men had a passion and honor in office that I think is characteristically unique to them. It also helps that they were young and inspirational presidents and that I am in college.
I think that 50 percent of the presidency is policy and 50 percent is rhetoric. On the policy side you can debate that JFK didn’t have time to do much, yet Obama (I believe) moved this country forward in a way that we haven’t seen in a long time (he’s the liberal Reagan, but cooler). On the rhetoric front, they are, in my view, the most inspirational presidents in history, and their youth carried a message that is unparalleled.
Finally, Christopher Wilson didn’t support Barack Obama during his candidacy, but says he still admires him the most—“without question”:
When someone is observed with such scrutiny and vigilance, they cannot escape their faults. President Obama had his. The pivot of his leadership was changing an opinion of what had been a strong conviction—not for the purpose of politics and remembrance—but because he knew it was the right thing to do! Specifically, having held strong opposing views of [same-sex marriage], President Obama made a remarkable turnaround and went full throttle in securing rights and becoming a quiet champion for the community—this in spite of his own personal beliefs. That’s rarely seen in politics, and applaudable.
Lastly, he gave the face of the president its most human touch. His humor, casual style, personal interests, candor and confidence were his beauty. I, like many others were able to connect with and see him for who he was ... a great father, husband, brother, uncle, son, friend and human being.
Thanks for your comments, and stay tuned for next week’s Question of the Week contest.
That’s the charge leveled by one reader, J., who responds to my grammar confession from earlier this week by advising me to “battle the misinformed pedantry of the peevers”—and points out a number of ways in which I’m guilty of misinformation myself. But first, two more readers offer their defenses for linguistic laxity.
Knox, a self-described “ambiguity ally,” says her attitude to English was shaped by growing up in a family of dyslexics:
In my younger years, I thought I had missed out on the family superpower. Today, we’ve come to terms with the differences: Acute writing skills are as much of a wieldable power as the extraordinary three-dimensional thinking that can make reading more difficult. But in the name of intellectual stimulation, debate around the importance of grammar and spelling still arises at the dinner table.
My youngest brother has a favorite defense; he likes to define “a word” with a sly smile and a hefty dose of side-eye. “Well. Don’t you know the definition of a word?” (He’ll pause for dramatic effect.) “According to the dictionary,” a stab at my English degree, “a word is a unit of language that functions as a principal carrier of meaning. The purpose of a word is not grammatical accuracy but a mode of conveying meaning. So, if you understood what I meant, then my mastery of language is intact.”
I’ll argue with him in the name of a good dinner debate, but truthfully I can't help but agree. The English language for me is less a network of rules and codes and more a tool for impact. However, the upshot here: It’s always the combination of the two—the codes and the meaning—that will craft the highest-impact message.
George takes a similarly laissez-faire approach:
Years of teaching both English and French as second languages has convinced me that when it comes to usage, the bottom line is getting the message across. All languages (except dead ones) are in a constant state of flux and there is nothing any of us can do about it. It may seem at times that a language is “deteriorating,” but those who are most knowledgeable about language know that no language has ever “deteriorated.” All languages evolve.
I love to quote—perhaps not totally accurately—the inimitable “Mr. Language Person” (Dave Barry of the Miami Herald—retired) who reported an overheard conversation between Eileen and her friend. Eileen was complaining about being unable to go to the church social for lack of a ride. Her friend replied: “Eileen, ’f I’d a know’d you’d a wanna went, I’d a see’d you’d a got to get to go!” This is 100 percent wrong grammatically, but the message comes across perfectly. Why correct it?
But another defense of what I’ve described as “rule-breaking” lies not in rejection of grammatical rules, but in a more precise interpretation. Here’s J., whose point-by point response to my post begins by unpacking Ruby’s critique of the Atlantic Daily verbs:
In “croissants uneaten,” uneaten is indeed a verb—specifically a passive verb—not an adjective. A “croissant uneaten” is a croissant that no one has eaten. That is, the verbal sense is clearly intact.
Although it appears in many of the same syntactic positions as adjectives, uneaten does not meet most of the criteria for adjective-hood (an asterisk indicates that something is ungrammatical):
It is not gradable: *more uneaten, *most uneaten
It cannot be modified by words like too and very: *very uneaten croissants
It doesn’t work with a verb like become: *The croissants became uneaten.
To get a better sense of all of this, compare uneaten to a past participle that has clearly become an adjective, like embarrassed. To be sure, when we’re discussing past participles, the line between verb and adjective is sometimes hazy. All we can do is look at the evidence.
***
I too have sometimes wondered if “Verbs” would be better titled “Past Participles”
The past participle is one of the six forms that every lexical verb has. The title "Verbs" encompasses those six forms. Don’t let a few misinformed peevers cause you to change the name.
You have what are clearly four verbs. Despite some readers’ desire for parallelism, there is nothing wrong or inelegant about having two past participles and two present-tense verbs.
You have four clear verbs. Does unimpressed straddle the line between verb and adjective? Probably. But isn’t there a verbal meaning there, i.e., that the press was unimpressed by someone or something, that someone or something did not impress them?
***
And I know that it’s frowned-upon to start a sentence with “and”
But it’s not. The “don’t start sentences with conjunctions” is a zombie rule. It has never been an actual rule of English grammar, and it’s easy to find examples of it in all levels of formality, from Supreme Court decisions to essays in The Atlantic to newspaper articles to fiction to social-media posts.
***
On the other hand, isn’t language shaped democratically by those who use it?
YES!
***
So tell me: Are you a grammar geek who takes occasional guilty pleasure in splitting infinitives? Do you dare to dangle prepositions?
With all due respect, this is the kind of stuff that perpetuates zombie rules. It perpetuates ignorance about the way our language works.
There is no “guilty pleasure” in “splitting” infinitives because it has never been ungrammatical in English. And anyway, “split infinitive” is a misnomer, one borne of early grammarians’ attempts to apply the grammatical rules of Latin (in which is is impossible to split an infinitive) to English. A to-infinitive clearly comprises two parts: the infinitival subordinator to and the plain form of the verb. This is clear in sentences like “I need to eat and sleep” and “We could go to the dance, but Cozznester doesn’t want to.” Nothing is being split in a “split infinitive.”
When you suggest that splitting infinitives and stranding prepositions is something that only grammar renegades do—especially when you do it in a widely read publication—you’re adding fuel to people's misconceptions (and their nervous cluelessness) about English. There’s no guilty pleasure in doing these things: They’re a natural part of English grammar. There are conventions that formal writing must adhere to. But conflating stylistic conventions with grammar leads people to believe that those conventions are actual rules.
I’m aware that this piece is trying to be light, to adopt a cheeky tone. But people who write about language need to battle the misinformed pedantry of the peevers. They need to strive to show readers how English actually works, not how those peevers want it to work.
Point taken. And gauntlet thrown.
Searching for further insight into stylistic peevery, I followed one of J.’s links to discover Britt Peterson’s 2014 Boston Globe column “Why We Love the Language Police.” Here’s Peterson’s central question:
It’s long been recognized that language is culturally contingent and constantly evolving, rather than being a strict, logical system that can be frozen in its 16th-century state, as [grammarian N.M.] Gwynne would have it.
And yet the enthusiasm with which people read Gwynne suggests that, outside academia, there’s some continuing appeal in being lectured about split infinitives and misplaced apostrophes. In fact, for hundreds of years, English-speakers have reveled in scolding each other and being scolded about language. Gwynne’s little book is just the latest to put the spotlight on an enduring conundrum: In a world where hundreds of millions of people use the language effectively every day, why do so many of us love to hear that we’re doing it wrong?
Proud pedants and peevers come forward: What’s so great about your usage rules? Can you defend against the charge of spreading misconceptions? If there’s no grammatical case against (for instance) a split infinitive, what’s the aesthetic one? Send your best case for conventions to hello@theatlantic.com.
Forgive me, dear readers: I have sinned against grammar and in thy sight, and, as I might have expected, you’ve caught me. I’m referring to the “Verbs” section of The Atlantic Daily newsletter, which includes a series of four links attached to four (hopefully) sonically pleasing predicates. For example, our February 7 edition:
The problem is that they’re not always, technically speaking, verbs. As one reader, Ruby, explains:
With respect, the phrase “croissants uneaten” contains no verb. Rather, uneaten is a verbal, a verb form that acts as another part of speech. In the phrase “croissants uneaten,” uneaten is an adjective that describes croissants.
Michelle asks for “parallel structure, please”:
While I loved seeing the Verbs section reinstated, I was a tad dismayed when “add up” appeared alongside “unimpressed,” “soured,” and “swiped.” As a former English teacher, I always impressed upon my students the importance of parallel structure to assist readers in following along, which is perhaps why I found the shift from past to present tense jarring: Why not “Press unimpressed, sugar scientists soured, identity swiped, figures added up”? I realize there is a slight difference between the phrase “add up,” which connotes “making sense,” versus “added up,” which suggests “tallying.” Perhaps you should have selected another example since the first three verb forms function as past passive participles (adjectives), while the last is definitely a verb.
And Joseph looks even closer: “Please note that ‘unimpressed’ is an adjective, not a verb.”
It’s true! It’s true! I throw myself upon your mercy. (Being also at the mercy of Merriam-Webster, I have verified that preposition.) But what’s a would-be wordplayer to do? The rules of grammar are many and rigid, the headline-pun options comparatively few. I reserve the right to rebel for rhythm’s sake. I must claim my freedom to conjugate! And, well, it’s the little things in life that keep us going, and on a grim news day something like “press unimpressed” can be too much fun to pass up.
Yea, though I walk in the shadow of stylebooks AP, MLA, and Chicago—though I am passionately pro-Oxford comma; though I get distressed by misplacement of hyphens; though indeed, I too have sometimes wondered if “Verbs” would be better titled “Past Participles”—I am only a writer and only human, and I persist in doubt.
As my colleague Joseph knows after fielding my not-so-correct attempt to correct him, I still have trouble understanding how the phrase “to jibe with” can reasonably signify agreement. My editor, Chris, can attest to my habit of putting commas in places where they are unwelcome, if not strictly prohibited (it’s for the musicality, I have oh-so-earnestly told him). And I know that it’s frowned-upon to start a sentence with “and” or follow a semicolon with “but”; but there are times when for reasons of cadence or tone it just feels right to do it. I know that “but it sounds good!” is not much of a logical argument for anything—but such is the logic to which I bow, time and time again.
I blame my education for the crisis of faith. In college, I divided my time between copy-editing jobs and creative-writing workshops, developing equal reverence for protocol and for experimentation. I also earned a degree in literature, which means I am now well acquainted with the glorious multitude of things one can do with the English language, extremely skilled at overthinking the meaning behind a particular comma, and—when it comes to my own writing—desperately confused. Forget grammar-Nazism: Communication is a kind of social contract, and there’s an egalitarian rightness to holding all writers to the same standards. On the other hand, isn’t language shaped democratically by those who use it? And art is a meritocracy anyway, and creativity means pushing limits, and where’s the danger and joy and intuitive magic in playing strictly by the rules?
I like to think I’m not alone in all this agita. So tell me: Are you a grammar geek who takes occasional guilty pleasure in splitting infinitives? Do you dare to dangle prepositions? Are your serial commas (however you feel about them) selectively enforced? Send your copy confessions my way: hello@theatlantic.com.
(Editor’s note: Alana Semuels joined the TAD discussion group of Atlantic readers for an “Ask Me Anything,” and a lightly edited version of that Q&A is below. Reader questions are in bold, followed by replies from Semuels.)
Hi Alana. Welcome to TAD and thank you for being here. I live in the heart of the Rust Belt—Pittsburgh—and I was wondering what you see as the best hope for river towns like Aliquippa and Beaver Falls that were founded on steel but now barely scrape by. We are losing young people at a rate of 30 percent, I think. A couple towns have found a niche and have become viable, but I just don’t see many of these places recovering. Do you think they will inevitably eventually disappear like so many other towns in the Midwest?
I started my journalism career in Pittsburgh, at the Post-Gazette, so I have a special alliance to the region (except to the Steelers. Go Pats!). There are towns—like Goshen, Indiana—that have survived the rural exodus, mostly by specializing in a few niche industries. My article “America Is Still Making Things” talks a little more about this. But only a few towns are going to be able to pull this off. I think the rest are going to keep losing population and young people. There’s hope for them to become retirement communities, but that’s not necessarily the most dynamic economic engine.
As someone who really went around and talked to a lot of people from all corners of America, did you think Trump might win the election? Or were you as surprised as the rest of us?
No, I was surprised, too. I wish I had talked to more people about this before the election, but I, like many other journalists, was focused on other things.
What do you think is the biggest misconception people have about the average Trump voter?
Democrats seem to think Trump voters are dumb; they aren’t. They just really don’t like Democrats, especially Hillary. A lot of the people I talked to said they were more anti-Hillary than they were pro-Trump. A friend who is a pollster said people in his groups thought Hillary was a liar and Trump an a-hole, and they’d rather vote for an a-hole than a liar.
I think a lot of people were long-time Republicans, and are as unlikely to change parties as urban Democrats are. But there was one woman who said to me she didn’t know who she was voting for until she got into the voting booth, and then she thought about the FBI and Hillary, and then voted for Trump. I think she is fairly representative.
It irritates me when Democrats criticize Rust Belt voters for supporting Trump. That’s the point of voting—everyone gets to choose who they want. Alexander Hamilton would have liked only the educated people to choose who was in charge, but that’s not a democracy.
Do you think sexism was a big factor in Midwest voters’ hate for Hillary?
No, I actually don’t. But I’m a business reporter, not a politics reporter, so I could be wrong.
Well the most surprising thing was when I was sitting at a pizza parlor talking to two young guys who said some really racist stuff (they didn’t like cities because they had too many black people, etc), which they knew was on the record. I think it really illuminated for me how different the two worlds are: What they were saying was perfectly fine to say in the world they lived in. In the world I live in, it was shocking.
Where is somewhere you’ve traveled that has really surprised you and changed how you think, either in a good way or a bad way?
Beaumont, Texas, was a fascinating place for me to visit. I had written a lot about segregation at that point, but it is often hard to articulate why segregation is so problematic (beyond general issues of equality and fairness). But I talked to a mother whose daughter had been succeeding in a good school in a white neighborhood, and then had to move to a bad school in a poor neighborhood. In the first school, her daughter had access to a computer, books, and an engaged teacher. In the second one, many of the kids in her class didn’t know how to read.
Based on what you’ve seen of America, do you consider it likely or unlikely that large-scale violent conflict breaks out between factions of Americans?
Hmm, I don’t think widespread violence is likely. One interesting thing I’ve noticed in trips since the election is how everyone is just going about their daily lives as before. Guys, the world has not ended (!!). If anything, people seem more politically engaged than ever.
One of the things I was most curious about after the election was who was going to be impacted first and soonest (apparently, the answer was immigrants from seven countries). But people live locally, and act locally, and will see little changed in their lives for now, I think.
What do you think is the limit at which Trump’s support among rural voters collapses, if there even is one?
I have thought about this a lot, and I think that the limit is a lot higher than Democrats would hope. I was in rural North Carolina last week talking to voters, and I was surprised how many of them—poor, rich, white, black—said they thought Trump was doing a good job. (This was in the midst of the immigration furor.) They said they thought he had a big mouth, and said things that he shouldn’t, but they wanted to give him a shot to turn the country around.
I enjoyed your story “President Trump, Job Creator?” Do you think that Trump either knows or cares that companies are playing him by letting him claim credit for things that they were going to do anyway?
I think he loves this. Announcements like Intel’s recent one about the chip factory in Arizona make him look good, even though he did nothing to make them happen. Intel, like most companies that make these announcements, had planned to do this long ago. By announcing it Trump’s way, though, they might be able to curry favor with him. I don’t think they’re playing him; I think he’s playing them.
What do you think will be the long-term ramifications of Trump’s economic policies? Do you think these ramifications could have a major impact on how rural areas vote, or do you think values and religious concerns will still be supreme?
This is a good question, but unfortunately I don’t have a great answer. It’s possible that Trump will convince more companies to manufacture here. The voters I talked to certainly think he is doing a good job so far. He is really good at making independent business decisions sound like they were because of him. If he does this, I think he’s going to keep a strong contingent of happy voters in the Rust Belt.
But a lot of these manufacturing jobs are going to be automated, and so that’s not going to help these voters in the long term. The automation could take a decade or so, though, so it may not be relevant for 2020.
However, as I wrote in my piece from Elkhart [“It’s Not About the Economy”], economic progress doesn’t necessarily mean voters will support the president. People live in bubbles of their own making, and they often don’t let facts disrupt their narrative of what is going on. (Liberals too!)
I really liked your article last November about the Democrats not having an easy answer for the Rust Belt. So my question is: Is there actually a pitch the Democrats can make that will work? Trump is promising the moon, and while I don’t think he can deliver, it seems like an impossible promise to compete against.
I think the pitch that will work is not the most sensible one, which is training and education. Rather, I think if somehow Democrats can go more progressive, towards a “growth-that-includes everyone” type of message, that could be more appealing, especially to one-time union voters. That might mean talking more about employee-owned companies, or about the importance of unions, or of making business share more profits.
I liked your piece about the TPP and its real impact on the American worker, but it’s interesting how few people in the Rust Belt seem to understand these concepts. Where do you think the disconnect in communication and understanding is? Is there a better way for to get these topics across?
That’s a really good question. I think that’s another thing I’ve really learned while talking to people across the country: People often believe the version of economics that’s simplest. So, “your job is being outsourced to Mexico” is easier to get angry about than “TPP would have raised wages overseas, which in turn could have driven companies to relocate to the U.S., which in turn would have created jobs here.” People in the Rust Belt really hate NAFTA and it’s going to be hard to change their mind about trade.
Seeing that the Trump administration has so far been rather, let’s say, incompetent, do you think he’ll actually be able to impose the trade restrictions he wants? And how do you think they’ll play out if they do indeed happen?
I think that for the next two years, if past is prologue, Trump is going to do pretty much whatever he wants. He already killed TPP. Renegotiating NAFTA is going to be harder, but I think the administration is very serious about this border adjustment tax. The good thing for Democrats, I think, is that most of these trade policies are going to be an absolute disaster for the economy. It’s worrying that Trump does not seem to want to listen to economists, but this will be an interesting experiment in what happens when a country does not follow widely-accepted economic principles.
Do you think the extremely polarizing nature of Trump will make the communities you visited more insular and defensive, and only deepen the divide in America? If so, if there anything that could be done to mitigate this? Is there anything the media could do on this front?
Yes, I do think the divide is going to deepen. People have beliefs about the country and the president and they are going to seek out news sources that confirm those beliefs. So people who like Trump are going to read things saying he’s doing a good job, and those that don’t are going to consume things saying he’s terrible.
I think local newspapers are important here. People care about what’s happening in their communities and still consume local news. So the degree that those papers can burst through those bubbles and share facts, that’s pretty important.
I was recently reading a New Yorker article about how a man from rural America who said he hated black people started up a discussion on C-SPAN with the head of Demos, who is black.
They’re now friends, and McGhee recommended that the man read up on black history and get to know more black people, which he did. I think the more people read up on people very different from them, and make contacts with those people, the better (yes, I realize this sounds very Kumbaya). I am trying to read Hillbilly Elegy right now (though I am not making much progress), and I want to read more about people in rural areas, even as I do more reporting there.
Something I’m always curious about, particularly from writers such as yourself: Do you think online commenting provides an opportunity to bridge some of these economic/educational/cultural divides or does it just widen the gulf? TAD was founded as an escape from the usual fracas of online comment boards, but I’m curious what role you think open online comments plays in today’s America.
If you mean commenting on sites like The Atlantic, I’m not sure that it can bridge the gulf. Many of the Republicans I talk to in the Rust Belt have never heard of The Atlantic and certainly would never read it. They have their news sources; Democrats have theirs.
I’ve thought a lot about how to bridge the divide I wrote about in “America’s Great Divergence,” and I just don’t have an idea. I do know that the opinions and input of people different from me are really important in my reporting, but I usually get those inputs by visiting somewhere really far away and talking to random people. I’d love to have more of those people in my Facebook feed, but I just don’t.
“I’d love to have more of those people in my Facebook feed, but I just don’t.”
I think that’s a really great point. Elsewhere on TAD today it was mentioned that Vox was reporting only 9 percent of Republicans disapprove of Trump right now. Which isn’t surprising in and of itself, but it did strike me how everything I’ve seen lately, through social media and through the news sources I regularly read, has been so negative about Trump that you’d think the entire country turned on him. It’s an important reminder that there really is another world out there that is easy to completely miss.
What do you think will be the most underreported, yet necessary economic/business reporting of the coming year?
I think the middle class is going to continue to hollow out, no matter what Trump does. People at the high-skill, educated end of the spectrum are going to do great, everyone else is going to continue to scrimp. Especially with GOP control in the nation and in many states, there’s going to be little appetite for raising the minimum wage in many places, and there’s likely to be more rollback of union protections. (I believe Iowa is considering scrapping collective bargaining.)
I also think that Trump’s changes to the tax codes, whatever they end up being, are going to be a big deal. So many economists I talk to say the way to lessen income inequality is to raise taxes on the rich. That is definitely not going to happen now.
Is there anything you can share about any big (or little, or anything in between) stories you’ve got coming up? Do you see your writing taking a specific shape under the Trump administration that maybe you wouldn’t have expected last October?
Yea, for better or worse, Trump dominates the news cycle, and he is what people are interested in reading about. I’m interested in what will happen if Trump dismantles regulations to make things easier for business, especially in the environmental arena. I think a lot of journalists are wary of writing stories for four years that are basically “Trump just announced a policy that is dumb. Here is why,” but on the other hand, you can’t ignore when he puts forth things that fly in the face of decades of economic thinking.
I am always open to story ideas about the new world we live in, so if you have any things you’d like to see us cover, shoot!
The poet Thomas Lux died on February 5. It seems fitting to honor him and his decades of Atlantic contributions with a brief history, but also with his own words in his own voice.
Speaking about his craft in an Atlantic interview from 2004, Lux is both magpie of unusual facts (“Without the dung beetle we’d all be up to our clavicles in cow pies. They deserve an ode!”) and defender of poetry’s essential weirdness:
I love mystery, strangeness, nuttiness, wildness, leaps across chasms, irreverence, all the crazy stuff we love about poetry. We don’t usually love poems because they are well made, or smart, or deep. We love them for their crazy hearts.
In the nine poems Lux published in our pages, you’ll find wry humor—1984’s “Snake Lake” begins:
My friends, I hope you will not swim here:
this lake isn’t named for what it lacks.
And you’ll find startling echoes of the present in “Henry Clay’s Mouth” (1999):
He said: “Kissing is like the presidency,
it is not to be sought and not to be declined.”
…
It was written, if women had the vote,
he would have been President,
kissing everyone in sight,
dancing on tables (“a grand Terpsichorean
performance ...”), kissing everyone,
sometimes two at once, kissing everyone,
the almost-President
of our people.
Years ago, as part of a series for poetry month, we gathered a selection of old Atlantic audio recordings of poets reading their works. My part was to convert the files from an obsolete, unplayable format to mp3. Among them was Lux’s reading of “Virgule,” an ode to “/” that begins:
What I love about this little leaning mark
is how it divides
without divisiveness. The left
or bottom side prying that choice up or out,
the right or top side pressing down upon
its choice: either/or,
his/her.
Listen to him read the entire poem:
Far more qualified people can speak to his particular brilliance—I’m just someone who tried to rescue his voice, or a minute and 38 seconds of it, from the online abyss and deliver him to you.
I asked my colleague David Barber, the Atlantic’s poetry editor, for his memories of the magazine’s long history with Lux. He writes:
Tom Lux’s quirky, wily, incorrigibly uncanny poems left their mark far and wide from way back, but The Atlantic could be said to have a special claim on him.
For one thing, he was a local boy made good: Born and raised in Northampton, Mass., where his father ran a dairy farm, he was a fixture for many years in Boston and its environs, home base to the august bewhiskered poets who founded the magazine in 1857. His editor at Boston’s Houghton Mifflin for several of his celebrated collections was Peter Davison, the Atlantic’s late longtime poetry editor and literary lion of parts. His work appeared early and often in these pages over those years, immediately recognizable for its mordant wit, offbeat verve, and matchless knack for musing beguilingly on just about anything. The only predictable trait of a Lux poem was that it would be the one and only thing of its kind.
It’s the weariest of clichés to say that a certain poet sounds like none other. Lux was the real McCoy. It’s there in the deadpan delivery, the sure comic timing, the live-wire ear for oddball lingo and kooky hearsay, the slyboots way of spinning tall tales out of small talk. His bittersweet satirical bent belongs to no school or tribe; his smarts and chops were his and his alone. Is there another American poet since Stevens who conjured up so many humdinger titles? Could anyone else have composed an ode to the secret life-force of a punctuation mark? Was there ever a laconic elegy for long-gone summertimes quite as definitively disarming as “The Man Into Whose Yard You Should Not Hit Your Ball”?