Living a long life seems the obvious goal for most people, and many of them, like Dylan Thomas, raged against the dying of the light. Others—like the transhumanists that Olga featured recently—want to transcend death entirely.
Well, like most things, the answer is not a simple yes or no; it depends—on so many factors, some of which we can control (e.g. not smoking) and can’t control (e.g. our genetic make-up). If you’re in good health physically and have all your faculties and some purposeful work or hobby, or just something you really enjoyed doing, then maybe it might be a good idea to live a long life. But those are a lot of ifs.
Another reader, John, looks to human connections:
Health is essential to making survival good, but it also helps to have a caring partner, for companionship and support. I am biased, because at 81, I have my health and a good wife. I’d like to live past 100 if these conditions remain. But if I become disabled, chronically ill or alone, life is unlikely worth it.
Rita has a bleaker outlook:
Looking at my genetics, I’m starting to think I may live a long time. I’m not yet 70, but I can probably expect to go until 95 at least.
This doesn’t fill me with joy. Who’s going to look after me when my eyesight starts to crap out and I get weaker? Where’s the money going to come from to continue to pay my bills? These are not minor questions. Their answers, as far as I can see, are “nobody” and “nowhere.”
And anyway, it’s not as if I can look forward to hiking in the desert or exploring foreign cities in my extreme old age. Nor will many of us be directing films or conducting research in our nineties. What most of us can anticipate is day after day staring at a TV set, wondering if anyone is coming for a visit.
She adds, “That Atlantic excerpt you cited from 1928 nails it”—namely, “Any programme which has for its object the prolongation of life must also have, accompanying this increased span of life, the ability of the individual to engage actively and with some degree of effectiveness in the affairs of life.” Another reader, Bernyce, is also worried about infirmity:
After the age of 75, the human body declines—if not steadily, then in jerks and/or slopes. People begin to loose hearing, eyesight, and useful teeth, as well as the ability to digest food that may be ingested. A younger friend (74), living in an assisted-living facility because her son lives 200 miles away and she is no longer able to walk, says her companions say the food is delicious. She says the desserts are tasteful but everything else is flavorless and slippery. People who have loved ones to care for them may be more fortunate.
Watch the French film Amour. It is a short, beautiful, and painful glimpse of the end of life in a loving marriage. Even when we are not alone, the end of life is very difficult.
Here’s the haunting trailer for Amour:
Here’s Jim:
At the somewhat advanced age of 88 (and I’ll be 89 in a few days), I’m tired. I think I’ve accomplished all I’m capable of and am ready to rest … permanently, I guess. Curious to see what, if anything, comes next. I’ll let you know.
Jim’s “I’m tired” reminds me of a similar sigh of acceptance that came from William Buckley during one of his final interviews, before dying at the age of 82:
The clip is worth watching in full, even if you’re no fan of the conservative figure, but it begins with Charlie Rose asking Buckley if he wishes he were 20 again, and he replies:
No, absolutely not. If I had a pill that would reduce my age by 25 years I wouldn’t take it. Because I’m tired of life. I really am. I am utterly prepared to stop living on. There are no enticements to me that justify the weariness, the repetition ...
Buckley goes on to quote Sherwin Nuland—a surgeon, professor of bioethics, and author of How We Die: Reflections on Life’s Final Chapter—who once quipped, “The greatest enemy of older people is young doctors”—because they’re determined to keep you alive at any cost. This next reader would likely fight them off:
I am ready to go at 61. We have no problem helping our sick and injured pets, farm animals, etc. find final peace, and now people are beginning to evolve on this point too. Thank god. (Yes, I think god would agree.)
Let’s face it, after 60, folks begin kicking the ol’ bucket from normal end of life reasons. Seems the body remembers “hard” living in the early years. And this is okay. I’m reading The Razor’s Edge right now and that helps me understand.
Here’s Bill:
As an 85 year old, I recognize that my usefulness is coming to a close.
At this time, I seem to provide joy to my children and grandchildren.
When I become a liability and need the constant care of others, I am content to have my life end, even if I have to take care of that myself.
At this time I do not need nor want that kind of care. But it may come soon, and I can face that comfortably.
Sharon is a very longtime reader:
Dear Atlantic, magazine of my youth and age;
I believe that one’s life should be as long as one can make a contribution in some way. For me, personally, I wish to live only as long as I can be useful. At 72, and a few years before, I made the decision that when I felt I could no longer contribute in a tangible way, I will end my life.
I was greatly miffed by an article by a know-all person of the psychiatric persuasion, who said that anyone who wished to end his or her life was depressed. In my opinion, that’s balderdash. My firm belief is that we should live only as long as we can help to decrease our particular footprint on the planet by benefitting others. My desire is to have 15 years of retirement, but if I can't meet my personal hook, I'll discard that goal.
I think it is immoral to artificially prolong the physical existence of an individual who is in no more than a vegetative state. On the other hand, I believe that no one has the right to make that choice for another person.
Kent has some advice:
I think everyone should think about a long life, and when you’re about halfway there or within 30 years of being there, set yourself a goal of how old and how alert you want to be. It’s likely to affect your health and wealth by making you focus on more important things in life and your ability to experience them. The earth doesn’t owe anyone longevity so it’s up to you to figure out what and where and when you’ll take charge of your existence and final stages of life.
Kent’s note reminds me of my stepfather, who’s approaching 70 and has a really wise approach to the remainder of his life: Instead of focusing on how long he’s going to live, he’s focused on how short he can make the window of time he’ll be infirmed. By eating healthy, cycling dozens of miles per week, and generally keeping his stress low, he’s determined to shrink that final period as much as possible.
This next reader, Rachel, also looks to her parents:
I am compelled to write to you! That has never happened before.
In the last six years, I saw both my parents off this planet. Both were happy to go and did not overstay. My mother, always in good health, had hoped for some more years but fell ill. Once that happened, she did not want to linger. It was too physically painful.
My father simply grew lonely and disinterested, and he too welcomed the end. He actually asked me hasten it for him, but I reminded him it was against the law (!)
Now I have my parents-in-law. He is a priest whose life revolved around being connected to others and doing pastoral work but who has recoiled into himself these last five years and today makes no contribution to anyone, anything, anywhere. This is so wrong. He could bring meaning to people but has closed those doors.
My mother-in-law, who has had to put him into a home because she cannot care for him, spends her days wracked with guilt for having done so. While he abhors the thought of death (I thought he would want to go to his maker??), she welcomes it—to be relieved of her guilt.
But neither is dying soon. What kind of life is this for them and their families, everybody’s pocketbook, and the earth’s resources?
I am soon 58 and HAVE NO DESIRE to live long. My parents checked out at 87 and 89 and I would be happy to go sooner, while I am still making some contribution to the world and to my loved ones.
Emma contributes through teaching:
Life that includes giving, sharing, and caring for others is worth it. In contrast, life as a “parasite”—endlessly entertained by television and card games—is perhaps a more arrogant use of resources. Of course I can say this now, at age 75, the day I teach a Chinese emigre English, the day after I teach three little girls piano, and the day when I will soon perform music for fellow residents in our retirement community.
What will I say ten years from now, when all I hope for is to see my grandchildren safely through adolescence, and I have no energy to spare for what I do now, I do not know.
“If we are being quite frank, there are a few exceptional people who may have something special to give to humanity, but the vast majority of people are simply useless.”
Are the majority of people useless? If we only consider people who have made contributions to the world through their inventions, philosophies, scientific or medical research, political leadership, military or business achievements, etc., then I would agree that the vast majority of people would seem to be useless.
However, every person who has ever lived on the face of the earth has influenced or impacted the lives of those around them in ways we know nothing about, unless their life touched us personally. And then only I can know how they impacted my own life experience.
Some of the peoples’ influences were/are positive and constructive; some negative and destructive. But they all contribute to the evolutionary process of the human consciousness and therefore each person’s experience, which in turn influences the lives of people of succeeding cultures and generations.
The greater question to me is why we are here at all. What is the reason or need for our actual existence? But this gets into a philosophical discussion that could go on and on.
On Wednesday, a Northern Virginia school district shut down for the day after a number of staff members asked for the day off to participate in “A Day Without a Woman,” a protest to highlight the contributions of women to society. A few weeks ago, a number of restaurants and fast-food chains closed down for “A Day Without Immigrants” to spotlight immigrant contributions in the United States.
So this week, we asked Politics & Policy Daily readers to fill in the blank with a group of people that deserves to be commemorated: A Day Without ______. Our first entry comes from Leslie, who recommends holding “A Day Without Daycare” in order to show:
(1) how important daycare services are to productivity
(2) how parents’ need for daycare is critical (so that they can work)
(3) how much families rely on unpaid daycare help from relatives and friends
Similarly, Brooke proposes a “Day Without Caregivers”—of any kind:
Schools would have no after-care and closed daycares would mean many workers would stay home. By doing our own care work, we would all appreciate how much work it is, how lovely it is to be present for each other, and how hard it is to be present for each other.
Once, when I lived in Bangladesh, a friend’s father was hospitalized. We took turns cooking for him (and the rest of the family) because the hospital did not provide food, maintaining his shadow “chart” so that we had a record of everything that happened to him, and sitting with him so that he always knew someone by his side. In a day without caregivers, we would honor caregivers and the relationships of care that are part of individual and social health.
Sally would agree:
The purpose of the commemoration is to highlight a group which is historically underappreciated, substantially underpaid for their labor, and taken for granted, yet would be sorely missed all around the country. Caregivers—for frail and disabled folks—fit. Now that baby boomers are reaching the stage of needing caregivers, we need to shine a light on how necessary they are for the people they serve and their families. The hard part about this choice is that caregivers can’t simply vanish for a day without endangering people’s lives.
Another reader proposes “A Day Without Cooks” to help recognize their importance in society and in families, adding “In China, there is an idiom: ‘The God of the people is food.’” Emily suggests “A Day Without Working Parents,” and Lynn can’t pick just one group of people; she wants to honor garbage collectors, janitors, teachers, and nurses.
Andrew wonders how Americans would fare for 24 hours without petroleum products:
This would be especially shocking for those on the left and in Congress who like to make the domestic oil and gas industry their perpetual whipping boy—modern healthcare, manufacturing, cheap/safe food, sanitation, all manner of things depend on plastics and petrochemicals. A similar argument could be made for the financial services industry—both are backbone industries upon which the economy runs, so their success (and their compensation) is a testament to how critical and integral to our daily lives and quality of life they are.
Andrew added:
White males, police officers, doctors and nurses, (even) politicians.
I don’t seriously propose this, but rather point out that we all play a part in society and the economy, and the whole premise of the “day without” is to draw attention to the contributions of certain groups of people. Like a multi-legged stool, if you remove one of the legs, things get a little wobbly. I’m more of an advocate for everyone doing their jobs, doing them well, and letting your accomplishments speak for themselves.
Chris thinks a good group to commemorate might be older, retired Americans who serve as volunteers in museums, hospitals, and schools:
Many people believe retired Americans are just living it up and collecting their social security and Medicare. We are doing some of that; we’ve earned it. But, we’re also contributing every day in many many ways.
Catherine figures a day without her fellow Millennials might be enlightening: “I think that might indicate to our elder detractors how hard and how much we work in our current economy.” Avid question-responder Howard thinks “A Day Without the Mainstream Media” would help put things in perspective:
Without them, we’d be relegated to the likes of Breitbart, Alex Jones’s Infowars, Daily Caller, The Blaze, American Pravda (aka Fox “News” Channel) and their counterparts on the nutty left (although there aren’t nearly as many).
Finally, Joseph Luchok simply hopes for “a day without Twitter.”
Alexandra of Russia and her son Alexei, photographed between 1910 and 1913.Library of Congress
Today is International Women’s Day. It also happens to be the 100th anniversary of the start of the revolution that brought down the Russian empire. Given the coincidence, I was delighted to find in our archives an article from our January 1928 issue titled “The Fall of the Russian Empire: The Part Played by a Woman”—that is, until I read author Edmund Walsh’s assessment of exactly what that “part” was:
Russia was the last island fortress of absolutism in the rising tide of democracy, the outstanding anachronism of the twentieth century. … It defied the elements for three hundred years—until the deluge came. Whose hand unloosed the flood gates? In my opinion, a woman, all unconsciously, had more to do with the final debacle than any other single cause. … History probably will clear the memory of Alexandra Feodorovna [of treason, but] it can never clear her memory of tendencies, practices, and imprudences that contributed notably to Russia's ruin. The domination which this imperious, proud, aloof, and resolute woman exercised over her irresolute and impressionable husband became such a menace that more than one grand duke, duchess, and general cried out in warning against it. …
Revolutions are made by men and women determining events. Men are swayed by powerful human emotions. Women create them. And the master passion, particularly in neurotic females, can be as elegantly indifferent to the realities of life and war as ever Montesquieu was to the existence of God.
It’s a fascinating historical document, undeniably sexist in its overtones. The gist of Walsh’s argument is that the Tsarina Alexandra, driven by fear for the health of her hemophiliac son, gave the self-proclaimed holy man and healer Grigori Rasputin a level of influence that irrevocably weakened the Russian government. For evidence, Walsh delves into the embarrassing intimacies of Alexandra’s letters to her husband. And he criticizes the empress on two familiar, contradictory fronts: On the one hand, she’s weak and overly emotional, too much guided by motherly worries to see the bigger picture of Russian politics. On the other, she’s aggressive and overly domineering, stepping outside her proper sphere of childrearing to advise her husband on governance. She’s portrayed as a femme fatale, making a “subtle approach to political questions … through the gateway of the Tsar’s affections.” But she’s not granted agency either: Walsh argues she brought about the fall of the empire “all unconsciously.” She is, like female leaders still are, damned for the stereotypes of womanhood she does fulfill and damned for the ones she does not.
But none of this is to dispute the chain of events that Walsh describes. Alexandra and her husband did fail at governance: For any leader, male or female, it’s a heartbreaking reality that even the safety of one’s own family must come second to the national interest. And for all the sexism embedded in Walsh’s narrative, I agree with his central point that “revolutions are made by men and women determining events.” What struck me, reading this article today, was Alexandra’s simple human vulnerability, and my own reaction to it—my inclination to sweep this unflattering story under the rug. When we seek to recognize the women of history, what do we do with the history that reveals individual women as less than admirable? How do we celebrate women—our role models, ourselves—as powerful, vulnerable, fully complex humans, flaws intact?
If you have thoughts on that question, please let us know. It reminded me of a comment about female characters I’d seen recently from a reader during TAD’s book-club discussion of Margaret Atwood’s The Handmaid’s Tale. The dystopian government of that novel is built on the oppression of women, to which its central character, Offred, bears witness but doesn’t fight back. The reader wrote:
When I started reading The Handmaid's Tale before Christmas, someone (a LADY before you get all "pffft men!") told me that the women of the novel, especially Offred, bothered her. It took me a long time to settle into why Offred might be kind of frustrating—and it's because I think, as women, we all want her to fight, to struggle, to know that what is happening to her is wrong. But she's a product of her conditioning, isn't she? And she's been conditioned so well. She passively allows things to happen to her. She just rides along, not speaking up or out or anything.
But as another reader pointed out, “Offred is like a lot of folks. When it comes down to it. It's easier to put your head down and survive.” And the first reader agreed, summing it up:
I think we expect women characters to be strong nowadays. We’re shocked when they aren’t.
We’ve come a long way since 1928, when portrayals of women like Walsh’s were more or less the norm. Over decades—centuries, for that matter—women have worked to prove that they are strong and brave and smart; that they are leaders and revolutionaries; that they are more than mothers only; that motherhood is no lesser thing. It’s been proven time and again, but women still must demand the freedom to be imperfect—which is no more and no less than what every human deserves.
That’s the question that reader John Harris has been asking himself lately. He’s not alone: In 1862, one of The Atlantic’s founders, Ralph Waldo Emerson, wondered the same thing about aging. Acknowledging that “the creed of the street is, Old Age is not disgraceful, but immensely disadvantageous,” Emerson set out to explain the upsides of senescence. A common theme is the sense of serenity that comes with age and experience:
Youth suffers not only from ungratified desires, but from powers untried, and from a picture in his mind of a career which has, as yet, no outward reality. He is tormented with the want of correspondence between things and thoughts. … Every faculty new to each man thus goads him and drives him out into doleful deserts, until it finds proper vent. … One by one, day after day, he learns to coin his wishes into facts. He has his calling, homestead, social connection, and personal power, and thus, at the end of fifty years, his soul is appeased by seeing some sort of correspondence between his wish and his possession. This makes the value of age, the satisfaction it slowly offers to every craving. He is serene who does not feel himself pinched and wronged, but whose condition, in particular and in general, allows the utterance of his mind.
By 1928, advances in medicine had made it more possible to take a long lifespan for granted. In an Atlantic article titled “The Secret of Longevity” (unavailable online), Cary T. Grayson noted that “probably at no other time in the history of the human race has so much attention been paid to the problem of prolonging the span of life.” He offered a word of warning:
Any programme which has for its object the prolongation of life must also have, accompanying this increased span of life, the ability of the individual to engage actively and with some degree of effectiveness in the affairs of life. Merely to live offers little to the individual if he has lost the ability to think, to grieve, or to hope. There is perhaps no more depressing picture than that of the person who remains on the stage after his act is over.
On the other hand, as Cullen Murphy contended in our January 1993 issue, an eternity spent with no decrease in faculties wouldn’t necessarily be desirable either:
There are a lot of characters in literature who have been endowed with immortality and who do manage to keep their youth. Unfortunately, in many cases nobody else does. Spouses and friends grow old and die. Societies change utterly. The immortals, their only constant companion a pervading loneliness, go on and on. This is the pathetic core of legends like those of the Flying Dutchman and the Wandering Jew. In Natalie Babbitt’s Tuck Everlasting, a fine and haunting novel for children, the Tuck family has inadvertently achieved immortality by drinking the waters of a magic spring. As the years pass, they are burdened emotionally by an unbridgeable remoteness from a world they are in but not of.
Since antiquity, Murphy wrote, literature has had a fairly united stance on immortality: “Tamper with the rhythms of nature and something inevitably goes wrong.” After all, people die to make room for more people, and pushing lifespans beyond their ordinary limits risks straining resources as well as reshaping families.
Charles C. Mann examined some of those potential consequences in his May 2005 Atlantic piece “The Coming Death Shortage,” predicting a social order increasingly stratified between “the very old and very rich on top … a mass of the ordinary old … and the diminishingly influential young.” Presciently, a few years before the collapse of the real-estate bubble that wiped out millions of Americans’ retirement savings, Mann outlined the effects of an increased proportion of older people in the workforce:
When lifespans extend indefinitely, the effects are felt throughout the life cycle, but the biggest social impact may be on the young. According to Joshua Goldstein, a demographer at Princeton, adolescence will in the future evolve into a period of experimentation and education that will last from the teenage years into the mid-thirties. … In the past the transition from youth to adulthood usually followed an orderly sequence: education, entry into the labor force, marriage, and parenthood. For tomorrow’s thirtysomethings, suspended in what Goldstein calls “quasi-adulthood,” these steps may occur in any order.
In other words, Emerson’s period of “ungratified desires and powers untried” would be extended indefinitely. Talk about doleful deserts! On top of such Millennial malaise, Mann also predicted increased marital stress, declining birth rates, a depleted labor force, and a widespread economic slowdown as the world’s most powerful nations entered a “longevity crisis.”
But that’s just one vision. Another came from Gregg Easterbrook, who anticipated “a grayer, quieter, better future” in his October 2014 Atlantic article “What Happens When We All Live to 100?” His argument has some echoes of Emerson’s, but with modern science to back it up:
October 2014
Neurological studies of healthy aging people show that the parts of the brain associated with reward-seeking light up less as time goes on. Whether it’s hot new fashions or hot-fudge sundaes, older people on the whole don’t desire acquisitions as much as the young and middle-aged do. Denounced for generations by writers and clergy, wretched excess has repelled all assaults. Longer life spans may at last be the counterweight to materialism.
Deeper changes may be in store as well. People in their late teens to late 20s are far more likely to commit crimes than people of other ages; as society grays, the decline of crime should continue. Violence in all guises should continue downward, too. … Research by John Mueller, a political scientist at Ohio State University, suggests that as people age, they become less enthusiastic about war. Perhaps this is because older people tend to be wiser than the young—and couldn’t the world use more wisdom?
It’s a good point. Couldn’t we all use more wisdom, more experience, more opportunities to learn? Wouldn’t we make better use of our lives if our lives went on forever? Not so fast, Olga Khazan wrote last month:
A common fear about life in our brave, new undying world is that it will just be really boring, says S. Matthew Liao, director of the Center for Bioethics at New York University. Life, Liao explained, is like a party—it has a start and end time. … “But imagine there’s a party that doesn’t end,” he continued. “It would be bad, because you’d think, ‘I could go there tomorrow, or a month from now.’ There’s no urgency to go to the party anymore.”
The Epicureans of ancient Greece thought about it similarly, [psychologist Sheldon] Solomon said. They saw life as a feast: “If you were at a meal, you’d be satiated, then stuffed, then repulsed,” he said. “Part of what makes each of us uniquely valuable is the great story. We have a plot, and ultimately it concludes.”
Even so, some futurists believe immortality is within reach:
So, what do you think: Is there a limit to how long people should live? Is it selfish to want eternity for yourself, or would having even a few immortals around make the world better for everyone? Here’s one reader’s take:
This reminds me a bit of the cylons in the “new” Battlestar Galactica.
With the ability to reincarnate infinitely, and be effectively immortal, they were callous towards humans, and killed humans with impunity. It was only when their ability to reincarnate was ended and they became effectively mortal (and thus subject to basically the same rules of death as humans) that they were driven to behave in a moral way.
But another reader argues:
I for one think the world would be a better place if we collectively took a longer view, and what better way to do that than to give everyone a stake in it?
On Tuesday, President Trump outlined his plans to increase defense spending and invest in America’s infrastructure. This week, we asked Politics & Policy Daily readers where they would allocate extra funds if they were in charge of the country’s budget. Here are some of our favorite responses.
The vast majority of respondents, including Stella Porto here, would invest more in education:
If I controlled the federal budget, I would strengthen basic public education. Provide more access to pre-school education. Make college more affordable. Expand community colleges. Develop re-training programs for those who jobs have been eliminated by automation or other economic trends.
Everything in the country depends on the level of education of its people—absolutely everything, from preventing illness, choosing a better lifestyle, to raising kids responsibly, to choosing elected officials, to fighting for important causes, etc. Citizenship depends on education. Access to good education is at the root of equality.
Chuck Barnes, a retired university faculty member and geologist, suggested funding a year or two of universal service for high school graduates:
I don’t mean military service, although that could be one option. Other options would include a wide range of work and/or training to help create a wide range of social service, training, physical work, military service, etc. This would accomplish two interrelated goals: 1) recognizing that we are such a great nation and that 1-2 years of service are a debt that should be paid for the privilege of being an American; and 2) helping young people from disparate worlds to interact in positive ways, while growing up and maturing.
Donna Hoffman, a former English and drama teacher, thinks America should invest in a new kind of education:
I would take that fictional extra money and put it into the National Endowment for the Arts and change from our current, terrible system of education to the Montessori System used in Europe and in private school systems around the U.S. Yes, our education system needs an overhaul, but it needs to be done by Europeans not Americans who are so enmeshed in what we’re doing now that they cannot see the forest for the trees.
Susan Berkow said she wouldn’t increase military spending because it “is already big enough” but she would spend more on support for veterans.
Connie Hellyer said investing in advancing reproductive rights for women around the world would be a “three-fer” because access to contraception “improves women’s health and ability to enter the labor force,” “improves children’s health,” and “relieves pressure on the environment.”
John Friedin would use the extra money to conduct “scores of scientifically run experiments with guaranteed basic income for all.” More on basic income here.
Jerry Purmal would focus on eliminating student debt:
In order to reduce the time over which each student’s debt lingers, those EXTRA funds would be applied to pay the annual interest on student debt, thus permitting the student’s obligatory loan payments—following graduation and gainful employment—to be entirely credited to reduction of each student’s principal sums interest-free.
Finally, in a time of “alternative facts” and “fake news,” Ken Prahl was thinking about how to learn from some of the lessons of 2016:
I’d use the funds to set up adult-education classes on critical thinking, what it is, and how to perform it—also explaining how history can be described using different narratives and giving examples of different narratives tied to various ideologies.
Because of the Internet I write more and receive feedback from people I know (on Facebook) and online strangers (on TAD and other platforms that use Disqus). I use it as a jumping-off place and resource for planning lessons for my high-school students in science.
However, I don’t practice music as often as I used to.
On a similar note, another reader confesses, “I draw less because I’m always on TAD”:
As a sketch artist, I appreciate my ability to Google things I want to draw for a reference point, but that doesn’t make me more creative. I already had the image in my head and the ability to draw. I honed my skills drawing people the old fashioned way, looking at pictures in books or live subjects and practicing till my fingers were going to fall off.
In my opinion, the internet also encourages people to copy the work of others that goes “viral” rather than creating something truly original. The fact that you can monetize that viral quality also makes it more likely that people will try to copy rather than create.
That’s the same reason a third reader worries that “the internet has become stifling for creativity”:
Maybe I am not looking in the right place, but most platforms seem to be more about reblogging/retweeting/reposting other people’s creations. Then there is the issue of having work stolen and credits removed.
As another reader notes, “This is the central conflict of fan fiction”:
It’s obviously creative. On the other hand, it is all based on blatant copying of another writer’s work. How much is this a huge expansion of a creative outlet, and how much is this actually people choosing to limit their own creativity by colonizing somebody else’s world rather than creating a new one?
For my part, I tend to think the internet has encouraged and elevated some amazing new forms of creativity based on reaction and re-creation, collaboration and synthesis. Take this delightful example:
Those creative forms are a big part of my job too: When I go to work, I’m either distilling my colleagues’ articles for our Daily newsletter or piecing together reader emails for Notes, and those curatorial tasks have been exciting and challenging in ways that I never expected. But I’ve also missed writing fiction and poetry and literary criticism, and I worry sometimes that I’m letting those creative muscles atrophy. If you’re a fanfic reader or writer (or videographer, or meme-creator, or content-aggregator) and would like to share your experience, please let us know: hello@theatlantic.com.
This next reader speaks up for creativity as “the product of synthesis”:
It’s not so much a quest for pure “originality,” as it is a quest for original perspectives or original articulations. I’d say that my creativity has been fueled by letting myself fall into occasional rabbit holes. Whether that’s plodding through artists I don’t know well on Spotify or following hyperlinks in a Wiki piece until I have forgotten about what it was that I initially wondered, that access to knowledge in a semi-random form triggers the old noggin like little else.
On the other hand: So much knowledge! So many rabbit holes! Jim is paralyzed:
I find many more ideas and inspirations, but the flow of information and ideas is so vast that I never find time to develop them. I need to get off the internet.
Diane is also exasperated:
The promise of digital technology was: spinning piles of straw into useful pieces of gold.
My reality is: looking for golden needles in a giant haystack of unusable straw.
I spend so much time looking for the few things actually useful to my project, my writing, my daily info needs, and by the end of the day I feel like I’ve wasted so much time and effort sorting through useless crap. And the pile of useless keeps getting bigger and bigger, like a bad dream.
This next reader provides some tips for productive discovery:
I am old enough to vaguely recall a time before I began to use the internet on a daily basis. What I would do, back then, when I got stuck and could not find a creative angle on a problem, was to go to some arbitrary corner of the library, take down the first book that caught my interest even though it had nothing to do with the problem at hand, and read a few pages—sometimes, the whole book. More often than not, it would trigger all sorts of analogies, and at least a few of them usually turned out to be fruitful. (Even if nothing turned out to be relevant, I usually still learned something interesting, so it was a win-win strategy.) It was a great way (to borrow Horace Walpole’s definition of serendipity) to make discoveries, by accidents and sagacity, of things one were not in quest of.
I try to use the internet in a somewhat similar fashion: When I’m stuck, I often spend a morning strolling around arbitrary corners of the internet, trying to discover stuff I did not know I was in quest of. Typically, I start in some academic resource like JSTOR. (I almost always start by limiting my search to articles at least 50 years old; it ensures that one does not end up reading fashionable stuff and thus thinking the same thoughts as all the other hamsters in the academic wheel. Also, older articles are usually far more well-written than the crap that results from the publish-or-perish system.) I am not above using e.g. Wikipedia, though, at least as a point of departure.
I also like reading old stuff in online newspaper/magazine archives. Sometimes, a stray remark in one of those wonderful 19th-century magazines written by and for men of letters is all you need to get a fresh angle on a familiar problem.
Gotta love those 19th-century magazines. In some ways, their mission wasn’t so different from that of the Facebook groups and Reddit threads and Disqus forums of today: creating a space for discourse and exchange and reflection, where exciting new ideas could bump up against each other. As James Russell Lowell, The Atlantic’s founding editor, wrote to a friend in 1857, “The magazine is to be free without being fanatical, and we hope to unite in it all available talent of all modes of opinion.” And as Terri, one of the founding members of TAD, reflects today:
TAD itself has been a creative endeavor for me and the other mods. Envisioning the community we wanted. Coming up with ideas to bring it to life. We developed ideas around the mix of politics, open and fun threads that the community has taken on and grown. It really has been a creative experience in collaboration on the internet.
Check out TAD’s whole discussion on creativity here, as well as many more. As for the offline benefits of online collaboration, take it from this reader—a “furniture maker and Weimaraner enthusiast”:
I would like to share a story about a project I am working on in which the internet has certainly aided my creativity. Zeus, our 8-month-old Weimaraner, is a couch hog. When my girlfriend and I sit down on the couch to watch TV, he will sit directly in front of us and bark until we make room for him. There are three large dog beds in the house, but Zeus steadfastly refuses to lie on the dog beds.
I am a member of a Weimaraner-owner Facebook group called Weim Crime. Several people in the group have had similar problems. We came up with a solution I tested out last week: build a dog bunk bed with one bed on the bottom and one bed about the same height as our couch.
It has worked out very well. Zeus quietly relaxes on the top dog bunk while we sit on the couch. I am now collecting feedback from that same group before building the more attractive final version. I have received very useful feedback—for example, lowering the top bunk deck to 18 inches or lower to prevent joint injuries. My end goal is to design and build a simple, low-cost dog bunk bed that is more attractive than the prototype and post a YouTube video showing other owners how to build a similar one.
This is just one silly project, but the feedback and interest I have receiving regarding the project has been really inspiring.
What questions about your day-to-day experience of the world have you been pondering? We welcome your feedback and inspirations. Check back Monday for the next discussion question in this series—and in the meantime, enjoy some Weimaraner art:
When I first contacted Nikolai Formozov about his paper on a 30,000-year-old squirrel originally found by Gulag prisoners, which I wrote about today, he told me he had a “few other colorful details” that didn’t make it into his paper. Would I be interested in hearing more?
I replied yes, of course, wondering how much more interesting this story could get. What he sent was magnificent.
(A note on nomenclature, which should not put you off from reading to the very end: Urocitellus parryii is the scientific name for present-day Arctic ground squirrels, and U. glacialis refers ones from the Ice Age.)
Nikolai wrote:
After we had made sense of the complicated and dramatic fate of Urocitellus parryii in northeastern Eurasia (it had once colonized the territory, then become extinct, and then re-colonized it from America), we began to wonder if there modern descendants of glacialis in Asia, if there were refugiums (shelters) from the Ice Age that still existed. Naturally, we considered Kamchatka, оne of the warmest places in the region. But we had no material from there.
At the time, my friend Igor Shpilenok, a wildlife photographer and popular blogger, was working in the Kronotskiy Wildlife Reserve on Kamchatka. From his blog, I noticed that he often saw a Red Fox, whom he had named Alisa, and that she brought ground squirrels to her puppies.
I wrote to Igor: “Where did Alisa find those ground squirrels? They should not be there (in that part of Kamchatka).”
Igor said, “Oh, they came here 20 years ago from the center of the peninsula.”
I said, “Could you ask her to collect some ground squirrels for us?”
Igor said: “Simple, I’ll trade her cookies for them. She loves cookies.”
But a strange thing happened after that. Igor wrote me “You know, now I don’t even need cookies, because after I received your letter, Alisa began leaving ground squirrels on my porch, the way cats do.”
So we received our first four specimens from Kamchatka courtesy of Alisa, and they were closely related to glacialis, as we predicted.
In our academic article, I had wanted to mention Alisa in the Acknowledgments, but this idea of mine was vetoed.
Have you read it cover to cover? If so, it’s time to test your memory. The quiz below contains 20 surprising facts, each one drawn from a different article in our latest issue. Each question includes the page number where you can find the answer, so if you’ve got a copy of the magazine handy, you can follow along on paper. Otherwise, go to the online table of contents, where the articles are listed in the same order as they appear in the quiz.
On Monday, the Weekly Standardpublished an article by Lee Smith titled “Fake News, Exposed.” It alleged that Rumana Ahmed, a former National Security Council staffer and the author of an Atlanticessay about why she left the Trump administration, had misled readers about the nature of her position.
“Ahmed was a political appointee in the Obama White House. According to Trump White House officials, it was very late in her tenure in the Obama administration when she applied for a civil service position with administrative duties,” Smith wrote. “‘Burrowing,’ as it's commonly called, is the process through which political appointees move into career government status. She was granted her new status at the end of January, just as the Trump team was moving into the White House.”
In fact, Ahmed held a term appointment that was not set to expire until the summer of 2018. Ahmed’s employment documents, which were reviewed by The Atlantic, show that her position with the NSC, which began in June 2014, was a Schedule A excepted service term appointment. Her term was renewed for another two years in August 2016.
“A Schedule A term appointment to the NSC would not ordinarily be described as a political appointment and it is a standard hiring authority for staffing the NSC,” explained Max Stier, president and CEO of the Partnership for Public Service. “You’re not serving at the pleasure of the president, you’re serving a two-year term.”
The 2016 Plum Book, an exhaustive list of political positions in the federal government, lists only the executive director as a political appointee among the NSC staff—along with the national security adviser and his deputies. A broader definition might encompass most other senior staff on the NSC, who are hired into the excepted service on Schedule C, and required to submit their resignations when a president leaves office.
Ahmed did not change the nature of her non-political appointment with the NSC late in the Obama administration, nor was that status renewed or changed in January. Multiple former senior NSC officials confirmed this account of her employment.
The Weekly Standard, Ahmed said, made no effort to contact her to verify its claims prior to publication.