(Go: >> BACK << -|- >> HOME <<)

Notes

First Drafts, Conversations, Stories in Progress

Alexandra of Russia and her son Alexei, photographed between 1910 and 1913. Library of Congress

Today is International Women’s Day. It also happens to be the 100th anniversary of the start of the revolution that brought down the Russian empire. Given the coincidence, I was delighted to find in our archives an article from our January 1928 issue titled “The Fall of the Russian Empire: The Part Played by a Woman”—that is, until I read author Edmund Walsh’s assessment of exactly what that “part” was:

Russia was the last island fortress of absolutism in the rising tide of democracy, the outstanding anachronism of the twentieth century. … It defied the elements for three hundred years—until the deluge came. Whose hand unloosed the flood gates? In my opinion, a woman, all unconsciously, had more to do with the final debacle than any other single cause. … History probably will clear the memory of Alexandra Feodorovna [of treason, but] it can never clear her memory of tendencies, practices, and imprudences that contributed notably to Russia's ruin. The domination which this imperious, proud, aloof, and resolute woman exercised over her irresolute and impressionable husband became such a menace that more than one grand duke, duchess, and general cried out in warning against it. …

Revolutions are made by men and women determining events. Men are swayed by powerful human emotions. Women create them. And the master passion, particularly in neurotic females, can be as elegantly indifferent to the realities of life and war as ever Montesquieu was to the existence of God.

It’s a fascinating historical document, undeniably sexist in its overtones. The gist of Walsh’s argument is that the Tsarina Alexandra, driven by fear for the health of her hemophiliac son, gave the self-proclaimed holy man and healer Grigori Rasputin a level of influence that irrevocably weakened the Russian government. For evidence, Walsh delves into the embarrassing intimacies of Alexandra’s letters to her husband. And he criticizes the empress on two familiar, contradictory fronts: On the one hand, she’s weak and overly emotional, too much guided by motherly worries to see the bigger picture of Russian politics. On the other, she’s aggressive and overly domineering, stepping outside her proper sphere of childrearing to advise her husband on governance. She’s portrayed as a femme fatale, making a “subtle approach to political questions … through the gateway of the Tsar’s affections.” But she’s not granted agency either: Walsh argues she brought about the fall of the empire “all unconsciously.” She is, like female leaders still are, damned for the stereotypes of womanhood she does fulfill and damned for the ones she does not.

But none of this is to dispute the chain of events that Walsh describes. Alexandra and her husband did fail at governance: For any leader, male or female, it’s a heartbreaking reality that even the safety of one’s own family must come second to the national interest. And for all the sexism embedded in Walsh’s narrative, I agree with his central point that “revolutions are made by men and women determining events.” What struck me, reading this article today, was Alexandra’s simple human vulnerability, and my own reaction to it—my inclination to sweep this unflattering story under the rug. When we seek to recognize the women of history, what do we do with the history that reveals individual women as less than admirable? How do we celebrate women—our role models, ourselves—as powerful, vulnerable, fully complex humans, flaws intact?

All notes on "Debating Feminism" >

That’s the question that reader John Harris has been asking himself lately. He’s not alone: In 1862, one of The Atlantic’s founders, Ralph Waldo Emerson, wondered the same thing about aging. Acknowledging that “the creed of the street is, Old Age is not disgraceful, but immensely disadvantageous,” Emerson set out to explain the upsides of senescence. A common theme is the sense of serenity that comes with age and experience:

Youth suffers not only from ungratified desires, but from powers untried, and from a picture in his mind of a career which has, as yet, no outward reality. He is tormented with the want of correspondence between things and thoughts. … Every faculty new to each man thus goads him and drives him out into doleful deserts, until it finds proper vent. … One by one, day after day, he learns to coin his wishes into facts. He has his calling, homestead, social connection, and personal power, and thus, at the end of fifty years, his soul is appeased by seeing some sort of correspondence between his wish and his possession. This makes the value of age, the satisfaction it slowly offers to every craving. He is serene who does not feel himself pinched and wronged, but whose condition, in particular and in general, allows the utterance of his mind.

By 1928, advances in medicine had made it more possible to take a long lifespan for granted. In an Atlantic article titled “The Secret of Longevity” (unavailable online), Cary T. Grayson noted that “probably at no other time in the history of the human race has so much attention been paid to the problem of prolonging the span of life.” He offered a word of warning:

Any programme which has for its object the prolongation of life must also have, accompanying this increased span of life, the ability of the individual to engage actively and with some degree of effectiveness in the affairs of life. Merely to live offers little to the individual if he has lost the ability to think, to grieve, or to hope. There is perhaps no more depressing  picture than that of the person who remains on the stage after his act is over.

On the other hand, as Cullen Murphy contended in our January 1993 issue, an eternity spent with no decrease in faculties wouldn’t necessarily be desirable either:

There are a lot of characters in literature who have been endowed with immortality and who do manage to keep their youth. Unfortunately, in many cases nobody else does. Spouses and friends grow old and die. Societies change utterly. The immortals, their only constant companion a pervading loneliness, go on and on. This is the pathetic core of legends like those of the Flying Dutchman and the Wandering Jew. In Natalie Babbitt’s Tuck Everlasting, a fine and haunting novel for children, the Tuck family has inadvertently achieved immortality by drinking the waters of a magic spring. As the years pass, they are burdened emotionally by an unbridgeable remoteness from a world they are in but not of.

Since antiquity, Murphy wrote, literature has had a fairly united stance on immortality: “Tamper with the rhythms of nature and something inevitably goes wrong.” After all, people die to make room for more people, and pushing lifespans beyond their ordinary limits risks straining resources as well as reshaping families.

Charles C. Mann examined some of those potential consequences in his May 2005 Atlantic piece “The Coming Death Shortage,” predicting a social order increasingly stratified between “the very old and very rich on top … a mass of the ordinary old … and the diminishingly influential young.” Presciently, a few years before the collapse of the real-estate bubble that wiped out millions of Americans’ retirement savings, Mann outlined the effects of an increased proportion of older people in the workforce:

All notes on "Question Answers" >
Katie Martin / The Atlantic

In this week’s Atlantic coverage, our writers explored why animals need sleep, the changing shape of American families, Mahershala Ali’s history-making Oscar, the jobs that might be better off automated, a new way to stop the spread of superbugs, and more.

Can you remember the key facts? Find the answers to this week’s questions in the articles linked above—or go ahead and test your memory now:

For more tricky questions and surprising facts, try last week’s quiz, and subscribe to our daily newsletter.

All notes on "Weekly Quiz" >
Pablo Martinez Monsivais / AP

On Tuesday, President Trump outlined his plans to increase defense spending and invest in America’s infrastructure. This week, we asked Politics & Policy Daily readers where they would allocate extra funds if they were in charge of the country’s budget. Here are some of our favorite responses.

The vast majority of respondents, including Stella Porto here, would invest more in education:

If I controlled the federal budget, I would strengthen basic public education. Provide more access to pre-school education. Make college more affordable. Expand community colleges. Develop re-training programs for those who jobs have been eliminated by automation or other economic trends.

Everything in the country depends on the level of education of its people—absolutely everything, from preventing illness, choosing a better lifestyle, to raising kids responsibly, to choosing elected officials, to fighting for important causes, etc. Citizenship depends on education. Access to good education is at the root of equality.

Chuck Barnes, a retired university faculty member and geologist, suggested funding a year or two of universal service for high school graduates:

All notes on "Question of the Week" >
Women work together at an internet cafe in Kabul, Afghanistan, on March 8, 2012. Mohammad Ismail / Reuters

Is the internet helpful or hurtful to human creativity? I posed that question to the reader discussion group known as TAD, and the consensus seems to be: It’s both. It’s complicated. And naturally, it depends a lot on what form of creativity you’re talking about. Here’s how one reader sums it up:

Because of the Internet I write more and receive feedback from people I know (on Facebook) and online strangers (on TAD and other platforms that use Disqus). I use it as a jumping-off place and resource for planning lessons for my high-school students in science.

However, I don’t practice music as often as I used to.

On a similar note, another reader confesses, “I draw less because I’m always on TAD”:

As a sketch artist, I appreciate my ability to Google things I want to draw for a reference point, but that doesn’t make me more creative. I already had the image in my head and the ability to draw. I honed my skills drawing people the old fashioned way, looking at pictures in books or live subjects and practicing till my fingers were going to fall off.

In my opinion, the internet also encourages people to copy the work of others that goes “viral” rather than creating something truly original. The fact that you can monetize that viral quality also makes it more likely that people will try to copy rather than create.

That’s the same reason a third reader worries that “the internet has become stifling for creativity”:

Maybe I am not looking in the right place, but most platforms seem to be more about reblogging/retweeting/reposting other people’s creations. Then there is the issue of having work stolen and credits removed.

As another reader notes, “This is the central conflict of fan fiction”:

It’s obviously creative. On the other hand, it is all based on blatant copying of another writer’s work. How much is this a huge expansion of a creative outlet, and how much is this actually people choosing to limit their own creativity by colonizing somebody else’s world rather than creating a new one?

The fanfic debate is fascinating, and more readers expand on it here.

For my part, I tend to think the internet has encouraged and elevated some amazing new forms of creativity based on reaction and re-creation, collaboration and synthesis. Take this delightful example:

Those creative forms are a big part of my job too: When I go to work, I’m either distilling my colleagues’ articles for our Daily newsletter or piecing together reader emails for Notes, and those curatorial tasks have been exciting and challenging in ways that I never expected. But I’ve also missed writing fiction and poetry and literary criticism, and I worry sometimes that I’m letting those creative muscles atrophy. If you’re a fanfic reader or writer (or videographer, or meme-creator, or content-aggregator) and would like to share your experience, please let us know: hello@theatlantic.com.

This next reader speaks up for creativity as “the product of synthesis”:

It’s not so much a quest for pure “originality,” as it is a quest for original perspectives or original articulations. I’d say that my creativity has been fueled by letting myself fall into occasional rabbit holes. Whether that’s plodding through artists I don’t know well on Spotify or following hyperlinks in a Wiki piece until I have forgotten about what it was that I initially wondered, that access to knowledge in a semi-random form triggers the old noggin like little else.

On the other hand: So much knowledge! So many rabbit holes! Jim is paralyzed:

All notes on "Question Answers" >

When I first contacted Nikolai Formozov about his paper on a 30,000-year-old squirrel originally found by Gulag prisoners, which I wrote about today, he told me he had a “few other colorful details” that didn’t make it into his paper. Would I be interested in hearing more?

I replied yes, of course, wondering how much more interesting this story could get. What he sent was magnificent.

(A note on nomenclature, which should not put you off from reading to the very end: Urocitellus parryii is the scientific name for present-day Arctic ground squirrels, and U. glacialis refers ones from the Ice Age.)

Nikolai wrote:

After we had made sense of the complicated and dramatic fate of Urocitellus parryii in northeastern Eurasia (it had once colonized the territory, then become extinct, and then re-colonized it from America), we began to wonder if there modern descendants of glacialis in Asia, if there were refugiums (shelters) from the Ice Age that still existed. Naturally, we considered Kamchatka, оne of the warmest places in the region. But we had no material from there.

At the time, my friend Igor Shpilenok, a wildlife photographer and popular blogger, was working in the Kronotskiy Wildlife Reserve on Kamchatka. From his blog, I noticed that he often saw a Red Fox, whom he had named Alisa, and that she brought ground squirrels to her puppies.

I wrote to Igor: “Where did Alisa find those ground squirrels? They should not be there (in that part of Kamchatka).”

Igor said, “Oh, they came here 20 years ago from the center of the peninsula.”

I said, “Could you ask her to collect some ground squirrels for us?”

Igor said: “Simple, I’ll trade her cookies for them. She loves cookies.”

But a strange thing happened after that. Igor wrote me “You know, now I don’t even need cookies, because after I received your letter, Alisa began leaving ground squirrels on my porch, the way cats do.”

So we received our first four specimens from Kamchatka courtesy of Alisa, and they were closely related to glacialis, as we predicted.

In our academic article, I had wanted to mention Alisa in the Acknowledgments, but this idea of mine was vetoed.

Jeffrey Smith / Katie Martin / The Atlantic

In the March 2017 issue of The Atlantic, our writers explored the path to autocratic government, the future of artificial intelligence, Victorian sex, 20th-century sainthood, and much more.

Have you read it cover to cover? If so, it’s time to test your memory. The quiz below contains 20 surprising facts, each one drawn from a different article in our latest issue. Each question includes the page number where you can find the answer, so if you’ve got a copy of the magazine handy, you can follow along on paper. Otherwise, go to the online table of contents, where the articles are listed in the same order as they appear in the quiz.

Good luck!

For more tricky questions and surprising facts, try last month’s quiz—and subscribe to our daily newsletter.

All notes on "Weekly Quiz" >
Yuri Gripas / Reuters

On Monday, the Weekly Standard published an article by Lee Smith titled “Fake News, Exposed.” It alleged that Rumana Ahmed, a former National Security Council staffer and the author of an Atlantic essay about why she left the Trump administration, had misled readers about the nature of her position.

“Ahmed was a political appointee in the Obama White House. According to Trump White House officials, it was very late in her tenure in the Obama administration when she applied for a civil service position with administrative duties,” Smith wrote. “‘Burrowing,’ as it's commonly called, is the process through which political appointees move into career government status. She was granted her new status at the end of January, just as the Trump team was moving into the White House.”

In fact, Ahmed held a term appointment that was not set to expire until the summer of 2018. Ahmed’s employment documents, which were reviewed by The Atlantic, show that her position with the NSC, which began in June 2014, was a Schedule A excepted service term appointment. Her term was renewed for another two years in August 2016.

“A Schedule A term appointment to the NSC would not ordinarily be described as a political appointment and it is a standard hiring authority for staffing the NSC,” explained Max Stier, president and CEO of the Partnership for Public Service. “You’re not serving at the pleasure of the president, you’re serving a two-year term.”

The 2016 Plum Book, an exhaustive list of political positions in the federal government, lists only the executive director as a political appointee among the NSC staff—along with the national security adviser and his deputies. A broader definition might encompass most other senior staff on the NSC, who are hired into the excepted service on Schedule C, and required to submit their resignations when a president leaves office.

Ahmed did not change the nature of her non-political appointment with the NSC late in the Obama administration, nor was that status renewed or changed in January. Multiple former senior NSC officials confirmed this account of her employment.

The Weekly Standard, Ahmed said, made no effort to contact her to verify its claims prior to publication.

Our collection of power plants for this photo series keeps growing: a nuclear one over Michigan, another one along the Cali coastline, a bunch of wind turbines over Colorado, a pair of coal-fired plants in Iowa, solar panels with crop circles in Arizona—and now a massive solar plant in Nevada that looks like a moon base or a SETI satellite:

Roberto T. Martins

The stunning image was sent by Roberto, a reader in Georgia:

This is the Crescent Dunes Solar Energy project, in the Nevada desert, as seen on a flight from Denver to San Francisco last November. I had just heard about it on NPR when I saw it right under our flight path. (If I hadn’t listened, I would have no idea what it was.)

Here’s the NPR story that he’s likely referencing. It provides some fascinating details into the unique nature of the Crescent Dunes solar plant, which can generate electricity for up to 10 hours even after the sun goes down. What’s the secret? Molten salt:

“It actually looks like water. It’s clear — it flows like water,” Smith says. He says the molten salt has to remain above 450 degrees Fahrenheit to stay liquid. It’s sent up the tower to the glowing tip, where it’s heated further. When the salt comes back down, it is 1,050 degrees. The molten salt is used to make steam to power a generator.

Here’s a closer view of the plant from Roberto, with the central tower casting a sundial-like shadow across the desert floor:

The plant generates enough electricity to power 75,000 Nevada homes. But it’s had some blemishes: “During a test [of Crescent Dunes last year], observers recorded a video of birds flying into heat from the mirrors and being incinerated.” The group Basin and Range Watch is now suing the agency to get more information on the dangers to wildlife. But flaming fowl isn’t unique to Crescent Dunes; the Ivanpah Solar Electric Generating System in California is another example of a broader problem for solar plants. Here’s an explanation from Emma Roller via our archives:

First, insects are drawn to the reflective light of the solar mirrors. That draws small, insect-eating birds, which in turn draw larger predatory birds. The rays of the mirrors’ reflected light produces temperatures from 800 degrees to 1,000 degrees Fahrenheit. Any animal caught in the intense glare of the mirror’s rays may catch fire and plummet toward the ground, or spontaneously combust altogether.

That beam of fiery death is called a “solar flux.” The bigger threat to birds, however, comes from wind turbines. As my colleague Clare Foran noted, “Research published in the peer-reviewed scientific journal Biological Conservation [in 2013] estimated that between 140,438 and 327,586 birds — or a mean of 234,012 — are killed annually due to collisions with turbines across the U.S.” Petroleum is another big danger:

All notes on "Aerial Views of America" >
A Kuwaiti oil field set afire by retreating Iraqi troops during Operation Desert Storm on March 1, 1991 (JO1 Gawlowicz / Department of Defense / Wikimedia)

Spurred by our collection of stories from readers who used marijuana as a substitute for prescription opioids, another reader writes:

I am a totally and permanently service-connected, disabled Marine veteran with Gulf War Illnesses. I was an infantryman in the first war in Iraq and spent a good deal of time in and around the burning oil fields. I was also dosed with long-term, low-dose nerve agents from the “superplume” of oil smoke and chemical weapons inadvertently made airborne by coalition forces during demolition while I was aboard ship in the Persian Gulf after the ground combat had ended.

After my four years of active duty I attended college and earned a civil engineering degree. I worked for a few years as a consulting civil engineer for a Fortune 500 engineering firm until the symptoms of my illnesses became too much for me to continue gainful employment as a civil engineer.

My medical care as an engineer was very good, since I had very good private insurance—until I could no longer work. By that point, I had been awarded a 50-percent disability rating and the VA stepped in to cover my treatment. As my illnesses progressed, I became less and less active and more and more dependent upon the 13 different pharmaceutical medications and three pharmaceutical inhalers the VA doctors prescribed to me for daily use. My symptoms/illnesses used to include:

June 4, 1965

The Zika emergency—thankfully now in the past but still without a vaccine—spread throughout 60 countries and affected thousands of pregnant women in 2015 and 2016. The disease is most dangerous for pregnant women due to the risk of birth defects as severe as microcephaly, when the fetus forms a small head and underdeveloped brain. To prevent that gruesome fate for their baby, pregnant women with Zika often turn to abortion (though the procedure is illegal in many of the countries most affected by the virus).

Before Zika, there was the rubella epidemic of 1964 to 1965, when an estimated 12.5 million Americans acquired the disease (also known as German measles). Similar to Zika, rubella’s symptoms for most adults are mild—a rash and a low-grade fever that lasts two or three days. But for a pregnant woman and her fetus, rubella is “very dangerous,” according the CDC, resulting in birth defects ranging from deafness to heart problems to mental disabilities. Also like Zika, rubella is often asymptomatic, thus many pregnant women don’t realize they’re carrying the virus until it’s too late. In the 1960s, prior to the release of the rubella vaccine in 1969 and the Roe decision in 1973 that made abortion legal nationwide, a small number of doctors illegally performed the procedure for pregnant women with rubella.

One of those women is Bette, an Atlantic reader who had a second-trimester abortion in March 1971. She was a 24-year-old married Christian at the time, and she frames her abortion story as “God’s will for my family”:

My husband and I celebrated my pregnancy with friends on Thanksgiving Day in 1970. Although the pregnancy was a bit of a surprise, we were delighted to welcome a baby into the world.

I was teaching fifth grade at the time, and I’ll never forget the moment when a student walked up to my desk and said he didn’t feel very well. When I saw the rash on his face, I flashed back to a terrible photograph I had seen in a magazine in my obstetrician’s office the week before. It was of a “Rubella baby,” and the caption said “Bobby’s mother recovered from German measles in 3 days. Bobby wasn’t so lucky.”

I didn’t know what that meant exactly, but I later found out the way scientists realized what the Rubella virus did to a fetus was when someone connected delivery-room personnel coming down with the three-day measles to a baby with severe birth defects. Although the mother recovers in three days, the baby stays sick throughout the remaining time of gestation and is still contagious at birth.

I had almost forgotten about that student and the magazine picture a couple of weeks later when I got up and saw a very slight rash on my own face.

All notes on "Abortion Stories" >

Ads are being blocked

For us to continue writing great stories, we need to display ads.

Un-block Learn more
Back

Whitelist

Please select the extension that is blocking ads.

Back

Please follow the steps below