Friday, November 30, 2018

Roll Over, Émile Durkheim — BoBo Boy Has Rediscovered Anomie (The Breakdown Of Social Bonds Between An Individual And The Community)

BoBo Boy (for his catchphrase of Bourgeois Bohemians in his book Bobos in Paradise [2000]) has tracked the current malaise in this land to something other than the state of the economy. There are multiple causes of today's national unhappiness and much — not all — can be traced to the Horse's A$$ in the Oval Office. To say that our social fabric is torn and frayed is an understatement. If this is (fair & balanced) social analysis, so be it.

[x NY Fishwrap]
It’s Not The Economy, Stupid
By BoBo Boy (David Brooks)


TagCrowd Cloud of the following piece of writing

created at TagCrowd.com

We’re enjoying one of the best economies of our lifetime. The GDP is growing at about 3.5 percent a year, which is about a point faster than many experts thought possible. We’re in the middle of the second-longest recovery in American history, and if it lasts for another eight months it will be the longest ever. If you were born in 1975, you’ve seen the US economy triple in size over the course of your lifetime.

The gains are finally being widely shared, even by the least skilled. As Michael Strain of the American Enterprise Institute recently noted, the median usual weekly earnings for workers who didn’t complete high school shot up by 6.5 percent over the past year. Thanks mostly to government transfer programs, incomes for the bottom fifth of society have increased by about 80 percent over the past four decades.

And yet are we happy?

About 60 percent of Americans are dissatisfied with the way things are going in this country. Researchers with the Gallup-Sharecare Well-Being Index interviewed 160,000 adults in 2017 to ask about their financial security, social relationships, sense of purpose and connectedness to community. Last year turned out to be the worst year for well-being of any since the study began 10 years ago.

As the recovery has advanced, people’s faith in capitalism has actually declined, especially among the young. Only 45 percent of those between 18 and 29 see capitalism positively, a lower rate than in 2010, when the country was climbing out of the Great Recession.

So why the long faces?

Part of the problem is Donald Trump. People can’t feel good about things when they think the country is disastrously led.

Part of it, as Noah Smith of Bloomberg theorizes, may be disappointment among the well-educated young. They graduated from college, saddled with debt, and naturally expected the world to embrace them as their parents and schools had done. Instead, many entered into the gig economy, where a lot of work is temporary and insecure. Normal professions for liberal arts grads, like the law, are drying up.

How many unpaid internships can you endure before you lose faith in the system?

But the biggest factor is the crisis of connection. People, especially in the middle- and working-class slices of society, are less likely to volunteer in their community, less likely to go to church, less likely to know their neighbors, less likely to be married than they were at any time over the past several decades. In short, they have fewer resources to help them ride the creative destruction that is ever-present in a market economy.

And they are dying. On Thursday, the Centers for Disease Control and Prevention reported that life expectancy in the United States declined for the third straight year. This is an absolutely stunning trend. In affluent, well-connected societies, life expectancies rise almost as a matter of course. The last time the American mortality rate fell for three straight years was 1915-1918, during World War I and the flu pandemic, which took 675,000 American lives.

And yet here we are — a straight-up social catastrophe.

Economic anxiety is now downstream from and merged with sociological, psychological and spiritual decay. There are thousands of employers looking for workers and unable to find any. Many young people do not have the support structures they need to persevere in school and get skills.

Many working-class men have not been raised in those relationships that inculcate the so-called soft skills. A 2018 LinkedIn survey of 4,000 professionals found that training for those soft skills — leadership, communication and collaboration — was the respondents’ highest priority. They valued these flexible skills more than specific technical ones, and find them in short supply.

There’s an interesting debate going on in conservative circles over whether we have overvalued total GDP growth in our economic policy and undervalued programs that specifically foster dignity-enhancing work. The way I see it is this: It’s nonsense to have an economic policy — or any policy — that doesn’t account for and address the social catastrophe happening all around us. Every single other issue exists under the shadow of this one.

Conservatives were wrong to think that economic growth would lead to healthy families and communities all by itself. Moderate Democrats were wrong to think it was sufficient to maximize growth and then address inequalities with transfer payments. The progressives are wrong to think life would be better if we just made our political economy look more like Denmark’s. The Danes and the Swedes take for granted a cohesive social fabric that simply does not exist here.

To make the crucial differences, economic policymakers are going to have to get out of the silos of their economic training and figure out how economic levers can have moral, communal and sociological effect. Oren Cass’s book The Once and Future Worker (2018) begins this exploration, as do Isabelle Sawhill’s The Forgotten Americans (2018) and Nebraska Senator Ben Sasse’s Them: Why We Hate Each Other — and How to Heal (2018).

It’s not jobs, jobs, jobs anymore. It’s relationships, relationships, relationships. ###'

We’re enjoying one of the best economies of our lifetime. The G.D.P. is growing at about 3.5 percent a year, which is about a point faster than many experts thought possible. We’re in the middle of the second-longest recovery in American history, and if it lasts for another eight months it will be the longest ever. If you were born in 1975, you’ve seen the U.S. economy triple in size over the course of your lifetime.

The gains are finally being widely shared, even by the least skilled. As Michael Strain of the American Enterprise Institute recently noted, the median usual weekly earnings for workers who didn’t complete high school shot up by 6.5 percent over the past year. Thanks mostly to government transfer programs, incomes for the bottom fifth of society have increased by about 80 percent over the past four decades.

And yet are we happy?

About 60 percent of Americans are dissatisfied with the way things are going in this country. Researchers with the Gallup-Sharecare Well-Being Index interviewed 160,000 adults in 2017 to ask about their financial security, social relationships, sense of purpose and connectedness to community. Last year turned out to be the worst year for well-being of any since the study began 10 years ago.

As the recovery has advanced, people’s faith in capitalism has actually declined, especially among the young. Only 45 percent of those between 18 and 29 see capitalism positively, a lower rate than in 2010, when the country was climbing out of the Great Recession.

So why the long faces?

Part of the problem is Donald Trump. People can’t feel good about things when they think the country is disastrously led.

Part of it, as Noah Smith of Bloomberg theorizes, may be disappointment among the well-educated young. They graduated from college, saddled with debt, and naturally expected the world to embrace them as their parents and schools had done. Instead, many entered into the gig economy, where a lot of work is temporary and insecure. Normal professions for liberal arts grads, like the law, are drying up.

How many unpaid internships can you endure before you lose faith in the system?

But the biggest factor is the crisis of connection. People, especially in the middle- and working-class slices of society, are less likely to volunteer in their community, less likely to go to church, less likely to know their neighbors, less likely to be married than they were at any time over the past several decades. In short, they have fewer resources to help them ride the creative destruction that is ever-present in a market economy.

And they are dying. On Thursday, the Centers for Disease Control and Prevention reported that life expectancy in the United States declined for the third straight year. This is an absolutely stunning trend. In affluent, well-connected societies, life expectancies rise almost as a matter of course. The last time the American mortality rate fell for three straight years was 1915-1918, during World War I and the flu pandemic, which took 675,000 American lives.

And yet here we are — a straight-up social catastrophe.

Economic anxiety is now downstream from and merged with sociological, psychological and spiritual decay. There are thousands of employers looking for workers and unable to find any. Many young people do not have the support structures they need to persevere in school and get skills.

Many working-class men have not been raised in those relationships that inculcate the so-called soft skills. A 2018 LinkedIn survey of 4,000 professionals found that training for those soft skills — leadership, communication and collaboration — was the respondents’ highest priority. They valued these flexible skills more than specific technical ones, and find them in short supply.

There’s an interesting debate going on in conservative circles over whether we have overvalued total G.D.P. growth in our economic policy and undervalued programs that specifically foster dignity-enhancing work. The way I see it is this: It’s nonsense to have an economic policy — or any policy — that doesn’t account for and address the social catastrophe happening all around us. Every single other issue exists under the shadow of this one.

Conservatives were wrong to think that economic growth would lead to healthy families and communities all by itself. Moderate Democrats were wrong to think it was sufficient to maximize growth and then address inequalities with transfer payments. The progressives are wrong to think life would be better if we just made our political economy look more like Denmark’s. The Danes and the Swedes take for granted a cohesive social fabric that simply does not exist here.

To make the crucial differences, economic policymakers are going to have to get out of the silos of their economic training and figure out how economic levers can have moral, communal and sociological effect. Oren Cass’s book “The Once and Future Worker” begins this exploration, as do Isabelle Sawhill’s “The Forgotten Americans” and Nebraska Senator Ben Sasse’s “Them: Why We Hate Each Other — and How to Heal.”

It’s not jobs, jobs, jobs anymore. It’s relationships, relationships, relationships. #$#

[David Brooks became an Op-Ed columnist for The New York Times in September 2003. His column appears every Tuesday and Friday. He is currently a commentator on “PBS NewsHour,” NPR’s “All Things Considered” and NBC’s “Meet the Press.” He is the author of Bobos in Paradise: The New Upper Class and How They Got There (2000), On Paradise Drive: How We Live Now (And Always Have) in the Future Tense (2004), and The Social Animal: The Hidden Sources of Love, Character, and Achievement (2011). Most recently he has written The Road to Character (2015). Brooks received a BA (history) from the University of Chicago (IL) and he is a member of the American Academy of Arts and Sciences.]

Copyright © 2018 The New York Times Company



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2018 Sapper's (Fair & Balanced) Rants & Raves

Thursday, November 29, 2018

Roll Over, Jimmy Wales — Cyber-Innovation Has No Limits??

Reading of this New Zealander's adventures in cyberspace brought back memories for this blogger. In the early years of the Internet, this blogger worked with an online network for historians (H-Net) that has grown into a vast network of disciplinary and interdisciplinary virtual groups. This blogger's adventures didn't end with H-Net because his appreciation for ParSystem software led him to talk (actually e-mail) his way onto the Web server of the the Department of Classics at the University of Pennsylvania with an e-mail list known then as Par-L for users of the ParSystem software. And, as this blogger approached retirement from the Collegium Excellens, he discovered another cyber-activity — blogging — with Blogger software that continues to this day, thanks to Blogger's present owners — Google. So, the account of the cyber-scholar in New Zealand is met with a sense of recognition and is unsurprising because it is possible for a dog to participate in cyber-activities. If this is the (fair & balanced) astounding marvel known as interconnected technology, so be it.

[x NY Fishwrap]
From Encyclopedic Collector To "Wikipedian-At-Large"
By Charlotte Graham-McLay


TagCrowd Cloud of the following piece of writing

created at TagCrowd.com

In the grand library of the Auckland War Memorial Museum on a Saturday morning in August, a small group of new and slightly nervous Wikipedia editors gathered for a day of training that would arm them to tackle New Zealand’s lackluster representation on the crowdsourced online encyclopedia.

Leading the so-called Wikiblitz was New Zealand’s official Wikipedian-at-Large, Mike Dickison, 49, who has in some senses been preparing his entire life for this post. As a collector of things and knowledge, he has pursued a string of enthusiasms, beginning with insects, shells and feathers (he put together his own museum as a boy), then giant flightless birds (a PhD on those), that ended, appropriately enough, with a job as the natural history curator at a museum. He once taught a class in knitting as therapy for stressed-out men after a major earthquake.

For the moment, he was involved in something a little less fascinating, guiding the group through the process of adding photos from the museum’s collection to pages on Wikipedia. The new editors — curious members of the public, many of whom had created their accounts the evening before — were mostly women, a fact Mr. Dickison was pleased to note; Wikipedia records its editors as 90 percent male.

“Be bold! Don’t be stymied by worry,” Mr. Dickison told the group, assuring them that early in his Wikipedia career, he had accidentally “blanked” more than one entire page by mistake.

As the country’s roving Wikipedian-at-Large, he is spending a year coaxing New Zealanders to take up volunteer editing on what is the world’s fifth-most-visited website. His salary and travel are funded by a grant from the Wikimedia Foundation, the nonprofit that runs Wikipedia, as part of its investment in “emerging communities” on the site, including New Zealand.

The South Pacific country is underrepresented on Wikipedia, and Mr. Dickison described the state of many of the country’s pages as “dire.”

His skills are self-taught, but Mr. Dickison’s affinity with Wikipedia’s gathering of community knowledge is the culmination of his lifelong obsession with collecting. Growing up in Christchurch, on the South Island of New Zealand, the son of an apprentice boilermaker father — who later ran a sporting goods store — and a homemaker mother, Mr. Dickison felt he was “destined” for a museum curator’s job.

What started as a typical childhood infatuation with dinosaurs developed into a fascination with the moa, a giant flightless bird native to New Zealand, which is now extinct. Mr. Dickison’s father, who had left school at 15, encouraged his son’s enthusiasm for curating. He built the glass display cases where his son could display his treasures in his “museum.”

Even then, Mr. Dickison was irked by the lack of available information about New Zealand’s native fauna.

“I was mad on insects, and in 1983 you had one book on New Zealand insects, which was written in the ’70s, with just a few color plates,” he said, adding that he is now writing his own children’s book on New Zealand’s natural history.

His preoccupation with the moa led Mr. Dickison to complete a PhD on the subject of giant flightless birds at Duke University decades later.

“I had no idea where North Carolina was or any fact about it whatsoever,” Mr. Dickison said. “The only Duke I knew was ‘The Dukes of Hazzard.’”

The appeal of giant flightless birds, to Mr. Dickison at least, seemed simple: “They’re just enormous. They’re really big. I mean, why do kids like dinosaurs? Because they’re huge.”

He dreams of traveling back 1,000 years to see the moa in its natural New Zealand habitat before it was wiped out by Polynesian settlers 500 years ago. He has even investigated the taxonomic origins of the “Sesame Street” character Big Bird (his conclusion: a giant flightless crane).

His enthusiasm for the smallest pieces of knowledge — Mr. Dickison’s website [a blog] includes a map recording everywhere in the world he has received a haircut — led to a day job as natural history curator at Whanganui Regional Museum, on the North Island. But by night, he was beginning to rack up hours as a volunteer editor on Wikipedia, and ran workshops training other new editors at a local library.

His edit history, which began in 2009, is not quite as lettered as his museum pedigree: Mr. Dickison made some of his first contributions to pages about Jaffas, a type of New Zealand candy, mandolins and the film “This Is Spinal Tap.”

He realized toward the end of his tenure as a natural history curator, he said, that the work he did on Wikipedia in his free time had “much more impact” than what he did at his day job.

After his application for a Wikimedia Foundation grant for the national Wikipedian-in-residence role was successful, Mr. Dickison said, he left his job, “filled my four-wheel-drive with plastic bins of worldly possessions and launched off around the country on an adventure.”

Mr. Dickison is no stranger to connecting unlikely groups of people. Upon his return to New Zealand from Duke, the sometime ukulele player was frustrated by the lack of sheet music for New Zealand standards. So he wrote a book of local songs for ukulele and traveled the North Island, teaching and performing them.

After a deadly earthquake struck his home city of Christchurch in 2011, killing 185 people and flattening much of the central city, Mr. Dickison ran the knitting-as-therapy class, having taught himself first as a way of dealing with the aftermath of the quakes.

In the 1990s, he had hosted sessions in internet cafes to help newcomers explore the World Wide Web.

“I don’t understand why I do these things,” he said. “I’m supposed to be an introvert.”

“But if I find something I’m passionate about, I need to share it and get other people involved too,” he added.

The fate of Christchurch was a cautionary tale about the need for societies to preserve their information, Mr. Dickison said. When the 2011 earthquake struck, every formative place from his childhood was destroyed, including his family home and former schools.

“Google Street View was still running images of pre-quake Christchurch for a while after the earthquake, and there was a huge worry that they would take those down and replace them with up-to-date views,” he said, adding that images of the city before the disaster had now been archived and preserved.

“I feel like we’ve been a bit cavalier about looking after knowledge in New Zealand,” Mr. Dickison said. “Too often, it just slips away.”

As part of his Wikipedian-at-Large role, he is charged with recruiting others to help preserve that knowledge online, with a particular emphasis on women and minorities, who are underrepresented in New Zealand’s small editing community. He plans more meet-ups and training sessions like the one at Auckland Museum, and will be resident around the country at locations including a government department and a bird sanctuary.

Mr. Dickison also hopes to entice reticent public and private institutions to crack open their vaults of knowledge and expertise, making them more accessible for editors to use while editing Wikipedia.

“I often have experts tell me they read a Wikipedia article that they know something about and it was full of inaccuracies,” he said.

“I always say, ‘Well, did you fix them? And if you didn’t fix them, why are you complaining to me? It’s like walking outside and complaining that it’s raining and not putting up an umbrella. Of course you’re wet!’” ###

[Charlotte Graham-McLay is a journalist. broadcaster, and writer who lives in Wellington and can often be found covering New Zealand for the New York Times. Since 2006, she has worked on and off at Radio New Zealand (RNZ) in a variety of roles, including presenter, editor, producer, reporter, and social journalist. As well as RNZ and the Times, her writing appears in The Telegraph, New Zealand Geographic, Landfall, and on the BBC. She received an undergraduate degree in broadcast communication from the New Zealand Broadcasting School and an MJ, with distinction, from Massey University (NZ).]

Copyright © 2018 The New York Times Company



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2018 Sapper's (Fair & Balanced) Rants & Raves

Wednesday, November 28, 2018

Warning — Ignore Today's Post To This Blog At Your Peril!

Yesterday's post to this blog received a disappointing number of clicks because it wasn't very cute, clever, or snarky. That was yesterday. Today, you will ignore this article at your peril if you expect to be living in 2050. At that point, you will be living in "interesting times" as the ancient Chinese curse predicts. Bill McKibben has been writing and speaking about our common peril for the past three decades. If this is a (fair & balanced) warning to be heeded, so be it

[x New Yorker]
Life On A Shrinking Planet
By Bill McKibben


TagCrowd Cloud of the following piece of writing

created at TagCrowd.com

Thirty years ago, this magazine published “The End of Nature,” a long article about what we then called the greenhouse effect. I was in my twenties when I wrote it, and out on an intellectual limb: climate science was still young. But the data were persuasive, and freighted with sadness. We were spewing so much carbon into the atmosphere that nature was no longer a force beyond our influence—and humanity, with its capacity for industry and heedlessness, had come to affect every cubic metre of the planet’s air, every inch of its surface, every drop of its water. Scientists underlined this notion a decade later when they began referring to our era as the Anthropocene, the world made by man.

I was frightened by my reporting, but, at the time, it seemed likely that we’d try as a society to prevent the worst from happening. In 1988, George H. W. Bush, running for President, promised that he would fight “the greenhouse effect with the White House effect.” He did not, nor did his successors, nor did their peers in seats of power around the world, and so in the intervening decades what was a theoretical threat has become a fierce daily reality. As this essay goes to press, California is ablaze. A big fire near Los Angeles forced the evacuation of Malibu, and an even larger fire, in the Sierra Nevada foothills, has become the most destructive in California’s history. After a summer of unprecedented high temperatures and a fall “rainy season” with less than half the usual precipitation, the northern firestorm turned a city called Paradise into an inferno within an hour, razing more than ten thousand buildings and killing at least sixty-three people; more than six hundred others are missing. The authorities brought in cadaver dogs, a lab to match evacuees’ DNA with swabs taken from the dead, and anthropologists from California State University at Chico to advise on how to identify bodies from charred bone fragments.

For the past few years, a tide of optimistic thinking has held that conditions for human beings around the globe have been improving. Wars are scarcer, poverty and hunger are less severe, and there are better prospects for wide-scale literacy and education. But there are newer signs that human progress has begun to flag. In the face of our environmental deterioration, it’s now reasonable to ask whether the human game has begun to falter—perhaps even to play itself out. Late in 2017, a United Nations agency announced that the number of chronically malnourished people in the world, after a decade of decline, had started to grow again—by thirty-eight million, to a total of eight hundred and fifteen million, “largely due to the proliferation of violent conflicts and climate-related shocks.” In June, 2018, the Food and Agriculture Organization of the U.N. found that child labor, after years of falling, was growing, “driven in part by an increase in conflicts and climate-induced disasters.”

In 2015, at the UN Climate Change Conference in Paris, the world’s governments, noting that the earth has so far warmed a little more than one degree Celsius above pre-industrial levels, set a goal of holding the increase this century to 1.5 degrees Celsius (2.7 degrees Fahrenheit), with a fallback target of two degrees (3.6 degrees Fahrenheit). This past October, the UN’s Intergovernmental Panel on Climate Change [IPCC] published a special report stating that global warming “is likely to reach 1.5 C between 2030 and 2052 if it continues to increase at the current rate.” We will have drawn a line in the sand and then watched a rising tide erase it. The report did not mention that, in Paris, countries’ initial pledges would cut emissions only enough to limit warming to 3.5 degrees Celsius (about 6.3 degrees Fahrenheit) by the end of the century, a scale and pace of change so profound as to call into question whether our current societies could survive it.

Scientists have warned for decades that climate change would lead to extreme weather. Shortly before the IPCC report was published, Hurricane Michael, the strongest hurricane ever to hit the Florida Panhandle, inflicted thirty billion dollars’ worth of material damage and killed forty-five people. President Trump, who has argued that global warming is “a total, and very expensive, hoax,” visited Florida to survey the wreckage, but told reporters that the storm had not caused him to rethink his decision to withdraw the U.S. from the Paris climate accords. He expressed no interest in the IPCC report beyond asking “who drew it.” (The answer is ninety-one researchers from forty countries.) He later claimed that his “natural instinct” for science made him confident that the climate would soon “change back.” A month later, Trump blamed the fires in California on “gross mismanagement of forests.”

Human beings have always experienced wars and truces, crashes and recoveries, famines and terrorism. We’ve endured tyrants and outlasted perverse ideologies. Climate change is different. As a team of scientists recently pointed out in the journal Nature Climate Change, the physical shifts we’re inflicting on the planet will “extend longer than the entire history of human civilization thus far.”

The poorest and most vulnerable will pay the highest price. But already, even in the most affluent areas, many of us hesitate to walk across a grassy meadow because of the proliferation of ticks bearing Lyme disease which have come with the hot weather; we have found ourselves unable to swim off beaches, because jellyfish, which thrive as warming seas kill off other marine life, have taken over the water. The planet’s diameter will remain eight thousand miles, and its surface will still cover two hundred million square miles. But the earth, for humans, has begun to shrink, under our feet and in our minds.

“Climate change,” like “urban sprawl” or “gun violence,” has become such a familiar term that we tend to read past it. But exactly what we’ve been up to should fill us with awe. During the past two hundred years, we have burned immense quantities of coal and gas and oil—in car motors, basement furnaces, power plants, steel mills—and, as we have done so, carbon atoms have combined with oxygen atoms in the air to produce carbon dioxide. This, along with other gases like methane, has trapped heat that would otherwise have radiated back out to space.

There are at least four other episodes in the earth’s half-billion-year history of animal life when CO2 has poured into the atmosphere in greater volumes, but perhaps never at greater speeds. Even at the end of the Permian Age, when huge injections of CO2 from volcanoes burning through coal deposits culminated in “The Great Dying,” the CO2 content of the atmosphere grew at perhaps a tenth of the current pace. Two centuries ago, the concentration of CO2 in the atmosphere was two hundred and seventy-five parts per million; it has now topped four hundred parts per million and is rising more than two parts per million each year. The extra heat that we trap near the planet every day is equivalent to the heat from four hundred thousand bombs the size of the one that was dropped on Hiroshima.

As a result, in the past thirty years we’ve seen all twenty of the hottest years ever recorded. The melting of ice caps and glaciers and the rising levels of our oceans and seas, initially predicted for the end of the century, have occurred decades early. “I’ve never been at . . . a climate conference where people say ‘that happened slower than I thought it would,’ ” Christina Hulbe, a New Zealand climatologist, told a reporter for Grist last year. This past May, a team of scientists from the University of Illinois reported that there was a thirty-five-per-cent chance that, because of unexpectedly high economic growth rates, the UN’s “worst-case scenario” for global warming was too optimistic. “We are now truly in uncharted territory,” David Carlson, the former director of the World Meteorological Organization’s climate-research division, said in the spring of 2017, after data showed that the previous year had broken global heat records.

We are off the literal charts as well. In August, I visited Greenland, where, one day, with a small group of scientists and activists, I took a boat from the village of Narsaq to a glacier on a nearby fjord. As we made our way across a broad bay, I glanced up at the electronic chart above the captain’s wheel, where a blinking icon showed that we were a mile inland. The captain explained that the chart was from five years ago, when the water around us was still ice. The American glaciologist Jason Box, who organized the trip, chose our landing site. “We called this place the Eagle Glacier because of its shape,” he said. The name, too, was five years old. “The head and the wings of the bird have melted away. I don’t know what we should call it now, but the eagle is dead.”

There were two poets among the crew, Aka Niviana, who is Greenlandic, and Kathy Jetnil-Kijiner, from the low-lying Marshall Islands, in the Pacific, where “king tides” recently washed through living rooms and unearthed graveyards. A small lens of fresh water has supported life on the Marshall Islands’ atolls for millennia, but, as salt water intrudes, breadfruit trees and banana palms wilt and die. As the Greenlandic ice we were gazing at continues to melt, the water will drown Jetnil-Kijiner’s homeland. About a third of the carbon responsible for these changes has come from the United States.

A few days after the boat trip, the two poets and I accompanied the scientists to another fjord, where they needed to change the memory card on a camera that tracks the retreat of the ice sheet. As we took off for the flight home over the snout of a giant glacier, an eight-story chunk calved off the face and crashed into the ocean. I’d never seen anything quite like it for sheer power—the waves rose twenty feet as it plunged into the dark water. You could imagine the same waves washing through the Marshalls. You could almost sense the ice elevating the ocean by a sliver—along the seafront in Mumbai, which already floods on a stormy day, and at the Battery in Manhattan, where the seawall rises just a few feet above the water.

When I say the world has begun to shrink, this is what I mean. Until now, human beings have been spreading, from our beginnings in Africa, out across the globe—slowly at first, and then much faster. But a period of contraction is setting in as we lose parts of the habitable earth. Sometimes our retreat will be hasty and violent; the effort to evacuate the blazing California towns along narrow roads was so chaotic that many people died in their cars. But most of the pullback will be slower, starting along the world’s coastlines. Each year, another twenty-four thousand people abandon Vietnam’s sublimely fertile Mekong Delta as crop fields are polluted with salt. As sea ice melts along the Alaskan coast, there is nothing to protect towns, cities, and native villages from the waves. In Mexico Beach, Florida, which was all but eradicated by Hurricane Michael, a resident told the Washington Post, “The older people can’t rebuild; it’s too late in their lives. Who is going to be left? Who is going to care?”

In one week at the end of last year, I read accounts from Louisiana, where government officials were finalizing a plan to relocate thousands of people threatened by the rising Gulf (“Not everybody is going to live where they are now and continue their way of life, and that is a terrible, and emotional, reality to face,” one state official said); from Hawaii, where, according to a new study, thirty-eight miles of coastal roads will become impassable in the next few decades; and from Jakarta, a city with a population of ten million, where a rising Java Sea had flooded the streets. In the first days of 2018, a nor’easter flooded downtown Boston; dumpsters and cars floated through the financial district. “If anyone wants to question global warming, just see where the flood zones are,” Marty Walsh, the mayor of Boston, told reporters. “Some of those zones did not flood thirty years ago.”

According to a study from the United Kingdom’s National Oceanography Centre last summer, the damage caused by rising sea levels will cost the world as much as fourteen trillion dollars a year by 2100, if the U.N. targets aren’t met. “Like it or not, we will retreat from most of the world’s non-urban shorelines in the not very distant future,” Orrin Pilkey, an expert on sea levels at Duke University, wrote in his book Retreat from a Rising Sea (2016). “We can plan now and retreat in a strategic and calculated fashion, or we can worry about it later and retreat in tactical disarray in response to devastating storms. In other words, we can walk away methodically, or we can flee in panic.”

But it’s not clear where to go. As with the rising seas, rising temperatures have begun to narrow the margins of our inhabitation, this time in the hot continental interiors. Nine of the ten deadliest heat waves in human history have occurred since 2000. In India, the rise in temperature since 1960 (about one degree Fahrenheit) has increased the chance of mass heat-related deaths by a hundred and fifty per cent. The summer of 2018 was the hottest ever measured in certain areas. For a couple of days in June, temperatures in cities in Pakistan and Iran peaked at slightly above a hundred and twenty-nine degrees Fahrenheit, the highest reliably recorded temperatures ever measured. The same heat wave, nearer the shore of the Persian Gulf and the Gulf of Oman, combined triple-digit temperatures with soaring humidity levels to produce a heat index of more than a hundred and forty degrees Fahrenheit. June 26th was the warmest night in history, with the mercury in one Omani city remaining above a hundred and nine degrees Fahrenheit until morning. In July, a heat wave in Montreal killed more than seventy people, and Death Valley, which often sets American records, registered the hottest month ever seen on our planet. Africa recorded its highest temperature in June, the Korean Peninsula in July, and Europe in August. The Times reported that, in Algeria, employees at a petroleum plant walked off the job as the temperature neared a hundred and twenty-four degrees. “We couldn’t keep up,” one worker told the reporter. “It was impossible to do the work.”

This was no illusion; some of the world is becoming too hot for humans. According to the National Oceanic and Atmospheric Administration, increased heat and humidity have reduced the amount of work people can do outdoors by ten per cent, a figure that is predicted to double by 2050. About a decade ago, Australian and American researchers, setting out to determine the highest survivable so-called “wet-bulb” temperature, concluded that when temperatures passed thirty-five degrees Celsius (ninety-five degrees Fahrenheit) and the humidity was higher than ninety per cent, even in “well-ventilated shaded conditions,” sweating slows down, and humans can survive only “for a few hours, the exact length of time being determined by individual physiology.”

As the planet warms, a crescent-shaped area encompassing parts of India, Pakistan, Bangladesh, and the North China Plain, where about 1.5 billion people (a fifth of humanity) live, is at high risk of such temperatures in the next half century. Across this belt, extreme heat waves that currently happen once every generation could, by the end of the century, become “annual events with temperatures close to the threshold for several weeks each year, which could lead to famine and mass migration.” By 2070, tropical regions that now get one day of truly oppressive humid heat a year can expect between a hundred and two hundred and fifty days, if the current levels of greenhouse-gas emissions continue. According to Radley Horton, a climate scientist at the Lamont-Doherty Earth Observatory, most people would “run into terrible problems” before then. The effects, he added, will be “transformative for all areas of human endeavor—economy, agriculture, military, recreation.”

Humans share the planet with many other creatures, of course. We have already managed to kill off sixty per cent of the world’s wildlife since 1970 by destroying their habitats, and now higher temperatures are starting to take their toll. A new study found that peak-dwelling birds were going extinct; as temperatures climb, the birds can no longer find relief on higher terrain. Coral reefs, rich in biodiversity, may soon be a tenth of their current size.

As some people flee humidity and rising sea levels, others will be forced to relocate in order to find enough water to survive. In late 2017, a study led by Manoj Joshi, of the University of East Anglia, found that, by 2050, if temperatures rise by two degrees a quarter of the earth will experience serious drought and desertification. The early signs are clear: SĂ£o Paulo came within days of running out of water last year, as did Cape Town this spring. In the fall, a record drought in Germany lowered the level of the Elbe to below twenty inches and reduced the corn harvest by forty per cent. The Potsdam Institute for Climate Impact Research concluded in a recent study that, as the number of days that reach eighty-six degrees Fahrenheit or higher increases, corn and soybean yields across the US grain belt could fall by between twenty-two and forty-nine per cent. We’ve already overpumped the aquifers that lie beneath most of the world’s breadbaskets; without the means to irrigate, we may encounter a repeat of the nineteen-thirties, when droughts and deep plowing led to the Dust Bowl—this time with no way of fixing the problem. Back then, the Okies fled to California, but California is no longer a green oasis. A hundred million trees died in the record drought that gripped the Golden State for much of this decade. The dead limbs helped spread the waves of fire, as scientists earlier this year warned that they could.

Thirty years ago, some believed that warmer temperatures would expand the field of play, turning the Arctic into the new Midwest. As Rex Tillerson, then the CEO of Exxon, cheerfully put it in 2012, “Changes to weather patterns that move crop production areas around—we’ll adapt to that.” But there is no rich topsoil in the far North; instead, the ground is underlaid with permafrost, which can be found beneath a fifth of the Northern Hemisphere. As the permafrost melts, it releases more carbon into the atmosphere. The thawing layer cracks roads, tilts houses, and uproots trees to create what scientists call “drunken forests.” Ninety scientists who released a joint report in 2017 concluded that economic losses from a warming Arctic could approach ninety trillion dollars in the course of the century, considerably outweighing whatever savings may have resulted from shorter shipping routes as the Northwest Passage unfreezes.

Churchill, Manitoba, on the edge of the Hudson Bay, in Canada, is connected to the rest of the country by a single rail line. In the spring of 2017, record floods washed away much of the track. OmniTrax, which owns the line, tried to cancel its contract with the government, declaring what lawyers call a “force majeure,” an unforeseen event beyond its responsibility. “To fix things in this era of climate change—well, it’s fixed, but you don’t count on it being the fix forever,” an engineer for the company explained at a media briefing in July. This summer, the Canadian government reopened the rail at a cost of a hundred and seventeen million dollars—about a hundred and ninety thousand dollars per Churchill resident. There is no reason to think the fix will last, and every reason to believe that our world will keep contracting.

All this has played out more or less as scientists warned, albeit faster. What has defied expectations is the slowness of the response. The climatologist James Hansen testified before Congress about the dangers of human-caused climate change thirty years ago. Since then, carbon emissions have increased with each year except 2009 (the height of the global recession) and the newest data show that 2018 will set another record. Simple inertia and the human tendency to prioritize short-term gains have played a role, but the fossil-fuel industry’s contribution has been by far the most damaging. Alex Steffen, an environmental writer, coined the term “predatory delay” to describe “the blocking or slowing of needed change, in order to make money off unsustainable, unjust systems in the meantime.” The behavior of the oil companies, which have pulled off perhaps the most consequential deception in mankind’s history, is a prime example.

As journalists at InsideClimate News and the Los Angeles Times have revealed since 2015, Exxon, the world’s largest oil company, understood that its product was contributing to climate change a decade before Hansen testified. In July, 1977, James F. Black, one of Exxon’s senior scientists, addressed many of the company’s top leaders in New York, explaining the earliest research on the greenhouse effect. “There is general scientific agreement that the most likely manner in which mankind is influencing the global climate is through carbon-dioxide release from the burning of fossil fuels,” he said, according to a written version of the speech which was later recorded, and which was obtained by InsideClimate News. In 1978, speaking to the company’s executives, Black estimated that a doubling of the carbon-dioxide concentration in the atmosphere would increase average global temperatures by between two and three degrees Celsius (5.4 degrees Fahrenheit), and as much as ten degrees Celsius (eighteen degrees Fahrenheit) at the poles.

Exxon spent millions of dollars researching the problem. It outfitted an oil tanker, the Esso Atlantic, with CO2 detectors to measure how fast the oceans could absorb excess carbon, and hired mathematicians to build sophisticated climate models. By 1982, they had concluded that even the company’s earlier estimates were probably too low. In a private corporate primer, they wrote that heading off global warming and “potentially catastrophic events” would “require major reductions in fossil fuel combustion.”

An investigation by the Los Angeles Times revealed that Exxon executives took these warnings seriously. Ken Croasdale, a senior researcher for the company’s Canadian subsidiary, led a team that investigated the positive and negative effects of warming on Exxon’s Arctic operations. In 1991, he found that greenhouse gases were rising due to the burning of fossil fuels. “Nobody disputes this fact,” he said. The following year, he wrote that “global warming can only help lower exploration and development costs” in the Beaufort Sea. Drilling season in the Arctic, he correctly predicted, would increase from two months to as many as five months. At the same time, he said, the rise in the sea level could threaten onshore infrastructure and create bigger waves that would damage offshore drilling structures. Thawing permafrost could make the earth buckle and slide under buildings and pipelines. As a result of these findings, Exxon and other major oil companies began laying plans to move into the Arctic, and started to build their new drilling platforms with higher decks, to compensate for the anticipated rises in sea level.

The implications of the exposĂ©s were startling. Not only did Exxon and other companies know that scientists like Hansen were right; they used his NASA climate models to figure out how low their drilling costs in the Arctic would eventually fall. Had Exxon and its peers passed on what they knew to the public, geological history would look very different today. The problem of climate change would not be solved, but the crisis would, most likely, now be receding. In 1989, an international ban on chlorine-containing man-made chemicals that had been eroding the earth’s ozone layer went into effect. Last month, researchers reported that the ozone layer was on track to fully heal by 2060. But that was a relatively easy fight, because the chemicals in question were not central to the world’s economy, and the manufacturers had readily available substitutes to sell. In the case of global warming, the culprit is fossil fuel, the most lucrative commodity on earth, and so the companies responsible took a different tack.

A document uncovered by the Los Angeles Times showed that, a month after Hansen’s testimony, in 1988, an unnamed Exxon “public affairs manager” issued an internal memo recommending that the company “emphasize the uncertainty” in the scientific data about climate change. Within a few years, Exxon, Chevron, Shell, Amoco, and others had joined the Global Climate Coalition [GCC], “to coordinate business participation in the international policy debate” on global warming. The GCC coördinated with the National Coal Association and the American Petroleum Institute on a campaign, via letters and telephone calls, to prevent a tax on fossil fuels, and produced a video in which the agency insisted that more carbon dioxide would “end world hunger” by promoting plant growth. With such efforts, it ginned up opposition to the Kyoto Protocol, the first global initiative to address climate change.

In October, 1997, two months before the Kyoto meeting, Lee Raymond, Exxon’s president and CEO, who had overseen the science department that in the nineteen-eighties produced the findings about climate change, gave a speech in Beijing to the World Petroleum Congress, in which he maintained that the earth was actually cooling. The idea that cutting fossil-fuel emissions could have an effect on the climate, he said, defied common sense. “It is highly unlikely that the temperature in the middle of the next century will be affected whether policies are enacted now, or twenty years from now,” he went on. Exxon’s own scientists had already shown each of these premises to be wrong.

On a December morning in 1997 at the Kyoto Convention Center, after a long night of negotiation, the developed nations reached a tentative accord on climate change. Exhausted delegates lay slumped on couches in the corridor, or on the floor in their suits, but most of them were grinning. Imperfect and limited though the agreement was, it seemed that momentum had gathered behind fighting climate change. But as I watched the delegates cheering and clapping, an American lobbyist, who had been coördinating much of the opposition to the accord, turned to me and said, “I can’t wait to get back to Washington, where we’ve got this under control.”

He was right. On January 29, 2001, nine days after George W. Bush was inaugurated, Lee Raymond visited his old friend Vice-President Dick Cheney, who had just stepped down as the CEO of the oil-drilling giant Halliburton. Cheney helped persuade Bush to abandon his campaign promise to treat carbon dioxide as a pollutant. Within the year, Frank Luntz, a Republican consultant for Bush, had produced an internal memo that made a doctrine of the strategy that the GCC had hit on a decade earlier. “Voters believe that there is no consensus about global warming within the scientific community,” Luntz wrote in the memo, which was obtained by the Environmental Working Group, a Washington-based organization. “Should the public come to believe that the scientific issues are settled, their views about global warming will change accordingly. Therefore, you need to continue to make the lack of scientific certainty a primary issue in the debate.”

The strategy of muddling the public’s impression of climate science has proved to be highly effective. In 2017, polls found that almost ninety per cent of Americans did not know that there was a scientific consensus on global warming. Raymond retired in 2006, after the company posted the biggest corporate profits in history, and his final annual salary was four hundred million dollars. His successor, Rex Tillerson, signed a five-hundred-billion-dollar deal to explore for oil in the rapidly thawing Russian Arctic, and in 2012 was awarded the Russian Order of Friendship. In 2016, Tillerson, at his last shareholder meeting before he briefly joined the Trump Administration as Secretary of State, said, “The world is going to have to continue using fossil fuels, whether they like it or not.”

It’s by no means clear whether Exxon’s deception and obfuscation are illegal. The company has long maintained that it “has tracked the scientific consensus on climate change, and its research on the issue has been published in publicly available peer-reviewed journals.” The First Amendment preserves one’s right to lie, although, in October, New York State Attorney General Barbara D. Underwood filed suit against Exxon for lying to investors, which is a crime. What is certain is that the industry’s campaign cost us the efforts of the human generation that might have made the crucial difference in the climate fight.

Exxon’s behavior is shocking, but not entirely surprising. Philip Morris lied about the effects of cigarette smoking before the government stood up to Big Tobacco. The mystery that historians will have to unravel is what went so wrong in our governance and our culture that we have done, essentially, nothing to stand up to the fossil-fuel industry.

There are undoubtedly myriad intellectual, psychological, and political sources for our inaction, but I cannot help thinking that the influence of Ayn Rand, the Russian Ă©migrĂ© novelist, may have played a role. Rand’s disquisitions on the “virtue of selfishness” and unbridled capitalism are admired by many American politicians and economists—Paul Ryan, Tillerson, Mike Pompeo, Andrew Puzder, and Donald Trump, among them. Trump, who has called The Fountainhead his favorite book, said that the novel “relates to business and beauty and life and inner emotions. That book relates to . . . everything.” Long after Rand’s death, in 1982, the libertarian gospel of the novel continues to sway our politics: Government is bad. Solidarity is a trap. Taxes are theft. The Koch brothers, whose enormous fortune derives in large part from the mining and refining of oil and gas, have peddled a similar message, broadening the efforts that Exxon-funded groups like the Global Climate Coalition spearheaded in the late nineteen-eighties.

Fossil-fuel companies and electric utilities, often led by Koch-linked groups, have put up fierce resistance to change. In Kansas, Koch allies helped turn mandated targets for renewable energy into voluntary commitments. In Wisconsin, Scott Walker’s administration prohibited state land officials from talking about climate change. In North Carolina, the state legislature, in conjunction with real-estate interests, effectively banned policymakers from using scientific estimates of sea-level rise in the coastal-planning process. Earlier this year, Americans for Prosperity, the most important Koch front group, waged a campaign against new bus routes and light-rail service in Tennessee, invoking human liberty. “If someone has the freedom to go where they want, do what they want, they’re not going to choose public transit,” a spokeswoman for the group explained. In Florida, an anti-renewable-subsidy ballot measure invoked the “Rights of Electricity Consumers Regarding Solar Energy Choice.”

Such efforts help explain why, in 2017, the growth of American residential solar installations came to a halt even before March, 2018, when President Trump imposed a thirty-per-cent tariff on solar panels, and why the number of solar jobs fell in the U.S. for the first time since the industry’s great expansion began, a decade earlier. In February, at the Department of Energy, Rick Perry—who once skipped his own arraignment on two felony charges, which were eventually dismissed, in order to attend a Koch brothers event—issued a new projection in which he announced that the US would go on emitting carbon at current levels through 2050; this means that our nation would use up all the planet’s remaining carbon budget if we plan on meeting the 1.5-degree target. Skepticism about the scientific consensus, Perry told the media in 2017, is a sign of a “wise, intellectually engaged person.”

Of all the environmental reversals made by the Trump Administration, the most devastating was its decision, last year, to withdraw from the Paris accords, making the US, the largest single historical source of carbon, the only nation not engaged in international efforts to control it. As the Washington Post reported, the withdrawal was the result of a collaborative venture. Among the anti-government ideologues and fossil-fuel lobbyists responsible was Myron Ebell, who was at Trump’s side in the Rose Garden during the withdrawal announcement, and who, at Frontiers of Freedom, had helped run a “complex influence campaign” in support of the tobacco industry. Ebell is a director of the Competitive Enterprise Institute [CEI], which was founded in 1984 to advance “the principles of limited government, free enterprise, and individual liberty,” and which funds the Cooler Heads Coalition, “an informal and ad-hoc group focused on dispelling the myths of global warming,” of which Ebell is the chairman. Also instrumental were the Heartland Institute and the Koch brothers’ Americans for Prosperity. After Trump’s election, these groups sent a letter reminding him of his campaign pledge to pull America out. The CEI ran a TV spot: “Mr. President, don’t listen to the swamp. Keep your promise.” And, despite the objections of most of his advisers, he did. The coalition had used its power to slow us down precisely at the moment when we needed to speed up. As a result, the particular politics of one country for one half-century will have changed the geological history of the earth.

We are on a path to self-destruction, and yet there is nothing inevitable about our fate. Solar panels and wind turbines are now among the least expensive ways to produce energy. Storage batteries are cheaper and more efficient than ever. We could move quickly if we chose to, but we’d need to opt for solidarity and coördination on a global scale. The chances of that look slim. In Russia, the second-largest petrostate after the US, Vladimir Putin believes that “climate change could be tied to some global cycles on Earth or even of planetary significance.” Saudi Arabia, the third-largest petrostate, tried to water down the recent IPCC report. Jair Bolsonaro, the newly elected President of Brazil, has vowed to institute policies that would dramatically accelerate the deforestation of the Amazon, the world’s largest rain forest. Meanwhile, Exxon recently announced a plan to spend a million dollars—about a hundredth of what the company spends each month in search of new oil and gas—to back the fight for a carbon tax of forty dollars a ton. At a press conference, some of the IPCC’s authors laughed out loud at the idea that such a tax would, this late in the game, have sufficient impact.

The possibility of swift change lies in people coming together in movements large enough to shift the Zeitgeist. In recent years, despairing at the slow progress, I’ve been one of many to protest pipelines and to call attention to Big Oil’s deceptions. The movement is growing. Since 2015, when four hundred thousand people marched in the streets of New York before the Paris climate talks, activists—often led by indigenous groups and communities living on the front lines of climate change—have blocked pipelines, forced the cancellation of new coal mines, helped keep the major oil companies out of the American Arctic, and persuaded dozens of cities to commit to one-hundred-per-cent renewable energy.

Each of these efforts has played out in the shadow of the industry’s unflagging campaign to maximize profits and prevent change. Voters in Washington State were initially supportive of a measure on last month’s ballot which would have imposed the nation’s first carbon tax—a modest fee that won support from such figures as Bill Gates. But the major oil companies spent record sums to defeat it. In Colorado, a similarly modest referendum that would have forced frackers to move their rigs away from houses and schools went down after the oil industry outspent citizen groups forty to one. This fall, California’s legislators committed to using only renewable energy by 2045, which was a great victory in the world’s fifth-largest economy. But the governor refused to stop signing new permits for oil wells, even in the middle of the state’s largest cities, where asthma rates are high.

New kinds of activism keep springing up. In Sweden this fall, a one-person school boycott by a fifteen-year-old girl named Greta Thunberg helped galvanize attention across Scandinavia. At the end of October, a new British group, Extinction Rebellion—its name both a reflection of the dire science and a potentially feisty response—announced plans for a campaign of civil disobedience. Last week, fifty-one young people were arrested in Nancy Pelosi’s office for staging a sit-in, demanding that the Democrats embrace a “Green New Deal” that would address the global climate crisis with policies to create jobs in renewable energy. They may have picked a winning issue: several polls have shown that even Republicans favor more government support for solar panels. This battle is epic and undecided. If we miss the two-degree target, we will fight to prevent a rise of three degrees, and then four. It’s a long escalator down to Hell.

Last June, I went to Cape Canaveral to watch Elon Musk’s Falcon 9 rocket lift off. When the moment came, it was as I’d always imagined: the clouds of steam venting in the minutes before launch, the immensely bright column of flame erupting. With remarkable slowness, the rocket began to rise, the grip of gravity yielding to the force of its engines. It is the most awesome technological spectacle human beings have produced.

Musk, Jeff Bezos, and Richard Branson are among the billionaires who have spent some of their fortunes on space travel—a last-ditch effort to expand the human zone of habitability. In November, 2016, Stephen Hawking gave humanity a deadline of a thousand years to leave Earth. Six months later, he revised the timetable to a century. In June, 2017, he told an audience that “spreading out may be the only thing that saves us from ourselves.” He continued, “Earth is under threat from so many areas that it is difficult for me to be positive.”

But escaping the wreckage is, almost certainly, a fantasy. Even if astronauts did cross the thirty-four million miles to Mars, they’d need to go underground to survive there. To what end? The multimillion-dollar attempts at building a “biosphere” in the Southwestern desert in 1991 ended in abject failure. Kim Stanley Robinson, the author of a trilogy of novels about the colonization of Mars, recently called such projects a “moral hazard.” “People think if we fuck up here on Earth we can always go to Mars or the stars,” he said. “It’s pernicious.”

The dream of interplanetary colonization also distracts us from acknowledging the unbearable beauty of the planet we already inhabit. The day before the launch, I went on a tour of the vast grounds of the Kennedy Space Center with NASA’s public-affairs officer, Greg Harland, and the biologist Don Dankert. I’d been warned beforehand by other NASA officials not to broach the topic of global warming; in any event,NASA’s predicament became obvious as soon as we climbed up on a dune overlooking Launch Complex 39, from which the Apollo missions left for the moon, and where any future Mars mission would likely begin. The launchpad is a quarter of a mile from the ocean—a perfect location, in the sense that, if something goes wrong, the rockets will fall into the sea, but not so perfect, since that sea is now rising. NASA started worrying about this sometime after the turn of the century, and formed a Dune Vulnerability Team.

In 2012, Hurricane Sandy, even at a distance of a couple of hundred miles, churned up waves strong enough to break through the barrier of dunes along the Atlantic shoreline of the Space Center and very nearly swamped the launch complexes. Dankert had millions of cubic yards of sand excavated from a nearby Air Force base, and saw to it that a hundred and eighty thousand native shrubs were planted to hold the sand in place. So far, the new dunes have yielded little ground to storms and hurricanes. But what impressed me more than the dunes was the men’s deep appreciation of their landscape. “Kennedy Space Center shares real estate with the Merritt Island Wildlife Refuge,” Harland said. “We use less than ten per cent for our industrial purposes.”

“When you look at the beach, it’s like eighteen-seventies Florida—the longest undisturbed stretch on the Atlantic Coast,” Dankert said. “We launch people into space from the middle of a wildlife refuge. That’s amazing.”

The two men talked for a long time about their favorite local species—the brown pelicans that were skimming the ocean, the Florida scrub jays. While rebuilding the dunes, they carefully bucket-trapped and relocated dozens of gopher tortoises. Before I left, they drove me half an hour across the swamp to a pond near the Space Center’s headquarters building, just to show me some alligators. Menacing snouts were visible beneath the water, but I was more interested in the sign that had been posted at each corner of the pond explaining that the alligators were native species, not pets. “Putting any food in the water for any reason will cause them to become accustomed to people and possibly dangerous,” it went on, adding that, if that should happen, “they must be removed and destroyed.”

Something about the sign moved me tremendously. It would have been easy enough to poison the pond, just as it would have been easy enough to bulldoze the dunes without a thought for the tortoises. But NASA hadn’t done so, because of a long series of laws that draw on an emerging understanding of who we are. In 1867, John Muir, one of the first Western environmentalists, walked from Louisville, Kentucky, to Florida, a trip that inspired his first heretical thoughts about the meaning of being human. “The world, we are told, was made especially for man—a presumption not supported by all the facts,” Muir wrote in his diary. “A numerous class of men are painfully astonished whenever they find anything, living or dead, in all God’s universe, which they cannot eat or render in some way what they call useful to themselves.” Muir’s proof that this self-centeredness was misguided was the alligator, which he could hear roaring in the Florida swamp as he camped nearby, and which clearly caused man mostly trouble. But these animals were wonderful nonetheless, Muir decided—remarkable creatures perfectly adapted to their landscape. “I have better thoughts of those alligators now that I’ve seen them at home,” he wrote. In his diary, he addressed the creatures directly: “Honorable representatives of the great saurian of an older creation, may you long enjoy your lilies and rushes, and be blessed now and then with a mouthful of terror-stricken man by way of dainty.”

That evening, Harland and Dankert drew a crude map to help me find the beach, north of Patrick Air Force Base and south of the spot where, in 1965, Barbara Eden emerged from her bottle to greet her astronaut at the start of the TV series “I Dream of Jeannie.” There, they said, I could wait out the hours until the pre-dawn rocket launch and perhaps spot a loggerhead sea turtle coming ashore to lay her eggs. And so I sat on the sand. The beach was deserted, and under a near-full moon I watched as a turtle trundled from the sea and lumbered deliberately to a spot near the dune, where she used her powerful legs to excavate a pit. She spent an hour laying eggs, and even from thirty yards away you could hear her heavy breathing in between the whispers of the waves. And then, having covered her clutch, she tracked back to the ocean, in the fashion of others like her for the past hundred and twenty million years. ###

[William (Bill) McKibben, a former New Yorker staff writer, is both the founder of the grassroots climate campaign 350.org and the Schumann Distinguished Scholar in environmental studies at Middlebury College (VT). McKibben wrote the first book about global warming, The End of Nature (1989). Find other books by Bill McKibben here. Bill McKibben received an AB (English) from Harvard University (MA).]

Copyright © 2018 The New Yorker/CondĂ© Nast Digital



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2018 Sapper's (Fair & Balanced) Rants & Raves

Tuesday, November 27, 2018

More From The Department Of Better Late Than Never — A Secular Homily On Thanksgiving

Today, David Von Drehle offers a lesson on the use of history when it seems that life in November 2018 is the worst timeever. Of course, we don't dwell on the horrors of the past, but that doesn't erase those earlier horrors. That is awful to contemplate because today's horrors will fade with the emergence of new horrors forty years hence. If this is a (fair & balanced) sense of relief for this blogger because he won't be around to moan about the horrible times of 2058, so be it.

][x WaPo — DC Fishwrap]]
You Think Things Are Bad Now? (Look Back 40 Years)
By David Von Drehle


TagCrowd Cloud of the following piece of writing

created at TagCrowd.com

In more robust times for the metropolitan newspaper business, the Kansas City Star ran a column devoted to nourishing civic memory. The feature recalled events and personalities from 40 years past — roughly half a lifetime — and frequently sparked reflections along the lines of that great French adage: Plus ça change, plus c'est la mĂªme chose (The more things change, the more they stay the same).

Alas, the column has not survived the many rounds of budget cuts visited on the Star and other local papers. But its spirit survives in the 40 Years Ago Column Club. I was their guest at lunch the other day.

The club newsletter at my seat reminded me of the tenor of the world in 1978. Amid the powder keg Middle East, a revolution was underway in Iran. The aged Shah, a despot installed and propped up by U.S. power, imposed military rule in a desperate attempt to extend his reign. Among oppressed Iranians, a rumor spread that the stern visage of Ayatollah Ruhollah Khomeini had appeared in the full moon — an omen that would be fulfilled a short time later when the Shah fled and Khomeini returned from exile to inaugurate an Islamic republic.

Meanwhile, a cult leader from California, who had taken his followers to a jungle compound in South America, ordered the assassination of a visiting congressman. Then Jim Jones led his people in a ghastly murder-suicide that left more than 900 bodies putrefying in the heat. A third of them were children.

Six nuclear weapons were detonated around the globe in a single month by four different countries — the United States, France, the United Kingdom and the Soviet Union. The total number of test explosions in 1978 was 66 . Anxiety over the risk of nuclear war was rising steeply. Within a handful of years, 100 million Americans — an astounding 43 percent of the population — would tune in to a single broadcast: “The Day After,” which depicted the destruction of Kansas City and nearby towns in a nuclear holocaust.

The newsletter took me 40 years deeper into the past: 1938. A darker period of human history is hard to imagine. Joseph Stalin’s terror was culminating in the Moscow “show trials” of his political enemies, while millions of ordinary Soviets were enslaved throughout the gulag archipelago. Farther east, the Japanese army was continuing its brutal conquest of China.

In the United States, the economy was failing again. One in five Americans was out of work. The midterm elections were a massive rebuke of President Franklin D. Roosevelt, who was widely criticized as a would-be autocrat. Roosevelt’s Democrats lost 71 seats in the House and seven in the Senate. “The New Deal has been halted,” the influential columnist Arthur Krock declared.

And in those corrupted storehouses of Western culture, Germany and Austria, the murder of a Nazi diplomat in France was the pretext for an orgy of state-sponsored violence against Jews. More than 1,000 synagogues were burned in two November days and nights; at least 7,500 businesses were looted; some 30,000 Jewish men were arrested and condemned to slavery at Dachau, Buchenwald and Sachsenhausen. This “Kristallnacht” — the glass of smashed windows covered the Reich like crystals — was followed by a month-long barrage of harsher anti-Jewish decrees. One-fifth of all Jewish wealth was confiscated, along with insurance payments for the destroyed property. Jews were forbidden to own businesses; their children were banned from public schools.

Forty years. This exercise could be continued, I suppose, as far back as records will take us. Forty years before Kristallnacht, in November 1898, a white supremacist army overran Wilmington, NC, and deposed the elected municipal government in US history’s only coup d’etat. Incited by speeches and editorials calling for the mass lynching of African Americans, the heavily armed mob drove hundreds of Wilmington families from their homes and gunned down an estimated 60 black citizens while burning the office of a newspaper editor who dared to suggest that white women might consent to sex with black men.

The lesson: Never in the course of human events was everything well and goodness unchallenged. Hatred and vice have always obstructed and opposed the exercise of virtues — of kindness, generosity, comity, humility, honesty and all the others. If we feel these contending forces more sharply in current events than through history books, it’s not because our day is more conflicted, more anxious, more contentious or more dangerous. It’s because it’s our day.

Now approaches the day set aside to giving thanks. I’m grateful for all those, famous and forgotten, past and present, who have lived through difficult times and chosen hope over despair, respect over contempt, gentleness over cruelty, communion over division, liberty over tyranny, beauty over ugliness, dignity over base impulses and manipulation. These choices are right and good, in all times and places. But that doesn’t make them easy — not today, not tomorrow, not 40 years hence. ###

[David Von Drehle is a columnist for The Washington Post, where he writes about national affairs and politics. He joined The Post in 2017 after a decade at Time magazine, where he wrote more than 60 cover stories as editor-at-large. During a previous stint at The Post, Von Drehle served as a writer and editor on the National staff, in Style, and at the Magazine. He is the author of a number of books, including the award-winning bestseller Triangle: The Fire That Changed America (2003). See Von Drehle's other books here. He received a BA (English) from the University of Denver (CO), where he was also a Boettcher Foundation Scholar and editor of the Denver Clarion, the student newspaper. In 1985 Von Drehle also received an MLitt (literature) from Oxford University (UK) as a Marshall Scholar.]

Copyright © 2018 The Washington Post







Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2018 Sapper's (Fair & Balanced) Rants & Raves