Tuesday, March 31, 2015

How Many Founding Fathers Could Dance On The Head Of A Pin?

At the bottom of Indiana's Religious Freedom Restoration Act of 2015 is more than a defense of Christianity. It legitimizes discrimination against gay marriage and the same-sex lifestyle. One of the greatest canards promulgated by the Religious Right is that the Founding Fathers (like Jefferson and Franklin) were devout Christians. Fie unto the proponents who proclaim that the "war on the Christian faith" is real. If this is (fair & balanced) call upon the Religious Right to cease and deist, so be it.

[x Patheos]
What Is Deism?
By Thomas S. Kidd

Tag Cloud of the following piece of writing

created at TagCrowd.com

The claim that any of the Founding Fathers were deists generates pushback among certain conservatives. This helps to account for the firestorm of controversy (which I covered for WORLD magazine) over David Barton’s The Jefferson Lies (2012) and the book’s subsequent abandonment by Thomas Nelson Publishers. Barton argued that until late in life, Jefferson was an orthodox, Trinitarian Christian, but critics showed that Barton neglected evidence from early in life where Jefferson, among other things, denied the Trinity.

When I get questions about deism and the Founders, I quickly point out that we know Ben Franklin was a deist because he called himself a deist. That quiets most potential critics because, after all, we don’t want to argue with the Founders’ own words. (Although I did once have someone insist that Franklin does not, in fact, call himself a deist in his autobiography, but a Christian. Perhaps this was a postmodern interpretation of Franklin’s statement “I soon became a thorough Deist”?)

Part of the problem with calling any of the Founders deists is the difficulty of defining deism. What did that term mean in the eighteenth century? Could you be a deist and somehow believe in prayer, as Franklin apparently did, at least as of the Constitutional Convention? (Franklin made a failed motion for the convention to open its sessions in prayer.) Could you be a deist and say with Jefferson, “I am a real Christian”?

Arguments about whether any or all the Founders were deists usually are hamstrung by overly precise definitions of deism. Deists believed in God as the cosmic watchmaker, critics protest, so any sign that a person believed in prayer or Providence automatically disqualifies them. But deism in eighteenth-century Europe and America could mean many different things. Its adherents could range from people who had qualms about Calvinism, to those who criticized the corruptions of the church as “priestcraft,” to more radical deists who espoused beliefs that seem close to atheism.

We should also remember that “deism” and “deists” were terms probably more often used by critics against their opponents, rather than by deists themselves. Self-identifying deists like Franklin were quite rare in America, although scholarly work by Amanda Porterfield and others has recently suggested that deism and freethinking may have had a stronger presence in post-Revolutionary America than we had previously realized. Evangelical writers also magnified the perceived threat of deists to the Christian character of the republic, just as some popular Christian authorities today herald the imminent fall of most teenage Christians into the hands of secularists.

So what was deism? In spite of all its diversity, deism was a strain of rationalist religion — many of its advocates, like Jefferson, would have called themselves Christians — which focused on the ethical, rational requirements of true faith and criticized the authority of ministers and institutional churches. Many of them, especially in England and America, believed that there was a true core of Christianity that one could recover through attention to Jesus’s teachings alone. One important aspect of deism that we often miss is that its adherents could hardly imagine a world not organized on theistic moral categories, such as the inherent goodness of charity. Most deists really did consider themselves serious theists, and many considered themselves devotees of Jesus and his teachings. Their deism was not just a convenient cloak for atheism.

Both Franklin and Jefferson wanted to dispense with Christian dogma and recover the true faith, which was a quality of living rather than a set of arcane propositions which (as they saw it) the guardians of orthodoxy defended in order to protect their own power. This is why Franklin gave so much attention to tests of personal virtue, and experimented constantly with charitable projects. Likewise, Jefferson was almost obsessed with the person and teachings of Jesus, but believed that in his teaching and behavior Jesus served as the preeminent example of “human excellence,” and that his followers imposed claims about his divinity and resurrection after the teacher’s death. But neither Jefferson nor Franklin imagined that we could do without this recovered rationalist Christianity — it was the best guide we had to real virtue.

The deists’ closest descendants today are not the “new atheists” who have stirred up so much media chatter in recent years. Their closest descendants are probably liberal mainline Christians who see Jesus as their model but who eschew (or even deny) the particular, exclusive doctrines that have been associated with Christian orthodoxy for millennia. Even though it has had some very influential devotees, that kind of non-orthodox faith has never seemed to win or hold many adherents in America. Ω

[Thomas Kidd is a Professor of History at Baylor University; he also is a Senior Fellow at the Baylor Institute for Studies of Religion. He has written God of Liberty: A Religious History of the American Revolution (2010) and Patrick Henry: First Among Patriots (2011). Kidd received the following degrees (history): BA and MA from Clemson University and a PhD from the University of Notre Dame.]

Copyright © 2015 Patheos



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2015 Sapper's (Fair & Balanced) Rants & Raves

Monday, March 30, 2015

Cruz'n For A Losin' ?

C.(rackpot) Cruz is the focus of today's 'toon that appears on Mondays in this blog. The dude was a super-wonk at both Princeton and Harvard Law. C. Cruz has a fatal flaw: Canadian birth. Black's Law Dictionary (9th Edition) defines "Natural Born Citizen" as "A person born within the jurisdiction of a national government." Rafael Edward (C.) Cruz was born in Canada's Alberta Province where Cruz's father operated a Calgary-based seismic-data processing firm for oil drillers. In this blogger's humble opinion, C. Cruz is an unnatural-born citizen of the United States. If this is (fair & balanced) pronouncement, so be it.

[x This Modern World]
Ted Cruz's Path To The White House
By Tom Tomorrow (Dan Perkins)

Tom Tomorrow/Dan Perkins

[Dan Perkins is an editorial cartoonist better known by the pen name "Tom Tomorrow". His weekly comic strip, "This Modern World," which comments on current events from a strong liberal perspective, appears regularly in approximately 150 papers across the U.S., as well as on Daily Kos. The strip debuted in 1990 in SF Weekly. Perkins, a long time resident of Brooklyn, New York, currently lives in Connecticut. He received the Robert F. Kennedy Award for Excellence in Journalism in both 1998 and 2002. When he is not working on projects related to his comic strip, Perkins writes a daily political weblog, also entitled "This Modern World," which he began in December 2001. More recently, Dan Perkins, pen name Tom Tomorrow, was named the winner of the 2013 Herblock Prize for editorial cartooning.]

Copyright © 2015 Tom Tomorrow (Dan Perkins)



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2015 Sapper's (Fair & Balanced) Rants & Raves

Sunday, March 29, 2015

Today's Dismal View Of How Economies Work

Economists wage their own internecine squabbles and the sun still rises in the east and sets in the west. If this is (fair & balanced) blogger-ignorance, so be it.

[x Boston Fishwrap]
Not Even Paul Krugman Is A Real Keynesian
By Jonathan Schlefer

Tag Cloud of the following piece of writing

created at TagCrowd.com

Economists’ ideas are far more powerful than is immediately obvious. As the economist John Maynard Keynes once wrote: “Practical men, who believe themselves to be quite exempt from any intellectual influences, are usually slaves of some defunct economist.”

Keynes was surely thinking of his influential predecessor, Knut Wicksell, who had died a decade before. This remark is from his General Theory, published in 1936, Keynes’s famous account of capitalism and depression, in which he broke free from Wicksell’s thinking.

Wicksell argued that if interest rates rise above some “natural rate,” they weaken investment demand and growth, while if they fall below it, they spur excessive demand and inflation. But markets inherently correct imbalances, restoring optimal equilibrium.

Keynes disagreed. “The economic system... seems capable of remaining in a chronic condition of subnormal activity for a considerable period without any marked tendency either towards recovery or towards complete collapse,” he wrote. “The evidence indicates that full, or even approximately full, employment is of rare and short-lived occurrence.”

Keynes’s insights have enormous practical importance, according to Lance Taylor and Duncan Foley of the New School. Temperamentally opposite — Foley a brilliant theorist, Taylor a pragmatist influential in developing nations — they jointly received the Leontief Prize for Advancing the Frontiers of Economic Thought at Tufts University’s Global Development and Environment Institute on Monday.

But isn’t Keynes now mainstream? No, say Foley and Taylor. The mainstream still sees economies as inherently moving to an optimal equilibrium, as Wicksell did. It still says demand causes short-run fluctuations, but only supply factors, such as the capital stock and technology, can affect long-run growth.

Even Paul Krugman, a self-described Keynesian, Nobel laureate, and New York Times columnist, writes in the 2012 edition of his textbook: “In the long run the economy is self-correcting: shocks to aggregate demand affect aggregate output in the short run but not in the long run.” He says Keynes and Wicksell are in key respects “essentially equivalent.”

Krugman does point to one exception: If interest rates are nearly zero, as during the financial crisis, markets lose restorative force. But, Taylor asks, what’s the logic?

Keynes saw capitalism’s general state as allowing almost arbitrary unemployment: hence his General Theory. Full employment was a lucky exception.

To Taylor, calling full employment the general state and allowing one unlucky exception turns Keynes upside down. And look where this confusion has brought us, he adds. Take the current eurozone disaster. For two decades, the European Union bureaucracy in Brussels, the German Council of Economic Experts, and a chorus of others, branded Germany, the “sick man of Europe,” as suffering from a sclerotic supply side: rigid labor unions, impediments to layoffs, a burdensome welfare state. But German labor costs to produce output sank steadily, and Germany generated huge trade surpluses — hardly signs of a sclerotic supply side. Yet growth has barely averaged 1 percent a year since 2000.

The problem? Weak demand, according to Taylor. The chorus ignored demand because the mainstream says it cannot affect long-run growth and employment. But it does. Now the same chorus is telling southern Europe to institute “reforms” like Germany. But if the German model doesn’t work well in Germany, how can it work in Greece, Italy, or Spain? A misguided idea is undermining the European Union itself.

Two economists could hardly have arrived at broadly similar views by more different paths. A lifelong Quaker, Duncan Foley planned to go into the Foreign Service but had to postpone training until his partner at the time could become a citizen. He’d liked economics as an undergraduate, so he entered the Yale PhD program in 1964.

Herbert Scarf’s [course] Mathematical Economics lit his thinking on fire. Scarf mapped out “the whole development of high economic theory” for the next quarter century, Foley recalls in an essay. A sad reflection on economic theory, he adds. Theorists would fritter away those years working out “increasingly esoteric implications of well-established concepts.”

Scarf’s teaching awoke in him a lifelong obsession with uncovering what would later be dubbed the “microfoundations of macroeconomics”: How might interactions among individuals and firms conspire to generate the workings of an entire economy?

On the one hand, the canonical “general equilibrium” model, published a decade earlier, had come as close to capturing Adam Smith’s ideal of the “invisible hand” as anything economists have ever invented. It depicts an economy in which numerous distinct individuals, making decisions according to their personal preferences, and trucking and bartering in competitive markets, create a best of all possible economic worlds. There is neither recession nor overheating. There is no unemployment.

On the other hand, Keynes’s world sees “dark forces of time and ignorance” enveloping the future. Speculators chase after what they think other speculators think, sometimes wreaking disaster. Firms may fail to sell goods or workers to find jobs. But this world is rather metaphorical. Instead of being inhabited by individual persons, it is comprised of homogenized “capital” and “labor,” “consumption” and “investment.”

How could Foley reconcile the general equilibrium model, more realistically based on diverse individuals but depicting an unbelievable utopia, with the Keynesian model, depicting a more realistic world but based on disembodied labor and capital?

When Foley thought of quitting Yale after a year, his sympathetic adviser James Tobin, later a Nobel laureate, persuaded him to take general exams that summer and write his thesis next year. He finished his PhD in an astonishing two years. He laments that Yale failed to socialize him into “the extremely narrow and ideological constraints” of the profession, though conceding he might have been uneducable in that respect.

Foley pursued these explorations after landing “in the club” as a faculty member at MIT, then widely regarded as the top economics department anywhere. There was great team spirit, says Taylor, who also taught there: Arrayed around “Paul Samuelson as the king and Robert Solow as the prince,” MIT spread its influence far and wide.

Ironically, it was none other than Foley who brought general-equilibrium theory, often considered the crown of mainstream economics, to MIT. Money, central to Keynes, is absent from the barter general-equilibrium model. Foley hoped that by weaving money into general equilibrium he could show how economies really work.

Foley’s colleagues were intrigued by his projects — even such forays as studying anthills — but feared they were too quixotic to net many journal articles. He recalls that he would chew the fat with Joseph Stiglitz in the next office, who liked “small models” addressing contained questions. Small models netted many journal articles, and if they contradicted each other somewhat, depending on the assumed situation, the MIT ethos accepted that they should only be applied with that specific scenario in mind. Stiglitz later was awarded the Nobel Prize. Foley wanted a more unified understanding. He did publish articles in top journals but finally concluded that you cannot weave money into general equilibrium because it is already there, operating so efficiently as to be invisible. When his wife was hired to teach classics at Stanford, he moved there.

The same questions still obsess Foley. In a paper he will give in April, he argues that economies assume dark Keynesian aspects because people interact socially outside markets. If we didn’t care how others value stocks, we would each value them at what we consider companies’ fundamental value. But we do care how others value stocks. “Technical traders” care about nothing else, buying and selling according to market “sentiment.” The result, Foley’s paper shows, is to drive a moderate equilibrium to extremes, limited only by traders’ perhaps scant “common sense” and bankers’ rather flexible “patience.” Likewise, social interactions drive aggregate demand and unemployment.

Lance Taylor had a less tempestuous career. Majoring in math at Caltech, he was fascinated by Keynes’s “General Theory.” Even in the 1960s, reading Keynes was unusual. Gregory Mankiw, who teaches macroeconomics at Harvard, has written, “One might suppose that reading Keynes is an important part of Keynesian theorizing. In fact, quite the opposite is the case.” Taylor could hardly disagree more.

In the economics PhD program at Harvard, he focused on developing nations: “It might do some good,” he says. He found his way down Massachusetts Avenue to MIT’s Center for International Studies, then financed partly by the CIA. He thinks Paul Rosenstein-Rodan, a principal founder of development economics who ran the center, may have been a CIA agent: “There was this amiable old guard who let you in, but he had a gun under his desk.” Rosenstein-Rodan helped Taylor get a job in Chile, where he learned about how developing economies worked “or didn’t work,” he cautions.

The fierce class divisions impressed him. His wife, a doctor, attended to the poor, contracted typhoid, and spent a month reading “A Hundred Years of Solitude” as she recovered. Next, off to Brasilia, bleak modernist architecture plunked into the jungle, he learned about the Texas of Latin America. Offered a job teaching at MIT, he published a stream of articles and sailed through tenure.

In 1978 Taylor wrote an article with Krugman, then an MIT graduate student. Unlike some professors who take credit for their brightest students’ work, he left Krugman’s name first. It became Krugman’s “job paper,” the one he sent to schools to get a job. Despite intellectual disagreements since then, Krugman gave a warm remembrance of Taylor when Taylor retired from full-time teaching last year.

Reminded how the pursuit of “small models” bothered Foley, Taylor shrugs, “I do it all the time.” Of course, his “small models” follow different lines from the mainstream. As for deriving models of economies from the microeconomics of interacting individuals, he responds, “I always thought standard microeconomics was silly.” In the mid-1990s, Taylor moved to the more collegial environment of the New School, where his collaboration with Foley and others deepened.

He argues that the structure of an economy cannot be derived from individuals; social institutions inherently shape it. He starts from national accounts — a tally of purchase and sales, investment and savings, among households, firms, the government, and banks that Keynes helped develop and saw as critical to understanding economies. Today, they are published by statistical agencies, principally the Bureau of Economic Analysis in the United States. On this accounting framework, Taylor imposes assumptions intended to capture the structure of key social institutions. He labels his approach, “structuralist,” a term borrowed from Latin American economists.

Two key economic facts, Taylor argues, are largely determined outside of — or even in spite of — markets. He follows Keynes in insisting that the first is demand. He treats demand as generally driving long-run growth, not just the ups and downs of business cycles. For example, all advanced nations employed industrial policies when they were developing, in part to sustain demand. Most economics texts tell how England, better at manufacturing, traded with Portugal, better at winemaking. Taylor notes that, in fact, the British navy sailed into Lisbon harbor, lowered its guns, and forced Portugal to sign the famous 1703 trade treaty to expand its market for manufactured goods. Britain required its colonies to buy its goods for the same reason. And it imposed some of the highest tariffs ever to block manufactured imports. The United States did likewise when we industrialized after the Civil War. The second fact — again determined largely by societies, not markets — is the distribution of income between profits and wages, and between high-earners and low-earners. In his “General Theory,” arguably as a concession to the mainstream, Keynes treated wages as determined by markets. But in response to friendly critics, according to Taylor, Keynes revised that idea and allowed that social bargaining determines wages. Taylor considers this approach both more consistent with Keynes’s central thinking and more realistic.

Taylor concedes that societies cannot ignore markets. According to work by Taylor and colleagues that extends Keynes’s approach, although demand drives all economies, some turn out to be “wage-led”: Consumer demand spurred by higher wages drives growth. Others turn out to be “profit-led”: Investment demand spurred by higher profits drives growth.

Modeling the US economy has convinced Taylor it is profit-led. A higher profit share boosts growth. Asked if he gets pushback from friends on the left, he hesitates. “Yes,” interjects Foley. “It’s constant.” His friends want the economy to be wage-led, Taylor agrees. “It’s their political preference. And it’s mine.” But he sees his models as tools to understand how economies work, not how he wants them to work. Not a bad lesson for all economists, Keynesian or not. Ω

[Jonathan Schlefer, a researcher at the Harvard Business School, is the author of The Assumptions Economists Make (2012). Schlefer received an AB (Greek literature and mathematics) from Harvard University, an M Arch (architecture) from the University of California at Berkeley, and a PhD (political science) from the Massachusetts Institute of Technology. He was the editor-in-chief of Technology Review (1982-1991) at MIT.]

Copyright © 2015 Boston Globe Media Partners



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2015 Sapper's (Fair & Balanced) Rants & Raves

Saturday, March 28, 2015

Words/Textbox To Dread: "Enter Your Password Here"

Ah, passwords. One of the regular irritants this blogger encounters in cyberspace. Just yesterday, a password protected site refused the password that had worked two days previously. The Help Desk was not very helpful and the quick and dirty solution was for the site's IT department to delete the blogger's account with the instructions that this blogger re-register as a "New User." And so it goes. Another day, another password to remember. The pile of stones to be pushed up the hill just keeps getting higher. If this is (fair & balanced)cyber-frustration, so be it.

[x NY Fishwrap 'Zine]
The Secret Life Of Passwords
By Ian Urbina

Tag Cloud of the following piece of writing


created at TagCrowd.com

Howard Lutnick, the chief executive of Cantor Fitzgerald, one of the world’s largest financial-services firms, still cries when he talks about it. Not long after the planes struck the twin towers, killing 658 of his co-workers and friends, including his brother, one of the first things on Lutnick’s mind was passwords. This may seem callous, but it was not.

Like virtually everyone else caught up in the events that day, Lutnick, who had taken the morning off to escort his son, Kyle, to his first day of kindergarten, was in shock. But he was also the one person most responsible for ensuring the viability of his company. The biggest threat to that survival became apparent almost immediately: No one knew the passwords for hundreds of accounts and files that were needed to get back online in time for the reopening of the bond markets. Cantor Fitzgerald did have extensive contingency plans in place, including a requirement that all employees tell their work passwords to four nearby colleagues. But now a large majority of the firm’s 960 New York employees were dead. “We were thinking of a major fire,” Lutnick said. “No one in those days had ever thought of an entire four-to-six-block radius being destroyed.” The attacks also knocked out one of the company’s main backup servers, which were housed, at what until that day seemed like a safe distance away, under 2 World Trade Center.

Hours after the attacks, Microsoft dispatched more than 30 security experts to an improvised Cantor Fitzgerald command center in Rochelle Park, N.J., roughly 20 miles from the rubble. Many of the missing passwords would prove to be relatively secure — the “JHx6fT!9” type that the company’s I.T. department implored everyone to choose. To crack those, the Microsoft technicians performed “brute force” attacks, using fast computers to begin with “a” then work through every possible letter and number combination before ending at “ZZZZZZZ.” But even with the fastest computers, brute-force attacks, working through trillions of combinations, could take days. Wall Street was not going to wait.

Microsoft’s technicians, Lutnick recalled, knew that they needed to take advantage of two facts: Many people use the same password for multiple accounts, and these passwords are typically personalized. The technicians explained that for their algorithms to work best, they needed large amounts of trivia about the owner of each missing password, the kinds of things that were too specific, too personal and too idiosyncratic for companies to keep on file. “It’s the details that make people distinct, that make them individuals,” Lutnick said. He soon found himself on the phone, desperately trying to compartmentalize his own agony while calling the spouses, parents and siblings of his former colleagues to console them — and to ask them, ever so gently, whether they knew their loved ones’ passwords. Most often they did not, which meant that Lutnick had to begin working his way through a checklist that had been provided to him by the Microsoft technicians. “What is your wedding anniversary? Tell me again where he went for undergrad? You guys have a dog, don’t you? What’s her name? You have two children. Can you give me their birth dates?”

“Remember, this was less than 24 hours after the towers had fallen,” he said. “The fire department was still referring to it as a search-and-rescue mission.” Families had not accepted their losses. Lutnick said he never referred to anyone as being dead, just “not available right now.” He framed his questions to be an affirmation of that person’s importance to the company, he said. Conversations oscillated between sudden bawling and agonizing silences. “Awful,” he said. Sometimes it took more than an hour to work through the checklist, but Lutnick said he made sure he was never the one to hang up first.

In the end, Microsoft’s technicians got what they needed. The firm was back in operation within two days. The same human sentimentality that made Cantor Fitzgerald’s passwords “weak,” ultimately proved to be its saving grace.

Several years ago I began asking my friends and family to tell me their passwords. I had come to believe that these tiny personalized codes get a bum rap. Yes, I understand why passwords are universally despised: the strains they put on our memory, the endless demand to update them, their sheer number. I hate them, too. But there is more to passwords than their annoyance. In our authorship of them, in the fact that we construct them so that we (and only we) will remember them, they take on secret lives. Many of our passwords are suffused with pathos, mischief, sometimes even poetry. Often they have rich back stories. A motivational mantra, a swipe at the boss, a hidden shrine to a lost love, an inside joke with ourselves, a defining emotional scar — these keepsake passwords, as I came to call them, are like tchotchkes of our inner lives. They derive from anything: Scripture, horoscopes, nicknames, lyrics, book passages. Like a tattoo on a private part of the body, they tend to be intimate, compact and expressive.

Perhaps my biggest surprise has been how willing, eager actually, people are to openly discuss their keepsakes. The friends I queried forwarded my request, and before long I started receiving passwords from complete strangers. There was the former prisoner whose password includes what used to be his inmate identification number (“a reminder not to go back”); the fallen-away Catholic whose passwords incorporate the Virgin Mary (“it’s secretly calming”); the childless 45-year-old whose password is the name of the baby boy she lost in utero (“my way of trying to keep him alive, I guess”).

Sometimes the passwords were playful. Several people said they used “incorrect” for theirs so that when they forgot it, the software automatically prompted them with the right one (“your password is incorrect”). Nicole Perlroth, The New York Times’s cybersecurity reporter, told me about the awkward conversation she had not long ago, when, locked out of her account, she was asked by the newspaper’s tech-support staff to disclose her password: a three-digit code plus an unpublishable epithet — a reference to a funny exchange she overheard years earlier between a store clerk and a thief.

Often, though, these disclosures had an emotional edge to them. One woman described the jarring realization that her sister’s name was the basis for all of their mother’s passwords. Another, Becky FitzSimons, recalled needling her husband, Will, after their wedding in 2013 because he was still using the digits of his ex-girlfriend’s birthday for his debit-card PIN. “I’m not a jealous person,” FitzSimons said. “But he changed it to my birthday the next day.”

Standing at the park watching my 11-year-old son climb on the jungle gym, I struck up a conversation with a woman walking her dog, and I told her about my keepsakes idea. Like most people, she did not want her name used in my article, because she said her vignette was too personal; she also feared being hacked. But she proceeded to tell me that several months after her son committed suicide, she found his password written on a piece of paper at his desk: “Lambda1969.” Only then, after some Internet searching, did she realize he had been gay. (Lambda is the Greek lowercase “l,” which some historians say stands in gay culture for liberation. The number, “1969,” she explained, referred to the year of the Stonewall Riots — the protests that followed a police raid on the Stonewall Inn in Greenwich Village.)

Some keepsakes were striking for their ingenuity. Like spring-loaded contraptions, they folded big thoughts down into tidy little ciphers. After being inspired by Sheryl Sandberg’s book, Lean In: Women, Work and the Will to Lead (2013), Cortni Kerr, a running partner of mine, began using “Ww$$do13,” which stood for “What would Sheryl Sandberg do” plus “13” for the year (2013) of the password’s creation. “TnsitTpsif” was the password of another friend, a computer scientist who loves wordplay. It stands for “The next sentence is true. The previous sentence is false,” which in philosophy is called a liar’s paradox. For my friend, it was a playful reference to the knots that language can tie. When I described keepsake passwords to Paul Saffo, who teaches engineering at Stanford and writes often about the future of technology, he coined the term “crypto haiku.”

Rachel Malis, 29, a friend’s former housemate, heard about my password fixations and emailed hers to me: “Odessa,” the Ukrainian city of her father’s birth. It seemed unremarkable to me. But she said there was more to it. So I suggested we meet for coffee. We sat for an hour while Malis nursed a latte and explained what gave her password its power for her.

“Odessa,” she said, referred not just to her lineage but also to a transformative trip she took there in 2008 with her father. In a sense, it was a place that had always separated them — it embodied a language, a regime and a past that she could never share. Her father fled Ukraine in 1980 when he was 28, and he vowed never to return. Even in America, old habits, like his KGB-induced skepticism of the police lingered. Malis said that during her childhood in Trumbull, CT, near New Haven, he would close the living-room blinds whenever he wanted to discuss anything “sensitive,” like summer travel plans or family finances. The city loomed large in her father’s consciousness when Malis was growing up. She once asked why there was no fleck of green anywhere in their house — not in the wallpaper, pictures, dishes, throw rugs — and her mother explained that it was because the color reminded him of painful early years spent in the army.

On that trip back, Malis paid for her father’s plane ticket and arranged their accommodations, and they were both surprised to find him just as lost as she was in the streets of Odessa. Her laconic father was more talkative, though, in his native tongue. He was strangely calm visiting his father’s grave but became choked up when he showed her the tracks where he caught the train that whisked him out of the city one panicked night so long ago. Above all, Malis said, typing “Odessa” every time she logged in to her computer was a reminder of the true epiphany she carried home: that getting closer to something — her father, this city — didn’t make it smaller or more manageable. “It actually just brought their complexity and nuance more into focus,” she said.

At least as interesting as the amount of thought Malis had packed into this one six-letter word was the fact that she was telling me it all. I confessed to her that I loved “Odessa” as a password. At the same time, I worried that her office’s techies might not share my affection, given that their first rule is to avoid choosing passwords with personal significance. Malis pointed out that we break that rule precisely because secure passwords are so much harder to remember. Our brains are prone to mooring new memories to old ones, she said. I added that I thought the behavior spoke to something deeper, something almost Cartesian. Humans like, even need, to imbue things with meaning, I suggested. We’re prone to organizing symbols into language.

Malis gave me an inquisitive look. So I continued: We try to make the best of our circumstances, converting our shackles into art, I said. Amid all that is ephemeral, we strive for permanence, in this case ignoring instructions to make passwords disposable, opting instead to preserve our special ones. These very tendencies are what distinguish us as a species.

These special passwords are a bit like origami, I suggested: small and often impromptu acts of creativity, sometimes found in the most banal of places. Malis seemed to agree. She nodded, shook my hand and left.

Asking strangers about their passwords is a touchy proposition. Push too hard, and you come off as a prospective hacker. Go too easy, and people just rant about how much they hate passwords. Still, it’s not every day that you stumble across a conversation topic that teaches you new things about people you’ve known for years.

I discovered, for example, that my father — a recently retired federal judge and generally a pretty serious guy — derived his passwords from a closeted love for goofy, novelty songs from the late ’50s and early ’60s (“The Purple People Eater,” “Monster Mash”).

The “4622” that my wife uses in her passwords was not just the address of her own father’s childhood home but also a reminder of his fragility and strength. Apparently when the former 270-pound football standout, a scholarship athlete and the pride of his working-class neighborhood in west Tulsa, was a small boy, he had to sing his home address (“4622 South 28th West Avenue”) in one full breath rather than try to say it normally; otherwise, his debilitating stutter would trip him up.

My young son revealed that his password was “philosophy,” because, he said, several years earlier, when he created it, he took secret pride in knowing the meaning of a concept that big. The disclosure had an interesting echo for me, because one of my first childhood passwords was a play on “ontogeny recapitulates phylogeny,” an evolutionary theory from a high-school biology class that I found especially captivating. (The hypothesis, now unfashionable, posits that the physical or intellectual development of each individual passes through stages similar to the developmental stages of that individual’s species or civilization.)

I asked Andy Miah, a professor of science communication and digital media at the University of Salford in England, for his thoughts on passwords, and he offered an anthropological outlook. Keepsake passwords, he suggested, ritualize a daily encounter with personal memories that often have no place else to be recalled. We engage with them more frequently and more actively than we do, say, with the framed photo on our desk. “You lose that ritual,” Miah said, “you lose an intimacy with yourself.”

For some people, these rituals are motivational. Fiona Moriarty, a competitive runner, told me that she often used “16:59” — her target time for the 5,000 meters in track. Mauricio Estrella, a designer who emailed me from Shanghai, described how his passwords function like homemade versions of popular apps like Narrato or 1 Second Everyday, which automatically provide its user with a daily reminder to pause and reflect momentarily on personal ambitions or values. To help quell his anger at his ex-wife soon after their divorce, Estrella had reset his password to “Forgive@h3r.” “It worked,” he said. Because his office computer demanded that he change his password every 30 days, he moved on to other goals: “Quit@smoking4ever” (successful); “Save4trip@thailand” (successful); “Eat2@day” (“it never worked, I’m still fat,” Estrella wrote); “Facetime2mom@sunday” (“it worked,” he said, “I’ve started talking with my mom every week now”).

Keepsakes also memorialize loss or mark painful turning points. Leslye Davis, the New York Times reporter who produced the video series that accompanies this article online, said that “stroke911” was her original Facebook password because she happened to create her page on the same day that her cousin had a stroke. My friend Monica Vendituoli’s keepsake was “swim2659nomore” — a reference to a career-ending shoulder injury in 2008 that prevented her from hitting the 26.59-second qualifying time in the 50-yard freestyle she needed for a championship meet in high school. But the effect of typing this password had shifted over the years, she added. What started as a mourning ritual, she said, was now more a reminder of how “time heals all.”

These personal tributes vary widely, I found. Stuck on a tarmac last year, I sat next to a chatty man who, judging by his expensive watch and suit, seemed to have done well for himself. We made small talk about our jobs, and eventually I told him about my interest in passwords. After a long, silent look out the window, he turned to me and said that he typically uses “1060” in his passwords. This was his SAT score, he explained. He liked reminding himself of it, he said, because he took a certain private satisfaction in how far he had come in life in spite of his mediocre showing on the standardized test.

I got an email from a college student, Megan Welch, 21, who described having been trapped several years earlier in a relationship with a physically abusive boyfriend. She recounted how he routinely spied on her email. When she tried to change her password, he always either guessed or got her to tell him the new one. “I was so predictable,” she said. After finally deciding to break up with him, she used for her new password the date of her decision, plus the word “freedom” — a deviation, she said, from the cutesy words that had been her norm. In being uncharacteristic, her password became unhackable; it was at once a break from her former self and a commemoration of that break.

Keepsake passwords are so universal that they are now part of the fabric of pop culture. I noticed, for instance, that on Showtime’s “Dexter,” the main character (a blood-spatter analyst for the police by day, vigilante serial killer by night) forgot his work computer’s password. He was soon visited by the ghost of his adoptive father, Harry, who killed himself after witnessing Dexter’s violent tendencies. The visit reminded Dexter of his password (“Harry”) and the viewer of the longevity and depth of his personal torment.

Googling for more examples, I came across Jack Donaghy, Alec Baldwin’s character on the NBC sitcom “30 Rock.” He convinced himself that a high-school crush still had feelings for him after he learned that her voice-mail code, “55287,” stood for “Klaus,” the name Jack used in the high-school German class they took together. I found George Costanza from “Seinfeld” nearly driving his girlfriend mad, and maybe even killing a guy, by refusing to share his A.T.M. password, “Bosco,” a reference to George’s weakness for the chocolate syrup.

But perhaps the most bizarre one I found was Jerry Seinfeld’s A.T.M. code — “Jor-El.” On the simplest level — as the episode explained — this was the name of Superman’s Kryptonian father. It served as a nod to the fictional Jerry’s love of the comic-book character. But in digging a bit further, I found that the real-life Jerry’s father was of Eastern European-Jewish descent, and his first name was Kalman, a.k.a. Kal. This is why one of the actor’s two sons, born long after the episode was made, has Kal as his middle name. Though most people know Superman as Clark Kent, his Kryptonian name is Kal-El. What Jerry hid in his PIN looped between fact and fiction, past and present; and comic book, sitcom and real life.

I loved the Seinfeld password story because it was so convoluted that in retelling it I could barely follow it myself. Its circularity inspired a certain awe in me — the way you might feel when you first see an optical illusion by Escher. That got me thinking about the intricate and self-referential patterns famously described in Douglas R. Hofstadter’s 1979 classic Gödel, Escher, Bach: An Eternal Golden Braid. The book is a beautiful and personal musing on how we mold both language and our sense of self from the inanimate material around us.

I wondered if there might be some (modest) parallel between what I saw in keepsakes and the elaborate loops in music, math and art that he described in his book. Like a fractal running through human psychology, maybe we have a tendency not just to create keepsakes but to create ones with self-referential loops in them.

So I called Hofstadter to get his take. He was reserved but intrigued. I suggested that many of these passwords seem to be quiet celebrations of things we hold dear. Hofstadter concurred. His primary password, he said, was the same one he has used since 1975, when he was a visiting scholar at Stanford. It consisted of a sentimental date from his past coupled with a word problem.

“Might there be something deeper at work in these password habits and in the self-referential loops you studied?” I asked.

Some of these patterns we discover, Hofstadter said, others we create. But above all, “we oppose randomness,” he said. “Keepsake passwords are part of that.”

The Internet is a confessional place. With so little privacy, passwords may soon be tomorrow’s eight-track player, quaintly described to our grandchildren. Ten years ago, Bill Gates announced during a tech-security conference in San Francisco that “people are going to rely less and less” on passwords, because they cannot “meet the challenge” of keeping critical information secure. In recent years, there has been a push for machines to identify us not by passwords but by things we possess, like tokens and key cards, or by scanning our eyes, voices or fingerprints. This year, for example, Google purchased SlickLogin, a start-up that verifies IDs using sound waves. iPhones have come equipped with fingerprint scanners for more than a year now. And yet passwords continue to proliferate, to metastasize. Every day more objects — thermostats, car consoles, home alarm systems — are designed to be wired into the Internet and thus password protected. Because big data is big money, even free websites now make you register to view virtually anything of importance so that companies can track potential customers. Five years ago, people averaged about 21 passwords. Now that number is 81, according to LastPass, a company that makes password-storage software.

Partly this push is being fueled by a growing and shared hatred of passwords. The digital era is nothing if not overwhelming. The unrelenting flood of information. The constant troubleshooting. We only just master one new device before it becomes outmoded. These frustrations are channeled into tantrums over forgotten passwords.

There is scarcely a more modern sense of anomie than that of being caught in the purgatory where, having forgotten a password, we’re asked personal trivia questions about ourselves that we can’t seem to answer correctly. The almost-weekly stream of news stories about major security breaches makes it tough not to feel as if privacy on the Internet is unattainable.

It’s enough to make the conscientious objectors seem sane. These are the many people I interviewed who said they had given up on the whole notion of online security, opting instead to adopt intentionally insecure passwords.

Digital nudists of sorts, these people throw all discretion to the wind, leaving themselves naked to hackers and identity thieves; they are protected only by the hope that they might disappear in the crowd. Their humble acts of rebellion seem to suggest that maybe the reason people were so willing to tell me their keepsakes was that it offered a small, private catharsis from the pent-up pressure that we all feel to police our online security.

In December 2009, an Eastern European hacker trolling the Internet for vulnerable targets stumbled across the mother lode: a database of 32 million passwords for a company called RockYou that runs a network of online games. Several weeks later, the hacker published the database, which remains among the largest such archives ever released.

The digital nudists were well represented. At least one of every 10 users chose a name or a name plus a year for his password. Two of every thousand passwords were the word “password.” But the RockYou breach had bigger lessons to offer. Most password research is focused on security, rather than on psychology or anthropology. Few modern activities, however, are more universal than creating a password. Rich, poor, young, old, virtually all of us are confronted daily by some kind of registration-demanding technology: wire transfers, prepaid cellphones, online banking, email, calling cards. The RockYou database could show how, when and why words gather weight — existential, personal weight.

This is partly why, for the past several years, a small team of computer scientists at the University of Ontario Institute of Technology has studied the RockYou database for lexical patterns. Among their more interesting finds: “Love” was by far the most common verb among the passwords — about twice as common as conjugations of the verb “to be” and roughly 12 times as common as conjugations of the verb “to hate.” By far the most popular adjectives used in the database’s passwords were “sexy,” “hot” and “pink.” Men’s names were about four times as likely as women’s names to appear as the object of passwords that start with “I love.”

Christopher Collins, one of the group’s lead researchers, explained that affection even appears in disguised forms. What at first looked like a disproportionately frequent use of the word “team,” for instance, turned out to be versions of the Spanish words “te amo,” or “I love you,” Collins said. The number “14344” appeared unusually often, and the researchers at first figured that it referred to a date: March 14, 1944. After consulting the urban dictionary, they soon found out that the number actually is popular code for “I love you very much.” (Count the letters in each word.)

In my own conversations, I, too, noticed that love (familial, unrequited, Platonic, failed) seemed to be a common source of inspiration for keepsakes. Perhaps my favorite of these anecdotes came from Maria T. Allen, who wrote that in 1993, when she was 22, she used for her password a combination of the name of her summer crush, J. D., with an autumn month and the name of a mythological female deity (she wouldn’t tell me which) to whom he had compared her when they first met. The fling ended, and they went their separate ways. But the password endured. Eleven years later, out of the blue, Allen received a message through Classmates.com from J. D. himself. They dated for several years, then decided to marry. Before the wedding, J. D. asked Maria if she had ever thought of him during that interim decade. “About every time I logged in to my Yahoo account,” she replied, before recounting to him her secret. He had the password inscribed on the inside of his wedding ring.

Granted, passwords harbor humanity’s darker side too. Joseph Bonneau, 30, who was among the first computer scientists to study RockYou’s archive, said he was amazed that tens of thousands of people would choose to introduce messages like “killmeplease,” “myfamilyhatesme” and “erinisaslut” — not to mention a slew of obscenities and racial slurs — into their lives multiple times a day.

In studying the database, Bonneau’s focus was not on the meaning of passwords but their security. And the further he dug into it, he said, the more he worried about the fate of privacy as so much of life moves online. “What the database made clear,” he said, “was that humans really are the weak link when it comes to data security.”

But precisely what made passwords so flawed is also what Bonneau said he found uplifting. “People take a nonnatural requirement imposed on them, like memorizing a password,” he said, “and make it a meaningful human experience.”

I later recounted Bonneau’s comment to Collins, who agreed. “We don’t just make it a meaningful experience,” he said. “Statistically speaking, at least based on the data, it’s most often an affectionate experience.”

There is something mildly destructive about collecting people’s keepsakes. Observers disturb the things we measure. But with passwords, or other secrets, we ruin them in their very discussion. Virtually all the people who revealed their passwords to me said they planned to stop using them. And yet they divulged them all the same.

Over the course of a half-hour, Hossein Bidgoli, a management information systems professor at California State University, Bakersfield, and editor of The Internet Encyclopedia, told me about the many dangers of using personal information in passwords. He fell silent, however, when I asked him whether he thought keepsakes were a bad thing.

Then he began to tell me about his life. He grew up in a small town near Tehran, he said, where he lived until he left Iran in 1976 to pursue his doctoral studies. He described his high school, which was named Karkhaneh, and the roses and rhododendron at a nearby plantation where he and his parents used to picnic. He recalled the distinct taste of the freshly made olive oil that his father, an engineer, used to bring home from the olive-processing plant where he worked.

“What you’re calling keepsake passwords,” Bidgoli said, “mine is ‘Karkhaneh.’ ”

Translated from Farsi, the word means “the place where people work,” he said. But for him, the name conjured a past happiness, time spent with his parents and the place that shaped his work ethic and his ethnic identity. “It’s a pretty memory,” he said, sotto voce.

I wondered why someone so concerned about security would be willing to tell me his password. I figured it might just be an extension of the oversharing culture that the Internet has created. Maybe my very hunt for significance in passwords and people’s general eagerness to help in that endeavor says more than any particular meaning I might actually find in the passwords themselves. Humans aren’t the only ones who solve puzzles. We are, however, the only ones who make puzzles simply so that we can solve them.

Bidgoli said he wasn’t sure why he disclosed his password. “It just seemed like your keepsakes are true,” he added after a long pause. “I wanted to contribute to that.” Ω

[Ian Urbina is a reporter for The New York Times, based in the paper’s Washington bureau. He has degrees in history from Georgetown University (BA) and the University of Chicago (MA), and his writings, which range from domestic and foreign policy to commentary on everyday life, have appeared in the Los Angeles Times, The Guardian, Harper’s, and elsewhere.]

Copyright © 2015 The New York Times Company



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2015 Sapper's (Fair & Balanced) Rants & Raves

Friday, March 27, 2015

What Is The Toughest Job In Kentucky?

Meet one of the most courageous people in Kentucky (and no, it's not Senate Majority Leader Mr. Turtle). Professor James Krupa faces class after of class of students who are hostile to the E-Word (evolution) and keeps pushing his particular rock up the hill, class meeting after class meeting. If this is (fair & balanced) academic freedom, so be it.

[x Orion]
Defending Darwin
By James J. Krupa

Tag Cloud of the following piece of writing

created at TagCrowd.com

i’m often asked what I do for a living. My answer, that I am a professor at the University of Kentucky, inevitably prompts a second question: “What do you teach?” Responding to such a question should be easy and invite polite conversation, but I usually brace for a negative reaction. At least half the time the person flinches with disapproval when I answer “evolution,” and often the conversation simply terminates once the “e-word” has been spoken. Occasionally, someone will retort: “But there is no evidence for evolution.” Or insist: “It’s just a theory, so why teach it?”

At this point I should walk away, but the educator in me can’t. I generally take the bait, explaining that evolution is an established fact and the foundation of all biology. If in a feisty mood, I’ll leave them with this caution: the fewer who understand evolution, the more who will die. Sometimes, when a person is still keen to prove me wrong, I’m more than happy to share with him an avalanche of evidence demonstrating I’m not.

Some colleagues ask why I bother, as if I’m the one who’s the provocateur. I remind them that evolution is the foundation of our science, and we simply can’t shy away from explaining it. We don’t avoid using the “g-word” when talking about gravitational theory, nor do we avoid the “c-word” when talking about cell theory. So why avoid talking about evolution, let alone defending it? After all, as a biologist, the mission of advancing evolution education is the most important aspect of my job.

To teach evolution at the University of Kentucky is to teach at an institution steeped in the history of defending evolution education. The first effort to pass an anti-evolution law (led by William Jennings Bryan) happened in Kentucky in 1921. It proposed making the teaching of evolution illegal. The university’s president at that time, Frank McVey, saw this bill as a threat to academic freedom. Three faculty members—William Funkhouser, a zoologist; Arthur Miller, a geologist who taught evolution; and Glanville Terrell, a philosopher—joined McVey in the battle to prevent the bill from becoming law. They put their jobs on the line. Through their efforts, the anti-evolution bill was defeated by a forty-two to forty-one vote in the state legislature. Consequently, the movement turned its attention toward Tennessee.

John Thomas Scopes was a student at the University of Kentucky then and watched the efforts of his three favorite teachers and President McVey. The reason the “Scopes Monkey Trial” occurred several years later in Dayton, Tennessee—where Scopes was a substitute teacher and volunteered to be prosecuted—was in good part due to the influence of his mentors, particularly Funkhouser. As Scopes writes in his memoir, Center of the Storm: “Teachers rather than subject matter rekindled my interest in science. Dr. Funkhouser . . . was a man without airs [who] taught zoology so flawlessly that there was no need to cram for the final examination; at the end of the term there was a thorough, fundamental grasp of the subject in bold relief in the student’s mind, where Funkhouser had left it.”

I was originally reluctant to take my job at the university when offered it twenty years ago. It required teaching three sections of non-majors biology classes, with three hundred students per section, and as many as eighteen hundred students each year. I wasn’t particularly keen on lecturing to an auditorium of students whose interest in biology was questionable given that the class was a freshman requirement.

Then I heard an interview with the renowned evolutionary biologist E. O. Wilson in which he addressed why, as a senior professor—and one of the most famous biologists in the world—he continued to teach non-majors biology at Harvard. Wilson explained that non-majors biology is the most important science class that one could teach. He felt many of the future leaders of this nation would take the class, and that this was the last chance to convey to them an appreciation for biology and science. Moved by Wilson’s words, and with the knowledge that William Funkhouser once held the job I was now contemplating, I accepted the position. The need to do well was unnerving, however, considering that if I failed as a teacher, a future Scopes might leave my class uninspired.

I realized early on that many instructors teach introductory biology classes incorrectly. Too often evolution is the last section to be taught, an autonomous unit at the end of the semester. I quickly came to the conclusion that, since evolution is the foundation upon which all biology rests, it should be taught at the beginning of a course, and as a recurring theme throughout the semester. As the renowned geneticist Theodosius Dobzhansky said: “Nothing in biology makes sense except in the light of evolution.” In other words, how else can we explain why the DNA of chimps and humans is nearly 99 percent identical, and that the blood and muscle proteins of chimps and humans are nearly identical as well? Why are these same proteins slightly less similar to gorillas and orangu­tans, while much less similar to goldfish? Only evolution can shed light on these questions: we humans are great apes; we and the other great apes (gibbons, chimps, gorillas, bonobos, and orangutans) all evolved from a common ancestor.

Soon, every topic and lecture in my class was built on an evolutionary foundation and explained from an evolutionary perspective. My basic biology for non-majors became evolution for non-majors. It didn’t take long before I started to hear from a vocal minority of students who strongly objected: “I am very offended by your lectures on evolution! Those who believe in creation are not ignorant of science! You had no right to try and force evolution on us. Your job was to teach it as a theory and not as a fact that all smart people believe in!!” And: “Evolution is not a proven fact. It should not be taught as if it is. It cannot be observed in any quantitative form and, therefore, isn’t really science.”

We live in a nation where public acceptance of evolution is the second lowest of thirty-four developed countries, just ahead of Turkey. Roughly half of Americans reject some aspect of evolution, believe the earth is less than ten thousand years old, and that humans coexisted with dinosaurs. Where I live, many believe evolution to be synonymous with atheism, and there are those who strongly feel I am teaching heresy to thousands of students. A local pastor, whom I’ve never met, wrote an article in The University Christian complaining that, not only was I teaching evolution and ignoring creationism, I was teaching it as a non-Christian, alternative religion.

There are students who enroll in my courses and already accept evolution. Although not yet particularly knowledgeable on the subject, they are eager to learn more. Then there are the students whose minds are already sealed shut to the possibility that evolution exists, but need to take my class to fulfill a college requirement. And then there are the students who have no opinion one way or the other but are open-minded. These are the students I most hope to reach by presenting them with convincing and overwhelming evidence without offending or alienating them.

Some students take offense very easily. During one lecture, a student asked a question I’ve heard many times: “If we evolved from monkeys, why are there still monkeys?” My response was and is always the same: we didn’t evolve from monkeys. Humans and monkeys evolved from a common ancestor. One ancestral population evolved in one direction toward modern-day monkeys, while another evolved toward humans. The explanation clicked for most students, but not all, so I tried another. I asked the students to consider this: Catholics are the oldest Christian denomination, and so if Protestants evolved from Catholics, why are there still Catholics? Some students laughed, some found it a clarifying example, and others were clearly offended. Two days later, a student walked down to the lectern after class and informed me that I was wrong about Catholics. He said Baptists were the first Christians and that this is clearly explained in the Bible. His mother told him so. I asked where this was explained in the Bible. He glared at me and said, “John the Baptist, duh!” and then walked away.

To truly understand evolution, you must first understand science. Unfortunately, one of the most misused words today is also one of the most important to science: theory. Many incorrectly see theory as the opposite of fact. The National Academy of Sciences provides concise definitions of these critical words: A fact is a scientific explanation that has been tested and confirmed so many times that there is no longer a compelling reason to keep testing it; a theory is a comprehensive explanation of some aspect of nature that is supported by a vast body of evidence generating testable and falsifiable predictions.

In science, something can be both theory and fact. We know the existence of pathogens is a fact; germ theory provides testable explanations concerning the nature of disease. We know the existence of cells is a fact, and that cell theory provides testable explanations of how cells function. Similarly, we know evolution is a fact, and that evolutionary theories explain biological patterns and mechanisms. The late Stephen Jay Gould said it best: “Evolution is a theory. It is also a fact. And facts and theories are different things, not rungs in a hierarchy of increasing certainty. Facts are the world’s data. Theories are structures of ideas that explain and interpret facts.”

Theory is the most powerful and important tool science has, but nonscientists have perverted and diluted the word to mean a hunch, notion, or idea. Thus, all too many people interpret the phrase “evolutionary theory” to mean “evolutionary hunch.”

Not surprisingly, I spend the first week of class differentiating theory from fact, as well as defining other critical terms. But I’m appalled by some of my colleagues who, despite being scientists, do not understand the meaning of theory. As I was preparing to teach a sophomore evolution class a few years ago, a biology colleague asked how I was going to approach teaching evolution. Specifically, he asked if I would be teaching evolution as a theory or a fact. “I will teach evolution as both theory and fact,” I said, trying hard to conceal my frustration. No matter. My colleague simply walked away, likely questioning my competence to teach the class.

Once I lay down the basics of science, I introduce the Darwinian theories of evolution. Charles Darwin was by no means the first or only to put forth evolution; others came before him including his grandfather, Erasmus Darwin, who wrote about descent with modification. Later, while Charles was amassing evidence in England for natural selection, one of the most eloquent scientific theories ever, Alfred Russel Wallace was also developing the same theory during his travels in Indonesia. But it was Charles Darwin alone who advanced the theory of descent with modification, with his bold idea that all species belong to the same tree of life and thus share a common ancestor. He also gave us sexual selection theory, which explains how evolution is shaped by competition for mates as well as choice of mates. Too often only natural selection and descent with modification are emphasized in introductory biology classes. I also cover Darwin’s theories of gradualism (including the nuance of punctuated equilibrium); descent from a common ancestor; multiplication of species; and sexual selection. I emphasize that five of the theories explain the patterns of evolution, while natural and sexual selection are the mechanisms that drive evolution.

Once the two essential strands of the class have been presented—the basic tenets of science and evolutionary ­theory—it’s time to tie the two together and thread them through the rest of the semester. I choose examples that will catch the class’s attention, such as the plight of the ivory-billed woodpecker, also known as the Lord God Bird due to its magnificent appearance. The story of the bird’s decline from habitat loss and hunting, and the failed efforts to save it from extinction, is riveting and heartbreaking. It pulls students in as we discuss how evolution can explain why this North American bird is so similar to a group of large South American woodpeckers, as well as Old World “ivorybills.” Students have to generate hypotheses that explain this phenomenon, and determine what evidence is needed to support or refute their hypotheses. They use a fact-gathering approach, plus all the Darwinian theories, to explain how and why these similar groups of big woodpeckers, many with sturdy white bills, live on three continents. Both scientific approach and evolutionary theory are now intertwined—an approach that is, in my opinion, essential for the teaching of biology at all levels. It does not shy away from public resistance to evolution education but stares it directly in the eye. To this end, I include a section on human evolution, a topic that, somewhat surprisingly, is avoided by many who teach evolution.

Rarely do I have a Kentucky student who learned about human evolution in high school biology. Those who did usually attended high schools in large urban centers like Louisville or Lexington. Given how easily it can provoke parents, the teaching of human evolution is a rarity in high school, so much so in Kentucky that it startled me when I first arrived. I had naively assumed it was something all students learned. I was fortunate to have attended Omaha Central High School in Nebraska, where the science teachers were excellent, and inspiring as well. They never sidestepped controversial ­topics relevant to their science. One teacher in particular—Creighton Steiner, the equivalent of my Funkhouser, and whom I regret not thanking in time before his passing—taught biology, earth science, and anthropology. One semester of his anthropology class was devoted entirely to human evolution. Steiner’s fascination with evolution ignited my passion for the subject. He was the first person to tell me about the age-old clash between science and religion, and how evolution was now at the heart of the conflict. He helped me realize that defending science and evolution is an obligation.

Human evolution is the greatest of all stories. It explains how we came to be. To weave the story of our ancestors and their evolutionary contributions to our existence is an exciting part of the semester. Australopithecus, Homo habilis, Homo erectus—I want my students to know these amazing beings, as well as the many more that have been discovered since my school days. The story of our evolutionary history captivates many of my students, while infuriating some. During one lecture, a student stood up in the back row and shouted the length of the auditorium that Darwin denounced evolution on his deathbed—a myth intentionally spread by creationists. The student then made it known that everything I was teaching was a lie, and stomped out of the auditorium, slamming the door behind him. A few years later during the same lecture, another student also shouted out from the back row that I was lying. She said that no transitional fossil forms had ever been found—despite my having shared images of many transitional forms during the semester. Many of her fellow students were shocked by her combativeness, particularly when she stormed out, also slamming the door behind her. Most semesters, a significant number of students abruptly leave as soon as they realize the topic is human evolution.

My classes provide an abundance of examples of how evolutionary theory explains biological phenomena, with evolutionary medicine surfacing toward the end of the semester. I focus on four basic points: our evolutionary legacy influences present-day health problems; overuse of antibiotics is causing pathogens to evolve resistance; treating conditions (fever, coughing, sneezing, diarrhea, vomiting) as symptoms of an illness can harm our health, while treating these conditions as adaptations and leaving them to run their course (unless they’re acute) can benefit our health; and how the ecological phenomenon of “corridors” (not washing hands, openly sneezing and coughing, shaking hands, unprotected sex) causes pathogens to spread easily, permitting them to evolve greater virulence, while maintaining “barriers” (washing hands, covering your mouth when sneezing and coughing, not shaking hands, using condoms) causes pathogens to evolve lower virulence.

If mild fever evolved as an adaptation to “cook” pathogens, and coughing, sneezing, and diarrhea evolved to expel them, then it is unwise to use medications to suppress these adaptations. Similarly, if a virulent strain can kill a host and escape to another via a corridor, greater virulence evolves. If, however, a barrier prevents spread of this pathogen, the most virulent die along with their host, leaving only less virulent forms to survive.

Evolutionary medicine brings the significance of evolution home. Students realize that not understanding evolution can have severe consequences to their health. If everyone understood that pathogens evolve (not develop) resistance to antibiotics when used excessively and unnecessarily, we would have fewer problems with ineffective antibiotics and highly resistant pathogens.

To explain evolutionary legacy, I point out that our physiology evolved for a hunter-gatherer diet and is not adapted for the modern Western diet, which is one reason obesity, diabetes, and other health issues are a growing problem. I also point out that a significant percentage of my students experience lower back problems—a seemingly odd phenomenon for such a young crowd. The explanation is that the vertebrate spine evolved 500 million years ago as a horizontal support structure from which internal organs hung (essentially a suspension bridge to oppose the forces of gravity). But seven million years ago, our ancestors evolved into upright walking creatures with vertical spines. Our spines no longer offset the pull of gravity, leaving our internal organs to push on our lower extremities. With the horizontal support structure gone, we are left to deal with lower back pain, ruptured disks, hemorrhoids, hernias, and varicose veins.

After a semester filled with evidence of evolution, capped off with a dose of evolutionary medicine, one might expect that every last student would understand it and accept it as fact. Sadly, this is not the case. There are those who remain convinced that evolution is a threat to their religious beliefs. Knowing this, I feel an obligation to give my “social resistance to evolution” lecture as the final topic.

This lecture lays down the history of the antiscience and anti-evolution movements, the arguments made by those opposing evolution, and why these arguments are wrong. I make it clear that one can accept evolution and maintain their religious beliefs. They are not mutually exclusive. Among the religious groups and organizations that support the teaching of evolution are the Episcopal Church, Lutheran World Federation, United Methodist Church, Presbyterian Church, United Unitarian Universalists, Roman Catholic Church, and the American Jewish Congress. In fact, 77 percent of all American Christians belong to a denomination that supports the teaching of evolution, and several high-profile evangelical Christians are ardent defenders of it, including former President Jimmy Carter and Dr. Francis Collins, director of the National Institute of Health. Even Pope John Paul II acknowledged the existence of evolution in an article he published in The Quarterly Review of Biology, in which he argued that the body evolved, but the soul was created. Pope Francis has made it clear that he accepts evolution as well.

This lecture should put students at ease knowing that religion and science need not be at odds. Of all the lectures I give, this one provokes the most discussion after class. And yet it often results in students expressing concern that I might not be saved. I never say anything about my personal religious beliefs, yet it is assumed I am an atheist. One student told me she hoped I could find God soon. When I again pointed out that John Paul accepted evolution—and he certainly wasn’t an atheist—the student countered that Catholics aren’t Christians. Several simply let me know they will be praying for me and praying hard. One student explained that as a devout Catholic he had no choice but to reject evolution. He accused me of fabricating the pope’s statements. When I explained that he could go to the Vatican website for verification or call the Vatican to talk to a scientist, he insisted that there was no such information available from the Vatican. He then pointed his finger at me and said the only way he would believe me is if Pope John Paul II came to my class to confirm these quotes face-to-face. The student then stomped out, again slamming the auditorium door behind him.

The thing about teaching is we are never sure we are making a difference. We never know how many students have been reached. What I have never come to grips with is that no matter how hard I try to be the best teacher I can, I will fail to connect with some students. Every time a student stomps out of my auditorium slamming the door on the way, I can’t help but question my abilities. Based on evaluations from the 24,000 students I’ve taught, 8 percent of my students simply detest me, but 90 percent love my class. That makes me one of the most hated and loved professors at the university.

I’m occasionally told my life would be easier if I backed off from my relentless efforts to advance evolution education. Maybe so. But to shy away from emphasizing evolutionary biology is to fail as a biology teacher. I continue to teach biology as I do, because biology makes sense only in the light of evolution.

And it’s a message that sometimes gets through. There’s one student I can remember in particular, who took my freshman seminar on evolutionary medicine. He was an ardent evangelical Christian who believed in the literal truth of biblical creation. The seminar was very hard on him, and he struggled with the information, questioning and doubting everything we read. Several years later, our paths crossed, and we stopped for what turned out to be a long, easy chat. Now a doctor, he explained to me that, at the time, he was so upset with my seminar that he attended a number of Creationists’ public lectures for evidence I was wrong. He said he found himself embarrassed by how badly these individuals perverted Christian teachings, as well as known facts, to make their argument. He wanted me to know that he came to understand he could be a Christian and accept evolution. Then he did something that resonates with any teacher: he thanked me for opening his eyes, turning his world upside down, and blurring the line between black and white. Ω

[
James J. Krupa has won several national and state teaching awards, as well as every major teaching award at the University of Kentucky, where he is a tenured professor of biology.
 Krupa received the following degrees (all biology): BS from the University of Nebraska at Omaha; MS from the University of Nebraska at Lincoln; and a PhD from the University of Oklahoma.]

Copyright © 2015 Orion Magazine



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2015 Sapper's (Fair & Balanced) Rants & Raves