Friday, January 20, 2017

A Wish For Inauguration Day, 2017: A Civil Annulment — STAT!

Today, The Jillster (Jill Lepore) goes on a stroll down Inaugural Lane and draws upon memories of Inaugural essays that she has written — along with companion essays by Louis Menand — [links provided below]. The Jillster's interpretation of the Inaugrual ceremony portrays it as a marriage ceremony (between the Inaugural groom and the People — or, in some cases, the Constitution). This blogger will avert his eyes (and ears) from today's spectacle in Washington, DC. If this is a (fair & balanced) wish for a speedy denouement, so be it.

[x New Yorker]
Trump's Washington Wedding
By The Jillster (Jill Lepore)

TagCrowd cloud of the following piece of writing

created at

An American Inauguration is like a wedding: the President is the groom, the people his bride. Donald Trump is about to pledge his troth. It didn’t always work this way, and, really, it shouldn’t. Washington isn’t Vegas.

Only lately has the American Presidency become romantic. The oath itself, established in Article II of the Constitution, is an oath of office, not a confession of love, and it doesn’t mention the American people. Instead, before Congress, the new President is supposed to pledge himself to the office and promise to protect the Constitution: “I do solemnly swear that I will faithfully execute the office of President of the United States, and will to the best of my ability, preserve, protect and defend the Constitution of the United States.”

The relationship between the first American Presidents and the American people wasn’t spousal; it was paternal. This began with George Washington. In the first draft of his Inaugural Address, Washington remarked that, having no children of his own, he would never establish a dynasty—“the Divine providence hath not seen fit that my blood should be transmitted or my name perpetuated by the endearing though sometimes seducing, channel of personal offspring”—but this also assured Americans that no one was closer in his affections. Washington addressed his remarks to Congress; the people didn’t hear them, they read them. Thomas Jefferson addressed his Inaugural to “Friends & Fellow Citizens,” though this was purely notional: he was speaking to Congress. As I wrote eight years ago, in an essay on the history of the Inaugural Address, James Monroe was the first President to be inaugurated outdoors (only because the Capitol was closed for renovations), before an audience of eight thousand, who could not possibly have heard a word he said. Andrew Jackson, in 1829, was the first President to ignore Congress and instead address his speech to the American people—“Fellow-Citizens” —twenty thousand of whom showed up to watch Chief Justice John Marshall ask Jackson to take his vows by placing his hand on a Bible. One witness described the scene: “The President took it from his hands, pressed his lips to it, laid it reverently down, then bowed again to the people—Yes, to the people in all their majesty.” In his speech, Jackson talked, a bit sentimentally, about “the habits of our Government and the feelings of our people.” When he finished, he bowed again, and rode his horse to the White House.

Except for Zachary Taylor’s, every nineteenth-century Inaugural Address mentioned the Constitution, which is what the President is actually wedding himself to. Jackson aside, there was little of romance in the ceremony itself. In Abraham Lincoln’s first Inaugural Address, he gave a brilliant and searing lecture about the powers of the different branches of government, while, upending Washington, nodding to “my rightful masters, the American people.” With Southern states already seceding, Lincoln talked about brotherhood, and about friendship. “We are not enemies, but friends,” he said. “Though passion may have strained it must not break our bonds of affection.” For Lincoln, the marriage that mattered was between the North and the South: “A husband and wife may be divorced and go out of the presence and beyond the reach of each other,” he said, “but the different parts of our country cannot do this.” And still the house divided.

Matters began to take a turn in 1889, when Benjamin Harrison, a widower, argued that, because the oath of office was public, the vow he had taken was not to the office but to the people. “The oath taken in the presence of the people becomes a mutual covenant,” Harrison said. “My promise is spoken; ours unspoken, but not the less real and solemn.” Ever since, the swearing-in has had some element of this to it, a certain bouquet, the whiff of a wedding day.

Inaugural Addresses got more intimate when they were broadcast on the radio, beginning in 1925, and on television, starting in 1949. The rhetoric during much of the early broadcast era, though, concerned a common purpose. “If I read the temper of our people correctly, we now realize, as we have never realized before, our interdependence on each other,” FDR said, in 1933. If this was a marital vow, it was that of an equal marriage. (Theodore Roosevelt gave the bride away, in 1905, when Franklin married Eleanor. “Her bouquet was of lilies of the valley,” the Times reported.) JFK’s Inauguration had an air of the nuptial about it, but only because of his youth and attractiveness. His rhetoric was fraternal, though it went beyond the fraternity of the American people to the fraternity of nations: “And so, my fellow Americans: ask not what your country can do for you—ask what you can do for your country. My fellow citizens of the world: ask not what America will do for you, but what together we can do for the freedom of man.”

Ronald Reagan began what historians call the “rhetorical Presidency.” He tended to bypass Congress and bring his policies directly to the American people. His Inauguration was the first to really look and sound like a wedding ceremony. “You and I,” he said to the American people, and vowed, “Your dreams, your hopes, your goals are going to be the dreams, the hopes, and the goals of this administration, so help me God.”

Barack Obama’s followers swooned for him, in much the way that Reagan’s did. “We, the people, declare today that the most evident of truths—that all of us are created equal—is the star that guides us still,” Obama said, in 2013, delivering an Inaugural Address that sounded more like a sermon, as if he were the minister at his own wedding.

Trump’s campaign was, from the start, an old-fashioned courtship: the kisses, the promises, the needy late-night phone calls. The chairman of his Inauguration committee promises the ceremony will have a “soft sensuality.” Trump’s attitude about the American people appears to have a lot in common with his ideas about women. “I AM YOUR VOICE,” he said in his speech at the Republican National Convention. At Trump’s third wedding, at Mar-a-Lago, in January, 2005, Billy Joel sang “Just the Way You Are.” It was a few months later that Trump talked to Billy Bush on a bus, about another courtship. “I moved on her very heavily,” he said. So help me God. ###

[Jill Lepore is the David Woods Kemper '41 Professor of American History at Harvard University as well as the chair of the History and Literature Program. She also is a staff writer at The New Yorker. Her latest books are The Story of America: Essays on Origins (2012), Book of Ages: The Life and Opinions of Jane Franklin (2013). and The Secret History of Wonder Woman (2014). Lepore earned a BA (English) from Tufts University, an MA (American culture) from the University of Michigan, and a PhD (American studies) from Yale University.]

Copyright © 2017 The New Yorker/Condé Nast Digital

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2017 Sapper's (Fair & Balanced) Rants & Raves

Thursday, January 19, 2017

The Dismal Science's Hobgoblin

Very few people have "no opinion" when the issue of increasing the minimum wage is raised. In today's post, law professor James Kwak takes issue with the standard appeal to the law of supply and demand that is beloved by those who oppose a minimum wage increase. Most of the stupids would uphold the idea that a minimum wage increase would distort the "iron law of wages." If this is a (fair & balanced) critique of magical thinking, so be it.

[x The Atlantic]
The Curse Of Econ 101
By James Kwak

[This essay was adapted from James Kwak’s book, Economism: Bad Economics and the Rise of Inequality.]

TagCrowd cloud of the following piece of writing

created at

In a rich, post-industrial society, where most people walk around with supercomputers in their pockets and a person can have virtually anything delivered to his or her doorstep overnight, it seems wrong that people who work should have to live in poverty. Yet in America, there are more than ten million members of the working poor: people in the workforce whose household income is below the poverty line. Looking around, it isn’t hard to understand why. The two most common occupations in the United States are retail salesperson and cashier [PDF]. Eight million people have one of those two jobs, which typically pay about $9–$10 per hour. It’s hard to make ends meet on such meager wages. A few years ago, McDonald’s was embarrassed by the revelation that its internal help line was recommending that even a full-time restaurant employee apply for various forms of public assistance.

Poverty in the midst of plenty exists because many working people simply don’t make very much money. This is possible because the minimum wage that businesses must pay is low: only $7.25 per hour in the United States in 2016 (although it is higher in some states and cities). At that rate, a person working full-time for a whole year, with no vacations or holidays, earns about $15,000—which is below the poverty line for a family of two, let alone a family of four. A minimum-wage employee is poor enough to qualify for food stamps and, in most states, Medicaid. Adjusted for inflation, the federal minimum is roughly the same as in the 1960s and 1970s, despite significant increases in average living standards over that period. The United States currently has the lowest minimum wage, as a proportion of its average wage, of any advanced economy, contributing to today’s soaring levels of inequality. At first glance, it seems that raising the minimum wage would be a good way to combat poverty.

The argument against increasing the minimum wage often relies on what I call “economism”—the misleading application of basic lessons from Economics 101 to real-world problems, creating the illusion of consensus and reducing a complex topic to a simple, open-and-shut case. According to economism, a pair of supply and demand curves proves that a minimum wage increases unemployment and hurts exactly the low-wage workers it is supposed to help. The argument goes like this: Low-skilled labor is bought and sold in a market, just like any good or service, and its price should be set by supply and demand. A minimum wage, however, upsets this happy equilibrium because it sets a price floor in the market for labor. If it is below the natural wage rate, then nothing changes. But if the minimum (say, $7.25 an hour) is above the natural wage (say, $6 per hour), it distorts the market. More people want jobs at $7.25 than at $6, but companies want to hire fewer employees. The result: more unemployment. The people who are still employed are better off, because they are being paid more for the same work; their gain is exactly balanced by their employers’ loss. But society as a whole is worse off, as transactions that would have benefited both buyers and suppliers of labor will not occur because of the minimum wage. These are jobs that someone would have been willing to do for less than $6 per hour and for which some company would have been willing to pay more than $6 per hour. Now those jobs are gone, as well as the goods and services that they would have produced.

The minimum wage has been a hobgoblin of economism since its origins. Henry Hazlitt wrote in Economics in One Lesson (1946; PDF), “For a low wage you substitute unemployment. You do harm all around, with no comparable compensation.” In Capitalism and Freedom 40th ed. (1962, 1982, 2002) , Milton Friedman patronizingly described the minimum wage as “about as clear a case as one can find of a measure the effects of which are precisely the opposite of those intended by the men of good will who support it.” Because employers will not pay people more money than their work is worth, he continued, “insofar as minimum-wage laws have any effect at all, their effect is clearly to increase poverty.” Jude Wanniski similarly concluded in The Way the World Works (1978), “Every increase in the minimum wage induces a decline in real output and a decline in employment.” On the campaign trail in 1980, Ronald Reagan said, “The minimum wage has caused more misery and unemployment than anything since the Great Depression.” Think tanks including Cato, Heritage, and the Manhattan Institute have reliably attacked the minimum wage for decades, all the while emphasizing the key lesson from Economics 101: Higher wages cause employers to cut jobs.

In today’s environment of increasing economic inequality, the minimum wage is a centerpiece of political debate. California, New York City, and Seattle are all raising their minimums to $15, and President Barack Obama called for a federal minimum of $10.10. An army of commentators has responded by reminding us of what we should have learned in Economics 101. In The Wall Street Journal, the economist Richard Vedder explained, “If the price of something rises, people buy less of it—including labor. Thus governmental interferences such as minimum-wage laws lower the quantity of labor demanded.” Writing for Forbes, Tim Worstall offered a mathematical proof: “A reduction in wage costs of some few thousand dollars increases employment. Obviously therefore a rise in wage costs of four or five times that is going to have significant unemployment effects. QED: A $15 minimum wage is going to destroy many jobs.” (Of theoretical arguments in favor of a higher minimum wage, he continued, “I’m afraid I really just don’t believe those arguments.”) Jonah Goldberg of the American Enterprise Institute and National Review chimed in, “A minimum wage is no different from a tax on firms that use low-wage and unskilled labor. And if there’s anything that economists agree upon, it’s that if you tax something you get less of it.”

The real impact of the minimum wage, however, is much less clear than these talking points might indicate. Looking at historical experience, there is no obvious relationship between the minimum wage and unemployment: adjusted for inflation, the federal minimum was highest from 1967 through 1969, when the unemployment rate was below 4 percent—a historically low level. When economists try to tackle this question, they come up with all sorts of results. In 1994, David Card and Alan Krueger evaluated an increase in New Jersey’s minimum wage [PDF] by comparing fast-food restaurants on both sides of the New Jersey-Pennsylvania border. They concluded, “Contrary to the central prediction of the textbook model ... we find no evidence that the rise in New Jersey’s minimum wage reduced employment at fast-food restaurants in the state.”

Card and Krueger’s findings have been vigorously contested across dozens of empirical studies. Today, people on both sides of the debate can cite papers supporting their position, and reviews of the academic research disagree on what conclusions to draw. David Neumark and William Wascher, economists who have long argued against the minimum wage, reviewed more than one hundred empirical papers in 2006 [PDF]. Although the studies had a wide range of results, they concluded that the “preponderance of the evidence” indicated that a higher minimum wage does increase unemployment. On the other hand, two recent meta-studies (which pool together the results of multiple analyses) have found that increasing the minimum wage does not have a significant impact on employment. In the past several years, a new round of sophisticated analyses comparing changes in employment levels between neighboring counties also found “strong earnings effects and no employment effects of minimum wage increases.” (That is, the number of jobs stays the same and workers make more money.) Not surprisingly, Neumark and Wascher have contested this approach. The profession as a whole is divided on the topic: When the University of Chicago Booth School of Business asked a panel of prominent economists in 2013 whether increasing the minimum wage to $9 would “make it noticeably harder for low-skilled workers to find employment,” the responses were split down the middle.

The idea that a higher minimum wage might not increase unemployment runs directly counter to the lessons of Economics 101. According to the textbook, if labor becomes more expensive, companies buy less of it. But there are several reasons why the real world does not behave so predictably. Although the standard model predicts that employers will replace workers with machines if wages increase, additional labor-saving technologies are not available to every company at a reasonable cost. Small employers in particular have limited flexibility; at their scale, they may not be able to maintain their operations with fewer workers. (Imagine a local copy shop: No matter how fast the copy machine is, there still needs to be one person to deal with customers.) Therefore, some companies can’t lay off employees if the minimum wage is increased. At the other extreme, very large employers may have enough market power that the usual supply-and-demand model doesn’t apply to them. They can reduce the wage level by hiring fewer workers (only those willing to work for low pay), just as a monopolist can boost prices by cutting production (think of an oil cartel, for example). A minimum wage forces them to pay more, which eliminates the incentive to minimize their workforce.

In the above examples, a higher minimum wage will raise labor costs. But many companies can recoup cost increases in the form of higher prices; because most of their customers are not poor, the net effect is to transfer money from higher-income to lower-income families. In addition, companies that pay more often benefit from higher employee productivity, offsetting the growth in labor costs. Justin Wolfers and Jan Zilinsky identified several reasons why higher wages boost productivity: They motivate people to work harder, they attract higher-skilled workers, and they reduce employee turnover, lowering hiring and training costs, among other things. If fewer people quit their jobs, that also reduces the number of people who are out of work at any one time because they’re looking for something better. A higher minimum wage motivates more people to enter the labor force, raising both employment and output. Finally, higher pay increases workers’ buying power. Because poor people spend a relatively large proportion of their income, a higher minimum wage can boost overall economic activity and stimulate economic growth, creating more jobs. All of these factors [PDF] vastly complicate the two-dimensional diagram taught in Economics 101 and help explain why a higher minimum wage does not necessarily throw people out of work. The supply-and-demand diagram is a good conceptual starting point for thinking about the minimum wage. But on its own, it has limited predictive value in the much more complex real world.

Even if a higher minimum wage does cause some people to lose their jobs, that cost has to be balanced against the benefit of greater earnings for other low-income workers. A study by the Congressional Budget Office (CBO) estimated that a $10.10 minimum would reduce employment by 500,000 jobs but would increase incomes for most poor families, moving 900,000 people above the poverty line. Similarly, a recent paper by the economist Arindrajit Dube [PDF] finds that a 10 percent raise in the minimum wage should reduce the number of families living in poverty by around 2 percent to 3 percent. The economists polled in the 2013 Chicago Booth study thought that increasing the minimum wage would be a good idea because its potential impact on employment would be outweighed by the benefits to people who were still able to find jobs. Raising the minimum wage would also reduce inequality by narrowing the pay gap between low-income and higher-income workers.

In short, whether the minimum wage should be increased (or eliminated) is a complicated question. The economic research is difficult to parse, and arguments often turn on sophisticated econometric details. Any change in the minimum wage would have different effects on different groups of people, and should also be compared with other policies that could help the working poor—such as the negative income tax (a cash grant to low-income households, similar to today’s Earned Income Tax Credit) favored by Milton Friedman, or the guaranteed minimum income that Friedrich Hayek assumed would exist.

Nevertheless, when the topic reaches the national stage, it is economism’s facile punch line that gets delivered, along with its all-purpose dismissal: people who want a higher minimum wage just don’t understand economics (although, by that standard, several Nobel Prize winners don’t understand economics). Many leading political figures largely repeat the central theses of economism, claiming that they have only the best interests of the poor at heart. In the 2016 presidential campaign, Senator Marco Rubio opposed increasing the minimum wage because companies would then substitute capital for labor: “I’m worried about the people whose wage is going to go down to zero because you’ve made them more expensive than a machine.” Senator Ted Cruz also chimed in on behalf of the poor, saying, “the minimum wage consistently hurts the most vulnerable.” Senator Rand Paul explained, “when the [minimum wage] is above the market wage it causes unemployment” because it reduces the number of employees whom companies can afford to hire. The former governor Jeb Bush also invoked Economics 101, saying that wages should be left “to the private sector,” meaning companies like Walmart, which “raised wages because of supply and demand.” For Congressman Paul Ryan, raising the minimum wage is “bad economics” and “will hurt the economy because it raises the price of labor.”

This conviction that the minimum wage hurts the poor is an example of economism in action. Economists have many different opinions on the subject, based on different theories and research studies, but when it comes to public debate, one particular result of one particular model is presented as an unassailable economic theorem. (Politicians advocating for a higher minimum wage, by contrast, tend to avoid economic models altogether, instead arguing in terms of fairness or helping the poor.) This happens partly because the competitive market model taught in introductory economics classes is simple, clear, and memorable. But it also happens because there is a large interest group that wants to keep the minimum wage low: businesses that rely heavily on cheap labor.

The restaurant industry has been a major force behind the advertising and public relations campaigns opposing the minimum wage, including many of the op-ed articles repeating the basic lesson of supply and demand. For example, Andy Puzder, the CEO of a restaurant company (and President-elect Trump’s nominee to lead the Labor Department), explained in The Wall Street Journal, “Every retailer has locations that are profitable, but only marginally. Increased labor costs can push these stores over the line and into the loss column. When that happens, companies that want to stay competitive will close them.” As a result, “broad increases in the minimum wage destroy jobs and hurt the working-class Americans that they are supposed to help.” A recent study by researchers at the Cornell School of Hotel Administration, however, found that higher minimum wages have not affected either the number of restaurants or the number of people that they employ, contrary to the industry’s dire predictions, while they have modestly increased workers’ pay. Because restaurant closings do not seem to increase, the implication is that paying employees more cuts into excess profits—profits beyond those necessary to stay in business. Or, as the financial commentator Barry Ritholtz put it, “raising the minimum wage works as a wealth transfer, from shareholders and franchisees, to minimum wage workers.” But instead of greedily demanding higher profits, industry executives can invoke Economics 101, which provides a simple explanation of the world that serves their interests.

The fact that this is the debate already demonstrates the historical influence of economism. Once upon a time, the major issue affecting workers’ wages and income inequality was unionization. In the 1950s, about one in every three wage and salary employees was a union member. Unions, of course, were an early and frequent target of economism. Hayek argued that unions are bad both for workers, because “they cannot in the long run increase real wages for all wishing to work above the level that would establish itself in a free market,” and for society as a whole, because “by establishing effective monopolies in the supply of the different kinds of labor, the unions will prevent competition from acting as an effective regulator of the allocation of all resources.” For Friedman, unions “harmed the public at large and workers as a whole by distorting the use of labor” while increasing inequality even within the working class. The changing composition of the U.S. workforce, state right-to-work laws, and aggressive anti-unionization tactics by employers—increasingly tolerated by the National Labor Relations Board, beginning with the Reagan administration—all contributed to a long, slow fall in unionization levels. By 2015 [PDF], only 12 percent of wage and salary employees were union members—fewer than 7 percent in the private sector. Low- and middle-income workers’ reduced bargaining power is a major reason why their wages have not kept pace with the overall growth of the economy. According to an analysis by the sociologists Bruce Western and Jake Rosenfeld [PDF], one-fifth to one-third of the increase in inequality between 1973 and 2007 results from the decline of unions.

With unions only a distant memory for many people, federal minimum-wage legislation has become the best hope for propping up wages for low-income workers. And again, the worldview of economism comes to the aid of employers by abstracting away from the reality of low-wage work to a pristine world ruled by the “law” of supply and demand. ###

[James Kwak is a Professor in the University of Connecticut School of Law. His most recent book is Economism: Bad Economics and the Rise of Inequality (2017). He received an AB (history, magna cum laude) from Harvard University; a PhD (history) from the University of California at Berkeley; and a JD from the Yale University School of Law.]

Copyright © 2017 The Atlantic Monthly Group

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2017 Sapper's (Fair & Balanced) Rants & Raves

Wednesday, January 18, 2017

Today's Burning Question: Who Moved My -Cheese- Blog?

In these dark and troubled days, A pervasive sense of helplessness is not uncommon for this blogger. If this is a (fair & balanced) consideration of life's basic dilemma, so be it.

[x Hedgehog Review]
The Necessity Of Self-Help Lit
By Joseph E. Davis

TagCrowd cloud of the following piece of writing

created at

They have a “reputation for nonsense,” these “bumper-sticker books.” Dashed off in “pop-culture” prose without “analytical rigor,” they offer up “useless platitudes” and “false promises.” Their “portentous pronouncements” convey a “neoliberal” message of “radical privatization,” built on an image of persons as “autonomous monads.” Their authors are “snake-oil peddlers” and “self-appointed gurus,” ringing up tidy profits by “preying on an unwary public.” People’s openness to their message signals a “trend toward authoritarianism,” while the effect of their “misleading quackery” has been to foster “relational detachment” and an “inward retreat” to “self-absorption.” Against an older ethic of “individualists in a common struggle,” they have promoted an “apolitical movement” and growing “social disengagement.” Their “new age sophistries” are “sapping our nation’s soul.”

Such characterizations of self-help literature, drawn from journalists and intellectuals, give a sense of the “healthy contempt” that flows from their pens, at least on those occasions when they give the genre any attention at all. It should be noted that most of the books, especially in the areas of medicine, psychology, and popular science, are written by people with advanced degrees (they display them on the cover), and many of the works in business and management are written by seasoned professionals in these two fields. In fact, one of the fastest-growing sub-categories of self-help lit consists of books of neuroscience and positive psychology that are peppered with scientific theories, experimental findings, and brain images. Many of the readers, and there are millions, are well educated. Nonetheless, intellectuals tend to dismiss the whole class of such books for promoting a new and fanatical project of self-creation, itself sustained by the illusion of self-sufficiency and self-mastery.

Because “self-help” is such a loose genre, generalizations about it are bound to be overly broad. It encompasses many different types of books dealing with spirituality, work, personal relationships, health, and what Dwight Macdonald once called “howtoism.” Self-helpery does not speak with a single voice. As folklorist Sandra Dolby shows, for instance, in her 2005 study Self-Help Books: Why Americans Keep Reading Them, the authors of such works draw on any one of at least four different concepts of the self: the detached self, as in books influenced by Eastern philosophy; the wounded self, which is common in books of popular psychology; the social self, which is often encountered in books on work in the corporate world, with some emphasis on “giving back”; and the obligated self, presupposed in books about spiritual growth and enrichment that tend to emphasize an individual religious duty to seek self-improvement. While the entire genre cannot be reduced to endless variations on the theme of Robert Ringer’s 1977 bestseller Looking Out for #1, it nevertheless conveys a broad common message that runs something like this: Life is a reflexive project, self-defined (and redefined) according to values and courses of action freely chosen, and divested as much as possible from the determining influence of family, cultural conditioning, and old habits of thought.

On the whole, then, the self-help message comes close to encouraging the project of radical self-creation that its critics find so objectionable. But in what sense is this project new or neoliberal or, for that matter, unique to self-help books? Isn’t this the “masterless” or “sovereign” self so commonly encouraged by thinkers of the eighteenth and nineteenth centuries? How is self-help different from the creative self-making that was Rousseau’s answer to the question “Who am I?”—the same Rousseau who, in his Confessions, identified the succession of his own feelings as the “one faithful guide on which I can depend”? Are the autonomy claims for the self in self-help stronger than the self-defining and reality-creating power accorded the imagination in the writings of Romantics such as William Wordsworth and Ralph Waldo Emerson? Although more hardheaded, John Stuart Mill provided the classic modern definition of freedom: “pursuing our own good in our own way,” so long as it does not deprive or interfere with the liberty of others. Individuality, for Mill, was exercised to the degree that a person’s “desires and impulses” and “plan of life” were self-chosen and self-cultivated independent of the “despotism of custom.” If the sovereign self of self-help books is an old ideal, is there anything actually new in them? That they are so widely and continuously read, and found helpful, suggests that more is on offer than rehashed self-talk.

A clue may lie less in the various solutions these works offer—exercise regimes, diets, spiritual visions, cognitive strategies to overcome faulty thinking or to assist emotional control, time management techniques, guides to intimacy, and so on—than in the cultural analysis they provide. By cultural analysis, I mean both the writers’ stated critique and their more implicit normative guidance. The stated critique, as Dolby observes, sets up the solution, and since some form of cultural conditioning is usually the root problem, the author presents a survey of the landscape that identifies the “traditions or conventional ways of thinking” that are part of the culture and have been harmfully internalized by the reader. The proposed solution, the enlightened thinking, follows deductively. And in the process, the reader is given a little lesson in “how cultural conditioning works.”

Normative guidance also concerns the dynamics of the changing social world in which readers live. While the ideal of the sovereign self is old, the social conditions that make constant reflexivity and instrumental relationships a virtual necessity are not. What is striking about the earlier authors who championed self-invention is just how much social and cultural stability they took as given, a stability they attributed variously to the natural world and human nature. Mill, like all these writers, assumed the class standing, material prosperity, and (male) freedom that—with its necessary dabbling and self-cultivation—his notion of individuality required. He assumed a background culture that would produce the qualities in (most) people that he took for granted. People were, in his understanding, generally self-restrained, rational, well-meaning, and energetic. In his 1859 book On Liberty, Mill wrote from the assumption that children were confidently “taught and trained” on the basis of the “ascertained results of human experience.” This shared “mode of existence” included a cultural consensus on traditions, customs, and practices; the individualist was one who deviated from this basic formation and the weight of its authority. Moreover, Mill viewed most people most of the time as simply conforming, “ape-like,” to prescribed roles and life plans, and living lives more or less bound by the confines of family, faith, community, custom, and tradition. He took for granted a social world in which such a nestled existence could take place.

That world, if never quite so determinative, has progressively vanished. And self-help books are both an indication of its disappearance and, critically, a response to it. Their normative guidance is their instruction in the principles and precepts by which life is now being lived. Norms are shared definitions of desirable behavior, standards for conduct under various circumstances. Characterized by a broad and continuous array of social changes and economic dislocations, more and more of daily life requires individual decisions. Much of private life has been deinstitutionalized; the social rites and rituals that superintended life transitions such as adulthood, courtship, and death have disappeared; long-term commitments have weakened; and the vibrancy of “communities of memory” has diminished. In any aspect of life, a variety of potentially incompatible norms are in play. And with the social world in flux, all the rules are in some degree of flux as well. Individuals are faced with the difficult and ongoing task of knowing what it is good and right to do in conducting their affairs.

Self-help witnesses to this need and addresses it. In this respect, it is less about bettering oneself than it is about help-seeking. People turn to self-help books in the face of challenges in intimate relationships, work, lifestyle decisions, self-care, emotional management, and much more. These works are especially likely to be turned to in moments of tough transition—being laid off, divorcing, breaking up, experiencing problems with physical or mental health, when the need for help and direction can be particularly intense. But in suggesting what to do and how to do it, self-help authors also articulate their sense of the way things are now. They convey what they see as the relevant cultural norms and show how, in this rough-and-tumble environment, syncing one’s own behavior with those norms will both increase success and—just as importantly—avert failure, embarrassment, and hurt.

One brief but indicative example from the work-and-management sub-genre is Spencer Johnson’s 1998 runaway bestseller (more than 10 million copies in print) Who Moved My Cheese? An A-Mazing Way to Deal with Change in Your Work and in Your Life. The “cheese” of the title is a metaphor for what people desire to have in life, the things—a career, health, recognition, a family, and so on—they believe will make them happy. The book was written as a parable for how people should respond when circumstances change and interfere with or derail their cherished dreams. According to Johnson, a medical doctor, the norm—or at least the one successful people live by—is to expect change, in any aspect of life at any time, and welcome it. Living by this standard involves a number of other norms, most of which Johnson leaves implicit. They include living with few if any strong attachments, entering only lightly into relationships and on purely contractual terms, managing life with relatively little support or need for care, disengaging from status relations, and living life forward, without resistance, rumination, or regret. Follow those rules, Johnson intimates, and there will always be better things ahead.

Does Johnson himself live—or really expect that others will live—in such a fully detached, free-floating way? Probably not. Providing a full account of the good life is not his purpose. His purpose is to engage people in the everyday world, especially the white-collar workplace, where they have been burned or made vulnerable, where employers neither extend nor expect loyalty, where decisions that affect employment and career are made by executives you will never meet, where seniority and acquired knowledge may actually be a liability, where keeping your job may require relocating across the country, where impression management may be more important than performance. Who has not dealt with such things? Johnson is offering a survival guide for a world in which unexpected and unwelcome change is common.

At the same time, he is not suggesting that his eyes-wide-open approach constitutes looking into the abyss. To the contrary, his message, as in all self-help books, is ultimately about basic trust. To Johnson, the world is not capricious. Given our social circumstances, the norms he points to are better guides to fulfillment—now you know how to proceed. And if you trust yourself and trust the cosmos, everything will always work out for the good.

In self-help books like Johnson’s, we find the old sovereign self but in a new key. It now appears less an ideal to strive for, less the object of heroic struggle, than simply an obligation and survival strategy. To be sure, there is in such self-help a promise of fulfillment in the self-making. But cultivating a sharp and reflexive self-awareness is also how you navigate the shoals of social disorder and institutional change, and deal with (and even manipulate) other people. Getting on in life requires a careful investment policy for your commitments, your emotional attention, and your expectations. Ideas like these are not unique to self-help, but books like Who Moved My Cheese? give them a particularly stark expression. By comparison, what Mill says of individuality and the freedom to choose one’s course sounds like little more than the freedom to be eccentric.

While the self of self-help remains masterless, it is not masterful. The critics are wrong in characterizing the message of self-help as encouraging self-absorption or a withdrawal from civic engagement. That is not the message. As Sandra Dolby observes, “Even the most crassly self-centered books advocate doing something for others, of engaging in social, community, and environmental service.” On the other hand, the self-sufficiency and self-mastery promoted in much self-help are, as critics suggest, illusions. We do not make ourselves, and we cannot validate ourselves. Imagining that we can will bring not independence or confidence or trust, but, ironically, a lot of anxious searching for the approval of others—or a return to the self-help literature for the key to that elusive autonomy. If self-help books tend to reproduce the very problems they seek to address, they also respond to a felt need and are meant to help people navigate real problems. We would do well to understand and acknowledge, rather than ridicule, what is at stake. ###

[Joseph E. Davis is Research Associate Professor of Sociology at the University of Virginia as well as the Director of Research and Publisher of The Hedgehog Review. He is is the author of Accounts of Innocence: Sexual Abuse, Trauma, and the Self (2005), which was the co-winner of the 2006 Cooley Award given by the Society for the Study of Symbolic Interaction. Davis received a BA (anthropology, summa cum laude) from the University of Minnesota and a PhD (sociology) from the University of Virginia.]

Copyright  2017 The Hedgehog Review / Institute for Advanced Studies in Culture (IASC)

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2017 Sapper's (Fair & Balanced) Rants & Raves

Tuesday, January 17, 2017

Today, We Have A Taxonomy Of Congressional Stupids

As the darkest day of this blogger's times approaches, T. A. Frank offers a scouting report on the who will be loyal or disloyal to Il Douche. Of course, this blogger wishes misfortune on all of their houses, but there is a morbid curiosity that is akin to watching a horror film. If this is (fair & balanced) political analysis, so be it.

[x VF]
What The GOP Really Thinks Of Trump Or, Frenemies Of Trump
By T. T. Frank

TagCrowd cloud of the following piece of writing

created at
For all that liberals insist that Donald Trump was the inevitable culmination of trends in the conservative movement and the Republican Party, he just wasn’t, and isn’t. Ted Cruz was the apogee of the conservative movement, and Marco Rubio was the apogee of the GOP. Trump is his own concoction, less a creature of fixed politics than an oddball with spectacular intuition and (to no one’s benefit) crippling impulses. Republicans are thrilled with the election but, in many cases, almost as apprehensive as Democrats. With Rubio in office, they would have had a genial by-the-numbers politician with a suitcase full of GOP study guides and almost no mind of his own. With Trump in office, they get a mystery.

I’ve had conversations with a number of Republicans over the past couple of weeks, and they all seem to have the same questions the rest of us do. Will Trump’s coalition be one of establishment Republicans and rebel Republicans? Or one of rebel Republicans and Bernie Sanders Democrats? Or one of something else entirely? Trump doesn’t know, and neither does anyone else. “I think we are feeling our way along,” said Deputy Majority Whip Tom Cole. “Just looking at it from a whip standpoint, it’s going to be a very different dynamic in terms of putting together a Republican coalition.”

The Senate is a beast of its own, and some of Trump’s fiercest enemies there are fellow Republicans like Lindsey Graham and John McCain. So let’s focus in this column on Republicans in the House. Very roughly speaking, what awaits Trump there are three groups. The first, a small one, loves his populist vision and intends to hold him to it on all fronts. The second, slightly larger, is made up of Republicans who are anywhere from half to three-quarters on board—they like Trump’s line on trade, or immigration, or nationalism more broadly, while dissenting on Trumpian policies on spending, or taxes, or tariffs, or Russia. A third faction doesn’t buy into populism at all and seems to view Trump like an uncontrolled bull, one they hope to rig up to a generator and harness for GOP energy. I’ll call them the Trumpists, the Freedomists, and the Ryanists.

Start with the Trumpists. Prior to the ascension of Trump, and before it had a name, Trumpism—a Pat Buchanan–esque philosophy of economic and military self-containment—was just one school of thought among Republican outliers in the House and Senate. Those who easily fit the category were few in number—fewer than five, would be my guess, and arguably as few as zero, if you define it narrowly enough. Jeff Sessions, who in 2013 advised Republicans to choose a “humble and honest populism” over Gang of Eight–style immigration bills, is one of them. Tennessee congressman Jimmy Duncan, a trade skeptic and reliable foe of illegal immigration—plus one of few Republicans to vote against authorizing George W. Bush to go to war with Iraq—is arguably another. There are a few more. But, again, it’s a small group.

This makes the Trumpists important mainly as keepers of the flame. Whatever Trump does, he wants to keep this group on board. One line that I encountered when speaking to people in this orbit was that deficit spending on infrastructure would be necessary as a bandage during hard times. That is to say: putting the brakes on globalization—with tariffs, revised trade deals, and stricter immigration control—could play near-term havoc with the economy, even if it causes longer-term benefits. The way to ease the transition is to create lots of jobs—in the construction and repair of roads, bridges, tunnels, rail lines, and airports. While that is going on—in this hopeful scenario—the private sector will complete most of its adaptations and emerge in a couple of years ready to hire, with shiny new roads and bridges at its disposal to boot. This would require tolerating considerable deficit spending, which could mean losing Republican support but gaining some among Democrats, especially those who represent working-class districts.

It’s all very simple, in theory, but such plans run with a thud into group two, which I’ll call the “Freedomists.” (No one in Congress, to my knowledge, goes by such a label, but I’m using it as a catchall for Republicans who dissent from the establishment.) These include the Tea Party caucus, although it exists more in name than in action, and the House Freedom Caucus, which was founded two years ago and has about 30 members. The Freedomists generally espouse limited government, and they have rebelled against Republican leadership on various issues, leading the charge to oust John Boehner as House Speaker in 2015. But the strongest glue bonding them has been fiscal hawkishness. (South Carolina congressman Mick Mulvaney, who is among their number, will be Trump’s director of the Office of Management and Budget.)

Many of the Freedomists are sympathetic to Trump. They know what it’s like to battle the establishment, and most see themselves as advocates for the little guy. Virginia congressman Dave Brat, famous as the underdog who defeated donor-class favorite Eric Cantor during a primary in 2014, is among them. Brat and his supporters view illegal immigration as a gift to the cheap-labor lobby, which, as Brat reminded me in conversation, gets all the benefits of low-paid employees while palming off the large attendant costs—an average of $10,000 a year to send each child of these workers to school—on middle-class taxpayers. Brat is also generally excited by the populism of the Trump movement and told me that he fears mainly that the kludgeocracy of Washington will impede efforts to create real change. But if Trump is hoping to levy trade tariffs or raise the debt ceiling, Brat is unlikely to join him. “When it comes to sticking points, the debt ceiling is going to be it,” he says. “There would have to be some credible commitment to a pro-growth corporate-rate bill that has a trillion in repatriation or something like that to get my buy-in. Otherwise, it’s a no.”

As Brat and many other Freedomists see it, doing away with regulations that hamstring U.S. industry will make it competitive and equip it to fight off competition from China. In this view, no tariffs will be required, nor will we need any infrastructure stimulus, at least not one that involves increasing deficits. Simply cutting red tape and regulations will unleash an economic boom in itself and revive labor markets at home. If there are tough times for a year or two, we ride them out. Proposals to increase deficit spending will therefore cause a lot of Freedomists to jump ship, and some of them, like Walter Jones, have a record of doing so even when George W. Bush was in power. Since they are over 30 in number, the Freedomists can stand in the way of party-line legislation. Quite possibly, then, Trump will find that the Freedom Caucus are supporters in spirit but obstacles in practice.

This leaves the establishment GOP, now called the Ryanists. In theory, the Ryanist GOP is Trump’s biggest headache, since it’s as in thrall to Bushism today as it was 15 years ago, happy to continue down the current path on trade, war, and immigration, with a repeal of Obamacare and cuts to Social Security and (by using vouchers) Medicare to boot. In practice, though, this faction of the party, which is by far the largest, is probably easiest to work with on deals. The question, then, is who trades what. If Trump gets his infrastructure bill, will the establishment demand a swifter repeal of Obamacare, for instance? The answer depends a lot on clout. Supporters of Trump would argue that the recent election proved that most Republican voters—to say nothing of most Americans—reject the establishment priorities. Supporters of the establishment would argue that establishment Republicans keep getting elected, so voters are with them.

Working in favor of the establishment members of the House and Senate is that the money is largely on their side, that Washington doesn’t change, and that they have much more seasoning in politics than Trump. Most try to live by the famous adage of Texas legislator Samuel Ealy Johnson Jr., father of Lyndon, who said, “You can’t be in politics unless you can walk in a room and know in a minute who’s for you, and who’s against you.” For all of Trump’s success in business, no one knows if he’s got anything close to Sam Johnson’s skill when it comes to navigating the labyrinth of legislative deal-making, even if he’s got Mike Pence at his side.

In Trump’s favor, though, is that he’s got an army of ardent fans who are prepared to direct a storm at anyone who defies their man. This is one reason why Trump went on a thank-you tour after the election. It was to keep these supporters energized for the clash of swords that starts in January. Also, despite having the thinnest skin of any politician ever to advance beyond the neighborhood-council level, Trump comes across as much shrewder than Ryan, even when he’s weakened. In early October, when Trump was in the biggest trouble he’d ever faced, Ryan still seemed like the feckless one, while Trump stayed on the warpath. “Disloyal R’s are far more difficult than Crooked Hillary,” he tweeted on October 11. “They come at you from all sides. They don’t know how to win—I will teach them!” And, arguably, he did just that.

Watching this battle play out will offer us fascinating lessons on the workings of power. (Forgive the cold word choice, but it’s a bleak truth that periods of human suffering and peril are those to which historians are most drawn—hence the curse of “interesting times.”) “I think most Republicans thought they were going to be a firewall against Hillary Clinton’s overreach,” says Cole, the deputy majority whip. “Now all of a sudden they’ve got to be the point of the spear. And they’re going to be confronted with some dilemmas they didn’t anticipate.”

While the skirmishes and scheming will take place in whispered conferences in rooms all around downtown Washington, DC, what comes to mind is an image of armies, swords drawn, unleashing a war cry and launching headlong into a battle of all against all. This all starts the day after the inauguration. You can almost hear the horses stomping. ###

[T.A. Frank, a Vanity Fair contributor who covers politics and policy, also has written for The New Republic, The American Prospect, The Weekly Standard, The Christian Science Monitor, and The Washington Monthly. In 2010, Frank ended a term as an Irvine Fellow at the The New America Foundation. He holds a BA (East Asian studies) from Columbia University.]

Copyright © 2017 Vanity Fair/Condé Nast Digital

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2017 Sapper's (Fair & Balanced) Rants & Raves

Monday, January 16, 2017

Today, Tom Tomorrow (Dan Perkins) Goes From The Sublime ("Incredible Trump") To The Sublimer ("Unbelievable Baby-Man")

In an e-mail message that accompanied this week's TMW 'toon, Tom Tomorrow (Dan Perkins) wrote:

I’ve decided to retire the “Incredible Trump” character, at least for awhile — at this point, it almost seems too flattering a caricature. Baby-Man captures something much more fundamental about his temperament — it’s not just about rage, it’s about complete and utter solipsism. I had a friendly argument with a reporter I know when I was at the DNC last summer — he believed quite firmly that Trump had no “superpowers” and had no chance of winning the election; I countered that his complete lack of shame *was* his superpower.

If this is a (fair & balanced) illustration of an unlikely superpower: "shamelessness," so be it.

[x TMW]
The Unbelievable Baby-Man
By Tom Tomorrow (Dan Perkins)

Tom Tomorrow/Dan Perkins

[Dan Perkins is an editorial cartoonist better known by the pen name "Tom Tomorrow." His weekly comic strip, "This Modern World," which comments on current events from a strong liberal perspective, appears regularly in approximately 150 papers across the U.S., as well as on Daily Kos. The strip debuted in 1990 in SF Weekly. Perkins, a long time resident of Brooklyn, New York, currently lives in Connecticut. He received the Robert F. Kennedy Award for Excellence in Journalism in both 1998 and 2002. When he is not working on projects related to his comic strip, Perkins writes a daily political blog, also entitled "This Modern World," which he began in December 2001. More recently, Dan Perkins, pen name Tom Tomorrow, was named the winner of the 2013 Herblock Prize for editorial cartooning. Even more recently, Dan Perkins was a runner-up for the 2015 Pulitzer Prize for Editorial Cartooning.]

Copyright © 2017 This Modern World/Tom Tomorrow (Dan Perkins)

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2017 Sapper's (Fair & Balanced) Rants & Raves

Sunday, January 15, 2017

Roll Over, Hannah Arendt — Make Way For The Banality Of Being Il Douche

As the darkest day of 2017 (January 20, 2017) nears its dawning, the horrible nightmare will begin. As Bill Maher put it best in a recent TV interview: "The POTUS, after January 20, won't have to rely on Twitter in dealing with his "enemies." He will have the FBI." If this is a (fair & balanced) preview of our long national nightmare, so be it.

[x Salon]
Donald Trump: Not Exactly Hitler! But...
By Andrew O'Hehir

TagCrowd cloud of the following piece of writing

created at

By next weekend [January 20, 2017], Donald Trump will be president of the United States. That is, I think he will — given everything that has happened throughout the absurdist drama otherwise known as the 2016 presidential election, it’s best not to make assumptions.

Only two axioms have proven true about this bizarre and chaotic presidential transition. The first one is easy: Just when you think things can’t get any weirder, they do. A week ago, we thought it was strange that Trump was praising Julian Assange, and we did not know about the hypothetical existence of a supposed “sex tape” made in a Moscow hotel room, which — if it exists, which it probably doesn’t — does not involve any actual sex. Those were innocent times!

The second axiom is a bit more complicated, but I’ll give it a try: More than a year’s worth of comparisons between Donald Trump and Adolf Hitler, which began as shrill liberal paranoia, have ended up seeming simultaneously more ridiculous than ever and more convincing than ever. If that sounds like an irresolvable contradiction, welcome to America in 2017.

If Donald Trump possessed any significant degree of self-awareness — a logical impossibility, because he wouldn’t be Donald Trump if he did — he probably wouldn’t send out tweets comparing America to Nazi Germany because confidential information is sometimes leaked to journalists. Some mechanism of psychological displacement would seem to be at work here, which like so many other things about Trump would be funny if it weren’t chilling: Porous government agencies and an adversarial press corps are exactly what the monolithic Nazi state didn’t have, and are exactly what Trump would like to get rid of, in the process of Nazifying, or at least Trumpifying, the dysfunctional American state.

As my colleague Chauncey DeVega has previously argued (and argues again this weekend, in an article to be published shortly after this one), more than anything else Trump resembles a “heel” from the world of professional wrestling, who also happens to be an authoritarian political leader with undeniable fascist tendencies in the real world. Trump is something like Hitler, but he is more like a pop-culture parody version of Hitler — the “Heil Honey I’m Home!” version of Hitler — created to entertain an audience that understands this is mostly a joke but isn’t quite sure where the shtick ends and the mass murder begins.

It would be conventional to say that we don’t know what kind of president Donald Trump will be, but it would be closer to the truth to say that what we really don’t know is how bad a president he will be, and how much damage he will inflict on the nation before he leaves office. One of the things DeVega’s pro wrestling analogy drives at, perhaps, is the sense that Trump is more like a phenomenon than a person.

Apocryphal Moscow sex tape or not, Trump’s personality and personal biography are of little interest, even to his followers and to himself. Trump struggled to come up with heartwarming family anecdotes for the ghostwriter of his own autobiography, and has said he doesn’t remember much of his childhood. (There’s a bit of Hitler there, too; we’ll get to that.) The Trump phenomenon illustrates what one scholar has described as the principle “that historical greatness can be linked with paltriness on the part of the individual concerned.”

What can I possibly mean (you may be asking) by ascribing historical greatness to Donald Trump, the comical real-estate buffoon who was elected president by accident, despite having absolutely no qualifications for the job? Well, that is his quality of greatness, pretty much. Look what he has accomplished so far; look how he has outfoxed and defeated every opponent. Trump’s peculiar greatness, we might say, is essentially linked to his distinctive quality of excess. It is a tremendous outpouring of energy, for good or ill, that shatters all existing standards. Granted, gigantic scale is not the same thing as historic greatness; there can be power in triviality as well. But Trump is not only gigantic and not only trivial. The eruption he has unleashed has been stamped through every one of its stages by his guiding will.

Throughout his political rise, Trump has had an amazing instinct for understanding what forces can be mobilized to serve his interests and how to mobilize them, and has not allowed prevailing trends to deceive him. At the time he began to move into politics, the entire system remained in the grip of 60 years or more of bipartisan consensus around the norms of liberal democracy. But he grasped the latent opposition to that system, just below the surface, and by bold and wayward combinations seized upon those factors and incorporated them into his campaign.

Trump’s conduct has always appeared foolish to sage political minds, and for years — indeed, virtually to the moment of his final victory — arrogant conventional wisdom did not take him seriously. The widespread mockery heaped upon him has been justified by his appearance, his unhinged rhetorical flights and the theatrical atmosphere he deliberately created around himself. (Again, like a pro wrestling villain.) Yet in a manner almost impossible to describe, he has always stood above or outside his banal and dull-witted persona.

Everything in those last three paragraphs is, I believe, accurate when it comes to Donald Trump, and indeed contains a few flashes of original insight. But I can’t take credit; I didn’t write it. Well, I sort of did: I revised it, edited it and updated the language here and there. In a word, I plagiarized it: All the ideas in that passage, and most of the actual writing, comes from the introduction to Joachim Fest’s famous biography Hitler, published in 1973.

Does quoting a biographer making the point that Hitler was a theatrical, manipulative genius with uncanny political instincts, who was consistently underestimated by his opponents until it was too late, prove that Trump is just like Hitler? Of course not. What it suggests, as I said earlier, is that Trump is something like Hitler, and that the type of authoritarian personality — or non-personality, in both these cases — who rises to the top when a democracy collapses conforms to a general pattern. As I see it, the parallels between Trump and Hitler are at once superficial — because they are different men, in sharply different historical contexts — and profound, because the underlying pattern is so similar and so disturbing.

Fest’s biography has been superseded in many ways by reams of Hitler scholarship published over the past four decades, but it retains considerable iconic power and remains a work of marvelous, troubling clarity. It was perhaps the first serious attempt made in postwar Germany to reckon with the dictator’s legacy in dispassionate fashion, at a time when many Germans either preferred to pretend Hitler had never existed or viewed him as a demonic arch-nemesis who had no history and required no explanation. Fest rejected both of the ideas about Hitler that were then dominant: that he was a uniquely evil person possessed of almost supernatural powers, or that he was a total nonentity serving the class interests of Germany’s financial, industrial and imperialist elite. Is it too much of a stretch to say that the same stereotypes have been applied to Donald Trump, and that they are just as incorrect now as they were then?

What is most striking about rereading Fest’s Hitler in 2017 is his portrayal of the future Führer as an embittered, uncultured provincial outsider, who consistently exaggerated the poverty of his background and who longed to join the upper levels of an urban bourgeois elite that viewed him with evident scorn (until, of course, it was compelled to grovel at his feet). Although the circumstances are different, I hardly need to observe that the parallels are striking.

Donald Trump was born into money and has never completely been able to deny it, although he continues to insist on the myth that his real estate empire was somehow the result of hard work, and that his father gave him only minimal help early in his career. Hitler created a more robust legend around the idea that his family had been “humble cottagers” and that he had pulled himself from rural poverty by his bootstraps, when in fact his father had been a highly respectable customs official for the Austro-Hungarian government, and young Adolf lived on family money for years. Fest reports that Hitler flew into a rage any time actual facts about his background threatened to emerge; his Twitter feed would have been amazing.

So in each case it’s the father who could legitimately be considered a self-made man — Alois Hitler ( Schicklgruber) was the illegitimate child of a domestic servant, and his paternity was never certain — and the son who yearns to leave that struggle behind and ascend to empyrean realms of metropolitan glamour. No matter how rich he got or how many appearances he made in the New York tabloids, to the WASP and Jewish elites of Manhattan Trump has always been a boorish kid from Queens with vulgar tastes and vulgar connections.

As Fest points out, the young Hitler was supposedly an aspiring artist in Vienna during one of that city’s greatest cultural explosions, but he apparently knew little or nothing about the art of Gustav Klimt or Oskar Kokoschka, the music of Mahler or Richard Strauss, the poetry of Rilke. He would later claim to have been a political and artistic “revolutionary,” but was really a frustrated dabbler who sold insipid watercolors to tourists and wandered the city alone in dandyish clothes, “full of sentimental admiration for the bourgeois world” of wealth and power that had conclusively rejected him.

What is Trump Tower but a cheaply made imitation of old-money luxury? And what has Trump’s entire career been, first in business and then in politics, if not (as Fest says of Hitler) the work of a reactionary “partisan of the establishment, paradoxically defending a reality that he simultaneously repudiated.” There could be no better description of the peculiar quality of the emerging Trump administration, which gained power on a promise to “drain the swamp” and overturn the existing order, and now plans to wield power through an assemblage of plutocrats and Establishment insiders unlike anything ever seen in American political history.

As I’ve already said, there are important differences. Adolf Hitler devoted years to building a political movement and a political party, which held to a somewhat coherent (if noxious) ideology. Trump’s movement sprang up virtually overnight, like a bloom of poisonous mushrooms, and is little more than a funnel for existing currents of nationalism, racism and cultural warfare. But both were nourished in similarly toxic soil — the alt-right of Hitler’s day being the anti-Semitic underground, viewed as déclassé by the polite anti-Semites of mainstream society — and both were fundamentally the instruments of one man’s will, whims and outsized persona.

One more important set of differences may be deduced: Hitler was able to unite and mobilize a large majority of German society, and to drive out or crush nearly all opposition. Americans are too lazy — or at least too deeply committed to consumerism and leisure time — to be mobilized on any large scale, and too fractious and diverse to be united about anything. Opposition to Trump is not going away, and once it works through its shock and internal squabbles is likely to forge a formidable coalition.

Trump’s political situation before assuming office is weaker than Hitler’s ever was. Even people who loved his performance as the pro-wrestling cartoon version of Hitler would presumably be less excited about the real thing. Presumably! But the cartoon Hitler is about to become the real president, and the time for pretending that we have the slightest idea what will happen next is long past. ###

[Andrew O'Hehir is a staff writer at Salon who covers movies, books, media, and politics. O'Hehir received a BA and an MA (both in humanities) from The Johns Hopkins University.]

Copyright © 2017 Salon Media Group

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2017 Sapper's (Fair & Balanced) Rants & Raves