Wednesday, August 20, 2014

Caesar Had His Brutus, Charles I Had His Cromwell, George III Had His Patrick Henry, The Trickster Had His Leon Jaworski — May Goodhair (But No Brains) Have His Michael McCrum

Yesterday, at 5:00 PM (CDT), Governor Goodhair (But No Brains) presented himself to the Travis County Criminal Justice Center for arraignment and a mugshot as an indicted felon who will face a criminal trial this fall. Goodhair (But No Brains) also posted bond of $20 after a plea bargain on the bond-amount between Special Prosecutor Michael McCrum and Goodhair's stable of lawyers paid by the taxpayers of Texas. This blogger wanted to see the sumbitch shackled at the wrists and ankles as he did the perp walk into the Justice Center. An orange jumpsuit with "Prisoner" stenciled across the back would have completed the dream. Goodhair (But No Brains) bellowed his mantra of belief in "The Rule Of Law" and this blogger hopes that "The Rule Of Law" results in a pair of guilty verdicts as Goodhair (But No Brains) is loaded aboard a Texas Department of Criminal Justice bus for the trip to The Walls Unit in Huntsville. As the fantasy continues, Goodhair (But No Brains) will be placed in the general population where he will spend his time holding his ankles while a hulking inmate drives home "The Rule Of Law." In the meantime, this blog presents a pair of contrarian essays in the midst of all of the bleating about poor, persecuted Goodhair (But No Brains). If this is (fair & balanced) hope for judicial retribution, so be it.

[Vannevar Bush HyperlinkBracketed NumbersDirectory]
[1] Goodhair Needs More Lawyers (Jeffrey Toobin)
[2] Goodhair's Legal Blunder (Charles P. Pierce)


[1]Back To Directory
[x The New Yorker]
Why Rick Perry May Be Out Of Luck
By Jeffrey Toobin

Tag Cloud of the following piece of writing

created at TagCrowd.com

Governor Rick Perry of Texas and President Barack Obama, strangest of bedfellows, are making similar discoveries about the scope of prosecutorial discretion. In short, it’s very broad.

Perry’s education on the subject is an unhappy one. Late Friday, the Texas Governor, who has about five months left in his term, was indicted on two counts: abuse of official capacity and coercion of a public servant. What those charges mean, though, is hard to say. The indictment itself [PDF] is just two pages and, to put it charitably, unelaborated.

The case has its origins in Perry’s long-running feud with Rosemary Lehmberg, a district attorney in Travis County, which includes Austin and represents an island of blue in the deep-red sea of Texas. Last year, Lehmberg was charged with drunken driving. She promptly pleaded guilty, which, in light of the YouTube videos of her sobriety test and her booking at the police station, was no surprise.

Lehmberg served several days in jail but declined to resign, so Perry decided to make the most of her difficulties. He said that, unless she resigned, he would use his power as Governor to veto $7.5 million in state money for her Public Integrity Unit, which had been hard at work prosecuting Texas pols, many of them Republicans. He could not, he said, support “continued state funding for an office with statewide jurisdiction at a time when the person charged with ultimate responsibility of that unit has lost the public’s confidence.”

What Perry did was obvious. The Governor was using his leverage to jam a political adversary—not exactly novel behavior in Texas, or most other states. But Democrats succeeded in winning the appointment of a special prosecutor, Michael McCrum, to investigate Perry’s behavior, and on Friday McCrum brought the hammer down. The threat to veto the money for the D.A. amounted to, according to the prosecutor, two different kinds of felonies: a “misuse” of government property, and a corrupt attempt to influence a public official in “a specific exercise of his official power or a specific performance of his official duty” or “to violate the public servants known legal duty.” (In the charmingly archaic view of Texas statutes, every public official is a “him.”)

Perry’s indictment has been widely panned, including by many liberals, as an attempt to criminalize hardball politics. (Vetoing things is, generally, part of a governor’s job.) Perry himself is all wounded innocence. “I intend to fight against those who would erode our state’s constitution and laws purely for political purposes, and I intend to win,” he said at a news conference. (It would be easier to feel sorry for Perry if he expressed similar concern about, say, the constitutional rights of those who were executed on his watch and with his support.)

So Perry may have a point, but he also has a problem. Prosecutors have wide, almost unlimited, latitude to decide which cases to bring. The reason is obvious: there is simply no way that the government could prosecute every violation of law it sees. Think about tax evasion, marijuana use, speeding, jay-walking—we’d live in a police state if the government went after every one of these cases. (Indeed, virtually all plea bargaining, which is an ubiquitous practice, amounts to an exercise of prosecutorial discretion.) As a result, courts give prosecutors virtual carte blanche to bring some cases and ignore others. But, once they do bring them, courts respond to the argument that “everyone does it” more or less the same way that your mother did. It’s no excuse. So if Perry’s behavior fits within the technical definition of the two statutes under which he’s charged, which it well might, he’s probably out of luck.

The President is relying on the same concept of discretion to push immigration reform, even though Congress has refused to pass a law to do so. The legislative branch writes the laws, which define the classes of people who are subject to deportation. But it is the executive branch that decides which actual individuals it will pursue and deport. Over the past several years, the Obama Administration has used its discretion to allow more immigrants to stay. During the 2012 campaign, the President announced his Deferred Action for Childhood Arrivals (DACA), which amounted to a kind of administrative DREAM Act. It limited the number of deportations of people who had been children when they were brought illegally to this country, provided they meet certain other conditions. The legality of DACA has not been successfully challenged.

Prosecutorial discretion is not unlimited. The executive branch can refrain from prosecuting certain individuals, but it cannot, in theory, offer immunity to entire classes of law-breakers. Nor can a prosecutor only charge people of a certain race, or, for that matter, political party. But it’s hard to know who would have standing to challenge a failure to bring a criminal case or a deportation. The rules of standing are usually limited to individuals who have suffered a specific harm, and there’s no harm in not being prosecuted. (The New Republic has a useful primer on the subject. )

That sort of limitation on prosecutorial discretion is unlikely to help Rick Perry. His complaint is that the prosecutor is bringing one case too many, not too few. That claim, almost invariably, is a loser. So, it turns out, may be the soon-to-be-former governor. Ω

[Jeffrey Toobin, a staff writer at The New Yorker since 1993, writes about legal affairs. Before joining The New Yorker, Toobin served as an Assistant United States Attorney in Brooklyn, New York. He also served as an associate counsel in the Office of Independent Counsel Lawrence E. Walsh, an experience that provided the basis for his first book, Opening Arguments: A Young Lawyer’s First Case—United States v. Oliver North (1991). Toobin's most recent book is The Nine: Inside the Secret World of the Supreme Court (2007). He graduated magna cum laude with a Bachelor of Arts degree from Harvard College and earned a Truman Scholarship. Thereafter, he graduated from Harvard Law School magna cum laude with J.D., where he was an editor of the Harvard Law Review.]

Copyright © 2014 Condé Nast Digital


[2]Back To Directory
[x Esquire]
The Joke That Is Rick Perry Is Not Funny
By Charles P. Pierce

Tag Cloud of the following piece of writing
created at TagCrowd.com

In a sane democracy, the idea of Rick Perry as president of the United States would be treated as the fringe enthusiasm of shoeless mouth-breathers. In 2012, the man could barely get from a subject to a verb without turning an ankle. Now, four years later, he puts on a pair of hornrims, and we are all supposed to buy him as legitimately the best person for the worst job in the world? I guess that's what the kidz at Tiger Beat On The Potomac think, anyway, because they broke a lot of rock telling us that Goodhair is going to ride out the latest ditch into which he's steered his bandwagon -- to wit, beating both Chris Christie and Scott Walker to the tape in the Who Will Be Indicted First? Derby. I will grant you that the indictment seems based on an arcane bit of Texas law that seems to criminalize Being A Dick to a disturbing extent. (However, if Perry really did offer another job to Rosemary Lemberg, the Travis County DA at the center of things here, in exchange for Lemberg's resignation as head of the Public Integrity Unit, then that's a whole 'nother thing.) I am less than compelled, however, by this argument.

The closest precedent dates back to 1917, when Gov. James Ferguson, who wanted the University of Texas to fire some faculty and staff of which he disapproved, was indicted based on his veto of funding to the university. Ferguson resigned before he was convicted. "There's not really any legal or political precedent for this. You've got to go back nearly a century," Jillson said.

Wait. Whoa there. In 1917, they brought exactly the same case against a sitting governor and that precedent is not a precedent because it goes back "nearly a century"? Is there a statute of limitations on relevance of which I am not aware, because it sure would have come in handy back in 1998, when the Republicans in the House impeached a sitting president despite the fact that action hadn't been taken for well over a century.

But that's not the real reason why Rick Perry as president should be a bitter joke in an evolved political culture. The real reason was outlined in shocking detail by an act of actual journalism committed over the weekend by the good folks at the Dallas Morning News. To put it briefly, being a worker in Rick Perry's Texas means that, every day, you put your life in your hands to an extent unheard of throughout the rest of the nation.

More workers die here than in any other state. On average, a Texas worker is 12 percent more likely to be killed on the job than someone doing the same job elsewhere, according to a Dallas Morning News analysis of federal data. That translates to about 580 excess workplace deaths over a decade. Construction has contributed mightily to Texas' booming economy. And the state's construction sites are 22 percent deadlier than the national average. Forty percent of Texas' excess death toll was among roofers, electricians and others in specialty construction trades. Such workers are sometimes treated as independent contractors, leaving them responsible for their own safety equipment and training. Many are undocumented immigrants. Government and industry here have invested relatively little in safety equipment, training and inspections, researchers say. And Texas is one of the toughest places to organize unions, which can promote safety. "There's a Wild West culture here," said University of Texas law professor Thomas McGarity, who has written several books about regulation. Texans often think, "We don't want some nanny state telling workers how to work and, by implication, telling employers how to manage the workplace," he said.

And the next thing you know, fertilizer plants explode and take entire towns with them.

Among The News' findings: California had 1,204 fewer deaths than expected; Texas had the highest rate of excess deaths among the 10 biggest states; there were 17 states with higher rates of excess deaths. But all of them had fewer than one-fourth of Texas' workplace deaths, which statistically skews the comparison. While oil and gas drilling is among the most dangerous industries in the U.S., Texas' fatality rate in that industry was 62 percent below average. There were 49 fewer deaths than expected.The most excess deaths in Texas were among specialty construction trades. There were 719 such fatalities, or 242 beyond what would have been expected.

Read the entire thing and remember that, outside of getting tough with the browns so people don't boo him like they did last time, Rick Perry is going to base his entire campaign for president on the economic performance of Texas while he was governor. What the Morning News describes is what Rick Perry is going to tell us he wants for the whole country — an unregulated dystopia in which a certain number of unavoidable deaths on the job are the price we pay for the freedom enjoyed by our job creators to risk all our lives. That is the basis of Rick Perry's upcoming campaign. It is built on a foundation of anonymous bones. Ω

[Charles P. "Charlie" Pierce is a sportswriter, political blogger, author, and game show panelist. Pierce is the lead political blogger for Esquire, a position he has held since September 2011. He has written for Grantland, The New York Times, the Los Angeles Times, the Chicago Tribune, the Boston Globe Sunday Magazine, the Milwaukee Journal-Sentinel, Sports Illustrated, The National Sports Daily, GQ, and Slate. Pierce makes appearances on radio as a regular contributor to a pair of NPR programs: "Only A Game" and "Wait Wait...Don't Tell Me!" He graduated from Marquette University (BA, Journalism).]

Copyright © 2014 Hearst Communications



Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2014 Sapper's (Fair & Balanced) Rants & Raves

Tuesday, August 19, 2014

The Dilemma O'The Day — Phish Or Cut Clickbait?

Website pop-up ads are wonderful! Not! The pop-ups are obnoxious and obtrusive. This blogger suspects that if you ask MIT's Ethan Zukerman for the time of day, he will tell you how to make a watch. Nonetheless, Zuckerman provides an interesting look at the state of the Web as it turns 25. If this is (fair & balanced) geekiness, so be it.

[x The Atlantic]
The Internet's Original Sin
By Ethan Zuckerman

Tag Cloud of the following piece of writing

created at TagCrowd.com

Ron Carlson’s short story “What We Wanted To Do” takes the form of an apology from a villager who failed to protect his comrades from marauding Visigoths. It begins:

What we wanted to do was spill boiling oil onto the heads of our enemies as they attempted to bang down the gates of our village. But as everyone now knows, we had some problems, primarily technical problems, that prevented us from doing what we wanted to do the way we had hoped to do it. What we’re asking for today is another chance.

There’s little suspense in the story—the disastrous outcome is obvious from the first paragraph—but it works because of the poignancy of the apology. All of us have screwed up situations in our lives so badly that we’ve been forced to explain our actions by reminding everyone of our good intentions. It’s obvious now that what we did was a fiasco, so let me remind you that what we wanted to do was something brave and noble.

The fiasco I want to talk about is the World Wide Web, specifically, the advertising-supported, “free as in beer” constellation of social networks, services, and content that represents so much of the present day web industry. I’ve been thinking of this world, one I’ve worked in for over 20 years, as a fiasco since reading a lecture by Maciej Cegłowski, delivered at the Beyond Tellerrand web design conference. Cegłowski is an important and influential programmer and an enviably talented writer. His talk is a patient explanation of how we’ve ended up with surveillance as the default, if not sole, internet business model.

The talk is hilarious and insightful, and poignant precisely for the reasons Carlson’s story is. The internet spies at us at every twist and turn not because Zuckerberg, Brin, and Page are scheming, sinister masterminds, but due to good intentions gone awry. With apologies to Carlson:

What we wanted to do was to build a tool that made it easy for everyone, everywhere to share knowledge, opinions, ideas and photos of cute cats. As everyone knows, we had some problems, primarily business model problems, that prevented us from doing what we wanted to do the way we hoped to do it. What we’re asking for today is a conversation about how we could do this better, since we screwed up pretty badly the first time around.

I use the first personal plural advisedly. From 1994 to 1999, I worked for Tripod.com, helping to architect, design, and implement a website that marketed content and services to recent college graduates. When that business failed to catch on, we became a webpage-hosting provider and proto-social network. Over the course of five years, we tried dozens of revenue models, printing out shiny new business plans to sell each one. We’d run as a subscription service! Take a share of revenue when our users bought mutual funds after reading our investment advice! Get paid to bundle a magazine with textbook publishers! Sell T-shirts and other branded merch!

At the end of the day, the business model that got us funded was advertising. The model that got us acquired was analyzing users’ personal homepages so we could better target ads to them. Along the way, we ended up creating one of the most hated tools in the advertiser’s toolkit: the pop-up ad. It was a way to associate an ad with a user’s page without putting it directly on the page, which advertisers worried would imply an association between their brand and the page’s content. Specifically, we came up with it when a major car company freaked out that they’d bought a banner ad on a page that celebrated anal sex. I wrote the code to launch the window and run an ad in it. I’m sorry. Our intentions were good.

Cegłowski’s speech explains why Tripod’s story sounds familiar. Advertising became the default business model on the web, “the entire economic foundation of our industry,” because it was the easiest model for a web startup to implement, and the easiest to market to investors. Web startups could contract their revenue growth to an ad network and focus on building an audience. If revenues were insufficient to cover the costs of providing the content or service, it didn't matter—what mattered was audience growth, as a site with tens of millions of loyal users would surely find a way to generate revenue.

There are businesses, Cegłowski notes, that make money from advertising, like Yahoo and Gawker. But most businesses use advertising in a different way. Their revenue source is investor storytime:

Investor storytime is when someone pays you to tell them how rich they’ll get when you finally put ads on your site.

Pinterest is a site that runs on investor storytime. Most startups run on investor storytime.

Investor storytime is not exactly advertising, but it is related to advertising. Think of it as an advertising future, or perhaps the world’s most targeted ad.

Both business models involve persuasion. In one of them, you’re asking millions of listeners to hand over a little bit of money. In the other, you’re persuading one or two listeners to hand over millions of money.

The key part of investor storytime is persuading investors that your ads will be worth more than everyone else’s ads. That’s because most online ads aren’t worth very much. As a rule, the ads that are worth the most money are those that appear when you’re ready to make a purchase—the ads that appear on Google when you’re searching for a new car or for someone to repair your roof can be sold for dollars per click because advertisers know you’re already interested in the services they are offering and that you’re likely to make an expensive purchase. But most online advertising doesn’t follow your interest; it competes for your attention. It’s a barrier you have to overcome (minimizing windows, clicking it out of the way, ignoring it) to get to the article or interaction you want.

A back of the envelope analysis from Felix Stalder gives a sense of how little these ads are worth. Last quarter, Facebook reported that it had 1.32 billion users, collected $2.91 billion in revenue and made a profit of $791 million, for a profit margin of 27 percent. Facebook is clearly doing a great job making money from ads. But the profit per user is just under $0.60. That’s a fascinating figure, because Facebook reports that users spend 40 minutes per day on the site, or roughly 60 hours per quarter.

Stalder is interested in the idea that users are working for Facebook, generating content that the company profits from without getting compensated. But even if we ignore the important idea of “free cultural labor” that makes a business like Facebook (or Tripod!) possible, it’s striking that our attention, as viewers, is worth only a penny an hour to Facebook’s advertisers.

Don Marti uses the same set of Facebook earning numbers to demonstrate that print newspapers make roughly four times as much money in advertising as Facebook does in the United States. Print advertising generates these enviable, if shrinking, numbers despite capturing only about 14 minutes a day of Americans’ attention. This “print dollars, digital dimes” problem is an apparent paradox: Why are targeted digital ads worth an order of magnitude less than untargeted print ads, in terms of “attention minutes”? Marti argues that advertising in a public place, like a newspapers, builds brands in a way that private, targeted ads can’t. (I’ve argued that this is a legacy effect and that print ads will fall in price once there are efficient digital ways to reach the majority of consumers in a market.)

Cegłowski tells us that it doesn’t matter.

The poor performance of digital ads just makes investor storytime more compelling. After showing how poor YouTube’s targeted ads are in understanding him as a consumer, he explains, “Of course, for ad sellers, the crappiness of targeted ads is a feature! It means there’s vast room for improvement. So many stories to tell the investors.”

Most investors know your company won’t grow to have a billion users, as Facebook does. So you’ve got to prove that your ads will be worth more than Facebook’s. In 1997, I argued that Tripod’s users were more valuable to advertisers than the average web user because I could use algorithms to analyze the home pages they posted and target ads to their interests and demographic data. Facebook makes a vastly more sophisticated version of that argument, and faces problems much like those we faced almost two decades ago. Targeting to intent (as Google’s search ads do) works well, while targeting to demographics, psychographics or stated interests (as Facebook does) works marginally better than not targeting at all.

Demonstrating that you’re going to target more and better than Facebook requires moving deeper into the world of surveillance—tracking users' mobile devices as they move through the physical world, assembling more complex user profiles by trading information between data brokers.

Once we’ve assumed that advertising is the default model to support the Internet, the next step is obvious: We need more data so we can make our targeted ads appear to be more effective. Cegłowski explains, “We’re addicted to ‘big data’ not because it’s effective now, but because we need it to tell better stories.” So we build businesses that promise investors that advertising will be more invasive, ubiquitous, and targeted and that we will collect more data about our users and their behavior.

I have come to believe that advertising is the original sin of the web. The fallen state of our Internet is a direct, if unintentional, consequence of choosing advertising as the default model to support online content and services. Through successive rounds of innovation and investor storytime, we’ve trained Internet users to expect that everything they say and do online will be aggregated into profiles (which they cannot review, challenge, or change) that shape both what ads and what content they see. Outrage over experimental manipulation of these profiles by social networks and dating companies has led to heated debates amongst the technologically savvy, but hasn’t shrunk the user bases of these services, as users now accept that this sort of manipulation is an integral part of the online experience.

Users have been so well trained to expect surveillance that even when widespread, clandestine government surveillance was revealed by a whistleblower, there has been little organized, public demand for reform and change. As a result, the Obama administration has been slightly more transparent about government surveillance requests, but has ignored most of the recommendations made by his own review panel and suffered few political consequences. Only half of Americans believe that Snowden’s leaks served the public interest and the majority of Americans favor criminal prosecution for the whistleblower. It’s unlikely that our willingness to accept online surveillance reflects our trust in the American government, which is at historic lows. More likely, we’ve been taught that this is simply how the Internet works: If we open ourselves to ever-increasing surveillance—whether from corporations or governments—the tools and content we want will remain free of cost.

At this point in the story, it’s probably worth reminding you that our intentions were good.

What I wanted to do was to build a tool that allowed everyone to have the opportunity to express themselves and be heard from anywhere from a few friends to the entire globe. In 1995, there weren’t a lot of ways to offer people free webpage hosting and make money. Charging users for the service would have blocked most of our potential customers—most of the world still doesn’t have a credit card today, and fewer did in 1995. E-payment systems like PayPal didn’t come online until 1999. But because Tripod’s services were free and ad supported, users around the world found us and began posting webpages they could not host elsewhere.

In 1996, we noticed that the majority of our users were coming from four countries: the United States, Canada, the U.K., and Malaysia. Since none of our content was in Bahasa Malay and since we’d never done any outreach to Malaysian users, this was a surprise. I started printing out heavily trafficked webpages posted by Malaysian users and brought a sheaf of them to a professor at nearby Williams College, who read them over and informed me that we had become a major vehicle for expression for Malaysia’s opposition political group, Anwar Ibrahim's Reformasi movement.

The adoption of Tripod by Malaysian activists was not directly due to our use of an ad-supported model, but it was an unintended, positive consequence. We couldn’t find a way to make money from advertising to Malaysian users, and we had internal discussions about whether we should “cut our losses” and provide services only to users in countries where we could sell advertising, conversations that Facebook and other ad-supported companies are now wrestling with as they expand in the developing world. I’m glad that we made the right decision (morally, if not fiscally) and that Facebook, thus far, has done so as well.

The great benefit of an ad supported web is that it’s a web open to everyone. It supports free riders well, which has been key in opening the web to young people and those in the developing world. Ad support makes it very easy for users to “try before they buy,” eliminating the hard parts of the sales cycle, and allowing services like Twitter, Facebook, and Weibo to scale to hundreds of millions of users at an unprecedented rate. This, in turn has powerful network effects: Once all your high school classmates are on Facebook, there’s a strong temptation to join, even if you don’t like the terms of service, as it’s an efficient way to keep in touch with that social circle.

In theory, an ad-supported system is more protective of privacy than a transactional one. Subscriptions or micropayments resolved via credit card create a strong link between online and real-world identity, while ads have traditionally been targeted to the content they appear with, not to the demo/psychographic identity of the user. In practice, part of Facebook’s relentless drive to ensure all users are personally identifiable is to improve targeting and reassure advertisers that their messages are reaching real human beings.

And while we’re considering the benefits of an ad-supported web, it’s worth exploring the idea that the adoption of advertising as a business model normalized the web much more quickly than it otherwise would have spread. Companies like Tripod worked to convince massive companies that were at least a decade from selling online, like auto manufacturers, that they needed a presence on the web to build their brand in an important new media. (Second Life unsuccessfully tried the same strategy a decade later, persuading Pontiac to open dealerships for virtual cars.) Taking a small portion of the auto industry’s vast ad budget allowed companies to persuade investors that online advertising would be huge and brought those companies online years before they had a business need to do so.

An ad supported web grows quickly and is open to those who can’t or won’t pay. But it has at least four downsides as a default business model.

First, while advertising without surveillance is possible—unverifiable advertising was the only type of advertising through most of the 20th century—it’s hard to imagine online advertising without surveillance. The primary benefit of online advertising is the ability to see who’s looking at an ad. Simply paying for online advertising requires surveillance, if only to eliminate clickfraud. And if Cegłowski’s theory is true, there’s no apparent escape from escalating surveillance to create more attractive business propositions.

Second, not only does advertising lead to surveillance through the “investor storytime” mechanism, it creates incentives to produce and share content that generates pageviews and mouse clicks, but little thoughtful engagement. Clickbait has become so prominent that even Upworthy, popularizer of spreadable media as a tool for social change, is asking advertisers to consider how much attention readers are paying to content, rather than how many pageviews it generates. Some new media empires are so attached to advertising metrics that they are giving writers days off from “traffic whoring” duty to allow them to produce content that has greater social and informational value. While many newspapers are shielding their reporters from statistics about whether their stories are being read, the increasing importance of digital news outlets to the public sphere suggests we may get less news that helps us engage as citizens and more news designed to get us to click the “next page” button.

Third, the advertising model tends to centralize the web. Advertisers are desperate to reach large audiences as the reach of any individual channel shrinks. A generation ago, you could reach a significant fraction the the American population by buying ad time on four television networks. Very few companies can offer that “Superbowl ad” reach today. Advertisers purchase ads scattered across hundreds of sites, buying demographic targeting at the lowest rates available. Companies like Facebook want get as much of that money as possible, which means chasing users and reach. Using cash from investors and ad sales, they can acquire smaller companies that are starting to build rival networks. (See Facebook’s acquisition of Instagram and, to a lesser extent, Whats App.) This centralization has dangers for online speech—it means decisions these platforms take to ban speech are as powerful as decisions made by governments, as Rebecca MacKinnon has eloquently documented.

Finally, even attempts to mitigate advertising’s downsides have consequences. To compensate us for our experience of continual surveillance, many websites promise personalization of content to match our interests and tastes. (By giving platforms information on our interests, we are, of course, generating more ad targeting information.)

This personalization means that two readers of The New York Times may seen a very different picture of the world, and that two users of Facebook certainly do, shaped both by our choice of friends and by Facebook’s algorithms. Research suggests that these personalized sites may lead us into echo chambers, filter bubbles, or other forms of ideological isolation that divide us into rival camps that cannot agree on anything, including a set of common facts on which we could build a debate. While many have written on this topic (and I wrote a book on it), few have shown the implications of overpersonalization as well as Gilad Lotan did in this recent analysis of media consumption in Israel and Palestine, where he describes the view participants in the current Gaza war have of the conflict as “personalized propaganda.”

It’s easier to rant about technology than it is to propose solutions. To Cegłowski’s credit, he closes his talk with a set of practical suggestions about limits we might put on the use of digital data by advertisers. He demands that we be given a right to review and delete data companies hold on us, proposes a time limit on how long data can be held and how it can be shared. Implementing these regulations, of course, would require finding a regulator with teeth, and it’s not clear that the Federal Trade Commission would be willing to enforce such constraints on companies that are becoming powerful actors in Washington.

More importantly, Cegłowski offers us a way forward through his own actions. Cegłowski wrote and maintains Pinboard.in, a simple and powerful bookmarking service with an unusual business model. Each user of the service pays a one-time fee, which rises a fraction of a cent with each new user. (When I signed up for Pinboard, it cost $5, and now costs a bit more than $10.) The cost has the benefit of keeping the service spam-free—Metafilter has seen some of the same benefits from their nominal membership fee—and has meant that the service has been profitable since it was launched. Users can upgrade to a $25-per-year version that archives every webpage you bookmark, creating a permanent, searchable archive of your journeys through the web. Cegłowski promises that he will never sell ads on the site and never sell data to third parties, reminding us, “If you’re not paying for your bookmarking, then someone else is, and their interests may not be aligned with yours.”

Pinboard launched in 2009, in part in reaction to changes at Del.icio.us, a beloved bookmarking site started by Joshua Schacter and sold to Yahoo, which slowly ran the site into the ground. Many of Pinboard’s policies can be read as Cegłowski’s attempts to protect himself—and anyone else—from having precious personal data held hostage when a company changes hands. But these principles are also an invitation to think about, and perhaps to create, a web that works very differently.

The web is celebrating a 25th anniversary, but that celebrates the invention of the HTTP protocol by Tim Berners Lee. In practical terms, the web as we know it is less than 20 years old. Many of the services we rely on, like Twitter, are less than 10 years old. Yet it’s often hard to imagine making deep, structural changes to the web. It’s easy to assume that aspects of the web’s architecture and business model are inevitable: We will inexorably move towards a web that is centralized, ad supported and heavily surveilled.

Part of the celebration of the Web’s 25th anniversary is The Web We Want, a campaign to open the dialog about how the Internet is structured and governed to voices from around the globe. I think it’s at least as important to consider how we want the web to make money, as these decisions have powerful unintended consequences.

One simple way forward is to charge for services and protect users’ privacy, as Cegłowski is doing with Pinboard. What would it cost to subscribe to an ad-free Facebook and receive a verifiable promise that your content and metadata wasn’t being resold, and would be deleted within a fixed window? Google now does this with its enterprise and educational email tools, promising paying users that their email is now exempt from the creepy content-based ad targeting that characterizes its free product. Would Google allow users to may a modest subscription fee and opt out of this obvious, heavy-handed surveillance?

Users will pay for services that they love. Reddit, the lively recommendation and discussion community, sells Reddit Gold subscriptions that give users special privileges and the ability to turn off ads. (Those ads, by the way, are a lot less intrusive than those on most social networks.) Reddit advertises Gold both by detailing benefits of membership and setting “daily goals” for gold subscriptions, telling readers, “We believe the more reddit can be user-supported, the freer we will be to make reddit the best it can be.” A culture has developed of giving a month’s Reddit gold membership to someone whose comments or content you’ve especially appreciated, rewarding both the individual and the community as a whole. It would be interesting to see if Facebook could support a premium model. I suspect many people use Facebook because they feel they have to, not because they love it, as they love Reddit.

There are numerous consequences, not all intended, not all good, to using fee for service as a default model on the web. Many users would abandon services that weren’t worth paying for—we’d see usage numbers for sites shrink as well as growing inexorably. Most sites would have much smaller userbases. This likely means we’d have a harder time finding our exes on Facebook, but might mean we’d see more competition, less centralization and more competitive innovation.

If we want to build a web that’s really global, we need to rethink online payment systems. Visa and Mastercard may never become pervasive in India and sub-Saharan Africa as mobile money already has a strong market share. Still, systems like M-Pesa suffer the same problems as credit cards and PayPal: high transaction costs. The model Ted Nelson dreamed of with Xanadu, where hyperlinks would ensure authors were cited and compensated for their work, required a micropayment system with low transaction costs.

We may be nearing such a system with Bitcoin and other cryptocurrencies that permit transactions with extremely low transaction costs. (In theory. In practice, Bitcoin transaction costs are currently still a significant fraction of a dollar.) Projects like Stellar are focused on mainstreaming cryptocurrency and ensuring these systems aren’t open only to the digerati. If Stellar takes off (a big if) and if transaction costs drop low enough (a very big if), we might see an Internet supported on micropayments of a fraction of a cent to compensate the operators of services or creators of content. This sort of architecture might provide a viable alternative to advertising for new businesses starting online. (Consider Jeremy Rubin’s Tidbit project, which quickly ran into legal trouble—viewers of a webpage would pay a micropayment through giving cycles of their computer’s CPU to mine Bitcoins as a way of paying the page’s owner.)

There is no single “right answer” to the question of how we pay for the tool that lets us share knowledge, opinions, ideas, and photos of cute cats. Whether we embrace micropayments, membership, crowdfunding, or any other model, there are bound to be unintended consequences.

But 20 years in to the ad-supported web, we can see that our current model is bad, broken, and corrosive. It’s time to start paying for privacy, to support services we love, and to abandon those that are free, but sell us—the users and our attention—as the product. Ω

[Ethan Zuckerman is director of the Center for Civic Media at MIT and principal research scientist at MIT’s Media Lab. He is the author of Rewire: Digital Cosmopolitans in the Age of Connection (2013). Zuckerman received a BA (philosophy) from Williams College. Following graduation at Williams, he was a Fulbright Scholar at the University of Legon, Ghana and the National Theatre of Ghana, studying ethnomusicology and percussion.]

Copyright © 2014 The Atlantic Monthly Group



Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2014 Sapper's (Fair & Balanced) Rants & Raves

Monday, August 18, 2014

Roll Over, RoboCop — Meet Officer Unfriendly Of The Ferguson (MO) Police Department

The entire world got a preview of militarized police following the Boston Marathon bombing. Entire neighborhoods instantly became free-fire zones. Now, the police in Ferguson, MO have provided another demonstration of a military response to a Missouri-style Infitada. Tear gas and rubber bullets came into use when Ferguson's Thin Blue Line was confronted by civilian rioters. What's next? Drones and Hellfire missiles? "Use 'em or lose 'em" is the motto o'the day when dealing with The Department of Defense Excess Property Program's distribution of surplus military equipment to state and local law enforcement agencies for use in counter-terrorism and counter-narcotic activities. The Ferguson Police Department probably has received a re-supply of rubber bullets and tear gas canisters. If this is a (fair & balanced) national disgrace, so be it.

[x This Modern World]
Officer Friendly
By Tom Tomorrow (Dan Perkins)

Tom Tomorrow/Dan Perkins

[Dan Perkins is an editorial cartoonist better known by the pen name "Tom Tomorrow". His weekly comic strip, "This Modern World," which comments on current events from a strong liberal perspective, appears regularly in approximately 150 papers across the U.S., as well as on Daily Kos. The strip debuted in 1990 in SF Weekly. Perkins, a long time resident of Brooklyn, New York, currently lives in Connecticut. He received the Robert F. Kennedy Award for Excellence in Journalism in both 1998 and 2002. When he is not working on projects related to his comic strip, Perkins writes a daily political weblog, also entitled "This Modern World," which he began in December 2001. More recently, Dan Perkins, pen name Tom Tomorrow, was named the winner of the 2013 Herblock Prize for editorial cartooning.]

Copyright © 2014 Tom Tomorrow (Dan Perkins)



Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2014 Sapper's (Fair & Balanced) Rants & Raves

Sunday, August 17, 2014

Today, An Ode To "The Cockroach Of The Internet"

In 2003, this blogger vowed that his blog would be a substitute for the e-mail that clogged his In Box: "Read this!" or "Read that!" Now, this blogger receives forwarded e-mail from his geezer friends that contains this funny or that bizarre content. This blog has become a giant In Box that one and all can visit (or not) to read funny, bizarre, or mundane stuff. The difference between a blog post and spammy e-mail is one of the marvels of the Internet. If this is (fair & balanced) geek-speak, so be it.

[x The Atlantic]
Email Is Still The Best Thing On The Internet
By Alexis C. Madrigal

Tag Cloud of the following piece of writing

created at TagCrowd.com

All these people are trying to kill email.

"E-mail is dead, or at least that’s what Silicon Valley is banking on," wrote Businessweek tech reporter Ashlee Vance.

There's the co-founder of Asana, the work software startup. Email has "become a counter-productivity tool,” Justin Rosenstein likes to say.

Slack, the superhot work chat tool, likes to brag that they've "saved the world from over 70,000,000 emails" (if you assume that every five Slack messages prevent one email from getting its wings).

And it's not just entrepreneurs with cloud software to sell. There are the young people, too, especially whatever we call the younger-than-Millennials.

Getting an email address was once a nerdy right of passage for Gen-Xers arriving on college campuses. Now, the kids are waging a war of indifference on poor old email, culling the weak and infirm old-people technology. One American professor maintained that, to his students, "e-mail was as antiquated as the spellings 'chuse' and 'musick' in the works by Cotton Mather and Jonathan Edwards." The vice-chancellor of Exeter University claimed, "There is no point in emailing students any more." The youth appear to think there are better, faster, more exciting ways to communicate than stupid email.

Yet, despite all the prognosticators predicting it will—choose the violence level of your metaphor—go out of style, be put out to pasture, or taken out back and shot, email grinds on.

You can't kill email! It's the cockroach of the Internet, and I mean that as a compliment. This resilience is a good thing.

"There isn't much to sending or receiving email and that's sort of the point," observed Aaron Straup Cope, the Cooper-Hewitt Design Museum's Senior Engineer in Digital and Emerging Media. "The next time someone tells you email is 'dead,' try to imagine the cost of investing in their solution or the cost of giving up all the flexibility that email affords."

Email is actually a tremendous, decentralized, open platform on which new, innovative things can and have been built. In that way, email represents a different model from the closed ecosystems we see proliferating across our computers and devices.

Email is a refugee from the open, interoperable, less-controlled "web we lost." It's an exciting landscape of freedom amidst the walled gardens of social networking and messaging services.

Yes, email is exciting. Get excited!

For all the changes occurring around email, the experience of email itself has been transformed, too. Email is not dying, but it is being unbundled.

Because it developed early in the history of the commercial Internet, email served as a support structure for many other developments in the web's history. This has kept email vitally important, but the downside is that the average inbox in the second decade of the century had become clogged with cruft. Too many tasks were bolted on to email's simple protocols.

Looking back on these transitional years from the 2020s, email will appear to people as a grab bag of mismatched services.

Email was a newsfeed. With the proliferation of newsletters, email alerts, flash sale emails, and other email-delivered content, one's email client became a major site of media consumption. It was a feed as much as an inbox.

Email was one's passport and identity. Before Facebook became a true alternative for verifying one's identity on the web, the email address was how one accomplished serious things on the Internet. Want to verify a bank account? Email. Amazon? Email. Forums? Email. Even Facebook in the early days? Email. And it meant something where your email address was hosted. FirstName@YourLastName.com signaled you owned a domain. A Hotmail account might indicate you were a beginner and a Well address connoted early Internet connectivity. For a time, Gmail addresses were a sign of sophistication. Now, both the functional and symbolic importance of email addresses is in decline. There are so many more ways to signal who we are online now.

Email was the primary means of direct social communication on the Internet. Email was how to send a message to someone, period. BBSs, chat rooms, and message boards have existed for as long as email, but email formed the private links between people that undergirded the public channels, which evolved before and with the web. Now, there are a lot of ways to reach someone on the net. There is one's phone, Facebook profile, Twitter account, LinkedIn, Instagram, Qik, WhatsApp, etc., etc. It's telling that in the mobile world, app developers want access to a user's phone's contact list, not her email connections.

Email was a digital package-delivery service. After FTP faded from popularity, but before Dropbox and Google Drive, email was the primary way to ship heavy digital documents around the Internet. The attachment was a key productivity tool for just about everyone, and it's hard to imagine an Internet without the ability to quickly append documents to a message. Needless to say, email is a less than ideal transmission or storage medium, relative to the new services.

Email was the primary mode of networked work communication. Most companies would have a hard time functioning without email, the French company Atos's successful email ban notwithstanding. And it's this last category of email service that so many companies are eager to reform. HootSuite's CEO laid out why in a "Fast Company" article in 2013: Email is, he said, unproductive, linear, not social, and paradoxically tends to create information silos. Plus, who doesn't want some enterprise budget? Many startups, tiring of or failing in the consumer space, need to pivot somewhere.

Looking at this list of email's many current uses, it is obvious that some of these tasks will leave its domain. Each person will get to choose whether they use email as their primary identity on the web. Work and simple social messaging will keep moving to other platforms, too. The same will be true of digital delivery, where many cloud-based solutions have already proved superior.

So, what will be left of the inbox, then?

I contend email might actually become what we thought it was: an electronic letter-writing platform.

My colleague Ian Bogost pointed out to me that we've used the metaphor of the mail to describe the kind of communication that goes on through these servers. But, in reality, email did not replace letters, but all classes of communications: phone calls, in-person encounters, memos, marketing pleas, etc.

The metaphor of electronic mail never fully fit how people use e-mail. But, now, perhaps it might. Email could become a home for the kinds of communications that come in the mail: letters from actual people, bills, personalized advertisements, and periodicals.

This change might be accelerated by services like Gmail's Priority Inbox, which sorts mail neatly (and automatically) into categories, or Unroll.me, which allows users to bundle incoming impersonal communications like newsletters and commercial offers into one easy custom publication.

That is to say, our inboxes are getting smarter and smarter. Serious tools are being built to help us direct and manage what was once just a chronological flow, which people dammed with inadequate organization systems hoping to survive the flood. (Remember all the folders in desktop email clients!)

It's worth noting that spam, which once threatened to overrun our inboxes, has been made invisible by more sophisticated email filtering. I received hundreds of spam emails yesterday, and yet I didn't see a single one because Gmail and my Atlantic email filtered them all neatly out of my main inbox. At the same time, the culture of botty spam spread to every other corner of the Internet. I see spam comments on every website and spam Facebook pages and spam Twitter accounts every day.

Email has gotten much smarter and easier to use, while retaining its ubiquity and interoperability. But there is no one company promoting Email (TM), so those changes have gone relatively unremarked upon.

But recall Hotmail in 1996 or Microsoft Outlook in 1999 or—and I know some nerds will hate me for saying this—Pine over a telnet connection in 1993. Compare it to Gmail today or Mailbox on an iPhone. The process of receiving email has gotten so much better, friendlier, and more sophisticated.

And one last thing.... This isn't something the originators of email ever could have imagined, but: Email does mobile really well.

While the mobile web is a rusting scrapheap of unreadable text, broken advertisements, and janky layouts, normal emails look great on phones! They are super lightweight, so they download quickly over any kind of connection, and the tools to forward or otherwise deal with them are built expertly and natively into our mobile devices.

All this to say, email has soaked up many of the great things about the current web. It's pretty. It's convenient. Algorithms work over the raw feed to simplify the flow of information. Email, generally, is mobile-friendly and renders beautifully on all devices. These are the things that the current generation of web companies strive to accomplish. And look at old email, doing all that effortlessly.

While email's continued evolution is significant, what it has retained from the old web sets it apart from the other pretty, convenient apps. Email is an open, interoperable protocol. Someone can use Google's service, spin up a server of her own, or send messages through Microsoft's enterprise software. And yet all of these people can communicate seamlessly. While various governments have done what they can to hassle or destroy anonymous email services in the post-Snowden world, email is one of the more defensible and private parts of the mainstream Internet experience, especially if one is willing to go through some extra security procedures.

Last, Silicon Valley startups seem to be able to offer the great experiences that they do because they centralize our information within their server farms. But email proves that this is not necessarily the case. Progress can come from much more distributed decision-making processes. The email protocol evolves based on the deliberations of the Internet Engineering Task Force, not by the fiat rule of a single company in Silicon Valley or New York.

And what's changing isn't a product that must be rolled out to all users, but an ecosystem that provides niches for all kinds of different emailers.

Perhaps the way, then, to recover some of the old web, before the dominance of Apple, Google, Amazon, and Facebook, isn't to build new competitors to those companies, but to redouble our use and support of good old email.

Email—yes, email—is one way forward for a less commercial, less centralized web, and the best thing is, this beautiful cockroach of a social network is already living in all of our homes.

Now, all we have to do is convince the kids that the real rebellion against the pressures of social media isn't to escape to the ephemerality of Snapchat, but to retreat to the private, relaxed confines of their email inboxes. Ω

[Alexis Madrigal is the senior editor of "The Atlantic," where he also oversees the Technology Channel. He's the author of Powering the Dream: The History and Promise of Green Technology (2011). Madrigal received a BA (magna cum laude, English} from Harvard University.]

Copyright © 2014 The Atlantic Monthly Group



Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2014 Sapper's (Fair & Balanced) Rants & Raves

Saturday, August 16, 2014

One Last Curtain Call: Robin Williams, RIP3

In the aftermath of the news of the death of Robin Williams, this blogger was cruising along in his wheels with the voice-controlled sound system (the blogger's iPod via a USB-connection) listening to Billy Joel and "The Entertainer" played over the speakers. Billy Joel was singing about the challenge of being on top of the pop music charts and later, this blogger encountered Eags' farewell to Robin Williams. Williams faced a comparable challenge as Billy Joel's entertainer: both men faced the Sysiphusian task in creating hit films or hit songs. Robin Williams threw away countless insights during his improvisational riffs as a standup comedian. However, this blogger will never forget that Willimas quoted Sigmund Freud as saying, "If it's not one thing, it's your mother." If this is the (fair & balanced) insight of our age, so be it.

[x NY Fishwrap]
Robin Williams, The Vulnerable Showman
By Timothy Egan

Tag Cloud of the following piece of writing

created at TagCrowd.com
[x YouTube/Jetvane55 Channel]
"The Entertainer"
By BIlly Joel

Every writer fears the blank page. Every comedian dreads silence at the end of a punch line. Every creative soul quivers at the prospect of brush touching canvas or fingers reaching for keyboards and finding — nothing.

In all the tributes to the propulsive force of genius that was Robin Williams, in all the helpful pivots from a celebrity’s suicide to the larger issue of depression that cripples millions, we may have forgotten one obvious thing about him. Williams was a showman — perhaps the most vulnerable of public persons. And in the digital age, when trolls can make a piñata of even the most gifted performers, the pressure on those on stage or in the arena has never been greater. Nor has the arc from triumph to tragedy been shorter.

In 2008, Williams was trying out new material at the Showbox in Seattle, a small venue. He was coming off a string of mediocre movies and unkind reviews, and was trying to dig his way out of a career hole. That night he did the work of 12 comedians, all the manic flights, the voices, the riffs, sweating through his shirt and gulping bottled water like a marathon runner. He was Springsteen in encore, Michael Jordan in double overtime.

Afterward, in his cramped dressing room, he appeared deflated, exhausted, spent and, as is often said of famous people when you meet them in real life, smaller than his screen self. We complimented him on the new stuff — he had killed, leaving the room in tears of laughter. “Really?” he asked in schoolboy hunger for a pat on the head. “Did you really like it?”

On stage, he owned the room. Backstage, he was lonely and needy and a little out of breath; laughter was his oxygen. Two years later, in a podcast interview with the comedian Marc Maron, he offered another glimpse of the pressure to create and remain fresh. “You bottom out,” he said. “People say, ‘You have an Academy Award.’ The Academy Award lasted about a week, then one week later people are going, ‘Hey, Mork.’ ”

Williams needed those nights in Seattle (he donated all the proceeds to a food bank, without making a fuss of it), because artists need room to fail. But I’m not sure that our era, in which everything is exposed, shared and composted in less time than it takes a cup of coffee to go cold, allows for these creative resuscitations anymore.

Look at the most savage reaction to Williams’s death. Rush Limbaugh said the suicide “fits a certain picture, or a certain image that the left has.” He said later he was talking about the “liberal media,” but he was doing what he always does: Finding the most heartless thing to say about a situation that requires a teaspoon of genuine empathy. On the Glenn Beck website for “The Blaze,” the hate flowed. “The way I look at it, that’s one less Hollywood lib-prog donor and voter,” one reader wrote in the comments. There were many others in the same vein. Harassment from Internet trolls forced the comedian’s daughter, Zelda Williams, to shut down her social media sites “maybe forever.”

We’ll never know if the bullies with public forums could drive Williams, who was 63, to severe depression. So much of chronic melancholy has more to do with clinical issues of the brain than immediate slights. And if Williams was indeed suffering from early stages of Parkinson’s disease, as his wife, Susan Schneider, announced Thursday, he could have sensed a coming diminishment of his talents.

Life events can trigger a downward episode. And Williams, as an artist, was highly sensitive to nuances of success or failure, and the accelerated cycles of those two sides in the 21st century.

He once said he heard “a little voice saying, ‘You’re garbage, you’re nothing.’ ” He did a riff, comic with a ring of haunting truth, about his GPS steering him over the Golden Gate Bridge. “I said: ‘Why? Have you seen my movies lately?’ ”

Ernest Hemingway, often called the greatest writer of the 20th century, realized that creativity has its own land mines. “That terrible mood of depression of whether it’s any good or not is known as the artist’s reward,” he said. Hemingway was emotionally wounded by the negative reviews of a novel published in 1950, “Across the River and Into the Trees.”

Were he around today, a tortured, possibly medicated Hemingway would probably be hiding under his boat in Cuba. He would certainly hope, as all creative types do, that the well would never run dry. But he might find the pressure to do something original too much in the modern age.

My plea here is for people to give the needed space to artists and performers to fail every now and then, and to understand how exposed someone feels when trying something new. The trolls, the Twitter executioners and the like should save their savagery for those who are famous for being famous.

“You’re only given one little spark of madness,” Williams said. “You mustn’t lose it.” Finding the strength to keep the spark from going dark, as it turns out, was probably the great struggle of his life. Ω

[Timothy Egan writes "Outposts," a column at the NY Fishwrap online. Egan — winner of both a Pulitzer Prize in 2001 as a member of a team of reporters who wrote the series "How Race Is Lived in America" and a National Book Award (The Worst Hard Time in 2006) — graduated from the University of Washington with a degree in journalism, and was awarded an honorary doctorate of humane letters by Whitman College in 2000 for his environmental writings. Egan's most recent book is The Big Burn: Teddy Roosevelt and the Fire that Saved America (2009).]

Copyright © 2014 The New York Times Company



Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2014 Sapper's (Fair & Balanced) Rants & Raves