Monday, April 30, 2012

The Sun, the Moon and Walmart

By HOMERO ARIDJIS

A CHILD in Mexico soon learns that corruption is a way of life, and that to get ahead in school, work and politics, “El que no transa, no avanza” — loosely, “You’re not going anywhere if you don’t cheat.”

When I was in junior high school, my history teacher sold us lottery tickets, promising that the more we bought, the higher our grades would be. The winning number, he said, would coincide with the National Lottery winner. I happened to buy that number and received the highest grade, but because he kept the tickets, I never got the money.

Years later, as president of an environmental activist organization called the Group of 100, I was offered visits to Las Vegas (chips provided), cars (drivers included), cash and even prostitutes in exchange for staying silent. But my most uncomfortable experience was in 1988, when I met with the secretary of Fisheries to protest the killing of dolphins by tuna fishers. He asked me, “What’s your problem?” “I don’t have any problems,” I replied. “How can I help you?” “Make the tuna fleet stop killing dolphins.” He reached for his checkbook. “Let’s talk money, how much do you want?”

So the news that Walmart may have paid $24 million in bribes for permits to open stores in Mexico was no surprise to me. When President Felipe Calderón declared he was “very indignant,” I thought of Claude Rains in Casablanca: “I’m shocked, shocked to find that gambling is going on in here!”

Walmart already had a history of controversial behavior in Mexico. Most notably, in November 2004, despite widespread opposition, the company opened a 72,000-square-foot store within the boundaries of the 2,000-year-old city of Teotihuacán, which features the Pyramids of the Sun and the Moon (“the place where men became gods” — or consumers?). Walmart has also built a supermarket on forested land in the resort town of Playa del Carmen, in Quintana Roo — though the permit for the building later turned out to have been granted for another site, on the island of Cozumel. The question now is who allows this, and in exchange for what?

Will the federal investigation discover how many Walmarts were built on the quicksands of corruption? Marcelo Ebrard, the mayor of Mexico City, is carrying out his own investigation, but considering that his brothers have been Walmart executives, I don’t have much hope that the truth will emerge. The other day I visited a Walmart, and one of the teenage packers, who are unsalaried and work for tips, confided in me that they had been forbidden to say anything to the press about their employer. They were told to consider themselves lucky to have a job at all.

In this country, corruption exists at all levels, from magnates to street vendors. It seems easier to get something done with a bribe than to fill out myriad forms and wait in lines to confront evasive civil servants. According to a recent study, companies shell out approximately 10 percent of their earnings to corrupt officials. In the last 30 years, the Mexican economy has lost more than $870 billion to corruption, crime and tax evasion.

The consequences of this corruption are clear. When devastating earthquakes hit Mexico City in 1985, an alarming number of shoddily constructed public buildings — schools, hospitals and government offices — were destroyed. Our school system has been hijacked by the politically powerful teachers’ union, and around 90 percent of the budget is eaten up by teachers’ salaries, though many on the payroll work for the union or hold political office instead of teaching.

Extortion and protection rackets flourish alongside drug trafficking. President Álvaro Obregón, who was assassinated in 1928, once said that “no general can resist a 50,000-peso cannon blast,” a precursor to today’s “plata o plomo” — silver or lead, the drug cartel’s offer to officials of a bribe or a bullet.

Clearly, putting an end to corruption — to kickbacks and nepotism, to crooked judges and policemen, to delinquent bureaucrats and drug lords — is Mexico’s greatest challenge. In 2000, when the left-of-center Institutional Revolutionary Party lost the presidency and its 71-year grip on power, there were hopes for reform, but it remains to be seen whether increased democratization will lead to lessened corruption.

This January, when a 341-foot-tall quartz-clad tower known as the Estela de Luz was inaugurated to commemorate 200 years of independence from Spain, Mr. Calderón called it “an emblem of a new era for Mexico.” And yet, the tower was finished 16 months late, at three times its planned cost. An investigation has begun; public servants have been charged with criminal offenses; protesters call it a monument to corruption.

The truth is, we have created a corrupt system that preys on both Mexicans and foreigners — how can we be outraged when an American company exploits it? At the same time, how can we hope for Mexicans to put an end to corruption when one of the most powerful and allegedly law abiding companies in the United States gives in to the same temptations? As a former governor of Chihuahua once said, after being accused of corruption, “If we put everyone who’s corrupt in jail, who will close the door?”

Homero Aridjis, a former ambassador, is a poet, a novelist and the author of the anthology “Eyes to See Otherwise.” This essay was translated by Betty Ferber from the Spanish.

A version of this op-ed appeared in print on May 1, 2012, on page A25 of the New York edition with the headline: The Sun, the Moon and Walmart.
NYT

Clouds’ Effect on Climate Change Is Last Bastion for Dissenters

JUSTIN GILLIS

LAMONT, Okla. — For decades, a small group of scientific dissenters has been trying to shoot holes in the prevailing science of climate change, offering one reason after another why the outlook simply must be wrong.

Over time, nearly every one of their arguments has been knocked down by accumulating evidence, and polls say 97 percent of working climate scientists now see global warming as a serious risk.

Yet in recent years, the climate change skeptics have seized on one last argument that cannot be so readily dismissed. Their theory is that clouds will save us.

They acknowledge that the human release of greenhouse gases will cause the planet to warm. But they assert that clouds — which can either warm or cool the earth, depending on the type and location — will shift in such a way as to counter much of the expected temperature rise and preserve the equable climate on which civilization depends.

Their theory exploits the greatest remaining mystery in climate science, the difficulty that researchers have had in predicting how clouds will change. The scientific majority believes that clouds will most likely have a neutral effect or will even amplify the warming, perhaps strongly, but the lack of unambiguous proof has left room for dissent.

“Clouds really are the biggest uncertainty,” said Andrew E. Dessler, a climate researcher at Texas A&M. “If you listen to the credible climate skeptics, they’ve really pushed all their chips onto clouds.”

Richard S. Lindzen, a professor of meteorology at the Massachusetts Institute of Technology, is the leading proponent of the view that clouds will save the day. His stature in the field — he has been making seminal contributions to climate science since the 1960s — has amplified his influence.

Dr. Lindzen says the earth is not especially sensitive to greenhouse gases because clouds will react to counter them, and he believes he has identified a specific mechanism. On a warming planet, he says, less coverage by high clouds in the tropics will allow more heat to escape to space, countering the temperature increase.

His idea has drawn withering criticism from other scientists, who cite errors in his papers and say proof is lacking. Enough evidence is already in hand, they say, to rule out the powerful cooling effect from clouds that would be needed to offset the increase of greenhouse gases.

However, politicians looking for reasons not to tackle climate change have embraced Dr. Lindzen and other skeptics, elevating their role in the public debate.

Dr. Lindzen has obliged by assuring them that they are running no risks by refusing to enact emission limits. “There’s been a lot of scare stuff put out that just doesn’t make sense,” he said in an interview.

Some politicians have welcomed that message, regularly calling Dr. Lindzen and a handful of other contrarian scientists before Congressional committees. During a hearing before a House subcommittee, Representative Dana Rohrabacher, a California Republican and vocal global warming skeptic, complained that “in the scientific community, there are people trying to tell us that we have got to accept draconian changes in our way of life mandated by law because the CO2 that we are emitting is going to cause drastic consequences to the planet’s climate.”

He repeatedly sought affirmation from Dr. Lindzen for his views, and got it.

At gatherings of climate change skeptics on both sides of the Atlantic, Dr. Lindzen has been treated as a star. During a debate in Australia over carbon taxes, his work was cited repeatedly. When he appears at conferences of the Heartland Institute, the primary American organization pushing climate change skepticism, he is greeted by thunderous applause.

While the scientific majority acknowledges that the lingering uncertainty about clouds plays into the hands of skeptics like Dr. Lindzen, they say that he has gone beyond any reasonable reading of the evidence to provide a dangerous alibi for inaction.

Dr. Lindzen is “feeding upon an audience that wants to hear a certain message, and wants to hear it put forth by people with enough scientific reputation that it can be sustained for a while, even if it’s wrong science,” said Christopher S. Bretherton, an atmospheric researcher at the University of Washington. “I don’t think it’s intellectually honest at all.”

With climate policy nearly paralyzed in the United States, many other governments have also declined to take action, and worldwide emissions of greenhouse gases are soaring.

Natural Thermostats

Clouds are so familiar they are easy to take for granted, but scientists point out that they have an enormous effect on the climate.

The energy that drives life on earth arrives as sunlight. To remain at a steady temperature, the earth has to return the energy it receives back to space, primarily as heat. Clouds alter the energy flow in both directions.

On balance, in today’s climate, clouds cool the earth. Dense, low-lying clouds are responsible for most of that effect, because they reflect considerable sunlight back to space. Many high, thin clouds have the opposite influence, allowing incoming sunshine to pass through but effectively trapping heat that is trying to escape.

“It’s like putting a lid on a pot on the stove,” said Andreas Muhlbauer, a cloud researcher at the University of Washington.

Humans are perturbing the earth’s heat balance by releasing greenhouse gases. Chemists proved in the 19th century that these gases, especially the carbon dioxide that results from burning fossil fuels, work like an invisible blanket in the atmosphere, blocking some heat that is attempting to escape to space. In the mid-20th century, as it became clear how fast carbon dioxide levels were rising, some scientists began to predict a warming of the planet. But they also realized that an exact forecast was difficult for several reasons, especially the question of how clouds would react.

Researchers are virtually certain the amount of water vapor in the atmosphere will rise with temperature, and evidence suggests this is already happening. But that does not say much about the type or location of clouds that will condense from the vapor.

Scientists use sophisticated computer programs to forecast future climate, but the computers are not yet powerful enough to predict the behavior of individual clouds across the whole earth over a century, which forces the researchers to use rough approximations.

The most elaborate computer programs have agreed on a broad conclusion: clouds are not likely to change enough to offset the bulk of the human-caused warming. Some of the analyses predict that clouds could actually amplify the warming trend sharply through several mechanisms, including a reduction of some of the low clouds that reflect a lot of sunlight back to space. Other computer analyses foresee a largely neutral effect. The result is a big spread in forecasts of future temperature, one that scientists have not been able to narrow much in 30 years of effort.

The earth’s surface has already warmed about 1.4 degrees Fahrenheit since the Industrial Revolution, most of that in the last 40 years. Modest as it sounds, it is an average for the whole planet, representing an enormous addition of heat. An even larger amount is being absorbed by the oceans. The increase has caused some of the world’s land ice to melt and the oceans to rise.

By midcentury, the level of greenhouse gases in the atmosphere is expected to double compared with the value that prevailed before the Industrial Revolution. At the low end, computers predict that the earth could warm in response by another 2 degrees Fahrenheit. The likelier figure, the analyses say, is 4 degrees. At the high end of projections, the warming could exceed 8 degrees. In all possible outcomes, the warming over land would be roughly twice the global average, and the warming in the Arctic greater still.

Even in the low projection, many scientists say, the damage could be substantial. In the high projection, some polar regions could heat up by 20 or 25 degrees Fahrenheit — more than enough, over centuries or longer, to melt the Greenland ice sheet, raising sea level by a catastrophic 20 feet or more. Vast changes in  rainfall, heat waves and other weather patterns would most likely accompany such a large warming.

“The big damages come if the climate sensitivity to greenhouse gases turns out to be high,” said Raymond T. Pierrehumbert, a climate scientist at the University of Chicago. “Then it’s not a bullet headed at us, but a thermonuclear warhead.”

A major goal of climate research is to improve the way clouds are represented in the computer analyses, which should narrow the range of predicted temperatures. And some of the most important data that researchers need to do so are streaming from a hilltop in rural Oklahoma, near the town of Lamont, where the Department of Energy runs the world’s largest facility for measuring the behavior of clouds.

Accuracy is an overriding goal there. One recent morning, Patrick Dowell, a technician, worked his way across the hill with a rag in hand, carefully dusting dozens of instruments pointed at the sky. When his fingers knocked one gauge off kilter, tiny motors whirred and the device snapped back to position, as though annoyed with him.

“When you clean it,” Mr. Dowell said, “it kind of fights you.”

The questions that scientists still need to answer are voluminous. For instance, they want a better idea of how clouds form at a microscopic scale, how their behavior varies under different atmospheric conditions, and how sensitive they are to higher temperatures.

Recently, $30 million worth of new radars have been installed in Oklahoma and at other research facilities, promising a better view of the innards of clouds. Satellites are also supplying better data, and theories of the atmosphere are improving. “I feel like we’re on our way to doing a lot better,” said Anthony D. Del Genio, a researcher with NASA.

But the problem of how clouds will behave in a future climate is not yet solved — making the unheralded field of cloud research one of the most important pursuits of modern science.

A Feedback Loop?

Among the many climate skeptics who plaster the Internet with their writings, hardly any have serious credentials in the physics of the atmosphere. But a handful of contrarian scientists do. The most influential is Dr. Lindzen.

Dr. Lindzen accepts the elementary tenets of climate science. He agrees that carbon dioxide is a greenhouse gas, calling people who dispute that point “nutty.” He agrees that the level of it is rising because of human activity and that this should warm the climate.

But for more than a decade, Dr. Lindzen has said that when surface temperature increases, the columns of moist air rising in the tropics will rain out more of their moisture, leaving less available to be thrown off as ice, which forms the thin, high clouds known as cirrus. Just like greenhouse gases, these cirrus clouds act to reduce the cooling of the earth, and a decrease of them would counteract the increase of greenhouse gases.

Dr. Lindzen calls his mechanism the iris effect, after the iris of the eye, which opens at night to let in more light. In this case, the earth’s “iris” of high clouds would be opening to let more heat escape.

When Dr. Lindzen first published this theory, in 2001, he said it was supported by satellite records over the Pacific Ocean. But other researchers quickly published work saying that the methods he had used to analyze the data were flawed and that his theory made assumptions that were inconsistent with known facts. Using what they considered more realistic assumptions, they said they could not verify his claims.

Today, most mainstream researchers consider Dr. Lindzen’s theory discredited. He does not agree, but he has had difficulty establishing his case in the scientific literature. Dr. Lindzen published a paper in 2009 offering more support for his case that the earth’s sensitivity to greenhouse gases is low, but once again scientists identified errors, including a failure to account for known inaccuracies in satellite measurements.

Dr. Lindzen acknowledged that the 2009 paper contained “some stupid mistakes” in his handling of the satellite data. “It was just embarrassing,” he said in an interview. “The technical details of satellite measurements are really sort of grotesque.”

Last year, he tried offering more evidence for his case, but after reviewers for a prestigious American journal criticized the paper, Dr. Lindzen published it in a little-known Korean journal.

Dr. Lindzen blames groupthink among climate scientists for his publication difficulties, saying the majority is determined to suppress any dissenting views. They, in turn, contend that he routinely misrepresents the work of other researchers.

“If I’m right, we’ll have saved money” by avoiding measures to limit emissions, Dr. Lindzen said in the interview. “If I’m wrong, we’ll know it in 50 years and can do something.”

But mainstream scientists counter that society’s impulse to wait only heightens the risks.

Ultimately, as the climate continues warming and more data accumulate, it will become obvious how clouds are reacting. But that could take decades, scientists say, and if the answer turns out to be that catastrophe looms, it would most likely be too late. By then, they say, the atmosphere would contain so much carbon dioxide as to make a substantial warming inevitable, and the gas would not return to a normal level for thousands of years.

Researchers are trying various shortcuts to get a rapid answer. One of those is to use short-term natural variations, such as the El Niño cycle, to see how clouds react to higher ocean temperatures. Dr. Dessler, the Texas A&M researcher, did that recently. His analysis, while not definitive, offered some evidence that clouds will exacerbate the long-term planetary warming, just as many of the computer programs have predicted. Most, but not all, papers relying on the historical cloud record have come to similar conclusions.

In his Congressional appearances, speeches and popular writings, Dr. Lindzen offers little hint of how thin the published science supporting his position is. Instead, starting from his disputed iris mechanism, he makes what many of his colleagues see as an unwarranted leap of logic, professing near-certainty that climate change is not a problem society needs to worry about.

“You have politicians who are being told if they question this, they are anti-science,” Dr. Lindzen said. “We are trying to tell them, no, questioning is never anti-science.”

Among the experts most offended by Dr. Lindzen’s stance are many of his colleagues in the M.I.T. atmospheric sciences department, some of whom were once as skeptical as he about climate change.

“Even if there were no political implications, it just seems deeply unprofessional and irresponsible to look at this and say, ‘We’re sure it’s not a problem,’ ” said Kerry A. Emanuel, another M.I.T. scientist. “It’s a special kind of risk, because it’s a risk to the collective civilization.”

NYT

Simons Foundation Chooses U.C. Berkeley for Computing Center - NYTimes.com

Simons Foundation Chooses U.C. Berkeley for Computing Center - NYTimes.com:

'via Blog this'

The Sun, the Moon and Walmart - NYTimes.com

The Sun, the Moon and Walmart - NYTimes.com:

'via Blog this'

Cardenas to Lead PEMEX


Mexico Reins In Oil Monopoly Mexico’s Oil Monopoly Chafes Under Regulatory Scrutiny - NYTimes.com

Mexico Reins In Oil Monopoly Mexico’s Oil Monopoly Chafes Under Regulatory Scrutiny - NYTimes.com:

'via Blog this'

10 Reasons the Government Should Not Regulate the Internet

In the light of recent controversies regarding bills like the Stop Online Piracy Act (SOPA) and the Preventing Real Online Threats to Economic Creativity and Theft of Intellectual Property Act (PROTECT IP Act), Americans are having more discussions about the implications of a government-controlled internet. A reality in many parts of the world, regulated and heavily censored internet activity seems to be more of a possibility than ever for the United States as well. Here are ten of the reasons why governments should not regulate the internet.

  1. To Protect the First Amendment – One of the most cherished rights granted to Americans, the right to free speech and freedom of the press, is protected by the First Amendment of the Constitution. Regulating and censoring online content would be in direct opposition of the Amendment.
  2. Encouraging Entrepreneurial Activity – An open internet also encourages another beloved aspect of the American Dream: the ability to create our own fortunes. The web allows entrepreneurs to fill any one of an endless array of niches, which stimulates economic activity.
  3. Facilitating Innovation – The internet as Americans know it today provides a variety of platforms for exploring emerging technology and even improving upon it, keeping the nation in the race of innovation and development of new areas.
  4. Complications of Regulating Legitimate Sites Under Sweeping Legislation – Broadly worded legislation could make it difficult to regulate legitimate sites, causing them to become lost in the shuffle of “objectionable” sites and depriving users of their potentially valuable information.
  5. Maintaining Citizens’ Right to Privacy – In our post-9/11 world, the concept of a citizen’s right to privacy has changed significantly. The Patriot Act and other similar bills have already increased the amount of surveillance the public endures; regulating the internet would be another step on a very slippery slope.
  6. “Offensive” is Arbitrary – The freedom of religion and the ability to make our own choices are key parts of the American cultural identity; what one person considers offensive may not be questionable in the least to another. In the event of a regulated internet, who would make the final call on web content and its level of offensiveness?
  7. Protecting Educational Value of the Web – While there are certainly dangers lurking in the darker corners of the internet, the vast stores of knowledge that can be accessed outweigh them greatly. Changing the functionality of the web could quite possibly make it more difficult to access educational material in an attempt to censor more controversial content.
  8. Preventing the Increase of Government Spending – The creation of a regulated internet would require an enormous amount of manpower in surveillance alone. Paired with the amount of money that would have to be spent on creating filters and sifting through the almost infinite amount of information available would be staggering.
  9. It Could Fan the Flames of Civil Unrest – The outrage of Egyptian people at their government’s disabling of the internet during a period of political upheaval should serve as a very strong example of why the government should not interfere with the web. An already-disillusioned populace can very quickly become mutinous when their ability to interact with the outside world is taken away.
  10. Savvy Hackers Will Defeat the System Anyway – If groups like Anonymous have proved anything, it’s that a keen mind and a determination to access information will inevitably lead to a back-door solution. Hackers would still be able to override the system to see the same content they do now; however, an already miserably overpopulated prison system would be immensely burdened by the influx of “criminals.”

These are only a few of the reasons why the government should not attempt to censor or filter the internet; like the proverbial iceberg, the bulk of the argument lies beneath the surface of what the average citizen sees.

Taken From Internet Service

Sunday, April 29, 2012

Wal-Mart Bribery Scandal Complicates U.S. Expansion Plans - NYTimes.com

Wal-Mart Bribery Scandal Complicates U.S. Expansion Plans - NYTimes.com:

'via Blog this'

For Businesses in China, a Minefield of Bribery Risks - NYTimes.com

For Businesses in China, a Minefield of Bribery Risks - NYTimes.com:

'via Blog this'

220 Youngsters Fall Ill From Children's Day Party - NYTimes.com

220 Youngsters Fall Ill From Children's Day Party - NYTimes.com:

'via Blog this'

Connecting the Dots

The two notes below, do connect.

Do you see how?

UCSB Student Arrested for I.V. Foot Patrol Arson Attack The Santa Barbara Independent

by TYLER HAYDEN

Santa Barbara detectives have arrested a 23-year-old UCSB senior for the New Year's day firebombing of the Isla Vista Foot Patrol station. Authorities say Keith Keiper hurled a Molotov cocktail at the front door of the 6504 Trigo Road building at around 11:30 p.m. The station sustained minor damage, but no one was hurt.

Keiper, who lives on El Colegio Road and has been at UCSB for two years, was booked into County Jail on January 25 and faces a felony charge of arson of an inhabited dwelling. His bail is set at $250,000. It's believed he acted alone.

Keith Keiper
Click to enlarge photo

Keith Keiper

Keiper's alleged attack came around six weeks after an unknown suspect threw two Molotov cocktails at the Foot Patrol building on November 15, 2011. That attack also caused minor damage to the front of the building, as well as smashing the windshield of an unoccupied Sheriff’s patrol car. The November case remains unsolved, and detectives — explaining why they can't go into detail about how Keiper was identified and caught — are working to determine if the incidents are connected. The FBI’s $5,000 reward for information that leads to the arrest and conviction of the person responsible for the November firebombing remains in effect.

A transfer student from Mira Costa College in Oceanside, Keiper was previously arrested in October during an Occupy Santa Barbara protest in De la Guerra Plaza. He was charged with two misdemeanors: staying in the park past curfew and refusing to sign a citation. The outcome of that case is unclear at this time.

If anyone has information helpful to the arson investigations, said a Sheriff's Department spokesperson, they are urged to call either the Sheriff’s office at (805) 681-4100, the Sheriff’s anonymous tip line at (805) 681-4171, or the FBI at (805) 642-3995.

SBSO

Independent

Wasting Our Minds

PAUL KRUGMAN

In Spain, the unemployment rate among workers under 25 is more than 50 percent. In Ireland almost a third of the young are unemployed. Here in America, youth unemployment is “only” 16.5 percent, which is still terrible — but things could be worse.

And sure enough, many politicians are doing all they can to guarantee that things will, in fact, get worse. We’ve been hearing a lot about the war on women, which is real enough. But there’s also a war on the young, which is just as real even if it’s better disguised. And it’s doing immense harm, not just to the young, but to the nation’s future.

Let’s start with some advice Mitt Romney gave to college students during an appearance last week. After denouncing President Obama’s “divisiveness,” the candidate told his audience, “Take a shot, go for it, take a risk, get the education, borrow money if you have to from your parents, start a business.”

The first thing you notice here is, of course, the Romney touch — the distinctive lack of empathy for those who weren’t born into affluent families, who can’t rely on the Bank of Mom and Dad to finance their ambitions. But the rest of the remark is just as bad in its own way.

I mean, “get the education”? And pay for it how? Tuition at public colleges and universities has soared, in part thanks to sharp reductions in state aid. Mr. Romney isn’t proposing anything that would fix that; he is, however, a strong supporter of the Ryan budget plan, which would drastically cut federal student aid, causing roughly a million students to lose their Pell grants.

So how, exactly, are young people from cash-strapped families supposed to “get the education”? Back in March Mr. Romney had the answer: Find the college “that has a little lower price where you can get a good education.” Good luck with that. But I guess it’s divisive to point out that Mr. Romney’s prescriptions are useless for Americans who weren’t born with his advantages.

There is, however, a larger issue: even if students do manage, somehow, to “get the education,” which they do all too often by incurring a lot of debt, they’ll be graduating into an economy that doesn’t seem to want them.

You’ve probably heard lots about how workers with college degrees are faring better in this slump than those with only a high school education, which is true. But the story is far less encouraging if you focus not on middle-aged Americans with degrees but on recent graduates. Unemployment among recent graduates has soared; so has part-time work, presumably reflecting the inability of graduates to find full-time jobs. Perhaps most telling, earnings have plunged even among those graduates working full time — a sign that many have been forced to take jobs that make no use of their education.

College graduates, then, are taking it on the chin thanks to the weak economy. And research tells us that the price isn’t temporary: students who graduate into a bad economy never recover the lost ground. Instead, their earnings are depressed for life.

What the young need most of all, then, is a better job market. People like Mr. Romney claim that they have the recipe for job creation: slash taxes on corporations and the rich, slash spending on public services and the poor. But we now have plenty of evidence on how these policies actually work in a depressed economy — and they clearly destroy jobs rather than create them.

For as you look at the economic devastation in Europe, you should bear in mind that some of the countries experiencing the worst devastation have been doing everything American conservatives say we should do here. Not long ago, conservatives gushed over Ireland’s economic policies, especially its low corporate tax rate; the Heritage Foundation used to give it higher marks for “economic freedom” than any other Western nation. When things went bad, Ireland once again received lavish praise, this time for its harsh spending cuts, which were supposed to inspire confidence and lead to quick recovery.

And now, as I said, almost a third of Ireland’s young can’t find jobs.

What should we do to help America’s young? Basically, the opposite of what Mr. Romney and his friends want. We should be expanding student aid, not slashing it. And we should reverse the de facto austerity policies that are holding back the U.S. economy — the unprecedented cutbacks at the state and local level, which have been hitting education especially hard.

Yes, such a policy reversal would cost money. But refusing to spend that money is foolish and shortsighted even in purely fiscal terms. Remember, the young aren’t just America’s future; they’re the future of the tax base, too.

A mind is a terrible thing to waste; wasting the minds of a whole generation is even more terrible. Let’s stop doing it.

NYT

The Man Who Makes the Future: Wired Icon Marc Andreessen | Epicenter | Wired.com

By Chris Anderson

He’s not a household name like Gates, Jobs, or Zuckerberg. His face isn’t known to millions. But during his remarkable 20-year career, no one has done more than Marc Andreessen to change the way we communicate. At 22, he invented Mosaic, the first graphical web browser—an innovation that is perhaps more responsible than any other for popularizing the Internet and bringing it into hundreds of millions of homes. He cofounded Netscape and took it public in a massive (for that time) stock offering that helped catalyze the dotcom boom. He started Loudcloud, a visionary service to bring cloud computing to business clients. And more recently, as a venture capitalist, he has backed an astonishing array of web 2.0 companies, from Twitter to Skype to Groupon to Instagram to Airbnb.

As Wired prepares for its 20th anniversary issue in January 2013, we are launching a series called Wired Icons: in-depth interviews with our biggest heroes, the tenacious pioneers who built digital culture and evangelized it to the world over the past two decades. There’s not a more fitting choice for our first icon than Andreessen—a man whose career, which almost exactly spans the history of our magazine, is a lesson in how to spot the future. In an interview at Andreessen’s office in Palo Alto, California, Wired editor in chief Chris Anderson talked with him about technological transformation, and about the five big ideas that Andreessen had before everyone else.

idea one 1992

Everyone Will Have the Web

As a 22-year-old undergraduate at the University of Illinois, Andreessen developed Mosaic, the first graphical browser for the World Wide Web, then brought the technology to Silicon Valley and cofounded Netscape. By August 1995, Netscape had gone public and was worth $2.9 billion.

Chris Anderson: At 22, you’re a random kid from small-town Wisconsin, working at a supercomputer center at the University of Illinois. How were you able to see the future of the web so clearly?

Marc Andreessen: It was probably the juxtaposition of the two—being from a small town and having access to a supercomputer. Where I grew up, we had the three TV networks, maybe two radio stations, no cable TV. We still had a long-distance party line in our neighborhood, so you could listen to all your neighbors’ phone calls. We had a very small public library, and the nearest bookstore was an hour away. So I came from an environment where I was starved for information, starved for connection.

Anderson: And then at Illinois, you found the Internet.

Andreessen: Right, which could make information so abundant. The future was much easier to see if you were on a college campus. Remember, it was feast or famine in those days. Trying to do dialup was miserable. If you were a trained computer scientist and you put in a tremendous amount of effort, you could do it: You could go get a Netcom account, you could set up your own TCP/IP stack, you could get a 2,400-baud modem. But at the university, you were on the Internet in a way that was actually very modern even by today’s standards. At the time, we had a T3 line—45 megabits, which is actually still considered broadband. Sure, that was for the entire campus, and it cost them $35,000 a month! But we had an actual broadband experience. And it convinced me that everybody was going to want to be connected, to have that experience for themselves.

Anderson: But the notion that everyday consumers would want it over dialup—that was pretty radical.

Andreessen: True. At the time, there were four presumptions made against dialup Internet access, and after Mosaic took off I could see that they were all wrong. The first presumption was that dialup flat-out wouldn’t work.

Anderson: That it would always be too slow, too clunky.

Andreessen: Right. The second presumption was that it was too expensive—and that it would always stay as expensive as it was. The third presumption was that people wouldn’t be smart enough to figure out how to get it working at home. But the most interesting presumption was the fourth one: that consumers wouldn’t want it, that they wouldn’t know what to do with it.

Anderson: Your big idea, really, was that they would want it—and they’d eventually get it.

Andreessen: Yeah. It was essentially knocking through all four of those assumptions. I thought it was obvious that everyone would want this and that they would be able to do lots of things with it. And I thought it was obvious that the technology would advance to a point where you wouldn’t need a computer science degree to do it.

Anderson: Which was the one problem you could do something about.

Andreessen: Well, actually, I think that Mosaic helped address a few of the problems at once. It did make the Internet much easier to use. But making it easier to use also made it more apparent how to use it, all the different things that people could do with it—which then made people want it more. And it’s also clear that we helped drive faster bandwidth: By creating the demand, we helped increase the supply.

Anderson: I remember the first time I interviewed you, back in 1995 when I was at The Economist. I thought we were going to talk about, you know, TCP/IP and HTTP. But you wanted to talk about globalization, about international trade. You were already thinking about the Internet in macroeconomic terms. Have you always seen the world that way, or was there an awakening somewhere in the process?

Andreessen: The awakening probably happened for me during that period. Once you understand that everybody’s going to get connected, a lot of things follow from that. If everybody gets the Internet, they end up with a browser, so they look at web pages—but they can also leave comments, create web pages. They can even host their own server! So not only is everybody consuming, they can also produce. And once you get instantaneous communication with everybody, you have economic activity that’s far more advanced, far more liquid, far more distributed than ever before.

Anderson: Looking back on the browser after 20 years, what are the biggest surprises? What did you not expect?

Andreessen: Number one, that it worked. The big turning point for me was when Mosaic worked. I was like, wait a minute, you can actually change the world!

Anderson: But you got that surprise early on. Mosaic was a huge success within 12 months.

Andreessen: Yeah, that’s true. But the second surprise is that it has kept working. Notwithstanding certain cover stories in certain magazines, I think the browser is as relevant today as it’s ever been.

idea two

1995

The Browser Will Be the Operating System

During the browser wars with Microsoft, when Netscape Navigator and Internet Explorer vied for domination on the PC desktop, Andreessen prophesied a future where computers would dispense with feature-heavy operating systems entirely. Instead, we would use a browser to run programs over the network. Netscape lost its battle with Microsoft, but in key respects Andreessen’s vision has come to pass. Google Chrome OS, for example, is a fully browser-based operating system, while most of our favorite applications, from email to social networks, now live entirely on the network.

Photo: Nigel Parry

Anderson (left) and Andreessen in Palo Alto in January 2012.
Photo: Nigel Parry

Anderson: A quote of yours that I’ve always loved is that Netscape would render Windows “a poorly debugged set of device drivers.”

Andreessen: In fairness, you have to give credit for that quote to Bob Metcalfe, the 3Com founder.

Anderson: Oh, it wasn’t you? It’s always attributed to you.

Andreessen: I used to say it, but it was a retweet on my part. [Laughs.] But yes, the idea we had then, which seems obvious today, was to lift the computing off of each user’s device and perform it in the network instead. It’s something I think is inherent in the technology—what some thinkers refer to as the “technological imperative.” It’s as if the technology wants it to happen.

Anderson: As in Stewart Brand’s famous formulation that “information wants to be free.”

Andreessen: Right. Technology is like water; it wants to find its level. So if you hook up your computer to a billion other computers, it just makes sense that a tremendous share of the resources you want to use—not only text or media but processing power too—will be located remotely. People tend to think of the web as a way to get information or perhaps as a place to carry out ecommerce. But really, the web is about accessing applications. Think of each website as an application, and every single click, every single interaction with that site, is an opportunity to be on the very latest version of that application. Once you start thinking in terms of networks, it just doesn’t make much sense to prefer local apps, with downloadable, installable code that needs to be constantly updated.

“We could have built a social element into Mosaic. But back then the Internet was all about anonymity.”

Anderson: Assuming you have enough bandwidth.

Andreessen: That’s the very big if in this equation. If you have infinite network bandwidth, if you have an infinitely fast network, then this is what the technology wants. But we’re not yet in a world of infinite speed, so that’s why we have mobile apps and PC and Mac software on laptops and phones. That’s why there are still Xbox games on discs. That’s why everything isn’t in the cloud. But eventually the technology wants it all to be up there.

Anderson: Back in 1995, Netscape began pursuing this vision by enabling the browser to do more.

Andreessen: We knew that you would need some processing to stay on the computer, so we invented JavaScript. And then we also catalyzed Java, which enabled far more sophisticated applications in the network, by building support for Java into the browser. The basic idea, which remains in force today, is that you do some computation on the device, but you want the server application to be in control of that. And the whole process is completely invisible to the user.

Anderson: Unlike with Mosaic, where your original ideas were proven correct within a year, it seems like this idea has taken 15 years to come to fruition.

Andreessen: Right. And only with the arrival of tablets and smartphones, really. If you draw a pie chart of all the personal computing devices in use, smartphones and tablets are now over 50 percent and growing rapidly. It took a lot longer than we expected, but these really are the network computers. Now, in an ironic twist of fate, the devices do have all these local apps …

Anderson: Well, exactly.

Andreessen: … but I can go on an iPad or an Android smartphone or a Linux tablet and I can access all the same websites and all the same applications and all the same services that I get on my desktop.

Anderson: But we do still have lots of desktops and laptops out there. Let me ask you in 2012: Do you still think that the web and browsers will render computer operating systems a “poorly debugged set of device drivers”?

Andreessen: I will pull a full Henry Kissinger and answer a different question. The application model of the future is the web application model. The apps will live on the web. Mobile apps on platforms like iOS and Android are a temporary step along the way toward the full mobile web. Now, that temporary step may last for a very long time. Because the networks are still limited. But if you grant me the very big assumption that at some point we will have ubiquitous, high-speed wireless connectivity, then in time everything will end up back in the web model. Because the technology wants it to work that way.

idea three

1999

Web Businesses Will Live in the Cloud

In September 1999, Andreessen cofounded Loudcloud, a firm that would enable whole businesses to move into the cloud; it would host and manage their web services and software so that companies wouldn’t need to run any servers at all. That business didn’t last—despite an IPO in 2001, Loudcloud changed its name and business model in 2002 and was sold to Hewlett-Packard in 2007. But its vision has been vindicated in the phenomenal rise of Amazon Web Services, which serves as the backbone for hundreds of thousands of businesses online.

Photo: Nigel Parry

Andreessen on the cover of Time in 1996.
Courtesy of: Time Magazine

Anderson: With the name Loudcloud, did you make the first use of the word cloud in this context—as a place where applications run on the network?

Andreessen: It was a common term in the telecom business. AT&T used it to talk about their Centrex service, which—going way back here—took all the hassles of switching phone calls out of the individual enterprise and turned it into a service. So our idea with Loudcloud was to offer a similar proposition, but for software. When we first announced it, I described it as Silicon Valley Power & Light.

Anderson: Tech companies would use it as a utility.

Andreessen: Exactly: the software power grid. We actually used the electrical metaphor more than the telecom metaphor. When electricity first came to factories, every factory had its own generator. But eventually that didn’t make any sense, because everyone could draw electricity off the grid. At the height of the first dotcom boom, we saw the exact same thing happening in Silicon Valley. You’d raise $20 million of venture capital, and then you’d have to turn around and write $5 million checks to Oracle, Sun, EMC, and Cisco just to build out your server farm. It was literally like everybody building their own electrical generator over and over again.

Anderson: You were the first company to provide software as a service.

Andreessen: I would say we were the first cloud provider in the modern sense of the term. Our pitch was, you should be able to buy all this software by the drink, instead of having to shell out for the bottle up front. By capitalizing on economies of scale, Loudcloud could provide higher levels of service than you could get in-house, and a startup could get its product to market almost instantaneously. It could spend its time and energy building the actual product instead of trying to figure out how to host it and keep it live. That was the pitch.

Anderson: It didn’t really work.

Andreessen: Well, it worked beautifully right up to the point when all the startups went bankrupt, and then all our big clients decided they didn’t have to worry about competing with the startups anymore. After that, it went completely sideways. Literally every other company we were competing with went bankrupt; we were the only one that got through it. So we just went back to basics and we said, OK, we couldn’t make it work as a service provider, but we think we can make it work as a software company, selling the back-end software to manage big networks of servers. We changed our name to Opsware. That ultimately worked, as a business.

Anderson: You were acquired by HP for $1.6 billion.

Andreessen: That whole transition happened during an unfun time in the tech economy. Everybody went through a crisis of confidence between 2002 and 2006. Up and down Sand Hill Road, VCs would refuse to fund consumer Internet companies, because it had been decided that those simply weren’t going to work.

Anderson: Looking back, it’s somewhat ironic that you started with the right name, Loudcloud, but abandoned it. Now the world has come back to cloud. What did it take?

Andreessen: In retrospect, we were five or six years too early. Besides the rebound in the startup economy, there have also been two huge developments in server technology. The first is commoditization: We were running on expensive Sun servers, but now you can buy Linux servers at a fraction of the cost. The second is virtualization, which makes managing the servers and apportioning services to clients far easier than was possible back in 1999. And that’s why Amazon’s cloud service has been so magical. It’s the same core concept—but with supercheap hardware, which makes the economics far more attractive for everybody, and with virtualization, which makes the entire environment far more adaptable.

idea four

2004

Everything Will Be Social

In 2004, when very few consumer Internet companies were getting funded, Andreessen cofounded Ning, a service to let groups of people create their own social apps. It was a modest success, but “social” has become just as ubiquitous as he predicted—increasingly, what we buy, what we listen to, even our search results are influenced by our friends’ tastes and choices. And most of the successful startups in this arena, from Facebook to Groupon to Instagram, have Andreessen as an investor or board member.

Photo: Nigel Parry

Andreessen on the cover of Wired in 2000.

Anderson: Your bet on Ning hasn’t paid off as handsomely as your previous two companies did, but you did bet correctly on a future where social would be knit into everything. What was your thinking around that venture?

Andreessen: In the 1990s, lots of people talked about Moore’s law, which predicts that processing speed will increase exponentially, and Metcalfe’s law, which holds that a network gets exponentially more valuable as nodes are added. But I was also fascinated with Reed’s law. That’s a mathematical property about the forming of groups—for any group of size n, the number of subgroups that can be assembled is 2n.

Anderson: So the bigger the network gets, the more subnetworks that will want to organize themselves—a richer and more varied set of social groups.

Andreessen: We see this playing out in retail, where ecommerce is becoming a group activity. Long before Ning, actually, in 1999, I invested in a company called Mobshop, which was Reed’s law applied to commerce, through group sales. It didn’t work back then. But 10 years later, I invested in Groupon, because I could see it was the same idea—finding, on the fly, a group of people who want the same product and using their massive numbers to command steep discounts. The Internet lets you aggregate groups in a way that was never previously possible.

Anderson: What changed between 1999 and 2009 that made Groupon—and Facebook, and all these other profitable consumer Internet companies—possible?

Andreessen: A big part of it was broadband. Ironically, it was during the nuclear winter, from 2000 to 2005, that broadband happened. DSL got built out, cable modems got built out. So then you started to have 100, 200, 300 million people worldwide on broadband. Also, the international market started to really open up: China, India, Indonesia, Thailand, Turkey. Still, though, starting a new consumer Internet company in 2004 was a radical act. [Laughs.]

Anderson: As I recall, your initial concept for Ning was to let groups create their own Craigslists, effectively—trusted marketplaces.

Andreessen: Yeah, at the time we had this concept of “social apps.” Friendster hadn’t worked, MySpace was just getting a little bit of traction, and Facebook was still at Harvard. What we knew worked were focused applications: Craigslist, eBay, Monster. So our idea was to bring social into these domains, in the form of apps that groups could run for themselves: their own job boards, their own selling marketplaces, and so on. Then later we sort of abstracted that up into the idea of building your own social network.

Anderson: In retrospect, it seems like social is another dimension of the Internet that was there from the beginning—as if the technology wanted it to happen.

Andreessen: I often wonder if we should have built social into the browser from the start. The idea that you want to be connected with your friends, your social circle, the people you work with—we could have built that into Mosaic. But at the time, the culture on the Internet revolved around anonymity and pseudonyms.

Anderson: You built in cookies so that sites could remember each user.

Andreessen: But we didn’t build in the concept of identity. I think that might have freaked people out.

Anderson: It might still.

Andreessen: Yeah, I’m not sure at the time people were ready for it. I don’t think it was an accident that it took, you know, 13 or 14 years after we introduced the browser for people to say, “I want my identity to be a standard part of this.”

Anderson: And it took Mark Zuckerberg to figure out how to make it pay off.

Andreessen: It was really a generational shift—a group of young entrepreneurs, including Andrew Mason and Mark Zuckerberg, who weren’t burned by the dotcom boom and bust. I came to Ning with all these psychic scars. They just looked at the Internet and said, “This stuff is really cool, and we want to build something new.”

Anderson: No cynicism.

Andreessen: One of the first times Zuckerberg and I got together, in 2005 or 2006, he stopped me in the middle of conversation and asked: “What did Netscape do?” And I said, “What do you mean, what did Netscape do?” And he was like, “Dude, I was in junior high. I wasn’t paying attention.”

Anderson: How big can Facebook get?

Andreessen: We don’t really know. The Internet is still the Wild West. Eight years ago, Facebook was just a gleam in a Harvard sophomore’s eye. It is still possible to build these things from scratch. So I can’t tell you what the top five platforms are going to be even five years from now. I’m pretty sure that Facebook, Apple, and Google will be on that list. But I don’t know what the other two will be. Maybe Microsoft comes roaring back with Windows Phone. Maybe Twitter evolves and gets to scale. HP is planning to open source its WebOS—maybe it’s that! Or maybe it’s something we haven’t even heard of, a company that’s just getting funded right now.

idea five

2009

Software Will Eat the World

In 2009, Andreessen and his longtime business partner, Loudcloud cofounder Ben Horowitz, created a venture capital firm called Andreessen Horowitz. Their vision today: an economy transformed by the rise of computing. Andreessen believes that enormous technology companies can now be built around the use of hyperintelligent software to revolutionize whole sectors of the economy, from retail to real estate to health care.

Photo: Nigel Parry

Photo: Nigel Parry

Anderson: Take us back to when you were forming Andreessen Horowitz. You’d been an investor for some time already, but now you decided to formalize it. So what was the guiding philosophy?

Andreessen: Our vision was to be a throwback: a Silicon Valley venture capital firm. We were going to be a single-office firm, focusing primarily on companies in the US and then, within that, primarily companies in Silicon Valley. And—this is the crucial thing—we’re only going to invest in companies based on computer science, no matter what sector their business is in. We are looking to invest in what we call primary technology companies.

Anderson: Give me an example.

Andreessen: Airbnb—the startup that lets you rent out your home or a room in your home. Ten years ago you would never have said you could build Airbnb, which is looking to transform real estate with a new primary technology. But now the market’s big enough.

Anderson: I guess I’m struggling a little bit with “primary technology.” How does Airbnb qualify?

Andreessen: Airbnb makes its money in real estate. But everything inside of how Airbnb runs has much more in common with Facebook or Google or Microsoft or Oracle than with any real estate company. What makes Airbnb function is its software engine, which matches customers to properties, sets prices, flags potential problems. It’s a tech company—a company where, if the developers all quit tomorrow, you’d have to shut the company down. To us, that’s a good thing.

Anderson: I’m probably a little bit elitist in this, but I think a “primary technology” would need to involve, you know, some fundamental new insight in code, some proprietary set of algorithms.

Andreessen: Oh, I agree. I think Airbnb is building a software technology that is equivalent in complexity, power, and importance to an operating system. It’s just applied to a sector of the economy instead. This is the basic insight: Software is eating the world. The Internet has now spread to the size and scope where it has become economically viable to build huge companies in single domains, where their basic, world-changing innovation is entirely in the code. We’ve especially seen it in retail—with companies like Groupon, Zappos, Fab.

“Amazon is a force for human progress and culture and economics in a way that Borders never was.”

Anderson: And these aren’t copycats, or me-toos, but fundamentally new insights in software?

Andreessen: Yes, absolutely. I have another theory that I call the missing campus puzzle. When you drive down highway 101 through Silicon Valley, you pass the Oracle campus and then the Google campus and then the Cisco campus. And some people think, wow, they’re so big. But what I think is, I’ve been driving for close to an hour—why haven’t I passed a hundred more campuses? Why is there all this open space?

Anderson: What’s your answer?

Andreessen: Think about what it has meant to build a primary technology company up until now. In order to harness a large enough market, to attract the right kind of technical talent, to pay them adequately, to grow the company to critical mass—until now that’s only been possible with companies that are providing tools for all sectors, not just specific sectors. Technology has been just a slice of the economy. We’ve been making the building blocks to get us to today, when technology is poised to remake the whole economy.

Anderson: What categories are next?

Andreessen: The next stops, I believe, are education, financial services, health care, and then ultimately government—the huge swaths of the economy that historically have not been addressable by technology, that haven’t been amenable to the entrance of Silicon Valley-style software companies. But increasingly I think they’re going to be.

Anderson: Today, so much software is instantiated in hardware—Apple being a great example. As software “eats the world,” do you think that we’ll see fewer companies like Apple that deliver their revolutionary software in the form of shiny objects?

Andreessen: Yes, but I’m not a purist. In fact, we’re funding some hardware companies. Let me give two examples. The first is Jawbone—they make portable speakers, noise-canceling headsets, and now a wristband that tracks your daily movements. Jawbone is an Apple-style company, in that it has genius in hardware and marketing as well as in software design. But if you took away the software, you’d have nothing.

Anderson: What’s the second?

Andreessen: The other one is Lytro, which is making light-field cameras—this amazing new technology that lets you capture the whole depth of field in three dimensions and then focus and compose your picture later.

Anderson: It’s a computer science company.

Andreessen: Yeah, it’s computer science. But it’s going to ship as a camera. And before I met Ren Ng, the founder, if you had asked me if we’d ever back a camera company, I would have said you’re smoking crack.

Anderson: There’s an app for that!

Andreessen: And Kodak filed for bankruptcy. But what Ren has is a completely different approach to photography. There’s a lot of hardware engineering that goes into it, but 90 percent of the intellectual property is software. So we look at Lytro and we look at Jawbone and we see software expressed as hardware—highly specialized hardware that will be hard to clone.

Anderson: One last question for you. Software eating the world is dematerialization, in some sense: These sectors of the economy get transformed into coding problems. But I’m wondering whether there is an economic path by which dematerialization leads to demonetization—where the efficiency of the software sucks economic value out of the whole system. Take Craigslist, for example: For every million that Craigslist made, it took a billion out of the newspaper industry. If you transform these big, inefficient industries in such a way that the value all accrues to a smaller software company, what’s the broad economic impact?

Andreessen: My bet is that the positive effects will far outweigh the negatives. Think about Borders, the bookstore chain. Amazon drove Borders out of business, and the vast majority of Borders employees are not qualified to work at Amazon. That’s an actual, full-on problem. But should Amazon have been prevented from doing that? In my view, no. Because it’s so much better to live in a world where that happened, it’s so much better to live in a world where Amazon is ascendant. I told you that my childhood bookstore was something you had to drive an hour to get to. But it was a Waldenbooks, and it was, like, 800 square feet, and it sold almost nothing that you would actually want to read. It’s such a better world where we have Amazon, where everything is universally available. They’re a force for human progress and culture and economics in a way that Borders never was.

Anderson: So it’s creative destruction.

Andreessen: When Milton Friedman was asked about this kind of thing, he said: Human wants and needs are infinite, and so there will always be new industries, there will always be new professions. This is the great sweep of economic history. When the vast majority of the workforce was in agriculture, it was impossible to imagine what all those people would do if they didn’t have agricultural jobs. Then a hundred years later the vast majority of the workforce was in industrial jobs, and we were similarly blind: It was impossible to imagine what workers would do without those jobs. Now the majority are in information jobs. If the computers get smart enough, then what? I’ll tell you: The then what is whatever we invent next.

Chris Anderson (canderson@wired.com) is editor in chief of Wired. He wrote about the death of the web in issue 18.09.

Wired

The Man Who Makes the Future: Wired Icon Marc Andreessen | Epicenter | Wired.com

The Man Who Makes the Future: Wired Icon Marc Andreessen | Epicenter | Wired.com:

'via Blog this'

David Peel Sang Once for Lennon, Now for Occupy Wall Street

LENNON GLASSES ON David Peel performs in Union Square.

By COREY KILGANNON

“A lot of people come up to me and say, ‘Peel, I thought you were dead,’ ” said David Peel, who is not only alive but basically living the same life he led 40 years ago, when he was a muse for the Yippie and hippie movements in New York City.

At 68, Mr. Peel is past conventional rock ’n’ roll retirement age, but why retire if you can rock? He still has his acoustic guitar, his three chords and his festive, irreverent marijuana shout-music.

And lately he has found new relevance, and new listeners. He was a regular last fall at the Occupy Wall Street movement’s Zuccotti Park encampment, and now shows up in Union Square to jam with the Occupy protesters there.

“Not many of them know who I was, because it’s a new generation,” he said. But the old-timers like Kenny Vena, 72, certainly do. “Man, David Peel — you were an East Village icon,” he said upon spotting Mr. Peel strumming in the park one recent weekday. “ ‘The Marijuana Song,’ right?”

Mr. Peel smiled and broke into “I Like Marijuana,” one of the classics by his band, David Peel & the Lower East Side. With his songs about smoking pot and blasting the establishment, he became a fixture at counterculture marches and demonstrations beginning in the late 1960s.

Now he has written songs for the Occupy movement, including “Up Against the Wall Street” and “Mic Check, No Check.”

“If you want to win the movement, you must have music, the way John Lennon gave us ‘Give Peace a Chance’ for the hippie movement,” he said as the drum circle banged away.

Whoops — we had almost gone too far without letting Mr. Peel invoke his friendship with John Lennon. His business card bears a picture of him with Lennon and Yoko Ono, and he still wears his Lennon-style sunglasses. He became so associated with Lennon that in the early 1970s, the Federal Bureau of Investigation mistakenly put Mr. Peel’s head shot on an intelligence advisory about Mr. Lennon.

Mr. Peel was born David Rosario and grew up in Midwood, Brooklyn. He served in the Army in the mid-1960s and then began living downtown and playing the guitar in parks and on the street.

Lennon first saw Mr. Peel playing in Washington Square Park in 1971 and was introduced to him by the founders of the Yippie movement, Jerry Rubin and Abbie Hoffman. They all immediately began singing along to Mr. Peel’s song “The Pope Smokes Dope,” until the scene was broken up by the police.

Lennon began performing with Mr. Peel and signed him to Apple Records for the album “The Pope Smokes Dope,” which Mr. Peel said was widely banned, largely for its title.

At the time, Lennon was quoted as saying that Mr. Peel “writes beautiful songs” and that if he ever wanted to write non-controversial music, he could write hits as “easy as pie.”

He acknowledged the criticism that Mr. Peel “can’t sing, or he can’t really play,” and called Mr. Peel’s music simple, but added, “Picasso spent 40 years trying to get as simple as that.”

Mr. Peel has been high on that endorsement to this day. Regarding his three-chord approach, he said: “Technique is not talent. A jazz musician with a million chords can go all over the place and end up no place.”

Mr. Peel is unmarried with no children and lives in a rent-stabilized apartment on Avenue B. He survives on modest royalties, small gig fees and the sale of old and current records.

At Union Square, dressed in East Village black, with his long black hair pulled back in a ponytail, he made the rounds greeting the other old hippies — including Aron Kay, 62, the Yippie Pieman — and then began playing one of his classics: “I’m Proud to Be a New York City Hippie.”

Then he was walking along St. Marks Place. He wore his Lennon sunglasses and handed out fliers for an upcoming marijuana rally. When asked about his pot intake these days, he flashed a grin and offered a typically ambiguous Peel-ism: “Lose dope, lose hope.”

He said he planned to continue to sing on the streets and in the parks downtown “until the day I drop dead and go to rock ’n’ roll heaven.”

E-mail: character@nytimes.com

NYT

Lennon, Ono, and Peel

President Obama, Warrior in Chief

By PETER L. BERGEN

THE president who won the Nobel Peace Prize less than nine months after his inauguration has turned out to be one of the most militarily aggressive American leaders in decades.

Liberals helped to elect Barack Obama in part because of his opposition to the Iraq war, and probably don’t celebrate all of the president’s many military accomplishments. But they are sizable.

Mr. Obama decimated Al Qaeda’s leadership. He overthrew the Libyan dictator. He ramped up drone attacks in Pakistan, waged effective covert wars in Yemen and Somalia and authorized a threefold increase in the number of American troops in Afghanistan. He became the first president to authorize the assassination of a United States citizen, Anwar al-Awlaki, who was born in New Mexico and played an operational role in Al Qaeda, and was killed in an American drone strike in Yemen. And, of course, Mr. Obama ordered and oversaw the Navy SEAL raid that killed Osama bin Laden.

Ironically, the president used the Nobel Peace Prize acceptance speech as an occasion to articulate his philosophy of war. He made it very clear that his opposition to the Iraq war didn’t mean that he embraced pacifism — not at all.

“I face the world as it is, and cannot stand idle in the face of threats to the American people,” the president told the Nobel committee — and the world. “For make no mistake: Evil does exist in the world. A nonviolent movement could not have halted Hitler’s armies. Negotiations cannot convince Al Qaeda’s leaders to lay down their arms. To say that force is sometimes necessary is not a call to cynicism — it is a recognition of history, the imperfections of man, and the limits of reason.”

If those on the left were listening, they didn’t seem to care. The left, which had loudly condemned George W. Bush for waterboarding and due process violations at Guantánamo, was relatively quiet when the Obama administration, acting as judge and executioner, ordered more than 250 drone strikes in Pakistan since 2009, during which at least 1,400 lives were lost.

Mr. Obama’s readiness to use force — and his military record — have won him little support from the right. Despite countervailing evidence, most conservatives view the president as some kind of peacenik. From both the right and left, there has been a continuing, dramatic cognitive disconnect between Mr. Obama’s record and the public perception of his leadership: despite his demonstrated willingness to use force, neither side regards him as the warrior president he is.

Mr. Obama had firsthand experience of military efficacy and precision early in his presidency. Three months after his inauguration, Somali pirates held Richard Phillips, the American captain of the Maersk Alabama, hostage in the Indian Ocean. Authorized to use deadly force if Captain Phillips’s life was in danger, Navy SEALs parachuted to a nearby warship, and three sharpshooters, firing at night from a distance of 100 feet, killed the pirates without harming Captain Phillips.

“GREAT job,” Mr. Obama told William H. McRaven, the then vice admiral who oversaw the daring rescue mission and later the Bin Laden operation in Abbottabad, Pakistan. The SEAL rescue was the president’s first high-stakes decision involving the secretive counterterrorism units. But he would rely increasingly upon their capacities in the coming years.

Soon after Mr. Obama took office he reframed the fight against terrorism. Liberals wanted to cast anti-terrorism efforts in terms of global law enforcement — rather than war. The president didn’t choose this path and instead declared “war against Al Qaeda and its allies.” In switching rhetorical gears, Mr. Obama abandoned Mr. Bush’s vague and open-ended fight against terrorism in favor of a war with particular, violent jihadists.

The rhetorical shift had dramatic — non-rhetorical — consequences. Compare Mr. Obama’s use of drone strikes with that of his predecessor. During the Bush administration, there was an American drone attack in Pakistan every 43 days; during the first two years of the Obama administration, there was a drone strike there every four days. And two years into his presidency, the Nobel Peace Prize-winning president was engaged in conflicts in six Muslim countries: Iraq, Afghanistan, Pakistan, Somalia, Yemen and Libya. The man who went to Washington as an “antiwar” president was more Teddy Roosevelt than Jimmy Carter.

Consider the comparative speed with which Mr. Obama and his Democratic predecessor, Bill Clinton, opted for military intervention in various conflicts. Hesitant, perhaps, because of the Black Hawk Down disaster in Somalia in 1993, Mr. Clinton did nothing to stop what, at least by 1994, was evidently a genocidal campaign in Rwanda. And Bosnia was on the verge of genocidal collapse before Mr. Clinton decided — after two years of dithering — to intervene in that troubled area in the mid-1990s. In contrast, it took Mr. Obama only a few weeks to act in Libya in the spring of 2011 when Col. Muammar el-Qaddafi threatened to massacre large portions of the Libyan population. Mr. Obama went to the United Nations and NATO and set in motion the military campaign — roundly criticized by the left and the right — that toppled the Libyan dictator.

None of this should have surprised anyone who had paid close attention to what Mr. Obama said about the use of force during his presidential campaign. In an August 2007 speech on national security, he put the nation — and the world — on alert: “If we have actionable intelligence about high-value terrorist targets and President Musharraf won’t act, we will,” he said, referring to Pervez Musharraf, then president of Pakistan. He added, “I will not hesitate to use military force to take out terrorists who pose a direct threat to America.”

That’s about as clear a statement as can be. But Republicans and Democrats blasted Mr. Obama with equal intensity for suggesting that he would authorize unilateral military action in Pakistan to kill Bin Laden or other Al Qaeda leaders.

Hillary Rodham Clinton, then a Democratic rival for the presidential nomination, said, “I think it is a very big mistake to telegraph that.” Mitt Romney, vying for the Republican nomination, accused Mr. Obama of being a “Dr. Strangelove” who is “going to bomb our allies.” John McCain piled on: “Will we risk the confused leadership of an inexperienced candidate who once suggested bombing our ally, Pakistan?”

Once in office, Mr. Obama signed off on a large increase in the number of C.I.A. officers on the ground in Pakistan and an intensified campaign of drone warfare there; he also embraced the use of drones or covert military units in places like Somalia and Yemen, where the United States was not engaged in traditional land warfare. (Mr. Bush, who first deployed C.I.A.-directed drones, did not do so on the scale that Mr. Obama did; and Mr. Obama, of course, had the benefit of significantly improved, more precise, drone technology.)

Nothing dramatizes Mr. Obama’s willingness to use hard power so well as his decision to send Navy SEAL Team 6 to Abbottabad, to take out Bin Laden. Had this risky operation failed, it would most likely have severely damaged Mr. Obama’s presidency — and legacy.

Mr. Obama’s advisers worried that a botched raid would disturb — or destroy — the United States-Pakistan relationship, which would make the war in Afghanistan more difficult to wage since so much American matériel had to travel through Pakistani airspace or ground routes.

The risks were enormous. A helicopter-borne assault could easily turn into a replay of the debacle in the Iranian desert in 1980, when Mr. Carter authorized a mission to release the American hostages in Tehran that ended with eight American servicemen dead and zero hostages freed.

SOME of Mr. Obama’s top advisers worried that the intelligence suggesting that Bin Laden was in the Abbottabad compound was circumstantial and much too flimsy to justify the risks involved. The deputy C.I.A. director, Michael J. Morell, had told the president that in terms of available data points, “the circumstantial evidence of Iraq having W.M.D. was actually stronger than evidence that Bin Laden was living in the Abbottabad compound.”

At the final National Security Council meeting to consider options connected to Bin Laden’s possible presence in the Abbottabad compound, Mr. Obama gave each of his advisers an opportunity to speak. When the president asked, “Where are you on this? What do you think?” so many officials prefaced their views by saying, “Mr. President, this is a very hard call,” that laughter erupted, providing a few moments of levity in the otherwise tense, two-hour meeting.

Asked his view, Vice President Joseph R. Biden Jr. said, “Mr. President, my suggestion is, don’t go.”

For the president, however, the potential rewards clearly outweighed all risk involved. “Even though I thought it was only 50-50 that Bin Laden was there, I thought it was worth us taking a shot,” he said. “And I said to myself that if we have a good chance of not completely defeating but badly disabling Al Qaeda, then it was worth both the political risks as well as the risks to our men.”

The following morning, on Friday, April 29, at 8:20 a.m. in the White House Diplomatic Reception Room, Mr. Obama gathered his key national security advisers in a semicircle around him and told them simply, “It’s a go.”

Three days later Bin Laden was dead.

The Bin Laden mission will surely resurface in the coming election; the campaign has already produced a 17-minute documentary that showcases the raid. This, combined with Mr. Obama’s record of military accomplishment, will make it hard for Mitt Romney to convince voters that Mr. Obama is a typical, weak-on-national-security Democrat. And, if Mr. Romney tries to portray Mr. Obama this way, he will very likely trap himself into calling for a war with Iran, which many Americans oppose.

Mr. Obama plans to be in Chicago for the NATO summit meeting in late May, just as the election campaign heats up. He’ll arrive knowing that the United States and Afghanistan have already agreed to a long-term strategic partnership that is likely to involve thousands of American soldiers in Afghanistan, in advisory roles, after combat operations end in 2014. (The details of the agreement are still being negotiated.) This should inoculate the president from would-be Romney charges that he is “abandoning” Afghanistan.

None of this suggests that Mr. Obama is trigger-happy or that, when considering the use of force, he is more likely to trust his gut than counsel provided during structured, often lengthy, deliberations with his National Security Council and other advisers. In instances in which the risks seem too great (military action against Iran) or the payoff too murky (some form of military intervention in Syria), Mr. Obama has repeatedly held America’s fire.

This said, it is clear that he has completely shaken the “Vietnam syndrome” that provided a lens through which a generation of Democratic leaders viewed military action. Still, the American public and chattering classes continue to regard the president as a thinker, not an actor; a negotiator, not a fighter.

What accounts for the strange, persistent cognitive dissonance about this president and his relation to military force? Does it stem from the campaign in which Mrs. Clinton repeatedly critiqued Mr. Obama for his stated willingness to negotiate with Iran and Cuba? Or is it because he can never quite shake the deliberative tone and mien of the constitutional law professor that he once was? Or because of his early opposition to the Iraq war? Whatever the causes, the president has embraced SEAL Team 6 rather than Code Pink, yet many continue to see him as the negotiator in chief rather than the warrior in chief that he actually is.

Peter L. Bergen is a director of the New America Foundation and the author of the forthcoming book “Manhunt: The Ten-Year Search for Bin Laden — From 9/11 to Abbottabad.”

NYT

Twitter Updates

Search This Blog

Total Pageviews