Monday, October 31, 2011
Decoding the Brain’s Cacophony
By BENEDICT CAREY
Published: October 31, 2011
ST. HELENA, Calif. — The scientists exchanged one last look and held their breath.
Everything was ready. The electrode was in place, threaded between the two hemispheres of a living cat’s brain; the instruments were tuned to pick up the chatter passing from one half to the other. The only thing left was to listen for that electronic whisper, the brain’s own internal code.
The amplifier hissed — the three scientists expectantly leaning closer — and out it came, loud and clear.
“We all live in a yellow submarine, yellow submarine, yellow submarine ....”
“The Beatles’ song! We somehow picked up the frequency of a radio station,” recalled Michael S. Gazzaniga, chuckling at the 45-year-old memory. “The brain’s secret code. Yeah, right!”
Dr. Gazzaniga, 71, now a professor of psychology at the University of California, Santa Barbara, is best known for a dazzling series of studies that revealed the brain’s split personality, the division of labor between its left and right hemispheres. But he is perhaps next best known for telling stories, many of them about blown experiments, dumb questions and other blunders during his nearly half-century career at the top of his field.
Now, in lectures and a new book, he is spelling out another kind of cautionary tale — a serious one, about the uses of neuroscience in society, particularly in the courtroom.
Brain science “will eventually begin to influence how the public views justice and responsibility,” Dr. Gazzaniga said at a recent conference here sponsored by the Edge Foundation.
And there is no guarantee, he added, that its influence will be a good one.
For one thing, brain-scanning technology is not ready for prime time in the legal system; it provides less information than people presume.
For another, new knowledge about neural processes is raising important questions about human responsibility. Scientists now know that the brain runs largely on autopilot; it acts first and asks questions later, often explaining behavior after the fact. So if much of behavior is automatic, then how responsible are people for their actions?
Who’s driving this submarine, anyway?
In his new book, “Who’s in Charge? Free Will and the Science of the Brain,” being published this month by HarperCollins, Dr. Gazzaniga (pronounced ga-ZAHN-a-ga) argues that the answer is hidden in plain sight. It’s a matter of knowing where to look.
The Split Brain
He began thinking seriously about the nature of responsibility only after many years of goofing off.
Mike Gazzaniga grew up in Glendale, Calif., exploring the open country east of Los Angeles and running occasional experiments in his garage, often with the help of his father, a prominent surgeon. It was fun; the experiments were real attempts to understand biochemistry; and even after joining the Alpha Delta Phi fraternity at Dartmouth (inspiration for the movie “Animal House”), he made time between parties and pranks to track who was doing what in his chosen field, brain science.
In particular, he began to follow studies at the California Institute of Technology suggesting that in animals, developing nerve cells are coded to congregate in specific areas in the brain. This work was captivating for two reasons.
First, it seemed to contradict common wisdom at the time, which held that specific brain functions like memory were widely — and uniformly — distributed in the brain, not concentrated in discrete regions.
Second, his girlfriend was due to take a summer job right there near Caltech.
He decided to write a letter to the director of the program, the eminent neurobiologist Roger Wolcott Sperry (emphasizing reason No. 1). Could Dr. Sperry use a summer intern? “He said sure,” Dr. Gazzaniga said. “I always tell students, ‘Go ahead and write directly to the person you want to study with; you just never know.’ ”
At Caltech that summer after his junior year, he glimpsed his future. He learned about so-called split-brain patients, people with severe epilepsy who had surgery cutting the connections between their left and right hemispheres. The surgery drastically reduced seizures but seemed to leave people otherwise unaffected.
Back at Dartmouth, he couldn’t stop thinking about it: Totally unaffected? Combing the literature, he found that the best attempt to detect an effect had found no changes in thinking or perception among 26 patients who had had the surgery at the University of Rochester.
Could that be possible? Mr. Gazzaniga was so eager to test the patients’ perceptions himself that he wrote another letter, this time to the surgeon — and got permission to do so.
“It’s spring break, I get all my gear together, I get all the way over there, and the guy changes his mind,” Dr. Gazzaniga said. “Like, ‘Hey, buddy, go home!’ ”
After graduating, he headed straight for Caltech.
“It wasn’t just ambition, it was something else — he was gutsy,” said Mitch Glickstein, who was in Dr. Sperry’s lab at the time and is completing a book, “Neuroscience: A Historical Introduction.” “Here’s this junior in college, he knows all about the split-brain patients, and he’s ready to do original research. At 20 years old.”
Caltech in those days was like a frat house for Nobel Prize contenders. Here’s Richard Feynman, the physicist, parking himself in the lab unannounced and making wisecracks about the experiments. There’s Dr. Sperry, annoyed, wondering how to one-up Dr. Feynman. One afternoon Dr. Sperry’s young student scrambled out into the hallway on all fours after an escaped lab animal and nearly kneecapped Linus Pauling, the eminent chemist. (“Why don’t you try anesthetizing a bowl of jelly instead?” Dr. Pauling remarked icily.)
And then there were the experiments, each one a snapshot into the dark box of the brain. In the early 1960s, Dr. Gazzaniga, then a graduate student, teamed with Dr. Sperry and Dr. Joseph Bogen, a brain surgeon, to publish a string of reports that dramatically demonstrated hemispheric specialization in humans.
The researchers devised a way to flash a picture of a bicycle to the right hemisphere alone. When split-brain patients were asked what they saw, they replied, “Nothing”: Because of the severed connection, the left hemisphere, where language is centered, got no visual input and no information from the right hemisphere. So the right hemisphere — which “saw” the bike — had no language to name it.
But here was the kicker: The right hemisphere could direct the hand it controls to draw the bicycle.
In other studies, the three scientists showed that the right hemisphere could also identify objects by touch, correctly selecting, say, a toothbrush or a spoon by feel after seeing the image of one.
The implications were soon clear. The left hemisphere was the intellectual, the wordsmith; it could be severed from the right without loss of I.Q. The right side was the artist, the visual-spatial expert.
The findings demolished the theory that specific functions were widely and uniformly supported in the brain. It also put “left brain/right brain” into the common language, as shorthand for types of skills and types of people. Still, in a field defined by incremental, often arcane advances, the Caltech team had achieved a moon shot.
Dr. Gazzaniga, now all of 25, could write his own ticket. He soon had a grant for a study to record the electronic chatter between the two hemispheres in the brain of a cat.
The Interpreter
The Beatles song that surged through the receiver in that experiment provided Dr. Gazzaniga with something almost as valuable as insight: a good story. Yet it also served as a rude reminder that he and his colleagues were missing something important in their assumptions about the brain.
“The question, ultimately, was why?” Dr. Gazzaniga said. “Why, if we have these separate systems, is it that the brain has a sense of unity?”
Even as he built his early triumph into a career, moving from Caltech to U.C. Santa Barbara and eventually to Dartmouth, with several stops along the way, the same question hung in the air, without a satisfactory answer. In the late 1970s, with the psychologist and linguist George A. Miller, he founded the field of cognitive neuroscience, a marriage of psychology and biology aimed at solving just such puzzles.
It didn’t happen, at least not quickly. In the decades to follow, brain scientists found that the left brain-right brain split is only the most obvious division of labor; in fact, the brain contains a swarm of specialized modules, each performing a special skill — calculating a distance, parsing a voice tone — and all of them running at the same time, communicating in widely distributed networks, often across hemispheres.
In short, the brain sustains a sense of unity not just in the presence of its left and right co-pilots. It does so amid a cacophony of competing voices, the neural equivalent of open outcry at the Chicago Board of Trade.
How?
It turned out, yet again, that people who’d had the split-brain surgery helped provide an answer. Dr. Gazzaniga, now at Dartmouth, performed more of his signature experiments — this time with an added twist. In one study, for instance, he and Joseph LeDoux, then a graduate student, showed a patient two pictures: The man’s left hemisphere saw a chicken claw; his right saw a snow scene. Afterward, the man chose the most appropriate matches from an array of pictures visible to both hemispheres. He chose a chicken to go with the claw, and a shovel to go with the snow. So far, so good.
But then Dr. Gazzaniga asked him why he chose those items — and struck gold. The man had a ready answer for one choice: The chicken goes with the claw. His left hemisphere had seen the claw, after all. Yet it had not seen the picture of the snow, only the shovel. Looking down at the picture of the shovel, the man said, “And you need a shovel to clean out the chicken shed.”
The left hemisphere was just concocting an explanation, Dr. Gazzaniga said. In studies in the 1980s and ’90s, he and others showed that the pattern was consistent: The left hemisphere takes what information it has and delivers a coherent tale to conscious awareness. It happens continually in daily life, and most everyone has caught himself or herself in the act — overhearing a fragment of gossip, for instance, and filling in the blanks with assumptions.
The brain’s cacophony of competing voices feels coherent because some module or network somewhere in the left hemisphere is providing a running narration. “It only took me 25 years to ask the right question to figure it out,” Dr. Gazzaniga said.
“One of the toughest things in any science, but especially in neuroscience, is to weed out the ideas that are really pleasing but unencumbered by truth,” said Thomas Carew, former president of the Society for Neuroscience and dean of the New York University School of Arts and Sciences. “Mike Gazzaniga is one of those in the field who’s been able to do that.”
Dr. Gazzaniga decided to call the left-brain narrating system “the interpreter.” The storyteller found the storyteller.
Emergent Properties
Knowing the breed well, he also understood its power. The interpreter creates the illusion of a meaningful script, as well as a coherent self. Working on the fly, it furiously reconstructs not only what happened but why, inserting motives here, intentions there — based on limited, sometimes flawed information.
One implication of this is a familiar staple of psychotherapy and literature: We are not who we think we are. We narrate our lives, shading every last detail, and even changing the script retrospectively, depending on the event, most of the time subconsciously. The storyteller never stops, except perhaps during deep sleep.
But another implication has to do with responsibility. If our sense of control is built on an unreliable account from automatic brain processes, how much control do we really have? Are there thresholds of responsibility, for instance, that can be determined by studying neural circuits? Dr. Gazzaniga and his wife, Charlotte, raised six children, so like any parents they had to determine levels of responsibility on the fly, just to get someone to set the table.
Yet questions like these became increasingly difficult to ignore for Dr. Gazzaniga, as he took on a more prominent role advising policy makers on the applications of brain science. He was appointed to a Congressional technology panel in 1991; in 2002, he took a position on the President’s Council on Bioethics. And in 2007, he became the founding director of the John D. and Catherine T. MacArthur Foundation’s Research Network on Law and Neuroscience, which tracks and evaluates applications in the legal system.
There, in particular, brain science has had a growing impact. In recent years lawyers have begun to present brain images as evidence, usually to mitigate responsibility for a crime or to test veracity of testimony, as in a polygraph; increasingly, those images have been admitted. And more are coming: In imaging studies, for instance, neuroscientists have identified cortical areas that are highly active when people suppress impulses or other behaviors.
But there are clear shortcomings in the application of each of these methods in courtrooms. Brain images are snapshots, for one thing; they capture a brain state at only one moment in time and say nothing about its function before or after. For another, the images vary widely among people with healthy brains — that is, a “high” level of activity in one person may be normal in another. Can brain science tell exactly where automatic processes end and self-directed “responsible” ones end?
Not now and not likely ever, Dr. Gazzaniga argues in his book. Social constructs like good judgment and free will are even further removed, and trying to define them in terms of biological processes is, in the end, a fool’s game.
“My contention is that, ultimately, responsibility is a contract between two people rather than a property of the brain, and determinism has no meaning in this context,” he writes in “Who’s in Charge?”
Like generosity and pettiness, like love and suspiciousness, responsibility is what he calls a “strongly emergent” property — a property that, though derived from biological mechanisms, is fundamentally distinct and obeys different laws, as do ice and water.
Dr. Gazzaniga is not the first scientist making this case. It is far from a settled matter, in part because researchers do not yet have a complete picture of how automatic and deliberate systems interact biologically.
“I see Gazzaniga’s point, and it would indeed be easiest if we could ignore conclusions derived from brain science and psychology when it comes to legal issues,” said Ap Dijksterhuis, a psychologist at Radboud University Nijmegen, in the Netherlands, in an e-mail. “However, I do not think we can do this forever, and at some point, some key legal concepts such as accountability or responsibility will have to be redefined.”
Until then, Dr. Gazzaniga’s advice is to look for them where they’ve always been: in the hearts and moral intuitions of human beings, in their laws and customs.
And, it should be said, in their stories.
NYT
Friday, October 28, 2011
Thursday, October 27, 2011
Wednesday, October 26, 2011
Crony Capitalism Comes Home
Published: October 26, 2011
Whenever I write about Occupy Wall Street, some readers ask me if the protesters really are half-naked Communists aiming to bring down the American economic system when they’re not doing drugs or having sex in public.
The answer is no. That alarmist view of the movement is a credit to the (prurient) imagination of its critics, and voyeurs of Occupy Wall Street will be disappointed. More important, while alarmists seem to think that the movement is a “mob” trying to overthrow capitalism, one can make a case that, on the contrary, it highlights the need to restore basic capitalist principles like accountability.
To put it another way, this is a chance to save capitalism from crony capitalists.
I’m as passionate a believer in capitalism as anyone. My Krzysztofowicz cousins (who didn’t shorten the family name) lived in Poland, and their experience with Communism taught me that the way to raise living standards is capitalism.
But, in recent years, some financiers have chosen to live in a government-backed featherbed. Their platform seems to be socialism for tycoons and capitalism for the rest of us. They’re not evil at all. But when the system allows you more than your fair share, it’s human to grab. That’s what explains featherbedding by both unions and tycoons, and both are impediments to a well-functioning market economy.
When I lived in Asia and covered the financial crisis there in the late 1990s, American government officials spoke scathingly about “crony capitalism” in the region. As Lawrence Summers, then a deputy Treasury secretary, put it in a speech in August 1998: “In Asia, the problems related to ‘crony capitalism’ are at the heart of this crisis, and that is why structural reforms must be a major part” of the International Monetary Fund’s solution.
The American critique of the Asian crisis was correct. The countries involved were nominally capitalist but needed major reforms to create accountability and competitive markets.
Something similar is true today of the United States.
So I’d like to invite the finance ministers of Thailand, South Korea and Indonesia — whom I and other Americans deemed emblems of crony capitalism in the 1990s — to stand up and denounce American crony capitalism today.
Capitalism is so successful an economic system partly because of an internal discipline that allows for loss and even bankruptcy. It’s the possibility of failure that creates the opportunity for triumph. Yet many of America’s major banks are too big to fail, so they can privatize profits while socializing risk.
The upshot is that financial institutions boost leverage in search of supersize profits and bonuses. Banks pretend that risk is eliminated because it’s securitized. Rating agencies accept money to issue an imprimatur that turns out to be meaningless. The system teeters, and then the taxpayer rushes in to bail bankers out. Where’s the accountability?
It’s not just rabble-rousers at Occupy Wall Street who are seeking to put America’s capitalists on a more capitalist footing. “Structural change is necessary,” Paul Volcker, the former chairman of the Federal Reserve, said in an important speech last month that discussed many of these themes. He called for more curbs on big banks, possibly including trimming their size, and he warned that otherwise we’re on a path of “increasingly frequent, complex and dangerous financial breakdowns.”
Likewise, Mohamed El-Erian, another pillar of the financial world who is the chief executive of Pimco, one of the world’s largest money managers, is sympathetic to aspects of the Occupy movement. He told me that the economic system needs to move toward “inclusive capitalism” and embrace broad-based job creation while curbing excessive inequality.
“You cannot be a good house in a rapidly deteriorating neighborhood,” he told me. “The credibility and the fair functioning of the neighborhood matter a great deal. Without that, the integrity of the capitalist system will weaken further.”
Lawrence Katz, a Harvard economist, adds that some inequality is necessary to create incentives in a capitalist economy but that “too much inequality can harm the efficient operation of the economy.” In particular, he says, excessive inequality can have two perverse consequences: first, the very wealthy lobby for favors, contracts and bailouts that distort markets; and, second, growing inequality undermines the ability of the poorest to invest in their own education.
“These factors mean that high inequality can generate further high inequality and eventually poor economic growth,” Professor Katz said.
Does that ring a bell?
So, yes, we face a threat to our capitalist system. But it’s not coming from half-naked anarchists manning the barricades at Occupy Wall Street protests. Rather, it comes from pinstriped apologists for a financial system that glides along without enough of the discipline of failure and that produces soaring inequality, socialist bank bailouts and unaccountable executives.
It’s time to take the crony out of capitalism, right here at home.
NYTWar in Oakland?
Reuters
Tuesday, October 25, 2011
The Need for Science Education
Society needs Scientific Storytellers who can inspire young minds to greatness in thought and discovery. Science Education suffers in America because there lacks an early and interesting interdisciplinary approach to explain the origins and evolution of the Universe. Science education in public schools is failing to connect the dots and lacks the grand Cosmic story that fuels the flames of genius. We need more classes and more teachers who are able to present the greatest story in the Universe. From the big bang to big brains. Cosmic evolution and Earth life evolution need to be explained in a way that gives young minds a more expansive perspective in this life. "Telling a story is one of the most persuasive means of communication...How we persuade is how we deliver and tell our story to the jury. Storytelling is the most basic means of communication." -Gerry Spence, renowned Trial Attorney When a culture teaches its young that religious mythology is truth and that modern science is a conspiracy of lies then that culture will breed a generation of dogmatic stagnation not fluid exploration. "During times of universal deceit, telling the truth becomes a revolutionary act." George Orwell "Telling a story is one of the most persuasive means of communication...How we persuade is how we deliver and tell our story to the jury. Storytelling is the most basic means of communication." -Gerry Spence, renowned Trial Attorney "I am against religion because it teaches us to be satisfied with not understanding the world." -Richard Dawkins Sir Martin Rees - "I think it would be a real cultural deprivation if everyone could not share the mystery and wonder of the cosmos that modern science reveals to us the emergence, from simple beginnings, of stars and planets, and the intricate evolution on Earth of life and intelligence." http://www.bighistoryproject.com/ Category: Science & Technology Tags: richard dawkins carl sagan neil degrasse tyson science education evolution biology astronomy teachers teaching lecture nasa schools students study alex filippenko analysis inspirational space interviews educational interdisciplinary discussion lesson lessons America Science Math Tests Global Education Challenge economy environment religion politics creationism intelligent design aronra thunderfoot in school atheist experience License: Standard YouTube License
Monday, October 24, 2011
Saturday, October 22, 2011
Immanuel Wallerstein
iwallerstein
Friday, October 21, 2011
An Erratic Leader, Brutal and Defiant to the End
Published: October 20, 2011
COL. MUAMMAR EL-QADDAFI, 1942-2011
An Erratic Leader, Brutal and Defiant to the End
In death, as in life, his circumstances proved startling, with jerky video images showing him captured, bloody and disheveled, but alive. A separate clip showed his half-naked torso, with eyes staring vacantly and what appeared to be a gunshot wound to the head, as jubilant fighters fired into the air. In a third video, posted on YouTube, excited fighters hovered around his lifeless-looking body, posing for photographs and yanking his limp head up and down by the hair.
Throughout his rule, Colonel Qaddafi, 69, sanctioned spasms of grisly violence and frequent bedlam, even as he sought to leverage his nation’s oil wealth into an outsize role on the world stage.
He embraced a string of titles: “the brother leader,” “the guide to the era of the masses,” “the king of kings of Africa” and — his most preferred — “the leader of the revolution.”
But the labels pinned on him by others tended to stick the most. President Ronald Reagan called him “the mad dog of the Middle East.” President Anwar el-Sadat of neighboring Egypt pronounced him “the crazy Libyan.”
As his dominion over Libya crumbled with surprising speed, Colonel Qaddafi refused to countenance the fact that most Libyans despised him. He placed blame for the uprising on foreign intervention — a United Nations Security Council resolution intended to defend civilians became the contentious basis for NATO airstrikes on his troops.
“I tell the coward crusaders: I live in a place where you can’t get me,” he taunted defiantly after the uprising against his rule started in February. “I live in the hearts of millions.”
That attitude endured to the end. In one of his last speeches, made weeks after Tripoli fell and he was a fugitive, he exhorted Libyans to defeat the uprising.
“The people of Libya, the true Libyans, will never accept invasion and colonization,” he said in remarks broadcast by a Syrian television station because he had lost control of Libya’s airwaves. “We will fight for our freedom, and we are ready to sacrifice ourselves.”
Colonel Qaddafi was a 27-year-old junior officer when he led the bloodless coup that deposed Libya’s monarch in 1969. Soon afterward, he began styling himself a desert nomad philosopher. He received dignitaries in his signature sprawling white tent, which he erected wherever he went: Rome, Paris and, after much controversy, New York, on a Westchester estate in 2009. Inside, its quilted walls might be printed with motifs like palm trees and camels, or embroidered with his sayings.
Colonel Qaddafi declared that his political system of permanent revolution would sweep away capitalism and socialism. But he hedged his bets by financing and arming a cornucopia of violent organizations, including the Irish Republican Army and African guerrilla groups, and he became an international pariah after his government was linked to terrorist attacks, particularly the 1988 bombing of a Pan Am jet over Lockerbie, Scotland, which killed 270 people.
After the American-led invasion of Iraq, Colonel Qaddafi announced that Libya was abandoning its pursuit of unconventional weapons, including a covert nascent nuclear program, ushering in a new era of relations with the West. But in Libya, he ruled through an ever smaller circle of advisers, including his sons, destroying any institution that might challenge him.
By the time he was done, Libya had no parliament, no unified military command, no political parties, no unions, no civil society and no nongovernmental organizations. His ministries were hollow, with the notable exception of the state oil company.
A Tight Grip on Power
Eight years into his rule, he renamed the country the Great Socialist People’s Libyan Jamahiriya. (Jamahiriya was his Arabic translation for a state of the masses.) “In the era of the masses, power is in the hands of the people themselves and leaders disappear forever,” he wrote in The Green Book, a three-volume political tract that was required reading in every school.
For decades, Libyans noted dryly that he did not seem to be disappearing any time soon; he became the longest-serving Arab or African leader. Yet he always presented himself as beloved guide and chief clairvoyant, rather than ruler. Indeed, he seethed when a popular uprising inspired by similar revolutions next door in Tunisia and Egypt first sought to drive him from power.
“I am a glory that Libya cannot forgo and the Libyan people cannot forgo, nor the Arab nation, nor the Islamic nation, nor Africa, nor Latin America, nor all the nations that desire freedom and human dignity and resist tyranny!” Colonel Qaddafi shouted in February. “Muammar Qaddafi is history, resistance, liberty, glory, revolution!”
It was a typically belligerent and random harangue. He vowed to fight to his last drop of blood.
“This is my country!” he roared as he shook his fist and pounded the lectern. “Muammar is not a president to quit his post. Muammar is the leader of the revolution until the end of time!”
He blamed all manner of bogeymen — including the United States, operatives of Al Qaeda and youths “fueled by milk and Nescafé spiked with hallucinogenic drugs.” But he also made it clear that he was ready to hunt all the “rats” to eliminate anyone who participated. “Everything will burn,” he vowed.
At least once a decade, Colonel Qaddafi fomented shocking violence that terrorized Libyans.
In the late 1970s and early ’80s, he eliminated even mild critics through public trials and executions. Kangaroo courts were staged on soccer fields or basketball courts, where the accused were interrogated, often urinating in fear as they begged for their lives. The events were televised to make sure that no Libyan missed the point.
The bodies of one group of students hanged in downtown Tripoli’s main square were left there to rot for a week, opposition figures said, and traffic was rerouted to force cars to pass by.
In the 1990s, faced with growing Islamist opposition, Colonel Qaddafi bombed towns in eastern Libya, and his henchmen were widely believed to have opened fire on prisoners in Tripoli’s Abu Salim prison, killing about 1,200.
“Qaddafi’s ability to have survived so long rests on his convenient position in not being committed to a single ideology and his use of violence in such a theatrical way,” said Hisham Matar, the author of “In the Country of Men,” a novel depicting the devastation of life under Colonel Qaddafi. “He deliberately tried to create a campaign that would terrorize the population, that would traumatize them to such an extent that they would never think of expressing their thoughts politically or socially.”
Colonel Qaddafi survived countless coup and assassination attempts and cracked down harshly afterward, alienating important Libyan tribes. He imported soldiers from his misadventures in places like Sudan, Chad and Liberia, transforming Libya’s ragtag militias into what he styled as his African or Islamic legions.
Muammar el-Qaddafi was born to illiterate Bedouin parents in a tent just inland from the coastal town of Surt in 1942. (Some sources give the date as June 7.) His father herded camels and sheep. One grandfather was killed in the 1911 Italian invasion to colonize Libya.
His parents scrimped for his education, first with a local cleric and then secondary school. He began to idolize President Gamal Abdel Nasser of Egypt, who preached Arab unity and socialism after deposing the king in a 1952 coup. He showed enough promise to enter the Royal Military Academy at Benghazi, in eastern Libya, and in 1966 was sent to England for a course on military communications. He learned English.
On Sept. 1, 1969, he led young officers in seizing the government in just a few hours while King Idris was abroad. They dissolved Parliament and set up a 12-member Revolutionary Command Council to rule Libya, mirroring Mr. Nasser’s Egypt. He was promoted to colonel and armed forces commander. Egypt was his blueprint, and he proclaimed that the newly named Libyan Arab Republic would advance under the Arab nationalist slogan “Socialism, unity and freedom.”
Yet in a country where the deposed king, Idris, had come from a long line of religious figures, Colonel Qaddafi felt compelled to shore up his Islamic credentials. He banned alcohol and closed bars, nightclubs and casinos. He outlawed teaching English in public schools. Traffic signs and advertisements not in Arabic were painted over.
Decades later, only one nightclub had opened in Tripoli, in a nondescript building plastered with revolutionary slogans and displaying the mandatory picture of the leader, who seemed to stare down from every wall in Libya.
Colonel Qaddafi claimed that Mr. Nasser had declared him his son, and in the early years of his rule, he set about trying to win his idol’s approval by modernizing Libya and trying, in vain, to unite it with other Arab countries. He expelled American and British military bases, then nationalized the property of Italian settlers and a small Jewish community. He railed against Israel.
He also vowed to eliminate Libya’s tribes, worried that they were too powerful, even though Libya’s urbanizing population had been moving away from them for some time.
Libya had been desperately poor until oil was discovered in 1959. A decade later, Libyans had touched little of their wealth.
The 1969 coup changed that. The new Libyan government forced the major oil companies to cede majority stakes in exchange for continued access to the country’s oil fields, and it demanded a greater share of the profits. The pattern was emulated across the oil-producing states, profoundly changing their relationship with the oil giants.
With the increased revenue, Colonel Qaddafi set about building roads, hospitals, schools and housing. And Libyans, who had suffered during the Italian occupation before World War II, were allowed to celebrate an anticolonial, Arab-nationalist sentiment that had been bottled up under the monarchy, said Prof. Ali Ahmida, an expert at the University of New England.
Life expectancy, which averaged 51 years in 1969, is now over 74. Literacy leapt to 88 percent. Per capita annual income grew to above $12,000 in recent years, though the figure is markedly lower than that found in many oil-rich countries. Yet Colonel Qaddafi warned his people that the oil would not last.
“Petroleum societies are lazy everywhere,” he observed. “People are used to having more money and want everything available. This revolution wants to change this life and to promote production and work, to produce everything by our hands. But the people are lazy.”
The Guiding Philosophy
The mercurial changes in policy and personality that kept Libyans off balance began in earnest with the three volumes of his Green Book, published from 1976 to 1979. (Green, he explained, was for both Islam and agriculture.) The book offered his “third universal theory” to improve on capitalism and socialism, and elevated the mundane to the allegedly profound, condemning sports like boxing as barbarism and pointing out that men and women are different because women menstruate.
Colonel Qaddafi also introduced Orwellian revolutionary committees in every neighborhood to purge the country of the ideologically unsound, calling it “people power.” He began foisting social experiments on Libyans.
Once he demanded that all Libyans raise chickens to promote self-sufficiency, even deducting the costs of cages from their wages. “It made no sense to raise chickens in apartments,” said Mansour O. El-Kikhia, a Qaddafi biographer at the University of Texas and a member of an opposition family. “People slaughtered the chickens, ate them and used the cages as dish racks.”
Colonel Qaddafi said women were not equal to men, but he exhibited them as a symbol of the success of the Libyan revolution. None had a higher profile than his phalanx of female bodyguards, in camouflage fatigues, red nail polish and high-heeled sandals, and carrying submachine guns.
To consolidate his power, Colonel Qaddafi tried to eliminate or isolate all of the 11 other members of the original Revolutionary Command Council. Strikes or unauthorized news reports resulted in prison sentences, and illegal political activity was punishable by death. Western books were burned, and private enterprise was banned. Libyan intelligence agents engaged in all manner of skulduggery, reaching overseas to kidnap and assassinate opponents.
He vowed to turn Libya into an agriculture powerhouse through the Great Man-Made River, a grandiose $20 billion project to pump water from aquifers underneath the Sahara and send it over 1,200 miles to the coast through a gargantuan pipeline.
Meanwhile he was cementing Libya’s rogue-state status by bankrolling terrorist and guerrilla organizations, including Abu Nidal, the radical Palestinian organization, and the violent Red Army Faction in Europe. At least a dozen coups or coup attempts in Africa were traced to his backing. That set him on a collision course with the West.
In the early 1980s, President Reagan closed the Libyan Embassy in Washington, suspended oil imports and shot down two Libyan fighters after Colonel Qaddafi tried to extend Libya’s territorial waters across the Gulf of Sidra.
In London in 1984, gunshots from the Libyan Embassy killed a police officer and wounded 11 demonstrators. In 1986, Libyan agents were linked to the bombing of a disco in West Berlin, killing two American service members and a Turkish woman and wounding 200 people.
President Reagan retaliated 10 days later by bombing targets in Libya, including Colonel Qaddafi’s residence in his compound at the Bab al-Aziziya barracks in Tripoli.
He preserved the wreckage of the house as a symbol of American treachery and, in front of it, installed a sculpture of a giant fist crushing an American jet fighter. It became his preferred stage for major events; his major speech during the 2011 uprising was delivered from the first floor.
During the ’80s, Colonel Qaddafi also invaded Chad after encroaching on its border for years. Chad finally defeated the effort in 1987 with French and American military aid.
The Lockerbie Bombing
In 1988, in the deadliest terrorist act linked to Libya, 259 people aboard Pan Am Flight 103 died when the plane exploded in midair over Lockerbie. The falling wreckage killed 11 people on the ground. Libyan agents were also believed to have been behind the explosion of a French passenger jet over Niger in 1989, killing 170 people.
Nearly a decade of international isolation started in 1992, after Libya refused to hand over two suspects indicted by the United States and Britain in the Lockerbie bombing. France also sought four suspects in the Niger bombing, among them Abdullah Senussi, a brother-in-law of Colonel Qaddafi’s and the head of external intelligence. He was convicted in absentia.
The United Nations imposed economic sanctions, and when his fellow Arabs enforced them, Colonel Qaddafi turned away from the Arab world. He began his quest to become leader of Africa, coming closest in title, at least, in 2009, when he was named the chairman of the African Union for a year.
In 1999, Libya finally handed over two Lockerbie suspects for trial in The Hague under Scottish law and reached a financial settlement with the French. One suspect was acquitted but another, Abdel Basset al-Megrahi, was convicted and sentenced to 27 years in a Scottish jail. When the government released him in 2009, on the grounds that he was terminally ill, the outcry was swift. The British were accused of trying to curry favor with Tripoli for oil and arms deals.
The international sanctions against Libya were lifted in 2003 after it accepted responsibility for the bombing and agreed to pay $2.7 billion to the families of victims in the Lockerbie bombing and in other attacks.
The Libyans did not admit guilt, however. They made it clear that they were simply taking a practical step toward restoring ties with the West. But when Judge Mustafa Abdel-Jalil, the justice minister, defected during the uprising in February 2011, he told a Swedish newspaper that he had proof that Colonel Qaddafi had ordered the operation.
Tripoli truly began to emerge from the cold after the September 2001 attacks against the United States. Colonel Qaddafi condemned them and shared Libya’s intelligence on Al Qaeda with Washington. Libya had been the first country to demand an international arrest warrant for Osama bin Laden.
Colonel Qaddafi also said he would destroy his weapons stockpile. President George W. Bush said Libya’s decision demonstrated the success of the invasion of Iraq, in that it had persuaded a rogue state to abandon its menacing ways, although Libya had made a similar overture years before and many experts did not consider its programs threatening.
Nevertheless, Britain and the United States re-established diplomatic relations. Prime Minister Tony Blair and Secretary of State Condoleezza Rice led a parade of world leaders to Colonel Qaddafi’s tent seeking trade deals. Ms. Rice was the first American secretary of state to visit since 1953.
Before the visit, Colonel Qaddafi was effusive about Ms. Rice. “I support my darling black African woman,” he said on the network Al Jazeera, adding: “I admire and am very proud of the way she leans back and gives orders to the Arab leaders. Yes, Leezza, Leezza, Leezza — I love her very much.”
State Department cables released by WikiLeaks suggested that there was another woman who had won Colonel Qaddafi’s affection, and confidence — a “voluptuous blonde” Ukrainian nurse, described as the senior member of a posse of nurses around him.
The cables described him as a hypochondriac who feared flying over water and who often fasted twice a week. He followed horse racing, loved flamenco dancing and added “king of culture” to his myriad titles, the cables said.
Around 1995 he published a collection of short stories and essays called “The Village, the City, the Suicide of the Astronaut and Other Stories.” It later came out in Britain as “Escape to Hell and Other Stories.” As a reviewer in the British newspaper The Guardian put it: “There are no characters, no twists, no subtle illuminations; indeed, there is precious little narrative. Instead, you get surreal rants and bizarre streams of consciousness obviously unmolested by the hand of any editor.”
Colonel Qaddafi married at least twice. His oldest son, Mohammed, from his first marriage, became a businessman and the agent for foreign companies working in Libya.
Seven other children — six sons and a daughter — came from his marriage to Safia Farkash, a former nurse. Seif al-Islam, the oldest son, had been the face of modern Libya, establishing an international charity and forever pledging that political reform was just around the corner. His moderate reputation evaporated with the uprising after he vowed that Libya would flow with blood. He was later indicted by the International Criminal Court, accused of crimes against humanity during the uprising.
Among Seif’s brothers, Muatassim, Hannibal and Khamis were military officers who commanded their own brigades. Muatassim headed the National Security Council but was also known for carousing in hot spots like the Caribbean island of St. Bart’s, where he was reported to have paid singers, including Mariah Carey, $1 million each for appearing at his holiday parties. Libyan television showed what it called his lifeless body on a gurney Thursday.
Hannibal gained notoriety for beating his wife and servants in luxurious European hotels. After he was arrested in Switzerland in 2008, Colonel Qaddafi broke off diplomatic relations and held two Swiss businessmen hostage.
The anti-Qaddafi forces said they killed Khamis, once head of the feared Khamis Brigade guarding Tripoli, in August as he and his bodyguards tried to break through a rebel checkpoint.
Another son, Saadi, a military officer, had been a professional soccer player who was allowed onto Italian teams more for the publicity than for his skills. The seventh son, Seif al-Arab, was believed killed in an air raid, with his brothers acting as pall bearers.
The daughter, Aisha, gained attention as a lawyer after she offered to join Saddam Hussein’s legal defense team.
Colonel Qaddafi was also believed to have adopted two children: Hanna and Milad, a nephew.
Mohammed, Hannibal and Aisha all fled to neighboring Algeria, while Saadi was given refuge in Niger to the south. There were conflicting reports Thursday over whether Seif al-Islam had been captured or killed.
As the circle around Colonel Qaddafi shrank, his sons increasingly became his advisers, but it was never clear if he had anointed any of them as his successor. He was believed to play one off against the other, granting and then withholding favor.
As Colonel Qaddafi grew older, the trim, handsome officer with short black hair gave way to someone more flamboyant. Brocade and medals festooned his military uniforms, as if he were some Gilbert & Sullivan admiral, while his black curls grew long and unruly. After he adopted Africa as his cause, he favored African robes in a riot of colors.
His long effort to eliminate the government left Libya in a shambles, its sagging infrastructure belying its oil wealth. That fact never seemed to bother Colonel Qaddafi. “Once he was in a position to sustain himself, the fact that nothing improved in Libya was something he did not notice,” said Lisa Anderson, the president of the American University in Cairo.
He was “notoriously mercurial,” a cable obtained through WikiLeaks said, a man who “avoids making eye contact” during meetings and thinks nothing of “long, uncomfortable periods of silence.” He would sometimes show up hours late for a state banquet honoring an African head of state, then sit in a far corner before bolting away. African leaders accepted this behavior in exchange for a check for a million dollars or two, diplomats said.
Capricious Dictates
After he put his worst years of sponsoring terrorism behind him, the West and the rest of the Arab world tended to treat him as comic opera, though he could still outrage, as he did in 2009, when, appearing for the first time before the United Nations General Assembly, he spoke for some 90 minutes instead of his allotted 15 and seemed to tear a copy of the charter, condemning the Security Council as a feudal organization.
When scores of children in a Benghazi hospital developed AIDS, most likely because of unsanitary conditions, Colonel Qaddafi accused the C.I.A. of developing the virus that caused it and of sending a group of Bulgarian nurses to spread it in Libya. The nurses were arrested, tortured, tried and sentenced to death before eventually being freed.
He never tired of pushing his idea for an Israeli-Palestinian solution, a unified country called “Isratine” in which both Jews and Arabs would enjoy equal rights as soon as all Palestinian refugees were allowed to return. The proposal elicited derision from other Arab leaders or senior officials.
At home, though, Libyans suffered under his dictates. He switched from the standard Muslim calendar to one marking the years since the Prophet Muhammad’s death, only to decide later that the birth year was a more auspicious place to start. Event organizers threw up their hands and reverted to the Western calendar. He also decided to rename the months. February was Lights. August was Hannibal.
Given the conceit that “popular committees” — and not Colonel Qaddafi himself — ran the country, everyone was required to attend committee sessions called at random once or twice a year to discuss an agenda “suggested” by the grand guide. Every single office — schools, government ministries, airlines, shops — had to shut for days, sometimes weeks. Scofflaws risked fines.
Colonel Qaddafi once declared that any money over $3,000 in anyone’s bank account was excessive and should revert to the state. Another time he lifted a ban on sport utility vehicles, then changed his mind a few months later, forcing everyone who had bought one to hide it.
Libyans grumbled that they had no idea what had happened to their oil money; the official news agency said the country earned $32 billion in 2010 alone. When prices were low or Libya was under sanctions during most of the 1980s and ’90s, the nearly one million people on the public payroll never got a pay raise; experts calculated that most lived on $300 to $400 per month.
The general disarray was another way of ensuring that no one developed the confidence and connections to try to overthrow him. Libyans lived constantly on edge. “It is an awful feeling when you don’t know what tomorrow is going to bring,” said Dr. Kikhia, the biographer. “People don’t work — they cannot make a decision at any level.”
Colonel Qaddafi saw his rule as a never-ending quest, without ever defining the objective. “The state of Libya was a state of constant revolution, which suggests there was no goal,” said Mr. Matar, the novelist. “It was all false; it was a way to keep them all occupied.”
When revolutions succeeded in two Arab neighbors, deposing the presidents of Tunisia and Egypt, Colonel Qaddafi was among the only leaders in the region to speak out publicly. The people had been swayed by foreign plots, he maintained. He tried to warn his people that Tunisians now lived in fear of being killed at home or on the streets. But few Tunisians died.
His first speech after the Libyan uprising erupted proved a classic Qaddafi moment, mocked by the outside world while accompanying a grueling civil war at home. He vowed to hunt down the insurgents “house by house and alley by alley.” The unusual Libyan word for alley — “zenga zenga” — was turned into a jingle, with various young women belly dancing to it in YouTube videos.
Even as Libyans died, Colonel Qaddafi demanded recognition that he alone had made them relevant.
“In the past, Libyans lacked an identity,” Colonel Qaddafi roared in the February speech. “When you said Libyan, they would tell you Libya, Liberia, Lebanon — they didn’t know Libya! But today you say Libya, they say Libya — Qaddafi, Libya — the revolution!”
NYTParty of Pollution
Published: October 20, 2011
Last month President Obama finally unveiled a serious economic stimulus plan — far short of what I’d like to see, but a step in the right direction. Republicans, predictably, have blocked it. But the new plan, combined with the Occupy Wall Street demonstrations, seems to have shifted the national conversation. We are, suddenly, focused on what we should have been talking about all along: jobs.
So what is the G.O.P. jobs plan? The answer, in large part, is to allow more pollution. So what you need to know is that weakening environmental regulations would do little to create jobs and would make us both poorer and sicker.
Now it would be wrong to say that all Republicans see increased pollution as the answer to unemployment. Herman Cain says that the unemployed are responsible for their own plight — a claim that, at Tuesday’s presidential debate, was met with wild applause.
Both Rick Perry and Mitt Romney have, however, put weakened environmental protection at the core of their economic proposals, as have Senate Republicans. Mr. Perry has put out a specific number — 1.2 million jobs — that appears to be based on a study released by the American Petroleum Institute, a trade association, claiming favorable employment effects from removing restrictions on oil and gas extraction. The same study lies behind the claims of Senate Republicans.
But does this oil-industry-backed study actually make a serious case for weaker environmental protection as a job-creation strategy? No.
Part of the problem is that the study relies heavily on an assumed “multiplier” effect, in which every new job in energy leads indirectly to the creation of 2.5 jobs elsewhere. Republicans, you may recall, were scornful of claims that government aid that helps avoid layoffs of schoolteachers also indirectly helps save jobs in the private sector. But I guess the laws of economics change when it’s an oil company rather than a school district doing the hiring.
Moreover, even if you take the study’s claims at face value, it offers little reason to believe that dirtier air and water can solve our current employment crisis. All the big numbers in the report are projections for late this decade. The report predicts fewer than 200,000 jobs next year, and fewer than 700,000 even by 2015.
You might want to compare these numbers with a couple of other numbers: the 14 million Americans currently unemployed, and the one million to two million jobs that independent estimates suggest the Obama plan would create, not in the distant future, but in 2012.
More pollution, then, isn’t the route to full employment. But is there a longer-term economic case for less environmental protection? No. Serious economic analysis actually says that we need more protection, not less.
The important thing to understand is that the case for pollution control isn’t based on some kind of aesthetic distaste for industrial society. Pollution does real, measurable damage, especially to human health.
And policy makers should take that damage into account. We need more politicians like the courageous governor who supported environmental controls on a coal-fired power plant, despite warnings that the plant might be closed, because “I will not create jobs or hold jobs that kill people.”
Actually, that was Mitt Romney, back in 2003 — the same politician who now demands that we use more coal.
How big are these damages? A new study by researchers at Yale and Middlebury College brings together data from a variety of sources to put a dollar value on the environmental damage various industries inflict. The estimates are far from comprehensive, since they only consider air pollution, and they make no effort to address longer-term issues such as climate change. Even so, the results are stunning.
For it turns out that there are a number of industries inflicting environmental damage that’s worth more than the sum of the wages they pay and the profits they earn — which means, in effect, that they destroy value rather than create it. High on the list, by the way, is coal-fired electricity generation, which the Mitt Romney-that-was used to stand up to.
As the study’s authors say, finding that an industry inflicts large environmental damage compared with its apparent economic return doesn’t necessarily mean that the industry should be shut down. What it means, instead, is that “the regulated levels of emissions from the industry are too high.” That is, environmental regulations aren’t strict enough.
Republicans, of course, have strong incentives to claim otherwise: the big value-destroying industries are concentrated in the energy and natural resources sector, which overwhelmingly donates to the G.O.P. But the reality is that more pollution wouldn’t solve our jobs problem. All it would do is make us poorer and sicker.
NYTThursday, October 20, 2011
Wednesday, October 19, 2011
Watching Athens Burn at Al Jazeera
The cradle of modern western civilization unraveling in front of our very eyes. Corruption of the older workers, locked out younger ones. Now the shit is hitting the fan.
Here in Chilpancingo, we have unemployed young people also, and our University is corrupt.
When will the bomb explode here?
Take This Waltz: Cohen-Lorca
Now in Vienna there's ten pretty women There's a shoulder where death comes to cry There's a lobby with nine hundred windows There's a tree where the doves go to die There's a piece that was torn from the morning And it hangs in the Gallery of Frost Aey, aey, aey, aey Take this waltz, take this waltz Take this waltz with the clamp on its jaws Oh I want you, I want you, I want you On a chair with a dead magazine In the cave at the tip of the lily In some hallway where love's never been On a bed where the moon has been sweating In a cry filled with footsteps and sand Aey, aey, aey, aey Take this waltz, take this waltz Take its broken waist in your hand This waltz, this waltz, this waltz, this waltz With its very own breath of brandy and death Dragging its tail in the sea There's a concert hall in Vienna Where your mouth had a thousand reviews There's a bar where the boys have stopped talking They've been sentenced to death by the blues But who is it climbs to your picture With a garland of freshly cut tears? Aey, aey, aey, aey Take this waltz, take this waltz Take this waltz it's been dying for years There's an attic where children are playing Where I've got to lie down with you soon In a dream of Hungarian lanterns In the mist of some sweet afternoon And I'll see what you've chained to your sorrow All your sheep and your lilies of snow Aey, aey, aey, aey Take this waltz, take this waltz With its, I'll never forget you, you know This waltz, this waltz, this waltz, this waltz With its very own breath of brandy and death Dragging its tail in the sea And I'll dance with you in Vienna I'll be wearing a river's disguise The hyacinth wild on my shoulder My mouth on the dew of your thighs And I'll bury my soul in a scrapbook With the photographs there, and the moss And I'll yield to the flood of your beauty My cheap violin and my cross And you'll carry me down on your dancing To the pools that you lift on your wrist Oh my love, oh my love Take this waltz, take this waltz It's yours now, it's all that there is Aey, aey, aey, aey © LEONARD COHEN STRANGER MUSIC INC;eLyrics
Tuesday, October 18, 2011
Monday, October 17, 2011
Not Such a Stretch to Reach for the Stars
By KENNETH CHANG
Published: October 17, 2011
ORLANDO, Fla. — A starship without an engine?
It may seem a fantastical notion, but hardly more so than the idea of building a starship of any kind, especially with NASA’s future uncertain at best.
Yet here in Orlando, not far from the launching site of the space program’s most triumphant achievements, the government’s Defense Advanced Research Projects Agency, or Darpa, drew hundreds this month to a symposium on the 100-Year Starship Study, which is devoted to ideas for visiting the stars.
Participants — an eclectic mix of engineers, scientists, science fiction fans, students and dreamers — explored a mix of ideas, including how to organize and finance a century-long project; whether civilization would survive, because an engine to propel a starship could also be used for a weapon to obliterate the planet; and whether people need to go along for the trip. (Alternatively, machines could build humans at the destination, perhaps tweaked to live in non-Earth-like environs.)
“The space program, any space program, needs a dream,” said one participant, Joseph Breeden. “If there are no dreamers, we’ll never get anywhere.”
It was Dr. Breeden who offered the idea of an engineless starship.
A physicist by training, he had most recently devised equations that forecast to banks how much they were going to lose on their consumer loans.
From his doctoral thesis, Dr. Breeden remembered that in a chaotic gravitational dance, stars are sometimes ejected at high speeds. The same effect, he believes, could propel starships.
First, find an asteroid in an elliptical orbit that passes close to the Sun. Second, put a starship in orbit around the asteroid. If the asteroid could be captured into a new orbit that clings close to the Sun, the starship would be flung on an interstellar trajectory, perhaps up to a tenth of the speed of light.
“The chaotic dynamics of those two allow all the energy of one to be transferred to the other,” said Dr. Breeden, who came toting copies of a paper describing the technique. “It’s a unique type of gravity assist.”
Darpa, by design, pursues out-of-the-box projects without immediate military use. (In the 1960s and 1970s, for instance, the agency laid the groundwork for the Internet.)
David L. Neyland, the director of tactical technology at Darpa, who orchestrated the one-year starship study, noted that his agency was founded more than 50 years ago as a response to Sputnik, the Soviet Union’s cold war satellite coup.
And the research and development of technologies that could lead to a starship, he said, would likely create useful military spinoffs.
“At every step along the way in the space business, the Department of Defense has benefited,” Mr. Neyland said.
In the talks, speakers laid out challenges that, while herculean, did not seem out of the realm of the possible, even without resorting to exotic physics like “Star Trek” warp drives.
Still, the sheer distances are daunting. “The problem of the stars is larger than most people realize,” said James Benford, a physicist who organized sessions on starship propulsion.
Richard Obousy, president of Icarus Interstellar, an organization of volunteers that has already spent several years on starship designing, gave an analogy. If Earth were in Orlando and the closest star system, Alpha Centauri, were in Los Angeles, then NASA’s two Voyager spacecraft, the most distant manmade objects, have traveled just one mile.
Another way of looking at the challenge is that in 10,000 years, the speed of humans has jumped by a factor of about 10,000, from a stroll (2.6 m.p.h.) to the Apollo astronauts’ return from the Moon (26,000 m.p.h.). Reaching the nearest stars in reasonable time — decades, not centuries — would require a velocity jump of another factor of 10,000.
The first steps, however, are easy to imagine. Even in the 1950s, rocket scientists realized that the current engines — burning kerosene or hydrogen and spewing flames out the nozzle — are the rocket equivalent of gas guzzlers. They designed nuclear engines that use reactors to heat liquid hydrogen into a fast-moving stream of gas. NASA had such engines ready for a hypothetical manned mission to Mars to follow the Moon landings.
Today, the space agency has revived that work, beginning with studies on an ideal fuel for a space reactor, and new nuclear engines could be ready by the end of the decade.
As for radioactivity concerns, the reactors would not be started until they reached space. “Space is a wonderful place to use nuclear power, because it is already radioactive,” said Geoffrey Landis, a scientist at the NASA Glenn Research Center in Ohio (and a science fiction author).
More advanced nuclear engines could use reactors to generate electric fields that accelerated charged ions for the thrust. Then fusion engines — producing energy through the combining of hydrogen atoms — could finally be powerful enough for interstellar travel.
The British Interplanetary Society put together a concept for a fusion-powered starship in the 1970s called Daedalus, extrapolating from known physics and technology. Dr. Obousy’s group, Icarus Interstellar, is revisiting the Daedalus design to see if 30-some years of new technology can produce a better starship.
Daedalus dwarfs the Saturn 5, the rocket that took astronauts to the Moon. “However, it’s no bigger than a Nimitz aircraft carrier,” Dr. Obousy said. “We have the ability to create big things. We just don’t have the ability to launch big things.”
Dr. Benford advocated another approach, harking back to the era of sailing ships. Giant sails on the starship could billow from photons beamed from Earth by lasers or giant antennae. “Here’s a case where we know the physics, and the engineering seems doable,” he said.
By contrast, no one has yet built an energy-producing fusion reactor.
Some of the questions posed at the symposium seemed almost mundane: What kind of lights should a starship have? How do you pack enough spare parts for a 50-year trip when there’s no Home Depot along the way? Other talks ruminated on theological and philosophical questions. “Did Jesus Die for Klingons, Too?” was the title of one.
“Vision without execution is daydreaming,” Mr. Neyland said in his introductory remarks, paraphrasing a Japanese proverb.
“And what we’re trying to inspire with the 100-Year Starship Study is that first step in establishing a bar that’s high enough, with challenges that are hard enough that people will actually go start tackling some of these really hard problems.”
For Dr. Breeden, discussions with other attendees affirmed his underlying idea and calculations, but it seems unlikely that asteroid flinging would be sufficient by itself. Still, it could prove a useful and cost-effective supplement for other propulsions systems.
The $1.1 million study — $1 million from Darpa, $100,000 from NASA — will culminate with the awarding of a $500,000 grant to an organization that will take the torch for further work.
Darpa would then exit the starship business, sidestepping interrogation by Congress during the next budget hearings of why it was spending taxpayer money on science fiction dreams.
“They want to get people thinking about a topic and propagate it very subtly,” said Gregory Benford, a physics professor at the University of California, Irvine, who is also a science fiction author (and the twin brother of James Benford). “They want it out of the budget by early next year.”
Perhaps tellingly, no high-level NASA officials spoke at the symposium other than Pete Worden, director of the Ames Research Center in California, whom Mr. Neyland described as a “co-conspirator” and who is often regarded as a maverick in the space agency.
“If we’re lucky, it will change NASA,” the science-fiction-writing Dr. Benford said of the starship research.
Some speakers said they thought the first goal over the next century should be colonizing the solar system, starting with Mars.
Dr. Obousy, for one, made his preference known in a couplet:
On to the stars!
NYTPleas Unheeded as Students’ U.S. Jobs Soured
By JULIA PRESTON
Published: October 16, 2011
The college student from Moldova was in the United States on a cultural exchange program run for half a century by the federal government, a program designed to build international understanding by providing foreign students with a dream summer of fun in America. So he summoned his best English for the e-mail he sent to the State Department in June.
“Pleas hellp,” wrote the student, Tudor Ureche. He told them about “the miserable situation in which I’ve found myself cought” since starting a job under the program in a plant packing Hershey’s chocolates near the company’s namesake town in Pennsylvania.
Students like Mr. Ureche, who had paid as much as $6,000 to take part in the program, expected a chance to see the best of this country, to make American friends and sightsee, with a summer job to help finance it all.
Instead, many students who were placed at the packing plant found themselves working grueling night shifts on speeding production lines, repeatedly lifting boxes weighing as much as 60 pounds and financially drained by low pay and unexpected extra costs for housing and transportation. Their complaints to the contractor running the program on behalf of the State Department were met with threats that they could be sent home.
Events this summer at the Hershey packing plant in Palmyra, Pa., revealed major holes in the State Department’s oversight of its summer work and travel program, the largest and most ambitious of its cultural exchanges. The program, which placed 130,000 foreign students in all sorts of jobs across the country this year, has a large impact in shaping the country’s image for young generations overseas.
The Hershey students finally got the department’s attention on Aug. 17 when 200 of them, waving placards and chanting union slogans, walked out of the plant, the first labor protest in the 50-year history of the department’s exchange programs.
The protests raised questions about whether the State Department is equipped to manage what has become a vast temporary work program, especially in times when suitable jobs for foreign students — even short-term jobs — are harder to come by as high unemployment persists in the United States.
The protests also exposed serious lapses by the Council for Educational Travel, USA, a nonprofit group based in California and one of more than 70 sponsors contracted by the State Department to organize the students’ trips to the United States and find jobs and housing for them.
The group, known as Cetusa, placed nearly 400 foreigners from 18 countries, many of them graduate students in medicine, engineering and economics, in physically arduous jobs at the Palmyra factory that were overwhelming for some.
The students, who were earning about $8 an hour, said they were isolated within the plant, rarely finding moments to practice English or socialize with Americans. With little explanation or accounting, the sponsor took steep deductions from their paychecks for housing, transportation and insurance that left many of them too little money to afford the tourist wanderings they had eagerly anticipated.
Program documents and interviews with 15 students show that Cetusa failed to heed many distress signals from students over many months, and responded to some with threats of expulsion from the program.
A Cry for Help
Mr. Ureche, 22, an engineering student, said he had begun to appeal to Cetusa for a different job as soon as he went to work lifting boxes loaded with Hershey’s candies.
“I’ve been having serious back pains since the first day of work,” Mr. Ureche reported in his e-mail to the State Department on June 6, sent two weeks after he started on the job. “If I continue in this rythm of work, it may cause me serious health damages.”
He felt “mistreated and ignored by my sponsor,” he wrote. And the organization told him, he said, that if he complained to Washington, “they will immediately cancel my visa." A few days later Mr. Ureche quit his job, making his way to New York and finding work.
When the walkout came two months later, State Department officials reacted swiftly, opening an investigation centering on Cetusa that has not yet concluded.
The department was already on notice about trouble in the program, after an Associated Press investigation last year found abuses of foreign students in several states. Officials started a broad review earlier this year, and on July 15 they inaugurated new regulations, which tighten requirements on sponsor organizations to ensure that students are matched with jobs that are appropriate and safe. A newly expanded staff of 18 inspectors will begin on-site audits of sponsor organizations this fall, officials said.
“We are asking hard questions,” said Rick Ruth, the State Department official in charge of cultural exchange programs — including whether the program should be scaled back in light of the hard times in the United States.
Cetusa responded to the protest by arranging for students to have a paid week off from the plant and by paying for two trips to historic sites in Pennsylvania. The Hershey Company hosted a daylong visit to its headquarters so students could learn about its business strategies.
“This is a beautiful, great program,” Rick Anaya, Cetusa’s chief executive, said of the cultural exchange.
Mr. Anaya said he was aware that the work in Palmyra was strenuous. “It is hard to lift,” he said. “But they get used to it and they are fine with it after a week or so.” He said all students had received and signed job descriptions before going to Palmyra. If packing work seemed too difficult, he said, “they didn’t have to sign up for the job.”
Mr. Anaya blamed the discontent on the National Guestworker Alliance, a labor group that helped organize the walkout, together with the A.F.L.-C.I.O. and other unions.
“It’s clear and obvious to me that this whole thing was started and fueled by the unions,” he said.
The foreign students’ travails did prove fertile ground for the alliance, an advocate for temporary foreign workers. Joined by some of the students, the alliance since August has led a campaign against the Hershey Company, accusing it of exploiting foreign students to displace American workers. Some students agreed.
“They take students who came on a cultural exchange to slave for them and make next to nothing, when these jobs could be going to families in Pennsylvania,” said Godwin Efobi, 26, a Nigerian medical student who was a protest leader.
Created under a 1961 law, the State Department’s summer work and travel program was designed to give foreign university students who do not hail from wealthy elites at home a brief plunge into American life, at no cost to the American taxpayer. The students come on a visa known as a J-1, which allows them to work for up to four months and travel for a month.
Students in the program, a legacy of the cold war, come mainly from China, Russia and Eastern European countries, with some from Latin America. Traditionally they have been employed in national parks, amusement parks, summer camps, beach resorts and restaurants, in low-wage but congenial jobs.
Over the years the program has won many happy reviews after students returned home. But in recent years it grew rapidly, to 150,000 students in 2008 from about 30,000 a decade earlier. It is now bigger than most federal programs explicitly dedicated to importing temporary foreign workers.
As a cultural exchange, the program is not monitored by labor authorities, said Daniel Costa, an analyst at the Economic Policy Institute in Washington who studies the J-1 program. Unlike the Department of Labor, he said, the State Department does not collect employment data that would show, for example, how many students have been placed in factories like the Palmyra plant, or how recently.
Mr. Anaya said his organization began by sending small numbers of students to Palmyra six years ago, for the annual summertime surge packing Hershey’s candies for the Christmas holidays.
Many students were surprised to learn when they arrived that they would not work for Hershey, but for SHS OnSite Solutions, a staffing subcontractor for Exel, the contractor Hershey hired to operate the packing plant.
This summer Cetusa placed nearly 400 students in jobs in which, according to the fine print of the visa papers, the worker “spends 100 per cent of the shift standing, walking, stooping, bending or lifting” and “involved in repetitive motion work,” and must be able to “lift up to 27 kilograms throughout the shift” (about 60 pounds) and “function effectively” in a cold room.
‘This Is America’
Many students — including many who left Hershey before the labor groups arrived — said the jobs were an immersion in misery.
Ignacio Torres Sibaja, a 21-year-old graphic arts student, said he started to weep when he read news about the walkout from his home in Costa Rica as he recalled three months when he worked in Palmyra last winter.
“I spent some of the worst moments of my life during that exchange,” Mr. Torres said. Speaking by telephone from San José, he said he had applied for a resort job in Colorado, learning only at the last minute that he would go to Palmyra. He did not focus on the job description. He thought working for Hershey would be fun.
A partner organization of Cetusa in Costa Rica had assured him he would earn back the $4,000 he borrowed from his parents to pay airfare and charges by Cetusa for his visa and their fees, Mr. Torres said. But that partner went out of business days after he arrived in the United States. With no guidance from either group, Mr. Torres spent two sleepless days, including one night he passed in a mall, finding a bus from Kennedy Airport to Harrisburg, Pa., where Cetusa has an office.
A representative of the sponsor greeted him with a demand for $800 to cover a rental deposit on an apartment, Mr. Torres said. The agency provided small apartments, often 15 miles away in Harrisburg, charging each student $400 a month to live in cramped quarters with four or five others.
Mr. Torres turned over all the spending money — $500 — he had scraped together for the trip. “I spent a week not eating,” he said.
At the plant, Mr. Torres said, “the packing line gets really, really fast and stressful.” Even though the plant was chilled below 60 degrees, he said, “I would be sweating all over.”
Mr. Torres echoed many students when he said his lowest moment came with his first paycheck. After deductions by Cetusa for rent, utilities, bus fare and other items, he took home $85 for 35 hours of work.
“You wanted a cultural exchange,” Mr. Torres was told by the group representative, he said. “This is America and this is the way we do things here.” Although Cetusa is a nonprofit organization in the United States, commercial affiliates manage housing and insurance for its international student programs.
Mr. Torres finished his job too broke to travel in the United States, he said, and went home in debt, feeling cheated.
Hoping to Be Fired
In June, a student from China, Tian Jia Yi, started on the packing line at the Hershey plant. Mr. Tian, 20, a hotel management student at a college in Qingdao, quickly discovered that the work was too much for him.
“My supervisor always ordered us to carry these chocolate boxes every day that were too heavy,” Mr. Tian said in a telephone interview, striving to express himself in correct English. After he pleaded for weeks for a different task, he said, a manager fired him.
“In a stupid way it was my dream that she would fire me because I can’t bear that work anymore,” he said.
Cetusa offered Mr. Tian a new job in California — but he had no money to travel there. Then the organization ordered him to leave his Pennsylvania apartment. Stranded and alone, he managed to locate a relative in Flushing, Queens, and retreated there to search, unavailingly, for a new job.
“I feel very ashamed that I have to spend a lot of money that my parents sent to me,” Mr. Tian said.
Under program regulations, sponsors are required to monitor their students throughout their stay here. But according to State Department officials, Mr. Anaya told them shortly after the walkout that his organization had not received any complaints from Palmyra before the protest.
Mr. Torres, however, recalled a gathering in March when dozens of students assailed Cetusa representatives with their grievances. “Everybody rose up and starting confronting them,” he said. In early August, a representative of the organization, Malgorzata Tekgoz, worked a night shift at the Palmyra plant to assess conditions there.
“It was fine, of course I got tired at the end of the shift,” she reported in an e-mail. She said students had raised “many standard complaints,” particularly about “the frequency of lifting boxes.”
With some students who pressed their case, Cetusa played tough. When the agency learned that Mr. Ureche had complained to the State Department, it terminated his participation in the program, putting him in violation of his visa, according to correspondence between Alan J. Leahy, a lawyer for the organization, and a labor lawyer Mr. Ureche contacted, Laurence E. Norton.
Still, dozens of foreign students employed at the Palmyra plant did not join the protests, State Department officials noted. Some had returned this summer for a second tour there.
Lenka Vavrova, 23, a student from Slovakia, told Cetusa in an e-mail that she was “ashamed of who some of these students are.” She wrote, “We came here for a short time so we should respect the conditions and laws.”
Mr. Anaya said he was convinced the demonstrators had been “misled and sold a bill of goods that is unfair to them” by the labor groups. “I do believe in the kids,” he said. “I believe eventually they will feel sorry for what they did.”
Saket Soni, director of the National Guestworker Alliance, said mobilizing the students had not been hard. “We talked about fundamental labor and civil rights protections that cover these students, and all workers in America,” he said. After the walkout, Exel, the contractor, said it would not longer use J-1 students at Palmyra.
Mr. Ureche said he remained disappointed by his experience here. “The students are not cheap workers,” he said. “They are coming here to meet new people, make some money for travel. This program is not for living without food, without money, without nothing.”
Alain Delaquérière contributed reporting.