history

The Cold War as game theory

From Charles Platt’s “The Profits of Fear” (August 2005):

Game theory began with the logical proposition that in a strategic two-player game, either player may try to obtain an advantage by bluffing. If the stakes are low, perhaps you can take a chance on trusting your opponent when he makes a seemingly fair and decent offer; but when the penalty for being deceived can be nuclear annihilation, taking a chance is out of the question. You work on the principle that the person you are dealing with may be utterly ruthless, unethical, and untrustworthy, no matter how peaceful his intentions may seem. You also have to assume that he may be smart enough to use game theory just like you; and therefore, he will assume that _you_ are ruthless, unethical, and untrustworthy, no matter how peaceful _your_ intentions may seem. In this way a supposedly rational system of assessment leads to a highly emotional outcome in which trust becomes impossible and strategy is based entirely on fear. This is precisely what happened during the decades of the Cold War.

The Cold War as game theory Read More »

Prescription drug spending has vastly increased in 25 years

From Clifton Leaf’s “The Law of Unintended Consequences” (Fortune: 19 September 2005):

Whatever the answer, it’s clear who pays for it. You do. You pay in the form of vastly higher drug prices and health-care insurance. Americans spent $179 billion on prescription drugs in 2003. That’s up from … wait for it … $12 billion in 1980 [when the Bayh-Dole Act was passed]. That’s a 13% hike, year after year, for two decades. Of course, what you don’t pay as a patient you pay as a taxpayer. The U.S. government picks up the tab for one in three Americans by way of Medicare, Medicaid, the military, and other programs. According to the provisions of Bayh-Dole, the government gets a royalty-free use, forever, of its funded inventions. It has never tried to collect. You might say the taxpayers pay for the hat–and have it handed to them.

Prescription drug spending has vastly increased in 25 years Read More »

What patents on life has wrought

From Clifton Leaf’s “The Law of Unintended Consequences” (Fortune: 19 September 2005):

The Supreme Court’s decision in 1980 to allow for the patenting of living organisms opened the spigots to individual claims of ownership over everything from genes and protein receptors to biochemical pathways and processes. Soon, research scientists were swooping into patent offices around the world with “invention” disclosures that weren’t so much products or processes as they were simply knowledge–or research tools to further knowledge.

The problem is, once it became clear that individuals could own little parcels of biology or chemistry, the common domain of scientific exchange–that dynamic place where theories are introduced, then challenged, and ultimately improved–begins to shrink. What’s more, as the number of claims grows, so do the overlapping claims and legal challenges. …

In October 1990 a researcher named Mary-Claire King at the University of California at Berkeley told the world that there was a breast-cancer susceptibility gene–and that it was on chromosome 17. Several other groups, sifting through 30 million base pairs of nucleotides to find the precise location of the gene, helped narrow the search with each new discovery. Then, in the spring of 1994, a team led by Mark Skolnick at the University of Utah beat everyone to the punch–identifying a gene with 5,592 base pairs and codes for a protein that was nearly 1,900 amino acids long. Skolnick’s team rushed to file a patent application and was issued title to the discovery three years later.

By all accounts the science was a collective effort. The NIH had funded scores of investigative teams around the country and given nearly 1,200 separate research grants to learn everything there was to learn about the genetics of breast cancer.

The patent, however, is licensed to one company–Skolnick’s. Myriad Genetics, a company the researcher founded in 1991, now insists on doing all U.S. testing for the presence of unknown mutation in the two related genes, BRCA1 and BRCA2. Those who have a mutation in either gene have as high as an 86% chance of getting cancer, say experts. The cost for the complete two-gene analysis: $2,975.

Critics say that Myriad’s ultrarestrictive licensing of the technology–one funded not only by federal dollars but also aided by the prior discoveries of hundreds of other scientists–is keeping the price of the test artificially high. Skolnick, 59, claims that the price is justified by his company’s careful analysis of thousands of base pairs of DNA, each of which is prone to a mutation or deletion, and by its educational outreach programs.

What patents on life has wrought Read More »

1980 Bayh-Dole Act created the biotech industry … & turned universities into businesses

From Clifton Leaf’s “The Law of Unintended Consequences” (Fortune: 19 September 2005):

For a century or more, the white-hot core of American innovation has been basic science. And the foundation of basic science has been the fluid exchange of ideas at the nation’s research universities. It has always been a surprisingly simple equation: Let scientists do their thing and share their work–and industry picks up the spoils. Academics win awards, companies make products, Americans benefit from an ever-rising standard of living.

That equation still holds, with the conspicuous exception of medical research. In this one area, something alarming has been happening over the past 25 years: Universities have evolved from public trusts into something closer to venture capital firms. What used to be a scientific community of free and open debate now often seems like a litigious scrum of data-hoarding and suspicion. And what’s more, Americans are paying for it through the nose. …

From 1992 to September 2003, pharmaceutical companies tied up the federal courts with 494 patent suits. That’s more than the number filed in the computer hardware, aerospace, defense, and chemical industries combined. Those legal expenses are part of a giant, hidden “drug tax”–a tax that has to be paid by someone. And that someone, as you’ll see below, is you. You don’t get the tab all at once, of course. It shows up in higher drug costs, higher tuition bills, higher taxes–and tragically, fewer medical miracles.

So how did we get to this sorry place? It was one piece of federal legislation that you’ve probably never heard of–a 1980 tweak to the U.S. patent and trademark law known as the Bayh-Dole Act. That single law, named for its sponsors, Senators Birch Bayh and Bob Dole, in essence transferred the title of all discoveries made with the help of federal research grants to the universities and small businesses where they were made.

Prior to the law’s enactment, inventors could always petition the government for the patent rights to their own work, though the rules were different at each federal agency; some 20 different statutes governed patent policy. The law simplified the “technology transfer” process and, more important, changed the legal presumption about who ought to own and develop new ideas–private enterprise as opposed to Uncle Sam. The new provisions encouraged academic institutions to seek out the clever ideas hiding in the backs of their research cupboards and to pursue licenses with business. And it told them to share some of the take with the actual inventors.

On the face of it, Bayh-Dole makes sense. Indeed, supporters say the law helped create the $43-billion-a-year biotech industry and has brought valuable drugs to market that otherwise would never have seen the light of day. What’s more, say many scholars, the law has created megaclusters of entrepreneurial companies–each an engine for high-paying, high-skilled jobs–all across the land.

That all sounds wonderful. Except that Bayh-Dole’s impact wasn’t so much in the industry it helped create, but rather in its unintended consequence–a legal frenzy that’s diverting scientists from doing science. …

A 1979 audit of government-held patents showed that fewer than 5% of some 28,000 discoveries–all of them made with the help of taxpayer money–had been developed, because no company was willing to risk the capital to commercialize them without owning title. …

A dozen schools–notably MIT, Stanford, the University of California, Johns Hopkins, and the University of Wisconsin–already had campus offices to work out licensing arrangements with government agencies and industry. But within a few years Technology Licensing Offices (or TLOs) were sprouting up everywhere. In 1979, American universities received 264 patents. By 1991, when a new organization, the Association of University Technology Managers, began compiling data, North American institutions (including colleges, research institutes, and hospitals) had filed 1,584 new U.S. patent applications and negotiated 1,229 licenses with industry–netting $218 million in royalties. By 2003 such institutions had filed five times as many new patent applications; they’d done 4,516 licensing deals and raked in over $1.3 billion in income. And on top of all that, 374 brand-new companies had sprouted from the wells of university research. That meant jobs pouring back into the community …

The anecdotal reports, fun “discovery stories” in alumni magazines, and numbers from the yearly AUTM surveys suggested that the academic productivity marvel had spread far and wide. But that’s hardly the case. Roughly a third of the new discoveries and more than half of all university licensing income in 2003 derived from just ten schools–MIT, Stanford, the usual suspects. They are, for the most part, the institutions that were pursuing “technology transfer” long before Bayh-Dole. …

Court dockets are now clogged with university patent claims. In 2002, North American academic institutions spent over $200 million in litigation (though some of that was returned in judgments)–more than five times the amount spent in 1991. Stanford Law School professor emeritus John Barton notes, in a 2000 study published in Science, that the indicator that correlates most perfectly with the rise in university patents is the number of intellectual-property lawyers. (Universities also spent $142 million on lobbying over the past six years.) …

So what do universities do with all their cash? That depends. Apart from the general guidelines provided by Bayh-Dole, which indicate the proceeds must be used for “scientific research or education,” there are no instructions. “These are unrestricted dollars that they can use, and so they’re worth a lot more than other dollars,” says University of Michigan law professor Rebecca Eisenberg, who has written extensively about the legislation. The one thing no school seems to use the money for is tuition–which apparently has little to do with “scientific research or education.” Meanwhile, the cost of university tuition has soared at a rate more than twice as high as inflation from 1980 to 2005.

1980 Bayh-Dole Act created the biotech industry … & turned universities into businesses Read More »

“Have you ever been admitted to a mental institution?”

From Tom Stites’s “Guest Posting: Is Media Performance Democracy’s Critical Issue?” (Center for Citizen Media: Blog: 3 July 2006):

And then there were [Walter] Annenberg’s political shenanigans – he shamelessly used his news columns [in The Philadelphia Inquirer] to embarrass candidates who dared to run against his favorites. One day in 1966 a Democrat named Milton Shapp held a press conference while running for governor and Annenberg’s hand-picked political reporter asked him only one question. The question was, “Mr. Shapp, have you ever been admitted to a mental institution?” “Why no,” Shapp responded, and went away scratching his head about this odd question. The next morning he didn’t need to scratch his head any more. A five-column front page Inquirer headline read, “Shapp Denies Mental Institution Stay.” I’m not making this up. I’ve seen the clipping – a friend used to have a framed copy above his desk. Those were not the good old days.

“Have you ever been admitted to a mental institution?” Read More »

Corporate consolidation reigns in American business, & that’s a problem

From Barry C. Lynn’s “The Case for Breaking Up Wal-Mart” (Harper’s: 24 July 2006):

It is now twenty-five years since the Reagan Administration eviscerated America’s century-long tradition of antitrust enforcement. For a generation, big firms have enjoyed almost complete license to use brute economic force to grow only bigger. And so today we find ourselves in a world dominated by immense global oligopolies that every day further limit the flexibility of our economy and our personal freedom within it. There are still many instances of intense competition — just ask General Motors.

But since the great opening of global markets in the early 1990s, the tendency within most of the systems we rely on for manufactured goods, processed commodities, and basic services has been toward ever more extreme consolidation. Consider raw materials: three firms control almost 75 percent of the global market in iron ore. Consider manufacturing services: Owens Illinois has rolled up roughly half the global capacity to supply glass containers. We see extreme consolidation in heavy equipment; General Electric builds 60 percent of large gas turbines as well as 60 percent of large wind turbines. In processed materials; Corning produces 60 percent of the glass for flat-screen televisions. Even in sneakers; Nike and Adidas split a 60-percent share of the global market. Consolidation reigns in banking, meatpacking, oil refining, and grains. It holds even in eyeglasses, a field in which the Italian firm Luxottica has captured control over five of the six national outlets in the U.S. market.

The stakes could not be higher. In systems where oligopolies rule unchecked by the state, competition itself is transformed from a free-for-all into a kind of private-property right, a license to the powerful to fence off entire marketplaces, there to pit supplier against supplier, community against community, and worker against worker, for their own private gain. When oligopolies rule unchecked by the state, what is perverted is the free market itself, and our freedom as individuals within the economy and ultimately within our political system as well.

Corporate consolidation reigns in American business, & that’s a problem Read More »

The first movie theater

From Adam Goodheart’s “10 Days That Changed History” (The New York Times: 2 July 2006):

APRIL 16, 1902: The Movies

Motion pictures seemed destined to become a passing fad. Only a few years after Edison’s first crude newsreels were screened — mostly in penny arcades, alongside carnival games and other cheap attractions, the novelty had worn off, and Americans were flocking back to live vaudeville.

Then, in spring 1902, Thomas L. Tally opened his Electric Theater in Los Angeles, a radical new venture devoted to movies and other high-tech devices of the era, like audio recordings.

“Tally was the first person to offer a modern multimedia entertainment experience to the American public,” says the film historian Marc Wanamaker. Before long, his successful movie palace produced imitators nationally, which would become known as “nickelodeons.”

The first movie theater Read More »

The day FDR was almost assassinated

From Adam Goodheart’s “10 Days That Changed History” (The New York Times: 2 July 2006):

FEB. 15, 1933: The Wobbly Chair

It should have been an easy shot: five rounds at 25 feet. But the gunman, Giuseppe Zangara, an anarchist, lost his balance atop a wobbly chair, and instead of hitting President-elect Franklin D. Roosevelt, he fatally wounded the mayor of Chicago, who was shaking hands with F.D.R.

Had Roosevelt been assassinated, his conservative Texas running mate, John Nance Garner, would most likely have come to power. “The New Deal, the move toward internationalism – these would never have happened,” says Alan Brinkley of Columbia University. “It would have changed the history of the world in the 20th century. I don’t think the Kennedy assassination changed things as much as Roosevelt’s would have.”

The day FDR was almost assassinated Read More »

The date Silicon Valley (& Intel) was born

From Adam Goodheart’s “10 Days That Changed History” (The New York Times: 2 July 2006):

SEPT. 18, 1957: Revolt of the Nerds

Fed up with their boss, eight lab workers walked off the job on this day in Mountain View, Calif. Their employer, William Shockley, had decided not to continue research into silicon-based semiconductors; frustrated, they decided to undertake the work on their own. The researchers — who would become known as “the traitorous eight” — went on to invent the microprocessor (and to found Intel, among other companies). “Sept. 18 was the birth date of Silicon Valley, of the electronics industry and of the entire digital age,” says Mr. Shockley’s biographer, Joel Shurkin.

The date Silicon Valley (& Intel) was born Read More »

Where we are now

From Gary Kamiya’s “License to lie” (Salon: 23 June 2006):

We are in a peculiar moment, one in which our politicians seem unable to articulate or even grasp the train wreck unfolding in front of them. Someday in the future, if the Democratic Party manages to transform itself from a cowering shadow to something approaching sentience, perhaps what really happened during the Bush era will be publicly debated.

Perhaps then we can ask how it happened that the government of the United States was hijacked by a bullying, fact-averse religious fanatic and his puppetmaster, an evil courtier out of Shakespeare. How we were plunged into a disastrous war simply because a cabal of ideologues and right-wing zealots, operating in autocratic secrecy, decided they wanted war. And how all of the normal workings of a democratic government — objective analysis, checks and balances, transparency — were simply trashed by an administration waving the bloody shirt of “terror.”

Where we are now Read More »

The origin of broadcast journalism

From Nicholas Lemann’s “The Murrow Doctrine” (The New Yorker: 23 & 30 January 2006: 38-43):

There is a memorable entry in William Shirer’s Berlin Diary in which he describes – as, in effect, something that happened at work one day – the birth of broadcast journalism. It was Sunday, March 13, 1938, the day after Nazi troops entered Austria. Shirer, in London, got a call from CBS headquarters, in New York, asking him to put together a broadcast in which radio correspondents in the major capitals of Europe, led by Shirer’s boss, Edward R. Murrow, who was on the scene in Vienna, would offer a series of live reports on Hitler’s move and the reaction to it.

The origin of broadcast journalism Read More »

Writers take a while to attain full power

From Thomas Babington Macaulay’s “A Speech Delivered In The Committee of the House Of Commons On The 6th Of April 1842” (Prime Palaver #4: 1 September 2001):

It is the law of our nature that the mind shall attain its full power by slow degrees; and this is especially true of the most vigorous minds. Young men, no doubt, have often produced works of great merit; but it would be impossible to name any writer of the first order whose juvenile performances were his best. That all the most valuable books of history, of philology, of physical and metaphysical science, of divinity, of political economy, have been produced by men of mature years will hardly be disputed. The case may not be quite so clear as respects works of the imagination. And yet I know no work of the imagination of the very highest class that was ever, in any age or country, produced by a man under thirty-five. Whatever powers a youth may have received from nature, it is impossible that his taste and judgment can be ripe, that his mind can be richly stored with images, that he can have observed the vicissitudes of life, that he can have studied the nicer shades of character. How, as Marmontel very sensibly said, is a person to paint portraits who has never seen faces? On the whole, I believe that I may, without fear of contradiction, affirm this, that of the good books now extant in the world more than nineteen-twentieths were published after the writers had attained the age of forty.

Writers take a while to attain full power Read More »

Macaulay in 1841: copyright a tax on readers

From Thomas Babington Macaulay’s “A Speech Delivered In The House Of Commons On The 5th Of February 1841” (Prime Palaver #4: 1 September 2001):

The principle of copyright is this. It is a tax on readers for the purpose of giving a bounty to writers. The tax is an exceedingly bad one; it is a tax on one of the most innocent and most salutary of human pleasures; and never let us forget, that a tax on innocent pleasures is a premium on vicious pleasures. I admit, however, the necessity of giving a bounty to genius and learning. In order to give such a bounty, I willingly submit even to this severe and burdensome tax. Nay, I am ready to increase the tax, if it can be shown that by so doing I should proportionally increase the bounty. My complaint is, that my honourable and learned friend doubles, triples, quadruples, the tax, and makes scarcely any perceptible addition to the bounty.

Macaulay in 1841: copyright a tax on readers Read More »

Macaulay in 1841 on the problems on the copyright monopoly

From Thomas Babington Macaulay’s “A Speech Delivered In The House Of Commons On The 5th Of February 1841” (Prime Palaver #4: 1 September 2001):

The question of copyright, Sir, like most questions of civil prudence, is neither black nor white, but grey. The system of copyright has great advantages and great disadvantages; and it is our business to ascertain what these are, and then to make an arrangement under which the advantages may be as far as possible secured, and the disadvantages as far as possible excluded. …

We have, then, only one resource left. We must betake ourselves to copyright, be the inconveniences of copyright what they may. Those inconveniences, in truth, are neither few nor small. Copyright is monopoly, and produces all the effects which the general voice of mankind attributes to monopoly. …

I believe, Sir, that I may with safety take it for granted that the effect of monopoly generally is to make articles scarce, to make them dear, and to make them bad. … Thus, then, stands the case. It is good that authors should be remunerated; and the least exceptionable way of remunerating them is by a monopoly. Yet monopoly is an evil. For the sake of the good we must submit to the evil; but the evil ought not to last a day longer than is necessary for the purpose of securing the good. …

For consider this; the evil effects of the monopoly are proportioned to the length of its duration. But the good effects for the sake of which we bear with the evil effects are by no means proportioned to the length of its duration. A monopoly of sixty years produces twice as much evil as a monopoly of thirty years, and thrice as much evil as a monopoly of twenty years. But it is by no means the fact that a posthumous monopoly of sixty years gives to an author thrice as much pleasure and thrice as strong a motive as a posthumous monopoly of twenty years. On the contrary, the difference is so small as to be hardly perceptible. We all know how faintly we are affected by the prospect of very distant advantages, even when they are advantages which we may reasonably hope that we shall ourselves enjoy. But an advantage that is to be enjoyed more than half a century after we are dead, by somebody, we know not by whom, perhaps by somebody unborn, by somebody utterly unconnected with us, is really no motive at all to action. …

Dr Johnson died fifty-six years ago. If the law were what my honourable and learned friend wishes to make it, somebody would now have the monopoly of Dr Johnson’s works. Who that somebody would be it is impossible to say; but we may venture to guess. I guess, then, that it would have been some bookseller, who was the assign of another bookseller, who was the grandson of a third bookseller, who had bought the copyright from Black Frank, the doctor’s servant and residuary legatee, in 1785 or 1786. Now, would the knowledge that this copyright would exist in 1841 have been a source of gratification to Johnson? Would it have stimulated his exertions? Would it have once drawn him out of his bed before noon? Would it have once cheered him under a fit of the spleen? Would it have induced him to give us one more allegory, one more life of a poet, one more imitation of Juvenal? I firmly believe not. I firmly believe that a hundred years ago, when he was writing our debates for the Gentleman’s Magazine, he would very much rather have had twopence to buy a plate of shin of beef at a cook’s shop underground. Considered as a reward to him, the difference between a twenty years’ and sixty years’ term of posthumous copyright would have been nothing or next to nothing. But is the difference nothing to us? I can buy Rasselas for sixpence; I might have had to give five shillings for it. I can buy the Dictionary, the entire genuine Dictionary, for two guineas, perhaps for less; I might have had to give five or six guineas for it. Do I grudge this to a man like Dr Johnson? Not at all. Show me that the prospect of this boon roused him to any vigorous effort, or sustained his spirits under depressing circumstances, and I am quite willing to pay the price of such an object, heavy as that price is. But what I do complain of is that my circumstances are to be worse, and Johnson’s none the better; that I am to give five pounds for what to him was not worth a farthing.

Macaulay in 1841 on the problems on the copyright monopoly Read More »

The real purposes of the American school

From John Taylor Gatto’s “Against School” (Harper’s Magazine: September 2003):

Mass schooling of a compulsory nature really got its teeth into the United States between 1905 and 1915, though it was conceived of much earlier and pushed for throughout most of the nineteenth century. The reason given for this enormous upheaval of family life and cultural traditions was, roughly speaking, threefold:

1) To make good people.
2) To make good citizens.
3) To make each person his or her personal best.

These goals are still trotted out today on a regular basis, and most of us accept them in one form or another as a decent definition of public education’s mission, however short schools actually fall in achieving them. But we are dead wrong. Compounding our error is the fact that the national literature holds numerous and surprisingly consistent statements of compulsory schooling’s true purpose. We have, for example, the great H. L. Mencken, who wrote in The American Mercury for April 1924 that the aim of public education is not

to fill the young of the species with knowledge and awaken their intelligence. . . . Nothing could be further from the truth. The aim.. . is simply to reduce as many individuals as possible to the same safe level, to breed and train a standardized citizenry, to put down dissent and originality. That is its aim in the United States . . . and that is its aim everywhere else.

[Alexander Inglis, author of the 1918 book, Principles of Secondary Education,], for whom a lecture in education at Harvard is named, makes it perfectly clear that compulsory schooling on this continent was intended to be just what it had been for Prussia in the 1820s: a fifth column into the burgeoning democratic movement that threatened to give the peasants and the proletarians a voice at the bargaining table. Modern, industrialized, compulsory schooling was to make a sort of surgical incision into the prospective unity of these underclasses. Divide children by subject, by age-grading, by constant rankings on tests, and by many other more subtle means, and it was unlikely that the ignorant mass of mankind, separated in childhood, would ever reintegrate into a dangerous whole.

Inglis breaks down the purpose – the actual purpose – of modem schooling into six basic functions, any one of which is enough to curl the hair of those innocent enough to believe the three traditional goals listed earlier:

1) The adjustive or adaptive function. Schools are to establish fixed habits of reaction to authority. This, of course, precludes critical judgment completely. It also pretty much destroys the idea that useful or interesting material should be taught, because you can’t test for reflexive obedience until you know whether you can make kids learn, and do, foolish and boring things.

2) The integrating function. This might well be called “the conformity function,” because its intention is to make children as alike as possible. People who conform are predictable, and this is of great use to those who wish to harness and manipulate a large labor force.

3) The diagnostic and directive function. School is meant to determine each student’s proper social role. This is done by logging evidence mathematically and anecdotally on cumulative records. As in “your permanent record.” Yes, you do have one.

4) The differentiating function. Once their social role has been “diagnosed,” children are to be sorted by role and trained only so far as their destination in the social machine merits – and not one step further. So much for making kids their personal best.

5) The selective function. This refers not to human choice at all but to Darwin’s theory of natural selection as applied to what he called “the favored races.” In short, the idea is to help things along by consciously attempting to improve the breeding stock. Schools are meant to tag the unfit – with poor grades, remedial placement, and other punishments – clearly enough that their peers will accept them as inferior and effectively bar them from the reproductive sweepstakes. That’s what all those little humiliations from first grade onward were intended to do: wash the dirt down the drain.

6) The propaedeutic function. The societal system implied by these rules will require an elite group of caretakers. To that end, a small fraction of the kids will quietly be taught how to manage this continuing project, how to watch over and control a population deliberately dumbed down and declawed in order that government might proceed unchallenged and corporations might never want for obedient labor. …

Class may frame the proposition, as when Woodrow Wilson, then president of Princeton University, said the following to the New York City School Teachers Association in 1909: “We want one class of persons to have a liberal education, and we want another class of persons, a very much larger class, of necessity, in every society, to forgo the privileges of a liberal education and fit themselves to perform specific difficult manual tasks.” …

Now, you needn’t have studied marketing to know that there are two groups of people who can always be convinced to consume more than they need to: addicts and children. School has done a pretty good job of turning our children into addicts, but it has done a spectacular job of turning our children into children. Again, this is no accident. Theorists from Plato to Rousseau to our own Dr. Inglis knew that if children could be cloistered with other children, stripped of responsibility and independence, encouraged to develop only the trivializing emotions of greed, envy, jealousy, and fear, they would grow older but never truly grow up. …

Now for the good news. Once you understand the logic behind modern schooling, its tricks and traps are fairly easy to avoid. School trains children to be employees and consumers; teach your own to be leaders and adventurers. School trains children to obey reflexively; teach your own to think critically and independently. Well-schooled kids have a low threshold for boredom; help your own to develop an inner life so that they’ll never be bored. Urge them to take on the serious material, the grown-up material, in history, literature, philosophy, music, art, economics, theology – all the stuff schoolteachers know well enough to avoid. Challenge your kids with plenty of solitude so that they can learn to enjoy their own company, to conduct inner dialogues. Well-schooled people are conditioned to dread being alone, and they seek constant companionship through the TV, the computer, the cell phone, and through shallow friendships quickly acquired and quickly abandoned. Your children should have a more meaningful life, and they can.

First, though, we must wake up to what our schools really are: laboratories of experimentation on young minds, drill centers for the habits and attitudes that corporate society demands. Mandatory education serves children only incidentally; its real purpose is to turn them into servants. Don’t let your own have their childhoods extended, not even for a day. If David Farragut could take command of a captured British warship as a preteen, if Thomas Edison could publish a broadsheet at the age of twelve, if Ben Franklin could apprentice himself to a printer at the same age (then put himself through a course of study that would choke a Yale senior today), there’s no telling what your own kids could do. After a long life, and thirty years in the public school trenches, I’ve concluded that genius is as common as dirt. We suppress our genius only because we haven’t yet figured out how to manage a population of educated men and women. The solution, I think, is simple and glorious. Let them manage themselves.

The real purposes of the American school Read More »

The birth of Geology & gradualism as a paradigm shift from catastrophism

From Kim Stanley Robinson’s “Imagining Abrupt Climate Change : Terraforming Earth” (Amazon Shorts: 31 July 2005):

This view, by the way, was in keeping with a larger and older paradigm called gradualism, the result of a dramatic and controversial paradigm shift of its own from the nineteenth century, one that is still a contested part of our culture wars, having to do with the birth of geology as a field, and its discovery of the immense age of the Earth. Before that, Earth’s history tended to be explained in a kind of Biblical paradigm, in which the Earth was understood to be several thousand years old, because of genealogies in the Bible, so that landscape features tended to be explained by events like Noah’s flood. This kind of “catastrophism” paradigm was what led Josiah Whitney to maintain that Yosemite Valley must have been formed by a cataclysmic earthquake, for instance; there simply hadn’t been time for water and ice to have carved something as hard as granite. It was John Muir who made the gradualist argument for glacial action over millions of years; and the eventual acceptance of his explanation was part of the general shift to gradualist explanations for Earth’s landforms, which also meant there was another time for evolution to have taken place. Gradualism also led by extension to thinking that the various climate regimes of the past had also come about fairly gradually.

The birth of Geology & gradualism as a paradigm shift from catastrophism Read More »

America the aggressive

From Harold Pinter’s “Nobel Lecture: Art, Truth & Politics” (Nobel Prize: 7 December 2005):

Direct invasion of a sovereign state has never in fact been America’s favoured method. In the main, it has preferred what it has described as ‘low intensity conflict’. Low intensity conflict means that thousands of people die but slower than if you dropped a bomb on them in one fell swoop. It means that you infect the heart of the country, that you establish a malignant growth and watch the gangrene bloom. When the populace has been subdued – or beaten to death – the same thing – and your own friends, the military and the great corporations, sit comfortably in power, you go before the camera and say that democracy has prevailed. This was a commonplace in US foreign policy in the years to which I refer. …

The United States supported and in many cases engendered every right wing military dictatorship in the world after the end of the Second World War. I refer to Indonesia, Greece, Uruguay, Brazil, Paraguay, Haiti, Turkey, the Philippines, Guatemala, El Salvador, and, of course, Chile. The horror the United States inflicted upon Chile in 1973 can never be purged and can never be forgiven.

Hundreds of thousands of deaths took place throughout these countries. Did they take place? And are they in all cases attributable to US foreign policy? The answer is yes they did take place and they are attributable to American foreign policy. But you wouldn’t know it.

It never happened. Nothing ever happened. Even while it was happening it wasn’t happening. It didn’t matter. It was of no interest. The crimes of the United States have been systematic, constant, vicious, remorseless, but very few people have actually talked about them. You have to hand it to America. It has exercised a quite clinical manipulation of power worldwide while masquerading as a force for universal good. It’s a brilliant, even witty, highly successful act of hypnosis.

I put to you that the United States is without doubt the greatest show on the road. Brutal, indifferent, scornful and ruthless it may be but it is also very clever. As a salesman it is out on its own and its most saleable commodity is self love. It’s a winner. Listen to all American presidents on television say the words, ‘the American people’, as in the sentence, ‘I say to the American people it is time to pray and to defend the rights of the American people and I ask the American people to trust their president in the action he is about to take on behalf of the American people.’

It’s a scintillating stratagem. Language is actually employed to keep thought at bay. The words ‘the American people’ provide a truly voluptuous cushion of reassurance. You don’t need to think. Just lie back on the cushion. The cushion may be suffocating your intelligence and your critical faculties but it’s very comfortable. This does not apply of course to the 40 million people living below the poverty line and the 2 million men and women imprisoned in the vast gulag of prisons, which extends across the US. …

We have brought torture, cluster bombs, depleted uranium, innumerable acts of random murder, misery, degradation and death to the Iraqi people and call it ‘bringing freedom and democracy to the Middle East’. …

I have said earlier that the United States is now totally frank about putting its cards on the table. That is the case. Its official declared policy is now defined as ‘full spectrum dominance’. That is not my term, it is theirs. ‘Full spectrum dominance’ means control of land, sea, air and space and all attendant resources.

The United States now occupies 702 military installations throughout the world in 132 countries, with the honourable exception of Sweden, of course. We don’t quite know how they got there but they are there all right.

The United States possesses 8,000 active and operational nuclear warheads. Two thousand are on hair trigger alert, ready to be launched with 15 minutes warning. It is developing new systems of nuclear force, known as bunker busters. … We must remind ourselves that the United States is on a permanent military footing and shows no sign of relaxing it.

America the aggressive Read More »

Why software is difficult to create … & will always be difficult

From Frederick P. Brooks, Jr.’s “No Silver Bullet: Essence and Accidents of Software Engineering” (Computer: Vol. 20, No. 4 [April 1987] pp. 10-19):

The familiar software project, at least as seen by the nontechnical manager, has something of this character; it is usually innocent and straightforward, but is capable of becoming a monster of missed schedules, blown budgets, and flawed products. So we hear desperate cries for a silver bullet–something to make software costs drop as rapidly as computer hardware costs do.

But, as we look to the horizon of a decade hence, we see no silver bullet. There is no single development, in either technology or in management technique, that by itself promises even one order-of-magnitude improvement in productivity, in reliability, in simplicity. …

The essence of a software entity is a construct of interlocking concepts: data sets, relationships among data items, algorithms, and invocations of functions. This essence is abstract in that such a conceptual construct is the same under many different representations. It is nonetheless highly precise and richly detailed.

I believe the hard part of building software to be the specification, design, and testing of this conceptual construct, not the labor of representing it and testing the fidelity of the representation. We still make syntax errors, to be sure; but they are fuzz compared with the conceptual errors in most systems. …

Let us consider the inherent properties of this irreducible essence of modern software systems: complexity, conformity, changeability, and invisibility.

Complexity. Software entities are more complex for their size than perhaps any other human construct because no two parts are alike (at least above the statement level). …

Many of the classic problems of developing software products derive from this essential complexity and its nonlinear increases with size. From the complexity comes the difficulty of communication among team members, which leads to product flaws, cost overruns, schedule delays. From the complexity comes the difficulty of enumerating, much less understanding, all the possible states of the program, and from that comes the unreliability. From complexity of function comes the difficulty of invoking function, which makes programs hard to use. From complexity of structure comes the difficulty of extending programs to new functions without creating side effects. From complexity of structure come the unvisualized states that constitute security trapdoors.

Not only technical problems, but management problems as well come from the complexity. It makes overview hard, thus impeding conceptual integrity. It makes it hard to find and control all the loose ends. It creates the tremendous learning and understanding burden that makes personnel turnover a disaster.

Conformity. … No such faith comforts the software engineer. Much of the complexity that he must master is arbitrary complexity, forced without rhyme or reason by the many human institutions and systems to which his interfaces must conform. …

Changeability. … All successful software gets changed. Two processes are at work. First, as a software product is found to be useful, people try it in new cases at the edge of or beyond the original domain. The pressures for extended function come chiefly from users who like the basic function and invent new uses for it.

Second, successful software survives beyond the normal life of the machine vehicle for which it is first written. If not new computers, then at least new disks, new displays, new printers come along; and the software must be conformed to its new vehicles of opportunity. …

Invisibility. Software is invisible and unvisualizable. …

The reality of software is not inherently embedded in space. Hence, it has no ready geometric representation in the way that land has maps, silicon chips have diagrams, computers have connectivity schematics. As soon as we attempt to diagram software structure, we find it to constitute not one, but several, general directed graphs superimposed one upon another. The several graphs may represent the flow of control, the flow of data, patterns of dependency, time sequence, name-space relationships. These graphs are usually not even planar, much less hierarchical. …

Past Breakthroughs Solved Accidental Difficulties

If we examine the three steps in software technology development that have been most fruitful in the past, we discover that each attacked a different major difficulty in building software, but that those difficulties have been accidental, not essential, difficulties. …

High-level languages. Surely the most powerful stroke for software productivity, reliability, and simplicity has been the progressive use of high-level languages for programming. …

What does a high-level language accomplish? It frees a program from much of its accidental complexity. …

Time-sharing. Time-sharing brought a major improvement in the productivity of programmers and in the quality of their product, although not so large as that brought by high-level languages.

Time-sharing attacks a quite different difficulty. Time-sharing preserves immediacy, and hence enables one to maintain an overview of complexity. …

Unified programming environments. Unix and Interlisp, the first integrated programming environments to come into widespread use, seem to have improved productivity by integral factors. Why?

They attack the accidental difficulties that result from using individual programs together, by providing integrated libraries, unified file formats, and pipes and filters. As a result, conceptual structures that in principle could always call, feed, and use one another can indeed easily do so in practice.

Why software is difficult to create … & will always be difficult Read More »

Paradigm shifts explained

From Kim Stanley Robinson’s “Imagining Abrupt Climate Change : Terraforming Earth” (Amazon Shorts: 31 July 2005):

… paradigm shifts are exciting moments in science’s ongoing project of self-improvement, making itself more accurately mapped to reality as it is discovered and teased out; this process of continual recalibration and improvement is one of the most admirable parts of science, which among other things is a most powerful and utopian set of mental habits; an attitude toward reality that I have no hesitation in labeling a kind of worship or devotion. And in this ongoing communal act of devotion, paradigm shifts are very good at revealing how science is conducted, in part because each one represents a little (or big) crisis of understanding.

As Thomas Kuhn described the process in his seminal book The Structure of Scientific Revolutions, workers in the various branches of science build over time an interconnected construct of concepts and beliefs that allow them to interpret the data from their experiments, and fit them into a larger picture of the world that makes the best sense of the evidence at hand. What is hoped for is a picture that, if anyone else were to question it, and follow the train of reasoning and all the evidence used to support it, they too would agree with it. This is one of the ways science is interestingly utopian; it attempts to say things that everyone looking at the same evidence would agree to.

So, using this paradigm, always admitted to be a work in progress, scientists then conduct what Kuhn calls “normal science,” elucidating further aspects of reality by using the paradigm to structure their questions and their answers. Sometimes paradigms are useful for centuries; other times, for shorter periods. Then it often happens that scientists in the course of doing “normal science” begin to get evidence from the field that cannot be explained within the paradigm that has been established. At first such “anomalies” are regarded as suspect in themselves, precisely because they don’t fit the paradigm. They’re oddities, and something might be wrong with them as such. Thus they are ignored, or tossed aside, or viewed with suspicion, or in some other way bracketed off. Eventually, if enough of them pile up, and they seem similar in kind, or otherwise solid as observations, attempts might be made to explain them within the old paradigm, by tweaking or re-interpreting the paradigm itself, without actually throwing the paradigm out entirely.

For instance, when it was found that Newtonian laws of gravitation could not account for the speed of Mercury, which was moving a tiny bit faster than it ought to have been, even though Newton’s laws accounted for all the other planets extremely well, at first some astronomers suggested there might be another planet inside the orbit of Mercury, too close to the Sun for us to see. They even gave this potential planet a name, Vulcan; but they couldn’t see it, and calculations revealed that this hypothetical Vulcan still would not explain the discrepancy in Mercury’s motion. The discrepancy remained an anomaly, and was real enough and serious enough to cast the whole Newtonian paradigm into doubt among the small group of people who worried about it and wondered what could be causing it.

It was Einstein who then proposed that Mercury moved differently than predicted because spacetime itself curved around masses, and near the huge mass of the Sun the effect was large enough to be noticeable.

Whoah! This was a rather mind-bogglingly profound explanation for a little orbital discrepancy in Mercury; but Einstein also made a new prediction and suggested an experiment; if his explanation were correct, then light too would bend in the gravity well around the sun, and so the light of a star would appear from behind the sun a little bit before the astronomical tables said that it should. The proposed experiment presented some observational difficulties, but a few years later it was accomplished during a total eclipse of the sun, and the light of a certain star appeared before it ought to have by just the degree Einstein had predicted. And so Einstein’s concepts concerning spacetime began to be accepted and elaborated, eventually forming a big part of the paradigm known as the “standard model,” within which new kinds of “normal science” in physics and astronomy could be done. …

Paradigm shifts explained Read More »

Origins of the interstate highway system

From Robert Sullivan’s “An Impala’s-Eye View of Highway History” (The New York Times: 14 July 2006):

Another traveler, Dwight D. Eisenhower, spent two months in 1919 driving a military convoy across the country; the shoddy roads left a lasting impression on him. After World War II he studied Hitler’s autobahn and concluded that the American military should have one. In 1956 he signed the Federal-Aid Highway Act, which, the president recounted in his memoir, resulted in enough concrete to build “six sidewalks to the moon.” The new highways were originally meant to loop around cities that could be skirted should they be destroyed by atomic bombs. Instead, the loops started a suburban construction boom that continues to this day. [Robert] Sullivan reports that Phoenix, a city that virtually rose out of the Interstate, currently gobbles up land at the rate of 1.2 acres per hour.

In the 1960’s state toll roads entered into the system, extending the web to all corners of the country. Today almost 47,000 miles of Interstate highways – with attendant motels, fast-food courts and construction projects – have paved over the continent with such efficiency that one can move from sea to shining sea with speed, economy and almost zero interpersonal interaction.

Origins of the interstate highway system Read More »