history

Poetry: Johnny Mercer’s “Early Autumn”

Man, these are beatiful, evocative lyrics by Johnny Mercer:

When an early autumn walks the land and chills the breeze
and touches with her hand the summer trees,
perhaps you’ll understand what memories I own.

There’s a dance pavilion in the rain all shuttered down,
a winding country lane all russet brown,
a frosty window pane shows me a town grown lonely.

That spring of ours that started so April-hearted,
seemed made for just a boy and girl.
I never dreamed, did you, any fall would come in view
so early, early.

Darling if you care, please, let me know,
I’ll meet you anywhere, I miss you so.
Let’s never have to share another early autumn.

Poetry: Johnny Mercer’s “Early Autumn” Read More »

David Foster Wallace on the impossibility of being informed & the seduction of dogma

From David Foster Wallace’s “Introduction” (The Best American Essays 2007):

Here is an overt premise. There is just no way that 2004’s reelection could have taken place—not to mention extraordinary renditions, legalized torture, FISA-flouting, or the
passage of the Military Commissions Act—if we had been paying attention and handling information in a competent grown-up way. ‘We’ meaning as a polity and culture. The premise does not entail specific blame—or rather the problems here are too entangled and systemic for good old-fashioned finger-pointing. It is, for one example, simplistic and wrong to blame the for-profit media for somehow failing to make clear to us the moral and practical hazards of trashing the Geneva Conventions. The for-profit media is highly attuned to what we want and the amount of detail we’ll sit still for. And a ninety-second news piece on the question of whether and how the Geneva Conventions ought to apply in an era of asymmetrical warfare is not going to explain anything; the relevant questions are too numerous and complicated, too fraught with contexts in everything from civil law and military history to ethics and game theory. One could spend a hard month just learning the history of the Conventions’ translation into actual codes of conduct for the U.S. military … and that’s not counting the dramatic changes in those codes since 2002, or the question of just what new practices violate (or don’t) just which Geneva provisions, and according to whom. Or let’s not even mention the amount of research, background, cross- checking, corroboration, and rhetorical parsing required to understand the cataclysm of Iraq, the collapse of congressional oversight, the ideology of neoconservatism, the legal status of presidential signing statements, the political marriage of evangelical Protestantism and corporatist laissez-faire … There’s no way. You’d simply drown. We all would. It’s amazing to me that no one much talks about this—about the fact that whatever our founders and framers thought of as a literate, informed citizenry can no longer exist, at least not without a whole new modern degree of subcontracting and dependence packed into what we mean by ‘informed.’8

8 Hence, by the way, the seduction of partisan dogma. You can drown in dogmatism now, too— radio, Internet, cable, commercial and scholarly print— but this kind of drowning is more like sweet release. Whether hard right or new left or whatever, the seduc- tion and mentality are the same. You don’t have to feel confused or inundated or ignorant. You don’t even have to think, for you already Know, and whatever you choose to learn confirms what you Know. This dog- matic lockstep is not the kind of inevitable dependence I’m talking about—or rather it’s only the most extreme and frightened form of that dependence.

David Foster Wallace on the impossibility of being informed & the seduction of dogma Read More »

Bernie Madoff & the 1st worldwide Ponzi scheme

From Diana B. Henrioques’s “Madoff Scheme Kept Rippling Outward, Across Borders” (The New York Times: 20 December 2008):

But whatever else Mr. Madoff’s game was, it was certainly this: The first worldwide Ponzi scheme — a fraud that lasted longer, reached wider and cut deeper than any similar scheme in history, entirely eclipsing the puny regional ambitions of Charles Ponzi, the Boston swindler who gave his name to the scheme nearly a century ago.

Regulators say Mr. Madoff himself estimated that $50 billion in personal and institutional wealth from around the world was gone. … Before it evaporated, it helped finance Mr. Madoff’s coddled lifestyle, with a Manhattan apartment, a beachfront mansion in the Hamptons, a small villa overlooking Cap d’Antibes on the French Riviera, a Mayfair office in London and yachts in New York, Florida and the Mediterranean.

In 1960, as Wall Street was just shaking off its postwar lethargy and starting to buzz again, Bernie Madoff (pronounced MAY-doff) set up his small trading firm. His plan was to make a business out of trading lesser-known over-the-counter stocks on the fringes of the traditional stock market. He was just 22, a graduate of Hofstra University on Long Island.

By 1989, Mr. Madoff ‘s firm was handling more than 5 percent of the trading volume on the august New York Stock Exchange …

And in 1990, he became the nonexecutive chairman of the Nasdaq market, which at the time was operated as a committee of the National Association of Securities Dealers.

His rise on Wall Street was built on his belief in a visionary notion that seemed bizarre to many at the time: That stocks could be traded by people who never saw each other but were connected only by electronics.

In the mid-1970s, he had spent over $250,000 to upgrade the computer equipment at the Cincinnati Stock Exchange, where he began offering to buy and sell stocks that were listed on the Big Board. The exchange, in effect, was transformed into the first all-electronic computerized stock exchange.

He also invested in new electronic trading technology for his firm, making it cheaper for brokerage firms to fill their stock orders. He eventually gained a large amount of business from big firms like A. G. Edwards & Sons, Charles Schwab & Company, Quick & Reilly and Fidelity Brokerage Services.

By the end of the technology bubble in 2000, his firm was the largest market maker on the Nasdaq electronic market, and he was a member of the Securities Industry Association, now known as the Securities Industry and Financial Markets Association, Wall Street’s principal lobbying arm.

Bernie Madoff & the 1st worldwide Ponzi scheme Read More »

COBOL is much more widely used than you might think

From Darryl Taft’s “Enterprise Applications: 20 Things You Might Not Know About COBOL (as the Language Turns 50)” (eWeek: September 2009). http://www.eweek.com/c/a/Enterprise-Applications/20-Things-You-Might-Not-Know-About-COBOL-As-the-Language-Turns-50-103943/?kc=EWKNLBOE09252009FEA1. Accessed 25 September 2009.

Five billion lines of new COBOL are developed every year.

More than 80 percent of all daily business transactions are processed in COBOL.

More than 70 percent of all worldwide business data is stored on a mainframe.

More than 70 percent of mission-critical applications are in COBOL.

More than 310 billion lines of software are in use today and more than 200 billion lines are COBOL (65 percent of the total software).

There are 200 times more COBOL transactions per day than Google searches worldwide.

An estimated 2 million people are currently working in COBOL in one form or another.

COBOL is much more widely used than you might think Read More »

What Google’s book settlement means

Google Book Search
Image via Wikipedia

From Robert Darnton’s “Google & the Future of Books” (The New York Review of Books: 12 February 2009):

As the Enlightenment faded in the early nineteenth century, professionalization set in. You can follow the process by comparing the Encyclopédie of Diderot, which organized knowledge into an organic whole dominated by the faculty of reason, with its successor from the end of the eighteenth century, the Encyclopédie méthodique, which divided knowledge into fields that we can recognize today: chemistry, physics, history, mathematics, and the rest. In the nineteenth century, those fields turned into professions, certified by Ph.D.s and guarded by professional associations. They metamorphosed into departments of universities, and by the twentieth century they had left their mark on campuses…

Along the way, professional journals sprouted throughout the fields, subfields, and sub-subfields. The learned societies produced them, and the libraries bought them. This system worked well for about a hundred years. Then commercial publishers discovered that they could make a fortune by selling subscriptions to the journals. Once a university library subscribed, the students and professors came to expect an uninterrupted flow of issues. The price could be ratcheted up without causing cancellations, because the libraries paid for the subscriptions and the professors did not. Best of all, the professors provided free or nearly free labor. They wrote the articles, refereed submissions, and served on editorial boards, partly to spread knowledge in the Enlightenment fashion, but mainly to advance their own careers.

The result stands out on the acquisitions budget of every research library: the Journal of Comparative Neurology now costs $25,910 for a year’s subscription; Tetrahedron costs $17,969 (or $39,739, if bundled with related publications as a Tetrahedron package); the average price of a chemistry journal is $3,490; and the ripple effects have damaged intellectual life throughout the world of learning. Owing to the skyrocketing cost of serials, libraries that used to spend 50 percent of their acquisitions budget on monographs now spend 25 percent or less. University presses, which depend on sales to libraries, cannot cover their costs by publishing monographs. And young scholars who depend on publishing to advance their careers are now in danger of perishing.

The eighteenth-century Republic of Letters had been transformed into a professional Republic of Learning, and it is now open to amateurs—amateurs in the best sense of the word, lovers of learning among the general citizenry. Openness is operating everywhere, thanks to “open access” repositories of digitized articles available free of charge, the Open Content Alliance, the Open Knowledge Commons, OpenCourseWare, the Internet Archive, and openly amateur enterprises like Wikipedia. The democratization of knowledge now seems to be at our fingertips. We can make the Enlightenment ideal come to life in reality.

What provoked these jeremianic- utopian reflections? Google. Four years ago, Google began digitizing books from research libraries, providing full-text searching and making books in the public domain available on the Internet at no cost to the viewer. For example, it is now possible for anyone, anywhere to view and download a digital copy of the 1871 first edition of Middlemarch that is in the collection of the Bodleian Library at Oxford. Everyone profited, including Google, which collected revenue from some discreet advertising attached to the service, Google Book Search. Google also digitized an ever-increasing number of library books that were protected by copyright in order to provide search services that displayed small snippets of the text. In September and October 2005, a group of authors and publishers brought a class action suit against Google, alleging violation of copyright. Last October 28, after lengthy negotiations, the opposing parties announced agreement on a settlement, which is subject to approval by the US District Court for the Southern District of New York.[2]

The settlement creates an enterprise known as the Book Rights Registry to represent the interests of the copyright holders. Google will sell access to a gigantic data bank composed primarily of copyrighted, out-of-print books digitized from the research libraries. Colleges, universities, and other organizations will be able to subscribe by paying for an “institutional license” providing access to the data bank. A “public access license” will make this material available to public libraries, where Google will provide free viewing of the digitized books on one computer terminal. And individuals also will be able to access and print out digitized versions of the books by purchasing a “consumer license” from Google, which will cooperate with the registry for the distribution of all the revenue to copyright holders. Google will retain 37 percent, and the registry will distribute 63 percent among the rightsholders.

Meanwhile, Google will continue to make books in the public domain available for users to read, download, and print, free of charge. Of the seven million books that Google reportedly had digitized by November 2008, one million are works in the public domain; one million are in copyright and in print; and five million are in copyright but out of print. It is this last category that will furnish the bulk of the books to be made available through the institutional license.

Many of the in-copyright and in-print books will not be available in the data bank unless the copyright owners opt to include them. They will continue to be sold in the normal fashion as printed books and also could be marketed to individual customers as digitized copies, accessible through the consumer license for downloading and reading, perhaps eventually on e-book readers such as Amazon’s Kindle.

After reading the settlement and letting its terms sink in—no easy task, as it runs to 134 pages and 15 appendices of legalese—one is likely to be dumbfounded: here is a proposal that could result in the world’s largest library. It would, to be sure, be a digital library, but it could dwarf the Library of Congress and all the national libraries of Europe. Moreover, in pursuing the terms of the settlement with the authors and publishers, Google could also become the world’s largest book business—not a chain of stores but an electronic supply service that could out-Amazon Amazon.

An enterprise on such a scale is bound to elicit reactions of the two kinds that I have been discussing: on the one hand, utopian enthusiasm; on the other, jeremiads about the danger of concentrating power to control access to information.

Google is not a guild, and it did not set out to create a monopoly. On the contrary, it has pursued a laudable goal: promoting access to information. But the class action character of the settlement makes Google invulnerable to competition. Most book authors and publishers who own US copyrights are automatically covered by the settlement. They can opt out of it; but whatever they do, no new digitizing enterprise can get off the ground without winning their assent one by one, a practical impossibility, or without becoming mired down in another class action suit. If approved by the court—a process that could take as much as two years—the settlement will give Google control over the digitizing of virtually all books covered by copyright in the United States.

Google alone has the wealth to digitize on a massive scale. And having settled with the authors and publishers, it can exploit its financial power from within a protective legal barrier; for the class action suit covers the entire class of authors and publishers. No new entrepreneurs will be able to digitize books within that fenced-off territory, even if they could afford it, because they would have to fight the copyright battles all over again. If the settlement is upheld by the court, only Google will be protected from copyright liability.

Google’s record suggests that it will not abuse its double-barreled fiscal-legal power. But what will happen if its current leaders sell the company or retire? The public will discover the answer from the prices that the future Google charges, especially the price of the institutional subscription licenses. The settlement leaves Google free to negotiate deals with each of its clients, although it announces two guiding principles: “(1) the realization of revenue at market rates for each Book and license on behalf of the Rightsholders and (2) the realization of broad access to the Books by the public, including institutions of higher education.”

What will happen if Google favors profitability over access? Nothing, if I read the terms of the settlement correctly. Only the registry, acting for the copyright holders, has the power to force a change in the subscription prices charged by Google, and there is no reason to expect the registry to object if the prices are too high. Google may choose to be generous in it pricing, and I have reason to hope it may do so; but it could also employ a strategy comparable to the one that proved to be so effective in pushing up the price of scholarly journals: first, entice subscribers with low initial rates, and then, once they are hooked, ratchet up the rates as high as the traffic will bear.

What Google’s book settlement means Read More »

The future of news as shown by the 2008 election

From Steven Berlin Johnson’s “Old Growth Media And The Future Of News” (StevenBerlinJohnson.com: 14 March 2009):

The first Presidential election that I followed in an obsessive way was the 1992 election that Clinton won. I was as compulsive a news junkie about that campaign as I was about the Mac in college: every day the Times would have a handful of stories about the campaign stops or debates or latest polls. Every night I would dutifully tune into Crossfire to hear what the punditocracy had to say about the day’s events. I read Newsweek and Time and the New Republic, and scoured the New Yorker for its occasional political pieces. When the debates aired, I’d watch religiously and stay up late soaking in the commentary from the assembled experts.

That was hardly a desert, to be sure. But compare it to the information channels that were available to me following the 2008 election. Everything I relied on in 1992 was still around of course – except for the late, lamented Crossfire – but it was now part of a vast new forest of news, data, opinion, satire – and perhaps most importantly, direct experience. Sites like Talking Points Memo and Politico did extensive direct reporting. Daily Kos provided in-depth surveys and field reports on state races that the Times would never have had the ink to cover. Individual bloggers like Andrew Sullivan responded to each twist in the news cycle; HuffPo culled the most provocative opinion pieces from the rest of the blogosphere. Nate Silver at fivethirtyeight.com did meta-analysis of polling that blew away anything William Schneider dreamed of doing on CNN in 1992. When the economy imploded in September, I followed economist bloggers like Brad DeLong to get their expert take the candidates’ responses to the crisis. (Yochai Benchler talks about this phenomenon of academics engaging with the news cycle in a smart response here.) I watched the debates with a thousand virtual friends live-Twittering alongside me on the couch. All this was filtered and remixed through the extraordinary political satire of John Stewart and Stephen Colbert, which I watched via viral clips on the Web as much as I watched on TV.

What’s more: the ecosystem of political news also included information coming directly from the candidates. Think about the Philadelphia race speech, arguably one of the two or three most important events in the whole campaign. Eight million people watched it on YouTube alone. Now, what would have happened to that speech had it been delivered in 1992? Would any of the networks have aired it in its entirety? Certainly not. It would have been reduced to a minute-long soundbite on the evening news. CNN probably would have aired it live, which might have meant that 500,000 people caught it. Fox News and MSNBC? They didn’t exist yet. A few serious newspaper might have reprinted it in its entirety, which might have added another million to the audience. Online perhaps someone would have uploaded a transcript to Compuserve or The Well, but that’s about the most we could have hoped for.

There is no question in mind my mind that the political news ecosystem of 2008 was far superior to that of 1992: I had more information about the state of the race, the tactics of both campaigns, the issues they were wrestling with, the mind of the electorate in different regions of the country. And I had more immediate access to the candidates themselves: their speeches and unscripted exchanges; their body language and position papers.

The old line on this new diversity was that it was fundamentally parasitic: bloggers were interesting, sure, but if the traditional news organizations went away, the bloggers would have nothing to write about, since most of what they did was link to professionally reported stories. Let me be clear: traditional news organizations were an important part of the 2008 ecosystem, no doubt about it. … But no reasonable observer of the political news ecosystem could describe all the new species as parasites on the traditional media. Imagine how many barrels of ink were purchased to print newspaper commentary on Obama’s San Francisco gaffe about people “clinging to their guns and religion.” But the original reporting on that quote didn’t come from the Times or the Journal; it came from a “citizen reporter” named Mayhill Fowler, part of the Off The Bus project sponsored by Jay Rosen’s Newassignment.net and The Huffington Post.

The future of news as shown by the 2008 election Read More »

David Foster Wallace on postmodernism & waiting for the parents to come home

From Larry McCaffery’s “Conversation with David Foster Wallace” (Dalkey Archive Press at the University of Illinois: Summer 1993):

For me, the last few years of the postmodern era have seemed a bit like the way you feel when you’re in high school and your parents go on a trip, and you throw a party. You get all your friends over and throw this wild disgusting fabulous party. For a while it’s great, free and freeing, parental authority gone and overthrown, a cat’s-away-let’s-play Dionysian revel. But then time passes and the party gets louder and louder, and you run out of drugs, and nobody’s got any money for more drugs, and things get broken and spilled, and there’s a cigarette burn on the couch, and you’re the host and it’s your house too, and you gradually start wishing your parents would come back and restore some fucking order in your house. It’s not a perfect analogy, but the sense I get of my generation of writers and intellectuals or whatever is that it’s 3:00 A.M. and the couch has several burn-holes and somebody’s thrown up in the umbrella stand and we’re wishing the revel would end. The postmodern founders’ patricidal work was great, but patricide produces orphans, and no amount of revelry can make up for the fact that writers my age have been literary orphans throughout our formative years. We’re kind of wishing some parents would come back. And of course we’re uneasy about the fact that we wish they’d come back—I mean, what’s wrong with us? Are we total pussies? Is there something about authority and limits we actually need? And then the uneasiest feeling of all, as we start gradually to realize that parents in fact aren’t ever coming back—which means we’re going to have to be the parents.

David Foster Wallace on postmodernism & waiting for the parents to come home Read More »

Mine fires that burn for 400 years

Centralia - Where there's smoke..
Creative Commons License photo credit: C. Young Photography

From Joshua Foer’s “Giant Burning Holes of the World” (Boing Boing: 16 June 2009):

… these sorts of mine fires can stay lit for a very long time. One burned in the city of Zwickau, Germany from 1476 to 1860. Another coal fire in Germany, at a place called Brennender Berg (Burning Mountain), has been smoking continually since 1688!

Mine fires that burn for 400 years Read More »

7 tools of propaganda

From Roger Ebert’s “The O’Reilly Procedure” (Roger Ebert’s Journal: 14 June 2009):

The seven propaganda devices include:

  • Name calling — giving something a bad label to make the audience reject it without examining the evidence;
  • Glittering generalities — the opposite of name calling;
  • Card stacking — the selective use of facts and half-truths;
  • Bandwagon — appeals to the desire, common to most of us, to follow the crowd;
  • Plain folks — an attempt to convince an audience that they, and their ideas, are “of the people”;
  • Transfer — carries over the authority, sanction and prestige of something we respect or dispute to something the speaker would want us to accept; and
  • Testimonials — involving a respected (or disrespected) person endorsing or rejecting an idea or person.

7 tools of propaganda Read More »

David Foster Wallace on the familiar & the strange

From Larry McCaffery’s “Conversation with David Foster Wallace” (Dalkey Archive Press at the University of Illinois: Summer 1993):

If you mean a post-industrial, mediated world, it’s inverted one of fiction’s big historical functions, that of providing data on distant cultures and persons. The first real generalization of human experience that novels tried to accomplish. If you lived in Bumfuck, Iowa, a hundred years ago and had no idea what life was like in India, good old Kipling goes over and presents it to you. … Well, but fiction’s presenting function for today’s reader has been reversed: since the whole global village is now presented as familiar, electronically immediate—satellites, microwaves, intrepid PBS anthropologists, Paul Simon’s Zulu back-ups—it’s almost like we need fiction writers to restore strange things’ ineluctable “strangeness,” to defamiliarize stuff, I guess you’d say.

… For our generation, the entire world seems to present itself as “familiar,” but since that’s of course an illusion in terms of anything really important about people, maybe any “realistic” fiction’s job is opposite what it used to be—no longer making the strange familiar but making the familiar strange again. It seems important to find ways of reminding ourselves that most “familiarity” is meditated and delusive.

David Foster Wallace on the familiar & the strange Read More »

Steve Jobs on mediocrity & market share

From Steven Levy’s “OK, Mac, Make a Wish: Apple’s ‘computer for the rest of us’ is, insanely, 20” (Newsweek: 2 February 2004):

If that’s so, then why is the Mac market share, even after Apple’s recent revival, sputtering at a measly 5 percent? Jobs has a theory about that, too. Once a company devises a great product, he says, it has a monopoly in that realm, and concentrates less on innovation than protecting its turf. “The Mac user interface was a 10-year monopoly,” says Jobs. “Who ended up running the company? Sales guys. At the critical juncture in the late ’80s, when they should have gone for market share, they went for profits. They made obscene profits for several years. And their products became mediocre. And then their monopoly ended with Windows 95. They behaved like a monopoly, and it came back to bite them, which always happens.”

Steve Jobs on mediocrity & market share Read More »

Fossils are the lucky ones

From Errol Morris’ “Whose Father Was He? (Part Five)” (The New York Times: 2 April 2009):

I had an opportunity to visit the fossil collections at the Museum of the Rockies in Bozeman, Montana. It was part of a dinosaur fossil-hunting trip with Jack Horner, the premier hunter of T-Rex skeletons. Downstairs in the lab, there was a Triceratops skull sitting on a table. I picked it up and inserted my finger into the brain cavity. (I had read all these stories about how small the Triceratops brain had to have been and I wanted to see for myself.) I said to Jack Horner, “To think that someday somebody will do that with my skull.” And he said, “You should be so lucky. It’s only the privileged few of us who get to be fossils.”

Fossils are the lucky ones Read More »

David Foster Wallace on leadership

From David Foster Wallace’s “The Weasel, Twelve Monkeys And The Shrub: Seven Days In The Life Of The Late, Great John McCain” (Rolling Stone: 13 April 2000):

The weird thing is that the word “leader” itself is cliché and boring, but when you come across somebody who actually is a real leader, that person isn’t cliché or boring at all; in fact he’s sort of the opposite of cliché and boring.

Obviously, a real leader isn’t just somebody who has ideas you agree with, nor is it just somebody you happen to think is a good guy. A real leader is somebody who, because of his own particular power and charisma and example, is able to inspire people, with “inspire” being used here in a serious and non-cliché way. A real leader can somehow get us to do certain things that deep down we think are good and want to be able to do but usually can’t get ourselves to do on our own. It’s a mysterious quality, hard to define, but we always know it when we see it, even as kids. You can probably remember seeing it in certain really great coaches, or teachers, or some extremely cool older kid you “looked up to” (interesting phrase) and wanted to be just like. Some of us remember seeing the quality as kids in a minister or rabbi, or a Scoutmaster, or a parent, or a friend’s parent, or a supervisor in a summer job. And yes, all these are “authority figures,” but it’s a special kind of authority. If you’ve ever spent time in the military, you know how incredibly easy it is to tell which of your superiors are real leaders and which aren’t, and how little rank has to do with it. A leader’s real “authority” is a power you voluntarily give him, and you grant him this authority not with resentment or resignation but happily; it feels right. Deep down, you almost always like how a real leader makes you feel, the way you find yourself working harder and pushing yourself and thinking in ways you couldn’t ever get to on your own.

Lincoln was, by all available evidence, a real leader, and Churchill, and Gandhi, and King. Teddy and Franklin Roosevelt, and de Gaulle, and certainly Marshall and maybe Eisenhower. (Of course Hitler was a real leader too, a very powerful one, so you have to watch out; all it is is a weird kind of power.)

Now you have to pay close attention to something that’s going to seem real obvious. There is a difference between a great leader and a great salesman. Because a salesman’s ultimate, overriding motivation is his own self-interest. If you buy what he’s selling, the salesman profits. So even though the salesman may have a very powerful, charismatic, admirable personality, and might even persuade you that buying really is in your interest (and it really might be) — still, a little part of you always knows that what the salesman’s ultimately after is something for himself. And this awareness is painful … although admittedly it’s a tiny pain, more like a twinge, and often unconscious. But if you’re subjected to enough great salesmen and salespitches and marketing concepts for long enough — like from your earliest Saturday-morning cartoons, let’s say — it is only a matter of time before you start believing deep down that everything is sales and marketing, and that whenever somebody seems like they care about you or about some noble idea or cause, that person is a salesman and really ultimately doesn’t give a shit about you or some cause but really just wants something for himself.

Yes, this is simplistic. All politicians sell, always have. FDR and JFK and MLK and Gandhi were great salesmen. But that’s not all they were. People could smell it. That weird little extra something. It had to do with “character” (which, yes, is also a cliché — suck it up).

David Foster Wallace on leadership Read More »

Why did Thomas Jefferson bring a stuffed moose to France?

From David G. Post’s “Jefferson’s Moose” (Remarks presented at the Stanford Law School Conference on Privacy in Cyberspace: 7 February 2000):

In 1787, Jefferson, then the American Minister to France, had the “complete skeleton, skin & horns of the Moose” shipped to him in Paris and mounted in the lobby of his hotel. One can only imagine the comments made by bemused onlookers and hotel staff.

This was no small undertaking at that time — I suppose it would be no small undertaking even today. It’s not as if he had no other things to do with his time or his money. It’s worth asking: Why did he do it? What could have possessed him?

He wanted, first, to shock. He wanted his French friends to stand back, to gasp, and to say: There really is a new world out there, one that has things in it that we can hardly imagine. He wanted them to have what Lessig called an “aha! moment” in regard to the New World from out of which Jefferson (and his moose) had emerged.

But there was another, more specific, purpose. He wanted to show them that this new world was not a degenerate place. The Comte de Buffon, probably the most celebrated naturalist of the late 18th Century, had propounded just such a theory about the degeneracy of life in the New World. Jefferson described Buffon’s theory this way:

“That the animals common both to the old and new world, are smaller in the latter; that those peculiar to the new, are on a smaller scale; that those which have been domesticated in both, have degenerated in America; and that on the whole the New World exhibits fewer species.”

Though it may be hard to appreciate from our more enlightened 21st century perspective, this was deadly serious stuff — both as science and, more to our point here, as politics; to Jefferson, Buffon’s theory had ominous political implications, for it was, as he put it, “within one step” of the notion that man, too, would degenerate in the New World. Thus, it could and did give a kind of intellectual cover to the notion that man in the New World could not be trusted to govern himself.

Sometimes a picture — or, better yet, a carcass — is worth a thousand words. So out comes the moose; larger than its European counterparts (the reindeer and caribou), its brooding presence in downtown Paris would surely make observers think twice about Buffon’s theory. Jefferson was no fool; he knew full well that one data point does not settle the argument, and he would provide, in his “Notes on the State of Virginia,” a detailed refutation of Buffon’s charge, page after page of careful analysis of the relative sizes of American and European animals.

Why did Thomas Jefferson bring a stuffed moose to France? Read More »

Totalitarian regimes adopt the trappings of religion for themselves

From Steven Weinberg’s “Without God” (The New York Review of Books: 25 September 2008):

It has often been noted that the greatest horrors of the twentieth century were perpetrated by regimes – Hitler’s Germany, Stalin’s Russia, Mao’s China – that while rejecting some or all of the teachings of religion, copied characteristics of religion at its worst: infallible leaders, sacred writings, mass rituals, the execution of apostates, and a sense of community that justified exterminating those outside the community.

Totalitarian regimes adopt the trappings of religion for themselves Read More »

4 sources of tension between science and religion

From Steven Weinberg’s “Without God” (The New York Review of Books: 25 September 2008):

But if the direct conflict between scientific knowledge and specific religious beliefs has not been so important in itself, there are at least four sources of tension between science and religion that have been important.

The first source of tension arises from the fact that religion originally gained much of its strength from the observation of mysterious phenomena – thunder, earthquakes, disease – that seemed to require the intervention of some divine being. There was a nymph in every brook, and a dryad in every tree. But as time passed more and more of these mysteries have been explained in purely natural ways. Explaining this or that about the natural world does not of course rule out religious belief. But if people believe in God because no other explanation seems possible for a whole host of mysteries, and then over the years these mysteries were one by one resolved naturalistically, then a certain weakening of belief can be expected.

Of course, not everything has been explained, nor will it ever be. The important thing is that we have not observed anything that seems to require supernatural intervention for its explanation. There are some today who cling to the remaining gaps in our understanding (such as our ignorance about the origin of life) as evidence for God. But as time passes and more and more of these gaps are filled in, their position gives an impression of people desperately holding on to outmoded opinions.

The problem for religious belief is not just that science has explained a lot of odds and ends about the world. There is a second source of tension: that these explanations have cast increasing doubt on the special role of man, as an actor created by God to play a starring part in a great cosmic drama of sin and salvation. We have had to accept that our home, the earth, is just another planet circling the sun; our sun is just one of a hundred billion stars in a galaxy that is just one of billions of visible galaxies; and it may be that the whole expanding cloud of galaxies is just a small part of a much larger multiverse, most of whose parts are utterly inhospitable to life. As Richard Feynman has said, “The theory that it’s all arranged as a stage for God to watch man’s struggle for good and evil seems inadequate.”

A third source of tension between science and religious belief has been more important in Islam than in Christianity. Around 1100, the Sufi philosopher Abu Hamid al-Ghazzali argued against the very idea of laws of nature, on the grounds that any such law would put God’s hands in chains. According to al-Ghazzali, a piece of cotton placed in a flame does not darken and smolder because of the heat of the flame, but because God wants it to darken and smolder. Laws of nature could have been reconciled with Islam, as a summary of what God usually wants to happen, but al-Ghazzali did not take that path.

Al-Ghazzali is often described as the most influential Islamic philosopher. I wish I knew enough to judge how great was the impact on Islam of his rejection of science. At any rate, science in Muslim countries, which had led the world in the ninth and tenth centuries, went into a decline in the century or two after al-Ghazzali. As a portent of this decline, in 1194 the Ulama of Córdoba burned all scientific and medical texts.

Nor has science revived in the Islamic world. … in 2002 the periodical Nature carried out a survey of science in Islamic countries, and found just three areas in which the Islamic world produced excellent science, all three directed toward applications rather than basic science. They were desalination, falconry, and camel breeding.

Something like al-Ghazzali’s concern for God’s freedom surfaced for a while in Christian Europe, but with very different results. In Paris and Canterbury in the thirteenth century there was a wave of condemnations of those teachings of Aristotle that seemed to limit the freedom of God to do things like create a vacuum or make several worlds or move the heavens in straight lines. The influence of Thomas Aquinas and Albertus Magnus saved the philosophy of Aristotle for Europe, and with it the idea of laws of nature. But although Aristotle was no longer condemned, his authority had been questioned – which was fortunate, since nothing could be built on his physics. Perhaps it was the weakening of Aristotle’s authority by reactionary churchmen that opened the door to the first small steps toward finding the true laws of nature at Paris and Lisieux and Oxford in the fourteenth century.

There is a fourth source of tension between science and religion that may be the most important of all. Traditional religions generally rely on authority, whether the authority is an infallible leader, such as a prophet or a pope or an imam, or a body of sacred writings, a Bible or a Koran. …

Of course, scientists rely on authorities, but of a very different sort. If I want to understand some fine point about the general theory of relativity, I might look up a recent paper by an expert in the field. But I would know that the expert might be wrong. One thing I probably would not do is to look up the original papers of Einstein, because today any good graduate student understands general relativity better than Einstein did. We progress. Indeed, in the form in which Einstein described his theory it is today generally regarded as only what is known in the trade as an effective field theory; that is, it is an approximation, valid for the large scales of distance for which it has been tested, but not under very cramped conditions, as in the early big bang.

We have our heroes in science, like Einstein, who was certainly the greatest physicist of the past century, but for us they are not infallible prophets.

4 sources of tension between science and religion Read More »

Intelligent Design? How about a flat earth?

From Steven Weinberg’s “Without God” (The New York Review of Books: 25 September 2008):

Contradictions between scripture and scientific knowledge have occurred again and again, and have generally been accommodated by the more enlightened among the religious. For instance, there are verses in both the Old and New Testament that seem to show that the earth is flat, and as noted by Copernicus (quoted by Galileo in the same letter to Christina) these verses led some early Church fathers like Lactantius to reject the Greek understanding that the earth is a sphere, but educated Christians long before the voyages of Columbus and Magellan had come to accept the spherical shape of the earth. Dante found the interior of the spherical earth a convenient place to store sinners.

What was briefly a serious issue in the early Church has today become a parody. The astrophysicist Adrian Melott of the University of Kansas, in a fight with zealots who wanted equal time for creationism in the Kansas public schools, founded an organization called FLAT (Families for Learning Accurate Theories). His society parodied creationists by demanding equal time for flat earth geography, arguing that children should be exposed to both sides of the controversy over the shape of the earth.

Intelligent Design? How about a flat earth? Read More »

MySpace/Facebook history & sociology

From danah boyd’s “Social Media is Here to Stay… Now What?” at the Microsoft Research Tech Fest, Redmond, Washington (danah: 26 February 2009):

Facebook had launched as a Harvard-only site before expanding to other elite institutions before expanding to other 4-year-colleges before expanding to 2-year colleges. It captured the mindshare of college students everywhere. It wasn’t until 2005 that they opened the doors to some companies and high schools. And only in 2006, did they open to all.

Facebook was narrated as the “safe” alternative and, in the 2006-2007 school year, a split amongst American teens occurred. Those college-bound kids from wealthier or upwardly mobile backgrounds flocked to Facebook while teens from urban or less economically privileged backgrounds rejected the transition and opted to stay with MySpace while simultaneously rejecting the fears brought on by American media. Many kids were caught in the middle and opted to use both, but the division that occurred resembles the same “jocks and burnouts” narrative that shaped American schools in the 1980s.

MySpace/Facebook history & sociology Read More »

A history of the negative associations of yellow

From Allen Abel And Madeleine Czigler’s “Submarines, bananas and taxis” (National Post: 24 June 2008):

Depicted in frescoes and canvases from the early Middle Ages onward in the robes of the betrayer of the Christ, “Judas yellow” devolved into an imprint of depravity, treason and exclusion.

By the 12th century, European Jews were compelled to wear yellow hats, prostitutes were bound by yellow sashes and yellow flags flew above the pus-stained hovels of the Black Death. From this would descend our own yellow of cowardice and insanity, and the yellow badges of the star-crossed Jüden of the Third Reich.

A history of the negative associations of yellow Read More »