art

David Foster Wallace on the problems with postmodern irony

From Larry McCaffery’s “Conversation with David Foster Wallace” (Dalkey Archive Press at the University of Illinois: Summer 1993):

Irony and cynicism were just what the U.S. hypocrisy of the fifties and sixties called for. That’s what made the early postmodernists great artists. The great thing about irony is that it splits things apart, gets up above them so we can see the flaws and hypocrisies and duplicates. The virtuous always triumph? Ward Cleaver is the prototypical fifties father? “Sure.” Sarcasm, parody, absurdism and irony are great ways to strip off stuff’s mask and show the unpleasant reality behind it. The problem is that once the rules of art are debunked, and once the unpleasant realities the irony diagnoses are revealed and diagnosed, “then” what do we do? Irony’s useful for debunking illusions, but most of the illusion-debunking in the U.S. has now been done and redone. Once everybody knows that equality of opportunity is bunk and Mike Brady’s bunk and Just Say No is bunk, now what do we do? All we seem to want to do is keep ridiculing the stuff. Postmodern irony and cynicism’s become an end in itself, a measure of hip sophistication and literary savvy. Few artists dare to try to talk about ways of working toward redeeming what’s wrong, because they’ll look sentimental and naive to all the weary ironists. Irony’s gone from liberating to enslaving. There’s some great essay somewhere that has a line about irony being the song of the prisoner who’s come to love his cage.

The problem is that, however misprised it’s been, what’s been passed down from the postmodern heyday is sarcasm, cynicism, a manic ennui, suspicion of all authority, suspicion of all constraints on conduct, and a terrible penchant for ironic diagnosis of unpleasantness instead of an ambition not just to diagnose and ridicule but to redeem. You’ve got to understand that this stuff has permeated the culture. It’s become our language; we’re so in it we don’t even see that it’s one perspective, one among many possible ways of seeing. Postmodern irony’s become our environment.

David Foster Wallace on the problems with postmodern irony Read More »

The Uncanny Valley, art forgery, & love

Apply new wax to old wood
Creative Commons License photo credit: hans s

From Errol Morris’ “Bamboozling Ourselves (Part 2)” (The New York Times: 28 May 2009):

[Errol Morris:] The Uncanny Valley is a concept developed by the Japanese robot scientist Masahiro Mori. It concerns the design of humanoid robots. Mori’s theory is relatively simple. We tend to reject robots that look too much like people. Slight discrepancies and incongruities between what we look like and what they look like disturb us. The closer a robot resembles a human, the more critical we become, the more sensitive to slight discrepancies, variations, imperfections. However, if we go far enough away from the humanoid, then we much more readily accept the robot as being like us. This accounts for the success of so many movie robots — from R2-D2 to WALL-E. They act like humans but they don’t look like humans. There is a region of acceptability — the peaks around The Uncanny Valley, the zone of acceptability that includes completely human and sort of human but not too human. The existence of The Uncanny Valley also suggests that we are programmed by natural selection to scrutinize the behavior and appearance of others. Survival no doubt depends on such an innate ability.

EDWARD DOLNICK: [The art forger Van Meegeren] wants to avoid it. So his big challenge is he wants to paint a picture that other people are going to take as Vermeer, because Vermeer is a brand name, because Vermeer is going to bring him lots of money, if he can get away with it, but he can’t paint a Vermeer. He doesn’t have that skill. So how is he going to paint a picture that doesn’t look like a Vermeer, but that people are going to say, “Oh! It’s a Vermeer?” How’s he going to pull it off? It’s a tough challenge. Now here’s the point of The Uncanny Valley: as your imitation gets closer and closer to the real thing, people think, “Good, good, good!” — but then when it’s very close, when it’s within 1 percent or something, instead of focusing on the 99 percent that is done well, they focus on the 1 percent that you’re missing, and you’re in trouble. Big trouble.

Van Meegeren is trapped in the valley. If he tries for the close copy, an almost exact copy, he’s going to fall short. He’s going to look silly. So what he does instead is rely on the blanks in Vermeer’s career, because hardly anything is known about him; he’s like Shakespeare in that regard. He’ll take advantage of those blanks by inventing a whole new era in Vermeer’s career. No one knows what he was up to all this time. He’ll throw in some Vermeer touches, including a signature, so that people who look at it will be led to think, “Yes, this is a Vermeer.”

Van Meegeren was sometimes careful, other times astonishingly reckless. He could have passed certain tests. What was peculiar, and what was quite startling to me, is that it turned out that nobody ever did any scientific test on Van Meegeren, even the stuff that was available in his day, until after he confessed. And to this day, people hardly ever test pictures, even multi-million dollar ones. And I was so surprised by that that I kept asking, over and over again: why? Why would that be? Before you buy a house, you have someone go through it for termites and the rest. How could it be that when you’re going to lay out $10 million for a painting, you don’t test it beforehand? And the answer is that you don’t test it because, at the point of being about to buy it, you’re in love! You’ve found something. It’s going to be the high mark of your collection; it’s going to be the making of you as a collector. You finally found this great thing. It’s available, and you want it. You want it to be real. You don’t want to have someone let you down by telling you that the painting isn’t what you think it is. It’s like being newly in love. Everything is candlelight and wine. Nobody hires a private detective at that point. It’s only years down the road when things have gone wrong that you say, “What was I thinking? What’s going on here?” The collector and the forger are in cahoots. The forger wants the collector to snap it up, and the collector wants it to be real. You are on the same side. You think that it would be a game of chess or something, you against him. “Has he got the paint right?” “Has he got the canvas?” You’re going to make this checkmark and that checkmark to see if the painting measures up. But instead, both sides are rooting for this thing to be real. If it is real, then you’ve got a masterpiece. If it’s not real, then today is just like yesterday. You’re back where you started, still on the prowl.

The Uncanny Valley, art forgery, & love Read More »

David Foster Wallace on rock, the rise of mass media, & the generation gap

From Larry McCaffery’s “Conversation with David Foster Wallace” (Dalkey Archive Press at the University of Illinois: Summer 1993):

Rock music itself bores me, usually. The phenomenon of rock interests me, though, because its birth was part of the rise of popular media, which completely changed the ways the U.S. was unified and split. The mass media unified the country geographically for pretty much the first time. Rock helped change the fundamental splits in the U.S. from geographical splits to generational ones. Very few people I talk to understand what “generation gap” ‘s implications really were. Kids loved rock partly because their parents didn’t, and obversely. In a mass mediated nation, it’s no longer North vs. South. It’s under-thirty vs. over thirty. I don’t think you can understand the sixties and Vietnam and love ins and LSD and the whole era of patricidal rebellion that helped inspire early postmodern fiction’s whole “We’re-going-to-trash-your-Beaver Cleaver-plasticized-G.O.P.-image-of-life-in-America” attitude without understanding rock ‘n roll. Because rock was and is all about busting loose, exceeding limits, and limits are usually set by parents, ancestors, older authorities.

David Foster Wallace on rock, the rise of mass media, & the generation gap Read More »

David Foster Wallace on minimalism & metafiction

From Larry McCaffery’s “Conversation with David Foster Wallace” (Dalkey Archive Press at the University of Illinois: Summer 1993):

Minimalism’s just the other side of metafictional recursion. The basic problem’s still the one of the mediating narrative consciousness. Both minimalism and metafiction try to resolve the problem in radical ways. Opposed, but both so extreme they end up empty. Recursive metafiction worships the narrative consciousness, makes “it” the subject of the text. Minimalism’s even worse, emptier, because it’s a fraud: it eschews not only self-reference but any narrative personality at all, tries to pretend there “is” no narrative consciousness in its text. This is so fucking American, man: either make something your God and cosmos and then worship it, or else kill it.

David Foster Wallace on minimalism & metafiction Read More »

David Foster Wallace on the familiar & the strange

From Larry McCaffery’s “Conversation with David Foster Wallace” (Dalkey Archive Press at the University of Illinois: Summer 1993):

If you mean a post-industrial, mediated world, it’s inverted one of fiction’s big historical functions, that of providing data on distant cultures and persons. The first real generalization of human experience that novels tried to accomplish. If you lived in Bumfuck, Iowa, a hundred years ago and had no idea what life was like in India, good old Kipling goes over and presents it to you. … Well, but fiction’s presenting function for today’s reader has been reversed: since the whole global village is now presented as familiar, electronically immediate—satellites, microwaves, intrepid PBS anthropologists, Paul Simon’s Zulu back-ups—it’s almost like we need fiction writers to restore strange things’ ineluctable “strangeness,” to defamiliarize stuff, I guess you’d say.

… For our generation, the entire world seems to present itself as “familiar,” but since that’s of course an illusion in terms of anything really important about people, maybe any “realistic” fiction’s job is opposite what it used to be—no longer making the strange familiar but making the familiar strange again. It seems important to find ways of reminding ourselves that most “familiarity” is meditated and delusive.

David Foster Wallace on the familiar & the strange Read More »

David Foster Wallace on TV, loneliness, & death

From Larry McCaffery’s “Conversation with David Foster Wallace” (Dalkey Archive Press at the University of Illinois: Summer 1993):

One thing TV does is help us deny that we’re lonely. With televised images, we can have the facsimile of a relationship without the work of a real relationship. It’s an anesthesia of “form.” The interesting thing is why we’re so desperate for this anesthetic against loneliness. You don’t have to think very hard to realize that our dread of both relationships and loneliness, both of which are like sub-dreads of our dread of being trapped inside a self (a psychic self, not just a physical self), has to do with angst about death, the recognition that I’m going to die, and die very much alone, and the rest of the world is going to go merrily on without me.

David Foster Wallace on TV, loneliness, & death Read More »

David Foster Wallace on fiction’s purpose in dark times

From Larry McCaffery’s “Conversation with David Foster Wallace” (Dalkey Archive Press at the University of Illinois: Summer 1993):

Look man, we’d probably most of us agree that these are dark times, and stupid ones, but do we need fiction that does nothing but dramatize how dark and stupid everything is? In dark times, the definition of good art would seem to be art that locates and applies CPR to those elements of what’s human and magical that still live and glow despite the times’ darkness. Really good fiction could have as dark a worldview as it wished, but it’d find a way both to depict this world and to illuminate the possibilities for being alive and human in it.

David Foster Wallace on fiction’s purpose in dark times Read More »

Now THAT is fantabulous!

i...dentify
Creative Commons License photo credit: newneonunion

From Roy Kesey’s piece in “Remembering David Foster Wallace” (Edward Champion’s Reluctant Habits: 15 September 2008):

The first story of David’s I ever read was that one Brief Interview that he had in the Paris Review maybe ten or eleven years ago. For me it was paradigm-altering, quietly fantabulous, in exactly the way that having a clay pot broken over your head would be fantabulous if instead of dirt it turned out to be full of cocaine and Slim Jims.

Now THAT is fantabulous! Read More »

Van Gogh on death

From Roger Ebert’s “Go gentle into that good night” (Roger Ebert’s Journal: 2 May 2009):

Van Gogh in Arles wrote this about death:

Looking at the stars always makes me dream, as simply as I dream over the black dots representing towns and villages on a map. Why? I ask myself, shouldn’t the shining dots of the sky be as accessible as the black dots on the map of France? Just as we take a train to get to Tarascon or Rouen, we take death to reach a star. We cannot get to a star while we are alive any more than we can take the train when we are dead. So to me it seems possible that cholera, tuberculosis and cancer are the celestial means of locomotion. Just as steamboats, buses and railways are the terrestrial means. To die quietly of old age, would be to go there on foot.

Van Gogh on death Read More »

Roger Ebert on death

From Roger Ebert’s “Go gentle into that good night” (Roger Ebert’s Journal: 2 May 2009):

What I expect will most probably happen [when I die] is that my body will fail, my mind will cease to function, and that will be that. My genes will not live on, because I have had no children. Perhaps I have been infertile. If I discover that somewhere along the way I conceived a child, let that child step forward and he or she will behold a happy man. Through my wife, I have had stepchildren and grandchildren, and I love them unconditionally, which is the only kind of love worth bothering with.

I am comforted by Richard Dawkins’ theory of memes. Those are mental units: thoughts, ideas, gestures, notions, songs, beliefs, rhymes, ideals, teachings, sayings, phrases, clichés, that move from mind to mind as genes move from body to body. After a lifetime of writing, teaching, broadcasting and happily torturing people with my jokes, I will leave behind more memes than many. They will all eventually die as well, but so it goes.

I drank for many years in a tavern that had a photograph of Brendan Behan on the wall, and under it this quotation, which I memorized:

I respect kindness in human beings first of all, and kindness to animals. I don’t respect the law; I have a total irreverence for anything connected with society except that which makes the roads safer, the beer stronger, the food cheaper and the old men and old women warmer in the winter and happier in the summer.

For 57 words, that does a pretty good job of summing it up. “Kindness” covers all of my political beliefs. No need to spell them out. Kindness is why I vote liberal and not conservative–but let’s not go there, not today. I believe that if, at the end of it all, according to our abilities, we have done something to make others a little happier, and something to make ourselves a little happier, that is about the best we can do. To make others less happy is a crime. To make ourselves unhappy is where all crime starts. We must try to contribute joy to the world. That is true no matter what our problems, our health, our circumstances. We must try. I didn’t always know this, and am happy I lived long enough to find it out.

Roger Ebert on death Read More »

David Foster Wallace on moving mountains

From Bill Katovsky’s “David Foster Wallace: A Profile” (McSweeney’s Internet Tendency: November 2008):

“I spent a lot of time as a volunteer in a nursing home in Amherst last summer. I was reading Dante’s Divine Comedy to an old man, Mr. Shulman. One day, I asked him where he was from. He said, ‘Just east of here, the Rockies.’ I said, ‘Mr. Shulman, the Rockies are west of here.’ He did a voilà with his hands and then said, ‘I move mountains.’ That stuck with me. Fiction either moves mountains or it’s boring; it moves mountains or it sits on its ass.”

David Foster Wallace on moving mountains Read More »

4 sources of tension between science and religion

From Steven Weinberg’s “Without God” (The New York Review of Books: 25 September 2008):

But if the direct conflict between scientific knowledge and specific religious beliefs has not been so important in itself, there are at least four sources of tension between science and religion that have been important.

The first source of tension arises from the fact that religion originally gained much of its strength from the observation of mysterious phenomena – thunder, earthquakes, disease – that seemed to require the intervention of some divine being. There was a nymph in every brook, and a dryad in every tree. But as time passed more and more of these mysteries have been explained in purely natural ways. Explaining this or that about the natural world does not of course rule out religious belief. But if people believe in God because no other explanation seems possible for a whole host of mysteries, and then over the years these mysteries were one by one resolved naturalistically, then a certain weakening of belief can be expected.

Of course, not everything has been explained, nor will it ever be. The important thing is that we have not observed anything that seems to require supernatural intervention for its explanation. There are some today who cling to the remaining gaps in our understanding (such as our ignorance about the origin of life) as evidence for God. But as time passes and more and more of these gaps are filled in, their position gives an impression of people desperately holding on to outmoded opinions.

The problem for religious belief is not just that science has explained a lot of odds and ends about the world. There is a second source of tension: that these explanations have cast increasing doubt on the special role of man, as an actor created by God to play a starring part in a great cosmic drama of sin and salvation. We have had to accept that our home, the earth, is just another planet circling the sun; our sun is just one of a hundred billion stars in a galaxy that is just one of billions of visible galaxies; and it may be that the whole expanding cloud of galaxies is just a small part of a much larger multiverse, most of whose parts are utterly inhospitable to life. As Richard Feynman has said, “The theory that it’s all arranged as a stage for God to watch man’s struggle for good and evil seems inadequate.”

A third source of tension between science and religious belief has been more important in Islam than in Christianity. Around 1100, the Sufi philosopher Abu Hamid al-Ghazzali argued against the very idea of laws of nature, on the grounds that any such law would put God’s hands in chains. According to al-Ghazzali, a piece of cotton placed in a flame does not darken and smolder because of the heat of the flame, but because God wants it to darken and smolder. Laws of nature could have been reconciled with Islam, as a summary of what God usually wants to happen, but al-Ghazzali did not take that path.

Al-Ghazzali is often described as the most influential Islamic philosopher. I wish I knew enough to judge how great was the impact on Islam of his rejection of science. At any rate, science in Muslim countries, which had led the world in the ninth and tenth centuries, went into a decline in the century or two after al-Ghazzali. As a portent of this decline, in 1194 the Ulama of Córdoba burned all scientific and medical texts.

Nor has science revived in the Islamic world. … in 2002 the periodical Nature carried out a survey of science in Islamic countries, and found just three areas in which the Islamic world produced excellent science, all three directed toward applications rather than basic science. They were desalination, falconry, and camel breeding.

Something like al-Ghazzali’s concern for God’s freedom surfaced for a while in Christian Europe, but with very different results. In Paris and Canterbury in the thirteenth century there was a wave of condemnations of those teachings of Aristotle that seemed to limit the freedom of God to do things like create a vacuum or make several worlds or move the heavens in straight lines. The influence of Thomas Aquinas and Albertus Magnus saved the philosophy of Aristotle for Europe, and with it the idea of laws of nature. But although Aristotle was no longer condemned, his authority had been questioned – which was fortunate, since nothing could be built on his physics. Perhaps it was the weakening of Aristotle’s authority by reactionary churchmen that opened the door to the first small steps toward finding the true laws of nature at Paris and Lisieux and Oxford in the fourteenth century.

There is a fourth source of tension between science and religion that may be the most important of all. Traditional religions generally rely on authority, whether the authority is an infallible leader, such as a prophet or a pope or an imam, or a body of sacred writings, a Bible or a Koran. …

Of course, scientists rely on authorities, but of a very different sort. If I want to understand some fine point about the general theory of relativity, I might look up a recent paper by an expert in the field. But I would know that the expert might be wrong. One thing I probably would not do is to look up the original papers of Einstein, because today any good graduate student understands general relativity better than Einstein did. We progress. Indeed, in the form in which Einstein described his theory it is today generally regarded as only what is known in the trade as an effective field theory; that is, it is an approximation, valid for the large scales of distance for which it has been tested, but not under very cramped conditions, as in the early big bang.

We have our heroes in science, like Einstein, who was certainly the greatest physicist of the past century, but for us they are not infallible prophets.

4 sources of tension between science and religion Read More »

David Foster Wallace on David Lynch

From David Foster Wallace’s “David Lynch Keeps His Head” (Premier: September 1996):

AN ACADEMIC DEFINITION of Lynchian might be that the term “refers to a particular kind of irony where the very macabre and the very mundane combine in such a way as to reveal the former’s perpetual containment within the latter.” But like postmodern or pornographic, Lynchian is one of those Porter Stewart-type words that’s ultimately definable only ostensively – i.e., we know it when we see it. Ted Bundy wasn’t particularly Lynchian, but good old Jeffrey Dahmer, with his victims’ various anatomies neatly separated and stored in his fridge alongside his chocolate milk and Shedd Spread, was thoroughgoingly Lynchian. A recent homicide in Boston, in which the deacon of a South Shore church reportedly gave chase to a vehicle that bad cut him off, forced the car off the road, and shot the driver with a highpowered crossbow, was borderline Lynchian. A Rotary luncheon where everybody’s got a comb-over and a polyester sport coat and is eating bland Rotarian chicken and exchanging Republican platitudes with heartfelt sincerity and yet all are either amputees or neurologically damaged or both would be more Lynchian than not.

David Foster Wallace on David Lynch Read More »

Why David Foster Wallace used footnotes

From D. T. Max’s “Notes and Errata*: A DFW Companion Guide to ‘The Unfinished’” (The Rumpus: 31 March 2009):

He explained that endnotes “allow . . . me to make the primary-text an easier read while at once 1) allowing a discursive, authorial intrusive style w/o Finneganizing the story, 2) mimic the information-flood and data-triage I expect’d be an even bigger part of US life 15 years hence. 3) have a lot more technical/medical verisimilitude 4) allow/make the reader go literally physically ‘back and forth’ in a way that perhaps cutely mimics some of the story’s thematic concerns . . . 5) feel emotionally like I’m satisfying your request for compression of text without sacrificing enormous amounts of stuff.”

He was known for endlessly fracturing narratives and for stem-winding sentences adorned with footnotes that were themselves stem-winders. Such techniques originally had been his way of reclaiming language from banality, while at the same time representing all the caveats, micro-thoughts, meta-moments, and other flickers of his hyperactive mind.

Why David Foster Wallace used footnotes Read More »

Why we can easily remember jingles but not jokes

From Natalie Angier’s “In One Ear and Out the Other” (The New York Times: 16 March 2009):

In understanding human memory and its tics, Scott A. Small, a neurologist and memory researcher at Columbia, suggests the familiar analogy with computer memory.

We have our version of a buffer, he said, a short-term working memory of limited scope and fast turnover rate. We have our equivalent of a save button: the hippocampus, deep in the forebrain is essential for translating short-term memories into a more permanent form.

Our frontal lobes perform the find function, retrieving saved files to embellish as needed. And though scientists used to believe that short- and long-term memories were stored in different parts of the brain, they have discovered that what really distinguishes the lasting from the transient is how strongly the memory is engraved in the brain, and the thickness and complexity of the connections linking large populations of brain cells. The deeper the memory, the more readily and robustly an ensemble of like-minded neurons will fire.

This process, of memory formation by neuronal entrainment, helps explain why some of life’s offerings weasel in easily and then refuse to be spiked. Music, for example. “The brain has a strong propensity to organize information and perception in patterns, and music plays into that inclination,” said Michael Thaut, a professor of music and neuroscience at Colorado State University. “From an acoustical perspective, music is an overstructured language, which the brain invented and which the brain loves to hear.”

A simple melody with a simple rhythm and repetition can be a tremendous mnemonic device. “It would be a virtually impossible task for young children to memorize a sequence of 26 separate letters if you just gave it to them as a string of information,” Dr. Thaut said. But when the alphabet is set to the tune of the ABC song with its four melodic phrases, preschoolers can learn it with ease.

And what are the most insidious jingles or sitcom themes but cunning variations on twinkle twinkle ABC?

Really great jokes, on the other hand, punch the lights out of do re mi. They work not by conforming to pattern recognition routines but by subverting them. “Jokes work because they deal with the unexpected, starting in one direction and then veering off into another,” said Robert Provine, a professor of psychology at the University of Maryland, Baltimore County, and the author of “Laughter: A Scientific Investigation.” “What makes a joke successful are the same properties that can make it difficult to remember.”

This may also explain why the jokes we tend to remember are often the most clichéd ones. A mother-in-law joke? Yes…

Why we can easily remember jingles but not jokes Read More »

A history of the negative associations of yellow

From Allen Abel And Madeleine Czigler’s “Submarines, bananas and taxis” (National Post: 24 June 2008):

Depicted in frescoes and canvases from the early Middle Ages onward in the robes of the betrayer of the Christ, “Judas yellow” devolved into an imprint of depravity, treason and exclusion.

By the 12th century, European Jews were compelled to wear yellow hats, prostitutes were bound by yellow sashes and yellow flags flew above the pus-stained hovels of the Black Death. From this would descend our own yellow of cowardice and insanity, and the yellow badges of the star-crossed Jüden of the Third Reich.

A history of the negative associations of yellow Read More »

The cochineal insect’s gift of red

From Allen Abel and Madeleine Czigler’s “Scandal, communism, blood” (National Post: 27 June 2008):

The blood-red allure of lipstick is a gift of a parasitic insect that infests cactus plants, principally in Mexico and Peru. It has been known since Aztec and Mayan times that, when boiled, the body of the cochineal insect dissolves into a deep crimson dye. France is the leading importer. Cochineal dye, which is neither Kosher nor Halal (since it is forbidden for Jews or Muslims to consume any insect) also is used in thousands of foods and beverages, ranging from sausages and gelatin desserts to some Cheddar cheese.

The cochineal insect’s gift of red Read More »

Wikipedia, freedom, & changes in production

From Clay Shirky’s “Old Revolutions, Good; New Revolutions, Bad” (Britannica Blog: 14 June 2007):

Gorman’s theory about print – its capabilities ushered in an age very different from manuscript culture — is correct, and the same kind of shift is at work today. As with the transition from manuscripts to print, the new technologies offer virtues that did not previously exist, but are now an assumed and permanent part of our intellectual environment. When reproduction, distribution, and findability were all hard, as they were for the last five hundred years, we needed specialists to undertake those jobs, and we properly venerated them for the service they performed. Now those tasks are simpler, and the earlier roles have instead become obstacles to direct access.

Digital and networked production vastly increase three kinds of freedom: freedom of speech, of the press, and of assembly. This perforce increases the freedom of anyone to say anything at any time. This freedom has led to an explosion in novel content, much of it mediocre, but freedom is like that. Critically, this expansion of freedom has not undermined any of the absolute advantages of expertise; the virtues of mastery remain as they were. What has happened is that the relative advantages of expertise are in precipitous decline. Experts the world over have been shocked to discover that they were consulted not as a direct result of their expertise, but often as a secondary effect – the apparatus of credentialing made finding experts easier than finding amateurs, even when the amateurs knew the same things as the experts.

The success of Wikipedia forces a profound question on print culture: how is information to be shared with the majority of the population? This is an especially tough question, as print culture has so manifestly failed at the transition to a world of unlimited perfect copies. Because Wikipedia’s contents are both useful and available, it has eroded the monopoly held by earlier modes of production. Other encyclopedias now have to compete for value to the user, and they are failing because their model mainly commits them to denying access and forbidding sharing. If Gorman wants more people reading Britannica, the choice lies with its management. Were they to allow users unfettered access to read and share Britannica’s content tomorrow, the only interesting question is whether their readership would rise a ten-fold or a hundred-fold.

Britannica will tell you that they don’t want to compete on universality of access or sharability, but this is the lament of the scribe who thinks that writing fast shouldn’t be part of the test. In a world where copies have become cost-free, people who expend their resources to prevent access or sharing are forgoing the principal advantages of the new tools, and this dilemma is common to every institution modeled on the scarcity and fragility of physical copies. Academic libraries, which in earlier days provided a service, have outsourced themselves as bouncers to publishers like Reed-Elsevier; their principal job, in the digital realm, is to prevent interested readers from gaining access to scholarly material.

Wikipedia, freedom, & changes in production Read More »