history

Ray Bradbury on Edgar Rice Burroughs

From Sam Weller’s interview of Ray Bradbury in “The Art of Fiction No. 203” (The Paris Review: Spring 2010, No. 192):

But as it turns out—and I love to say it because it upsets everyone terribly—[Edgar Rice] Burroughs is probably the most influential writer in the entire history of the world.

INTERVIEWER Why do you think that?

BRADBURY By giving romance and adventure to a whole generation of boys, Burroughs caused them to go out and decide to become special. That’s what we have to do for everyone, give the gift of life with our books. Say to a girl or boy at age ten, Hey, life is fun! Grow tall! I’ve talked to more biochemists and more astronomers and technologists in various fields, who, when they were ten years old, fell in love with John Carter and Tarzan and decided to become something romantic. Burroughs put us on the moon. All the technologists read Burroughs. I was once at Caltech with a whole bunch of scientists and they all admitted it. Two leading astronomers—one from Cornell, the other from Caltech—came out and said, Yeah, that’s why we became astronomers. We wanted to see Mars more closely.

Ray Bradbury on Edgar Rice Burroughs Read More »

Kurt Vonnegut on the basic plots available to writers

From David Hayman, David Michaelis, George Plimpton, & Richard Rhodes’s interview of Kurt Vonnegut in “The Art of Fiction No. 64” (The Paris Review: Spring 1977, No. 69):

VONNEGUT The others aren’t that much fun to describe: somebody gets into trouble, and then gets out again; somebody loses something and gets it back; somebody is wronged and gets revenge; Cinderella; somebody hits the skids and just goes down, down, down; people fall in love with each other, and a lot of other people get in the way; a virtuous person is falsely accused of sin; a sinful person is believed to be virtuous; a person faces a challenge bravely, and succeeds or fails; a person lies, a person steals, a person kills, a person commits fornication.

INTERVIEWER If you will pardon my saying so, these are very old-fashioned plots.

VONNEGUT I guarantee you that no modern story scheme, even plotlessness, will give a reader genuine satisfaction, unless one of those old-fashioned plots is smuggled in somewhere. I don’t praise plots as accurate representations of life, but as ways to keep readers reading. When I used to teach creative writing, I would tell the students to make their characters want something right away—even if it’s only a glass of water. Characters paralyzed by the meaninglessness of modern life still have to drink water from time to time.

Kurt Vonnegut on the basic plots available to writers Read More »

John Steinbeck on Ernest Hemingway

From Nathaniel Benchley’s interview of John Steinbeck in “The Art of Fiction No. 45” (The Paris Review: Fall 1969, No. 48):

The first thing we heard of Ernest Hemingway’s death was a call from the London Daily Mail, asking me to comment on it. And quite privately, although something of this sort might be expected, I find it shocking. He had only one theme—only one. A man contends with the forces of the world, called fate, and meets them with courage.

John Steinbeck on Ernest Hemingway Read More »

Anthony Burgess on satire

From John Cullinan’s interview of Anthony Burgess in “The Art of Fiction No. 48” (The Paris Review: Spring 1973, No. 56):

Satire is a difficult medium, ephemeral unless there’s tremendous vitality in the form itself—like Absalom and Achitophel, Tale of a Tub, Animal Farm: I mean, the work has to subsist as story or poetry even when the objects of the satire are forgotten.

Anthony Burgess on satire Read More »

2 great examples of Tom Wolfe’s early New Journalism writing style

From Tom Wolfe’s “The Last American Hero Is Junior Johnson. Yes!” (Esquire: March 1965):

Ten o’clock Sunday morning in the hills of North Carolina. Cars, miles of cars, in every direction, millions of cars, pastel cars, aqua green, aqua blue, aqua beige, aqua buff, aqua dawn, aqua dusk, aqua aqua, aqua Malacca, Malacca lacquer, Cloud lavender, Assassin pink, Rake-a-cheek raspberry. Nude Strand coral, Honest Thrill orange, and Baby Fawn Lust cream-colored cars are all going to the stock-car races, and that old mothering North Carolina sun keeps exploding off the windshields. Mother dog!

Working mash wouldn’t wait for a man. It started coming to a head when it got ready to and a man had to be there to take it off, out there in the woods, in the brush, in the brambles, in the muck, in the snow. Wouldn’t it have been something if you could have just set it all up inside a good old shed with a corrugated metal roof and order those parts like you want them and not have to smuggle all that copper and all that sugar and all that everything out here in the woods and be a coppersmith and a plumber and a cooper and a carpenter and a pack horse and every other goddamned thing God ever saw in this world, all at once.

2 great examples of Tom Wolfe’s early New Journalism writing style Read More »

How the Madden NFL videogame was developed

From Patrick Hruby’s “The Franchise: The inside story of how Madden NFL became a video game dynasty” (ESPN: 22 July 2010):

1982

Harvard grad and former Apple employee Trip Hawkins founds video game maker Electronic Arts, in part to create a football game; one year later, the company releases “One-on-One: Dr. J vs. Larry Bird,” the first game to feature licensed sports celebrities. Art imitates life.

1983-84

Hawkins approaches former Oakland Raiders coach and NFL television analyst John Madden to endorse a football game. Madden agrees, but insists on realistic game play with 22 on-screen players, a daunting technical challenge.

1988-90

EA releases the first Madden football game for the Apple II home computer; a subsequent Sega Genesis home console port blends the Apple II game’s realism with control pad-heavy, arcade-style action, becoming a smash hit.

madden-nfl-covers-sm.jpg

You can measure the impact of “Madden” through its sales: as many as 2 million copies in a single week, 85 million copies since the game’s inception and more than $3 billion in total revenue. You can chart the game’s ascent, shoulder to shoulder, alongside the $20 billion-a-year video game industry, which is either co-opting Hollywood (see “Tomb Raider” and “Prince of Persia”) or topping it (opening-week gross of “Call of Duty: Modern Warfare 2”: $550 million; “The Dark Knight”: $204 million).

Some of the pain was financial. Just as EA brought its first games to market in 1983, the home video game industry imploded. In a two-year span, Coleco abandoned the business, Intellivision went from 1,200 employees to five and Atari infamously dumped thousands of unsold game cartridges into a New Mexico landfill. Toy retailers bailed, concluding that video games were a Cabbage Patch-style fad. Even at EA — a hot home computer startup — continued solvency was hardly assured.

In 1988, “John Madden Football” was released for the Apple II computer and became a modest commercial success.

THE STAKES WERE HIGH for a pair of upstart game makers, with a career-making opportunity and a $100,000 development contract on the line. In early 1990, Troy Lyndon and Mike Knox of San Diego-based Park Place Productions met with Hawkins to discuss building a “Madden” game for Sega’s upcoming home video game console, the Genesis. …

Because the game that made “Madden” a phenomenon wasn’t the initial Apple II release, it was the Genesis follow-up, a surprise smash spawned by an entirely different mindset. Hawkins wanted “Madden” to play out like the NFL. Equivalent stats. Similar play charts. Real football.

In 1990, EA had a market cap of about $60 million; three years later, that number swelled to $2 billion.

In 2004, EA paid the NFL a reported $300 million-plus for five years of exclusive rights to teams and players. The deal was later extended to 2013. Just like that, competing games went kaput. The franchise stands alone, triumphant, increasingly encumbered by its outsize success.

Hawkins left EA in the early 1990s to spearhead 3D0, an ill-fated console maker that became a doomed software house. An icy rift between the company and its founder ensued.

How the Madden NFL videogame was developed Read More »

A summary of Galbraith’s The Affluent Society

From a summary of John Kenneth Galbraith’s The Affluent Society (Abridge Me: 1 June 2010):

The Concept of the Conventional Wisdom

The paradigms on which society’s perception of reality are based are highly conservative. People invest heavily in these ideas, and so are heavily resistant to changing them. They are only finally overturned by new ideas when new events occur which make the conventional wisdom appear so absurd as to be impalpable. Then the conventional wisdom quietly dies with its most staunch proponents, to be replaced with a new conventional wisdom. …

Economic Security

… Economics professors argue that the threat of unemployment is necessary to maintain incentives to high productivity, and simultaneously that established professors require life tenure in order to do their best work. …

The Paramount Position of Production

… Another irrationality persists (more in America than elsewhere?): the prestigious usefulness of private-sector output, compared to the burdensome annoyance of public expenditure. Somehow public expenditure can never quite be viewed as a productive and enriching element of national output; it is forever something to be avoided, at best a necessary encumbrance. Cars are important, roads are not. An expansion in telephone services improves the general well-being, cuts in postal services are a necessary economy. Vacuum cleaners to ensure clean houses boast our standard of living, street cleaners are an unfortunate expense. Thus we end up with clean houses and filthy streets. …

[W]e have wants at the margin only so far as they are synthesised. We do not manufacture wants for goods we do not produce. …

The Dependence Effect

… Modern consumer demand, at the margin, does not originate from within the individual, but is a consequence of production. It has two origins:

  1. Emulation: the desire to keep abreast of, or ahead of one’s peer group — demand originating from this motivation is created indirectly by production. Every effort to increase production to satiate want brings with it a general raising of the level of consumption, which itself increases want.
  2. Advertising: the direct influence of advertising and salesmanship create new wants which the consumer did not previously possess. Any student of business has by now come to view marketing as fundamental a business activity as production. Any want that can be significantly moulded by advertising cannot possibly have been strongly felt in the absence of that advertising — advertising is powerless to persuade a man that he is or is not hungry.

Inflation

… In 1942 a grateful and very anxious citizenry rewarded its soldiers, sailors, and airmen with a substantial increase in pay. In the teeming city of Honolulu, in prompt response to this advance in wage income, the prostitutes raised the prices of their services. This was at a time when, if anything, increased volume was causing a reduction in their average unit costs. However, in this instance the high military authorities, deeply angered by what they deemed improper, immoral, and indecent profiteering, ordered a return to the previous scale. …

The Theory of Social Balance

The final problem of the affluent society is the balance of goods it produces. Private goods: TVs, cars, cigarettes, drugs and alcohol are overproduced; public goods: education, healthcare, police services, park provision, mass transport and refuse disposal are underproduced. The consequences are extremely severe for the wellbeing of society. The balance between private and public consumption will be referred to as ‘the social balance’. The main reason for this imbalance is relatively straightforward. The forces we have identified which increase consumer demand as production rises (advertising and emulation) act almost entirely on the private sector. …

It is arguable that emulation acts on public services to an extent: a new school in one district may encourage neighbouring districts to ‘keep up’, but the effect is relatively miniscule.

Thus, private demand is artificially inflated and public demand is not, and the voter-consumer decides how to split his income between the two at the ballot box: inevitably public expenditure is grossly underrepresented. …

A summary of Galbraith’s The Affluent Society Read More »

A great example of poor comments in your code

From Steven Levy’s Hackers: Heroes of the Computer Revolution (Penguin Books: 2001): 43:

[Peter Samson, one of the first MIT hackers], though, was particularly obscure in refusing to add comments to his source code explaining what he was doing at a given time. One well-distributed program Samson wrote went on for hundreds of assembly language instructions, with only one comment beside an instruction which contained the number 1750. The comment was RIPJSB, and people racked their brains about its meaning until someone figured out that 1750 was the year Bach died, and that Samson had written an abbreviation for Rest In Peace Johann Sebastian Bach.

A great example of poor comments in your code Read More »

The Hacker Ethic

From Steven Levy’s Hackers: Heroes of the Computer Revolution (Penguin Books: 2001): 40-46:

Still, even in the days of the TX-0 [the late 1950s], the planks of the platform were in place. The Hacker Ethic:

  • Access To Computers — And Anything Which Might Teach You Something About The Way The World Works — Should Be Unlimited And Total. Always Yield To The Hands-On Imperative!
  • All Information Should Be Free.
  • Mistrust Authority — Promote Decentralization. The last thing you need is a bureaucracy. Bureaucracies, whether corporate, government, or university, are flawed systems, dangerous in that they cannot accommodate the exploratory impulse of true hackers. Bureaucrats hide behind arbitrary rules (as opposed to the logical algorithms by which machines and computer programs operate): they invoke those rules to consolidate power, and perceive the constructive impulse of hackers as a threat.
  • Hackers Should Be Judged By Their Hacking, Not Bogus Criteria Such As Degrees, Age, Race, Or Position. This meritocratic trait was not necessarily rooted in the inherent goodness of hacker hearts–it was mainly that hackers cared less about someone’s superficial characteristics than they did about his potential to advance the general state of hacking, to create new programs to admire, to talk about that new feature in the system.
  • You Can Create Art And Beauty On A Computer.
  • Computers Can Change Your Life For The Better.
  • Like Aladdin’s Lamp, You Could Get It To Do Your Bidding.

The Hacker Ethic Read More »

The origin of the word “munge”, “hack”, & others

From Steven Levy’s Hackers: Heroes of the Computer Revolution (Penguin Books: 2001): 23:

The core members hung out at [MIT’s Tech Model Railroad Club in the late 1950s] for hours; constantly improving The System, arguing about what could be done next, developing a jargon of their own that seemed incomprehensible to outsiders who might chance on these teen-aged fanatics … When a piece of equipment wasn’t working, it was “losing”; when a piece of equipment was ruined, it was “munged” (Mash Until No Good); the two desks in the corner of the room were not called the office, but the “orifice”; one who insisted on studying for courses was a “tool”; garbage was called “cruft”; and a project undertaken or a product built not solely to fulfill some constructive goal, but with some wild pleasure taken in mere involvement, was called a “hack.”

This latter term may have been suggested by ancient MIT lingo– the word “hack” had long been used to describe the elaborate college pranks that MIT students would regularly devise, such as covering the dome that overlooked the campus with reflecting foil. But as the TMRC people used the word, there was serious respect implied. While someone might call a clever connection between relays a “mere hack,” it would be understood that, to qualify as a hack, the feat must be imbued with innovation, style, and technical virtuosity.

The origin of the word “munge”, “hack”, & others Read More »

Luther & Poe both complained about too many books

From Clay Shirky’s “Does The Internet Make You Smarter?” (The Wall Street Journal: 5 June 2010):

In the history of print … complaints about distraction have been rampant; no less a beneficiary of the printing press than Martin Luther complained, “The multitude of books is a great evil. There is no measure of limit to this fever for writing.” Edgar Allan Poe, writing during another surge in publishing, concluded, “The enormous multiplication of books in every branch of knowledge is one of the greatest evils of this age; since it presents one of the most serious obstacles to the acquisition of correct information.”

Luther & Poe both complained about too many books Read More »

The dangers of loyalty based on personality, not policies

This quotation is directly about politics, but it’s about anyone – or even anything – we emotionally attach ourselves to.

From Glenn Greenwald’s “My friend the president” (Salon: 8 December 2009):

Those who venerated Bush because he was a morally upright and strong evangelical-warrior-family man and revere Palin as a common-sense Christian hockey mom are similar in kind to those whose reaction to Obama is dominated by their view of him as an inspiring, kind, sophisticated, soothing and mature intellectual. These are personality types bolstered with sophisticated marketing techniques, not policies, governing approaches or ideologies. But for those looking for some emotional attachment to a leader, rather than policies they believe are right, personality attachments are far more important. They’re also far more potent. Loyalty grounded in admiration for character will inspire support regardless of policy, and will produce and sustain the fantasy that this is not a mere politician, but a person of deep importance to one’s life who — like a loved one or close friend or religious leader — must be protected and defended at all costs.

The dangers of loyalty based on personality, not policies Read More »

The Irish Church lies in creative – and evil – ways

From Patsy McGarry’s “Church ‘lied without lying’” (Irish Times: 26 November 2009):

One of the most fascinating discoveries in the Dublin Archdiocese report was that of the concept of “mental reservation” which allows clerics mislead people without believing they are lying.

According to the Commission of Investigation report, “mental reservation is a concept developed and much discussed over the centuries, which permits a church man knowingly to convey a misleading impression to another person without being guilty of lying”.

It gives an example. “John calls to the parish priest to make a complaint about the behaviour of one of his curates. The parish priest sees him coming but does not want to see him because he considers John to be a troublemaker. He sends another of his curates to answer the door. John asks the curate if the parish priest is in. The curate replies that he is not.”

The commission added: “This is clearly untrue but in the Church’s view it is not a lie because, when the curate told John that the parish priest was not in, he mentally reserved the words ‘…to you’.”

Cardinal Desmond Connell had explained the concept to the commission as follows:

“Well, the general teaching about mental reservation is that you are not permitted to tell a lie. On the other hand, you may be put in a position where you have to answer, and there may be circumstances in which you can use an ambiguous expression realising that the person who you are talking to will accept an untrue version of whatever it may be – permitting that to happen, not willing that it happened, that would be lying. It really is a matter of trying to deal with extraordinarily difficult matters that may arise in social relations where people may ask questions that you simply cannot answer. Everybody knows that this kind of thing is liable to happen. So mental reservation is, in a sense, a way of answering without lying.”

In Mr Madden’s case, emphasised he did not lie to the media about the use of diocesan funds for the compensation of clerical child sexual abuse victims.

[Cardinal Connell] explained to [Andrew] Madden [a sexual abuse victim, that] he had told journalists “that diocesan funds ARE (report’s emphasis) not used for such a purpose; that he had not said that diocesan funds WERE not used for such a purpose. By using the present tense he had not excluded the possibility that diocesan funds had been used for such purpose in the past. According to Mr Madden, Cardinal Connell considered that there was an enormous difference between the two.”

The Irish Church lies in creative – and evil – ways Read More »

David Foster Wallace on the impossibility of being informed & the seduction of dogma

From David Foster Wallace’s “Introduction” (The Best American Essays 2007):

Here is an overt premise. There is just no way that 2004’s reelection could have taken place—not to mention extraordinary renditions, legalized torture, FISA-flouting, or the
passage of the Military Commissions Act—if we had been paying attention and handling information in a competent grown-up way. ‘We’ meaning as a polity and culture. The premise does not entail specific blame—or rather the problems here are too entangled and systemic for good old-fashioned finger-pointing. It is, for one example, simplistic and wrong to blame the for-profit media for somehow failing to make clear to us the moral and practical hazards of trashing the Geneva Conventions. The for-profit media is highly attuned to what we want and the amount of detail we’ll sit still for. And a ninety-second news piece on the question of whether and how the Geneva Conventions ought to apply in an era of asymmetrical warfare is not going to explain anything; the relevant questions are too numerous and complicated, too fraught with contexts in everything from civil law and military history to ethics and game theory. One could spend a hard month just learning the history of the Conventions’ translation into actual codes of conduct for the U.S. military … and that’s not counting the dramatic changes in those codes since 2002, or the question of just what new practices violate (or don’t) just which Geneva provisions, and according to whom. Or let’s not even mention the amount of research, background, cross- checking, corroboration, and rhetorical parsing required to understand the cataclysm of Iraq, the collapse of congressional oversight, the ideology of neoconservatism, the legal status of presidential signing statements, the political marriage of evangelical Protestantism and corporatist laissez-faire … There’s no way. You’d simply drown. We all would. It’s amazing to me that no one much talks about this—about the fact that whatever our founders and framers thought of as a literate, informed citizenry can no longer exist, at least not without a whole new modern degree of subcontracting and dependence packed into what we mean by ‘informed.’8

8 Hence, by the way, the seduction of partisan dogma. You can drown in dogmatism now, too— radio, Internet, cable, commercial and scholarly print— but this kind of drowning is more like sweet release. Whether hard right or new left or whatever, the seduc- tion and mentality are the same. You don’t have to feel confused or inundated or ignorant. You don’t even have to think, for you already Know, and whatever you choose to learn confirms what you Know. This dog- matic lockstep is not the kind of inevitable dependence I’m talking about—or rather it’s only the most extreme and frightened form of that dependence.

David Foster Wallace on the impossibility of being informed & the seduction of dogma Read More »

Religion, God, history, morality

From Steve Paulson’s interview with Robert Wright, “God, He’s moody” (Salon: 24 June 2009):

Do you think religions share certain core principles?

Not many. People in the modern world, certainly in America, think of religion as being largely about prescribing moral behavior. But religion wasn’t originally about that at all. To judge by hunter-gatherer religions, religion was not fundamentally about morality before the invention of agriculture. It was trying to figure out why bad things happen and increasing the frequency with which good things happen. Why do you sometimes get earthquakes, storms, disease and get slaughtered? But then sometimes you get nice weather, abundant game and you get to do the slaughtering. Those were the religious questions in the beginning.

And bad things happened because the gods were against you or certain spirits had it out for you?

Yes, you had done something to offend a god or spirit. However, it was not originally a moral lapse. That’s an idea you see as societies get more complex. When you have a small group of hunter-gatherers, a robust moral system is not a big challenge. Everyone knows everybody, so it’s hard to conceal anything you steal. If you mess with somebody too much, there will be payback. Moral regulation is not a big problem in a simple society. But as society got more complex with the invention of agriculture and writing, morality did become a challenge. Religion filled that gap.

For people who claim that Israel was monotheistic from the get-go and its flirtations with polytheism were rare aberrations, it’s interesting that the Jerusalem temple, according to the Bible’s account, had all these other gods being worshiped in it. Asherah was in the temple. She seemed to be a consort or wife of Yahweh. And there were vessels devoted to Baal, the reviled Canaanite god. So Israel was fundamentally polytheistic at this point. Then King Josiah goes on a rampage as he tries to consolidate his own power by wiping out the other gods.

You make the point that the Quran is a different kind of sacred text than the Bible. It was probably written over the course of two decades, while the stories collected in the Bible were written over centuries. That’s why the Bible is such a diverse document.

We think of the Bible as a book, but in ancient times it would have been thought of as a library. There were books written by lots of different people, including a lot of cosmopolitan elites. You also see elements of Greek philosophy. The Quran is just one guy talking. In the Muslim view, he’s mediating the word of God. He’s not especially cosmopolitan. He is, according to Islamic tradition, illiterate. So it’s not surprising that the Quran didn’t have the intellectual diversity and, in some cases, the philosophical depth that you find in the Bible. I do think he was actually a very modern thinker. Muhammad’s argument for why you should be devoted exclusively to this one God is very modern.

Are you also saying we can be religious without believing in God?

By some definitions, yes. It’s hard to find a definition of religion that encompasses everything we call religion. The definition I like comes from William James. He said, “Religious belief consists of the belief that there is an unseen order and that our supreme good lies in harmoniously adjusting to that order.” In that sense, you can be religious without believing in God. In that sense, I’m religious. On the God question, I’m not sure.

Religion, God, history, morality Read More »

Bernie Madoff & the 1st worldwide Ponzi scheme

From Diana B. Henrioques’s “Madoff Scheme Kept Rippling Outward, Across Borders” (The New York Times: 20 December 2008):

But whatever else Mr. Madoff’s game was, it was certainly this: The first worldwide Ponzi scheme — a fraud that lasted longer, reached wider and cut deeper than any similar scheme in history, entirely eclipsing the puny regional ambitions of Charles Ponzi, the Boston swindler who gave his name to the scheme nearly a century ago.

Regulators say Mr. Madoff himself estimated that $50 billion in personal and institutional wealth from around the world was gone. … Before it evaporated, it helped finance Mr. Madoff’s coddled lifestyle, with a Manhattan apartment, a beachfront mansion in the Hamptons, a small villa overlooking Cap d’Antibes on the French Riviera, a Mayfair office in London and yachts in New York, Florida and the Mediterranean.

In 1960, as Wall Street was just shaking off its postwar lethargy and starting to buzz again, Bernie Madoff (pronounced MAY-doff) set up his small trading firm. His plan was to make a business out of trading lesser-known over-the-counter stocks on the fringes of the traditional stock market. He was just 22, a graduate of Hofstra University on Long Island.

By 1989, Mr. Madoff ‘s firm was handling more than 5 percent of the trading volume on the august New York Stock Exchange …

And in 1990, he became the nonexecutive chairman of the Nasdaq market, which at the time was operated as a committee of the National Association of Securities Dealers.

His rise on Wall Street was built on his belief in a visionary notion that seemed bizarre to many at the time: That stocks could be traded by people who never saw each other but were connected only by electronics.

In the mid-1970s, he had spent over $250,000 to upgrade the computer equipment at the Cincinnati Stock Exchange, where he began offering to buy and sell stocks that were listed on the Big Board. The exchange, in effect, was transformed into the first all-electronic computerized stock exchange.

He also invested in new electronic trading technology for his firm, making it cheaper for brokerage firms to fill their stock orders. He eventually gained a large amount of business from big firms like A. G. Edwards & Sons, Charles Schwab & Company, Quick & Reilly and Fidelity Brokerage Services.

By the end of the technology bubble in 2000, his firm was the largest market maker on the Nasdaq electronic market, and he was a member of the Securities Industry Association, now known as the Securities Industry and Financial Markets Association, Wall Street’s principal lobbying arm.

Bernie Madoff & the 1st worldwide Ponzi scheme Read More »

COBOL is much more widely used than you might think

From Darryl Taft’s “Enterprise Applications: 20 Things You Might Not Know About COBOL (as the Language Turns 50)” (eWeek: September 2009). http://www.eweek.com/c/a/Enterprise-Applications/20-Things-You-Might-Not-Know-About-COBOL-As-the-Language-Turns-50-103943/?kc=EWKNLBOE09252009FEA1. Accessed 25 September 2009.

Five billion lines of new COBOL are developed every year.

More than 80 percent of all daily business transactions are processed in COBOL.

More than 70 percent of all worldwide business data is stored on a mainframe.

More than 70 percent of mission-critical applications are in COBOL.

More than 310 billion lines of software are in use today and more than 200 billion lines are COBOL (65 percent of the total software).

There are 200 times more COBOL transactions per day than Google searches worldwide.

An estimated 2 million people are currently working in COBOL in one form or another.

COBOL is much more widely used than you might think Read More »

Why Picasso charged a million dollars

Femme aux Bras Croisés, 1902
Image via Wikipedia

From Josh Olson’s “I Will Not Read Your Fucking Script” (The Village Voice: 9 September 2009):

There’s a great story about Pablo Picasso. Some guy told Picasso he’d pay him to draw a picture on a napkin. Picasso whipped out a pen and banged out a sketch, handed it to the guy, and said, “One million dollars, please.”

“A million dollars?” the guy exclaimed. “That only took you thirty seconds!”

“Yes,” said Picasso. “But it took me fifty years to learn how to draw that in thirty seconds.”

Why Picasso charged a million dollars Read More »

Apple’s role in technology

Image representing iPhone as depicted in Crunc...
Image via CrunchBase

From Doc Searls’s “The Most Personal Device” (Linux Journal: 1 March 2009):

My friend Keith Hopper made an interesting observation recently. He said one of Apple’s roles in the world is finding categories where progress is logjammed, and opening things up by coming out with a single solution that takes care of everything, from the bottom to the top. Apple did it with graphical computing, with .mp3 players, with on-line music sales and now with smartphones. In each case, it opens up whole new territories that can then be settled and expanded by other products, services and companies. Yes, it’s closed and controlling and the rest of it. But what matters is the new markets that open up.

Apple’s role in technology Read More »

Who would ever think that it was a good idea?

A typical full sheet of LSD blotter paper with...
Image via Wikipedia

Read this article about Paul Krassner’s experiences with the Manson Family & note the emphasis I’ve added – is this not the greatest sentence out of nowhere you’ve ever seen? How in the world did that ever seem like a good idea?

From Paul Krassner’s “My Acid Trip with Squeaky Fromme” (The Huffington Post: 6 August 2009):

Manson was on Death Row — before capital punishment was repealed (and later reinstated, but not retroactively) in California — so I was unable to meet with him. Reporters had to settle for an interview with any prisoner awaiting the gas chamber, and it was unlikely that Charlie would be selected at random for me.

In the course of our correspondence, there was a letter from Manson consisting of a few pages of gibberish about Christ and the Devil, but at one point, right in the middle, he wrote in tiny letters, “Call Squeaky,” with her phone number. I called, and we arranged to meet at her apartment in Los Angeles. On an impulse, I brought several tabs of acid with me on the plane.

Who would ever think that it was a good idea? Read More »