technology

Talk about Markdown to SLUUG this Wednesday

I’ll be giving a talk to the St. Louis UNIX Users Group next Wednesday night about Markdown, a tool I absolutely love.

You’re invited to come. Please do – I think you’ll definitely learn a lot.

Date: Wednesday, Nov. 9, 2011
Time: 6:30 – 9 pm
Where: 11885 Lackland Rd., St Louis, MO 63146
Map: http://g.co/maps/6gg9g
Directions: http://www.sluug.org/resources/meeting_info/map_graybar.shtml

Here’s the description:

John Gruber, the inventor of Markdown, describes it this way: “Markdown is a text-to-HTML conversion tool for web writers. Markdown allows you to write using an easy-to-read, easy-to-write plain text format, then convert it to structurally valid XHTML (or HTML). Thus, ‘Markdown’ is two things: (1) a plain text formatting syntax; and (2) a software tool, written in Perl, that converts the plain text formatting to HTML. … The overriding design goal for Markdown’s formatting syntax is to make it as readable as possible. The idea is that a Markdown-formatted document should be publishable as-is, as plain text, without looking like it’s been marked up with tags or formatting instructions.”

This talk by Scott Granneman & Bill Odom will cover the basics of Markdown’s syntax, key variants of Markdown, tools for composing Markdown (including vim, of course!), and ways you can easily transform a plain text file written in Markdown into HTML, JSON, TXT, LaTeX, man, MediaWiki, Textile, DocBook XML, ODT, EPUB, Slidy and S5 HTML and JavaScript slide shows, RTF, or even Word!

If you have any questions, please contact me. Hope to see you there!

Talk about Markdown to SLUUG this Wednesday Read More »

Steve Jobs, genius

From Stephen Fry’s “Steve Jobs” (The New Adventures of Stephen Fry: 6 October 2011):

Henry Ford didn’t invent the motor car, Rockefeller didn’t discover how to crack crude oil into petrol, Disney didn’t invent animation, the Macdonald brothers didn’t invent the hamburger, Martin Luther King didn’t invent oratory, neither Jane Austen, Tolstoy nor Flaubert invented the novel and D. W. Griffith, the Warner Brothers, Irving Thalberg and Steven Spielberg didn’t invent film-making. Steve Jobs didn’t invent computers and he didn’t invent packet switching or the mouse. But he saw that there were no limits to the power that creative combinations of technology and design could accomplish.

I once heard George Melly, on a programme about Louis Armstrong, do that dangerous thing and give his own definition of a genius. “A genius,” he said, “is someone who enters a field and works in it and when they leave it, it is different. By that token, Satchmo was a genius.” I don’t think any reasonable person could deny that Steve Jobs, by that same token, was a genius too.

Steve Jobs, genius Read More »

How changes in glass changed working conditions

From Nicholas Carr’s “(re)framed” (Rough Type: 3 June 2011):

I’m reminded of an interesting passage in the book Glass: A World History:

As we have seen, one of the rapid developments in glass technology was the making of panes of window glass, plain and coloured, which was particularly noticeable in the northern half of Europe [after the twelfth century]. One very practical effect of this was on working conditions. In the cold and dark northern half of Europe people could now work for longer hours and with more precision because they were shielded from the elements. The light poured in, yet the cold was kept out. Prior to glass only thin slivers of horn or parchment were used and the window spaces were of necessity much smaller and the light admitted, dimmer.

How changes in glass changed working conditions Read More »

Speaking at SLUUG: Amazing, Stupendous, Mind-Blowing Apps for iPad2

Jans Carton & I are delivering a talk at the St. Louis UNIX Users Group at 6:30 pm this Wednesday, 8 June 2011, titled “Amazing, Stupendous, Mind-Blowing Apps for iPad2”. We’ll be demoing iPad apps live for everyone. If you want to find out more about the iPad, or discover some awesome new iPad apps, then come hear Jans & I speak.

Here’s the complete description:

It’s becoming crystal clear that tablets and other mobile devices are leading the computing revolution into the next decade, and at the forefront (at least for now) is Apple’s iPad. Scott Granneman & Jans Carton will demonstrate iPad apps that they find particularly useful, cool, or amazing – sometimes all three at the same time! You’ll see apps for everything from video to games to reading to productivity, and much, much more. We guarantee you’ll see something that makes you think, and something else that makes you go “Wow!”.

The meeting will be held at Graybar Electric Co., Inc. at 11885 Lackland Road, 63146. Directions are at http://sluug.org/resources/meeting_info/map_graybar.shtml, & a Google Map can be found at http://goo.gl/maps/jnpX.

Speaking at SLUUG: Amazing, Stupendous, Mind-Blowing Apps for iPad2 Read More »

Clay Shirky on the changes to publishing & media

From Parul Sehgal’s “Here Comes Clay Shirky” (Publisher’s Weekly: 21 June 2010):

PW: In April of this year, Wired‘s Kevin Kelly turned a Shirky quote—“Institutions will try to preserve the problem to which they are the solution”—into “the Shirky Principle,” in deference to the simple, yet powerful observation. … Kelly explained, “The Shirky Principle declares that complex solutions, like a company, or an industry, can become so dedicated to the problem they are the solution to, that often they inadvertently perpetuate the problem.”

CS: It is possible to think that the Internet will be a net positive for society while admitting that there are significant downsides—after all, it’s not a revolution if nobody loses.

No one will ever wonder, is there anything amusing for me on the Internet? That is a solved problem. What we should really care about are [the Internet’s] cultural uses.

In Here Comes Everybody I told the story of the Abbot of Sponheim who in 1492 wrote a book saying that if this printing press thing is allowed to expand, what will the scribes do for a living? But it was more important that Europe be literate than for scribes to have a job.

In a world where a book had to be a physical object, charging money was a way to cause more copies to come into circulation. In the digital world, charging money for something is a way to produce fewer copies. There is no way to preserve the status quo and not abandon that value.

Some of it’s the brilliant Upton Sinclair observation: “It’s hard to make a man understand something if his livelihood depends on him not understanding it.” From the laying on of hands of [Italian printer] Aldus Manutius on down, publishing has always been this way. This is a medium where a change to glue-based paperback binding constituted a revolution.

PW: When do you think a similar realization will come to book publishing?

CS: I think someone will make the imprint that bypasses the traditional distribution networks. Right now the big bottleneck is the head buyer at Barnes & Noble. That’s the seawall holding back the flood in publishing. Someone’s going to say, “I can do a business book or a vampire book or a romance novel, whatever, that might sell 60% of the units it would sell if I had full distribution and a multimillion dollar marketing campaign—but I can do it for 1% percent of the cost.” It has already happened a couple of times with specialty books. The moment of tip happens when enough things get joined up to create their own feedback loop, and the feedback loop in publishing changes when someone at Barnes & Noble says: “We can’t afford not to stock this particular book or series from an independent publisher.” It could be on Lulu, or iUniverse, whatever. And, I feel pretty confident saying it’s going to happen in the next five years.

Clay Shirky on the changes to publishing & media Read More »

David Pogue’s insights about tech over time

From David Pogue’s “The Lessons of 10 Years of Talking Tech” (The New York Times: 25 November 2010):

As tech decades go, this one has been a jaw-dropper. Since my first column in 2000, the tech world has not so much blossomed as exploded. Think of all the commonplace tech that didn’t even exist 10 years ago: HDTV, Blu-ray, GPS, Wi-Fi, Gmail, YouTube, iPod, iPhone, Kindle, Xbox, Wii, Facebook, Twitter, Android, online music stores, streaming movies and on and on.

With the turkey cooking, this seems like a good moment to review, to reminisce — and to distill some insight from the first decade in the new tech millennium.

Things don’t replace things; they just splinter. I can’t tell you how exhausting it is to keep hearing pundits say that some product is the “iPhone killer” or the “Kindle killer.” Listen, dudes: the history of consumer tech is branching, not replacing.

Things don’t replace things; they just add on. Sooner or later, everything goes on-demand. The last 10 years have brought a sweeping switch from tape and paper storage to digital downloads. Music, TV shows, movies, photos and now books and newspapers. We want instant access. We want it easy.

Some people’s gadgets determine their self-esteem. … Today’s gadgets are intensely personal. Your phone or camera or music player makes a statement, reflects your style and character. No wonder some people interpret criticisms of a product as a criticism of their choices. By extension, it’s a critique of them.

Everybody reads with a lens. … feelings run just as strongly in the tech realm. You can’t use the word “Apple,” “Microsoft” or “Google” in a sentence these days without stirring up emotion.

It’s not that hard to tell the winners from the losers. … There was the Microsoft Spot Watch (2003). This was a wireless wristwatch that could display your appointments and messages — but cost $10 a month, had to be recharged nightly and wouldn’t work outside your home city unless you filled out a Web form in advance.

Some concepts’ time may never come. The same “breakthrough” ideas keep surfacing — and bombing, year after year. For the love of Mike, people, nobody wants videophones!

Teenagers do not want “communicators” that do nothing but send text messages, either (AT&T Ogo, Sony Mylo, Motorola V200). People do not want to surf the Internet on their TV screens (WebTV, AOLTV, Google TV). And give it up on the stripped-down kitchen “Internet appliances” (3Com Audrey, Netpliance i-Opener, Virgin Webplayer). Nobody has ever bought one, and nobody ever will.

Forget about forever — nothing lasts a year. Of the thousands of products I’ve reviewed in 10 years, only a handful are still on the market. Oh, you can find some gadgets whose descendants are still around: iPod, BlackBerry, Internet Explorer and so on.

But it’s mind-frying to contemplate the millions of dollars and person-years that were spent on products and services that now fill the Great Tech Graveyard: Olympus M-Robe. PocketPC. Smart Display. MicroMV. MSN Explorer. Aibo. All those PlaysForSure music players, all those Palm organizers, all those GPS units you had to load up with maps from your computer.

Everybody knows that’s the way tech goes. The trick is to accept your
gadget’s obsolescence at the time you buy it…

Nobody can keep up. Everywhere I go, I meet people who express the same reaction to consumer tech today: there’s too much stuff coming too fast. It’s impossible to keep up with trends, to know what to buy, to avoid feeling left behind. They’re right. There’s never been a period of greater technological change. You couldn’t keep up with all of it if you tried.

David Pogue’s insights about tech over time Read More »

Hanoi’s last blacksmith

From Seth Mydans’s “A Lone Blacksmith, Where Hammers Rang” (The New York Times: 25 November 2010):

HANOI, Vietnam — He is the last blacksmith on Blacksmith Street, dark with soot, his arms dappled with burns, sweating and hammering at his little roadside forge as a new world courses past him.

The son and grandson of blacksmiths, Nguyen Phuong Hung grew up when the street still rang with the sounds of the smithies, producing farm equipment, horseshoes and hand tools, before modern commerce and industrial production made them obsolete. “I still remember, when it was raining lightly, the streets were empty and that was all you could hear was the sounds of the hammers,” said Mr. Hung, 49.

The men who worked there left for lighter, better-paying work, and because the word was out that no modern woman would marry a blacksmith, Mr. Hung said. There may be other blacksmiths working in Vietnam, he said, but not here in the capital.

“Once I am gone the street will have no meaning anymore,” he said. “Blacksmith Street will be only a name.” That has been the fate of almost all the 36 narrow streets in Hanoi’s tree-shaded Ancient Quarter, each of them named for the guilds that once controlled them — Fan Street, China Bowl Street, Sweet Potato Street, Conical Hat Street.

There is nothing like this little corner of the urban past anywhere else in Vietnam. Only four of the streets have retained something of their original businesses, said Nguyen Vinh Phuc, a leading historian of Hanoi. There are still jewelry shops on Silver Street, sweets and pastries on Sugar Street, votive papers and toys on Votive Paper Street and pots and pans on Tin Street.

Traders have done business on this spot since the ninth century, Mr. Phuc said. The 36 guilds established themselves at the start of the 19th century.

Blacksmith Street got its name at the beginning of the 19th century, Mr. Phuc said, when French colonial administrators sent out a call for metal workers to help build the Long Bien bridge over the Red River. It was designed by the French architect Gustave Eiffel and became a target of American bombing raids during the Vietnam War.

Mr. Hung’s family has been here from the start, and like his father and grandfather he was called to help out around the forge when he was just a boy, as young as 6. But he rebelled and left for jobs as a driver and factory worker until, when he was 35, his father called him back. “My father told me this is the family trade and I’m the only one left to do it,” Mr. Hung said. “He said, ‘Just watch me work and you’ll learn what to do.’”

Mr. Hung discovered that he loved the work, and that it was his destiny to be a blacksmith. He remembered his father’s words: “When the iron glows red, you earn your money. That is your life.”

Mr. Hung has set up a little tea table on the sidewalk, refilling a thermos from a huge iron kettle that swings gently above the hot coals. A giant bamboo pipe leans against the table, and passersby are welcome to stop for a lungful of strong tobacco.

Mr. Hung hammers with the confidence of a master, bare-handed as he works because he says gloves would dull his touch. Wearing a pair of plastic sandals, he ignores the sparks that sting his feet and pepper his shirt with holes. Flames and smoke gush from the hot metal as he tempers it in a bucket of oil. By the end of the day, his arms and face are black with soot.

Hanoi’s last blacksmith Read More »

Unix: An Oral History

From ‘s “Unix: An Oral History” (: ):

Multics

Gordon M. Brown

[Multics] was designed to include fault-free continuous operation capabilities, convenient remote terminal access and selective information sharing. One of the most important features of Multics was to follow the trend towards integrated multi-tasking and permit multiple programming environments and different human interfaces under one operating system.

Moreover, two key concepts had been picked up on in the development of Multics that would later serve to define Unix. These were that the less important features of the system introduced more complexity and, conversely, that the most important property of algorithms was simplicity. Ritchie explained this to Mahoney, articulating that:

The relationship of Multics to [the development of Unix] is actually interesting and fairly complicated. There were a lot of cultural things that were sort of taken over wholesale. And these include important things, [such as] the hierarchical file system and tree-structure file system – which incidentally did not get into the first version of Unix on the PDP-7. This is an example of why the whole thing is complicated. But any rate, things like the hierarchical file system, choices of simple things like the characters you use to edit lines as you’re typing, erasing characters were the same as those we had. I guess the most important fundamental thing is just the notion that the basic style of interaction with the machine, the fact that there was the notion of a command line, the notion was an explicit shell program. In fact the name shell came from Multics. A lot of extremely important things were completely internalized, and of course this is the way it is. A lot of that came through from Multics.

The Beginning

Michael Errecart and Cameron Jones

Files to Share

The Unix file system was based almost entirely on the file system for the failed Multics project. The idea was for file sharing to take place with explicitly separated file systems for each user, so that there would be no locking of file tables.

A major part of the answer to this question is that the file system had to be open. The needs of the group dictated that every user had access to every other user’s files, so the Unix system had to be extremely open. This openness is even seen in the password storage, which is not hidden at all but is encrypted. Any user can see all the encrypted passwords, but can only test one solution per second, which makes it extremely time consuming to try to break into the system.

The idea of standard input and output for devices eventually found its way into Unix as pipes. Pipes enabled users and programmers to send one function’s output into another function by simply placing a vertical line, a ‘|’ between the two functions. Piping is one of the most distinct features of Unix …

Language from B to C

… Thompson was intent on having Unix be portable, and the creation of a portable language was intrinsic to this. …

Finding a Machine

Darice Wong & Jake Knerr

… Thompson devoted a month apiece to the shell, editor, assembler, and other software tools. …

Use of Unix started in the patent office of Bell Labs, but by 1972 there were a number of non-research organizations at Bell Labs that were beginning to use Unix for software development. Morgan recalls the importance of text processing in the establishment of Unix. …

Building Unix

Jason Aughenbaugh, Jonathan Jessup, & Nicholas Spicher

The Origin of Pipes

The first edition of Thompson and Ritchie’s The Unix Programmer’s Manual was dated November 3, 1971; however, the idea of pipes is not mentioned until the Version 3 Unix manual, published in February 1973. …

Software Tools

grep was, in fact, one of the first programs that could be classified as a software tool. Thompson designed it at the request of McIlroy, as McIlroy explains:

One afternoon I asked Ken Thompson if he could lift the regular expression recognizer out of the editor and make a one-pass program to do it. He said yes. The next morning I found a note in my mail announcing a program named grep. It worked like a charm. When asked what that funny name meant, Ken said it was obvious. It stood for the editor command that it simulated, g/re/p (global regular expression print)….From that special-purpose beginning, grep soon became a household word. (Something I had to stop myself from writing in the first paragraph above shows how firmly naturalized the idea now is: ‘I used ed to grep out words from the dictionary.’) More than any other single program, grep focused the viewpoint that Kernighan and Plauger christened and formalized in Software Tools: make programs that do one thing and do it well, with as few preconceptions about input syntax as possible.

eqn and grep are illustrative of the Unix toolbox philosophy that McIlroy phrases as, “Write programs that do one thing and do it well. Write programs to work together. Write programs that handle text streams, because that is a universal interface.” This philosophy was enshrined in Kernighan and Plauger’s 1976 book, Software Tools, and reiterated in the “Foreword” to the issue of The Bell Systems Technical Journal that also introduced pipes.

Ethos

Robert Murray-Rust & Malika Seth

McIlroy says,

This is the Unix philosophy. Write programs that do one thing and do it well. Write programs to work together. Write programs that handle text streams because, that is a universal interface.

The dissemination of Unix, with a focus on what went on within Bell Labs

Steve Chen

In 1973, the first Unix applications were installed on a system involved in updating directory information and intercepting calls to numbers that had been changed. This was the first time Unix had been used in supporting an actual, ongoing operating business. Soon, Unix was being used to automate the operations systems at Bell Laboratories. It was automating the monitoring, involved in measurement, and helping to rout calls and ensure the quality of the calls.

There were numerous reasons for the friendliness the academic society, especially the academic Computer Science community, showed towards Unix. John Stoneback relates a few of these:

Unix came into many CS departments largely because it was the only powerful interactive system that could run on the sort of hardware (PDP-11s) that universities could afford in the mid ’70s. In addition, Unix itself was also very inexpensive. Since source code was provided, it was a system that could be shaped to the requirements of a particular installation. It was written in a language considerably more attractive than assembly, and it was small enough to be studied and understood by individuals. (John Stoneback, “The Collegiate Community,” Unix Review, October 1985, p. 27.)

The key features and characteristics of Unix that held it above other operating systems at the time were its software tools, its portability, its flexibility, and the fact that it was simple, compact, and efficient. The development of Unix in Bell Labs was carried on under a set of principles that the researchers had developed to guide their work. These principles included:

(i) Make each program do one thing well. To do a new job, build afresh rather than complicate old programs by adding new features.

(ii) Expect the output of every program to become the input to another, as yet unknown, program. Don’t clutter output with extraneous information. Avoid stringently columnar or binary input formats. Don’t insist on interactive input.

(iii) Design and build software, even operating systems, to be tried early, ideally within weeks. Don’t hesitate to throw away the clumsy parts and rebuild them.

(iv) Use tools in preference to unskilled help to lighten a programming task, even if you have to detour to build the tools and expect to throw some of them out after you’ve finished using them.”

(M.D. McIlroy, E.N. Pinson, and B.A. Tague “Unix Time-Sharing System Forward,” The Bell System Technical Journal, July-Aug 1088 vol 57, number 6 part 2. P. 1902)

Unix: An Oral History Read More »

HDTV’s widely adopted by American households

From Alex Mindlin’s “Room to Grow as Homes Add HD TVs” (The New York Times: 21 November 2010):

High-definition televisions have entered American homes with startling speed; 56 percent of households now have at least some HD channels and an HD set, according to Nielsen. Among consumer technologies, that speed of adoption is rivaled only by the VCR.

But the average television viewer is still unlikely to watch television in high definition. One reason is that people watch some standard channels on their HD sets. But a more important reason for the low HD viewing rate is that most homes have several televisions, of which the HD set is only one.

“What we’re seeing,” [Pat McDonough, a senior television analyst at Nielsen] said, “is that people want a second high-definition set pretty quickly, once they get used to watching it.”

Indeed, in households with a single HD set, family members diverge from normal American viewing patterns; rather than migrating to their rooms at night to watch TV on separate sets, they cluster in the living room around the HD set.

HDTV’s widely adopted by American households Read More »

My favorite iPhone apps

Someone on a mailing list asked for a list of our favorite iPhone apps. Here’s what I said:

Reeder is the best RSS reader (tied to Google Reader, natch), bar none.

Articles presents Wikipedia beautifully.

Dropbox is an essential for the reasons Martin gave.

Echofon is a great Twitter app, especially since it syncs with its Mac desktop app.

Pano takes panoramic pix, ColorSplash allows you to make pix B&W & then selectively colorize them, & Camera+ has all sorts of goodies.

Rowmote Pro lets me control my Mac mini connected to my TV remotely.

Simplenote is a great note app that syncs with its website & JustNotes on my Mac.

1Password keeps passwords, account info, serial #’s, & sensitive notes encrypted & synced with the Mac version of the app using Dropbox.

Nightstand is a gorgeous alarm clock & more.

Amazon.com makes it too easy for me to spend $$$.

PhoneFlicks manages my Netflix queue.

And finally, even though it’s only been out for a day or two, Rage 3D is a killer shooter that looks freakin’ gorgeous.

My favorite iPhone apps Read More »

Evaluating software features

When developing software, it’s important to rank your features, as you can’t do everything, & not everything is worth doing. One way to rank features is to categorize them in order of importance using the following three categories:

  1. Required/Essential/Necessary: Mission critical features that must be present
  2. Preferred/Conditional: Important features & enhancements that bring better experience & easier management, but can wait until later release if necessary
  3. Optional/Nice To Have: If resources permit, sure, but otherwise…

Of course, you should also group your features based upon the kinds of features they are. Here’s a suggestion for those groups:

  • User experience
  • Management
  • Security

Evaluating software features Read More »

My response to the news that “Reader, Acrobat Patches Plug 23 Security Holes”

I sent this email out earlier today to friends & students:

For the love of Pete, people, if you use Adobe Acrobat Reader, update it.

http://krebsonsecurity.com/2010/10/reader-acrobat-patches-plug-23-security-holes/

But here’s a better question: why are you using Adobe Reader in the first place? It’s one of the WORST programs for security you can have on your computer. And most of the time, you just don’t need it!

If you use Windows, give Foxit Reader (http://www.foxitsoftware.com/pdf/reader/) a whirl. It’s free!

If you use a Mac, you already have a great PDF reader installed with your operating system: Preview. Use it.

The ONLY reason to use Adobe Reader is to fill out tax forms. When I need to do that, I download Adobe Reader, download the PDFs from the gubmint, fill out the PDFs, send ’em to the Feds & the State, & then remove Adobe Reader. I encourage others to do the same.

My response to the news that “Reader, Acrobat Patches Plug 23 Security Holes” Read More »

How the Madden NFL videogame was developed

From Patrick Hruby’s “The Franchise: The inside story of how Madden NFL became a video game dynasty” (ESPN: 22 July 2010):

1982

Harvard grad and former Apple employee Trip Hawkins founds video game maker Electronic Arts, in part to create a football game; one year later, the company releases “One-on-One: Dr. J vs. Larry Bird,” the first game to feature licensed sports celebrities. Art imitates life.

1983-84

Hawkins approaches former Oakland Raiders coach and NFL television analyst John Madden to endorse a football game. Madden agrees, but insists on realistic game play with 22 on-screen players, a daunting technical challenge.

1988-90

EA releases the first Madden football game for the Apple II home computer; a subsequent Sega Genesis home console port blends the Apple II game’s realism with control pad-heavy, arcade-style action, becoming a smash hit.

madden-nfl-covers-sm.jpg

You can measure the impact of “Madden” through its sales: as many as 2 million copies in a single week, 85 million copies since the game’s inception and more than $3 billion in total revenue. You can chart the game’s ascent, shoulder to shoulder, alongside the $20 billion-a-year video game industry, which is either co-opting Hollywood (see “Tomb Raider” and “Prince of Persia”) or topping it (opening-week gross of “Call of Duty: Modern Warfare 2”: $550 million; “The Dark Knight”: $204 million).

Some of the pain was financial. Just as EA brought its first games to market in 1983, the home video game industry imploded. In a two-year span, Coleco abandoned the business, Intellivision went from 1,200 employees to five and Atari infamously dumped thousands of unsold game cartridges into a New Mexico landfill. Toy retailers bailed, concluding that video games were a Cabbage Patch-style fad. Even at EA — a hot home computer startup — continued solvency was hardly assured.

In 1988, “John Madden Football” was released for the Apple II computer and became a modest commercial success.

THE STAKES WERE HIGH for a pair of upstart game makers, with a career-making opportunity and a $100,000 development contract on the line. In early 1990, Troy Lyndon and Mike Knox of San Diego-based Park Place Productions met with Hawkins to discuss building a “Madden” game for Sega’s upcoming home video game console, the Genesis. …

Because the game that made “Madden” a phenomenon wasn’t the initial Apple II release, it was the Genesis follow-up, a surprise smash spawned by an entirely different mindset. Hawkins wanted “Madden” to play out like the NFL. Equivalent stats. Similar play charts. Real football.

In 1990, EA had a market cap of about $60 million; three years later, that number swelled to $2 billion.

In 2004, EA paid the NFL a reported $300 million-plus for five years of exclusive rights to teams and players. The deal was later extended to 2013. Just like that, competing games went kaput. The franchise stands alone, triumphant, increasingly encumbered by its outsize success.

Hawkins left EA in the early 1990s to spearhead 3D0, an ill-fated console maker that became a doomed software house. An icy rift between the company and its founder ensued.

How the Madden NFL videogame was developed Read More »

A summary of Galbraith’s The Affluent Society

From a summary of John Kenneth Galbraith’s The Affluent Society (Abridge Me: 1 June 2010):

The Concept of the Conventional Wisdom

The paradigms on which society’s perception of reality are based are highly conservative. People invest heavily in these ideas, and so are heavily resistant to changing them. They are only finally overturned by new ideas when new events occur which make the conventional wisdom appear so absurd as to be impalpable. Then the conventional wisdom quietly dies with its most staunch proponents, to be replaced with a new conventional wisdom. …

Economic Security

… Economics professors argue that the threat of unemployment is necessary to maintain incentives to high productivity, and simultaneously that established professors require life tenure in order to do their best work. …

The Paramount Position of Production

… Another irrationality persists (more in America than elsewhere?): the prestigious usefulness of private-sector output, compared to the burdensome annoyance of public expenditure. Somehow public expenditure can never quite be viewed as a productive and enriching element of national output; it is forever something to be avoided, at best a necessary encumbrance. Cars are important, roads are not. An expansion in telephone services improves the general well-being, cuts in postal services are a necessary economy. Vacuum cleaners to ensure clean houses boast our standard of living, street cleaners are an unfortunate expense. Thus we end up with clean houses and filthy streets. …

[W]e have wants at the margin only so far as they are synthesised. We do not manufacture wants for goods we do not produce. …

The Dependence Effect

… Modern consumer demand, at the margin, does not originate from within the individual, but is a consequence of production. It has two origins:

  1. Emulation: the desire to keep abreast of, or ahead of one’s peer group — demand originating from this motivation is created indirectly by production. Every effort to increase production to satiate want brings with it a general raising of the level of consumption, which itself increases want.
  2. Advertising: the direct influence of advertising and salesmanship create new wants which the consumer did not previously possess. Any student of business has by now come to view marketing as fundamental a business activity as production. Any want that can be significantly moulded by advertising cannot possibly have been strongly felt in the absence of that advertising — advertising is powerless to persuade a man that he is or is not hungry.

Inflation

… In 1942 a grateful and very anxious citizenry rewarded its soldiers, sailors, and airmen with a substantial increase in pay. In the teeming city of Honolulu, in prompt response to this advance in wage income, the prostitutes raised the prices of their services. This was at a time when, if anything, increased volume was causing a reduction in their average unit costs. However, in this instance the high military authorities, deeply angered by what they deemed improper, immoral, and indecent profiteering, ordered a return to the previous scale. …

The Theory of Social Balance

The final problem of the affluent society is the balance of goods it produces. Private goods: TVs, cars, cigarettes, drugs and alcohol are overproduced; public goods: education, healthcare, police services, park provision, mass transport and refuse disposal are underproduced. The consequences are extremely severe for the wellbeing of society. The balance between private and public consumption will be referred to as ‘the social balance’. The main reason for this imbalance is relatively straightforward. The forces we have identified which increase consumer demand as production rises (advertising and emulation) act almost entirely on the private sector. …

It is arguable that emulation acts on public services to an extent: a new school in one district may encourage neighbouring districts to ‘keep up’, but the effect is relatively miniscule.

Thus, private demand is artificially inflated and public demand is not, and the voter-consumer decides how to split his income between the two at the ballot box: inevitably public expenditure is grossly underrepresented. …

A summary of Galbraith’s The Affluent Society Read More »

Microsoft’s real customers

From James Fallow’s “Inside the Leviathan: A short and stimulating brush with Microsoft’s corporate culture” (The Atlantic: February 2000):

Financial analysts have long recognized that Microsoft’s profit really comes from two sources. One is operating systems (Windows, in all its varieties), and the other is the Office suite of programs. Everything else — Flight Simulator, Slate, MSNBC, mice and keyboards — is financially meaningless. What these two big categories have in common is that individuals are not the significant customers. Operating systems are sold mainly to computer companies such as Dell and Compaq, which pass them pre-loaded to individual consumers. And the main paying customers for Office are big corporations (or what the high-tech world calls LORGs, for “large-size organizations”), which may buy thousands of “seats” for their employees at hundreds of dollars apiece. Product planning, therefore, is focused with admirable clarity on those whose decisions really matter to Microsoft — the information-technology manager at Chevron or the U.S. Department of Agriculture, for example — rather than some writer with an idea about how to make his colleagues happier with a program.

Microsoft’s real customers Read More »

A great example of poor comments in your code

From Steven Levy’s Hackers: Heroes of the Computer Revolution (Penguin Books: 2001): 43:

[Peter Samson, one of the first MIT hackers], though, was particularly obscure in refusing to add comments to his source code explaining what he was doing at a given time. One well-distributed program Samson wrote went on for hundreds of assembly language instructions, with only one comment beside an instruction which contained the number 1750. The comment was RIPJSB, and people racked their brains about its meaning until someone figured out that 1750 was the year Bach died, and that Samson had written an abbreviation for Rest In Peace Johann Sebastian Bach.

A great example of poor comments in your code Read More »

The Hacker Ethic

From Steven Levy’s Hackers: Heroes of the Computer Revolution (Penguin Books: 2001): 40-46:

Still, even in the days of the TX-0 [the late 1950s], the planks of the platform were in place. The Hacker Ethic:

  • Access To Computers — And Anything Which Might Teach You Something About The Way The World Works — Should Be Unlimited And Total. Always Yield To The Hands-On Imperative!
  • All Information Should Be Free.
  • Mistrust Authority — Promote Decentralization. The last thing you need is a bureaucracy. Bureaucracies, whether corporate, government, or university, are flawed systems, dangerous in that they cannot accommodate the exploratory impulse of true hackers. Bureaucrats hide behind arbitrary rules (as opposed to the logical algorithms by which machines and computer programs operate): they invoke those rules to consolidate power, and perceive the constructive impulse of hackers as a threat.
  • Hackers Should Be Judged By Their Hacking, Not Bogus Criteria Such As Degrees, Age, Race, Or Position. This meritocratic trait was not necessarily rooted in the inherent goodness of hacker hearts–it was mainly that hackers cared less about someone’s superficial characteristics than they did about his potential to advance the general state of hacking, to create new programs to admire, to talk about that new feature in the system.
  • You Can Create Art And Beauty On A Computer.
  • Computers Can Change Your Life For The Better.
  • Like Aladdin’s Lamp, You Could Get It To Do Your Bidding.

The Hacker Ethic Read More »

The origin of the word “munge”, “hack”, & others

From Steven Levy’s Hackers: Heroes of the Computer Revolution (Penguin Books: 2001): 23:

The core members hung out at [MIT’s Tech Model Railroad Club in the late 1950s] for hours; constantly improving The System, arguing about what could be done next, developing a jargon of their own that seemed incomprehensible to outsiders who might chance on these teen-aged fanatics … When a piece of equipment wasn’t working, it was “losing”; when a piece of equipment was ruined, it was “munged” (Mash Until No Good); the two desks in the corner of the room were not called the office, but the “orifice”; one who insisted on studying for courses was a “tool”; garbage was called “cruft”; and a project undertaken or a product built not solely to fulfill some constructive goal, but with some wild pleasure taken in mere involvement, was called a “hack.”

This latter term may have been suggested by ancient MIT lingo– the word “hack” had long been used to describe the elaborate college pranks that MIT students would regularly devise, such as covering the dome that overlooked the campus with reflecting foil. But as the TMRC people used the word, there was serious respect implied. While someone might call a clever connection between relays a “mere hack,” it would be understood that, to qualify as a hack, the feat must be imbued with innovation, style, and technical virtuosity.

The origin of the word “munge”, “hack”, & others Read More »

Luther & Poe both complained about too many books

From Clay Shirky’s “Does The Internet Make You Smarter?” (The Wall Street Journal: 5 June 2010):

In the history of print … complaints about distraction have been rampant; no less a beneficiary of the printing press than Martin Luther complained, “The multitude of books is a great evil. There is no measure of limit to this fever for writing.” Edgar Allan Poe, writing during another surge in publishing, concluded, “The enormous multiplication of books in every branch of knowledge is one of the greatest evils of this age; since it presents one of the most serious obstacles to the acquisition of correct information.”

Luther & Poe both complained about too many books Read More »

Refusing a technology defines you

From Sander Duivestein’s “Penny Thoughts on the Technium” (The Technium: 1 December 2009):

I‘m interested in how people personally decide to refuse a technology. I’m interested in that process, because I think that will happen more and more as the number of technologies keep increasing. The only way we can sort our identity is by not using technology. We’re used to be that you define yourself by what you use now. You define yourself by what you don’t use. So I’m interested in that process.

Refusing a technology defines you Read More »