computer

Steve Jobs, genius

From Stephen Fry’s “Steve Jobs” (The New Adventures of Stephen Fry: 6 October 2011):

Henry Ford didn’t invent the motor car, Rockefeller didn’t discover how to crack crude oil into petrol, Disney didn’t invent animation, the Macdonald brothers didn’t invent the hamburger, Martin Luther King didn’t invent oratory, neither Jane Austen, Tolstoy nor Flaubert invented the novel and D. W. Griffith, the Warner Brothers, Irving Thalberg and Steven Spielberg didn’t invent film-making. Steve Jobs didn’t invent computers and he didn’t invent packet switching or the mouse. But he saw that there were no limits to the power that creative combinations of technology and design could accomplish.

I once heard George Melly, on a programme about Louis Armstrong, do that dangerous thing and give his own definition of a genius. “A genius,” he said, “is someone who enters a field and works in it and when they leave it, it is different. By that token, Satchmo was a genius.” I don’t think any reasonable person could deny that Steve Jobs, by that same token, was a genius too.

Steve Jobs, genius Read More »

David Pogue’s insights about tech over time

From David Pogue’s “The Lessons of 10 Years of Talking Tech” (The New York Times: 25 November 2010):

As tech decades go, this one has been a jaw-dropper. Since my first column in 2000, the tech world has not so much blossomed as exploded. Think of all the commonplace tech that didn’t even exist 10 years ago: HDTV, Blu-ray, GPS, Wi-Fi, Gmail, YouTube, iPod, iPhone, Kindle, Xbox, Wii, Facebook, Twitter, Android, online music stores, streaming movies and on and on.

With the turkey cooking, this seems like a good moment to review, to reminisce — and to distill some insight from the first decade in the new tech millennium.

Things don’t replace things; they just splinter. I can’t tell you how exhausting it is to keep hearing pundits say that some product is the “iPhone killer” or the “Kindle killer.” Listen, dudes: the history of consumer tech is branching, not replacing.

Things don’t replace things; they just add on. Sooner or later, everything goes on-demand. The last 10 years have brought a sweeping switch from tape and paper storage to digital downloads. Music, TV shows, movies, photos and now books and newspapers. We want instant access. We want it easy.

Some people’s gadgets determine their self-esteem. … Today’s gadgets are intensely personal. Your phone or camera or music player makes a statement, reflects your style and character. No wonder some people interpret criticisms of a product as a criticism of their choices. By extension, it’s a critique of them.

Everybody reads with a lens. … feelings run just as strongly in the tech realm. You can’t use the word “Apple,” “Microsoft” or “Google” in a sentence these days without stirring up emotion.

It’s not that hard to tell the winners from the losers. … There was the Microsoft Spot Watch (2003). This was a wireless wristwatch that could display your appointments and messages — but cost $10 a month, had to be recharged nightly and wouldn’t work outside your home city unless you filled out a Web form in advance.

Some concepts’ time may never come. The same “breakthrough” ideas keep surfacing — and bombing, year after year. For the love of Mike, people, nobody wants videophones!

Teenagers do not want “communicators” that do nothing but send text messages, either (AT&T Ogo, Sony Mylo, Motorola V200). People do not want to surf the Internet on their TV screens (WebTV, AOLTV, Google TV). And give it up on the stripped-down kitchen “Internet appliances” (3Com Audrey, Netpliance i-Opener, Virgin Webplayer). Nobody has ever bought one, and nobody ever will.

Forget about forever — nothing lasts a year. Of the thousands of products I’ve reviewed in 10 years, only a handful are still on the market. Oh, you can find some gadgets whose descendants are still around: iPod, BlackBerry, Internet Explorer and so on.

But it’s mind-frying to contemplate the millions of dollars and person-years that were spent on products and services that now fill the Great Tech Graveyard: Olympus M-Robe. PocketPC. Smart Display. MicroMV. MSN Explorer. Aibo. All those PlaysForSure music players, all those Palm organizers, all those GPS units you had to load up with maps from your computer.

Everybody knows that’s the way tech goes. The trick is to accept your
gadget’s obsolescence at the time you buy it…

Nobody can keep up. Everywhere I go, I meet people who express the same reaction to consumer tech today: there’s too much stuff coming too fast. It’s impossible to keep up with trends, to know what to buy, to avoid feeling left behind. They’re right. There’s never been a period of greater technological change. You couldn’t keep up with all of it if you tried.

David Pogue’s insights about tech over time Read More »

Unix: An Oral History

From ‘s “Unix: An Oral History” (: ):

Multics

Gordon M. Brown

[Multics] was designed to include fault-free continuous operation capabilities, convenient remote terminal access and selective information sharing. One of the most important features of Multics was to follow the trend towards integrated multi-tasking and permit multiple programming environments and different human interfaces under one operating system.

Moreover, two key concepts had been picked up on in the development of Multics that would later serve to define Unix. These were that the less important features of the system introduced more complexity and, conversely, that the most important property of algorithms was simplicity. Ritchie explained this to Mahoney, articulating that:

The relationship of Multics to [the development of Unix] is actually interesting and fairly complicated. There were a lot of cultural things that were sort of taken over wholesale. And these include important things, [such as] the hierarchical file system and tree-structure file system – which incidentally did not get into the first version of Unix on the PDP-7. This is an example of why the whole thing is complicated. But any rate, things like the hierarchical file system, choices of simple things like the characters you use to edit lines as you’re typing, erasing characters were the same as those we had. I guess the most important fundamental thing is just the notion that the basic style of interaction with the machine, the fact that there was the notion of a command line, the notion was an explicit shell program. In fact the name shell came from Multics. A lot of extremely important things were completely internalized, and of course this is the way it is. A lot of that came through from Multics.

The Beginning

Michael Errecart and Cameron Jones

Files to Share

The Unix file system was based almost entirely on the file system for the failed Multics project. The idea was for file sharing to take place with explicitly separated file systems for each user, so that there would be no locking of file tables.

A major part of the answer to this question is that the file system had to be open. The needs of the group dictated that every user had access to every other user’s files, so the Unix system had to be extremely open. This openness is even seen in the password storage, which is not hidden at all but is encrypted. Any user can see all the encrypted passwords, but can only test one solution per second, which makes it extremely time consuming to try to break into the system.

The idea of standard input and output for devices eventually found its way into Unix as pipes. Pipes enabled users and programmers to send one function’s output into another function by simply placing a vertical line, a ‘|’ between the two functions. Piping is one of the most distinct features of Unix …

Language from B to C

… Thompson was intent on having Unix be portable, and the creation of a portable language was intrinsic to this. …

Finding a Machine

Darice Wong & Jake Knerr

… Thompson devoted a month apiece to the shell, editor, assembler, and other software tools. …

Use of Unix started in the patent office of Bell Labs, but by 1972 there were a number of non-research organizations at Bell Labs that were beginning to use Unix for software development. Morgan recalls the importance of text processing in the establishment of Unix. …

Building Unix

Jason Aughenbaugh, Jonathan Jessup, & Nicholas Spicher

The Origin of Pipes

The first edition of Thompson and Ritchie’s The Unix Programmer’s Manual was dated November 3, 1971; however, the idea of pipes is not mentioned until the Version 3 Unix manual, published in February 1973. …

Software Tools

grep was, in fact, one of the first programs that could be classified as a software tool. Thompson designed it at the request of McIlroy, as McIlroy explains:

One afternoon I asked Ken Thompson if he could lift the regular expression recognizer out of the editor and make a one-pass program to do it. He said yes. The next morning I found a note in my mail announcing a program named grep. It worked like a charm. When asked what that funny name meant, Ken said it was obvious. It stood for the editor command that it simulated, g/re/p (global regular expression print)….From that special-purpose beginning, grep soon became a household word. (Something I had to stop myself from writing in the first paragraph above shows how firmly naturalized the idea now is: ‘I used ed to grep out words from the dictionary.’) More than any other single program, grep focused the viewpoint that Kernighan and Plauger christened and formalized in Software Tools: make programs that do one thing and do it well, with as few preconceptions about input syntax as possible.

eqn and grep are illustrative of the Unix toolbox philosophy that McIlroy phrases as, “Write programs that do one thing and do it well. Write programs to work together. Write programs that handle text streams, because that is a universal interface.” This philosophy was enshrined in Kernighan and Plauger’s 1976 book, Software Tools, and reiterated in the “Foreword” to the issue of The Bell Systems Technical Journal that also introduced pipes.

Ethos

Robert Murray-Rust & Malika Seth

McIlroy says,

This is the Unix philosophy. Write programs that do one thing and do it well. Write programs to work together. Write programs that handle text streams because, that is a universal interface.

The dissemination of Unix, with a focus on what went on within Bell Labs

Steve Chen

In 1973, the first Unix applications were installed on a system involved in updating directory information and intercepting calls to numbers that had been changed. This was the first time Unix had been used in supporting an actual, ongoing operating business. Soon, Unix was being used to automate the operations systems at Bell Laboratories. It was automating the monitoring, involved in measurement, and helping to rout calls and ensure the quality of the calls.

There were numerous reasons for the friendliness the academic society, especially the academic Computer Science community, showed towards Unix. John Stoneback relates a few of these:

Unix came into many CS departments largely because it was the only powerful interactive system that could run on the sort of hardware (PDP-11s) that universities could afford in the mid ’70s. In addition, Unix itself was also very inexpensive. Since source code was provided, it was a system that could be shaped to the requirements of a particular installation. It was written in a language considerably more attractive than assembly, and it was small enough to be studied and understood by individuals. (John Stoneback, “The Collegiate Community,” Unix Review, October 1985, p. 27.)

The key features and characteristics of Unix that held it above other operating systems at the time were its software tools, its portability, its flexibility, and the fact that it was simple, compact, and efficient. The development of Unix in Bell Labs was carried on under a set of principles that the researchers had developed to guide their work. These principles included:

(i) Make each program do one thing well. To do a new job, build afresh rather than complicate old programs by adding new features.

(ii) Expect the output of every program to become the input to another, as yet unknown, program. Don’t clutter output with extraneous information. Avoid stringently columnar or binary input formats. Don’t insist on interactive input.

(iii) Design and build software, even operating systems, to be tried early, ideally within weeks. Don’t hesitate to throw away the clumsy parts and rebuild them.

(iv) Use tools in preference to unskilled help to lighten a programming task, even if you have to detour to build the tools and expect to throw some of them out after you’ve finished using them.”

(M.D. McIlroy, E.N. Pinson, and B.A. Tague “Unix Time-Sharing System Forward,” The Bell System Technical Journal, July-Aug 1088 vol 57, number 6 part 2. P. 1902)

Unix: An Oral History Read More »

How the Madden NFL videogame was developed

From Patrick Hruby’s “The Franchise: The inside story of how Madden NFL became a video game dynasty” (ESPN: 22 July 2010):

1982

Harvard grad and former Apple employee Trip Hawkins founds video game maker Electronic Arts, in part to create a football game; one year later, the company releases “One-on-One: Dr. J vs. Larry Bird,” the first game to feature licensed sports celebrities. Art imitates life.

1983-84

Hawkins approaches former Oakland Raiders coach and NFL television analyst John Madden to endorse a football game. Madden agrees, but insists on realistic game play with 22 on-screen players, a daunting technical challenge.

1988-90

EA releases the first Madden football game for the Apple II home computer; a subsequent Sega Genesis home console port blends the Apple II game’s realism with control pad-heavy, arcade-style action, becoming a smash hit.

madden-nfl-covers-sm.jpg

You can measure the impact of “Madden” through its sales: as many as 2 million copies in a single week, 85 million copies since the game’s inception and more than $3 billion in total revenue. You can chart the game’s ascent, shoulder to shoulder, alongside the $20 billion-a-year video game industry, which is either co-opting Hollywood (see “Tomb Raider” and “Prince of Persia”) or topping it (opening-week gross of “Call of Duty: Modern Warfare 2”: $550 million; “The Dark Knight”: $204 million).

Some of the pain was financial. Just as EA brought its first games to market in 1983, the home video game industry imploded. In a two-year span, Coleco abandoned the business, Intellivision went from 1,200 employees to five and Atari infamously dumped thousands of unsold game cartridges into a New Mexico landfill. Toy retailers bailed, concluding that video games were a Cabbage Patch-style fad. Even at EA — a hot home computer startup — continued solvency was hardly assured.

In 1988, “John Madden Football” was released for the Apple II computer and became a modest commercial success.

THE STAKES WERE HIGH for a pair of upstart game makers, with a career-making opportunity and a $100,000 development contract on the line. In early 1990, Troy Lyndon and Mike Knox of San Diego-based Park Place Productions met with Hawkins to discuss building a “Madden” game for Sega’s upcoming home video game console, the Genesis. …

Because the game that made “Madden” a phenomenon wasn’t the initial Apple II release, it was the Genesis follow-up, a surprise smash spawned by an entirely different mindset. Hawkins wanted “Madden” to play out like the NFL. Equivalent stats. Similar play charts. Real football.

In 1990, EA had a market cap of about $60 million; three years later, that number swelled to $2 billion.

In 2004, EA paid the NFL a reported $300 million-plus for five years of exclusive rights to teams and players. The deal was later extended to 2013. Just like that, competing games went kaput. The franchise stands alone, triumphant, increasingly encumbered by its outsize success.

Hawkins left EA in the early 1990s to spearhead 3D0, an ill-fated console maker that became a doomed software house. An icy rift between the company and its founder ensued.

How the Madden NFL videogame was developed Read More »

Microsoft’s real customers

From James Fallow’s “Inside the Leviathan: A short and stimulating brush with Microsoft’s corporate culture” (The Atlantic: February 2000):

Financial analysts have long recognized that Microsoft’s profit really comes from two sources. One is operating systems (Windows, in all its varieties), and the other is the Office suite of programs. Everything else — Flight Simulator, Slate, MSNBC, mice and keyboards — is financially meaningless. What these two big categories have in common is that individuals are not the significant customers. Operating systems are sold mainly to computer companies such as Dell and Compaq, which pass them pre-loaded to individual consumers. And the main paying customers for Office are big corporations (or what the high-tech world calls LORGs, for “large-size organizations”), which may buy thousands of “seats” for their employees at hundreds of dollars apiece. Product planning, therefore, is focused with admirable clarity on those whose decisions really matter to Microsoft — the information-technology manager at Chevron or the U.S. Department of Agriculture, for example — rather than some writer with an idea about how to make his colleagues happier with a program.

Microsoft’s real customers Read More »

A great example of poor comments in your code

From Steven Levy’s Hackers: Heroes of the Computer Revolution (Penguin Books: 2001): 43:

[Peter Samson, one of the first MIT hackers], though, was particularly obscure in refusing to add comments to his source code explaining what he was doing at a given time. One well-distributed program Samson wrote went on for hundreds of assembly language instructions, with only one comment beside an instruction which contained the number 1750. The comment was RIPJSB, and people racked their brains about its meaning until someone figured out that 1750 was the year Bach died, and that Samson had written an abbreviation for Rest In Peace Johann Sebastian Bach.

A great example of poor comments in your code Read More »

The Hacker Ethic

From Steven Levy’s Hackers: Heroes of the Computer Revolution (Penguin Books: 2001): 40-46:

Still, even in the days of the TX-0 [the late 1950s], the planks of the platform were in place. The Hacker Ethic:

  • Access To Computers — And Anything Which Might Teach You Something About The Way The World Works — Should Be Unlimited And Total. Always Yield To The Hands-On Imperative!
  • All Information Should Be Free.
  • Mistrust Authority — Promote Decentralization. The last thing you need is a bureaucracy. Bureaucracies, whether corporate, government, or university, are flawed systems, dangerous in that they cannot accommodate the exploratory impulse of true hackers. Bureaucrats hide behind arbitrary rules (as opposed to the logical algorithms by which machines and computer programs operate): they invoke those rules to consolidate power, and perceive the constructive impulse of hackers as a threat.
  • Hackers Should Be Judged By Their Hacking, Not Bogus Criteria Such As Degrees, Age, Race, Or Position. This meritocratic trait was not necessarily rooted in the inherent goodness of hacker hearts–it was mainly that hackers cared less about someone’s superficial characteristics than they did about his potential to advance the general state of hacking, to create new programs to admire, to talk about that new feature in the system.
  • You Can Create Art And Beauty On A Computer.
  • Computers Can Change Your Life For The Better.
  • Like Aladdin’s Lamp, You Could Get It To Do Your Bidding.

The Hacker Ethic Read More »

The origin of the word “munge”, “hack”, & others

From Steven Levy’s Hackers: Heroes of the Computer Revolution (Penguin Books: 2001): 23:

The core members hung out at [MIT’s Tech Model Railroad Club in the late 1950s] for hours; constantly improving The System, arguing about what could be done next, developing a jargon of their own that seemed incomprehensible to outsiders who might chance on these teen-aged fanatics … When a piece of equipment wasn’t working, it was “losing”; when a piece of equipment was ruined, it was “munged” (Mash Until No Good); the two desks in the corner of the room were not called the office, but the “orifice”; one who insisted on studying for courses was a “tool”; garbage was called “cruft”; and a project undertaken or a product built not solely to fulfill some constructive goal, but with some wild pleasure taken in mere involvement, was called a “hack.”

This latter term may have been suggested by ancient MIT lingo– the word “hack” had long been used to describe the elaborate college pranks that MIT students would regularly devise, such as covering the dome that overlooked the campus with reflecting foil. But as the TMRC people used the word, there was serious respect implied. While someone might call a clever connection between relays a “mere hack,” it would be understood that, to qualify as a hack, the feat must be imbued with innovation, style, and technical virtuosity.

The origin of the word “munge”, “hack”, & others Read More »

Lovely – Microsoft will let companies create ad-filled desktop themes

From Jeff Bertolucci’s “Windows 7 Ads: Microsoft Tarts Up the Desktop” (PC World: 13 November 2009):

Microsoft has announced plans to peddle Windows 7 desktop space to advertisers, who’ll create Windows UI themes–customized backgrounds, audio clips, and other elements–that highlight their brand, Computerworld reports. In fact, some advertiser themes are already available in the Windows 7 Personalization Gallery, including desktop pitches for soft drinks (Coca-Cola, Pepsi), autos (Ducati, Ferrari, Infiniti), and big-budget Hollywood blockbusters (Avatar).

The advertiser themes are different, however, in that they won’t be foisted on unsuspecting users. Rather, you’ll have to download and install the ad pitch yourself. As a result, I doubt many Windows 7 users will gripe about ad themes. Hey, if you’re a Preparation H fan, why not devote the desktop to your favorite ointment?

Lovely – Microsoft will let companies create ad-filled desktop themes Read More »

Bernie Madoff & the 1st worldwide Ponzi scheme

From Diana B. Henrioques’s “Madoff Scheme Kept Rippling Outward, Across Borders” (The New York Times: 20 December 2008):

But whatever else Mr. Madoff’s game was, it was certainly this: The first worldwide Ponzi scheme — a fraud that lasted longer, reached wider and cut deeper than any similar scheme in history, entirely eclipsing the puny regional ambitions of Charles Ponzi, the Boston swindler who gave his name to the scheme nearly a century ago.

Regulators say Mr. Madoff himself estimated that $50 billion in personal and institutional wealth from around the world was gone. … Before it evaporated, it helped finance Mr. Madoff’s coddled lifestyle, with a Manhattan apartment, a beachfront mansion in the Hamptons, a small villa overlooking Cap d’Antibes on the French Riviera, a Mayfair office in London and yachts in New York, Florida and the Mediterranean.

In 1960, as Wall Street was just shaking off its postwar lethargy and starting to buzz again, Bernie Madoff (pronounced MAY-doff) set up his small trading firm. His plan was to make a business out of trading lesser-known over-the-counter stocks on the fringes of the traditional stock market. He was just 22, a graduate of Hofstra University on Long Island.

By 1989, Mr. Madoff ‘s firm was handling more than 5 percent of the trading volume on the august New York Stock Exchange …

And in 1990, he became the nonexecutive chairman of the Nasdaq market, which at the time was operated as a committee of the National Association of Securities Dealers.

His rise on Wall Street was built on his belief in a visionary notion that seemed bizarre to many at the time: That stocks could be traded by people who never saw each other but were connected only by electronics.

In the mid-1970s, he had spent over $250,000 to upgrade the computer equipment at the Cincinnati Stock Exchange, where he began offering to buy and sell stocks that were listed on the Big Board. The exchange, in effect, was transformed into the first all-electronic computerized stock exchange.

He also invested in new electronic trading technology for his firm, making it cheaper for brokerage firms to fill their stock orders. He eventually gained a large amount of business from big firms like A. G. Edwards & Sons, Charles Schwab & Company, Quick & Reilly and Fidelity Brokerage Services.

By the end of the technology bubble in 2000, his firm was the largest market maker on the Nasdaq electronic market, and he was a member of the Securities Industry Association, now known as the Securities Industry and Financial Markets Association, Wall Street’s principal lobbying arm.

Bernie Madoff & the 1st worldwide Ponzi scheme Read More »

Programmer jokes

Q: How do you tell an introverted computer scientist from an extroverted computer scientist?

A: An extroverted computer scientist looks at your shoes when he talks to you.


Knock, knock.

Who’s there?

very long pause….

Java.


Saying that Java is nice because it works on every OS is like saying that anal sex is nice because it works on every gender.


A physicist, an engineer and a programmer were in a car driving over a steep alpine pass when the brakes failed. The car was getting faster and faster, they were struggling to get round the corners and once or twice only the feeble crash barrier saved them from crashing down the side of the mountain. They were sure they were all going to die, when suddenly they spotted an escape lane. They pulled into the escape lane, and came safely to a halt.

The physicist said “We need to model the friction in the brake pads and the resultant temperature rise, see if we can work out why they failed”.

The engineer said “I think I’ve got a few spanners in the back. I’ll take a look and see if I can work out what’s wrong”.

The programmer said “Why don’t we get going again and see if it’s reproducible?”


To understand what recursion is you must first understand recursion.

Programmer jokes Read More »

Apple’s role in technology

Image representing iPhone as depicted in Crunc...
Image via CrunchBase

From Doc Searls’s “The Most Personal Device” (Linux Journal: 1 March 2009):

My friend Keith Hopper made an interesting observation recently. He said one of Apple’s roles in the world is finding categories where progress is logjammed, and opening things up by coming out with a single solution that takes care of everything, from the bottom to the top. Apple did it with graphical computing, with .mp3 players, with on-line music sales and now with smartphones. In each case, it opens up whole new territories that can then be settled and expanded by other products, services and companies. Yes, it’s closed and controlling and the rest of it. But what matters is the new markets that open up.

Apple’s role in technology Read More »

Grab what others type through an electrical socket

Description unavailable
Image by Dim Sum! via Flickr

From Tim Greene’s “Black Hat set to expose new attacks” (Network World: 27 July 2009):

Black Hat USA 2009, considered a premier venue for publicizing new exploits with an eye toward neutralizing them, is expected to draw thousands to hear presentations from academics, vendors and private crackers.

For instance, one talk will demonstrate that if attackers can plug into an electrical socket near a computer or draw a bead on it with a laser they can steal whatever is being typed in. How to execute this attack will be demonstrated by Andrea Barisani and Daniele Bianco, a pair of researchers for network security consultancy Inverse Path.

Attackers grab keyboard signals that are generated by hitting keys. Because the data wire within the keyboard cable is unshielded, the signals leak into the ground wire in the cable, and from there into the ground wire of the electrical system feeding the computer. Bit streams generated by the keyboards that indicate what keys have been struck create voltage fluctuations in the grounds, they say.

Attackers extend the ground of a nearby power socket and attach to it two probes separated by a resistor. The voltage difference and the fluctuations in that difference – the keyboard signals – are captured from both ends of the resistor and converted to letters.

This method would not work if the computer were unplugged from the wall, such as a laptop running on its battery. A second attack can prove effective in this case, Bianco’s and Barisani’s paper says.

Attackers point a cheap laser at a shiny part of a laptop or even an object on the table with the laptop. A receiver is aligned to capture the reflected light beam and the modulations that are caused by the vibrations resulting from striking the keys.

Analyzing the sequences of individual keys that are struck and the spacing between words, the attacker can figure out what message has been typed. Knowing what language is being typed is a big help, they say.

Grab what others type through an electrical socket Read More »

How security experts defended against Conficker

From Jim Giles’ “The inside story of the Conficker worm” (New Scientist: 12 June 2009):

23 October 2008 … The dry, technical language of Microsoft’s October update did not indicate anything particularly untoward. A security flaw in a port that Windows-based PCs use to send and receive network signals, it said, might be used to create a “wormable exploit”. Worms are pieces of software that spread unseen between machines, mainly – but not exclusively – via the internet (see “Cell spam”). Once they have installed themselves, they do the bidding of whoever created them.

If every Windows user had downloaded the security patch Microsoft supplied, all would have been well. Not all home users regularly do so, however, and large companies often take weeks to install a patch. That provides windows of opportunity for criminals.

The new worm soon ran into a listening device, a “network telescope”, housed by the San Diego Supercomputing Center at the University of California. The telescope is a collection of millions of dummy internet addresses, all of which route to a single computer. It is a useful monitor of the online underground: because there is no reason for legitimate users to reach out to these addresses, mostly only suspicious software is likely to get in touch.

The telescope’s logs show the worm spreading in a flash flood. For most of 20 November, about 3000 infected computers attempted to infiltrate the telescope’s vulnerable ports every hour – only slightly above the background noise generated by older malicious code still at large. At 6 pm, the number began to rise. By 9 am the following day, it was 115,000 an hour. Conficker was already out of control.

That same day, the worm also appeared in “honeypots” – collections of computers connected to the internet and deliberately unprotected to attract criminal software for analysis. It was soon clear that this was an extremely sophisticated worm. After installing itself, for example, it placed its own patch over the vulnerable port so that other malicious code could not use it to sneak in. As Brandon Enright, a network security analyst at the University of California, San Diego, puts it, smart burglars close the window they enter by.

Conficker also had an ingenious way of communicating with its creators. Every day, the worm came up with 250 meaningless strings of letters and attached a top-level domain name – a .com, .net, .org, .info or .biz – to the end of each to create a series of internet addresses, or URLs. Then the worm contacted these URLs. The worm’s creators knew what each day’s URLs would be, so they could register any one of them as a website at any time and leave new instructions for the worm there.

It was a smart trick. The worm hunters would only ever spot the illicit address when the infected computers were making contact and the update was being downloaded – too late to do anything. For the next day’s set of instructions, the creators would have a different list of 250 to work with. The security community had no way of keeping up.

No way, that is, until Phil Porras got involved. He and his computer security team at SRI International in Menlo Park, California, began to tease apart the Conficker code. It was slow going: the worm was hidden within two shells of encryption that defeated the tools that Porras usually applied. By about a week before Christmas, however, his team and others – including the Russian security firm Kaspersky Labs, based in Moscow – had exposed the worm’s inner workings, and had found a list of all the URLs it would contact.

[Rick Wesson of Support Intelligence] has years of experience with the organisations that handle domain registration, and within days of getting Porras’s list he had set up a system to remove the tainted URLs, using his own money to buy them up.

It seemed like a major win, but the hackers were quick to bounce back: on 29 December, they started again from scratch by releasing an upgraded version of the worm that exploited the same security loophole.

This new worm had an impressive array of new tricks. Some were simple. As well as propagating via the internet, the worm hopped on to USB drives plugged into an infected computer. When those drives were later connected to a different machine, it hopped off again. The worm also blocked access to some security websites: when an infected user tried to go online and download the Microsoft patch against it, they got a “site not found” message.

Other innovations revealed the sophistication of Conficker’s creators. If the encryption used for the previous strain was tough, that of the new version seemed virtually bullet-proof. It was based on code little known outside academia that had been released just three months earlier by researchers at the Massachusetts Institute of Technology.

Indeed, worse was to come. On 15 March, Conficker presented the security experts with a new problem. It reached out to a URL called rmpezrx.org. It was on the list that Porras had produced, but – those involved decline to say why – it had not been blocked. One site was all that the hackers needed. A new version was waiting there to be downloaded by all the already infected computers, complete with another new box of tricks.

Now the cat-and-mouse game became clear. Conficker’s authors had discerned Porras and Wesson’s strategy and so from 1 April, the code of the new worm soon revealed, it would be able to start scanning for updates on 500 URLs selected at random from a list of 50,000 that were encoded in it. The range of suffixes would increase to 116 and include many country codes, such as .kz for Kazakhstan and .ie for Ireland. Each country-level suffix belongs to a different national authority, each of which sets its own registration procedures. Blocking the previous set of domains had been exhausting. It would soon become nigh-on impossible – even if the new version of the worm could be fully decrypted.

Luckily, Porras quickly repeated his feat and extracted the crucial list of URLs. Immediately, Wesson and others contacted the Internet Corporation for Assigned Names and Numbers (ICANN), an umbrella body that coordinates country suffixes.

From the second version onwards, Conficker had come with a much more efficient option: peer-to-peer (P2P) communication. This technology, widely used to trade pirated copies of software and films, allows software to reach out and exchange signals with copies of itself.

Six days after the 1 April deadline, Conficker’s authors let loose a new version of the worm via P2P. With no central release point to target, security experts had no means of stopping it spreading through the worm’s network. The URL scam seems to have been little more than a wonderful way to waste the anti-hackers’ time and resources. “They said: you’ll have to look at 50,000 domains. But they never intended to use them,” says Joe Stewart of SecureWorks in Atlanta, Georgia. “They used peer-to-peer instead. They misdirected us.”

The latest worm release had a few tweaks, such as blocking the action of software designed to scan for its presence. But piggybacking on it was something more significant: the worm’s first moneymaking schemes. These were a spam program called Waledac and a fake antivirus package named Spyware Protect 2009.

The same goes for fake software: when the accounts of a Russian company behind an antivirus scam became public last year, it appeared that one criminal had earned more than $145,000 from it in just 10 days.

How security experts defended against Conficker Read More »

Outline for an Unpublished Linux Textbook

Back in 2004 or so, I was asked to write an outline for a college textbook that would be used in courses on Linux. I happily complied, producing the outline you can see on my website. The editor on the project loved the outline & showed it several professors to get their reactions, which were uniformly positive, with one prof reporting back that (& I’m paraphrasing here) “It was like this author read my mind, as this is exactly the book I’d like to use in my course!” Sadly, the book was never written, because the editor’s boss didn’t like the fact that I didn’t have a PhD in Computer Science. I thought that to be a silly reason then, & I think it’s a silly reason to reject the book now.

However, their loss is your gain. Here’s the outline for the book. Yes, it’s sadly outdated. Yes, it focuses quite a bit on SUSE, but that was what the publisher wanted. Yes, Linux has come a LONG way since I wrote this outline. But I still think it’s a damn good job, and you may find it interesting for historical reasons. So, enjoy!

Outline for an Unpublished Linux Textbook Read More »

Could Green Dam lead to the largest botnet in history?

Green_Damn_site_blocked.jpg

From Rob Cottingham’s “From blocking to botnet: Censorship isn’t the only problem with China’s new Internet blocking software” (Social Signal: 10 June 2009):

Any blocking software needs to update itself from time to time: at the very least to freshen its database of forbidden content, and more than likely to fix bugs, add features and improve performance. (Most anti-virus software does this.)

If all the software does is to refresh the list of banned sites, that limits the potential for abuse. But if the software is loading new executable code onto the computer, suddenly there’s the potential for something a lot bigger.

Say you’re a high-ranking official in the Chinese military. And let’s say you have some responsibility for the state’s capacity to wage so-called cyber warfare: digital assaults on an enemy’s technological infrastructure.

It strikes you: there’s a single backdoor into more that 40 million Chinese computers, capable of installing… well, nearly anything you want.

What if you used that backdoor, not just to update blocking software, but to create something else?

Say, the biggest botnet in history?

Still, a botnet 40 million strong (plus the installed base already in place in Chinese schools and other institutions) at the beck and call of the military is potentially a formidable weapon. Even if the Chinese government has no intention today of using Green Dam for anything other than blocking pornography, the temptation to repurpose it for military purposes may prove to be overwhelming.

Could Green Dam lead to the largest botnet in history? Read More »

Steve Jobs on mediocrity & market share

From Steven Levy’s “OK, Mac, Make a Wish: Apple’s ‘computer for the rest of us’ is, insanely, 20” (Newsweek: 2 February 2004):

If that’s so, then why is the Mac market share, even after Apple’s recent revival, sputtering at a measly 5 percent? Jobs has a theory about that, too. Once a company devises a great product, he says, it has a monopoly in that realm, and concentrates less on innovation than protecting its turf. “The Mac user interface was a 10-year monopoly,” says Jobs. “Who ended up running the company? Sales guys. At the critical juncture in the late ’80s, when they should have gone for market share, they went for profits. They made obscene profits for several years. And their products became mediocre. And then their monopoly ended with Windows 95. They behaved like a monopoly, and it came back to bite them, which always happens.”

Steve Jobs on mediocrity & market share Read More »

Extreme male brains

From Joe Clark’s “The extreme Google brain” (Fawny: 26 April 2009):

… Susan Pinker’s The Sexual Paradox, which explains, using scientific findings, why large majorities of girls and women behave almost identically at different stages of their lives – while large minorities of boys and men show vast variability compared to each other and to male norms.

Some of these boys and men exhibit extreme-male-brain tendencies, including an ability to focus obsessively for long periods of time, often on inanimate objects or abstractions (hence male domination of engineering and high-end law). Paradoxically, other male brains in these exceptional cases may have an ability to experiment with many options for short periods each. Pejoratively diagnosed as attention-deficit disorder, Pinker provides evidence this latter ability is actually a strength for some entrepreneurs.

The male brain, extreme or not, is compatible with visual design. It allows you to learn every font in the Letraset catalogue and work from a grid. In fact, the male-brain capacity for years-long single-mindedness explains why the heads of large ad agencies and design houses are overwhelmingly male. (It isn’t a sexist conspiracy.)

In the computer industry, extreme male brains permit years of concentration on hardware and software design, while also iterating those designs seemingly ad infinitum. The extreme male brain is really the extreme Google brain. It’s somewhat of a misnomer, because such is actually the average brain inside the company, but I will use that as a neologism.

Google was founded by extreme-male-brain nerds and, by all outward appearances, seems to hire only that type of person, not all of them male.

Extreme male brains Read More »

$9 million stolen from 130 ATM machines in 49 cities in 30 minutes

From Catey Hill’s “Massive ATM heist! $9M stolen in only 30 minutes” (New York Daily News: 12 February 2009)

With information stolen from only 100 ATM cards, thieves made off with $9 million in cash, according to published reports. It only took 30 minutes.

“We’ve seen similar attempts to defraud a bank through ATM machines but not, not anywhere near the scale we have here,” FBI Agent Ross Rice told Fox 5. “We’ve never seen one this well coordinated,” the FBI told Fox 5.

The heist happened in November, but FBI officials released more information about the events only recently. …

How did they do it? The thieves hacked into the RBS WorldPay computer system and stole payroll card information from the company. A payroll card is used by many companies to pay the salaries of their employees. The cards work a lot like a debit card and can be used in any ATM.

Once the thieves had the card info, they employed a group of ‘cashers’ – people employed to go get the money out of the ATMs. The cashers went to ATMs around the world and withdrew money.
“Over 130 different ATM machines in 49 cities worldwide were accessed in a 30-minute period on November 8,” Agent Rice told Fox 5.

$9 million stolen from 130 ATM machines in 49 cities in 30 minutes Read More »