business

It’s hard to judge the young, but the market can

From Paul Graham’s “Hiring is Obsolete” (May 2005):

It’s hard to judge the young because (a) they change rapidly, (b) there is great variation between them, and (c) they’re individually inconsistent. That last one is a big problem. When you’re young, you occasionally say and do stupid things even when you’re smart. So if the algorithm is to filter out people who say stupid things, as many investors and employers unconsciously do, you’re going to get a lot of false positives. …

The market is a lot more discerning than any employer. And it is completely non-discriminatory. On the Internet, nobody knows you’re a dog. And more to the point, nobody knows you’re 22. All users care about is whether your site or software gives them what they want. They don’t care if the person behind it is a high school kid.

It’s hard to judge the young, but the market can Read More »

The real vs. stated purpose of PowerPoint

From Paul Graham’s “Hiring is Obsolete” (May 2005):

For example, the stated purpose of Powerpoint is to present ideas. Its real role is to overcome people’s fear of public speaking. It allows you to give an impressive-looking talk about nothing, and it causes the audience to sit in a dark room looking at slides, instead of a bright one looking at you.

The real vs. stated purpose of PowerPoint Read More »

Why did it take so long for blogging to take off?

From Paul Graham’s “Hiring is Obsolete” (May 2005):

Have you ever noticed that when animals are let out of cages, they don’t always realize at first that the door’s open? Often they have to be poked with a stick to get them out. Something similar happened with blogs. People could have been publishing online in 1995, and yet blogging has only really taken off in the last couple years. In 1995 we thought only professional writers were entitled to publish their ideas, and that anyone else who did was a crank. Now publishing online is becoming so popular that everyone wants to do it, even print journalists. But blogging has not taken off recently because of any technical innovation; it just took eight years for everyone to realize the cage was open.

Why did it take so long for blogging to take off? Read More »

Why is American design so often terrible compared to Japanese design?

From Paul Graham’s “Made in USA” (November 2004):

Americans are good at some things and bad at others. We’re good at making movies and software, and bad at making cars and cities. And I think we may be good at what we’re good at for the same reason we’re bad at what we’re bad at. We’re impatient. In America, if you want to do something, you don’t worry that it might come out badly, or upset delicate social balances, or that people might think you’re getting above yourself. If you want to do something, as Nike says, just do it. …

For centuries the Japanese have made finer things than we have in the West. When you look at swords they made in 1200, you just can’t believe the date on the label is right. Presumably their cars fit together more precisely than ours for the same reason their joinery always has. They’re obsessed with making things well.

Not us. When we make something in America, our aim is just to get the job done. Once we reach that point, we take one of two routes. We can stop there, and have something crude but serviceable, like a Vise-grip. Or we can improve it, which usually means encrusting it with gratuitous ornament. When we want to make a car “better,” we stick tail fins on it, or make it longer, or make the windows smaller, depending on the current fashion. …

Letting focus groups design your cars for you only wins in the short term. In the long term, it pays to bet on good design. The focus group may say they want the meretricious feature du jour, but what they want even more is to imitate sophisticated buyers, and they, though a small minority, really do care about good design. Eventually the pimps and drug dealers notice that the doctors and lawyers have switched from Cadillac to Lexus, and do the same.

Why is American design so often terrible compared to Japanese design? Read More »

Who made money during the era of railroads

From Paul Graham’s “What the Bubble Got Right” (September 2004):

In fact most of the money to be made from big trends is made indirectly. It was not the railroads themselves that made the most money during the railroad boom, but the companies on either side, like Carnegie’s steelworks, which made the rails, and Standard Oil, which used railroads to get oil to the East Coast, where it could be shipped to Europe.

Who made money during the era of railroads Read More »

What successful startups need

From Paul Graham’s “How to Start a Startup” (March 2005):

You need three things to create a successful startup: to start with good people, to make something customers actually want, and to spend as little money as possible. Most startups that fail do it because they fail at one of these. A startup that does all three will probably succeed. …

I can think of several heuristics for generating ideas for startups, but most reduce to this: look at something people are trying to do, and figure out how to do it in a way that doesn’t suck. …

What matters is not ideas, but the people who have them. Good people can fix bad ideas, but good ideas can’t save bad people.

What successful startups need Read More »

Cultural differences between Unix and Windows

From Joel Spolsky’s “Biculturalism” (Joel on Software: 14 December 2003):

What are the cultural differences between Unix and Windows programmers? There are many details and subtleties, but for the most part it comes down to one thing: Unix culture values code which is useful to other programmers, while Windows culture values code which is useful to non-programmers.

This is, of course, a major simplification, but really, that’s the big difference: are we programming for programmers or end users? Everything else is commentary. …

Let’s look at a small example. The Unix programming culture holds in high esteem programs which can be called from the command line, which take arguments that control every aspect of their behavior, and the output of which can be captured as regularly-formatted, machine readable plain text. Such programs are valued because they can easily be incorporated into other programs or larger software systems by programmers. To take one miniscule example, there is a core value in the Unix culture, which Raymond calls “Silence is Golden,” that a program that has done exactly what you told it to do successfully should provide no output whatsoever. It doesn’t matter if you’ve just typed a 300 character command line to create a file system, or built and installed a complicated piece of software, or sent a manned rocket to the moon. If it succeeds, the accepted thing to do is simply output nothing. The user will infer from the next command prompt that everything must be OK.

This is an important value in Unix culture because you’re programming for other programmers. As Raymond puts it, “Programs that babble don’t tend to play well with other programs.” By contrast, in the Windows culture, you’re programming for Aunt Marge, and Aunt Marge might be justified in observing that a program that produces no output because it succeeded cannot be distinguished from a program that produced no output because it failed badly or a program that produced no output because it misinterpreted your request.

Similarly, the Unix culture appreciates programs that stay textual. They don’t like GUIs much, except as lipstick painted cleanly on top of textual programs, and they don’t like binary file formats. This is because a textual interface is easier to program against than, say, a GUI interface, which is almost impossible to program against unless some other provisions are made, like a built-in scripting language. Here again, we see that the Unix culture values creating code that is useful to other programmers, something which is rarely a goal in Windows programming.

Which is not to say that all Unix programs are designed solely for programmers. Far from it. But the culture values things that are useful to programmers, and this explains a thing or two about a thing or two. …

The Unix cultural value of visible source code makes it an easier environment to develop for. Any Windows developer will tell you about the time they spent four days tracking down a bug because, say, they thought that the memory size returned by LocalSize would be the same as the memory size they originally requested with LocalAlloc, or some similar bug they could have fixed in ten minutes if they could see the source code of the library. …

When Unix was created and when it formed its cultural values, there were no end users. Computers were expensive, CPU time was expensive, and learning about computers meant learning how to program. It’s no wonder that the culture which emerged valued things which are useful to other programmers. By contrast, Windows was created with one goal only: to sell as many copies as conceivable at a profit. …

For example, Unix has a value of separating policy from mechanism which, historically, came from the designers of X. This directly led to a schism in user interfaces; nobody has ever quite been able to agree on all the details of how the desktop UI should work, and they think this is OK, because their culture values this diversity, but for Aunt Marge it is very much not OK to have to use a different UI to cut and paste in one program than she uses in another.

Cultural differences between Unix and Windows Read More »

Why disclosure laws are good

From Bruce Schneier’s “Identity-Theft Disclosure Laws” (Crypto-Gram Newsletter: 15 May 2006):

Disclosure laws force companies to make these security breaches public. This is a good idea for three reasons. One, it is good security practice to notify potential identity theft victims that their personal information has been lost or stolen. Two, statistics on actual data thefts are valuable for research purposes. And three, the potential cost of the notification and the associated bad publicity naturally leads companies to spend more money on protecting personal information — or to refrain from collecting it in the first place.

Why disclosure laws are good Read More »

Microsoft’s BitLocker could be used for DRM

From Bruce Schneier’s “Microsoft’s BitLocker” (Crypto-Gram Newsletter: 15 May 2006):

BitLocker is not a DRM system. However, it is straightforward to turn it into a DRM system. Simply give programs the ability to require that files be stored only on BitLocker-enabled drives, and then only be transferable to other BitLocker-enabled drives. How easy this would be to implement, and how hard it would be to subvert, depends on the details of the system.

Microsoft’s BitLocker could be used for DRM Read More »

Exploits used for corporate espionage

From Ryan Naraine’s “Microsoft Confirms Excel Zero-Day Attack Under Way” (eWeek: 16 June 2006):

Microsoft June 15 confirmed that a new, undocumented flaw in its widely used Excel spreadsheet program was being used in an attack against an unnamed target.

The company’s warning comes less than a month after a code-execution hole in Microsoft Word was exploited in what is described as a “super, super targeted attack” against business interests overseas.

The back-to-back zero-day attacks closely resemble each other and suggest that well-organized criminals are conducting corporate espionage using critical flaws purchased from underground hackers.

Exploits used for corporate espionage Read More »

Change the AMD K8 CPU without authentication checks

From Bruce Schneier’s Crypto-Gram Newsletter (15 August 2004):

Here’s an interesting hardware security vulnerability. Turns out that it’s possible to update the AMD K8 processor (Athlon64 or Opteron) microcode. And, get this, there’s no authentication check. So it’s possible that an attacker who has access to a machine can backdoor the CPU.

[See http://www.realworldtech.com/forums/index.cfm?action=detail&id=35446&threadid=35446&roomid=11]

Change the AMD K8 CPU without authentication checks Read More »

1st 2 questions AOL tech support asks

From Spare me the details (The Economist: 28 October 2004):

LISA HOOK, an executive at AOL, one of the biggest providers of traditional (“dial-up”) internet access, has learned amazing things by listening in on the calls to AOL’s help desk. Usually, the problem is that users cannot get online. The help desk’s first question is: “Do you have a computer?” Surprisingly often the answer is no, and the customer was trying to shove the installation CD into the stereo or TV set. The help desk’s next question is: “Do you have a second telephone line?” Again, surprisingly often the answer is no, which means that the customer cannot get on to the internet because he is on the line to the help desk. And so it goes on. …

1st 2 questions AOL tech support asks Read More »

Unix specs vs. Windows specs

From Peter Seebach’s Standards and specs: Not by UNIX alone (IBM developerWorks: 8 March 2006):

In the past 20 years, developers for “the same” desktop platform (“whatever Microsoft ships”) have been told that the API to target is (in this order):

* DOS
* Win16
* OS/2
* Win32
* WinNT
* WinXP
* and most recently .NET.

Of course, that list is from last year, and now the “stable” target that you should be developing for, if you have an eye for the future, is Vista.

It hasn’t been quite as bad in the Macintosh world, where the number of major API changes has been limited: classic single-tasking Mac OS, classic multitasking Mac OS (System 7), Carbon (System 8/9 and preview of OS X), and Cocoa (OS X), but even there, the cost of migration has been significant. At least OS X finally offers a stable UNIX API for the back-end part of programs, allowing developers to ignore the API creep except in GUI code.

By contrast, twenty-year-old UNIX utilities still compile and run. A new desktop computing API will come and everyone will have to rewrite for it, but mountains will erode away before read() and write() stop working. This is the reason that all the hassle of formal UNIX standards has had so little effect on practical UNIX software development; the core API is simple, clean, and well-designed, and there is no need to change it significantly.

… UNIX users have been switching hardware platforms since the 1970s; it’s no big deal. …

Just as there are many varieties of UNIX, there are many UNIX standards:

* Probably the oldest standard that people still refer to is AT&T’s 1985 System V Interface Definition (SVID). This standard shows up, for instance, in man pages describing the standards compliance of functions that have been in the C library “forever.”
* Meanwhile, X/Open (now the Open Group) was developing “portability guides” with names like XPG2, XPG3, and so on. XPG1 was actually released in 1995. The XPG guides are largely subsumed into newer specs, but once again, are still referred to sometimes in documentation.
* The IEEE’s POSIX standard showed up in 1990 with updates in 1992 and 1993 and a second edition in 1996. It’s still a viable standard, although it has suffered from poor accessibility. POSIX specs have names like 1003.x; for instance, 1003.1 and 1003.2, which refer to different parts of the standard, or 1003.1-1988 and 1003.1-1990, which refer to two versions of the standard.
* The fairly ominous sounding “Spec 1170” (also known as “UNIX 98” or “Single Unix Specification”) is probably the most complete specification; it is produced by the Open Group, and is effectively a descendant of the XPG series. In practice, this is “the” UNIX standard these days, although it’s a little large; this has had an impact on conformance testing.
* The Linux Standards Base is not strictly a UNIX standard, but it’s a standardization effort relevant to a very large number of developers working with code designed to run “on UNIX.” …

You can look at OS specifications in two very different ways: one is from the point of view of a developer trying to port an application, and the other is from the point of view of the user trying to interact with the system.

UNIX conveniently blurs this distinction. The primary user interface is also one of the primary development environments; therefore, UNIX specifications often cover not only the C language API, but also the shell environment and many of the core utilities shell programmers rely on. …

From the perspective of a developer who’s seen many Unix-like systems, Linux is probably mostly sort of similar to System V. The heavy focus on GNU utilities gives a sort of surreal combination of Berkeley and System V features, but if you have to guess whether Linux does something the Berkeley way or the System V way, go with System V. This is especially true of system startup; nearly all Linux systems use the System V /etc/inittab and /etc/rc.d structure, or something very close to it. …

Unix specs vs. Windows specs Read More »

AT&T’s security tv station

From Stephen Lawson & Robert McMillan’s AT&T plans CNN-syle security channel (InfoWorld: 23 June 2005):

Security experts at AT&T are about to take a page from CNN’s playbook. Within the next year they will begin delivering a video streaming service that will carry Internet security news 24 hours a day, seven days a week, according to the executive in charge of AT&T Labs.

The service, which currently goes by the code name Internet Security News Network, (ISN) is under development at AT&T Labs, but it will be offered as an additional service to the company’s customers within the next nine to 12 months, according to Hossein Eslambolchi, president of AT&T’s Global Networking Technology Services and AT&T Labs

ISN will look very much like Time Warner’s Cable News Network, except that it will be broadcast exclusively over the Internet, Eslambolchi said. “It’s like CNN,” he said. “When a new attack is spotted, we’ll be able to offer constant updates, monitoring, and advice.”

AT&T’s security tv station Read More »

Why Microsoft is threatened by open source

From How Microsoft played the patent card, and failed (The Register: 23 December 2004):

… the joint lead on the Samba project, Jeremy Allison …: “Microsoft has bought off and paid off every competitor it has, except open source. Every single player they could buy out, they did. That leaves Real, and FOSS. And they can’t buy us out, because you can’t buy off a social movement.”

Why Microsoft is threatened by open source Read More »

A profile of phishers & their jobs

From Lee Gomes’s Phisher Tales: How Webs of Scammers Pull Off Internet Fraud (The Wall Street Journal: 20 June 2005):

The typical phisher, he discovered, isn’t a movie-style villain but a Romanian teenager, albeit one who belongs to a social and economic infrastructure that is both remarkably sophisticated and utterly ragtag.

If, in the early days, phishing scams were one-person operations, they have since become so complicated that, just as with medicine or law, the labor has become specialized.

Phishers with different skills will trade with each other in IRC chat rooms, says Mr. Abad. Some might have access to computers around the world that have been hijacked, and can thus be used in connection with a phishing attack. Others might design realistic “scam pages,” which are the actual emails that phishers send. …

But even if a phisher has a “full,” the real work has yet to begin. The goal of most phishers is to use the information they glean to withdraw money from your bank account. Western Union is one way. Another is making a fake ATM card using a blank credit card and a special magnetic stripe reader/writer, which is easy to purchase online.

A phisher, though, may not have the wherewithal to do either of those. He might, for instance, be stuck in a small town where the Internet is his only connection to the outside world. In that case, he’ll go into an IRC chat room and look for a “casher,” someone who can do the dirty work of actually walking up to an ATM. Cashers, says Mr. Abad, usually take a cut of the proceeds and then wire the rest back to the phisher.

Certain chat rooms are thus full of cashers looking for work. “I cash out,” advertised “CCPower” last week on an IRC channel that had 80 other people logged onto it. “Msg me for deal. 65% your share.”

The average nonphisher might wonder what would prevent a casher from simply taking the money and running. It turns out, says Mr. Abad, that phishers have a reputation-monitoring system much like eBay’s. If you rip someone off, your rating goes down. Not only that, phishers post nasty notices about you on IRC. “Sox and Bagzy are rippers,” warned a message posted last week.

Phishers, not surprisingly, are savvy about their targets. For instance, it wasn’t just a coincidence that Washington Mutual was a phisher favorite. Mr. Abad says it was widely known in the phishing underground that a flaw in the communications between the bank’s ATM machines and its mainframe computers made it especially easy to manufacture fake Washington Mutual ATM cards. The bank fixed the problem a few months ago, Mr. Abad says, and the incidence of Washington Mutual-related phishing quickly plummeted. …

Mr. Abad himself is just 23 years old, but he has spent much of the past 10 years hanging out in IRC chat rooms, encountering all manner of hackers and other colorful characters. One thing that’s different about phishers, he says, is how little they like to gab.

“Real hackers will engage in conversation,” he says. “With phishers, it’s a job.”

A profile of phishers & their jobs Read More »

Spammers causing problems to DNS

From Dennis Fisher’s Spammers’ New Tactic Upends DNS (eWeek: 10 January 2005):

One troublesome technique finding favor with spammers involves sending mass mailings in the middle of the night from a domain that has not yet been registered. After the mailings go out, the spammer registers the domain early the next morning.

By doing this, spammers hope to avoid stiff CAN-SPAM fines through minimal exposure and visibility with a given domain. The ruse, they hope, makes them more difficult to find and prosecute.

The scheme, however, has unintended consequences of its own. During the interval between mailing and registration, the SMTP servers on the recipients’ networks attempt Domain Name System look-ups on the nonexistent domain, causing delays and timeouts on the DNS servers and backups in SMTP message queues.

“Anti-spam systems have become heavily dependent on DNS for looking at all kinds of blacklists, looking at headers, all of that,” said Paul Judge, a well-known anti-spam expert and chief technology officer at CipherTrust Inc., a mail security vendor based in Atlanta. “I’ve seen systems that have to do as many as 30 DNS calls on each message. Even in large enterprises, it’s becoming very common to see a large spam load cripple the DNS infrastructure.”

Spammers causing problems to DNS Read More »