analysis

Walke describes the Battle of Island Number 10

From “Operations of the Western Flotilla” by Henry A. Walke, Commander of the Carondelet, describing the Battle of Island Number Ten:

Having received written orders from the flag-officer, under date of March 30th, I at once began to prepare the Carondelet for the ordeal. All the loose material at hand was collected, and on the 4th of April the decks were covered with it, to protect them against plunging shot. Hawsers and chain cables were placed around the pilot-house and other vulnerable parts of the vessel, and every precaution was adopted to prevent disaster. A coal-barge laden with hay and coal was lashed to the part of the port side on which there was no iron plating, to protect the magazine. And it was truly said that the old Carondelet at that time resembled a farmer’s wagon prepared for market. The engineers led the escape-steam, through the pipes aft, into the wheel-house, to avoid the puffing sound it made when blown through the smoke-stacks.

All the necessary preparations having been made, I informed the flag-officer of my intention to run the gauntlet that night, and received his approval. Colonel Buford, who commanded the land forces temporarily with the flotilla, assisted me in preparing for the trip, and on the night of the 4th brought on board Captain Hollenstein, of the Forty-second Illinois, and twenty-three sharp-shooters of his command, who volunteered their services, which were gratefully accepted. Colonel Buford remained on board until the last moment to encourage us. I informed the officers and crew of the character of the undertaking, and all expressed a readiness to make the venture. In order to resist boarding parties in case we should be disabled, the sailors were well armed, and pistols, cutlasses, muskets, boarding-pikes, and hand-grenades were within reach. Hose was attached to the boilers for throwing scalding water over any who might attempt to board. If it should be found impossible to save the vessel, it was designed to sink rather than burn her, as the loss of life would probably be greater in the latter case by the explosion of her magazine. During the afternoon there was promise of a clear, moonlight night, and it was determined to wait until the moon was down, and then to make the attempt, whatever the chances. …

At ten o’clock the moon had gone down, and the sky, the earth, and the river were alike hidden in the black shadow of a thunder-storm, which had now spread itself over all the heavens. As the time seemed favorable, I ordered the first master to cast off. Dark clouds now rose rapidly over us, and enveloped us in almost total darkness, except when the sky was lighted up by the welcome flashes of vivid lightning, to show us the perilous way we were to take. Now and then the dim outline of the landscape could be seen, and the forest bending under the roaring storm that came rushing up the river.

With our bow pointing to the island, we passed the lowest point of land without being observed, it appears, by the enemy. All speed was given to the vessel to drive her through the tempest. The flashes of lightning continued with frightful brilliancy, and “almost every second” wrote a correspondent, “every brace, post, and outline could be seen with startling distinctness, enshrouded by a bluish white, glare of light, and then her form for the next minute would become merged in the intense darkness.” When opposite Battery No. 2, on the mainland, the smoke-stacks blazed up, but the fire was soon subdued. It was caused by the soot becoming dry, as the escape-steam, which usually kept the stacks wet, had been sent into the wheel-house, as already mentioned, to prevent noise. With such vivid lightning as prevailed during the whole passage, there was no prospect of escaping the vigilance of the enemy, but there was good reason to hope that he would be unable to point his guns accurately. Again the smoke-stacks took fire, and were soon put out; and then the roar of the enemy’s guns began, and from Batteries Nos. 2, 3, and 4 came almost incessantly the sharp crack and screaming sound of their rifle-shells, which seemed to unite with the electric batteries of the clouds to annihilate us.

While nearing the island or some shoal point, during a few minutes of total darkness, we were startled by the loud, sharp order, “Hard a-port!” from our brave and skillful pilot, First Master Hoel. We almost grazed the island, and it appears were not observed through the storm until we were close in, and the enemy, having no time to point his guns, fired at random. In fact, we ran so near that the enemy did not, probably could not depress his guns sufficiently. While close under the lee of the island and during a lull in the storm and in the firing, one of our pilots heard a Confederate officer shout, “Elevate your guns!” “Yes, confound you,” said the pilot, in a much lower key, “elevate.” It is probable that the muzzles of those guns had been depressed to keep the rain out of them, and the officers, not expecting another night attack in such a storm, and arriving late, ordered the guns elevated just in time to save us from the direct fire of the enemy’s heaviest fort; and this, no doubt, was the cause of our remarkable escape. Nearly all the enemy’s shot went over us.

Having passed the principal batteries, we were greatly relieved from suspense, patiently endured, however, by the officers and crew. But there was another formidable obstacle in the way — a floating battery, which was the great “war elephant” of the Confederates, built to blockade the Mississippi permanently. As we passed her she fired six or eight shots at us, but without effect. One ball struck the coal-barge and one was found in a bale of hay; we found also one or two musket-bullets. We arrived at New Madrid about midnight with no one hurt, and were most joyfully received by our army. At the suggestion of Paymaster Nixon, all hands “spliced the main brace.”

Walke describes the Battle of Island Number 10 Read More »

Flat local calling rates in US helped grow the Net

From Andrew Odlyzko’s “Pricing and Architecture of the Internet: Historical Perspectives from Telecommunications and Transportation“:

Moreover, flat rates for local calling played a key role in the rise of the Internet, by promoting much faster spread of this technology in the U.S. than in other countries. (This, as well as the FCC decisions about keeping Internet calls free from access charges, should surely be added to the list of “the 10 key choices that were critical to the Net’s success,” that were compiled by Scott Bradner [28].)

Flat local calling rates in US helped grow the Net Read More »

Terrorist social networks

From Technology Review‘s “Terror’s Server“:

For example, research suggests that people with nefarious intent tend to exhibit distinct patterns in their use of e-mails or online forums like chat rooms. Whereas most people establish a wide variety of contacts over time, those engaged in plotting a crime tend to keep in touch only with a very tight circle of people, says William Wallace, an operations researcher at Rensselaer Polytechnic Institute.

This phenomenon is quite predictable. “Very few groups of people communicate repeatedly only among themselves,” says Wallace. “It’s very rare; they don’t trust people outside the group to communicate. When 80 percent of communications is within a regular group, this is where we think we will find the groups who are planning activities that are malicious.” Of course, not all such groups will prove to be malicious; the odd high-school reunion will crop up. But Wallace’s group is developing an algorithm that will narrow down the field of so-called social networks to those that warrant the scrutiny of intelligence officials. The algorithm is scheduled for completion and delivery to intelligence agencies this summer. …

Terrorist social networks Read More »

When to use XML

From W3C’s “Architecture of the World Wide Web, Volume One“:

XML defines textual data formats that are naturally suited to describing data objects which are hierarchical and processed in a chosen sequence. It is widely, but not universally, applicable for data formats; an audio or video format, for example, is unlikely to be well suited to expression in XML. Design constraints that would suggest the use of XML include:

1. Requirement for a hierarchical structure.
2. Need for a wide range of tools on a variety of platforms.
3. Need for data that can outlive the applications that currently process it.
4. Ability to support internationalization in a self-describing way that makes confusion over coding options unlikely.
5. Early detection of encoding errors with no requirement to “work around” such errors.
6. A high proportion of human-readable textual content.
7. Potential composition of the data format with other XML-encoded formats.
8. Desire for data easily parsed by both humans and machines.
9. Desire for vocabularies that can be invented in a distributed manner and combined flexibly.

When to use XML Read More »

Jefferson Davis, seeker after discord

From Shelby Foote’s The Civil War: Fort Sumter to Perryville (127):

Men interpreted [Jefferson Davis] as they saw him, and for the most part they considered him argumentative in the extreme, irascible, and a seeker after discord. A Richmond editor later wrote, for all to read, that Davis was “ready for any quarrel with any and everybody, at any time and all times; and the suspicion goes that rather than not have a row on hand with the enemy, he would make one with the best friend he had on earth.”

Jefferson Davis, seeker after discord Read More »

The sky-god as origin of evil

From Gore Vidal, quoted in Richard Dawkins’ “Time to Stand Up“:

The great unmentionable evil at the center of our culture is monotheism. From a barbaric Bronze Age text known as the Old Testament, three anti-human religions have evolved — Judaism, Christianity, and Islam. These are sky-god religions. They are, literally, patriarchal — God is the Omnipotent Father — hence the loathing of women for 2,000 years in those countries afflicted by the sky-god and his earthly male delegates. The sky-god is a jealous god, of course. He requires total obedience from everyone on earth, as he is not just in place for one tribe, but for all creation. Those who would reject him must be converted or killed for their own good.

The sky-god as origin of evil Read More »

Google on the Google File System (& Linux)

From Sanjay Ghemawat, Howard Gobioff, & Shun-Tak Leung’s “The Google File System“:

We have designed and implemented the Google File Sys- tem, a scalable distributed file system for large distributed data-intensive applications. It provides fault tolerance while running on inexpensive commodity hardware, and it delivers high aggregate performance to a large number of clients. …

The file system has successfully met our storage needs. It is widely deployed within Google as the storage platform for the generation and processing of data used by our ser- vice as well as research and development efforts that require large data sets. The largest cluster to date provides hun- dreds of terabytes of storage across thousands of disks on over a thousand machines, and it is concurrently accessed by hundreds of clients. …

We have seen problems caused by application bugs, operating system bugs, human errors, and the failures of disks, memory, connectors, networking, and power sup- plies. Therefore, constant monitoring, error detection, fault tolerance, and automatic recovery must be integral to the system.

Second, files are huge by traditional standards. Multi-GB files are common. Each file typically contains many applica- tion objects such as web documents. When we are regularly working with fast growing data sets of many TBs comprising billions of objects, it is unwieldy to manage billions of ap- proximately KB-sized files even when the file system could support it. As a result, design assumptions and parameters such as I/O operation and blocksizes have to be revisited.

Third, most files are mutated by appending new data rather than overwriting existing data. Random writes within a file are practically non-existent. Once written, the files are only read, and often only sequentially. …

Multiple GFS clusters are currently deployed for different purposes. The largest ones have over 1000 storage nodes, over 300 TB of diskstorage, and are heavily accessed by hundreds of clients on distinct machines on a continuous basis. …

Despite occasional problems, the availability of Linux code has helped us time and again to explore and understand system behavior. When appropriate, we improve the kernel and share the changes with the open source community.

Google on the Google File System (& Linux) Read More »

The original description of Ajax

From Jesse James Garrett’s “Ajax: A New Approach to Web Applications“:

Ajax isn’t a technology. It’s really several technologies, each flourishing in its own right, coming together in powerful new ways. Ajax incorporates:

  • standards-based presentation using XHTML and CSS;
  • dynamic display and interaction using the Document Object Model;
  • data interchange and manipulation using XML and XSLT;
  • asynchronous data retrieval using XMLHttpRequest;
  • and JavaScript binding everything together.

The classic web application model works like this: Most user actions in the interface trigger an HTTP request back to a web server. The server does some processing — retrieving data, crunching numbers, talking to various legacy systems — and then returns an HTML page to the client. It’s a model adapted from the Web’s original use as a hypertext medium, but as fans of The Elements of User Experience know, what makes the Web good for hypertext doesn’t necessarily make it good for software applications. …

An Ajax application eliminates the start-stop-start-stop nature of interaction on the Web by introducing an intermediary — an Ajax engine — between the user and the server. It seems like adding a layer to the application would make it less responsive, but the opposite is true.

Instead of loading a webpage, at the start of the session, the browser loads an Ajax engine — written in JavaScript and usually tucked away in a hidden frame. This engine is responsible for both rendering the interface the user sees and communicating with the server on the user’s behalf. The Ajax engine allows the user’s interaction with the application to happen asynchronously — independent of communication with the server. So the user is never staring at a blank browser window and an hourglass icon, waiting around for the server to do something. …

Every user action that normally would generate an HTTP request takes the form of a JavaScript call to the Ajax engine instead. Any response to a user action that doesn’t require a trip back to the server — such as simple data validation, editing data in memory, and even some navigation — the engine handles on its own. If the engine needs something from the server in order to respond — if it’s submitting data for processing, loading additional interface code, or retrieving new data — the engine makes those requests asynchronously, usually using XML, without stalling a user’s interaction with the application.

The original description of Ajax Read More »

Religion & evolution

From Salon’s “Religious belief itself is an adaptation“, an interview with Edward O. Wilson:

Religious belief itself is an adaptation that has evolved because we’re hard-wired to form tribalistic religions. Religion is intensely tribalistic. A devout Christian or Muslim doesn’t say one religion is as good as another. It gives them faith in the particular group to which they belong and that set of beliefs and moral views. …

You cannot explain the patterns of diversity in the world, the geography of life, the endless details of distribution, similarity and dissimilarity in the world, by any means except evolution. That’s the one theory that ties it together. It is very hard to see how traditionalist religious views will come to explain the meaning of life on this planet. …

Religion & evolution Read More »

Architecture & the quality without a name

From Brian Hayes’ “The Post-OOP Paradigm“:

Christopher Alexander [a bricks-and-steel architect] is known for the enigmatic thesis that well-designed buildings and towns must have “the quality without a name.” He explains: “The fact that this quality cannot be named does not mean that it is vague or imprecise. It is impossible to name because it is unerringly precise.”

Architecture & the quality without a name Read More »

A very brief history of programming

From Brian Hayes’ “The Post-OOP Paradigm“:

The architects of the earliest computer systems gave little thought to software. (The very word was still a decade in the future.) Building the machine itself was the serious intellectual challenge; converting mathematical formulas into program statements looked like a routine clerical task. The awful truth came out soon enough. Maurice V. Wilkes, who wrote what may have been the first working computer program, had his personal epiphany in 1949, when “the realization came over me with full force that a good part of the remainder of my life was going to be spent in finding errors in my own programs.” Half a century later, we’re still debugging.

The very first programs were written in pure binary notation: Both data and instructions had to be encoded in long, featureless strings of 1s and 0s. Moreover, it was up to the programmer to keep track of where everything was stored in the machine’s memory. Before you could call a subroutine, you had to calculate its address.

The technology that lifted these burdens from the programmer was assembly language, in which raw binary codes were replaced by symbols such as load, store, add, sub. The symbols were translated into binary by a program called an assembler, which also calculated addresses. This was the first of many instances in which the computer was recruited to help with its own programming.

Assembly language was a crucial early advance, but still the programmer had to keep in mind all the minutiae in the instruction set of a specific computer. Evaluating a short mathematical expression such as x 2+y 2 might require dozens of assembly-language instructions. Higher-level languages freed the programmer to think in terms of variables and equations rather than registers and addresses. In Fortran, for example, x 2+y 2 would be written simply as X**2+Y**2. Expressions of this kind are translated into binary form by a program called a compiler.

… By the 1960s large software projects were notorious for being late, overbudget and buggy; soon came the appalling news that the cost of software was overtaking that of hardware. Frederick P. Brooks, Jr., who managed the OS/360 software program at IBM, called large-system programming a “tar pit” and remarked, “Everyone seems to have been surprised by the stickiness of the problem.”

One response to this crisis was structured programming, a reform movement whose manifesto was Edsger W. Dijkstra’s brief letter to the editor titled “Go to statement considered harmful.” Structured programs were to be built out of subunits that have a single entrance point and a single exit (eschewing the goto command, which allows jumps into or out of the middle of a routine). Three such constructs were recommended: sequencing (do A, then B, then C), alternation (either do A or do B) and iteration (repeat A until some condition is satisfied). Corrado Böhm and Giuseppe Jacopini proved that these three idioms are sufficient to express essentially all programs.

Structured programming came packaged with a number of related principles and imperatives. Top-down design and stepwise refinement urged the programmer to set forth the broad outlines of a procedure first and only later fill in the details. Modularity called for self-contained units with simple interfaces between them. Encapsulation, or data hiding, required that the internal workings of a module be kept private, so that later changes to the module would not affect other areas of the program. All of these ideas have proved their worth and remain a part of software practice today. But they did not rescue programmers from the tar pit.

Object-oriented programming addresses these issues by packing both data and procedures—both nouns and verbs—into a single object. An object named triangle would have inside it some data structure representing a three-sided shape, but it would also include the procedures (called methods in this context) for acting on the data. To rotate a triangle, you send a message to the triangle object, telling it to rotate itself. Sending and receiving messages is the only way objects communicate with one another; outsiders are not allowed direct access to the data. Because only the object’s own methods know about the internal data structures, it’s easier to keep them in sync.

You define the class triangle just once; individual triangles are created as instances of the class. A mechanism called inheritance takes this idea a step further. You might define a more-general class polygon, which would have triangle as a subclass, along with other subclasses such as quadrilateral, pentagon and hexagon. Some methods would be common to all polygons; one example is the calculation of perimeter, which can be done by adding the lengths of the sides, no matter how many sides there are. If you define the method calculate-perimeter in the class polygon, all the subclasses inherit this code.

A very brief history of programming Read More »

Intel: anyone can challenge anyone

From FORTUNE’s “Lessons in Leadership: The Education of Andy Grove“:

[Intel CEO Andy] Grove had never been one to rely on others’ interpretations of reality. … At Intel he fostered a culture in which “knowledge power” would trump “position power.” Anyone could challenge anyone else’s idea, so long as it was about the idea and not the person–and so long as you were ready for the demand “Prove it.” That required data. Without data, an idea was only a story–a representation of reality and thus subject to distortion.

Intel: anyone can challenge anyone Read More »

Intel’s ups and downs

From FORTUNE’s “Lessons in Leadership: The Education of Andy Grove“:

By 1983, when Grove distilled much of his thinking in his book High Output Management (still a worthwhile read), he was president of a fast-growing $1.1-billion-a-year corporation, a leading maker of memory chips, whose CEO was Gordon Moore. … What Moore’s Law did not and could not predict was that Japanese firms, too, might master this process and turn memory chips into a commodity. …

Intel kept denying the cliff ahead until its profits went over the edge, plummeting from $198 million in 1984 to less than $2 million in 1985. It was in the middle of this crisis, when many managers would have obsessed about specifics, that Grove stepped outside himself. He and Moore had been agonizing over their dilemma for weeks, he recounts in Only the Paranoid Survive, when something happened: “I looked out the window at the Ferris wheel of the Great America amusement park revolving in the distance when I turned back to Gordon, and I asked, ‘If we got kicked out and the board brought in a new CEO, what do you think he would do?’ Gordon answered without hesitation, ‘He would get us out of memories.’ I stared at him, numb, then said, ‘Why shouldn’t you and I walk out the door, come back, and do it ourselves?'”

… once IBM chose Intel’s microprocessor to be the chip at the heart of its PCs, demand began to explode. Even so, the shift from memory chips was brutally hard–in 1986, Intel fired some 8,000 people and lost more than $180 million on $1.3 billion in sales–the only loss the company has ever posted since its early days as a startup.

Intel’s ups and downs Read More »

Unpatched Linux, 3 months; unpatched Windows, 20 minutes

From Bruce Schneier’s “Linux Security“:

I’m a big fan of the Honeynet Project … Basically, they wire computers up with sensors, put them on the Internet, and watch hackers attack them.

They just released a report about the security of Linux:

Recent data from our honeynet sensor grid reveals that the average life expectancy to compromise for an unpatched Linux system has increased from 72 hours to 3 months. …

This is much greater than that of Windows systems, which have average life expectancies on the order of a few minutes.

… That’s the real story: the hackers aren’t bothering with Linux. Two years ago, a vulnerable Linux system would be hacked in less than three days; now it takes three months.

Why? My guess is a combination of two reasons. One, Linux is that much more secure than Windows. Two, the bad guys are focusing on Windows — more bang for the buck.

Unpatched Linux, 3 months; unpatched Windows, 20 minutes Read More »

Four principles of modernity

From “Relativity, Uncertainty, Incompleteness and Undecidability“:

In this article four fundamental principles are presented: relativity, uncertainty, incompleteness and undecidability. They were studied by, respectively, Albert Einstein, Werner Heisenberg, Kurt Gödel and Alan Turing. …

Relativity says that there is no privileged, “objective” viewpoint for certain observations. … Now, if things move relative to each other, then obviously their positions at a given time are also measured relative to each other. …

Werner Heisenberg showed that if we built a machine to tell us with high precision were an electron is, this machine could not also tell us the speed of the electron. If we want to measure its speed without altering it we can use a different light but then we wouldn’t know where it is. At atomic scale, no instrument can tell us at the same time exactly where a particle is and exactly at what speed it is moving. …

If this system is complete, then anything that is true is provable. Similarly, anything false is provable false. Kurt Gödel got the intuition that traditional mathematical logic was not complete, and devoted several years to try to find one thing, a single thing that was inside the mathematics but outside the reach of logic. … Gödel’s incompleteness means that the classical mathematical logic deductive system, and actually any logical system consistent and expressive enough, is not complete, has “holes” full of expressions that are not logically true nor false. …

Turing’s halting problem is one of the problems that fall in to the category of undecidable problems. It says that it is not possible to write a program to decide if other program is correctly written, in the sense that it will never hang. This creates a limit to the verification of all programs, as all the attempts of building actual computers, usable in practice and different from Turing machines have been proved to be equivalent in power and limitations to the basic Turing machine.

Four principles of modernity Read More »

The incompetent don’t know it

From “Unskilled and Unaware of It“:

It seems that the reason for this phenomenon is obvious: The more incompetent someone is in a particular area, the less qualified that person is to assess anyone’s skill in that space, including their own. When one fails to recognize that he or she has performed poorly, the individual is left assuming that they have performed well. As a result, the incompetent will tend to grossly overestimate their skills and abilities. A few years ago, two men from the Department of Psychology at Cornell University made an effort to determine just how profoundly one misoverestimates one’s own skills in relation to one’s actual abilities. They made four predictions, and executed four studies.

Justin Kruger and David Dunning made the following predictions before beginning their investigation:

  • Incompetent individuals, compared with their more competent peers, will dramatically overestimate their ability and performance relative to objective criteria.
  • Incompetent individuals will suffer from deficient metacognitive skills, in that they will be less able than their more competent peers to recognize competence when they see it–be it their own or anyone else’s.
  • Incompetent individuals will be less able than their more competent peers to gain insight into their true level of performance by means of social comparison information. In particular, because of their difficulty recognizing competence in others, incompetent individuals will be unable to use information about the choices and performances of others to form more accurate impressions of their own ability.
  • The incompetent can gain insight about their shortcomings, but this comes (paradoxically) by making them more competent, thus providing them the metacognitive skills necessary to be able to realize that they have performed poorly.

… In short, the study showed that the researchers’ predictions were spot-on. …

Also interestingly, the top performers tended to underestimate their own performance compared to their peers. The researchers found that those participants fell prey to the false-consensus effect, a phenomenon where one assumes that one’s peers are performing at least as well as oneself when given no evidence to the contrary.

The incompetent don’t know it Read More »

Interesting way to acquire someone’s signature

From Simson Garfinkel’s “Absolute Identification“, chapter 3 of Database Nation:

Already, the United Parcel Service, the nation’s largest package delivery service, is also the nation’s leader in biometric piracy. For most packages, UPS requires that a signature be written to serve as proof of delivery. In 1987, UPS started scanning the pen-and-ink signatures recorded for each package delivery. These images were stored in a database and faxed to any person who called UPS’s 800 number and asked for a ‘proof of delivery’ receipt. In 1990, UPS improved its piracy technology by equipping its drivers with portable electronic computers called DIADs (Delivery Information Acquisition Devices). Each computer has a built-in bar code reader and a signature pad. When a delivery is made, the UPS driver scans the bar code on each package and then has the person receiving the delivery sign for the package. The bar code number and the handwritten signature are recorded inside the DIAD, and ultimately uploaded to the company’s databanks.

The push to make signatures available in electronic form came from UPS customers, Pat Steffen, a spokesperson for UPS, told me when I called the company to complain about the practices. Signatures are considered proof of delivery. Digitizing that proof allows UPS to manipulate it like any other digital data. The faxed proof-of-delivery certificates are sent automatically from UPS computers, she explained. It’s also possible for UPS customers to download tracking software and view the signatures directly on their personal computers.

Ironically, by making a person’s written signature widely available, UPS is helping to dilute the written signature’s very value. Once the signature is digitized, it’s easy to manipulate it further with a computer–for example, you can paste it at the bottom of a contract. UPS’s system is particularly vulnerable: any package can be tracked as long as you know the package’s airbill, and UPS issues its preprinted airbills in sequential order–for example, ‘0930 8164 904,’ ‘0930 8164 913,’ and ‘0930 8164 922.’ An attacker can easily learn a company’s UPS airbill, use that airbill to obtain a comprehensive list of every delivery recipient–and then make a copy of every recipient’s signature.

UPS understands the vulnerability, but it can’t address the problem very well. A note on the company’s web site says:

UPS authorizes you to use UPS tracking systems solely to track shipments tendered by or for you to UPS for delivery and for no other purpose. Any other use of UPS tracking systems and information is strictly prohibited.

But, realistically speaking, UPS can do little to prevent this kind of attack. ‘If someone wants to go out of their way to get package numbers, it can be done. If someone wants to go out of their way to do anything, I suppose that’s possible. It is not an easy thing to do,’ said Steffen. Guessing would be harder, of course, if UPS used longer airbill numbers and didn’t issue them in a predictable sequence.

Interesting way to acquire someone’s signature Read More »

An interesting way to look at DRM

From “The Big DRM Mistake?“:

Fundamentally, DRM is a about persistent access control – it is a term for a set of technologies that allow for data to be protected beyond the file system of the original machine. Thus, for example, the read/write/execute access control on most *nix file systems will not only be applicable to the original machine but to all machines.

Stated in these terms, I agree with the aims of DRM. However, it is the ways in which large media and software businesses have mis-applied DRM that have ruined the associations most users have with the technology.

An interesting way to look at DRM Read More »

What is a socio-technical system?

From “Why a Socio-Technical System?“:

You have divined by now that a socio-technical system is a mixture of people and technology. It is, in fact, a much more complex mixture. Below, we outline many of the items that may be found in an STS. In the notes, we will make the case that many of the individual items of a socio-technical system are difficult to distinguish from each other because of their close inter-relationships.

Socio-technical systems include:

Hardware Mainframes, workstations, peripheral, connecting networks. This is the classic meaning of technology. It is hard to imagine a socio-technical system without some hardware component (though we welcome suggestions). In our above examples, the hardware is the microcomputers and their connecting wires, hubs, routers, etc.

Software Operating systems, utilities, application programs, specialized code. It is getting increasingly hard to tell the difference between software and hardware, but we expect that software is likely to be an integral part of any socio-technical system. Software (and by implication, hardware too) often incorporates social rules and organizational procedures as part of its design (e.g. optimize these parameters, ask for these data, store the data in these formats, etc.). Thus, software can serve as a stand-in for some of the factors listed below, and the incorporation of social rules into the technology can make these rules harder to see and harder to change. In the examples above, much of the software is likely to change from the emergency room to the elementary school. The software that does not change (e.g. the operating system) may have been designed more with one socio-technical system in mind (e.g. Unix was designed with an academic socio-technical system in mind). The re-use of this software in a different socio-technical system may cause problems of mismatch.

Physical surroundings. Buildings also influence and embody social rules, and their design can effect the ways that a technology is used. The manager’s office that is protected by a secretary’s office is one example; the large office suite with no walls is another. The physical environment of the military supplier and the elementary school are likely to be quite different, and some security issues may be handled by this physical environment rather than by the technology. Moving a technology that assumes one physical environment into a different environment one may cause mismatch problems.

People Individuals, groups, roles (support, training, management, line personnel, engineer, etc.), agencies. Note that we list here not just people (e.g. Mr. Jones) but roles (Mr. Jones, head of quality assurance), groups (Management staff in Quality Assurance) and agencies (The Department of Defense). In addition to his role as head of quality assurance, Mr. Jones may also have other roles (e.g. a teacher, a professional electrical engineer, etc.). The person in charge of the microcomputers in our example above may have very different roles in the different socio-technical systems, and these different roles will bring with them different responsibilities and ethical issues. Software and hardware designed assuming the kind of support one would find in a university environment may not match well with an elementary school or emergency room environment.

Procedures both official and actual, management models, reporting relationships, documentation requirements, data flow, rules & norms. Procedures describe the way things are done in an organization (or at least the official line regarding how they ought to be done). Both the official rules and their actual implementation are important in understanding a socio-technical system. In addition, there are norms about how things are done that allow organizations to work. These norms may not be specified (indeed, it might be counter-productive to specify them). But those who understand them know how to, for instance, make complaints, get a questionable part passed, and find answers to technical questions. Procedures are prime candidates to be encoded in software design.

Laws and regulations. These also are procedures like those above, but they carry special societal sanctions if the violators are caught. They might be laws regarding the protection of privacy, or regulations about the testing of chips in military use. These societal laws and regulations might be in conflict with internal procedures and rules. For instance, some companies have implicit expectations that employees will share (and probably copy) commercial software. Obviously these illegal expectations cannot be made explicit, but they can be made known.

Data and data structures. What data are collected, how they are archived, to whom they are made available, and the formats in which they are stored are all decisions that go into the design of a socio-technical system. Data archiving in an emergency room it will be quite different from that in an insurance company, and will be subject to different ethical issues too.

What is a socio-technical system? Read More »

Create web sites with PONUR

From “Dive Into Mark“:

Web Content Accessibility Guidelines 2.0 working draft from April 24, 2002. Only 3 weeks old!

The overall goal is to create Web content that is Perceivable, Operable, Navigable, and Understandable by the broadest possible range of users and compatible with their wide range of assistive technologies, now and in the future.

  1. Perceivable. Ensure that all content can be presented in form(s) that can be perceived by any user – except those aspects of the content that cannot be expressed in words.
  2. Operable. Ensure that the interface elements in the content are operable by any user.
  3. Orientation/Navigation. Facilitate content orientation and navigation
  4. Comprehendible. Make it as easy as possible to understand the content and controls.
  5. Technology Robust. Use Web technologies that maximize the ability of the content to work with current and future accessibility technologies and user agents.

I like that: perceivable, operable, navigable, understandable, and robust. That deserves to become a new acronym, PONUR

Create web sites with PONUR Read More »