technology

1% create, 10% comment, 89% just use

From Charles Arthur’s “What is the 1% rule?” (Guardian Unlimited: 20 July 2006):

It’s an emerging rule of thumb that suggests that if you get a group of 100 people online then one will create content, 10 will “interact” with it (commenting or offering improvements) and the other 89 will just view it.

It’s a meme that emerges strongly in statistics from YouTube, which in just 18 months has gone from zero to 60% of all online video viewing.

The numbers are revealing: each day there are 100 million downloads and 65,000 uploads – which as Antony Mayfield (at http://open.typepad.com/open) points out, is 1,538 downloads per upload – and 20m unique users per month.

That puts the “creator to consumer” ratio at just 0.5%, but it’s early days yet …

50% of all Wikipedia article edits are done by 0.7% of users, and more than 70% of all articles have been written by just 1.8% of all users, according to the Church of the Customer blog (http://customerevangelists.typepad.com/blog/).

Earlier metrics garnered from community sites suggested that about 80% of content was produced by 20% of the users, but the growing number of data points is creating a clearer picture of how Web 2.0 groups need to think. For instance, a site that demands too much interaction and content generation from users will see nine out of 10 people just pass by.

Bradley Horowitz of Yahoo points out that much the same applies at Yahoo: in Yahoo Groups, the discussion lists, “1% of the user population might start a group; 10% of the user population might participate actively, and actually author content, whether starting a thread or responding to a thread-in-progress; 100% of the user population benefits from the activities of the above groups,” he noted on his blog (www.elatable.com/blog/?p=5) in February.

1% create, 10% comment, 89% just use Read More »

Open sources turns software into a service industry

From Eric Steven Raymond’s “Problems in the Environment of Unix” (The Art of Unix Programming: 19 September 2003):

It’s not necessarily going to be an easy transition. Open source turns software into a service industry. Service-provider firms (think of medical and legal practices) can’t be scaled up by injecting more capital into them; those that try only scale up their fixed costs, overshoot their revenue base, and starve to death. The choices come down to singing for your supper (getting paid through tips and donations), running a corner shop (a small, low-overhead service business), or finding a wealthy patron (some large firm that needs to use and modify open-source software for its business purposes).

Open sources turns software into a service industry Read More »

Differences between Macintosh & Unix programmers

From Eric Steven Raymond’s “Problems in the Environment of Unix” (The Art of Unix Programming: 19 September 2003):

Macintosh programmers are all about the user experience. They’re architects and decorators. They design from the outside in, asking first “What kind of interaction do we want to support?” and then building the application logic behind it to meet the demands of the user-interface design. This leads to programs that are very pretty and infrastructure that is weak and rickety. In one notorious example, as late as Release 9 the MacOS memory manager sometimes required the user to manually deallocate memory by manually chucking out exited but still-resident programs. Unix people are viscerally revolted by this kind of mal-design; they don’t understand how Macintosh people could live with it.

By contrast, Unix people are all about infrastructure. We are plumbers and stonemasons. We design from the inside out, building mighty engines to solve abstractly defined problems (like “How do we get reliable packet-stream delivery from point A to point B over unreliable hardware and links?”). We then wrap thin and often profoundly ugly interfaces around the engines. The commands date(1), find(1), and ed(1) are notorious examples, but there are hundreds of others. Macintosh people are viscerally revolted by this kind of mal-design; they don’t understand how Unix people can live with it. …

In many ways this kind of parochialism has served us well. We are the keepers of the Internet and the World Wide Web. Our software and our traditions dominate serious computing, the applications where 24/7 reliability and minimal downtime is a must. We really are extremely good at building solid infrastructure; not perfect by any means, but there is no other software technical culture that has anywhere close to our track record, and it is one to be proud of. …

To non-technical end users, the software we build tends to be either bewildering and incomprehensible, or clumsy and condescending, or both at the same time. Even when we try to do the user-friendliness thing as earnestly as possible, we’re woefully inconsistent at it. Many of the attitudes and reflexes we’ve inherited from old-school Unix are just wrong for the job. Even when we want to listen to and help Aunt Tillie, we don’t know how — we project our categories and our concerns onto her and give her ‘solutions’ that she finds as daunting as her problems.

Differences between Macintosh & Unix programmers Read More »

The first movie theater

From Adam Goodheart’s “10 Days That Changed History” (The New York Times: 2 July 2006):

APRIL 16, 1902: The Movies

Motion pictures seemed destined to become a passing fad. Only a few years after Edison’s first crude newsreels were screened — mostly in penny arcades, alongside carnival games and other cheap attractions, the novelty had worn off, and Americans were flocking back to live vaudeville.

Then, in spring 1902, Thomas L. Tally opened his Electric Theater in Los Angeles, a radical new venture devoted to movies and other high-tech devices of the era, like audio recordings.

“Tally was the first person to offer a modern multimedia entertainment experience to the American public,” says the film historian Marc Wanamaker. Before long, his successful movie palace produced imitators nationally, which would become known as “nickelodeons.”

The first movie theater Read More »

The date Silicon Valley (& Intel) was born

From Adam Goodheart’s “10 Days That Changed History” (The New York Times: 2 July 2006):

SEPT. 18, 1957: Revolt of the Nerds

Fed up with their boss, eight lab workers walked off the job on this day in Mountain View, Calif. Their employer, William Shockley, had decided not to continue research into silicon-based semiconductors; frustrated, they decided to undertake the work on their own. The researchers — who would become known as “the traitorous eight” — went on to invent the microprocessor (and to found Intel, among other companies). “Sept. 18 was the birth date of Silicon Valley, of the electronics industry and of the entire digital age,” says Mr. Shockley’s biographer, Joel Shurkin.

The date Silicon Valley (& Intel) was born Read More »

The origin of broadcast journalism

From Nicholas Lemann’s “The Murrow Doctrine” (The New Yorker: 23 & 30 January 2006: 38-43):

There is a memorable entry in William Shirer’s Berlin Diary in which he describes – as, in effect, something that happened at work one day – the birth of broadcast journalism. It was Sunday, March 13, 1938, the day after Nazi troops entered Austria. Shirer, in London, got a call from CBS headquarters, in New York, asking him to put together a broadcast in which radio correspondents in the major capitals of Europe, led by Shirer’s boss, Edward R. Murrow, who was on the scene in Vienna, would offer a series of live reports on Hitler’s move and the reaction to it.

The origin of broadcast journalism Read More »

DRM converts copyrights into trade secrets

From Mark Sableman’s “Copyright reformers pose tough questions” (St. Louis Journalism Review: June 2005):

It goes by the name “digital rights management” – the effort, already very successful, to give content owners the right to lock down their works technologically. It is what Washington University law professor Charles McManis has characterized as attaching absolute “trade secret” property-type rights to the content formerly subject to the copyright balance between private rights and public use.

DRM converts copyrights into trade secrets Read More »

The birth of Geology & gradualism as a paradigm shift from catastrophism

From Kim Stanley Robinson’s “Imagining Abrupt Climate Change : Terraforming Earth” (Amazon Shorts: 31 July 2005):

This view, by the way, was in keeping with a larger and older paradigm called gradualism, the result of a dramatic and controversial paradigm shift of its own from the nineteenth century, one that is still a contested part of our culture wars, having to do with the birth of geology as a field, and its discovery of the immense age of the Earth. Before that, Earth’s history tended to be explained in a kind of Biblical paradigm, in which the Earth was understood to be several thousand years old, because of genealogies in the Bible, so that landscape features tended to be explained by events like Noah’s flood. This kind of “catastrophism” paradigm was what led Josiah Whitney to maintain that Yosemite Valley must have been formed by a cataclysmic earthquake, for instance; there simply hadn’t been time for water and ice to have carved something as hard as granite. It was John Muir who made the gradualist argument for glacial action over millions of years; and the eventual acceptance of his explanation was part of the general shift to gradualist explanations for Earth’s landforms, which also meant there was another time for evolution to have taken place. Gradualism also led by extension to thinking that the various climate regimes of the past had also come about fairly gradually.

The birth of Geology & gradualism as a paradigm shift from catastrophism Read More »

Why software is difficult to create … & will always be difficult

From Frederick P. Brooks, Jr.’s “No Silver Bullet: Essence and Accidents of Software Engineering” (Computer: Vol. 20, No. 4 [April 1987] pp. 10-19):

The familiar software project, at least as seen by the nontechnical manager, has something of this character; it is usually innocent and straightforward, but is capable of becoming a monster of missed schedules, blown budgets, and flawed products. So we hear desperate cries for a silver bullet–something to make software costs drop as rapidly as computer hardware costs do.

But, as we look to the horizon of a decade hence, we see no silver bullet. There is no single development, in either technology or in management technique, that by itself promises even one order-of-magnitude improvement in productivity, in reliability, in simplicity. …

The essence of a software entity is a construct of interlocking concepts: data sets, relationships among data items, algorithms, and invocations of functions. This essence is abstract in that such a conceptual construct is the same under many different representations. It is nonetheless highly precise and richly detailed.

I believe the hard part of building software to be the specification, design, and testing of this conceptual construct, not the labor of representing it and testing the fidelity of the representation. We still make syntax errors, to be sure; but they are fuzz compared with the conceptual errors in most systems. …

Let us consider the inherent properties of this irreducible essence of modern software systems: complexity, conformity, changeability, and invisibility.

Complexity. Software entities are more complex for their size than perhaps any other human construct because no two parts are alike (at least above the statement level). …

Many of the classic problems of developing software products derive from this essential complexity and its nonlinear increases with size. From the complexity comes the difficulty of communication among team members, which leads to product flaws, cost overruns, schedule delays. From the complexity comes the difficulty of enumerating, much less understanding, all the possible states of the program, and from that comes the unreliability. From complexity of function comes the difficulty of invoking function, which makes programs hard to use. From complexity of structure comes the difficulty of extending programs to new functions without creating side effects. From complexity of structure come the unvisualized states that constitute security trapdoors.

Not only technical problems, but management problems as well come from the complexity. It makes overview hard, thus impeding conceptual integrity. It makes it hard to find and control all the loose ends. It creates the tremendous learning and understanding burden that makes personnel turnover a disaster.

Conformity. … No such faith comforts the software engineer. Much of the complexity that he must master is arbitrary complexity, forced without rhyme or reason by the many human institutions and systems to which his interfaces must conform. …

Changeability. … All successful software gets changed. Two processes are at work. First, as a software product is found to be useful, people try it in new cases at the edge of or beyond the original domain. The pressures for extended function come chiefly from users who like the basic function and invent new uses for it.

Second, successful software survives beyond the normal life of the machine vehicle for which it is first written. If not new computers, then at least new disks, new displays, new printers come along; and the software must be conformed to its new vehicles of opportunity. …

Invisibility. Software is invisible and unvisualizable. …

The reality of software is not inherently embedded in space. Hence, it has no ready geometric representation in the way that land has maps, silicon chips have diagrams, computers have connectivity schematics. As soon as we attempt to diagram software structure, we find it to constitute not one, but several, general directed graphs superimposed one upon another. The several graphs may represent the flow of control, the flow of data, patterns of dependency, time sequence, name-space relationships. These graphs are usually not even planar, much less hierarchical. …

Past Breakthroughs Solved Accidental Difficulties

If we examine the three steps in software technology development that have been most fruitful in the past, we discover that each attacked a different major difficulty in building software, but that those difficulties have been accidental, not essential, difficulties. …

High-level languages. Surely the most powerful stroke for software productivity, reliability, and simplicity has been the progressive use of high-level languages for programming. …

What does a high-level language accomplish? It frees a program from much of its accidental complexity. …

Time-sharing. Time-sharing brought a major improvement in the productivity of programmers and in the quality of their product, although not so large as that brought by high-level languages.

Time-sharing attacks a quite different difficulty. Time-sharing preserves immediacy, and hence enables one to maintain an overview of complexity. …

Unified programming environments. Unix and Interlisp, the first integrated programming environments to come into widespread use, seem to have improved productivity by integral factors. Why?

They attack the accidental difficulties that result from using individual programs together, by providing integrated libraries, unified file formats, and pipes and filters. As a result, conceptual structures that in principle could always call, feed, and use one another can indeed easily do so in practice.

Why software is difficult to create … & will always be difficult Read More »

Just how big is YouTube?

From Reuters’s “YouTube serves up 100 mln videos a day” (16 July 2006):

YouTube, the leader in Internet video search, said on Sunday viewers have are now watching more than 100 million videos per day on its site, marking the surge in demand for its “snack-sized” video fare.

Since springing from out of nowhere late last year, YouTube has come to hold the leading position in online video with 29 percent of the U.S. multimedia entertainment market, according to the latest weekly data from Web measurement site Hitwise.

YouTube videos account for 60 percent of all videos watched online, the company said. …

In June, 2.5 billion videos were watched on YouTube, which is based in San Mateo, California and has just over 30 employees. More than 65,000 videos are now uploaded daily to YouTube, up from around 50,000 in May, the company said.

YouTube boasts nearly 20 million unique users per month, according to Nielsen//NetRatings, another Internet audience measurement firm.

Just how big is YouTube? Read More »

What kinds of spam are effective?

From Alex Mindlin’s “Seems Somebody Is Clicking on That Spam” (The New York Times: 3 July 2006):

Spam messages promoting pornography are 280 times as effective in getting recipients to click on them as messages advertising pharmacy drugs, which are the next most effective type of spam.

The third most successful variety is spam advertising Rolex watches, 0.0075 percent of which get clicked on, according to an analysis by CipherTrust, a large manufacturer of devices that protect networks from spam and viruses.

What kinds of spam are effective? Read More »

Ban USB devices or glue USB ports shut

From AAP’s “Computers ‘glued’ to protect data” (News.com.au: 4 July 2006):

A rise in the level of corporate data theft has spurred some companies to take measures to stop rogue employees sneaking corporate data out of the workplace on memory sticks, iPods and mobile phones, The Australian Financial Review reported.

Rising data theft has prompted a number of companies to ban portable storage devices – such as the ubiquitous memory stick – that can be plugged into computers to download files from one machine and transfer to another. …

“We have heard of at least one case where a company took steps to disable USB ports on their PCs with superglue,” SurfControl Australia’s managing director, Charles Heunemann, said.

Ban USB devices or glue USB ports shut Read More »

NSA spying: Project Shamrock & Echelon

From Kim Zetter’s “The NSA is on the line — all of them” (Salon: 15 May 2006):

As fireworks showered New York Harbor [in 1976], the country was debating a three-decades-long agreement between Western Union and other telecommunications companies to surreptitiously supply the NSA, on a daily basis, with all telegrams sent to and from the United States. The similarity between that earlier program and the most recent one is remarkable, with one exception — the NSA now owns vastly improved technology to sift through and mine massive amounts of data it has collected in what is being described as the world’s single largest database of personal information. And, according to Aid, the mining goes far beyond our phone lines.

The controversy over Project Shamrock in 1976 ultimately led Congress to pass the 1978 Foreign Intelligence Surveillance Act and other privacy and communication laws designed to prevent commercial companies from working in cahoots with the government to conduct wholesale secret surveillance on their customers. But as stories revealed last week, those safeguards had little effect in preventing at least three telecommunications companies from repeating history. …

[Intelligence historian Matthew Aid] compared the agency’s current data mining to Project Shamrock and Echelon, the code name for an NSA computer system that for many years analyzed satellite communication signals outside the U.S., and generated its own controversy when critics claimed that in addition to eavesdropping on enemy communication, the satellites were eavesdropping on allies’ domestic phone and e-mail conversations. …

If you want some historical perspective look at Operation Shamrock, which collapsed in 1975 because [Rep.] Bella Abzug [D-NY] subpoenaed the heads of Western Union and the other telecommunications giants and put them in witness chairs, and they all admitted that they had cooperated with the NSA for the better part of 40 years by supplying cables and telegrams.

The newest system being added to the NSA infrastructure, by the way, is called Project Trailblazer, which was initiated in 2002 and which was supposed to go online about now but is fantastically over budget and way behind schedule. Trailblazer is designed to copy the new forms of telecommunications — fiber optic cable traffic, cellphone communication, BlackBerry and Internet e-mail traffic. …

Echelon, in fact, is nothing more than a VAX microcomputer that was manufactured in the early 1970s by Digital Equipment Corp., and was used at six satellite intercept stations [to filter and sort data collected from the satellites and distribute it to analysts]. The computer has long since been obsolete. Since 9/11, whatever plans in place to modernize Echelon have been put on hold. The NSA does in fact have a global intercept network, but they just call it the intercept collection infrastructure. They don’t have a code name or anything sexy to describe it, and it didn’t do domestic spying.

NSA spying: Project Shamrock & Echelon Read More »

A private espionage company for businessmen

From Bo Elkjaer and Kenan Seeberg’s “Echelon’S Architect” (Cryptome: 21 May 2002):

After that, [Bruce McIndoe] started to design Echelon II, an enlargement of the original system.

Bruce McIndoe left the inner circle of the enormous espionage network in 1998, a network run by the National Security Agency, the world’s most powerful intelligence agency, in cooperation with other Western intelligence services. Ekstra Bladet tracked down Bruce McIndoe to IJet Travel Intelligence, a private espionage agency where he is currently second in command.

IJet Travel Intelligence is an exceedingly effective, specialized company that employs former staff members of the NSA, CIA, KGB and South African intelligence services.

The company’s task is to furnish reports for top executives from US business and industry that reveal everything about the destination to which they are travelling for their multinational company. All the information they need to make the trip as safe as possible. The company resembles a miniature version of his previous employer, the world’s most powerful intelligence agency, the NSA. …

“Okay. In short, we have transferred everything I did for the NSA and other services to a private company that then sells intelligence to businesspersons. We get information on everything from local diseases, outbreaks of malaria epidemics and local unrest to strikes, the weather and traffic conditions. Our customers are large multinational companies like Prudential and Texas Instruments. We also work for institutions like the World Bank and the IMF.” …

“Yes, exactly. Our staff are also former intelligent agents who have either developed or run espionage operations for US intelligence agencies or people from the UK, South Africa and Russia.”

A private espionage company for businessmen Read More »

Patenting is hurting scientific research & progress

From American Association for the Advancement of Science’s “The Effects of Patenting in the AAAS Scientific Community” [250 kb PDF] (2006):

Forty percent of respondents who had acquired patented technologies since January 2001 reported difficulties in obtaining those technologies. Industry bioscience respondents reported the most problems, with 76 percent reporting that their research had been affected by such difficulties. In contrast, only 35 percent of academic bioscience respondents reported difficulties that affected their research.

Of the 72 respondents who reported that their work had been affected by the technology acquisition process, 58 percent of those reported that their work was delayed. Fifty percent reported that they had to change their research, and 28 percent reported abandoning their research project as acquisition of the necessary technologies involved overly complex licensing negotiations.

Patenting is hurting scientific research & progress Read More »

OnStar: the numbers

From PR Newswire’s “OnStar Achieves Another First as Winner of Good Housekeeping’s ‘Good Buy’ Award for Best Servic” (3 December 2004):

Each month on average, OnStar receives about 700 airbag notifications and 11,000 emergency assistance calls, which include 4,000 Good Samaritan calls for a variety of emergency situations. In addition, each month OnStar advisors respond to an average of 500 stolen vehicle location requests, 20,000 requests for roadside assistance, 36,000 remote door-unlock requests and 19,000 GM Goodwrench remote diagnostics requests.

OnStar: the numbers Read More »

How to get 1 million MySpace friends

From Nate Mook’s “Cross-Site Scripting Worm Hits MySpace” (Beta News: 13 October 2005):

One clever MySpace user looking to expand his buddy list recently figured out how to force others to become his friend, and ended up creating the first self-propagating cross-site scripting (XSS) worm. In less than 24 hours, “Samy” had amassed over 1 million friends on the popular online community.

How did Samy transcend his humble beginnings of only 73 friends to become a veritable global celebrity? The answer is a combination of XSS tricks and lax security in certain Web browsers.

First, by examining the restrictions put into place by MySpace, Samy discovered how to insert raw HTML into his user profile page. But MySpace stripped out the word “javascript” from any text, which would be needed to execute code.

With the help of Internet Explorer, Samy was able to break the word JavaScript into two lines and place script code within a Cascading Style Sheet tag.

The next step was to simply instruct the Web browser to load a MySpace URL that would automatically invite Samy as a friend, and later add him as a “hero” to the visitor’s own profile page. To do this without a user’s knowledge, the code utilized XMLHTTPRequest – a JavaScript object used in AJAX, or Web 2.0, applications such as Google Maps.

Taking the hack even further, Samy realized that he could simply insert the entire script into the visiting user’s profile, creating a replicating worm. “So if 5 people viewed my profile, that’s 5 new friends. If 5 people viewed each of their profiles, that’s 25 more new friends,” Samy explained.

It didn’t take long for friend requests to start rolling in – first in the hundreds, then thousands. By 9:30pm that night, requests topped one million and continued arriving at a rate of 1,000 every few seconds. Less than an hour later, MySpace was taken offline while the worm was removed from all user profiles.

How to get 1 million MySpace friends Read More »

California’s wide-open educational software reveals personal info

From Nanette Asimov’s “Software glitch reveals private data for thousands of state’s students” (San Francisco Chronicle: 21 October 2005):

The personal information of tens of thousands of California children — including their names, state achievement test scores, identification numbers and status in gifted or special-needs programs — is open to public view through a security loophole in dozens of school districts statewide that use a popular education software system.

Teacher names and employee identification numbers are also visible to anyone logging onto the system, which is used locally by school districts including San Francisco, San Jose and Hayward.

The problem occurs when the districts issue a generic password to teachers using the system. Until the teacher changes to a unique password, anyone can type in a teacher’s user name and generic password and gain access to information about students that is supposed to be guarded as closely as the gold in Fort Knox. …

San Francisco administrators immediately shut down access to the service, called OARS — Online Assessment Reporting System — after a reporter phoned and said she had been able to access student information for all the children in two middle-school classes where the teachers had not yet changed their passwords. …

Most of the 96 districts statewide that use the system are in Southern California and the Central Valley. …

“We have confidence in the professionalism of our teachers” not to share their passwords, Bradshaw said.

But told how simple it was to gain access to the student records of any teacher who had not yet changed to a unique password, the administrators said they planned to make sure teachers did so.

“We will definitely monitor that,” Quinn said. “We don’t want anyone getting into student information.”

California’s wide-open educational software reveals personal info Read More »

Microsoft: only way to deal with malware is to wipe the computer

From Ryan Naraine’s “Microsoft Says Recovery from Malware Becoming Impossible” (eWeek: 4 April 2006):

In a rare discussion about the severity of the Windows malware scourge, a Microsoft security official said businesses should consider investing in an automated process to wipe hard drives and reinstall operating systems as a practical way to recover from malware infestation.

“When you are dealing with rootkits and some advanced spyware programs, the only solution is to rebuild from scratch. In some cases, there really is no way to recover without nuking the systems from orbit,” Mike Danseglio, program manager in the Security Solutions group at Microsoft, said in a presentation at the InfoSec World conference here.

Offensive rootkits, which are used hide malware programs and maintain an undetectable presence on an infected machine, have become the weapon of choice for virus and spyware writers and, because they often use kernel hooks to avoid detection, Danseglio said IT administrators may never know if all traces of a rootkit have been successfully removed.

He cited a recent instance where an unnamed branch of the U.S. government struggled with malware infestations on more than 2,000 client machines. “In that case, it was so severe that trying to recover was meaningless. They did not have an automated process to wipe and rebuild the systems, so it became a burden. They had to design a process real fast,” Danseglio added.

… “We’ve seen the self-healing malware that actually detects that you’re trying to get rid of it. You remove it, and the next time you look in that directory, it’s sitting there. It can simply reinstall itself,” he said.

“Detection is difficult, and remediation is often impossible,” Danseglio declared. “If it doesn’t crash your system or cause your system to freeze, how do you know it’s there? The answer is you just don’t know. Lots of times, you never see the infection occur in real time, and you don’t see the malware lingering or running in the background.”

… Danseglio said the success of social engineering attacks is a sign that the weakest link in malware defense is “human stupidity.”

“Social engineering is a very, very effective technique. We have statistics that show significant infection rates for the social engineering malware. Phishing is a major problem because there really is no patch for human stupidity,” he said.

Microsoft: only way to deal with malware is to wipe the computer Read More »

A big benefit of open source: better learning & teaching

From Jon Udell’s “Open source education” (InfoWorld: 7 June 2006):

Open source software development, to a degree unmatched by any other modern profession, offers apprentices the opportunity to watch journeymen and masters at work, to interact with them, and to learn how they think, work, succeed, and fail. Transparency and accountability govern not only the production of source code but also the companion processes of design, specification, testing, maintenance, and evaluation. …

It’s typical of many professions to cultivate an aura of infallibility and monopoly control of information. Open source doesn’t work that way. There are prima donnas, to be sure, but the culture requires practitioners to show their cards, and it erodes information monopolies. Shared code is just the tip of the iceberg. Below the waterline, there’s a vast body of shared knowledge and tradition, grounded in what Tim O’Reilly calls an architecture of participation.

We’ve come to see open source as an economic innovation. Cooperative production of commodity infrastructure paves the way for competitive production of high-value products and services. Perhaps we’ll someday see open source as an educational innovation, too. Cooperative production of shared knowledge isn’t just a by-product. When apprentices, journeymen, and masters engage in a continuous cycle of learning and teaching, an old approach to education is made new again.

A big benefit of open source: better learning & teaching Read More »