business

Three top botnets

From Kelly Jackson Higgins’ “The World’s Biggest Botnets” (Dark Reading: 9 November 2007):

You know about the Storm Trojan, which is spread by the world’s largest botnet. But what you may not know is there’s now a new peer-to-peer based botnet emerging that could blow Storm away.

“We’re investigating a new peer-to-peer botnet that may wind up rivaling Storm in size and sophistication,” says Tripp Cox, vice president of engineering for startup Damballa, which tracks botnet command and control infrastructures. “We can’t say much more about it, but we can tell it’s distinct from Storm.”

Researchers estimate that there are thousands of botnets in operation today, but only a handful stand out by their sheer size and pervasiveness. Although size gives a botnet muscle and breadth, it can also make it too conspicuous, which is why botnets like Storm fluctuate in size and are constantly finding new ways to cover their tracks to avoid detection. Researchers have different head counts for different botnets, with Storm by far the largest (for now, anyway).

Damballa says its top three botnets are Storm, with 230,000 active members per 24 hour period; Rbot, an IRC-based botnet with 40,000 active members per 24 hour period; and Bobax, an HTTP-based botnet with 24,000 active members per 24 hour period, according to the company.

1. Storm

Size: 230,000 active members per 24 hour period

Type: peer-to-peer

Purpose: Spam, DDOS

Malware: Trojan.Peacomm (aka Nuwar)

Few researchers can agree on Storm’s actual size — while Damballa says its over 200,000 bots, Trend Micro says its more like 40,000 to 100,000 today. But all researchers say that Storm is a whole new brand of botnet. First, it uses encrypted decentralized, peer-to-peer communication, unlike the traditional centralized IRC model. That makes it tough to kill because you can’t necessarily shut down its command and control machines. And intercepting Storm’s traffic requires cracking the encrypted data.

Storm also uses fast-flux, a round-robin method where infected bot machines (typically home computers) serve as proxies or hosts for malicious Websites. These are constantly rotated, changing their DNS records to prevent their discovery by researchers, ISPs, or law enforcement. And researchers say it’s tough to tell how the command and control communication structure is set up behind the P2P botnet. “Nobody knows how the mother ships are generating their C&C,” Trend Micro’s Ferguson says.

Storm uses a complex combination of malware called Peacomm that includes a worm, rootkit, spam relay, and Trojan.

But researchers don’t know — or can’t say — who exactly is behind Storm, except that it’s likely a fairly small, tightly knit group with a clear business plan. “All roads lead back to Russia,” Trend Micro’s Ferguson says.

“Storm is only thing now that keeps me awake at night and busy,” he says. “It’s professionalized crimeware… They have young, talented programmers apparently. And they write tools to do administrative [tracking], as well as writing cryptographic routines… and another will handle social engineering, and another will write the Trojan downloader, and another is writing the rootkit.”

Rbot

Size: 40,000 active members per 24 hour period

Type: IRC

Purpose: DDOS, spam, malicious operations

Malware: Windows worm

Rbot is basically an old-school IRC botnet that uses the Rbot malware kit. It isn’t likely to ever reach Storm size because IRC botnets just can’t scale accordingly. “An IRC server has to be a beefy machine to support anything anywhere close to the size of Peacomm/Storm,” Damballa’s Cox says.

It can disable antivirus software, too. Rbot’s underlying malware uses a backdoor to gain control of the infected machine, installing keyloggers, viruses, and even stealing files from the machine, as well as the usual spam and DDOS attacks.

Bobax

Size: 24,000 active members per 24 hour period

Type: HTTP

Purpose: Spam

Malware: Mass-mailing worm

Bobax is specifically for spamming, Cox says, and uses the stealthier HTTP for sending instructions to its bots on who and what to spam. …

According to Symantec, Bobax bores open a back door and downloads files onto the infected machine, and lowers its security settings. It spreads via a buffer overflow vulnerability in Windows, and inserts the spam code into the IE browser so that each time the browser runs, the virus is activated. And Bobax also does some reconnaissance to ensure that its spam runs are efficient: It can do bandwidth and network analysis to determine just how much spam it can send, according to Damballa. “Thus [they] are able to tailor their spamming so as not to tax the network, which helps them avoid detection,” according to company research.

Even more frightening, though, is that some Bobax variants can block access to antivirus and security vendor Websites, a new trend in Website exploitation.

Three top botnets Read More »

Largest botnet as of 2006: 1.5 M machines

From Gregg Keizer’s “Dutch Botnet Bigger Than Expected” (InformationWeek: 21 October 2005):

Dutch prosecutors who last month arrested a trio of young men for creating a large botnet allegedly used to extort a U.S. company, steal identities, and distribute spyware now say they bagged bigger prey: a botnet of 1.5 million machines.

According to Wim de Bruin, a spokesman for the Public Prosecution Service (Openbaar Ministerie, or OM), when investigators at GOVCERT.NL, the Netherlands’ Computer Emergency Response Team, and several Internet service providers began dismantling the botnet, they discovered it consisted of about 1.5 million compromised computers, 15 times the 100,000 PCs first thought.

The three suspects, ages 19, 22, and 27, were arrested Oct. 6 …

The trio supposedly used the Toxbot Trojan horse to infect the vast number of machines, easily the largest controlled by arrested attackers.

Largest botnet as of 2006: 1.5 M machines Read More »

Why botnet operators do it: profit, politics, & prestige

From Clive Akass’ “Storm worm ‘making millions a day’” (Personal Computer World: 11 February 2008):

The people behind the Storm worm are making millions of pounds a day by using it to generate revenue, according to IBM’s principal web security strategist.

Joshua Corman, of IBM Internet Security Systems, said that in the past it had been assumed that web security attacks were essential ego driven. But now attackers fell in three camps.

‘I call them my three Ps, profit, politics and prestige,’ he said during a debate at a NetEvents forum in Barcelona.

The Storm worm, which had been around about a year, had been a tremendous financial success because it created a botnet of compromised machines that could be used to launch profitable spam attacks.

Not only do the criminals get money simply for sending out the spam in much more quantity than could be sent by a single machine but they get a cut of any business done off the spam.

Why botnet operators do it: profit, politics, & prestige Read More »

Srizbi, Bobax, & Storm – the rankings

From Gregg Keizer’s “RSA – Top botnets control 1M hijacked computers” (Computerworld: 4 October 2008):

Joe Stewart, director of malware research at SecureWorks, presented his survey at the RSA Conference, which opened Monday in San Francisco. The survey ranked the top 11 botnets that send spam; by extrapolating their size, Stewart estimated the bots on his list control just over a million machines and are capable of flooding the Internet with more than 100 billion spam messages every day.

The botnet at the top of the chart is Srizbi. According to Stewart, this botnet — which also goes by the names “Cbeplay” and “Exchanger” — has an estimated 315,000 bots and can blast out 60 billion messages a day.

While it may not have gotten the publicity that Storm has during the last year, it’s built around a much more substantial collection of hijacked computers, said Stewart. In comparison, Storm’s botnet counts just 85,000 machines, only 35,000 of which are set up to send spam. Storm, in fact, is No. 5 on Stewart’s list.

“Storm is pretty insignificant at this point,” said Stewart. “It got all this attention, so Microsoft added it to its malicious software detection tool [in September 2007], and that’s removed hundreds of thousands of compromised PCs from the botnet.”

The second-largest botnet is “Bobax,” which boasts an estimated 185,000 hacked systems in its collection. Able to spam approximately nine billion messages a day, Bobax has been around for some time, but recently has been in the news again, albeit under one of its several aliases.

Srizbi, Bobax, & Storm – the rankings Read More »

Number of bots drops 20% on Christmas

From Robert Lemos’ “Bot-infected PCs get a refresh” (SecurityFocus: 28 December 2006):

On Christmas day, the number of bots tracked by the Shadowserver group dropped nearly 20 percent.

The dramatic decrease in weekly totals–from more than 500,000 infected systems to less than 400,000 computers–puzzled researchers. The Internet Storm Center, a threat monitoring group managed by the SANS Institute, confirmed a drop of about 10 percent.

One of the Internet Storm Center’s network monitoring volunteers posited that the decrease was due to the large number of computers given as gifts this Christmas. The systems running Microsoft Windows XP will be using Service Pack 2, which also means the firewall will be on by default, adding an additional hurdle for bot herder looking to reclaim their drones.

“Many of the infected machines are turned off, the new shiny ones have not been infected, and the Internet is momentarily a safer place,” Marcus Sachs, director of the ISC, stated in a diary entry. “But like you said, give it a few weeks and we’ll be right back to where we started from.”

Number of bots drops 20% on Christmas Read More »

1/4 of all Internet computers part of a botnet?

From Nate Anderson’s “Vint Cerf: one quarter of all computers part of a botnet” (Ars Technica: 25 January 2007):

The BBC’s Tim Weber, who was in the audience of an Internet panel featuring Vint Cerf, Michael Dell, John Markoff of the New York Times, and Jon Zittrain of Oxford, came away most impressed by the botnet statistics. Cerf told his listeners that approximately 600 million computers are connected to the Internet, and that 150 million of them might be participants in a botnet—nearly all of them unwilling victims. Weber remarks that “in most cases the owners of these computers have not the slightest idea what their little beige friend in the study is up to.”

In September 2006, security research firm Arbor Networks announced that it was now seeing botnet-based denial of service attacks capable of generating an astonishing 10-20Gbps of junk data. The company notes that when major attacks of this sort began, ISPs often do exactly what the attacker wants them to do: take the target site offline.

1/4 of all Internet computers part of a botnet? Read More »

Prices for various services and software in the underground

From Tom Espiner’s “Cracking open the cybercrime economy” (CNET News: 14 December 2007):

“Over the years, the criminal elements, the ones who are making money, making millions out of all this online crime, are just getting stronger and stronger. I don’t think we are really winning this war.”

As director of antivirus research for F-Secure, you might expect Mikko Hypponen to overplay the seriousness of the situation. But according to the Finnish company, during 2007 the number of samples of malicious code on its database doubled, having taken 20 years to reach the size it was at the beginning of this year.

“From Trojan creation sites out of Germany and the Eastern bloc, you can purchase kits and support for malware in yearly contracts,” said [David Marcus, security research manager at McAfee Avert Labs]. “They present themselves as a cottage industry which sells tools or creation kits. It’s hard to tell if it’s a conspiracy or a bunch of autonomous individuals who are good at covering their tracks.”

Joe Telafici, director of operations at McAfee’s Avert Labs, said Storm is continuing to evolve. “We’ve seen periodic activity from Storm indicating that it is still actively being maintained. They have actually ripped out core pieces of functionality to modify the obfuscation mechanisms that weren’t working any more. Most people keep changing the wrapper until it gets by (security software)–these guys changed the functionality.”

Peter Gutmann, a security researcher at the University of Auckland, says in a report that malicious software via the affiliate model–in which someone pays others to infect users with spyware and Trojans–has become more prevalent in 2007.

The affiliate model was pioneered by the iframedollars.biz site in 2005, which paid Webmasters 6 cents per infected site. Since then, this has been extended to a “vast number of adware affiliates,” according to Gutmann. For example, one adware supplier pays 30 cents for each install in the United States, 20 cents in Canada, 10 cents in the United Kingdom, and 1 or 2 cents elsewhere.

Hackers also piggyback malicious software on legitimate software. According to Gutmann, versions of coolwebsearch co-install a mail zombie and a keystroke logger, while some peer-to-peer and file-sharing applications come with bundled adware and spyware.

In March, the price quoted on malware sites for the Gozi Trojan, which steals data and sends it to hackers in an encrypted form, was between $1,000 and $2,000 for the basic version. Buyers could purchase add-on services at varying prices starting at $20.

In the 2007 black economy, everything can be outsourced, according to Gutmann. A scammer can buy hosts for a phishing site, buy spam services to lure victims, buy drops to send the money to, and pay a cashier to cash out the accounts. …

Antidetection vendors sell services to malicious-software and botnet vendors, who sell stolen credit card data to middlemen. Those middlemen then sell that information to fraudsters who deal in stolen credit card data and pay a premium for verifiably active accounts. “The money seems to be in the middlemen,” Gutmann says.

One example of this is the Gozi Trojan. According to reports, the malware was available this summer as a service from iFrameBiz and stat482.com, who bought the Trojan from the HangUp team, a group of Russian hackers. The Trojan server was managed by 76service.com, and hosted by the Russian Business Network, which security vendors allege offered “bullet-proof” hosting for phishing sites and other illicit operations.

According to Gutmann, there are many independent malicious-software developers selling their wares online. Private releases can be tailored to individual clients, while vendors offer support services, often bundling antidetection. For example, the private edition of Hav-rat version 1.2, a Trojan written by hacker Havalito, is advertised as being completely undetectable by antivirus companies. If it does get detected then it will be replaced with a new copy that again is supposedly undetectable.

Hackers can buy denial-of-service attacks for $100 per day, while spammers can buy CDs with harvested e-mail addresses. Spammers can also send mail via spam brokers, handled via online forums such as specialham.com and spamforum.biz. In this environment, $1 buys 1,000 to 5,000 credits, while $1,000 buys 10,000 compromised PCs. Credit is deducted when the spam is accepted by the target mail server. The brokers handle spam distribution via open proxies, relays and compromised PCs, while the sending is usually done from the client’s PC using broker-provided software and control information.

Carders, who mainly deal in stolen credit card details, openly publish prices, or engage in private negotiations to decide the price, with some sources giving bulk discounts for larger purchases. The rate for credit card details is approximately $1 for all the details down to the Card Verification Value (CVV); $10 for details with CVV linked to a Social Security number; and $50 for a full bank account.

Prices for various services and software in the underground Read More »

Criminals working together to improve their tools

From Dan Goodin’s “Crimeware giants form botnet tag team” (The Register: 5 September 2008):

The Rock Phish gang – one of the net’s most notorious phishing outfits – has teamed up with another criminal heavyweight called Asprox in overhauling its network with state-of-the-art technology, according to researchers from RSA.

Over the past five months, Rock Phishers have painstakingly refurbished their infrastructure, introducing several sophisticated crimeware packages that get silently installed on the PCs of its victims. One of those programs makes infected machines part of a fast-flux botnet that adds reliability and resiliency to the Rock Phish network.

Based in Europe, the Rock Phish group is a criminal collective that has been targeting banks and other financial institutions since 2004. According to RSA, they are responsible for half of the worldwide phishing attacks and have siphoned tens of millions of dollars from individuals’ bank accounts. The group got its name from a now discontinued quirk in which the phishers used directory paths that contained the word “rock.”

The first sign the group was expanding operations came in April, when it introduced a trojan known alternately as Zeus or WSNPOEM, which steals sensitive financial information in transit from a victim’s machine to a bank. Shortly afterward, the gang added more crimeware, including a custom-made botnet client that was spread, among other means, using the Neosploit infection kit.

Soon, additional signs appeared pointing to a partnership between Rock Phishers and Asprox. Most notably, the command and control server for the custom Rock Phish crimeware had exactly the same directory structure of many of the Asprox servers, leading RSA researchers to believe Rock Phish and Asprox attacks were using at least one common server. …

RSA researchers also noticed that a decrease in phishing attacks hosted on Rock Phishers’ old servers coincided with never-before-seen phishing attacks used on the Asprox botnet.

In this case, Rock Phishers seem to be betting that the spoofed pages used in their phishing attacks will remain up longer using fast-flux technology from Asprox.

“It just shows that these guys know each other and are willing to provide services to each other,” said Joe Stewart, a researcher at SecureWorks who has spent years tracking Asprox and groups that use fast-flux botnets. “This goes on in the underground all the time.”

Criminals working together to improve their tools Read More »

Wikipedia, freedom, & changes in production

From Clay Shirky’s “Old Revolutions, Good; New Revolutions, Bad” (Britannica Blog: 14 June 2007):

Gorman’s theory about print – its capabilities ushered in an age very different from manuscript culture — is correct, and the same kind of shift is at work today. As with the transition from manuscripts to print, the new technologies offer virtues that did not previously exist, but are now an assumed and permanent part of our intellectual environment. When reproduction, distribution, and findability were all hard, as they were for the last five hundred years, we needed specialists to undertake those jobs, and we properly venerated them for the service they performed. Now those tasks are simpler, and the earlier roles have instead become obstacles to direct access.

Digital and networked production vastly increase three kinds of freedom: freedom of speech, of the press, and of assembly. This perforce increases the freedom of anyone to say anything at any time. This freedom has led to an explosion in novel content, much of it mediocre, but freedom is like that. Critically, this expansion of freedom has not undermined any of the absolute advantages of expertise; the virtues of mastery remain as they were. What has happened is that the relative advantages of expertise are in precipitous decline. Experts the world over have been shocked to discover that they were consulted not as a direct result of their expertise, but often as a secondary effect – the apparatus of credentialing made finding experts easier than finding amateurs, even when the amateurs knew the same things as the experts.

The success of Wikipedia forces a profound question on print culture: how is information to be shared with the majority of the population? This is an especially tough question, as print culture has so manifestly failed at the transition to a world of unlimited perfect copies. Because Wikipedia’s contents are both useful and available, it has eroded the monopoly held by earlier modes of production. Other encyclopedias now have to compete for value to the user, and they are failing because their model mainly commits them to denying access and forbidding sharing. If Gorman wants more people reading Britannica, the choice lies with its management. Were they to allow users unfettered access to read and share Britannica’s content tomorrow, the only interesting question is whether their readership would rise a ten-fold or a hundred-fold.

Britannica will tell you that they don’t want to compete on universality of access or sharability, but this is the lament of the scribe who thinks that writing fast shouldn’t be part of the test. In a world where copies have become cost-free, people who expend their resources to prevent access or sharing are forgoing the principal advantages of the new tools, and this dilemma is common to every institution modeled on the scarcity and fragility of physical copies. Academic libraries, which in earlier days provided a service, have outsourced themselves as bouncers to publishers like Reed-Elsevier; their principal job, in the digital realm, is to prevent interested readers from gaining access to scholarly material.

Wikipedia, freedom, & changes in production Read More »

The Yakuza’s influence in Japan

From Jake Adelstein’s “This Mob Is Big in Japan” (The Washington Post: 11 May 2008):

Most Americans think of Japan as a law-abiding and peaceful place, as well as our staunch ally, but reporting on the underworld gave me a different perspective. Mobs are legal entities here. Their fan magazines and comic books are sold in convenience stores, and bosses socialize with prime ministers and politicians. …

I loved my job. The cops fighting organized crime are hard-drinking iconoclasts — many look like their mobster foes, with their black suits and slicked-back hair. They’re outsiders in Japanese society, and perhaps because I was an outsider too, we got along well. The yakuza’s tribal features are also compelling, like those of an alien life form: the full-body tattoos, missing digits and pseudo-family structure. …

The Japanese National Police Agency (NPA) estimates that the yakuza have almost 80,000 members. The most powerful faction, the Yamaguchi-gumi, is known as “the Wal-Mart of the yakuza” and reportedly has close to 40,000 members. In Tokyo alone, the police have identified more than 800 yakuza front companies: investment and auditing firms, construction companies and pastry shops. The mobsters even set up their own bank in California, according to underworld sources.

Over the last seven years, the yakuza have moved into finance. Japan’s Securities and Exchange Surveillance Commission has an index of more than 50 listed companies with ties to organized crime.

In the good old days, the yakuza made most of their money from sleaze: prostitution, drugs, protection money and child pornography. Kiddie porn is still part of their base income — and another area where Japan isn’t acting like America’s friend.

In 1999, my editors assigned me to cover the Tokyo neighborhood that includes Kabukicho, Japan’s largest red-light district. Japan had recently outlawed child pornography — reluctantly, after international pressure left officials no choice. But the ban, which is still in effect, had a major flaw: It criminalized producing and selling child pornography, not owning it. So the big-money industry goes on, unabated.

I’m not entirely objective on the issue of the yakuza in my adopted homeland. Three years ago, [Tadamasa Goto, a notorious Japanese gang boss, the one that some federal agents call the “John Gotti of Japan”] got word that I was reporting an article about his liver transplant. A few days later, his underlings obliquely threatened me. Then came a formal meeting. The offer was straightforward. “Erase the story or be erased,” one of them said. “Your family too.”

The Yakuza’s influence in Japan Read More »

ODF compared & constrasted with OOXML

From Sam Hiser’s “Achieving Openness: A Closer Look at ODF and OOXML” (ONLamp.com: 14 June 2007):

An open, XML-based standard for displaying and storing data files (text documents, spreadsheets, and presentations) offers a new and promising approach to data storage and document exchange among office applications. A comparison of the two XML-based formats–OpenDocument Format (“ODF”) and Office Open XML (“OOXML”)–across widely accepted “openness” criteria has revealed substantial differences, including the following:

  • ODF is developed and maintained in an open, multi-vendor, multi-stakeholder process that protects against control by a single organization. OOXML is less open in its development and maintenance, despite being submitted to a formal standards body, because control of the standard ultimately rests with one organization.
  • ODF is the only openly available standard, published fully in a document that is freely available and easy to comprehend. This openness is reflected in the number of competing applications in which ODF is already implemented. Unlike ODF, OOXML’s complexity, extraordinary length, technical omissions, and single-vendor dependencies combine to make alternative implementation unattractive as well as legally and practically impossible.
  • ODF is the only format unencumbered by intellectual property rights (IPR) restrictions on its use in other software, as certified by the Software Freedom Law Center. Conversely, many elements designed into the OOXML formats but left undefined in the OOXML specification require behaviors upon document files that only Microsoft Office applications can provide. This makes data inaccessible and breaks work group productivity whenever alternative software is used.
  • ODF offers interoperability with ODF-compliant applications on most of the common operating system platforms. OOXML is designed to operate fully within the Microsoft environment only. Though it will work elegantly across the many products in the Microsoft catalog, OOXML ignores accepted standards and best practices regarding its use of XML.

Overall, a comparison of both formats reveals significant differences in their levels of openness. While ODF is revealed as sufficiently open across all four key criteria, OOXML shows relative weakness in each criteria and offers fundamental flaws that undermine its candidacy as a global standard.

ODF compared & constrasted with OOXML Read More »

The future of security

From Bruce Schneier’s “Security in Ten Years” (Crypto-Gram: 15 December 2007):

Bruce Schneier: … The nature of the attacks will be different: the targets, tactics and results. Security is both a trade-off and an arms race, a balance between attacker and defender, and changes in technology upset that balance. Technology might make one particular tactic more effective, or one particular security technology cheaper and more ubiquitous. Or a new emergent application might become a favored target.

By 2017, people and organizations won’t be buying computers and connectivity the way they are today. The world will be dominated by telcos, large ISPs and systems integration companies, and computing will look a lot like a utility. Companies will be selling services, not products: email services, application services, entertainment services. We’re starting to see this trend today, and it’s going to take off in the next 10 years. Where this affects security is that by 2017, people and organizations won’t have a lot of control over their security. Everything will be handled at the ISPs and in the backbone. The free-wheeling days of general-use PCs will be largely over. Think of the iPhone model: You get what Apple decides to give you, and if you try to hack your phone, they can disable it remotely. We techie geeks won’t like it, but it’s the future. The Internet is all about commerce, and commerce won’t survive any other way.

Marcus Ranum: … Another trend I see getting worse is government IT know-how. At the rate outsourcing has been brain-draining the federal workforce, by 2017 there won’t be a single government employee who knows how to do anything with a computer except run PowerPoint and Web surf. Joking aside, the result is that the government’s critical infrastructure will be almost entirely managed from the outside. The strategic implications of such a shift have scared me for a long time; it amounts to a loss of control over data, resources and communications.

Bruce Schneier: … I’m reminded of the post-9/11 anti-terrorist hysteria — we’ve confused security with control, and instead of building systems for real security, we’re building systems of control. Think of ID checks everywhere, the no-fly list, warrantless eavesdropping, broad surveillance, data mining, and all the systems to check up on scuba divers, private pilots, peace activists and other groups of people. These give us negligible security, but put a whole lot of control in the government’s hands.

That’s the problem with any system that relies on control: Once you figure out how to hack the control system, you’re pretty much golden. So instead of a zillion pesky worms, by 2017 we’re going to see fewer but worse super worms that sail past our defenses.

The future of security Read More »

My new book – Google Apps Deciphered – is out!

I’m really proud to announce that my 5th book is now out & available for purchase: Google Apps Deciphered: Compute in the Cloud to Streamline Your Desktop. My other books include:

(I’ve also contributed to two others: Ubuntu Hacks: Tips & Tools for Exploring, Using, and Tuning Linux and Microsoft Vista for IT Security Professionals.)

Google Apps Deciphered is a guide to setting up Google Apps, migrating to it, customizing it, and using it to improve productivity, communications, and collaboration. I walk you through each leading component of Google Apps individually, and then show my readers exactly how to make them work together for you on the Web or by integrating them with your favorite desktop apps. I provide practical insights on Google Apps programs for email, calendaring, contacts, wikis, word processing, spreadsheets, presentations, video, and even Google’s new web browser Chrome. My aim was to collect together and present tips and tricks I’ve gained by using and setting up Google Apps for clients, family, and friends.

Here’s the table of contents:

  • 1: Choosing an Edition of Google Apps
  • 2: Setting Up Google Apps
  • 3: Migrating Email to Google Apps
  • 4: Migrating Contacts to Google Apps
  • 5: Migrating Calendars to Google Apps
  • 6: Managing Google Apps Services
  • 7: Setting Up Gmail
  • 8: Things to Know About Using Gmail
  • 9: Integrating Gmail with Other Software and Services
  • 10: Integrating Google Contacts with Other Software and Services
  • 11: Setting Up Google Calendar
  • 12: Things to Know About Using Google Calendar
  • 13: Integrating Google Calendar with Other Software and Services
  • 14: Things to Know About Using Google Docs
  • 15: Integrating Google Docs with Other Software and Services
  • 16: Setting Up Google Sites
  • 17: Things to Know About Using Google Sites
  • 18: Things to Know About Using Google Talk
  • 19: Things to Know About Using Start Page
  • 20: Things to Know About Using Message Security and Recovery
  • 21: Things to Know About Using Google Video
  • Appendix A: Backing Up Google Apps
  • Appendix B: Dealing with Multiple Accounts
  • Appendix C: Google Chrome: A Browser Built for Cloud Computing

If you want to know more about Google Apps and how to use it, then I know you’ll enjoy and learn from Google Apps Deciphered. You can read about and buy the book at Amazon (http://www.amazon.com/Google-Apps-Deciphered-Compute-Streamline/dp/0137004702) for $26.39. If you have any questions or comments, don’t hesitate to contact me at scott at granneman dot com.

My new book – Google Apps Deciphered – is out! Read More »

A single medium, with a single search engine, & a single info source

From Nicholas Carr’s “All hail the information triumvirate!” (Rough Type: 22 January 2009):

Today, another year having passed, I did the searches [on Google] again. And guess what:

World War II: #1
Israel: #1
George Washington: #1
Genome: #1
Agriculture: #1
Herman Melville: #1
Internet: #1
Magna Carta: #1
Evolution: #1
Epilepsy: #1

Yes, it’s a clean sweep for Wikipedia.

The first thing to be said is: Congratulations, Wikipedians. You rule. Seriously, it’s a remarkable achievement. Who would have thought that a rag-tag band of anonymous volunteers could achieve what amounts to hegemony over the results of the most popular search engine, at least when it comes to searches for common topics.

The next thing to be said is: what we seem to have here is evidence of a fundamental failure of the Web as an information-delivery service. Three things have happened, in a blink of history’s eye: (1) a single medium, the Web, has come to dominate the storage and supply of information, (2) a single search engine, Google, has come to dominate the navigation of that medium, and (3) a single information source, Wikipedia, has come to dominate the results served up by that search engine. Even if you adore the Web, Google, and Wikipedia – and I admit there’s much to adore – you have to wonder if the transformation of the Net from a radically heterogeneous information source to a radically homogeneous one is a good thing. Is culture best served by an information triumvirate?

It’s hard to imagine that Wikipedia articles are actually the very best source of information for all of the many thousands of topics on which they now appear as the top Google search result. What’s much more likely is that the Web, through its links, and Google, through its search algorithms, have inadvertently set into motion a very strong feedback loop that amplifies popularity and, in the end, leads us all, lemminglike, down the same well-trod path – the path of least resistance. You might call this the triumph of the wisdom of the crowd. I would suggest that it would be more accurately described as the triumph of the wisdom of the mob. The former sounds benign; the latter, less so.

A single medium, with a single search engine, & a single info source Read More »

Old botnets dead; new botnets coming

From Joel Hruska’s “Meet Son of Storm, Srizbi 2.0: next-gen botnets come online” (Ars Technica: 15 January 2009):

First the good news: SecureWorks reports that Storm is dead, Bobax/Kraken is moribund, and both Srizbi and Rustock were heavily damaged by the McColo takedown; Srizbi is now all but silent, while Rustock remains viable. That’s three significant botnets taken out and one damaged in a single year; cue (genuine) applause.

The bad news kicks in further down the page with a fresh list of botnets what need to be watched. Rustock and Mega-D (also known as Ozdok) are still alive and kicking, while newcomers Xarvester and Waledac could cause serious problems in 2009. Xarvester, according to Marshal may be an updated form of Srizbi; the two share a number of common features, including:

* HTTP command and control over nonstandard ports
* Encrypted template files contain several files needed for spamming
* Bots don’t need to do their own DNS lookups to send spam
* Config files have similar format and data
* Uploads Minidump crash file

Old botnets dead; new botnets coming Read More »

A definition of cloud computing

From Darryl K. Taft’s “Predictions for the Cloud in 2009” (eWeek: 29 December 2008):

[Peter] Coffee, who is now director of platform research at Salesforce.com, said, “I’m currently using a simple reference model for what a ‘cloud computing’ initiative should try to provide. I’m borrowing from the famous Zero-One-Infinity rule, canonically defined in The Jargon File…”

He continued, “It seems to me that a serious effort at delivering cloud benefits pursues the following ideals—perhaps never quite reaching them, but clearly having them as goals within theoretical possibility: Zero—On-premise[s] infrastructure, acquisition cost, adoption cost and support cost. One—Coherent software environment—not a ‘stack’ of multiple products from different providers. This avoids the chaos of uncoordinated release cycles or deferred upgrades. Infinity—Scalability in response to changing need, integratability/interoperability with legacy assets and other services, and customizability/programmability from data, through logic, up into the user interface without compromising robust multitenancy.”

A definition of cloud computing Read More »

Social networks can be used to manipulate affinity groups

From Ronald A. Cass’ “Madoff Exploited the Jews” (The Wall Street Journal: 18 December 2008):

Steven Spielberg. Elie Wiesel. Mort Zuckerman. Frank Lautenberg. Yeshiva University. As I read the list of people and enterprises reportedly bilked to the tune of $50 billion by Bernard Madoff, I recalled a childhood in which my father received bad news by asking first, “Was it a Jew?” My father coupled sensitivity to anti-Semitism with special sympathy for other Jews. In contrast, Mr. Madoff, it seems, targeted other Jews, drawing them in at least in some measure because of a shared faith.

The Madoff tale is striking in part because it is like stealing from family. Yet frauds that prey on people who share bonds of religion or ethnicity, who travel in the same circles, are quite common. Two years ago the Securities and Exchange Commission issued a warning about “affinity fraud.” The SEC ticked off a series of examples of schemes that were directed at members of a community: Armenian-Americans, Baptist Church members, Jehovah’s Witnesses, African-American church groups, Korean-Americans. In each case, the perpetrator relied on the fact that being from the same community provided a reason to trust the sales pitch, to believe it was plausible that someone from the same background would give you a deal that, if offered by someone without such ties, would sound too good to be true.

The sense of common heritage, of community, also makes it less seemly to ask hard questions. Pressing a fellow parishioner or club member for hard information is like demanding receipts from your aunt — it just doesn’t feel right. Hucksters know that, they play on it, and they count on our trust to make their confidence games work.

The level of affinity and of trust may be especially high among Jews. The Holocaust and generations of anti-Semitic laws and practices around the world made reliance on other Jews, and care for them, a survival instinct. As a result, Jews are often an easy target both for fund-raising appeals and fraud. But affinity plays a role in many groups, making members more trusting of appeals within the group.

Social networks can be used to manipulate affinity groups Read More »

DIY genetic engineering

From Marcus Wohlsen’s “Amateurs are trying genetic engineering at home” (AP: 25 December 2008):

Now, tinkerers are working at home with the basic building blocks of life itself.

Using homemade lab equipment and the wealth of scientific knowledge available online, these hobbyists are trying to create new life forms through genetic engineering — a field long dominated by Ph.D.s toiling in university and corporate laboratories.

In her San Francisco dining room lab, for example, 31-year-old computer programmer Meredith L. Patterson is trying to develop genetically altered yogurt bacteria that will glow green to signal the presence of melamine, the chemical that turned Chinese-made baby formula and pet food deadly.

Many of these amateurs may have studied biology in college but have no advanced degrees and are not earning a living in the biotechnology field. Some proudly call themselves “biohackers” — innovators who push technological boundaries and put the spread of knowledge before profits.

In Cambridge, Mass., a group called DIYbio is setting up a community lab where the public could use chemicals and lab equipment, including a used freezer, scored for free off Craigslist, that drops to 80 degrees below zero, the temperature needed to keep many kinds of bacteria alive.

Patterson, the computer programmer, wants to insert the gene for fluorescence into yogurt bacteria, applying techniques developed in the 1970s.

She learned about genetic engineering by reading scientific papers and getting tips from online forums. She ordered jellyfish DNA for a green fluorescent protein from a biological supply company for less than $100. And she built her own lab equipment, including a gel electrophoresis chamber, or DNA analyzer, which she constructed for less than $25, versus more than $200 for a low-end off-the-shelf model.

DIY genetic engineering Read More »

Social networking and “friendship”

From danah boyd’s “Friends, Friendsters, and MySpace Top 8: Writing Community Into Being on Social Network Sites” (First Monday: December 2006)

John’s reference to “gateway Friends” concerns a specific technological affordance unique to Friendster. Because the company felt it would make the site more intimate, Friendster limits users from surfing to Profiles beyond four degrees (Friends of Friends of Friends of Friends). When people login, they can see how many Profiles are “in their network” where the network is defined by the four degrees. For users seeking to meet new people, growing this number matters. For those who wanted it to be intimate, keeping the number smaller was more important. In either case, the number of people in one’s network was perceived as directly related to the number of friends one had.

“I am happy with the number of friends I have. I can access over 26,000 profiles, which is enough for me!” — Abby

The number of Friends one has definitely affects the size of one’s network but connecting to Collectors plays a much more significant role. Because these “gateway friends” (a.k.a. social network hubs) have lots of Friends who are not connected to each other, they expand the network pretty rapidly. Thus, connecting to Collectors or connecting to people who connect to Collectors opens you up to a large network rather quickly.

While Collectors could be anyone interested in amassing many Friends, fake Profiles were developed to aid in this process. These Fakesters included characters, celebrities, objects, icons, institutions, and ideas. For example, Homer Simpson had a Profile alongside Jesus and Brown University. By connecting people with shared interests or affiliations, Fakesters supported networking between like-minded individuals. Because play and connecting were primary incentives for many Fakesters, they welcomed any and all Friends. Likewise, people who wanted access to more people connected to Fakesters. Fakesters helped centralize the network and two Fakesters — Burning Man and Ali G — reached mass popularity with over 10,000 Friends each before the Web site’s creators put an end to their collecting and deleted both accounts. This began the deletion of all Fakesters in what was eventually termed the Fakester Genocide [8].

While Friendster was irritated by fake Profiles, MySpace embraced this practice. One of MySpace’s early strategies was to provide a place for everyone who was rejected from Friendster or who didn’t want to be on a dating site [9]. Bands who had been kicked off of Friendster were some of the earliest MySpace users. Over time, movie stars, politicians, porn divas, comedians, and other celebrities joined the fray. Often, the person behind these Profiles was not the celebrity but a manager. Corporations began creating Profiles for their products and brands. While Friendster eventually began allowing such fake Profiles for a fee, MySpace never charged people for their commercial uses.

Investigating Friendship in LiveJournal, Kate Raynes-Goldie and Fono (2005) found that there was tremendous inconsistency in why people Friended others. They primarily found that Friendship stood for: content, offline facilitator, online community, trust, courtesy, declaration, or nothing. When I asked participants about their practices on Friendster and MySpace, I found very similar incentives. The most common reasons for Friendship that I heard from users [11] were:

1. Actual friends
2. Acquaintances, family members, colleagues
3. It would be socially inappropriate to say no because you know them
4. Having lots of Friends makes you look popular
5. It’s a way of indicating that you are a fan (of that person, band, product, etc.)
6. Your list of Friends reveals who you are
7. Their Profile is cool so being Friends makes you look cool
8. Collecting Friends lets you see more people (Friendster)
9. It’s the only way to see a private Profile (MySpace)
10. Being Friends lets you see someone’s bulletins and their Friends-only blog posts (MySpace)
11. You want them to see your bulletins, private Profile, private blog (MySpace)
12. You can use your Friends list to find someone later
13. It’s easier to say yes than no

These incentives account for a variety of different connections. While the first three reasons all concern people that you know, the rest can explain why people connect to a lot of people that they do not know. Most reveal how technical affordances affect people’s incentives to connect.

Raynes-Goldie and Fono (2005) also found that there is a great deal of social anxiety and drama provoked by Friending in LiveJournal (LJ). In LJ, Friendship does not require reciprocity. Anyone can list anyone else as a Friend; this articulation is public but there is no notification. The value of Friendship on LJ is deeply connected to the privacy settings and subscription processes. The norm on LJ is to read others’ entries through a “Friends page.” This page is an aggregation of all of an individual’s Friends’ posts. When someone posts an LJ entry, they have a choice as to whether the post should be public, private, Friends-only, or available to subgroups of Friends. In this way, it is necessary to be someone’s Friend to have access to Friends-only posts. To locate how the multiple and conflicting views of Friendship cause tremendous conflict and misunderstanding on LJ, Raynes-Goldie and Fono speak of “hyperfriending.” This process is quite similar to what takes place on other social network sites, but there are some differences. Because Friends-only posts are commonplace, not being someone’s Friend is a huge limitation to information access. Furthermore, because reciprocity is not structurally required, there’s a much greater social weight to recognizing someone’s Friendship and reciprocating intentionally. On MySpace and Friendster, there is little to lose by being loose with Friendship and more to gain; the perception is that there is much more to lose on LJ.

While users can scroll through their list of Friends, not all Friends are displayed on the participant’s Profile. Most social network sites display Friends in the order in which their account was created or their last login date. By implementing a “Top 8” feature, MySpace changed the social dynamics around the ordering of Friends. Initially, “Top 8” allowed users to select eight Friends to display on their Profile. More recently, that feature was changed to “Top Friends” as users have more options in how many people they could list [12]. Many users will only list people that they know and celebrities that they admire in their Top Friends, often as a way to both demarcate their identity and signal meaningful relationships with others.

There are many advantages to the Top Friends feature. It allows people to show connections that really say something about who they are. It also serves as a bookmark to the people that matter. By choosing to list the people who one visits the most frequently, simply going to one’s Profile provides a set of valuable links.

“As a kid, you used your birthday party guest list as leverage on the playground. ‘If you let me play I’ll invite you to my birthday party.’ Then, as you grew up and got your own phone, it was all about someone being on your speed dial. Well today it’s the MySpace Top 8. It’s the new dangling carrot for gaining superficial acceptance. Taking someone off your Top 8 is your new passive aggressive power play when someone pisses you off.” — Nadine

There are a handful of social norms that pervade Top 8 culture. Often, the person in the upper left (“1st” position) is a significant other, dear friend, or close family member. Reciprocity is another salient component of Top Friends dynamics. If Susan lists Mary on her Top 8, she expects Mary to reciprocate. To acknowledge this, Mary adds a Comment to Susan’s page saying, “Thanx for puttin me on ur Top 8! I put you on mine 2.” By publicly acknowledging this addition, Mary is making certain Susan’s viewers recognize Mary’s status on Susan’s list. Of course, just being in someone’s list is not always enough. As Samantha explains, “Friends get into fights because they’re not 1st on someone’s Top 8, or somebody else is before them.” While some people are ecstatic to be added, there are many more that are frustrated because they are removed or simply not listed.

The Top Friends feature requires participants to actively signal their relationship with others. Such a system makes it difficult to be vague about who matters the most, although some tried by explaining on their bulletins what theme they are using to choose their Top 8 this week: “my Sagittarius friends,” “my basketball team,” and “people whose initials are BR.” Still others relied on fake Profiles for their Top 8.

The networked nature of impressions does not only affect the viewer — this is how newcomers decided what to present in the first place. When people first joined Friendster, they took cues from the people who invited them. Three specific subcultures dominated the early adopters — bloggers, attendees of the Burning Man [14] festival, and gay men mostly living in New York. If the invitee was a Burner, their Profile would probably be filled with references to the event with images full of half-naked, costumed people running around the desert. As such, newcomers would get the impression that it was a site for Burners and they would create a Profile that displayed that facet of their identity. In decided who to invite, newcomers would perpetuate the framing by only inviting people who are part of the Burning Man subculture.

Interestingly, because of this process, Burners believed that the site was for Burners, gay men thought it was a gay dating site, and bloggers were ecstatic to have a geek socializing tool. The reason each group got this impression had to do with the way in which context was created on these systems. Rather than having the context dictated by the environment itself, context emerged through Friends networks. As a result, being socialized into Friendster meant connected to Friends that reinforced the contextual information of early adopters.

The growth of MySpace followed a similar curve. One of the key early adopter groups were hipsters living in the Silverlake neighborhood of Los Angeles. They were passionate about indie rock music and many were musicians, promoters, club goers, etc. As MySpace took hold, long before any press was covering the site, MySpace took off amongst 20/30-something urban socializers, musicians, and teenagers. The latter group may not appear obvious, but teenagers are some of the most active music consumers — they follow music culture avidly, even when they are unable to see the bands play live due to age restrictions. As the site grew, the teenagers and 20/30-somethings pretty much left each other alone, although bands bridged these groups. It was not until the site was sold to News Corp. for US$580 million in the summer of 2005 that the press began covering the phenomenon. The massive press helped it grow larger, penetrating those three demographics more deeply but also attracting new populations, namely adults who are interested in teenagers (parents, teachers, pedophiles, marketers).

When context is defined by whom one Friends, and addressing multiple audiences simultaneously complicates all relationships, people must make hard choices. Joshua Meyrowitz (1985) highlights this problem in reference to television. In the early 1960s, Stokely Carmichael regularly addressed segregated black and white audiences about the values of Black Power. Depending on his audience, he used very different rhetorical styles. As his popularity grew, he began to attract media attention and was invited to speak on TV and radio. Unfortunately, this was more of a curse than a blessing because the audiences he would reach through these mediums included both black and white communities. With no way to reconcile the two different rhetorical styles, he had to choose. In choosing to maintain his roots in front of white listeners, Carmichael permanently alienated white society from the messages of Black Power.

Notes

10. Friendster originally limited users to 150 Friends. It is no accident that they chose 150, as this is the “Dunbar number.” In his research on gossip and grooming, Robin Dunbar argues that there is a cognitive limit to the number of relations that one can maintain. People can only keep gossip with 150 people at any given time (Dunbar, 1998). By capping Friends at 150, Friendster either misunderstood Dunbar or did not realize that their users were actually connecting to friends from the past with whom they are not currently engaging.

12. Eight was the maximum number of Friends that the system initially let people have. Some users figured out how to hack the system to display more Friends; there are entire bulletin boards dedicated to teaching others how to hack this. Consistently, upping the limit was the number one request that the company received. In the spring of 2006, MySpace launched an ad campaign for X-Men. In return for Friending X-Men, users were given the option to have 12, 16, 20, or 24 Friends in their Top Friends section. Millions of users did exactly that. In late June, this feature was introduced to everyone, regardless of Friending X-Men. While eight is no longer the limit, people move between calling it Top 8 or Top Friends. I will use both terms interchangeably, even when the number of Friends might be greater than eight.

Social networking and “friendship” Read More »

Many layers of cloud computing, or just one?

From Nicholas Carr’s “Further musings on the network effect and the cloud” (Rough Type: 27 October 2008):

I think O’Reilly did a nice job of identifying the different layers of the cloud computing business – infrastructure, development platform, applications – and I think he’s right that they’ll have different economic and competitive characteristics. One thing we don’t know yet, though, is whether those layers will in the long run exist as separate industry sectors or whether they’ll collapse into a single supply model. In other words, will the infrastructure suppliers also come to dominate the supply of apps? Google and Microsoft are obviously trying to play across all three layers, while Amazon so far seems content to focus on the infrastructure business and Salesforce is expanding from the apps layer to the development platform layer. The degree to which the layers remain, or don’t remain, discrete business sectors will play a huge role in determining the ultimate shape, economics, and degree of consolidation in cloud computing.

Let me end on a speculative note: There’s one layer in the cloud that O’Reilly failed to mention, and that layer is actually on top of the application layer. It’s what I’ll call the device layer – encompassing all the various appliances people will use to tap the cloud – and it may ultimately come to be the most interesting layer. A hundred years ago, when Tesla, Westinghouse, Insull, and others were building the cloud of that time – the electric grid – companies viewed the effort in terms of the inputs to their business: in particular, the power they needed to run the machines that produced the goods they sold. But the real revolutionary aspect of the electric grid was not the way it changed business inputs – though that was indeed dramatic – but the way it changed business outputs. After the grid was built, we saw an avalanche of new products outfitted with electric cords, many of which were inconceivable before the grid’s arrival. The real fortunes were made by those companies that thought most creatively about the devices that consumers would plug into the grid. Today, we’re already seeing hints of the device layer – of the cloud as output rather than input. Look at the way, for instance, that the little old iPod has shaped the digital music cloud.

Many layers of cloud computing, or just one? Read More »