science

How the fundamentalist thinks

From ScienceDaily’s “Brain Differences Found Between Believers In God And Non-believers” (5 March 2009):

In two studies led by Assistant Psychology Professor Michael Inzlicht, participants performed a Stroop task – a well-known test of cognitive control – while hooked up to electrodes that measured their brain activity.

Compared to non-believers, the religious participants showed significantly less activity in the anterior cingulate cortex (ACC), a portion of the brain that helps modify behavior by signaling when attention and control are needed, usually as a result of some anxiety-producing event like making a mistake. The stronger their religious zeal and the more they believed in God, the less their ACC fired in response to their own errors, and the fewer errors they made.

“You could think of this part of the brain like a cortical alarm bell that rings when an individual has just made a mistake or experiences uncertainty,” says lead author Inzlicht, who teaches and conducts research at the University of Toronto Scarborough. “We found that religious people or even people who simply believe in the existence of God show significantly less brain activity in relation to their own errors. They’re much less anxious and feel less stressed when they have made an error.”

“Obviously, anxiety can be negative because if you have too much, you’re paralyzed with fear,” [Inzlicht] says. “However, it also serves a very useful function in that it alerts us when we’re making mistakes. If you don’t experience anxiety when you make an error, what impetus do you have to change or improve your behaviour so you don’t make the same mistakes again and again?”

How the fundamentalist thinks Read More »

Add houseplants to your home & office

From David Pogue’s “TED’s Greatest Hits” (The New York Times: 10 February 2009):

Kamal Meattle reported the results of his efforts to fill an office building with plants, in an effort to reduce headache, asthma, and other productivity-sapping aliments in thickly polluted India. After researching NASA documents, he concluded that a set of three particular common, waist-high houseplants—areca palm, Mother-in-Law’s Tongue, and Money Plant—could be combined to scrub the air of carbon dioxide, formaldehyde and other pollutants.

At about four plants per occupant (1200 plants in all), the building’s air freshened considerably, and the health and productivity results were staggering. Eye irritation dropped by 52 percent, lower respiratory symptoms by 34 percent, headaches by 24 percent and asthma by 9 percent. There were fewer sick days, employee productivity increased, and energy costs dropped by 15 percent.

Add houseplants to your home & office Read More »

The color of the TV you watch determines the color of your dreams

From Richard Alleyne’s “Black and white TV generation have monochrome dreams” (The Telegraph: 17 October 2008):

New research suggests that the type of television you watched as a child has a profound effect on the colour of your dreams.

While almost all under 25s dream in colour, thousands of over 55s, all of whom were brought up with black and white sets, often dream in monchrome – even now.

Research from 1915 through to the 1950s suggested that the vast majority of dreams are in black and white but the tide turned in the sixties, and later results suggested that up to 83 per cent of dreams contain some colour.

Since this period also marked the transition between black-and-white film and TV and widespread Technicolor, an obvious explanation was that the media had been priming the subjects’ dreams.

The color of the TV you watch determines the color of your dreams Read More »

DIY genetic engineering

From Marcus Wohlsen’s “Amateurs are trying genetic engineering at home” (AP: 25 December 2008):

Now, tinkerers are working at home with the basic building blocks of life itself.

Using homemade lab equipment and the wealth of scientific knowledge available online, these hobbyists are trying to create new life forms through genetic engineering — a field long dominated by Ph.D.s toiling in university and corporate laboratories.

In her San Francisco dining room lab, for example, 31-year-old computer programmer Meredith L. Patterson is trying to develop genetically altered yogurt bacteria that will glow green to signal the presence of melamine, the chemical that turned Chinese-made baby formula and pet food deadly.

Many of these amateurs may have studied biology in college but have no advanced degrees and are not earning a living in the biotechnology field. Some proudly call themselves “biohackers” — innovators who push technological boundaries and put the spread of knowledge before profits.

In Cambridge, Mass., a group called DIYbio is setting up a community lab where the public could use chemicals and lab equipment, including a used freezer, scored for free off Craigslist, that drops to 80 degrees below zero, the temperature needed to keep many kinds of bacteria alive.

Patterson, the computer programmer, wants to insert the gene for fluorescence into yogurt bacteria, applying techniques developed in the 1970s.

She learned about genetic engineering by reading scientific papers and getting tips from online forums. She ordered jellyfish DNA for a green fluorescent protein from a biological supply company for less than $100. And she built her own lab equipment, including a gel electrophoresis chamber, or DNA analyzer, which she constructed for less than $25, versus more than $200 for a low-end off-the-shelf model.

DIY genetic engineering Read More »

How it feels to drown, get decapitated, get electrocuted, and more

From Anna Gosline’s “Death special: How does it feel to die?” (New Scientist: 13 October 2007):

Death comes in many guises, but one way or another it is usually a lack of oxygen to the brain that delivers the coup de grâce. Whether as a result of a heart attack, drowning or suffocation, for example, people ultimately die because their neurons are deprived of oxygen, leading to cessation of electrical activity in the brain – the modern definition of biological death.

If the flow of freshly oxygenated blood to the brain is stopped, through whatever mechanism, people tend to have about 10 seconds before losing consciousness. They may take many more minutes to die, though, with the exact mode of death affecting the subtleties of the final experience.

Drowning

Typically, when a victim realises that they cannot keep their head above water they tend to panic, leading to the classic “surface struggle”. They gasp for air at the surface and hold their breath as they bob beneath, says Tipton. Struggling to breathe, they can’t call for help. Their bodies are upright, arms weakly grasping, as if trying to climb a non-existent ladder from the sea. Studies with New York lifeguards in the 1950s and 1960s found that this stage lasts just 20 to 60 seconds.

When victims eventually submerge, they hold their breath for as long as possible, typically 30 to 90 seconds. After that, they inhale some water, splutter, cough and inhale more. Water in the lungs blocks gas exchange in delicate tissues, while inhaling water also triggers the airway to seal shut – a reflex called a laryngospasm. “There is a feeling of tearing and a burning sensation in the chest as water goes down into the airway. Then that sort of slips into a feeling of calmness and tranquility,” says Tipton, describing reports from survivors.

That calmness represents the beginnings of the loss of consciousness from oxygen deprivation, which eventually results in the heart stopping and brain death.

Heart attack

The most common symptom is, of course, chest pain: a tightness, pressure or squeezing, often described as an “elephant on my chest”, which may be lasting or come and go. This is the heart muscle struggling and dying from oxygen deprivation. Pain can radiate to the jaw, throat, back, belly and arms. Other signs and symptoms include shortness of breath, nausea and cold sweats.

Most victims delay before seeking assistance, waiting an average of 2 to 6 hours. Women are the worst, probably because they are more likely to experience less well-known symptoms, such as breathlessness, back or jaw pain, or nausea, says JoAnn Manson, an epidemiologist at Harvard Medical School.

Even small heart attacks can play havoc with the electrical impulses that control heart muscle contraction, effectively stopping it. In about 10 seconds the person loses consciousness, and minutes later they are dead.

Bleeding to death

People can bleed to death in seconds if the aorta, the major blood vessel leading from the heart, is completely severed, for example, after a severe fall or car accident.

Death could creep up much more slowly if a smaller vein or artery is nicked – even taking hours. Such victims would experience several stages of haemorrhagic shock. The average adult has 5 litres of blood. Losses of around 750 millilitres generally cause few symptoms. Anyone losing 1.5 litres – either through an external wound or internal bleeding – feels weak, thirsty and anxious, and would be breathing fast. By 2 litres, people experience dizziness, confusion and then eventual unconsciousness.

Fire

Long the fate of witches and heretics, burning to death is torture. Hot smoke and flames singe eyebrows and hair and burn the throat and airways, making it hard to breathe. Burns inflict immediate and intense pain through stimulation of the nociceptors – the pain nerves in the skin. To make matters worse, burns also trigger a rapid inflammatory response, which boosts sensitivity to pain in the injured tissues and surrounding areas.

Most people who die in fires do not in fact die from burns. The most common cause of death is inhaling toxic gases – carbon monoxide, carbon dioxide and even hydrogen cyanide – together with the suffocating lack of oxygen. One study of fire deaths in Norway from 1996 found that almost 75 per cent of the 286 people autopsied had died from carbon monoxide poisoning.

Depending on the size of the fire and how close you are to it, concentrations of carbon monoxide could start to cause headache and drowsiness in minutes, eventually leading to unconsciousness. According to the US National Fire Protection Association, 40 per cent of the victims of fatal home fires are knocked out by fumes before they can even wake up.

Decaptitation

Beheading, if somewhat gruesome, can be one of the quickest and least painful ways to die – so long as the executioner is skilled, his blade sharp, and the condemned sits still.

Quick it may be, but consciousness is nevertheless believed to continue after the spinal chord is severed. A study in rats in 1991 found that it takes 2.7 seconds for the brain to consume the oxygen from the blood in the head; the equivalent figure for humans has been calculated at 7 seconds.

It took the axeman three attempts to sever the head of Mary Queen of Scots in 1587. He had to finish the job with a knife.

Decades earlier in 1541, Margaret Pole, the Countess of Salisbury, was executed at the Tower of London. She was dragged to the block, but refused to lay her head down. The inexperienced axe man made a gash in her shoulder rather than her neck. According to some reports, she leapt from the block and was chased by the executioner, who struck 11 times before she died.

Electrocution

In accidental electrocutions, usually involving low, household current, the most common cause of death is arrhythmia, stopping the heart dead. Unconsciousness ensues after the standard 10 seconds, says Richard Trohman, a cardiologist at Rush University in Chicago. One study of electrocution deaths in Montreal, Canada found that 92 per cent had probably died from arrhythmia.

Higher currents can produce nearly immediate unconsciousness.

Fall from a height

A high fall is certainly among the speediest ways to die: terminal velocity (no pun intended) is about 200 kilometres per hour, achieved from a height of about 145 metres or more. A study of deadly falls in Hamburg, Germany, found that 75 per cent of victims died in the first few seconds or minutes after landing.

The exact cause of death varies, depending on the landing surface and the person’s posture. People are especially unlikely to arrive at the hospital alive if they land on their head – more common for shorter (under 10 metres) and higher (over 25 metres) falls. A 1981 analysis of 100 suicidal jumps from the Golden Gate Bridge in San Francisco – height: 75 metres, velocity on impact with the water: 120 kilometres per hour – found numerous causes of instantaneous death including massive lung bruising, collapsed lungs, exploded hearts or damage to major blood vessels and lungs through broken ribs.

Survivors of great falls often report the sensation of time slowing down. The natural reaction is to struggle to maintain a feet-first landing, resulting in fractures to the leg bones, lower spinal column and life-threatening broken pelvises. The impact travelling up through the body can also burst the aorta and heart chambers. Yet this is probably still the safest way to land, despite the force being concentrated in a small area: the feet and legs form a “crumple zone” which provides some protection to the major internal organs.

Some experienced climbers or skydivers who have survived a fall report feeling focused, alert and driven to ensure they landed in the best way possible: relaxed, legs bent and, where possible, ready to roll.

Hanging

Suicides and old-fashioned “short drop” executions cause death by strangulation; the rope puts pressure on the windpipe and the arteries to the brain. This can cause unconsciousness in 10 seconds, but it takes longer if the noose is incorrectly sited. Witnesses of public hangings often reported victims “dancing” in pain at the end of the rope, struggling violently as they asphyxiated. Death only ensues after many minutes, as shown by the numerous people being resuscitated after being cut down – even after 15 minutes.

When public executions were outlawed in Britain in 1868, hangmen looked for a less performance-oriented approach. They eventually adopted the “long-drop” method, using a lengthier rope so the victim reached a speed that broke their necks. It had to be tailored to the victim’s weight, however, as too great a force could rip the head clean off, a professionally embarrassing outcome for the hangman.

Despite the public boasting of several prominent executioners in late 19th-century Britain, a 1992 analysis of the remains of 34 prisoners found that in only about half of cases was the cause of death wholly or partly due to spinal trauma. Just one-fifth showed the classic “hangman’s fracture” between the second and third cervical vertebrae. The others died in part from asphyxiation.

Lethal injection

Read full article
Continue reading page |1 |2 |3 |4

Michael Spence, an anthropologist at the University of Western Ontario in London, Canada, has found similar results in US victims. He concluded, however, that even if asphyxiation played a role, the trauma of the drop would have rapidly rendered all of them unconscious. “What the hangmen were looking for was quick cessation of activity,” he says. “And they knew enough about their craft to ensure that happened. The thing they feared most was decapitation.”
Lethal injection

US-government approved, but is it really painless?

Lethal injection was designed in Oklahoma in 1977 as a humane alternative to the electric chair. The state medical examiner and chair of anaesthesiology settled on a series of three drug injections. First comes the anaesthetic thiopental to speed away any feelings of pain, followed by a paralytic agent called pancuronium to stop breathing. Finally potassium chloride is injected, which stops the heart almost instantly.

Each drug is supposed to be administered in a lethal dose, a redundancy to ensure speedy and humane death. However, eyewitnesses have reported inmates convulsing, heaving and attempting to sit up during the procedure, suggesting the cocktail is not always completely effective.

Explosive decompression

In real life there has been just one fatal space depressurisation accident. This occurred on the Russian Soyuz-11 mission in 1971, when a seal leaked upon re-entry into the Earth’s atmosphere; upon landing all three flight crew were found dead from asphyxiation.

Most of our knowledge of depressurisation comes from animal experiments and the experiences of pilots in accidents at very high altitudes. When the external air pressure suddenly drops, the air in the lungs expands, tearing the fragile gas exchange tissues. This is especially damaging if the victim neglects to exhale prior to decompression or tries to hold their breath. Oxygen begins to escape from the blood and lungs.

Experiments on dogs in the 1950s showed that 30 to 40 seconds after the pressure drops, their bodies began to swell as the water in tissues vaporised, though the tight seal of their skin prevented them from “bursting”. The heart rate rises initially, then plummets. Bubbles of water vapour form in the blood and travel through the circulatory system, obstructing blood flow. After about a minute, blood effectively stops circulating.

Human survivors of rapid decompression accidents include pilots whose planes lost pressure, or in one case a NASA technician who accidentally depressurised his flight suit inside a vacuum chamber. They often report an initial pain, like being hit in the chest, and may remember feeling air escape from their lungs and the inability to inhale. Time to the loss of consciousness was generally less than 15 seconds.

How it feels to drown, get decapitated, get electrocuted, and more Read More »

Hallucinating the presence of the dead

From Vaughan Bell’s “Ghost Stories: Visits from the Deceased” (Scientific American: 2 December 2008):

The dead stay with us, that much is clear. They remain in our hearts and minds, of course, but for many people they also linger in our senses—as sights, sounds, smells, touches or presences. Grief hallucinations are a normal reaction to bereavement but are rarely discussed, because people fear they might be considered insane or mentally destabilised by their loss. As a society we tend to associate hallucinations with things like drugs and mental illness, but we now know that hallucinations are common in sober healthy people and that they are more likely during times of stress.

Mourning seems to be a time when hallucinations are particularly common, to the point where feeling the presence of the deceased is the norm rather than the exception. One study, by the researcher Agneta Grimby at the University of Goteborg, found that over 80 percent of elderly people experience hallucinations associated with their dead partner one month after bereavement, as if their perception had yet to catch up with the knowledge of their beloved’s passing. As a marker of how vivid such visions can seem, almost a third of the people reported that they spoke in response to their experiences. In other words, these weren’t just peripheral illusions: they could evoke the very essence of the deceased.

Occasionally, these hallucinations are heart-rending. A 2002 case report by German researchers described how a middle aged woman, grieving her daughter’s death from a heroin overdose, regularly saw the young girl and sometimes heard her say “Mamma, Mamma!” and “It’s so cold.” Thankfully, these distressing experiences tend to be rare, and most people who experience hallucinations during bereavement find them comforting, as if they were re-connecting with something of the positive from the person’s life. Perhaps this reconnecting is reflected in the fact that the intensity of grief has been found to predict the number of pleasant hallucinations, as has the happiness of the marriage to the person who passed away.

There are hints that the type of grief hallucinations might also differ across cultures. Anthropologists have told us a great deal about how the ceremonies, beliefs and the social rituals of death differ greatly across the world, but we have few clues about how these different approaches affect how people experience the dead after they have gone.

Hallucinating the presence of the dead Read More »

A woman who never forgets anything

From Samiha Shafy’s “An Infinite Loop in the Brain” (Der Spiegel: 21 November 2008):

Price can rattle off, without hesitation, what she saw and heard on almost any given date. She remembers many early childhood experiences and most of the days between the ages of 9 and 15. After that, there are virtually no gaps in her memory. “Starting on Feb. 5, 1980, I remember everything. That was a Tuesday.”

“People say to me: Oh, how fascinating, it must be a treat to have a perfect memory,” she says. Her lips twist into a thin smile. “But it’s also agonizing.”

In addition to good memories, every angry word, every mistake, every disappointment, every shock and every moment of pain goes unforgotten. Time heals no wounds for Price. “I don’t look back at the past with any distance. It’s more like experiencing everything over and over again, and those memories trigger exactly the same emotions in me. It’s like an endless, chaotic film that can completely overpower me. And there’s no stop button.”

She’s constantly bombarded with fragments of memories, exposed to an automatic and uncontrollable process that behaves like an infinite loop in a computer. Sometimes there are external triggers, like a certain smell, song or word. But often her memories return by themselves. Beautiful, horrific, important or banal scenes rush across her wildly chaotic “internal monitor,” sometimes displacing the present. “All of this is incredibly exhausting,” says Price.

The scientists were able to verify her autobiographical data because she has meticulously kept a diary since the age of 10. She has filled more than 50,000 pages with tiny writing, documenting every occurrence, no matter how insignificant. Writing things down helps Price organize the thoughts and images shimmering in her head.

In fact, she feels a strong need to document her life. This includes hoarding every possible memento from childhood, including dolls, stuffed animals, cassette tapes, books, a drawer from dresser she had when she was five. “I have to be able to touch my memories,” Price explains.

[James McGaugh, founder of the Center for the Neurobiology of Learning and Memory at the University of California in Irvine,] and his colleagues concluded that Price’s episodic memory, her recollection of personal experiences and the emotions associated with them, is virtually perfect. A case like this has never been described in the history of memory research, according to McGaugh. He explains that Price differs substantially from other people with special powers of recall, such as autistic savants, because she uses no strategies to help her remember and even does a surprisingly poor job on some memory tests.

It’s difficult for her to memorize poems or series of numbers — which helps explain why she never stood out in school. Her semantic memory, the ability to remember facts not directly related to everyday life, is only average.

Two years ago, the scientists published their first conclusions in a professional journal without revealing the identity of their subject. Since then, more than 200 people have contacted McGaugh, all claiming to have an equally perfect episodic memory. Most of them were exposed as fakes. Three did appear to have similarly astonishing abilities. “Their personalities are very different. The others are not as anxious as Jill. But they achieve comparable results in the tests,” McGaugh reports.

The subjects do have certain compulsive traits in common, says McGaugh, especially compulsive hoarding. The three others are left-handed, and Price also showed a tendency toward left-handedness in tests.

In neurobiological terms, a memory is a stored pattern of links between nerve cells in the brain. It is created when synapses in a network of neurons are activated for a short time. The more often the memory is recalled afterwards, the more likely it is that permanent links develop between the nerve cells — and the pattern will be stored as a long-term memory. In theory there are so many possible links that an almost unlimited number of memories can be permanently stored.

So why don’t all people have the same powers of recollection as Jill Price? “If we could remember everything equally well, the brain would be hopelessly overburdened and would operate more slowly,” says McGaugh. He says forgetting is a necessary condition of having a viable memory — except in the case of Price and the other three memory superstars.

A woman who never forgets anything Read More »

An analysis of Google’s technology, 2005

From Stephen E. Arnold’s The Google Legacy: How Google’s Internet Search is Transforming Application Software (Infonortics: September 2005):

The figure Google’s Fusion: Hardware and Software Engineering shows that Google’s technology framework has two areas of activity. There is the software engineering effort that focuses on PageRank and other applications. Software engineering, as used here, means writing code and thinking about how computer systems operate in order to get work done quickly. Quickly means the sub one-second response times that Google is able to maintain despite its surging growth in usage, applications and data processing.

Google is hardware plus software

The other effort focuses on hardware. Google has refined server racks, cable placement, cooling devices, and data center layout. The payoff is lower operating costs and the ability to scale as demand for computing resources increases. With faster turnaround and the elimination of such troublesome jobs as backing up data, Google’s hardware innovations give it a competitive advantage few of its rivals can equal as of mid-2005.

How Google Is Different from MSN and Yahoo

Google’s technologyis simultaneously just like other online companies’ technology, and very different. A data center is usually a facility owned and operated by a third party where customers place their servers. The staff of the data center manage the power, air conditioning and routine maintenance. The customer specifies the computers and components. When a data center must expand, the staff of the facility may handle virtually all routine chores and may work with the customer’s engineers for certain more specialized tasks.

Before looking at some significant engineering differences between Google and two of its major competitors, review this list of characteristics for a Google data center.

1. Google data centers – now numbering about two dozen, although no one outside Google knows the exact number or their locations. They come online and automatically, under the direction of the Google File System, start getting work from other data centers. These facilities, sometimes filled with 10,000 or more Google computers, find one another and configure themselves with minimal human intervention.

2. The hardware in a Google data center can be bought at a local computer store. Google uses the same types of memory, disc drives, fans and power supplies as those in a standard desktop PC.

3. Each Google server comes in a standard case called a pizza box with one important change: the plugs and ports are at the front of the box to make access faster and easier.

4. Google racks are assembled for Google to hold servers on their front and back sides. This effectively allows a standard rack, normally holding 40 pizza box servers, to hold 80.

5. A Google data center can go from a stack of parts to online operation in as little as 72 hours, unlike more typical data centers that can require a week or even a month to get additional resources online.

6. Each server, rack and data center works in a way that is similar to what is called “plug and play.” Like a mouse plugged into the USB port on a laptop, Google’s network of data centers knows when more resources have been connected. These resources, for the most part, go into operation without human intervention.

Several of these factors are dependent on software. This overlap between the hardware and software competencies at Google, as previously noted, illustrates the symbiotic relationship between these two different engineering approaches. At Google, from its inception, Google software and Google hardware have been tightly coupled. Google is not a software company nor is it a hardware company. Google is, like IBM, a company that owes its existence to both hardware and software. Unlike IBM, Google has a business model that is advertiser supported. Technically, Google is conceptually closer to IBM (at one time a hardware and software company) than it is to Microsoft (primarily a software company) or Yahoo! (an integrator of multiple softwares).

Software and hardware engineering cannot be easily segregated at Google. At MSN and Yahoo hardware and software are more loosely-coupled. Two examples will illustrate these differences.

Microsoft – with some minor excursions into the Xbox game machine and peripherals – develops operating systems and traditional applications. Microsoft has multiple operating systems, and its engineers are hard at work on the company’s next-generation of operating systems.

Several observations are warranted:

1. Unlike Google, Microsoft does not focus on performance as an end in itself. As a result, Microsoft gets performance the way most computer users do. Microsoft buys or upgrades machines. Microsoft does not fiddle with its operating systems and their subfunctions to get that extra time slice or two out of the hardware.

2. Unlike Google, Microsoft has to support many operating systems and invest time and energy in making certain that important legacy applications such as Microsoft Office or SQLServer can run on these new operating systems. Microsoft has a boat anchor tied to its engineer’s ankles. The boat anchor is the need to ensure that legacy code works in Microsoft’s latest and greatest operating systems.

3. Unlike Google, Microsoft has no significant track record in designing and building hardware for distributed, massively parallelised computing. The mice and keyboards were a success. Microsoft has continued to lose money on the Xbox, and the sudden demise of Microsoft’s entry into the home network hardware market provides more evidence that Microsoft does not have a hardware competency equal to Google’s.

Yahoo! operates differently from both Google and Microsoft. Yahoo! is in mid-2005 a direct competitor to Google for advertising dollars. Yahoo! has grown through acquisitions. In search, for example, Yahoo acquired 3721.com to handle Chinese language search and retrieval. Yahoo bought Inktomi to provide Web search. Yahoo bought Stata Labs in order to provide users with search and retrieval of their Yahoo! mail. Yahoo! also owns AllTheWeb.com, a Web search site created by FAST Search & Transfer. Yahoo! owns the Overture search technology used by advertisers to locate key words to bid on. Yahoo! owns Alta Vista, the Web search system developed by Digital Equipment Corp. Yahoo! licenses InQuira search for customer support functions. Yahoo has a jumble of search technology; Google has one search technology.

Historically Yahoo has acquired technology companies and allowed each company to operate its technology in a silo. Integration of these different technologies is a time-consuming, expensive activity for Yahoo. Each of these software applications requires servers and systems particular to each technology. The result is that Yahoo has a mosaic of operating systems, hardware and systems. Yahoo!’s problem is different from Microsoft’s legacy boat-anchor problem. Yahoo! faces a Balkan-states problem.

There are many voices, many needs, and many opposing interests. Yahoo! must invest in management resources to keep the peace. Yahoo! does not have a core competency in hardware engineering for performance and consistency. Yahoo! may well have considerable competency in supporting a crazy-quilt of hardware and operating systems, however. Yahoo! is not a software engineering company. Its engineers make functions from disparate systems available via a portal.

The figure below provides an overview of the mid-2005 technical orientation of Google, Microsoft and Yahoo.

2005 focuses of Google, MSN, and Yahoo

The Technology Precepts

… five precepts thread through Google’s technical papers and presentations. The following snapshots are extreme simplifications of complex, yet extremely fundamental, aspects of the Googleplex.

Cheap Hardware and Smart Software

Google approaches the problem of reducing the costs of hardware, set up, burn-in and maintenance pragmatically. A large number of cheap devices using off-the-shelf commodity controllers, cables and memory reduces costs. But cheap hardware fails.

In order to minimize the “cost” of failure, Google conceived of smart software that would perform whatever tasks were needed when hardware devices fail. A single device or an entire rack of devices could crash, and the overall system would not fail. More important, when such a crash occurs, no full-time systems engineering team has to perform technical triage at 3 a.m.

The focus on low-cost, commodity hardware and smart software is part of the Google culture.

Logical Architecture

Google’s technical papers do not describe the architecture of the Googleplex as self-similar. Google’s technical papers provide tantalizing glimpses of an approach to online systems that makes a single server share features and functions of a cluster of servers, a complete data center, and a group of Google’s data centers.

The collections of servers running Google applications on the Google version of Linux is a supercomputer. The Googleplex can perform mundane computing chores like taking a user’s query and matching it to documents Google has indexed. Further more, the Googleplex can perform side calculations needed to embed ads in the results pages shown to user, execute parallelized, high-speed data transfers like computers running state-of-the-art storage devices, and handle necessary housekeeping chores for usage tracking and billing.

When Google needs to add processing capacity or additional storage, Google’s engineers plug in the needed resources. Due to self-similarity, the Googleplex can recognize, configure and use the new resource. Google has an almost unlimited flexibility with regard to scaling and accessing the capabilities of the Googleplex.

In Google’s self-similar architecture, the loss of an individual device is irrelevant. In fact, a rack or a data center can fail without data loss or taking the Googleplex down. The Google operating system ensures that each file is written three to six times to different storage devices. When a copy of that file is not available, the Googleplex consults a log for the location of the copies of the needed file. The application then uses that replica of the needed file and continues with the job’s processing.

Speed and Then More Speed

Google uses commodity pizza box servers organized in a cluster. A cluster is group of computers that are joined together to create a more robust system. Instead of using exotic servers with eight or more processors, Google generally uses servers that have two processors similar to those found in a typical home computer.

Through proprietary changes to Linux and other engineering innovations, Google is able to achieve supercomputer performance from components that are cheap and widely available.

… engineers familiar with Google believe that read rates may in some clusters approach 2,000 megabytes a second. When commodity hardware gets better, Google runs faster without paying a premium for that performance gain.

Another key notion of speed at Google concerns writing computer programs to deploy to Google users. Google has developed short cuts to programming. An example is Google’s creating a library of canned functions to make it easy for a programmer to optimize a program to run on the Googleplex computer. At Microsoft or Yahoo, a programmer must write some code or fiddle with code to get different pieces of a program to execute simultaneously using multiple processors. Not at Google. A programmer writes a program, uses a function from a Google bundle of canned routines, and lets the Googleplex handle the details. Google’s programmers are freed from much of the tedium associated with writing software for a distributed, parallel computer.

Eliminate or Reduce Certain System Expenses

Some lucky investors jumped on the Google bandwagon early. Nevertheless, Google was frugal, partly by necessity and partly by design. The focus on frugality influenced many hardware and software engineering decisions at the company.

Drawbacks of the Googleplex

The Laws of Physics: Heat and Power 101

In reality, no one knows. Google has a rapidly expanding number of data centers. The data center near Atlanta, Georgia, is one of the newest deployed. This state-of-the-art facility reflects what Google engineers have learned about heat and power issues in its other data centers. Within the last 12 months, Google has shifted from concentrating its servers at about a dozen data centers, each with 10,000 or more servers, to about 60 data centers, each with fewer machines. The change is a response to the heat and power issues associated with larger concentrations of Google servers.

The most failure prone components are:

  • Fans.
  • IDE drives which fail at the rate of one per 1,000 drives per day.
  • Power supplies which fail at a lower rate.

Leveraging the Googleplex

Google’s technology is one major challenge to Microsoft and Yahoo. So to conclude this cursory and vastly simplified look at Google technology, consider these items:

1. Google is fast anywhere in the world.

2. Google learns. When the heat and power problems at dense data centers surfaced, Google introduced cooling and power conservation innovations to its two dozen data centers.

3. Programmers want to work at Google. “Google has cachet,” said one recent University of Washington graduate.

4. Google’s operating and scaling costs are lower than most other firms offering similar businesses.

5. Google squeezes more work out of programmers and engineers by design.

6. Google does not break down, or at least it has not gone offline since 2000.

7. Google’s Googleplex can deliver desktop-server applications now.

8. Google’s applications install and update without burdening the user with gory details and messy crashes.

9. Google’s patents provide basic technology insight pertinent to Google’s core functionality.

An analysis of Google’s technology, 2005 Read More »

Richard Stallman on the 4 freedoms

From Richard Stallman’s “Transcript of Richard Stallman at the 4th international GPLv3 conference; 23rd August 2006” (FSF Europe: 23 August 2006):

Specifically, this refers to four essential freedoms, which are the definition of Free Software.

Freedom zero is the freedom to run the program, as you wish, for any purpose.

Freedom one is the freedom to study the source code and then change it so that it does what you wish.

Freedom two is the freedom to help your neighbour, which is the freedom to distribute, including publication, copies of the program to others when you wish.

Freedom three is the freedom to help build your community, which is the freedom to distribute, including publication, your modified versions, when you wish.

These four freedoms make it possible for users to live an upright, ethical life as a member of a community and enable us individually and collectively to have control over what our software does and thus to have control over our computing.

Richard Stallman on the 4 freedoms Read More »

How con artists use psychology to work

From Paul J. Zak’s “How to Run a Con” (Psychology Today: 13 November 2008):

When I was in high school, I took a job at an ARCO gas station on the outskirts of Santa Barbara, California. At the time, I drove a 1967 Mustang hotrod and thought I might pick up some tips and cheap parts by working around cars after school. You see a lot of interesting things working the night shift in a sketchy neighborhood. I constantly saw people making bad decisions: drunk drivers, gang members, unhappy cops, and con men. In fact, I was the victim of a classic con called “The Pigeon Drop.” If we humans have such big brains, how can we get conned?

Here’s what happened to me. One slow Sunday afternoon, a man comes out of the restroom with a pearl necklace in his hand. “Found it on the bathroom floor” he says. He followed with “Geez, looks nice-I wonder who lost it?” Just then, the gas station’s phone rings and a man asked if anyone found a pearl necklace that he had purchased as a gift for his wife. He offers a $200 reward for the necklace’s return. I tell him that a customer found it. “OK” he says, “I’ll be there in 30 minutes.” I give him the ARCO address and he gives me his phone number. The man who found the necklace hears all this but tells me he is running late for a job interview and cannot wait for the other man to arrive.

Huum, what to do? The man with the necklace said “Why don’t I give you the necklace and we split the reward?” The greed-o-meter goes off in my head, suppressing all rational thought. “Yeah, you give me the necklace to hold and I’ll give you $100” I suggest. He agrees. Since high school kids working at gas stations don’t have $100, I take money out of the cash drawer to complete the transaction.

You can guess the rest. The man with the lost necklace doesn’t come and never answers my many calls. After about an hour, I call the police. The “pearl” necklace was a two dollar fake and the number I was calling went to a pay phone nearby. I had to fess up to my boss and pay back the money with my next paycheck.

Why did this con work? Let’s do some neuroscience. While the primary motivator from my perspective was greed, the pigeon drop cleverly engages THOMAS (The Human Oxytocin Mediated Attachment System). … THOMAS is a powerful brain circuit that releases the neurochemical oxytocin when we are trusted and induces a desire to reciprocate the trust we have been shown–even with strangers.

The key to a con is not that you trust the conman, but that he shows he trusts you. Conmen ply their trade by appearing fragile or needing help, by seeming vulnerable. Because of THOMAS, the human brain makes us feel good when we help others–this is the basis for attachment to family and friends and cooperation with strangers. “I need your help” is a potent stimulus for action.

How con artists use psychology to work Read More »

50% of people infected with personality-changing brain parasites from cats

From Carl Zimmer’s “The Return of the Puppet Masters” (Corante: 17 January 2006):

I was investigating the remarkable ability parasites have to manipulate the behavior of their hosts. The lancet fluke Dicrocoelium dendriticum, for example, forces its ant host to clamp itself to the tip of grass blades, where a grazing mammal might eat it. It’s in the fluke’s interest to get eaten, because only by getting into the gut of a sheep or some other grazer can it complete its life cycle. Another fluke, Euhaplorchis californiensis, causes infected fish to shimmy and jump, greatly increasing the chance that wading birds will grab them.

Those parasites were weird enough, but then I got to know Toxoplasma gondii. This single-celled parasite lives in the guts of cats, sheddding eggs that can be picked up by rats and other animals that can just so happen be eaten by cats. Toxoplasma forms cysts throughout its intermediate host’s body, including the brain. And yet a Toxoplasma-ridden rat is perfectly healthy. That makes good sense for the parasite, since a cat would not be particularly interested in eating a dead rat. But scientists at Oxford discovered that the parasite changes the rats in one subtle but vital way.

The scientists studied the rats in a six-foot by six-foot outdoor enclosure. They used bricks to turn it into a maze of paths and cells. In each corner of the enclosure they put a nest box along with a bowl of food and water. On each the nests they added a few drops of a particular odor. On one they added the scent of fresh straw bedding, on another the bedding from a rat’s nests, on another the scent of rabbit urine, on another, the urine of a cat. When they set healthy rats loose in the enclosure, the animals rooted around curiously and investigated the nests. But when they came across the cat odor, they shied away and never returned to that corner. This was no surprise: the odor of a cat triggers a sudden shift in the chemistry of rat brains that brings on intense anxiety. (When researchers test anti-anxiety drugs on rats, they use a whiff of cat urine to make them panic.) The anxiety attack made the healthy rats shy away from the odor and in general makes them leery of investigating new things. Better to lie low and stay alive.

Then the researchers put Toxoplasma-carrying rats in the enclosure. Rats carrying the parasite are for the most part indistinguishable from healthy ones. They can compete for mates just as well and have no trouble feeding themselves. The only difference, the researchers found, is that they are more likely to get themselves killed. The scent of a cat in the enclosure didn’t make them anxious, and they went about their business as if nothing was bothering them. They would explore around the odor at least as often as they did anywhere else in the enclosure. In some cases, they even took a special interest in the spot and came back to it over and over again.

The scientists speculated that Toxoplasma was secreted some substance that was altering the patterns of brain activity in the rats. This manipulation likely evolved through natural selection, since parasites that were more likely to end up in cats would leave more offpsring.

The Oxford scientists knew that humans can be hosts to Toxoplasma, too. People can become infected by its eggs by handling soil or kitty litter. For most people, the infection causes no harm. Only if a person’s immune system is weak does Toxoplasma grow uncontrollably. That’s why pregnant women are advised not to handle kitty litter, and why toxoplasmosis is a serious risk for people with AIDS. Otherwise, the parasite lives quietly in people’s bodies (and brains). It’s estimated that about half of all people on Earth are infected with Toxoplasma.

Parasitologist Jaroslav Flegr of Charles University in Prague administered psychological questionnaires to people infected with Toxoplasma and controls. Those infected, he found, show a small, but statistically significant, tendency to be more self-reproaching and insecure. Paradoxically, infected women, on average, tend to be more outgoing and warmhearted than controls, while infected men tend to be more jealous and suspicious.

… [E. Fuller Torrey of the Stanley Medical Research Institute in Bethesda, Maryland] and his colleagues had noticed some intriguing links between Toxoplasma and schizophrenia. Infection with the parasite has been associated with damage to a certain class of neurons (astrocytes). So has schizophrenia. Pregnant women with high levels of Toxoplasma antibodies in their blood were more likely to give birth to children who would later develop schizophrenia. Torrey lays out more links in this 2003 paper. While none is a smoking gun, they are certainly food for thought. It’s conceivable that exposure to Toxoplasma causes subtle changes in most people’s personality, but in a small minority, it has more devastating effects.

50% of people infected with personality-changing brain parasites from cats Read More »

Biometric photo watermarking using your iris

From Eric’s “Canon’s Iris Registration Mode – Biological Copyright Metadata” (Photography Bay: 9 February 2008):

A recent Canon patent application (Pub. No.: US 2008/0025574 A1) reveals the next step in digital watermarking – Iris Registration.

The short and sweet of it?

1. Turn the Mode dial to “REG”
2. Choose between “REG 1″ through “REG 5″ (for up to 5 registered users)
3. Put eye to viewfinder
4. Look at display of center distance measurement point
5. Press the shutter button
6. Iris image captured
7. Go shoot

Additional embedded info can be added later. All metadata will be added to images after you’re finished shooting in a collective manner and not for each image. The purpose of the collective tagging, if you will, is to refrain from hampering the camera’s speed (frames per second) while shooting.

Biometric photo watermarking using your iris Read More »

The latest on electronic voting machines

From James Turner’s interview with Dr. Barbara Simons, past President of the Association for Computing Machinery & recent appointee to the Advisory Board of the Federal Election Assistance Commission, at “A 2008 e-Voting Wrapup with Dr. Barbara Simons” (O’Reilly Media: 7 November 2008):

[Note from Scott: headers added by me]

Optical Scan: Good & Bad

And most of the voting in Minnesota was done on precinct based optical scan machines, paper ballot which is then fed into the optical scanner at the precinct. And the good thing about that is it gives the voter immediate feedback if there is any problem, such as over-voting, voting twice for a candidate.

Well there’s several problems; one is–well first of all, as you say because these things have computers in them they can be mis-programmed, there can be software bugs. You could conceivably have malicious code. You could have the machines give you a different count from the right one. There was a situation back in the 2004 race where Gephardt in one of the Primaries–Gephardt received a large number of votes after he had withdrawn from the race. And this was done–using paper ballots, using optical scan paper ballots. I don’t know if it was this particular brand or not. And when they were recounted it was discovered that in fact that was the wrong result; that he had gotten fewer votes. Now I never saw an explanation for what happened but my guess is that whoever programmed these machines had mistakenly assigned the slot that was for Kerry to Gephardt and the slot that was for Gephardt to Kerry; that’s my guess. Now I don’t know if that’s true but if that did happen I think there’s very little reason to believe it was malicious because there was really nothing to be gained by doing that. So I think it was just an honest error but of course errors can occur.

DRE Studies

Ohio conducted a major study of electronic voting machines called the Everest Study which was commissioned by the current Secretary of State Bruner, Secretary of State Bruner and this study uncovered huge problems with these–with most of these voting systems, these touch screen voting systems. They were found to be insecure, unreliable, difficult to use; basically a similar study had been studied in California not too much earlier called the Top to Bottom Review and the Ohio study confirmed every–all of the problems that had been uncovered in California and found additional problems, so based on that there was a push to get rid of a lot of these machines.

States Using DREs

Maryland and Georgia are entirely touch screen States and so is New Jersey. In Maryland they’re supposed to replace them with optical scan paper ballots by 2010 but there’s some concern that there may not be the funding to do that. In fact Maryland and Georgia both use Diebold which is now called Premier, paperless touch screen voting machines; Georgia started using them in 2002 and in that race, that’s the race in which Max Cleveland, the Democratic Senator, paraplegic from–the Vietnam War Vet was defeated and I know that there are some people who questioned the outcome of that race because the area polls had showed him winning. And because that race–those machines are paperless there was no way to check the outcome. Another thing that was of a concern in Maryland in 2002 was that–I mean in Georgia in 2002 was that there were last minute software patches being added to the machines just before the Election and the software patches hadn’t really been inspected by any kind of independent agency.

More on Optical Scans

Well I think scanned ballots–well certainly scanned ballots give you a paper trail and they give you a good paper trail. The kind of paper trail you want and it’s not really a paper trail; it’s paper ballots because they are the ballots. What you want is you want it to be easy to audit and recount an election. And I think that’s something that really people hadn’t taken into consideration early on when a lot of these machines were first designed and purchased.

Disabilities

One of the things that was investigated in California when they did the Top to Bottom Review was just how easy is it for people with disabilities to use these touch screen machines? Nobody had ever done that before and these test results came back very negatively. If you look at the California results they’re very negative on these touch screen machines. In many cases people in wheelchairs had a very difficult time being able to operate them correctly, people who were blind sometimes had troubles understanding what was being said or things were said too loudly or too softly or they would get confused about the instructions or some of the ways that they had for manual inputting; their votes were confusing.

There is a–there are these things called Ballot Generating Devices which are not what we generally refer to as touch screen machines although they can be touch screen. The most widely used one is called the Auto Mark. And the way the Auto Mark works is you take a paper ballots, one of these optical scan ballots and you insert it into the Auto Mark and then it operates much the same way that these other paperless–potentially paperless touch screen machines work. It has a headphone–headset so that a blind voter can use it; it has–it’s possible for somebody in a wheelchair to vote, although in fact you don’t have to use this if you’re in a wheelchair; you can vote optical scan clearly. Somebody who has severe mobility impairments can vote on these machines using a sip, puff device where if you sip it’s a zero or one and if you puff it’s the opposite or a yes or a no. And these–the Auto Mark was designed with disability people in mind from early on. And it faired much better in the California tests. What it does is at the end when the voter with disabilities is finished he or she will say okay cast my ballot. At that point the Auto Mark simply marks the optical scan ballot; it just marks it. And then you have an optical scan ballot that can be read by an optical scanner. There should be no problems with it because it’s been generated by a machine. And you have a paper ballot that can be recounted.

Problems with DREs vs Optical Scans

One of the things to keep in–there’s a couple things to keep in mind when thinking about replacing these systems. The first is that these direct recording electronic systems or touch screen systems as they’re called they have to have–the States and localities that buy these systems have to have maintenance contracts with the vendors because they’re very complicated systems to maintain and of course the software is a secret. So some of these contracts are quite costly and these are ongoing expenses with these machines. In addition, because they have software in them they have to be securely stored and they have to be securely delivered and those create enormous problems especially when you have to worry about delivering large numbers of machines to places prior to the election. Frequently these machines end up staying in people’s garages or in churches for periods of time when they’re relatively insecure.

And you need far fewer scanners; the security issues with scanners are not as great because you can do an audit and a recount, so altogether it just seems to me that moving to paper based optical scan systems with precinct scanners so that the voter gets feedback on the ballot if the voter votes twice for President; the ballot is kicked out and the voter can vote a new ballot.

And as I say there is the Auto Mark for voters with disabilities to use; there’s also another system called Populex but that’s not as widely used as Auto Mark. There could be new systems coming forward.

1/2 of DREs Broken in Pennsylvania on Election Day

Editor’s Note: Dr. Simons wrote me later to say: “Many Pennsylvania polling places opened on election day with half or more of their voting machines broken — so they used emergency paper ballots until they could fix their machines.”

The latest on electronic voting machines Read More »

Tracking children who might commit a crime later

From Mark Townsend and Anushka Asthana’s “Put young children on DNA list, urge police” (The Guardian: 16 March 2008):

Primary school children should be eligible for the DNA database if they exhibit behaviour indicating they may become criminals in later life, according to Britain’s most senior police forensics expert.

Gary Pugh, director of forensic sciences at Scotland Yard and the new DNA spokesman for the Association of Chief Police Officers (Acpo), said a debate was needed on how far Britain should go in identifying potential offenders, given that some experts believe it is possible to identify future offending traits in children as young as five.

Tracking children who might commit a crime later Read More »

10,000 hours to reach expertise

From Malcolm Gladwell’s “A gift or hard graft?” (The Guardian: 15 November 2008):

This idea – that excellence at a complex task requires a critical, minimum level of practice – surfaces again and again in studies of expertise. In fact, researchers have settled on what they believe is a magic number for true expertise: 10,000 hours.

“In study after study, of composers, basketball players, fiction writers, ice-skaters, concert pianists, chess players, master criminals,” writes the neurologist Daniel Levitin, “this number comes up again and again. Ten thousand hours is equivalent to roughly three hours a day, or 20 hours a week, of practice over 10 years… No one has yet found a case in which true world-class expertise was accomplished in less time. It seems that it takes the brain this long to assimilate all that it needs to know to achieve true mastery.”

10,000 hours to reach expertise Read More »

Correcting wrong info reinforces false beliefs

From Jonathan M. Gitlin’s “Does ideology trump facts? Studies say it often does” (Ars Technica: 24 September 2008):

We like to think that people will be well informed before making important decisions, such as who to vote for, but the truth is that’s not always the case. Being uninformed is one thing, but having a population that’s actively misinformed presents problems when it comes to participating in the national debate, or the democratic process. If the findings of some political scientists are right, attempting to correct misinformation might do nothing more than reinforce the false belief.

This sort of misinformation isn’t hypothetical; in 2003 a study found that viewers of Fox News were significantly more misinformed about the Iraq war, with far greater percentages of viewers erroneously believing that Iraq possessed WMDs or that there was a credible link between the 9/11 attack and Saddam Hussein than those who got their news from other outlets like NPR and PBS. This has led to the rise of websites like FactCheck and SourceWatch.

Saying that correcting misinformation does little more than reinforce a false belief is a pretty controversial proposal, but the claim is based on a number of studies that examine the effect of political or ideological bias on fact correction. In the studies, volunteers were shown news items or political adverts that contained misinformation, followed by a correction. For example, a study by John Bullock of Yale showed volunteers a political ad created by NARAL that linked Justice John Roberts to a violent anti-abortion group, followed by news that the ad had been withdrawn. Interestingly, Democratic participants had a worse opinion of Roberts after being shown the ad, even after they were told it was false.

Over half (56 percent) of Democratic subjects disapproved of Roberts before the misinformation. That rose to 80 percent afterward, but even after correcting the misinformation, 72 percent of Democratic subjects still had a negative opinion. Republican volunteers, on the other hand, only showed a small increase in disapproval after watching the misinformation (11 percent vs 14 percent).

Correcting wrong info reinforces false beliefs Read More »

Interesting psychological disorders

From Lauren Davis’ “Delusion or Alien Invasion? Disorders That Make Life Seem Like Scifi” (io9: 27 September 2008):

Capgras Delusion: You believe a loved one has been replaced with an exact duplicate.

Reduplicative Paramnesia: You believe that a place or location has been moved to another site, or has been duplicated and exists in two places simultaneously.

Alien Hand Syndrome:Your hand seems to have a will of its own.

Alice in Wonderland Syndrome: You perceive objects as much larger or smaller than they actually are.

Fregoli Syndrome: You believe that multiple people in your life are actually a single person in disguise.

Jumping Frenchman of Maine Disorder: You obey any order shouted at you in a commanding voice.

Delusional Parasitosis: You believe that you are infested with parasites.

Cotard Delusion: You believe you have died and that your body is rotting and/or your soul is gone.

Interesting psychological disorders Read More »

To solve a problem, you first have to figure out the problem

From Russell L. Ackoff & Daniel Greenberg’s Turning Learning Right Side Up: Putting Education Back on Track (2008):

A classic story illustrates very well the potential cost of placing a problem in a disciplinary box. It involves a multistoried office building in New York. Occupants began complaining about the poor elevator service provided in the building. Waiting times for elevators at peak hours, they said, were excessively long. Several of the tenants threatened to break their leases and move out of the building because of this…

Management authorized a study to determine what would be the best solution. The study revealed that because of the age of the building no engineering solution could be justified economically. The engineers said that management would just have to live with the problem permanently.

The desperate manager called a meeting of his staff, which included a young recently hired graduate in personnel psychology…The young man had not focused on elevator performance but on the fact that people complained about waiting only a few minutes. Why, he asked himself, were they complaining about waiting for only a very short time? He concluded that the complaints were a consequence of boredom. Therefore, he took the problem to be one of giving those waiting something to occupy their time pleasantly. He suggested installing mirrors in the elevator boarding areas so that those waiting could look at each other or themselves without appearing to do so. The manager took up his suggestion. The installation of mirrors was made quickly and at a relatively low cost. The complaints about waiting stopped.

Today, mirrors in elevator lobbies and even on elevators in tall buildings are commonplace.

To solve a problem, you first have to figure out the problem Read More »

Energy-efficient washing machines

From Is a Dishwasher a Green Machine?:

To really green up your automatic dishwashing, you should always use the air-drying function, avoid the profligate “rinse hold” setting, wash only full loads, and install the machine far away from your refrigerator.

Just promise that you’ll scrape your dishes instead of pre-rinsing, use the shortest wash cycles possible, and buy phosphate-free detergents – or, if you’re handy with a blender, make your own.

Energy-efficient washing machines Read More »