education

The real digital divide: knowing how to use what you have & not knowing

From Howard Rheingold’s interview in “Howard Rheingold’s Latest Connection” (Business Week: 11 August 2004):

Here’s where Wikipedia fits in. It used to be if you were a kid in a village in India or a village in northern Canada in the winter, maybe you could get to a place where they have a few books once in a while. Now, if you have a telephone, you can get a free encyclopedia. You have access to the world’s knowledge. Knowing how to use that is a barrier. The divide increasingly is not so much between those who have and those who don’t, but those who know how to use what they have and those who don’t.

Reading the impenetrable too fast

From Lee Siegel, quoted in Juliet Lapidos’s “Overrated: Authors, critics, and editors on ‘great books’ that aren’t all that great” (Slate: 11 August 2011):

It was like Herbert Marcuse’s advice to a despairing graduate student who said he had spent days on a sentence in Hegel and still couldn’t understand it: “You’re reading too fast,” Marcuse told him.

Shelby Foote on how the Civil War changed the gender of the teaching profession

From Carter Coleman, Donald Faulkner, & William Kennedy’s interview of Shelby Foote in “The Art of Fiction No. 158” (The Paris Review: Summer 1999, No. 151):

About the time that war started I think roughly eighty-five or ninety percent of the teachers in this country were men. After the war was over something like eighty-five to ninety percent of teachers were women.

Kurt Vonnegut on using our talents

From David Hayman, David Michaelis, George Plimpton, & Richard Rhodes’s interview of Kurt Vonnegut in “The Art of Fiction No. 64” (The Paris Review: Spring 1977, No. 69):

I bawled [my daughter] out one time for not doing more with the talents she had. She replied that having talent doesn’t carry with it the obligation that something has to be done with it. This was startling news to me. I thought people were supposed to grab their talents and run as far and fast as they could.

A summary of Galbraith’s The Affluent Society

From a summary of John Kenneth Galbraith’s The Affluent Society (Abridge Me: 1 June 2010):

The Concept of the Conventional Wisdom

The paradigms on which society’s perception of reality are based are highly conservative. People invest heavily in these ideas, and so are heavily resistant to changing them. They are only finally overturned by new ideas when new events occur which make the conventional wisdom appear so absurd as to be impalpable. Then the conventional wisdom quietly dies with its most staunch proponents, to be replaced with a new conventional wisdom. …

Economic Security

… Economics professors argue that the threat of unemployment is necessary to maintain incentives to high productivity, and simultaneously that established professors require life tenure in order to do their best work. …

The Paramount Position of Production

… Another irrationality persists (more in America than elsewhere?): the prestigious usefulness of private-sector output, compared to the burdensome annoyance of public expenditure. Somehow public expenditure can never quite be viewed as a productive and enriching element of national output; it is forever something to be avoided, at best a necessary encumbrance. Cars are important, roads are not. An expansion in telephone services improves the general well-being, cuts in postal services are a necessary economy. Vacuum cleaners to ensure clean houses boast our standard of living, street cleaners are an unfortunate expense. Thus we end up with clean houses and filthy streets. …

[W]e have wants at the margin only so far as they are synthesised. We do not manufacture wants for goods we do not produce. …

The Dependence Effect

… Modern consumer demand, at the margin, does not originate from within the individual, but is a consequence of production. It has two origins:

  1. Emulation: the desire to keep abreast of, or ahead of one’s peer group — demand originating from this motivation is created indirectly by production. Every effort to increase production to satiate want brings with it a general raising of the level of consumption, which itself increases want.
  2. Advertising: the direct influence of advertising and salesmanship create new wants which the consumer did not previously possess. Any student of business has by now come to view marketing as fundamental a business activity as production. Any want that can be significantly moulded by advertising cannot possibly have been strongly felt in the absence of that advertising — advertising is powerless to persuade a man that he is or is not hungry.

Inflation

… In 1942 a grateful and very anxious citizenry rewarded its soldiers, sailors, and airmen with a substantial increase in pay. In the teeming city of Honolulu, in prompt response to this advance in wage income, the prostitutes raised the prices of their services. This was at a time when, if anything, increased volume was causing a reduction in their average unit costs. However, in this instance the high military authorities, deeply angered by what they deemed improper, immoral, and indecent profiteering, ordered a return to the previous scale. …

The Theory of Social Balance

The final problem of the affluent society is the balance of goods it produces. Private goods: TVs, cars, cigarettes, drugs and alcohol are overproduced; public goods: education, healthcare, police services, park provision, mass transport and refuse disposal are underproduced. The consequences are extremely severe for the wellbeing of society. The balance between private and public consumption will be referred to as ‘the social balance’. The main reason for this imbalance is relatively straightforward. The forces we have identified which increase consumer demand as production rises (advertising and emulation) act almost entirely on the private sector. …

It is arguable that emulation acts on public services to an extent: a new school in one district may encourage neighbouring districts to ‘keep up’, but the effect is relatively miniscule.

Thus, private demand is artificially inflated and public demand is not, and the voter-consumer decides how to split his income between the two at the ballot box: inevitably public expenditure is grossly underrepresented. …

Errol Morris on “investigative journalism”

From Errol Morris’s “Film Legend Errol Morris Salutes New Graduates At 2010 Commencement” (Berkeley Graduate School of Journalism: 10 May 2010):

I have often wondered why we need the phrase investigative journalism. Isn’t all journalism supposed to be investigative? Isn’t journalism without an investigative element little more than gossip? And isn’t there enough gossip around already?

Errol Morris on film noir

From Errol Morris’s “Film Legend Errol Morris Salutes New Graduates At 2010 Commencement” (Berkeley Graduate School of Journalism: 10 May 2010):

There are many things I liked about noir. But in particular, there are images of one benighted character after another struggling to make sense of the world – and sometimes failing in the effort. [Their failure could be chalked up to many things. But most severe among the possibilities, was the thought that the world might be intractable. That you can never figure out how it works, what makes it tick. A terribly, sad thought. There has to be, there just has to be the presumption that you can figure things out.]

David Foster Wallace on the impossibility of being informed & the seduction of dogma

From David Foster Wallace’s “Introduction” (The Best American Essays 2007):

Here is an overt premise. There is just no way that 2004’s reelection could have taken place—not to mention extraordinary renditions, legalized torture, FISA-flouting, or the
passage of the Military Commissions Act—if we had been paying attention and handling information in a competent grown-up way. ‘We’ meaning as a polity and culture. The premise does not entail specific blame—or rather the problems here are too entangled and systemic for good old-fashioned finger-pointing. It is, for one example, simplistic and wrong to blame the for-profit media for somehow failing to make clear to us the moral and practical hazards of trashing the Geneva Conventions. The for-profit media is highly attuned to what we want and the amount of detail we’ll sit still for. And a ninety-second news piece on the question of whether and how the Geneva Conventions ought to apply in an era of asymmetrical warfare is not going to explain anything; the relevant questions are too numerous and complicated, too fraught with contexts in everything from civil law and military history to ethics and game theory. One could spend a hard month just learning the history of the Conventions’ translation into actual codes of conduct for the U.S. military … and that’s not counting the dramatic changes in those codes since 2002, or the question of just what new practices violate (or don’t) just which Geneva provisions, and according to whom. Or let’s not even mention the amount of research, background, cross- checking, corroboration, and rhetorical parsing required to understand the cataclysm of Iraq, the collapse of congressional oversight, the ideology of neoconservatism, the legal status of presidential signing statements, the political marriage of evangelical Protestantism and corporatist laissez-faire … There’s no way. You’d simply drown. We all would. It’s amazing to me that no one much talks about this—about the fact that whatever our founders and framers thought of as a literate, informed citizenry can no longer exist, at least not without a whole new modern degree of subcontracting and dependence packed into what we mean by ‘informed.’8

8 Hence, by the way, the seduction of partisan dogma. You can drown in dogmatism now, too— radio, Internet, cable, commercial and scholarly print— but this kind of drowning is more like sweet release. Whether hard right or new left or whatever, the seduc- tion and mentality are the same. You don’t have to feel confused or inundated or ignorant. You don’t even have to think, for you already Know, and whatever you choose to learn confirms what you Know. This dog- matic lockstep is not the kind of inevitable dependence I’m talking about—or rather it’s only the most extreme and frightened form of that dependence.

Malcolm Gladwell on training to be a journalist

From Alex Altman’s “Q&A: Author Malcolm Gladwell” (TIME: 20 October 2009):

If you had a single piece of advice to offer young journalists, what would it be?

The issue is not writing. It’s what you write about. One of my favorite columnists is Jonathan Weil, who writes for Bloomberg. He broke the Enron story, and he broke it because he’s one of the very few mainstream journalists in America who really knows how to read a balance sheet. That means Jonathan Weil will always have a job, and will always be read, and will always have something interesting to say. He’s unique. Most accountants don’t write articles, and most journalists don’t know anything about accounting. Aspiring journalists should stop going to journalism programs and go to some other kind of grad school. If I was studying today, I would go get a master’s in statistics, and maybe do a bunch of accounting courses and then write from that perspective. I think that’s the way to survive. The role of the generalist is diminishing. Journalism has to get smarter.

Huck Finn caged

From Nicholas Carr’s “Sivilized” (Rough Type: 27 June 2009):

Michael Chabon, in an elegiac essay in the new edition of the New York Review of Books, rues the loss of the “Wilderness of Childhood” – the unparented, unfenced, only partially mapped territory that was once the scene of youth.

Huck Finn, now fully under the thumb of Miss Watson and the Widow Douglas, spends his unscheduled time wandering the fabricated landscapes of World of Warcraft, seeking adventure.

All about freezing to death

Ice mask, C.T. Madigan / photograph by Frank Hurley
Creative Commons License photo credit: State Library of New South Wales collection

From Peter Stark’s “As Freezing Persons Recollect the Snow–First Chill–Then Stupor–Then the Letting Go” (Outside: January 1997):

There is no precise core temperature at which the human body perishes from cold. At Dachau’s cold-water immersion baths, Nazi doctors calculated death to arrive at around 77 degrees Fahrenheit. The lowest recorded core temperature in a surviving adult is 60.8 degrees. For a child it’s lower: In 1994, a two-year-old girl in Saskatchewan wandered out of her house into a minus-40 night. She was found near her doorstep the next morning, limbs frozen solid, her core temperature 57 degrees. She lived.

The cold remains a mystery, more prone to fell men than women, more lethal to the thin and well muscled than to those with avoirdupois, and least forgiving to the arrogant and the unaware.

Were you a Norwegian fisherman or Inuit hunter, both of whom frequently work gloveless in the cold, your chilled hands would open their surface capillaries periodically to allow surges of warm blood to pass into them and maintain their flexibility. This phenomenon, known as the hunter’s response, can elevate a 35-degree skin temperature to 50 degrees within seven or eight minutes.

Other human adaptations to the cold are more mysterious. Tibetan Buddhist monks can raise the skin temperature of their hands and feet by 15 degrees through meditation. Australian aborigines, who once slept on the ground, unclothed, on near-freezing nights, would slip into a light hypothermic state, suppressing shivering until the rising sun rewarmed them.

The exertion that warmed you on the way uphill now works against you: Your exercise-dilated capillaries carry the excess heat of your core to your skin, and your wet clothing dispels it rapidly into the night. The lack of insulating fat over your muscles allows the cold to creep that much closer to your warm blood.

Your temperature begins to plummet. Within 17 minutes it reaches the normal 98.6. Then it slips below.

At 97 degrees, hunched over in your slow search, the muscles along your neck and shoulders tighten in what’s known as pre-shivering muscle tone. Sensors have signaled the temperature control center in your hypothalamus, which in turn has ordered the constriction of the entire web of surface capillaries. Your hands and feet begin to ache with cold.

At 95, you’ve entered the zone of mild hypothermia. You’re now trembling violently as your body attains its maximum shivering response, an involuntary condition in which your muscles contract rapidly to generate additional body heat.

And after this long stop, the skiing itself has become more difficult. By the time you push off downhill, your muscles have cooled and tightened so dramatically that they no longer contract easily, and once contracted, they won’t relax. You’re locked into an ungainly, spread-armed, weak-kneed snowplow.

As you sink back into the snow, shaken, your heat begins to drain away at an alarming rate, your head alone accounting for 50 percent of the loss. The pain of the cold soon pierces your ears so sharply that you root about in the snow until you find your hat and mash it back onto your head.

But even that little activity has been exhausting. You know you should find your glove as well, and yet you’re becoming too weary to feel any urgency. You decide to have a short rest before going on.

An hour passes. at one point, a stray thought says you should start being scared, but fear is a concept that floats somewhere beyond your immediate reach, like that numb hand lying naked in the snow. You’ve slid into the temperature range at which cold renders the enzymes in your brain less efficient. With every one-degree drop in body temperature below 95, your cerebral metabolic rate falls off by 3 to 5 percent. When your core temperature reaches 93, amnesia nibbles at your consciousness.

In the minus-35-degree air, your core temperature falls about one degree every 30 to 40 minutes, your body heat leaching out into the soft, enveloping snow. Apathy at 91 degrees. Stupor at 90.

You’ve now crossed the boundary into profound hypothermia. By the time your core temperature has fallen to 88 degrees, your body has abandoned the urge to warm itself by shivering. Your blood is thickening like crankcase oil in a cold engine. Your oxygen consumption, a measure of your metabolic rate, has fallen by more than a quarter. Your kidneys, however, work overtime to process the fluid overload that occurred when the blood vessels in your extremities constricted and squeezed fluids toward your center. You feel a powerful urge to urinate, the only thing you feel at all.

By 87 degrees you’ve lost the ability to recognize a familiar face, should one suddenly appear from the woods.

At 86 degrees, your heart, its electrical impulses hampered by chilled nerve tissues, becomes arrhythmic. It now pumps less than two-thirds the normal amount of blood. The lack of oxygen and the slowing metabolism of your brain, meanwhile, begin to trigger visual and auditory hallucinations.

At 85 degrees, those freezing to death, in a strange, anguished paroxysm, often rip off their clothes. This phenomenon, known as paradoxical undressing, is common enough that urban hypothermia victims are sometimes initially diagnosed as victims of sexual assault. Though researchers are uncertain of the cause, the most logical explanation is that shortly before loss of consciousness, the constricted blood vessels near the body’s surface suddenly dilate and produce a sensation of extreme heat against the skin.

There’s an adage about hypothermia: “You aren’t dead until you’re warm and dead.”

At about 6:00 the next morning, his friends, having discovered the stalled Jeep, find him, still huddled inches from the buried log, his gloveless hand shoved into his armpit. The flesh of his limbs is waxy and stiff as old putty, his pulse nonexistent, his pupils unresponsive to light. Dead.

But those who understand cold know that even as it deadens, it offers perverse salvation. Heat is a presence: the rapid vibrating of molecules. Cold is an absence: the damping of the vibrations. At absolute zero, minus 459.67 degrees Fahrenheit, molecular motion ceases altogether. It is this slowing that converts gases to liquids, liquids to solids, and renders solids harder. It slows bacterial growth and chemical reactions. In the human body, cold shuts down metabolism. The lungs take in less oxygen, the heart pumps less blood. Under normal temperatures, this would produce brain damage. But the chilled brain, having slowed its own metabolism, needs far less oxygen-rich blood and can, under the right circumstances, survive intact.

Setting her ear to his chest, one of his rescuers listens intently. Seconds pass. Then, faintly, she hears a tiny sound–a single thump, so slight that it might be the sound of her own blood. She presses her ear harder to the cold flesh. Another faint thump, then another.

The slowing that accompanies freezing is, in its way, so beneficial that it is even induced at times. Cardiologists today often use deep chilling to slow a patient’s metabolism in preparation for heart or brain surgery. In this state of near suspension, the patient’s blood flows slowly, his heart rarely beats–or in the case of those on heart-lung machines, doesn’t beat at all; death seems near. But carefully monitored, a patient can remain in this cold stasis, undamaged, for hours.

In fact, many hypothermia victims die each year in the process of being rescued. In “rewarming shock,” the constricted capillaries reopen almost all at once, causing a sudden drop in blood pressure. The slightest movement can send a victim’s heart muscle into wild spasms of ventricular fibrillation. In 1980, 16 shipwrecked Danish fishermen were hauled to safety after an hour and a half in the frigid North Sea. They then walked across the deck of the rescue ship, stepped below for a hot drink, and dropped dead, all 16 of them.

The doctor rapidly issues orders to his staff: intravenous administration of warm saline, the bag first heated in the microwave to 110 degrees. Elevating the core temperature of an average-size male one degree requires adding about 60 kilocalories of heat. A kilocalorie is the amount of heat needed to raise the temperature of one liter of water one degree Celsius. Since a quart of hot soup at 140 degrees offers about 30 kilocalories, the patient curled on the table would need to consume 40 quarts of chicken broth to push his core temperature up to normal. Even the warm saline, infused directly into his blood, will add only 30 kilocalories.

Ideally, the doctor would have access to a cardiopulmonary bypass machine, with which he could pump out the victim’s blood, rewarm and oxygenate it, and pump it back in again, safely raising the core temperature as much as one degree every three minutes. But such machines are rarely available outside major urban hospitals.

You’d nod if you could. But you can’t move. All you can feel is throbbing discomfort everywhere. Glancing down to where the pain is most biting, you notice blisters filled with clear fluid dotting your fingers, once gloveless in the snow. During the long, cold hours the tissue froze and ice crystals formed in the tiny spaces between your cells, sucking water from them, blocking the blood supply. You stare at them absently.

“I think they’ll be fine,” a voice from overhead says. “The damage looks superficial. We expect that the blisters will break in a week or so, and the tissue should revive after that.”

If not, you know that your fingers will eventually turn black, the color of bloodless, dead tissue. And then they will be amputated.

You’ve seen that in the infinite reaches of the universe, heat is as glorious and ephemeral as the light of the stars. Heat exists only where matter exists, where particles can vibrate and jump. In the infinite winter of space, heat is tiny; it is the cold that is huge.

Extreme male brains

From Joe Clark’s “The extreme Google brain” (Fawny: 26 April 2009):

… Susan Pinker’s The Sexual Paradox, which explains, using scientific findings, why large majorities of girls and women behave almost identically at different stages of their lives – while large minorities of boys and men show vast variability compared to each other and to male norms.

Some of these boys and men exhibit extreme-male-brain tendencies, including an ability to focus obsessively for long periods of time, often on inanimate objects or abstractions (hence male domination of engineering and high-end law). Paradoxically, other male brains in these exceptional cases may have an ability to experiment with many options for short periods each. Pejoratively diagnosed as attention-deficit disorder, Pinker provides evidence this latter ability is actually a strength for some entrepreneurs.

The male brain, extreme or not, is compatible with visual design. It allows you to learn every font in the Letraset catalogue and work from a grid. In fact, the male-brain capacity for years-long single-mindedness explains why the heads of large ad agencies and design houses are overwhelmingly male. (It isn’t a sexist conspiracy.)

In the computer industry, extreme male brains permit years of concentration on hardware and software design, while also iterating those designs seemingly ad infinitum. The extreme male brain is really the extreme Google brain. It’s somewhat of a misnomer, because such is actually the average brain inside the company, but I will use that as a neologism.

Google was founded by extreme-male-brain nerds and, by all outward appearances, seems to hire only that type of person, not all of them male.

Michael Pollan’s rules for food

20070630_wall_drug_gus_chips.jpg
Image by rsgranne via Flickr

From John Schwenkler’s “Food for thought: renewing the culinary culture should be a conservative cause” (The American Conservative: 2008):

Michael Pollan’s In Defense of Food deconstructs the pretensions of “food science” in often hilarious fashion and distills all you need to know about eating into three directives: Eat food (as opposed to things with unfamiliar or unpronounceable ingredients, packaged “food products” that make government-sanctioned health claims, and pretty much anything from the middle aisles of the grocery store); Not too much (go for quality over quantity, and eat at a table, with others); Mostly plants (in unprocessed form when possible).

Chemically remove bad memories

From Nicholas Carr’s “Remembering to forget” (Rough Type: 22 October 2008):

Slowly but surely, scientists are getting closer to developing a drug that will allow people to eliminate unpleasant memories. The new issue of Neuron features a report from a group of Chinese scientists who were able to use a chemical – the protein alpha-CaM kinase II – to successfully erase memories from the minds of mice. The memory losses, report the authors, are “not caused by disrupting the retrieval access to the stored information but are, rather, due to the active erasure of the stored memories.” The erasure, moreover, “is highly restricted to the memory being retrieved while leaving other memories intact. Therefore, our study reveals a molecular genetic paradigm through which a given memory, such as new or old fear memory, can be rapidly and specifically erased in a controlled and inducible manner in the brain.”

One can think of a whole range of applications, from the therapeutic to the cosmetic to the political.

Socioeconomic analysis of MySpace & Facebook

From danah boyd’s “Viewing American class divisions through Facebook and MySpace” (danah boyd: 24 June 2007):

When MySpace launched in 2003, it was primarily used by 20/30-somethings (just like Friendster before it). The bands began populating the site by early 2004 and throughout 2004, the average age slowly declined. It wasn’t until late 2004 that teens really started appearing en masse on MySpace and 2005 was the year that MySpace became the “in thing” for teens.

Facebook launched in 2004 as a Harvard-only site. It slowly expanded to welcome people with .edu accounts from a variety of different universities. In mid-2005, Facebook opened its doors to high school students, but it wasn’t that easy to get an account because you needed to be invited. As a result, those who were in college tended to invite those high school students that they liked. Facebook was strongly framed as the “cool” thing that college students did.

In addition to the college framing, the press coverage of MySpace as dangerous and sketchy alienated “good” kids. Facebook seemed to provide an ideal alternative. Parents weren’t nearly as terrified of Facebook because it seemed “safe” thanks to the network-driven structure.

She argues that class divisions in the United States have more to do with lifestyle and social stratification than with income. In other words, all of my anti-capitalist college friends who work in cafes and read Engels are not working class just because they make $14K a year and have no benefits. Class divisions in the United States have more to do with social networks (the real ones, not FB/MS), social capital, cultural capital, and attitudes than income. Not surprisingly, other demographics typically discussed in class terms are also a part of this lifestyle division. Social networks are strongly connected to geography, race, and religion; these are also huge factors in lifestyle divisions and thus “class.”

The goodie two shoes, jocks, athletes, or other “good” kids are now going to Facebook. These kids tend to come from families who emphasize education and going to college. They are part of what we’d call hegemonic society. They are primarily white, but not exclusively. They are in honors classes, looking forward to the prom, and live in a world dictated by after school activities.

MySpace is still home for Latino/Hispanic teens, immigrant teens, “burnouts,” “alternative kids,” “art fags,” punks, emos, goths, gangstas, queer kids, and other kids who didn’t play into the dominant high school popularity paradigm. These are kids whose parents didn’t go to college, who are expected to get a job when they finish high school. These are the teens who plan to go into the military immediately after schools. Teens who are really into music or in a band are also on MySpace. MySpace has most of the kids who are socially ostracized at school because they are geeks, freaks, or queers.

In order to demarcate these two groups, let’s call the first group of teens “hegemonic teens” and the second group “subaltern teens.”

Most teens who exclusively use Facebook are familiar with and have an opinion about MySpace. These teens are very aware of MySpace and they often have a negative opinion about it. They see it as gaudy, immature, and “so middle school.” They prefer the “clean” look of Facebook, noting that it is more mature and that MySpace is “so lame.” What hegemonic teens call gaudy can also be labeled as “glitzy” or “bling” or “fly” (or what my generation would call “phat”) by subaltern teens. Terms like “bling” come out of hip-hop culture where showy, sparkly, brash visual displays are acceptable and valued. The look and feel of MySpace resonates far better with subaltern communities than it does with the upwardly mobile hegemonic teens. … That “clean” or “modern” look of Facebook is akin to West Elm or Pottery Barn or any poshy Scandinavian design house (that I admit I’m drawn to) while the more flashy look of MySpace resembles the Las Vegas imagery that attracts millions every year. I suspect that lifestyles have aesthetic values and that these are being reproduced on MySpace and Facebook.

I should note here that aesthetics do divide MySpace users. The look and feel that is acceptable amongst average Latino users is quite different from what you see the subculturally-identified outcasts using. Amongst the emo teens, there’s a push for simple black/white/grey backgrounds and simplistic layouts. While I’m using the term “subaltern teens” to lump together non-hegemonic teens, the lifestyle divisions amongst the subalterns are quite visible on MySpace through the aesthetic choices of the backgrounds. The aesthetics issue is also one of the forces that drives some longer-term users away from MySpace.

Teens from poorer backgrounds who are on MySpace are less likely to know people who go to universities. They are more likely to know people who are older than them, but most of their older friends, cousins, and co-workers are on MySpace. It’s the cool working class thing and it’s the dominant SNS at community colleges. These teens are more likely to be interested in activities like shows and clubs and they find out about them through MySpace. The subaltern teens who are better identified as “outsiders” in a hegemonic community tend to be very aware of Facebook. Their choice to use MySpace instead of Facebook is a rejection of the hegemonic values (and a lack of desire to hang out with the preps and jocks even online).

Class divisions in military use

A month ago, the military banned MySpace but not Facebook. This was a very interesting move because the division in the military reflects the division in high schools. Soldiers are on MySpace; officers are on Facebook. Facebook is extremely popular in the military, but it’s not the SNS of choice for 18-year old soldiers, a group that is primarily from poorer, less educated communities. They are using MySpace. The officers, many of whom have already received college training, are using Facebook. The military ban appears to replicate the class divisions that exist throughout the military. …

MySpace is the primary way that young soldiers communicate with their peers. When I first started tracking soldiers’ MySpace profiles, I had to take a long deep breath. Many of them were extremely pro-war, pro-guns, anti-Arab, anti-Muslim, pro-killing, and xenophobic as hell. Over the last year, I’ve watched more and more profiles emerge from soldiers who aren’t quite sure what they are doing in Iraq. I don’t have the data to confirm whether or not a significant shift has occurred but it was one of those observations that just made me think. And then the ban happened. I can’t help but wonder if part of the goal is to cut off communication between current soldiers and the group that the military hopes to recruit.

Thoughts and meta thoughts

People often ask me if I’m worried about teens today. The answer is yes, but it’s not because of social network sites. With the hegemonic teens, I’m very worried about the stress that they’re under, the lack of mobility and healthy opportunities for play and socialization, and the hyper-scheduling and surveillance. I’m worried about their unrealistic expectations for becoming rich and famous, their lack of work ethic after being pampered for so long, and the lack of opportunities that many of them have to even be economically stable let alone better off than their parents. I’m worried about how locking teens indoors coupled with a fast food/junk food advertising machine has resulted in a decrease in health levels across the board which will just get messy as they are increasingly unable to afford health insurance. When it comes to ostracized teens, I’m worried about the reasons why society has ostracized them and how they will react to ongoing criticism from hegemonic peers. I cringe every time I hear of another Columbine, another Virgina Tech, another site of horror when an outcast teen lashes back at the hegemonic values of society.

I worry about the lack of opportunities available to poor teens from uneducated backgrounds. I’m worried about how Wal-Mart Nation has destroyed many of the opportunities for meaningful working class labor as these youth enter the workforce. I’m worried about what a prolonged war will mean for them. I’m worried about how they’ve been told that to succeed, they must be a famous musician or sports player. I’m worried about how gangs provide the only meaningful sense of community that many of these teens will ever know.

Given the state of what I see in all sorts of neighborhoods, I’m amazed at how well teens are coping and I think that technology has a lot to do with that. Teens are using social network sites to build community and connect with their peers. They are creating publics for socialization. And through it, they are showcasing all of the good, bad, and ugly of today’s teen life.

In the 70s, Paul Willis analyzed British working class youth and he wrote a book called Learning to Labor: How Working Class Kids Get Working Class Jobs. He argued that working class teens will reject hegemonic values because it’s the only way to continue to be a part of the community that they live in. In other words, if you don’t know that you will succeed if you make a run at jumping class, don’t bother – you’ll lose all of your friends and community in the process. His analysis has such strong resonance in American society today. I just wish I knew how to fix it.

My new book – Google Apps Deciphered – is out!

I’m really proud to announce that my 5th book is now out & available for purchase: Google Apps Deciphered: Compute in the Cloud to Streamline Your Desktop. My other books include:

(I’ve also contributed to two others: Ubuntu Hacks: Tips & Tools for Exploring, Using, and Tuning Linux and Microsoft Vista for IT Security Professionals.)

Google Apps Deciphered is a guide to setting up Google Apps, migrating to it, customizing it, and using it to improve productivity, communications, and collaboration. I walk you through each leading component of Google Apps individually, and then show my readers exactly how to make them work together for you on the Web or by integrating them with your favorite desktop apps. I provide practical insights on Google Apps programs for email, calendaring, contacts, wikis, word processing, spreadsheets, presentations, video, and even Google’s new web browser Chrome. My aim was to collect together and present tips and tricks I’ve gained by using and setting up Google Apps for clients, family, and friends.

Here’s the table of contents:

  • 1: Choosing an Edition of Google Apps
  • 2: Setting Up Google Apps
  • 3: Migrating Email to Google Apps
  • 4: Migrating Contacts to Google Apps
  • 5: Migrating Calendars to Google Apps
  • 6: Managing Google Apps Services
  • 7: Setting Up Gmail
  • 8: Things to Know About Using Gmail
  • 9: Integrating Gmail with Other Software and Services
  • 10: Integrating Google Contacts with Other Software and Services
  • 11: Setting Up Google Calendar
  • 12: Things to Know About Using Google Calendar
  • 13: Integrating Google Calendar with Other Software and Services
  • 14: Things to Know About Using Google Docs
  • 15: Integrating Google Docs with Other Software and Services
  • 16: Setting Up Google Sites
  • 17: Things to Know About Using Google Sites
  • 18: Things to Know About Using Google Talk
  • 19: Things to Know About Using Start Page
  • 20: Things to Know About Using Message Security and Recovery
  • 21: Things to Know About Using Google Video
  • Appendix A: Backing Up Google Apps
  • Appendix B: Dealing with Multiple Accounts
  • Appendix C: Google Chrome: A Browser Built for Cloud Computing

If you want to know more about Google Apps and how to use it, then I know you’ll enjoy and learn from Google Apps Deciphered. You can read about and buy the book at Amazon (http://www.amazon.com/Google-Apps-Deciphered-Compute-Streamline/dp/0137004702) for $26.39. If you have any questions or comments, don’t hesitate to contact me at scott at granneman dot com.

A single medium, with a single search engine, & a single info source

From Nicholas Carr’s “All hail the information triumvirate!” (Rough Type: 22 January 2009):

Today, another year having passed, I did the searches [on Google] again. And guess what:

World War II: #1
Israel: #1
George Washington: #1
Genome: #1
Agriculture: #1
Herman Melville: #1
Internet: #1
Magna Carta: #1
Evolution: #1
Epilepsy: #1

Yes, it’s a clean sweep for Wikipedia.

The first thing to be said is: Congratulations, Wikipedians. You rule. Seriously, it’s a remarkable achievement. Who would have thought that a rag-tag band of anonymous volunteers could achieve what amounts to hegemony over the results of the most popular search engine, at least when it comes to searches for common topics.

The next thing to be said is: what we seem to have here is evidence of a fundamental failure of the Web as an information-delivery service. Three things have happened, in a blink of history’s eye: (1) a single medium, the Web, has come to dominate the storage and supply of information, (2) a single search engine, Google, has come to dominate the navigation of that medium, and (3) a single information source, Wikipedia, has come to dominate the results served up by that search engine. Even if you adore the Web, Google, and Wikipedia – and I admit there’s much to adore – you have to wonder if the transformation of the Net from a radically heterogeneous information source to a radically homogeneous one is a good thing. Is culture best served by an information triumvirate?

It’s hard to imagine that Wikipedia articles are actually the very best source of information for all of the many thousands of topics on which they now appear as the top Google search result. What’s much more likely is that the Web, through its links, and Google, through its search algorithms, have inadvertently set into motion a very strong feedback loop that amplifies popularity and, in the end, leads us all, lemminglike, down the same well-trod path – the path of least resistance. You might call this the triumph of the wisdom of the crowd. I would suggest that it would be more accurately described as the triumph of the wisdom of the mob. The former sounds benign; the latter, less so.

DIY genetic engineering

From Marcus Wohlsen’s “Amateurs are trying genetic engineering at home” (AP: 25 December 2008):

Now, tinkerers are working at home with the basic building blocks of life itself.

Using homemade lab equipment and the wealth of scientific knowledge available online, these hobbyists are trying to create new life forms through genetic engineering — a field long dominated by Ph.D.s toiling in university and corporate laboratories.

In her San Francisco dining room lab, for example, 31-year-old computer programmer Meredith L. Patterson is trying to develop genetically altered yogurt bacteria that will glow green to signal the presence of melamine, the chemical that turned Chinese-made baby formula and pet food deadly.

Many of these amateurs may have studied biology in college but have no advanced degrees and are not earning a living in the biotechnology field. Some proudly call themselves “biohackers” — innovators who push technological boundaries and put the spread of knowledge before profits.

In Cambridge, Mass., a group called DIYbio is setting up a community lab where the public could use chemicals and lab equipment, including a used freezer, scored for free off Craigslist, that drops to 80 degrees below zero, the temperature needed to keep many kinds of bacteria alive.

Patterson, the computer programmer, wants to insert the gene for fluorescence into yogurt bacteria, applying techniques developed in the 1970s.

She learned about genetic engineering by reading scientific papers and getting tips from online forums. She ordered jellyfish DNA for a green fluorescent protein from a biological supply company for less than $100. And she built her own lab equipment, including a gel electrophoresis chamber, or DNA analyzer, which she constructed for less than $25, versus more than $200 for a low-end off-the-shelf model.