terrorism

Why airport security fails constantly

From Bruce Schneier’s “Airport Passenger Screening” (Crypto-Gram Newsletter: 15 April 2006):

It seems like every time someone tests airport security, airport security fails. In tests between November 2001 and February 2002, screeners missed 70 percent of knives, 30 percent of guns, and 60 percent of (fake) bombs. And recently, testers were able to smuggle bomb-making parts through airport security in 21 of 21 attempts. …

The failure to detect bomb-making parts is easier to understand. Break up something into small enough parts, and it’s going to slip past the screeners pretty easily. The explosive material won’t show up on the metal detector, and the associated electronics can look benign when disassembled. This isn’t even a new problem. It’s widely believed that the Chechen women who blew up the two Russian planes in August 2004 probably smuggled their bombs aboard the planes in pieces. …

Airport screeners have a difficult job, primarily because the human brain isn’t naturally adapted to the task. We’re wired for visual pattern matching, and are great at picking out something we know to look for — for example, a lion in a sea of tall grass.

But we’re much less adept at detecting random exceptions in uniform data. Faced with an endless stream of identical objects, the brain quickly concludes that everything is identical and there’s no point in paying attention. By the time the exception comes around, the brain simply doesn’t notice it. This psychological phenomenon isn’t just a problem in airport screening: It’s been identified in inspections of all kinds, and is why casinos move their dealers around so often. The tasks are simply mind-numbing.

Why airport security fails constantly Read More »

Al Qaeda hijacks web server to distribute video

From Matt Tanase’s Don’t let this happen to you:

Smaller companies often assume they have nothing of interest to hackers. Often times that is the case, but they are still after resources, as in this case. Unfortunately, the hackers in this case are tied to Al Qaeda. They placed the recent hostage video on a California companies server. Imagine all of the lovely publicity this brought in.

From New24’s US firm spread hostage video (17 June 2004):

Video images of a US engineer taken hostage in Saudi Arabia, possibly by the al-Qaeda network, could have been put on the internet via a US firm based in California, Der Spiegel magazine reported on Thursday.

The video was released on Tuesday and shows relatively high-quality film of hostage Paul Johnson, who kidnappers from a group called “al-Qaeda in the Arabian Peninsula” have threatened to kill by Friday.

The origin of the video was traced to Silicon Valley Land Surveying Incorporated, a California land surveying and mapping company, said Spiegel online, the internet service for the respected German weekly.

The magazine said that according to its research the move was the first time al-Qaeda had “hijacked” a website to broadcast its propaganda.

Al Qaeda hijacks web server to distribute video Read More »

Social network analysis by the NSA

From John Diamond and Leslie Cauley’s “Pre-9/11 records help flag suspicious calling” (USA TODAY: 22 May 2006):

Armed with details of billions of telephone calls, the National Security Agency used phone records linked to the Sept. 11, 2001 attacks to create a template of how phone activity among terrorists looks, say current and former intelligence officials who were briefed about the program. …

The “call detail records” are the electronic information that is logged automatically each time a call is initiated. For more than 20 years, local and long-distance companies have used call detail records to figure out how much to charge each other for handling calls and to determine problems with equipment.

In addition to the number from which a call is made, the detail records are packed with information. Also included: the number called; the route a call took to reach its final destination; the time, date and place where a call started and ended; and the duration of the call. The records also note whether the call was placed from a cellphone or from a traditional “land line.” …

Calls coming into the country from Pakistan, Afghanistan or the Middle East, for example, are flagged by NSA computers if they are followed by a flood of calls from the number that received the call to other U.S. numbers.

The spy agency then checks the numbers against databases of phone numbers linked to terrorism, the officials say. Those include numbers found during searches of computers or cellphones that belonged to terrorists.

It is not clear how much terrorist activity, if any, the data collection has helped to find.

Social network analysis by the NSA Read More »

Matching identities across databases, anonymously

From MIT Technology Review‘s’ “Blindfolding Big Brother, Sort of“:

In 1983, entrepreneur Jeff Jonas founded Systems Research and Development (SRD), a firm that provided software to identify people and determine who was in their circle of friends. In the early 1990s, the company moved to Las Vegas, where it worked on security software for casinos. Then, in January 2005, IBM acquired SRD and Jonas became chief scientist in the company’s Entity Analytic Solutions group.

His newest technology, which allows entities such as government agencies to match an individual found in one database to that same person in another database, is getting a lot of attention from governments, banks, health-care providers, and, of course, privacy advocates. Jonas claims that his technology is as good at protecting privacy as it as at finding important information. …

JJ: The technique that we have created allows the bank to anonymize its customer data. When I say “anonymize,” I mean it changes the name and address and date of birth, or whatever data they have about an identity, into a numeric value that is nonhuman readable and nonreversible. You can’t run the math backwards and compute from the anonymized value what the original input value was. …

Here’s the scenario: The government has a list of people we should never let into the country. It’s a secret. They don’t want people in other countries to know. And the government tends to not share this list with corporate America. Now, if you have a cruise line, you want to make sure you don’t have people getting on your boat who shouldn’t even be in the United States in the first place. Prior to the U.S. Patriot Act, the government couldn’t go and subpoena 100,000 records every day from every company. Usually, the government would have to go to a cruise line and have a subpoena for a record. Section 215 [of the Patriot Act] allows the government to go to a business entity and say, “We want all your records.” Now, the Fourth Amendment, which is “search and seizure,” has a legal test called “reasonable and particular.” Some might argue that if a government goes to a cruise line and says, “Give us all your data,” it is hard to envision that this would be reasonable and particular.

But what other solution do they have? There was no other solution. Our Anonymous Resolution technology would allow a government to take its secret list and anonymize it, allow a cruise line to anonymize their passenger list, and then when there’s a match it would tell the government: “record 123.” So they’d look it up and say, “My goodness, it’s Majed Moqed.” And it would tell them which record to subpoena from which organization. Now it’s back to reasonable and particular. ….

TR: How is this is based on earlier work you did for Las Vegas casinos?

JJ: The ability to figure out if two people are the same despite all the natural variability of how people express their identity is something we really got a good understanding of assisting the gaming industry. We also learned how people try to fabricate fake identities and how they try to evade systems. It was learning how to do that at high speed that opened the door to make this next thing possible. Had we not solved that in the 1990s, we would not have been able to conjure up a method to do anonymous resolution.

Matching identities across databases, anonymously Read More »

Security will retard innovation

From Technology Review‘s “Terror’s Server“:

Zittrain [Jonathan Zittrain, codirector of the Berkman Center for Internet and Society at Harvard Law School] concurs with Neumann [Peter Neumann, a computer scientist at SRI International, a nonprofit research institute in Menlo Park, CA] but also predicts an impending overreaction. Terrorism or no terrorism, he sees a convergence of security, legal, and business trends that will force the Internet to change, and not necessarily for the better. “Collectively speaking, there are going to be technological changes to how the Internet functions — driven either by the law or by collective action. If you look at what they are doing about spam, it has this shape to it,” Zittrain says. And while technologi­cal change might improve online security, he says, “it will make the Internet less flexible. If it’s no longer possible for two guys in a garage to write and distribute killer-app code without clearing it first with entrenched interests, we stand to lose the very processes that gave us the Web browser, instant messaging, Linux, and e-mail.”

Security will retard innovation Read More »

Terrorist social networks

From Technology Review‘s “Terror’s Server“:

For example, research suggests that people with nefarious intent tend to exhibit distinct patterns in their use of e-mails or online forums like chat rooms. Whereas most people establish a wide variety of contacts over time, those engaged in plotting a crime tend to keep in touch only with a very tight circle of people, says William Wallace, an operations researcher at Rensselaer Polytechnic Institute.

This phenomenon is quite predictable. “Very few groups of people communicate repeatedly only among themselves,” says Wallace. “It’s very rare; they don’t trust people outside the group to communicate. When 80 percent of communications is within a regular group, this is where we think we will find the groups who are planning activities that are malicious.” Of course, not all such groups will prove to be malicious; the odd high-school reunion will crop up. But Wallace’s group is developing an algorithm that will narrow down the field of so-called social networks to those that warrant the scrutiny of intelligence officials. The algorithm is scheduled for completion and delivery to intelligence agencies this summer. …

Terrorist social networks Read More »

How terrorists use the Web

From Technology Review‘s “Terror’s Server“:

According to [Gabriel] Weimann [professor of communications at University of Haifa], the number of [terror-related] websites has leapt from only 12 in 1997 to around 4,300 today. …

These sites serve as a means to recruit members, solicit funds, and promote and spread ideology. …

The September 11 hijackers used conventional tools like chat rooms and e-mail to communicate and used the Web to gather basic information on targets, says Philip Zelikow, a historian at the University of Virginia and the former executive director of the 9/11 Commission. …

Finally, terrorists are learning that they can distribute images of atrocities with the help of the Web. … “The Internet allows a small group to publicize such horrific and gruesome acts in seconds, for very little or no cost, worldwide, to huge audiences, in the most powerful way,” says Weimann. …

How terrorists use the Web Read More »

Bruce Schneier on what we should do

From Bruce Schneier’s “Searching Bags in Subways“:

Final note: I often get comments along the lines of “Stop criticizing stuff; tell us what we should do.” My answer is always the same. Counterterrorism is most effective when it doesn’t make arbitrary assumptions about the terrorists’ plans. Stop searching bags on the subways, and spend the money on 1) intelligence and investigation — stopping the terrorists regardless of what their plans are, and 2) emergency response — lessening the impact of a terrorist attack, regardless of what the plans are. Countermeasures that defend against particular targets, or assume particular tactics, or cause the terrorists to make insignificant modifications in their plans, or that surveil the entire population looking for the few terrorists, are largely not worth it.

Bruce Schneier on what we should do Read More »

Tracking terrorists with Unintended Information Revelation

From “New search engine to help thwart terrorists“:

With news that the London bombers were British citizens, radicalised on the streets of England and with squeaky-clean police records, comes the realisation that new mechanisms for hunting terrorists before they strike must be developed.

Researchers at the University of Buffalo, US, believe they have discovered a technique that will reveal information on public web sites that was not intended to be published.

The United States Federal Aviation Administration (FAA) and the National Science Foundation (NSF) are supporting the development of a new search engine based on Unintended Information Revelation (UIR), and designed for anti-terrorism applications.

UIR supposes that snippets of information – that by themselves appear to be innocent – may be linked together to reveal highly sensitive data.

… “A concept chain graph will show you what’s common between two seemingly unconnected things,” said Srihari. “With regular searches, the input is a set of key words, the search produces a ranked list of documents, any one of which could satisfy the query.

“UIR, on the other hand, is a composite query, not a keyword query. It is designed to find the best path, the best chain of associations between two or more ideas. It returns to you an evidence trail that says, ‘This is how these pieces are connected.'”

Tracking terrorists with Unintended Information Revelation Read More »

Don’t fly where we won’t tell you not to fly

From Bruce Schneier’s “The Silliness of Secrecy“, quoting The Wall Street Journal:

Ever since Sept. 11, 2001, the federal government has advised airplane pilots against flying near 100 nuclear power plants around the country or they will be forced down by fighter jets. But pilots say there’s a hitch in the instructions: aviation security officials refuse to disclose the precise location of the plants because they consider that “SSI” — Sensitive Security Information.

“The message is; ‘please don’t fly there, but we can’t tell you where there is,'” says Melissa Rudinger of the Aircraft Owners and Pilots Association, a trade group representing 60% of American pilots.

Determined to find a way out of the Catch-22, the pilots’ group sat down with a commercial mapping company, and in a matter of days plotted the exact geographical locations of the plants from data found on the Internet and in libraries. It made the information available to its 400,000 members on its Web site — until officials from the Transportation Security Administration asked them to take the information down. “Their concern was that [terrorists] mining the Internet could use it,” Ms. Rudinger says.

Don’t fly where we won’t tell you not to fly Read More »

How to fake an anthrax scare

From Bruce Schneier’s “White Powder Anthrax Hoaxes“:

Earlier this month, there was an anthrax scare at the Indonesian embassy in Australia. Someone sent them some white powder in an envelope, which was scary enough. Then it tested positive for bacillus. The building was decontaminated, and the staff was quarantined for twelve hours. By then, tests came back negative for anthrax.

A lot of thought went into this false alarm. The attackers obviously knew that their white powder would be quickly tested for the presence of a bacterium of the bacillus family (of which anthrax is a member), but that the bacillus would have to be cultured for a couple of days before a more exact identification could be made. So even without any anthrax, they managed to cause two days of terror.

… In an interesting side note, the media have revealed for the first time that 360 “white powder” incidents have taken place since 11 September 2001. This news had been suppressed by the government, which had issued D notices to the media for all such incidents. So there has been one such incident approximately every four days — an astonishing number, given Australia’s otherwise low crime rate.

How to fake an anthrax scare Read More »

Risk management

From Glenn Fleishman’s post to the Interesting People mailing list:

I heard the strangely frank head of TSA on NPR this morning–perhaps he forgot he was speaking to the public?–talk quite honestly about what I would describe as “yield management for risk.”

Basically:

* The pilots are now protected, so the plane won’t be weaponized even if many passengers were to die on board.
* Passengers will overwhelm someone armed with relatively minor weapons, even if some passengers die. That’s acceptable risk.
* A lot of stuff on planes can be used as weapons already (he didn’t elaborate).
* The evaluated risk of smaller knives is low in their testing — meaning whatever air marshalls wear for protection will resist punctures from smaller knives.

He said the focus is now on explosive detection.

Risk management Read More »

Jans clarifies it for us

Back in November 2002, a bunch of us went camping in a cabin in the woods. Around midnight, we were sitting around the fire, talking. The subject of crime came up, specifically the statute of limitations.

Scott: I think the statute of limitations doesn’t apply only in cases of murder and rape.

Denise: That’s right.

Scott: What about terrorism? Is there no statute of limitations on that?

Paul: Well, usually terrorism includes murder.

Jans: If there’s no murder, then it’s just scaryism.

Jans clarifies it for us Read More »