-----Original Message-----
From: Bruce Schneier [mailto:[log in to unmask]]
Sent: 15 September 2006 07:15
To: [log in to unmask]
Subject: CRYPTO-GRAM, September 15, 2006
CRYPTO-GRAM
September 15, 2006
by Bruce Schneier
Founder and CTO
Counterpane Internet Security, Inc.
[log in to unmask]
http://www.schneier.com
http://www.counterpane.com
A free monthly newsletter providing summaries, analyses, insights, and
commentaries on security: computer and otherwise.
For back issues, or to subscribe, visit
<http://www.schneier.com/crypto-gram.html>.
You can read this issue on the web at
<http://www.schneier.com/crypto-gram-0609.html>. These same essays
appear in the "Schneier on Security" blog:
<http://www.schneier.com/blog>. An RSS feed is available.
** *** ***** ******* *********** *************
In this issue:
What the Terrorists Want
Details on the British Terrorist Arrest
More Than 10 Ways to Avoid the Next 9/11
Fifth Anniversary of September 11, 2001
Crypto-Gram Reprints
Educating Users
Human/Bear Security Trade-Off
Land Title Fraud
News
Is There Strategic Software?
Media Sanitization and Encryption
What is a Hacker?
Counterpane News
TrackMeNot
USBDumper
Microsoft and FairUse4WM
Comments from Readers
** *** ***** ******* *********** *************
What the Terrorists Want
On August 16, two men were escorted off a plane headed for Manchester,
England, because some passengers thought they looked either Asian or
Middle Eastern, might have been talking Arabic, wore leather jackets,
and looked at their watches -- and the passengers refused to fly with
them on board. The men were questioned for several hours and then released.
On August 15, an entire airport terminal was evacuated because someone's
cosmetics triggered a false positive for explosives. The same day, a
Muslim man was removed from an airplane in Denver for reciting prayers.
The Transportation Security Administration decided that the flight crew
overreacted, but he still had to spend the night in Denver before flying
home the next day. The next day, a Port of Seattle terminal was
evacuated because a couple of dogs gave a false alarm for explosives.
On August 19, a plane made an emergency landing in Tampa, Florida, after
the crew became suspicious because two of the lavatory doors were
locked. The plane was searched, but nothing was found. Meanwhile, a man
who tampered with a bathroom smoke detector on a flight to San Antonio
was cleared of terrorism, but only after having his house searched.
On August 16, a woman suffered a panic attack and became violent on a
flight from London to Washington, so the plane was escorted to the
Boston airport by fighter jets. "The woman was carrying hand cream and
matches but was not a terrorist threat," said the TSA spokesman after
the incident.
And on August 18, a plane flying from London to Egypt made an emergency
landing in Italy when someone found a bomb threat scrawled on an air
sickness bag. Nothing was found on the plane, and no one knows how long
the note was on board.
I'd like everyone to take a deep breath and listen for a minute.
The point of terrorism is to cause terror, sometimes to further a
political goal and sometimes out of sheer hatred. The people terrorists
kill are not the targets; they are collateral damage. And blowing up
planes, trains, markets, or buses is not the goal; those are just
tactics. The real targets of terrorism are the rest of us: the billions
of us who are not killed but are terrorized because of the killing. The
real point of terrorism is not the act itself, but our reaction to the act.
And we're doing exactly what the terrorists want.
We're all a little jumpy after the recent arrest of 23 terror suspects
in Great Britain. The men were reportedly plotting a liquid-explosive
attack on airplanes, and both the press and politicians have been
trumpeting the story ever since.
In truth, it's doubtful that their plan would have succeeded; chemists
have been debunking the idea since it became public. Certainly the
suspects were a long way off from trying: None had bought airline
tickets, and some didn't even have passports.
Regardless of the threat, from the would-be bombers' perspective, the
explosives and planes were merely tactics. Their goal was to cause
terror, and in that they've succeeded.
Imagine for a moment what would have happened if they had blown up ten
planes. There would be canceled flights, chaos at airports, bans on
carry-on luggage, world leaders talking tough new security measures,
political posturing and all sorts of false alarms as jittery people
panicked. To a lesser degree, that's basically what's happening right now.
Our politicians help the terrorists every time they use fear as a
campaign tactic. The press helps every time it writes scare stories
about the plot and the threat. And if we're terrified, and we share that
fear, we help. All of these actions intensify and repeat the terrorists'
actions, and increase the effects of their terror.
(I am not saying that the politicians and press are terrorists, or that
they share any of the blame for terrorist attacks. I'm not that stupid.
But the subject of terrorism is more complex than it appears, and
understanding its various causes and effects are vital for understanding
how to best deal with it.)
The implausible plots and false alarms actually hurt us in two ways. Not
only do they increase the level of fear, but they also waste time and
resources that could be better spent fighting the real threats and
increasing actual security. I'll bet the terrorists are laughing at us.
Another thought experiment: Imagine for a moment that the British
government arrested the 23 suspects without fanfare. Imagine that the
TSA and its European counterparts didn't engage in pointless airline
security measures like banning liquids. And imagine that the press
didn't write about it endlessly, and that the politicians didn't use the
event to remind us all how scared we should be. If we'd reacted that
way, then the terrorists would have truly failed.
It's time we calm down and fight terror with anti-terror. This does not
mean that we simply roll over and accept terrorism. There are things our
government can and should do to fight terrorism, most of them involving
intelligence and investigation -- and not focusing on specific plots.
But our job is to remain steadfast in the face of terror, to refuse to
be terrorized. Our job is to not panic every time two Muslims stand
together checking their watches. There are approximately 1 billion
Muslims in the world, a large percentage of them not Arab, and about 320
million Arabs in the Middle East, the overwhelming majority of them not
terrorists. Our job is to think critically and rationally, and to ignore
the cacophony of other interests trying to use terrorism to advance
political careers or increase a television show's viewership.
The surest defense against terrorism is to refuse to be terrorized. Our
job is to recognize that terrorism is just one of the risks we face, and
not a particularly common one at that. And our job is to fight those
politicians who use fear as an excuse to take away our liberties and
promote security theater that wastes money and doesn't make us any safer.
Incidents:
http://www.dailymail.co.uk/pages/live/articles/news/news.html?in_article_id=
401419&in_page_id=1770
or http://tinyurl.com/k5njg
http://news.bbc.co.uk/2/hi/uk_news/england/5267884.stm
http://www.cbsnews.com/stories/2006/08/17/national/main1906433.shtml
http://www.cbc.ca/story/canada/national/2006/08/18/doctor-winnipeg.html
or http://tinyurl.com/emnox
http://www.heraldnet.com/stories/06/08/16/100wir_port1.cfm
http://www.miami.com/mld/miamiherald/news/local/states/florida/counties/brow
ard_county/15321870.htm
or http://tinyurl.com/s5oxe
http://www.usatoday.com/news/nation/2006-08-20-fbi-passenger_x.htm
http://www.theage.com.au/articles/2006/08/17/1155407916156.html
http://www.guardian.co.uk/uklatest/story/0,,-6024132,00.html
http://news.bbc.co.uk/2/hi/europe/5283476.stm
http://forums.worldofwarcraft.com/thread.html?topicId=11211166
There have been many more incidents since I wrote this -- all false
alarms. I've stopped keeping a list.
The chemical unreality of the plot:
http://www.theregister.co.uk/2006/08/17/flying_toilet_terror_labs/print.html
or http://tinyurl.com/eeen2
http://www.interesting-people.org/archives/interesting-people/200608/msg0008
7.html
or http://tinyurl.com/etrl8
http://www.boingboing.net/2006/08/14/tatp_about_that_pyro.html
http://www.timesonline.co.uk/article/0,,2-2306994,00.html
http://www.cnn.com/2006/US/08/10/us.security/index.html
http://www.wondermark.com/d/220.html
http://kfmonkey.blogspot.com/2006/08/wait-arent-you-scared.html
This essay also makes the same point that we're overreacting, as well as
describing a 1995 terrorist plot that was remarkably similar in both
materials and modus operandi -- and didn't result in a complete ban on
liquids.
http://www.salon.com/opinion/feature/2006/08/17/airport_futility/
My previous related writings:
http://www.schneier.com/essay-096.html
http://www.schneier.com/essay-038.html
http://www.schneier.com/blog/archives/2006/08/terrorism_secur.html
http://www.schneier.com/essay-087.html
http://www.schneier.com/essay-045.html
This essay originally appeared in Wired:
http://www.wired.com/news/columns/0,71642-0.html
** *** ***** ******* *********** *************
Details on the British Terrorist Arrest
Details are emerging:
* There was some serious cash flow from someone, presumably someone
abroad.
* There was no imminent threat.
* However, the threat was real. And it seems pretty clear that it
would have bypassed all existing airport security systems.
* The conspirators were radicalized by the war in Iraq, although it is
impossible to say whether they would have been otherwise radicalized
without it.
* They were caught through police work, not through any broad
surveillance, and were under surveillance for more than a year.
What pisses me off most is the second item. By arresting the
conspirators early, the police squandered the chance to learn more about
the network and arrest more of them -- and to present a less flimsy
case. There have been many news reports detailing how the U.S.
pressured the UK government to make the arrests sooner, possibly out of
political motivations. (And then Scotland Yard got annoyed at the U.S.
leaking plot details to the press, hampering their case.)
I still think that all of the new airline security measures are an
overreaction. As I said on a radio interview a couple of weeks ago:
"We ban guns and knives, and the terrorists use box cutters. We ban box
cutters and corkscrews, and they hide explosives in their shoes. We
screen shoes, and the terrorists use liquids. We ban liquids, and the
terrorist will use something else. It's not a fair game, because the
terrorists get to see our security measures before they plan their
attack." And it's not a game we can win. So let's stop playing, and
play a game we actually can win. The real lesson of the London arrests
is that investigation and intelligence work.
http://www.nytimes.com/2006/08/28/world/europe/28plot.html?ex=1314417600&en=
3bd0e2092e48e4f1&ei=5090&partner=rssuserland&emc=rss
or http://tinyurl.com/gnbeb
The above URL is unavailable in the UK:
http://www.nytimes.com/2006/08/29/business/media/29times.html?ex=1314504000&
en=d2eb8d24ef801b5f&ei=5090&partner=rssuserland&emc=rss
or http://tinyurl.com/n3lxo
http://www.timesonline.co.uk/article/0,,200-2322630,00.html
http://msnbc.msn.com/id/14320452/
http://www.craigmurray.co.uk/archives/2006/08/the_uk_terror_p.html
http://observer.guardian.co.uk/politics/story/0,,1854503,00.html
http://images.ucomics.com/comics/tmclo/2006/tmclo060817.gif
http://www.webcomicsnation.com/ericburns/stark/series.php?ID=39633
My initial comments on the arrests:
http://www.schneier.com/blog/archives/2006/08/terrorism_secur.html
** *** ***** ******* *********** *************
More Than 10 Ways to Avoid the Next 9/11
On 10 September 2006, the New York Times published a feature called "Ten
Ways to Avoid the Next 9/11": "The Op-Ed page asked 10 people with
experience in security and counterterrorism to answer the following
question: What is one major reason the United States has not suffered a
major attack since 2001, and what is the one thing you would recommend
the nation do in order to avoid attacks in the future?"
Actually, they asked more than 10, myself included. But some of us were
cut because they didn't have enough space. This was my essay:
Despite what you see in the movies and on television, it's actually very
difficult to execute a major terrorist act. It's hard to organize,
plan, and execute an attack, and it's all too easy to slip up and get
caught. Combine that with our intelligence work tracking terrorist
cells and interdicting terrorist funding, and you have a climate where
major attacks are rare. In many ways, the success of 9/11 was an
anomaly; there were many points where it could have failed. The main
reason we haven't seen another 9/11 is that it isn't as easy as it looks.
Much of our counterterrorist efforts are nothing more than security
theater: ineffectual measures that look good. Forget the war on terror;
the difficulty isn't killing or arresting the terrorists, it's finding
them. Terrorism is a law enforcement problem, and needs to be treated
as such. For example, none of our post-9/11 airline security measures
would have stopped the London shampoo bombers. The lesson of London is
that our best defense is intelligence and investigation. Rather than
spending money on airline security, or sports stadium security --
measures that require us to guess the plot correctly in order to be
effective -- we're better off spending money on measures that are
effective regardless of the plot.
Intelligence and investigation have kept us safe from terrorism in the
past, and will continue to do so in the future. If the CIA and FBI had
done a better job of coordinating and sharing data in 2001, 9/11 would
have been another failed attempt. Coordination has gotten better, and
those agencies are better funded -- but it's still not enough. Whenever
you read about the billions being spent on national ID cards or massive
data mining programs or new airport security measures, think about the
number of intelligence agents that the same money could buy. That's
where we're going to see the greatest return on our security investment.
http://www.nytimes.com/2006/08/29/business/media/29times.html?ex=1314504000&
en=d2eb8d24ef801b5f&ei=5090&partner=rssuserland&emc=rss
or http://tinyurl.com/n3lxo
** *** ***** ******* *********** *************
Fifth Anniversary of September 11, 2001
It occurs to me that many people here didn't read what I wrote a few
days after 9/11, or what I wrote a couple of weeks after that.
http://www.schneier.com/crypto-gram-0109.html#1
http://www.schneier.com/crypto-gram-0109a.html
** *** ***** ******* *********** *************
Crypto-Gram Reprints
Crypto-Gram is currently in its ninth year of publication. Back issues
cover a variety of security-related topics, and can all be found on
<http://www.schneier.com/crypto-gram-back.html>. These are a selection
of articles that appeared in this calendar month in other years.
Movie-Plot Threats:
http://www.schneier.com/crypto-gram-0509.html#1
Hurricane Katrina and Security:
http://www.schneier.com/crypto-gram-0509.html#2
Trusted Computing Best Practices:
http://www.schneier.com/crypto-gram-0509.html#13
Security at the Olympics:
http://www.schneier.com/crypto-gram-0409.html#2
Trusted Traveler program:
http://www.schneier.com/crypto-gram-0409.html#5
No-Fly List:
http://www.schneier.com/crypto-gram-0409.html#10
Accidents and Security Incidents:
http://www.schneier.com/crypto-gram-0309.html#1
Benevolent Worms:
http://www.schneier.com/crypto-gram-0309.html#8
Special issue on 9/11, including articles on airport security,
biometrics, cryptography, steganography, intelligence failures, and
protecting liberty:
http://www.schneier.com/crypto-gram-0109a.html
Full Disclosure and the Window of Exposure:
http://www.schneier.com/crypto-gram-0009.html#1
Open Source and Security:
http://www.schneier.com/crypto-gram-9909.html#OpenSourceandSecurity or
http://makeashorterlink.com/?U25716849
Factoring a 512-bit Number:
http://www.schneier.com/crypto-gram-9909.html#Factoringa512-bitNumber or
http://makeashorterlink.com/?J17752849
** *** ***** ******* *********** *************
Educating Users
I've met users, and they're not fluent in security. They might be fluent
in spreadsheets, eBay, or sending jokes over e-mail, but they're not
technologists, let alone security people. Of course they're making all
sorts of security mistakes. I too have tried educating users, and I
agree that it's largely futile.
Part of the problem is generational. We've seen this with all sorts of
technologies: electricity, telephones, microwave ovens, VCRs, video
games. Older generations approach newfangled technologies with
trepidation, distrust, and confusion, while the children who grew up
with them understand them intuitively.
But while the don't-get-it generation will die off eventually, we won't
suddenly enter an era of unprecedented computer security. Technology
moves too fast these days; there's no time for any generation to become
fluent in anything.
Earlier this year, researchers ran an experiment in London's financial
district. Someone stood on a street corner and handed out CDs, saying
they were a "special Valentine's Day promotion." Many people, some
working at sensitive bank workstations, ran the program on the CDs on
their work computers. The program was benign -- all it did was alert
some computer on the Internet that it was running -- but it could just
have easily been malicious. The researchers concluded that users don't
care about security. That's simply not true. Users care about security
-- they just don't understand it.
I don't see a failure of education; I see a failure of technology. It
shouldn't have been possible for those users to run that CD, or for a
random program stuffed into a banking computer to "phone home" across
the Internet.
The real problem is that computers don't work well. The industry has
convinced everyone that people need a computer to survive, and at the
same time it's made computers so complicated that only an expert can
maintain them.
If I try to repair my home heating system, I'm likely to break all sorts
of safety rules. I have no experience in that sort of thing, and
honestly, there's no point in trying to educate me. But the heating
system works fine without my having to learn anything about it. I know
how to set my thermostat and to call a professional if anything goes wrong.
Punishment isn't something you do instead of education; it's a form of
education -- a very primal form of education best suited to children and
animals (and experts aren't so sure about children). I say we stop
punishing people for failures of technology, and demand that computer
companies market secure hardware and software.
This originally appeared in the April 2006 issue of "Information
Security Magazine," as the second part of a point/counterpoint with
Marcus Ranum. You can read Marcus's essay here:
http://www.ranum.com/security/computer_security/editorials/point-counterpoin
t/users.html
or http://tinyurl.com/pgyp4
** *** ***** ******* *********** *************
Human/Bear Security Trade-Off
I like this example from SlashDot: "Back in the 1980s, Yosemite
National Park was having a serious problem with bears: They would wander
into campgrounds and break into the garbage bins. This put both bears
and people at risk. So the Park Service started installing armored
garbage cans that were tricky to open -- you had to swing a latch, align
two bits of handle, that sort of thing. But it turns out it's actually
quite tricky to get the design of these cans just right. Make it *too*
complex and people can't get them open to put away their garbage in the
first place. Said one park ranger, 'There is considerable overlap
between the intelligence of the smartest bears and the dumbest tourists.'"
It's a tough balance to strike. People are smart, but they're impatient
and unwilling to spend a lot of time solving the problem. Bears are
dumb, but they're tenacious and are willing to spend hours solving the
problem. Given those two constraints, creating a trash can that can
both work for people and not work for bears is not easy.
http://yro.slashdot.org/comments.pl?sid=191810&cid=15757347
** *** ***** ******* *********** *************
Land Title Fraud
There seems to be a small epidemic of land title fraud in Ontario, Canada.
What happens is someone impersonates the homeowner, and then sells the
house out from under him. The former owner is still liable for the
mortgage, but can't get in his former house. Cleaning up the problem
takes a lot of time and energy.
The problem is one of economic incentives. If banks were held liable
for fraudulent mortgages, then the problem would go away really quickly.
But as long as they're not, they have no incentive to ensure that this
fraud doesn't occur. (They have some incentive, because the fraud costs
them money, but as long as the few fraud cases cost less than ensuring
the validity of *every* mortgage, they'll just ignore the problem and
eat the losses when fraud occurs.)
http://www.thestar.com/NASApp/cs/ContentServer?pagename=thestar/Layout/Artic
le_Type1&c=Article&cid=1156542610726&call_pageid=968332188774&col=9683501164
67
or http://tinyurl.com/g5qyr
** *** ***** ******* *********** *************
News
Last year, New York City implemented a program of random bag searches in
the subways. It was a silly idea, and I wrote about it then. Recently
the U.S. Court of Appeals for the 2nd Circuit upheld the program.
Daniel Solove wrote about the ruling.
http://www.concurringopinions.com/archives/2006/08/nyc_subway_sear_2.html
or http://tinyurl.com/q8j99
My commentary from last year:
http://www.schneier.com/blog/archives/2005/07/searching_bags.html:
A futile attempt to improve the security of Japan's hanko identification
system.
http://asia.cnet.com/reviews/blog/mobileojisan/0,39050793,39390184,00.htm
or http://tinyurl.com/mgjyd
A 1963 FBI book in fingerprinting, with an introduction by J. Edgar
Hoover, is on Project Gutenberg
http://www.gutenberg.org/files/19022/19022-h/19022-h.htm
You can buy a real copy here.
http://www.antiqbook.com/boox/cro/7958.shtml
A 2001 story about people dressing up as Australian census takers to
collect personal data for fraudulent purposes.
[log in to unmask]" target="_blank">http:[log in to unmask]
6E7F176CA256AA200237AF8?OpenDocument
or http://tinyurl.com/qftxy
The age of this story makes it more interesting. This is the sort of
identity-theft tactic that I would have expected to see this year, as
criminals have gotten more and more sophisticated. It surprises me that
they were doing this five years ago as well.
"Ten Worst Privacy Debacles of All Time." Not a bad list.
http://www.wired.com/news/politics/privacy/0,71622-0.html
Daniel Solove comments:
http://www.concurringopinions.com/archives/2006/08/the_ten_greates.html
"You are what you say: privacy risks of public mentions," Proceedings of
the 29th Annual International ACM SIGIR Conference on Research and
Development in Information Retrieval, 2006.
http://portal.acm.org/citation.cfm?doid=1148170.1148267
http://www-users.cs.umn.edu/~dfrankow/files/privacy-sigir2006.pdf
Kobi Alexander fled the United States ten days ago. He was tracked down
in Sri Lanka via a Skype call. Ars Technica explains: "The fugitive
former CEO may have been convinced that using Skype made him safe from
tracking, but he -- and everyone else that believes VoIP is inherently
more secure than a landline -- was wrong. Tracking anonymous
peer-to-peer VoIP traffic over the Internet is possible. In fact, it can
be done even if the parties have taken some steps to disguise the
traffic." Let this be a warning to all of you who thought Skype was
anonymous.
http://www.haaretz.com/hasen/spages/754476.html
http://arstechnica.com/news.ars/post/20060824-7582.html
http://ise.gmu.edu/~xwangc/Publications/CCS05-VoIPTracking.pdf
Stephen Colbert on protecting your computer:
http://www.comedycentral.com/shows/the_colbert_report/videos/season_2/index.
jhtml?playVideo=72869&rsspartner=rssfofReduxx
or http://tinyurl.com/pz4o4
http://www.comedycentral.com/shows/the_colbert_report/videos/season_2/index.
jhtml?playVideo=72870&rsspartner=rssfofReduxx
or http://tinyurl.com/qjq4w
Stupid Security Award nominations open:
http://www.privacyinternational.org/article.shtml?cmd[347]=x-347-541996
or http://tinyurl.com/hhgzf
Call forwarding credit card scam:
http://www.schneier.com/blog/archives/2006/08/call_forwarding_1.html
World War II statistics-and-security story: estimating the number of
tanks the Germans produced:
http://www.guardian.co.uk/g2/story/0,,1824525,00.html
"The Dread Pirate Bin Ladin" argues that, legally, terrorists should be
treated as pirates under international law:
http://www.legalaffairs.org/issues/July-August-2005/feature_burgess_julaug05
.msp
or http://tinyurl.com/am9a9
Ross Anderson's "Security Engineering" is a great book. And I'm not
saying that because I wrote the foreword. Since it was published in
2001, I have regularly recommended it to engineers interested in
security. None of this is news. What is news is that you can download
the book, free and legally.
http://www.cl.cam.ac.uk/~rja14/book.html
Some news about behavioral profiling as a counterterrorism measure:
http://www.schneier.com/blog/archives/2006/08/behavioral_prof.html
And behavioral profiling caught Warren Jeffs:
http://www.schneier.com/blog/archives/2006/08/behavioral_prof_2.html
Don't use Browzar:
http://web3.0log.org/2006/09/01/new-secure-browser-browzar-is-fake-and-full-
of-adware/
or http://tinyurl.com/q7pxy
An anti-terrorism expert claimed to have smuggled a bomb onto an
airplane, twice. Then he recanted. Near as I can tell, he's an idiot.
http://www.schneier.com/blog/archives/2006/09/man_claims_to_h.html
Airport security cartoons, lots of them:
http://www.schneier.com/blog/archives/2006/09/airport_securit_1.html
This is absolutely essential reading for anyone interested in how the
U.S. is prosecuting terrorism. Put aside the rhetoric and the
posturing; this is what is actually happening. Transactional Records
Access Clearinghouse (TRAC) puts this data together by looking at
Justice Department records. The data research organization is connected
to Syracuse University, and has been doing this sort of thing --
tracking what federal agencies actually do rather than what they say
they do -- for over fifteen years.
http://trac.syr.edu/tracreports/terrorism/169/
I am particularly entertained by the Justice Department's rebuttal,
which basically just calls the study names without offering any
substantive criticism:
http://www.detnews.com/apps/pbcs.dll/article?AID=/20060904/NATION/609040358/
or http://tinyurl.com/r3s2b
People sell, give away, and throw away their cell phones without even
thinking about the data still on them:
http://www.cnn.com/2006/TECH/ptech/08/30/betrayed.byacellphone.ap/index.html
or http://tinyurl.com/z5a73
More and more, our data is not really under our control. We store it on
devices and third-party websites, or on our own computer. We try to
erase it, but we really can't. We try to control its dissemination, but
it's harder and harder.
California is about to secure wireless networks with stickers:
http://www.theregister.co.uk/2006/09/04/wi-fi_warnings_legislated/
An August 2005 cover story from Business Week on "The State of
Surveillance":
http://www.businessweek.com/magazine/content/05_32/b3946001_mz001.htm
A CIO Insight article on the death of privacy:
http://www.cioinsight.com/article2/0,1540,2012398,00.asp
And here's my essay on "The Future of Privacy."
http://www.schneier.com/blog/archives/2006/03/the_future_of_p.html
Bomb or not? Can you identify the bomb:
http://www.bombornot.com/
In related news, here's a guy who makes it through security with a live
vibrator in his pants.
http://www.zug.com/gab/index.cgi?func=view_thread&thread_id=68619
There's also a funny video on Dutch TV. A screener scans a passenger's
bag, putting aside several obvious bags of cocaine to warn him about a
very tiny nail file.
http://u1.peersphere.net/cas/controller/Luchthaven.mpg?livelinkDataID=144647
1
or http://tinyurl.com/zlqrv
Here's where to buy stuff seized at Boston's Logan Airport. I also read
somewhere that some stuff ends up on eBay.
http://www.boston.com/business/articles/2006/09/04/banned_items_find_new_hom
e_in_discount_bin/
or http://tinyurl.com/gs735
And finally, Quinn Norton said: "I think someone should try to blow up a
plane with a piece of ID, just to watch the TSA's mind implode."
http://www.ambiguous.org/archive.php3/2006/08/31#quinn2006831.1
The chairman of Hewlett-Packard, annoyed at leaks, hired investigators
to track down the phone records (including home and cell) of the other
HP board members. One board member resigned because of this. The
leaker has refused to resign, although he has been outed. Note that the
article says that the investigators used "pretexting," which is illegal.
http://www.msnbc.msn.com/id/14687677/site/newsweek/
http://riskman.typepad.com/perilocity/2006/09/does_hp_have_an.html
http://news.com.com/Leak+scandal+costs+HPs+Dunn+her+chairmans+job/2100-1014_
3-6114655.html
or http://tinyurl.com/pu286
Police lose Semtex during test. Oops. It's only eight ounces of the
stuff, but still....
http://www.boston.com/news/globe/city_region/breaking_news/2006/09/state_pol
ice_lo.html
or http://tinyurl.com/r9ak5
Digital snooping for the masses:
http://www.nytimes.com/2006/09/07/fashion/07spy.html?ex=1315281600&en=c48cca
6a35e9bd22&ei=5090&partner=rssuserland&emc=rss
or http://tinyurl.com/ovoyn
Notes from the Hash Function Workshop:
http://www.schneier.com/blog/archives/2006/09/notes_from_the.html
The "Ultimate Secure Home." Hoax or not?
http://ultimatesecurehome.com/
http://www.schneier.com/blog/archives/2006/09/ultimate_secure.html
Seems like Sudanese customs officials are seizing laptops from people
entering the country and checking the data on their hard drives. While
the stated reason is pornography, anyone bringing a computer into the
country should be concerned about personal information, writing that
might be deemed political by the Sudanese authorities, confidential
business information, and so on.
http://edition.cnn.com/2006/WORLD/africa/08/30/sudan.crackdown.reut/index.ht
ml
or http://tinyurl.com/pdag7
http://ngosecurity.blogspot.com/2006/09/incident-laptop-seizures-sudan.html
or http://tinyurl.com/lla8b
This should be a concern regardless of the border you cross. Your
privacy rights when trying to enter a country are minimal, and this kind
of thing could happen anywhere. (I have heard anecdotal stories about
Israel doing this, but don't have confirmation.) If you're bringing a
laptop across an international border, you should clean off all
unnecessary files and encrypt the rest.
Turing Bombe recreated at Bletchley Park:
http://www.telegraph.co.uk/news/main.jhtml?xml=/news/2006/09/07/nbletchley07
.xml
or http://tinyurl.com/mg6fw
Burglars foil alarm system. Clever hack I talked about in Beyond Fear:
http://www.schneier.com/blog/archives/2006/09/burglars_foil_a.html
A paper from the CATO Institute: "Doublespeak and the War on Terrorism":
http://www.cato.org/pub_display.php?pub_id=6654
Defeating a coin-op photocopy machine with a paper clip:
http://www.instructables.com/id/EW8JTRWKO9ERIE1UQD/
Article on industrial spying. Lots of hype, but interesting nonetheless:
http://news.bbc.co.uk/2/hi/technology/5313772.stm
Ed Felten and his team at Princeton have analyzed a Diebold AccuVote-TS
machine, and they have discovered all sorts of vulnerabilities. They're
able to introduce a virus that flips votes, and automatically spreads
from machine to machine. Amazing stuff. Diebold, of course, is
pretending that there's no problem.
http://itpolicy.princeton.edu/voting/
Video demonstration: http://itpolicy.princeton.edu/voting/videos.html
http://www.salon.com/opinion/feature/2006/09/13/diebold/index.html
http://arstechnica.com/news.ars/post/20060913-7735.html
http://www.msnbc.msn.com/id/14825465/
http://www.computerworld.com/blogs/node/3475
"The Onion" on airport security oversights:
http://www.theonion.com/content/node/52333
And a cryptography cartoon:
http://xkcd.com/c153.html
** *** ***** ******* *********** *************
Is There Strategic Software?
If you define "critical infrastructure" as "things essential for the
functioning of a society and economy," then software is critical
infrastructure. For many companies and individuals, if their computers
stop working, they stop working.
It's a situation that snuck up on us. Everyone knew that the software
that flies 747s or targets cruise missiles was critical, but who thought
of the airlines' weight and balance computers, or the operating system
running the databases and spreadsheets that determine which cruise
missiles get shipped where?
And over the years, common, off-the-shelf, personal- and business-grade
software has been used for more and more critical applications. Today we
find ourselves in a situation where a well-positioned flaw in Windows,
Cisco routers or Apache could seriously affect the economy.
It's perfectly rational to assume that some programmers -- a tiny
minority I'm sure -- are deliberately adding vulnerabilities and back
doors into the code they write. I'm actually kind of amazed that back
doors secretly added by the CIA/NSA, MI5, the Chinese, Mossad and others
don't conflict with each other. Even if these groups aren't infiltrating
software companies with back doors, you can be sure they're scouring
products for vulnerabilities they can exploit, if necessary. On the
other hand, we're already living in a world where dozens of new flaws
are discovered in common software products weekly, and the economy is
humming along. But we're not talking about this month's worm from Asia
or new phishing software from the Russian mafia -- we're talking
national intelligence organizations. "Infowar" is an overhyped term, but
the next war will have a cyberspace component, and these organizations
wouldn't be doing their jobs if they weren't preparing for it.
Marcus is 100 percent correct when he says it's simply too late to do
anything about it. The software industry is international, and no
country can start demanding domestic-only software and expect to get
anywhere. Nor would that actually solve the problem, which is more about
the allegiance of millions of individual programmers than which country
they happen to inhabit.
So, what to do? The key here is to remember the real problem: current
commercial software practices are not secure enough to reliably detect
and delete deliberately inserted malicious code. Once you understand
this, you'll drop the red herring arguments that led to CheckPoint not
being able to buy Sourcefire and concentrate on the real solution:
defense in depth.
In theory, security software is an after-the-fact kludge because the
underlying OS and apps are riddled with vulnerabilities. If your
software were written properly, you wouldn't need a firewall -- right?
If we were to get serious about critical infrastructure, we'd recognize
it's all critical and start building security software to protect it.
We'd build our security based on the principles of safe failure; we'd
assume security would fail and make sure it's OK when it does. We'd use
defense in depth and compartmentalization to minimize the effects of
failure. Basically, we'd do everything we're supposed to do now to
secure our networks.
It'd be expensive, probably prohibitively so. Maybe it would be easier
to continue to ignore the problem, or at least manage geopolitics so
that no national military wants to take us down.
This is the second half of a point/counterpoint I did with Marcus Ranum
the September 2006 issue of "Information Security Magazine." Here's
his half:
http://www.ranum.com/security/computer_security/editorials/point-counterpoin
t/strategic.html
or http://tinyurl.com/nmask
** *** ***** ******* *********** *************
Media Sanitization and Encryption
Last week NIST released Special Publication 800-88, "Guidelines for
Media Sanitization."
There is a new paragraph in this document (page 7) that was not in the
draft version: "Encryption is not a generally accepted means of
sanitization. The increasing power of computers decreases the time
needed to crack cipher text and therefore the inability to recover the
encrypted data can not be assured."
I have to admit that this doesn't make any sense to me. If the
encryption is done properly, and if the key is properly chosen, then
erasing the key -- and all copies -- is equivalent to erasing the files.
And if you're using full-disk encryption, then erasing the key is
equivalent to sanitizing the drive. For that not to be true means that
the encryption program isn't secure.
I think NIST is just confused.
http://csrc.nist.gov/publications/nistpubs/#sp800-88
** *** ***** ******* *********** *************
What is a Hacker?
A hacker is someone who thinks outside the box. It's someone who
discards conventional wisdom, and does something else instead. It's
someone who looks at the edge and wonders what's beyond. It's someone
who sees a set of rules and wonders what happens if you don't follow
them. A hacker is someone who experiments with the limitations of
systems for intellectual curiosity.
I wrote that last sentence in the year 2000, in my book "Beyond Fear."
And I'm sticking to that definition.
This is what else I wrote in "Beyond Fear":
"Hackers are as old as curiosity, although the term itself is modern.
Galileo was a hacker. Mme. Curie was one, too. Aristotle wasn't.
(Aristotle had some theoretical proof that women had fewer teeth than
men. A hacker would have simply counted his wife's teeth. A good hacker
would have counted his wife's teeth without her knowing about it, while
she was asleep. A good bad hacker might remove some of them, just to
prove a point.)
"When I was in college, I knew a group similar to hackers: the key
freaks. They wanted access, and their goal was to have a key to every
lock on campus. They would study lockpicking and learn new techniques,
trade maps of the steam tunnels and where they led, and exchange copies
of keys with each other. A locked door was a challenge, a personal
affront to their ability. These people weren't out to do damage --
stealing stuff wasn't their objective -- although they certainly could
have. Their hobby was the power to go anywhere they wanted to.
"Remember the phone phreaks of yesteryear, the ones who could whistle
into payphones and make free phone calls. Sure, they stole phone
service. But it wasn't like they needed to make eight-hour calls to
Manila or McMurdo. And their real work was secret knowledge: The phone
network was a vast maze of information. They wanted to know the system
better than the designers, and they wanted the ability to modify it to
their will. Understanding how the phone system worked -- that was the
true prize. Other early hackers were ham-radio hobbyists and model-train
enthusiasts.
"Richard Feynman was a hacker; read any of his books.
"Computer hackers follow these evolutionary lines. Or, they are the same
genus operating on a new system. Computers, and networks in particular,
are the new landscape to be explored. Networks provide the ultimate maze
of steam tunnels, where a new hacking technique becomes a key that can
open computer after computer. And inside is knowledge, understanding.
Access. How things work. Why things work. It's all out there, waiting to
be discovered."
Computers are the perfect playground for hackers. Computers, and
computer networks, are vast treasure troves of secret knowledge. The
Internet is an immense landscape of undiscovered information. The more
you know, the more you can do.
And it should be no surprise that many hackers have focused their skills
on computer security. Not only is it often the obstacle between the
hacker and knowledge, and therefore something to be defeated, but also
the very mindset necessary to be good at security is exactly the same
mindset that hackers have: thinking outside the box, breaking the rules,
exploring the limitations of a system. The easiest way to break a
security system is to figure out what the system's designers hadn't
thought of: that's security hacking.
Hackers cheat. And breaking security regularly involves cheating. It's
figuring out a smart card's RSA key by looking at the power
fluctuations, because the designers of the card never realized anyone
could do that. It's self-signing a piece of code, because the
signature-verification system didn't think someone might try that. It's
using a piece of a protocol to break a completely different protocol,
because all previous security analysis only looked at protocols
individually and not in pairs.
That's security hacking: breaking a system by thinking differently.
It all sounds criminal: recovering encrypted text, fooling signature
algorithms, breaking protocols. But honestly, that's just the way we
security people talk. Hacking isn't criminal. All the examples two
paragraphs above were performed by respected security professionals, and
all were presented at security conferences.
I remember one conversation I had at a Crypto conference, early in my
career. It was outside amongst the jumbo shrimp, chocolate-covered
strawberries, and other delectables. A bunch of us were talking about
some cryptographic system, including Brian Snow of the NSA. Someone
described an unconventional attack, one that didn't follow the normal
rules of cryptanalysis. I don't remember any of the details, but I
remember my response after hearing the description of the attack.
"That's cheating," I said.
Because it was.
I also remember Brian turning to look at me. He didn't say anything,
but his look conveyed everything. "There's no such thing as cheating in
this business."
Because there isn't.
Hacking is cheating, and it's how we get better at security. It's only
after someone invents a new attack that the rest of us can figure out
how to defend against it.
For years I have refused to play the semantic "hacker" vs. "cracker"
game. There are good hackers and bad hackers, just as there are good
electricians and bad electricians. "Hacker" is a mindset and a skill
set; what you do with it is a different issue.
And I believe the best computer security experts have the hacker
mindset. When I look to hire people, I look for someone who can't walk
into a store without figuring out how to shoplift. I look for someone
who can't test a computer security program without trying to get around
it. I look for someone who, when told that things work in a particular
way, immediately asks how things stop working if you do something else.
We need these people in security, and we need them on our side.
Criminals are always trying to figure out how to break security systems.
Field a new system -- an ATM, an online banking system, a gambling
machine -- and criminals will try to make an illegal profit off it.
They'll figure it out eventually, because some hackers are also
criminals. But if we have hackers working for us, they'll figure it out
first -- and then we can defend ourselves.
It's our only hope for security in this fast-moving technological world
of ours.
This essay appeared in the Summer 2006 issue of "2600."
** *** ***** ******* *********** *************
Counterpane News
Counterpane has a new application security assessment service:
http://www.counterpane.com/application-security-assessment.html
Schneier is speaking via teleconference at the Hack-in-the-Box
Conference, September 20, in Kuala Lumpur, Malaysia:
http://conference.hitb.org/hitbsecconf2006kl/
Schneier is speaking at the University of Southern California, September
26, in Los Angeles:
http://netzoo.net/cpd/schneier.html
Schneier is speaking at the ACLU National Capital Area President's
Committee Dinner, September 27, in Washington, DC.
Schneier is speaking at Michigan Technical University, October 2, in
Houghton, MI.
http://www.greatevents.mtu.edu/geseason/04.html
Schneier is speaking at Sandia National Laboratories, October 5, in
Livermore, CA.
Schneier is speaking at "Security Takes Off," October 9, in Malmoe, Sweden.
http://www.dfs.se/kretsar/sodra
Schneier is speaking at Information Security Solutions Europe, October
10, in Rome.
http://www.eema.org/static/isse/budapest.htm
Schneier was interviewed for Martin McKeay's security podcast.
http://www.mckeay.net/secure/2006/08/network_security_podcast_episo_35.html
or http://tinyurl.com/mxkfe
Bruce Schneier Facts:
http://geekz.co.uk/schneierfacts/
Some of these are pretty funny. And no, I had nothing to do with it.
** *** ***** ******* *********** *************
TrackMeNot
In the wake of AOL's publication of search data, and the "New York
Times" article demonstrating how easy it is to figure out who did the
searching, we have TrackMeNot:
"TrackMeNot runs in Firefox as a low-priority background process that
periodically issues randomized search-queries to popular search engines,
e.g., AOL, Yahoo!, Google, and MSN. It hides users' actual search trails
in a cloud of indistinguishable 'ghost' queries, making it difficult, if
not impossible, to aggregate such data into accurate or identifying user
profiles. TrackMeNot integrates into the Firefox 'Tools' menu and
includes a variety of user-configurable options."
Let's count the ways this doesn't work.
One, it doesn't hide your searches. If the government wants to know
who's been searching on "al Qaeda recruitment centers," it won't matter
that you've made ten thousand other searches as well -- you'll be targeted.
Two, it's too easy to spot. There are only 1,673 search terms in the
program's dictionary. Here, as a random example, are the program's "G"
words: gag, gagged, gagging, gags, gas, gaseous, gases, gassed, gasses,
gassing, gen, generate, generated, generates, generating, gens, gig,
gigs, gillion, gillions, glass, glasses, glitch, glitched, glitches,
glitching, glob, globed, globing, globs, glue, glues, gnarlier,
gnarliest, gnarly, gobble, gobbled, gobbles, gobbling, golden, goldener,
goldenest, gonk, gonked, gonking, gonks, gonzo, gopher, gophers, gorp,
gorps, gotcha, gotchas, gribble, gribbles, grind, grinding, grinds,
grok, grokked, grokking, groks, ground, grovel, groveled, groveling,
grovelled, grovelling, grovels, grue, grues, grunge, grunges, gun,
gunned, gunning, guns, guru, gurus
The program's authors claim that this list is temporary, and that there
will eventually be a TrackMeNot server with an ever-changing word list.
Of course, that list can be monitored by any analysis program -- as
could any queries to that server.
In any case, every twelve seconds -- exactly -- the program picks a
random pair of words and sends it to either AOL, Yahoo, MSN, or Google.
My guess is that your searches contain more than two words, you don't
send them out in precise twelve-second intervals, and you favor one
search engine over the others.
Three, some of the program's searches are worse than yours. The
dictionary includes: HIV, atomic, bomb, bible, bibles, bombing, bombs,
boxes, choke, choked, chokes, choking, chain, crackers, empire, evil,
erotics, erotices, fingers, knobs, kicking, harier, hamster, hairs,
legal, letterbomb, letterbombs, mailbomb, mailbombing, mailbombs, rapes,
raping, rape, raper, rapist, virgin, warez, warezes, whack, whacked,
whacker, whacking, whackers, whacks, pistols
Does anyone really think that searches on "erotic rape," "mailbombing
bibles," and "choking virgins" will make their legitimate searches less
noteworthy?
And four, it wastes a whole lot of bandwidth. A query every twelve
seconds translates into 2,400 queries a day, assuming an eight-hour
workday. A typical Google response is about 25K, so we're talking 60
megabytes of additional traffic daily. Imagine if everyone in the
company used it.
I suppose this kind of thing would stop someone who has a paper printout
of your searches and is looking through them manually, but it's not
going to hamper computer analysis very much. Or anyone who isn't lazy.
But it wouldn't be hard for a computer profiling program to ignore
these searches.
As one commentator put it: "Imagine a cop pulls you over for speeding.
As he approaches, you realize you left your wallet at home. Without your
driver's license, you could be in a lot of trouble. When he approaches,
you roll down your window and shout. "Hello Officer! I don't have
insurance on this vehicle! This car is stolen! I have weed in my
glovebox! I don't have my driver's license! I just hit an old lady
minutes ago! I've been running stop lights all morning! I have a dead
body in my trunk! This car doesn't pass the emissions tests! I'm not
allowed to drive because I am under house arrest! My gas tank runs on
the blood of children!" You stop to catch a breath, confident you have
supplied so much information to the cop that you can't possibly be
caught for not having your license now."
Yes, data mining is a signal-to-noise problem. But artificial noise
like this isn't going to help much. If I were going to improve on this
idea, I would make the plugin watch the user's search patterns. I would
make it send queries only to the search engines the user does, only when
he is actually online doing things. I would randomize the timing.
(There's a comment to that effect in the code, so presumably this will
be fixed in a later version of the program.) And I would make it
monitor the web pages the user looks at, and send queries based on
keywords it finds on those pages. And I would make it send queries in
the form the user tends to use, whether it be single words, pairs of
words, or whatever.
But honestly, I don't know that I would use it even then. The way
serious people protect their web-searching privacy is through
anonymization. Use Tor for serious web anonymization. Or Black Box
Search for simple anonymous searching (there's a Greasemonkey extension
that does that automatically). And set your browser to delete search
engine cookies regularly.
TrackMeNot:
http://mrl.nyu.edu/~dhowe/TrackMeNot/
Another commentator:
http://blog.air0day.com/2006/08/21/worst-security-tool-ever/
Other tools:
http://tor.eff.org/
http://www.blackboxsearch.com/
http://blog.nemik.net/2006/08/21/dont-leave-traces/
AOL privacy breach:
http://www.schneier.com/blog/archives/2006/08/aol_releases_ma.html
http://mrl.nyu.edu/~dhowe/TrackMeNot/NYTimes_AOL_Exposed.htm
** *** ***** ******* *********** *************
USBDumper
USBDumper is a cute little utility that silently copies the contents of
an inserted USB drive onto the PC. The idea is that you install this
piece of software on your computer, or on a public PC, and then you
collect the files -- some of them personal and confidential -- from
anyone who plugs his USB drive into that computer. (There's a similar
program that downloads a disk image, allowing someone to recover deleted
files as well.)
No big deal to anyone who worries about computer security for a living,
but probably a rude shock to salespeople, conference presenters, file
sharers, and many others who regularly plug their USB drives into
strange PCs.
http://www.secuobs.com/news/07062006-sstic_usbdumper.shtml
http://www.secuobs.com/USBDumper.rar
http://www.rfc1149.net/blog/2006/08/23/wiping-unused-space-in-a-file-system/
or http://tinyurl.com/m757a
** *** ***** ******* *********** *************
Microsoft and FairUse4WM
If you really want to see Microsoft scramble to patch a hole in its
software, don't look to vulnerabilities that impact countless Internet
Explorer users or give intruders control of thousands of Windows
machines. Just crack Redmond's DRM.
Security patches used to be rare. Software vendors were happy to pretend
that vulnerabilities in their products were illusory -- and then quietly
fix the problem in the next software release.
That changed with the full disclosure movement. Independent security
researchers started going public with the holes they found, making
vulnerabilities impossible for vendors to ignore. Then worms became more
common; patching -- and patching quickly -- became the norm.
But even now, no software vendor likes to issue patches. Every patch is
a public admission that the company made a mistake. Moreover, the
process diverts engineering resources from new development. Patches
annoy users by making them update their software, and piss them off even
more if the update doesn't work properly.
For the vendor, there's an economic balancing act: how much more will
your users be annoyed by unpatched software than they will be by the
patch, and is that reduction in annoyance worth the cost of patching?
Since 2003, Microsoft's strategy to balance these costs and benefits has
been to batch patches: instead of issuing them one at a time, it's been
issuing them all together on the second Tuesday of each month. This
decreases Microsoft's development costs and increases the reliability of
its patches.
The user pays for this strategy by remaining open to known
vulnerabilities for up to a month. On the other hand, users benefit from
a predictable schedule: Microsoft can test all the patches that are
going out at the same time, which means that patches are more reliable
and users are able to install them faster with more confidence.
In the absence of regulation, software liability, or some other
mechanism to make unpatched software costly for the vendor, Patch
Tuesday is the best users are likely to get.
Why? Because it makes near-term financial sense to Microsoft. The
company is not a public charity, and if the Internet suffers, or if
computers are compromised *en masse*, the economic impact on Microsoft
is still minimal.
Microsoft is in the business of making money, and keeping users secure
by patching its software is only incidental to that goal.
There's no better example of this of this principle in action than
Microsoft's behavior around the vulnerability in its digital rights
management software PlaysForSure.
In August, a hacker developed an application called FairUse4WM that
strips the copy protection from Windows Media DRM 10 and 11 files.
Now, this isn't a "vulnerability" in the normal sense of the word:
digital rights management is not a feature that users want. Being able
to remove copy protection is a good thing for some users, and completely
irrelevant for everyone else. No user is ever going to say: "Oh no. I
can now play the music I bought for my computer in my car. I must
install a patch so I can't do that anymore."
But to Microsoft, this vulnerability is a big deal. It affects the
company's relationship with major record labels. It affects the
company's product offerings. It affects the company's bottom line.
Fixing this "vulnerability" is in the company's best interest; never
mind the customer.
So Microsoft wasted no time; it issued a patch three days after learning
about the hack. There's no month-long wait for copyright holders who
rely on Microsoft's DRM.
This clearly demonstrates that economics is a much more powerful
motivator than security.
It should surprise no one that the system didn't stay patched for long.
FairUse4WM 1.2 gets around Microsoft's patch, and also circumvents the
copy protection in Windows Media DRM 9 and 11beta2 files. And four days
later, Microsoft issued another patch.
That's where things stand now. Any guess on how long it will take the
FairUse4WM people to update their software? And then how long before
Microsoft to patch once again?
Certainly much less time than it will take Microsoft and the recording
industry to realize they're playing a losing game, and that trying to
make digital files uncopyable is like trying to make water not wet.
If Microsoft abandoned this Sisyphean effort and put the same
development effort into building a fast and reliable patching system,
the entire Internet would benefit. But simple economics says it probably
never will.
http://en.wikipedia.org/wiki/Microsoft_PlaysForSure
http://forum.doom9.org/showthread.php?t=114916
http://www.engadget.com/2006/08/25/fairuse4wm-strips-windows-media-drm
http://www.dailytech.com/article.aspx?newsid=3999
http://www.engadget.com/2006/08/28/microsoft-already-on-their-way-to-patchin
g-fairuse4wm
or http://tinyurl.com/lh6rw
http://www.engadget.com/2006/09/02/fairuse4wm-peeps-stay-one-step-ahead-of-m
icrosoft
or http://tinyurl.com/ogev6
Commentary:
http://www.businessethics.ca/blog/2006/09/microsoft-ethics-economics-of-cust
omer.html
or http://tinyurl.com/pggqk
BSkyB halts download service because of the breaks.
http://www.washingtonpost.com/wp-dyn/content/article/2006/09/12/AR2006091200
837.html
or http://tinyurl.com/hdd3p
A version of this essay originally appeared on Wired.com.
http://www.wired.com/news/columns/0,71738-0.html
** *** ***** ******* *********** *************
Comments from Readers
There are hundreds of comments -- many of them interesting -- on these
topics on my blog. Search for the story you want to comment on, and join
in.
http://www.schneier.com/blog
** *** ***** ******* *********** *************
CRYPTO-GRAM is a free monthly newsletter providing summaries, analyses,
insights, and commentaries on security: computer and otherwise. You can
subscribe, unsubscribe, or change your address on the Web at
<http://www.schneier.com/crypto-gram.html>. Back issues are also
available at that URL.
Comments on CRYPTO-GRAM should be sent to [log in to unmask]
Permission to print comments is assumed unless otherwise stated.
Comments may be edited for length and clarity.
Please feel free to forward CRYPTO-GRAM, in whole or in part, to
colleagues and friends who will find it valuable. Permission is also
granted to reprint CRYPTO-GRAM, as long as it is reprinted in its entirety.
CRYPTO-GRAM is written by Bruce Schneier. Schneier is the author of the
best sellers "Beyond Fear," "Secrets and Lies," and "Applied
Cryptography," and an inventor of the Blowfish and Twofish algorithms.
He is founder and CTO of Counterpane Internet Security Inc., and is a
member of the Advisory Board of the Electronic Privacy Information
Center (EPIC). He is a frequent writer and lecturer on security topics.
See <http://www.schneier.com>.
Counterpane is the world's leading protector of networked information -
the inventor of outsourced security monitoring and the foremost
authority on effective mitigation of emerging IT threats. Counterpane
protects networks for Fortune 1000 companies and governments world-wide.
See <http://www.counterpane.com>.
Crypto-Gram is a personal newsletter. Opinions expressed are not
necessarily those of Counterpane Internet Security, Inc.
Copyright (c) 2006 by Bruce Schneier.
--
This message has been scanned for viruses and dangerous
content by the NorMAN MailScanner Service and is believed
to be clean.
The NorMAN MailScanner Service is operated by Information
Systems and Services, Newcastle University.
====
This e-mail is intended solely for the addressee. It may contain private and
confidential information. If you are not the intended addressee, please take
no action based on it nor show a copy to anyone. Please reply to this e-mail
to highlight the error. You should also be aware that all electronic mail
from, to, or within Northumbria University may be the subject of a request
under the Freedom of Information Act 2000 and related legislation, and
therefore may be required to be disclosed to third parties.
This e-mail and attachments have been scanned for viruses prior to leaving
Northumbria University. Northumbria University will not be liable for any
losses as a result of any viruses being passed on.
************************************************************************************
Distributed through Cyber-Society-Live [CSL]: CSL is a moderated discussion
list made up of people who are interested in the interdisciplinary academic
study of Cyber Society in all its manifestations.To join the list please visit:
http://www.jiscmail.ac.uk/lists/cyber-society-live.html
*************************************************************************************
|