JiscMail Logo
Email discussion lists for the UK Education and Research communities

Help for CYBER-SOCIETY-LIVE Archives


CYBER-SOCIETY-LIVE Archives

CYBER-SOCIETY-LIVE Archives


CYBER-SOCIETY-LIVE@JISCMAIL.AC.UK


View:

Message:

[

First

|

Previous

|

Next

|

Last

]

By Topic:

[

First

|

Previous

|

Next

|

Last

]

By Author:

[

First

|

Previous

|

Next

|

Last

]

Font:

Proportional Font

LISTSERV Archives

LISTSERV Archives

CYBER-SOCIETY-LIVE Home

CYBER-SOCIETY-LIVE Home

CYBER-SOCIETY-LIVE  2006

CYBER-SOCIETY-LIVE 2006

Options

Subscribe or Unsubscribe

Subscribe or Unsubscribe

Log In

Log In

Get Password

Get Password

Subject:

[CSL]: CRYPTO-GRAM, March 15, 2006

From:

J Armitage <[log in to unmask]>

Reply-To:

Interdisciplinary academic study of Cyber Society <[log in to unmask]>

Date:

Thu, 16 Mar 2006 08:19:21 -0000

Content-Type:

text/plain

Parts/Attachments:

Parts/Attachments

text/plain (1190 lines)

-----Original Message-----
From: Bruce Schneier [mailto:[log in to unmask]] 
Sent: 15 March 2006 09:13
To: [log in to unmask]
Subject: CRYPTO-GRAM, March 15, 2006

                  CRYPTO-GRAM

                March 15, 2006

               by Bruce Schneier
                Founder and CTO
       Counterpane Internet Security, Inc.
            [log in to unmask]
            <http://www.schneier.com>
           <http://www.counterpane.com>


A free monthly newsletter providing summaries, analyses, insights, and 
commentaries on security: computer and otherwise.

For back issues, or to subscribe, visit 
<http://www.schneier.com/crypto-gram.html>.

You can read this issue on the web at 
<http://www.schneier.com/crypto-gram-0603.html>.  These same essays 
appear in the "Schneier on Security" blog: 
<http://www.schneier.com/blog>.  An RSS feed is available.


** *** ***** ******* *********** *************

In this issue:
      The Future of Privacy
      Face Recognition Comes to Bars
      Security, Economics, and Lost Conference Badges
      Crypto-Gram Reprints
      Data Mining for Terrorists
      Airport Security Failure
      News
      Police Department Privilege Escalation
      Database Error Causes Unbalanced Budget
      Credit Card Companies and Agenda
      Counterpane News
      Proof that Employees Don't Care About Security
      U.S. Port Security and Proxies
      Comments from Readers


** *** ***** ******* *********** *************

      The Future of Privacy



Over the past 20 years, there's been a sea change in the battle for 
personal privacy.

The pervasiveness of computers has resulted in the almost constant 
surveillance of everyone, with profound implications for our society 
and our freedoms. Corporations and the police are both using this new 
trove of surveillance data. We as a society need to understand the 
technological trends and discuss their implications. If we ignore the 
problem and leave it to the "market," we'll all find that we have 
almost no privacy left.

Most people think of surveillance in terms of police procedure: Follow 
that car, watch that person, listen in on his phone conversations. This 
kind of surveillance still occurs. But today's surveillance is more 
like the NSA's model, recently turned against Americans: Eavesdrop on 
every phone call, listening for certain keywords. It's still 
surveillance, but it's wholesale surveillance.

Wholesale surveillance is a whole new world. It's not "follow that 
car," it's "follow every car." The National Security Agency can 
eavesdrop on every phone call, looking for patterns of communication or 
keywords that might indicate a conversation between terrorists. Many 
airports collect the license plates of every car in their parking lots, 
and can use that database to locate suspicious or abandoned cars. 
Several cities have stationary or car-mounted license-plate scanners 
that keep records of every car that passes, and save that data for 
later analysis.

More and more, we leave a trail of electronic footprints as we go 
through our daily lives. We used to walk into a bookstore, browse, and 
buy a book with cash. Now we visit Amazon, and all of our browsing and 
purchases are recorded. We used to throw a quarter in a toll booth; now 
EZ Pass records the date and time our car passed through the booth. 
Data about us are collected when we make a phone call, send an e-mail 
message, make a purchase with our credit card, or visit a website.

Much has been written about RFID chips and how they can be used to 
track people. People can also be tracked by their cell phones, their 
Bluetooth devices, and their WiFi-enabled computers. In some cities, 
video cameras capture our image hundreds of times a day.

The common thread here is computers. Computers are involved more and 
more in our transactions, and data are byproducts of these 
transactions. As computer memory becomes cheaper, more and more of 
these electronic footprints are being saved. And as processing becomes 
cheaper, more and more of it is being cross-indexed and correlated, and 
then used for secondary purposes.

Information about us has value. It has value to the police, but it also 
has value to corporations. The Justice Department wants details of 
Google searches, so they can look for patterns that might help find 
child pornographers. Google uses that same data so it can deliver 
context-sensitive advertising messages. The city of Baltimore uses 
aerial photography to surveil every house, looking for building permit 
violations. A national lawn-care company uses the same data to better 
market its services. The phone company keeps detailed call records for 
billing purposes; the police use them to catch bad guys.

In the dot-com bust, the customer database was often the only salable 
asset a company had. Companies like Experian and Acxiom are in the 
business of buying and reselling this sort of data, and their customers 
are both corporate and government.

Computers are getting smaller and cheaper every year, and these trends 
will continue. Here's just one example of the digital footprints we leave:

It would take about 100 megabytes of storage to record everything the 
fastest typist input to his computer in a year. That's a single flash 
memory chip today, and one could imagine computer manufacturers 
offering this as a reliability feature. Recording everything the 
average user does on the Internet requires more memory: 4 to 8 
gigabytes a year. That's a lot, but "record everything" is GMail's 
model, and it's probably only a few years before ISPs offer this service.

The typical person uses 500 cell phone minutes a month; that translates 
to 5 gigabytes a year to save it all. My iPod can store 12 times that 
data. A "life recorder" you can wear on your lapel that constantly 
records is still a few generations off: 200 gigabytes/year for audio 
and 700 gigabytes/year for video. It'll be sold as a security device, 
so that no one can attack you without being recorded. When that 
happens, will not wearing a life recorder be used as evidence that 
someone is up to no good, just as prosecutors today use the fact that 
someone left his cell phone at home as evidence that he didn't want to 
be tracked?

In a sense, we're living in a unique time in history. Identification 
checks are common, but they still require us to whip out our ID. Soon 
it'll happen automatically, either through an RFID chip in our wallet 
or face-recognition from cameras. And those cameras, now visible, will 
shrink to the point where we won't even see them.

We're never going to stop the march of technology, but we can enact 
legislation to protect our privacy: comprehensive laws regulating what 
can be done with personal information about us, and more privacy 
protection from the police. Today, personal information about you is 
not yours; it's owned by the collector. There are laws protecting 
specific pieces of personal data -- videotape rental records, health 
care information -- but nothing like the broad privacy protection laws 
you find in European countries. That's really the only solution; 
leaving the market to sort this out will result in even more invasive 
wholesale surveillance.

Most of us are happy to give out personal information in exchange for 
specific services. What we object to is the surreptitious collection of 
personal information, and the secondary use of information once it's 
collected: the buying and selling of our information behind our back.

In some ways, this tidal wave of data is the pollution problem of the 
information age. All information processes produce it. If we ignore the 
problem, it will stay around forever. And the only way to successfully 
deal with it is to pass laws regulating its generation, use and 
eventual disposal.

This essay was originally published in the Minneapolis Star-Tribune.
<http://www.startribune.com/562/story/284023.html>


** *** ***** ******* *********** *************

      Face Recognition Comes to Bars



BioBouncer is a face recognition system intended for bars:

"Its camera snaps customers entering clubs and bars, and facial 
recognition software compares them with stored images of previously 
identified troublemakers. The technology alerts club security to image 
matches, while innocent images are automatically flushed at the end of 
each night, Dussich said. Various clubs can share databases through a 
virtual private network, so belligerent drunks might find themselves 
unwelcome in all their neighborhood bars."

Anyone want to guess how long that "automatically flushed at the end of 
each night" will last?  This data has enormous value.  Insurance 
companies will want to know if someone was in a bar before a car 
accident.  Employers will want to know if their employees were drinking 
before work -- think airplane pilots.  Private investigators will want 
to know who walked into a bar with whom.  The police will want to know 
all sorts of things.  Lots of people will want this data -- and they'll 
all be willing to pay for it.

And the data will be owned by the bars that collect it.  They can 
choose to erase it, or they can choose to sell it to data aggregators 
like Acxiom.

It's rarely the initial application that's the problem.  It's the 
follow-on applications.  It's the function creep.  Before you know it, 
everyone will know that they are identified the moment they walk into a 
commercial building.  We will all lose privacy, and liberty, and 
freedom as a result.

<http://www.wired.com/news/technology/1,70265-0.html>


** *** ***** ******* *********** *************

      Security, Economics, and Lost Conference Badges



Conference badges are an interesting security token.  They can be very 
valuable -- a full conference registration at the RSA Conference last 
month in San Jose, for example, cost $1,985 -- but their value decays 
rapidly with time.  By the end of the conference, they were worthless.

Counterfeiting badges is one security concern, but an even bigger 
concern is people losing their badge or having their badge 
stolen.  It's way cheaper to find or steal someone else's badge than it 
is to buy your own.  People could do this sort of thing on purpose, 
pretending to lose their badge and giving it to someone else.

A few years ago, the RSA Conference charged people $100 for a 
replacement badge, which is far cheaper than a second membership.  So 
the fraud remained.  (At least, I assume it did.  I don't know anything 
about how prevalent this kind of fraud was at RSA.)

Last year, the RSA Conference tried to further limit these types of 
fraud by putting people's photographs on their badges.  Clever idea, 
but difficult to implement.

For this to work, though, guards need to match photographs with 
faces.  This means that either 1) you need a lot more guards at 
entrance points, or 2) the lines will move a lot slower.  Actually, far 
more likely is 3) no one will check the photographs.

And it was an expensive solution for the RSA Conference.  They needed 
the equipment to put the photos on the badges.  Registration was much 
slower.  And pro-privacy people objected to the conference keeping 
their photographs on file.

This year, the RSA Conference solved the problem through 
economics:  "If you lose your badge and/or badge holder, you will be 
required to purchase a new one for a fee of $1,895.00."

Look how clever this is.  Instead of trying to solve this particular 
badge fraud problem through security, they simply moved the problem 
from the conference to the attendee.  The badges still have that $1,895 
value, but now if it's stolen and used by someone else, it's the 
attendee who's out the money.  As far as the RSA Conference is 
concerned, the security risk is an externality.

Note that from an outside perspective, this isn't the most efficient 
way to deal with the security problem.  It's likely that the cost to 
the RSA Conference for centralized security is less than the aggregate 
cost of all the individual security measures.  But the RSA Conference 
gets to make the trade-off, so they chose a solution that was cheaper 
for them.

Of course, it would have been nice if the conference provided a 
slightly more secure attachment point for the badge holder than a thin 
strip of plastic.  But why should they?  It's not their problem anymore.


** *** ***** ******* *********** *************

      Crypto-Gram Reprints



Crypto-Gram is currently in its ninth year of publication.  Back issues 
cover a variety of security-related topics, and can all be found on 
<http://www.schneier.com/crypto-gram-back.html>.  These are a selection 
of articles that appeared in this calendar month in other years.

SHA-1 Broken:
<http://www.schneier.com/crypto-gram-0503.html#1>

The Failure of Two-Factor Authentication:
<http://www.schneier.com/crypto-gram-0503.html#2>

Sensitive Security Information (SSI):
<http://www.schneier.com/crypto-gram-0503.html#11>

"I am not a Terrorist" Cards:
<http://www.schneier.com/crypto-gram-0403.html#10>

The Security Risks of Centralization:
<http://www.schneier.com/crypto-gram-0403.html#11>

Practical Cryptography:
<http://www.schneier.com/crypto-gram-0303.html#1>

SSL flaw:
<http://www.schneier.com/crypto-gram-0303.html#3>

SSL patent infringement:
<http://www.schneier.com/crypto-gram-0303.html#8>

SNMP vulnerabilities:
<http://www.schneier.com./crypto-gram-0203.html#1>

Bernstein's factoring breakthrough?
<http://www.schneier.com./crypto-gram-0203.html#6>

Richard Clarke on 9/11's Lessons
<http://www.schneier.com./crypto-gram-0203.html#7>

Security patch treadmill:
<http://www.schneier.com/crypto-gram-0103.html#1>

Insurance and the future of network security:
<http://www.schneier.com/crypto-gram-0103.html#3>

The "death" of IDSs:
<http://www.schneier.com/crypto-gram-0103.html#9>

802.11 security:
<http://www.schneier.com/crypto-gram-0103.html#10>

Software complexity and security:
<http://www.schneier.com/crypto-gram-0003.html#SoftwareComplexityandSecu 
rity>

Why the worst cryptography is in systems that pass initial cryptanalysis:
<http://www.schneier.com/crypto-gram-9903.html#initial>


** *** ***** ******* *********** *************

      Data Mining for Terrorists



In the post 9/11 world, there's much focus on connecting the dots. Many 
believe that data mining is the crystal ball that will enable us to 
uncover future terrorist plots. But even in the most wildly optimistic 
projections, data mining isn't tenable for that purpose. We're not 
trading privacy for security; we're giving up privacy and getting no 
security in return.

Most people first learned about data mining in November 2002, when news 
broke about a massive government data mining program called Total 
Information Awareness. The basic idea was as audacious as it was 
repellent: suck up as much data as possible about everyone, sift 
through it with massive computers, and investigate patterns that might 
indicate terrorist plots. Americans across the political spectrum 
denounced the program, and in September 2003, Congress eliminated its 
funding and closed its offices.

But TIA didn't die. According to "The National Journal," it just 
changed its name and moved inside the Defense Department.

This shouldn't be a surprise. In May 2004, the General Accounting 
Office published a report that listed 122 different federal government 
data mining programs that used people's personal information.  This 
list didn't include classified programs, like the NSA's eavesdropping 
effort, or state-run programs like MATRIX.

The promise of data mining is compelling, and convinces many. But it's 
wrong. We're not going to find terrorist plots through systems like 
this, and we're going to waste valuable resources chasing down false 
alarms. To understand why, we have to look at the economics of the system.

Security is always a trade-off, and for a system to be worthwhile, the 
advantages have to be greater than the disadvantages. A national 
security data mining program is going to find some percentage of real 
attacks, and some percentage of false alarms. If the benefits of 
finding and stopping those attacks outweigh the cost -- in money, 
liberties, etc. -- then the system is a good one. If not, then you'd be 
better off spending that cost elsewhere.

Data mining works best when there's a well-defined profile you're 
searching for, a reasonable number of attacks per year, and a low cost 
of false alarms. Credit card fraud is one of data mining's success 
stories: all credit card companies data mine their transaction 
databases, looking for spending patterns that indicate a stolen card. 
Many credit card thieves share a pattern -- purchase expensive luxury 
goods, purchase things that can be easily fenced, etc. -- and data 
mining systems can minimize the losses in many cases by shutting down 
the card. In addition, the cost of false alarms is only a phone call to 
the cardholder asking him to verify a couple of purchases. The 
cardholders don't even resent these phone calls -- as long as they're 
infrequent -- so the cost is just a few minutes of operator time.

Terrorist plots are different. There is no well-defined profile, and 
attacks are very rare. Taken together, these facts mean that data 
mining systems won't uncover any terrorist plots until they are very 
accurate, and that even very accurate systems will be so flooded with 
false alarms that they will be useless.

All data mining systems fail in two different ways: false positives and 
false negatives. A false positive is when the system identifies a 
terrorist plot that really isn't one. A false negative is when the 
system misses an actual terrorist plot. Depending on how you "tune" 
your detection algorithms, you can err on one side or the other: you 
can increase the number of false positives to ensure that you are less 
likely to miss an actual terrorist plot, or you can reduce the number 
of false positives at the expense of missing terrorist plots.

To reduce both those numbers, you need a well-defined profile. And 
that's a problem when it comes to terrorism. In hindsight, it was 
really easy to connect the 9/11 dots and point to the warning signs, 
but it's much harder before the fact. Certainly, there are common 
warning signs that many terrorist plots share, but each is unique, as 
well. The better you can define what you're looking for, the better 
your results will be. Data mining for terrorist plots is going to be 
sloppy, and it's going to be hard to find anything useful.

Data mining is like searching for a needle in a haystack. There are 900 
million credit cards in circulation in the United States.  According to 
the FTC September 2003 Identity Theft Survey Report, about 1% (10 
million) cards are stolen and fraudulently used each year. Terrorism is 
different. There are trillions of connections between people and events 
-- things that the data mining system will have to "look at" -- and 
very few plots. This rarity makes even accurate identification systems 
useless.

Let's look at some numbers. We'll be optimistic. We'll assume the 
system has a 1 in 100 false positive rate (99% accurate), and a 1 in 
1,000 false negative rate (99.9% accurate).

Assume one trillion possible indicators to sift through: that's about 
ten events -- e-mails, phone calls, purchases, web surfings, whatever 
-- per person in the U.S. per day. Also assume that 10 of them are 
actually terrorists plotting.

This unrealistically-accurate system will generate one billion false 
alarms for every real terrorist plot it uncovers. Every day of every 
year, the police will have to investigate 27 million potential plots in 
order to find the one real terrorist plot per month. Raise that 
false-positive accuracy to an absurd 99.9999% and you're still chasing 
2,750 false alarms per day -- but that will inevitably raise your false 
negatives, and you're going to miss some of those ten real plots.

This isn't anything new.  In statistics, it's called the "base rate 
fallacy" and it applies in other domains as well. For example, even 
highly accurate medical tests are useless as diagnostic tools if the 
incidence of the disease is rare in the general population. Terrorist 
attacks are also rare, any "test" is going to result in an endless 
stream of false alarms.

This is exactly the sort of thing we saw with the NSA's eavesdropping 
program: the "New York Times" reported that the computers spat out 
thousands of tips per month. Every one of them turned out to be a false 
alarm.

And the cost was enormous: not just the cost of the FBI agents running 
around chasing dead-end leads instead of doing things that might 
actually make us safer, but also the cost in civil liberties. The 
fundamental freedoms that make our country the envy of the world are 
valuable, and not something that we should throw away lightly.

Data mining can work. It helps Visa keep the costs of fraud down, just 
as it helps Amazon.com show me books that I might want to buy, and 
Google show me advertising I'm more likely to be interested in. But 
these are all instances where the cost of false positives is low -- a 
phone call from a Visa operator, or an uninteresting ad -- and in 
systems that have value even if there is a high number of false negatives.

Finding terrorism plots is not a problem that lends itself to data 
mining. It's a needle-in-a-haystack problem, and throwing more hay on 
the pile doesn't make that problem any easier. We'd be far better off 
putting people in charge of investigating potential plots and letting 
them direct the computers, instead of putting the computers in charge 
and letting them decide who should be investigated.

This essay originally appeared on Wired.com.
<http://www.wired.com/news/columns/0,70357-0.html>

TIA:
<http://www.epic.org/privacy/profiling/tia/>
<http://www.fas.org/sgp/congress/2003/tia.html>

Its return:
<http://nationaljournal.com/about/njweekly/stories/2006/0223nj1.htm>

GAO report:
<http://www.epic.org/privacy/profiling/gao_dm_rpt.pdf>

MATRIX:
<http://www.aclu.org/privacy/spying/15701res20050308.html>

Base rate fallacy:
<http://www.cia.gov/csi/books/19104/art15.html#ft145>

New York Times on the NSA eavesdropping program:
<http://www.schneier.com/blog/archives/2006/01/post_1.html>


** *** ***** ******* *********** *************

      Airport Security Failure



At LaGuardia, a man successfully walked through the metal detector, but 
screeners wanted to check his shoes.  (Some reports say his shoes set 
off an alarm.)  But he didn't wait, and disappeared into the crowd.

The entire Delta Airlines terminal had to be evacuated, and between 
2,500 and 3,000 people had to be rescreened.  I'm sure the resultant 
flight delays rippled through the entire system.

Security systems can fail in two ways.  They can fail to defend against 
an attack.  And they can fail when there is no attack to defend.  The 
latter failure is often more important, because false alarms are more 
common than real attacks.

Aside from the obvious security failure -- how did this person manage 
to disappear into the crowd, anyway -- it's painfully obvious that the 
overall security system did not fail well.  Well-designed security 
systems fail gracefully, without affecting the entire airport 
terminal.  That the only thing the TSA could do after the failure was 
evacuate the entire terminal and rescreen everyone is a testament to 
how badly designed the security system is.

<http://www.newsday.com/news/printedition/newyork/nyc-nydelt114658156mar 
11,0,7784010.story> or <http://tinyurl.com/jtzuo>
<http://news.bbc.co.uk/2/hi/americas/4795534.stm>


** *** ***** ******* *********** *************

      News



"Lessons from the Sony CD DRM Episode" is an interesting paper by J. 
Alex Halderman and Edward W. Felten.
<http://itpolicy.princeton.edu/pub/sonydrm-ext.pdf>

This is a great example of a movie-plot threat: terrorists hijacking 
school buses and using them to blow up things.
<http://www.schneier.com/blog/archives/2006/02/school_bus_driv.html>

A court has ruled that companies do not have to encrypt data under 
Gramm-Leach Bliley.  I know nothing of the legal merits of the case, 
nor do I have an opinion about whether Gramm-Leach-Bliley does or does 
not require financial companies to encrypt personal data in its 
purview.  But I do know that we as a society need to force companies to 
encrypt personal data about us.  Companies won't do it on their own -- 
the market just doesn't encourage this behavior -- so legislation or 
liability are the only available mechanisms.  If this law doesn't do 
it, we need another one.
<http://writ.news.findlaw.com/commentary/20060220_sinrod.html>
<http://www.securityfocus.com/columnists/387>

I find this phishing attack impressive for several reasons.  One, it's 
a very sophisticated attack and demonstrates how clever identity 
thieves are becoming.  Two, it narrowly targets a particular credit 
union, and sneakily uses the fact that credit cards issued by an 
institution share the same initial digits.  Three, it exploits an 
authentication problem with SSL certificates.  And four, it is yet 
another proof point that "user education" isn't how we're going to 
solve this kind of risk.
<http://blog.washingtonpost.com/securityfix/2006/02/the_new_face_of_phis 
hing_1.html> or <http://tinyurl.com/773mg>
<http://isc.sans.org/diary.php?storyid=1118>

Patrick Smith, a former pilot, writes about his experiences -- 
involving the police -- taking pictures in airports.
<http://www.salon.com/tech/col/smith/2006/02/10/askthepilot173/index1.ht 
ml> or <http://tinyurl.com/gxuaw>

More on port security (funny):
<http://www.defectiveyeti.com/archives/001599.html>

The Houston chief of police wants to put surveillance cameras in 
apartment complexes, downtown streets, shopping malls and even private 
homes to fight crime during a shortage of police officers.  He said: "I 
know a lot of people are concerned about Big Brother, but my response 
to that is, if you are not doing anything wrong, why should you worry 
about it?"  One of the problems we have in the privacy community is 
that we don't have a crisp answer to that question.  I asked for 
suggestions on my blog, and there were some really good responses.
<http://www.dallasnews.com/sharedcontent/APStories/stories/D8FPQU300.htm 
l> or <http://tinyurl.com/z7shz>
<http://www.schneier.com/blog/archives/2006/02/police_cameras.html>

Here's how to make your own hardware key logger for PS/2 keyboards.
<http://www.makezine.com/blog/archive/2006/02/diy_hardware_keylogger.htm 
l> or <http://tinyurl.com/h3bp8>
Here's where to buy one:
<http://www.keykatcher.com/>
<http://www.lakeshoretechnology.com/KeyPhantom.asp>
<http://www.keyghost.com/kgpro.htm>
<http://www.keytrapper.com/>
<http://www.spectorsoft.com/>

The M4 project is trying to break three original Enigma messages left 
over from World War II.
<http://www.bytereef.org/m4_project.html>
<http://news.bbc.co.uk/2/hi/technology/4763854.stm>

Something like 50 million pounds was stolen from a banknote storage 
depot in the UK.  The Times writes: "Large-scale cash robbery was once 
a technical challenge: drilling through walls, short-circuiting alarms, 
gagging guards and stationing the get-away car. Today, the weak points 
in the banks' defences are not grilles and vaults, but human beings. 
Stealing money is now partly a matter of psychology. The success of the 
Tonbridge robbers depended on terrifying Mr Dixon into opening the 
doors. They had studied their victim. They knew the route he took home, 
and how he would respond when his wife and child were in mortal danger. 
It did not take gelignite to blow open the vaults; it took fear, in the 
hostage technique known as 'tiger kidnapping,' so called because of the 
predatory stalking that precedes it. Tiger kidnapping is the point 
where old-fashioned crime meets modern terrorism."
<http://www.timesonline.co.uk/newspaper/0,,175-2057507,00.html>
A good chronology of events:
<http://news.bbc.co.uk/1/hi/england/kent/4742972.stm>

DNA surveillance in the UK:
<http://www.schneier.com/blog/archives/2006/02/dna_surveillanc.html>

Quantum computing just got more bizarre: you don't even have to turn 
the computer on to get a result.  So now, even turning the machine off 
won't necessarily prevent hackers from stealing passwords.
<http://www.newscientist.com/channel/info-tech/mg18925405.700.html>
<http://cosmicvariance.com/2006/02/28/paul-kwiat-on-quantum-computation> 
  or <http://tinyurl.com/g87yo>

Last month I wrote about a wiretapping scandal in Greece.  More details 
are emerging.  It turns out that the "malicious code" was actually code 
designed into the system.  It's eavesdropping code put into the system 
for the police.  The attackers managed to bypass the authorization 
mechanisms of the eavesdropping system, and activate the "lawful 
interception" module in the mobile network. They then redirected about 
100 numbers to 14 shadow numbers they controlled.  There is an 
important security lesson here.  I have long argued that when you build 
surveillance mechanisms into communication systems, you invite the bad 
guys to use those mechanisms for their own purposes.  That's exactly 
what happened here.
<http://www.schneier.com/blog/archives/2006/02/phone_tapping_i.html>
<http://homes.esat.kuleuven.be/~gdanezis/intercept.html>
<http://www.quintessenz.org/cgi-bin/index?id=000100002344>
<http://betabug.ch/blogs/ch-athens/312>

Jury duty identity-theft scam:
<http://www.snopes.com/crime/fraud/juryduty.asp>

The FedEx Kinko's ExpressPay stored value card has been 
hacked.  There's nothing particularly amazing about the hack; the most 
remarkable thing is how badly the system was designed in the first 
place.  The only security on the cards is a three-byte code that lets 
you read and write to the card.  I'd be amazed if no one has hacked 
this before.
<http://www.mal-aware.org/2006/02/28/fedex-kinkos-smart-cards-hacked/>
<http://www.eweek.com/article2/0,1759,1932824,00.asp?kc=EWRSS03119TX1K00 
00594> or <http://tinyurl.com/zf58a>

Nothing too surprising in this study of password generation practices:
<http://psychology.wichita.edu/surl/usabilitynews/81/Passwords.htm>

Caller ID spoofing: harmful pranks, invasions of privacy, and fraud.
<http://www.startribune.com/484/story/278518.html>
<http://www.wired.com/news/technology/0,70320-0.html>

AT&T's 1.9-trillion-call database:
<http://www.schneier.com/blog/archives/2006/03/atts_19trillion.html>

This story shows how badly terrorist profiling can go wrong: a couple 
is investigated for paying down their credit card balance.  The article 
goes on to blame something called the Bank Privacy Act, but that's not 
correct.  The culprit here is the amendments made to the Bank Secrecy 
Act by the USA Patriot Act, Sections 351 and 352.  Remember, all the 
time spent chasing down silly false alarms is time wasted.  Finding 
terrorist plots is a signal-to-noise problem, and stuff like this 
substantially decreases that ratio: it adds a lot of noise without 
adding enough signal.  It makes us less safe, because it makes 
terrorist plots harder to find.
<http://www.shns.com/shns/g_index2.cfm?action=detail&pk=RAISEALARM-02-28 
-06> or <http://tinyurl.com/pju6w>
The law:
<http://www.epic.org/privacy/terrorism/hr3162.html>
<http://www.cyberlaw.com/aml.html>
<http://www.fincen.gov/352ccards.pdf>

There seems to be some massive class break against Citibank ATM cards 
in Canada, the UK, and Russia.  I haven't waded through all the 
details, but here are a bunch of links:
<http://www.schneier.com/blog/archives/2006/03/more_on_the_atm.html>
<http://ioerror.livejournal.com/301520.html>
<http://www.boingboing.net/2006/03/05/citibank_under_fraud.html>
<http://www.consumerist.com/consumer/citibank/massive-citibank-fraud-ale 
rt-update-158565.php> or <http://tinyurl.com/gs9y2>
<http://www.liquidmatrix.org/blog/2006/03/05/citibank-under-fraud-attack 
/> or <http://tinyurl.com/eey3o>
<http://www.boingboing.net/2006/03/06/citibank_live_richly.html>
<http://www.securityfocus.com/brief/157>
<http://www.msnbc.msn.com/id/11714119/>
<http://software.silicon.com/security/0,39024655,39157043,00.htm>

Using social engineering to crash the Oscars:
<http://www.cnn.com/2006/SHOWBIZ/Movies/03/04/oscars.crashers.reut/index 
.html> or <http://tinyurl.com/k65ct>

Fighting misuse of the Patriot Act:
<http://www.suntimes.com/output/steyn/cst-edt-steyn051.html>

Essay on the "analog hole," the human dimension of the problem of 
securing information.
<http://www.infoworld.com/article/06/03/01/75874_10OPstrategic_1.html>

Along the same lines, here's a story about the security risks of 
talking loudly:
<http://networks.silicon.com/mobile/0,39024665,39156987,00.htm>

Criminals are breaking into stores and pretending to ransack them, as a 
cover for installing ATM skimming hardware, complete with a 
transmitter.  Note the last paragraph of the story -- it's in Danish, 
sorry -- where the company admits that this is the fourth attempt they 
know of criminals installing reader equipment inside ATM terminals for 
the purpose of skimming numbers and PINs.
<http://www.dr.dk/Nyheder/Indland/kriminalitet/2006/02/15/152517.htm>

According to the TSA, in the 9th Circuit Case of John Gilmore, you are 
allowed to fly without showing ID -- you'll just have to submit 
yourself to secondary screening.  The Identity Project wants you to try 
it out.  If you have time, try to fly without showing ID.  I know you 
can do this if you claim that you lost your ID, but I don't know what 
the results would be if you simply refuse to show ID.
<http://www.papersplease.org/investigation.html>

In the Netherlands, criminals are stealing money from ATM machines by 
blowing them up.  First, they drill a hole in an ATM and fill it with 
some sort of gas.  Then, they ignite the gas -- from a safe distance -- 
and clean up the money that flies all over the place after the ATM 
explodes.  Sounds crazy, but apparently there has been an increase in 
this type of attack recently.  The banks' countermeasure is to install 
air vents so that gas can't build up inside the ATMs.
<http://www.nu.nl/news/683538/13/Banken_wapenen_zich_tegen_plofkraak.htm 
l> or <http://tinyurl.com/z9mng>

GPG is an open-source version of the PGP e-mail encryption 
protocol.  Recently, a very serious vulnerability was discovered in the 
software: given a signed e-mail message, you can modify the message -- 
specifically, you an prepend or append arbitrary data -- without 
disturbing the signature verification.  (This bug is fixed in Version 
1.4.2.2.  Users should upgrade immediately.)  It appears this bug has 
existed for years without anybody finding it.  Moral: Open source does 
not necessarily mean "fewer bugs."  I wrote about this back in 1999.
<http://lists.gnupg.org/pipermail/gnupg-announce/2006q1/000216.html>
<http://www.schneier.com/crypto-gram-9909.html#OpenSourceandSecurity>

Finding covert CIA agents using the Internet:
<http://www.chicagotribune.com/news/nationworld/chi-060311ciamain-story, 
1,123362.story> or <http://tinyurl.com/qhe2d>
<http://www.theregister.co.uk/2006/03/13/ispy/>
<http://www.telegraph.co.uk/news/main.jhtml?xml=/news/2006/03/13/ucia.xm 
l&sSheet=/portal/2006/03/13/ixportaltop.html> or <http://tinyurl.com/h673n>

An article explains how you can modify, and then print, your own 
boarding pass and get on an airplane even if you're on the no-fly 
list.  This isn't news; I wrote about it in 2003.
<http://www.csoonline.com/read/020106/caveat021706.html>
<http://www.schneier.com/crypto-gram-0308.html#6>

Interesting, but long, article on bioterrorism:
<http://www.technologyreview.com/BioTech/wtr_16485,306,p1.html>

Clever college basketball prank relies on social engineering:
<http://www.schneier.com/blog/archives/2006/03/basketball_pran.html>


** *** ***** ******* *********** *************

Police Department Privilege Escalation



It's easier than you think to create your own police department in the 
United States.

Yosef Maiwandi formed the San Gabriel Valley Transit Authority -- a 
tiny, privately run nonprofit organization that provides bus rides to 
disabled people and senior citizens. It operates out of an auto repair 
shop.  Then, because the law seems to allow transit companies to form 
their own police departments, he formed the San Gabriel Valley Transit 
Authority Police Department.  As a thank you, he made Stefan Eriksson a 
deputy police commissioner of the San Gabriel Transit Authority 
Police's anti-terrorism division, and gave him business cards.

Police departments like this don't have much legal authority, they 
don't really need to. My guess is that the name alone is impressive enough.

In the computer security world, privilege escalation means using some 
legitimately granted authority to secure extra authority that was not 
intended.  This is a real-world counterpart.  Even though transit 
police departments are meant to police their vehicles only, the title 
-- and the ostensible authority that comes along with it -- is useful 
elsewhere.  Someone with criminal intent could easily use this 
authority to evade scrutiny or commit fraud.

"Deal said that his agency has discovered that several railroad 
agencies around California have created police departments -- even 
though the companies have no rail lines in California to patrol. The 
police certification agency is seeking to decertify those agencies 
because it sees no reason for them to exist in California.

"The issue of private transit firms creating police agencies has in 
recent years been a concern in Illinois, where several individuals with 
criminal histories created railroads as a means of forming a police 
agency."

The real problem is that we're too deferential to police power.  We 
don't know the limits of police authority, whether it be an airport 
policeman or someone with a business card from the "San Gabriel Valley 
Transit Authority Police Department."

<http://www.latimes.com/news/local/la-me-ferrari8mar08,0,3717162.story>


** *** ***** ******* *********** *************

      Database Error Causes Unbalanced Budget



A house erroneously valued at $400 million caused budget shortfalls and 
possible layoffs in municipalities and school districts in northwest 
Indiana.  Seems that some unauthorized user accidentally changed the 
value of some database entry.

Three things immediately spring to mind:

One, the system did not fail safely.  This one error seems to have 
cascaded into multiple errors, as the new tax total immediately changed 
budgets of "18 government taxing units."

Two, there were no sanity checks on the system.  "The city of 
Valparaiso and the Valparaiso Community School Corp. were asked to 
return $2.7 million."  Didn't the city wonder where all that extra 
money came from in the first place?

Three, the access-control mechanisms on the computer system were too 
broad.  When a user is authenticated to use the "R-E-D" program, he 
shouldn't automatically have permission to use the "R-E-R" program as 
well.  Authentication isn't all or nothing; it should be granular to 
the operation.

<http://cnews.canoe.ca/CNEWS/WeirdNews/2006/02/10/1436417-ap.html>


** *** ***** ******* *********** *************

      Credit Card Companies and Agenda



A guy tears up a credit card application, tapes it back together, fills 
it out with someone else's address and a different phone number, and 
send it in.  He still gets a credit card.

Imagine that some fraudster is rummaging through your trash and finds a 
torn-up credit card application.  That's why this is bad.

To understand why it's happening, you need to understand the trade-offs 
and the agenda.  From the point of view of the credit card company, the 
benefits of giving someone a credit card is that he'll use it and 
generate revenue.  The risk is that it's a fraudster who will cost the 
company revenue.  The credit card industry has dealt with the risk in 
two ways: they've pushed a lot of the risk onto the merchants, and 
they've implemented fraud detection systems to limit the damage.

All other costs and problems of identity theft are borne by the 
consumer; they're an externality to the credit card company.  They 
don't enter into the trade-off decision at all.

We can laugh at this kind of thing all day, but it's actually in the 
best interests of the credit card industry to mail cards in response to 
torn-up and taped-together applications without doing much checking of 
the address or phone number.  If we want that to change, we need to fix 
the externality.

<http://www.cockeyed.com/citizen/creditcard/application.shtml>


** *** ***** ******* *********** *************

      Counterpane News



Counterpane and MessageLabs has released a joint Attack Trends report:
<http://www.counterpane.com/pr-20060313.html>
Report here.
<http://www.counterpane.com/dl/attack-trends-2005-messagelabs.pdf>

TippingPoint partners with Counterpane:
<http://www.crn.com/sections/security/security.jhtml?articleId=181500683 
 > or <http://tinyurl.com/e5rax>

Schneier is speaking at the Software Development Conference in Santa 
Clara on March 15:
<http://www.sdexpo.com/>

Schneier is speaking at the IDC IT Security Roadshow in Istanbul, 
Prague, and Warsaw, March 21st - 28th:
<http://www.idc-cema.com/?showproduct=28039&content_lang=ENG>
<http://www.idc-cema.com/?showproduct=28034&content_lang=ENG>
<http://www.idc-cema.com/?showproduct=28036&content_lang=ENG>

Schneier is speaking at the Rochester Institute of Technology on April 7th:
<http://www.gccis.rit.edu/index.php3?dir=sidebar/&var=summary_of_DLS>

Counterpane is hiring.  We have an urgent need for a Database and 
Systems Analyst and a Senior Java Software Engineer.  There are other 
job openings, too.
<http://www.counterpane.com/jobs.html>

Schneier received the 2006 Dr. Dobb's Journal Excellence in Programming 
Award:
<http://www.ddj.com/documents/s=10035/ddj0604a/0604a.html>

Password Safe 3.0 Beta is available for public testing:
<https://sourceforge.net/forum/forum.php?forum_id=549069>

CNN interviewed Schneier for a story about cell phone tracking.  Video:
<http://dynamic.cnn.com/apps/tp/video/tech/2006/03/14/sieberg.cell.track 
.affl/video.ws.asx> or <http://tinyurl.com/j4sy4>

** *** ***** ******* *********** *************

      Proof that Employees Don't Care About Security



Does anyone think that this experiment would turn out any differently?

"An experiment carried out within London's square mile has revealed 
that employees in some of the City's best known financial services 
companies don't care about basic security policy.

"CDs were handed out to commuters as they entered the City by employees 
of IT skills specialist The Training Camp and recipients were told the 
disks contained a special Valentine's Day promotion.

"However, the CDs contained nothing more than code which informed The 
Training Camp how many of the recipients had tried to open the CD. 
Among those who were duped were employees of a major retail bank and 
two global insurers.

"The CD packaging even contained a clear warning about installing 
third-party software and acting in breach of company acceptable-use 
policies -- but that didn't deter many individuals who showed little 
regard for the security of their PC and their company."

This was a benign stunt, but it could have been much more serious.  A 
CD-ROM carried into the office and run on a computer bypasses the 
company's network security systems.  You could easily imagine a 
criminal ring using this technique to deliver a malicious program into 
a corporate network -- and it would work.

But concluding that employees don't care about security is a bit 
naive.   Employees care about security; they just don't understand 
it.  Computer and network security is complicated and confusing, and 
unless you're technologically inclined, you're just not going to have 
an intuitive feel for what's appropriate and what's a security 
risk.  Even worse, technology changes quickly, and any security 
intuition an employee has is likely to be out of date within a short time.

Education is one way to deal with this, but education has its 
limitations.  I'm sure these banks had security awareness campaigns; 
they just didn't stick.  Punishment is another form of education, and 
my guess it would be more effective.  If the banks fired everyone who 
fell for the CD-ROM-on-the-street trick, you can be sure that no one 
would ever do that again.  (At least, until everyone forgot.)  That 
won't ever happen, though, because the morale effects would be huge.

Rather than blaming this kind of behavior on the users, we would be 
better served by focusing on the technology.  Why does the average 
computer user at a bank need the ability to install software from a 
CD-ROM?  Why doesn't the computer block that action, or at least inform 
the IT department?  Computers need to be secure regardless of who's 
sitting in front of them, irrespective of what they do.

If I go downstairs and try to repair the heating system in my home, I'm 
likely to break all sorts of safety rules -- and probably the system 
and myself in the process.  I have no experience in that sort of thing, 
and honestly, there's no point trying to educate me.  But my home 
heating system works fine without my having to learn anything about 
it.  I know how to set my thermostat, and to call a professional if 
something goes wrong.

Computers need to work more like that.

<http://software.silicon.com/security/0,39024655,39156503,00.htm>


** *** ***** ******* *********** *************

      U.S. Port Security and Proxies



Does it make sense to surrender management, including security, of six 
U.S. ports to a Dubai-based company? This question has set off a heated 
debate between the administration and Congress, as members of both 
parties condemned the deal.

Most of the rhetoric is political posturing, but there's an interesting 
security issue embedded in the controversy. It's about proxies, trust, 
and transparency.

A proxy is a concept I discussed in my book "Beyond Fear." It's a 
person or organization that acts on your behalf in some way. It's how 
complex societies work -- it's impossible for us all to do everything 
or make every decision, so we cede some authority to proxies.

Whether it's the cook at the restaurant where you're eating, the 
suppliers who make your business run or your government, proxies are 
everywhere. Doctors, stockbrokers, hotel chains and government 
regulators like the FDA and the FAA are all proxies.

Sometimes proxies act in our behalf simply because we can't do 
everything. But more often we have these proxies because we don't have 
the expertise to do the work ourselves.

Most security works through proxies. We just don't have the expertise 
to make decisions about airline security, police coverage and military 
readiness, so we rely on others. We all hope our proxies make the same 
decisions we would have, but our only choice is to trust -- to rely on, 
really -- our proxies.

Here's the paradox: Even though we are forced to rely on them, we may 
or may not trust them. When we trust our proxies, we come to that trust 
in a variety of ways -- sometimes through experience, sometimes through 
recommendations from a source we trust. Sometimes it's third-party 
audit, affiliations in professional societies or a gut feeling. But 
when it comes to government, trust is based on transparency. The more 
our government is based on secrecy, the more we are forced to "just 
trust" it and the less we actually trust it.

The security of U.S. ports involves a lot of proxies. We, the people, 
gave our proxy to our elected officials. They passed laws -- the 
Maritime Transportation Security Act (a U.S. law) and the International 
Ship and Port Facility Security codes -- regulating security at these 
ports, and tasked the Coast Guard, another proxy, to oversee that.

The same elected officials (or perhaps some different elected 
officials, through some other bureaucratic proxy entirely) have hired 
yet another proxy -- a company based in the United Kingdom called 
Peninsular and Oriental Steam Navigation Company (P&O) -- to manage the 
ports in New York, New Jersey, Philadelphia, Baltimore, Miami and New 
Orleans.

And now the officers of P&O, acting as proxies for the company's 
shareholders, agreed to be absorbed by yet another proxy: Thunder FZE, 
itself a subsidiary of another company called Dubai Ports World, which 
is a corporation based in the United Arab Emirates.

Still another proxy, the Committee on Foreign Investment in the United 
States, part of the Treasury Department, approved the sale. And 
finally, both P&O and Thunder FZE hire thousands of proxies -- 
employees, suppliers and partners -- to carry out the actual jobs at 
the various ports they operate.

This is a complicated web of proxies, but it's a complicated system. We 
have trouble trusting it, because so much is shrouded in secrecy. We 
don't know what kind of security these ports have. We hear snippets 
like "only 5 percent of incoming cargo is inspected," but we don't know 
more than that. We read that security aspects of the P&O sale were 
"rigorously reviewed," and that the review was "cursory."

We don't know what kind of security there is in the UAE, Dubai Ports 
World or the subsidiary that is actually doing the work. We have no 
choice but to rely on these proxies, yet we have no basis by which to 
trust them.

Pull aside the rhetoric, and this is everyone's point. There are those 
who don't trust the Bush administration and believe its motivations are 
political. There are those who don't trust the UAE because of its 
terrorist ties -- two of the 9/11 terrorists and some of the funding 
for the attack came out of that country -- and those who don't trust it 
because of racial prejudices. There are those who don't trust security 
at our nation's ports generally and see this as just another example of 
the problem.

The solution is openness. The Bush administration needs to better 
explain how port security works, and the decision process by which the 
sale of P&O was approved. If this deal doesn't compromise security, 
voters -- at least the particular lawmakers we trust -- need to 
understand that.

Regardless of the outcome of the Dubai deal, we need more transparency 
in how our government approaches counter-terrorism in general. Secrecy 
simply isn't serving our nation well in this case. It's not making us 
safer, and it's properly reducing faith in our government.

Proxies are a natural outgrowth of society, an inevitable byproduct of 
specialization. But our proxies are not us and they have different 
motivations -- they simply won't make the same security decisions as we 
would. Whether a king is hiring mercenaries, an organization is hiring 
a network security company or a person is asking some guy to watch his 
bags while he gets a drink of water, successful security proxies are 
based on trust. And when it comes to government, trust comes through 
transparency and openness.

Think of it as security from proxies.

This essay previously appeared in Wired:
<http://www.wired.com/news/columns/0,70258-0.html>

This is some good commentary on the Dubai port issue:
<http://politicalwire.com/archives/2006/03/01/dubai_and_our_ports_whos_t 
aking_over_what.html> or <http://tinyurl.com/og5ap>


** *** ***** ******* *********** *************

      Comments from Readers



There are hundreds of comments -- many of them interesting -- on these 
topics on my blog.  Search for the story you want to comment on, and 
join in.

<http://www.schneier.com/blog>


** *** ***** ******* *********** *************

CRYPTO-GRAM is a free monthly newsletter providing summaries, analyses, 
insights, and commentaries on security: computer and otherwise.  You 
can subscribe, unsubscribe, or change your address on the Web at 
<http://www.schneier.com/crypto-gram.html>.  Back issues are also 
available at that URL.

Comments on CRYPTO-GRAM should be sent to 
[log in to unmask]  Permission to print comments is assumed 
unless otherwise stated.  Comments may be edited for length and clarity.

Please feel free to forward CRYPTO-GRAM to colleagues and friends who 
will find it valuable.  Permission is granted to reprint CRYPTO-GRAM, 
as long as it is reprinted in its entirety.

CRYPTO-GRAM is written by Bruce Schneier.  Schneier is the author of 
the best sellers "Beyond Fear," "Secrets and Lies," and "Applied 
Cryptography,"  and an inventor of the Blowfish and Twofish 
algorithms.  He is founder and CTO of Counterpane Internet Security 
Inc., and is a member of the Advisory Board of the Electronic Privacy 
Information Center (EPIC).  He is a frequent writer and lecturer on 
security topics.  See <http://www.schneier.com>.

Counterpane is the world's leading protector of networked information - 
the inventor of outsourced security monitoring and the foremost 
authority on effective mitigation of emerging IT threats. Counterpane 
protects networks for Fortune 1000 companies and governments 
world-wide.  See <http://www.counterpane.com>.

Crypto-Gram is a personal newsletter.  Opinions expressed are not 
necessarily those of Counterpane Internet Security, Inc.

Copyright (c) 2006 by Bruce Schneier.

-- 
This message has been scanned for viruses and dangerous
content by the NorMAN MailScanner Service and is believed
to be clean.

The NorMAN MailScanner Service is operated by Information
Systems and Services, University of Newcastle upon Tyne.


====
This e-mail is intended solely for the addressee. It may contain private and
confidential information. If you are not the intended addressee, please take
no action based on it nor show a copy to anyone. Please reply to this e-mail
to highlight the error. You should also be aware that all electronic mail
from, to, or within Northumbria University may be the subject of a request
under the Freedom of Information Act 2000 and related legislation, and
therefore may be required to be disclosed to third parties.
This e-mail and attachments have been scanned for viruses prior to leaving
Northumbria University. Northumbria University will not be liable for any
losses as a result of any viruses being passed on.

************************************************************************************
Distributed through Cyber-Society-Live [CSL]: CSL is a moderated discussion
list made up of people who are interested in the interdisciplinary academic
study of Cyber Society in all its manifestations.To join the list please visit:
http://www.jiscmail.ac.uk/lists/cyber-society-live.html
*************************************************************************************

Top of Message | Previous Page | Permalink

JiscMail Tools


RSS Feeds and Sharing


Advanced Options


Archives

April 2024
March 2024
February 2024
January 2024
December 2023
November 2023
October 2023
September 2023
August 2023
July 2023
June 2023
May 2023
April 2023
March 2023
February 2023
January 2023
December 2022
November 2022
October 2022
September 2022
August 2022
June 2022
May 2022
March 2022
February 2022
October 2021
July 2021
June 2021
April 2021
March 2021
February 2021
January 2021
December 2020
November 2020
October 2020
September 2020
July 2020
June 2020
May 2020
April 2020
February 2020
January 2020
December 2019
November 2019
October 2019
September 2019
August 2019
July 2019
June 2019
May 2019
March 2019
February 2019
January 2019
December 2018
November 2018
October 2018
September 2018
August 2018
July 2018
June 2018
May 2018
April 2018
March 2018
February 2018
January 2018
December 2017
November 2017
October 2017
September 2017
August 2017
July 2017
June 2017
May 2017
April 2017
March 2017
January 2017
December 2016
November 2016
October 2016
September 2016
August 2016
June 2016
May 2016
April 2016
March 2016
February 2016
January 2016
December 2015
November 2015
October 2015
September 2015
August 2015
July 2015
June 2015
May 2015
April 2015
March 2015
February 2015
January 2015
December 2014
November 2014
October 2014
September 2014
August 2014
June 2014
May 2014
April 2014
March 2014
February 2014
January 2014
December 2013
November 2013
October 2013
September 2013
August 2013
July 2013
June 2013
May 2013
April 2013
March 2013
February 2013
January 2013
December 2012
November 2012
October 2012
September 2012
August 2012
July 2012
June 2012
May 2012
April 2012
March 2012
February 2012
January 2012
December 2011
November 2011
October 2011
September 2011
July 2011
June 2011
May 2011
April 2011
March 2011
February 2011
January 2011
December 2010
November 2010
October 2010
September 2010
August 2010
July 2010
June 2010
May 2010
April 2010
March 2010
February 2010
January 2010
December 2009
November 2009
October 2009
September 2009
July 2009
June 2009
May 2009
April 2009
March 2009
February 2009
January 2009
December 2008
November 2008
October 2008
September 2008
June 2008
May 2008
April 2008
March 2008
February 2008
January 2008
December 2007
November 2007
October 2007
September 2007
June 2007
May 2007
April 2007
March 2007
February 2007
January 2007
2006
2005
2004
2003
2002
2001
2000


JiscMail is a Jisc service.

View our service policies at https://www.jiscmail.ac.uk/policyandsecurity/ and Jisc's privacy policy at https://www.jisc.ac.uk/website/privacy-notice

For help and support help@jisc.ac.uk

Secured by F-Secure Anti-Virus CataList Email List Search Powered by the LISTSERV Email List Manager