JiscMail Logo
Email discussion lists for the UK Education and Research communities

Help for CYBER-SOCIETY-LIVE Archives


CYBER-SOCIETY-LIVE Archives

CYBER-SOCIETY-LIVE Archives


CYBER-SOCIETY-LIVE@JISCMAIL.AC.UK


View:

Message:

[

First

|

Previous

|

Next

|

Last

]

By Topic:

[

First

|

Previous

|

Next

|

Last

]

By Author:

[

First

|

Previous

|

Next

|

Last

]

Font:

Proportional Font

LISTSERV Archives

LISTSERV Archives

CYBER-SOCIETY-LIVE Home

CYBER-SOCIETY-LIVE Home

CYBER-SOCIETY-LIVE  2006

CYBER-SOCIETY-LIVE 2006

Options

Subscribe or Unsubscribe

Subscribe or Unsubscribe

Log In

Log In

Get Password

Get Password

Subject:

[CSL]: CRYPTO-GRAM, February 15, 2006

From:

J Armitage <[log in to unmask]>

Reply-To:

Interdisciplinary academic study of Cyber Society <[log in to unmask]>

Date:

Wed, 15 Feb 2006 09:23:52 -0000

Content-Type:

text/plain

Parts/Attachments:

Parts/Attachments

text/plain (1249 lines)

From: Bruce Schneier [mailto:[log in to unmask]] 
Sent: 15 February 2006 07:06
To: [log in to unmask]
Subject: CRYPTO-GRAM, February 15, 2006

                  CRYPTO-GRAM

               February 15, 2006

               by Bruce Schneier
                Founder and CTO
       Counterpane Internet Security, Inc.
            [log in to unmask]
            <http://www.schneier.com>
           <http://www.counterpane.com>


A free monthly newsletter providing summaries, analyses, insights, and 
commentaries on security: computer and otherwise.

For back issues, or to subscribe, visit 
<http://www.schneier.com/crypto-gram.html>.

You can read this issue on the web at 
<http://www.schneier.com/crypto-gram-0602.html>.  These same essays 
appear in the "Schneier on Security" blog: 
<http://www.schneier.com/blog>.  An RSS feed is available.


** *** ***** ******* *********** *************

In this issue:
      Risks of Losing Portable Devices
      Multi-Use ID Cards
      Ben Franklin on the Feeling of Security
      Valentine's Day Security
      Crypto-Gram Reprints
      U.S. Customs Opening International Mail
      The Failure of US-VISIT
      Identity Theft in the UK
      News
      Passlogix Misquotes Me in Their PR Material
      The Doghouse: Super Cipher P2P Messenger
      Privatizing Registered Traveler
      Counterpane News
      Security Problems with Controlled Access Systems
      Security in Cartoons
      Countering "Trusting Trust"
      Security in the Cloud
      Comments from Readers


** *** ***** ******* *********** *************

      Risks of Losing Portable Devices



Some years ago, I left my laptop computer on a train from Washington to 
New York. Replacing the computer was expensive, but at the time I was 
more worried about the data.

Of course I had good backups, but now a copy of all my e-mail, client 
files, personal writings and book manuscripts were...well, somewhere. 
Probably the drive would be erased by the computer's new owner, but 
maybe my personal and professional life would end up in places I didn't 
want them to be.

If anything, this problem has gotten worse. Our digital devices have 
all gotten smaller, while at the same time they're carrying more and 
more sensitive information.

My laptop is my primary computer. It could easily contain every e-mail 
I've sent and received over the past 12 years, an enormous amount of 
work-related documents, and my personal everything.

I have several USB thumb drives, including a 2-gig drive that serves as 
my primary backup. The one I carry with me contains a complete dump of 
the past 12 months of my life, in a device so easy to lose some people 
I know buy them in bulk.

My cell phone is a Treo. It holds not only my frequently called phone 
numbers, but my entire address book -- including any personal notes 
I've made -- my calendar for the past six years, hundreds of e-mails, 
all my SMS messages, and a log of every phone call I've made and 
received. At least, it would if I didn't take specific pains to clean 
that information out once in a while.

A friend of mine has a habit of leaving his iPod on airplanes; he's 
been through three so far. The most recent one he lost contained not 
only his full music library, but his address book and calendar as well. 
And the press regularly reports stories about lost and stolen laptops 
with sensitive corporate documents or personal information of hundreds 
of thousands of individuals.

I could go on forever.

The point is that it's now amazingly easy to lose an enormous amount of 
information. Twenty years ago, someone could break into my office and 
copy every customer file, every piece of correspondence, everything 
about my professional life. Today, all he has to do is steal my 
computer. Or my portable backup drive. Or my small stack of DVD 
backups. Furthermore, he could sneak into my office and copy all this 
data, and I'd never know it.

This problem isn't going away anytime soon.

There are two solutions that make sense. The first is to protect the 
data. Hard-disk encryption programs like PGP Disk allow you to encrypt 
individual files, folders or entire disk partitions. Several 
manufacturers market USB thumb drives with built-in encryption. Some 
PDA manufacturers are starting to add password protection -- not as 
good as encryption, but at least it's something -- to their devices, 
and there are some aftermarket PDA encryption programs.

The second solution is to remotely delete the data if the device is 
lost. This is still a new idea, but I believe it will gain traction in 
the corporate market. If you give an employee a BlackBerry for business 
use, you want to be able to wipe the device's memory if he loses it. 
And since the device is online all the time, it's a pretty easy feature 
to add.

But until these two solutions become ubiquitous, the best option is to 
pay attention and erase data. Delete old e-mails from your BlackBerry, 
SMSs from your cell phone and old data from your address books -- 
regularly. Find that call log and purge it once in a while. Don't store 
everything on your laptop, only the files you might actually need.

I don't think we can make these devices harder to lose; that's a human 
problem and not a technological one. But we can make the loss just cost 
money, not privacy.


This essay originally appeared on Wired.com:
<http://www.wired.com/news/technology/0,70044-0.html>

A Dutch army officer lost a memory stick with classified details of an 
Afghan mission.
<http://www.expatica.com/source/site_article.asp?subchannel_id=1&story_i 
d=27303&name=Officer+lost+USB+stick+with+details+of+Afghan+mission> or 
<http://tinyurl.com/ahm7f>

** *** ***** ******* *********** *************

      Multi-Use ID Cards



I don't know about your wallet, but mine contains a driver's license, 
three credit cards, two bank ATM cards, frequent-flier cards for three 
airlines and frequent-guest cards for three hotel chains, memberships 
cards to two airline clubs, a library card, a AAA card, a Costco 
membership, and a bunch of other ID-type cards.

Any technologist who looks at the pile would reasonably ask: why all 
those cards? Most of them are not intended to be hard-to-forge 
identification cards; they're simply ways of carrying around unique 
numbers that are pointers into a database. Why does Visa bother issuing 
credit cards in the first place? Clearly you don't need the physical 
card in order to complete the transaction, as anyone who has bought 
something over the phone or the internet knows. Your bank could just 
use your driver's license number as an account number.

The same with those airline, hotel and rental car affinity cards. Or 
any of the discount cards given out by supermarkets, office supply 
stores, hardware stores and -- it seems -- everyone else. They could 
use any of your existing account numbers. Or simply your name and 
address. In fact, if you forget your card, they'll look up your account 
number if you give them your phone number. Why go to the trouble and 
expense of issuing unique cards at all?

A single, centralized authentication system has long been the dream of 
many technologists. Those involved in computer security will remember 
the promise of public-key infrastructure, or PKI. Everyone was going to 
have a single digital "certificate" that would be accepted by all sorts 
of different applications. It never happened.

And today the most far-reaching proposals for national ID cards -- 
including a recent South African proposal -- envision a world where a 
single ID would be used for everything. It won't happen either.

And neither will a world of biometrics. It's the obvious next step: Why 
carry a driver's license? Use your face or fingerprint.

But the truth is, neither a national ID nor a biometric system will 
ever replace the decks of plastic and paper that crowd our wallets.

For starters, the uniqueness of the cards provides important security 
to the issuers. Everyone has different rules for card issuance, 
expiration and revocation, and everyone wants to be in control of their 
own cards. If you lose control, you lose security. So airline clubs ask 
for a photo ID with your membership card, and merchants want to see it 
when you use your credit card, but neither will replace their cards 
with that photo ID.

Another reason is reliability. Your credit card company doesn't want 
your ability to make purchases to disappear if you have your driver's 
license revoked. Your airline doesn't want your frequent-flier account 
to depend on a particular credit card. And no one wants the liability 
of having their application depend on someone else's infrastructure, or 
having their infrastructure support someone else's application.

But security and reliability are only secondary concerns. If it made 
smart business sense for companies to piggyback on existing cards, they 
would find a way around the security concerns. The reason they don't 
boils down to one word: branding.

My airline wants a card with its logo on it in my wallet. So does my 
rental car company, my supermarket and everyone else I do business 
with. My credit card company wants me to open up my wallet and notice 
its card; I'm far more likely to use a physical card than a virtual one 
that I have to remember is attached to my driver's license number. And 
I'm more likely to feel important if I have a card, especially a card 
that recognizes me as a frequent flier or a preferred customer.

Some years ago, when credit cards with embedded chips were new, the 
card manufacturers designed a secure, multi-application operating 
system for these smartcards. The idea was that a single physical card 
could be used for everything: multiple credit card accounts, airline 
affinity memberships, public-transportation payment cards, etc. Nobody 
bought into the system: not because of security concerns, but because 
of branding concerns. Whose logo would get to be on the card? When the 
manufacturers envisioned a card with multiple small logos, one for each 
application, everyone wanted to know: Whose logo would be first? On 
top? In color?

The companies give you their own card partly because they want complete 
control of the rules around their own system, but mostly because they 
want you to carry around a small piece of advertising in your wallet. 
An American Express Gold Card is supposed to make you feel powerful and 
everyone else feel green. They want you to wave it around.

That's why you still have a dozen different cards in your wallet. And 
countries that have national IDs give their citizens yet another card 
to carry around in their wallets -- and not a replacement for something 
else.


This essay originally appeared on Wired.com:
<http://www.wired.com/news/technology/0,70167-0.html>


** *** ***** ******* *********** *************

      Ben Franklin on the Feeling of Security



January 17 was Ben Franklin's 300th birthday.  Among many other 
discoveries and inventions, Franklin worked out a way of protecting 
buildings from lightning strikes, by providing a conducting path to 
ground -- outside a building -- from one or more pointed rods high atop 
the structure.  People tried this, and it worked.  Franklin became a 
celebrity, not just among "electricians," but among the general public.

An article in January's issue of "Physics Today" has a great 1769 quote 
by Franklin about lightning rods, and the reality vs. the feeling of 
security:

"Those who calculate chances may perhaps find that not one death (or 
the destruction of one house) in a hundred thousand happens from that 
cause, and that therefore it is scarce worth while to be at any expense 
to guard against it. But in all countries there are particular 
situations of buildings more exposed than others to such accidents, and 
there are minds so strongly impressed with the apprehension of them, as 
to be very unhappy every time a little thunder is within their hearing; 
it may therefore be well to render this little piece of new knowledge 
as general and well understood as possible, since to make us safe is 
not all its advantage, it is some to make us easy. And as the stroke it 
secures us from might have chanced perhaps but once in our lives, while 
it may relieve us a hundred times from those painful apprehensions, the 
latter may possibly on the whole contribute more to the happiness of 
mankind than the former."

<http://www.physicstoday.org/vol-59/iss-1/p42.html>


** *** ***** ******* *********** *************

      Valentine's Day Security



Last Friday, the Wall Street Journal ran an article about how 
Valentine's Day is the day when cheating spouses are most likely to 
trip up:

"Valentine's Day is the biggest single 24-hour period for florists, a 
huge event for greeting-card companies and a boon for candy makers. But 
it's also a major crisis day for anyone who is having an affair. After 
all, Valentine's Day is the one holiday when everyone is expected to do 
something romantic for their spouse or lover -- and if someone has 
both, it's a serious problem."

So, of course, private detectives work overtime.

"'If anything is going on, it will be happening on that day,' says 
Irene Smith, who says business at her Discreet Investigations detective 
agency in Golden, Colo., as much as doubles -- to as many as 12 cases 
some years -- on Valentine's Day."

Private detectives are expensive -- about $100 per hour, according to 
the article -- and might not be worth it.

The article suggests some surveillance tools you can buy at home: a 
real-time GPS tracking system you can hide in your spouse's car, a Home 
Evidence Collection Kit you can use to analyze stains on "clothing, car 
seats or elsewhere," Internet spying software, a telephone recorder, 
and a really cool buttonhole camera.

But even that stuff may be overkill:

"Ruth Houston, author of a book called _Is He Cheating on You? -- 829 
Telltale Signs,_ says she generally recommends against spending money 
on private detectives to catch cheaters because the indications are so 
easy to read. (Sign No. 3 under "Gifts": He tries to convince you he 
bought expensive chocolates for himself.)"

I hope I don't need to remind you that cheaters should also be reading 
that book, familiarizing themselves with the 829 telltale signs they 
should avoid making.

The article has several interesting personal stories, and warns that 
"planning a 'business trip' that falls over Valentine's Day is a 
typical mistake cheaters make."

So now I'm wondering why the RSA Conference is being held over 
Valentine's Day.


Wall Street Journal article (unfortunately, the link is only for paid 
subscribers):
<http://online.wsj.com/article/SB113953440437870240.html>

Real-time GPS tracking system:
<http://spygear4u.com/>

Home Evidence Collection Kit:
<http://trutestinc.com/>

Internet spying software:
<http://e-spy-software.com/>

Telephone recorder:
<http://uspystore.com/>

Buttonhole camera:
<http://pimall.com/nais/buttoncamera.html>


** *** ***** ******* *********** *************

      Crypto-Gram Reprints



Crypto-Gram is currently in its ninth year of publication.  Back issues 
cover a variety of security-related topics, and can all be found on 
<http://www.schneier.com/crypto-gram-back.html>.  These are a selection 
of articles that appeared in this calendar month in other years.

TSA's Secure Flight
<http://www.schneier.com/crypto-gram-0502.html#1>

The Curse of the Secret Question:
<http://www.schneier.com/crypto-gram-0502.html#9>

Authentication and Expiration:
<http://www.schneier.com/crypto-gram-0502.html#10>

Toward Universal Surveillance:
<http://www.schneier.com/crypto-gram-0402.html#1>

The Politicization of Security:
<http://www.schneier.com/crypto-gram-0402.html#2>

Identification and Security:
<http://www.schneier.com/crypto-gram-0402.html#6>

The Economics of Spam:
<http://www.schneier.com/crypto-gram-0402.html#9>

Militaries and Cyber-War:
<http://www.schneier.com/crypto-gram-0301.html#1>

The RMAC Authentication Mode:
<http://www.schneier.com/crypto-gram-0301.html#7>

Microsoft and "Trustworthy Computing":
<http://www.schneier.com./crypto-gram-0202.html#1>

Judging Microsoft:
<http://www.schneier.com./crypto-gram-0202.html#2>

Hard-drive-embedded copy protection:
<http://www.schneier.com/crypto-gram-0102.html#1>

A semantic attack on URLs:
<http://www.schneier.com/crypto-gram-0102.html#7>

E-mail filter idiocy:
<http://www.schneier.com/crypto-gram-0102.html#8>

Air gaps:
<http://www.schneier.com/crypto-gram-0102.html#9>

Internet voting vs. large-value e-commerce:
<http://www.schneier.com/crypto-gram-0102.html#10>

Distributed denial-of-service attacks:
<http://www.schneier.com/crypto-gram-0002.html#ddos>

Recognizing crypto snake-oil:
<http://www.schneier.com/crypto-gram-9902.html#snakeoil>


** *** ***** ******* *********** *************

      U.S. Customs Opening International Mail



The press is reporting that Customs and Border Protection is opening 
international mail coming into the U.S. without warrant.

Sadly, this is legal.

Congress passed a trade act in 2002, 107 H.R. 3009, that expanded the 
Custom Service's ability to open international mail.  Here's the 
beginning of Section 344:

"(1) In general.--For purposes of ensuring compliance with the Customs 
laws of the United States and other laws enforced by the Customs 
Service, including the provisions of law described in paragraph (2), a 
Customs officer may, subject to the provisions of this section, stop 
and search at the border, without a search warrant, mail of domestic 
origin transmitted for export by the United States Postal Service and 
foreign mail transiting the United States that is being imported or 
exported by the United States Postal Service."

If I remember correctly, the ACLU was able to temper the amendment, and 
this language is better than what the government originally wanted.

Domestic First Class mail is still private; the police need a warrant 
is to open it.  But there is a lower standard for Media Mail and the 
like, and a lower standard for "mail covers": the practice of 
collecting address information from the outside of the envelope.

<http://www.msnbc.msn.com/id/10740935/>

107 H.R. 3009:
<http://thomas.loc.gov/cgi-bin/bdquery/z?d107:h.r.3009:>


** *** ***** ******* *********** *************

      The Failure of US-VISIT



US-VISIT is the program to program to fingerprint and otherwise keep 
tabs on foreign visitors to the U.S.  A recent article talks about how 
the program is being rolled out, but the last paragraph is the most 
interesting:

"Since January 2004, US-VISIT has processed more than 44 million 
visitors. It has spotted and apprehended nearly 1,000 people with 
criminal or immigration violations, according to a DHS press release."

I wrote about US-VISIT in 2004, and back then I said that it was too 
expensive and a bad trade-off.  The price tag for "the next phase" was 
$15B; I'm sure the total cost is much higher.

But take that $15B number.  One thousand bad guys, most of them not 
very bad, caught through US-VISIT.  That's $15M per bad guy caught.

Surely there's a more cost-effective way to catch bad guys?

<http://fcw.com/article91831-12-30-05-Web>

My previous essay on the topic:
<http://www.schneier.com/essay-072.html>


** *** ***** ******* *********** *************

      Identity Theft in the UK



Recently there was some serious tax credit fraud in the UK.  Basically, 
there is a tax-credit system that allows taxpayers to get a refund for 
some of their taxes if they meet certain criteria.  Politically, this 
was a major objective of the Labour Party.  So the Inland Revenue (the 
UK version of the IRS) made it as easy as possible to apply for this 
refund.  One of the ways taxpayers could apply was via a Web portal.

Unfortunately, the only details necessary when applying were the 
applicant's National Insurance number (the UK version of the Social 
Security number) and mother's maiden name.  The refund was then paid 
directly into any bank account specified on the application 
form.  Anyone who knows anything about security can guess what 
happened.  Estimates are that fifteen millions pounds has been stolen 
by criminal syndicates.

The press has been treating this as an issue of identity theft, talking 
about how criminals went Dumpster diving to get National Insurance 
numbers and so forth.  I have seen very little about how the 
authentication scheme failed.  The system tried -- using semi-secret 
information like NI number and mother's maiden name -- to authenticate 
the person.  Instead, the system should have tried to authenticate the 
transaction.  Even a simple verification step -- does the name on the 
account match the name of the person who should receive the refund -- 
would have gone a long way to preventing this type of fraud.

<http://news.bbc.co.uk/1/hi/business/4617108.stm>


** *** ***** ******* *********** *************

      News



One problem with cameras is that you can't trust the watchers not to 
misuse them.  This is a story of two CCTV camera operators who have 
been jailed for spying on a naked woman in her own home.
<http://news.bbc.co.uk/1/hi/england/merseyside/4609746.stm>
<http://www.theregister.co.uk/2006/01/13/cctv_men_jailed/>

The Department of Homeland Security is funding the security of 
open-source products, including Linux, Apache, MySQL FreeBSD, Mozilla, 
and Sendmail.  I think this is a great use of public funds.  One of the 
limitations of open-source development is that it's hard to fund tools 
like Coverity.  And this kind of thing improves security for a lot of 
different organizations against a wide variety of threats.  And it 
increases competition with Microsoft, which will force it to improve 
its OS as well.  Everybody wins.
<http://www.eweek.com/article2/0,1895,1909946,00.asp>

All of that extra-legal NSA eavesdropping resulted in a whole lot of 
dead ends.  This should come as a surprise to nobody.  False alarms are 
what you get when institute a wholesale surveillance program with 
computers in charge; actual terrorist plots are simply too rare for 
anything else to happen.  Good security has people in charge, and 
technology as a tool -- not the other way around.
<http://www.nytimes.com/2006/01/17/politics/17spy.html>
A lot of the above article reads like a turf war between the NSA and 
the FBI, but the "inside baseball" aspects are interesting.
<http://www.wired.com/news/columns/0,70035-0.html>

Interesting essay on suicide bombers.  The conclusion: civil liberties 
increase security.
<http://www.sciam.com/article.cfm?chanID=sa006&articleID=0006A854-E67F-1 
3A1-A67F83414B7F0104&pageNumber=2&catID=2> or <http://tinyurl.com/ack8p>

Great counterfeiting story that illustrates how criminals adapt to 
security measures.  As a security measure, the merchants use a chemical 
pen that determines if the bills are counterfeit.  But that's not 
exactly what the pen does.  The pen only verifies that the paper is 
legitimate.  So criminals take low-value bills, bleach them, and turn 
them into high-value bills.
<http://news.tbo.com/news/metro/MGB60FN8IIE.html>

Last month was the 20th anniversary of the first PC virus: Brain.
<http://www.f-secure.com/v-descs/brain.shtml>
<http://www.f-secure.com/weblog/archives/archive-012006.html#00000784> 
or <http://tinyurl.com/7wnl4>

Some detail about how bomb-sniffing dogs work:
<http://www.slate.com/id/2134394/>

Anonym.OS is an anonymous operating system.  It's CD-based, designed so 
that you never touch the hard drive.  You can walk up to a public 
computer and be anonymous on the Internet.  I think this kind of thing 
is important, and am pleased to see its development.
<http://www.wired.com/news/technology/0,70017-0.html>
<http://theory.kaos.to/projects.html>
<http://yro.slashdot.org/article.pl?sid=06/01/16/2142208>

The U.S. Defense Department wants to develop a lie detector that can be 
used surreptitiously.  (Sure, who wouldn't want one of those?)
<http://www.newscientist.com/article.ns?id=mg18925335.800>

In this article about RFID cards, there's a paragraph that claims the 
chip can be read "from several yards away at border crossings."  I 
thought the government was still claiming that the chips could only be 
read from inches away?
<http://www.latimes.com/news/nationworld/nation/la-na-border18jan18,0,11 
25973.story> or <http://tinyurl.com/ch8d5>

The 43rd Mersenne prime was found: 2^30,402,457 - 1, in a massively 
parallel search.  It's 9,152,052 decimal digits long.
<http://www.mersenne.org/prime.htm>

Article on how the French utilize domestic spying as a counterterrorism 
tool.  I like reading how a judge is intimately involved in the process.
<http://www.foreignpolicy.com/story/cms.php?story_id=3353>

How to survive a robot uprising:
<http://www.livejournal.com/users/bohunk/1561641.html>
<http://www.schneier.com/blog/archives/2006/01/how_to_survive.html>

If you have a moment, take this survey on vulnerability 
disclosure.  The researchers are trying to understand how secrecy and 
openness can be balanced in the analysis and alerting of security 
vulnerabilities:
<http://www.infowarrior.org/survey.html>

Fascinating information about the information of espionage in New Zealand:
<http://www.stuff.co.nz/stuff/print/0,1478,3540743a6005,00.html>

EPIC has documents that show how no-bid contracts for work on voting 
system standards go to vendors of those machines.
<http://www.epic.org/foia_notes/note11.html>

This is a sad story of the U.S. no-fly list.  The person in question 
was flying from Canada to Mexico; his plane didn't land in the U.S., 
only flew over it.  Another case of mistaken identity, of course.
<http://www.thestar.com/NASApp/cs/ContentServer?pagename=thestar/Layout/ 
Article_Type1&c=Article&cid=1136589011746&call_pageid=968332188854&col=9 
68350060724> or <http://tinyurl.com/dsxnn>
And here's a story of a four-year-old boy on the watch list:
<http://abclocal.go.com/ktrk/story?section=local&id=3771743>

The NSA's technology transfer program:
<http://www.nsa.gov/techtrans/index.cfm>
Also look at their 44 Technology Profile Fact Sheets:
<http://www.nsa.gov/techtrans/techt00002.cfm>

I get e-mail, occasionally weird e-mail.  Every once in a while I get 
an e-mail from someone who needs a handwritten real-world cryptogram 
solved.  This one is from 2004, and involves a multiple murder and 
suicide.  The cryptogram was left by the murderer, and is on my 
blog.  Take a look, at the note and the comments, if you are interested 
in trying to solve the mystery.  But please be respectful of the 
relatives and friends of the victims; they're also following the 
progress on the blog.
<http://www.schneier.com/blog/archives/2006/01/handwritten_rea.html>

The 2005 Information Security Salary and Career Advancement Survey is 
interesting to read.
<http://www.sans.org/salary2005>

High-tech wireless dead drop discovered in Russia.  Cool stuff.
<http://en.rian.ru/russia/20060130/43250990.html>
<http://news.bbc.co.uk/2/hi/europe/4639758.stm>
<http://www.schneier.com/blog/archives/2006/01/wireless_dead_d.html>
I am reminded of a dead drop technique used by, I think, the 9/11 
terrorists.  They used Hotmail (or some other anonymous e-mail service) 
accounts, but instead of e-mailing messages to each other, one would 
save a message as "draft" and the recipient would retrieve it from the 
same account later.  I thought that was pretty clever, actually.

A couple of Dutch hacker have cracked their country's biometric 
passport.  Two points stand out.  One, the RFID chip in the passport 
can be read from ten meters.  Two, lots of predictability in the 
encryption key -- sloppy, sloppy -- makes the brute-force attack much 
easier.  But the references are from last summer.  Why is this being 
reported now?
<http://www.theregister.com/2006/01/30/dutch_biometric_passport_crack/>

Bug in Google's censored service for China:
<http://www.crypticide.com/dropsafe/articles/security/post20060129233439 
.html> or <http://tinyurl.com/792x9>
And how it works, using "tiananmen" as a search term:
<http://www.computerbytesman.com/google/imagesearch.htm?tiananmen>

The NSA on how to redact (MS Word and PDF):
<http://www.fas.org/sgp/othergov/dod/nsa-redact.pdf>
Read other NSA Security Configuration Guides here:
<http://www.nsa.gov/snac/>

Interesting article about someone convicted for running a for-profit 
botnet:
<http://www.breitbart.com/news/2006/01/23/D8FALFU05.html>

A Dutch high-tech prison where "inmates wear electronic wristbands that 
track their every movement and guards monitor cells using 
emotion-recognition software."  Emotion recognition 
software?  Wow.  Remember, new surveillance technologies are first used 
on populations with limited rights: inmates, children, military 
personnel, and the mentally ill.
<http://www.cnn.com/2006/TECH/01/19/high.tech.prisons.ap>

Interesting white paper from the ACLU: "Eavesdropping 101: What Can The 
NSA Do?"
<http://www.aclu.org/safefree/nsaspying/23989res20060131.html>
<http://www.aclu.org/safefree/nsaspying/nsamap013006.html>
<http://www.politechbot.com/2006/01/31/barry-steinhardt-on>

Unknowns tapped the mobile phones of about 100 Greek politicians and 
offices, including the U.S. embassy in Athens and the Greek prime 
minister.  Details are sketchy, but it seems that a piece of malicious 
code was discovered by Ericsson technicians in Vodafone's mobile phone 
software. The code tapped into the conference call system. It 
"conference called" phone calls to 14 prepaid mobile phones where the 
calls were recorded.
<http://betabug.ch/blogs/ch-athens/288>
<http://seattlepi.nwsource.com/national/1103AP_Greece_Phone_Surveillance 
.html> or <http://tinyurl.com/ajsgu>
More information in Greek:
<http://www.in.gr/news/article.asp?lngEntityID=681341&lngDtrID=244>

Interesting research paper by Shishir Nagaraja and Ross Anderson, "The 
Topology of Covert Conflict."  Implications for warfare, terrorism, and 
peer-to-peer file sharing:
<http://www.cl.cam.ac.uk/TechReports/UCAM-CL-TR-637.html>

Last year I wrote about an article by Daniel J. Solove and Chris 
Hoofnagle titled "A Model Regime of Privacy Protection."  The paper has 
been revised a few times based on comments -- some of them from readers 
of my blog and Crypto-Gram -- and published.
<http://papers.ssrn.com/sol3/papers.cfm?abstract_id=881294>
The odds of this turning into law are, unfortunately, close to zero.

This Barcelona club requires an embedded RFID chip for VIP status:
<http://edition.cnn.com/2004/TECH/10/05/spark.bajabeach/>
And this company requires the same for access to the data center:
<http://www.theregister.co.uk/2006/02/10/employees_chipped/>
<http://www.securityfocus.com/brief/134>

This article by Malcolm Gladwell on profiling and generalizations is 
excellent:
<http://www.newyorker.com/fact/content/articles/060206fa_fact>

Here's an interesting rebuttal of Laszlo Kish's theoretically secure 
classical communications scheme.
<http://terrybollinger.com/qencrypt/BollingerCritiqueOfKishPaper-2006-01 
-31.pdf> or <http://tinyurl.com/as63o>
And a response from Kish:
<http://www.ece.tamu.edu/~noise/research_files/Response_Bollinger.pdf>
My original essay on the topic:
<http://www.schneier.com/blog/archives/2005/12/totally_secure.html>

Check washing is a form of fraud.  The criminal uses various solvents 
to remove data from a signed check -- the "pay to" name, the amount -- 
and replace it with data more beneficial to the criminal: his own name, 
a larger amount.  This webpage -- I know nothing about who these people 
are, but they seem a bit amateurish -- talks about check fraud, and 
then gives advice on pens and inks to check writers.
<http://www.ckfraud.org/washing.html>

Interesting paper on "petnames," which tries to solve the security 
problems inherent in naming.
<http://www.skyhunter.com/marcs/petnames/IntroPetNames.html>

The militarization of police work: more and more, police are using 
military-style weapons and tactics.  "Eastern Kentucky University's 
Peter Kraska -- a widely cited expert on police militarization -- 
estimates that SWAT teams are called out about 40,000 times a year in 
the United States; in the 1980s, that figure was 3,000 times a year. 
Most 'call-outs' were to serve warrants on nonviolent drug offenders."
<http://www.cato.org/pub_display.php?pub_id=5439>

Really interesting article on security features of Internet Explorer 7:
<http://redmondmag.com/columns/article.asp?editorialsid=1215>
My commentary:
<http://www.schneier.com/blog/archives/2006/02/the_new_interne.html>

The TSA has announced that Secure Flight, its comprehensive program to 
match airline passengers against terrorist watch lists, has been 
suspended.  I have written about this program extensively. It's an 
absolute mess in every way, and doesn't make us safer.  But don't think 
this is the end. Under Section 4012 of the Intelligence Reform and 
Terrorism Prevention Act, Congress mandated the TSA put in place a 
program to screen every domestic passenger against the watch list. 
Until Congress repeals that mandate, these postponements and 
suspensions are the best we can hope for. Expect it all to come back 
under a different name -- and a clean record in the eyes of those not 
paying close attention -- soon.
<http://msnbc.msn.com/id/11254968/>
Me on Secure Flight:
<http://www.schneier.com/blog/archives/2005/09/secure_flight_n_1.html>

I just found an interesting paper: "Windows Access Control 
Demystified," by Sudhakar Govindavajhala and Andrew W. Appel. 
Basically, they show that companies like Adobe, Macromedia, etc., have 
mistakes in their Access Control Programming that open security holes 
in Windows XP.
<http://www.cs.princeton.edu/~sudhakar/papers/winval.pdf>
Ed Felten has some good commentary about the paper on his blog.
<http://www.freedom-to-tinker.com/?p=970>

Forget RFID; you can be tracked with WiFi from several hundred meters away.
<http://us.gizmodo.com/gadgets/wireless/wifi-tracking-008264.php>
In other news, Apple is adding WiFi to its iPod:
<http://www.reghardware.co.uk/2006/02/08/portalplayer_wireless_ipod_chip 
/> or <http://tinyurl.com/74t7o>
And don't forget, you can be tracked by your cell phone:
<http://news.com.com/E-tracking+through+your+cell+phone/2010-1039_3-6038 
468.html> or <http://tinyurl.com/7gaxm>

Gary T. Marx is a sociology professor at MIT, and a frequent writer on 
privacy issues.  I find him both clear and insightful, as well as 
interesting and entertaining.  This new paper is worth reading: "Soft 
Surveillance: The Growth of Mandatory Volunteerism in Collecting 
Personal Information -- 'Hey Buddy Can You Spare a DNA?'"
<http://web.mit.edu/gtmarx/www/softsurveillance.html>
You can read a whole bunch of his other articles here:
<http://web.mit.edu/gtmarx/www/garyhome.html#Online>

Real fake ID cards. Or maybe they're fake real ID cards.  This website 
sells ID cards.  They're not ID cards for anything in particular, but 
they look official.  If you need to fool someone who really doesn't 
know what an ID card is supposed to look like, these are likely to work.
<http://www.real-id.com/>


** *** ***** ******* *********** *************

      Passlogix Misquotes Me in Their PR Material



I recently received a PR e-mail from a company called Passlogix.  In 
part, it said:  "Password security is still a very prevalent threat, 
2005 had security gurus like Bruce Schneier publicly suggest that you 
actually write them down on sticky-notes.  A recent survey stated 78% 
of employees use passwords as their primary forms of security, 52% use 
the same password for their accounts -- yet 77% struggle to remember 
their passwords."

Actually, I don't.  I recommend writing your passwords down and keeping 
them in your wallet.

I know nothing about this company, but I am unhappy at their 
misrepresentation of what I said.

<http://www.passlogix.com/>

My recommendation:
<http://www.schneier.com/blog/archives/2005/06/write_down_your.html>


** *** ***** ******* *********** *************

      The Doghouse: Super Cipher P2P Messenger



Super Cipher P2P Messenger uses "unbreakable Infinity bit Triple Layer 
Socket Encryption for completely secure communication."

<http://www.snapfiles.com/get/supercipherp2p.html>


** *** ***** ******* *********** *************

      Privatizing Registered Traveler



In mid-January, the TSA announced details of its Registered Traveler 
program (sometimes known as "Trusted Traveler").  Basically, you pay 
money for a background check and get a biometric ID -- a fingerprint -- 
that gets you through airline security faster.

I've already written about why this is a bad idea for security:

"What the Trusted Traveler program does is create two different access 
paths into the airport: high security and low security. The intent is 
that only good guys will take the low-security path, and the bad guys 
will be forced to take the high-security path, but it rarely works out 
that way. You have to assume that the bad guys will find a way to take 
the low-security path.

"The Trusted Traveler program is based on the dangerous myth that 
terrorists match a particular profile and that we can somehow pick 
terrorists out of a crowd if we only can identify everyone. That's 
simply not true. Most of the 9/11 terrorists were unknown and not on 
any watch list. Timothy McVeigh was an upstanding US citizen before he 
blew up the Oklahoma City Federal Building. Palestinian suicide bombers 
in Israel are normal, nondescript people. Intelligence reports indicate 
that Al Qaeda is recruiting non-Arab terrorists for US operations."

But what the TSA is actually doing is even more bizarre.  The TSA is 
privatizing this system.  They want the companies that *sell* 
for-profit, Registered Traveler passes to do the background 
checks.  They want the companies to use error-filled commercial 
databases to do this.  What incentive do these companies have to not 
sell someone a pass?  Who is liable for mistakes?

I thought airline security was important.

<http://www.tsa.gov/public/display?theme=40&content=090005198018c349>

News article:
<http://www.washingtonpost.com/wp-dyn/content/article/2006/01/20/AR20060 
12001812.html> or <http://tinyurl.com/9axu6>

My previous essay:
<http://www.schneier.com/essay-051.html>

This is an excellent discussion of the problems:
<http://arstechnica.com/news.ars/post/20060125-6052.html>
"What's worse than having identity thieves impersonate you to Chase 
Bank? Having terrorists impersonate you to the TSA."


** *** ***** ******* *********** *************

      Counterpane News



Counterpane monitored someting like 100 billion network events, 
world-wide, in 2005.  These are the attack trends that we're seeing.
<http://www.counterpane.com/cgi-bin/attack-trends-cg.cgi>

For the RSA Conference, my wife and I have written a 110-page 
restaurant guidebook for the downtown San Jose area.  It's a fun read, 
even if you aren't looking for a San Jose restaurant.  (Do people know 
that I write restaurant reviews for the Minneapolis Star Tribune?)  The 
restaurant guide will be available at the conference -- and of course 
you can download it -- but I have a few hundred to give away 
here.  I'll send a copy to anyone who wants one, in exchange for 
postage.  (It's not about the money, but I need some sort of gating 
function so that only those actually interested get a copy.)  Cost is 
$2.50 if you live in the U.S., $3.00 for Canada/Mexico, and $6.00 
elsewhere.  I'll accept PayPal to my e-mail address -- 
[log in to unmask] -- or a check to Bruce Schneier, Counterpane 
Internet Security, Inc., 1090A La Avenida, Mountain View, CA 
94043.  Sorry, but I can't accept credit cards directly.
<http://www.schneier.com/restaurants-rsa2006.pdf>

Last weekend I spoke at the ACLU Washington Annual Membership 
Conference.  The Seattle Times covered my speech:
<http://seattletimes.nwsource.com/html/localnews/2002800247_aclu12m.html 
 > or <http://tinyurl.com/bomxu>
<http://www.aclu-wa.org/detail.cfm?id=391>


** *** ***** ******* *********** *************

      Security Problems with Controlled Access Systems



There was an interesting security tidbit in an article on the recent 
post office shooting.  "The shooter's pass to access the facility had 
been expired, officials said, but she apparently used her knowledge of 
how security at the facility worked to gain entrance, following another 
vehicle in through the outer gate and getting other employees to open 
security doors."

This is a failure of both technology and procedure.  The gate was 
configured to allow multiple vehicles to enter on only one person's 
authorization -- that's a technology failure.  And people are 
programmed to be polite -- to hold the door for others.

Note: There is a common myth that workplace homicides are prevalent in 
the United States Postal Service.  (Note the phrase "going 
postal.")  But not counting this event, there has been less than one 
shooting fatality per year at Postal Service facilities over the last 
20 years.  As the USPS has more than 700,000 employees, this is a lower 
rate than the average workplace.

NOTE:  Some news reports say that she got another employee's badge at 
gunpoint, which is a failure of a completely different kind.

<http://www.msnbc.msn.com/id/11107022/>


** *** ***** ******* *********** *************

      Security in Cartoons



RFID:
<http://www.ibiblio.org/Dave/Dr-Fun/df200601/df20060116.jpg>

Spamming:
<http://ars.userfriendly.org/cartoons/?id=20060131>

Airline security checkpoints:
<http://www.ucomics.com/closetohome/2006/02/07/>


** *** ***** ******* *********** *************

      Countering "Trusting Trust"



Way back in 1974, Paul Karger and Roger Schell discovered a devastating 
attack against computer systems.  Ken Thompson described it in his 
classic 1984 speech, "Reflections on Trusting Trust"   Basically, an 
attacker changes a compiler binary to produce malicious versions of 
some programs, INCLUDING ITSELF. Once this is done, the attack 
perpetuates, essentially undetectably. Thompson demonstrated the attack 
in a devastating way: he subverted a compiler of an experimental 
victim, allowing Thompson to log in as root without using a password. 
The victim never noticed the attack, even when they disassembled the 
binaries -- the compiler rigged the disassembler, too.

This attack has long been part of the lore of computer security, and 
everyone knows that there's no defense.  And that makes a new paper by 
David A. Wheeler so interesting.  It's called "Countering Trusting 
Trust through Diverse Double-Compiling," and it describes a technique 
called diverse double-compiling (DDC) that detects this attack.  From 
the abstract: "Simply recompile the purported source code twice: once 
with a second (trusted) compiler, and again using the result of the 
first compilation. If the result is bit-for-bit identical with the 
untrusted binary, then the source code accurately represents the 
binary. This technique has been mentioned informally, but its issues 
and ramifications have not been identified or discussed in a 
peer-reviewed work, nor has a public demonstration been made. This 
paper describes the technique, justifies it, describes how to overcome 
practical challenges, and demonstrates it."

To see how this works, look at the attack.  In a simple form, the 
attacker modifies the compiler binary so that whenever some targeted 
security code like a password check is compiled, the compiler emits the 
attacker's backdoor code in the executable.

Now, this would be easy to get around by just recompiling the 
compiler.  Since that will be done from time to time as bugs are fixed 
or features are added, a more robust form of the attack adds a 
step:  Whenever the compiler is itself compiled, it emits the code to 
insert malicious code into various programs, including itself.

Assuming broadly that the compiler source is updated, but not 
completely rewritten, this attack is undetectable.

Wheeler explains how to defeat this more robust attack.  Suppose we 
have two completely independent compilers: A and T.  More specifically, 
we have source code S_A of compiler A, and executable code E_A and 
E_T.  We want to determine if the binary of compiler A -- E_A -- 
contains this trusting trust attack.

Here's Wheeler's trick:

Step 1: Compile S_A with E_A, yielding new executable X.

Step 2: Compile S_A with E_T, yielding new executable Y.

Since X and Y were generated by two different compilers, they should 
have different source code but be functionally equivalent.  So far, so 
good.  Now:

Step 3: Compile S_A with X, yielding new executable V.

Step 4: Compile S_A with Y, yielding new executable W.

Since X and Y are functionally equivalent, V and W should be 
bit-for-bit equivalent.

And that's how to detect the attack.  If E_A is infected with the 
robust form of the attack, then X and Y will be functionally 
different.  And if X and Y are functionally different, then V and W 
will be bitwise different.  So all you have to do is to run a binary 
compare between V and W; if they're different, then E_A is infected.

Now you might read this and think: "What's the big deal?  All I need to 
test if I have a trusted compiler is...another trusted compiler.  Isn't 
it turtles all the way down?"

Not really.  You do have to trust a compiler, but you don't have to 
know beforehand which one you must trust.  If you have the source code 
for compiler T, you can test it against compiler A.  Basically, you 
still have to have at least one executable compiler you trust.  But you 
don't have to know which one you should start trusting.

And the definition of "trust" is much looser.  This countermeasure will 
only fail if both A and T are infected in exactly the same way.  The 
second compiler can be malicious; it just has to be malicious in some 
different way: i.e., it can't have the same triggers and payloads of 
the first.  You can greatly increase the odds that the 
triggers/payloads are not identical by increasing diversity: using a 
compiler from a different era, on a different platform, without a 
common heritage, transforming the code, etc.

Also, the *only* thing compiler B has to do is compile the 
compiler-under-test.  It can be hideously slow, produce code that is 
hideously slow, or only work on a machine that hasn't been produced in 
a decade.  You could create a compiler specifically for this task.  And 
if you're *really* worried about "turtles all the way down," you can 
write Compiler B yourself for a computer you built yourself from vacuum 
tubes that you made yourself.  Since Compiler B only has to 
occasionally recompile your "real" compiler, you can impose a lot of 
restrictions that you would never accept in a typical production-use 
compiler.  And you can periodically check Compiler B's integrity using 
every other compiler out there.

Now, this technique only detects when the binary doesn't match the 
source, so someone still needs to examine the compiler source 
code.  But now you only have to examine the source code (a much easier 
task), not the binary.

It's interesting: the "trusting trust" attack has actually gotten 
easier over time, because compilers have gotten increasingly complex, 
giving attackers more places to hide their attacks.  Here's how you can 
use a simpler compiler -- that you can trust more -- to act as a 
watchdog on the more sophisticated and more complex compiler.


Wheeler's paper and website:
<http://www.acsa-admin.org/2005/abstracts/47.html>
<http://www.dwheeler.com/trusting-trust>

"Reflections on Trusting Trust"
<http://www.acm.org/classics/sep95/>


** *** ***** ******* *********** *************

      Security in the Cloud



One of the basic philosophies of security is defense in depth: 
overlapping systems designed to provide security even if one of them 
fails. An example is a firewall coupled with an intrusion-detection 
system (IDS). Defense in depth provides security, because there's no 
single point of failure and no assumed single vector for attacks.

It is for this reason that a choice between implementing network 
security in the middle of the network -- in the cloud -- or at the 
endpoints is a false dichotomy. No single security system is a panacea, 
and it's far better to do both.

This kind of layered security is precisely what we're seeing develop. 
Traditionally, security was implemented at the endpoints, because 
that's what the user controlled. An organization had no choice but to 
put its firewalls, IDSs, and anti-virus software inside its network. 
Today, with the rise of managed security services and other outsourced 
network services, additional security can be provided inside the cloud.

I'm all in favor of security in the cloud. If we could build a new 
Internet today from scratch, we would embed a lot of security 
functionality in the cloud. But even that wouldn't substitute for 
security at the endpoints. Defense in depth beats a single point of 
failure, and security in the cloud is only part of a layered approach.

For example, consider the various network-based e-mail filtering 
services available. They do a great job of filtering out spam and 
viruses, but it would be folly to consider them a substitute for 
anti-virus security on the desktop. Many e-mails are internal only, 
never entering the cloud at all. Worse, an attacker might open up a 
message gateway inside the enterprise's infrastructure. Smart 
organizations build defense in depth: e-mail filtering inside the cloud 
plus anti-virus on the desktop.

The same reasoning applies to network-based firewalls and 
intrusion-prevention systems (IPS). Security would be vastly improved 
if the major carriers implemented cloud-based solutions, but they're no 
substitute for traditional firewalls, IDSs, and IPSs.

This should not be an either/or decision. At Counterpane, for example, 
we offer cloud services and more traditional network and desktop 
services. The real trick is making everything work together.

Security is about technology, people, and processes. Regardless of 
where your security systems are, they're not going to work unless human 
experts are paying attention. Real-time monitoring and response is 
what's most important; where the equipment goes is secondary.

Security is always a trade-off. Budgets are limited and economic 
considerations regularly trump security concerns. Traditional security 
products and services are centered on the internal network, because 
that's the target of attack. Compliance focuses on that for the same 
reason. Security in the cloud is a good addition, but it's not a 
replacement for more traditional network and desktop security.

This was published as a "Face-Off" in "Network World":
<http://www.networkworld.com/columnists/2006/021306faceoffno.html>

The opposing view is here:
<http://www.networkworld.com/columnists/2006/021306faceoffyes.html>


** *** ***** ******* *********** *************

      Comments from Readers



There are hundreds of comments -- many of them interesting -- on these 
topics on my blog.  Search for the story you want to comment on, and 
join in.

<http://www.schneier.com/blog>


** *** ***** ******* *********** *************

CRYPTO-GRAM is a free monthly newsletter providing summaries, analyses, 
insights, and commentaries on security: computer and otherwise.  You 
can subscribe, unsubscribe, or change your address on the Web at 
<http://www.schneier.com/crypto-gram.html>.  Back issues are also 
available at that URL.

Comments on CRYPTO-GRAM should be sent to 
[log in to unmask]  Permission to print comments is assumed 
unless otherwise stated.  Comments may be edited for length and clarity.

Please feel free to forward CRYPTO-GRAM to colleagues and friends who 
will find it valuable.  Permission is granted to reprint CRYPTO-GRAM, 
as long as it is reprinted in its entirety.

CRYPTO-GRAM is written by Bruce Schneier.  Schneier is the author of 
the best sellers "Beyond Fear," "Secrets and Lies," and "Applied 
Cryptography,"  and an inventor of the Blowfish and Twofish 
algorithms.  He is founder and CTO of Counterpane Internet Security 
Inc., and is a member of the Advisory Board of the Electronic Privacy 
Information Center (EPIC).  He is a frequent writer and lecturer on 
security topics.  See <http://www.schneier.com>.

Counterpane is the world's leading protector of networked information - 
the inventor of outsourced security monitoring and the foremost 
authority on effective mitigation of emerging IT threats. Counterpane 
protects networks for Fortune 1000 companies and governments 
world-wide.  See <http://www.counterpane.com>.

Crypto-Gram is a personal newsletter.  Opinions expressed are not 
necessarily those of Counterpane Internet Security, Inc.

Copyright (c) 2006 by Bruce Schneier.

-- 
This message has been scanned for viruses and dangerous
content by the NorMAN MailScanner Service and is believed
to be clean.

The NorMAN MailScanner Service is operated by Information
Systems and Services, University of Newcastle upon Tyne.


====
This e-mail is intended solely for the addressee. It may contain private and
confidential information. If you are not the intended addressee, please take
no action based on it nor show a copy to anyone. Please reply to this e-mail
to highlight the error. You should also be aware that all electronic mail
from, to, or within Northumbria University may be the subject of a request
under the Freedom of Information Act 2000 and related legislation, and
therefore may be required to be disclosed to third parties.
This e-mail and attachments have been scanned for viruses prior to leaving
Northumbria University. Northumbria University will not be liable for any
losses as a result of any viruses being passed on.

************************************************************************************
Distributed through Cyber-Society-Live [CSL]: CSL is a moderated discussion
list made up of people who are interested in the interdisciplinary academic
study of Cyber Society in all its manifestations.To join the list please visit:
http://www.jiscmail.ac.uk/lists/cyber-society-live.html
*************************************************************************************

Top of Message | Previous Page | Permalink

JiscMail Tools


RSS Feeds and Sharing


Advanced Options


Archives

April 2024
March 2024
February 2024
January 2024
December 2023
November 2023
October 2023
September 2023
August 2023
July 2023
June 2023
May 2023
April 2023
March 2023
February 2023
January 2023
December 2022
November 2022
October 2022
September 2022
August 2022
June 2022
May 2022
March 2022
February 2022
October 2021
July 2021
June 2021
April 2021
March 2021
February 2021
January 2021
December 2020
November 2020
October 2020
September 2020
July 2020
June 2020
May 2020
April 2020
February 2020
January 2020
December 2019
November 2019
October 2019
September 2019
August 2019
July 2019
June 2019
May 2019
March 2019
February 2019
January 2019
December 2018
November 2018
October 2018
September 2018
August 2018
July 2018
June 2018
May 2018
April 2018
March 2018
February 2018
January 2018
December 2017
November 2017
October 2017
September 2017
August 2017
July 2017
June 2017
May 2017
April 2017
March 2017
January 2017
December 2016
November 2016
October 2016
September 2016
August 2016
June 2016
May 2016
April 2016
March 2016
February 2016
January 2016
December 2015
November 2015
October 2015
September 2015
August 2015
July 2015
June 2015
May 2015
April 2015
March 2015
February 2015
January 2015
December 2014
November 2014
October 2014
September 2014
August 2014
June 2014
May 2014
April 2014
March 2014
February 2014
January 2014
December 2013
November 2013
October 2013
September 2013
August 2013
July 2013
June 2013
May 2013
April 2013
March 2013
February 2013
January 2013
December 2012
November 2012
October 2012
September 2012
August 2012
July 2012
June 2012
May 2012
April 2012
March 2012
February 2012
January 2012
December 2011
November 2011
October 2011
September 2011
July 2011
June 2011
May 2011
April 2011
March 2011
February 2011
January 2011
December 2010
November 2010
October 2010
September 2010
August 2010
July 2010
June 2010
May 2010
April 2010
March 2010
February 2010
January 2010
December 2009
November 2009
October 2009
September 2009
July 2009
June 2009
May 2009
April 2009
March 2009
February 2009
January 2009
December 2008
November 2008
October 2008
September 2008
June 2008
May 2008
April 2008
March 2008
February 2008
January 2008
December 2007
November 2007
October 2007
September 2007
June 2007
May 2007
April 2007
March 2007
February 2007
January 2007
2006
2005
2004
2003
2002
2001
2000


JiscMail is a Jisc service.

View our service policies at https://www.jiscmail.ac.uk/policyandsecurity/ and Jisc's privacy policy at https://www.jisc.ac.uk/website/privacy-notice

For help and support help@jisc.ac.uk

Secured by F-Secure Anti-Virus CataList Email List Search Powered by the LISTSERV Email List Manager