The Economist : Privacy, Actually

Friday, February 15, 2008

THE internet, argues Kim Cameron, who works as “Identity Architect” at Microsoft, “was built without a way to know who and what you are connecting to”.  That is bad enough in the private sector, where the only thing at stake is money.  For dealing with government, it is potentially catastrophic.  Technology can—just about—tell how an internet user got online.  It can check the authenticity of passwords and logins, and validate smart cards or biometric checks.  But such data, even if encrypted, can be stolen, borrowed, guessed or intercepted.  Internet users have become used to providing personal information to any convincing-looking box that appears on a screen.  They have little idea of either the technology that helps to provide electronic security in practice or the theoretical principles that determine whether it will work.  According to Mr Cameron, “there is no consistent and comprehensible framework allowing them to evaluate the authenticity of the sites they visit, and they don’t have a reliable way of knowing when they are disclosing private information to illegitimate parties.  At the same time they lack a framework for controlling or even remembering the many different aspects of their digital existence.”  So financial institutions and their customers are routinely defrauded by cybergangsters, and there is little legal basis for dealing with cybercrime.  Identities are valuable, allowing crooks to empty bank accounts or buy things online.  Cybercriminals have been targeting individual internet users with “spyware” (which records keystrokes) and “phishing” (bogus e-mails that trick users into providing personal information online).  But the huge databases held by governments would be a much bigger prize…..E-government looks like a potential crock of gold for fraudsters, with huge databases compiled by law, most of them only lightly and incompetently protected, and ambitious plans for even more.  The biggest e-government contract anywhere is Britain’s £12.4 billion scheme for centralised medical records, which will be held on a database accessible by perhaps 1m NHS staff.  Other grandiose plans in Britain include a national identity-card scheme; ContactPoint, a national register of all children in England, which will be accessible by 300,000 people; and a pensioners’ bus-pass scheme containing the ages and addresses of 17m people.  Officials and politicians insist that these schemes are safe.  Encryption will be strong, they say, and access controlled.  Any attempt to get into a patient’s medical records will leave an electronic fingerprint, which will help to protect confidentiality.  Maybe.  But the history of big databases so far is not encouraging.  Critics worry that it will take only one person with the right access to any of the planned databases who is careless or corrupt, and the whole country’s records become vulnerable.  Ross Anderson, professor of Security Engineering at Cambridge and one of the government’s most vehement critics, argues that local systems are far more secure than national ones.  Patient data held at a GP practice may be vulnerable to a security lapse on the premises, but the damage will be limited.  “You can have security, or functionality, or scale—you can even have any two of these.  But you can’t have all three, and the government will eventually be forced to admit this.  In the meantime, billions of pounds are being wasted on gigantic systems projects that usually don’t work, and that place citizens’ privacy and safety at risk when they do.”  Richard Clayton, a fellow-campaigner, says that personal information should be treated like plutonium pellets: “Kept in secure containers, handled as seldom as possible and escorted whenever it has to travel.  Should it get out into the environment, it will be a danger for years to come.  Putting it into one huge pile is really asking for trouble.”

Public paranoia about government databases may well be justified, but it sits oddly with the complacency, verging on carelessness, that people display when convenience is on offer.  Ask the average traveller from a developed country whether he would like to be fingerprinted by an authoritarian regime and have the results stored indefinitely in its computer, and he will probably say no.  But when such procedures save time, scruples go out of the window.  Travellers standing in the lengthy visa and immigration queues at Dubai airport face a phalanx of bored and sullen officials who communicate by hand gestures and grunts, with nary a “please” or “thank you”.  But passengers with an “e-card” fare much better.  They go straight to the “e-gate” where they swipe their card, press a finger on the glass panel and smile at the camera…..All the traveller needs is his passport and a willingness to trust the country’s feudal rulers.  The hard lesson for governments is that citizens will adopt technology when it is both optional and beneficial to them, but resist it strenuously when it is compulsory, no matter how sensible it may seem.  To take another example, if users of public transport in London were told that in future all their trips would be logged by the authorities, they would revolt.  But offered lower fares if they use an Oyster card, issued by a branch of government called Transport for London, they have few objections.  Nor do they seem to mind much that the same body photographs their car every time they visit central London on a working day to enforce the capital’s congestion charge.  Oddly, people seem to mind even less about how much information the private sector holds about them.  Supermarket loyalty cards record all their purchases, however revealing, and search engines note everything they have been looking for on the internet.  People who would strongly resist giving any personal information to the government are quite happy for Google to know that they have been searching for “hot Asian babes”.  The result, says Microsoft’s Mr Cameron, is pernicious…..(He) suggests rethinking the whole issue, starting from the principle that users may be identified only with their explicit consent.  That sounds commonsensical, but many big government databases do things differently.  Britain’s planned central records for the NHS, for example, will assume consent as it combines all the medical records held in local practice databases.  The second principle, says Mr Cameron, should be to keep down the risk of a breach by using as little information as possible to achieve the task in hand.  This approach, which he calls “information minimalism”, rules out keeping information “just in case”.  For example, if a government agency needs to check if someone falls into a certain age group, it is far better to acquire and store this information temporarily as a “yes” or “no” than to record the actual date of birth permanently, which would be much more personal and therefore more damaging if leaked.  Third, identity systems must be able to check who is asking for the information, not just hand it over.  How easy it is for the outside world to access such information should depend on whose identity it is.  Public bodies, Mr Cameron suggests, should make themselves accessible to all comers.  Private individuals, by contrast, should be protected so that they have to identify themselves only temporarily and by choice.  Some existing technologies are not capable of making such distinctions.  Examples include Bluetooth technology (in which gadgets such as mobile phones constantly broadcast their availability) and RFID (radio frequency identity) chips.  These tiny, remotely readable devices have already been incorporated in many countries’ passports, despite plentiful evidence that they can be remotely read, deciphered and even cloned with easily obtained equipment and software.  The final principle is a thorough understanding of the human factor.  As Mr Cameron notes, “we have done a pretty good job of securing the channel between server and browser through the use of cryptography—a channel that might extend for thousands of miles.  But we have failed adequately to protect the two- or three-foot channel between the browser’s display and the brain of the human who uses it.  This immeasurably shorter channel is the one under attack.”  When it comes to government data, a loosely guarded password can cause untold damage.  Officialdom and the public alike have yet to that take on board.

Reference :

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: