Archive for October, 2012

Mobile payments – Designing for compliance

Posted on October 24th, 2012 by



Globally, we are seeing some exciting developments in peer to peer payments, remote mobile web commerce payments and near field communication (NFC) payments – when users hold their device in close proximity to a point-of-sale terminal. The growth of these new payment services will mean that organisations other than banks will have access to information held in bank accounts. In fact, there is likely to be a variety of new stakeholders across various jurisdictions involved in the retail payments industry, including mobile phone manufacturers, telecommunications providers, financial institutions, non-bank payment providers, mobile services infrastructure providers, customers and merchants. Inevitably, we will see telecoms providers “morph” into financial services providers and banks obtaining telecoms licences.
There is a race on and the prize is customer data and the ability to use that data in a way that goes far beyond facilitating a payment transaction. Of course, having a greater amount of data about customers and using it in innovative ways is not necessarily a bad thing for the customer. However, the argument runs that there are significant barriers to entry in the form of European data protection laws and regulations for companies who seek to use US payment products more globally.
Given the myriad of potential stakeholders involved in providing a mobile payments solution, a key issue is to determine who will be acting as a data controller for the relevant personal data since it is the data controller that will generally be responsible for compliance with data protection laws. In some scenarios, for example, where the mobile payments solution simply involves an extension of the current web-based payment solutions offered by financial institutions, it will usually be the financial institutions that will be acting as data controllers. However, in other mobile payment models, the position may be different and in some situations, more than one party may be acting as a data controller.
Nevertheless, it is in the interests of all of the relevant stakeholders to ensure that the mobile payments solution as a whole achieves compliance with EU data protection laws. As in many other areas of technology development, designing for privacy from the outset is fundamental to achieving cost-effective compliance in the long term and the importance of the ‘privacy by design’ concept has been recognised by its inclusion in the proposed new EU Data Protection Regulation.
But what does ‘privacy by design’ really mean? Too often in the context of technological developments, the issue of security tends to be touted as the main data protection risk that arises. However, while security is obviously very important, it is only one part of the whole picture and there are a range of other data protection challenges that should not be overlooked at design stage. For example, ‘privacy by design’ would also demand that the relevant data processing systems are designed a way that: (i) minimises the data collected (for example by limiting the ‘data fields’ to be completed); (ii) ensures that only users that have a ‘need-to-know’ can access detailed transaction information; and (iii) takes account of circumstances in which a banking customer’s consent may be required and, if appropriate, provides an interface with the relevant consent mechanism.
In general, all of the relevant data protection principles should be factored into the design stage of a mobile payments solution, where it is possible to do so. Obviously, any design implementation would also need to take account of other laws and regulations (for example, telecommunications regulations and the rules in relation to the use of e-money).
Mobile payments are, and should be, the future for customers and merchants alike, and, as with Big Data, the law should not stand in the way. But by the same token, stakeholders cannot write themselves a blank cheque as regards what they can do with customer data and should be designing for compliance from the outset.

Privacy’s greatest threat and how to overcome it

Posted on October 22nd, 2012 by



After some erroneous newspaper reports in 1897 that he had passed away, Mark Twain famously said that the reports of his death were greatly exaggerated.  The same might also be said of privacy.  Scott G. McNealy, former CEO of Sun Microsystems, reportedly once said “You already have zero privacy. Get over it.“.  However, if last week’s IAPP Privacy Academy in San Jose was anything to go by, privacy is very much alive and kicking.

It’s easy to understand why concerns about the death of privacy arise though.  Today’s data generation, processing and exploitation is simply vast – way beyond a level any of us could meaningfully hope to comprehend or, dare I suggest, control.  The real danger to privacy though is not the scale of data processing that goes on – that’s simply a reality of living in a modern day, technology-enabled, society; a Pandora’s box that, now opened, cannot now be closed.  Instead, the real danger to privacy is excessive and unrealistic regulation.

Better regulation drives better compliance

From many years of working in privacy, it’s been my experience that most businesses work hard to be compliant.  Naturally, there are outliers, but these few cases should not drive the regulation that determines how the majority conduct their business.  It’s also been my experience that compliance is most often achieved where the standards applied by legislators and regulators are accurate, proportionate and not excessive – the same standards they expect our controllers to apply when processing personal data.  In other words, legislation and regulation drives the best behaviour when it is achievable.

By contrast, excessive, disproportionate regulation that does not accurately reflect the way that technology works or recognise the societal benefits that data processing can deliver often brings about the opposite effect.  By making compliance impossible, or at least, disproportionately burdensome to achieve, businesses, unsurprisingly, often find themselves falling short of expected regulatory standards – in many cases, wholly unintentionally.

The recent “cookie law” is a good example of this: a law that, though well-intentioned, is effectively seen as regulating a technology (cookies) rather than a purpose (tracking), leading to widespread confusion about the standards that apply and – let’s be honest – non-compliance currently on an unprecedented scale throughout the EU.

Why the Regulation mustn’t make the same mistake

In its current form, the proposed General Data Protection Regulation also runs this risk.  The reform of Europe’s data protection laws is a golden, once-in-a-generation opportunity to re-visit how we do privacy and build a better, more robust framework that fosters new technologies and business innovation, while still protecting against unwarranted privacy intrusions and harm.

But instead of focussing on the “what”, the legislation focuses too much on the “how”: rather than looking to the outputs we should strive to achieve (namely, ensuring that ever-evolving technologies do not make unwarranted intrusions into our private lives) the draft legislation instead mandates excessive accountability standards that do not take proper account of context or actual likelihood of harm.

For example:

*  How, exactly, does an online business ensure that its processing of child data is predicated only on parental or guardian consent (Article 8)?  My prediction: many websites will build meaningless terms into their website privacy policies that children must not use the site – delivering no “real” protection in practice.

*  Why is it necessary for an organisation transferring data internationally to inform individuals “on the level of protection afforded by that third country … by reference to an adequacy decision of the Commission” (Article 14)? Do data subjects really care where their data goes and whether the Commission has made an adequacy decision – or do they just want assurance that their data will be used for legitimate purposes and at all times kept safe and secure, wherever it is?  How does this work in a technology environment that is increasingly shifting to the cloud?

*  Why should controllers be required to provide data portability to data subjects in an “electronic and structured format which is commonly used” (Article 18)?  Surely confidentiality and data security is best achieved through the use of proprietary systems whose technology is not “commonly used”, therefore less understood and vulnerable to external attack?  Are we legislating for a future of security weakness?

*  Why should data controllers and processors maintain such extensive levels of data processing documentation (Article 28)?  How will smaller businesses cope with this burden?  Yes, an exemption applies for businesses employing less than 250 persons but only if their data processing is “ancillary” to the main business activities – immediately ruling out most technology start-ups.

*  And how can we still, in this day and age, operate on a misguided assumption that model contracts provide a sound basis for protecting international exports of data (Article 42)?  Wouldn’t it make more sense to require controllers to make their own adequacy assessment and to hold them to account if they fall short of the mark?

Make your voice heard!

For the past 17 years, the European Union has been a standard-bearer in operating an effective legal and regulatory framework for privacy.  That framework is now showing its age and, if not reformed in a way that understands, respects and addresses the range of different (and competing) stakeholder interests, risks being ruinous to the privacy advancements Europe has achieved to date.

The good news is that reforming an entire European legal framework doesn’t happen overnight, and the process through to approval and adoption of the General Data Protection Regulation is a long one.  While formal consultation periods are now closed, there remain many opportunities to get involved in reform discussions through legislative and regulatory liaisons at both a European and national level.

To make their voices heard, businesses throughout the data processing spectrum must seize this opportunity to get involved.  Only through informed dialogue with stakeholders can Europe hope to output technology-neutral, proportionate legislation that delivers meaningful data protection in practice.  If it does this, then Europe stands the best chance of remaining a standard-bearer for privacy for the next 17 years too.

Brussels calling: news on the Regulation

Posted on October 12th, 2012 by



There was a definite data protection buzz in Brussels this week as the European Parliament hosted a two-day Inter-parliamentary Committee Meeting to discuss the new EU Data Protection framework, proposed by the European Commission in January.

Representatives of global technology organisations, consumer protection groups, members of national parliaments and members of the EU institutions were prominent among the innumerable stakeholders there, each eager to present their views and contribute to the debate.

The conference was organised by the Committee on Civil Liberties, Justice and Home Affairs (LIBE), the body appointed by the European Parliament to assist with the data protection reforms, headed up by rapporteurs Jan Albrecht and Dimitrios Droutsas.

Since the Lisbon Treaty came into force in 2009, the European Parliament and the Council of the European Union are jointly responsible for negotiating and agreeing upon legislative proposals put forward by the Commission. It follows then that this conference provided a fundamental platform upon which stakeholders could share their opinions and concerns, and an important means by which legislators could gain insight into the practical, legal and economic realities behind the proposals. These contributions will feed directly into the legislative process, and LIBE will no doubt consider them when preparing its draft opinion on the reforms which is expected later this year.

So what then was the outcome of the conference? There are certainly many questions that remain unanswered and it was pointed out by Simon Davies from the London School of Economics that there is almost no agreement among stakeholders on any single point. A huge amount of re-thinking and re-drafting will no doubt ensue. That said, what was abundantly clear was an overwhelming support in principle for the reforms and, despite there being some way to go in terms of getting the legislation right, a sense that the key people responsible for drafting it are listening to what people have to say.

For instance, Viviane Reding (the Vice President of the Commission) made it clear that the Commission would consider reducing the vast number of delegated acts. This will no doubt have come as welcome news to many. Delegated and implementing acts enable the Commission to supplement and amend certain non-essential elements of the legislation once it has come into force. In other words, they achieve flexibility and enable clauses to be drafted in a technologically neutral manner, making way for new technological innovations that will be prevalent in the years to come. The counter argument though is that delegated acts give the Commission excessive (and in many cases unnecessary) powers, which would constitute a bar to strengthening democracy and promoting transparency across the EU.

Francoise Le Bail (the Commission’s Director General for Justice), whilst defending the number of delegated acts currently drafted, recognised there were a lot of question marks and problems outstanding but stressed that stakeholder contributions were valued by the Commission which is determined to take into account the proposals and comments made. There is still room then for voices to be heard.

The debate on delegated acts was one thing, and there are no prizes for guessing some of the other controversial elements that repeatedly cropped up. The “right to be forgotten”, “one-stop-shop”, “consent”, “profiling” and “data protection by design” were all key concepts which unsurprising featured in the debate and, whether for or against them, the general view was clear. The drafting needs to be tightened up, and greater clarity is needed in many cases so as to be sure of the exact rights and obligations of everyone concerned.

The proposed legislation does after all affect a huge number people; not just citizens, but consumers, SMEs, global organisations and public authorities are all affected, and this was also a key feature of the debate. On the one hand, we were reminded that data protection is a fundamental right of each citizen in the EU and measures must be taken to protect that right; on the other we were reminded that data, which flows across the digital environment in ever-increasing volumes, is a hugely important economic asset, not-to-mention a vital component in terms of law enforcement.

So a balance needs to be struck. There are clearly business incentives for building trust in the digital environment, and similarly there is an undisputed recognition of the fact that we need to bolster the rights of individuals. It seems that all stakeholders are recognising the need to be flexible in their approach and response to these reforms, and are working hard to achieve a robust and coherent legal system that will, over the coming years, facilitate innovation whilst providing people with protection and control of their data, to enable the EU to continue to be a major player in the digital economy.

LIBE is expected to present its draft report on the proposed legislation by the end of this year, after which Member States will be invited to table their amendments. LIBE will then meet to discuss those amendments and it is expected that an orientation vote (where the committee votes and concludes upon its initial position in light of the negotiations) will be held in April 2013.

Weather forecast for cloud computing in Europe is “overall good”

Posted on October 8th, 2012 by



The end of September has seen the UK Information Commissioner’s Office release its guidance on cloud computing, shortly followed by the European Commission’s announcement on a new strategy for “Unleashing the potential of cloud computing in Europe”.

ICO

The ICO’s new guidance starts with a helpful ‘setting the scene’ introduction for those new to the topic of cloud computing by going through definitions, different deployment and service models before moving on to an analysis of the data protection obligations.

According to the ICO, based on the fact of determining the purposes and the manner in which any personal data may be processed, the cloud customer is most likely to be the data controller. The guidance does contain a caveat that each case of outsourcing to the cloud and the controller/processor roles of each party will need to be determined separately. The end of the document has a useful checklist of considerations.

The guidance sets out a logical approach that should be followed by potential customers of cloud computing services and which comprises the following steps:

  1. Data selection – selecting which data to move to the cloud and creating a record of which categories of data you are planning to move.
  2. Risk assessment – carrying out privacy impact assessments is recommended for large and complex personal data processing operations in the cloud.
  3. The type of service and provider selection– taking into account the maturity of the service offered and whether it targets a specific market.
  4. Monitoring performance – ongoing obligation throughout the time the outsourcing to the cloud takes place.
  5. Informing cloud users – this reflects the transparency principle; cloud customers who are data controllers (who make services that run on the cloud available to individuals) will need to consider informing the individuals/cloud end users of the service about the processing in the cloud.
  6. Written contract – it is a legal requirement under the Data Protection Act to have a written contract in place between a data controller and a data processor.

 

With regard to selecting a cloud provider the ICO points potential cloud users to the need to look at the security offered, how the data will be protected and the access controls that have been put in place. Helpfully for data controllers, the ICO recognises that it is not always possible to carry out physical audits of the cloud provider but highlights the importance of ensuring that appropriate technical and organisational security measures are maintained at all times.

On the data transfers front the ICO states that cloud customers should ask potential cloud providers for a list of countries where data is likely to be processed and for information relating to the safeguards in place there. It is unfortunate that in this aspect the ICO follows the recent Article 29 Working Party Opinion on Cloud Computing.

EU

Turning to the European Commission’s announcement of a new strategy for “Unleashing the potential of cloud computing in Europe”, the main aim of the strategy is to support the take-up of cloud computing services through creating new homogenised technical standards on interoperability, data portability and reversibility by 2013; as well as certification schemes for cloud providers. A key area where, according to the strategy document, the Commission will concentrate its work on will be safe and fair contract terms and conditions for cloud computing services. This will involve developing model terms for service level agreements. The strategy stresses the importance of the ongoing work on the proposed Data Protection Regulation and the expectation that this work should be completed in 2013.

The new strategy when coupled with the recent Article 29 Working Party Opinion shows clear signs that cloud computing is fast gaining prominence on the European Commission’s Digital Agenda. At this stage it is important to track the developments in this area and for industry members to continue providing their feedback to proposals. The ICO’s guidance proves that a pragmatic approach to cloud computing is achievable without minimising the protection afforded to individuals’ personal data.

In short, the key takeaways from these developments are that in addition to contributing to the development of model contract terms, customers of cloud computing services must look at the selection process and the contractual documentation as their top priorities when approaching a cloud service relationship.

Consent revisited

Posted on October 4th, 2012 by



If there was a prize for the most controversial provision in the draft EU Data Protection Regulation, it would probably be won by the article dealing with consent.  From Member States’ governments to European Parliament’s committees, everyone seems to have a very strong opinion of that article.  A number of European governments have already used their representation on the Council of the EU to criticise the legal uncertainty created by the draft provision.  The level of disagreement with the Commission’s proposal is perhaps not surprising given the elevated and rather emotional role that consent has in privacy matters and the potentially catastrophic consequences of setting the bar for valid consent either too low or too high.  But the point is that once again, the issue of individual’s consent is proving to be an uneasy one, to say the least.

This controversy is not driven by a purely academic interest about what may or may not happen in a few years’ time when the Regulation is adopted.  Consent is a legal basis for collecting and exploiting personal information today, and in some cases, there is little or no option than to get people’s permission to use their data.  Without a doubt, the most vibrant and present legal dilemma regarding what qualifies as consent is taking place in the context of cookies and anything else that amounts to storing or accessing information stored on someone’s device.  If it wasn’t for the innate human difficulty in establishing what kind of conduct may amount to consent, it would be odd to think that after more than 3 years of heated debate about the cookie consent rule, we still are nowhere near finding a solution that everyone is happy with.

Some attempts to find a middle ground between a rock-solid, unflappably demonstrable opt-in consent and the mere assumption that anything goes when people surf the net have been made in recent times but many of the approaches adopted by European websites fall short of the necessary standards.  So how can consent be obtained on the Internet other than by ticking a box?  Is the concept of implied consent – so commonly used and relied upon in our ordinary comings and goings in the offline world – a workable way forward online?  There isn’t a reason why it shouldn’t but to achieve a reasonable degree of legal certainty, some minimum conditions ought to be met as otherwise, we will be back to the assumption that unless someone makes a big deal of it, anything goes when you go online.

One could probably write a long academic article about this, but at a practical level it is possible to distil the conditions for valid implied consent into four ‘must have’ elements:

*     Deploying a visible and prominent cookie notice – For someone to be in a position to have a say on anything, they really need to know what’s going on.  So in the context of websites, that means that visitors must be presented with some kind of sufficiently clear and ‘in your face’ notice, so that it is obvious to the average user what is happening.  That way, a visitor’s indication of wishes is impliedly given when they see the cookie notice, understand its meaning and rely on the functionality available to make their cookie choices. 

*     Identifying the specific conduct that amounts to consent – Whether it is closing a box, opening a page, clicking on a link or continuing to use the site, the notice must spell out what specific action or conduct undertaken by a visitor will amount to consent to cookies being set or accessed.  Otherwise, the website operator will never truly know whether the visitor accepts the use of cookies on their device.  At the very least, if an assumption is being made that the visitor is happy to receive cookies, say so!

*     Providing a mechanism for control and decision making – The flipside of agreeing to something is having the ability to object to it.  Otherwise, there is no real choice.  With cookies, a ‘take it or leave it’ approach is still a choice, but not a genuine one.  Therefore, as part of the process of obtaining consent, website visitors should be able to make their choices freely and refuse the use of cookies (other than those that fall under the strictly necessary exemption) at any time and through simple means, even if it means that the site’s functionality is limited for the user as a result.  In an ideal world, these controls need to be sufficiently granular to allow visitors to accept the types of cookies they are happy to receive and to refuse those they are not.

*     Spelling out what cookies are for – Finally, clear and comprehensive information about the use of cookies through the site must be continuously and readily available to satisfy the transparency requirements under European data protection law.  The law is not prescriptive about the way that this information should be provided, but it should be sufficiently full and intelligible to allow individuals to clearly understand the potential consequences of allowing cookies in their devices.

The debate about whether consent should be a requirement to collect and use people’s information will no doubt continue and intensify as that information becomes more and more valuable.  Whether we will ever have a definitive answer is yet to be seen but in the meantime, let’s try to look at technology as an enabler for individual choice.  We may be surprised of what is possible.

 

This article was first published in Data Protection Law & Policy in September 2012.

What to do when you can’t delete data?

Posted on October 2nd, 2012 by



How many lawyers have written terms into data processing contracts along the following lines:  “Upon termination or expiry of this Agreement, the data processor shall delete any and all copies of the Personal Data in its possession or control“?

It’s a classic example of a legal clause that’s ever so easy to draft but, in this day and age, almost impossible to implement in practice.  In most data processing ecosystems, the reality is that there seldom exists just a single copy of our data; instead, our data is distributed, backed-up, and archived across multiple systems, drives and tapes, and often across different geographic locations.  Far from being a bad thing, data distribution, archival and back-up better preserves the availability and integrity of our records.  But the quid pro quo of greater data resilience is that commitments to comprehensively wipe every last trace of our data are simply unrealistic and unachievable.

Nevertheless, once data has fulfilled its purpose, deletion is seemingly what the law requires.  The fifth principle of the Data Protection Act 1998 (implementing Article 6(e) of Directive 95/46/EC) says that: “Personal data processed for any purpose or purposes shall not be kept for longer than is necessary for that purpose or those purposes.“  So how to reconcile this black and white approach to data deletion with the reality of modern day data processing systems?

Thankfully, the ICO has the answer, which it provides in a recently-published guidance note on “Deleting personal data” (available here).  The ICO starts off by acknowledging the difficulties outlined above, commenting that “In the days of paper records it was relatively easy to say whether information had been deleted or not, for example through incineration. The situation can be less certain with electronic storage, where information that has been ‘deleted’ may still exist, in some form or another, within an organisation’s systems.

The sensible answer it arrives at is to say that, if data cannot be deleted for technical or other reasons, then it should instead be put ‘beyond use’.   Putting data ‘beyond use’ has four components, namely:

  1. ensuring that the organisation will not and cannot use the personal data to inform any decision in respect of any individual or in a manner that affects the underlying individuals in any way;
  2. not giving any other organisation access to the personal data;
  3. at all times protecting the personal data with appropriate technical and organisational security; and
  4. committing to delete the personal data if or when this becomes possible.

Broadly speaking, you can condense the four components above into: “Delete it if you can and, if you can’t, make sure it’s stored securely and don’t let anyone use it”. Which is, of course, entirely sensible advice.

It does raise one interesting problem though:  what to do when the individual data subject requests access to his or her data that has been put beyond use?  Here, the ICO again takes a business-friendly view saying simply that “We will not require data controllers to grant individuals subject access to the personal data provided that all four safeguards above are in place.“  In other words, the business does not need to instigate extensive (and expensive) searches of records that have been put beyond use just because an individual requests access to his or her data – for the purposes of subject access, this inert data is treated as if it had been deleted.

But the ICO does issue a warning: “It is bad practice to give a user the impression that a deletion is absolute, when in fact it is not.” So the message to take away is this: make sure you do not commit yourself to data deletion standards that you know, in all likelihood, you can’t and won’t meet.   And, by the same token, don’t let your lawyers commit you to these either!