Archive for the ‘Right to be forgotten’ Category

2013 a big year for privacy? You ain’t seen nothing yet!

Posted on December 31st, 2013 by



If you thought that 2013 was a big year for privacy, then prepare yourself: it was only the beginning.  Many of the privacy stories whose winding narratives began in 2013 will continue to take unexpected twists and turns throughout 2014, with several poised to reach dramatic conclusions – or otherwise spawn spin-offs and sequels.

Here are just a few of the stories likely to dominate the privacy headlines in 2014:

1.  EU data protection reform:  The Commission’s draft General Data Protection Regulation arrived with a bang in January 2012, proposing fines of up to 2% of global turnover for data protection breaches, a 24-hour data breach notification regime, and a controversial new right for individuals to have their data “forgotten” from the Internet, among many other things.  Heated debate about the pros and cons of these reforms continued into 2013, with the European Parliament’s LIBE Committee only voting on and publishing its position on the draft Regulation in October 2013 (missing two earlier deadlines).  All eyes then turned to the Council, expecting it to put forward its position on the draft Regulation sometime in December, only to discover that it had gotten hung up on the “one stop shop” principle and made little real progress at all.  With the original goal being to adopt the new Regulation before the European Parliamentary elections in May 2014, a real question mark now hangs over whether Europe will achieve this deadline – and what will happen if it doesn’t.

2.  NSA surveillance:  The biggest privacy story – if not the biggest news story – of 2013 concerned the leaks of classified documents from the US National Security Agency by its contractor, Edward Snowden.  The leaks revealed that the NSA had been collecting Internet users’ metadata from the servers of leading technology companies and from the cables that carry our Internet communications around the world. This story has had a profound effect in terms of raising individuals’ privacy awareness worldwide, impacting global political and trade relationships, and adding impetus to the European Union’s regulatory reform agenda.  With the Guardian newspaper recently declaring that it has so far revealed only about 1% of the materials Edward Snowden has disclosed to it – and British television broadcasting an “alternative” Christmas message from Edward Snowden on “Why privacy matters” – it’s safe to say that this is a story that will continue to headline throughout 2014, prompting the global privacy community to contemplate perhaps the most fundamental privacy question of all: to what extent, if at all, will we trade personal privacy in the interests of global security?

3.  Safe harbor: Regulators across several European territories have, for many years now, been grumbling about the “adequacy” of the EU/US safe harbor regime as a basis for exporting data from the European Union to the US.  The Snowden revelations have further fuelled this fire, ultimately leading to the European Commission publishing a set of 13 recommendations for restoring trust in safe harbor.  The Commission has set the US Department of Commerce an ambitious deadline of summer 2014 to address these recommendations – and raised the “nuclear” prospect that it may even suspend safe harbor if this does not happen.  With some 3,000+ US companies currently relying on safe harbor for their EU data exports, many US-led corporations will be watching this story very closely – and would be well-advised to begin contingency planning now…

4.  New technologies:  Ever-evolving technologies will continue to challenge traditional notions of data privacy throughout 2014.  In the past year alone, Big Data has bumped heads with the concepts of purpose limitation and data minimisation, the Internet of Things has highlighted the shortcomings of user consent in an everything-connected world, and the exponential growth of cloud technologies continue to demonstrate the absurdity of extra-EEA data export restrictions and their attendant solutions (Do model clauses really provide adequate protection? Tsch.) Quite aside from the issues presented by technologies like Google Glass and iPhone fingerprint recognition, who can say what other new devices, platforms and services we’ll see in 2014 – and how these will challenge the global privacy community to get creative and adapt accordingly.

5.  Global interoperability:  As at year end, there are close to 100 countries with data protection laws on their statute books, with new privacy laws either coming into effect or getting adopted in countries like Mexico, Australia and South Africa throughout 2013.  And there are still many more countries with data privacy bills under discussion or with new laws coming into effect throughout 2014 (Singapore being one example).  Legislators around the world are waking up to the need to adopt new statutory frameworks (or to reform existing ones) to respect individuals’ privacy – both in the interests of protecting their citizens but also, with the digital economy becoming ever more important, in order not to lose out to businesses looking for ‘safe’ countries to house their data processing operations.  All these new laws will continue to raise challenges in terms of global interoperability – how does an organization spread across multiple international territories comply with its manifold, and often varied, legal obligations while at the same time adopting globally consistent data protection policies, managed with limited internal resources?

6.  Coordinated enforcement:  In 2013, we’ve seen the first real example of cross-border privacy enforcement, with six data protection authorities (led by the CNIL) taking coordinated enforcement action against Google over the launch over its consolidated privacy policy across its various service lines.  With the limitations of national deterrents for data privacy breaches that exist for regulators in many territories (some cannot impose fines, while others can impose only limited fines) and continuing discussion about the need for “one stop shop” enforcement under the proposed General Data Protection Regulation, it seems likely that we’ll see more cooperation and coordinated enforcement by data protection authorities in 2014 and beyond.

2013 was undoubtedly an exciting year for data privacy, but 2014 promises so much more.  It won’t be enough for the privacy community just to know the law – we must each of us become privacy strategists if we are to do proper justice to protect the business and consumer stakeholders we represent.  We have exciting times ahead.

Happy New Year everyone!

The Internet and the Great Data Deletion Debate

Posted on August 15th, 2013 by



Can your data, once uploaded publicly onto the Web, ever realistically be forgotten?  This was the debate I was having with a friend from the IAPP last night.  Much has been said about the EU’s proposals for a ‘right to be forgotten’ but, rather than arguing points of law, we were simply debating whether it is even possible to purge all copies of an individual’s data from the Web.

The answer, I think, is both yes and no: yes, it’s technically possible, and no, it’s very unlikely ever to happen.  Here’s why:

1. To purge all copies of an individual’s data from the Web, you’d need either (a) to know where all copies of those data exist on the Web, or (b) the data would need some kind of built-in ‘self-destruct’ mechanism so that it knows to purge itself after a set period of time.

2.  Solution (a) creates as many privacy issues as it solves.  You’d need either to create some kind of massive database tracking where all copies of data go on the Web or each copy of the data would need, somehow, to be ‘linked’ directly or indirectly to all other copies.  Even assuming it was technically feasible, it would have a chilling effect on freedom of speech – consider how likely a whistleblower would be to post content knowing that every content of that copy could be traced back to its original source.  In fact, how would anyone feel about posting content to the Internet knowing that every single subsequent copy could easily be traced back to their original post and, ultimately, back to them?

3.  That leaves solution (b).  It is wholly possible to create files with built in self-destruct mechanisms, but they would no longer be pure ‘data’ files.  Instead, they would be executable files – i.e. files that can be run as software on the systems on which they’re hosted.  But allowing executable data files to be imported and run on Web-connected IT systems creates huge security exposure – the potential for exploitation by viruses and malicious software would be enormous.  The other possibility would be that the data file contains a separate data field instructing the system on which it is hosted when to delete it – much like a cookie has an expiry date.  That would be fine for propietary data formats on closed IT systems, but is unlikely to catch on across existing, well-established and standardised data formats like .jpgs, .mpgs etc. across the global Web.  So the prospects for solution (b) catching on also appear slim.

What are the consequence of this?  If we can’t purge copies of the individuals’ data spread across the Internet, where does that leave us?  Likely the only realistic solution is to control the propogation of the data at source in the first place.  Achieving that is a combination of:

(a)  Awareness and education – informing individuals through privacy statements and contextual notices how their data may be shared, and educating them not to upload content they (or others) wouldn’t want to share;

(b)  Product design – utilising privacy impact assessments and privacy by design methodologies to assess product / service intrusiveness at the outset and then designing systems that don’t allow illegitimate data propogation; and

(c)  Regulation and sanctions – we need proportionate regulation backed by appropriate sanctions to incentivise realistic protections and discourage illegitimate data trading.  

No one doubts that privacy on the Internet is a challenge, and nowhere does it become more challenging than with the speedy and uncontrolled copying of data.   But let’s not focus on how we stop data once it’s ‘out there’ – however hard we try, that’s likely to remain an unrealistic goal.  Let’s focus instead on source-based controls – this is achievable and, ultimately, will best protect individuals and their data.

ECJ Advocate General: Google is NOT a controller of personal data on other sites

Posted on June 25th, 2013 by



We now know the Advocate General’s Opinion in the most eagerly followed data protection case in the history of the European Court of Justice (ECJ). After the prolific enforcement actions of the Spanish data protection authority to stop Google showing unwanted personal data in search results, their court battles were escalated all the way to the ECJ. Whilst the final decision is still a few months away, the influential Opinion of the Advocate General (AG) is a clear indication of where things are going.

The ultimate question is whether Google, in its capacity as a search engine provider, is legally required to honour individuals’ request to block personal data from appearing in search results. For that to be the case, the court will have to answer affirmatively a three-fold legal test:

1. Does EU law apply to Google? The AG’s Opinion is YES if the search engine provider has an establishment in a Member State for the purpose of promoting and selling advertising space on the search engine, as that establishment acts as the bridge between the search service and the revenue generated by advertising.

Unfortunately the AG does not deal with the question of whether Google Inc. uses equipment in Spain, so we don’t know whether an Internet company with no physical presence in the EU will be caught by EU law.

2. Does a search engine process personal data? The AG’s answer here is also YES, because notions of ‘personal data’ and ‘processing’ are sufficiently wide to cover the activities involved in retrieving information sought by users.

3. Is Google a controller of that data? Crucially, the AG’s answer is NO, because a search engine is not aware of the existence of a certain defined category of information amounting to personal data. Therefore, Google is not in a position to determine the uses made of that data.

So the conclusion, according to the AG, is that a data protection authority cannot compel Google to stop revealing personal data as part of search results.

In addition, the AG goes on to say that even if the ECJ were to find that internet search engine service providers were responsible as controllers for personal data appearing in search results, an individual would still not have a general ‘right to be forgotten’, as this is not contemplated in the current Directive.

Position of Spain on the General Data Protection Regulation: flexibility, common sense and self-regulation

Posted on March 7th, 2013 by



As expectation and concerns rise whilst we wait for the final position of the LIBE committee and the European Parliament on the General Data Protection Regulation (the “Regulation”), the report issued by the Spanish Ministry of Justice on the Regulation (the “Report”) and the recent statements of the Spanish Minister of Justice is music to our ears.

A few weeks ago the Spanish Minister of Justice expressed concern that SMEs could be ‘suffocated’ by the new data protection framework. This concern seems to have inspired some of the amendments suggested in the Report which are designed to make the Regulation more flexible. These include substantive changes to reduce the administrative burdens for organisations with a DPO or for those that have adhered to a certification scheme, and the calculation of fines on profits rather than turnover.

Spain favours a Regulation that relies on self-regulation and accountability, clearly steering away from a restrictive ‘one size fits all’ approach which establishes an onerous (and expensive to comply with) framework . The underlying objective of these proposals seems to be the protection of the SMEs at the core of the Spanish economy. A summary of the Spanish position is provided below:

- Regulation v Directive: there is agreement that a Regulation is the best instrument to standardise data protection within the EU. This is despite the fact that this will cause complications under Spanish Constitutional law.

- Data protection principles: the Report favours the language of the Data Protection Directive (which uses the expression “adequate, relevant and not excessive”) as it allows more flexibility than the language of the Regulation which refers to personal data being “limited to the minimum necessary”. In updating personal data, the Report suggests that this should only be required “whenever necessary” and depending upon its expected use as opposed to the general obligation currently set out by the Regulation.

- Information: the requirement to inform individuals about the period during which personal data will be kept is considered excessive and very difficult to comply with. The Report suggests that this should only be required “whenever it is possible”.

- Consent: the requirement of express consent is seen as too onerous in practice and “properly informed consent” is favoured, the focus being on whether individuals understand the meaning of their actions. The adoption of sector by sector solutions in this context is not ruled out.

- Right to be forgotten: this right is considered paramount but the point is made that a balance has to be found between “theoretical technological possibilities” and “real limitations”. Making an organisation solely responsible for the erasure of personal data which has been disseminated to third parties is regarded as excessive.

- Security incidents: various amendments to the articles that regulate breach notifications are suggested to introduce less stringent requirements to the proposed regime. The suggested amendments remove the duty to notify the controller within 24 hours and also limit the obligation to notify for serious breaches only. Notifications to data subjects are also limited to those that would not have a negative impact on the investigations.

- DPOs: it is proposed that the appointment of DPOs should not be compulsory but should be encouraged by incentives such as the suppression of certain administrative burdens (as referred to below). Organisations without the resources to appoint a DPO may also be encouraged to adopt a “flexible and rigorous” certification policy or scheme. Such certifications would be by sector, revocable and renewable.

- Documentation, impact assessments and prior authorisation: the suggested amendments propose a solution whereby organisations which hold a valid certificate or which have appointed a DPO, would not have to maintain documentation, carry out PIAs or request authorisation to data protection authorities as provided for by Articles 28.2, 33 and 34 of the Regulation respectively.

- International transfers: Spain favours the current system but suggests that this could be made more flexible by only requiring the authorisation of the data protection authority for contractual clauses (which have not been adopted by the Commission or an authority) when the organisation does not have a DPO or a certificate.

- One-stop-shop: this concept is endorsed in general but the Report proposes that where a corporation is established in more than one Member State, the DPA established in the country of residence of an individual complainant should have jurisdiction to deal with the matter. The consistency mechanism would be used to ensure a coherent decision where there were several similar complaints in different countries.

- Sanctions and alternatives: Spain considers that the current system could be improved by providing less stringent alternatives to the imposition of fines. Furthermore, it is proposed that the way in which sanctions are calculated is reviewed on the basis that annual turnover does not equal benefits obtained. This is to avoid the imposition of disproportionate sanctions.

- Technological neutrality: technological neutrality is supported although the Report expresses concerns that such neutrality does not provide for adequate solutions for particular challenges, such as those presented by cloud computing or the transfer of personal data over the Internet.

- Cloud computing: the Report suggests that the Regulation takes this “new reality” into account and suggests the adoption some measures, for example, those aimed at (1) finding a balance between the roles of controllers and processors in order to avoid cloud service providers becoming solely responsible for the processing of personal data; and (2) simplifying the rules on international transfers of personal data; for example, by extending binding corporate rules to the network of sub-processors.

European Parliament’s take on the Regulation: Stricter, thicker and tougher

Posted on January 9th, 2013 by



 

If anyone thought that the European Commission’s draft Data Protection Regulation was prescriptive and ambitious, then prepare yourselves for the European Parliament’s approach. The much awaited draft report by the LIBE Committee with its revised proposal (as prepared by its rapporteur Jan-Philipp Albrecht) has now been made available and what was already a very complex piece of draft legislation has become by far the strictest, most wide ranging and potentially most difficult to navigate data protection law ever to be proposed.

This is by no means the end of the legislative process, but here are some of the highlights of the European Parliament’s proposal currently on the table:

*     The territorial scope of application to non EU-based controllers has been expanded, in order to catch those collecting data of EU residents with the aim of (a) offering goods or services (even if they are free) or (b) monitoring those individuals (not just their behaviour).

*     The concept of ‘personal data’ has also been expanded to cover information relating to someone who can be singled out (not just identified).

*     The Parliament has chosen to give an even bigger role to ‘consent’ (which must still be explicit), since this is regarded as the best way for individuals to control the uses made of their data. In turn, relying on the so-called ‘legitimate interests’ ground to process personal data has become much more onerous, as controllers must then inform individuals about such specific processing and the reasons why those legitimate interests override the interests or fundamental rights and freedoms of the individual.

*     Individuals’ rights have been massively strengthened across the board. For example, the right of access has been expanded by adding to it a ‘right to data portability’ and the controversial ‘right to be forgotten’ potentially goes even further than originally drafted, whilst profiling activities are severely restricted.

*     All of the so-called ‘accountability’ measures imposed on data controllers are either maintained or reinforced. For example, the obligation to appoint a data protection officer will kick in when personal data relating to 500 or more individuals is processed per year, and new principles such as data protection by design and by default are now set to apply to data processors as well.

*     The ‘one stop shop’ concept that made a single authority competent in respect of a controller operating across Member States has been considerably diluted, as the lead authority is now restricted to just acting as a single contact point.

*     Many of the areas that had been left for the Commission to deal with via ‘delegated acts’ are now either specifically covered by the Regulation itself (hence becoming more detailed and prescriptive) or left for the proposed European Data Protection Board to specify, therefore indirectly giving a legislative power to the national data protection authorities.

*     An area of surprising dogmatism is international data transfers, where the Parliament has added further conditions to the criteria for adequacy findings, placed a time limit of 2 years to previously granted adequacy decisions or authorisations for specific transfers (it’s not clear what happens afterwards – is Safe Harbor at risk?), reinforced slightly the criteria for BCR authorisations, and limited transfers to non-EU public authorities and courts.

*     Finally, with regard to monetary fines, whilst the Parliament gives data protection authorities more discretion to impose sanctions, more instances of possible breaches have been added to the most severe categories of fines.

All in all, the LIBE Committee’s draft proposal represents a significant toughening of the Commission’s draft (which was already significantly tougher than the existing data protection directive). Once it is agreed by the Parliament, heated negotiations with the Council of the EU and other stakeholders (including the Commission itself) will then follow and we have just over a year to get the balance right. Much work no doubt awaits.

 

The UK’s Justice Committee is not impressed with the EU Data Protection Framework Proposals

Posted on November 2nd, 2012 by



In the week that the UK Parliament voted for a real-terms cut in the EU’s future budget, it’s no particular surprise to hear criticism from UK Parliamentarians levelled at EU institutions. On Thursday this week, the House of Commons Justice Committee produced its opinion on the European Commission’s legislative proposals for reform of EU data protection law. Whilst accepting that reform of data protection law is necessary, the opinion urges the Commission to ‘go back to the drawing board and devise a regime which is much less prescriptive’. The opinion strongly calls upon the Commission to re-think a number of issues including the division of the proposals into a Regulation and Directive, the drive towards harmonisation at the expense of flexibility, the need for a proper impact assessment, the right to be forgotten and the power of data protection authorities to issue sanctions. The Justice Committee heard evidence from the Ministry of Justice (in charge of negotiating the UK’s position on the proposals), the Information Commissioner’s Office, the EU Commission as well as representatives of UK small businesses, the police, privacy and consumer lobbyists and global businesses.   

Regulation and Directive

While the MoJ and ICO remained resistant to splitting the proposals for reform between a Regulation (for most data processing) and a Directive (for data processing for law enforcement and judicial co-operation), the Commission argued that this split was deliberate to give Member States flexibility to take their particular culture and type of legislation into consideration. So, in the case of the UK, the Commission considered this accommodated the UK’s reliance on common law.  However, a number of witnesses considered that the protection afforded by the draft Directive was less than the protection provided by the draft Regulation so potentially not protecting the rights of individuals. 

Principles rather than prescription?

There was considerable opposition to the prescriptive elements in the Regulation and the ICO, amongst others, encouraged an outcome focused approach based on principles. On the other hand, privacy and consumer lobbyists welcomed the administrative requirements on controllers which they considered helped to secure the rights of individuals.

Good for business?

It was accepted that simple, harmonised rules would greatly help small businesses seeking to expand across the EU as well as global businesses. However, the more prescriptive the rules the harder it would be for businesses to comply (particularly small businesses). The MoJ saw a real threat to business if the Regulation placed extra burdens on businesses and stated that it would influence negotiations to ensure a proportionate, flexible approach that does not impede entrepreneurship. The recent announcement from the EU Justice Commissioner Viviane Reding that she does not wish to see small businesses overburdened by the Regulation should provide some relief for businesses overawed by the compliance requirements of the Regulation.

Good for the ICO?  

Representatives from the ICO stated bluntly that they would not be able to resource their new role under the Regulation. Additionally, the MoJ made it clear that the ‘wish list of extra responsibilities and tasks‘ for the ICO under the Regulation was ‘genuinely wishful thinking’. Likewise, the ICO objected to having its hands tied by the Regulation when it came to identifying and dealing with compliance failures and wanted regulators to have more discretion to apply their own judgement and experience.   

The European Commission

In the Commission’s view enhanced harmonisation would make global processing of personal data simpler and cheaper and thus lead to increased business for the EU. However, this picture of harmonisation downplays the efforts that organisations will have to go to in order to strive for this end.  The MoJ and others sharply criticised the impact assessment that the Commission provided as inadequate and the Justice Committee called for a full assessment of the impact of the proposals.

The Commission also argued that they had sought to technology-proof the Regulation by leaving flexibility in the form of delegated Acts for the Commission to implement later. However, there was significant criticism from witnesses on the extent and scope of provisions for delegated Acts which potentially gave power to the Commission to prescribe technical formats, standards and solutions. There appears to be some scope for movement on this point given Viviane Reding’s recent announcement that she was willing to review the delegated Acts individually and to limit them to only what is truly necessary for future technological developments.

The right to be forgotten

Comments from the ICO provided insight into this controversial concept as Christopher Graham indicated (to his surprise) that Viviane Reding had told him that the right to be forgotten was ‘more of a political slogan’ which actually represented something that already existed. So amidst all the excitement and debate that the trumpeting of the right to be forgotten had stirred up, there was now a suggestion that it wasn’t really a big deal after all. The MoJ strongly emphasised that it would resist the implementation of the right to be forgotten since it would raise unrealistic expectations that will prove impossible to fulfil. More cautiously, the Justice Committee recognised the importance of an individual’s right to delete their data but recommended that the phrase ‘right to be forgotten’ should be avoided since it was misleading. Since the right to be forgotten is inextricably linked in most people’s minds with social media, it was significant that the MoJ considered that parts of the Regulation appeared to be overly-concerned with social media (an anxiety that has perhaps infected the tenor of the drafting).

Subject access rights

Although there were objections from the Federation of Small Businesses to the abolition of the £10 fee for access to personal data and the MoJ was clearly sympathetic to these concerns, the Justice Committee (along with privacy and consumer lobbyists) supported the Commission’s position that the right of access should be free. The MoJ was urged to change its negotiating position on this point.

Justice Committee’s conclusions

In the Committee’s view, the draft Regulation does not produce a proportionate, practicable, affordable or effective system of data protection. Therefore the Committee lay out a stark choice for the Commission: either pursue harmonisation under a Regulation by focusing on the elements essential to harmonise and deploy the consistency mechanism and the European Data Protection Board to achieve this, or use a Directive to set out the outcomes to be achieved and leave implementation down to Member States, thus forgoing an element of harmonisation and consistency. With respect to the new draft Directive on processing personal data for law enforcement and judicial co-operation purposes, the Committee queried whether there is a pressing need to amend EU law in this area. 

What next?

The Justice Committee was asked by the European Scrutiny Committee to provide an opinion on the new data protection framework proposals. Although it has delivered its opinion, the opinion contains a number of outstanding actions on the MoJ to clarify its view or provide responses to the Committee on certain aspects of the new data protection framework. This may well inform the MoJ’s position as it continues to negotiate at European level on the shape of the data protection framework proposals.

Brussels calling: news on the Regulation

Posted on October 12th, 2012 by



There was a definite data protection buzz in Brussels this week as the European Parliament hosted a two-day Inter-parliamentary Committee Meeting to discuss the new EU Data Protection framework, proposed by the European Commission in January.

Representatives of global technology organisations, consumer protection groups, members of national parliaments and members of the EU institutions were prominent among the innumerable stakeholders there, each eager to present their views and contribute to the debate.

The conference was organised by the Committee on Civil Liberties, Justice and Home Affairs (LIBE), the body appointed by the European Parliament to assist with the data protection reforms, headed up by rapporteurs Jan Albrecht and Dimitrios Droutsas.

Since the Lisbon Treaty came into force in 2009, the European Parliament and the Council of the European Union are jointly responsible for negotiating and agreeing upon legislative proposals put forward by the Commission. It follows then that this conference provided a fundamental platform upon which stakeholders could share their opinions and concerns, and an important means by which legislators could gain insight into the practical, legal and economic realities behind the proposals. These contributions will feed directly into the legislative process, and LIBE will no doubt consider them when preparing its draft opinion on the reforms which is expected later this year.

So what then was the outcome of the conference? There are certainly many questions that remain unanswered and it was pointed out by Simon Davies from the London School of Economics that there is almost no agreement among stakeholders on any single point. A huge amount of re-thinking and re-drafting will no doubt ensue. That said, what was abundantly clear was an overwhelming support in principle for the reforms and, despite there being some way to go in terms of getting the legislation right, a sense that the key people responsible for drafting it are listening to what people have to say.

For instance, Viviane Reding (the Vice President of the Commission) made it clear that the Commission would consider reducing the vast number of delegated acts. This will no doubt have come as welcome news to many. Delegated and implementing acts enable the Commission to supplement and amend certain non-essential elements of the legislation once it has come into force. In other words, they achieve flexibility and enable clauses to be drafted in a technologically neutral manner, making way for new technological innovations that will be prevalent in the years to come. The counter argument though is that delegated acts give the Commission excessive (and in many cases unnecessary) powers, which would constitute a bar to strengthening democracy and promoting transparency across the EU.

Francoise Le Bail (the Commission’s Director General for Justice), whilst defending the number of delegated acts currently drafted, recognised there were a lot of question marks and problems outstanding but stressed that stakeholder contributions were valued by the Commission which is determined to take into account the proposals and comments made. There is still room then for voices to be heard.

The debate on delegated acts was one thing, and there are no prizes for guessing some of the other controversial elements that repeatedly cropped up. The “right to be forgotten”, “one-stop-shop”, “consent”, “profiling” and “data protection by design” were all key concepts which unsurprising featured in the debate and, whether for or against them, the general view was clear. The drafting needs to be tightened up, and greater clarity is needed in many cases so as to be sure of the exact rights and obligations of everyone concerned.

The proposed legislation does after all affect a huge number people; not just citizens, but consumers, SMEs, global organisations and public authorities are all affected, and this was also a key feature of the debate. On the one hand, we were reminded that data protection is a fundamental right of each citizen in the EU and measures must be taken to protect that right; on the other we were reminded that data, which flows across the digital environment in ever-increasing volumes, is a hugely important economic asset, not-to-mention a vital component in terms of law enforcement.

So a balance needs to be struck. There are clearly business incentives for building trust in the digital environment, and similarly there is an undisputed recognition of the fact that we need to bolster the rights of individuals. It seems that all stakeholders are recognising the need to be flexible in their approach and response to these reforms, and are working hard to achieve a robust and coherent legal system that will, over the coming years, facilitate innovation whilst providing people with protection and control of their data, to enable the EU to continue to be a major player in the digital economy.

LIBE is expected to present its draft report on the proposed legislation by the end of this year, after which Member States will be invited to table their amendments. LIBE will then meet to discuss those amendments and it is expected that an orientation vote (where the committee votes and concludes upon its initial position in light of the negotiations) will be held in April 2013.

What to do when you can’t delete data?

Posted on October 2nd, 2012 by



How many lawyers have written terms into data processing contracts along the following lines:  “Upon termination or expiry of this Agreement, the data processor shall delete any and all copies of the Personal Data in its possession or control“?

It’s a classic example of a legal clause that’s ever so easy to draft but, in this day and age, almost impossible to implement in practice.  In most data processing ecosystems, the reality is that there seldom exists just a single copy of our data; instead, our data is distributed, backed-up, and archived across multiple systems, drives and tapes, and often across different geographic locations.  Far from being a bad thing, data distribution, archival and back-up better preserves the availability and integrity of our records.  But the quid pro quo of greater data resilience is that commitments to comprehensively wipe every last trace of our data are simply unrealistic and unachievable.

Nevertheless, once data has fulfilled its purpose, deletion is seemingly what the law requires.  The fifth principle of the Data Protection Act 1998 (implementing Article 6(e) of Directive 95/46/EC) says that: “Personal data processed for any purpose or purposes shall not be kept for longer than is necessary for that purpose or those purposes.“  So how to reconcile this black and white approach to data deletion with the reality of modern day data processing systems?

Thankfully, the ICO has the answer, which it provides in a recently-published guidance note on “Deleting personal data” (available here).  The ICO starts off by acknowledging the difficulties outlined above, commenting that “In the days of paper records it was relatively easy to say whether information had been deleted or not, for example through incineration. The situation can be less certain with electronic storage, where information that has been ‘deleted’ may still exist, in some form or another, within an organisation’s systems.

The sensible answer it arrives at is to say that, if data cannot be deleted for technical or other reasons, then it should instead be put ‘beyond use’.   Putting data ‘beyond use’ has four components, namely:

  1. ensuring that the organisation will not and cannot use the personal data to inform any decision in respect of any individual or in a manner that affects the underlying individuals in any way;
  2. not giving any other organisation access to the personal data;
  3. at all times protecting the personal data with appropriate technical and organisational security; and
  4. committing to delete the personal data if or when this becomes possible.

Broadly speaking, you can condense the four components above into: “Delete it if you can and, if you can’t, make sure it’s stored securely and don’t let anyone use it”. Which is, of course, entirely sensible advice.

It does raise one interesting problem though:  what to do when the individual data subject requests access to his or her data that has been put beyond use?  Here, the ICO again takes a business-friendly view saying simply that “We will not require data controllers to grant individuals subject access to the personal data provided that all four safeguards above are in place.“  In other words, the business does not need to instigate extensive (and expensive) searches of records that have been put beyond use just because an individual requests access to his or her data – for the purposes of subject access, this inert data is treated as if it had been deleted.

But the ICO does issue a warning: “It is bad practice to give a user the impression that a deletion is absolute, when in fact it is not.” So the message to take away is this: make sure you do not commit yourself to data deletion standards that you know, in all likelihood, you can’t and won’t meet.   And, by the same token, don’t let your lawyers commit you to these either!

The Justice Committee’s first bite of the new Data Protection Framework Proposals

Posted on September 4th, 2012 by



This morning the UK Parliament’s Justice Select Committee held its first evidence session on the EU Data Protection Framework Proposals. Representatives from the Association of Chief Police Officers, the Met Police, the Federation of Small Businesses, Microsoft as well as the Information Commissioner’s Office provided their views on the two draft EU legal instruments – the Directive (concerned with criminal data) and the Regulation (concerned with pretty much everything else).

Criticism

While the witnesses accepted that the Regulation did bring welcome changes to reduce certain aspects of the current regime’s bureaucracy (for instance, around notifying DPAs), the overwhelming response was to criticise the overly-engineered text of the Directive and Regulation (including the numerous delegated powers given to the EU Commission).  A key tension in the Regulation exists between the drive towards harmonisation (particularly dear to the Commission) and the consequent prescriptive practices and procedures that the Commission’s version of harmonisation requires.

The Business view

Although international businesses are keen on a single data protection standard across the EU, this becomes less palatable when the requirements for that standard are set out in precise detail. Additionally, while the Regulation appears to hold out all sorts of new rights to individuals as data subjects, industry queried what incentives the Regulation contained for them to comply and what compensation they would receive for the additional administrative burdens they would have to bear (such as maintaining detailed documentation about their data processing and responding to subject access requests if the fee is abolished). Industry supported an approach that encouraged codes of conduct and certification to promote trust between consumers and business.

The Regulator’s view

In his evidence, Christopher Graham, the Information Commissioner, was particular trenchant in his view that full compliance by the Information Commissioner’s Office with the requirements of the Regulation was not only unworkable but also exorbitantly expensive. He indicated that potentially millions more pounds would need to be allocated to the ICO for the office to fulfil its obligations under the Regulation such as checking that data controllers appoint DPOs or carry out PIAs. The ICO emphasised the need for the Regulation to focus on good data protection outcomes rather than prescribing the means by which this is achieved. For the ICO, the Regulation should promote a risk-based rather than one-size fits all approach.

The ICO was optimistic that its view during the negotiations on the Regulation would make some headway.  In particular the ICO was not keen to see its reputation as a regulator that advises and assists transformed into an administrative centre where it is obliged to punish compliance failures with no ability to apply discretion and judgment.

The right to be disappointed….

Although there was some discussion amongst the Committee and witnesses on the impact of the right to be forgotten, some witnesses considered this would swiftly become a ‘right to be disappointed’. Though packaged up as a new right, witnesses made the point that a similar if not identical right already exists in the current regime. Additionally the practical feasibility of organisations scouring the internet to identify and delete every reference to an individual means that it will be well nigh impossible for an organisation to conclusively delete every reference to an individual. Disappointment and disenchantment would inevitably set in. The ICO also mentioned that it is still unclear whether search engines would be caught by the obligation to implement an individual’s right to be forgotten.

Why the Big Buzz about Big Data?

Posted on June 29th, 2012 by



Another year, another buzz word, and this time around it’s “Big Data” that’s getting everyone’s attention. But what exactly is Big Data, and why is everyone – commercial organisations, regulators and lawyers – so excited about it?

Put simply, the term Big Data refers to datasets that are very, very large – so large that, traditionally, supercomputers would ordinarily have been required to process them. But, with the irrepressible evolution of technology, falling computing costs, and scalable, distributed data processing models (think cloud computing) Big Data processing is increasingly within the capability of most commercial and research organisations.

In its oft-quoted article “The Data Deluge”, the Economist reports that “Everywhere you look, the quantity of information in the world is soaring. According to one estimate, mankind created 150 exabytes (billion gigabytes) of data in 2005. [In 2010], it will create 1,200 exabytes.“  Let’s put that in perspective – 1,200 exabytes is 1,200,000,000,000 gigabytes of data. A typical Blu-Ray disc can hold 25 gigabytes – so 1,200 exabytes is about the equivalent of about 48 billion Blu-Ray discs. Estimating your typical Blu-Ray movie at about 2 hours long (excluding special features and the like), then there’s at least 96 billion hours of viewing time there, or about 146,000 human life times.  OK, this is a slightly fatuous example, but you get my point – and bear in mind that global data is growing year-on-year at an exponential rate so these figures are already well out of date.

Much of this Big Data will be highly personal to us: think about the value of the data we all put “out there” when we shop online or post status updates, photos and other content through our various social networking accounts (I have at least 5). And don’t forget the search terms we post when we use our favourite search engines, or the data we generate when using mobile – particularly location-enabled – services. Imagine how organisations, if they had access to all this information, could use it to better advertise their products and services, roadmap product development to take account of shifting consumer patterns, spot and respond to potentially-brand damaging viral complaints – ultimately, keep their customers happier and improve their revenues.

The potential benefits of Big Data are vast and, as yet, still largely unrealised. It goes against the grain of any privacy professional to admit that there are societal advantages to data maximisation, but it would be disingenuous to deny this. Peter Fleischer, Google’s Privacy Counsel, expressed it very eloquently on his blog when he wrote “I’m sure that more and more data will be shared and published, sometimes openly to the Web, and sometimes privately to a community of friends or family. But the trend is clear. Most of the sharing will be utterly boring: nope, I don’t care what you had for breakfast today. But what is boring individually can be fascinating in crowd-sourcing terms, as big data analysis discovers ever more insights into human nature, health, and economics from mountains of seemingly banal data bits. We already know that some data sets hold vast information, but we’ve barely begun to know how to read them yet, like genomes. Data holds massive knowledge and value, even, perhaps especially, when we do not yet know how to read it. Maybe it’s a mistake to try to minimize data generation and retention. Maybe the privacy community’s shibboleth of data deletion is a crime against science, in ways that we don’t even understand yet.” (You can access Peter’s blog “Privacy…?” here.)

This quote raises the interesting question of whether the compilation and analysis of Big Data sets should really be considered personal data processing. Of course, many of the individual records within commercial Big Data sets will be personal – but the true value of Big Data processing is often (though not always) in the aggregate trends and patterns they reveal – less about predicting any one individual’s behaviours, reactions and preferences, and more about understanding the global picture. Perhaps its time that we stop thinking of privacy in terms of merely collecting data, and look more to the intrusiveness (or otherwise) of the purposes to which our data are put?

This is perhaps something for a wider, philosophical debate about the pros and cons of Big Data, and I wouldn’t claim to have the answers. What I can say, though, is that Big Data faces some big issues under data protection law as it stands today, not least in terms of data protection principles that mandate user notice and choice, purpose limitation, data minimisation, data retention and – of course – data exports. These are not issues that will go away under the new General Data Protection Regulation which, as if to gear itself up for a fight with Big Data proponents, further bolsters transparency, consent and data minimisation principles, while also proposing a new, highly controversial ‘right to be forgotten’.

So what can and should Big Data collectors do for now? Fundamentally, accountability for the data you collect and process will be key. Your data subjects need to understand how their data will be used, both at the individual and the Big Data level, to feel in control of this and to be comforted that their data won’t be used in ways that sit outside their reasonable expectations of privacy. This is not just a matter of external facing privacy policies, but also a matter of carefully-constructed internal policies that impose sensible checks and balances on the organisation’s use of data. It’s also about adopting Privacy Impact Assessments as a matter of organisational culture to identify and address risks whenever using Big Data analysis for new or exciting reasons.

Big Data is, and should be, the future of data processing, and our laws should not prevent this. But, equally, organisations need to be careful that they do not see the Big Data age as a free for all hunting season on user data that invades personal privacy and control. Big issues for Big Data indeed.