European Union harmonizes the approach to sentencing cybercriminals

This post was written by Cynthia O'Donoghue.

In early July, the European Parliament adopted a new directive harmonizing the criminal laws relating to cyberattacks (Directive). It will replace the current nonbinding agreement between EU countries from 2005 (Framework Decision 2005/222/JHA). The Directive aims to harmonise the approach to cybercrime, by requiring all Member States to introduce maximum imprisonment sentences ranging from two to five years for various forms of cyberattack. ”Cyber crime does not stop at borders, so it is vital to have a comprehensive and joint set of rules to prevent and fight it successfully,” said Monika Hohlmeier, a German lawmaker responsible for overseeing the Directive’s passage through the European Parliament.

The Directive would set a three-tier system of maximum prison sentences applicable to cybercrimes, and it will be in each Member States’ discretion to define which attacks would be classified as minor. In addition, perpetrators benefitting from their cybercrime would face penalties ranging disbarment from public benefits to being closed down.

All infringements will need to carry at least a 2 year maximum prison sentence. The crimes range from illegal interference with IT systems or data, access of information systems, interception of data, as well as producing, selling or distributing tools designed for a cyberattack. The penalty for illegal interference with systems or data should be increased to a maximum 3 year sentence when the perpetrator used tools specifically designed for large-scale attacks or another person’s electronic identity. A maximum term of 5 years should apply to illegal interference with “critical” infrastructures (e.g. government information systems or energy networks), attacks which cause serious damage, attacks committed by criminal organisations and using "botnets" - establishing remote control over a significant number of computers by infecting them with malicious software.

The Directive also creates a system for effective exchange of information on cyberattacks. Member States will need to maintain an operational national point of contact which must be available on a 24/7 basis with a required response time for urgent reports of 8 hours.

The Directive was adopted in the European Parliament and will be considered by the EU Council at a forthcoming meeting. Once the Directive is fully approved and published, Member States will have two years to transpose its provisions into their national laws, except in relation to Denmark which used its opt-out right for legislation affecting law enforcement.

ENISA Cybersecurity Annual Report

This post was written by Cynthia O'Donoghue.

ENISA, the European Union Agency for Network and Information Security, issued its Annual Incidents Report 2012. The report has been issued under Article 13a of the Common Regulatory Framework Directive (1009/140/EC) for electronic communications networks and services. The report highlights that 18 European Union countries reported 79 significant incidents during 2012. Only 9 countries reported no significant incidents.

Nearly 50% of incidents reported related to mobile telephony and the internet, which affected about 1.8 million users per incident. Over 75% of the incidents were reported as “system failures” with hardware being the most common followed by software issues. Only 6% of the incidents were attributed to cyberattacks with the internet, followed by fixed telephony, being the most affected. Cyberattacks accounted for the second biggest cause of internet issues behind hardware failures.

The report contains several examples of the types of incidents reported, some of which were related to hardware failures or configuration failures. Some notable incidents related to theft of fibre optic cables, vandalism by a former employee and distributed denial of service attacks.

ENISA’s report concludes that the proposal for a cybersecurity Directive contains a similar reporting requirement to the existing Framework Directive. ENISA supports reporting as a method of assisting the European Union and Member States to improve the security and resilience of electronic communications networks.

UK Court of Appeal upholds two-year sentence for cybercriminal

This post was written by Cynthia O'Donoghue.

The UK Court of Appeal, R v Martin [2013] EWCA Crim 1420, dismissed an appeal against a two-year prison sentence for various cybercrimes by the appellant, Lewys Martin. Martin previously pleaded guilty to various breaches of the Computer Misuse Act 1990. Martin was then convicted and sentenced to concurrent terms for unauthorised modification of computer material, securing unauthorised access to computer material, including with intent, and for making, supplying or obtaining computer materials. In addition, there was a violation for the interception of communications under the Regulation of Investigatory Powers Act 2000.

The underlying convictions related to denial of service (DOS) attacks against various public bodies including Oxford and Cambridge Universities and the Kent Police, as well as to the hacking of several individuals' bank accounts, all taking place during 2011 and 2012. Martin was found to be linked to the cyberattack group “Anonymous.” The Universities wasted 19 working days in dealing with the attacks and Kent Policy wasted 35 man hours, with 30% of their security team engaged in dealing with the attacks. The individuals whose banking details were hacked and stolen were also impacted with each of them having to cancel and obtain new bank cards and close accounts, which took weeks to resolve.

The court recognised that individuals who had their privacy invaded in this way “very seldom” got over it, and the sentence had to reflect that the attacks against the individuals bordered on identity theft, even though Martin’s acts were not financially motivated. The appellate court held that the offences fell “into the highest level of culpability: they were carefully planned offences which did and were intended to cause harm both to the individuals and organisations targeted” (para. 36).

The court also held that the seriousness of the criminality could not be measured by the length of a cyberattack nor by the financial consequences, rather that the wider implications for society could not be ignored because of the potential to cause great damage and increasing prevalence of such incidents. The court looked at aggravating factors, such as whether an offence was planned or persistent, the nature of the damage, the public interest, and effect on individual privacy and on public confidence, holding that “for offending of this scale, sentences would be measured in years rather than months” (para. 43). In particular, the court acknowledged the prevalence of computer crime, the potential to cause enormous damage, to IT systems, important public institutions and to individuals, given the way in which society now operates, and that organisations are compelled to spend substantial sums combating this type of crime. The court concluded, therefore, that a deterrent sentence was warranted and the sentences were “amply justified” (para. 46).

U.S.-EU Safe Harbor Under Fire

This post was written by Cynthia O'Donoghue.

As part of an on-going debate on the European data protection reform, doubts were cast over the adequacy of the Safe Harbor arrangements with the United States. Viviane Reding, the European Commissioner for Justice, Fundamental Rights and Citizenship, called the 13-year-old data-sharing agreement between the EU and the United States a potential “loophole for data transfers,” which does not provide adequate protection. “The Safe Harbour agreement may not be so safe after all,” she said when announcing a review of the cross-Atlantic agreement.

The existing General Data Protection Directive prohibits cross-border transfers of personal data to countries not recognised as providing adequate protection for the processing of personal data, unless certain mechanism are in place. The U.S.-EU Safe Harbor Framework attempts to transpose European data protection law into that of the United States, such data transferred to companies certifying adherence to the framework as deemed to provide adequate protection for the processing of personal data. Currently, around 3,000 companies have voluntarily joined the programme by subscribing to a binding set of data transfer rules.

EU officials have raised two criticisms. First, whether Safe Harbor actually provides adequate protection. Second, whether companies certified to Safe Harbor actually observe the principles. Past studies have shown organisations falsely claiming to have certified to Safe Harbor, as well as only a fraction of organisations fully complying with Safe Harbor requirements in practice.

The U.S. Federal Trade Commission (FTC), the body responsible for enforcing the Safe Harbor, recently increased its enforcement action to ensure compliance with Safe Harbor, including by requiring annual audits of Twitter, Google, Facebook and MySpace. However, the EU officials and representatives of some European data protection authorities doubt whether this is enough to make Safe Harbor work in its current form.

Article 3 of the U.S.-EU Safe Harbor Agreement allows the European Commission to reverse or suspend the agreement. Referring to this provision, the European Parliament requested that the European Commission conduct a full review of Safe Harbor. Ms. Reding has confirmed that she plans to present a comprehensive assessment of Safe Harbor before the end of the year. Companies relying on Safe Harbor may need to audit their adherence to the framework, or even to consider implementing other mechanisms for ensuring adequate protection for data transfers, including binding corporate rules.

EU Article 29 A29WP publishes new BCR guidance for processors

This post was written by Cynthia O'Donoghue.

The European Union (EU) data protection body, the Article 29 Working Party (A29WP), in April adopted new guidance on Binding Corporate Rules for Processors (BCPRs). The document supplements the opinion from June 2012, which listed elements required for valid BCPRs, by further clarifying what provisions and mechanisms must be included before BCPRs can be authorised. The BCPR process has been developed by the A29WP in response to a request from outsourcing providers to create a new legal instrument to legitimise international data transfers.

The new guidance emphasises that BCPRs are the preferred method for transfers of personal data from the EU to countries without “adequate levels of protection,” over other methods, such as the EU standard contractual clauses. BCPRs are preferred when transfers are voluminous and frequent between the primary data processor and sub-processors in the same organisation. BCPRs are also recognised within the mutual recognition scheme, such that authorisation of BCPRs by one EU member state will result in automatic authorisation in other participating EU member states.

Data controllers will remain responsible for ensuring that service providers only process data under their instructions, and that sufficient guarantees are in place to protect the personal data being transferred to a service provider and within that service provider group, even where BCPRs have been authorised.

The A29WP emphasises that the BCPRs must be binding both internally and externally, and recommends service providers implement strict and punitive policies or codes of conduct supported by intra-group agreements. For third-party sub-processors, service providers are required to enter into agreements requiring sub-processors to respect the same obligations as the processor group. The sub-processor agreement will need third-party beneficiary rights for the data controller and for data subjects. Service providers seeking authorisation for BCPRs will need to include extracts of relevant clauses in their authorisation application.

The guidance also specifies the limits imposed on the requirements for modifying authorised BCPRs and lists other compulsory clauses, such as provisions ensuring compliance, audit mechanisms and complaint handling, and a duty to cooperate with both the controller and the relevant data protection authority. The BCPRs must also designate a corporate member within the EU that will be liable for breaches of the BCPRs by members of the group outside the EU.

While this new tool was developed in response to calls from the outsourcing community, no BCPRs have been authorised to date, although the French authority, the CNIL, has admitted to having several applications pending.

EU Article 29 Working Party criticises the proposed Data Protection Impact Assessment templates for smart-meters

This post was written by Cynthia O'Donoghue.

The Article 29 Working Party (A29WP) adopted the Opinion on Data Protection Impact Template Assessment for Smart Grid and Smart Metering Systems (Opinion), which evaluates the Privacy Impact Assessment (PIA) template that the member states intend to adopt. The PIA, which was prepared by industry representatives, seeks to ensure that smart-meter operators comply with data protection rules; however, the A29WP pointed out a number of inadequacies in the template.

The EU initiative to roll out smart gas and electricity meters, which can send usage data via remote communications, underpins the desire for a more effective and efficient energy supply. In the Opinion, the A29WP points out the risk that smart-meter usage data may be used to infer information about “consumers’ use of specific goods or devices, daily routines, living arrangements, activities, lifestyles and behaviour.”

The energy supply industry expert group developed the PIA to ensure that smart-meter operators comply with data protection rules, and to facilitate compliance assessments by Data Protection Authorities, as well as to provide information to consumers.

The PIA template contains an eight-step impact assessment and provides step-by-step guidance on how to carry it out. The A29WP admitted the proposed template contains useful elements, but criticised the failure to include any method of directly assessing the foreseeable impacts on the data subjects, including the risk of price discrimination or criminal acts facilitated by unauthorised profiling. The A29WP also felt the PIA template confused risks and threats, and failed to match specific risks to controls based on best practice. Other criticism included that the PIA template lacked sufficient guidance on the concepts of vulnerability, calculating and prioritising risks, choosing appropriate mitigating controls, and appropriately allocating data protection responsibilities between the different stakeholders. The A29WP also recommended including an analysis of industry-specific risks and relevant controls.

The A29WP acknowledged that the industry expert group is preparing ‘best available techniques’ that may address some of the criticisms, but it would wait to see the techniques included within the PIA template before it is resubmitted for a further opinion.

APEC's Cross-Border Privacy Rules begin to gain momentum

This post was written by Cynthia O'Donoghue.

In February 2013, Mexico became the second approved participant in the Cross-Border Privacy Rules (CBPR) programme - a system for convenient cross-border data transfers introduced in 2011 by the Asia-Pacific Economic Cooperation (APEC). At the same time, APEC and EU Data Protection Authorities (DPAs) plan to create a unified cross-border system covering both regions. This will be a welcome development likely to stimulate global trade.

APEC operates as a forum, of 21 countries from the Asia-Pacific region, promoting a free and open market and encouraging economic integration. One of its recent initiatives is the CBPR system, which involves companies adopting internal privacy rules assessed by accountability agents. This promotes unified data privacy policies for businesses operating throughout APEC economies. APEC leaders pledged to implement the programme in November 2011. The first country to join the system in July 2012 was the United States.

At the same time, representatives of APEC and the EU have begun discussions on creating a unified cross-border system. Achieving this will be helped by the fact that APEC’s CBPR and the EU’s binding corporate rules (BCRs) are very similar. Both systems involve companies developing internal policies for international transfers that are approved by third party authorities. A key difference, however, is that BCRs ensure that EEA businesses can transfer data to affiliates outside the EEA without breaching EU law, whereas the CBPR system is designed to promote transfers of data within the Asia-Pacific region. The French DPA (CNIL), which represents the EU in these discussions, has studied the similarities and differences between the two systems and carried out preliminary work on developing the unified mechanism. An initial road map for the project will be prepared by the APEC and the Article 29 Working Party in the coming months. Once introduced, the new system promises to bring much needed certainty for businesses operating or planning to operate in both regions.

The Article 29 Working Party tackles the most contested elements of the new Data Protection Regulation

This post was written by Cynthia O'Donoghue.

The Article 29 Working Party (“Art. 29 WP”), which has already released two opinions (WP191 and WP199) regarding the draft General Data Protection Regulation (“Regulation”), issued a statement and two accompanying annexes addressing some of the most heavily debated elements. This statement addresses relaxation of rules for the public sector, a one-stop-shop for data controllers, the pseudonymisation of data, the standard of consent, cross-border transfers, a risk-based approach, and the household exemption. Many of the views expressed by the Working Party appear to be in direct opposition to a number of observations made by other organisations, such as ITRE (see also our blog and client alert regarding the ITRE’s opinion.

The Art. 29 WP vehemently opposes the concept that the public sector should have a different regulatory regime for data protection from that of the private sector, on the basis that data protection is a fundamental right that is not affected by the status of the data controller being a public body.

The Art. 29 WP seeks the inclusion of pseudonymised and encrypted data with the scope of ‘personal data’ on the basis that they are security techniques that do not change the inherently personal nature of the data.

The Art. 29 WP discourages removing the requirement for explicit consent because it is both essential to ensure that consent is not misused by data controllers, and goes to the heart of proving the validity of consent. It also expressed support for consent being invalid when obtained where there is a significant imbalance of power.

Permitting cross-border data transfers without the need for a binding mechanism was rejected by the Art. 29 WP. The Art. 29 WP’s statement advocated the introduction of Mutual Legal Assistance Treaties (“MLATs”) to govern disclosures of data not otherwise authorized under EU or EU member states’ national laws, where such disclosures would be based on important grounds of public interest. Without such MLATs, data controllers would continue to be prohibited from transferring data outside EU even when subject to the court order of a third country.

The Art. 29 WP supports a risk-based and scalable approach to data protection, with risk depending not only on the size of the controller, but also on the nature and categories of the data being processed.

In relation to the household exemption, commonly relied upon by organisations that ask members or users to add their contacts, such as social media, the Art. 29 WP recommended removing the exemption when its use would result in gainful interest connected with a commercial activity.

This statement will be weighed by the LIBE Committee as part of determining which of the more than 3,000 suggested amendments to incorporate into the Regulation; but given that the Art. 29 WP is made up of the 27 EU member states’ data protection authorities, the Art. 29 WP statement is likely to be influential.

ECJ to weigh in on Spanish contest with Google over the application of data protection laws

This post was written by Cynthia O'Donoghue and Katalina Chin.

As Google continues its legal battle with the Spanish Data Protection Authority (DPA), the Spanish High Court (Audiencia Nacional) has referred several questions to the European Court of Justice (ECJ). The questions cover whether individuals have the right to demand the removal and blocking of information contained within Internet search results, even though that information was lawfully collected and accurate at the time of collection. Such search results may have a negative or harmful effect on the individual since the information could potentially be available “over the lifetime of an individual and that of his descendants.”

The case at issue related to a person who, when searching his name, was provided with search results relating to a newspaper advertisement for the auction of his property stemming from an old and subsequently resolved debt. The individual had requested Google to remove the search result, and when it did not do so, complained to the Spanish Data Protection Agency, which upheld the complaint and required Google to amend the search results. As the advertisement had been published in a newspaper, Google felt that the search result should not be taken down, and appealed.

One of the issues referred to the ECJ is whether the Spanish court has jurisdiction over Google, Inc. as a data controller or whether the issue should be tried in a California court, since Google’s local subsidiary only sells advertising to the California parent. The Spanish DPA found jurisdiction on the basis that Google Spain has a sufficient presence in Spain, operates a Spanish top level domain (ccTLD), and processes personal data about Spanish citizens.

A second important question to be answered by the ECJ is whether Google can be classified as a “data controller, rather than only a host. Google argues that it did not produce the information in question, but merely displayed it in search results, and that data would disappear from the search index as soon as it was no longer available from the source web page. Additionally, Google asked the court to consider its rights to the freedom of expression. It argued that forcing it to remove the search results would be detrimental to the public interest, as “there are clear societal reasons” why information about valid legal material which still exists online should be publicly available.

ECJ advocate-general is anticipated to publish an opinion on the matter June 25, and the judges are expected to rule on the matter by the end of the year.

EU member states argue for watering down the proposed Data Protection Regulation

This post was written by Cynthia O'Donoghue.

The proposed new EU General Data Protection Regulation may need to be watered down. The far-reaching proposed draft, which was published in January 2012, aims to unify and strengthen the data protection laws across the 27 EU countries. However, the Financial Times reports that a memo drafted by the Irish presidency admits that “several member states have voiced their disagreement with the level of prescriptiveness of a number of the proposed obligations in the draft regulation.”

There appears to be a prevailing opinion among the member states that the burdens imposed by the draft Regulation must be reduced, especially the most commonly criticised elements, such as the requirement to obtain individuals’ explicit consent and the “right to be forgotten.” Several EU member states, like the UK (see our blog about the UK’s criticism), advocate a “risk-based” approach that would have as its focus whether a substantial threat to a person’s personal data exists. Several EU member states would like small companies spared from many of the compliance burdens contained in the proposed Regulation—an approach advocated by the American Chamber of Commerce.

Some member states, including the UK, would like to see the designation of a data protection officer reduced to an optional requirement. Germany and Belgium argue for the easing of rules related to the use of data by public institutions.

The lobbying for watering down the proposed Regulation has been openly criticised by a coalition of privacy groups, as well as by Jan Philipp Albrecht, the rapporteur for the draft regulation (see also our blog about Albrecht’s report on the proposed Regulation). Given the raging debate, it looks as though enough member states oppose the draft Regulation to block the entire proposal, unless the European Parliament and the European Commission heed the calls for compromise.

European Parliament Committee on Industry, Research and Energy publish opinion on the proposed General Data Protection Regulation

This post was written by Cynthia O'Donoghue.

Following the lead of the Committee on Civil Liberties, Justice and Home Affairs (LIBE), which already released its draft report (see our prior blog) 20 February, the European Parliament Committee on Industry, Research and Energy (ITRE Committee) published its Draft Opinion on the proposed General Data Protection Regulation. This opinion has been submitted to LIBE, which has the task of consolidating amendments and voting on its own report at the end of April.

In the Draft Opinion, ITRE rapporteur Seán Kelly outlined his substantial support for the proposed Regulation and suggested that the changes should help avoid excessive administrative burdens for enterprises, and introduce a greater degree of flexibility, especially in terms of accountability and the notification requirements to supervisory bodies. The ITRE Committee, however, proposed significant amendments to the Regulation in an attempt to ease restrictions on companies by focusing on corporate governance, the use of impact assessments, and bringing increased clarity to the provisions. It has recommended significant alterations to the most contentious provisions, such as consent mechanisms; the rights of access, portability, and to be forgotten; the 24-hour breach notification requirement; and the sanctions regime.

For a more detailed analysis, click here to read the issued Client Alert.

UK Information Commissioners Office presents article-by-article analysis of the proposed new General Data Protection Regulation

This post was written by Cynthia O'Donoghue.

Following the publication of its “further thoughts" on the European Commission’s proposed new data protection framework, the ICO has now published an in-depth, article-by-article analysis of the proposed General Data Protection Regulation (the Regulation). The ICO pointed out that this is an important opportunity to get the framework correct, as it is likely to remain in force for many years. The paper reflected the ICO’s general concerns and expressed its opinion about some of the more contested elements of the Regulation.

The ICO reiterated the need for further clarity and expressed concerns about the number of delegated acts of the European Commission in the Regulation on the basis that use of the delegated acts is likely to result in continued uncertainly for businesses and data subjects.

The ICO emphasises that the new data protection framework should promote a truly risk-based approach, instead of focusing on the administrative detail and compliance process rather than outcomes, as it could encourage paper-only compliance. The ICO also voiced strong support for the concept of protection by design, so long as the model was principle-based to accommodate scalability and flexibility.

The ICO welcomed “the high standard of consent”, but raised concerns that some data controllers may be left without a lawful basis for processing, and criticised the unequivocal barring of consent obtained in cases of alleged “significant imbalance", pointing out that consent can be obtained for employer-employee data processing. The ICO continues to advocate for the inclusion of “pseudonymised' data within the definition of the personal data, but floated the idea that individuals’ access rights should not apply.

While the ICO generally supports the new right to be forgotten, the paper acknowledges that it may be impossible in practice, because data in the public domain will often be disseminated without the original data controller’s consent or knowledge, which could result in individuals developing a false belief that data is capable of being erased. Despite acknowledging the concerns regarding the right to portability’s potential impact on property rights and trade secrets, and admitting it is not a “classical" element of data protection law, the ICO welcomed its inclusion highlighting that it empowers consumers.

EU and U.S. sign joint declaration to make Internet safer for children

This post was written by Cynthia O'Donoghue.

EU Commission Vice-President Neelie Kroes, responsible for the Digital Agenda for Europe, and U.S. Secretary of Homeland Security Janet Napolitano, have signed a joint Declaration to “work collectively and in partnership to reduce the risks and maximise the benefits of the Internet for children.” The declaration demonstrates a mutual recognition by the United States and the EU of the need to establish appropriate safeguards to strengthen cyber security, and will complement the EU "Strategy for a Better Internet for Children."

The declaration sets out plans to create joint U.S./EU campaigns, with the U.S. Department of Homeland Security scheduled to participate in the EU Safer Internet Day 5 February 2013. The joint campaigns will seek to improve cybersecurity and focus on international cooperation between industry, public authorities, schools, and civil society to ensure a global audience.

According to the European Commission, 75% of children between the ages of 6 and 17 routinely use the Internet, and the declaration sets out three main objectives to protect children online:

  1. Increase awareness of risks and improve skills of children, and engage parents and teachers to help enable best use of the Internet by collaborating on cybersecurity awareness
  2. Work with industry, law enforcement and other stakeholders to ensure that Internet content and services can be trusted, and parents and children can make informed choices
  3. Cooperate in fighting online child sexual exploitation and abuse

The EU and U.S. have historically worked together to combat cybercrime and have established an EU-U.S. Working Group on Cybersecurity and Cybercrime, and it is this existing collaboration that has led to this “key milestone.”

Rapporteur Jan Philipp Albrecht presents report on the European Commission's proposed Data Protection Regulation

This post was written by Cynthia O'Donoghue.

On January 10, 2013, Jan Philipp Albrecht, the rapporteur to the EU Parliament’s Committee on Civil Liberties, Justice and Home Affairs (“LIBE”), presented his draft report (the “Report”) proposing amendments to the European Commission’s proposed Data Protection Regulation (the “Proposed Regulation”).

Albrecht’s amendments to what was already a complex and prescriptive piece of draft legislation have received mixed reviews from government and industry. The UK recently voiced its criticism of the current proposals, while the European Data Protection Supervisor (EDPS) reacted positively to Albrecht’s report, indicating that it was impressed with the changes made, as they included many of the EDPS and Article 29 Working Party recommendations.

Albrecht has recommended significant alterations to the most contentious provisions, such as the definition of personal data, consent, the rights of access, portability and to be forgotten, and the 24-hour breach notification. Albrecht has sought to simplify the legal framework while also strengthening individuals’ rights.

The definition of personal data includes data that would single a person out, either from data held alone or when used in “combination with associated data,” and seeks to clarify uses of pseudonymised data and create a definition for anonymous data that prevents identification of a person, where identification, directly or indirectly, would require a “disproportionate amount of time, expense and effort.”

Albrecht believes consent “is the best way for individuals to gain more control over data processing activities,” and his proposed amendments consent to be explicit, freely given, specific-informed, and obtained through "clear affirmative action," since pre-ticked boxes cannot be seen to express free consent.

The right of access would now include the ability to obtain information about profiling and whether a governmental authority had requested data, as well as whether an organisation had complied with that request. The right of portability would be amended to be part of the right of access, so that copies of data are provided in a format that can be migrated to another service.

In relation to the right to be forgotten, Albrecht includes a provision for erasure if there is no legitimate grounds to retain the data. This aims to ensure that companies that have transferred data to third parties without a legitimate legal basis, do actually erase the data. Vivian Reding, in a speech at the EC Justice Council meeting in Dublin 18 January 2013, endorsed this “ambitious and pragmatic” approach in being necessary to prevent imposing unreasonable obligations on businesses.

Responding to the perceived short time limit of 24 hours for notifying the National Supervisory Body of personal data breaches initially proposed by the European Commission, Albrecht suggests extending the time frame to 72 hours.

Albrecht also recommends more onerous notification requirements, with data controllers required to use a multi-layered approach including easily understandable, icon-based descriptions for different types of processing.

Albrecht also recommends that organisations’ ability to rely on legitimate interest basis for processing data be limited to “exceptional circumstances,” where it would be possible for data controller’s interests to override the fundamental rights and freedoms of data subjects.

Other amendments proposed by Albrecht include replacing the criterion for mandatory appointment of a data protection officer (DPO) from being based on having more than 250 employees, to processing the data of 500 individuals or more per year. This means that even small companies and start-ups would incur this expense.

In its recent response to the UK Justice Select Committee’s opinion on the Data Protection framework proposals, the UK Ministry of Justice found mandatory appointments of DPOs unnecessary and suggested that data controllers should be encouraged to appoint DPOs “if they were felt necessary to ensure compliance with the proposed Regulation.” Both the UK Ministry of Justice and the UK Justice Select Committee have been highly critical of proposed Regulation, finding it overly prescriptive and likely to increase costs to the UK economy of between £100 million – £360 million per annum; and the UK Government likely would view Albrecht’s amendments even more harshly, since the UK would like to see the draft Regulation re-casting as a Directive to allow Member States a degree of flexibility.

The Irish government, which currently holds the EU presidency, also expressed concern at a Justice Council meeting in Dublin, suggesting that the household exemption (which permits individuals processing data as part of purely personal activity) and the right to be forgotten are unrealistic. While the Irish have previously said that the proposed Regulation is a priority they would like to see passed during their EU term of presidency, the draft Regulation is continuing to prove highly contentious, and any effort to further constrain business is likely to meet with resistance from some Member States as well as industry.
 

The European Network and Information Security Agency (ENISA) publishes report on the 'Right To Be Forgotten'

This post was written by Cynthia O'Donoghue.

The "right to be forgotten" as contained in the EU Commission’s Proposed Data Protection Regulation (Proposed Regulation), enhances the existing right to data erasure obligation by including an obligation on data controllers that have personal data public, to inform third parties on the data subject's request to erase any links to, or copy or replicate personal data the individual no longer wishes to be public, from online services. How this new right may be implemented is far from straightforward, and the European Network and Information Security Agency (ENISA) has exposed many of the technical difficulties of its implementation in a report, “The right to be forgotten – between expectations and practice.”

A fundamental concern raised by ENISA is the broad scope of the definition of personal data. In addition, ENISA warns that the draft regulation is not specific enough with regard to who has the right to request the deletion of data. This can become complex in certain circumstances, especially in the context of multiple data subjects with divergent viewpoints on deletion. Although difficult to administer, according to ENISA, there is an obvious need to establish who gets to decide in these situations.

ENISA also finds the definition of "forgotten" data problematic, asking whether it is enough to simply make the data inaccessible to the public or whether it requires absolute deletion. Concerns are raised about the complexities involved in the deletion of personal data from data in large data sets or “Big Data,” especially where it may be possible to re-identify individuals from information from data held in large data sets. ENISA also points out that research, which depends on aggregated and derived forms of information (e.g., statistics), if elements of the raw data from which the data set is derived are forgotten.

Because of the Internet’s openly accessible nature, once information is published it becomes impossible to prevent unauthorised copying of the information, making it difficult, if not impossible, to locate all copies of it. Enforcement of the right to be forgotten solely through technical means or through requests to "take down" information, is therefore unlikely to be feasible. ENISA suggests that technical enforcement would need to be supplemented by international legal provisions aimed at making it difficult to find personal data, for instance, by requiring search engines to filter references to forgotten data from their search results.

Although ENISA stays clear of opining of the merit of a right to be forgotten, the report demonstrates that reliance on technical means to comply with the right, should it be implemented, requires a clearer definition of the scope of personal data, a clarification of who has the right to ask for the deletion, and under which circumstances and what methods data can be considered "forgotten." The ENISA report shows that a technical solution by itself is impossible, and what is required is a further refinement by policymakers and data protection authorities if the right to be forgotten is to operate effectively should it be implemented.
 

The Council of the European Union issues suggested amendments to the Proposed EU Data Protection Regulation

This post was written by Cynthia O'Donoghue.

The Council of the European Union has published a new review detailing comments on the draft proposal for a General Data Protection Regulation (“Draft Regulation” or “Regulation”). Building on comments made in the DAPIX document, the review contains comments from each EU Member State with suggested changes to the first and second chapters of the Draft Regulation.

Most of the Member States commented on the excessive number of delegated acts which allow the European Commission considerable powers of discretion, and many sought to delete some, if not all, of those delegated acts. Other general comments focused on the territorial and material scope of the Regulation and the fact that some Member States would have preferred a directive (requiring national implementing legislation) to a regulation (direct effect).

Many of the Member States highlighted similar issues with Articles 1-10 of the Draft Regulation, namely:

  • In relation to Article 1 (subject matter and objectives), revision or deletion of paragraph 3 as its impact could reduce the scope of the right to protection of personal data.
  • Frequent comments on Article 2 (material scope) paragraph 2(2)(d) are that the exemption does not take into account the ECJ judgment in Lindquist, where data is made available on the Internet; that the exemption under 2(2)(b) for EU institutions, bodies, offices and agencies should be deleted; and that the concept of ‘national security’ in 2(2)(a) is too vague.
  • The extension of jurisdiction in Article 3 (territorial scope) outside of the EU was considered unworkable and potentially unenforceable.
  • A change to the definition of ‘personal data’ in Article 4 to include anonymised and/or pseudonymised data, and that the definitions of genetic data, biometric data and data concerning health, were all too wide.
  • Article 5 (principles relating to personal data processing) should give greater consideration to the use of pseudonymised data, and that paragraph (f) was considered too general and imprecise, thus creating an excessive liability on a data processor.
  • Most Member States had drafting issues or additions to Article 6 (lawfulness of processing), including in relation to processing for ‘legitimate interests’.
  • Member States welcomed the provision that the controller shall bear the burden of proof for obtaining the data subject’s consent in Article 7 (Conditions for consent), although some questioned the form of consent required.
  • Almost all of the Member States had issues with the intended scope of Article 8 (Processing of child personal data), questioning how controllers are supposed to identify and verify the age of children online, and may interfere with national age limits and systems.
  • In relation to Article 9 (processing special categories of data), Member States questioned whether consent was required in all cases and whether ‘beliefs’ was considered to be too wide.
  • Most Member States had reservations around the wording of Article 10 (Processing not allowing identification), either querying its necessity or questioning its meaning and opting for its deletion.

The document will be discussed in greater detail in our upcoming Client Alert. We will issue Part 2 when the next stage of the review is published.

US wades into debate on revision to EU Data Protection Directive

This post was written by Cynthia O'Donoghue and Nick Tyler

The U.S. Federal Trade Commission (FTC) has waded into the political debate with an Informal Note on the draft EU Data Protection Regulation as reported by Statewatch. In addition, Digital Civil Rights in Europe has reported that the U.S. Department of Commerce engaged in significant lobbying of the European Commission in response to the leaked draft Regulation.

The FTC’s Informal Note, provided to the EC in December 2011, focused on “two overarching concerns”:

  • potential adverse effect on the global interoperability of privacy frameworks” – resulting in divergence rather than convergence of data privacy standards globally; and
  • serious implications for regulatory enforcement activities involving third countries” such as the U.S. – resulting in EU data protection laws presenting a significant obstacle to international enforcement cooperation.

In both respects, the Informal Note portrays the draft Regulation as a backward step that would have an adverse effect on the global interoperability of privacy regimes due to it increasing differences rather than promoting convergence. The FTC also raised concerns about the draft Regulation’s potential to adversely impact international investigations, hinder information sharing between regulatory agencies and undercut enforcement cooperation between the EU data protection authorities and similar privacy enforcement agencies round the world.

In doing so, the FTC’s Informal Note emphasises many of the issues highlighted in our two blogs and Client Alert following the leak of the draft Regulation. In particular, the following themes are highlighted:

  • Data breach notification – criticising the Regulation’s “focus on process, instead of on improving security practices”, the note concludes that this “may…dilute the effectiveness and credibility of all such notices.” This echoes a concern first raised by the UK Information Commissioner’s Office during the IAPP Summit in November 2011, relating to notification of all data breaches regardless of seriousness or number of persons affected.
  • The “right to be forgotten” – the FTC’s concern relates to a chilling effect on rights to free speech and intimates that a right to be forgotten is little more than a pipe-dream fraught with legal and practical obstacles that render it unfeasible. Basically, the ubiquity of the Internet means that the cat’s out of the bag and any attempt to put it back is doomed to fail.
  • The definition of “child” – the EU’s definition of child being anyone under the age of 18 runs counter to the U.S.’s longstanding regulation of children’s privacy (defined as under-13 in the Children’s Online Privacy Protection Act (COPPA)). The FTC refers the EC to its recent review of the COPPA Rule1suggesting it take a more modern and less paternalistic view by recognising:

…it would be difficult to require parental permission for teenagers because they’re independent, more sophisticated with new technologies than their parents are, and have access to computers outside the home, particularly with the increasing proliferation of mobile devices.”

  • Transfers to third countries – criticising the increased complexity in determining adequacy for transferring data outside the EU, the FTC believes that the draft Regulation only makes the process more burdensome, opaque and indeterminate rather than the EC achieving its stated objective of clarifying it. There is undoubtedly a degree of self interest in the FTC’s alarm at the possibility that a U.S. Safe Harbor certification may no longer be recognised (at least in its current form) as a lawful basis for transfers of personal information from the EU to the U.S., as we previously highlighted. The prospect that present lawful trans-border dataflow mechanisms will need to be replaced by new or re-vamped versions, including through the use of binding corporate rules, will alarm every U.S. organisation that has invested significantly in putting legal mechanisms in place to transfer data from the EU to the U.S.
  • International Investigations – the FTC raises concerns about the effect on international regulatory enforcement, effectively calling the draft Regulation a ‘blocking statute’, because data controllers will have to notify and receive prior authorisation from a data protection authority before disclosing personal data to any non-EU governmental or regulatory authorities or private litigants outside the EU. The FTC highlights the conflicts as well as perils such provisions will create for U.S. companies with a presence in the EU, especially if an investigation relates to anti-competitive activities, financial or consumer fraud. The FTC suggests that the draft Regulation incentivises “offshoring” evidence, resulting in untimely delays and potentially damaging the interests of consumers, including in the EU.

The FTC’s Informal Note, along with other voices loudly debating the draft Regulation, advocates a more balanced and proportional approach to privacy and data protection. 

Whether this US intervention will contribute to a delay in the EC publishing the draft Regulation, or whether, as recently restated by Ms. Reding’s office, publication will still take place on Data Protection Day on 28 January, we don’t have long to find out.



1 COPPA Rule Review Request for Comment, Fed. Reg. Vol. 76, No. 187, Sept 27 2011 at 5905, available at: http://www.ftc.gov/os/2011/09/110915coppa.pdf.