Italy Releases Draft Declaration of Internet Rights

This post was written by Cynthia O’Donoghue.

Italy’s Chamber of Deputies has proposed a ‘Draft Declaration of Internet Rights’ (Declaration), acknowledging both the way in which the internet has changed interactions and the way it has erased borders, but also noting that the EU’s protection of personal data is a necessary reference for governing operation of the internet. The Declaration is now open to public consultation until 27 February 2015.

The aim of the Declaration is to establish some general principles to be implemented by national legislation. It consists of a preamble and 14 articles covering topics including the fundamental right to internet access, net neutrality and right to be forgotten.

In particular, there is strong emphasis on the protection of the individual from widespread monitoring. Article 9 of the Declaration, for example, states that restrictions on anonymous communications "may be imposed only when based on the need to safeguard the public interest and are necessary, proportionate, and grounded in law and in accordance with the basic principles of a democratic society."

This publication is not the first of its kind and follows the German Bundestag committee work on the ‘Digital Agenda’, France’s parliamentary committee report on Rights and Liberties in the Digital Age, and Brazil’s Marco Civil.

The Declaration has received a mixed response, including from Italy’s Data Protection Commissioner, who expressed some concern about the rights to be anonymous and to be forgotten (Articles 9 and 10). A particular concern raised about the right to be forgotten, relates to increasing the scope of the right to permit court appeals of decisions relating to search engine de-listings where there is a public interest in preserving the information, which in principle sounds like a promotion of freedom of speech, but could have the opposite effect by focusing undue attention on individual requesting de-listing.

As a Declaration it will not become binding even after being finalised after the public consultation period; however, it will form a statement of principles on internet governance and the rights of individuals.

Update on State Attorneys General: Connecticut Creates a Permanent Privacy Department; NAAG Covers Big Data, Cybersecurity, and Cloud Computing; and States Amend Breach Laws

This post was written by Divonne Smoyer and Christine N. Czuprynski.

The federal government may be pushing a cybersecurity and data privacy agenda, but that doesn’t mean that the states are taking a back seat. The state attorneys general are maintaining their focus on issues relating to privacy and data security and expanding the scope of that focus to address the ever-evolving nature of those issues.

On March 11, 2015, Connecticut Attorney General George Jepsen announced the creation of the Privacy and Data Security Department in his office that will be tasked with privacy and data security investigations and litigation. The attorney general, who created a privacy task force four years ago, hopes that the creation of this specialized department will solidify Connecticut’s role as a leader in this space. The attorney general is making the shift from a task force to a permanent department because the need for such a focus has not let up in the last four years, and shows no signs of doing so.

Privacy and data security are on the minds of the attorneys general as they come out of their most recent National Association of Attorneys General (NAAG) meetings and head into spring. The NAAG Southern Region Meeting, which concluded March 13, 2015, covered “Big Data – Challenges and Opportunities,” and included panels on data breach, cybersecurity, cloud computing and the proposal for a national data breach notification law.

NAAG President Mississippi Attorney General Jim Hood, whose presidential initiative for the 2014-15 year is “Protecting Our Digital Lives: New Challenges for Attorneys General,” will host the presidential initiative summit in mid-April in Biloxi, Mississippi. On the summit agenda: intellectual property theft, cloud computing, and digital currency.

In addition, state attorneys general are seeking to revise and expand upon existing data breach and privacy legislation. We have previously discussed the changes being considered in New York and Oregon. The Washington Attorney General is also pushing for changes to that state’s data breach notification law. Regulated entities can expect to continue to see a lot of action from the states on these issues. 

Ofgem's Smart Meter Network Decision: UK gas and electricity consumer privacy gets broader protection

This post was written by Kate Brimsted and Cynthia O'Donoghue.

In February 2015, Ofgem (the UK’s Office of Gas and Electricity Markets) published its Decision on Extending the Smart Meter Framework to Remote Meters (the Decision). This confirms that, following a public consultation, the privacy requirements embedded in the supplier licence terms and which will apply to suppliers’ use of customer data from “smart meters” will apply to a wider class of meters.

Ofgem is a non-ministerial government department and an independent National Regulatory Authority, recognised by EU Directives. The UK Department of Energy and Climate Change (the DECC) is leading the implementation of “smart metering”; gas and electricity suppliers are required to roll out around 53 million smart meters, affecting every home and smaller business in Great Britain. The rollout is scheduled to be completed by 2020. Smart meters are expected to bring significant benefits. Consumers will have more information about their energy consumption, which should help them manage their usage more effectively. There will be improved and more accurate billing, easier and quicker switching between different methods of payment, and a wider range of payment options, including Internet-based prepayment top-up. Smart meters should also help to reduce costs for the industry and, ultimately, consumers.

However, smart meters can store much more detailed energy consumption data than traditional meters, and are capable of being read remotely. The DECC therefore originally introduced a regulatory framework for data access and privacy specifically for smart meters, including new supplier licence obligations (the Privacy Requirements), as well as obligations in the Smart Energy Code to complement the Data Protection Act 1998 and to ensure that consumers have control over the use of consumption data from their meters.

The Privacy Requirements require suppliers:

  • For domestic consumers, to get opt-in consent to obtain and use data at greater detail than daily reads, or to use any detail of consumption data for marketing
  • For domestic consumers, to get opt-out consent for access to consumption data up to daily detail (the supplier is required to notify the consumer of the data it plans to take and must not take the data if the consumer so requests)
  • For micro business consumers, to get opt-out consent for access to consumption data at greater detail than monthly

Along with “smart meters”, there is a range of meters with similar functionality, such as “smart-type”, “advanced domestic”, “advanced” and “AMR” meters; Ofgem refers to these collectively as Remote Access Meters – i.e., any meter that isn’t a smart meter, but which is able remotely to send consumption data to the supplier, either on its own or with an ancillary device. Ofgem’s Decision confirms that the Privacy Requirements will also apply to Remote Access Meters in the future, regardless of when they were installed.

Article 29 Working Party issues its Cookie Sweep Combined Analysis - Report

This post was written by Cynthia O’Donoghue and Katalina Bateman.

On 3 February, the Article 29 Data Protection Working Party published its ‘Cookie Sweep Combined Analysis – Report’. The sweep was undertaken by the WP29 in partnership with eight of the European data protection regulators, including the UK’s ICO, France’s CNIL and Spain’s AEPD, in order to assess the current steps taken by website operators to ensure compliance with Article 5(3) of Directive 2002/58/EC, as amended by 2009/136/EC. The Report details the results of their assessment of the extent of the use of cookies, the level of information provided, and a review of control mechanisms in place.

The Report examines 250 websites which were selected as among the most frequently visited by individuals within each member state taking part in the sweep. Media, e-commerce and the public sector were chosen as target sectors, which were those considered by the WP29 to present the ‘greatest data protection and privacy risks to EU citizens’.

Highlights of the assessment include:

  • High numbers of cookies are being placed by websites. Media websites place an average of 50 cookies during a visitor’s first visit.
  • Expiry dates of cookies are often excessive. Three cookies in the sweep had been set with the expiry date of 31 December 9999, nearly 8000 years in the future. Excluding cookies with a long duration, the average duration was between one and two years.
  • 26% of sites examined provide no notification that cookies are being used. Of those that did provide a notification, 50% merely inform users that cookies are in use without requesting consent.
  • 16% of sites give users a granular level of control to accept a subset of cookies, with the majority of sites relying on browser settings or a link to a third-party opt-out tool.

Since publishing the Report, the WP29 has made it clear in a Press Release that the results of the sweep “will be considered at a national level for potential enforcement action”. While the UK’s ICO has already stated that it intends to write to those organisations that are still failing to provide basic cookie information on their websites before considering whether further action is required, other European regulators have yet to comment on what actions they have planned.

China's State Administration for Industry and Commerce Releases Measures Defining Consumer Personal Information

This post was written by Cynthia O'Donoghue and Zack Dong.

In January, China’s State Administration for Industry and Commerce (SAIC) released its ‘Measures on Penalties for Infringing Upon the Rights and Interests of Consumers’ (Measures) which are due to take effect March 15, 2015.

These Measures flesh out China’s Consumer Rights Protection Law (CRPL) which was amended in March 2014 and provides guidance as to how companies may collect, use and protect personal information of consumers.

The Measures helpfully defines “consumer personal information”, which the amendments to the CRPL had failed to do, as “information collected by an enterprise operator during the sale of products or provision of services, that can, singly or in combination with other information, identify a consumer.”

Examples of consumer personal information provide additional clarity, such as information which refers to a consumer’s name, gender, occupation, birth date, identification card number, residential address, contact information, income and financial status, health status, and consumer status. This definition is a welcome addition in the midst of China’s patchwork of privacy rules and regulations.

Violations of the Measures may result in significant penalties. The Measures state that the SAIC and its local Administrations of Industry and Commerce may impose a fine of up to RMB 500,000 if there are no illegal earnings. In the event that there are illegal earnings, however, they may issues fines of up to 10 times the amount of the illegal earnings and confiscate all illegal earnings.

It is hoped that these new Measures (in combination with the CRPL) will help to repair consumer trust in Chinese companies, and protect the improper use, disclosure and sale of consumers’ personal information in the country.

EU Art. 29 Working Party Letter on Health Data and Apps

This post was written by Cynthia O'Donoghue.

The EU Article 29 Working Party (“WP29”) has published a letter to the European Commission (“EC”) on the scope of health data in relation to lifestyle and well-being apps, following the EC’s Working Document on mHealth and the outcome of its public consultation, which generated interest in strong privacy and security tools, and strengthened enforcement of data protection.

In the letter, WP29 addresses the exceptions to processing health data for historical, statistical or scientific research, and requests that the EC ensure that any secondary processing of health data only be permitted after having obtained explicit consent from individuals.

The Annex to the letter acknowledges that determining the scope of health data is particularly complex and can have a wider interpretation depending on context, and is likely to capture apps measuring blood pressure or heart rate – exactly the types of apps that are already widely available.

The Annex makes recommendations for those gray areas where it is not always clear whether personal data is medical data, and gives examples of possible indicators to consider, such as the intended use of the data and, over time, if it is combined with other data, would it be possible to create a profile about the health of an individual, such as risks related to illness, weight gain or loss and the consequential health issues that may arise, or an indication of heart disease. To be considered ‘medical data’, the WP29 states that there has to be a relationship between the raw data set collected through the app and the ability to determine a health aspect of a person, either from the raw data itself or when that raw data is combined with other data (irrespective of whether these conclusions are accurate or not).

Finally, WP29 suggests that the data protection exception relating to further processing of health data for historical, statistical and scientific purposes should be limited to research that serves high public interests, cannot otherwise be carried out or where other safeguards apply, and where individuals may opt out.

The view of the WP29 is likely to capture most of the existing apps relating to well-being, which at present a lot of organizations may have been considered to be outside the scope of the additional protections afforded to sensitive data.

Google signs UK Undertaking to Improve its Privacy Policy

This post was written by Cynthia O'Donoghue.

On 30 January 2015, Google signed an Undertaking with the Information Commissioner’s Office (ICO) to improve and amend the Privacy Policy it adopted 1 March 2012.

Among other things, the modifications to the Privacy Policy allowed Google to combine personal data across all services and products. For example, personal data collected through YouTube could now be combined with personal data collected through Google Search.

The Undertaking requires Google to address three of the ICO’s particular concerns: (1) the lack of easily accessible information describing the ways in which service users’ personal data is processed by Google; (2) the vague descriptions describing the purposes for which the personal data is processed; and (3) the use of insufficient explanations of technical terms to service users.

In order to address these issues, Google states in Annex 1, Undertaking that it will, inter alia: enhance the accessibility of its Privacy Policy to ensure that users can easily find information about its privacy practices; provide clear, unambiguous and comprehensive information regarding data processing, including an exhaustive list of the types of data processed by Google and the purposes for which data is processed; and revise its Privacy Policy to avoid indistinct language where possible.

Google has a period of two years in which to implement these changes, and it must provide a report to the ICO by August 2015, specifying the steps Google has taken in response to the commitments set out in the Undertaking.

The ICO’s measures in response to Google’s breach of national data protection laws are much lighter than those take by other EU Member States. The data protection authorities in France (CNIL) and Spain (AEPD) have imposed fines of €150,000 and €900,000 respectively. Currently, the Dutch data protection authority is threatening Google with a €15 million fine (see our previous blog).

New Data Protection Laws in Africa

This post was written by Cynthia O'Donoghue.

In recent years, the number of African countries which have enacted privacy frameworks or are planning data protection laws has vastly increased.

Currently, 14 African countries have privacy framework laws and some sort of data protection authorities in place. Once the African Union Convention on Cyber Security and Personal data Protection (Convention) is ratified across the continent, many other nations will likely enact personal data protection laws.

Currently, seven African countries have data protection bills in place: Kenya, Madagascar, Mali, Niger, Nigeria, Tanzania, and Uganda. Many analysts believe that the Convention seeks to replicate the European Union data protection model whereby each country has its own national data protection laws and authority.

Despite these developments, the Convention still has many important areas to provide guidance on. For instance, the Convention fails to define what is meant by “consent”, “personal data” and the “legitimate grounds” individuals can raise to object to the processing of their information.

The international human rights advocacy group, Access, welcomes these changes, but stresses that “change won’t happen overnight”, and that “it will likely be a few years” before countries enact laws to implement the Convention.

FAA Takes One Small Step Toward Legalizing Commercial Use of Small Unmanned Aircraft Systems, a.k.a Drones

This post was written by Patrick BradleyMark Melodia and Paul Bond.

The Federal Aviation Administration (FAA) has long been studying the promise and perils of small unmanned aircraft systems (“UAS”), a.k.a drones. The commercial potential of UAS technology is clear. Businesses are eager to use UAS to do everything from covering traffic accidents to taking real estate and wedding photos to delivering small parcels. However, the FAA currently prohibits any commercial or business use of UAS, unless the operator obtains specific permission from the FAA. Permission is only granted on a case-by-case basis, greatly restricting businesses from adopting UAS.

This framework remains in place, but the FAA has now issued a Notice of Proposed Rulemaking (NPRM). If adopted, the proposed rules would provide some rules of the sky for UAS and real regulatory relief to businesses. However, estimates are that adoption of even these first-step rules may be as far as two years out.

The FAA’s proposed rule would set forth several requirements as to operator certification, airworthiness, registration, and operation. As to operation, potentially significant restrictions include a requirement that UAS only operate in the daylight at or below 500 feet above the ground; that the operator maintain a line of sight with the UAS during operation; and that an operator only operate one UAS at a time. Drones would not be allowed to fly over any people not directly involved with the operation of the drone.

Currently, prospective commercial drone operators are required to hold at least a private pilot certificate. That would change under the new rules. Commercial UAS operators will need to pass an FAA knowledge test and pass biennial knowledge exams. Transportation Security Administration approval will also be required under the rules. Commercial drone operators will not be required to undergo an FAA medical exam.

The FAA rules do not call for the imposition of airworthiness requirements on drones, but they will be required to register with the FAA, and they will carry N numbers like other aircraft. The pilot will need to do a preflight inspection before every flight, and accidents must be reported.

The proposed rule would not apply to:

“(1) air carrier operations; (2) external load and towing operations; (3) international operations; (4) foreign-owned aircraft that are ineligible to be registered in the United States; (5) public aircraft; (6) certain model aircraft; and (7) moored balloons, kites, amateur rockets, and unmanned free balloons.”

As to privacy, the FAA notes:

“The FAA also notes that privacy concerns have been raised about unmanned aircraft operations. Although these issues are beyond the scope of this rulemaking… the Department and FAA will participate in the multi-stakeholder engagement process led by the National Telecommunications and Information Administration (NTIA) to assist in this process regarding privacy, accountability, and transparency issues concerning commercial and private UAS use in the NAS. We also note that state law and other legal protections for individual privacy may provide recourse for a person whose privacy may be affected through another person’s use of a UAS.” At the same time that the FAA released this NPRM, the White House issued a Presidential Memorandum to all federal agencies, setting forth administration priorities for the NTIA process and all agency rulemaking.

Comments to the FAA’s NPRM will be open for 60 days after it is published to the Federal Register.

Ofcom Publishes Plan To Support the Internet of Things

This post was written by Cynthia O'Donoghue and Angus Finnegan.

In January, Ofcom, the UK telecommunications regulator, published its Statement on ‘Promoting investment and innovation in the Internet of Things’ (Statement). The Statement acknowledges that the Internet of Things (IoT) has the potential to deliver significant benefits to citizens and consumers. In light of this, Ofcom sought views from its stakeholders on what role Ofcom might play to support the growth and innovation of the IoT.

The Statement identifies four priority areas to help support the growth of the IoT. These include: data privacy, network security and resilience, spectrum availability, and network addresses.

Ofcom identifies data privacy as the ‘greatest single barrier to the development of the IoT’. Respondents were concerned about issues such as lack of trust in sharing personal data on the part of citizens and consumers.

To address such issues, the Statement proposes the implementation of a common framework to allow consumers to easily and transparently authorise the conditions under which data collected by their devices is used and shared with others. Where possible, the Statement recommends industry-led approaches to keeping consumers in control which are agreed internationally where possible.

In order to foster innovation and facilitate progress on these issues at both a national and international level, Ofcom proposes to work closely with government, the Information Commissioner’s Office, other regulators, and industry.

This Statement follows the EU Article 29 Working Party’s Opinion on ‘Recent Developments on the Internet of Things’ which we reported on in January 2015.

German Data Protection Commissioners Take Action Against Safe Harbor

This post was written by Cynthia O’Donoghue, Thomas Fischl & Katharina Weimer.

At the Data Protection Conference in Berlin, the Berlin and Hamburg Data Protection Commissioners (Commissioners) made a number of important announcements regarding the ‘inadequacy’ of the EU/U.S. Safe Harbor Program.

Both Dr. Alexander Dix and Prof. Johannes Caspar, Commissioners for Berlin and Hamburg respectively, asserted that U.S. companies do not protect data to the same level as EU companies do, even when U.S. companies certify that they will adhere to the Safe Harbor provisions. In addition, the Data Protection Authorities (DPAs) stated that there may be inadequate enforcement of the Safe Harbor Program by the Federal Trade Commission. Speaking on behalf of his colleagues from 16 German states, Dr. Dix went as far to say that:

“The Safe Harbor agreement is practically dead, unless some limits are being placed to the excessive surveillance by intelligence agencies”.

Dr. Dix further announced that the German DPAs in Berlin and Bremen have initiated administrative proceedings against two U.S. companies that base their data transfers on the EU/U.S. Safe Harbor Program. In these proceedings, the Germany DPAs have expressed their intention to stop data transfers for a limited time. Some commentators have suggested that an actual suspension of data transfer may potentially lead to a court decision, which could deny the supervisory authorities’ competence to suspend data transfers.

Other speakers, such as Paul Nemitz, Director for fundamental rights and union citizenship at the Directorate-General Justice of the European Commission, stressed that “there is an economic incentive to make Safe Harbor work”. However, in order for trans-Atlantic businesses to flourish, organisations need to be more transparent.

In light of these developments, global organisations may wish to consider alternative approaches to the Safe Harbor Program, such as EU Model Clauses, for data transfers from European jurisdictions, such as from Germany to the United States.

Finland Introduces New Information Society Code

This post was written by Cynthia O'Donoghue and Katalina Bateman.

The Information Society Code (2014/917) (Code) – a new act in Finland on electronic communications, privacy, data security, communications, and the information society in general – took effect 1 January.

This sees a consolidation of 10 existing acts into one, which had included Finland’s Communications Market Act; Act on the Protection of Privacy in Electronic Communications; Domain Name Act; Act on Radio Frequencies and Telecommunications Equipment; Act on the Measures to Prevent Distribution of Child Pornography; and Act on Television and Radio Operations.

Besides simplifying existing rules and increased regulatory powers over the information society, there are three significant changes:

  1. Extending Confidentiality Obligations

The Code extends the obligation to protect the confidentiality of communication from traditional telecom companies to ALL intermediaries of electronic communications services.

Under the changes, social media companies must now ensure that users of their messaging services get the same standards of privacy and security as other, already regulated, sectors, such as telecommunications companies.

  1. Extraterritorial Application

The Code’s scope has been increased, allowing the extraterritorial application of its rules. It now also covers companies based outside the EU that offer services in Finland. The obligation on operators to maintain the information security in connection with their services will apply where (1) an operator is based in Finland; (2) the communications network or other equipment to be used in the business operation are located or maintained in Finland; or (3) the services are offered in Finnish or are otherwise targeting Finland or Finns.

  1. Joint Liability

The Code introduces a new obligation where a telecom operator and service provider can be held jointly liable for a defect in the provision of a service. As a big drive on consumer protection, the Code allows, in cases where consumers order and pay for products and services via their mobile phone, that telecom operators and the companies selling the said products or services share accountability.

With its focus on transparency, accountability, and its extraterritorial application, the Code does reflect many aspects of the upcoming EU General Data Protection Regulation, and is a clear enhancement of Finland’s laws on information security.

OECD Releases Guidance for Digital Consumer Products

This post was written by Cynthia O’Donoghue.

The Organisation for Economic Cooperation and Development (OECD) released Consumer Policy Guidance on Intangible Digital Content Products (Guidance) for protecting online consumers of digital content.

With the expansion of the Internet and mobile devices, digital content has grown considerably. The OECD recognizes that this has brought consumers considerable benefits, “including ready access to a wide range of high-quality products, often at reduced costs”. It has also created issues that the OECD believes “countries and business now need to address”.

According to the Guidance, consumers acquiring and using intangible digital content products face several challenges, including, among others: inadequate information disclosure, and misleading or unfair commercial practices.

The Guidance provides recommendations to address six issues concerning:

  • Digital content product access and usage conditions
  • Privacy and security
  • Fraudulent, misleading and unfair commercial practices
  • Children
  • Dispute resolution and redress
  • Digital competence

The recommendations include provisions relating to privacy and security, and address fraudulent, misleading and unfair commercial practices. In particular, the OECD suggests that terms and conditions should be made available to consumers as early as possible in the transaction, and that consumers be provided with clear information about the collection, storage and use of their personal data, including steps consumers can take to manage their data.

In addition, the Guidance addresses children’s advertising and recommends that businesses have mechanisms in place to prevent children from making in-app or digital content purchases without parental consent.

The OECD has also called for effective dispute resolution and redress mechanisms.

Given the growth in the digital market in which businesses now operate, this Guidance calls for governments, businesses and other stakeholders to work collectively to develop education and awareness programs to facilitate consumer use of digital content. Importantly, the OECD acknowledges that protection of consumers and of their personal data should form the core of any legal framework, and should be read in conjunction with the OECD’s Privacy Principles, which tend to form the basis of the data protection and privacy laws in nearly 140 countries.

Dutch Data Protection Authority Threatens Google with a €15 million fine

This post was written by Cynthia O’Donoghue.

The Dutch data protection authority, College Bescherming Persoonsgegevens (CBP), released a cease and desist order requiring Google to pay €60,000 per day, up to a maximum of €15 million, for violating Dutch data protection law, Wet bescherming persoonsgegevens(Wbp). Google has until the end of February 2015 to change the way it handles personal data.

The order requires Google to carry out three measures:

  • Ask for “unambiguous consent” before it shares personal data of Internet users with its other services, such as Google Maps and YouTube, the video-sharing site
  • Make it clear to users that Google is the owner of YouTube
  • Amend its privacy policy to clarify what data is collected and how the data is used

While the CBP conceded that Google has “already taken measures in the Netherlands”, CBP Chairman Jacob Konstamm commented that “This has been on-going since 2012 and we hope our patience will no longer be tested.” Google has responded that “We are disappointed with the Dutch regulator’s orders, especially as we have already made a number of changes to our privacy policy in response to their concerns”.

The Dutch authority has also recently announced that it will now turn its attention to Facebook’s privacy policy. The order clearly shows that the national data protection authorities are not ready to give up their national jurisdiction and enforcement powers, and that they are individually and increasingly focussing on transparency to users of social media.

EU Art. 29 Working Party Opinion on the Internet of Things

This post was written by Cynthia O’Donoghue.

The EU Article 29 Working Party (WP29) has issued an Opinion on ‘Recent Developments on the Internet of Things’ (Opinion). The Opinion stresses the privacy and security challenges generated by the development of the Internet of Things (IoT), while acknowledging the benefits of IoT to individual lives, and the prospect of significant economic growth within the EU companies.

The Opinion focuses on innovations such as wearable technology and connected devices in homes, cars, and work environments, Quantified Self sensors, and Home Automation or “demotics.”

The WP29 identifies challenges and security risks posed by these devices, and recommends the implementation of Privacy by Design/Default (PbD) techniques and other practical tools aimed at specific industries, such as device manufacturers and application developers, to ensure that their developments safeguard the data subject’s privacy.

The Opinion recommends that organisations placing IoT devices in the marketplace should complete privacy impact assessments, conduct timely deletion of raw data, respect users’ rights to self-determination of their data, and use appropriate privacy information notices to inform and obtain consent.

ISO develops the first privacy-specific cloud standard

This post was written by Cynthia O’Donoghue.

Earlier in 2014, the International Standards Organisation (ISO) developed a new voluntary standard, ISO 27018 (Standard), establishing commonly accepted control objectives and guidelines to protect personal information for a public cloud computing environment.

The need to create trust in cloud solutions led to the development of the Standard, in accordance with one of the key goals announced in the 2012 European Cloud Computing Strategy. In adopting an appropriate set of standards for cloud service providers who process personal data, providers can give their customers confidence that they meet their own regulatory obligations on data security.

The Standard focuses on practical recommendations to help cloud providers meet the Standard. Examples include:

  • Confidentiality agreements and training for those with access to personal information
  • Policies for the return, transfer or disposal of personal information at termination
  • Policies that allow the processing of personal information for marketing or advertising purposes only with customer’s express consent
  • Requirements to disclose the names of sub-processors and possible locations where personal information may be processed prior to entering into a cloud services contract
  • Independent security reviews at regular intervals or after significant changes

The Standard could not come at a better time. The Ponemon Institute, which conducts independent research on privacy, data protection and information security policy, revealed the extent of mistrust in cloud, with 72% of EU respondents accusing cloud service providers of failing to comply with data protection regulations. Obtaining the ISO cloud certification could go a long way to restoring trust, and could further facilitate the adoption of cloud computing in all sectors.

UK ICO to endorse privacy seal schemes

This post was written by Cynthia O’Donoghue.

The UK Information Commissioner’s Office (ICO) signalled its commitment to approving third-party “privacy seal” schemes following its recent public consultation. The first UK schemes should be operational by 2016.

The consultation comes in anticipation of the European Commission’s revised data protection framework proposals, which may include provisions intended to encourage the adoption of privacy seals, certification mechanisms and trust marks. It is hoped that privacy certification will establish confidence among consumers around personal data handling.

The ICO has issued a proposed framework and guidelines for organisations wishing to be approved as providers of a privacy seal scheme, including that providers must be an independent body accredited by the UK Accreditation Service.

Privacy seals are already available at the European level, and in France, Germany and Japan.

EU Art. 29 Proposes Class Actions to Enforce Privacy Rights

This post was written by Cynthia O’Donoghue.

This month, the Article 29 Data Protection Working Party (Working Party) and the French Data Protection Authority (CNIL) held the European Data Governance Forum, an international conference focusing on the issues of privacy, innovation and surveillance in Europe. The conference highlighted many of the issues raised in the Joint Statement released by the Working Party in November.

The Joint Statement emphasises the need to address “both the lack of confidence in (foreign or national) governments, intelligence and surveillance services, as well as the underlying problem of how to control access to massive amounts of personal data” in this digital age.

The Working Party proposed a series of principles and actions to create a framework enabling “private companies and other relevant bodies to innovate and offer goods and services that meet consumer demand or public needs, whilst allowing national intelligence services to perform their missions within the applicable law but avoiding a surveillance society”.

Some the key messages suggested by the Working Party include:

  • Protection of personal data as a fundamental right
  • Strengthening public awareness and individual empowerment to help individuals limit their exposure to excessive surveillance
  • No secret, massive and indiscriminate surveillance

The use of surveillance systems can be seen as privacy-intrusive, whereas establishing an effective privacy framework focused on transparency, accountability and restoring trust, can act as a counterbalance.

Privacy Authorities Urge Mobile Apps to Implement Privacy Policies

This post was written by Cynthia O’Donoghue.

In December, 23 privacy authorities – many of which are members of the Global Privacy Enforcement Network (GPEN) – signed an open letter to the operators of seven app marketplaces, urging them to improve consumers’ access to privacy information on mobile apps.

The letter states that:

  • Mobile apps that collect data in and through mobile devices within an app marketplace store must provide users with privacy practice information (for example, privacy policy links)
  • Privacy policy links must clearly inform users about the collection and use of their data before they download the app
  • Marketplace operators must implement the necessary protections to ensure the privacy practice transparency of apps offered in their stores

This letter comes in light of this year’s privacy sweep which we reported on in September. One observation of particular concern was that 85% of the mobile apps reviewed failed to explain clearly how they were collecting, using and disclosing personal information.

With the proliferation of apps, it is clear that privacy and data protection authorities are keen to ensure that apps provide transparency to consumers, and a good privacy policy may help app developers to stand out from the competition.

ICO Publishes its Report on Big Data and Data Protection

This post was written by Cynthia O’Donoghue.

On 28 July, the ICO released its report ‘Big data and data protection’ (the ‘Report’).

The Report defines ‘Big Data’ and sets out the data protection and privacy issues raised by Big Data, as well as compliance with the UK Data Protection Act 1998 (‘DPA’) in the context of Big Data.

The ICO defines Big Data by reference to the Garter IT glossary definition, and further explains that processing personal data must be of a significant volume, variety or velocity.

When announcing publication of the Report, Steve Wood, the ICO’s Head of Policy Delivery, stated that “Big Data can work within the established data protection principles….The principles are still fit for purpose but organisations need to innovate when applying them”.

Under the DPA 1st Principle (fair and lawful processing), the Report emphasises that the complexity of Big Data analytics should not become an excuse for failing to seek consent where required, and that organisations must process data fairly, particularly where Big Data is used to make decisions affecting individuals. A study by Barocas and Selbst entitled ‘Big Data’s Disparate Impact’ found that Big Data has the “potential to exacerbate inequality”, and use of Big Data that resulted in discrimination would violate the fairness principle.

The Report addresses the significant issue of data collection when using Big Data analytics, and stresses that an organisation must have a clear understanding from the outset of what it intends to do with, or learn from, the data to ensure that the data is relevant and not excessive for the purpose. The Report seeks to address the growing concern that Big Data analytics tends to involve collecting as much data as possible, but that under the DPA, data minimisation remains an essential element of Big Data.

The Report also cautions that organisations seeking to use analytics must ensure against purpose-creep by following the purpose limitation principle to ensure that data collected for one purpose is then not used for another purpose incompatible with the original purpose. With this in mind, the ICO suggests that organisations employ a risk-based approach to identify and mitigate the risks presented by Big Data.

The Report also addresses whether the growth of Big Data leads to an increased data security threat, and highlights how The European Union Agency for Network and Information Security (‘ENISA’) has identified a number of emerging threats arising from the potential misuse of Big Data by so-called ‘adversaries’. In contrast, the Report also illustrates that there is evidence illustrating how Big Data can be used to improve information security.

To address these concerns, the ICO recommends several ‘tools for compliance’, including:

  • Privacy Impact Assessments (PIAs)
  • Privacy by Design
  • Promoting transparency through Privacy Notices

Big Data is a fast-growing area that offers many opportunities and commercial advantages. It also presents many challenges. As the Report argues, the benefits of Big Data can only be realised by adhering to current DPA Principles and safeguards. Only through compliance will individuals trust organisations and become more open to the use of their data for Big Data analytics.

FTC Workshop on Big Data: Focus on Data Brokers

This post was written by Divonne Smoyer and Christine N. Czuprynski.

On September 15, the Federal Trade Commission held a workshop entitled “Big Data: A Tool for Inclusion or Exclusion?” FTC Commissioner Julie Brill took the opportunity to discuss an industry that she has consistently maintained requires more regulation and scrutiny: data brokers.

Commissioner Brill stressed first that the FTC is very focused on entities regulated by the Fair Credit Reporting Act (FCRA), and reminded the audience that those entities will be held to the law by the agency. Those entities that are not subject to the FCRA are not off the hook: companies that engage in profiling, or “alternative scoring,” should take a very critical look at what they are doing, since alternative scoring has the potential to limit an individual’s access to credit, insurance, and job opportunities. Brill noted that the FTC’s May 2014 report focused on transparency, and called for legislation to make data brokers accountable – thoughts she echoed during Monday’s workshop.

Finally, Commissioner Brill stressed that all companies would be well-advised to see if their own big data systems cause problems that ultimately exacerbate existing socioeconomic conditions. She reiterated that companies should use their systems for good, and have a role in spotting and rooting out discrimination and differential impact. You can find the text of her full speech here.

85% of Mobile Apps Marked Down on Transparency: 'Must Try Harder' Say Global Privacy Regulators

This post was written by Cynthia O’Donoghue and Kate Brimsted.

In May this year, members of the Global Privacy Enforcement Network (GPEN) conducted a privacy sweep of 1,200+ mobile apps. The findings are now available (here).

GPEN is an informal network of 27 Data Protection Authorities (“DPAs”) established in 2007. Its members include the UK’s ICO, Australia’s OAIC, and Canada’s OPC.

DPAs from 26 jurisdictions carried out this year’s sweep (an increase of seven jurisdictions compared with the last sweep which we reported on in May 2014). The recent sweep focused on (1) the types of permissions an app was seeking; (2) whether those permissions exceeded what would be expected based on an app of its type; and (3) the level of explanation an app provided as to why the personal information was needed and how it proposed to use it.

The results showed that:

  • 85% of the mobile apps failed to explain clearly how they were collecting, using and disclosing personal information.
  • 59% left users struggling to find basic privacy information.
  • One in three apps appeared to request an excessive number of permissions to access additional personal information.
  • 43% failed to tailor privacy policies for the small screen, e.g., by providing in tiny type or requiring users to scroll or click through multiple pages.

In announcing their results, the GPEN made it clear that the sweep was not in itself an investigation. However, the sweep is likely to result in follow-up work, such as outreach to organisations, deeper analysis of app privacy provisions, or enforcement actions.

Privacy shortcomings are not just a regulatory matter; research by the ICO last year suggested that 49% of app users have decided not to download an app because of privacy concerns. In an increasingly crowded app marketplace, good privacy policies may be a valuable way to stand out from the competition.

Microsoft loses third round of battle against extra-territorial warrants

This post was written by Cynthia O’Donoghue, Mark S. Melodia, Paul Bond, and Kate Brimsted.

On 31 July, the chief judge of the Southern District of New York delivered the latest in a series of controversial judgments stemming from a test case brought by Microsoft in an extra-territorial warrant issued under the U.S. Stored Communications Act. In the third ruling on the matter, the court found in favour of the U.S. government, upholding the warrant and ordering that Microsoft turn over customer emails stored in a data centre in Ireland. The District Court agreed to stay the order while the decision is appealed further.  If Microsoft’s final appeal is dismissed, the case will have significant implications for all U.S. businesses that store customer data overseas.  The implications also extend to non-U.S. customers, including those companies located within the EEA, that have entered agreements with U.S.-based companies to store their data outside the U.S. In particular, there is concern that foreign companies and consumers will lose trust in the ability of American companies to protect the privacy of their data.

Click here to read the full issued Client Alert.

Brazilian Data Protection Authority fines Internet Provider $1.59m

This post was written by Cynthia O’Donoghue and Kate Brimsted.

In July, the Brazilian Department of Consumer Protection and Defence (‘DPDC’) fined the telecom provider Oi 3.5 million reals ($1.59 million) for recording and selling its subscriber browser data in a case based on Brazilian consumer law dating back to 1990.

The DPDC investigated allegations that Oi had entered into an agreement with British online advertising firm Phorm Inc. to develop an Internet activity monitoring program called ‘Navegador’. The investigation confirmed that this program was in use and actively collected the browsing data of Oi’s broadband customers.

The browsing data was collected and stored in a database of user profiles, with the stated purpose of improving the browsing experience. Oi then sold this data to behavioural advertising companies without having obtained the consent of its customers.

The amount of the fine imposed took into account several factors, including the economic benefit to Oi, its financial condition, and the serious nature of the offence. The fine was issued after Oi suspended its use of the Internet activity monitoring software.

Oi denied violating customer privacy and claimed that use of the Internet monitoring program was overseen by government regulators. Phorm Inc. denied that any of the data collected from Oi’s customers was sold, and said that all relevant privacy regulations had been adhered to strictly.

The fine serves as a warning that Brazil will take strong action to enforce its new Internet law.

UK set to implement emergency Data Retention and Investigatory Powers Bill

This post was written by Cynthia O'Donoghue, Angus Finnegan and Kate Brimsted.

In April, the Court of Justice of the European Union (‘Court’) declared Directive 2006/24/EC on the Retention of Data to be invalid, creating uncertainty for telecommunications operators across the region. In a controversial move by the UK Government, the Data Retention and Investigatory Powers Act 2014 (‘Act’) has been passed using emergency procedures.

Formulated in 2006, the Directive aimed to harmonise the laws of Member States in relation to the retention of data. It introduced an obligation on telecommunications operators to retain a wide range of traffic and location data, which could then be accessed by national authorities for the purpose of detecting and investigating serious crime. The Directive was implemented in the UK through the Data Retention (EC Directive) Regulations 2009.

In its judgment, the Court stated that the obligation to retain communications data and the ability of national authorities to access them constituted an interference with both Articles 7 and 8 of the Charter of Fundamental Rights. Whilst this satisfied the objective of general interest, it was not proportionate or limited to what was strictly necessary. There was concern that the data collected “may allow very precise conclusions to be drawn concerning the private lives of the persons whose data has been retained.”

The Act seeks to maintain the status quo by preempting any legal challenge to the Regulations, and allows the Secretary of State to issue a notice requiring the retention of all data, or specific categories of data, for a period of 12 months. Whilst the effect of the Act is largely similar to its predecessor, the language used is more expansive and appears to be capable of encompassing a broader range of data.

The Act also amends certain provisions of the Regulation of Investigatory Powers Act 2000, allowing for the extra-territoriality of warrants in certain circumstances. This is a major step not only for UK interception powers, but for interception powers globally. Last month, we reported that Microsoft would continue to challenge a U.S. court ruling that effectively allowed an extra-territorial warrant to be issued; it appears that the legal basis for similar powers could be being introduced by the back door in the UK.

It is unclear whether the Act will be a temporary piece of legislation, staying in place until a more permanent solution is implemented at EU level, or whether it will be permanent. However, one positive effect will be that telecommunications operators will know what their retention obligations are. That is not the case in almost all other Member States at present.

Italian Data Protection Authority issues new EU guidelines

This post was written by Cynthia O’Donoghue, Kate Brimsted, and Matthew N. Peters.

In early May the Italian data protection authority (“Garante”) issued “Simplified Arrangements to Provide Information and Obtain Consent Regarding Cookies” (“Guidelines”).  These are intended to provide clarity on the application of Legislative Decree No. 69/2012 (the “2012 Act”), which implemented the EU Cookie Directive in Italy.

The Guidelines synthesize the findings of a public consultation and set out simple methods for informing website users about the use of cookies and procuring their consent.

Key topics include:

i) Distinguishing technical cookies from profiling cookies: technical cookies only require users to be clearly informed and include browsing/session cookies, first-party analytics cookies and functional cookies; while profiling cookies require users’ consent to create a user profile and for the website operator and any third parties to carry out marketing and promotional activities.

ii) A ‘double decker’ approach to inform users and obtain consent by providing summary cookie by means of a ‘banner’ on a website landing page with more detailed information included in a full privacy notice that is linked to the banner.

iii) Links to third parties that also place cookies on a user’s device to each respective third party’s own consent and privacy notices so users remain fully informed and retain their ability to consent.   

iv) Implementation and sanctions: Garante has given data controllers one year from the date of publication of the Guidelines to meet these requirements. Failure to do so carries a range of sanctions, including a maximum fine of €300,000 and ‘naming and shaming’.

European Commission releases communication on building a data-driven economy, calling for a rapid conclusion to data-protection reform

This post was written by Cynthia O'Donoghue and Kate Brimsted.

In July, the European Commission (‘Commission’) published a communication titled “Towards a thriving data-driven economy” (‘Communication’), setting out the conditions that it believes are needed to establish a single market for big data and cloud computing. The Communication recognizes that the current legal environment is overly complex, creating “entry barriers to SMEs and [stifling] innovation.” In a press statement, the Commission also called for governments to “embrace the potential of Big Data.”

The Communication follows the European Council’s conclusions of 2013, which identified the digital economy, innovation and services as potential growth areas. The Commission recognizes that for “a new industrial revolution driven by digital data, computation and automation,” the EU needs a data-friendly legal framework and improved infrastructure.

Citing statistics about the amount of data being generated worldwide, the Commission believes that reform of EU data-protection laws and the adoption of Network and Information Security Directive will ensure a “high level of trust fundamental for a thriving data-driven economy.” To this end, the Commission seeks a rapid conclusion to the legislative process.

The Commission’s vision of a data-driven economy is founded on the availability of reliable and interoperable datasets and enabling infrastructure, facilitating value and using Big Data over a range of applications.

To achieve a data-driven economy, coordination among Member States and the EU is necessary. The key framework conditions are digital entrepreneurship, open data incubators, developing a skills base, a data market monitoring tool and the identification of sectorial priorities, and ensuring the availability of infrastructure for a data-driven economy, along with addressing regulatory issues relating to consumer and data protection, including data-mining and security.

In an atmosphere of increasingly complex regulation anticipated by the Draft Data Protection Regulation and rulings of Europe’s senior courts, a positive slant on the use of data should be refreshing to organisations that depend on it in their operations. The test for the recommendations will be in how the Commission and the EU seek to implement them.

Article 29 Working Party releases opinion on Anonymisation Techniques

This post was written by Kate Brimsted, Katalina Chin, and Tom C. Evans.

In April, the EU’s Article 29 Working Party (Working Party) adopted an opinion on Anonymisation Techniques (Opinion). The Opinion is designed to provide guidance for organisations on the use of common anonymisation techniques, and the risks that can be presented by them.

When data is truly anonymised – so that the original data subject cannot be identified – it falls outside of EU data protection law. The Opinion notes that the re-use of data can be beneficial, providing “clear benefits for society, individuals and organisations”. However, achieving true anonymisation is not easy, and can diminish the usefulness of the data in some circumstances.

The EU regime does not prescribe any particular technique that should be used to anonymise personal data. To guide organisations in designing their own policy on anonymisation, the Opinion examines the two principle forms: (a) randomization and (b) generalization.

In particular, the Opinion looks at the relative strengths and weaknesses of each technique, and the common mistakes and failures that arise in relation to them. Each technique is analysed using three risk criteria, which include:

1. The risk that data identifying an individual could be singled out
2. The ‘linkability’ of two records that relate to an individual
3. Inferences that can be drawn about one set of data based on a second set of data

The Working Party stated that by considering these strengths and weaknesses, organisations will be able to take a risk-based approach to the anonymisation technique used and tailor it to the dataset in question. The Opinion emphasizes that no technique will achieve anonymisation with certainty, and that since the fields of anonymisation and re-identification are actively researched, data controllers should regularly review their policies and the techniques employed.

In addition, the Opinion makes clear that pseudonymisation is not a method of anonymisation in itself. Therefore, organisations that use this technique should be aware that the data they process does not fall outside of the EU data protection regime. These comments are significant because the draft EU General Data Protection Regulation contains specific references to pseudonymisation and the circumstances in which the technique can be used.

At the recent IAPP Europe Data Protection Intensive 2014 held in London, Security Engineering Professor Ross Anderson of the University of Cambridge put to the conference audience that anonymisation will never be a completely infallible tool for the security of personal data – a discussion set in the context of secondary uses of medical records. Despite these wider questions on anonymisation being posed by many, the Working Party’s Opinion will at least provide some useful guidance for organisations that have a need to anonymise data.


Privacy Regulators of the World Unite to Conduct "Sweep" of Mobile Apps from 12 May

This post was written by Kate Brimsted, Mark S. Melodia, Daniel Kadar, Paul Bond and Cynthia O’Donoghue.

In the week commencing 12 May, members of the Global Privacy Enforcement Network (GPEN) will conduct an online privacy sweep, focusing on the transparency with which mobile apps collect personal data.

GPEN is an informal network of 27 Data Protection Authorities (“DPAs”) that was established in 2007. Its members include the UK’s ICO, France’s CNIL, Spain’s AEPD, Canada’s OPC and the U.S. FTC.

The network’s tasks are to:

  • Support joint enforcement initiatives and awareness campaigns
  • Work to developed shared enforcement policies
  • Share best practices in addressing cross-border challenges
  • Discuss the practical aspects of privacy law enforcement co-operation

The sweep is part of an effort to ensure that consumers are fully aware of the ways in which apps gather and use personal data. To this end, DPAs will focus on the level of permission requested by apps, the way in which the permission is requested and the purposes for which personal data are used. The DPAs will focus in particular on whether the level of permission requested by the app is what would be expected of an app of its type, or whether it appears excessive.

This is the second time that GPEN has conducted an Internet privacy sweep. In May 2013, DPAs from 19 jurisdictions carried out a sweep of websites and apps and their privacy policies.  This looked at (1) was there a privacy policy? (2) was it easy to find? (3) was it easy to read and understand?  This led to regulators following up with a number of organisations, including insurance companies, financial institutions, and media companies, resulting in some substantial changes being made to their privacy policies. 

The results of the 2014 sweep are expected to be published later this year.

French data protection authority ramps up inspections for 2014 - will it be a knock on the door or a "remote audit"?

This post was written by Daniel Kadar and Kate Brimsted.

At the end of April, the French data protection authority (CNIL) released its inspection schedule for 2014 (the Schedule), promising to carry out some 550 inspections over the course of the year.

Approximately 350 inspections are expected to be on-site, a quarter of which will focus on CCTV/video surveillance, and 200 will be carried out using the CNIL’s new powers of online investigation. These powers, introduced in April 2014, enable agents to carry out “remote investigations” into compliance with the French Data Protection Act.

The Schedule sets out six priority areas for inspections in the period, including:

  • Processing personal data by the National Database on Household Credit Repayments
  • Handling data security breaches by electronic communications operators
  • Collecting and using personal data, including sensitive data, by social networks, online dating providers and third-party applications linked to social networks
  • Processing personal data by the government’s system for the payment and collection of income tax
  • Processing personal data by online payment systems
  • Processing personal data by the National Sex Offenders Register

The CNIL will also continue to participate in the Article 29 Working Party’s effort to harmonise the approach of EU data protection authorities regarding Internet cookie compliance.

The CNIL further renewed its commitment to support international cooperation between data protection authorities, and is set to take part in the 2nd Global Privacy Enforcement Network’s Internet Sweep (Internet audits evaluating how well websites protect the data privacy of their users). 

International cooperation is a hot topic for EU data protection authorities. In anticipation of the General Data Protection Regulation and its proposed introduction of a “one-stop shop” mechanism, regulators across Europe will be looking to plan ahead for the changes to come. The CNIL has also, on behalf of the Article 29 Working Party, been leading the EU data protection enforcement against Google after the implementation of its new platform (also covered by this blog here [dated Jan. 21, 2014]).

Article 29 Working Party adopts opinion on Personal Data Breach Notification

This post was written by Cynthia O'Donoghue.

At the end of March, the EU’s Article 29 Working Party adopted an opinion on Personal Data Breach Notification (the Opinion). The Opinion is designed to help data controllers decide whether they are obliged to notify data subjects when a ‘personal data breach’ has occurred.

A ‘personal data breach’ under Directive 2002/58/EC (the Directive) broadly covers the situation where personal data is compromised because of a security breach, and requires communications service providers (CSPs) to notify their competent national authority. Depending on the consequences of the personal data breach, CSPs may also be under a duty to notify the individual data subjects concerned.

The Opinion contains factual scenarios outlining the process that should be used by CSPs to determine whether, following a personal data breach, individuals affected should be notified. Each scenario is assessed using the following three “classical security criteria”:

  • Availability breach – the accidental or unlawful destruction of data
  • Integrity breach – the alteration of personal data
  • Confidentiality breach – the unauthorized access to or disclosure of personal data

The Opinion includes practical guidance for notifying individuals, including where a CSP does not have the contact details of the individuals concerned, or where the compromised data relates to children.  The Opinion also stresses the importance of taking measures to prevent personal data breaches.

Further reform in Australia

This post was written by Cynthia O’Donoghue.

Australia’s privacy protection reform laws came into force in mid-March, making significant changes to the regulation of data. Further reform is now on the horizon, with theAustralian Law Reform Commission (the Commission) publishing a discussion paper titled, ‘Serious Invasions of Privacy in the Digital Era’ (Discussion Paper).

The Commission is carrying out an inquiry at the request of the Australian government to find “innovative ways the law might prevent or redress serious invasions of privacy.”  Two of the Commission’s proposals are likely to be of particular concern to businesses.

First, the Discussion Paper proposes the introduction of a principle for the deletion of personal data. The principle would differ significantly from the ‘Right to Erasure’, one of the headline provisions contained in the proposed EU General Data Protection Regulation.

The current draft of the EU provision would allow citizens to request the deletion of any personal data held about them, where the data controller has no reason for retaining it. Data controllers would also be required to take reasonable steps to inform any third parties to whom they have passed the data of this request. In contrast, the Australian recommendation on data erasure would apply only to data that the citizen had personally provided to a data controller. The Discussion Paper calls for comments as to whether the data controller should be under a duty to inform third parties of this request.

Second, the Discussion Paper contains a proposal to introduce a new Commonwealth Statute which would apply to all territories in Australia. This statute would provide citizens with the ability to bring a cause of action against any individual or entity that seriously invades their privacy. The action would enable individuals to obtain damages independent of breach of the Australian Privacy Act.

The Commission is scheduled to deliver its final report to the Attorney-General in June 2014.


Safety of US-EU Safe Harbor Given Boost

This post was written by Cynthia O'Donoghue.

Following months of uncertainty about the future of the EU-U.S. Safe Harbor Framework, political leaders from the EU and the United States reiterated their commitment to the regime in a joint statement issued 26 March (the Statement).

EU-U.S. Safe Harbor is designed to essentially transpose EU data protection law into U.S. law so that organisations certified to the program are deemed to adequately protect personal data transferred from the EU to them in the United States. 

The future of the Safe Harbor regime was cast into doubt last year, following Edward Snowden’s revelations about the extent of NSA information gathering. In November 2013, the European Commission released a Strategy Paper which noted that “the current implementation of Safe Harbor cannot be maintained.” In particular, the paper pointed to shortcomings in transparency, enforcement and the use of the national security exception.

The situation became worse at the beginning of last month when a resolution of the EU Parliament drastically called for the “immediate suspension” of the Safe Harbor regime on the ground that it provides an insufficient level of protection to EU citizens.

The Statement is the latest development in the saga, with officials pledging to maintain the Safe Harbor framework subject to a commitment to strengthening it “in a comprehensive manner by summer 2014”. This demonstrates a slightly more diplomatic approach, which should be reassuring to businesses that currently rely on the Safe Harbor exception.

The Statement also confirms the commitment of the EU to introducing a new “umbrella agreement” for the transfer and processing of data in the context of police and judicial proceedings. The aim of this agreement is to provide citizens with the same level of protection on both sides of the Atlantic, with judicial redress mechanisms open to EU citizens who are not resident in the United States. Negotiations around this agreement commenced in March 2011, and are still on-going.

Brazil's Internet Bill: Latest Developments

This post was written by Cynthia O’Donoghue.

At the end of March, the Brazilian Chamber of Deputies voted in favour of the Marco Civil da Internet (Internet Bill), bringing the ground-breaking legislation one step closer to enactment. The Internet Bill will now progress to the Senate for approval.

In the wake of Edward Snowden’s revelations about global surveillance programs, the Internet Bill had included a provision that would have required organisations to store all data held on Brazilian citizens within the country’s border. This controversial requirement has been dropped by the Brazilian government in the latest version of the Internet Bill. However, the text voted on by the Chamber of Deputies now provides that organisations will be subject to the laws and courts of Brazil in cases where the information of Brazilian citizens is involved.

The Internet Bill will introduce a variety of measures to govern the use of the Internet, providing civilians with robust rights and implementing strict requirements for organisations to comply with. The legislation is the first of its kind, and has been hailed by the Brazilian Minister of Justice as a sign that Brazil is at the forefront of efforts to regulate the web democratically. The most important provisions in the legislation are:

  • A statement of the rights of web users, including freedom of expression and the confidentiality of online communications
  • The enshrinement of “net neutrality”, a principle that prohibits ISPs and governments from making a distinction between different types of data traffic. This will prevent organisations from being able to limit access to different websites based upon subscription plans.
  • Confirmation that ISPs cannot be held liable for content uploaded by third parties using their services unless they refuse to remove such content following a court order

ICO issues updated code of practice on subject access requests

This post was written by Cynthia O'Donoghue.

The UK Information Commissioner’s Office (ICO) has issued an updated code of practice (the Code) on subject access requests, less than a year after releasing its original guidance paper on the topic. The Code is designed to help organisations fulfill their duties under the Data Protection Act 1998 (DPA) and contains guidance in relation to recognising and responding to subject access requests.

The “right of subject access” enables individuals to request from organisations information about what personal data is held about them. The information may include source of the personal data, how it is processed and whether it is passed on to any third parties. The DPA also permits individuals to request a copy of the personal data held. Unless an exemption applies, organisations are under a duty to provide this information when requested.

The Code is not legally binding, but it does demonstrate the steps that the ICO considers to be good practice. The ICO also points out that by dealing with subject access requests efficiently an organisation may enhance the level of customer service offered.

The main recommendations of the Code relate to the handling of a subject access request and cover the following issues:

  1. Taking a positive approach to subject access.
  2. Finding and retrieving the relevant information.
  3. Dealing with subject access requests involving other people’s information.
  4. Supplying information to the requester (not just copies); and
  5. Exemptions.

The Code also contains guidance in relation to “special cases” and enforcement action by the ICO.

UK Information Commissioner's Office and U.S. Federal Trade Commission sign Memorandum of Understanding

This post was written by Cynthia O'Donoghue.

At the beginning of March, the UK Information Commissioner’s Office (ICO) signed a memorandum of understanding (MOU) with the U.S. Federal Trade Commission (FTC) at the IAPP Global Privacy Summit. The memorandum is aimed at increasing cooperation between the agencies, with UK Information Commissioner Graham stating that the arrangement would be “to the benefit of people in the United States and the United Kingdom.”

Whilst the MOU does not create legally binding obligations between the two agencies, it sets out terms for cooperation during investigations and enforcement activities. The FTC and ICO will cooperate on serious violations. The methods for cooperation include:

  • Sharing information, including complaints and personal information
  • A mutual undertaking to provide investigative assistance to the other agency through use of legal powers
  • Coordinating enforcement powers when dealing with cross-border activities arising from an investigation of a breach of either country’s law, where the matter being investigated is the same or substantially similar to practices prohibited by the other country

Measures to encourage cooperation between national regulators have been introduced by several international organisations. For example, in 2010, the Asia-Pacific Economic Cooperation (of which the United States is a member) launched a Cross-border Data Privacy Initiative, recognising that “trusted flows of information are essential to doing business in the global economy.”

The MOU is a joint acknowledgment by the FTC and ICO that consumer protection and data protection require close collaboration, and it serves as a warning to organisations that the agencies will be proactive in carrying out investigations of serious violations of consumer protection and data protection laws.

European Parliament votes in favour of new Data Protection Regulation

This post was written by Cynthia O'Donoghue.

In March, the European Parliament voted overwhelmingly in favour of implementing the draft Data Protection Regulation, making its commitment to reforming the European regime irreversible. In order to become law, the Regulation must now be negotiated and adopted by the Council of Ministers.

Discussions around reform began in January 2012, in recognition of the growing economic significance of personal data. With estimates that by 2020 the personal data of European citizens will be worth nearly €1 trillion annually, it is important that any reform ensures an adequate level of protection for citizens while not overburdening businesses. To this end, Vice-President Viviane Reding has stated that the Regulation will “make life easier for business and strengthen the protection of our citizens.” The Regulation will make four key changes to the data protection regime, which are summarised below:

  1. Equal application in all EU Member States by replacing the “current inconsistent patchwork of national laws”, making compliance easier and cheaper
  2. Creation of a “one-stop shop” for organisations to deal with one data protection authority where their EU QA is located, rather than across various member states, reducing the administrative burden on organisations. EU residents may still bring complaints to the authority in their home country.
  3. Application of the Regulation to any organisation that operates within the single market to ensure that businesses are competing equally
  4. A right of EU residents to request that their data be removed where a data controller no longer has a legitimate reason to retain it

The draft Regulation continues to contain robust sanctioning powers with fines of up to 5% of annual worldwide turnover, a significant increase by the European Parliament on the 2% limit that had previously been recommended.

Despite the Parliament’s vote and ringing endorsement for the draft Regulation, the text is still subject to input from the Council of Ministers, who appear to be taking a more pragmatic approach aimed at promoting the EU Digital Agenda and continued growth in the digital marketplace. The next Council meeting is in June, so we may yet see further revisions to the existing draft.

Article 29 Working Party and APEC authorities release "practical tool" to map the requirements of the BCR and CBPR regimes

This post was written by Cynthia O'Donoghue.

At the beginning of March, representatives of the EU Article 29 Working Party and the Asia-Pacific Economic Cooperation (which includes, among others, the United States and the People’s Republic of China) announced the introduction of a new Referential on requirements for binding corporate rules (the Referential).

Both the EU and Asia-Pacific Economic Cooperation (APEC) regimes place restrictions on the transfer of data across borders. Under the EU regime, implementing a set of binding corporate rules (BCR) that have been approved in advance by national authorities will allow a company or group of companies to transfer data outside of the EEA without breaching the EU Data Protection Directive. Under the APEC regime, Cross-Border Privacy Rules (CBPR) serve the same purpose, allowing data to be transferred between participating economies. Both regimes require the rules to be approved in advance by regulators before they can be relied on.

The Referential does not achieve mutual recognition of both the EU and APEC systems, but it is intended to be a “pragmatic checklist for organizations applying for authorization of BCR and/or certification of CBPR”. The Referential acts as a comparison document, setting out a “common block” of elements that are shared by both systems, and “additional blocks” which list their differences. For example, while both systems require appropriate training to be given to employees, the EU regime requires only that this training is given to employees with permanent or regular access to personal data. In contrast, the APEC regime appears to extend to all employees.

Work on the referential began early in 2013, with Lourdes Yaptinchay stating that cooperation between APEC and the EU “is an important next step towards better protecting personal data and could provide a foundation for more fruitful exchange between companies with a stake in the two regions.”

The comparative nature of the Referential highlights the challenges that face organisations that want to satisfy both the EU and APEC regimes in a single set of rules. By drafting a set of rules that complies with the most stringent regime on any one point, organisations can use the document to navigate the approval process with more ease.

Hong Kong's Office of the Privacy Commissioner for Personal Data releases Best Practice Guide on Privacy Management Programmes

This post was written by Cynthia O'Donoghue.

Last month, Hong Kong’s Office of the Privacy Commissioner for Personal Data (OPCP) released a Best Practice Guide on Privacy Management Programmes (PMP) (the Guide). Striking a similar chord to the UK Information Commissioner’s Office in the recently released code of practice on conducting Privacy Impact Assessments, the OPCP notes that despite no requirement within the Personal Data (Privacy) Ordinance (the Ordinance) for PMPs, organisations that do adopt them are likely to benefit from increased levels of trust among their customers and employees, as well as demonstrating compliance with the Ordinance.

The Guide does not provide a “one-size-fits-all” solution, and organisations will need to consider their size and nature when developing a PMP. To this end, the Guide addresses both the fundamental components of a PMP and the ongoing assessment and revision.

The Guide notes that implementation of PMPs will require organisations to consider their policies, staff training, and the processes that are followed when contracting with third parties. The Guide states that the key components of a PMP are:

  1. Organisational commitment: this includes buy-in from top management, designating a member of staff to manage the PMP (this could be a full-time member in a large organisation, or a business owner in a small organisation), and establishing reporting lines.
  2. Program controls: an inventory of personal data held by the organization should be made. Internal policies should also be put in place to address obligations under the Ordinance, with risk-assessment tools to allow new or altered projects to be assessed.

The Guide is a welcome development for Hong Kong organisations, which, by following its terms, will be able to demonstrate their compliance with the Ordinance. However, organisations should also note that the Guide indicates that the OPCP expects organisations to take positive steps towards fulfilling their obligations.

Indian Centre for Internet and Society issues call for comments on draft Privacy (Protection) Bill

This post was written by Cynthia O'Donoghue.

A nonprofit research organisation, the Indian Centre for Internet and Society (ICIS), has issued an open call for comments on its draft Privacy (Protection) Bill 2013 (the Bill). Consultations on the Bill started in April 2013, with a series of seven roundtable talks being held in partnership with the Federation of Indian Chambers of Commerce and Industry, and the Data Security Council of India.
ICIS states that it has “the intention of submitting the Bill to the Department of Personnel and Training as a citizen’s version of privacy legislation for India.” India’s current data protection regime, a product of piecemeal development, imposes very limited duties on organisations that collect, process and store data. No national regulator is in place to oversee the data protection regime.

Described by ICIS as “a citizen’s version of privacy legislation for India,” the draft Bill contains provisions for a new Data Protection Authority of India to be established with wide powers of investigation, review and enforcement. Penalties detailed by the Bill for infringement include a term of imprisonment and a fine. In addition, the Bill proposes the introduction of comprehensive regulation, including:

  • Regulation of the collection, processing and storage of data by any person
  • Regulation of the use of voice and data interception by authorities
  • Regulation of the manner in which forms of surveillance not amounting to interceptions of communications may be conducted

If implemented, the draft Bill would be a considerable step forward for the privacy landscape in India, which has so far lacked the impetus provided by international instruments such as the European Data Protection Directive (95/46/EC).

New UK Cyber Security Principles Released

This post was written by Cynthia O'Donoghue.

Back in 2011, the Cabinet Office launched a cyber security strategy outlining steps the UK Government would take to tackle cyber crime by 2015. The National Cyber Security Programme invested £650 million funding to support the strategy ‘Protecting and Promoting the UK in a digital world’. Measures proposed by the strategy included:

  • Reviewing existing legislation, e.g., Computer Misuse Act 1990, to ensure remains relevant and effective
  • Pioneering a joint public-private sector cyber security to allow exchange of data on cyber threats across sectors to manage response to cyber attacks
  • Seeking to agree a voluntary set of guiding principles with Internet Service Providers
  • Developing kite marks for approved cyber security software
  • Encouraging UK courts to enforce sanctions for online offences under the Serious Crime Prevention Order
  • Creating a new national cyber crime capability as part of the National Crime Agency
  • Creating a single reporting system for cyber crime using the Action Fraud portal run by the National Fraud Authority
  • Strengthening the role of Get Safe Online to raise awareness and education about online security

In line with the third proposal of the strategy, the Department for Business, Innovation and Skills has now issued new guiding principles developed and agreed between government and leading Internet Service Providers (ISPs), such as ISPA, BT, Sky, Talk Talk, Vodafone and Virgin Media, to promote cyber security and protect ISP customers from online threats. 

The first section of the principles propose ISPs must:

  • Increase customer awareness of cyber security issues (including by directing to Get Safe Online and other national campaigns), and educate customers on the basic online threats, how to practise safe online behaviour, and how to spot cyber crime and report through Action Fraud
  • Empower customers to protect themselves from online threats through providing tools such as anti-virus software, anti-spyware, anti-spam, malware protection or firewall protection
  • Provide clear mechanisms to encourage customers to report compromises or threats to minimise the impact of cyber threats

The second section mandates government must:

  • Continue to make businesses aware of cyber threats and educate them how to respond through guidance, e.g., Cyber Security Guidance for Business issued 2012 and Small Business Guidance for Cyber Security 2013
  • Advise nationally on improving cyber security, e.g., Get Safe Online
  • Increase enforcement of online threats through the national crime capability of the National Crime Agency

The guidelines conclude by highlighting cyber security issues the government and ISPs will partner to resolve jointly going forward to achieve the aims of the UK cyber security strategy.

Spanish Court Ruling Validates Employee Monitoring

This post was written by Cynthia O'Donoghue.

Spain’s constitutional court, the Tribunal Constitucional, made a landmark ruling in the case of Pérez González v. Alcaliber S.A. in early October, finding that companies are permitted to access and monitor employee communications via company IT resources, including emails and texts, as part of investigations into employee misconduct.

Pérez González was dismissed by Alcaliber for disseminating trade secrets to competitors.  Alcaliber accessed Pérez González’ company emails and laptop hard drive in the presence of the notary public following suspicions of wrongdoing to confirm grounds for dismissal. Emails in both 2007 and 2008 were found to confirm suspicions that Pérez González had disclosed information about the year’s poppy crops from his company account to a competitor of Alcaliber.

Pérez González challenged the dismissal with a claim for wrongful termination. He refuted the validity of the emails as evidence for his dismissal on the basis of his fundamental right to secrecy in communications under Article 18 of the Spanish Constitution. However, the constitutional court held that Pérez González did not have a reasonable well-founded expectation of confidentiality when using a company email account or other workplace communications where monitoring is foreseeable. Furthermore, the company collective bargaining agreement clearly prohibited the use of company-owned communications networks for non-work reasons. On this basis, the constitutional court upheld the decisions of the Madrid Labour Court and the High Court of Justice to affirm the dismissal.

The Tribunal Constitucional held that dismissal was not disproportionate in light of the severity of sharing confidential company information. Furthermore, the court ruled that a company must be permitted to monitor employee communications to verify well-founded suspicions of transgression where such monitoring is necessary to provide evidence to justify dismissal.

This ruling recognises that employee privacy rights must be balanced against employers’ rights to investigate employee wrongdoing, and further acknowledges that employees’ rights to privacy in the EU are not absolute.

New Data Protection Law For South Africa

This post was written by Cynthia O'Donoghue.

After 10 years of debate, South Africa’s President Jacob Zuma has finally signed South Africa’s first framework privacy bill into law, the Protection of Personal Information Bill (PoPI). PoPI will reinforce the right to privacy under Article 14 of the South African Constitution. PoPI will take effect one year after the date of enactment, though there is potential for this transition period to increase to three years, dependent on discussions between the Minister of Justice and Constitutional Development, and the newly established data protection authority (DPA). After this date, the DPA will be empowered by PoPI to enforce fines of up to 10 million Rand ($957,171) for non-compliance.

PoPI will provide protection for both individuals and juristic persons, including corporations. The new law will also allow the DPA to file lawsuits on behalf of individuals against data controllers. Data controllers will have to be aware of this strict liability they will bear, and the potential for remedies sought to include aggravated damages.

PoPI is based upon a framework of conditions, including:

  • Accountability – Data controllers will bear ultimate liability and responsibility for compliance with PoPI
  • Processing Limitation – data may only be processed lawfully and excessively, and with consent of the data subject (which can be withdrawn at any time.) Explicit consent of the data subject is required for processing sensitive data.
  • Purpose Specification – data may only be collected for a specific, explicitly defined and lawful purpose. Any data collected should not be retained any longer than is necessary for achieving that purpose. Explicit consent is required for direct marketing.
  • Further Processing – any further processing must be compatible with the original purpose of collection
  • Information Quality – reasonable steps must be taken to ensure data is complete, accurate, not misleading and updated when necessary
  • Openness – Data controllers must retain open records documenting all processing operations undertaken, and must take reasonable steps to ensure data subjects are informed of the purpose and extent of data collected; the identity of the data controller; whether provision of the information is mandatory or voluntary and the consequences of failure to supply that information; any subsequent disclosure to third parties; and the full extent of rights available to data subjects under PoPI
  • Security Safeguards – Data controllers must take responsible steps to secure the integrity and confidentiality of personal data in their possession by taking appropriate technical and organisational security measures. Any security compromises must be notified to the DPA and the data subject concerned.
  • Data Subject Participation – A data subject has rights under PoPI to access or correct their personal information held by the data controller

President Jacob Zuma commented, ‘PoPI will give effect to the right to privacy by introducing measures to ensure that the personal information of an individual is safeguarded when it is processed by responsible parties.”

Canada Supreme Court Declares Alberta Privacy Law Unconstitutional

This post was written by Mark S. Melodia, Cynthia O’Donoghue, and Frederick Lah.

On November 15, 2013, the Supreme Court held Alberta’s Personal Information Protection Act (“PIPA”) to be unconstitutional, holding that an individual’s right to freedom of expression in the labor strike context outweighs the individual’s right to control his or her information in public. The ruling is suspended for 12 months to give Alberta’s legislature time to consider how to best amend PIPA.

This case, Alberta (Information and Privacy Commissioner) v. United Food and Commercial Workers, Local 401, arose after the union recorded and photographed employees of a casino crossing the picket line during lawful picketing activity. The union posted signs near the picket line saying that those who crossed the line would be photographed. Some of the photographs were eventually used on union newsletters and posters. The photographed employees filed complaints with Alberta’s privacy commissioner. An adjudicator appointed by the commissioner determined that PIPA prohibited the union from collecting, using, and disclosing such photographs and recordings without the consent of the employees. The case was reviewed by the appellate courts before eventually making its way to the Supreme Court.

The Supreme Court held in a 9-0 ruling that PIPA violates s. 2(b) of Canada’s Charter of Rights and Freedoms, which guarantees the “freedom of thought, belief, opinion and expression, including freedom of the press and other media of communication.” After determining that dissuading workers from crossing the picket line is protected activity, the court found “without difficulty” that PIPA unconstitutionally restricts such freedoms of expression:

“It goes without saying that by appearing in public, an individual does not automatically forfeit his or her interest in retaining control over the personal information which is thereby exposed. This is especially true given the developments in technology that make it possible for personal information to be recorded with ease, distributed to an almost infinite audience, and stored indefinitely. Nevertheless, PIPA’s restrictions operate in the context of a case like this one to impede the formulation and expression of views on matters of significant public interest and importance.”

The tension between personal privacy and freedom of expression in the labor context is not unique to Canada. In the United States, for example, the National Labor Relations Board has struck down a number of social media policies for making sweeping prohibitions on what employees can and cannot post on their social media accounts. Our Employment Law Watch Blog has previously written about the NLRB’s social media cases. Back in July, the NLRB also held a company’s notice directing employees not to discuss workplace investigations to be in violation of the National Labor Relations Act. The recent Canada Supreme Court case is just the latest example of how this tension can play out in the courtroom.

While it is too early to tell what sort of specific changes to PIPA will result, the changes will likely focus on union activity, perhaps by expressly excluding union activity from the scope of the law. In the meantime, PIPA, as it’s currently drafted, will continue to remain in full force for the next year.

State Attorneys General Maintain Sharp Focus on Privacy

This post was written by Mark S. Melodia and Christine E. Nielsen.

Though the National Association of Attorneys General (NAAG) Presidential Initiative “Privacy in a Digital Age” expired in June 2013 when a new NAAG president took over, the state attorneys general have maintained their sharp focus on all things privacy, with no signs that that focus will shift anytime soon. Most recent case in point: a $17 million settlement with Google related to Google’s use of tracking cookies on Safari browsers. 

On November 18, 37 states and the District of Columbia announced the settlement with Google, which resolves an investigation that began in February 2012. Default settings on Apple’s Safari browser do not allow for tracking across different websites.  The investigation centered on whether Google tricked the browser into allowing such tracking, ostensibly in contradiction to the user’s choice not to be tracked. Google faced similar scrutiny from the FTC, which entered into a $22.5 million settlement with the search engine giant late last year.

In addition to the $17 million payment, the state AG settlement prohibits Google, without the express consent of an individual user, from overriding that user’s Internet browser’s setting to block tracking cookies. Google is also prohibited from misrepresenting the extent to which a user can manage how Google serves advertisements. Google must create and maintain a page that informs users about cookies, Google’s use of cookies, and user control over cookies.  This separate “Cookie Page” must be maintained for five years.

Privacy investigations and enforcement actions are not just handled through the multistate vehicle; individual states are pursuing their own actions, scrutinizing website and mobile app privacy policies, investigating data security breaches, and paying close attention to how entities treat sensitive data like children’s information and health information. For example, California has been particularly active in this area, releasing mobile app best practices guidance earlier this year, which followed on the heels of enforcement actions filed against mobile application developers for alleged non-compliance with California’s privacy policy requirements.
Several states have also flexed their muscles in the health care arena, enforcing data breach notification requirements for the loss of protected health information under the Health Insurance Portability and Accountability Act (HIPAA). Connecticut led the charge in 2010, exercising the new enforcement authority granted to the states under the HITECH Act, with a lawsuit against Health Net. In 2012, both Massachusetts and Minnesota entered the arena with investigations of their own. With this year’s release of final rules under HITECH and a renewed national focus on health care, we wouldn’t be surprised to hear about more states jumping into that privacy arena soon.

Department for Business, Innovation & Skills Publishes Impact Assessment for European Commission Proposed Cybersecurity Directive

This post was written by Cynthia O'Donoghue.

The UK Government Department for Business, Innovation and Skills (BIS) has issued an impact assessment (IA) at the end of September on the draft Network and Information Security Directive (the Directive) proposed by the European Commission on 7 February 2013. The Directive aims to achieve a common high level of network and information security across the EU to harmonise existing discrepancies between national strategies.

To achieve this, the proposed Directive mandates:

  • All Member States must within one month establish competent authorities for network and information security and set up national Computer Emergency Response Teams (CERTS)
  • A cooperation network must be set up between competent authorities enabling secure and coordinated information exchange as well as an early warning system to allow effective detection and response in relation to network information security incidents
  • A culture of risk management and information sharing between private and public sectors must be developed
  • A system of reporting to the relevant competent authority of any incidents seriously compromising an entity’s networks and information systems must be established
  • National competent authorities must impose sanctions, initiate audits and publicise incidents

To assess the impact this Directive could have in the UK, BIS initiated a call for evidence on 22 May 2013 to create a baseline. It was found at present £1.98 billion is spent on security annually. Large organisations spend £1.45 billion in total with an average each of £540,000, whilst SME’s account for £533 million each averaging £26,000. The potential impact of the Directive is estimated as follows:

  • 22,935 businesses in the UK will be affected
  • Additional security spending will amount to between £992.1 million - £1,984.2 million
  • Large organisations will have to increase average spending by an extra £270,000-£540,000, whilst small organisations will have to increase average spending by an addition £13,000-£26,000
  • An overall benefit of £860.6 million is estimated if 5,000-10,000 of effected UK organisations can achieve benefits of £27,000 by preventing 50% of cybersecurity incidents

Beyond the figures, the IA also highlights some key concerns:

  • By setting a minimum level across the EU this could result in a tick box approach to compliance
  • In many sectors, reporting infrastructures already exist with industry regulators. Additional reporting obligations could lead to duplication of procedure increasing the administrative burden and causing resources to be diverted to dealing with compliance.
  • The scope of business likely to be effected is overly expansive and could impose disproportionate obligations on small businesses
  • Imposing mandatory reporting obligations as opposed to the voluntary approach could create a compliance culture discouraging information sharing
  • Audits, sanctions and publication of breaches could penalise organisations with strong capabilities for detecting breaches and discentivize reporting
  • Establishing a new national competent authority could be costly and unnecessary
  • A Pan European response framework is likely to inhibit and slow down effective national measures for incidents
  • Significant security risks are inherent in greater information sharing between national competent authorities poses

This being said, BIS are hopeful that the Directive will prove flexible in approach and deem existing voluntary measures and the existing high level strategy in the UK as sufficient. However, until the scope and thresholds under the Directive are confirmed, it is only possible to speculate how costly its impact could be.

Does SB 568, California's New 'Eraser Button' law, apply to you?

This post was written by Lisa B. Kim, Joshua B. Marker, and Paul Cho.

One of the bills we have been following since May has recently cleared the governor’s desk and been signed into law. SB 568, now popularly known as the “Eraser Button” law, adds two significant, privacy-related requirements for operators of websites, online services, and mobile apps directed toward users under the age of 18. More specifically, this law applies to operators whose products and services are directed toward minors (defined as under age 18), and those who have actual knowledge that their products and services are being used by minors. Unless an exception to the new law applies, these operators will be required by law to: (1) notify minors of their right to remove posted content (whether on their own or by the operator upon request), and (2) provide instructions on how to do so.

The new law also prevents certain operators from marketing or advertising prohibited items (alcoholic beverages, firearms or handguns, dangerous fireworks, etc.) to minors. Additionally, operators who fall within the scope of the new law will be prevented from providing the personal information of minors to third parties for the purpose of marketing or advertising prohibited items. Finally, operators with products and services directed toward minors will also be required to provide notice to any third-party advertising services, who will face the same restrictions as to what they can advertise through that particular operator.

The new law applies to operators of websites, online services, and mobile apps everywhere, as long as their website, product, or service is visited or used by a California resident. It goes into effect January 1, 2015. While there is no specific private cause of action or statutory penalty set forth in the bill, violations will likely be enforced in civil lawsuits by the government and private parties under California’s unfair competition law.

Thus, anyone who operates a website or mobile application that reaches California residents should answer the questions listed in the attached flow chart to determine whether this new law applies to them, and to what extent. Also included in the attachment is a list of what needs to be done to comply with the law.

UK Office of Fair Trading Consults on Consumer Protection Principles for Children's Online Games and Apps

This post was written by Cynthia O'Donoghue and Kate Brimsted.

With more than six million apps currently in existence, the ‘appification’ of society is increasingly a topic of discussion, and certainly it was prominent at the 35th International Conference of Data Protection and Privacy Commissioners in Warsaw in September. Apps often collect large amounts of personal data and therefore have significant potential privacy implications. Young children are particularly vulnerable in this respect, as they can be captivated by online and app-based games and less aware of potential risks.

In September, the UK’s Office of Fair Trading (OFT) reported on its investigation into the ways in which online and app-based games encourage children to make purchases. It is now consulting on proposed principles that the online games industry will be expected to adhere to achieve compliance with consumer protection laws, in particular with regards to children, who often constitute the “average consumer” in this context. The consultation closes on 21 November 2013.

The OFT investigation scrutinised the commercial practices of 38 web and app-based games popular with children and identified the following areas of concern:

  • A lack of transparent, accurate, up-front information relating to costs available prior to the decision to play or download game
  • Misleading practices, including failing to separately identify commercial intent interspersed in game play
  • Exploitation of children’s inexperience, vulnerability and credulity, including aggressive and manipulative commercial practices
  • Direct exhortations to children to buy advertised products
  • Payments taken from account holders without their knowledge, express authorisation or informed consent

To resolve these concerns the OFT has proposed the following principles which are intended to apply industry wide:

  1. Information about all costs associated with a game should be provided clearly, accurately and prominently up-front
  2. All information about the game necessary for the average consumer to make an informed decision to play, download, sign up or purchase a game should be clear, accurate, prominent and provided up-front 
  3. Information about the business should be clear, accurate, prominent and provided up-front. It should be clear to the consumer whom to contact in the case of queries or complaints and the business should be capable of being contacted rapidly and communicated with in a direct and effective manner
  4. The commercial intent of any in-game promotion of paid-for content or promotion of any other product or services should be clear and distinguishable from game play
  5. A game should not mislead consumers by giving the false impression that payments are required, or are an integral part of the way the game is played, if that is not the case
  6. Games should not include practices that are aggressive, or which otherwise have the potential to exploit a child’s inherent inexperience, vulnerability or credulity. The younger a child is, the greater the impact such practices are likely to have. 
  7. A game should not include direct exhortations to children to make a purchase or persuade others to purchase for them 
  8. Payments should not be taken from the payment account holder unless authorised. Separate informed consent should be required for in-game payments (i.e. payments additional to any one-off payment authorised at the outset to purchase, download or sign up to the game). The amount to be debited should be made clear. Consent should not be assumed, e.g. through opt-out provisions and the consumer should positively indicate his/her informed consent.

The OFT has launched a consultation on the proposed principles. All responses must be received by 5 p.m. on 21 November 2013 by email to: or by post to: Children’s Online Games Consultation, Office of Fair Trading, Fleetbank House, 2-6 Salisbury Square, London EC4Y 8JX. The final version of the principles is due to be published by February 2014, with a grace period until 1 April 2014, after which enforcement action may be taken against businesses likely to be in breach of consumer protection law.

International Economic Organisation OECD publishes revised guidelines on the protection of privacy and transborder flows of personal data.

This post was written by Cynthia O'Donoghue.

The international free flow of information has become fundamental in a data-driven economy. Yet the increasingly extensive use and movement of personal data creates greater privacy risks for an individual’s digital data trail; and while nearly 99 countries worldwide have some form of data privacy laws, the legal disparities can hinder transborder data flow. Acknowledging the need for a unified standard, the Organisation for Economic Co-Operation and Development (OECD) has published a revised version of the 1980 Guidelines on the ‘protection of privacy and transborder flows of personal data.’

The original guidelines informed and became the basis for many countries' data protection laws, including those in Europe. Fundamentally, the revised version leaves the original privacy principles unchanged, and are widely familiar:

  • Fair, lawful and limited collection of personal data obtained with the knowledge and consent of the individual
  • Data is relevant for purpose collected, is complete, and kept up to date
  • Use of data for new purposes must either be compatible with the original purpose and new uses, or disclosures require consent
  • Use of reasonable security safeguards to protect data and accountability of any data controller
  • Individual right of access to data held, and the right to have data erased, rectified or amended

Data controller accountability is reinforced in the revised guidelines, regardless of data location, and regardless of whether it remains within their own operations, those of its agents, or is transferred to another data controller. The OECD recommends the use of tailored privacy management programs and privacy impact assessments to manage the risk of data breach. The OECD also encourages contractual provisions requiring compliance with a data controller’s privacy policy, notification protocols in the event of a security breach, and response plans for data breaches and data subject inquiries.

The OECD guidelines suggest that to manage global privacy risks, there must be improved interoperability, with national strategies between states co-ordinated at government level, and cross-border co-operation between privacy enforcement authorities.

Office for the Australian Information Commissioner (OAIC) Publishes Draft Guidelines Interpreting New Privacy Principles

This post was written by Cynthia O'Donoghue.

The Office for the Australian Information Commissioner (OAIC) has published initial draft guidelines which provide a good indication as to how to interpret the first five of thirteen Australian Privacy Principles (APPS) that will form the foundation of the Privacy Amendment (Enhancing Privacy Protection) Act 2012 which will become effective from 12 March 2014.

  1. APP 1: Open and Transparent Management of Personal Information
    APP entities will be deemed accountable for taking proactive steps to manage risks to data at every stage from collection, use and storage to destruction. Emphasis is placed on the importance of IT security systems, privacy impact assessments for new projects and procedures for reporting breaches. Also important are easily accessible and up-to-date privacy policies.
  2. APP 2: Anonymity and Pseudonymity
    It is anticipated that individuals will have the right to deal with organisations where they cannot be identified from the data they provide, by opting not to provide personal information, or by providing a different name, term or descriptor. The aim is to give individuals greater control over their personal information and is seen as a method of assisting organisations with reducing their compliance burden. Organisations would need to prominently state when it is not necessary for an individual to provide personal information.
  3. APP 3: Collection of Solicited Personal Information
    APP entities will only be able to solicit information collected from other entites which is reasonably necessary or directly related to the entities functions or activities. There willl also be an additional obligation to seek explicit direct consent from individuals when soliciting sensitive personal data except (a) where it is permitted by law (b) where a permitted general situation exists or 3.4 (c) where a permitted health situation exists (d) for an enforcement activity or (e) by a non-profit organisation.
  4. APP 4: Dealing with Unsolicited Personal Information
    This principle aims to address how organisations should deal with data which it has not actively sought to collect yet but falls within its control, such as information received that is surplus to its function. If the data could not have been collected under APP 3, then it must be either destroyed or de-identified.
  5. APP 5: Notification of the Collection of Personal Information
    Before or at the time of collection of any information, organisations will be expected to ensure that individuals are fully informed as to the APP entity’s identity, the purpose for collection, the consequences if that information is not collected and any intended disclosure. 

Further draft guidelines are expected to be released over the next few weeks and will cover the remaining APPS which deal with topics including direct marketing, cross-border disclosure or personal information and data security.  

Advertising Industry Group Enters the Mobile Privacy Arena

This post was written by Joshua B. Marker.

The Digital Advertising Alliance released its self-regulatory principles for advertising in the mobile environment yesterday. While the guidance is ostensibly focused on advertising, it incorporates many important privacy principles regarding the collection, use, and sharing of data through mobile applications, and should be reviewed by any company that has a mobile presence.

Click here to read the full write-up on our sister blog, AdLaw By Request.

N.J. Supreme Court Says Warrants Required for Cell Phone Location Data

This post was written by Paul Bond and Frederick Lah.

Last week, the New Jersey Supreme Court held in State v. Earls that New Jersey residents have a constitutional right of privacy in their cell phone location data, and that law enforcement officers must obtain a search warrant in order to access the data. In the case, the police were searching for a suspected burglar and his girlfriend. In that effort, they contacted a cell phone service provider. At three different times that evening, the service provider gave information about the location of the suspected burglar’s cell phone. After the Appellate Division concluded that defendant lacked a reasonable expectation of privacy in his cell phone location information, the Supreme Court held that the New Jersey Constitution protects an individual’s privacy interest in his or her cell phone, and that the police must obtain a warrant based on probable cause (or must qualify for a warrant requirement exception) to obtain location information from a cell phone. The court noted:

“When people make disclosures to phone companies and other providers to use their services, they are not promoting the release of personal information to others. Instead, they can reasonably expect that their personal information will remain private… Today, cell phones can be pinpointed with great precision, but courts are not adept at calculating a person’s legitimate expectation of privacy with mathematical certainty. What is clear is that cell phones are not meant to serve as tracking devices to locate their owners wherever they may be. No one buys a cell phone to share detailed information about their whereabouts with the police.”

If the opinion had found the right to locational privacy from the government under the federal Constitution, Earls could be undermined by subsequent federal court decisions. But the New Jersey Supreme Court has the final say over what New Jersey’s Constitution means. No court in the country can overturn the New Jersey Supreme Court’s decision that there is a state constitutional right against warrantless collection of cell phone location data by law enforcement. By framing the issue in state constitutional terms, Chief Justice Rabner insulated the decision from further review, and settled an issue in New Jersey that is unsettled across the country in the wake of the United States Supreme Court’s decision in the GPS privacy case, US v. Jones.

Earls does not apply directly to anyone but New Jersey state actors. Because the state government can exert so much power over individual citizens, the state is often constrained in ways that private companies are not.

However, the court’s reasoning – that cell phone location data can reveal “which shops, doctors, religious services, and political events [people] go to, and with whom they choose to associate” – is very much in line with the reasoning of those pushing for more legislation or regulation on geolocation data. And with more and more companies collecting location data in the mobile app space, this ruling has the potential to indirectly apply to any company that records location data about New Jersey residents and may be asked for it by law enforcement. Companies need to ensure they are not just handing this data away any time a police officer requests it, even if pursuant to a criminal investigation. Companies should also make sure that they understand the warrant requirement and that it is communicated clearly to all of their employees with access to the data. Otherwise, they run the risk of legal backlash from consumers.

As courts across the country continue to consider how to apply constitutional rights to new forms of technology, we’ll continue to monitor these cases closely.

UK Office of Fair Trading warns online businesses about fair data usage

This post was written by Cynthia O’Donoghue, Edward S. Miller and Marjorie C. Holmes.

Office of Fair Trading (OFT) research into how online businesses use consumers' information to influence prices has raised concerns over how UK companies collect and use consumer data. The report on Personalised Pricing found that many consumers are concerned with the extent of personal information collected and used online. OFT points out that websites failed to properly inform customers of what information they gather, how it is used and how to opt out. The consumer protection watchdog shared its findings with the Information Commissioner’s Office (ICO), and has vowed to continue monitoring the situation and take enforcement actions if necessary.

The report analysed the types of consumer data used by businesses to personalise prices, in particular addresses, dates of birth, past purchases, and browsing history. While there was no evidence of using such data to distort pricing, OFT prepared letters to more than 60 leading online businesses, encouraging them to reconsider their approach to data protection and cookie notices. The letters remind businesses that consumers value their privacy and recommend that the businesses:

  • Provide consumers with accurate, honest and clear details about how the data is used
  • Provide an opt-out in relation to non-essential data collection
  • Understand data usage by third parties
  • Ensure that terms and conditions are fair

The OFT points businesses to ICO guidance and suggests they also align their practices with any relevant trade association code of practice.

While the OFT has not alleged misconduct by the 60 online businesses, it has promised to monitor action and enforce consumer legislation if it finds evidence of misleading or unfair practices. The letters highlight that online businesses with inadequate data protection policies run the risk of breaching Consumer Protection from Unfair Trading Regulations 2008 (“CPRs”). Breach of the CPRs can result in an unlimited fines as well as criminal prosecution, whereas maximum penalties under the Data Protection Act 1998 are £500,000 GBP.

The OFT will cooperate with the ICO to investigate, and Simon Entwisle, Director of Operations at the ICO, applauded the OFT for reminding UK companies how to build customer relations through data protection. ICO’s continuing interest in promoting data protection among online businesses is clear from its involvement in the global investigation of website privacy policies’ standards, organised by the Global Privacy Enforcement Network (see our related blog).

Japan promotes the use of Big Data

This post was written by Cynthia O’Donoghue and Taisuke Kimoto.

On May 10, the Japanese Government released a report regarding the use of personal information in Big Data applications (available in Japanese). This comes just months after Japan announced plans to provide guidance on data anonymisation as part of the 'Japan Revitalisation Acceleration Programme’ (see our related blog). The report was prepared by the Personal Data Working Group established by the Ministry of Economy, Trade and Industry (Ministry) as part of the IT Integration Forum. The Ministry hopes that this will help Japanese businesses use Big Data to innovate and develop.

Big Data uses vast amounts of data (often personal) to gather valuable information – for example, about customer trends. A summary of the report points out that most Japanese companies’ use of Big Data has been limited when compared with other world markets. Japanese business appears not to have taken advantage of the opportunities created from data, such as geolocation, radio frequency identification, web logs, and online targeted advertising. The Working Group suggests that one of the reasons why such development has not been as great as it could be relates to concerns about privacy and data protection.

The Working Group focused on three areas aimed at establishing trust between consumers and business in relation to Big Data, and recommends:

  • User-friendly descriptions
  • Use of business credibility ratings and education to businesses on handling personal data
  • Consumer choice about what information is disclosed

The Working Group criticised the practice of providing a single consent mechanism for all types of personal information. It proposed that businesses provide to consumers a comprehensive list of what personal information will be collected and for what purpose, with consumers having the ability to indicate consent for each of the listed items.

Supreme Court Ruling in Clapper v. Amnesty International Leaving Data Breach Class Actions in Danger?

This post was written by Mark S. Melodia and Paul Bond.

In Clapper vs. Amnesty International, a group including journalists, human right activists, and labor leaders challenged the 2008 amendments made to the Foreign Intelligence Surveillance Act. The amendments included broadening the surveillance powers of the federal government with respect to communications outside the U.S.

Plaintiffs claimed that their work required open communication with persons around the globe and that they had incurred costs to prevent this government surveillance. A 5-4 decision was issued by the Supreme Court, where the majority (Alito, J.) found that the plaintiffs had no Article III standing to sue.

For a more detailed analysis, please click here to read the issued Client Alert.

Long-Awaited HITECH Final Rule is Here

This post was written by Brad M. Rostolsky, Nancy E. Bonifant, Salvatore G. Rotella, Jr., Elizabeth Doyle O'Brien, Jennifer Pike and Zachary A. Portin.

After much anticipation, the Office for Civil Rights of the United States Department of Health and Human services published the HITECH Final Rule on January 25, 2013. The final regulation contains substantive and technical modifications and additions to the HIPAA Privacy, Security, Enforcement, and Breach Notification Rules.

For a more detailed analysis, please click here to read the issued Client Alert.

FTC Speaks About Its Mobile Privacy Disclosures Guidance

This post was written by Paul Bond and Frederick Lah.

On February 1, the FTC released its Mobile Privacy Disclosures Guidance (the Guidance) setting forth best practice recommendations for platforms, app developers, third parties such as ad networks and analytics companies, and app trade associations. We previously wrote about the Guidance when it was issued.

On February 15, Assistant Director in the FTC’s Division of Privacy and Identity Protection, Chris Olsen, spoke at the latest National Telecommunications and Information Administration (NTIA) stakeholders’ meeting in Washington, DC about the Guidance. Here are some highlights from the meeting:

  • Olsen started off the meeting by recapping the recent efforts by the FTC in the mobile space.
  • He said that the FTC believes that consumers are not really aware of the types of information collection and sharing practices that are taking place.
  • He described the mobile ecosystem as “complex” and that all the players in the ecosystem need to work together for its improvement.
  • Olsen spoke about the specific roles and responsibilities of all the players in the ecosystem - app platforms, app developers, and app networks – as outlined in the Report (many of which we described in our previous blog article).
  • According to Olsen, the Guidance was designed to do three things – (1) spur on members of the ecosystem to take a more active role in addressing the lack of sufficient disclosures; (2) reach as many industry participants as possible in the “diverse marketplace” and to educate participants on what are the best practices; (3) provide input to industry stakeholders, such as the NTIA, on the development of a code of conduct for the mobile space.
  • One commentator noted that the Guidance sets out what industry participants should be doing, but does not seem to set out what the role of the FTC should be. Olsen responded by saying that, “the FTC needs to do better, too.” He specifically identified enforcement and outreach as areas for improvement.
  • With respect to the Guidance’s recommendations for platforms, Olsen stressed the need for platforms to be clear to consumers about what they’re doing (or not doing) and to oversee and enforce their developer agreements. 
  • Olsen pointed out that the Guidance does not set forth legal requirements and that the FTC did not issue the Guidance with the goal of providing any sort of legal framework. He did note though that Congress is interested in the issue and that they will continue to hold hearings about the state of affairs in the mobile environment, and that the FTC would provide input to Congress if called upon to do so.
  • As for the use of icons in the mobile space, Olsen said that he thinks that an essential element of any icon program must be that it “communicates a clear message” and is not ambiguous. 
  • He also noted that the recent report from the California AG’s office on mobile privacy is largely consistent with the FTC’s Guidance, but noted that the California report appears to cover a larger scope of mobile privacy issues, one not just focused on the issue of disclosures.

Olsen’s comments and the Guidance itself are informative, but it remains to be seen how the players in the ecosystem will respond to the recommended best practices. Another question will be what effect, if any, the Guidance will have on the FTC’s enforcement efforts? We’ll be monitoring this situation closely.

Blackberry Policing Apps To Ensure Compliance With Privacy Policies

This post was written by Paul Bond, Lisa B. Kim, and Frederick Lah.

In the midst of all the recent attention on mobile apps and their privacy challenges, BlackBerry has unveiled a new “privacy notice” service that alerts customers about apps that “don’t clearly or adequately inform users about how the app is accessing and possibly managing customers’ data.” According to BlackBerry, these notices will “provide information about an application's behavior in order for customers to make an informed decision about whether to continue using the app.” In addition, the notices will provide information to users on how to remove the app.

As an example, BlackBerry issued its first “privacy notice” to NumberBook, a caller ID app. After conducting an investigation into the app, BlackBerry determined that in addition to identifying callers, NumberBook, unbeknownst to the user, was collecting the user’s contact list and GPS location, and had the ability to send text messages from the user’s device. This did not comply with BlackBerry’s privacy and mobile app guidelines because it did not provide sufficient notification to users about what information was uploaded from their device or how it was used or shared with third parties, nor did it seek consent from the user’s contacts before it disclosed their phone numbers to other NumberBook users. So BlackBerry removed the app from its App World store and issued an alert to BlackBerry owners who had previously downloaded the app.

BlackBerry’s new “privacy notice” service offers a technical solution designed to meet the increasing demand from regulators that apps provide better disclosures to consumers about the app’s privacy practices. Other companies, such as Facebook, have opted to respond to this increasing demand from regulators by agreeing to participate in the “Ad Choices” self-regulatory program, which applies to both mobile web and mobile app advertisements. Interestingly, BlackBerry's service was unveiled the same day that the FTC released its latest report on mobile privacy disclosures, which we previously covered here.  As regulatory pressure continues to build from both the FTC and the states, namely California, it will be interesting to see how other app developers and platform providers in the mobile space will respond.

FTC Tries The Carrot and The Stick: Releases Guidance on Mobile Privacy Best Practices; Enters Into $800K Consent Order with Path

This post was written by John P. Feldman, Paul Bond and Christine E. Nielsen.

Today, the Federal Trade Commission released detailed guidance on privacy in the mobile environment – at the same time it announced its largest-ever settlement with an app developer for alleged privacy violations. Combined with aggressive action on mobile privacy issues by the California attorney general’s office, Mobile Privacy Disclosures provides every company associated with a mobile app with an urgent reason to review all disclosures and practices. 

Please click here to continue reading this Client Alert

The Arrival of the OCR HIPAA/HITECH Final Rule is Here

This post was written by Brad M. Rostolsky and Nancy E. Bonifant.

The long awaited final rule, released yesterday by the Office for Civil Rights (OCR) of the Department of Health and Human Services, modifies the HIPAA Privacy, Security, Breach and Enforcement Rules and is comprised of four final rules which implement the statutory requirements of the Health Information Technology for Economic and Clinical Health Act (HITECH) and the Genetic Information Nondiscrimination Act (GINA).

Please click here for a more detailed analysis on our sister blog, Health Industry Washington Watch.

The EU Commission declares New Zealand adequate for the transfer of personal data

This post was written by Cynthia O'Donoghue.

On 19 December 2012, following years of assessment and culminating in positive recommendations by two specialist EU Committees, the European Commission formally announced that New Zealand’s data protection standards are compatible with those of the EU, and that they ensure “adequate protection” of personal data under the European Data Protection Directive 95/46/EC. Vice-President Viviane Reding, the European Commissioner for Justice, Fundamental Rights and Citizenship, declared that the decision paved the way to boosting trade with the EU’s international partners, while helping to set high standards for personal data protection at a global level.

Under the European Data Protection Directive, transfers of personal data to countries outside the European Economic Area that are not considered to provide “adequate protection” of personal data are subject to strict conditions under which adequate safeguards must be put in place in order to allow for the international transfer. This finding of adequacy will therefore allow personal data to flow from the 27 EU member states to New Zealand for processing without any further safeguards being necessary.

To date, the European Commission has recognised Andorra, Argentina, Australia, Canada, Faeroe Islands, Guernsey, Israel, Isle of Man, Jersey, New Zealand, Switzerland and Uruguay, and the United States’ Safe Harbor scheme, as providing adequate protection.

Awaiting the Release of the HITECH Final Rule

This post was written by Brad M. Rostolsky and Nancy E. Bonifant.

As the year is coming to an end, the industry is speculating the release date of the Health Information Technology for Economic and Clinical Health Act (“HITECH”) final rule. The final rule is expected to address modifications to the Privacy, Security, Enforcement, and Breach Notification Rules, and with the release date yet to be determined, it is important for Covered Entities and Business Associates to be prepared for the upcoming changes.

Please click here for a more detailed analysis on our sister blog, Life Sciences Legal Update.

Big Data Goes to Princeton for Inaugural Meeting of IAPP's New Jersey KnowledgeNet

This post was written by Paul Bond, Mark Melodia and Cynthia O'Donoghue.

The International Association of Privacy Professionals hosted its first KnowledgeNet in New Jersey December 6, 2012, at the Princeton offices of Reed Smith. Reed Smith attorneys Mark Melodia and Paul Bond presented a seminar on “Understanding and Defending Big Data” to the gathering of dozens of privacy attorneys and privacy compliance professionals from around the state. Participants discussed what Big Data means to them, as well as how Big Data solutions are being deployed in their respective industries, including financial services, health care, energy, and education. This KnowledgeNet, organized by Miranda Alfonso-Williams, Global Privacy Leader, GE Healthcare, promises to be the first of many for the growing privacy community of the Garden State.

In October, Reed Smith hosted a Silicon Valley event on “Big Data Monetization.” Video of the presentations from that panel, including Mark, and Data Privacy, Security & Management Group co-chair Cynthia O’Donoghue, are available here.

Data Protection Concessions for SME's hinted at by EU Justice Commissioner

This post was written by Cynthia O'Donoghue.

Viviane Reding, Vice-President of the European Commission, EU Justice Commissioner, told ministers from the European Union Member States at a Justice and Home Affairs Council meeting in Luxembourg that in an effort not to overburden small and medium-sized enterprises (SMEs), she is prepared to offer them some concessions under the revised EU Data Protection Regulation.

SMEs are currently exempt from certain requirements, including the appointment of a data protection officer, but the Commission is prepared to consider broadening this exemption to other areas through an approach that takes into account the amount and sensitivity of the data processed. The proposal further elucidates the Commission's intention to not apply the same rules to “the small hairdresser as to a multinational.”

Reding emphasised that the Commission would not fall into the trap of some lobbyists expressing concerns for SMEs, but in fact referring to provisions designed to help large multinational firms.

In her speech, the Commissioner also referred to the proposed implementing and delegated acts, expressing that they are not designed to be considered a “blank cheque” for the Commission. Instead, Reding suggested she would consider reviewing them one-by-one with member states to ensure they are limited to what is truly necessary.

Reding also hinted that there may be different rules for the private and public sectors, by advocating the need for greater flexibility, even though the consensus is to stick to the status quo of having the same rules apply to both. However, Reding stated that “specific rules are necessary in certain circumstances such as the land registry which should be public.” But she warned that “there can be no general exemption for the public sector.”

It’s looking increasing unlikely that the EU Data Protection Regulation will be revised and ready for a vote during the first quarter of next year, despite the Irish Presidency’s hope to get it on the agenda for February 2013.

ICO publishes guide on Anonymisation in the UK

This post was written by Cynthia O'Donoghue.

The UK Information Commissioner's Office (ICO) has published a code of good practice on managing the risks related to anonymisation. Christopher Graham, UK Information Commissioner, believes this to be the first code of practice on anonymisation to be published by any European data protection authority, but Liechtenstein published a guide on anonymisation and pseudonymisation earlier this year.

With publicly available data increasing rapidly and the rise of “big-data,” anonymisation is an important tool in “helping society to make rich data resources available whilst protecting individuals’ privacy.” It is considered to be of particular value for organisations that want to publish data for research purposes.

The Code was issued pursuant to Recital 26 of the European Data Protection Directive (Directive 95/46/EC), which provides that “the principles of protection shall not apply to data rendered anonymous in such a way that the data subject is no longer identifiable.” Data that is properly anonymised ensures that an individual can no longer be identified, resulting in such data falling outside the European data protection laws. Anonymisation is not, however, always straightforward since individuals may be identified in a number of ways which can lead to the possibility of re-identifying individuals from a combination of anonymised data and data aggregated from other sources. The ICO recognises the difficulty in determining whether anonymised data is still classified as personal data and believes a sensible judgment should be made in the circumstances.

The Code recommends that data controllers perform regular risk assessments on the likely occurrence of re-identification since that risk may change over time. It further warns that even if anonymisation is carried out effectively, it does not necessarily protect personal data from being re-identified in the future. In borderline cases where there is uncertainty about whether re-identification can occur, organisations are urged to seek the individual’s consent for disclosure and to adopt a more rigorous form of risk analysis and anonymisation.

Disclosure of anonymised data does not require consent, according to the ICO, “provided there is no likelihood of anonymisation causing unwarranted damage or distress then there will be no need to obtain consent as a means of legitimising the processing.” The ICO also acknowledged that consent can be not only onerous but potentially impossible to obtain, and even if obtained, the ICO generally considers it safer to use or disclose anonymised data.

The Information Commissioner does suggest an added layer of bureaucracy and cost of organisations, however, by suggesting that risk assessment strategies form part of an organisation’s wider governance structure with the appointment of a “Senior Information Risk Owner” who would be responsible for authorising and overseeing the anonymisation process.

Protection of employee privacy rights in France: measures controlling employees in the workplace must be treated with caution - employers should avoid placing restrictions upon themselves

This post was written by Daniel Kadar.

France’s highest court (“Cour de cassation”) ruled 26 June 2012 in Monsieur X v. YBC Helpevia that a company’s internal rules may limit an employer’s access to employee emails.

French case-law has traditionally held that employees have a right to privacy at their workplace and that an employer cannot search an employee’s personal files stored on a work computer without breaching the employee’s right to privacy (Nikon France v. Onof).

As a result, case-law allows a French employer to search an employee’s professional messages, but prohibits any access to his/her personal files and messages that are specifically identified and marked “personal”.

The current decision has, however, narrowed the employer’s ability to search an employee’s professional files if the internal rules of the company place restrictions on the search.

In the current case, an employee was suspected of hacking into the email account of his employer in order to access data about potential future salary increase proposals. To confirm the suspicion, the employee’s emails were opened by the company on his professional computer whilst he was absent. However, the company’s internal rules stated that the employer would refrain generally from accessing the employee’s computer in such circumstances. The “Cour de cassation” confirmed the Rouen Court of Appeal decision and ordered the company to compensate the employee for unjust dismissal.

This decision established that if there is a general prohibition of the employer reading employees’ emails in their absence in the internal rules of a company, no distinction between personal and professional communications will be made – none of the communications may be read.

As a consequence, by failing to make the distinction between personal and professional messages in its internal rules, the company restricted its ability to access the employee’s professional emails by stating that such access could only be in the physical presence of the employee.

French companies looking to monitor their employees’ communications should therefore make sure that they do not unintentionally restrict themselves more than the law requires. This is also a reminder for employers that they need to draft their internal policies very carefully.

Employee protection remains a very sensitive issue in France. On another topic, the French CNIL published 23 October 2012 its decision to withdraw previous authorization it had granted to employers to monitor employee access to their workplace, as well as their work schedule by biometrical means.

The CNIL therefore decided to suspend for five years the application of its 2006 AU-007 “Unique Authorization” because of the fact that French labour organizations prefer non-biometric managing tools in order to strengthen the workers’ rights and to preserve the trust relationship between employer and employees.

European Commission shows concern over the slow development of the Do-Not-Track standard

This post was written by Cynthia O'Donoghue.

Neelie Kroes, Vice President of the European Commission, has signalled her concern over the progress of the adoption of the Do-Not-Track (DNT) standard, which is being developed by the World Wide Web Consortium (W3C) as a universal mechanism to communicate relevant consent or lack of consent to the tracking of individuals’ web data. The mechanism is already present in certain browsers, including Microsoft and Mozilla; however, the lack of a commonly agreed-upon standard is causing dispute, with increasingly broad exceptions being suggested to the W3C, which Jon Leibowitz, the Federal Trade Commission's Chairman, labelled 'a loophole you could drive a virtual truck through'.

Delay and frustration of the process have mostly been attributed to advertisers and marketers, many of whom are insisting that marketing is one of the most important freedoms of a civilised society, and that the DNT will harm their business operations. Indeed the Direct Marketing Association (DMA) has asked the W3C that marketing be added to the list of those activities exempt from the standard, which has drawn much condemnation from the Commissioner.

This ‘watering down’ of the standard is the main problem according to Kroes, who would prefer that DNT build on the principle of informed consent, giving consumers control over their information and letting them choose not to be tracked. Kroes addresses the current issues over default settings, stating that DNT should be specific in informing consumers about which default settings are in their software and devices. Anonymisation and retention limits should be incorporated to mitigate, but cannot be seen as a ‘get-out clause’.

Kroes criticised the current DNT standard, declaring that it will not, on its own, guarantee satisfying cookie requirements under the ePrivacy Directive, but did emphasis the value of DNT in harmonising online business and transparency to consumers.

Kroes warned that time is running out to create a simple, uniform and self-regulatory standard to address tracking. She views harmonised DNT standards as having universal appeal. "When I say this is in everyone's interest, I mean everyone. Including American companies. Because if you want to track Europeans, you have to play by our rules."

Opinion of the European Data Protection Supervisor on the European Commission proposal for a Regulation of the European Parliament, and of the Council on trust and confidence in electronic transactions in the internal market.

This post was written by Cynthia O'Donoghue.

The European Data Protection Supervisor (EDPS) has published its opinion on the European Commission draft Regulation on electronic identification and trust services for electronic transactions in the internal market. The proposed Regulation is expected to enhance trust in pan-European electronic transactions, to ensure cross-border mutual recognition of electronic identification by enhancing current rules on e signatures, and by providing a legal framework for electronic seals, time stamping, electronic document acceptability, electronic delivery and website authentication.

Electronic Identification schemes and trust services raise significant data-protection issues stemming from the processing of personal data, and the EDPS supports the proposed Regulation as a method of harmonising data-protection principles, and contributing toward mutual recognition and acceptance of electronic trust services and identification schemes. The EDPS, however, has suggested several recommendations to increase harmonisation and interoperability, such as a common set of security requirements, and clarification of individuals’ rights of access and to be informed.

The proposed Regulation leaves wide discretionary powers with the member states to create electronic identification schemes, and the EDPS recommends adopting a common set of conditions to be applied for the use of national identification schemes across borders.

In relation to the requirements for a mutual recognition scheme for electronic identification schemes, the EDPS recommends that the Proposed Regulation specify: (i) which data or categories of data will be processed for cross-border identification of individuals and set data minimisation goals; (ii) a common minimum safeguard level proportionate to the risks involved and at least compliant with the requirements set forth for the providers of qualified trust service; and (iii) a set framework for the interoperability of national identification schemes.

In relation to the requirements for the provision and mutual recognition of electronic trust services, the EDPS recommends that the Proposed Regulation specify: (i) if personal data will be processed and, if so, the data or categories of data to be processed so as to assess data protection implications; (ii) appropriate safeguards to avoid any overlap between the competencies of the supervisory bodies; (iii) that notification requirements for data breaches be consistent with those in the e-Privacy Directive; and (iv) the setting of specific time limits for data retention.

European Parliament Publishes Study Suggesting Improvements to the European Commission's Proposed Data-Protection Package

This post was written by Cynthia O'Donoghue.

The European Parliament has published a study aimed at providing advice on priority measures to ensure that the Proposed Data Protection Regulation, presented by the European Commission (EC) earlier this year, is more comprehensive in relation to data protection and more protective of consumers’ privacy rights.

The European Parliament supports various new rights, namely the right to be forgotten, the right of portability, and the right against profiling, and commends the EC’s proposal to create a level playing field across the EU through inclusion of a ‘one-stop-shop’ principle. This principle involves a single data-protection authority based on an organisation’s main location, the applicability of data protection laws extraterritorially for businesses active in, but based outside, the EU, and the general principle of accountability.

The European Parliament believes, however, that globalisation and the advent of new technologies still needs to be fully addressed. The proliferation of geo-location services, smart metering, face recognition technologies, social networking services, online gaming, and RFID technologies, has meant that companies and governments are often processing personal data without data subjects being aware of the impact of these activities.

While the European Parliament supports the refined definition of consent, the study recommends that a variety of online identifiers, such as IP addresses or cookies, should be specifically qualified, and situations illustrated where they should be treated as personal data. The study also recommended that the proposed Regulation should encourage anonymisation, especially for the processing of sensitive data. Further, the European Parliament would like to see the scope of what constitutes a data controller limited to organisations that determine the “purposes” of data processing, rather than the “conditions”, as well.

The European Parliament sees international data transfers as a key area requiring improvement, especially in the context of cloud computing. The study calls for a greater emphasis on risk assessment prior to transferring data with more emphasis on accountability, and suggests the development of an accreditation system or the dedicated Cloud Safe Harbour Programme, as well as self-regulatory industry standards.

Privacy International publishes its analysis of the European Commission's proposal for a General Data Protection Regulation

This post was written by Cynthia O'Donoghue.

On 25 January 2012 the EC proposed a uniform legal framework for providing legal certainty on data protection. The most notable proposed change is that from a European Directive to a Regulation (the Proposed Regulation) to ensure directly enforceable implementation across all Member States. The Proposed Regulation sets out general rules on data protection that would modernise and further harmonise the data protection regime created by the Data Protection Directive (95/46/EC).

While the European Data Protection Supervisor (EDPS) has stated that it is a huge step forward for data protection in Europe, it still fails to offer a comprehensive set of data protection rules for the EU.

Privacy International’s analysis concurs with this sentiment.

It suggests that the Proposed Regulation goes some way to ensure that data protection law responds to contemporary and emerging threats to the right to privacy, and commends the introduction of additional controls for individual consumers with regards to access, correction and deletion and the provision of greater power for independent authorities to ensure effective enforcement. It also welcomes the emphasis on responsibility and accountability through privacy by design and the introduction of data breach notification for all industry sectors.

Privacy International, however, also express concern over various weaknesses that may undermine individuals’ rights. It is advocating for more specific, comprehensive protection including:

  • A stronger definition of consent to make it ‘provable’.
  • A clearer definition of processing on the grounds of ‘legitimate interests’.
  • Data Breach notification limited to serious risk to avoid notification fatigue.
  • Inclusion of information about profiling and security measures to individuals.
  • Deletion of the provision allowing further non-compatible use on the basis that it undermines the very basis of data protection.

While most of us welcome the idea of great harmonization of data protection law across the EU, Privacy International’s views are at odds with the other fundamental purpose of data protection law, which is the free flow of data and while individuals’ rights should be protected, the EU has the unenviable task of ensuring that it is done in a way that does not thwart business and have a dampening effort on the EU’s goals for the future of its digital economy.

Privacy International publishes its analysis on the European Commission's proposal for a Data Protection Directive in the law enforcement sector.

This post was written by Cynthia O'Donoghue.

On 25 January 2012 the European Commission proposed a uniform legal framework for providing legal certainty on data protection. This includes a Regulation (the Proposed Regulation) with general rules on data protection that would modernise and further harmonise the data protection regime created by the Data Protection Directive (95/46/EC) and a Directive (the Proposed Directive) with specific data protection rules for the law enforcement sector.

The Proposed Directive has been met with critical reviews, with the European Data Protection Supervisor (EDPS) taking the view that the Proposed Directive does not meet the requirement of a consistent and high level of data protection and the Article 29 Data Protection Working Party (WP29) stating that it was “disappointed by the Commission’s level of ambition and [the WP29] underlines the need for stronger provisions.” This lack of high level data protection is troubling because in the context of law enforcement citizens may be put at particular risk due to the likely processing of sensitive personal data.

In its analysis of the Proposed Directive, Privacy International is principally concerned that (i) the rights of data subjects are significantly weaker than they would be under the Proposed Regulation; (ii) data controllers are subject to fewer and vaguer obligations than they would be under the Proposed Regulation; (iii) there is no mention of preventing transfers of data to non-competent authority recipients; and (iv) supervisory authorities have disproportionately limited powers in comparison to their peers under the Proposed Regulation.

In many instances there does not appear to be any justification for departing from the rules provided in the proposed Regulation. Privacy International has therefore called for improvements to the articles of the Proposed Directive to rectify their concerns and to ensure that it is more comprehensive and in tune with the Proposed Regulation in the Commission’s proposal.

The Norwegian Data Protection Authority holds off ban, permitting the use of cloud computing services by Norway Municipalities.

This post was written by Cynthia O'Donoghue.

We previously told you about the Norwegian Data Protection Authority’s, Datatilsynet (Norwegian DPA) finding that Google Analytics breached that country’s data protection laws. In an about face, Norwegian DPA has now decided to hold off its ban on the use of Google’s and Microsoft’s cloud computing services by the Municipalities of Narvik and Moss, respectively. The Norwegian DPA had originally concluded that Google Apps and Microsoft 365 failed to comply with the Norwegian Data Protection Act because the municipalities lost control over the storage and access restrictions to personal data being processed by Google and Microsoft through their Cloud Computing services. The main concern the Norwegian DPA had was the failure to establish a valid data processor agreement in accordance with Section 15 of the Personal Data Act, which did not fulfill information security requirements according to Section 13 and did not adhere to regulations on the transfer of personal data abroad in section 29. The Norwegian DPA was also concerned that the U.S. Patriot Act represented a challenge with regard to the safeguarding of personal privacy, even within the Safe Harbor scheme.

The Norwegian DPA is now satisfied that Google and Microsoft have increased their cloud computing security and that the data stored in the EU/EEA and in the United States under the safe harbor regime are protected by adequate safeguards. This fundamental reversal of regulatory policy suggests that the DPA is reassessing the significance of cloud computing in light of its growing popularity. However, the use of cloud computing services in Norway will be made conditional upon strict prerequisites:

  • A thorough risk and vulnerability analysis must be carried out in advance.
  • The enterprise must have established a satisfactory data processor agreement in compliance with Norwegian regulations. The municipality will be responsible for ensuring compliance with statutory requirements.
  • The use of cloud computing services must be audited on a regular basis. An independent third party must carry out a security audit on behalf of the municipality to ensure compliance with the data processor agreement.
  • The data processor agreement must be enforced, and the supplier's general privacy policy must be in compliance with the agreement.
  • In relation to the transfer of personal data; unless the countries transferred to have been approved as a safe destination by the EU Commission, the transfer must be regulated by standard agreements.

The European Commission has published a new strategy document on Cloud Computing in the EU

This post was written by Cynthia O'Donoghue.

With concerns over the potential fragmentation of the digital single market and the proliferation of different data protection standards for personal data across the EU, the European Commission (Commission) has published new guidance on the use of cloud computing. The Commission has identified the steps it wishes to introduce in 2013 to ensure publicly available cloud offerings adhere to European standards, not only in regulatory terms, but also in terms of being competitive, open and secure. The Commission suggests that cloud computing, being born global, requires a reinforced international dialogue on safe and seamless cross-border use in order to create a digital single market.

The Commission believes that cloud computing has the potential to slash users' IT expenditure, boost productivity and growth, and create 3.8 million new jobs by 2020. Even the smallest firms will be able to use the cloud to reach out to ever-larger markets, while governments can make their services more attractive and efficient. The lack of standardisation and harmonisation across the EU is a concern for cloud computing adoption, and implementing policies to achieve this is therefore the crux of the Commission’s strategy. A preparatory study undertaken for the Commission estimates that a public cloud would generate €250 billion in GDP in 2020, with cloud-friendly single market policies in place against €88 billion in a ‘no intervention’ scenario.

To deliver on its goals, the Commission states that it will launch three cloud-specific actions: (1) cutting through the jungle of standards; (2) implementing safe and fair contract terms and conditions; and (3) establishing a European Cloud Partnership (ECP) to drive innovation and growth.

On 25 January 2012, the Commission proposed a uniform legal framework for providing legal certainty on data protection which would address the issues raised by the cloud, and apply directly and uniformly across all 27 Member States. The new legal framework will provide for the necessary conditions for the adoption of codes of conduct and standards for the cloud, where stakeholders see a need for certification schemes that verify that the provider has implemented the appropriate IT security standards and safeguards for data transfers, including the adoption of cloud-friendly binding corporate rules where necessary.

The European Telecommunications Standards Institute (ETSI) has set up a Cloud Group to consider cloud standardisation needs and conformity with interoperability standards. The Commission will work with the support of ETSI, the European Network and Information Security Agency (ENISA), and other relevant bodies to assist the development of EU-wide voluntary certification schemes in the area of cloud computing.

The Commission states that standardisation in licensing and security is essential to the development of a digital single market. Standardisation in licensing would allow customers to access their personal account from multiple devices, irrespective of the territory. The territory from which the account is accessed should be introduced. Moreover, a rapid adoption of the Commission’s proposal for a Directive on Collective Rights Management will address many of the cross-border licensing needs for cloud content. In terms of security standards, the Commission suggests secure eAuthentication methods and the adoption of common standards for Internet transactions, which could be achieved through the adoption of their proposals on e-identification and authentication.

The Commission also proposed the adoption of model contract terms to address issues such as data retention, data disclosure, and integrity and liability.

Reed Smith Gearing Up For 'Big Data Monetization' Conference

This post was written by Mark Melodia, Cynthia O'Donoghue, Paul Bond and Frederick Lah.

Next week, Reed Smith will host a conference on “Big Data Monetization” at the Quadrus Conference Center in Silicon Valley (8:30-11:30 a.m. PDT). As we gear up, we wanted to share some of our thoughts on this notion of Big Data and give you a preview of the types of issues we’ll be tackling at the conference.

Big Data is an amorphous term, one that has taken on different meanings in different contexts. In general, Big Data is a term used to characterize the accumulation of data, especially for data sets that may not be usable for analysis by themselves. The term does not just encompass the small subset of companies that actually provides data analytics, or that exists for the sole purpose of monetizing personal information and habit data, but rather extends to any significant company participating in the digital data-driven economy.

Virtually every company, in every industry, is now an information and technology company. Companies run on Big Data, whether it be customer information, employee information, or competitive intelligence. Companies store, share, and use that information in increasingly complex ways, taking advantage of cloud-based solutions and revolutions in analytics, and finding ways to turn these massive databases into revenue – for example, by creating tailored advertisements based on customer shopping preferences or online browsing history. There is no doubt a plethora of opportunities for retailers, health care providers, banks, energy companies, website operators, and data brokers alike in Big Data.

Of course, using Big Data comes with its own set of risks. Companies need to ensure their disclosures are up-to-date and accurate about their information practices, and there may be laws or regulations on the collection and use of information depending on the types of data and data subjects involved. Both Congress and the Federal Trade Commission have also recently raised concerns about data brokers. In addition, some customers may feel uneasy with the notion that a company has too much information about them, thus drawing the attention of class action plaintiffs’ counsel. And, of course, having such deep databases of personal information highlights the importance of keeping information safe and secure. The more valuable information a company holds, the more magnified the threats of data theft and data loss become. The key with monetizing Big Data is striking the balance between risk and reward.

Our Data Privacy, Security & Management team has extensive experience providing privacy compliance advice to clients, drawing upon our knowledge gained in defending more than five dozen privacy class actions and three Multidistrict Litigations; our day-to-day operational experience answering questions from our technology, financial, health, and energy clients; and our diverse skill-set that includes engineers, software developers, cybersecurity and other technology professionals; former regulators and former in-house counsel at global banks; asset managers; and insurers. Earlier this month, Mark Melodia and Paul Bond were featured in the cover story of Law Technology News, “Defending Big Data”. Mark also recently did a podcast on “Defending Big Data”. We continue to stay on top of this area.

At the “Big Data Monetization” conference next week, our panel of experts will be tackling the following types of questions:

  • Why should a corporate officer or director or investor care about issues with Big Data?
  • What is the current regulatory landscape for Big Data?
  • What are the biggest challenges for Big Data as it operates in the United States and globally?
  • How does the issue of data ownership arise for Big Data?
  • What types of litigation risks exist for Big Data 
  • How does the so-called concept of the "right to be forgotten” impact Big Data?
  • How does insurance play a role in mitigating the risks that come with Big Data?

We look forward to seeing many of you next week.

Norwegian DPA finds that Google Analytics breaches national data protection law

This post was written by Cynthia O'Donoghue.

The Norwegian Data Protection Authority Datatilsynet (Norwegian DPA) has concluded that the use of the website analytic tool Google Analytics by two state agencies violated Norway’s data protection law. The two state agencies—Tax Administration and the State Educational Loan Fund—were not able to account for how Google Analytics worked, and found that there was a disconnect between Google’s privacy policy and that of the state agencies.

Google Analytics is a website tool that allows organizations to create reports about how visitors and users utilize a website, and to analyze visitor and user behavior. Google Analytics is widely used, including by some other European-based data protection authorities, such as the UK Information Commissioner’s Office.

Google Analytics collects part of a visitor’s IP address, which, in a 2011 opinion, the European Court of Justice found to be personal data. The Norwegian DPA found that the agencies should be deemed to control the information collected via Google Analytics cookies, but that it appeared the data was actually collected by Google, thus making Google, rather than the state agencies, the data controller. In addition, the Norwegian DPA determined that neither of the two state agencies could demonstrate that data provided to Google had been anonymized, nor that its use was limited to statistical purposes. The agencies' unconditional acceptance of the terms and conditions appeared to imply that Google could use the IP addresses to provide additional services that would allow them to compile personal information about the visitors from many different websites, and thereby potentially identify the user.

The DPA believed that Google should be functioning as a data processor of each of the agencies; has required both agencies to correct the information on their websites; and has requested that any IP addresses collected are anonymized and used only for analysis. Both state organizations now have a chance to respond to the DPA's findings before a final ruling is made.

This is the first ruling of its nature in the EEA and, in some ways, is surprising given that the collection of IP addresses by Google Analytics cookies tends to be limited by geographic region rather than comprised of the entire IP address. The final ruling will be one to watch, including whether there is a knock-on effect throughout the EEA with other national authorities taking similar decisions about the use of Google Analytics.


Asian Developments ―Tougher Data Protection Laws on the Eastern Horizon

This post was written by Cynthia O'Donoghue.

Macau, Hong Kong and Taiwan have been flexing their data protection muscles. Macau’s Office for Personal Data Protection (“OPDP”) is investigating the transfer of data from the Asian subsidiary of Las Vegas Sands to the United States. Hong Kong has just passed the Personal Data (Privacy) (Amendment) Ordinance that increases penalties and introduces new offences. Taiwan has added stronger enforcement powers to its Personal Information Protection Act (“PIPA”).

In Macau, the OPDP is investigating Sands China Limited’s subsidiary, Venetian Macau Limited, for potential violations of Macau's privacy laws, which prohibit the unsanctioned transfer of personal data to foreign jurisdictions, such as to the United States. The investigation relates to the movement of files from Macau to the United States relevant to a 2010 lawsuit. Violations of Macau's 2005 Personal Data Protection Act (“PDPA”) can be subject to civil and criminal penalties, with fines per violation of 80,000MOP (around $10,000) and a maximum jail sentence of two years. Macau has previously fined Google 30,000MOP for breaching the PDPA.

Hong Kong enacted an amendment to the Personal Data (Privacy) Ordinance passed in June, 2012. However, most provisions will come into effect 1 October 2012. The changes particularly affect organisations engaged in direct marketing or that provide data for direct marketing. The Privacy Commissioner’s Office (“PCO”) is scheduled to provide guidance on the new compliance regime, which includes enforcement powers for the PCO such as fines of between HK$500,000 and HK$1 million ($64,500 - $129,000). The maximum fine is for a new offence designed to address malicious disclosure of personal data without consent, where the perpetrator has made financial gain, caused financial loss, or caused psychological harm to the data subject. The new law also includes:

  • An exemption relating to the use of personal data in relation to due diligence
  • Requirements for data users to adopt contractual means to prevent personal data that has been transferred to a data controller from being kept longer than necessary, and to prevent unauthorised access, unauthorised use or loss of personal data 
  • A new right for individuals who have suffered harm as a result of a breach of the Data Protection Law to apply to the PCO for assistance

Taiwan amended its Computer Processed Personal Data Protection Act (“CPPDPA”) more than two years ago when it enacted the new Personal Information Protection Act (“PIPA”). PIPA is finally set to come into force next year, but the legislature can continue to make further amendments up until 30 April 2012. Enforcement under the old CPPDPA had been haphazard and intermittent, mainly because there has been no single agency responsible for enforcement. Under PIPA, the Ministry of Justice has been identified as the agency responsible for coordinating enforcement. Recently, however, Taiwan’s financial services regulator (the “FSC”) imposed substantial fines against banks on privacy-related grounds, rather than wait for PIPA to be enacted. In March 2012, the FSC fined two insurance brokers NT$600,000 ($20,000) each for illegally releasing personal data to a life insurance company.

FTC Does Not Issue a Final COPPA Rule; Instead, Seeks Comment on Modifications to Rule Definitions

This post was written by John P. Feldman, Amy S. Mushahwar and Christine Nielsen.

This morning the FTC released a supplemental notice of proposed rulemaking on the Children's Online Privacy Protection Act (COPPA) Rule. This is not a final rule. The notice suggests further modifications to proposed definitions released in the September 2011 Notice of Proposed Rulemaking on the COPPA Rule. Specifically, the FTC now seeks comment on proposed modifications to the definitions of "operator," "personal information," and "website or online service directed to children." This notice must be read in conjunction with the 2011 notice to understand the full scope of the proposed changes. The FTC is seeking comments on these proposals. Comments must be received on or before September 10, 2012. Shortly, we will be providing a detailed analysis of this notice in context with the earlier release.

California App Developer Settles Lawsuit with New Jersey AG

This post was written by John P. Feldman and Frederick Lah.

In early June, New Jersey Attorney General Jeffrey Chiesa and the New Jersey Division of Consumer Affairs brought a complaint against California-based mobile app developer 24x7 Digital LLC for alleged violations of the Children's Online Privacy Protection Act ("COPPA"). The state alleged that 24x7 Digital, through its "TeachMe" Apps, was collecting the names and unique device identifiers ("UDIDs") of children and transmitting them to a third party, without the COPPA-required notice or parental consent.

Just three weeks later, the two sides have settled. As part of the settlement, 24x7 Digital represented that it will destroy all of the information collected in violation of COPPA and that it has stopped collecting such information. The developer also agreed to comply with monitoring and reporting requirements. No money appears to be a part of the settlement. The attorney general hailed the settlement as a "clear victory for children's privacy in the age of mobile devices and the easy transfer of personal information."

According to a press release, this lawsuit was the first filed as a result of the state's ongoing initiative against Internet privacy and acts of cyberfraud. The state hinted that more suits may be on the way, saying that the Division is continuing to investigate other mobile applications and their possible unlawful sharing of personal information.

Notable is the speed with which this lawsuit was settled, as well as the absence of any money attached to this order. Cooperation and quick action may have paid off for 24x7 Digital.

Is the New Facebook Settlement About Privacy? Or, Revenge of the Prosser Torts.

This post was written by Mark S. Melodia, Paul Bond, and Frederick Lah.

Recently, Facebook announced a proposed settlement of a national class action in the United States District Court for the Northern District of California. Fraley, et al. v. Facebook, Inc., 5:11-cv-01726. This settlement has been described by some as settlement of a “privacy lawsuit.” See, e.g., “Facebook to Settle Privacy Lawsuit Over Ads” by Ann Miller in The Recorder, and “Facebook Settling ‘Sponsored Stories’ Privacy Lawsuit” by David Kravets. But is the issue really privacy? For reasons from public relations to legal analysis to insurance coverage, knowing how to characterize this type of dispute is crucial.

The Fraley Complaint challenged an alleged Facebook practice in connection with sponsored ads. Per the Complaint, Facebook would not only display such ads, but would also use the “names, photographs, likenesses, and identities” of Facebook users to help promote the product to friends of those users. The Complaint alleges that a user would be associated with a product by choosing to click a “Like” button, and would then be automatically associated with the corresponding ad campaign. The company hit back with a Motion to Dismiss contesting the existence of any claimed “right of identity,” which would be inconsistent with the operation of the campaign. Thereafter, the parties reached a settlement according to a recent court filing, although details of the settlement were not available. A separate but related lawsuit alleging Facebook violates California state law by including minors in the sponsored stories program is still pending before the court.

While the proposed settlement, if approved, will avoid the need to decide these issues in this case, the ambiguities at issue have been in play in United States law for at least 50 years. Dean Prosser, in his 1960 article “Privacy” for the California Law Review, surveyed what was, even in 1960, a haphazard patchwork of legal authority on this point. He concluded:

“What has emerged from the decisions is no simple matter. It is not one tort, but a complex of four. The law of privacy comprises four distinct kinds of invasion of four different interests of the plaintiff, which are tied together by the common name, but otherwise have almost nothing in common except that each represents an interference with the right of the plaintiff…to be let alone.” Dean Prosser, Privacy, 48 Cal. L. Rev. 388, 389 (1960).

Each of the so-called Prosser torts has since found its way into privacy class action allegations in the Internet age.

“Without any attempt to exact definition, these four torts may be described as follows: 1. Intrusion upon the plaintiff's seclusion or solitude, or into his private affairs; 2. Public disclosure of embarrassing private facts about the plaintiff; 3. Publicity which places the plaintiff in a false light in the public eye; and 4. Appropriation, for the defendant's advantage, of the plaintiff's name or likeness.” Id.

Prior suits regarding, for example, disclosure of Internet search histories or video rental habits, focused on the first two of these Prosser torts: intrusion upon the plaintiff's seclusion or solitude, or into his private affairs and public disclosure of embarrassing private facts about the plaintiff. In addition, FTC consent orders such as those entered into by Google in connection with the launch of Google Buzz, or more recently by MySpace, also involve contested claims about supposedly private affairs or private facts improperly disclosed. The Facebook settlement in Fraley is significantly different, and draws directly on the third and fourth of the Prosser torts: publicity that places the plaintiff in a false light in the public eye; and appropriation, for the defendant's advantage, of the plaintiff's name or likeness. The information at issue – that someone “Likes” a certain product – would have already been displayed on that individual’s profile, available to all of his or her friends. The combination of that freely available information with the sponsored ad makes no new information available. This is a “privacy” claim, if at all, under the aegis of the latter Prosser torts. Friends will falsely believe that a user has taken an endorsement role, the theory goes; name and likeness have been misappropriated.

In an age when brands live or die by their ability to leverage social media to improve customer engagement, including by user-generated content, understanding how all the Prosser torts may impact the use of consumer information is more critical than ever.

Article 29 Working Party adopts a "general positive stance" in its Opinion on the new EU Data Privacy Regulation and Directive

This post was written by Cynthia O'Donoghue.

In the Article 29 Working Party’s Opinion on the new EU data protection reforms, the Working Party has carefully studied both the Regulation and the Directive, and has given its first general reaction. The Working Party welcomed the provisions intended to clarify and strengthen the rights of individuals, including clarification of consent, the introduction of a transparency principle and enhanced redress, as well as the proposals to harmonise the powers among the national data protection authorities (DPAs).

Despite the positive reaction, the Working Party stated its disappointment in having two legal instruments in a Regulation and a Directive, given that the objectives of the two instruments are the same and that a comprehensive legal framework is achievable.

In relation to the Regulation, the Working Party highlights positive aspects, including:

  • Greater clarity through more precise definitions
  • Greater rights for individuals regarding their data, such as more transparency, greater control over data processing and strengthened rights to data access
  • Simplification and greater consistency for data controllers
  • Introduction of Privacy by Design
  • Data breach notification requirements
  • The Right to be Forgotten, which it hopes will strengthen individuals’ controls over their personal data
  • DPAs being given strengthened independence and powers, including fines

The Working Party also highlighted weaknesses, including serious reservations about the delegated powers reserved to the European Commission, as well as concern about the increased costs and resources needed by the DPAs, and the broad exceptions for public authorities by reason of public interest. Weakness in relation to the Right to be Forgotten relates to whether it will be possible to enforce, given the way the Internet works and the lack of a mandatory provision requiring third parties to comply with an individual’s request to erase data.

The Working Party most significantly welcomes the introduction of significant fines, which it believes will act as a deterrent and will contribute to a high degree of compliance by data controllers.

In relation to the Directive, the Working Party fears that the number of inconsistencies between the Regulation and the Directive will result in the two instruments not being complementary, and in the potential for the documents not to work together on core aspects, especially given that the Directive has a lower standard of protection than the Regulation.

As the new Regulation and Directive makes its way through the European parliamentary process, it will be interesting to watch whether the two instruments become one so that the overall aim of consistency is achieved, especially as the Directive governs the way in which law enforcement handles individuals’ personal data and the desire for not just corporates, but also government, to be held to the same standards.

FTC Issues Final Commission Report on Consumer Privacy: Agency Calls on Companies to Develop Privacy Best Practices

This post was written by Paul Bond, Christopher G. Cwalina, Amy S. Mushahwar, Frederick Lah, and Christine E. Nielsen.

This week, the Federal Trade Commission (FTC) released its long-awaited final Commission Consumer Privacy Report, entitled “Protecting Consumer Privacy in an Era of Rapid Change” (“Final Report”). The FTC emphasizes that the report only sets forth industry best practices and was “not intended” to serve as a new template for enforcement. However, this line is not exactly clear as the FTC identifies existing law and enforcement actions that form the basis of its advice (and could be the basis for Section 5 enforcement actions).

The Final Report expands on a preliminary FTC staff report issued in December 2010 and is consistent with the Department of Commerce’s (DOC) parallel privacy initiative. The Final Report calls on companies to do the following:

  • Engage in Privacy by Design: Companies should build in privacy protections – including data security, data minimization, focused data retention and data hygiene – at every stage in product development (from conceptualization to end-of-lifecycle).
  • Provide Simplified Choice: Companies should give consumers the ability to make choices about their data collection and use “at a relevant time and context,” including developing more automated choice functions like a “Do Not Track” mechanism.
  • Exhibit Greater Transparency: Companies should make their data practices more “consumer friendly” and accessible by streamlining privacy policies, providing consumers with access to data collected about them, and engaging in consumer education campaigns to promote information-age literacy.

The framework applies to all businesses that collect or use consumer data that can be “reasonably linked to a specific consumer, computer, or other device, unless the entity collects only non sensitive data from fewer than 5,000 consumers per year and does not share the data with third parties.” Notably, the framework, also applies to offline or paper data. Data that has been de-identified is exempt.

The FTC also calls on Congress to develop baseline privacy legislation. To this end, FTC Chairman John Liebowitz and DOC will be testifying Thursday, March 29, 2012, before the House Energy and Commerce Subcommittee on Commerce, Manufacturing and Trade to advance the legislative agenda. The hearing notice and Committee background memo are available here.

Over the next year, the FTC will focus on encouraging voluntary adoption of further privacy protections and be active in five main areas:

  • “Do Not Track” Browser Standard: While the FTC commends the progress made by the Digital Advertising Alliance (DAA) in developing an icon-based system for self-regulation of the online advertising industry, they say that more work needs to be done. The DAA, Internet browser companies, the FTC and the DOC have publicly committed to implementing the existing DAA self-regulatory standard in a browser-based automated privacy tool that will help consumers persistently opt-out of online behavioral advertising and multi-site advertising.
  • Mobile Data: On the heels of the FTC Mobile Children’s Privacy Report, the FTC continues to urge all companies offering mobile services to improve privacy disclosures. In that vein, the FTC will host a web-disclosure workshop including some mobile privacy discussions May 30, 2012, to address how mobile privacy disclosures may be streamlined for mobile screen viewing.
  • Data Brokers Disclosure & Consumer Data Access: The FTC asks data brokers (those collecting information on consumers where they do not have a consumer-facing relationship) to create a centralized website where they would: (1) identify themselves to consumers and describe how they collect and use consumer data and (2) detail the access rights and data choice they provide with the data that they maintain.
  • Large Platform Providers: The FTC suggested that large platform providers, businesses such as ISPs, operating systems, browsers and social media companies that seek to comprehensively track consumers’ online activities, raise elevated privacy concerns. This heightened concern regarding multi-platform tracking is best exhibited in the FTC’s and state regulators concerns regarding the streamlined Google privacy policy. FTC staff intends to host a public workshop on this topic in Q3 of this year.
  • Commerce’s Development of Enforceable Self Regulatory Codes: The DOC is in the process of developing sector-specific codes of conduct. FTC staff has indicated that it will participate in this process, and if strong privacy codes are developed in the Commerce process, the Commission will view adherence to such codes favorably when it is reviewing company practices under a Section 5 action.

Please click here to view additional information from the Reed Smith Teleseminar "FTC Issues Final Commission Report on Consumer Privacy: Agency Calls on Companies to Develop Privacy Best Practices."

Some Follow-Up Thoughts on 'Public Privacy'

This post was written by Mark Melodia, John Hines, and Frederick Lah.

We wanted to follow up on a previous post we wrote about whether there is such a thing as "public privacy" -- the concept that people should be entitled to at least some expectation that their actions, even if done in public, will not be widely publicized on a site like YouTube. We continue to see cases develop in this area, particularly in the law enforcement context.

In the following client alert, we take a closer look at this notion of "public privacy."

New law regulating Internet Information Service Providers comes into force in China

This post was written by Cynthia O'Donoghue and Zack Dong.

New regulations governing the activities of Internet Information Service Providers (“IISPs”) unveiled
by the Chinese Ministry of Industry and Information Technology (“CMIIT”) in December came into
force on 15 March. The “Several Provisions on Regulation of the Market Order of Internet Information Services” (“Provisions”) aim to enhance the protections available to Internet users in China in areas such as Internet security, data protection and online advertising.

For a more detailed analysis, please click here.

The Information Commissioner's Office publishes its initial analysis of the European Commission's legislative proposals for the protection of individuals with regard to the processing of personal data.

This post was written by Cynthia O'Donoghue.

The ICO has published its initial analysis of the European Commission’s reform of the EU Data Protection Directive 95/46/EC ("the Directive"). The ICO published its review of the draft Data Protection Regulation and the Directive on data protection in law enforcement ("DP Framework") on 25 January 2012, but was quick to stress that its review is not a comprehensive analysis, nor will it be the ICO’s last word on the subject.

The ICO views the Commission’s proposals as a “positive contribution” towards updating data protection law in light of the current “patchwork” national laws, and because the existing Directive is “out-of-date”. The ICO, however, would prefer a single comprehensive instrument to the two documents contained in the DP Framework. If two instruments do remain, then the ICO would like to see the EU Parliament ensure as much consistency as possible between them; otherwise, there could be a lack of consistency, which would undermine one of the European Commission’s objectives for revising the existing Directive.

The ICO points to the following concerns:

  • Consistency, while welcome, may never be truly possible because of the variations between different member states
  • The drive for harmonisation could become a burden on businesses and lead to complexity for individuals
  • The DP Framework is more detailed and prescriptive than the Directive and a result could be onerous or disproportionate, whereas a flexible instrument may be more suitable

The ICO reviewed a selection of the 91 articles of the proposed EU DP Regulation, praising the expanded definitions of "data subject" and "personal data" (Art. 4), and the "one-stop shop" provision for controllers and processors established in more than one Member State (Art. 51(2)).

In the ICO’s view, many provisions have been “considerably weakened” when compared with the version that was leaked in December 2011. The ICO calls for the wording to be tightened or provisions to be reinstated to strengthen the level of data protection, which is of particular importance in the police and law enforcement sector, where processing personal data carries a significant privacy risk for individuals. The ICO comments that, at the very least, the basic provisions, definitions and principles within the Framework need to be aligned, such as the inconsistencies between the draft law enforcement Directive and the Regulation relating to profiling (Art 9). According to the ICO, failure to do this is contrary to the Commission’s desire for consistency and will only lead to confusion.

Lastly, the ICO believes that the two-year implementation period is too long, mainly because data protection and privacy are not a new area of law, and many of the provisions are recognised as good practice across the EU already.

The Information Commissioner's Office wades in on Google and Microsoft privacy row

This post was written by Cynthia O'Donoghue.

The Information Commissioner’s Office (“ICO”) has asked Google to explain the operation of its system of delivering ‘third-party’ cookies to Internet users, spurred on by a dispute that has arisen between Google and Microsoft regarding Internet Explorer privacy policies.
The dispute arose when Dean Hachamovitch, Corporate Vice President for Internet Explorer (“IE”), wrote a blog post in which Microsoft accused Google of using ‘third-party’ cookies to track user behaviour online, when IE has default settings in place to stop this from occurring. ‘Third-party’ cookies are those dropped by domains other than the current domain in the address bar. They are valuable to advertisers as they can assist in building a browsing history of a user across all of the websites he or she visits.

IE by default blocks ‘third-party’ cookies unless a site presents a ‘Platform for Privacy Preferences’ (“P3P”) Compact Policy Statement describing how the site will use the cookie and pledging not to track the user. Microsoft claims Google’s P3P policy causes IE to accept Google’s cookies even though it does not say how it will use them, by utilising a nuance in the P3P specification. By using specific text, browsers read Google's policy as saying that the cookie won't be used for tracking or any similar purpose, enabling Google to bypass the cookie protection and permitting the use of ‘third-party’ cookies rather than blocking them.

Rachel Whetstone, Senior Vice President of Communications and Policy at Google, said Microsoft had "omitted important information" from the blog post. She insisted that it was well known that Microsoft’s system is outdated (dating back to 2002), and that it is “impractical to comply with Microsoft's request while providing modern web functionality”.

An ICO spokesperson has confirmed that it is making “ongoing enquiries” with Google over Microsoft’s claims in order to ensure that Google is complying with both the Data Protection Act and EU privacy regulations. The ICO could not comment on when these enquiries would be completed.

Microsoft's blog post follows an article published 17 February 2012 in the Wall Street Journal, which accused Google of using a special coding to circumvent privacy policies in Apple browser Safari, allowing it to track user movement across websites.

The Balancing Act Between Individual Privacy and Public Policies Favoring Open Public Records: NJ Appellate Court Orders Disclosure of Names and Addresses of Senior Citizens

This post was written by Mark Melodia and Frederick Lah.

A New Jersey appellate court has affirmed a lower court's order requiring a county to make available an unredacted list of names and mailing addresses of senior citizens pursuant to the state's Open Public Records Act (OPRA).

The list at issue was compiled by the County of Union to allow for distribution of a senior citizen newsletter. The operator of a website called "Union County Watchdog" requested a copy of the list so that she could disseminate information in furtherance of her website's non-profit civic activities, which included informing the public of government matters. The County had provided the list but only after redacting the mailing addresses, asserting that the senior citizens' privacy interests precluded disclosure under a provision of OPRA which requires that prior to disclosure, the custodian must redact a person's "social security number, credit card number, unlisted telephone number, or driver license number."

The County conceded that a mailing address did not appear on the list of OPRA's redaction exceptions, but instead argued that the addresses in this instance were linked to another identifier -- status as "senior citizen." The court rejected this argument, likening status as a "senior citizen" to status as a "homeowner," neither of which serve as meaningful identifiers:

"[o]ur concern is that the term 'senior citizen' is too broad a label to fall within the purview of a meaningful identifier. It is without definition or parameters. We are not convinced that the designation 'senior citizen' is any more of a personal identifier than the label 'homeowner' … We contrast this with the unique identifier of a social security number."

The court's refusal to treat status as a subgroup of citizens as a personal identifier may be seen as a contrast from the recent trend of some U.S. courts to extend the definition of personal information (in certain contexts) to cover data elements not traditionally considered to fit the category, such as ZIP codes. For our previous analysis of the recent privacy litigation surrounding the treatment of ZIP codes, click here.

The court also found that the potential harms that would result from the disclosure, such as unsolicited door-to-door contact or direct mailing, were minimal, noting that the list members had originally signed up to receive information about government services, and that was the type of information the Watchdog site intended to send. The court's stance is in line with the general lack of privacy regulations in the U.S. addressing direct mailing or door-to-door solicitations. It is interesting to note that these potential harms of increased solicitations or inconveniences (not recognized as meaningful by the court here) are often the type of harms that plaintiffs attempt to recover in class action litigation brought after a data breach or theft.

The tension between personal privacy and strong public policies favoring open public records is often the underlying theme in cases where access to government records is at issue. Under this particular set of circumstances, the court ruled in favor of the public interest of having open records over the need to preserve individual privacy, despite the fact that a vulnerable class of senior citizens was involved. The court noted, though, that some of the privacy concerns could be abated by providing notice to those people who sign up to receive the newsletter that their names and addresses are subject to OPRA.

App Privacy Guidelines backed by European Mobile Operators

This post was written by Cynthia O'Donoghue.

On 27 February 2012, with the support of Europe’s largest mobile operators, the GSMA published a set of global Privacy Design Guidelines for Mobile Application Development. These guidelines come just days after the largest US based App providers, including Google, Apple and Amazon, agreed to legally enforceable privacy standards.

The Mobile App Privacy Design Guidelines are aimed at all companies who are responsible for collecting and processing personal information about mobile users, and include App developers, mobile operators and advertisers. The guidelines encourage the development, delivery and operation of mobile Apps that put users first and help them understand what personal information a mobile application may access, collect and use, what the information will be used for and why; and how users may exercise choice and control over this use. The Guidelines also suggest that users should also be informed before they download an App whether it is supported by advertisements, and mobile advertising should only use information that has been properly obtained. In addition to transparency and privacy matters, the Guidelines include recommendations on data retention and security, use of location data, Apps’ use of social networking and social media, including use by children.

App privacy is a burning issue, with the App industry facing heavy criticism for seeking to get around privacy protections. Recently Path’s and Hispter’s Apps were exposed for uploaded users’ address books without asking for permission. Facebook has also been criticised following revelations that its Android App grant Facebook permission to read users’ text messages.

European mobile operators implementing the Guidelines include France Telecom – Orange, Deutsche Telekom, Vodafone and Telecom Italia. The GSMA said the guidelines encourage the development of Apps that respect “privacy by design” and hoped the Guidelines set a global standard rather than being just for the European market.

Mobile Application Developers: California AG Settlement with Amazon, Google, Apple and Other Mobile Appcation Platform Providers Sends Privacy Compliance Obligations Your Way

This post was written by Paul Bond, Christopher G. Cwalina, Khurram Nasir Gore, Amy S. Mushahwar and Steven B. Roosa.

A warning from the California Attorney General’s office to mobile app developers: “Don’t get cute!” On February 22, California’s Attorney General Kamala Harris announced that her office and the six leading mobile application platform providers – Amazon, Apple, Google, Hewlett-Packard, Microsoft, and RIM – have agreed to a statement of principles that ask mobile app developers to inform users of their privacy practices before users purchase or download the app. In a press conference, Harris made it clear that failure to comply with the agreed-to principles by the thousands of mobile app developers churning out applications could lead to lawsuits being filed by the Attorney General’s office against developers.

For a detailed analysis, please click here to read the issued Client Alert.

Obama Administration Finalizes Its Privacy Framework: DOC Steams Ahead with Privacy Regulatory Blueprint in the Absence of Federal Privacy Legislation

This post was written by Paul Bond, Judith L. Harris, John P. Feldman, Christopher G. Cwalina and Amy S. Mushahwar.

Today, in a ceremony with much fanfare, Secretary of Commerce John Bryson and Federal Trade Commission Chairman John Liebowitz outlined the Obama administration's privacy blueprint for a "consumer bill of rights." Shortly thereafter, the Department of Commerce released its long-awaited consumer privacy green paper entitled,"Consumer Data Privacy in a Networked World" (the "Final Report"), which follows up on a draft staff report issued well over a year ago [see our previous post, Privacy: A Washington Tale of Two Reports].

Like the previous draft, the Final Report calls for a comprehensive privacy framework for all data, instead of the current sector-specific approach to data protection that leaves some personal data (outside of the communications, health care, education, financial services and children's-online sectors) largely unregulated. The Final Report calls for federal legislation to create such a "privacy bill of rights" that would supplement and fill in the gaps of existing federal privacy policy. However, scores of privacy bills have been introduced in 2010, 2011 and 2012, and few expect a comprehensive privacy bill to pass during a bitter election year.

Knowing that privacy legislation will be difficult to pass this year, the administration also laid out a set of voluntary privacy standards in the Final Report that could be adopted by industry in the absence of legislation. The Commerce Department indicated today that it is confident industry will adopt this cooperative approach for a privacy public-private partnership. Secretary Bryson also indicated that his office already conducted extensive outreach with Internet companies, data collection companies, retailers, ad networks, privacy advocates, academics and consumer groups to encourage the voluntary adoption of seven data-handling principles:

1. Individual Consumer Control of Data Through Choice Mechanisms
2. Greater Consumer Transparency
3. Respect for Data Context
4. Secure Handling of Data
5. Consumer Data Access & Correction Rights (Data Hygiene)
6. Focused Collection (Data Minimization)
7. Accountability (through audit controls and vendor contractual obligations)

Such a voluntary code, however, comes with a carrot and an eventual stick. The carrot: FTC enforcement actions regarding online privacy matters are ongoing. As indicated in the Final Report, if the industry adopts any voluntary code that is developed, then in any investigation or enforcement action based on an FTC Section 5 unfair and deceptive trade practices action, the FTC would consider a company's adherence to the voluntary codes favorably. The stick comes in a few weeks. The Federal Trade Commission is expected to release its Final Staff Report on Consumer Privacy that will be in sync with the administration's blueprint. Non-adherence to a Final FTC Staff Report could be used as evidence of a Section 5 violation, even in the absence of any general privacy federal legislation.

In the coming weeks we will be releasing more granular guidance on how companies should begin evaluating their respective privacy practices, as well as other elements of the staff report (i.e., international harmonization, the role of U.S. state attorneys general, and DOC support of national data breach standard legislation).

 Please click here to view additional information from the Reed Smith Teleseminar "The Department of Commerce Steams Ahead with Privacy Regulatory Blueprint: What you Need to Know." 



Massachusetts Data Protection Regulations: March 1, 2012 Deadline for Service Provider Contracts

This post was written by John L. Hines, Jr., Paul Bond, Amy S. Mushahwar and Frederick Lah.

The Massachusetts Data Protection Regulations, 201 C.M.R. 17.00, ("Massachusetts Regulations") establish minimum standards to be met in connection with safeguarding the personal information of Massachusetts residents. Personal information is defined as a resident's first name and last name or first initial and last name in combination with the resident's Social Security number, driver's license number or state ID card number, or financial account number.

Under the Massachusetts Regulations, companies that own or license personal information must "oversee" service providers by requiring them by contract to "implement and maintain such appropriate security measures for personal information." See 201 C.M.R. 17.03(2)(f). The Massachusetts Regulations provide a grandfather clause that deems any contract with a service provider entered into before March 1, 2010 to be in compliance, even if it does not have provisions related to adequate data security. This clause, though, expires March 1, 2012, which is quickly approaching. From that date forward, all contracts with service providers must be in compliance with the provision.

All companies—whether the owner/licensor of the information overseeing the service provider, or the service provider (who would also likely be considered a licensor)—need to ensure that any contract (new or existing) touching personal information contains a provision to implement and maintain appropriate safeguards. Such a representation should be accompanied with the requisite due diligence to ensure accuracy and the right to review/audit future compliance.

Contractual modification may prove to be harder for some companies, particularly those operating under medium- or long-term contracts that do not require that a servicer provider do all the things that the Massachusetts Regulations require. In this situation, good faith and cooperation may not always work. Still, you may be able to rely on contractual clauses requiring compliance with law to effectuate change. At the very least, you should communicate (and document) your expectation of compliance to the service providers.

Markey Releases Discussion Draft of the Mobile Device Privacy Act

This post was written by Amy S. Mushahwar.

Today, in response to the controversy surrounding cellphone tracking software from Carrier IQ, U.S. Representative Edward Markey (D-MA) released a draft of a cellphone privacy bill.

As background, the Carrier IQ software first made headlines in November, when a researcher posted a YouTube video claiming to show that the Carrier IQ software records users' every keystroke, including the websites they visit, the contents of their text messages and their location. Carrier IQ, a California-based software company, says its software is installed on 140 million phones, but the company does not track keystrokes or user's locations. Carrier IQ now faces a federal investigation and multiple lawsuits on this matter.

The Markey legislation aims to remedy the perceived privacy deficiencies. In its present form, the Markey discussion draft would require companies to:

  • Disclose any mobile tracking software when a consumer buys a device (or after sale if it is later installed by a carrier or placed within a mobile application downloaded).
  • Notify consumers what information may be collected, any third parties to which the information would be disclosed and how such information will be used.
  • Obtain express consent before the tracking software collects or transmits information.
  • Require any third party receiving collected personal information to have policies in place to secure the information.
  • Require any third parties to prepare and file agreements on information with the Federal Trade Commission (FTC) and Federal Communications Commission (FCC).

Additionally, the legislation contemplates outlining an enforcement regime for the FTC and FCC, along with State Attorney General enforcement and a private right of action. Representative Markey is the co-chair of the Bi-Partisan Congressional Privacy Caucus, and he has previously investigated the privacy and data security practices of Google, Apple, Facebook, Amazon, and others.

EU Commission sends draft EU General Data Protection Regulation and Directive on Criminal Investigations and Judicial Proceedings to the European Parliament

This post was written by Cynthia O'Donoghue and Nick Tyler

The European Commission today completed its task of reforming the EU Data Protection Directive by sending a draft Regulation to the European Parliament. The draft Regulation contains comprehensive reforms and seeks to harmonise data protection laws across the 27 EU Member States, and to enhance EU citizens' privacy protections in the age of the Internet.

There will be two tiers of compliance obligations and sanctions, with one aimed at small- to medium-sized enterprises and the other at large, multinational organizations. SMEs are entitled to certain exemptions to ease administrative burdens, such as no requirement to appoint a data protection officer and a sanctions cap of up to €1 million. Multinationals with more than 250 employees in the EU will have to appoint a data protection officer and may face sanctions of up to 2 percent of worldwide annual turnover for serious breaches. Multinationals outside the EU will also have to comply with the data protection rules if they seek to market products and services to the EU citizens.

Key provisions include:

A single notification to the data protection authority in the country where an organization has its principle establishment. There remains an obligation to notify and seek prior authorization for a range of processing activity considered to present specific risks, such as systematic and extensive profiling and large-scale video surveillance.

Accountability principle for those processing personal data, including impact assessments for SMEs and top-down accountability for all organisations.

Data breach notification to the national data protection authority if feasible within 24 hours, and to individuals if there is a risk of harm.

Increased individual control over their data includes seeking their explicit consent before data may be processed rather than it being assumed, and their ability to refer matters to the data protection authority in their country even if data is processed by a company based outside the EU.

Data Portability will mean that individuals will have easier access to their own data and be able to transfer it from one service provider to another more easily.

A right to be forgotten allows individuals, including children, the ability to delete their data if an organization does not have any legitimate grounds for retaining it. The right provides exemptions for legitimate historic data such as newspaper archives, and seeks to balance the right to privacy with the right to free speech.

The sanction regime has at least been watered down from the draft Regulation circulated in November 2011, which had proposed sanctions of up to 5 percent of worldwide annual turnover.

There have been some ‘business-friendly’ changes to the draft Regulation as compared with the earlier November draft. The proposal for an opt-in for commercial marketing has been substituted with an opt-out, and the provisions relating to children’s privacy now requires parental consent for under the age of 13, rather than 18.
In addition, while there is an emphasis on binding corporate rules for international data transfers outside of the EU, contractual clauses, EU standard contracts, and findings of adequacy, as well as international commitments by countries or international organizations such as U.S. Safe Harbor, will still apply. Given the changes contemplated under the draft Regulation, existing international data transfer mechanisms may need to be reviewed and amended if the draft Regulation is adopted.
The new European Data Protection Board will no longer act as a supernational regulator in relation to approving enforcement actions and sanctions as proposed in the November version of the draft Regulation. Instead, its powers will be limited to ensuring consistent application of the Regulation without the power to overrule decisions in individual cases.
The Commission's proposed draft Regulation and accompanying Directive now goes to the European Parliament and EU Member States (meeting in the Council of Ministers) for discussion. The Regulation will only take effect two years after adoption by the European Parliament, and we would expect further changes as it makes its way through the legislative process. That means any changes are probably close to three years down the road.

When might a private email account become 'public property'? Freedom of information guidance may lead to erosion of privacy for employees

This post was written by Cynthia O'Donoghue and Nick Tyler.

There will always be a tension implicit in the relationship between freedom of information and data protection laws. In the United Kingdom this is usually alleviated by the fact that both are regulated by the same person/body, the Information Commissioner’s Office (ICO). However, recently published ICO guidance, aimed at public authorities under the Freedom of Information Act 2000 (FOIA), could provide an arguable basis for allowing private sector organisations to search their employees’ private email accounts for work-related communications or company business to respond to subject access requests made under the Data Protection Act 1998 (DPA) or other legitimate requests, such as e-discovery/disclosure.

The ICO guidance 1 was prompted by reports of government ministers, elected representatives and/or public sector officials using their non-work personal email accounts (e.g. Hotmail, Yahoo and Gmail) for work-related communications and official business. Concerns that this may have been done in a deliberate attempt to circumvent the FOIA regime prompted the regulator to act. The ICO guidance makes it clear that information held in such accounts and relating to official business of a public authority is “held by the authority” and/or “held by another person on behalf of the authority” and is therefore in scope of a request made under FOIA.

We wonder whether by ensuring no stone is left unturned to identify all information within the scope of FOIA requests this guidance might have some unintended consequences, by analogy, in the context of subject access requests made under the DPA.

The guidance requires public authorities that have established the existence of such information to ask the individual “to search their account for any relevant information”. A record of such action needs to be kept “to demonstrate, if required, that appropriate searches have been made in relation to a particular request”. This may arise in the course of the ICO’s investigation of a complaint under FOIA.

The guidance recommends clear policies for email/acceptable use of IT systems, and records management, in an effort to address the acknowledged “complications” arising from the onerous requirement to request “searches of private email accounts, and other private media”.

Addressing similar “complications” could lead to employers exerting their authority over their employees in attempting to either identify all personal data within the scope of a data subject access request or within the scope of a company’s legitimate business interest, such as would be required to respond to disclosure/discovery. The rationale behind the guidance could just as easily be applied, by analogy, to those occasions when the ICO deems it appropriate that such searches should extend to personal email accounts and home computers, where these have been used to process personal data for which the employer is the data controller.

Such unintended consequences inevitably raise genuine concerns about the erosion of privacy in the workplace. At this point such concerns are likely to surface in the public sector workplace, unless accepted as the inevitable price of greater openness in the public sector. 


1 “Official information held in private email accounts”, ICO, dated 15 December 2011

The European Court of Justice rules twice in one day on data protection issues: Emerging clarity and consistency is in everyone's interests.

This post was written by Cynthia O'Donoghue and Nick Tyler.

“You wait for ages for one and then two turn up at the same time!” The European Court of Justice issued two significant rulings this past November.

The first addressed the manner in which Spain enacted the Data Protection Directive. In Asociación Nacional de Establecimientos Financieros de Crédito (ASNEF) v Administración del Estado (C-468/10) and Federación de Comercio Electrónico y Marketing Directo (FECEMD) v Administración del Estado (C-469/10), the claimants challenged Spain’s national data protection law (Organic Law 15/1999) which imposed the extra condition that personal data must be in the public domain when processed, based upon a data controller’s legitimate interests. The ECJ ruled that Article 7(f) of the Data Protection Directive 95/46/EC was sufficiently precise to have direct effect in member states’ national laws because it sets out an exhaustive list of conditions to the processing of personal data and as such member states may not impose additional conditions.

The surprising aspect of this case, in our view, is that it has taken until now to gain a degree of consistency of interpretation for what is a relatively straightforward provision of EU data protection law. In our experience the misinterpretation of this provision in Spanish law has presented real practical difficulties to clients implementing run-of-the-mill applications involving non-sensitive personal data. The resulting emphasis in Spain on the need to gather consent has inevitably introduced increased bureaucracy and associated costs.

The other case, Scarlet Extended SA (Scarlet) v Société belge des auteurs, compositeurs et éditeurs SCRL (SABAM) (Case C-70/10), stemmed from a referral to the ECJ by the Belgian court and has important implications for the practical enforcement of copyright infringement cases. SABAM, a management company representing owners of copyright-protected works, took legal action against Scarlet, an Internet Service Provider (ISP), because Scarlet’s users were downloading works in SABAM’s catalogue through peer-to-peer networks/file sharing and so infringing copyright.

In the legal proceedings SABAM asked the Belgian courts to make an order requiring the ISP to stop such infringements “by blocking, or making it impossible for its customers to send or receive in any way files containing a musical work using peer-to-peer software without permission”. The technical solution would involve a systematic analysis of all content and the collection and identification of users’ IP addresses from which unlawful content was sent, which may also result in the blocking of lawful content. The local Belgian court granted SABAM’s request for an injunction.

Scarlet appealed, claiming that the injunction would be unlawful on several grounds, most notably in the context of data protection and privacy by breaching Belgian laws implementing Directive 2000/31, prohibiting the monitoring of communications and the general surveillance of all communications passing through the ISP’s network, and Directive 95/46/EC because the filtering system would involve the processing of IP addresses, which are personal data.

The ECJ ruled that the technical solution did not strike a fair or proportionate balance between the protection of the intellectual property right holders and the freedom to conduct a business, such as ISPs, nor was a fair balance struck between the protection of copyright and the fundamental rights of individuals, in this case the ISP’s customers.

Crucially, the ECJ noted the impact on the ISP’s customers and the infringement of their fundamental right to protection of their personal data (Article 8 of the Charter of Fundamental Rights of the EU) and their freedom to receive or impart information (Article 11 of the Charter).

This ruling essentially validates the Art. 29 Working Party’s opinion that in the hands of ISPs, IP addresses are personal data because “they allow those users to be precisely identified.” What is unclear from the ruling is whether IP addresses are also considered to be personal data when processed by organizations that would not have access to names and account information that would enable such precise identification.

Tougher EU Data Protection Laws on the Horizon

This post was written by Cynthia O'Donoghue.

In a bid to strengthen the European data privacy rules it is most likely that non-European companies will be held to the same standards as European companies in a bid to further protect EU consumer privacy. 

The EU Justice Minister, Viviane Reding, and the German Consumer Protection Minister, Ilse Aigner, released a joint statement saying that the proposed reforms to the Data Protection Directive due at the end of January 2012 will be changed so that consumers’ privacy is protected regardless of a company’s country of origin. “We both believe that companies that direct their services to European consumers should be subject to EU data protection laws. Otherwise they should not be able to do business on our internal market.”

Reding and Aigner focused their statement not just on social networks but also on data that is stored in a ‘cloud’. They stressed that consumers should have more control over their data and stated “EU law should require that consumers give explicit consent before their data are used. And consumers generally should have the right to delete their data at any time, especially the data they post on the internet themselves.”

The joint statement leads us to conclude that both a new principle of accountability and a ‘right to be forgotten’ will be included in the revised EU data protection law. The statements are also consistent with the increased pressure for social networks, like Google and Facebook, who operate outside the European Union but target EU based consumers, to fully comply with the EU data protection laws. The pressure on such companies can also be seen as a natural progression from the investigations into their handling of personal data that have emanated from France, Germany, the UK and Ireland. To prepare for the new horizon, organisations should start by thinking about compliance. 

How to Craft Plain Language Privacy Notices and What Constitutes "Material Change"

This post was written by Christopher G. Cwalina.

Privacy policies have been reviled for their incomprehensibility; regulators are calling for clearer disclosures, and, increasingly, statutes require that privacy notices be written in plain language. In this program, our seasoned panelists—including a plain-language expert—will use real-world examples to help you craft a clear and consumer-friendly privacy notice that also satisfies legal requirements. Find out how to turn legalese into easy reading using common words; short, declarative sentences, and an emphasis on action and choice.

In addition, the FTC has said that under well-settled case law and policy, companies must provide prominent disclosures and obtain opt-in consent before using consumer data in a materially different manner than claimed when the data was collected, posted, or otherwise obtained. What constitutes using data in a materially different manner than originally claimed can be difficult to ascertain. Companies are regularly and on an ongoing basis developing new products and services involving new data uses. The line between an existing and already disclosed use of data and the start of a materially different use that needs to be independently disclosed is not always clear and privacy professionals are left to make this decision. Hear directly from an Assistant Director from the FTC's Division of Privacy and Identity Protection on this point.

Privacy Compliance: Not Just a Luxury Anymore

This post was written by Mark S. Melodia and David Z. Smith.

On August 29, 2011, a Google shareholder filed a derivative action against the company’s directors stemming from Google allegedly allowing and supporting Canadian and other foreign pharmacies to advertise and ship prescription drugs to American consumers through Google’s AdWords advertising program in violation of U.S. law. The lawsuit comes on the heels of the announcement days earlier of a $500 million settlement between Google and the U.S. Department of Justice over an investigation of those same advertising practices. Google’s AdWords program displays sponsored advertisements in response to specific searches entered into Google’s search function. AdWords not only allows advertisers to target certain search terms, but to geo-target the searchers, so that certain advertisements will only appear for search terms entered by individuals within a certain geographic location. Plaintiff thus alleges that the directors breached their fiduciary duties and wasted corporate assets by, among other things, failing to ensure that Google had proper internal controls that would have prevented Canadian pharmacies from geo-targeting U.S. citizens with advertisements for prescription drugs.

This lawsuit is the latest in a growing line of derivative and securities fraud complaints based on alleged lack of internal controls over data security and privacy. In past cases, companies such as Heartland Payment, ChoicePoint, TJX, and more recently, Sony, have all been sued for allegedly failing to develop and maintain an adequate security environment, thereby allowing consumers’ private information to be exposed and forcing the companies to expend scarce corporate resources to prevent litigation losses or further reputational hits. The Google case shows that companies not only face the risk of derivative or securities fraud actions over the failure to protect consumers’ data, but may also be forced to defend any failures to control how their systems are used (or possibly misused) by a third-party to target consumers they should not be allowed to target. With the increasing sensitivity over on-line data security and privacy, and growing public awareness of web/search advertising functionalities such as AdWords or sites that allow third-party communication and geo-location check-ins (like social media sites), these lawsuits are likely to become more frequent. Such cases also deliver a fresh reminder to senior management of how strong privacy compliance programs and practices have come to be regarded as a critical component of good corporate governance and behavior.

The end of the News of the World marks the beginning of the end for wholesale privacy intrusions by the media - the Information Commissioner says, "I told you so!"

This post was written by Nick Tyler.

The closure of the News of the World, the best-read Sunday newspaper in the English language, is a stark illustration of the reputational and commercial damage that can result from privacy-intrusive practices carried out in the name of ‘investigative journalism’.

The UK’s phone-hacking scandal, which has been rumbling for years, blew up this week after it came to light that it was not just public figures and celebrities that were targeted but ordinary people (and their families) who were the victims of crime, terrorism and war. Such egregious and unconscionable behaviour saw an advertising boycott by companies which will result in the last edition of the newspaper this Sunday carry no commercial advertising.

Ultimately, for the newspaper’s owner Rupert Murdoch, the reputational price proved too high as the scandal’s effect threatens the share price of News Corporation International as well as their multi-billion pound takeover of BSkyB in the face of universal public outrage.

As the criminal investigation finally gets into gear, with arrests of high-profile figures expected and a public inquiry ordered by the Prime Minister, it is worth noting that the UK’s data protection regulator, the Information Commissioner Christopher Graham, this week reminded everyone that over five years ago his office (the ICO) first brought to light the unlawful trade in personal information with two special reports to Parliament, What Price Privacy?’ and ‘What Price Privacy Now?’ .

When first publishing these reports the ICO pressed for the strongest possible sanctions for those found guilty of the most serious criminal offences under UK data protection law. Those representations resulted in a power to change the law (see section 77 of the Criminal Justice and Immigration Act 2008). This power would enable the penalty for breaches of section 55 of the Data Protection Act 1998 to include custodial sentences. However, it has not yet been exercised by the UK Government.

On the back of the latest scandal the Commissioner this week called for that power to be exercised. We can expect that call to become stronger and louder over the coming weeks and months.

"Stick, Twist or Bust?" UK Minister warns EU Commission not to gamble with the future direction of data protection law.

This post was written by Cynthia O’Donoghue and Nick Tyler.

The UK Minister responsible for government policy on data protection has raised concerns about any proposed “radical rewrite” of the EU Data Protection Directive.

Kenneth Clarke, Lord Chancellor and Secretary of State for Justice, called for both flexibility and a common-sense solution to modernising data protection law. He recognised that “technology has moved on” and that future EU regulation of data protection must address the “broader landscape” without getting caught up in “endless” debate “over the details”.

The flagging at this stage of some fundamental UK opposition to a number of specific reforms does not bode well for a happy consensus emerging from the EU-wide negotiations to follow the hotly anticipated publication of the EU Commission proposals:

What are seen as ‘Bad Ideas’?

  • A new “right to be forgotten” – Worried about its impact on both business and the public, Mr Clarke made it plain that he wants the “right to be forgotten” to be forgotten!
  • Revision of the Data Retention Directive – Mr Clarke staunchly defended the ability of law enforcement authorities across the world to collect, retain and pool data to improve security, in spite of concerns from privacy regulators and advocates.
  • EU extra-territoriality – While acknowledging the aspirational “idea that European standards [of data protection] should apply to any firm processing EU citizens’ data anywhere in the world”, Mr Clarke was withering in his assessment that, on purely legal grounds, the European Commission must be “wrong”:

“I see little sign that the Commission has thought about this sufficiently yet. And how on earth are you going to enforce EU [data] protection on a global basis?”

Any ‘Good’ Ideas?

The Accountability Principle and Binding Corporate Rules –referring to the UK’s consultation on revision of the EU Data Protection Directive, Mr Clarke backed a more business-friendly solution:

“. . . [W]e should consider moving from a system which restricts information based on national standards of data protection, to a system based on the standard of data protection of the particular company involved – far more relevant to modern methods of business.”

Raising the Stakes for the Future of EU Data Protection?

The UK Government position appears against a move toward harmonization. In Mr Clarke’s view sticking to a set of shared principles and values, which at present has been implemented and is enforced in 27 different ways, would allow each country to be true to its own “constitutional and cultural identities”:

“. . . let’s learn to understand each other’s legal systems better, not rewrite our respective statutes and codes from scratch.”

This is a challenging prospect for global businesses trying to understand and comply with local law variations across Europe. They can only hope that the future EU data protection regime delivers some significant improvements to work with, and avoids the imposition of bad ideas in the form of arbitrary, additional and onerous obligations.


A Supreme Court Win For Free Speech About Medical Options

This post was written by Paul Bond and Joe Metro.

States regulate doctors in issuing prescriptions. The States keep databases that show which doctors prescribe what medicines, for what purposes, and when. That information is valuable to anyone who would seek to locate doctors with certain prescription-writing habits. For example, a database user might seek out doctors to suggest that those doctors try a different drug or combination of drugs as a more effective treatment. Some doctors objected to being contacted with such suggestions, especially by commercial drug manufacturers. As a consequence, several States passed laws banning the purchase and use of prescription-writing records for purposes of commercial outreach to health care professionals. Vermont's law was challenged by, et al., IMS Health, a major provider of information services to the health care industry. The United States Court of Appeals for the Second Circuit, at IMS Health's urging, struck down Vermont's law as imposing an unconstitutional impairment on commercial free speech. Today, in a 6-3 decision, the United States Supreme Court agreed, adopting a position that Reed Smith helped advance.

Justice Kennedy, writing for the majority in Sorrell v. IMS Health, stated that: "Speech in aid of pharmaceutical a form of expression protected by the Free Speech Clause of the First Amendment. As a consequence, Vermont’s statute must be subjected to heightened judicial scrutiny. The law cannot satisfy that standard." The Court noted that Vermont's law would allow academics to use of prescriber-identified information to promote generic drug use. However, the same law would block the makers of brand-name drugs from reaching out to doctors in a comparable, high-touch informational campaign. Thus, "the law on its face burdens disfavored speech by disfavored speakers." Lacking a compelling reason for this viewpoint-based discrimination, Vermont's law could not stand.

The dissent, authored by Justice Breyer, called for a more relaxed standard of review to be applied to the challenged State regulations. The dissent argues that the speech in question is commercial; that limits are routinely put on marketing speech especially in connection with health and safety; and moreover, that the States should be afforded great leeway in deciding for what purposes these State-created databases of prescription information are sold and used.

Reed Smith participated in this case to further explain to the Court the public health benefits arising from targeted commercial use of prescription-writing data. Reed Smith's team drafted and filed an amicus brief supporting IMS Health's position. Reed Smith submitted that brief to the Court on behalf of two former United States Secretaries of Health and Human Services (Dr. Louis W. Sullivan and Governor Tommy Thompson) as well as the Healthcare Leadership Council. The decision of the Court today is fully consistent with the positions advanced by these public health experts. Of note, that Court specifically cited to and endorsed the public health benefits of a free flow of information about treatment options. As the Court found: "A consumer’s concern for the free flow of commercial speech often may be far keener than his concern for urgent political dialogue. That reality has great relevance in the fields of medicine and public health, where information can save lives."

FTC Seeks Public Comment For Revising the "Dot Com Disclosures"

Careful Consideration is Advised, as FTC's Guidance May Inform Federal and/or State Enforcement Actions

Comments Deadline: July 11, 2011

This post was written by Christopher G. Cwalina, Amy S. Mushahwar, and Frederick Lah.

The Federal Trade Commission ("FTC") seeks public comment, as it considers updating and reissuing "Dot Com Disclosures: Information about Online Advertising", its business guidance document for online marketers on how to provide clear and conspicuous disclosures to consumers.

In its request for comment, the FTC cites the dramatic changes in the online world since the guidance was originally published in 2000, particularly the emergence of mobile marketing, the "App" economy, the use of "pop-up blockers," and online social networking. (This recognition of mobile is particularly important in light of last week's letter by Senator Al Franken (D-MN) to Google (maker of the Android) and Apple (maker of the iPhone and iPad) asking that all mobile apps for their devices provide "clear and understandable privacy policies.")

Even though the "Dot Com Disclosures" are considered guidance and not formal regulations, the FTC has used its Dot Com Disclosures to inform Section 5 enforcement actions. For example, in a consent order with, Inc., the FTC required that's representations about its advertisements be made "clearly and prominently." The definition of "clearly and prominently" was cited almost verbatim to the definition of the term as it appears in the guidance. The FTC also cited to the guidance back in 2002 in response to a complaint brought by Commercial Alert against search engines like AOL and Microsoft for their allegedly misleading disclosures about the advertisements placed on search result lists. State courts have also cited the guidance. In 2009, a Texas court stated that in determining what constitutes deceptive conduct under Texas' Unfair Trade Practices Act, "they are to be guided by the interpretations of that term in the guidelines of the FTC" and found that those guidelines require that disclosures must be “clear and conspicuous” based on the placement of the disclosure on the webpage and its proximity to the other relevant information.

The FTC seeks comment from the industry on a number of issues. In the request for comment, the FTC provides a series of questions to help companies consider what type of revisions need to be made, such as:

  • What issues have been raised by new online technologies, Internet activities, or features that have emerged since the business guide was issued (e.g., mobile marketing, including screen size) that should be addressed in a revised guidance document?
  • What issues raised by new laws or regulations should be addressed in a revised guidance document?
  • What research or other information regarding the online marketplace, online advertising techniques, consumer online behavior, or the effectiveness of online disclosures should be considered in a revised guidance document?
  • What specific types of online disclosures, if any, raise unique issues that should be considered separately from general disclosure requirements?
  • What guidance in the original “Dot Com Disclosures” document is outdated or unnecessary?
  • What guidance in “Dot Com Disclosures” should be clarified, expanded, strengthened, or limited?
  • What issues relating to disclosures have arisen from multi-party selling arrangements in Internet commerce, such as (1) established online sellers providing a platform for other firms to market and sell their products online, (2) website operators being compensated for referring consumers to other Internet sites that offer products and services, and (3) other affiliate marketing arrangements?

Regardless of how the guidance is ultimately revised, the FTC will certainly continue to use this sort of guidance to inform its enforcement efforts. We recommend that companies carefully review the Dot Com Disclosure guidance and questions posed in the request for comment. Companies should analyze how any new guidance might affect their advertising practices and consider whether they should provide comments. July may seem like several weeks away, but because these issues are likely to impact advertising for multiple product lines within your company, we encourage you to begin an internal dialogue on this proceeding immediately.

Commissioner Brill Introduces Competition Analysis to Privacy Debate

This post was written by Paul Bond and Chris Cwalina.

In her new article, "The Intersection of Consumer Protection and Competition in the New World of Privacy," Federal Trade Commissioner Julie Brill cautions that the pursuit of privacy may conflict with the pursuit of a competitive market. Commissioner Brill's article, published in the Spring Edition of Competition Policy International, notes that the Federal Trade Commission's role is to protect consumers from many types of market failures. The FTC strives to protect consumers from unfair and deceptive information collection and use practices. But, at the same time, the FTC protects consumers from collusive and other anti-competitive behaviors. Commissioner Brill identifies a potentially problematic range of privacy enhancements which could, paradoxically, harm consumers by stifling competition. In this position, Commissioner Brill goes further than the FTC's preliminary white paper, "Protecting Consumer Privacy in an Era of Rapid Change: A Proposed Framework for Businesses and Policymakers" (2010 Privacy Report).

For example, Commissioner Brill asserts that self-regulation to date has been "slow and inadequate". This mirrors criticisms in the 2010 Privacy Report. But Commissioner Brill goes on to posit that dominant companies can misuse privacy self-regulation to stifle market entry by new competitors. The Commissioner does not describe in any detail the manner in which such an anti-competitive plan would be carried out. Presumably, the cost in money or time of complying with the industry's self-regulation would prove prohibitive for fledgling businesses, while just a "cost of doing business" for better capitalized industry leaders. There may also be a concern that existing businesses, which already hold stockpiles of consumer information, would erect barriers to data collection which would affect new enterprises disproportionately.

Commissioner Brill also raises the competitive concern that privacy regulation not unfairly benefit new entrants. "Indeed," she recognizes, "some more established data brokers and other information firms believe it is much easier for their newer competitors to design privacy protections into their new business models and new forms of communications than it is to retrofit old systems to meet the realities of today's privacy concerns."

Until now, a strategic analysis of the competitive impact of privacy regulation has not been an FTC priority. Indeed, in her Article, Commissioner Brill notes that she writes only for herself, and is not reflecting the views of the Commission or the other Commissioners. Still, taken in conjunction with Commissioner Roach's recent opinion that the Google Buzz settlement may have been a strategic ploy by Google to create insurmountable regulatory barriers to entry, it is safe to say the FTC is increasingly wary of privacy regulation being misused for private ends. Advocates of self-regulation, as well as those seeking to advance or defeat governmental regulation, must be prepared to explain why their privacy regulation or self-regulation proposals are consistent with a vigorous free market. Advocates of industry self-regulation already know that the FTC has criticized efforts to date and here is another hurdle that must be addressed before self-regulation is deemed by the FTC to be robust enough and workable.

Given how extremely easy it is to transfer information as an asset between corporate forms, and from one area of the world to another, the prospect for strategic resistance to or abuse of privacy regulation by companies around the world is substantial. Commissioner Brill performs a service by injecting a note of economic realism into the ongoing debate about how information can and should be regulated in the 21st century.

China Announces State Internet Information Office

This post was written by Joseph I. Rosenbaum, Frederick H. Lah, Zack Dong and Amy S. Mushahwar.

On May 4, 2011, the Chinese government announced it was establishing the State Internet Information Office, an office dedicated to managing Internet information. According to the announcement, this office will be responsible for directing, coordinating, and supervising online content management. The office will also have enforcement authority over those in violation of China's laws and regulations (see, for example, China sets up office for Internet information management). While there are reports that many believe the purpose of the new office will be to censor political and social dissidents (see, China Creates New Agency for Patrolling the Internet, the office may also have a key role in thwarting illegal spamming and other dubious data practices.

To read the complete blog post, click here.

Does "Public" Privacy Exist?

This post was written by Mark Melodia, John Hines, and Frederick Lah.

Just how much privacy are we entitled to in public places, such as public highways and buses, classrooms, restaurants, or even on the Internet? While we expect to lose some sense of privacy when we move into public spaces, does this mean that we should be subject to being recorded (and subsequently publicized on a site like YouTube) anytime we are in public? Two recent cases involving the recording of police officers highlight the debate surrounding these questions.

Back in April 2010, motorcyclist Anthony Graber was charged with violating Maryland's wiretapping laws after he used a camera in his helmet to videotape a state trooper brandish his gun while stopping Graber for speeding. To see the YouTube Video, please click here.  The Maryland court dismissed the charges, providing that "[i]n this rapid information technology era in which we live, it is hard to imagine that either an offender or an officer would have any reasonable expectation of privacy with regard to what is said between them in a traffic stop on a public highway."

Later, in March 2011, the ACLU, on behalf of Khaliah Fitchette, filed a complaint against the City of Newark, N.J. after Fitchette was handcuffed and detained for using her smart phone to record two police officers deal with a disorderly man on a bus. Fitchette was allegedly detained for two hours in the back of the squad car but no charges were filed against her. Fitchette's phone was seized by the police and the video was deleted. The complaint alleges violations of the Fourth Amendment and Fitchette's First Amendment right to record and disseminate the video. A decision has not yet been made on the case.

These two cases illustrate the debate over whether police officers should be subject to being filmed or recorded while performing their duties. On the one hand, some would argue that a free and open society ought to tolerate and even encourage the rights of citizens to record and publish the activities of their public servants, especially police officers; indeed, some might argue that recording arrests and other demonstrations of police power may help reduce the incidence of abuse and unlawful invasion of individual rights. On the other hand, there is a legitimate concern that being recorded and subsequently publicized might have a chilling effect on an officer's willingness to act swiftly in critical situations and thereby jeopardize public safety and welfare.

On a deeper level, though, the reluctance that some police officers feel about being taped may serve as a visible demonstration of the reluctance that many people feel about their lack of "public" privacy. As new technologies with recording capability continue to become more widespread, anyone of us is subject to being recorded anytime we step out into public. What's more is the reality that such recordings may be uploaded onto YouTube and publicized to the world at the press of a button. As Harvard Law Professor Jonathan Zittrain notes in one of his books , "[C]itizens can quickly distribute to anywhere in the world what they capture in their backyard … The presence of documentary evidence [ ] creates the possibility of getting fired or disciplined where there had not been one before … As our previously private public spaces, like classrooms and restaurants, turn into public public spaces, the pressure will rise for us to be on press conference behavior."

Similarly, as Internet marketing companies continue to find new ways to track and utilize consumer information, how much privacy should people be entitled to as they browse the Internet? For example, there have been a number of lawsuits over the past year brought by consumers against companies for their use of Flash cookies / Local Shared Objects ("LSOs"). The suits generally contend that companies, without permission, use Flash cookies / LSO to track and follow consumers as they browse the Web. While each of these suits involve individualized questions of fact, collectively they raise important social (and political) considerations on this issue of "public" privacy. Despite the fact the Internet is largely considered to be a public place -- whether as a forum to exchange ideas or as an online marketplace -- these lawsuits show that people still feel entitled to a sense of personal privacy as they use the Internet. Perhaps the disconnect lies within our society's continued reliance on the Warren and Brandeis standard of the "right to be left alone." Some scholars have suggested that that standard no longer applies and that the relevant standard should instead be on preventing tangible harms that might result when data is entrusted to a third party.

Whether we are entitled to some sense of "public" privacy is a debate that addresses important public policy considerations that go to the heart of how we control what others think of us and how we maintain control over our ability to shape and manage our identity, reputation, and personal information. There is obviously no easy answer. The only thing that is clear is that there is no specific state or federal legal scheme designed to address this issue. As Congress and State legislatures continue to wrestle with these questions, we will continue to monitor.

'What Cookies Are In Your Jar?' - ICO's guidance on compliance with new EU cookie law leaves industry something to chew on (and few crumbs of comfort!)

This post was written by Cynthia O'Donoghue and Nick Tyler.

With two weeks to go until implementation of an EU-wide amendment to the law on cookies and consent, the UK’s data protection regulator, the ICO, has issued initial guidance on compliance. It proposes three actions that organisations can take to mitigate their potential exposure to enforcement action in the short-term. In the meantime, industry and the authorities are working on finding solutions to the most complex and challenging issues presented by the new law.
In our Client Alert we look more closely at what organisations need to be doing now to comply with this new EU-wide regime.  Reed Smith's Legal Bytes blog also recently posted on the topic.

California Senator Proposes State "Do-Not-Track" Bill

This post was written by Kathyleen A. O’Brien.

On April 6, 2011, California State Senator Alan Lowenthal (D-Longbeach) introduced a version of “do-not-track” legislation in the form of SB 761. An initial hearing will be held by the California Senate Judiciary Committee on April 26.

The bill largely follows the current “do-not-track” framework being proposed by U.S. Rep. Jackie Speier (D-CA) and others in Congress. Many, including Sen. Lowenthal, see the California bill as a way to spur action on the national level. Although privacy is largely viewed as a bipartisan issue, Lowenthal is hoping that because the Democrats control the California governorship and legislature, the process of passing a “do-not-track” bill will be quicker and smoother on the state level. Interestingly, the effort is attracting at least some bipartisan support with Judiciary Committee member Sen. Tom Harman (R-Huntington Beach) expressing interest in tackling the issue of online tracking. Ultimately, passage of the bill would, once again, put California out in front on online consumer protection issues much like its “do-not-call” and data breach laws have in the past.

The bill requires the Attorney General, in consultation with the California Office of Privacy Protection, to adopt regulations that would require companies doing business in California that collect, use, or store online data regarding consumers to provide those consumers with a way to opt out of such practices. Additionally, the bill would grant the Attorney General power to impose regulations that may, among other things, require companies to provide consumers with access to their personal data, and a clear and easy to understand data retention and security policy. As a nod to the business community, the Attorney General would have the power to create exemptions for commonly accepted business practices.

Any company that willfully fails to comply with the adopted regulations would be liable to consumers in a civil action with statutory damages, which would range from $100 to $1,000. The proposed bill could include punitive damages also, as determined by the court, as well as costs and reasonable attorney’s fees.

Research for this post was conducted by Legal Intern Noah Cherry.

'The Four Pillars of Wisdom'? EU Commissioner's speech signals key areas for reform of EU privacy rights

This post was written by Cynthia O'Donoghue and Nick Tyler.

In a recent speech, Viviane Reding, the EU Commissioner with responsibility for European Union data protection policy identified ‘four pillars’ upon which the privacy rights of EU citizens “need to be built” so that individuals’ have more control over their personal data in today’s online world.

Reforming EU data protection is Commissioner Reding’s “top legislative priority” and the new proposals are expected this summer.

The ‘four pillars’ are:

  • The right to be forgotten,
  • Transparency,
  • Privacy by default, and
  • Protection regardless of geographic location.

The “right to be forgotten” (also alarmingly termed the “right to oblivion”) will comprise “a comprehensive set of existing and new rules to better cope with privacy risks online”. This new “right” will require the data controller to demonstrate the need for collecting personal data and to delete data held if consent to processing is withdrawn.

While transparency has always been a fundamental principle, Commissioner Reding is advocating transparency as a new right. This would fundamentally shift transparency from being an obligation on data controllers to a right providing individuals more control over their data. The shift seeks to address the perceived risks of regulators and policy makers (particularly in the context of social networks) that personal data is misused, especially the personal data of young people. These paternalistic concerns appear to be driving Commissioner Reding’s call for “privacy by default”.

There is potential for confusion with this new term in that “privacy by default” could easily be mistaken for the concept of “Privacy by Design”, which was recently adopted as a guiding principle by the global data protection community – see earlier blog post and Client Alert . In fact, “privacy by default” is a much more basic idea and signals a policy shift towards more explicit consent from individuals. Its implementation would challenge existing data collection practices currently relied on through available software applications. While the focus on “explicit consent” is initially concerning, Ms Reding does appear to recognise other lawful reasons for collection and use, apart from consent. We can only hope that the “legitimate interests” of the controller continue to provide a lawful basis to rely upon in practice, subject, of course, to any overriding interest of an individual.

Commissioner Reding has taken a particularly robust stance on the extra-territorial application of EU data protection laws to ensure protection of EU citizens’ data irrespective of geographic location:

“Any company operating in the EU market or any online product that is targeted at EU consumers must comply with EU rules.”

To make this commitment more realistic in practice, the Commissioner recognises the need to “reinforce the independence and harmonise the powers” of Member States’ privacy regulators through a more coordinated approach to EU-wide enforcement and regulation.

That’s a mighty challenge in itself since the existing European data protection landscape remains notoriously inconsistent and unpredictable with many regulators anxious to address criticism of ineffectual regulation by exercising enforcement powers. This is all likely to increase the heat on the compliance and legal functions, as well as the boardrooms, of many enterprises with EU operations. It looks like we can all look forward to a long, hot summer!

Israel is welcomed to the ranks of EU-approved personal data destinations

This post was written by Nick Tyler.

The EU Commission has recently approved Israel as a country providing “an adequate level of protection for personal data transferred from the European Union”.

This follows a lengthy process which was nearly derailed, after Irish Government objections, following the assassination in Dubai last January of a Hamas official allegedly committed by agents of Mossad, Israel’s Secret Service, and associated allegations of identity theft involving the passports of Irish (as well as UK) citizens.

Israel has now joined a very select band of countries, including Argentina, Canada and Switzerland, which have received the EU-data protection ‘seal of approval’. This group is likely to expand further in the coming months, with the expected addition of Uruguay following the positive opinion of the Article 29 Working Group in October last year (see our related blog post). The equivalent opinion on Israel was issued as long ago as December 2009.

Israel’s data protection regulator, ILITA (the Israeli Law Information and Technology Authority) has been formally recognised as an independent supervisory authority in spite of its links with the Israeli Ministry of Justice. In October last year ILITA hosted the International Conference of Data Protection and Privacy Commissioners in Jerusalem.

This Decision marks an important legal and commercial development as it enables the automated international transfer of personal data from the EU to Israel between corporate affiliates, or from European corporations to data processing operations in Israel. It also covers non-automated transfers of personal data that will be subject to further automated processing in Israel.

There are two restrictions on the scope of the Decision:

  • It does not cover the non-automated transfer of manual data which is then processed in Israel by non-automated means.
  • EU data protection regulators will monitor the effectiveness of ILITA when it comes to enforcing privacy and data protection laws in Israel, to verify that personal data transferred to Israel is adequately protected in practice. The rights of those EU regulators to suspend data flows to any particular recipient in Israel have been expressly reserved.

While this Decision provides a way past the legal obstacle that previously restricted the transfer of personal data from Europe to Israel, when it comes to other data protection compliance obligations it is important that clients operating, or otherwise conducting business, in Israel take appropriate legal advice and assistance on compliance with local laws.

European Commission Communication on personal data protection in the European Union - A seasonal wish-list for a harmonious future?

This post was written by Nick Tyler and Cynthia O'Donoghue.

With so much consultation activity going on in the United States on the future of privacy regulation and enforcement, initiated by the FTC and US Department of Commerce, we should not lose sight of parallel developments and consultation activity going on in Europe following a recent Communication from the European Commission.

Now seems to be an appropriate time of year to take stock and highlight the key themes of that Communication and what it might mean for clients as they look to address and/or progress their data privacy compliance programmes in the year(s) ahead. We have therefore published a Client Alert which takes a closer look at the emerging themes and what lies ahead in 2011. 

Read the full Client Alert here.

Department of Commerce Privacy Green Paper -- Detailed Digest

This post was written by Amy Mushahwar.

As promised in our teleseminar last week, we have digested the Department of Commerce Privacy green paper, entitled, "Commercial Data Privacy and Innovation in the Internet Economy: A Dynamic Policy Framework". The green paper will kick start an ongoing discussion of privacy and we encourage organizations to undertake some cost-benefit analysis now for the best outcome in 2011. Time is of the essence and comments to this green paper are due on January 28, 2011. To learn more about this important release, please read our recent client alert.

Privacy: A Washington Tale of Two Reports

This post was written by Mark Melodia, Judy Harris, Chris Cwalina, Paul Bond, and Amy Mushahwar.

We've been busy here in Washington with two seminal privacy reports released within a span of two weeks.  At Reed Smith, our interdisciplinary team of former government officials, former in-house attorneys, class action litigators and engineers (in the US and internationally) are reviewing the releases and providing prompt insights for your review.  Below, please find a link to the reports, our most recent digests and our aptly timed teleseminar that occurred on the very day that the Department of Commerce released its privacy green paper.

On December 1, 2010, the Federal Trade Commission issued its long-awaited 123-page preliminary report on privacy, Protecting Consumer Privacy in an Era of Rapid Change: A Proposed Framework for Businesses and Policymakers. The report is the most important and comprehensive guidance the FTC has ever issued in the privacy arena, and it has the potential to dramatically overhaul the way businesses think about privacy. More importantly, the document sets the stage, potentially, for a very different regulatory framework in Washington. For more detailed information on the FTC Report click here.  Comments are due on this report by January 31, 2011.

On December 16, 2010, the U.S. Department of Commerce issued its initial policy recommendation in a green paper, Commercial Data Privacy and Innovation in the Internet Economy: A Dynamic Policy Framework .  The Commerce green paper issued by the specially established Internet Task Force at the Department of Commerce lends another voice to the privacy debate and attempts to create a universal privacy baseline. While the report makes no recommendations to cover specific industry sectors that are addressed by existing privacy regulations, such as, healthcare, financial services and education, it is clear that the Department of Commerce would like to lead the regulatory agenda in the online privacy overhaul that is expected in 2011.  Check back here over the next few days for a more detailed look into the report.  Comments are due on this report by January 28, 2011. 

We addressed both reports in yesterday's teleseminar by privacy counsel Mark Melodia, Chris Cwalina, Paul Bond and Amy Mushahwar,  even though our team was still digesting the Commerce item that was released only hours before the teleseminar.  Our team described how the reports may apply to your business and provided a view from Washington regarding the complex regulatory and legislative road that may lie ahead for data privacy and cyber security issues. Feel free to listen to an audio recording of the event while watching the slide show.

Hamburg DPA Files Bank €200,000 For Accessing Customer Data and Customer Profiling

This post was written by Thomas Fischl and Katharina A. Weimer.

On November 23, 2010, the data protection authority (the “DPA”) of the German federal state of Hamburg fined regional financial institution Hamburger Sparkasse AG (“Haspa”) €200,000 for illegally allowing its customer service representatives access to customers’ bank data, and for profiling its customers and also granting the representatives access to such profiles. The bank cooperated with the DPA and immediately discontinued the illegal practices.

From the end of 2005 until August 2010, Haspa allowed its self-employed, external customer service representatives access to customer bank data, often without having first obtained the customers’ consent. According to the DPA, the number of bank accounts accessed is not clear. The bank was aware of this practice through reviews of log files that detailed the representatives’ access.  

In addition, the bank created customer character profiles which were available for all external customer service representatives. The bank used tracked account balances and data on the use of financial products to create profiles of customers. The profiles were based on neurological research and customer data, including customers’ socio-demographic status and financial products, such as direct deposit accounts and the number of transactions. The creation and use of the profiles occurred without notice to the customer.

According to the head of the Hamburg DPA, Johannes Caspar, the fine was based on the following factors: (i) bank data is considered highly sensitive as it provides a great deal of information about the individual customer, (ii) the severity and degree of the violation, and (iii) the fact that the amount of the fine should exceed the economic benefit derived from the violation. Furthermore, the DPA sought to discourage future data protection law violations, while cautioning against the use of modern neuromarketing tactics to exploit customers.

In the bank’s defense, the DPA considered that the bank’s management responded quickly with a clarification of the issues and cooperated with the DPA’s investigation. Furthermore, on July 9 the bank withdrew access rights to customer data from external service representatives. The DPA also took into consideration that, in August, the bank implemented new technical procedures designed to comply with data protection requirements and deleted unlawful customer profiles.

The case highlights the willingness of the German Data Protection Authorities to impose significant fines on companies which fail to protect customer data. In a similar case, Postbank was fined EUR 120,000 in early 2010. For more information, view the Hamburg DPA’s press release here (in German).

FTC Releases Privacy Report

This post was written by Paul Bond, Christopher G. Cwalina, Amy S. Mushahwar, and Frederick Lah.

On December 1, 2010 the FTC released its long-awaited Protecting Consumer Privacy in an Era of Rapid Change. This 123-page preliminary staff report proposes a sea change in US privacy law. The FTC is accepting comments on this report until January 31, 2011.

In the report, the FTC proposes a major change in the framework of US privacy law, stating bluntly that, "Industry must do better."

  • Notice-and-consent does not work, the FTC says. People do not read or understand privacy notices as now written. The Commission's view is that privacy policies have become "long" and "incomprehensible".
  • The report says that waiting for harm to come to consumers is also not an effective way to enforce privacy norms. Harm has traditionally meant economic or physical harm. Per the report, privacy harms include reputational harms and even the emotional harm of having one's information "out there," and/or "fear of being monitored". The FTC says the new framework must address and allay these anxieties; however, there is some disagreement among the Commissioners. Commissioner J. Thomas Rosch expressed in his concurrence that "the Commission could overstep its bounds" if it were to begin analyzing these more intangible harms when assessing consumer injury.
  • Industry self-regulation, per the report, is too little, too late and has failed to provide adequate and meaningful protection.

The report also challenges a number of assumptions in how we view data privacy and security.

  • The FTC casts severe doubt on claims that de-identified information need not be protected, citing to multiple instances and methods by which personally-identifiable information (“PII”) can be culled from data that does not include names (i.e., IP Addresses or other unique identifiers). The distinction between PII and non-PII, the FTC concludes, is "of decreasing relevance". Consequently, the scope of the report is very broad and applies to "all commercial entities that collect or use consumer data that can be reasonably linked to a specific consumer, computer, or other device."
  • The report purports to apply in the online and offline world and not just to companies that work directly with consumers.
  • The FTC suggests that consumers must be made aware of and consent to onward transfers of information to non-affiliates, regardless of the industry, universalizing consumer notice requirements that hitherto only applied as to certain highly regulated industries (i.e., telecommunications, education, healthcare, financial services) or certain types of highly sensitive data (i.e., credit report information, bank account information).
  • The report distinguished between "commonly accepted data practices" and all other data practices. Borrowing from GLBA and HIPAA, commonly accepted practices, like using data to aid law enforcement or in response to judicial process or to prevent fraud, would not require notice to or consent of consumers. All other data practices would require notice and consent, in a form easy to read and understand, ideally provided to the consumer at the point the consumer enters his or her personal data. Behavioral advertising and deep packet inspection are explicitly named as not "commonly accepted data practices". Also, the FTC suggests that opt-in consent be obtained prior to implementing any material changes to a company's privacy policy that would apply to data collected under a prior policy.
  • The report suggests that to promote a free and competitive market, the privacy practices of companies need to be more transparent to consumers and that companies provide consumers with "reasonable access" to their data.
  • Per the report, appropriate data retention periods should be a legal requirement. The report sites geolocation data as especially important to phase out.
  • The report also endorses a "Do Not Track" mechanism, understanding that such a mechanism would be far more complex than the National Do Not Call registry. The FTC supports either legislation or self regulatory efforts to develop a system whereby a consumer could opt not to be "tracked." The FTC has expressed a distinction between "tracking" and "interest-based" advertising. And, in later discussions regarding the report, the FTC has stated that it will treat first-party advertising more favorably than third-party ad servers. The FTC has not decided on the technical mechanism for creating such a registry, but has proposed that a browser-level solution that could be similar to the privacy plug-in on the Firefox browser or incognito mode in Google Chrome. The FTC has not expressed whether opt-in or opt-out would be the default browser setting for any browser privacy plug-ins/modes developed.

So what should businesses do?

First, companies should carefully review the report and the 50+ questions open for public comment posed in Appendix A (there are also additional questions posed in the Commissioner dissent statements).

Second, companies should strongly consider commenting on the report. In our experience, the FTC will listen to and often address business concerns, but they must be heard. Trade associations may be a good place to start but also consider unique issues that your company may face that should be addressed.

Third, now is a good time for companies to pull back and consider their privacy programs and the extent to which they incorporate privacy into their everyday business practices. The report suggests that every company should adopt "privacy by design," "building privacy protections into everyday business practices," "assigning personnel to oversee privacy issues, training employees on privacy issues, and conducting privacy reviews when developing new products and services".

The FTC's full report is available here

From World Cup Winners to Adequate Level of Data Protection - Uruguay Set to Join Another Exclusive Club!

This post was written by Cynthia O’Donoghue and Nick Tyler.

Having hosted and won the very first ‘soccer’ World Cup in 1930, and then having won it again twenty years later, Uruguay belongs to a very exclusive band of multiple-World Cup winning countries. Having reached the semi-finals of this year’s tournament (for the fifth time in total), this relatively small South American republic has a proud and enviable record as one of the most successful footballing nations.

This year is fast proving to be significant for Uruguay for more serious reasons than national sporting prowess (more serious that is if you do not subscribe to the philosophy of Liverpool FC’s legendary manager, Bill Shankly: “Some people think football is a matter of life and death. I assure you, it's much more serious than that.”)

On 12 October the Article 29 Working Party of European data protection regulators issued an opinion approving Uruguay’s admission into another exclusive club—the list of countries that provide an ‘adequate level of protection’ within the meaning of Article 25(6) of European Data Protection Directive 95/46/EC.  The Article 29 Working Party’s opinion was issued after a two-year review process and is a pre-requisite to approval by the European Commission. Barring any unforeseen political hitches, as befell Israel’s bid for ‘adequacy’ earlier this year, such approval should follow.

Some background points to note about Uruguay’s data protection regime:

  • The relevant legislation consists of:
    • Law No. 18.331 of 11 August 2008, on the Protection of Personal Data and “Habeas Data” Action (abbreviated as LPDP in Spanish); and
    • Regulating Decree of 31 August 2009, developing LPDP (DPDP).
  • LPDP is comprehensive in its scope and reach, covering all sectors of activity.
  • The independent supervisory authority is called the Unit for Regulation and Control of Personal Data (URCDP in Spanish).
  • Together with the Unit for Access to Public Information (UAIP) URCDP forms the Agency for the Development of Electronic Government and the Knowledge-Based Society (AGESIC in Spanish).
  • URCDP operates a permanent register of databases.
  • URCDP has power to impose sanctions ranging from a warning and a fine to suspension of any database.
  • Article 8 of DPDP contains data breach notification requirements.
  • Article 15 of LPDP provides a right of correction to “every natural or legal person”.
  • In the case of any denial of the rights of subject access and correction, Article 38 of LPDP provides for an action or writ of habeas data, also exercisable on behalf of deceased persons.

The concept of Habeas data does feature in a number of other Latin American countries’ constitutions but, unlike those of Argentina and, imminently, Uruguay, these have not achieved the vaunted status of EU-approved ‘adequate’ data protection regimes.

The UK Regulator's 'Wish List' for a New EU Data Protection Directive Highlights the Challenges Ahead

This post was written by Cynthia O’Donoghue and Nick Tyler.

The Information Commissioner’s Office (ICO), the UK data protection regulator, has recently responded to the UK Government’s Call for Evidence on the current data protection legislative framework. The Ministry of Justice sought evidence about how the European Data Protection Directive 95/46/EC and the Data Protection Act 1998 are working, and their impact on individuals and organisations. The Call for Evidence, which closed on 6 October, seeks to inform the UK negotiation position for a new EU data protection instrument, expected to start in early 2011.

In its response, the ICO asserts that the data protection principles are “sound and should be maintained”, although it acknowledges that changes are needed. The ICO listed key ‘must-haves’ for “an effective new data protection framework”:

  • A “much clearer” definition of personal data “more relevant to modern technologies and…practical realities” capable of recognising the many different levels of “identifiability”, and in turn protection, which technology can provide; 
  • A “more flexible and contextual” concept of sensitive personal data, with financial and geo-location data being examples of non-sensitive data that warrant increased vigilance and protection;
  • A revisit of the definitions of processor and controller and a more collective form of responsibility that deals “more realistically with the collaborative nature of modern business and service delivery”;
  • A consistent approach to transparency and consent in Europe as the two concepts are not interchangeable, in meaning or legal effect.
  • A new requirement of accountability (see also our recent Client Alert) to “reinforce the responsibility of data controllers”, which can be scaled to an organisation’s size and the risks of their processing of personal data. 
  • Significant changes to international data transfers to “deal more realistically with current and future international data flows” by focusing on the exporting data controller’s risk assessment and responsibilities, regardless of location, as well as on assessing ‘adequacy’ based on the specific circumstances and method of transfer as opposed to whether or not a country is designated as ‘adequate’.
  • An explicit privacy by design requirement that ensures the building-in of data protection compliance measures at each stage of the information lifecycle as opposed to bolting-on remedial measures.

The ICO’s response is typically pragmatic and builds on several earlier contributions made over the last 18 months. Read together they provide a consistent and compelling case for change (see also, for example, “Making European data protection law fit for the 21st century”).

Indian Government discussing BlackBerry ban: "security more important than privacy"

This post was written by Cynthia O'Donoghue, Katalina Chin and Katharina Weimer.

A few days following the concession made by BlackBerry manufacturers, Research in Motion (RIM), to provide Indian security agencies access to their encrypted data, India’s Home Minister P. Chidambaram held “security to be more important than privacy”.

Security concerns in India have certainly risen following the terror attack on Mumbai in November 2008, the worsening violence in the disputed region of Kashmir and a rising Maoist insurgency in a mineral-rich territory of the East. And certainly, such concerns may be flared by the fact that attacks are often coordinated using mobile phones, satellite phones and voice over internet calls. These mounting fears over terrorism have led the Indian Government to demand from their first target, RIM, full access to the encrypted data of BlackBerry users in India.

Canadian company RIM refused this request on technical grounds, arguing that the information would be impossible to provide. However, in the knowledge that data is provided by RIM to other countries the Indian Government stuck firm to their demand: then why not India? While the private service, Blackberry Internet Service (BIS), offered by RIM uses their own servers for communication, RIM maintained it is not possible for them to access the business service (Blackberry Enterprise Service (BES)). Indeed, the level of privacy afforded to RIM’s corporate customers is a strong selling point and providing governments with access to email communication for surveillance purposes has the potential to breach a fundamental principle of RIM's business approach: customers' trust in the confidentiality of their communications.

Following RIM's refusal to grant access, the Indian Government issued an ultimatum: if they did not grant full access to all data (encrypted or not), India would block the mail service of the smart phone manufacturer entirely. Fearing this ban on their business in India, one of the fastest growing smart phone markets of the world, RIM conceded to the Indian Government's requests and made several suggestions to resolve the issue of providing access to their data. The decision made by Nokia, RIM’s main competitor in the region, to set up servers in India to facilitate government monitoring, may well have weakened any bargaining position that RIM were hoping to play on.

The measures to be adopted by RIM have yet to be made public but the proposals are seemingly sufficient for the Indian government to grant a two-month grace period to evaluate RIM’s suggestions. While the reprieve offers Blackberry users in India some breathing space, it is unclear whether RIM will be in a position to satisfy the interests of both the Indian Government in security and surveillance and their customers in ensuring the privacy of their communications. India’s Home Secretary is due to meet officials from the Department of Telecommunications, the Intelligence Bureau and the National Technical Research Organisation on Monday the 6th of September to discuss Blackberry security issues.

In light of this development and the Indian Government’s priority on national security over privacy, there is likely to be mounting fear amongst similar online communications companies that they may be the next target and have to provide access to encrypted data transmitted online. RIM has faced similar issues in other countries, including Saudi Arabia, the United Arab Emirates, Lebanon and Indonesia. 

What kind of animal is your PET? Report on Privacy Enhancing Technologies ("PETs") released by European Commission

This post was written by Cynthia O'Donoghue and Katalina Chin.

The European Commission DG Justice, Freedom and Security commissioned London Economics, one of Europe's leading specialist economics and policy consultancies, to undertake a study and report on the economic benefits of Privacy Enhancing Technologies ("PETs") for organisations and institutions using and holding personal data in selected European member states.

But what are PETs?  It is a term used for a set of computer tools, applications and mechanisms, including procedures and management systems, which aim to protect the privacy of personal data by eliminating, anonymising or minimising personal data in order to prevent unnecessary or unwanted processing of personal data.  Features can include, for example, allowing an individual to choose the degree of anonymity, to inspect, correct and delete any of their personal data, to track the use of their personal data and may also include a consent mechanism prior to providing personal data to online service providers.  The report emphasises that, "data minimisation and consent mechanisms are an important part of PETs, and PETs often combine these elements with data protection tools into an integrated privacy system".

The report highlights that "the rights [set out in Article 8 of the Charter of Fundamental Rights of the European Union which deals with an individual’s rights to the protection of personal data] form the basis of the legal framework in which PETs are deployed" and should have at their core the objective of transparency, proportionality and data minimisation.

The report explains how it is difficult to quantify the wider economic benefits of a data controller using PETs to protect an individual’s personal data, and how the evidence has shown that the benefits can only be assessed on a case-by-case basis.  If anything, the study found little evidence to show that the demand by individuals for greater privacy is driving PETs deployment, and suggests that this is in part due to “the uncertainties surrounding the risk of disclosure of personal data, a lack of knowledge about PETs, and behavioural biases that prevent individuals from acting in accordance with their stated preference for greater privacy”.

The fact of the matter is, as the report makes very clear, that data controllers can derive a variety of benefits from holding and using personal data (including the personalisation of goods and services, data mining, etc.) and to the extent that PETs limit the ability of data controllers to use personal data, this will clearly act as a disincentive in the exploitation of PETs. The report highlights that, “data controllers often favour mere data protection to protect themselves against the adverse consequences of data loss over data minimisation or consent mechanisms which can impede the use of personal data”.  Evidence considered in the study suggests that there is a role for the public sector in helping data controllers realise the benefits of PETs, such as “official endorsements of PETs, including through pioneering deployment and official certification schemes, and direct support for the development of PETs, through subsidies to researchers (e.g. the European Framework Programmes)".

As the heat in data privacy issues continues to rise, with increased powers of regulatory authorities, tougher sanctions being imposed and a greater emphasis in Europe’s legislation on security management, it is clear that privacy by design will be the most effective method of compliance.

Mexico's Senate Passes Federal Law for Protection of Personal Data

This post was written by Mark Melodia, Cynthia O'Donoghue, and Anthony Traymore.

On April 27, 2010, the Mexican Senate passed Ley Federal de Protección de Datos Personales en Posesión de los Particulares (the Federal Law for Protection of Personal Data (FLPPA)). President Felipe Calderon is expected to sign the FLPPA into law soon, and thereafter, the FLPPA will be published and its regulatory provisions enacted. The objective of the FLPPA is to provide regulatory mechanisms for the newly established replacement agency, Instituto Federal de Acceso a la Información y Protección de Datos (the Federal Institute of Information Access and Data Protection (FIIADP), to enforce the FLPPA in relation to any individual or entity engaging in the collection, storage and/or transfer of personal data.

To view the entire alert, please click here.

Toward Reinforcement of the Applicable Legislation on Data Protection in France: The New Bill On Privacy

This post was writtem by Cynthia O'Donoghue and Daniel Kadar.

A bill "intended to better guarantee the right to privacy in the digital age" was adopted by a large majority of the French Senate March 23, 2010, and immediately transmitted to the French National Assembly for review.

The first objective of the bill is aimed at educating students about the use and exposure of personal information on the Internet, notably through social media. The bill is principally aimed at significantly reinforcing the obligations of data processors, and with increasing the powers of the French data protection agency, the CNIL.

To view the entire alert, please click here.