Senators Trying to Hit the Brakes on Smart Cars, Citing Privacy and Security Concerns

This post was written by Mark Melodia and Frederick Lah.

On February 11, Sens. Ed Markey (D-Mass.) and Richard Blumenthal (D-Conn.) announced that they would introduce legislation intended to address the data privacy and security vulnerabilities with Internet-connected cars. The legislation, if passed, would require manufacturers to adhere to a number of security and privacy standards, including the following:

  • Requirement that all wireless access points in the car are protected against hacking attacks, evaluated using penetration testing
  • Requirement that all collected information is appropriately secured and encrypted to prevent unwanted access
  • Requirement that the manufacturer or third-party feature provider be able to detect, report and respond to real-time hacking events
  • Transparency requirement that drivers are made explicitly aware of data collection, transmission, and use of driving information
  • Consumers can choose whether data is collected without having to disable navigation
  • Prohibited use of personal driving information for advertising or marketing purposes

The legislative proposal served as a follow-up to an earlier report by Sen. Markey, “Tracking & Hacking: Security & Privacy Gaps Put American Drivers at Risk.” That report was based on the responses from 16 major automobile manufacturers to questions posed by the senator about how driver information is collected and used, and the potential security risks with wireless technologies in cars. The report found that large amounts of personal driver information – including geographic location, destinations entered into a navigation system, parking history locations, and vehicle speed – are collected without the drivers being clearly informed as to how that information will be used. In most cases, the information is shared with third-party data centers, the report said. Further, the report found that nearly 100 percent of cars on the market include wireless technologies that could pose vulnerabilities to hacking intrusions, and that most manufacturers were unaware or unable to report on past hacking incidents.

In addition to Sen. Markey’s report, the FTC highlighted the potential security and privacy risks with connected cars in its recent Internet of Things Staff Report, which we previously covered here. While acknowledging the many safety and convenience benefits of smart cars, the FTC also shared Sen. Markey’s concern about their potential vulnerabilities.

In response, the industry, led by two major automobile coalitions, has adopted self-regulatory privacy principles. In November 2014, 19 U.S. car companies made a commitment to incorporate a series of self regulatory “Consumer Privacy Protection Principles for Vehicle Services” in their vehicles no later than model year 2017. In a letter sent to the FTC, the participating manufacturers said the “privacy principles” would be applied to their vehicles’ technologies and services, such as roadside assistance and navigation services, and will provide a baseline for privacy commitments. The principles include provisions for transparency, choice, respect for context, data minimization, de-identification, data security, integrity, access, and accountability. Sen. Markey said in a statement that these self-regulatory principles were a good first step, but that they did not go far enough in terms of choice and transparency.

As more and more cars join the wave of Internet of Things, legislators and regulators will continue to scrutinize their potential privacy and security risks. Any road forward – whether it be legislative or self-regulatory – must carefully balance the many benefits offered by smart cars with their potential risks. In the meantime, car manufacturers (and their third-party service and technology providers) should continue to monitor this area for legislative developments and start taking steps to implement the self-regulatory principles.

In Nevada Court, Millions of Dollars Wasted in the Name of Macau Data Privacy Law

This post was written by J. Joan Hon.

Clark County Nevada District Judge Elizabeth Gonzalez is considering further sanction against Sands China Ltd. for redacting “personal information” from about 2,600 documents the company produced in 2013 as part of an ongoing wrongful termination suit first filed in 2010 by Steven Jacobs, the former president of Sands Macau. Jacobs alleges that he was wrongfully fired for refusing to engage in unlawful acts, including promoting prostitution and spying on Chinese politicians in order to find potentially embarrassing information for use in obtaining favorable treatment for the casino.

Jacobs sought the production of about 100,000 emails and other documents from Las Vegas Sands Corp. and Sands China in order to show that Las Vegas controlled Sands China, and therefore the Nevada court has jurisdiction over Sands China. In 2012, Judge Gonzalez ruled that neither defendant could raise the Macau Personal Data Protection Act (the “Macau PDPA”), which has the closest approach to the European Union’s Data Protection Directive of 1995 than any other country in Asia, as an excuse to refuse disclosure. The ruling was made after it was learned that “significant amounts of data from Macau related to Jacobs was transported to the United States” and reviewed by in-house counsel for Las Vegas Sands and outside counsel. The defendants had tried to conceal the existence of the transferred data, and were subsequently ordered to make a $25,000 contribution to the Legal Aid Center for Southern Nevada, and to pay Jacobs’ legal fees for nine “needless” hearings involving issues related to the Macau PDPA.

Unable to avoid disclosure of documents, Sands China then spent US$2.4 million to redact documents, insisting that it would face civil and criminal penalties, including possible imprisonment for the company’s officers and directors if it hadn’t. David Fleming, general counsel for Sands China, testified that Macanese officials “were furious” about the prior release of data from the region.

In 2012, the Macau Office for Personal Data Protection (“OPDP”) had begun an investigation into potential violations related to the alleged transfer of “certain data” from Sands China to the United States without permission, but to date, the government office has made no statement on any outcomes of this probe. Typically, the maximum fine per violation would be 80,000 patacas (US$10,000) and the maximum jail sentence would be two years.

Judge Gonzalez rejected arguments made by Sands China and is currently considering further sanctions against the defendants.

According to the Macau PDPA, “[t]ransfers of personal data to any destination outside the Macau SAR is prohibited unless an adequate level of protection is guaranteed by the legal system of the country where the data is transferred, and such determination is left under the discretion of the OPDP.”

Wynn Macau Fined by Macau OPDP in Relation to FCPA Investigation

In 2013, Wynn Macau Ltd. was fined 20,000 patacas (US$2,500) by the Macau OPDP for unauthorized transfers of customer information to its parent. The information was used in an investigation into whether an executive at the parent company violated the Foreign Corrupt Practices Act. The data included customer relationships and entertainment expenses, and involved officials from another country, and thus its transfer violated Macau’s Personal Data Protection Act, according to the OPDP.


Russia sets a new deadline for data localisation, and removes Hong Kong and Switzerland from Adequate Privacy Protection List

This post was written by Cynthia O’Donoghue.

The Russian Duma recently set a new deadline for companies to localise their data processing of Russian citizens on Russian soil, while the data protection authority published an order removing Hong Kong and Switzerland from its ‘adequate privacy protection list’.

The Russian Duma has voted through, on a first reading, an accelerated effective date for the data localisation law, moving the deadline forward by a year to 1 September 2015. Previously, Federal Law No. 242-FZ, which amends Russia’s 2006 data protection statute and primary data security law (Laws 152-FZ and 149-FZ), had been proposed to come into force as early as 1 January 2015, from the initial deadline of 1 September 2016.

In addition, the Russian data protection authority (Roscomnadzor) issued a new order removing Hong Kong and Switzerland from a list of countries that meet privacy protection adequacy standards in Russia. Nothing in the order indicates a reason for the removal. The order becomes effective 25 December 2014. The list of adequate countries includes all members of the Council of Europe Convention 108 on Data Protection, as well as Australia, Argentina, Israel, Canada, Morocco, Malaysia, Mexico, Mongolia, New Zealand, Angola, Benin, Cape Verde, South Korea, Peru, Senegal, Tunisia and Chile.

Hong Kong Privacy Commissioner Ends 2014 with Special Interest in Mobile Apps

This post was written by Joan Hon.

The Hong Kong Privacy Commissioner of Personal Data (the “Commissioner”) ended 2014 with a special interest in mobile applications (“apps”).

In a media statement published 15 December 2014, the Commissioner reported that versions 4.3 and earlier of Google’s Android operating system contained a flaw that allowed others to read shared memory in mobile devices without the proper user permission. The Commissioner had contacted Google twice to formally request it to “take corrective action and/or warn the end-users concerned that they are subject to the risk of data access by malicious apps without their knowledge and permission.”

This is not the first time the Hong Kong privacy regulator has reproached Google for its data practices. In 2010, Google undertook to investigate its Street View and WiFi data collection, to ensure practices complied with Hong Kong law. Also, earlier in 2014, the Commissioner pressed Google to apply the EU “right to be forgotten” safeguard to Hong Kong.

On the same date, the Commissioner completed two separate investigative reports on mobile travel apps by travel services companies,* finding that these apps had either inappropriately collected excessive personal information without giving customers notice as to how their data was to be used, or otherwise failed to safeguard customer personal information.

Finally, the Commissioner also issued a statement on a survey done in conjunction with the 2nd annual Global Privacy Enforcement Network Mobile Sweep. The survey reviewed 60 popular mobile apps developed by Hong Kong entities and found that “their transparency in terms of privacy policy was clearly inadequate and there was no noticeable improvement compared with the results of a similar survey conducted in 2013.”


* The Commissioner conducts investigations of suspected breaches of the Personal Data (Privacy) Ordinance (Cap. 486) based on complaints received, and publishes an investigation report when he opines that it is in the public interest to do so.

Direct Marketing Association releases New Privacy Code of Practice

This post was written by Cynthia O’Donoghue and Kate Brimsted.

On 18 August, the Direct Marketing Association (‘DMA’) issued its new Privacy Code of Practice (‘Code’) to address customer concerns about data privacy. The Code is a result of an 18-month consultation with the Information Commissioner’s Office, the Department for Culture, Media & Sport and Ofcom.

The Code focuses on five key principles:

  • Put your customer first
  • Respect privacy
  • Be honest and fair
  • Be diligent with data
  • Take responsibility

The Code contains desirable outcomes for each principle. For example, a customer receiving a ‘positive and transparent experience throughout their association with the company’ is a specified outcome against the ‘put your customer first’ principle.

The principles form a useful tool that encourages self-regulation and seeks to cultivate a relationship of trust with customers. Rather than issue a rule-based system, the DMA’s new Code provides flexibility to members to determine the way they will comply with both the principles and the law.

The Code will be enforced by the DM Commission, the industry’s independent watchdog. Breaking the Code will result in DMA members being expelled from the association, a move which is likely to cause reputational damage.

European Commission releases technical standards on Radio Frequency Identification

This post was written by Cynthia O'Donoghue and Kate Brimsted.

In July, the EU introduced new technical standards (‘Standards’) to assist users of Radio Frequency Identification (‘RFID’) technology to comply with the EU Data Protection regime and the Commission’s 2009 recommendation on RFID. The Standards are the result of a long-term EU project which began with a public consultation in 2006.

When RFID technology is used to gather and distribute personal data, it falls within the EU Data Protection regime. The Standards are being introduced at a critical time, as the use of RFID becomes more widespread, particularly in the health care and banking industries.

The key features of the Standards include:

  • The introduction of a new, EU-wide RFID sign which will allow people to identify products that use smart chips
  • New Standards for Privacy Impact Assessments (‘PIA’) to help ensure data protection by design
  • Guidance on the structure and content of RFID privacy policies

The Standards will be a useful tool for organisations that already use RFID technology, or are looking to do so. In particular, the Standards on PIAs will assist organisations to plan how they will comply with the forthcoming Data Protection Regulation, which requires PIAs to be carried out in various circumstances.

Article 29 Working Party supports recognition of Processor BCRs in the Data Protection Regulation

This post was written by Cynthia O'Donoghue and Kate Brimsted.

In June, the Article 29 Working Party (‘Working Party’) wrote to the President of the European Commission, setting out its case for including a reference to Binding Corporate Rules for data processors (‘BCR-P’) in the forthcoming Data Protection Regulation.

Binding Corporate Rules are one way in which data controllers or data processors in Europe can lawfully undertake international transfers of data. They are an alternative to using EU Model Clauses, or gaining Safe Harbor certification. However, to date, BCRs have been used to a much lesser extent than either of these methods since they are costly and time consuming to implement.

In the proposal for a Regulation published in January 2012, the European Commission had introduced an express reference to BCR-Ps. This reference was dropped, however, in the draft version of the Regulation that was voted on by the European Parliament in March 2014.

In its letter, the Working Party notes that it has officially allowed data processors to apply for BCRs since January 2013. In this connection, “denying the possibility for BCR-P will limit the choice of organisations to use model clauses or apply the Safe Harbor if possible, which do not contain such accountability mechanisms to ensure compliance as it is provided for in BCR-P.”

The letter makes clear that the Working Party is strongly in favour of BCR-Ps, which “offer a high level of protection for the international transfer of personal data to processors” and are “an optimal solution to promote the European principles of personal data abroad.” It is noted that three multi-nationals have already had BCR-Ps approved, and that approximately 10 applications are currently pending. By not providing for BCR-P in the Regulation, these companies will be put at a disadvantage.

The Regulation, which is currently being negotiated between the European Parliament and the European Council, is widely expected to come into force in 2017. It will implement substantial changes to the current regime, including the introduction of significant new duties for data processors.

Ireland and the UK ban forced subject access requests

This post was written by Cynthia O'Donoghue and Kate Brimsted.

The practice of employers forcing employees or applicants to exercise subject access rights has been described by the UK’s Information Commissioner’s Office (‘ICO’) as a “clear perversion of an individual’s own rights”. It is now set to become a thing of the past in the UK and Ireland, as both jurisdictions bring laws into effect to make the practice a criminal offence.

In Ireland, provisions of the Data Protection Act 1988 and Data Protection (Amendment) Act 2003 that outlaw the practice were triggered in July 2014. In addition to the employer-employee relationship, these provisions apply to any person who engages another person to provide a service. The provisions have always been included in the Acts, but have not been brought into force until now.

In June 2014, the UK’s Ministry of Justice released guidance stating that similar provisions will come into force as of 1 December 2014. Employers that attempt to force people to use their subject access rights will be committing a criminal offence, punishable by a potentially unlimited fine. The ICO has indicated that it clearly intended to enforce the provision in one of its blogs, stating that “the war is not yet won but a significant new weapon is entering the battlefield. We intend to use it to stamp out this practice once and for all.”

These developments come against a backdrop of similar regulatory changes in the United States, where the long-standing “Ban the Box” movement continues to challenge the use of criminal conviction questions on job application forms. In addition, Maryland became the first state in 2012 to ban employers from asking employees and job applicants to disclose their social media passwords. Similar legislation has now been introduced or is pending in 28 states nationwide.

In light of these developments, employers should review their procedures to ensure that they do not fall foul of this update.

New Russian legislation requires local storage of citizens' personal data

This post was written by Cynthia O’Donoghue and Kate Brimsted.

President Putin recently signed Federal Law No. 242-FZ (the “Law”) which amends Russia’s 2006 data protection statute and primary data security law (Laws 152-FZ and 149-FZ), to require domestic data storage of Russian citizens’ personal data. The Law will allow the websites that do not comply to be blocked from operating in Russia and recorded on a Register of organisations in breach.

The requirement to relocate database operations could place a significant burden on both international and domestic online businesses. All retail, tourism, and social networking sites, along with those that rely on foreign cloud service providers, could have their access to the Russian market heavily restricted by the Law. The Law takes effect 1 September 2016, which may not provide some organisations with enough of a transition period to make the necessary changes.

Earlier this year, the Brazilian government decided not to include a similar provision in their Internet bill in recognition of the draconian nature, the potential economic impact and the practical difficulties.  Russia has not taken this more pragmatic approach.  

U.S. extraterritorial data warrants: yet another reason for swift Data Protection reform, says EU Commission

This post was written by Kate Brimsted.

In May, we reported that a U.S. magistrate judge had upheld a warrant requiring Microsoft to disclose emails held on servers in Ireland to the U.S. authorities. The ruling has now attracted the attention of Brussels, with the Vice-President of the European Commission, Viviane Reding, voicing her concern.

Microsoft had argued before the court that the warrant, which was issued under the Stored Communications Act, should be quashed. This was because it amounted to an extraterritorial warrant, which U.S. courts were not authorised to issue under the Act. In summary, the court ruled that the warrant should be upheld, noting that otherwise the U.S. government would have to rely on the “slow and laborious” procedure under the Mutual Legal Assistance Treaty, which would place a “substantial” burden on the government.

In a letter to Sophie in’t Veld, a Dutch MEP, Ms Reding noted that the U.S. decision “bypasses existing formal procedures”, and that the Commission is concerned that the extraterritorial application of foreign laws may “be in breach of international law”. In light of this, Ms Reding states that requests should not be directly addressed to companies, and that existing formal channels such as the Mutual Legal Assistance Treaty should be used in order to avoid companies being “caught in the middle” of a conflict of laws. She also advocates that the EU institutions should work towards the swift adoption of the EU data protection reform.  Ms Reding further reported that the Council of Ministers has agreed with the principle reflected by the proposed Regulation – and consistent with the recent Google Spain decision – that “EU rules should apply to all companies, even those not established in the EU (territorial scope), whenever they handle personal data of individuals in the EU”.

FTC Settlement with Snapchat - What Happens on Snapchat Stays on Snapchat?

Last Thursday, the Federal Trade Commission (FTC) announced that messaging app Snapchat agreed to settle charges that it deceived consumers with promises about the disappearing nature of messages sent through the app. The FTC case also alleged that the company deceived consumers over the amount of personal data the app collected, and the security measures taken to protect that data from misuse and unauthorized disclosure. The case alleged that Snapchat’s failure to secure its Find Friends feature resulted in a security breach that enabled attackers to compile a database of 4.6 million Snapchat usernames and phone numbers.

Click here to read the full post on our sister blog AdLaw By Request.

Spain's AEPD Publishes Draft Privacy Impact Assessment Guide

This post was written by Katalina Chin.

On 17 March, the Spanish data protection agency (la Agencia Española de Protección de Datos - AEPD) published a draft privacy impact assessment guide (Evaluación del Impacto en materia de Protección de Datos Personales). At the same time, the AEPD has initiated a public consultation, open until 25 April, to garner opinion and comments on the guide, after which they will issue a final version.

The guide sets out a framework to improve privacy and data protection in relation to an organisation’s technological developments, with the aim of helping them identify, address and minimise data protection risks prior to the implementation of a new product or service.

In this draft guide, the AEPD comments on the increasing importance for organisations to demonstrate their commitment to the rights of individuals whose personal data they process, and in meeting their legal obligations (essentially advocating the principle of accountability). In this regard, they advise that a developed privacy impact assessment will go a long way in evidencing an organisation’s good diligence, as well as assisting it to develop appropriate methods and procedures for addressing privacy risks.

It is not suggested, however, that the guide will provide the only methodology for carrying out a privacy impact assessment. Indeed, the AEPD says that they would be receptive to organisations who wish to develop an assessment specifically adapted to their business or sector, and they would be open to providing such organisations with guidance to ensure that they meet the minimum regulatory requirements.

As well as providing general guidance on privacy impact assessments, the guide sets out a set of basic questions, together with an ‘evaluation’ tool developed by the AEPD, whereby organisations can ‘check off’ and determine the legal obligations that must be met in order to implement their intended product or service in compliance with data protection legislation.

While this privacy impact assessment is not obligatory in Spain, this type of compliance review could become a legal requirement across the EU if the European Regulation on Data Protection remains as currently drafted (Article 33).

Edward Snowden submits written testimony to the EU Civil Liberties Commission

This post was written by Cynthia O'Donoghue.

When Edward Snowden alerted the media to the extent of global intelligence surveillance programmes in 2013, he sparked investigations and debate into the gathering of data by intelligence agencies worldwide. He is now contributing to the debate again,submitting written testimony (the Statement) to the investigation of theEU Committee on Civil Liberties (the Committee). 

The Committee’s investigation has involved a broad examination of the ways in which data on EU citizens is collected by both American agencies and agencies in its own “back yard”. In January, the Committee released a draft report on the investigation, with MEPs condemning the “vast, systematic, blanket collection of personal data of innocent people”.

In the Statement, Snowden explains the extent of the data gathered by agencies, stating that while working for the NSA, he could “read the private communications of any member of this committee, as well as any ordinary citizen”. Snowden criticises the use of resources to fund mass, suspicionless surveillance at the cost of “traditional, proven methods”, citing a number of examples of incidents that have not been prevented despite the use of mass surveillance.

The Statement also contains details of cooperation between EU Member States and the NSA’s Foreign Affairs Directorate (FAD), stating that FAD systematically attempts to influence legal reform across the EU. When successful, Snowden claims that FAD encourages states to perform “access operations” which allow it to gain access to bulk communications of telecoms providers within the jurisdiction.

In relation to whistleblowing within intelligence agencies, Snowden points out that the current legal protections in the United States do not apply to the employees of private companies and therefore do not provide a satisfactory level of protection to concerned individuals employed by such organisations. In addition, the Statement indicates that raising concerns internally is ineffective as other employees are fearful of the consequences that may follow.

For businesses, Snowden’s remarks when questioned about industrial espionage are likely to be the most interesting. Snowden states that the fact that “a major goal of the US Intelligence Community is to produce economic intelligence is the worst kept secret in Washington”. In addition, the Statement points out that evidence of industrial espionage can be seen in the press, with an example being recent reports that GCHQ successfully targeted a Yahoo service to gain access to the webcams of devices within citizens’ homes.

The Statement paints a concerning picture of the way in which politics influence the level of protection given to citizens. As Snowden points out, the Statement is limited to information that has already entered the public domain, and so it is unlikely to impact the Committee’s findings. However, with the European Parliament scheduled to vote on the draft data protection regulation and Safe Harbor Program, it will intensify analysis of the legal reforms being implemented in Brussels.

Court Rules That Technical Violations of Michigan Video Rental Privacy Act Give Rise to $5,000 Per Person in Statutory Damages, Alleged Violation Enough to Stay in Federal Court

This post was written by Paul Bond and Lisa B. Kim.

A Michigan federal judge has held that plaintiffs could proceed in federal court on their claims under the Video Rental Privacy Act (VRPA), a state law akin to the federal Video Privacy Protection Act (VPPA). The ruling came in three similar putative class actions that alleged Bauer Publishing Co., Hearst Communications, Inc, and Time, Inc., respectively, sold their customers’ personal information without permission. (The three cases were assigned to the same judge for their similar allegations.)

To have jurisdiction over a case, a federal court must find that the plaintiffs satisfy Article III of the United States Constitution, including by alleging that they have suffered an injury-in-fact. Many privacy class actions falter because plaintiffs allege only a technical violation, but cannot point to any actual or imminent impact that this supposed violation had, or will have, on their lives. The plaintiffs’ bar has therefore tried to find federal and state privacy laws, with associated statutory or liquidated damage hooks, in an attempt to avoid dismissal for lack of harm.

In these cases, plaintiffs brought suit under the VRPA. Michigan’s VRPA provides that a person “engaged in the business of selling at retail, renting, or lending books or other written materials, sound recordings, or video recordings shall not disclose to any person, other than the customer, a record or information concerning the purchase, lease, rental, or borrowing of those materials by a customer that indicates the identity of the customer,” with certain exceptions. The statute also provides that “a person who violates this act shall be liable in a civil action for damages to the customer identified” for “[a]ctual damages, including damages for emotional distress, or $5,000.00, whichever is greater” (emphasis added) and costs and reasonable attorneys’ fees. Because of the provision for $5,000 per person in statutory damages, the VRPA threatens businesses with the prospect of catastrophic class damages.

Defendants moved to dismiss these cases. In part, the defendants argued that the plaintiffs had alleged no injury-in-fact, and thus, the court had no jurisdiction. The court rejected that argument. Analyzing the language of the VRPA, Judge George Steeh reasoned that the VRPA did not contain any language that would require the claimant to have suffered any actual injury apart from the violation of the statute. To the contrary, the statute expressly provided for statutory damages and actual damages as alternative remedies. Contrasting Michigan’s law with its federal counterpart, the court noted that “Unlike the VPPA, a close reading of the VRPA reveals that it contains absolutely no language to require that a claimant suffer any actual injury apart from a violation of the statute[.]” In doing so, Judge Steeh followed the reasoning of the Northern District of California, who addressed this same issue of Article III standing in connection with Michigan’s VRPA in Deacon v. Pandora Media, Inc. (901 F.Supp.2d 1166 (N.D. Cal. 2012).

While the VRPA is similar to the VPPA, and was in fact enacted right after the VPPA, this holding is not likely to extend to the VPPA. In his ruling, Judge Steeh specifically noted differences between the language of the VRPA and the VPPA as to this issue of standing, and also referenced how the Northern District of Illinois found that the VPPA required that a plaintiff actually be “aggrieved,” i.e., suffered an Article III injury-in-fact (See Sterk v. Best Buy Stores, L.P., 2012 WL 5197901 (N.D. Ill.)). That being said, holdings like this are surely being watched by legislatures who are introducing new privacy bills that explicitly include language that violations of the law would constitute an injury. See e.g.

You can read Judge Steeh’s 17 page ruling here.

California Legislature Pushing Forward Multiple Data Privacy Bills

This post was written by Sarah Woo, Lisa B. Kim and Joshua B. Marker.

The California legislature is determined to be at the forefront in the development of data privacy law by drafting a number of data privacy protection bills that will impact companies’ obligations with respect to the disclosure, compilation, removal, or sharing of consumers’ personal information.

Click here to read the issued Client Alert.


President Signs Amendment to Video Privacy Protection Act, Ushering in a New Era for Widespread Sharing of Viewing Histories

This post was written by Lisa B. Kim, Paul Bond, John P. Feldman, Christine E. Nielsen and Frederick Lah.

On January 10, 2013, President Obama signed the Video Privacy Protection Act Amendments Act of 2012 (“VPPAA”), which makes it easier for companies to obtain consumer consent to share video viewing information. At the same time, the amendment left in place many of the pitfalls traditionally associated with the VPPA, and added some new ones.

More specifically, the VPPAA makes the following changes:

  1. Consent Via Internet: It clarifies that a company can obtain informed, written consent to disclose the consumer’s video viewing information through electronic means, such as by the consumer signing an agreement over the Internet
  2. Consent Must Be Separate And Distinct: It requires that such consent be in a form “distinct and separate” from any form setting forth other legal or financial obligations of the consumer
  3. Consent Can Be Given In Advance: It allows consent to be given in advance for a set period of time, not to exceed two years, or until consent is withdrawn by the consumer, whichever is sooner
  4. Consent Can Be Withdrawn: It requires the company to provide an opportunity for the consumer to withdraw consent on a case-by-case basis, or to withdraw from ongoing disclosures, at the consumer's election

Previously, the VPPA had not allowed for advanced consent to sharing. Since all disclosures had to be consented to “at the time” of disclosure, companies could not even obtain consent to sharing video history by way of their account formation documents or terms of use. Because of this Amendment, it is now possible for video content platforms to obtain valid, upfront consent from all of their customers to share individual video viewing histories with third parties. Properly done, this could result in a substantial benefit both to the sharing company and to the consumer, who will receive more highly personalized marketing offers. And, consumers will retain an opt-out as of right.

But this benefit is not without its dangers. The Amendment imposes many requirements both for obtaining advance consent and for allowing revocation of consent. Many nuts-and-bolts questions concerning how to meet these requirements are left unanswered by the plain text of the amendment. When is a consent form sufficiently “separate and distinct”? Can a company require consent for video history sharing as a take-it-or-leave-it term of using the service at all? What does it mean that the customer must be allowed to withdraw consent “on a case-by-case basis”? Given that the VPPA provides for a private cause of action, statutory penalties of $2,500 per person, attorneys’ fees, and punitive damages, any failure to comply with these new consent provisions (which are surely not defined by case law yet), could expose a company to significant liability.

All companies looking to take advantage of this Amendment should carefully consider the design and operation of its consent mechanisms, drawing not only from the Act, but from all relevant sources of guidance as well.

The Article 29 Working Party issues Opinion on the cookies

This post was written by Cynthia O'Donoghue.

During its meeting in early June, the Article 29 Working Party (the “Working Party”) issued an Opinion on cookies that analyses the exemptions to the requirement for informed consent, and sets how the revised e-Privacy Directive impacts cookie usage.

Article 5.3 of the amended ePrivacy Directive 2009/136/EC provides that cookies are exempt from the need to obtain informed consent when a cookie is:

A. used “for the sole purpose of carrying out the transmission of a communication over an electronic communications network” or

B. “strictly necessary in order for the provider of an information society service explicitly requested by the subscriber or user to provide the service”.

The Working Party opined that the restrictive nature of “sole purpose” in A specifically limits this exemption such that the use of cookies that assist, speed up or regulate transmission of a communication fall outside the informed consent exemption.

To satisfy B above, a cookie has to pass two tests to be exempt:

  • The user must take a positive action to request a service with a clearly defined perimeter
  • The cookie is required such that if the cookie was disabled, the requested service would not work

The Working Party pointed out that cookies exempt from consent should have a lifespan related to their purpose and must expire once no longer needed, taking into account the reasonable expectations of the average user.

“Third party” cookies are more likely to require consent where they are not strictly necessary, as the data protection risk comes from the purpose(s) for the processing, rather than from the information contained within the cookie.

  • Where cookies perform multiple functions, they will only be exempt from the consent requirement if all of the distinct purposes individually satisfy the exemption criteria. The Opinion sets out several helpful examples of situations where cookies will or will not be exempt from the consent requirement by specifically discussing “user-input” cookies, authentication cookies, user-centric security cookies, multimedia player cookies and load balancing cookies, user interface customisation cookies, and social plug-in content sharing cookies.
  • Cookies which the Working Party considered to be outside the exemption from consent included social plug-in tracking cookies, third-party cookies used for behavioural advertising, and first-party analytic cookies, even though the Working Party recognized that such cookies represent a low privacy risk where they are limited aggregated statistical data, and where the website operator provides clear information about cookies and adequate privacy safeguards, such as an opt-out from data collection and anonymisation.

Judge Narrows App Litigation, But Lets Plaintiffs Press On

This post was written by Christopher G. Cwalina, Paul Bond, and Christine E. Nielsen.

A recent decision in ongoing litigation over mobile application practices shows how difficult the defense of privacy class actions can be. Even if the defense wins dismissal of some causes of action, the survival of any cause of action may force the defendant into costly discovery.

On June 12, U.S. District Judge Lucy Koh granted in part and dismissed in part Motions to Dismiss filed in the iPhone Application Litigation MDL in the Northern District of California, case no. 5:11-md-02250. In this case, plaintiffs claimed defendants violated plaintiffs’ privacy rights by unlawfully allowing third-party applications to collect and use personal information, including location information, from users’ mobile devices without consent. Plaintiffs brought 13 causes of action against Apple and the Mobile Industry defendants, including those based on federal statute, state statute, contract law, tort, and equity.

Defendants contended that plaintiffs lacked Article III standing and the case should be dismissed for lack of subject matter jurisdiction. They argued that plaintiffs failed to allege actual injury-in-fact. Judge Koh disagreed, noting that “Plaintiffs have alleged actual injury, including: diminished and consumed iDevice [iPhone, iPad, and iPod Touch] resources, such as storage, battery life and bandwidth; increased, unexpected, and unreasonable risk to the security of sensitive personal information; and detrimental reliance on Apple’s representations regarding the privacy protection afforded to users of iDevice apps.” The court found that plaintiffs’ alleged overpayment for those devices was enough to establish standing under California’s Unfair Competition Law (UCL). The court then found that the alleged business practices may be unlawful under California’s Consumer Legal Remedies Act (CLRA), unfair in that they are injurious to consumers and may not be outweighed by benefits to consumers, and fraudulent in that Apple made misrepresentations and material omissions to induce the purchase of mobile devices.

In addition, the court declined to dismiss the claims on the grounds that Apple’s Privacy Policy expressly permitted the collection and transfer of user data at issue, in part because the policy’s language was ambiguous as to the exact definition of “personal information.” Although many of the counts against Apple, and all of the counts against the other Mobile Industry defendants – Admob, Inc., Flurry, Inc., AdMarval, Inc., Google, Inc., and Medialets, Inc. – were dismissed, counts against Apple under the CLRA and UCL will proceed.

Notably, the court rejected Apple’s argument that all of the claims should be dismissed on the grounds that Apple has permission to collect and transfer user data pursuant to the Privacy Policy. On this point, the court said that “Plaintiffs have a colorable argument that the terms of the privacy agreement were ambiguous and do not necessarily foreclose the remaining claims against Apple.” The court stated that there was ambiguity as to whether something like a user’s unique device identifier is “personal information” under the terms of the privacy policy, and thus whether its collection and use was consistent with that policy. While this is one trial court decision on a preliminary motion, the decision reinforces the need for companies to closely examine disclosures to see how well they would hold up in any subsequent litigation.

The UK Information Commissioner's Office Has Received Numerous Complaints about Websites not adhering to the 'Cookie' law

This post was written by Cynthia O'Donoghue.

The UK Information Commissioner's Office (ICO) has received 169 complaints thus far about websites failing to comply with the cookie law that came into force May 26, reports. UK Information Commissioner Christopher Graham stated that his office has received 169 complaints thus far about websites whose policies appear not to comply with the new regulations on cookies, as reported in Commissioner Graham is reported to have said that the complaints indicate what individuals are interested in and should serve as a warning to organizations that are not yet compliant. "…There are many [complaints] where customers are pointing out that well-respected brands are not doing anything about the cookie law and [these customers] can't understand why not," Graham said. The CIO Journal is reporting that the ICO has sent out 70 letters to companies that have yet to comply, including to Tesco, Facebook and HSBC.

Despite the alleged non-compliance, the ICO was issuing new guidance right up until the eve of the grace period for enforcement ending. On May 25, the ICO published revised guidance to clarify points around implied consent, and the ICO’s Strategic Liaison Group Manager for Business and Industry posted a blog with a video containing answers to FAQs.

The new guidance confirms that implied consent is a valid form of user consent and complies with the Privacy and Electronic Communications (EC Directive) (Amendment) Regulations 2011, and can be used instead of an explicit opt-in measure. This issue had troubled organisations as the previous guidance seemed to suggest implied consent would not be valid, although website operators are not meant to rely on an assumption that users have read a privacy policy that may be hard to find or difficult to understand.

The latest ICO guidance confirms that user consent can be inferred from users navigating among website pages, provided users have a reasonable understanding that by doing so they have agreed to cookies being set.

The latest guidance also addresses the issue of “prior” consent, and while the ICO’s position is that wherever possible cookies should only be set once users have had an opportunity to understand what cookies are being used and to indicate their consent, website operators should be able to demonstrate that, where it is not possible to obtain this prior consent, they are doing as much as possible to provide timely information about what cookies will placed on the users’ device.

In addition, the guidance clarifies that the mere placement of a statement about cookies in a privacy policy is not sufficiently prominent, and website operators are expected to give a clear and specific explanation to the ICO about why their website is not fully compliant. The ICO further explained that there will be a ‘sliding scale’ of enforcement, with the most intrusive cookies that pose a risk of harm to individuals being the focus of the ICO’s enquires. Since the blog contains a link where users are invited to report their cookie concerns, expect to see the ICO making further statements about the number of complaints and their investigations.

The UK Information Commissioner's Office issues the largest monetary penalty in its history to NHS hospital trust

This post was written by Cynthia O'Donoghue.

The UK Information Commissioner’s Office (“ICO”) has issued its largest-ever fine of £325,000 GBP ($503,705 USD) to Brighton and Sussex University Hospitals NHS Trust following the discovery of highly sensitive personal data belonging to tens of thousands of patients and staff, including information relating to sexual health and HIV, on hard drives sold on an Internet auction site in October and November 2010. This marks the highest fine for a “serious breach” of the UK Data Protection Act issued to date by the ICO. In April 2010, the ICO was granted additional powers to issue monetary penalties of up to £500,000.

The ICO's Deputy Commissioner and Director of Data Protection David Smith said in a statement that the high amount of the penalty “reflects the gravity and scale of the data breach.” The fine is also meant to deter lax compliance by warning organisations that they remain liable for the information management activities they outsource.

The Brighton and Sussex University Hospitals NHS Trust had outsourced the destruction of 1,000 hard drives which contained the sensitive data to a third party. However, rather than being destroyed, some of the hard drives were sold in an auction.

Since January 2012, the ICO has issued at least eight fines ranging from £70,000 to £140,000 for various serious data breaches. Some of the highest penalties issued to date have included:

  • £140,000, issued in January of this year against Midlothian Council for disclosing sensitive personal data relating to children and their carers to the wrong recipients on five occasions
  • £130,000, issued in December 2011 to Powys County Council after the details of a child protection case were sent to the wrong recipient
  • £120,000 issued in June 2011 against Surrey county council after sensitive personal information was emailed to the wrong recipients on three occasions

The ICO is increasingly using its powers to issue fines and, by doing so, sending a strong message that serious breaches of the Data Protection Act will not be tolerated.



The French Data Protection Authority unveils its agenda and targets for inspections in 2012

This post was written by Cynthia O'Donoghue.

The French Data Protection Authority (the “CNIL”) issued a press release 19 April 2012 detailing its planned enforcement agenda for the coming year. The CNIL announced that it intends to conduct around 450 on-site inspections during 2012, with particular focus on six specific themes. The CNIL will also continue work started in 2011, including at least 150 inspections related to video surveillance.

The focus will be on the following areas:

Smartphones: The CNIL will investigate both the purchasing and use of smartphones, in particular data collection by mobile operators and app providers. In relation to mobile operators, the CNIL will focus on the database of mobile customers and the extent of monitoring their customers’ usage.

Health data security: The CNIL intends to continue its work from 2011 in this area and to focus on the development and use of health-related data, in particular by carrying out inspections on medial research facilities, online health-related applications, health care providers, and companies that host health-related data, especially the use of cloud computing.

Data breaches: Given the August 2011 regulations on data breaches, the CNIL will focus on compliance by ISPs to notify the CNIL of data breaches, as well as to notify individuals when the data breach “affects their personal data or private life.”

Sports and hobbies: Despite the CNIL having conducted checks in this sector, the CNIL intends to examine further anti-doping controls, the hosting of sports competitions, and the processing of member and spectator personal data by the main French sports federations, including disclosures to third parties and blacklisting.

Police records: Following a parliamentary report on police records, the CNIL will organize a series of inspections and implement controls on data processing at the national and local levels, relating to the use of personal data and the internal operating services of the police.

Utility and motorway companies: The CNIL intends to focus on transparency of data processing by conducting a broad survey of major companies that provide services to millions of French citizens through the supply of water, gas and electricity, and the collection of road tolls.

Vermont Strengthens Data Breach Notification Law

This post was written by Paul Bond and Frederick Lah.

Vermont has recently updated its data breach notification law, Vt. Stat. Tit. 9, Ch. 62, sections 2430 and 2435, to make it one of the stronger data breach notification laws in the country. The new law became effective May 8, 2012. There are three main changes in the law:

First, the definition of security breach has been amended. Previously, "security breach" meant the unauthorized acquisition or access of data. The new definition no longer covers unauthorized access and only defines the term as "unauthorized acquisition … or a reasonable belief of unauthorized acquisition." To help clarify this new standard, the law lists the following factors that companies should consider when determining whether data has been acquired or reasonably believed to have been acquired:

  • Indications that the information is in the physical possession and control of a person without valid authorization, such as a lost or stolen computer or other device containing information
  • Indications that the information has been downloaded or copied
  • Indications that the information was used by an unauthorized person, such as fraudulent accounts opened or instances of identity theft reported, or
  • Indications that the information has been made public

The second major change of Vermont's law is that it has added a 45-day firm deadline upon discovery of the breach for when consumer notifications must be sent. The vast majority of states speak in general terms and only require that notification be made to consumers "without unreasonable delay" or "in the most expedient time possible." Vermont now joins a handful of other states (Florida, Ohio, and Wisconsin) with a specific firm deadline. All of these states have the same 45-day deadline.

Lastly, the amended law adds a requirement that the state attorney general must be notified of a data breach. The company must notify the attorney general of the date of the breach, date of the discovery of the breach, and a preliminary description of the breach, which shall include the number of Vermont consumers affected, if known. By default, this notification must be done within 14 business days upon discovery of the breach. Puerto Rico is the only other jurisdiction with a firm deadline (10 days) for when government notification must be sent. Interestingly, though, the new law provides companies with an alternative to this 14-business-day requirement. If, prior to the breach, the company has sworn in writing to the attorney general that it maintains written policies and procedures to maintain the security of the consumer information and respond to a breach in a manner consistent with Vermont law, then the 14-business-day requirement would not apply. Instead, the company would just need to notify the attorney general prior to sending the consumer notifications (which have a firm deadline of 45 days). The law provides that the company must make this sworn statement "on a form and in a manner prescribed by the office of the attorney general"; however, no guidance has been released yet on what this form would look like.

This recent update to the Vermont data breach notification law provides yet another wrinkle in the complicated landscape of state data breach notification laws.

Reed Smith hosts seminar on "Taming the e-Beast: What you need to know about Records Management, Data Protection and E-Disclosure in this Electronic Age"

This post was written by Cynthia O'Donoghue, David Cohen and Rosanne Kay.

Reed Smith hosted a seminar in its London office to discuss issues companies face arising from poor Records Management, Data Protection, E-Disclosure and the Proposed EU General Data Protection Regulation. Speakers included the UK Information Commissioner’s Office Head of Strategic Liaison, Jonathan Bamford, and Reed Smith London Partners Cynthia O’Donoghue and Rosanne Kay, and Pittsburgh Partner David Cohen.

In the first session, Cynthia and David addressed the issue of poor records management and how companies can take steps to improve their approach to record keeping in the Electronic Age. They commented that the volume of documentation being stored by companies is becoming increasingly difficult to manage because of emails and documents being kept for too long a period. Companies face conflicting duties of requiring a good retention policy and being prepared for litigation, at the same time as complying with data privacy principles which state that information should not be kept for longer than necessary. Companies are often saving records beyond the point where they have any useful purpose, such as emails that tend to have a lifespan of only six months, and companies can suffer from poor employee productivity when employees spend inordinate amounts of time looking for documents. The speakers advised clients to adopt a ‘six-step action plan’ to address these issues and strike a balance between the different business needs, legal considerations, and data privacy concerns, to create a workable, appropriate retention policy.

Jonathan Bamford gave a presentation on the ICO’s perspective on the EU Data Protection Regulation and Directive. The ICO is seeking a clear, easy-to-understand set of rules containing effective requirements that are both simple to exercise and low cost. The ICO wants accountability and responsibility throughout the information life cycle, and a provision which allows organisations that are compliant with the regulations to “get ahead”. He stated that the ICO welcomed certain aspects of the regulations, including:

  • Improved rights for individuals
  • A higher standard of consent – in the new draft regulations, consent must be explicit and can be withdrawn
  • Incorporation of new concepts such as Privacy by Design
  • Stronger supervisory authorities
  • More consistency across the EU – one set of regulations across all 27 member states and “one-stop-shop” complaints' procedures

Jonathan explained that some changes in the proposed framework were less welcome by the ICO, including:

  • Having a separate Regulation and Directive as the two instruments could cause confusion, because the Directive seems to have a lower standard of protection
  • The overly prescriptive nature of the proposed Regulation
  • The lack of focus on privacy risk – the UK’s current Data Protection Act and associated measures put privacy risk at the forefront
  • An outdated approach to international data transfers
  • A “one size fits all” approach towards sensitive data without considering the context and risk

He also expressed doubts regarding some concepts raised in the proposals, stating that the Right to be Forgotten will be very difficult to enforce, and that the potential workload that will be placed on supervisory authorities is almost unworkable. He echoed the view expressed in the ICO’s initial opinion stating that the published opinion will not be the ICO’s last word on the draft EU Regulations.

The last session of the seminar covered E-Disclosure and Cross-Border issues. David Cohen and Rosanne Kay discussed the various issues that arise with e-disclosure/ discovery in litigation in both the UK and the US. Electronic documents have taken on a large significance in litigation in recent years because of the fact that they contain a lot of information, are easy to search using keyword terms and are difficult to destroy, and can be difficult to locate and preserve. New technologies, such as ‘concept searching’ and ‘e-mail threading’, are emerging to aid document reviews. David highlighted an emerging trend in the United States, where sanctions have been imposed on parties for e-discovery mistakes.

Cynthia then discussed conflicting laws between the EU and US on cross-border discovery stemming from the international data transfer bar contained in the EU Data Protection Directive, and some European countries’ blocking statutes. Because of the broad definitions of ‘personal data’ and ‘processing’, any US discovery seeking documents from organizations located in Europe will be caught by national data protection laws so that a transfer of data to the United States has the potential to violate national data protection laws. Cynthia discussed recent trends such as the Sedona Conference Working Group 6 principles on transfers and the new American Bar Association’s decision urging US courts to give ‘due respect’ to foreign data protection and privacy, and the International Chamber of Commerce policy statement on “Cross-border law enforcement access to company data – current issues under data protection and privacy law”. The statement makes recommendations that can help to ensure respect for both law enforcement interests, and data protection and privacy laws.

Sedona Conference® International Principles on Discovery, Disclosure & Data Protection - a new set of "Three Ps" for litigants and data privacy practitioners to apply in the real world

This post was written by Cynthia O'Donoghue and Nick Tyler.

Last month we highlighted a resolution of the American Bar Association urging U.S. courts to: “consider and respect…the data protection and privacy laws of any…foreign sovereign, and the interests of any person who is subject to, or benefits from, such laws”, in the context of the onerous legal requirements in the United States to preserve and disclose information in civil litigation.

This month we want to highlight an important publication by Working Group 6 of the Sedona Conference®: “International Principles on Discovery, Disclosure & Data Protection: Best Practices, Recommendations & Principles for Addressing the Preservation Discovery of Protected Data in U.S. Litigation”. This document provides a working blueprint for litigants and data privacy practitioners alike to follow in resolving the “rock and hard place” challenge faced by clients seeking to comply with competing international laws.

The published “European Union Edition” can be viewed as a useful companion piece to the Article 29 Working Party Working Document 1/2009 on pre-trial discovery for cross-border civil litigation, but with the distinct advantage of providing six working principles as well as practical solutions:

  • Model Protected Data Protective Order and
  • Cross-Border Data Safeguarding Process + Transfer Protocol (Process + Protocol)

The six principles include:

  1. Due respect for Data Protection Laws, such as the EU Data Protection Directive and HIPAA, which echoes the ABA resolution
  2. Good faith and reasonableness (proportionality) to resolve conflicts
  3. Limited scope of preservation, disclosure and discovery by relevance and necessity
  4. Use of protective orders to resolve/minimize conflict
  5. Data Controller document data protection safeguards taken and obligations observed, including in relation to data transfers made from Europe in the litigation context
  6. Document management based on retention period being no longer than necessary to satisfy legal or business needs

Use of the principles, process and protocol may help mitigate the conflict in laws between the EU and the United States on this subject, but organisations will need to tread carefully since the proposed EU Data Protection Regulation still does not accommodate transfers to the United States for purposes of litigation.

The Article 29 Working Party tells two online advertising groups that their proposed code of conduct for data tracking is still not satisfactory and is contrary to EU privacy laws.

This post was written by Cynthia O'Donoghue.

The Article 29 Working Party has again told two online advertising groups, the Interactive Advertising Bureau (“IAB”) and the European Advertising Standards Alliance (“EASA”), that their proposed code of conduct for data tracking was still unsuitable as it failed to satisfy the requirements of EU privacy laws, and suggested adoption of the standards unveiled by the World Wide Web Consortium (“W3C”). The proposed code had already been criticized in December 2011, after which the Working Party issued recommendations in January 2012.

The code itself calls for self-regulation by the Internet advertising industry in the area of behavioural advertising, as well as participation in a pan-European website that gives users access to information regarding online advertising. The IAB and EASA implemented only a few of the Working Party’s recommendations. In a press release issued by EASA, they maintained that the code “goes beyond what any law has historically required in terms of transparency and control”. However, the Working Party held that the changes did not go far enough, and the current approach taken by the code of conduct “does not meet the consent and information requirements of the revised e-Privacy Directive”.

The Working Party encouraged the IAB and the EASA to consider the W3C standards unveiled last November, as the W3C is considered a leading international Internet standards group and includes employees of Apple Inc. and the Federal Trade Commission. The W3C plans to adopt a global Do Not Track (“DNT”) mechanism in June 2012, which the Working Party has assumed will give users “an active and informed choice” about whether their web activity should be tracked, and is seen by the Working Party as a very efficient way of dealing with user content.

The Working Party encouraged the IAB and the EASA to incorporate elements of the DNT mechanism into their code, and to present the code again to the Working Party for approval.

German Court 'Un-Friends' Facebook: Ruling on Friend Finder, User's IP Rights and Data Use Policy

This post was written by Katharina A. Weimer and Thomas Fischl.

On March 6, 2012, the Regional Court of Berlin issued a ruling on a case initiated by the Verbraucherzentrale Bundesverband, the Federation of German Consumer Organisations, against Facebook Ireland Limited. The court took this rare opportunity to object to several key features of Facebook’s user experience and actions:

  1. The court criticized that users are not informed properly during the registration process, that email addresses and names of the user’s contacts are imported by the friend finder functionality without the contact’s knowledge, and that this functionality sends invitations to these contacts to join Facebook without the consent of these contacts. The court ruled that Facebook can no longer use this functionality in Germany without proper information and consent.
  2. The court also ruled on the validity of certain provisions in Facebook’s terms of use, deciding that Facebook may not require the users to grant a comprehensive right to use the users’ content on a worldwide basis without royalty payments. Facebook’s users maintain their rights in any content which is protected by IP rights, e.g. pictures, and Facebook may only use it with the users’ explicit consent.
  3. Another consent Facebook will have to reword is the users’ consent to data processing for advertising purposes. In addition, any changes to the terms of use and its privacy policy need to be communicated to the users sufficiently in advance.

The court held that Facebook’s practice is in violation of data protection laws, constitutes a case of unfair competition and is not compliant with German rules on standard terms and conditions. Users and consumer protection agencies are no longer willing to accept Facebook’s approach to privacy, transparency and intellectual property, and the Berlin court was willing to lead the way. The judgment is not legally binding yet and Facebook announced that it will carefully review the judgment once available in full and will then decide on any action to be taken.

A Seasonal Reminder for Your New Year's To-Do List - Implement Your Cookie Action Plan for a "Good Enough" Solution!

This post was written by Cynthia O'Donoghue and Nick Tyler.

On Christmas Day, organisations operating in the UK will have just five months to get their act together and comply fully with the new EU-wide rules on cookies.

See earlier Client Alerts:

The 12-month lead-in period set by the UK data protection regulator, the Information Commissioner’s Office (ICO), expires on 25 May 2012. This period is a time for taking pro-active steps, with the Information Commissioner himself issuing a timely warning on his blog that not enough is being done to address compliance by too many.

If the ICO’s message wasn’t clear seven months ago, its latest reminder should be now:

“organisations will need to be able to demonstrate they have taken sensible measured action to move to compliance. If a website has not achieved full compliance at the end of the period the [ICO] will expect a specific and clear explanation of why it was not possible to comply in time, a clear timescale for when compliance will be achieved and details of specifically what work is being done to make that happen.”

The ICO have helpfully taken the opportunity to update their guidance.This now includes a number of useful examples of what some organisations are doing to meet the new requirement for positive consent to cookies and other similar technologies.

The key first steps remain the same:

1. Cookie Audit,
2. User Impact Assessment, and
3. Action Plan.

At this stage of the lead-in period, the ICO expects organisations to have decided on the solutions appropriate to them and to have ready an

4. Implementation Plan – setting out the organisation’s activities to get into compliance between now and 25 May 2012. If you haven’t yet started this process, now is the time to start and to map out your chosen solutions!

The ICO emphasises that organisations must have in place “mechanisms for exercising user choice” to better educate consumers about the different cookies they use, what they are used for, and “making the case” about the undoubted benefits of cookies. The ICO’s guidance stems from UK Government-sponsored research revealing the general public’s limited understanding of cookies and how to manage them, including among more “internet-savvy” consumers.

While many view the new EU-wide requirement for positive consent to cookies as a legislative ‘sledgehammer to crack a nut’, the ICO’s position is that the more information given to consumers the better choice and control they are able to exercise.

The ICO’s view is the opposite of less is more in that greater information and choice will result in increased consumer confidence rather than resistance to cookies.

While the ICO recognises that technical solutions remain a “work in progress”, it also challenges the prevalent criticism and to the new rules highlighting some genuine ‘quick fixes’ which, while not perfect, seem to be good enough for them to accept as compliant.

Leaked proposed EU Commission Data Protection Regulation has potential to open eyes and make mouths water!

This post was written by Cynthia O'Donoghue.

The European Commission’s new draft data protection regulation was leaked to the press earlier this month. The proposal includes repeal of the present EU Data Protection Directive 95/46 and recommends a General Data Protection Regulation, as well as a Police and Criminal Justice Data Protection Directive.

The Commission appears to have made good its threats to increase enforcement to make U.S. and other companies outside the EEA comply. Some of the ground-breaking proposals include a harmonised enforcement and sanctions mechanism which include penalties of 1%, 3% or 5% of a enterprise's annual worldwide turnover for intentional or negligent breaches of various data protection obligations. Those penalties will certainly force organisations to sit up and take notice of their data protection obligations.

As suspected, the draft regulation includes new elements in relation to the principles of transparency and data minimisation, as well as a new principle of accountability for data controllers. Built into the new principle is an obligation for Privacy by Design “and by default”.

In addition, the right to be forgotten shifts the burden from individuals to organisations by requiring organisations that seek to continue to process personal data to demonstrate compelling legitimate grounds for the processing which override the interests or fundamental rights of the individual. This new right to be forgotten extends to erasure of information in the public domain available via the Internet or other communication service, and links to a new right to have the data restricted.

The draft Regulation also includes an obligation on large enterprises to appoint a data protection officer for both data controllers and data processors, where the processing of personal data requires regular and systematic monitoring.

The draft Regulation further proposes a new ‘super’ regulator, a European Data Protection Board to consist of the heads of each of the Member States’ Data Protection Authorities to replace the Article 29 Working Party. This new ‘super’ regulator will have the power to review and opine on measures at the national level relating to cross-border data processing whether within the European Union or outside of it, including approvals of data transfer agreements and binding corporate rules.

As we recently saw with France’s implementation of a data protection label, the proposed Regulation encourages the use of data protection certifications, such as seals and marks, for data controllers, aimed at helping individuals assess an organisation’s privacy practices.

Unless organisations raise data privacy and protection up the priority list of importance, they would be sitting on a time bomb. The issue is not whether this proposal will come into force, but when, and while there may be some changes while the proposal makes its way through the European Parliament, the way forward for organisations is now clear, and organisations will have at least two years to bolster their processes and procedures and get ready for the new horizon.

Even Data Privacy Obligations are Bigger in Texas

This post was written by John L. Hines, Jr., Paul Bond, Amy S. Mushahwar, Brad M. Rostolsky and Frederick Lah.

Earlier this year, Texas Governor Rick Perry signed into law Texas House Bill (H.B. 300), which presents more stringent requirements for health privacy, data breach notification obligations, and increased fines for violations. The law will become effective September 1, 2012. The following client alert details what businesses in Texas need to know about this new data privacy law. In addition, we wanted to remind clients about California's amendments to its data breach notification bill, as those changes are set to become effective January 1, 2012. Please feel free to pass this along to any client who may find it relevant.

To view the entire alert, please click here.

Proposed Rule Seeks To Prevent Future Contractor Leaks of Personally Identifiable Information - The WikiLeaks Response

This post was written by Melissa E. Beras.

On October 14, 2011, just one week after the release of the "WikiLeaks Order," the Department of Defense (DoD), the General Services Administration (GSA), and the National Aeronautics and Space Administration (NASA) proposed a rule that would require certain contractors to complete training that addresses the protection of privacy and the handling and safeguarding of personally identifiable information (PII). Specifically, the rule requires contractors who access government records, handle PII, or design, develop, maintain, or operate a system of government records on behalf of the government, to undergo training upon award of a contract and at least annually thereafter. Further, according to the rule, contractors would have recordkeeping requirements for documents indicating that employees have completed the mandatory training and would be required to produce those records upon government request.

In addition, the proposed Federal Acquisition Regulation (FAR) text provides that the required privacy training must, at a minimum, address seven mandatory elements. Those elements include training on privacy protection in accordance with the Privacy Act of 1974, restrictions on the use of personally owned equipment that implicates PII, breach notification procedures, and other “agency-specific” training requirements. The proposed FAR text also provides alternative language for instances where an agency would prefer that the contractor create the privacy training package, as opposed to attending an agency-developed privacy training. Additional alternative language is proposed for instances where the government determines it is in its best interest for the agency itself to conduct the training. Moreover, the clause requires that it be flowed down to any subcontractors who: (1) have access to government records; (2) handle PII; or (3) design, develop, maintain, or operate a system of records on behalf of the government.

The proposed rule is a part of a broader effort to enhance cyber security. It follows the “WikiLeaks Order,” an executive order issued October 7, 2011, and formally titled “Structural Reforms to Improve the Security of Classified Networks and the Responsible Sharing and Safeguarding of Classified Information,” which directs governmental change to ensure that classified information is shared responsibly and safeguarded on computer networks in a manner consistent with appropriate protections for privacy and civil liberties. The order expressly states that agencies bear “the primary responsibility for meeting these twin goals.” The proposed rule also comes shortly after the DoD requested the extension of a pilot program through November 2011, which helps protect the networks of its prime defense contractors by sharing intelligence about threats to their data with these contractors.

Contractors interested in sharing their views on the proposed rule have the opportunity to comment. Written comments are due by December 13, 2011.

Predictions on the New EU Data Protection Law

This post was written by Cynthia O'Donoghue.

Richard Thomas, the former UK Information Commissioner predicted that the European Commission will issue a regulation rather than a directive as part of the overhaul of the EU data protection directive. Under EU law a regulation has immediate legal effect whereas a directive requires the EU member states to enact implementing legislation. The issuance of a regulation would finally harmonise data protection law across the EU member states. In addition Richard Thomas predicted that the issuance of a regulation would result in a standardised registration process for data controllers across the EU. Richard Thomas made his predictions at the 10th Annual Data Protection Compliance Conference which took place last week in London.

At the same conference the current Information Commissioner, Christopher Graham, complained about not having statutory powers to carry out audits in sectors that receive the most complaints and which cause him the most concern. Commissioner Graham’s complaint stems from the fact that under the UK Data Protection Act 1998 he must seek permission from organisations before being able to carry out an audit of their data protection practices. Commissioner Graham is seeking to extend his powers under the Coroners and Justice Act 2009 so that he can target those sectors most complained about which include car insurance companies and banking and building societies.

A busy week in Europe: Do Not Track, Children's Internet Privacy, Data Breach Notification and Transfers of Passenger Record Data

This post was written by Cynthia O'Donoghue.

Hasn’t it been a busy week in Europe? The regulators seems to be falling over one another in a race to the top of privacy regulation. Targeted are web browsers and ‘do-not-track’ mechanisms, children’s internet privacy, banks, and the U.S.’s request for passenger data.

The European Commissioner Nellie Kroes came close to threatening the advertising industry when speaking at a recent workshop in Brussels. The EU is picking up the baton from the U.S. Federal Trade Commission in calling for a ‘Do Not Track’ standard to be in place by June of 2012. For those web browers who either run or businesses who honour do-not-track, Commissioner Kroes says, “But this is not enough. Citizens need to be sure what exactly companies commit to if they say they honour do not track. … If I don't see a speedy and satisfactory development, I will not hesitate to employ all available means to ensure our citizens' right to privacy."

Commissioner Kroes was also recently “disappointed” by the findings of a study by the European Commission (EC) on how social networking sites treat children. The study found that out of 14 social networking sites, only two had default settings that limited access to the approved contacts of children. The European Commission is consulting with social networking sites about rules governing the on-line privacy of children, and Commissioner Kroes will be urging sites to “make a clear commitment to remedy [their default settings] in a revised version of the self-regulatory framework” being discussed. Based on the results of the study, social networks will need to do more to protect children’s privacy.

As if banks don’t have enough on their regulatory plates, the EU Justice Commissioner Viviane Reding recently announced that banks will be required to disclose serious data protection breaches. While Commissioner Reding acknowledged that feedback from the banking

sector indicated a concern with mandatory data breach notifications adding to their administrative burden, she say the burden "is entirely proportionate and would enhance consumers' confidence in data security and oversight."

The EC and the U.S. have been renegotiating the agreement on the transfer of passenger name record data (PNR) to the U.S. This week a leaked report showed that the EC’s legal counsel opined that the proposed agreement which would allow the storage and retention of PNR for 15 years, which is four times longer than the present agreement, is unlawful, and “grave doubts” were expressed about the agreement’s compliance with EU data protection laws, notwithstanding an acknowledgement that the PNR aids in the fight against international terrorism. The new agreement will need the approval of the European Parliament, but a German Minister of the European Parliament has concerns about whether the agreement will pass through the Parliament despite the legal advice and would prefer the parties go “back to the drawing board” to ensure that any new agreement is compliant with EU data protection law.

UK Banks Need to Get it Right on Data Protection

This post was written by Cynthia O'Donoghue.

The Information Commissioner’s Office (ICO) told attendees of the British Banker’s Association conference today that they need to get it right on data protection.

Banks were reminded that data protection is not only about keeping data secure, it is about ensuring individuals remain in control of data the banks hold about them.

Two years ago the ICO was inundated with complaints about the banks’ failures to provide information about unfair bank charges, and the ICO does not want a repeat.

In light of the recent ruling about the mis-selling of payment protection insurance, the ICO will expect banks to provide customers with timely and full responses to information requests.

The ICO also announced that it has identified the financial sector as a priority area in its draft Information Rights Strategy.

FTC and Google - Proposed Settlement Over "Buzz"

This post was written by Christopher G. Cwalina, Amy S. Mushahwar, and Frederick Lah.

Google, Inc. agreed to a proposed consent order over charges that it used deceptive tactics and violated its privacy promises to consumers when it launched its social network, Google Buzz. The Agency alleged in its Complaint that Google's information practices violated Section 5 of the FTC Act.

As background, in February 2010, Google launched Buzz, a social networking service within Gmail, its web-based email product. Google used the information of Gmail users, including first and last name and email contacts, to populate the social network. Gmail users were, in many instances, automatically set up with “followers” (people that followed the user or people that the user followed). According to the FTC's Complaint, even if a user did not enroll in Buzz, the user's information was shared in a number of ways (e.g., a user who did not enroll in Buzz could still be followed by other Gmail users who enrolled in Buzz). The FTC also alleges that the setup process for Gmail users who enrolled in Buzz did not adequately communicate that certain previously private information would be shared publicly by default. Further, the FTC alleges that certain personal information of Gmail users was shared without consumers' permission through Buzz (e.g., some information was searchable on the Internet and could be indexed by Internet search engines).

Part I of the proposed consent order prohibits Google from misrepresenting the privacy and confidentiality of any “covered information,” as well as the company’s compliance with its other any privacy and security program, including the U.S.-EU Safe Harbor Framework. The term "covered information" is defined very broadly to include an individual's first and last name, physical address, email address, screen name, persistent identifier (e.g., IP address), list of contacts, and physical location. The FTC noted in its press release [] that this is the first time it has alleged violations of the substantive privacy requirements of the U.S.-EU Safe Harbor Framework.

Part II of the proposed consent order requires Google to give its users a "clear and prominent" notice and choice. Under the terms of the proposed consent order, Google must obtain express affirmative consent before sharing any user's covered information with a third party in connection with: (1) a change, addition or enhancement to any product or service, (2) where such sharing is contrary to stated sharing practices in effect at the time the information was collected. The proposed opt-in disclosure must appear separately from any end user license agreement, privacy policy, website terms of use or similar document and prominently disclose: (1) that the Google user’s information will be disclosed to one or more third parties, (2) the identity or specific categories of such third parties, and (3) the purpose(s) for Google’s sharing of the information.

Part III of the proposed order requires Google to establish and maintain a comprehensive privacy program that is reasonably designed to address privacy risks related to the development and management of new and existing products and services. The program must be documented in writing and must contain privacy controls appropriate to Google’s size and complexity, the nature and scope of its activities, and the sensitivity of covered information. Part IV through IX of the proposed order contain reporting and compliance provisions, including obtaining ongoing biennial assessments from a qualified third-party professional about Google's privacy practices, requiring that Google retain consumer complaints for a period of six months, and mandating that Google submit an initial compliance report to the FTC and make available to the FTC subsequent reports. If finalized, the proposed consent order would remain in effect (with ongoing compliance requirements) for twenty years.

Commissioner Rosch, in a concurring statement, expressed "substantial reservations" about Part II. He said that Google never intended in its original Privacy Policy that the consent it would seek would was "opt-in" (as opposed to "opt-out"), and that such a requirement was "brand new". Also, Commissioner Rosch made note of the fact that the proposed consent order seems to apply to "any" new or additional sharing of previously collected personal information, not just any "material" new or additional sharing of information.

The proposed consent order will be placed on the public record for thirty days until May 2, 2011 for public comment. After thirty days, the Commission will consider comments and decide whether to make the proposed consent order final. Bottom line, this case should serve as a reminder that companies must align their business practices with the promises contained in their Privacy Policies.

FTC Brings Enforcement Action against Text Messaging Spammer

This post was written by Kevin Xu and John Hines.

On February 22, 2011, the Federal Trade Commission (“FTC”) filed a complaint against Phillip A. Flora (“Flora”) for an operation that allegedly blasted consumers with millions of illegal spam text messages, including many messages that deceptively advertised a mortgage modification website called “” The FTC is asking the court to shut down Flora’s operation and freeze his assets.

According to the FTC complaint, beginning on or about August 22, 2009, Flora transmitted or arranged for the transmission of at least 5 million spam text messages to random consumers. The text messages promoted products and services, including, but not limited to, loan modification programs and debt relief services. The text messages offered to help consumers obtain mortgage loan modifications and many of the messages state: “Homeowners, we can lower your mortgage payment by doing a Loan Modification. Late on payments OK. No equity OK. May we please give you a call?” Consumers who visited this web address arrived at a website that touted itself as the “Official Home Loan Modification and Audit Assistance Information” beneath a picture of the U.S. flag. This website, although it included the term “gov” in its address, was not operated by or affiliated with any governmental entity. Additionally, Flora allegedly collected information from consumers who responded to text messages – even those asking him to stop sending messages – and sold their contact information to marketers claiming they were “debt settlement leads.”

The FTC charges that Flora violated the Section 5(a) of the FTC Act, which prohibits unfair or deceptive acts or practices in or affecting commerce, by sending unsolicited commercial text messages to consumers, and by misrepresenting that he was affiliated with a government agency. In addition, the FTC charges that Flora violated the CAN-SPAM Act by sending consumers spam text messages that failed to include a way for consumers to “opt-out” of future messages and failed to include the physical mailing address of the sender, as required by the CAN-SPAM Act.

The outcome of this case, which we note is being brought by the FTC and not the FCC, may have a significant impact on consumer data privacy rights in the mobile communications sector, and may serve as a watershed case for consumers’ potential recourses in future privacy violation situations arising from mobile communications.

Asian Data Privacy Update

This post was written by Cynthia O'Donoghue.

Asian countries continue to focus on developing their data protection legislation.

The Philippines Congress recently finished its second reading of House Bill 1554 which will introduce a unified and special law relating to data protection and privacy. Singapore, which already has some sectoral laws and a voluntary data protection model code, is now calling for the introduction of formal data protection legislation for parliamentary debate in early 2012.

The Philippines draft bill seeks to establish fair practices and regulate the collection, use and protection of individuals’ private information as well as to promote the development of its business process outsourcing industry. Under the Filipino bill, businesses and government agencies would have to obtain an individual’s unambiguous consent to collect and use their personal data. The bill also sets out data breach notification requirements to the regulator and to affected individuals when there is a real risk of serious harm, including breaches that may enable identity fraud. The proposed bill defines personal information quite broadly as “any data that can be used alone or in conjunction with other data to identify an individual”, and provides additional protections for sensitive personal information. Under the bill, a national Privacy Commission would be created that has the power to implement and enforce data protection legislation, including the authority to impose civil fines for certain violations and to refer suspected intentional violations to the Philippines Government’s Justice Department for investigation and potential imposition of criminal penalties of up to three years imprisonment.

In Singapore the Information, Communication and Arts Minister indicated that the country’s review to assess the need for a data protection system has been completed and that comprehensive data protection legislation will be introduced in early 2012.The Singaporean government has concluded that it would be in Singapore’s interests to protect individuals’ personal data against unauthorised use and disclosure. The objective of the proposed law is to curb excessive and unnecessary collection of individuals’ personal data, and for businesses and government to obtain individual consent before their personal information is disclosed. In addition the proposed legislation will create a Data Protection Council to oversee the implementation and enforcement of data protection legislation. It is not clear whether there will be a compulsory data breach notification element to any proposed Singapore legislation.

Both the Philippines and Singapore are members of the Asia-Pacific Economic Cooperation Forum (APEC), which issued a privacy framework for its 21 members. Both the Philippines draft house bill and the proposed Singaporean legislation are consistent with the APEC privacy framework principles.

As the number of Asian countries enacting data protection legislation increases, organisations must ensure they are ready to meet the requirements. This is particularly important for outsourcing suppliers as both countries view data protection legislation as a way of increasing commerce.

California Reins in Retail Marketing

This post was written by Joshua Marker.

Catalog and retail marketing in California just got a little bit trickier. No longer can retailers require that a customer provide a ZIP code to complete a credit card transaction, and this may impede the ability of many retailers to generate in-store marketing leads. On February 10, 2011, the California Supreme Court held that the Song-Beverly Credit Card Act (“the Act”) covers key components of an individual’s address as ‘personal identification information’ in a credit card transaction.

In that case, Pineda v. Williams-Sonoma Stores, Inc., No. S178241, Williams-Sonoma’s practice of collecting individual’s ZIP codes when completing a credit card transaction was at issue. Williams-Sonoma collected these ZIP codes for credit card verification purposes and developed a retail marketing lead list from its in-store transactions. The California Supreme Court found that this practice violated Section 1747.08(a)(2) of the Act, as ZIP codes are ‘personal identification information’ covered by the Act, and the collection of that information was thus prohibited. 

Section 1747.08 of the Act prohibits a business from requiring a cardholder to provide ‘personal identification information’ in order to complete a credit card transaction. The Act defines ‘personal identification information’ as “information concerning the cardholder, other than information set forth on the credit card, and including, but not limited to, the cardholder’s address and telephone number.” Pineda alleged that Williams-Sonoma had collected her ZIP code as part of a credit card transaction, and used this information to obtain her complete address, and added her information to their database.

The California Supreme Court ruled that ‘personal identification information’ under the Act must include components of a cardholder’s address. If it did not, a business could request a ZIP code and find the address through a reverse search, as Williams-Sonoma was allegedly doing. This “would render the statute’s protections hollow.” The term ‘address’ must be construed to cover the components of a cardholder’s address, and not just the complete address. Otherwise a retailer would be permitted “to obtain indirectly what they are clearly prohibited from obtaining directly.”

With fines of up to $1,000 for each violation and the potential for class action litigation as a result of this statute, the Act carries potentially steep penalties. As a result, it may be time to review of your company’s in-store information collection practices and bring them in line with this new ruling.

New ENISA Report on data breach notifications issued

This post was written by Cynthia O'Donoghue and Katalina Chin.

ENISA (the European Network and Information Security Agency) has issued a new report on data breach notifications . Having approached telecoms operators and data protection authorities (DPAs) on this topic, the report highlights data breach handling and key stakeholder concerns.

The revised e-Privacy Directive (2002/58/EC) brought in EU data breach notification requirements for the telecoms sector and the European Commission is considering the inclusion of the finance, healthcare and small business sectors. By requiring mandatory data breach notification to the national data protection authority, the Commission hopes to encourage organisations to increase the level of security afforded to personal data and to reassure citizens about the security of their personal data by telecom sector operators.

What are appropriate technical and organisation security measures exactly?

ENISA, the EU agency ‘created as a response to security issues of the European Union’, will be preparing guidance on the technical implementation measures and procedures required to comply with Article 4 of the e-Privacy Directive on security, so this most recent report serves as a useful precursor to the issues which should be addressed in the highly-anticipated ENISA guidance.

As a general comment ENISA found that data protection authorities tend to take a varied approach to enforcing data protection and privacy in the EU. ‘Some follow EC Directives closely, while others take on additional responsibilities beyond those outlined in the Directives’. While the telecoms sector may recognise the importance data breach notification will play in data security, the uncertainty in how such notifications will be dealt with by DPAs should not be underestimated.

The key concerns raised by telecoms operators include the following:

  • Risk prioritisation - Breaches should be categorised according to specific risk levels to limit the burden of notification on the resources of both organisations and DPAs, in particular where there is no real risk to the rights of the data subject;
  • Communications channels - Brand is an important issue for all operators who are looking for assurances that notification requirements will not impact negatively on their brand as well as assurance that they can maintain control on notifying data subjects to effectively manage any impact on brand perception; and
  • Support - Guidance relating to the implementation of security levels to comply with Article 4 of the e-Privacy Directive and should aim to prevent violations before they happen in addition to procedural guidance on the data breach notification requirement.

The DPAs interviewed for ENISA’s report listed concerns. While the majority of DPAs support mandatory breach notification for telecoms operators, the report highlights a long list of factors for consideration before mandatory notification can be implemented. Those factors include:

  • adequate resources both budgetary and staff IT expertise to match the high level of technical expertise found in the telecoms sector;
  • sanctioning authority to impose penalties as a tool ensuring compliance; and
  • a clear delineation of responsibilities between relevant authorities to mitigate or prevent potential conflict.

The key concern among DPAs was that the data breach notification requirement will interfere with their ability to perform their numerous other pre-existing responsibilities, which in some member states is already evident when seeking authorisations for data processing and/or approval for international transfers.

Organisations should look to the legislative examples of Ireland and Germany highlighted in the ENISA report while Member States prepare their implementing legislation of the new e-Privacy Directive. ENISA cited both countries as useful examples of breach notification procedures and suggested a progress review of “both countries over time in order to gather experiences, best practices, and lessons learned”.

We will prepare a future blog on ENISA’s guidance on appropriate technical and organisational measures once it has been issued so watch this space!

Privacy & Data Security Bills After the Midterm Elections

This post was written by Judith L. Harris, Christopher G. Cwalina and  Amy S. Mushahwar.

The midterm elections will likely result in a shift of political power within the House of Representatives. The resultant divided government is likely to impact the present ambitious privacy and data security legislative agenda. Reed Smith Washington D.C. Data Privacy, Security and Management attorneys Judith Harris, Christopher Cwalina, and Amy Mushahwar have published an analysis of their predictions for 2011 legislative priorities as the incoming crop of legislators move from campaign mode to governance. Please see their article in Information Security here.

What kind of animal is your PET? Report on Privacy Enhancing Technologies ("PETs") released by European Commission

This post was written by Cynthia O'Donoghue and Katalina Chin.

The European Commission DG Justice, Freedom and Security commissioned London Economics, one of Europe's leading specialist economics and policy consultancies, to undertake a study and report on the economic benefits of Privacy Enhancing Technologies ("PETs") for organisations and institutions using and holding personal data in selected European member states.

But what are PETs?  It is a term used for a set of computer tools, applications and mechanisms, including procedures and management systems, which aim to protect the privacy of personal data by eliminating, anonymising or minimising personal data in order to prevent unnecessary or unwanted processing of personal data.  Features can include, for example, allowing an individual to choose the degree of anonymity, to inspect, correct and delete any of their personal data, to track the use of their personal data and may also include a consent mechanism prior to providing personal data to online service providers.  The report emphasises that, "data minimisation and consent mechanisms are an important part of PETs, and PETs often combine these elements with data protection tools into an integrated privacy system".

The report highlights that "the rights [set out in Article 8 of the Charter of Fundamental Rights of the European Union which deals with an individual’s rights to the protection of personal data] form the basis of the legal framework in which PETs are deployed" and should have at their core the objective of transparency, proportionality and data minimisation.

The report explains how it is difficult to quantify the wider economic benefits of a data controller using PETs to protect an individual’s personal data, and how the evidence has shown that the benefits can only be assessed on a case-by-case basis.  If anything, the study found little evidence to show that the demand by individuals for greater privacy is driving PETs deployment, and suggests that this is in part due to “the uncertainties surrounding the risk of disclosure of personal data, a lack of knowledge about PETs, and behavioural biases that prevent individuals from acting in accordance with their stated preference for greater privacy”.

The fact of the matter is, as the report makes very clear, that data controllers can derive a variety of benefits from holding and using personal data (including the personalisation of goods and services, data mining, etc.) and to the extent that PETs limit the ability of data controllers to use personal data, this will clearly act as a disincentive in the exploitation of PETs. The report highlights that, “data controllers often favour mere data protection to protect themselves against the adverse consequences of data loss over data minimisation or consent mechanisms which can impede the use of personal data”.  Evidence considered in the study suggests that there is a role for the public sector in helping data controllers realise the benefits of PETs, such as “official endorsements of PETs, including through pioneering deployment and official certification schemes, and direct support for the development of PETs, through subsidies to researchers (e.g. the European Framework Programmes)".

As the heat in data privacy issues continues to rise, with increased powers of regulatory authorities, tougher sanctions being imposed and a greater emphasis in Europe’s legislation on security management, it is clear that privacy by design will be the most effective method of compliance.

Consumer Privacy Issues Abound in the Dodd-Frank Wall Street Reform and Consumer Protection Act

This post was written by Chris Cwalina, Mark Melodia and Amy Mushahwar.

With President Obama scheduled to sign the Dodd-Frank Wall Street Reform and Consumer Protection Act this week, the financial services industry faces a rapidly changing regulatory environment.  While a great deal of attention has been paid to the significant restructuring of the financial services regulatory regime, little focus has been placed on the proposed changes to the oversight of consumer privacy issues, data security and data stewardship. These issues may not only affect banks, but all types of businesses servicing the financial industry as well.

To view the entire alert, please click here.