Final Rule on Human Trafficking Prevention Set to Take Effect and Require Additional Compliance Actions for Certain Contractors

This post was written by Lorraine M. Campos and Nkechi A. Kanu.

Starting the first week of March , federal contractors will be subject to new prohibitions and obligations related to human trafficking. These new requirements stem from a final rule issued January 29, 2015, that significantly strengthens the Federal Acquisition Regulation’s (FAR) existing provisions to prevent trafficking in persons. The new rule goes into effect March 2, 2015, and will apply to all new contracts, as well as future orders under existing indefinite delivery, indefinite quantity (IDIQ) contracts.


The final rule prohibits government contractors and their employees from undertaking certain actions related to:

  • Engaging in severe forms of human trafficking during the period of performance of the contract
  • Procuring commercial sex acts during the period of performance of the contract
  • Using forced labor in the performance of the contract
  • Destroying, concealing, confiscating or otherwise denying employees’ access to identity or immigration documents
  • Engaging in fraudulent or misleading recruitment practices
  • Employing recruiters who violate the labor laws of the country where the recruitment takes place
  • Charging recruiting fees
  • Failing to provide return transportation to an employee who is not a national of the country where the work is to take place, subject to limited exceptions
  • Providing housing, if required, that fails to meet host country safety or housing laws, or
  • Failing to provide a written work document, if required

Contractor Obligations

The new rule also requires government contractors to undertake the following actions to combat trafficking in persons:

  • Inform employees and agents of the human trafficking requirements and the possible repercussions if violations occur
  • Disclose information to the relevant contracting officers concerning human trafficking offenses
  • Cooperate in investigations into human trafficking offenses
  • Protect victims and witnesses of human trafficking offenses

Compliance and Certification

A key component of the new rule is the requirement that certain government contractors implement a compliance plan and certification requirements. If any part of a government contract with an estimated value of more than $500,000 is for supplies acquired overseas or services to be performed overseas, the offeror must certify that: (1) it has implemented an anti-human trafficking compliance plan and procedures to prevent human trafficking violations; and (2) it has performed due diligence on its agents and subcontractors, and taken necessary remedial actions.

Government contractors should review their compliance plans to ensure that they have adequate mechanisms in place to comply with the new rule’s more robust requirements, and to monitor and evaluate the extent to which their suppliers are in compliance with the new requirements. A violation of these regulations may result in suspension of contract payments, termination of a contract, or suspension or debarment, as well as False Claims Act and other liability.

Article 29 Working Party issues its Cookie Sweep Combined Analysis - Report

This post was written by Cynthia O’Donoghue and Katalina Bateman.

On 3 February, the Article 29 Data Protection Working Party published its ‘Cookie Sweep Combined Analysis – Report’. The sweep was undertaken by the WP29 in partnership with eight of the European data protection regulators, including the UK’s ICO, France’s CNIL and Spain’s AEPD, in order to assess the current steps taken by website operators to ensure compliance with Article 5(3) of Directive 2002/58/EC, as amended by 2009/136/EC. The Report details the results of their assessment of the extent of the use of cookies, the level of information provided, and a review of control mechanisms in place.

The Report examines 250 websites which were selected as among the most frequently visited by individuals within each member state taking part in the sweep. Media, e-commerce and the public sector were chosen as target sectors, which were those considered by the WP29 to present the ‘greatest data protection and privacy risks to EU citizens’.

Highlights of the assessment include:

  • High numbers of cookies are being placed by websites. Media websites place an average of 50 cookies during a visitor’s first visit.
  • Expiry dates of cookies are often excessive. Three cookies in the sweep had been set with the expiry date of 31 December 9999, nearly 8000 years in the future. Excluding cookies with a long duration, the average duration was between one and two years.
  • 26% of sites examined provide no notification that cookies are being used. Of those that did provide a notification, 50% merely inform users that cookies are in use without requesting consent.
  • 16% of sites give users a granular level of control to accept a subset of cookies, with the majority of sites relying on browser settings or a link to a third-party opt-out tool.

Since publishing the Report, the WP29 has made it clear in a Press Release that the results of the sweep “will be considered at a national level for potential enforcement action”. While the UK’s ICO has already stated that it intends to write to those organisations that are still failing to provide basic cookie information on their websites before considering whether further action is required, other European regulators have yet to comment on what actions they have planned.

Enforcement Action by the FCA is on the Rise

This post was written by Eoin O'Shea, Chris Borg, and Claude Brown.

In 2014, there was an increase by 20 percent in open investigations by the UK’s Financial Conduct Authority (FCA). An increase that is speculated to be the result of the FCA’s new stricter regime, in comparison to its supposedly less confrontational predecessor, the FSA. Click here for the Reed Smith Client Alert to lean more about what this could mean for clients in the regulated financial services industry.

South Korean Communications Commission Releases Guidelines on Data Protection for Big Data

This post was written by Cynthia O'Donoghue and Philip Thomas.

In December 2014, the Korea Communications Commission (KCC) released the“Big Data Guidelines for Data Protection” (Guidelines). Aimed at Information and Communications Service Providers (ICSPs), they are designed to prevent the misuse of “publicly available information” to create and exploit new information. The Guidelines expressly permit ICSPs to collect and use “publicly available information”, within certain parameters.

“Publicly available information” is defined as “code, letters, sounds and images" that are “lawfully disclosed”; however, the Guidelines also cover Internet log information and transaction records.

According to the Guidelines, where such information includes personal information, the data must be de-identified before it may be collected, retained, combined, analysed or sold. The Guidelines also include a number of specific measures for ICSPs to take in connection with their collection and use of such information.

These measures include a duty to disclose their big data processing activities and policies to users, and to inform users of their rights to opt out. Other provisions include a prohibition on the collection, analysis and exploitation of sensitive information, and an obligation to ensure that information collected and used remains de-identified.

Before the Guidelines were introduced, the right to collect and use such information in South Korea was widely considered to be a grey area. The KCC has therefore provided much-needed clarification in this area. The Guidelines strike a balance between protecting personal information, on the one hand, and recognising the growth of the big data industry on the other.

China's State Administration for Industry and Commerce Releases Measures Defining Consumer Personal Information

This post was written by Cynthia O'Donoghue and Zack Dong.

In January, China’s State Administration for Industry and Commerce (SAIC) released its ‘Measures on Penalties for Infringing Upon the Rights and Interests of Consumers’ (Measures) which are due to take effect March 15, 2015.

These Measures flesh out China’s Consumer Rights Protection Law (CRPL) which was amended in March 2014 and provides guidance as to how companies may collect, use and protect personal information of consumers.

The Measures helpfully defines “consumer personal information”, which the amendments to the CRPL had failed to do, as “information collected by an enterprise operator during the sale of products or provision of services, that can, singly or in combination with other information, identify a consumer.”

Examples of consumer personal information provide additional clarity, such as information which refers to a consumer’s name, gender, occupation, birth date, identification card number, residential address, contact information, income and financial status, health status, and consumer status. This definition is a welcome addition in the midst of China’s patchwork of privacy rules and regulations.

Violations of the Measures may result in significant penalties. The Measures state that the SAIC and its local Administrations of Industry and Commerce may impose a fine of up to RMB 500,000 if there are no illegal earnings. In the event that there are illegal earnings, however, they may issues fines of up to 10 times the amount of the illegal earnings and confiscate all illegal earnings.

It is hoped that these new Measures (in combination with the CRPL) will help to repair consumer trust in Chinese companies, and protect the improper use, disclosure and sale of consumers’ personal information in the country.

EU Art. 29 Working Party Letter on Health Data and Apps

This post was written by Cynthia O'Donoghue.

The EU Article 29 Working Party (“WP29”) has published a letter to the European Commission (“EC”) on the scope of health data in relation to lifestyle and well-being apps, following the EC’s Working Document on mHealth and the outcome of its public consultation, which generated interest in strong privacy and security tools, and strengthened enforcement of data protection.

In the letter, WP29 addresses the exceptions to processing health data for historical, statistical or scientific research, and requests that the EC ensure that any secondary processing of health data only be permitted after having obtained explicit consent from individuals.

The Annex to the letter acknowledges that determining the scope of health data is particularly complex and can have a wider interpretation depending on context, and is likely to capture apps measuring blood pressure or heart rate – exactly the types of apps that are already widely available.

The Annex makes recommendations for those gray areas where it is not always clear whether personal data is medical data, and gives examples of possible indicators to consider, such as the intended use of the data and, over time, if it is combined with other data, would it be possible to create a profile about the health of an individual, such as risks related to illness, weight gain or loss and the consequential health issues that may arise, or an indication of heart disease. To be considered ‘medical data’, the WP29 states that there has to be a relationship between the raw data set collected through the app and the ability to determine a health aspect of a person, either from the raw data itself or when that raw data is combined with other data (irrespective of whether these conclusions are accurate or not).

Finally, WP29 suggests that the data protection exception relating to further processing of health data for historical, statistical and scientific purposes should be limited to research that serves high public interests, cannot otherwise be carried out or where other safeguards apply, and where individuals may opt out.

The view of the WP29 is likely to capture most of the existing apps relating to well-being, which at present a lot of organizations may have been considered to be outside the scope of the additional protections afforded to sensitive data.

Google signs UK Undertaking to Improve its Privacy Policy

This post was written by Cynthia O'Donoghue.

On 30 January 2015, Google signed an Undertaking with the Information Commissioner’s Office (ICO) to improve and amend the Privacy Policy it adopted 1 March 2012.

Among other things, the modifications to the Privacy Policy allowed Google to combine personal data across all services and products. For example, personal data collected through YouTube could now be combined with personal data collected through Google Search.

The Undertaking requires Google to address three of the ICO’s particular concerns: (1) the lack of easily accessible information describing the ways in which service users’ personal data is processed by Google; (2) the vague descriptions describing the purposes for which the personal data is processed; and (3) the use of insufficient explanations of technical terms to service users.

In order to address these issues, Google states in Annex 1, Undertaking that it will, inter alia: enhance the accessibility of its Privacy Policy to ensure that users can easily find information about its privacy practices; provide clear, unambiguous and comprehensive information regarding data processing, including an exhaustive list of the types of data processed by Google and the purposes for which data is processed; and revise its Privacy Policy to avoid indistinct language where possible.

Google has a period of two years in which to implement these changes, and it must provide a report to the ICO by August 2015, specifying the steps Google has taken in response to the commitments set out in the Undertaking.

The ICO’s measures in response to Google’s breach of national data protection laws are much lighter than those take by other EU Member States. The data protection authorities in France (CNIL) and Spain (AEPD) have imposed fines of €150,000 and €900,000 respectively. Currently, the Dutch data protection authority is threatening Google with a €15 million fine (see our previous blog).

Trump Taj Mahal Fined Record $10 Million for Inadequate AML Program

This post was written by Kathleen Nandan and Amy Tonti.

As disclosed recently in a bankruptcy court filing, on January 27, 2015, the Financial Crimes Enforcement Network (“FinCEN”) imposed a $10 million civil money penalty pursuant to the Bank Secrecy Act (the “BSA”) on Trump Taj Mahal Associates LLC. Trump Taj Mahal consented to the imposition of the penalty (subject to the bankruptcy court’s approval) and admitted that its conduct violated the BSA. This $10 million penalty, reported to be the largest BSA penalty ever imposed upon a casino, highlights the government’s ongoing focus on the gaming industry.

The BSA requires financial institutions, which include casinos, to establish and implement policies to detect and prevent money laundering. Casinos must implement anti-money laundering (“AML”) programs that require, among other things, the creation of internal controls to ensure compliance with the BSA, independent testing of their AML programs, AML training for personnel, the designation of individuals responsible for AML compliance, and the implementation of procedures to identify suspicious transactions and determine whether records must be made or maintained under the BSA. Trump Taj Mahal admitted that it failed to implement such an AML program.

With respect to these failures, Trump Taj Mahal admitted that its violations were “willful,” as that term is used in civil enforcement of the BSA. Trump Taj Mahal did not admit that it knew that its conduct violated the BSA or that it otherwise acted with an improper motive or bad purpose. Instead, the casino acknowledged that it acted with either reckless disregard or willful blindness.

The settlement proposes to treat the $10 million penalty as an unsecured claim in the Trump Taj Mahal bankruptcy case. Under the Bankruptcy Code, certain penalties assessed by the government are to be accorded “priority,” to be paid before general unsecured creditors. Based on current law, the penalty assessed for a BSA violation does not fall within the priority treatment, and any payment of the penalty would be based on a pro-rata distribution to general unsecured creditors. Notably, a criminal penalty assessed for a violation that occurred during the pendency of a bankruptcy (unlike the civil penalty here) might be treated as having priority by certain courts, but not all courts. Courts that deny priority to a penalty do so to enhance recovery for all creditors.

Recent enforcement actions against others in the gaming industry, as well as FinCEN Director Jennifer Shasky Calvery’s recent speeches cautioning casinos not to let the entertainment component of their business color their view of their AML obligations, suggest that the industry will remain under FinCEN’s scrutiny for the near future. As Director Shasky Calvery noted in June 2014, casinos are “complex financial institutions with intricate operations that extend credit, and that conduct millions of dollars of transactions every day. They cater to millions of customers with their bets, markers, and redemptions. And casinos must continue their progress in thinking more like other financial institutions to identify AML risks.”*



New Authorizations to Export Personal Communications Items and Services to Sudan

This post was written by Michael A. Grant, Leigh T. Hansson, and Michael J. Lowell.

On February 18, 2015 the Commerce Department’s Bureau of Industry and Security (“BIS”) and Treasury Department’s Office of Foreign Assets Control (“OFAC”) published changes to the Export Administration Regulation (“EAR”) and the Sudanese Sanctions Regulations (“SSR”) in order to advance the free flow of information and facilitate communications by the Sudanese people. OFAC’s changes are consistent and nearly identical to its personal communications general license for Iran, first issued on May 30, 2013 and revised on February 7, 2014. See our guidance related to the Iran Personal Communications License. Because BIS and OFAC maintain concurrent jurisdiction to regulate trade with Sudan, BIS has amended the EAR including the revision of License Exception Consumer Communications Devices (“CCD”) to authorize cetin exports to Sudan.

Sudanese Sanction Regulations

OFAC’s amendments to the SSR include provision for both U.S. and non-U.S. Persons.

  • U.S. Persons are authorized to export services to Sudan incident to the exchange of personal communications
  • U.S. and non-U.S. Persons may export certain U.S.-origin software to Sudan incident to personal communications
  • U.S. Persons may export certain foreign-origin software or software that is not subject to the EAR as defined in EAR § 734.3(b)(3)
  • U.S. and non-U.S. Persons may export certain hardware listed in Annex B to Sudan including (but not limited to):
    • Mobile phones and PDAs
    • Satellite phone and Broadband Global Area Network hardware
    • Consumer modems, network interface cards, ration equipment, routers
    • Laptops, tablets and personal computing devices
    • Anti-virus and anti-malware software
    • Anti-tracking software
    • Mobile operating systems and applications for mobile devices
  • U.S. Persons may provide certain consumer Internet connectivity services to Sudan
  • Authorized hardware and software may be imported into the United States
  • U.S. Persons may export certain free of cost services and software to the Government of Sudan

Export Administration Regulations

In consultation with OFAC and the State Department, BIS implemented changes to the EAR consistent with the new authorizations issued by OFAC. BIS has expanded License Exception CCD and made minor technical changes to ensure authorization of certain personal communication devices to Sudan.

Consumer Communications Devices

On January 15, 2015, BIS amended License Exception CCD to authorize the export of certain items to Cuba. The new change by BIS adds Sudan as an authorized destination for the license exception. In addition License Exception CCD now includes authorization to export certain Global Positioning System receivers and similar satellite receivers to Sudan only. The items authorized under CCD for Sudan overlap with OFAC’s new authorizations and include (but are not limited to):

  • Computers
  • Modems
  • Network access controllers and communications channel controllers
  • Mobile phones
  • Satellite telephones
  • Personal digital assistants
  • Consumer software to be used for equipment described in this list

Reexport of Items Subject to the EAR

BIS amended Section 742.10 of the EAR and no longer requires a license for non-U.S. Persons to reexport specified U.S. Origin items to Sudan. The items which no longer require a license for reexport are those classified under the following ECCNs: 2A994, 3A992.a, 5A991.g, 5A992, 5D992.b or .c, 6A991, 6A998, 7A994, 8A992.d, .e, .f, and .g, 9A990.a and .b, and 9A991.d and .e. In addition, these items are not included in the de minimis calculation for Sudan.

Additional Considerations

The new authorization permits U.S. Persons to provide certain free of charge services and items to the Government to Sudan. However, the Government of Sudan continues to be a prohibited end user for the direct or indirect export of fee-based services and hardware directly under both OFAC and BIS regulations. Both agencies have modified their specific licensing review policy from a policy of denial to a case-by-case assessment for personal communication services, software or hardware (OFAC) and for medical and telecommunications equipment (BIS).

New Data Protection Laws in Africa

This post was written by Cynthia O'Donoghue.

In recent years, the number of African countries which have enacted privacy frameworks or are planning data protection laws has vastly increased.

Currently, 14 African countries have privacy framework laws and some sort of data protection authorities in place. Once the African Union Convention on Cyber Security and Personal data Protection (Convention) is ratified across the continent, many other nations will likely enact personal data protection laws.

Currently, seven African countries have data protection bills in place: Kenya, Madagascar, Mali, Niger, Nigeria, Tanzania, and Uganda. Many analysts believe that the Convention seeks to replicate the European Union data protection model whereby each country has its own national data protection laws and authority.

Despite these developments, the Convention still has many important areas to provide guidance on. For instance, the Convention fails to define what is meant by “consent”, “personal data” and the “legitimate grounds” individuals can raise to object to the processing of their information.

The international human rights advocacy group, Access, welcomes these changes, but stresses that “change won’t happen overnight”, and that “it will likely be a few years” before countries enact laws to implement the Convention.

FAA Takes One Small Step Toward Legalizing Commercial Use of Small Unmanned Aircraft Systems, a.k.a Drones

This post was written by Patrick BradleyMark Melodia and Paul Bond.

The Federal Aviation Administration (FAA) has long been studying the promise and perils of small unmanned aircraft systems (“UAS”), a.k.a drones. The commercial potential of UAS technology is clear. Businesses are eager to use UAS to do everything from covering traffic accidents to taking real estate and wedding photos to delivering small parcels. However, the FAA currently prohibits any commercial or business use of UAS, unless the operator obtains specific permission from the FAA. Permission is only granted on a case-by-case basis, greatly restricting businesses from adopting UAS.

This framework remains in place, but the FAA has now issued a Notice of Proposed Rulemaking (NPRM). If adopted, the proposed rules would provide some rules of the sky for UAS and real regulatory relief to businesses. However, estimates are that adoption of even these first-step rules may be as far as two years out.

The FAA’s proposed rule would set forth several requirements as to operator certification, airworthiness, registration, and operation. As to operation, potentially significant restrictions include a requirement that UAS only operate in the daylight at or below 500 feet above the ground; that the operator maintain a line of sight with the UAS during operation; and that an operator only operate one UAS at a time. Drones would not be allowed to fly over any people not directly involved with the operation of the drone.

Currently, prospective commercial drone operators are required to hold at least a private pilot certificate. That would change under the new rules. Commercial UAS operators will need to pass an FAA knowledge test and pass biennial knowledge exams. Transportation Security Administration approval will also be required under the rules. Commercial drone operators will not be required to undergo an FAA medical exam.

The FAA rules do not call for the imposition of airworthiness requirements on drones, but they will be required to register with the FAA, and they will carry N numbers like other aircraft. The pilot will need to do a preflight inspection before every flight, and accidents must be reported.

The proposed rule would not apply to:

“(1) air carrier operations; (2) external load and towing operations; (3) international operations; (4) foreign-owned aircraft that are ineligible to be registered in the United States; (5) public aircraft; (6) certain model aircraft; and (7) moored balloons, kites, amateur rockets, and unmanned free balloons.”

As to privacy, the FAA notes:

“The FAA also notes that privacy concerns have been raised about unmanned aircraft operations. Although these issues are beyond the scope of this rulemaking… the Department and FAA will participate in the multi-stakeholder engagement process led by the National Telecommunications and Information Administration (NTIA) to assist in this process regarding privacy, accountability, and transparency issues concerning commercial and private UAS use in the NAS. We also note that state law and other legal protections for individual privacy may provide recourse for a person whose privacy may be affected through another person’s use of a UAS.” At the same time that the FAA released this NPRM, the White House issued a Presidential Memorandum to all federal agencies, setting forth administration priorities for the NTIA process and all agency rulemaking.

Comments to the FAA’s NPRM will be open for 60 days after it is published to the Federal Register.

Ofcom Publishes Plan To Support the Internet of Things

This post was written by Cynthia O'Donoghue and Angus Finnegan.

In January, Ofcom, the UK telecommunications regulator, published its Statement on ‘Promoting investment and innovation in the Internet of Things’ (Statement). The Statement acknowledges that the Internet of Things (IoT) has the potential to deliver significant benefits to citizens and consumers. In light of this, Ofcom sought views from its stakeholders on what role Ofcom might play to support the growth and innovation of the IoT.

The Statement identifies four priority areas to help support the growth of the IoT. These include: data privacy, network security and resilience, spectrum availability, and network addresses.

Ofcom identifies data privacy as the ‘greatest single barrier to the development of the IoT’. Respondents were concerned about issues such as lack of trust in sharing personal data on the part of citizens and consumers.

To address such issues, the Statement proposes the implementation of a common framework to allow consumers to easily and transparently authorise the conditions under which data collected by their devices is used and shared with others. Where possible, the Statement recommends industry-led approaches to keeping consumers in control which are agreed internationally where possible.

In order to foster innovation and facilitate progress on these issues at both a national and international level, Ofcom proposes to work closely with government, the Information Commissioner’s Office, other regulators, and industry.

This Statement follows the EU Article 29 Working Party’s Opinion on ‘Recent Developments on the Internet of Things’ which we reported on in January 2015.

German Data Protection Commissioners Take Action Against Safe Harbor

This post was written by Cynthia O’Donoghue, Thomas Fischl & Katharina Weimer.

At the Data Protection Conference in Berlin, the Berlin and Hamburg Data Protection Commissioners (Commissioners) made a number of important announcements regarding the ‘inadequacy’ of the EU/U.S. Safe Harbor Program.

Both Dr. Alexander Dix and Prof. Johannes Caspar, Commissioners for Berlin and Hamburg respectively, asserted that U.S. companies do not protect data to the same level as EU companies do, even when U.S. companies certify that they will adhere to the Safe Harbor provisions. In addition, the Data Protection Authorities (DPAs) stated that there may be inadequate enforcement of the Safe Harbor Program by the Federal Trade Commission. Speaking on behalf of his colleagues from 16 German states, Dr. Dix went as far to say that:

“The Safe Harbor agreement is practically dead, unless some limits are being placed to the excessive surveillance by intelligence agencies”.

Dr. Dix further announced that the German DPAs in Berlin and Bremen have initiated administrative proceedings against two U.S. companies that base their data transfers on the EU/U.S. Safe Harbor Program. In these proceedings, the Germany DPAs have expressed their intention to stop data transfers for a limited time. Some commentators have suggested that an actual suspension of data transfer may potentially lead to a court decision, which could deny the supervisory authorities’ competence to suspend data transfers.

Other speakers, such as Paul Nemitz, Director for fundamental rights and union citizenship at the Directorate-General Justice of the European Commission, stressed that “there is an economic incentive to make Safe Harbor work”. However, in order for trans-Atlantic businesses to flourish, organisations need to be more transparent.

In light of these developments, global organisations may wish to consider alternative approaches to the Safe Harbor Program, such as EU Model Clauses, for data transfers from European jurisdictions, such as from Germany to the United States.

Senators Trying to Hit the Brakes on Smart Cars, Citing Privacy and Security Concerns

This post was written by Mark Melodia and Frederick Lah.

On February 11, Sens. Ed Markey (D-Mass.) and Richard Blumenthal (D-Conn.) announced that they would introduce legislation intended to address the data privacy and security vulnerabilities with Internet-connected cars. The legislation, if passed, would require manufacturers to adhere to a number of security and privacy standards, including the following:

  • Requirement that all wireless access points in the car are protected against hacking attacks, evaluated using penetration testing
  • Requirement that all collected information is appropriately secured and encrypted to prevent unwanted access
  • Requirement that the manufacturer or third-party feature provider be able to detect, report and respond to real-time hacking events
  • Transparency requirement that drivers are made explicitly aware of data collection, transmission, and use of driving information
  • Consumers can choose whether data is collected without having to disable navigation
  • Prohibited use of personal driving information for advertising or marketing purposes

The legislative proposal served as a follow-up to an earlier report by Sen. Markey, “Tracking & Hacking: Security & Privacy Gaps Put American Drivers at Risk.” That report was based on the responses from 16 major automobile manufacturers to questions posed by the senator about how driver information is collected and used, and the potential security risks with wireless technologies in cars. The report found that large amounts of personal driver information – including geographic location, destinations entered into a navigation system, parking history locations, and vehicle speed – are collected without the drivers being clearly informed as to how that information will be used. In most cases, the information is shared with third-party data centers, the report said. Further, the report found that nearly 100 percent of cars on the market include wireless technologies that could pose vulnerabilities to hacking intrusions, and that most manufacturers were unaware or unable to report on past hacking incidents.

In addition to Sen. Markey’s report, the FTC highlighted the potential security and privacy risks with connected cars in its recent Internet of Things Staff Report, which we previously covered here. While acknowledging the many safety and convenience benefits of smart cars, the FTC also shared Sen. Markey’s concern about their potential vulnerabilities.

In response, the industry, led by two major automobile coalitions, has adopted self-regulatory privacy principles. In November 2014, 19 U.S. car companies made a commitment to incorporate a series of self regulatory “Consumer Privacy Protection Principles for Vehicle Services” in their vehicles no later than model year 2017. In a letter sent to the FTC, the participating manufacturers said the “privacy principles” would be applied to their vehicles’ technologies and services, such as roadside assistance and navigation services, and will provide a baseline for privacy commitments. The principles include provisions for transparency, choice, respect for context, data minimization, de-identification, data security, integrity, access, and accountability. Sen. Markey said in a statement that these self-regulatory principles were a good first step, but that they did not go far enough in terms of choice and transparency.

As more and more cars join the wave of Internet of Things, legislators and regulators will continue to scrutinize their potential privacy and security risks. Any road forward – whether it be legislative or self-regulatory – must carefully balance the many benefits offered by smart cars with their potential risks. In the meantime, car manufacturers (and their third-party service and technology providers) should continue to monitor this area for legislative developments and start taking steps to implement the self-regulatory principles.

Courts Continue To Find That Unique Device Identifiers Are Not Personally Identifiable Information (PII) Under The Video Privacy Protection Act (VPPA)

This post was written by Lisa Kim and Alan Drosdick.

Two recent federal district court rulings regarding the Video Privacy Protection Act (VPPA) follow the emerging trend of decisions indicating that courts are reluctant to find violations of the VPPA for sharing anonymous identification markers with third parties (see May 5, 2014 blog post; June 20, 2014 blog post).

On January 20, 2015, a district court judge in New Jersey dismissed with prejudice a VPPA action against Viacom, Inc. (“Viacom”), holding that disclosure of anonymous user information to Google, Inc. (“Google”) was not actionable because such information did not constitute “personally identifiable information” (“PII”) as defined under the VPPA.

In In re Nickelodeon Consumer Privacy Litigation, plaintiffs alleged that Viacom operated websites for children, and encouraged users to create personal profiles on them. Viacom then assigned each user a code name, and collected certain information about each user, including gender and birthday. Viacom also placed cookies on a user’s computer, and allowed Google to place similar cookies, that collected further information, such as IP address, device and browser settings, and web traffic. On these websites, users were able to stream videos and play games, and a record was created of the name of each video each user played. Plaintiffs alleged that Viacom shared this information with Google, and both Viacom and Google used the information to target advertising at the user. Plaintiffs claimed that this practice of sharing information without users’ consent violates the VPPA.

The court found that nothing in the VPPA or its legislative history suggested that PII included anonymous user IDs, gender and age, or data about a user’s computer. Plaintiffs argued that Google, because it already had so much general information at its disposal, could use the information garnered from Viacom to ascertain personal identities. The court disagreed, confirming that PII is information which must, without more, itself link an actual person to actual video materials. Because the user information Viacom disclosed was not PII, no violation of the VPPA occurred, and the court dismissed the claim with prejudice.

Similarly, on January 23, 2015, a district court judge in Georgia dismissed with prejudice a VPPA action against Dow Jones & Company, Inc. (“Dow Jones”), holding that the disclosure of the plaintiff’s Roku device serial number was not actionable because the Roku device serial number did not qualify as PII.

In Locklear v. Dow Jones & Company, Inc., the plaintiff alleged that she downloaded and began using the Wall Street Journal Live Channel (“WSJ Channel”), offered by Dow Jones & Company, Inc. (“Dow Jones”), on her Roku device. Each time plaintiff viewed a video clip using the WSJ Channel, Dow Jones disclosed without her consent her anonymous Roku device serial number and video viewing history to mDialog, a third-party analytics and advertising company. mDialog, using demographic data from Roku and other such entities, was able to identify plaintiff and attribute her video records to her. Plaintiff alleged that this practice violated the VPPA.

The court dismissed the case with prejudice, finding that the Roku device serial number did not qualify as PII. Declaring the fact pattern indistinguishable from that presented in Ellis v. Cartoon Network, Inc. (see October 13, 2014 blog post), the court again defined PII as information which must, without more, itself link an actual person to actual video materials. Because mDialog had to take further steps, by turning to other sources beyond Dow Jones, to identify the user, Dow Jones’s disclosure of plaintiff’s anonymous Roku device serial number did not constitute a violation of the VPPA.

These rulings continue to demonstrate that courts are unwilling to enlarge the scope of the VPPA to include sharing anonymous identification numbers or code names alone. Nevertheless, companies utilizing unique device identifiers in connection with video materials should use caution it what information it shares with others.