FTC Workshop on Big Data: Focus on Data Brokers

This post was written by Divonne Smoyer and Christine N. Czuprynski.

On September 15, the Federal Trade Commission held a workshop entitled “Big Data: A Tool for Inclusion or Exclusion?” FTC Commissioner Julie Brill took the opportunity to discuss an industry that she has consistently maintained requires more regulation and scrutiny: data brokers.

Commissioner Brill stressed first that the FTC is very focused on entities regulated by the Fair Credit Reporting Act (FCRA), and reminded the audience that those entities will be held to the law by the agency. Those entities that are not subject to the FCRA are not off the hook: companies that engage in profiling, or “alternative scoring,” should take a very critical look at what they are doing, since alternative scoring has the potential to limit an individual’s access to credit, insurance, and job opportunities. Brill noted that the FTC’s May 2014 report focused on transparency, and called for legislation to make data brokers accountable – thoughts she echoed during Monday’s workshop.

Finally, Commissioner Brill stressed that all companies would be well-advised to see if their own big data systems cause problems that ultimately exacerbate existing socioeconomic conditions. She reiterated that companies should use their systems for good, and have a role in spotting and rooting out discrimination and differential impact. You can find the text of her full speech here.

FTC Commissioner Brill Urges State AGs to Up the Ante

This post was written by Divonne Smoyer and Christine Czuprynski.

Businesses that think they know what privacy issues are on the minds of the state attorneys general (AGs) should be aware that AGs are being urged to take action, either on their own, or in concert with the FTC, on key cutting edge privacy issues. At a major meeting of state AGs this week at the Conference of Western Attorneys General, FTC Commissioner Julie Brill, one of the highlighted speakers at the event, emphasized the importance of the AGs’ role in privacy regulation, and encouraged AGs to collaborate and cooperate on privacy investigations consistent with FTC efforts.

Commissioner Brill, a former assistant AG in two influential state attorney general offices, Vermont and North Carolina, outlined for the AGs several high-level privacy priorities for the FTC, including: (1) user-generated health information; (2) the Internet of Things; and, (3) mobile payments and mobile security. She invited the states to follow these and other privacy issues, and to complement the FTC’s actions in these areas in appropriate ways.

Also a focus: the Commission’s “Big Data” data broker report. Commissioner Brill emphasized her concerns about data broker practices, including their use of terms to describe and categorize individuals, such as “Urban Scramble,” “Mobile Mixers,” “Rural Everlasting,” and “Married Sophisticates.” She stressed that the information gathered by data brokers about these groups may allow businesses to make inferences about people, which in turn could impact access to credit, and in other ways. She pointed out that the FTC unanimously called for legislation to increase transparency and provide consumers with meaningful choices about how their data is used.

Building on her comments about data brokers, Commissioner Brill voiced concerns about the United States’ sectoral approach to privacy law and stressed that there needs to be gap-filling in areas outside of those sector-specific laws, and, since Congress is focused elsewhere on privacy issues, state action may be the best option to take on these issues and fill the gaps. This is not the first time Commissioner Brill has called on the states to take decisive action, and it won’t be the last.

Finally, Commissioner Brill addressed the FTC’s case against Wyndham in particular, noting that the FTC is aggressively fighting challenges to its Section 5 authority. She reminded the states that they have an interest in this fight given that state UDAP statutes share a common blueprint as so-called “mini-FTC Acts,” and invited collaboration on future challenges.

It is likely that many of the states will take action consistent with Commissioner Brill's urging.

FTC Settlement with Snapchat - What Happens on Snapchat Stays on Snapchat?

Last Thursday, the Federal Trade Commission (FTC) announced that messaging app Snapchat agreed to settle charges that it deceived consumers with promises about the disappearing nature of messages sent through the app. The FTC case also alleged that the company deceived consumers over the amount of personal data the app collected, and the security measures taken to protect that data from misuse and unauthorized disclosure. The case alleged that Snapchat’s failure to secure its Find Friends feature resulted in a security breach that enabled attackers to compile a database of 4.6 million Snapchat usernames and phone numbers.

Click here to read the full post on our sister blog AdLaw By Request.

Update on Federal Trade Commission v. Wyndham Worldwide Corp.: FTC Allowed To Proceed with Data Security Suit, Rejects Fundamental Challenge to FTC Authority

This post was written by Paul Bond and Christine N. Czuprynski.

A New Jersey federal court is allowing the FTC’s case against Wyndham Worldwide Corporation to go forward, denying Wyndham’s Motion to Dismiss on both the unfairness and deception counts.  In this closely watched case, the court emphasized that in denying Wyndham’s request for dismissal, it was not providing the FTC with a “blank check to sustain a lawsuit against every business that has been hacked.”  The far-reaching implications of this decision, though, cannot be ignored.

The Wyndham decision may well prove rocket fuel to an agency already proceeding at break-neck speed to formulate and enforce (often at the same time) new data security law.  Any company that was still waiting for the FTC to go through a formal rulemaking process on data security can wait no more.  The decision by Judge Salas has arguably ratified all of the reams of informal guidance the FTC has provided over the past decade, plus in enforcement actions, panel discussions, white papers, and more, as though they had gone through the formal notice and comment-based rulemaking process.  Unless a company is confident that it knows, has synthesized, and has applied this informal guidance to its own activities, it stands at risk of being the next target for the FTC's newly affirmed section 5 authority. 

The Federal Trade Commission sued Wyndham Worldwide in June 2012 in the District of Arizona.  The FTC alleged that Wyndham’s failure to properly safeguard the personal information in its possession led to a data security breach that exposed thousands of customers to identity theft and other fraud. The case was transferred to the District of New Jersey in March 2013.  Soon thereafter, Wyndham filed its Motion to Dismiss.

Wyndham challenged the FTC’s authority to regulate unfairness in the data security context.  Wyndham further argued that the FTC could not bring unfairness claims unless and until it had promulgated regulations on the issue.  U.S. District Judge Esther Salas rejected both of these challenges, as well as Wyndham’s third challenge, that the FTC failed to sufficiently plead both its unfairness and deception claims.

Wyndham argued that section 5 of the FTC Act does not confer unfairness authority that covers data security.  Wyndham contrasted section 5 of the FTC Act to the Fair Credit Reporting Act (FCRA), the Gramm-Leach-Bliley Act (GLBA), and the Children’s Online Privacy Protection Act (COPPA), all of which include specific authority for the FTC to regulate data security in certain contexts.  Wyndham argued that those statutes, which were enacted after the FTC Act, would be superfluous if the FTC had the general data security authority it seeks to wield in this case.  The court disagreed and ruled that the FTC’s general authority over data security can coexist with more specified authority in the FCRA, GLBA, and COPPA.

Wyndham also argued that the FTC had not provided fair notice of what data-security practices a business had to implement in order to comply with the FTC Act. . In rejecting that argument, the court held that the FTC was not required to engage in rulemaking before enforcing Section 5 in data-security cases, but could instead develop the law on a case-by-case basis. The court also found that fair notice was provided through the FTC’s public complaints, consent agreements, public statements and business guidance brochure. As such, the FTC was not required to also promulgate formal regulations. In addition, the court found that the FTC had pled with enough particularity to satisfy the heightened requirements in Rule 9(b), even though it was no persuaded that this action fell under that rule.

With respect to the deception claim, the ruling also touched on the respective liability between franchisors and franchisees, and issues we’ve written about recently.  Wyndham sought to exclude Wyndham-branded hotels from the case on the grounds that Wyndham Hotels and Resorts is a legally separate entity from Wyndham-branded hotels.  It therefore argued that statements on the Hotels and Resorts website privacy policy could not form the basis for deception claims, where personal information was accessed from Wyndham-branded hotels.  The court reviewed the language from the privacy policy to determine that a reasonable person could conclude that the privacy policy on the Hotels and Resorts website made statements about data security at both the Hotels and Resorts and the Wyndham-branded properties.  Despite the court’s claims of not providing the FTC with carte blanche to pursue companies that fall victim to hackers, the court’s ruling makes clear that when companies experience data breaches, they – and their franchisees – are now more, not less, likely to face the possibility of enforcement action by the FTC.

UK Information Commissioner's Office and U.S. Federal Trade Commission sign Memorandum of Understanding

This post was written by Cynthia O'Donoghue.

At the beginning of March, the UK Information Commissioner’s Office (ICO) signed a memorandum of understanding (MOU) with the U.S. Federal Trade Commission (FTC) at the IAPP Global Privacy Summit. The memorandum is aimed at increasing cooperation between the agencies, with UK Information Commissioner Graham stating that the arrangement would be “to the benefit of people in the United States and the United Kingdom.”

Whilst the MOU does not create legally binding obligations between the two agencies, it sets out terms for cooperation during investigations and enforcement activities. The FTC and ICO will cooperate on serious violations. The methods for cooperation include:

  • Sharing information, including complaints and personal information
  • A mutual undertaking to provide investigative assistance to the other agency through use of legal powers
  • Coordinating enforcement powers when dealing with cross-border activities arising from an investigation of a breach of either country’s law, where the matter being investigated is the same or substantially similar to practices prohibited by the other country

Measures to encourage cooperation between national regulators have been introduced by several international organisations. For example, in 2010, the Asia-Pacific Economic Cooperation (of which the United States is a member) launched a Cross-border Data Privacy Initiative, recognising that “trusted flows of information are essential to doing business in the global economy.”

The MOU is a joint acknowledgment by the FTC and ICO that consumer protection and data protection require close collaboration, and it serves as a warning to organisations that the agencies will be proactive in carrying out investigations of serious violations of consumer protection and data protection laws.

The Federal Trade Commission and Irish Data Protection Commissioner sign a memorandum of understanding

This post was written by Cynthia O'Donoghue.

In June 2013, the Federal Trade Commission (FTC) and Ireland's Office of the Data Protection Commissioner signed a memorandum of understanding establishing a mutual assistance and information exchange program to secure compliance with data protection and privacy laws on both sides of the Atlantic.

The privacy and data protection laws between Ireland and the United States differ significantly; however, the two agencies recognise that the global economy and the resultant increase in the cross-border flow of personal information merits close cooperation. The U.S. privacy framework is based on a number of legislative acts, that in the main apply to a specific sector or type of data, such as consumer data or health data, while Ireland’s Data Protection Acts of 1988 and 2003, which implement the EU Data Protection Directive (95/46/EC), apply to the processing of any personal data.

The MOU sets out broad objectives to ensure cooperation over the enforcement of privacy laws and to facilitate research and education in the area of data protection, including through the exchange of knowledge and expertise.

The FTC and the Irish data protection authority have agreed to use their best efforts to:

  • Share information, including complaints they receive
  • Provide each other with investigative assistance
  • Exchange data protection related information, including for purposes of consumer and business education
  • Explore opportunities for staff exchanges and joint training programs
  • Coordinate enforcement against cross-border violations
  • Regularly discuss continuing and prospective opportunities for cooperation

The memorandum also specifies the procedures and rules applying to requests for assistance. Such requests should be made only when they do not impose an excessive burden on the other agency. Any shared information, the existence of the investigations, and any requests made, are to be treated by the agencies as confidential.

FTC Brings Its First Enforcement Action for 'Internet of Things'

This post was written by John P. Feldman and Frederick Lah.

Earlier this year, we wrote about the FTC’s plan to hold a November 2013 public workshop over concerns with the “Internet of Things,” the dramatically growing capacity of smart devices to communicate information through the Internet. In advance of the workshop, the FTC has entered into a consent decree with a marketer of Internet-connected video cameras, marking the Commission’s first foray into the Internet of Things.

The marketer in this case was a provider of home security video cameras that allowed consumers to monitor their homes remotely. According to the complaint, a hacker exploited a security flaw in the marketer’s system and posted live feeds to approximately 700 home cameras, displaying babies asleep in their cribs, young children playing, and adults going about their daily lives. While the marketer did alert customers of the security flaw and offered them a security patch, the FTC alleged that the marketer had failed to use reasonable security to design and test its software, including a setting for the cameras’ password requirement. The FTC also alleged that the marketer had transmitted user login credentials in clear, readable text over the Internet, even though free software was available to secure such transmissions.

Under the terms of its settlement, the marketer is prohibited from misrepresenting the security of its cameras and the information that its cameras transmit. The marketer is also prohibited from misrepresenting the extent to which a consumer can control the security of information captured by the cameras. The FTC voted 4-0 to accept a consent agreement and the proposed order. The agreement will be subject to public comment for 30 days through October 4, 2013, after which the FTC will decide whether to make the proposed settlement final. Comments may be submitted electronically here.

This case is an important one for all companies that offer products connected to the Internet, whether they’re offering home appliances, automobiles, or even products with “smart” labels. The FTC relied upon a “reasonable” standard in bringing this action, which can always be a tricky one for companies to interpret. As a baseline, companies need to follow industry security standards and implement protections that are commensurate with the type of data they collect and transmit. As a best practice, companies should take a Privacy by Design approach and consider privacy and security as early as possible during product development. Still, no system can ever be 100 percent secure from malicious hackers, even if the company has taken extensive measures to protect its data assets; and just because a company has been the victim of a malicious hack does not by itself prove that the company was not acting reasonably.

The FTC’s workshop on the privacy and security of the Internet of Things will be held November 19, 2013. According to the FTC’s website, the workshop will address issues related to the increasingly prevalent ability of everyday devices to communicate with each other and with people.

Commissioner Brill Suggests 'Reclaim Your Name' Initiative for Consumer Control Over Big Data: FTC Signals Intent To Continue Debate Over Safeguards for Predictive Analytics

This post was written by Paul Bond and Christine Nielsen.  

While the national conversation on data collection in the United States has been dominated recently by issues of national security, the FTC remains determined to push consumer privacy. In her keynote address at the recent Computers Freedom and Privacy Conference, Commissioner Julie Brill reaffirmed the Commission’s commitment to address so-called data brokers in a systematic way. The speech, entitled “Reclaim Your Name,” is something of a state-of-the-union on big data matters. Aside from the admittedly catchy name, nothing in this proposal is totally new. However, in terms of tone and timing, this speech suggests that the FTC is looking to retake the privacy discussion and refocus the national debate on commercial data collection and use.

After a passing reference to the Snowden leaks and the tradeoff inherent in the use of consumer data, Commissioner Brill drills down on several substantive points. These include her judgment that an expanded use of the Fair Credit Reporting Act may be needed. She notes, on page 5 of this speech, that “our big data world strains the seams of the FCRA,” and calls for the application of the Act to new situations and new types of information products. For example, in Commissioner Brill’s view, on p. 4, even e-scores used to guide online behavioral marketing may need to eventually be put under the FCRA’s framework. After all, she asks, “What happens if lenders and other financial service providers do away with their phone banks and storefronts and market their loans and other financial products largely or entirely online?” p. 4.

Commissioner Brill also repeats her long-standing call for additional transparency among data brokers. Per Commissioner Brill, “Since most consumers have no way of knowing who these data brokers are, let alone finding the tools the companies provide, the reality is that current access and correction rights provide only the illusion of transparency.” p. 5.

The Commissioner’s third point involves notice and choice. Commissioner Brill used an example she’s cited frequently – that of Target predicting customers’ pregnancy status based on buying behaviors, and marketing pregnancy and baby products accordingly – to suggest that some types of predictive analytics will never be appropriate in some contexts, regardless of notice. She notes:

There is nothing in the context of a retail purchase that implies notice and consent – nothing that reasonably informs the consumer her data might be collected to make predictions about sensitive health conditions or seeks her consent to do so. And if the store were to try to make the notice and consent explicit? Imagine walking into Target and reading a sign on the wall or a disclosure on a receipt that says: “We will analyze your purchases to predict what health conditions you have so that we can provide you with discounts and coupons you may want.” That clear statement would surprise – and alarm – most of us.

p. 7.

Lastly, Commissioner Brill reaffirmed the FTC’s suspicion regarding deidentification: “Because much of big data is created through predictive analysis, and because much of the analytics are for the purpose of gaining insights into specific individuals, chunks of big data will always be, by their very nature, identifiable or linkable to individuals.” p. 9.

As far as solutions, the Commissioner called for increased privacy by design, and voiced support for “the creation of ‘algorithmists’ – licensed professionals with ethical responsibilities for an organization’s appropriate handling of consumer data.” p. 9. Commissioner Brill also said she would welcome legislation on the issue. But lastly, she suggests a voluntary industry initiative:

I would suggest we need a comprehensive initiative – one I am calling “Reclaim Your Name.” Reclaim Your Name would give consumers the knowledge and the technological tools to reassert some control over their personal data – to be the ones to decide how much to share, with whom, and for what purpose – to reclaim their names.

Reclaim Your Name would empower the consumer to find out how brokers are collecting and using data; give her access to information that data brokers have amassed about her; allow her to opt-out if she learns a data broker is selling her information for marketing purposes; and provide her the opportunity to correct errors in information used for substantive decisions – like credit, insurance, employment, and other benefits. Over a year ago, I called on the data broker industry to develop a user-friendly, one-stop online shop to achieve these goals.

p. 10.

While many portions of Commissioner Brill’s recommendations are wholly aspirational, this conversation should be carefully considered by any company developing policies and procedures around predictive analytics and big data.

Supreme Court Remands Pay-for-Delay Settlement for Antitrust Review in FTC v. Actavis

This post was written by P. Gavin Eastgate, Jeremy Feinstein, William Sheridan and Jessica Rose.

The U.S. Supreme Court last week issued a significant decision subjecting pay-for-delay settlements, a common practice in the pharmaceutical industry, to antitrust review. Also known as reverse payments, these settlements typically involve payments from a brand drug manufacturer to a generic drug manufacturer to settle patent litigation that would jeopardize the brand manufacturer’s legal, patent-protected monopoly. In FTC v. Actavis, a five-member majority of the Court held that such payments may violate the antitrust laws and should be evaluated under the rule of reason. Several courts of appeals had held that reverse payments could not be anticompetitive if they fell within the scope of the patent’s exclusionary protection – a position the dissent would have adopted. This is a significant decision in an area of law that is likely still not settled – district courts will have their hands full applying antitrust’s rule-of-reason in the patent litigation settlement context, and several Senators are pushing legislation to end reverse payments.

Click here to read the issued Client Alert.

Consumer Privacy Groups Submit Comments in Advance of FTC's 'Internet of Things' Workshop

This post was written by Paul Bond and Frederick Lah.

Refrigerators automatically doing grocery shopping for you on your drive home from work and cell phone attachments measuring glucose levels don’t necessarily seem like bad things. But with the explosion of cutting-edge smart devices and applications comes the mounting data privacy concerns of the so-called “Internet of Things.”

The “Internet of Things” refers to the dramatically growing capacity of devices to communicate information efficiently through the Internet. The most common example – mobile devices – has now become an extension of ourselves: waking us up to start the day, being an arm’s reach away at night, and being essential to day-to-day activities.

The FTC will hold a public workshop November 21, 2013, to address concerns over the “Internet of Things,” as an increasing amount of smart devices permeate the market. In advance of the workshop, two public interest groups – Electronic Privacy Information Center (“EPIC”) and Center for Digital Democracy (“CDD”) – have submitted comments expressing their concerns over the data privacy implications of the “Internet of Things.”

EPIC’s comments outlined its concern about some of most common consumer technologies that enable this connectivity, ranging from Wi-Fi to GPS tracking. EPIC highlighted its concern with consumers’ personal information and behavior patterns being improperly distributed or tracked. For example, some cars now come equipped with electronic GPS “black boxes” called Event Data Recorders (“EDR”) that collect information about velocity, direction, and seat belt use in motor vehicles, and distribute it to insurance companies, police, and other third parties. As a result, EPIC warns that drivers will soon have to become accustomed to their cars revealing highly personal information about them, such as “frequency and location of hospital trips, therapy sessions, personal visits, or even daily lunch habits.”

The CDD also submitted comments to the FTC this past weekend identifying transactions in specific areas of concern such as finances, health, ethnicity/race, and the youth. For example, the CDD mentioned the danger that a patient’s “health journey” may be tracked, analyzed, and sometimes even “offered up to pharmaceutical companies, surgery centers and other medical marketers.” The CDD thinks that consumers may also be “targeted on the spot for payday loans” by financial mobile marketers when entering a specific geographic location.

Like corporate efforts to capitalize on Big Data, adapting to the “Internet of Things” is a current business necessity. Not only is it integral to keep up with the advances of smart devices and integrate them into business for efficiency, but the information and feedback obtained from smart devices can also prove to be immensely beneficial for better understanding consumer habits. At the same time, companies must be mindful of the types of data they collect and how they use the information, and must also ensure that the necessary disclosures are given to consumers.


Research and drafting assistance for this post was provided by Reed Smith Summer Associate Sulina Gabale.

New FAQs Issued by the FTC for COPPA Compliance

This post was written by John P. Feldman and Caroline Klocko.

Earlier this week, the Federal Trade Commission (FTC) issued Frequently Asked Questions for complying with the Children's Online Privacy Protection Act (COPPA). The FAQs are intended as a supplement to the already issued compliance materials. As we previously reported, the revised COPPA Rule is set to go into effect on July 1, 2013. For companies running websites that collect information from children under 13, COPPA compliance will be critical. The FAQs will provide helpful guidance to reach that goal.

To learn more please visit our sister blog, AdLaw By Request.

Commissioner Brill to States: Data Brokers Aren't Going to Regulate Themselves

This post was written by Paul Bond and Christine E. Nielsen.

Federal Trade Commissioner Julie Brill, in a speech Monday at the National Association of Attorneys General (NAAG) Presidential Initiative Summit, urged the states to take a more active role in investigating and holding accountable data brokers for violations of the Fair Credit Reporting Act (FCRA).

The FCRA regulates the use of credit report information for credit and insurance eligibility decisions, and also in background checks and other investigative reports. The traditional actors in this space have seen increasing competition from entrants into the market, many of which may not be aware of FCRA’s broad reach and statutory requirements. For example, the FTC recently notified entities that compile rental history data that they are likely subject to FCRA and must abide by its requirements.

The attorneys general have publicly pursued several privacy-related investigations and enforcement actions since Attorney General Gansler announced his “Privacy in a Digital Age” Presidential Initiative. The California attorney general has recently provided guidance to and engaged in enforcement actions against entities active in the mobile application space. And the attorneys general have recently concluded an enforcement action against Google, which resulted in a $7 million settlement for Google’s alleged interception of personal data through its Street View vehicles. Still, the FTC, and not the states, has pursued data brokers for FCRA violations.

Data brokers have long been of interest to the FTC, which singled the industry out as one that needs special attention in its 2012 privacy report. Regulators justify heightened scrutiny because data brokers amass large quantities of valuable consumer data, but are often unknown to consumers. The state attorneys general as a multi-state group investigated and eventually settled with ChoicePoint following that data broker’s 2004 security breach, and individually have investigated entities that engaged in pretexting to obtain and compile phone record data.

As we enter the final few months of Attorney General Gansler’s term as NAAG President, we will keep a close watch on whether the attorneys general answer Commissioner Brill’s call-to-action.

Supreme Court Reins in State-Action Immunity Doctrine

This post was written by William J. Sheridan.

Yesterday, in FTC v. Phoebe Putney Health Systems, Inc., the Supreme Court rejected an expansive view of the state-action immunity doctrine articulated by the U.S. Court of Appeals for the Eleventh Circuit. Saying the court of appeals applied the doctrine’s concept of foreseeability “too loosely,” the unanimous Court concluded that Georgia did not contemplate displacing competition in the hospital services market when it created hospital authorities with general corporate powers.

The court of appeals had upheld dismissal of an FTC action aimed at preventing the acquisition of a Georgia hospital by a hospital authority that controlled the only other hospital in the county. According to the 11th Circuit, the state-action immunity doctrine foreclosed the possibility of federal antitrust liability even though it agreed that the acquisition would limit competition and potentially create a monopoly.

The state-action immunity doctrine protects local government entities, like the hospital authorities in this case, from antitrust enforcement where they are acting pursuant to a clearly articulated and affirmatively expressed state policy to displace competition. Private parties enjoy similar immunity, with the additional requirement that the policy must be actively supervised by the State.

The Supreme Court determined that in this case there was no clear articulation of a State policy that would permit anticompetitive conduct. Justice Sotomayor’s opinion reasoned that although Georgia hospital authorities have the power to acquire other hospitals, it was not clearly articulated that they would have the power to “make acquisitions of existing hospitals that will substantially lessen competition.” Slip Op. at 10.

As a practical matter, any relief granted on remand will be remedial because the transaction closed after the court of appeals issued its opinion and dissolved the FTC’s temporary injunction.

The upshot of this decision seems to be that the Supreme Court has declined an invitation to broaden the antitrust protection afforded private entities and local governments by the state-action immunity doctrine. A general delegation of corporate powers by the State is not a license to reduce competition. As the Court observed, cases where it has found a “clear articulation” of State intent to displace competition without an explicit statement typically involve delegations of regulatory authority that are “inherently anticompetitive.” Slip Op. at 12.


FTC Speaks About Its Mobile Privacy Disclosures Guidance

This post was written by Paul Bond and Frederick Lah.

On February 1, the FTC released its Mobile Privacy Disclosures Guidance (the Guidance) setting forth best practice recommendations for platforms, app developers, third parties such as ad networks and analytics companies, and app trade associations. We previously wrote about the Guidance when it was issued.

On February 15, Assistant Director in the FTC’s Division of Privacy and Identity Protection, Chris Olsen, spoke at the latest National Telecommunications and Information Administration (NTIA) stakeholders’ meeting in Washington, DC about the Guidance. Here are some highlights from the meeting:

  • Olsen started off the meeting by recapping the recent efforts by the FTC in the mobile space.
  • He said that the FTC believes that consumers are not really aware of the types of information collection and sharing practices that are taking place.
  • He described the mobile ecosystem as “complex” and that all the players in the ecosystem need to work together for its improvement.
  • Olsen spoke about the specific roles and responsibilities of all the players in the ecosystem - app platforms, app developers, and app networks – as outlined in the Report (many of which we described in our previous blog article).
  • According to Olsen, the Guidance was designed to do three things – (1) spur on members of the ecosystem to take a more active role in addressing the lack of sufficient disclosures; (2) reach as many industry participants as possible in the “diverse marketplace” and to educate participants on what are the best practices; (3) provide input to industry stakeholders, such as the NTIA, on the development of a code of conduct for the mobile space.
  • One commentator noted that the Guidance sets out what industry participants should be doing, but does not seem to set out what the role of the FTC should be. Olsen responded by saying that, “the FTC needs to do better, too.” He specifically identified enforcement and outreach as areas for improvement.
  • With respect to the Guidance’s recommendations for platforms, Olsen stressed the need for platforms to be clear to consumers about what they’re doing (or not doing) and to oversee and enforce their developer agreements. 
  • Olsen pointed out that the Guidance does not set forth legal requirements and that the FTC did not issue the Guidance with the goal of providing any sort of legal framework. He did note though that Congress is interested in the issue and that they will continue to hold hearings about the state of affairs in the mobile environment, and that the FTC would provide input to Congress if called upon to do so.
  • As for the use of icons in the mobile space, Olsen said that he thinks that an essential element of any icon program must be that it “communicates a clear message” and is not ambiguous. 
  • He also noted that the recent report from the California AG’s office on mobile privacy is largely consistent with the FTC’s Guidance, but noted that the California report appears to cover a larger scope of mobile privacy issues, one not just focused on the issue of disclosures.

Olsen’s comments and the Guidance itself are informative, but it remains to be seen how the players in the ecosystem will respond to the recommended best practices. Another question will be what effect, if any, the Guidance will have on the FTC’s enforcement efforts? We’ll be monitoring this situation closely.

Deadline for Comments on Fred Meyer Guides Extended by FTC

This post was written by Keri S. Bruce.

The deadline to provide comments to aid the Federal Trade Commission (FTC) in its review of the Fred Meyer Guides (Guides) was reopened and extended until March 4, 2013. The Guides clarify the Robinson-Patman Act (Act) by explaining how manufacturers and wholesalers can provide advertising allowances and other promotional payments and services to retailers in a manner that does not violate the prohibited anti-competitive price discrimination requirements of the Act.

Please click here for more information on our sister blog, Adlaw by Request.

FTC Tries The Carrot and The Stick: Releases Guidance on Mobile Privacy Best Practices; Enters Into $800K Consent Order with Path

This post was written by John P. Feldman, Paul Bond and Christine E. Nielsen.

Today, the Federal Trade Commission released detailed guidance on privacy in the mobile environment – at the same time it announced its largest-ever settlement with an app developer for alleged privacy violations. Combined with aggressive action on mobile privacy issues by the California attorney general’s office, Mobile Privacy Disclosures provides every company associated with a mobile app with an urgent reason to review all disclosures and practices. 

Please click here to continue reading this Client Alert

Federal Trade Commission Announces Adjusted HSR Thresholds for 2013

This post was written by Debra H. Dermody, P. Gavin Eastgate, Michelle A. Mantine and William J. Sheridan.

On January 10, 2013, the Federal Trade Commission announced the annual threshold adjustments for premerger filings under the Hart-Scott-Rodino Antitrust Improvements Act of 1976 (15 U.S.C. § 18a) (“HSR”). The new thresholds have increased the dollar amount required to trigger HSR notification with respect to both the size-of-transaction and size-of-person tests.

Please click here to read the issued Client Alert.

Right on Time: FTC Announces COPPA Update

This post was written by John P. Feldman and Frederick Lah.

Earlier today, FTC Chairman Leibowitz announced the agency’s update to the COPPA rule at a press conference alongside Sens. Jay Rockefeller (D-W.Va.) and Mark Pryor (D-Ark.), and Congressmen Ed Markey (D-Mass.) and Joe Barton (R-Tex.). The changes to COPPA were two years in the making and were the result of two proposed rule revisions and comment periods. As anticipated, the new rule comes with a broadened scope. Sen. Rockefeller provided the opening remarks to the press conference, expressing his approval that the new COPPA rule “captures the new online reality” to address the rise of social networks, smartphones, tablets, and apps. Some highlights from the new COPPA rule include:

  • Expanded scope of personal information – the collection of which requires parental notice and consent – to include geolocation information, photographs, and videos. The chairman noted that this kind of information can be used to cause physical harm to children.
  • Expanded scope of personal information to also include persistent identifiers, such as mobile device unique identifiers and IP addresses, to the extent they can recognize users over time and across different websites. Chairman Leibowitz noted that these types of information can be used to build massive profiles by behavioral marketers. The definition would not be extended to include persistent identifiers if they are used for the sole purpose of supporting the site or its internal operations.
  • Closed a “loophole” that allowed covered websites or online services to permit third parties to collect personal information through plug-ins or ad networks that the covered websites or online services would not have otherwise been allowed to collect without parental consent. Third-party collectors are now also required to comply with COPPA if they have “actual knowledge” of the child-directed nature of the site from which they are collecting personal information. 
  • Offer companies a “streamlined, voluntary and transparent” approval process for new ways of getting parental consent. The chairman encouraged companies to create additional “simple, low-cost means” of obtaining verifiable parental consent. 
  • Strengthened data security protections by requiring that covered websites or online services take reasonable steps to release children’s personal information only to companies that are capable of keeping it secure and confidential.
  • Strengthened requirement that covered websites or online services adopt reasonable procedures for data retention and deletion, and the FTC’s oversight of self-regulatory safe harbor programs.

Notably, the chairman said that advertisers can continue to advertise to children, but not behavioral advertising, without the consent of parents. The chairman bluntly stated: “Until and unless you get parental consent, you may not track children to build massive profiles for behavioral advertising principles. Period.”

The amendments are expected to come into effect July 1, 2013. Companies need to consider these amendments now, both with respect to their operations for websites and in the mobile space. This is especially true considering the FTC recent focus on children’s privacy in the mobile app environment. We will continue to follow this issue closely.  


More News on COPPA...

This post was written by John P. Feldman and Frederick Lah.

One day after the FTC issued its second report on privacy concerns with mobile apps for kids, "Mobile Apps for Kids: Disclosures Still Not Making the Grade", a consumer privacy group filed a complaint with the FTC against a mobile game-maker for alleged violations of COPPA.  The complaint, filed by the Center for Digital Democracy, alleged that the mobile game-maker was collecting children’s personal information through its mobile game without obtaining verifiable parental consent, and without providing the requisite notice under COPPA.

As actions continue to be brought under the current COPPA regime, we’re expecting the new COPPA rules to be issued any day now. In a recent interview, Chairman Leibowitz noted that he was "pretty sure" that the new COPPA rules would be finalized by the end of the year (while sounding less optimistic that a Do-Not-Track deal would be reached in the same time frame). We will continue to monitor this issue closely in the coming weeks.

FTC Issues Second Report on Privacy Concerns with Mobile Apps for Kids

This post was written by John P. Feldman and Frederick Lah.

It continues to be a busy time in the world of mobile app privacy. Last week, we reported on the California attorney general bringing a mobile privacy enforcement action against Delta Air Lines. And just yesterday, the FTC issued its second staff report on the privacy practices of mobile apps for children, “Mobile Apps for Kids: Disclosures Still Not Making the Grade.”

This report reiterates some of the findings from the FTC’s first report on the privacy practices of mobile apps for children, “Mobile Apps for Kids: Current Privacy Disclosures are Disappointing.” The FTC continues to voice its dissatisfaction with the current privacy disclosures that companies are providing to parents, such as what type of data the app collects, who will have access to that data, how the data will be used, and who that data will be shared with. The FTC continues to believe that such disclosures should be provided prior to download since once an app is downloaded, the app may already be collecting the child’s information.

The basis of the second report was again a survey of approximately 400 apps, 200 each from the Apple and Google Play app stores. According to the survey, only 20 percent of the apps reviewed contained any privacy-related disclosures at all, whether on the app’s promotion page, on the developer website, or within the app. In addition, the FTC’s survey found that:

  • 59 percent of the apps reviewed transmitted device ID, geolocation, or phone number to the developer, an advertising network, analytics company or other third party, yet only 11 percent of the apps disclosed that the app transmitted such data.
  • 58 percent of the apps reviewed contained advertising within the app, yet only 9 percent disclosed that the app contained advertising.
  • 22 percent of the apps reviewed contained links to social media, yet only 9 percent disclosed that fact.

The FTC intends to conduct another survey in the future, and they “expect to see improvement.” They noted that such discrepancies between a company’s privacy practices and disclosures could constitute violations of COPPA or the FTC Act. With the FTC’s continued focus in this area and the proposed COPPA rules expected to be implemented shortly, we anticipate that the intersection between mobile apps and children’s privacy will continue to draw heightened regulatory scrutiny. In the meantime, we recommend that companies operating in the mobile space (especially those providing apps for children) review and update their mobile app privacy disclosures now before they become the next enforcement target.

Reed Smith Gearing Up For 'Big Data Monetization' Conference

This post was written by Mark Melodia, Cynthia O'Donoghue, Paul Bond and Frederick Lah.

Next week, Reed Smith will host a conference on “Big Data Monetization” at the Quadrus Conference Center in Silicon Valley (8:30-11:30 a.m. PDT). As we gear up, we wanted to share some of our thoughts on this notion of Big Data and give you a preview of the types of issues we’ll be tackling at the conference.

Big Data is an amorphous term, one that has taken on different meanings in different contexts. In general, Big Data is a term used to characterize the accumulation of data, especially for data sets that may not be usable for analysis by themselves. The term does not just encompass the small subset of companies that actually provides data analytics, or that exists for the sole purpose of monetizing personal information and habit data, but rather extends to any significant company participating in the digital data-driven economy.

Virtually every company, in every industry, is now an information and technology company. Companies run on Big Data, whether it be customer information, employee information, or competitive intelligence. Companies store, share, and use that information in increasingly complex ways, taking advantage of cloud-based solutions and revolutions in analytics, and finding ways to turn these massive databases into revenue – for example, by creating tailored advertisements based on customer shopping preferences or online browsing history. There is no doubt a plethora of opportunities for retailers, health care providers, banks, energy companies, website operators, and data brokers alike in Big Data.

Of course, using Big Data comes with its own set of risks. Companies need to ensure their disclosures are up-to-date and accurate about their information practices, and there may be laws or regulations on the collection and use of information depending on the types of data and data subjects involved. Both Congress and the Federal Trade Commission have also recently raised concerns about data brokers. In addition, some customers may feel uneasy with the notion that a company has too much information about them, thus drawing the attention of class action plaintiffs’ counsel. And, of course, having such deep databases of personal information highlights the importance of keeping information safe and secure. The more valuable information a company holds, the more magnified the threats of data theft and data loss become. The key with monetizing Big Data is striking the balance between risk and reward.

Our Data Privacy, Security & Management team has extensive experience providing privacy compliance advice to clients, drawing upon our knowledge gained in defending more than five dozen privacy class actions and three Multidistrict Litigations; our day-to-day operational experience answering questions from our technology, financial, health, and energy clients; and our diverse skill-set that includes engineers, software developers, cybersecurity and other technology professionals; former regulators and former in-house counsel at global banks; asset managers; and insurers. Earlier this month, Mark Melodia and Paul Bond were featured in the cover story of Law Technology News, “Defending Big Data”. Mark also recently did a podcast on “Defending Big Data”. We continue to stay on top of this area.

At the “Big Data Monetization” conference next week, our panel of experts will be tackling the following types of questions:

  • Why should a corporate officer or director or investor care about issues with Big Data?
  • What is the current regulatory landscape for Big Data?
  • What are the biggest challenges for Big Data as it operates in the United States and globally?
  • How does the issue of data ownership arise for Big Data?
  • What types of litigation risks exist for Big Data 
  • How does the so-called concept of the "right to be forgotten” impact Big Data?
  • How does insurance play a role in mitigating the risks that come with Big Data?

We look forward to seeing many of you next week.

FTC Does Not Issue a Final COPPA Rule; Instead, Seeks Comment on Modifications to Rule Definitions

This post was written by John P. Feldman, Amy S. Mushahwar and Christine Nielsen.

This morning the FTC released a supplemental notice of proposed rulemaking on the Children's Online Privacy Protection Act (COPPA) Rule. This is not a final rule. The notice suggests further modifications to proposed definitions released in the September 2011 Notice of Proposed Rulemaking on the COPPA Rule. Specifically, the FTC now seeks comment on proposed modifications to the definitions of "operator," "personal information," and "website or online service directed to children." This notice must be read in conjunction with the 2011 notice to understand the full scope of the proposed changes. The FTC is seeking comments on these proposals. Comments must be received on or before September 10, 2012. Shortly, we will be providing a detailed analysis of this notice in context with the earlier release.

FCC Approves Order to Tighten Regulatory Treatment of Robocalls Under the Telephone Consumer Protection Act

This post was written by Judith L. Harris and Amy S. Mushahwar.

The Federal Communications Commission (FCC) acted today to tighten its rules under the Telephone Consumer Protection Act (TCPA) and conform them, to the extent possible, with the more stringent rules already in place at the Federal Trade Commission (FTC) under the Telephone Sales Rule (TSR). This change will hit hardest entities such as banks which are not subject to FTC jurisdiction, and do not have more stringent compliance programs already in place. Although the FCC’s order has not been released and no information is available yet as to the details of how the revised rules will operate and exactly to what calls they will apply, the following four points are clear:

1. Prior express WRITTEN consent will now be required before making any telemarketing robocall (using an autodialer or a prerecorded message) to a consumer; electronic signatures will be acceptable as evidence of written consent and this change will not apply to purely informational calls (“such as those related to school closings and flight changes.”);

2. The “established business relationship” will be eliminated as an exception to the prior written consent requirement that currently applies in the case of wireline calls;

3. An automated opt-out mechanism will have to be included in each robocall to facilitate a consumer’s ability to withdraw prior consent; and

4. The rules governing abandoned or “dead air” calls will be tightened, including through stricter time limits and by changing those limits to apply to each separate marketing campaign, rather than allowing the limits to be averaged over different calling campaigns, as is currently the case.

We are awaiting further details on exactly how these rules will be applied and when they will become effective. In the interim, please contact the authors of this article or the Reed Smith attorney with whom you normally work.

Federal Trade Commission Announces Adjusted HSR Thresholds for 2012

This post was written by Debra H. Dermody, Gavin P. Eastgate and Michelle Mantine.

On January 24, 2012, the Federal Trade Commission announced the annual threshold adjustments for premerger filings under the Hart-Scott-Rodino Antitrust Improvements Act of 1976 (15 U.S.C. § 18a) (“HSR”). The new thresholds have increased the dollar amount required to trigger HSR notification with respect to both the size-of-transaction and size-of-person tests.

The revised HSR thresholds will apply to all transactions that close on or after the effective date, which is 30 calendar days following publication of the adjusted thresholds in the Federal Register. Publication will occur shortly, and the effective date will be in late February.  Click here to learn more about the Adjusted HSR Thresholds for 2012.

FTC's Consent Order with ScanScout: The Latest Progression with 'Flash Cookies' and Privacy

This post was written by Mark S. Melodia, Christopher G. Cwalina, Steven B. Roosa and Frederick Lah.

Online advertising network ScanScout, Inc. has agreed to settle the FTC's charges that it deceptively represented that users could opt out of receiving targeted ads by changing their Web browser settings to block and delete cookies. The consent decree stems from the FTC's charges that ScanScout's privacy policy did not adequately inform users about the use and management of Flash local shared objects, otherwise known as "Flash cookies", from being placed on their computers. This news is just the latest progression with the Flash cookie issue. In addition to the ongoing threat of Flash cookie-related litigation, companies should now be put on notice that the failure to properly disclose the use of Flash cookies can result in FTC enforcement. The following client alert provides more detail about the consent decree itself and lists some steps that every company with an online presence should take with regard to their use of Flash cookies and other data collection technologies. Please feel free to pass this along to any client who may find it relevant.

To view the entire alert, please click here.

FTC Seeks Public Comment For Revising the "Dot Com Disclosures"

Careful Consideration is Advised, as FTC's Guidance May Inform Federal and/or State Enforcement Actions

Comments Deadline: July 11, 2011

This post was written by Christopher G. Cwalina, Amy S. Mushahwar, and Frederick Lah.

The Federal Trade Commission ("FTC") seeks public comment, as it considers updating and reissuing "Dot Com Disclosures: Information about Online Advertising", its business guidance document for online marketers on how to provide clear and conspicuous disclosures to consumers.

In its request for comment, the FTC cites the dramatic changes in the online world since the guidance was originally published in 2000, particularly the emergence of mobile marketing, the "App" economy, the use of "pop-up blockers," and online social networking. (This recognition of mobile is particularly important in light of last week's letter by Senator Al Franken (D-MN) to Google (maker of the Android) and Apple (maker of the iPhone and iPad) asking that all mobile apps for their devices provide "clear and understandable privacy policies.")

Even though the "Dot Com Disclosures" are considered guidance and not formal regulations, the FTC has used its Dot Com Disclosures to inform Section 5 enforcement actions. For example, in a consent order with Advertising.com, Inc., the FTC required that Advertising.com's representations about its advertisements be made "clearly and prominently." The definition of "clearly and prominently" was cited almost verbatim to the definition of the term as it appears in the guidance. The FTC also cited to the guidance back in 2002 in response to a complaint brought by Commercial Alert against search engines like AOL and Microsoft for their allegedly misleading disclosures about the advertisements placed on search result lists. State courts have also cited the guidance. In 2009, a Texas court stated that in determining what constitutes deceptive conduct under Texas' Unfair Trade Practices Act, "they are to be guided by the interpretations of that term in the guidelines of the FTC" and found that those guidelines require that disclosures must be “clear and conspicuous” based on the placement of the disclosure on the webpage and its proximity to the other relevant information.

The FTC seeks comment from the industry on a number of issues. In the request for comment, the FTC provides a series of questions to help companies consider what type of revisions need to be made, such as:

  • What issues have been raised by new online technologies, Internet activities, or features that have emerged since the business guide was issued (e.g., mobile marketing, including screen size) that should be addressed in a revised guidance document?
  • What issues raised by new laws or regulations should be addressed in a revised guidance document?
  • What research or other information regarding the online marketplace, online advertising techniques, consumer online behavior, or the effectiveness of online disclosures should be considered in a revised guidance document?
  • What specific types of online disclosures, if any, raise unique issues that should be considered separately from general disclosure requirements?
  • What guidance in the original “Dot Com Disclosures” document is outdated or unnecessary?
  • What guidance in “Dot Com Disclosures” should be clarified, expanded, strengthened, or limited?
  • What issues relating to disclosures have arisen from multi-party selling arrangements in Internet commerce, such as (1) established online sellers providing a platform for other firms to market and sell their products online, (2) website operators being compensated for referring consumers to other Internet sites that offer products and services, and (3) other affiliate marketing arrangements?

Regardless of how the guidance is ultimately revised, the FTC will certainly continue to use this sort of guidance to inform its enforcement efforts. We recommend that companies carefully review the Dot Com Disclosure guidance and questions posed in the request for comment. Companies should analyze how any new guidance might affect their advertising practices and consider whether they should provide comments. July may seem like several weeks away, but because these issues are likely to impact advertising for multiple product lines within your company, we encourage you to begin an internal dialogue on this proceeding immediately.

Commissioner Brill Introduces Competition Analysis to Privacy Debate

This post was written by Paul Bond and Chris Cwalina.

In her new article, "The Intersection of Consumer Protection and Competition in the New World of Privacy," Federal Trade Commissioner Julie Brill cautions that the pursuit of privacy may conflict with the pursuit of a competitive market. Commissioner Brill's article, published in the Spring Edition of Competition Policy International, notes that the Federal Trade Commission's role is to protect consumers from many types of market failures. The FTC strives to protect consumers from unfair and deceptive information collection and use practices. But, at the same time, the FTC protects consumers from collusive and other anti-competitive behaviors. Commissioner Brill identifies a potentially problematic range of privacy enhancements which could, paradoxically, harm consumers by stifling competition. In this position, Commissioner Brill goes further than the FTC's preliminary white paper, "Protecting Consumer Privacy in an Era of Rapid Change: A Proposed Framework for Businesses and Policymakers" (2010 Privacy Report).

For example, Commissioner Brill asserts that self-regulation to date has been "slow and inadequate". This mirrors criticisms in the 2010 Privacy Report. But Commissioner Brill goes on to posit that dominant companies can misuse privacy self-regulation to stifle market entry by new competitors. The Commissioner does not describe in any detail the manner in which such an anti-competitive plan would be carried out. Presumably, the cost in money or time of complying with the industry's self-regulation would prove prohibitive for fledgling businesses, while just a "cost of doing business" for better capitalized industry leaders. There may also be a concern that existing businesses, which already hold stockpiles of consumer information, would erect barriers to data collection which would affect new enterprises disproportionately.

Commissioner Brill also raises the competitive concern that privacy regulation not unfairly benefit new entrants. "Indeed," she recognizes, "some more established data brokers and other information firms believe it is much easier for their newer competitors to design privacy protections into their new business models and new forms of communications than it is to retrofit old systems to meet the realities of today's privacy concerns."

Until now, a strategic analysis of the competitive impact of privacy regulation has not been an FTC priority. Indeed, in her Article, Commissioner Brill notes that she writes only for herself, and is not reflecting the views of the Commission or the other Commissioners. Still, taken in conjunction with Commissioner Roach's recent opinion that the Google Buzz settlement may have been a strategic ploy by Google to create insurmountable regulatory barriers to entry, it is safe to say the FTC is increasingly wary of privacy regulation being misused for private ends. Advocates of self-regulation, as well as those seeking to advance or defeat governmental regulation, must be prepared to explain why their privacy regulation or self-regulation proposals are consistent with a vigorous free market. Advocates of industry self-regulation already know that the FTC has criticized efforts to date and here is another hurdle that must be addressed before self-regulation is deemed by the FTC to be robust enough and workable.

Given how extremely easy it is to transfer information as an asset between corporate forms, and from one area of the world to another, the prospect for strategic resistance to or abuse of privacy regulation by companies around the world is substantial. Commissioner Brill performs a service by injecting a note of economic realism into the ongoing debate about how information can and should be regulated in the 21st century.

Rep. Markey Releases a Kids Do Not Track Discussion Draft Bill

This post is written by John Feldman and Amy Mushahwar.

Bill Adds to the Web of Proposed Privacy Legislation and Contains Much More Than Kids Do Not Track

Today, Rep. Ed Markey (D-Mass.) circulated a discussion draft of his kids online do-not-track bill, co-sponsored by Joe Barton (R-Tex.) that proposes to make it illegal to use kids' or teens' information for targeted marketing and require parental consent for online tracking of the info. Both Congressmen co-chair the House Privacy Caucus and their kids' privacy bill will join other more generally-applicable privacy legislation pending in the 112th Congress by Representatives Cliff Stearns (R-Fl.), Fred Upton (R-Mich.), Jackie Speier (D-Calf.) and Bobby Rush (D-Ill.) and Senators John Kerry (D-Mass.) and John McCain (R.-Ariz.) with Senator Jay Rockefeller (D-W.Va.) promising to release a generally-applicable privacy bill containing Do Not Track provisions next week.

But, members of the privacy community were expecting this piece of proposed legislation. Markey had promised since late 2010 that the bill was coming. Specifically, the bill would update the Childrens' Online Privacy Protection Act of 1998 ("COPPA") provisions relating to the collection, use and disclosure of children's personal information. Further, it would establish protections for personal information of teens who were previously not addressed in COPPA at all.

Key provisions of the bill include:
Scope Updates:
The bill would expand the scope of the definition of covered Internet operators to include online applications and the mobile web. The Federal Trade Commission ("FTC") would also be empowered with rulemaking authority to create more flexible definitions of operators that account for the development of new technology. The also expands the personal information protected to include IP Addresses, mobile SIMs or any other computer or other device identifying numbers.

Privacy Policies/Disclosure: The bill would require online companies to explain the types of personal information collected, how that information is used and disclosed, and the policies for collection of personal information.

Further Parental Choice: In addition to keeping the existing requirements for online companies to obtain parental consent for collection of childrens' personal information, the bill also includes provisions requiring companies to provide parents access to the information collected about their child and the opportunity to opt-out of further use of maintenance of their child's data.

Targeted Marketing Prohibitions for Kids & Minors: Website operators and other online providers would be prohibited from knowingly collecting personal information for behavioral marketing purposes from children and minors. The FTC would be required to issue regulations within one year of the bill's passage.

Digital Marketing Bill of Rights for Teens & Fair Information Practices Principles: This section incorporates the Fair Information Practice Principles ("FIPPs") concept that was in the Department of Commerce's Privacy Green Paper. Under this proposed bill, website operators and other online providers are prohibited from collecting personal information from any minors, unless they adopt a Digital Marketing Bill of Rights for Teens. Such a bill of rights or FIPPs must include provisions regarding data: collection, quality, purpose specification, use limitations, security, use transparency, access and correction.

Geolocation Information Collection of Kids and Minors: Website operators and service providers must establish procedures for notice and choice regarding geolocation information. In the case of information collection from children, an operator/provider must obtain verifiable parental consent before this information would be collected, in most cases.

Eraser Button: Website operators must create an "Erase Button" for parents and children by requiring companies to permit users to eliminate publicly available personal information content when technologically feasible. (Such a provision, however, could lead parents and children into a false sense of security on the web. With multiple outlets for data cashing, it is difficult to wholly erase data on the web.)

Expansion of FTC Jurisdiction to Telecom: In keeping with the Kerry bill, the Markey bill also seeks to expand FTC jurisdiction to telecommunications carriers.

We will be carefully evaluating these provisions while this bill pends, but we can readily identify that complications are likely to arise for marketing to young adults. For example, teens are far more likely to lie when faced with traditional age screens. So, even though the statute contains a 'knowing' information collection requirement, to what degree would marketers be required 'fortify' their existing age screens to account for teens? If more stringent age screens must be employed, will the more tedious screens reduce marketing to adults, too?

If this bill advances on the Hill, please lookout for upcoming privacy bill updates from our team.

Reed Smith Attorney Talks McCain-Kerry Bill

Reed Smith Attorney Amy Mushahwar was recently interviewed by IT Business Edge on the McCain-Kerry Bill. According to Amy, "if enacted, the bill would expand the Federal Trade Commission’s jurisdiction to include telecommunications companies for privacy matters. Typically, telecom companies would not be within the FTC’s jurisdiction." To see the complete interview, please click here.

FTC and Google - Proposed Settlement Over "Buzz"

This post was written by Christopher G. Cwalina, Amy S. Mushahwar, and Frederick Lah.

Google, Inc. agreed to a proposed consent order over charges that it used deceptive tactics and violated its privacy promises to consumers when it launched its social network, Google Buzz. The Agency alleged in its Complaint that Google's information practices violated Section 5 of the FTC Act.

As background, in February 2010, Google launched Buzz, a social networking service within Gmail, its web-based email product. Google used the information of Gmail users, including first and last name and email contacts, to populate the social network. Gmail users were, in many instances, automatically set up with “followers” (people that followed the user or people that the user followed). According to the FTC's Complaint, even if a user did not enroll in Buzz, the user's information was shared in a number of ways (e.g., a user who did not enroll in Buzz could still be followed by other Gmail users who enrolled in Buzz). The FTC also alleges that the setup process for Gmail users who enrolled in Buzz did not adequately communicate that certain previously private information would be shared publicly by default. Further, the FTC alleges that certain personal information of Gmail users was shared without consumers' permission through Buzz (e.g., some information was searchable on the Internet and could be indexed by Internet search engines).

Part I of the proposed consent order prohibits Google from misrepresenting the privacy and confidentiality of any “covered information,” as well as the company’s compliance with its other any privacy and security program, including the U.S.-EU Safe Harbor Framework. The term "covered information" is defined very broadly to include an individual's first and last name, physical address, email address, screen name, persistent identifier (e.g., IP address), list of contacts, and physical location. The FTC noted in its press release [http://www.ftc.gov/opa/2011/03/google.shtm] that this is the first time it has alleged violations of the substantive privacy requirements of the U.S.-EU Safe Harbor Framework.

Part II of the proposed consent order requires Google to give its users a "clear and prominent" notice and choice. Under the terms of the proposed consent order, Google must obtain express affirmative consent before sharing any user's covered information with a third party in connection with: (1) a change, addition or enhancement to any product or service, (2) where such sharing is contrary to stated sharing practices in effect at the time the information was collected. The proposed opt-in disclosure must appear separately from any end user license agreement, privacy policy, website terms of use or similar document and prominently disclose: (1) that the Google user’s information will be disclosed to one or more third parties, (2) the identity or specific categories of such third parties, and (3) the purpose(s) for Google’s sharing of the information.

Part III of the proposed order requires Google to establish and maintain a comprehensive privacy program that is reasonably designed to address privacy risks related to the development and management of new and existing products and services. The program must be documented in writing and must contain privacy controls appropriate to Google’s size and complexity, the nature and scope of its activities, and the sensitivity of covered information. Part IV through IX of the proposed order contain reporting and compliance provisions, including obtaining ongoing biennial assessments from a qualified third-party professional about Google's privacy practices, requiring that Google retain consumer complaints for a period of six months, and mandating that Google submit an initial compliance report to the FTC and make available to the FTC subsequent reports. If finalized, the proposed consent order would remain in effect (with ongoing compliance requirements) for twenty years.

Commissioner Rosch, in a concurring statement, expressed "substantial reservations" about Part II. He said that Google never intended in its original Privacy Policy that the consent it would seek would was "opt-in" (as opposed to "opt-out"), and that such a requirement was "brand new". Also, Commissioner Rosch made note of the fact that the proposed consent order seems to apply to "any" new or additional sharing of previously collected personal information, not just any "material" new or additional sharing of information.

The proposed consent order will be placed on the public record for thirty days until May 2, 2011 for public comment. After thirty days, the Commission will consider comments and decide whether to make the proposed consent order final. Bottom line, this case should serve as a reminder that companies must align their business practices with the promises contained in their Privacy Policies.

FTC Brings Enforcement Action against Text Messaging Spammer

This post was written by Kevin Xu and John Hines.

On February 22, 2011, the Federal Trade Commission (“FTC”) filed a complaint against Phillip A. Flora (“Flora”) for an operation that allegedly blasted consumers with millions of illegal spam text messages, including many messages that deceptively advertised a mortgage modification website called “Loanmod-gov.net.” The FTC is asking the court to shut down Flora’s operation and freeze his assets.

According to the FTC complaint, beginning on or about August 22, 2009, Flora transmitted or arranged for the transmission of at least 5 million spam text messages to random consumers. The text messages promoted products and services, including, but not limited to, loan modification programs and debt relief services. The text messages offered to help consumers obtain mortgage loan modifications and many of the messages state: “Homeowners, we can lower your mortgage payment by doing a Loan Modification. Late on payments OK. No equity OK. May we please give you a call? Loanmod-gov.net.” Consumers who visited this web address arrived at a website that touted itself as the “Official Home Loan Modification and Audit Assistance Information” beneath a picture of the U.S. flag. This website, although it included the term “gov” in its address, was not operated by or affiliated with any governmental entity. Additionally, Flora allegedly collected information from consumers who responded to text messages – even those asking him to stop sending messages – and sold their contact information to marketers claiming they were “debt settlement leads.”

The FTC charges that Flora violated the Section 5(a) of the FTC Act, which prohibits unfair or deceptive acts or practices in or affecting commerce, by sending unsolicited commercial text messages to consumers, and by misrepresenting that he was affiliated with a government agency. In addition, the FTC charges that Flora violated the CAN-SPAM Act by sending consumers spam text messages that failed to include a way for consumers to “opt-out” of future messages and failed to include the physical mailing address of the sender, as required by the CAN-SPAM Act.

The outcome of this case, which we note is being brought by the FTC and not the FCC, may have a significant impact on consumer data privacy rights in the mobile communications sector, and may serve as a watershed case for consumers’ potential recourses in future privacy violation situations arising from mobile communications.

Department of Commerce Privacy Green Paper -- Detailed Digest

This post was written by Amy Mushahwar.

As promised in our teleseminar last week, we have digested the Department of Commerce Privacy green paper, entitled, "Commercial Data Privacy and Innovation in the Internet Economy: A Dynamic Policy Framework". The green paper will kick start an ongoing discussion of privacy and we encourage organizations to undertake some cost-benefit analysis now for the best outcome in 2011. Time is of the essence and comments to this green paper are due on January 28, 2011. To learn more about this important release, please read our recent client alert.