European Commission releases technical standards on Radio Frequency Identification

This post was written by Cynthia O'Donoghue and Kate Brimsted.

In July, the EU introduced new technical standards (‘Standards’) to assist users of Radio Frequency Identification (‘RFID’) technology to comply with the EU Data Protection regime and the Commission’s 2009 recommendation on RFID. The Standards are the result of a long-term EU project which began with a public consultation in 2006.

When RFID technology is used to gather and distribute personal data, it falls within the EU Data Protection regime. The Standards are being introduced at a critical time, as the use of RFID becomes more widespread, particularly in the health care and banking industries.

The key features of the Standards include:

  • The introduction of a new, EU-wide RFID sign which will allow people to identify products that use smart chips
  • New Standards for Privacy Impact Assessments (‘PIA’) to help ensure data protection by design
  • Guidance on the structure and content of RFID privacy policies

The Standards will be a useful tool for organisations that already use RFID technology, or are looking to do so. In particular, the Standards on PIAs will assist organisations to plan how they will comply with the forthcoming Data Protection Regulation, which requires PIAs to be carried out in various circumstances.

Article 29 Working Party supports recognition of Processor BCRs in the Data Protection Regulation

This post was written by Cynthia O'Donoghue and Kate Brimsted.

In June, the Article 29 Working Party (‘Working Party’) wrote to the President of the European Commission, setting out its case for including a reference to Binding Corporate Rules for data processors (‘BCR-P’) in the forthcoming Data Protection Regulation.

Binding Corporate Rules are one way in which data controllers or data processors in Europe can lawfully undertake international transfers of data. They are an alternative to using EU Model Clauses, or gaining Safe Harbor certification. However, to date, BCRs have been used to a much lesser extent than either of these methods since they are costly and time consuming to implement.

In the proposal for a Regulation published in January 2012, the European Commission had introduced an express reference to BCR-Ps. This reference was dropped, however, in the draft version of the Regulation that was voted on by the European Parliament in March 2014.

In its letter, the Working Party notes that it has officially allowed data processors to apply for BCRs since January 2013. In this connection, “denying the possibility for BCR-P will limit the choice of organisations to use model clauses or apply the Safe Harbor if possible, which do not contain such accountability mechanisms to ensure compliance as it is provided for in BCR-P.”

The letter makes clear that the Working Party is strongly in favour of BCR-Ps, which “offer a high level of protection for the international transfer of personal data to processors” and are “an optimal solution to promote the European principles of personal data abroad.” It is noted that three multi-nationals have already had BCR-Ps approved, and that approximately 10 applications are currently pending. By not providing for BCR-P in the Regulation, these companies will be put at a disadvantage.

The Regulation, which is currently being negotiated between the European Parliament and the European Council, is widely expected to come into force in 2017. It will implement substantial changes to the current regime, including the introduction of significant new duties for data processors.

Ireland and the UK ban forced subject access requests

This post was written by Cynthia O'Donoghue and Kate Brimsted.

The practice of employers forcing employees or applicants to exercise subject access rights has been described by the UK’s Information Commissioner’s Office (‘ICO’) as a “clear perversion of an individual’s own rights”. It is now set to become a thing of the past in the UK and Ireland, as both jurisdictions bring laws into effect to make the practice a criminal offence.

In Ireland, provisions of the Data Protection Act 1988 and Data Protection (Amendment) Act 2003 that outlaw the practice were triggered in July 2014. In addition to the employer-employee relationship, these provisions apply to any person who engages another person to provide a service. The provisions have always been included in the Acts, but have not been brought into force until now.

In June 2014, the UK’s Ministry of Justice released guidance stating that similar provisions will come into force as of 1 December 2014. Employers that attempt to force people to use their subject access rights will be committing a criminal offence, punishable by a potentially unlimited fine. The ICO has indicated that it clearly intended to enforce the provision in one of its blogs, stating that “the war is not yet won but a significant new weapon is entering the battlefield. We intend to use it to stamp out this practice once and for all.”

These developments come against a backdrop of similar regulatory changes in the United States, where the long-standing “Ban the Box” movement continues to challenge the use of criminal conviction questions on job application forms. In addition, Maryland became the first state in 2012 to ban employers from asking employees and job applicants to disclose their social media passwords. Similar legislation has now been introduced or is pending in 28 states nationwide.

In light of these developments, employers should review their procedures to ensure that they do not fall foul of this update.

FTC Workshop on Big Data: Focus on Data Brokers

This post was written by Divonne Smoyer and Christine N. Czuprynski.

On September 15, the Federal Trade Commission held a workshop entitled “Big Data: A Tool for Inclusion or Exclusion?” FTC Commissioner Julie Brill took the opportunity to discuss an industry that she has consistently maintained requires more regulation and scrutiny: data brokers.

Commissioner Brill stressed first that the FTC is very focused on entities regulated by the Fair Credit Reporting Act (FCRA), and reminded the audience that those entities will be held to the law by the agency. Those entities that are not subject to the FCRA are not off the hook: companies that engage in profiling, or “alternative scoring,” should take a very critical look at what they are doing, since alternative scoring has the potential to limit an individual’s access to credit, insurance, and job opportunities. Brill noted that the FTC’s May 2014 report focused on transparency, and called for legislation to make data brokers accountable – thoughts she echoed during Monday’s workshop.

Finally, Commissioner Brill stressed that all companies would be well-advised to see if their own big data systems cause problems that ultimately exacerbate existing socioeconomic conditions. She reiterated that companies should use their systems for good, and have a role in spotting and rooting out discrimination and differential impact. You can find the text of her full speech here.

Further U.S. Sanctions Target Russia's Energy, Defense and Financial Sectors

This post was written by Leigh T. Hansson, Michael J. Lowell, Hena M. Schommer, and Paula A. Salamoun.

As the United States and Russia continue to clash over Russia’s actions in the Ukraine, on September 12, the U.S. Treasury Department’s Office of Foreign Assets Control (“OFAC”) issued additional sanctions further restricting designated Russian financial institutions’ access to capital markets, targeting Russian defense entities, and prohibiting exports to Russian entities that have been specifically designated as participants in exploration and production in deepwater, Arctic offshore, or shale projects. As part of these recent changes, OFAC amended Directive 1, previously issued July 16, 2014, and added Directives 3 and 4 targeting the Russian defense and energy sectors. OFAC also added five more entities to its Specially Designated Nationals List (“SDN List”).

OFAC amended Directive 1 to change the debt maturity restriction from 90 days to 30 days. U.S. persons may not transact in, provide financing for, or deal in new debt or new equity of longer than 30 days, with entities designated under Directive 1 of the Sectoral Sanctions Identifications List (“SSI List”). All other previous restrictions provided for in Directives 1 and 2 remain unchanged. Directive 3 specifically targets the “defense and related materiel sector” of Russia, and prohibits “all transactions in, provision of financing for, and other dealings in new debt of longer than 30 days maturity” of designated entities. OFAC designated one Russian entity under Directive 3, Rostec State Corporation.

Directive 4 introduces new prohibitions targeting entities in Russia’s energy sector. OFAC designated five entities under Directive 4, prohibiting “the provision, exportation, or reexportation, directly or indirectly, of goods, services (except for financial services), or technology in support of exploration or production for deepwater, Arctic offshore, or shale projects that have the potential to produce oil” in Russia or any territory claimed by Russia. Concurrently, OFAC also issued General License Number 2, which allows U.S. persons until September 26, 2014, to engage in activities with Directive 4 designated entities that are “ordinarily incident and necessary to the wind down of operations, contracts, or other agreements” prohibited under Directive 4. The five entities designated under Directive 4 are Lukoil OAO, OJSC Gazprom Neft, Open Joint Stock Company Gazprom, Surgutneftegas, and Open Joint-Stock Company Rosneft Oil Company. Directive 4 prohibitions also apply to any entities 50 percent or more owned by one or more of the designated entities.

According to OFAC guidance, the Directive 4 prohibition does not apply to the provision of financial services, such as clearing transactions or providing insurance. However, the exportation of services, such as drilling services, geophysical services, geological services, logistical services, management services, modeling capabilities, and mapping technologies, are examples of prohibited activities under Directive 4.

Other Reed Smith updates related to U.S.-Russia can be found here.

85% of Mobile Apps Marked Down on Transparency: 'Must Try Harder' Say Global Privacy Regulators

This post was written by Cynthia O’Donoghue and Kate Brimsted.

In May this year, members of the Global Privacy Enforcement Network (GPEN) conducted a privacy sweep of 1,200+ mobile apps. The findings are now available (here).

GPEN is an informal network of 27 Data Protection Authorities (“DPAs”) established in 2007. Its members include the UK’s ICO, Australia’s OAIC, and Canada’s OPC.

DPAs from 26 jurisdictions carried out this year’s sweep (an increase of seven jurisdictions compared with the last sweep which we reported on in May 2014). The recent sweep focused on (1) the types of permissions an app was seeking; (2) whether those permissions exceeded what would be expected based on an app of its type; and (3) the level of explanation an app provided as to why the personal information was needed and how it proposed to use it.

The results showed that:

  • 85% of the mobile apps failed to explain clearly how they were collecting, using and disclosing personal information.
  • 59% left users struggling to find basic privacy information.
  • One in three apps appeared to request an excessive number of permissions to access additional personal information.
  • 43% failed to tailor privacy policies for the small screen, e.g., by providing in tiny type or requiring users to scroll or click through multiple pages.

In announcing their results, the GPEN made it clear that the sweep was not in itself an investigation. However, the sweep is likely to result in follow-up work, such as outreach to organisations, deeper analysis of app privacy provisions, or enforcement actions.

Privacy shortcomings are not just a regulatory matter; research by the ICO last year suggested that 49% of app users have decided not to download an app because of privacy concerns. In an increasingly crowded app marketplace, good privacy policies may be a valuable way to stand out from the competition.

Reed Smith attorneys conduct Q&A with State AGs

This post was written by Divonne Smoyer.

The office of Connecticut Attorney General (AG) George Jepsen has been at the forefront of state-led privacy enforcement issues for years, and Connecticut is widely considered to be one of the most active states in privacy policy and legal enforcement. The Connecticut AG’s office was one of the first to create a special privacy unit in 2011. And it was the first to exercise jurisdiction under the 2009 federal HITECH Act, which extends enforcement of federal privacy and security requirements governing protected health information to state AGs, suing Health Net in January 2010.

Reed Smith Data Privacy attorneys Divonne Smoyer and Christine Czuprynski produced a series of Q&A with AG Jepsen. Click here to read the entire piece on Privacy Advisor.

Click here to also read a Q&A with Indiana AG Greg Zoeller written by Smoyer and Reed Smith Privacy attorney Paul Bond.

Ninth Circuit Refuses To Enforce Arbitration Clause Contained in Barnes & Noble's 'Browsewrap' Terms of Use Agreement

This post was written by Mark S. Melodia and Lisa B. Kim.

During recent terms, the U.S. Supreme Court has repeatedly embraced mandatory arbitration and class action waivers contained in a wide variety of consumer contracts.  The Court has sided with corporate defendants and elevated the requirements of the Federal Arbitration Act above other legal and policy interests advanced by would-be class representatives and their class action counsel.  And yet, all of this case law takes as a starting point that a valid, enforceable contract has been formed under state contract law.  Given the increasingly online nature of consumer transactions, this means that companies offering their goods and services via website or app need to assure that their terms and conditions will be recognized later by a reviewing court as a binding contract in order to get the benefit of this pro-arbitration case law.  Those counseling companies must, therefore, closely watch court decisions – particularly federal appellate authority – that do or do not enforce online terms of use.  One such decision issued earlier this week.

On August, 18, the Ninth Circuit affirmed the district court’s denial of Barnes & Noble, Inc.’s motion to compel arbitration, finding that plaintiff did not have sufficient notice of Barnes & Noble’s Terms of Use agreement, and thus, could not have unambiguously manifested assent to the arbitration provision contained in it.  See Nguyen v. Barnes & Noble, Inc., Case No. 12-56628, 2014 WL 4056549, *1 (9th Cir. Aug. 18, 2014).  In Nguyen, the plaintiff brought a putative class action against Barnes & Noble after it had cancelled his purchase of two heavily discounted tablet computers during an online “fire sale.”  The plaintiff alleges that Barnes & Noble engaged in deceptive business practices and false advertising in violation of California and New York law.

In affirming the district court’s ruling, the Ninth Circuit found that the plaintiff did not have constructive notice of the arbitration clause in it, despite the fact that Barnes & Noble’s Terms of Use was available through a hyperlink at the bottom left of every page of its website (i.e., as a “browsewrap” agreement) and was in proximity to relevant buttons the website user would have clicked on.  Id. at *5-6.  The Ninth Circuit held that the onus was on website owners to put users on notice of the terms to which they wish to bind consumers, and that this could have been done through a “click-wrap” agreement where the user affirmatively acknowledged the agreement by clicking on a button or checking a box.  Id. at *5-6.  Indeed, the decision expressly states that had there been evidence of this, the outcome of the case may have been different.  Id. at *4.

In light of this decision, website owners utilizing a “browsewrap” terms of use agreement should consider incorporating some type of “click-wrap” method for garnering the affirmative consent of its users.  Otherwise, they will run the risk that courts, like the Ninth Circuit, will deny their enforceability.

TCPA Plaintiffs Secure Victories in Recent Rulings on Class Certification and Prior Express Consent

This post was written by Albert E. Hartmann and Henry Pietrkowski.

In separate cases, one Illinois federal judge issued several rulings favorable to Telephone Consumer Protection Act (TCPA) plaintiffs on key issues.  One ruling certified classes of almost 1 million consumers who received automated phone calls, even though the defendants’ records alone were not sufficient to identify the class members.  In a series of rulings in another case also involving automated calls, the judge refused to dismiss the case, even though the plaintiff admitted that he gave his cellular phone number to the defendant.

In the first case, Birchmeier v. Caribbean Cruise Line, Inc., et al., # 1:12-cv-04069 (U.S. District Court for the Northern District of Illinois), United States District Judge Matthew F. Kennelly certified two classes – with a combined total membership of almost 1 million consumers – who had received automated calls in alleged violation of the TCPA.  Plaintiffs initially indicated that they had received from defendants a list of almost 175,000 phone numbers to which automated calls had “unquestionably” been made.  At oral argument on class certification, defendants’ counsel conceded that the class members associated with those numbers were ascertainable. 

Ongoing discovery expanded that number to approximately 930,000.  Plaintiffs defined the putative classes as people whose numbers were on the list of 930,000 numbers from defendants, or whose own records could prove that they received a call at issue.  Judge Kennelly rejected defendants’ arguments opposing certification of classes based on this larger number.  The judge rejected the argument that the class was not ascertainable because defendants’ records could not establish the identity of the subscribers to the called numbers at the times of the calls.  The defendants’ earlier admission that the identities of the smaller number of class members were ascertainable, combined with plaintiffs’ contentions that that could (albeit with difficulty) identify the class members, rendered the putative classes sufficiently ascertainable under Rule 23.  Judge Kennelly also ruled that class members could be identified using their own records; for example, copies of phone bills showing they received a call from one of defendants’ numbers, or potentially with sworn statements providing sufficient details.  In reaching this ruling, Judge Kennelly noted that it would be “fundamentally unfair” to restrict class membership to people only identified on defendants’ records because that could result in “an incentive for a person to violate the TCPA on a mass scale and keep no records of its activity, knowing it could avoid legal responsibility for the full scope of its illegal conduct.”  After determining that the putative classes were ascertainable, the judge held that plaintiffs had carried their burden on the remaining Rule 23 elements and certified the two classes.  Thus, even when a defendant’s records cannot identify the putative class members, the class may still be certified if plaintiff can establish a viable method to ascertain class membership.

In the second case, Kolinek v. Walgreen Co., # 1:13-cv-04806 (U.S. District Court for the Northern District of Illinois), the plaintiff alleged a TCPA violation because he received automated calls to his cellular phone prompting him to refill a prescription.  Judge Kennelly initially dismissed the case because plaintiff had provided his cellular phone number to the defendant, which the defendant argued constituted “prior express consent.”  On July 7, 2014, however, Judge Kennelly reconsidered that decision in light of a March 2014 ruling from the Federal Communications Commission (FCC) that “made it clear that turning over one’s wireless number for the purposes of joining one particular private messaging group did not amount to consent for communications relating to something other than that particular group.”  Thus, while providing a cellular number may constitute “prior express consent” under the TCPA, “the scope of a consumer’s consent depends on its context and the purpose for which it is given.  Consent for one purpose does not equate to consent for all purposes.”  Because plaintiff alleged that he had only provided his number for “‘verification purposes.’ … If that is what happened, it does not amount to consent to automated calls reminding him to refill his prescription.”  Accordingly, Judge Kennelly ruled that dismissal of the case under the TCPA’s “prior express consent” exception was not warranted.

In a second opinion, issued August 11, 2014, Judge Kennelly ruled that dismissal was not warranted under the TCPA’s “emergency purposes” exception either.  While FCC regulations define “emergency purposes” to mean “calls made necessary in any situation affecting the health and safety of consumers,” 47 C.F.R. § 64.1200(f)(4), the FCC has not read that exception to cover calls to consumers about prescriptions or refills.  Noting the absence of such FCC guidance (which the judge observed would “bind the Court”), as well as the paucity of the complaint’s allegations “about the nature or contents of the call,” the judge ruled that he could not dismiss the case without “further factual development.”  Taken together, Judge Kennelly’s rulings in the Kolinek case may allow plaintiffs to survive motions to dismiss even when they admit providing their cellular phone numbers to the defendant.

In many respects, both of these opinions are outliers.  For example, other courts have concluded that providing a cellular number to a company constitutes consent to receive calls on that number.  Moreover, the rulings are fact-specific and thus may not extend beyond the cases at issue.  TCPA plaintiffs, however, will likely seize on these rulings and read them expansively to prolong cases and pressure defendants.  Defendants, therefore, must be aware of these issues and take them into account when defending TCPA cases, especially in the Northern District of Illinois.


This post was written by Hena M. Schommer, Bethany R. Brown, Michael J. Lowell, Leigh T. Hansson, and Michael A. Grant.

On August 13, 2014, the Office of Foreign Assets Control (“OFAC”) revised its guidance on the status of entities owned by persons designated on the Specially Designated Nationals List (“SDN List”).  Under the new guidance, OFAC will consider an entity to be blocked if it is 50 percent or more owned, directly or indirectly, in the aggregate by one or more SDNs. This rule applies even if the entity is not itself listed on the SDN List.  The guidance reverses OFAC’s prior position on aggregate ownership by multiple SDNs.  In conjunction with the revised guidance OFAC also issued further guidance in the form of Frequently Asked Questions (“FAQs”).

OFAC’s revised guidance addresses ownership only and not control.  OFAC clarified that an entity collectively controlled by multiple SDNs - that is not also an SDN - owned under the 50 percent standard articulated in the guidance - is not automatically blocked.  Other more comprehensive sanctions programs may apply separate SDN control criteria, such as Cuba and Sudan.  However, OFAC warns that entities that are controlled by SDNs have a high risk of future designation by OFAC.

OFAC encourages entities considering potential transactions to undertake appropriate due diligence on parties to or involved with the transaction, especially in cases where complex ownership structures exist, since direct or indirect ownership by SDNs will trigger automatic blocking. Persons doing business with companies owned in part by an SDN should reevaluate the companies' status under the new guidance and consider whether existing due diligence processes will be sufficient to identify blocked persons going forward.

Wearable Device Privacy - A Legislative Priority?

This post was written by Frederick Lah and Khurram N. Gore.
Seemingly every day, new types of wearable devices are popping up on the market.  Google Glass, Samsung’s Gear, Fitbit (a fitness and activity tracker), Pulse (a fitness tracker that measures heart rate and blood oxygen), and Narrative (a wearable, automatic camera) are just a few of the more popular “wearables” currently on the market, not to mention Apple’s “iWatch,” rumored to be released later this year.  In addition, medical devices are becoming increasingly advanced in their ability to collect and track patient behavior. 
As wearables become more sophisticated and prevalent, they’re beginning to attract the attention of senators and regulators.  Earlier this week, U.S. Senator Chuck Schumer (D-N.Y.) issued a press release calling on the Federal Trade Commission (“FTC”) to push fitness device and app companies to provide users with a clear opportunity to “opt-out” before any personal health data is provided to third parties.  Schumer’s concern is that the data collected through the devices and apps – which may include sensitive and private health information – may be potentially sold to third parties, such as employers, insurance providers, and other companies, without the users’ knowledge or consent.  Schumer called this possibility a “privacy nightmare,” given that these fitness trackers gather a wide range of health information, such as medical conditions, sleep patterns, calories burned, GPS locations, blood pressure, weight, and more. This press release comes on the heels of an FTC workshop held in May that analyzed how some health and fitness apps and devices may be collecting and transmitting health data to third parties. 
Schumer’s comments were of particular interest to us.  We’ve been beta-testing Google Glass for the past several months as we try to get a better understanding of the types of data privacy and security risks that wearables pose in the corporate environment.  As the devices continue to gain popularity, we expect regulators, legislators, and companies to start paying closer attention to the data security and privacy risks associated with their use.

House of Lords' report on Google 'right to be forgotten' case concludes that it's 'bad law'

This post was written by Cynthia O’Donoghue and Kate Brimsted.

Back in May, we covered the European Union Court of Justice’s landmark ruling in the Google Spain case (‘CJEU Judgment’). Since then, much has been made in the media about the so-called “right to be forgotten”, and the various characters that have requested the removal of links relating to them. Now, the House of Lords Home Affairs, Health and Education EU Sub-Committee (‘Committee’) has released its own report (‘Report’) on the CJEU Judgment, calling it “unworkable, unreasonable and wrong in principle”.

One of the main concerns of the Report is that the practical implementation of the CJEU Judgment imposes a “massive burden” on search engines, and that while Google may have the resources to comply with the ruling, other smaller search engines may not. In addition, the Report makes much of the argument that classifying search engines as data controllers leads to the logical conclusion that users of search engines are also data controllers.

In relation to the “right to be forgotten” – both as implemented by the Judgment and as proposed by the Data Protection Regulation – the Committee notes a particular concern that requiring privacy by design may lead to many SMEs not progressing beyond start-up stage. Labeling the Judgment “bad law”, the Committee calls for the EU legislature to “replace it with better law”, in particular by removing the current provision that would establish a right to be forgotten. The provision is unworkable in practice since it requires the application of vast resources, and leaves to individual companies the task of deciding whether a request to remove data complies with the conditions laid down in the Judgment.

The Committee’s Report is just one of a host of criticisms that has been made of the Google Spain decision – albeit one of the most high profile. Implementing the Judgment has also caused Google PR headaches, with individual instances of the removal of links subject to widespread coverage in the media.

Microsoft loses third round of battle against extra-territorial warrants

This post was written by Cynthia O’Donoghue, Mark S. Melodia, Paul Bond, and Kate Brimsted.

On 31 July, the chief judge of the Southern District of New York delivered the latest in a series of controversial judgments stemming from a test case brought by Microsoft in an extra-territorial warrant issued under the U.S. Stored Communications Act. In the third ruling on the matter, the court found in favour of the U.S. government, upholding the warrant and ordering that Microsoft turn over customer emails stored in a data centre in Ireland. The District Court agreed to stay the order while the decision is appealed further.  If Microsoft’s final appeal is dismissed, the case will have significant implications for all U.S. businesses that store customer data overseas.  The implications also extend to non-U.S. customers, including those companies located within the EEA, that have entered agreements with U.S.-based companies to store their data outside the U.S. In particular, there is concern that foreign companies and consumers will lose trust in the ability of American companies to protect the privacy of their data.

Click here to read the full issued Client Alert.

U.S. Expands Export Restrictions Targeting Russia's Oil and Gas Production

This post was written by Hena M. Schommer, Michael J. Lowell, and Leigh T. Hansson.

Effective August 6, 2014, the United States Department of Commerce’s Bureau of Industry and Security (“BIS”) issued new regulations, identified as the “Russian Industry Sector Sanctions,” restricting exports and other transfers of certain items subject to the Export Administration Regulations (“EAR”) that may benefit Russia’s energy sector.  Newly added EAR section 746.5 imposes licensing requirements on the export, reexport, or in-country transfer of a wide range of items that may be used in Russia in the exploration or production of deepwater, Arctic offshore, or shale projects having the potential to produce oil or gas. The new regulations also clarify that applications for pertinent export licenses are subject to a presumption of denial, and that no EAR license exceptions – aside from license exception GOV  – apply to covered shipments.  The BIS rule took effect immediately upon issuance. Any in-process shipments of restricted items that fall within the restrictions will be considered violations after August 6, 2014; this means that shipments that are in-transit on or after the effective date would be considered violations.

In section 746.5(a)(1), BIS provides a list of Export Control Classification Numbers (“ECCNs”) and a list of EAR99 items identified as the Russian Industry Sector Sanction List.  The specific ECCNs restricted for export to Russia are ECCNs 0A998 (newly added), 1C992, 3A229, 3A231, 3A232, 6A991, 8A992, and 8D999 (also newly added).  The Russian Industry Sector Sanction List, consisting of items identified by their Schedule B numbers and descriptions, includes, but is not limited to, drilling rigs, parts for horizontal drilling, drilling and completion equipment, subsea processing equipment, Arctic-capable marine equipment, wireline and down hole motors and equipment, drill pipe and casing, software for hydraulic fracturing, high pressure pumps, seismic acquisition equipment, remotely operated vehicles, compressors, expanders, valves, and risers.

U.S. and non-U.S. exporters and reexporters should carefully examine the Russian Industry Sector Sanction List and relevant ECCNs to determine whether any items recently shipped, in process, or intended for future export, reexport, or transfer, are covered.

As a result of U.S. and European Union (“EU”) cooperation, the list of restricted items is virtually identical to the items included in Annex II of the EU Regulation issued July 31, 2014.  However, the items actually controlled under the respective lists may differ because of divergent classification interpretations between the United States and the EU.  For further details on EU restrictions, see Reed Smith’s recent update here.

Brazilian Data Protection Authority fines Internet Provider $1.59m

This post was written by Cynthia O’Donoghue and Kate Brimsted.

In July, the Brazilian Department of Consumer Protection and Defence (‘DPDC’) fined the telecom provider Oi 3.5 million reals ($1.59 million) for recording and selling its subscriber browser data in a case based on Brazilian consumer law dating back to 1990.

The DPDC investigated allegations that Oi had entered into an agreement with British online advertising firm Phorm Inc. to develop an Internet activity monitoring program called ‘Navegador’. The investigation confirmed that this program was in use and actively collected the browsing data of Oi’s broadband customers.

The browsing data was collected and stored in a database of user profiles, with the stated purpose of improving the browsing experience. Oi then sold this data to behavioural advertising companies without having obtained the consent of its customers.

The amount of the fine imposed took into account several factors, including the economic benefit to Oi, its financial condition, and the serious nature of the offence. The fine was issued after Oi suspended its use of the Internet activity monitoring software.

Oi denied violating customer privacy and claimed that use of the Internet monitoring program was overseen by government regulators. Phorm Inc. denied that any of the data collected from Oi’s customers was sold, and said that all relevant privacy regulations had been adhered to strictly.

The fine serves as a warning that Brazil will take strong action to enforce its new Internet law.