St. Ledger-Roty & Olson LLP
Proud sponsor of 1410 Q Street: DC’s Innovation Hot Spot!

PRIVACY & INFORMATION LAW UPDATE
July 2011
A bimonthly update of trends and developments in privacy law & policy

Prepared by Karen L. Neuman

  • You are receiving this publication because of your interest in privacy, information management & data security. It is not intended to be and should not be considered legal advice.
  • Not interested? unsubscribe. Know someone who might be? Please forward.
  • If someone sent you this publication subscribe here to receive future issues.
  • To view previous issues click here.

In this Issue:
FEATURE ARTICLE: Through the Looking Glass: Can General Audience Sites & Apps Talk to Alice?
FTC Releases 10-Year Regulatory Review Calendar & Announces Changes to Regulatory Review Process
FTC Announces $1.8 Million Settlement for FCRA Violations
Supreme Court Invalidates Vermont’s Physician-Privacy Prescription Drug Marketing Law
Location Privacy Protection Act of 2011 Introduced in Senate
Payment Card Industry Security Council Issues Guidelines for Virtualized Environments
Operators of Online Virtual Worlds Agree to $3 Million Settlement in FTC COPPA Enforcement Action
CA PUC Smart Grid Privacy Decision
Coming Next

 

 

 

 

 

Feature Article:
Through the Looking Glass: Can General Audience Sites & Apps Talk to Alice?

Many operators of general audience websites and mobile apps may be surprised to learn that they could be subject to the Children’s Online Privacy Protection Act (COPPA) and its parental notice and consent, and data integrity and security requirements. COPPA is often mistakenly associated with children-only websites but it can also apply to “general audience” sites and online services that attract children. The widespread use of social media and mobile technologies for marketing by businesses, together with the adoption of mobile computing by children, can create complicated COPPA obligations. Recent enforcement actions by the Federal Trade Commission (FTC) leave no doubt about the risks and consequences of non-compliance.

Hotels and resorts (particularly those who market “kids club” or similar activities ); apparel and retail stores (particularly those with brand-specific websites or mobile apps); online social gaming and virtual worlds (including those on social media or mobile platforms); music and video sites or apps; social “check-in” and other geolocation-based services, social media platforms (including brand “fan” pages); entities that offer online contests or sweepstakes and charitable giving sites are all destinations likely to attract today’s device-tethered, technology savvy-children.
Read more...


FTC Releases 10-Year Regulatory Review Calendar & Announces Changes to Regulatory Review Process
On July 7, 2011, the Federal Trade Commission (FTC) published its 10-year regulatory review calendar, initiating a process that was put in place in 1992 to conduct a regular review of all of the agency’s rules and guides on a rotating basis.
Read more...


FTC Announces $1.8 Million Settlement for FCRA Violations
On June 28, 2011 the FTC announced that Teletrack, Inc. (Teletrack), a consumer reporting agency, agreed to pay $1.8 million to settle charges that it sold credit reports to marketers in violation of the Fair Credit Reporting Act (FCRA).
Read more...


Supreme Court Invalidates Vermont’s Physician-Privacy Prescription Drug Marketing Law
On June 23, 2011, the U.S. Supreme Court invalidated a Vermont law that prohibited the sale of prescription drug data to pharmaceutical companies for commercial uses on grounds that the law impermissibly infringed First Amendment speech rights. The 6-3 decision affirmed a ruling by the U.S. Court of Appeals for the Second Circuit and resolved a split among circuits resulting from a First Circuit ruling that upheld nearly identical laws in Maine and New Hampshire1.

As we noted previously, the case, Sorrell v. IMS Health, Inc.,2 could have far-reaching implications for the manner and extent to which government may restrict the commercial use of non-public personal information. This case could also mark the beginning of a trend by businesses that acquire and use personal data to challenge government privacy regulation on First Amendment grounds.
Read more...


Location Privacy Protection ACT of 2011 Introduced in Senate
On June 15, 2011, Sen. Franken (D-MN) and Sen. Blumenthal (D-CT) introduced a measure intended to address mounting concerns about the unauthorized collection and disclosure of location information from consumers’ mobile devices. If enacted, the Location Privacy Protection Act of 2011 would apply to companies like Apple and Google, as well as to app developers.
Read more...


Payment Card Industry Security Council Issues Guidelines for Virtualized Environments
By Karen L. Neuman and Ari Z. Moskowitz
On June 14, 2011, the virtualization subcommittee of the Payment Card Industry Security Standards Council (Council), the entity that develops and implements the Payment Card Industry Data Security Standard (PCI DSS), issued new PCI data security guidelines for cloud-based or virtualized environments. The guidelines are intended to assist organizations with meeting the 2012 compliance deadline for PCI DSS 2.0. They include 6 objectives that encompass 12 requirements.
Read more...


Operators of Online Virtual Worlds Agree to $3 Million Settlement in FTC COPPA Enforcement Action
On May 12, 2011, the Federal Trade Commission (FTC) announced that operators of 20 online virtual world websites have agreed to pay $3 million to settle charges that the sites violated the Children’s Online Privacy Protection Rule (COPPA Rule), and Section 5 of the Federal Trade Commission Act, by collecting and disclosing personal information without obtaining prior parental consent from hundreds of thousands of children who were under the age of 13. The settlement is the largest civil penalty for a COPPA violation. In addition to the civil penalty, the defendants are permanently barred from misrepresenting their information practices regarding children.
Read more...


CA PUC Smart Grid Privacy Decision
On May 6, 2011 the California Public Utilities Commission (CPUC) issued a decision adopting proposed rules to protect the privacy and security of customer usage data of the State’s investor-owned electric utilities and clarifying related obligations of the utilities and third party contractors. The rules would also apply, through tariffs, to certain third parties that are responsible for “system, grid, [], operational needs or energy efficiency programs, including Home Area Network (HAN) devices that automatically transfer residential data solely to those devices. Application of the proposed rules to other entities, including research facilities, could depend on a number of factors, including whether the facilities are subject to separate statutory authority governing access to and use of utility customer data, whether the data is in the aggregate or anonymized (and cannot be de-anonymized), or the facilities’ contractual relationships with covered utilities or third parties.
Read more...


Coming Next:

  • The US-EU Safe Harbor Privacy Framework: Strategic Advantages for U.S. –Based Businesses.
  • Food for Thought: The EU Cookie Law – Is Your Company in Compliance?
  • Privacy Law Developments in India: Impact on U.S. Businesses

Feature Article:
Through the Looking Glass: Can General Audience Sites & Apps Talk to Alice?

Many operators of general audience websites and mobile apps may be surprised to learn that they could be subject to the Children’s Online Privacy Protection Act (COPPA) and its parental notice and consent, and data integrity and security requirements. COPPA is often mistakenly associated with children-only websites but it can also apply to “general audience” sites and online services that attract children. The widespread use of social media and mobile technologies for marketing by businesses, together with the adoption of mobile computing by children, can create complicated COPPA obligations. Recent enforcement actions by the Federal Trade Commission (FTC) leave no doubt about the risks and consequences of non-compliance.

Hotels and resorts (particularly those who market “kids club” or similar activities ); apparel and retail stores (particularly those with brand-specific websites or mobile apps); online social gaming and virtual worlds (including those on social media or mobile platforms); music and video sites or apps; social “check-in” and other geolocation-based services, social media platforms (including brand “fan” pages); entities that offer online contests or sweepstakes and charitable giving sites are all destinations likely to attract today’s device-tethered, technology savvy-children. Children visiting these destinations are likely to disclose personal information that triggers the COPPA rule.

THE COPPA RULE.

The COPPA rule, promulgated and enforced by the FTC, is intended to protect children from unauthorized contact from marketers or other adults. It currently prohibits operators of commercial websites and online services directed to children -- as well as general audience sites that attract children -- from collecting personal information from kids under the age of 13 without first seeking and obtaining “verifiable” parental consent. Parents must be given access to their children’s personal data and the opportunity to have it deleted from the operator’s system. The rule’s application to general audience sites is triggered when those sites “knowingly” collect personal information from children. A site is deemed to knowingly collect personal information if a child discloses that he or she is under 13 or the site otherwise learns that the child is under 13.

Personal information includes a physical or personal e-mail address, phone number, social security number or IM identifier that contains a child’s e-mail address. It also can include gender or date of birth information if such information can be combined with other data to identify and contact a specific child.

Perhaps the most perplexing compliance challenges involve the rule’s prior verifiable parental consent and age verification requirements. Many of the rule’s parental consent mechanisms are either impractical given the real-time nature of online interactions or activities (e.g., notifying parents of intent to collect personal information and asking that they fax consent or call an “800” number staffed by trained professionals), or reflect outdated assumptions about how to guard against a child masquerading as a parent (e.g., permitting a completed credit card transaction to serve as verifiable parental consent). Many of the rule’s age screening mechanisms (for example drop-down menus with dates of birth) are ineffective against kids who can “outsmart” this approach or who keep trying until they “pass”. Blocking repeated attempts or implementing age screening technologies can be cost- ineffective or otherwise impractical, particularly for small operators.

ENFORCEMENT.

The FTC has brought a number of high profile enforcement actions against general audience site operators alleging noncompliance with the COPPA rule. These actions have resulted in significant fines, remedial compliance obligations, and FTC oversight. In many instances the cost of complying with remedial measures can exceed the amount of civil penalties.

For example, in 2009 the FTC settled charges with Iconix Brand Group (Iconix) for knowingly collecting, using or disclosing personal information from children under 13 without obtaining prior parental consent. Iconix operated a number of general audience apparel websites, including some that sold brands that appealed to children and teens. Visitors to the company’s brand-specific Web sites were required to provide personal information, including full name, e-mail address, zip code, and in some cases mailing address, gender, date of birth and phone number in order to receive brand updates, enter sweepstakes contests, and participate in interactive brand-awareness campaigns, post photos and share stories. Under the settlement, Iconix was fined $250,000 and required to implement a number of remedial measures including deleting all personal information obtained and stored in violation of COPPA, distributing the settlement order and FTC’s COPPA compliance materials to company employees, and complying with certain reporting and record-keeping obligations.

In 2008 the FTC settled similar charges against Sony BMC Music (Sony). At the time Sony operated over 1,000 artist and label websites, 196 of which collected personal information from children under 13. Like Iconix, Sony required visitors to disclose personal information and date of birth when registering to use the sites. Sony agreed to pay $1 million to charges that it violated COPPA by collecting, maintaining and disclosing personal information from thousands of under-13 children without their parents’ consent. Sony was also required to delete all personal information collected in violation of the rule and comply with certain employee training and record-keeping measures.

UPDATING THE COPPA RULE.

In March 2010 the FTC initiated a comprehensive review of the COPPA rule to address privacy risks posed by the use of mobile technology by children to access the Internet, including interactive gaming, social and other media. Proposed changes are expected to be announced this summer. Accordingly, the rule’s reach may soon expand, making it more difficult for businesses to engage young people and even adults online.

Potential changes include mandating the adoption of age screening and verification technologies to adequately reflect changes in behavior and technology; expanding the statutory definition of “Internet” to address the adoption of mobile computing technologies by children; and expanding the definition of “personal information” to include passive technologies such as mobile unique device identifiers, persistent identifiers such as IP addresses, as well as possibly increasing the minimum age of protection to teenagers older than 13.

CONGRESSIONAL INTIATIVES TO PROTECT CHILDREN’S ONLINE PRIVACY.

Congress has also been paying close attention to privacy risks (as well as broader consumer protection risks) facing minors, holding hearings and considering measures intended to address those risks. On June 15, 2011, Senators Franken (D-MN) and Blumenthal (D-CT) introduced a location privacy bill containing a provision that would criminalize the “knowing and intentional aggregation and sale” of location data of children 10 years old and younger. In April 2011 the Chair of the Senate Commerce Committee, Senator Rockefeller (D-VA) indicated that Congress might consider amending COPPA to reflect childrens’ use of mobile technology and social media. On May 13, 2011, Representatives Markey (D-Mass.) and Barton (R-Texas) released a discussion draft of the Do Not Track Kids Act of 2011. The bill would give parents the choice of withholding consent for tracking and targeting ads to children who are under 13. In February 2011 several democratic lawmakers, including Representative Markey, sent a letter to the FTC calling for agency review of in-app purchases on Apple and Google/Android devices by children after a Washington Post article highlighted the ability of children to guess parental passwords and use them to incur substantial, unauthorized charges.

Meanwhile, technology continues to evolve at an ever-accelerating pace, posing challenges to policymakers who are trying to craft a privacy regulatory framework for children that can adapt to technology without erecting unintended obstacles to innovation.

CONCLUSION.

As industry awaits anticipated changes to COPPA, operators of general audience websites and mobile apps may already need to add COPPA to their compliance checklists. Operators should be familiar with COPPA’s current requirements while anticipating likely changes, and review their data collection, retention, sharing and use practices with an experienced professional to identify and address potential compliance issues.


Back to Top


FTC Releases 10-Year Regulatory Review Calendar & Announces Changes to Regulatory Review Process
On July 7, 2011, the Federal Trade Commission (FTC) published its 10-year regulatory review calendar, initiating a process that was put in place in 1992 to conduct a regular review of all of the agency’s rules and guides on a rotating basis.

According to a statement on the FTC’s website, the purpose of the review is to address changes in technology and the marketplace by placing each rule on a 10-year review calendar, during which public comment is sought on the following questions: 1) what is the economic impact of the rule; 2) is there a continuing need for the rule; 3) are there possible conflicts between the rules and state, local, or other federal laws or regulations; and 4) has the rule been affected by any technological, economic, or other industry changes. The calendar identifies the rule, the year it is slated for review and the rule’s review status.

Privacy and consumer data items on the calendar that are currently under review include: 1) use of pre-notification of the Negative Option Plans Rule, 2) Mail or Telephone Order Merchandise Rule, 3) the Children’s Online Privacy Protection Act Rule, and 4) Telephone Order Merchandise Rule. Privacy and consumer data rules slated for review starting in this year and continuing through 2020 include: 1) standards for safeguarding customer information, 2) disposal of consumer report information and records rule; 3) “Red Flags Rule”; 4) Privacy of Consumer Financial Information Rule; and 5) Health Breach Notification Rule (for non-HIPPA covered entities).

The FTC also announced that it also examining whether changes are necessary to its regulatory review process. The agency is seeking public comment on various aspects of that process, including how often it should review its rules and guides, and what changes can be put in place to make the process “more responsive to the needs of consumers and businesses.”

Your business could be impacted by either of these proceedings. Reference to the FTC regulatory review calendar will give you a good indication of agency action relevant to your business, including comment periods for proposed rules or guides. Participation in these proceedings is an important opportunity to potentially impact the regulatory outcome by educating regulators about how contemplated action could affect your industry. The regulatory review calendar is also a useful resource for alerting you to rules or guides you may have been unaware of but that your business might be subject to.

Changes to the FTC’s regulatory review process could also impact your business and you may want to consider submitting comments that alert the agency about the possible impact of its regulatory review procedures on your business.

Back to Top


FTC Announces $1.8 Million Settlement for FCRA Violations
On June 28, 2011 the FTC announced that Teletrack, Inc. (Teletrack), a consumer reporting agency, agreed to pay $1.8 million to settle charges that it sold credit reports to marketers in violation of the Fair Credit Reporting Act (FCRA).

The Complaint alleged that Teletrack created a marketing database of consumer information that the company collected through its credit reporting business. The information included lists of consumers who had applied for non-traditional “credit products” -- for example “payday” and non- prime automobile loans. Teletrack sold this information to marketers and other third parties who in turn wanted to use it to target distressed customers in need of alternative sources of credit.

The FTC alleged that these marketing lists were “credit reports” subject to the FCRA because they contained information about consumers’ creditworthiness. The FCRA makes it illegal to sell credit reports without a specific “permissible purpose”. Since marketing is not a permissible purpose under the Act, the FTC charged Teletrack with violating the Act.

In addition to the civil penalty, the settlement order requires Teletrack to only give credit reports to entities that Teletrack has reason to believe have a permissible FCRA purpose for obtaining them, or as otherwise allowed under the FCRA. The Order also imposes certain reporting and record- keeping requirements on Teletrack to ensure Teletrack’s compliance with the terms of the settlement.

This case is another example of the proactive approach the FTC is taking to protect consumer privacy in a variety of contexts. The FTC has brought a number of high profile enforcement actions that send a clear signal that the agency believes it has the tools it needs to protect privacy, even as Congress considers expanding the FTC’s rulemaking and enforcement authority.

This case can also potentially be seen as a precursor to heightened FTC enforcement actions involving financial services products as it awaits the anticipated spate of related rulemakings by the newly created Consumer Protection Financial Protection Board which, by statute, vests enforcement in the FTC.

Finally, this case should not be seen as limited solely to data brokers that sell consumer financial or credit information for commercial purposes; data brokers that sell this information for other uses -- notably employee screening and background checks are also subject to the FCRA. Accordingly, companies that sell consumer data for commercial and other purposes should be familiar with the restrictions that the FCRA places on the sale of consumer data to third parties.

Back to Top


Supreme Court Invalidates Vermont’s Physician-Privacy Prescription Drug Marketing Law
On June 23, 2011, the U.S. Supreme Court invalidated a Vermont law that prohibited the sale of prescription drug data to pharmaceutical companies for commercial uses on grounds that the law impermissibly infringed First Amendment speech rights. The 6-3 decision affirmed a ruling by the U.S. Court of Appeals for the Second Circuit and resolved a split among circuits resulting from a First Circuit ruling that upheld nearly identical laws in Maine and New Hampshire1.

As we noted previously, the case, Sorrell v. IMS Health, Inc.,2 could have far-reaching implications for the manner and extent to which government may restrict the commercial use of non-public personal information. This case could also mark the beginning of a trend by businesses that acquire and use personal data to challenge government privacy regulation on First Amendment grounds.

Vermont’s Prescription Confidentiality Law3 was enacted in 2007 to address the legislature’s concerns with the marketing practice known as “detailing”, which allows marketers to use physician prescription information to determine which drugs are likely to appeal to doctors and “how best to present a particular sales message.”

“Detailing” involves the purchase of physician prescription data by data mining companies from pharmacies (who are required by law to maintain records about both the prescribing physician and the patient). The data is then combined with information from other databases and sold to drug companies for use in brand-name prescription drug marketing campaigns directed at individual physicians. The data is also used to monitor and evaluate marketing campaigns by drug companies and individual detailers. While the data did not include patient names, it did include information that identified specific physicians. Privacy advocates contended that the information could easily be combined with other available data to identify individual patients and disclose their prescription drug histories.

Vermont’s law attempted to address this practice by prohibiting any health insurer, self-insured employer, electronic transmission intermediary or pharmacy from selling or otherwise using “prescriber-identifiable information for marketing or promoting a prescription drug” without the doctor’s consent. The law further prohibited pharmaceutical manufacturers and marketers from using “prescriber-identifiable information for marketing or promoting a prescription drug” without prior physician consent.

In reaching its decision the Court evaluated the law’s constitutionality on First Amendment (as opposed to privacy) grounds. The Court first concluded that the law warranted heighted scrutiny because it imposed content- and speaker-based burdens on protected expression.

Applying this standard, the Court noted that the law prohibited pharmaceutical companies from using prescription data for marketing uses but permitted its acquisition and use for other types of “speech” by other speakers. Given the “widespread availability and many permissible uses” of the data, the Court concluded that the State’s asserted interest in protecting physician confidentiality was undermined by the State’s failure to narrowly tailor the statute, thereby preventing Vermont from justifying “the burdens” that the law imposed on protected expression.

Rather than protecting privacy, as asserted by Vermont, the Court concluded that the law was intended to suppress a specific type of speech by specific speakers that the legislature looked upon with disfavor. Interestingly, the Court emphasized that had the legislature imposed a more comprehensive privacy regime, for example by restricting all disclosure of the data except in only “a few…well justified circumstances”, it would have viewed the law “through quite a different lens.”

Although this case arose in the context of medical privacy, its outcome could provide useful guidance to legislators about how to craft laws that are intended to address broader privacy concerns that can withstand first amendment scrutiny. By the same token, the decision offers businesses that rely on personal data for marketing or targeted advertising a potentially new basis for challenging privacy legislation intended to curtail those practices.


1 See IMS Health Inc. v. Ayotte, 550 F.3d 42 (1st Cir. 2008), cert. denied, 129 S. Ct. 2864 (2009) and IMS Health Inc. v. Mills, 616 F.3d 7 (1st Cir. 2010).
2 No. 10–779, June 23, 2011.
3 Prescription Drug Cost Containment Law, Vt. Stat. Ann. tit. 18, § 4631 (2007-2008).

Back to Top


Location Privacy Protection ACT of 2011 Introduced in Senate
On June 15, 2011, Sen. Franken (D-MN) and Sen. Blumenthal (D-CT) introduced a measure intended to address mounting concerns about the unauthorized collection and disclosure of location information from consumers’ mobile devices. If enacted, the Location Privacy Protection Act of 2011 would apply to companies like Apple and Google, as well as to app developers.

The bill attempts to address privacy concerns by closing purported loopholes in existing federal law (specifically the Electronic Communications Privacy Act and certain provisions of the Cable Act and Communications Act), by requiring covered companies and businesses to obtain “express consent” from users of smart phones, I-pads and similar devices before collecting and sharing information about those users’ location with third parties.

The measure would also create criminal penalties for apps that knowingly disclose geolocation information “while knowing and intending that domestic violence or stalking will occur as a result of the disclosure” and criminalizes the “knowing and intentional aggregation and sale” of location data of children 10 years old and younger. The bill also calls for certain measures to be undertaken by law enforcement involving the study and training of dating and domestic violence crimes involving location- based technology.

This bill is one of a number of privacy-related bills pending in Congress. Prior efforts to enact comprehensive privacy legislation have been unsuccessful. Deficit reduction, employment and defense matters continue to be a principle focus in Congress and it is unclear whether discrete privacy measures like this one will get traction and eventually become law. Nevertheless, a great deal of attention has been drawn to mobile privacy as a result of high profile data breaches, location information related lawsuits, and calls for action by privacy advocacy groups. Accordingly, mobile platforms, device manufacturers and app developers should monitor developments involving this bill and opportunities to impact the regulatory environment should it become law.


Back to Top


Payment Card Industry Security Council Issues Guidelines for Virtualized Environments
By Karen L. Neuman and Ari Z. Moskowitz
On June 14, 2011, the virtualization subcommittee of the Payment Card Industry Security Standards Council (Council), the entity that develops and implements the Payment Card Industry Data Security Standard (PCI DSS), issued new PCI data security guidelines for cloud-based or virtualized environments. The guidelines are intended to assist organizations with meeting the 2012 compliance deadline for PCI DSS 2.0. They include 6 objectives that encompass 12 requirements.

If your business retains payment card data in a cloud based or virtualized environment these guidelines will apply. Noncompliance could affect your ability to process payment card transactions. That said, the Council acknowledged that compliance may be difficult to achieve for some businesses.

Merchants or other businesses that engage cloud service providers to store credit card data should conduct an internal PCI DSS compliance evaluation as well as perform due diligence on cloud service providers to ensure PCI-DSS compliance before engaging these service providers. Merchant due diligence should include an assessment of the virtual environment to ensure that the service provider implements procedures recommended by the Council, including network and access controls, segmented authentication, encryption and logging – the goal of which is to quarantine each of the service provider’s customer environments from the others.

To facilitate a merchant’s due diligence about a virtualized environment, the Guidelines impose an obligation on service providers to demonstrate “rigorous” evidence of “adequate” controls, including the results of the service providers own PCI DSS evaluation. These disclosures should also assist merchants in negotiating for contractual language to further ensure compliance.

In addition to conducting due diligence, merchants or other businesses that use cloud- based services should :

  • undertake a holistic evaluation of their internal operations, including interactions between critical departments – e.g., network security, business, IT and even marketing -- to determine what obstacles, if any, could interfere with the interoperability of the security functions of each department and ultimately PCI DSS compliance;
  • evaluate vendor agreements for sufficient protection against security threats, including breach or catastrophic events resulting in data loss;
  • negotiate for audit rights to minimize regulatory risk, even though the attributes of virtualized environments may make it difficult to conduct a comprehensive audit;
  • update privacy, data security and other internal policies, and
  • revise and update employee training materials and conduct appropriate employee training.

One interesting question is the interplay, if any, between the virtualization guidelines and the Payment Application Data Security Standard (PA DSS). This standard applies to mobile transactions and provides standards for developing applications that store, process or transmits cardholder data. Ten days after issuing the virtualization guidelines, the Council issued clarification about what type of mobile application payments will be subject to the PA DSS as well as eligibility for PA DSS validation. However, not all mobile application payment transactions are eligible for PA-DSS validation. According to a statement on the Council’s website, the Council focused on identifying risks associated with validating mobile payments under the PA DSS standard. A major risk factor involved the ability of application’s environment to support PCI DSS compliance.

With the advent of in-app transactions, digital currency and mobile “wallets”, app developers should be familiar with both standards and understand any compliance obligations that could be imposed by each.

The Council’s virtualization guidelines can be seen as a useful tool for merchants and service providers for achieving PCI DSS compliance. Well publicized breaches of cloud-based environments demonstrate the timely nature of these guidelines. The extent of their application to emerging technologies remains to be seen.

 

Back to Top


Operators of Online Virtual Worlds Agree to $3 Million Settlement in FTC COPPA Enforcement Action
On May 12, 2011, the Federal Trade Commission (FTC) announced that operators of 20 online virtual world websites have agreed to pay $3 million to settle charges that the sites violated the Children’s Online Privacy Protection Rule (COPPA Rule), and Section 5 of the Federal Trade Commission Act, by collecting and disclosing personal information without obtaining prior parental consent from hundreds of thousands of children who were under the age of 13. The settlement is the largest civil penalty for a COPPA violation. In addition to the civil penalty, the defendants are permanently barred from misrepresenting their information practices regarding children.

This settlement signals that FTC COPPA enforcement is not being put on hold pending the outcome of proceedings that were initiated in 2010 to consider proposed changes to the COPPA Rule. (The changes attempt to address the impact on children’s privacy of the widespread adoption of mobile communications technologies over which children are able to access websites, including virtual worlds, and download mobile applications.)

The FTC charged that Playdom, Inc., and one of its executives operated online virtual worlds where users could access social games and engage in numerous activities, including “2 Moons”, “9 Dragons” and “My Diva Doll.” The violations occurred between 2006 and 2010, including after Playdom’s acquisition of the sites’ developer studio in May 2010.

At least one of the sites was directed to children; the others were general audience websites that attracted hundreds of thousands of children. The complaint alleged that the defendants violated the COPPA Rule because they: 1) collected children’s ages and email addresses at registration, and then enabled children to publicly post on personal profile pages or online community forums personal information, including their full names, email addresses, instant messenger IDs and location information; and 2) failed to provide proper parental notice and obtain prior verifiable parental consent. The FTC further alleged that the defendants violated the Federal Trade Commission Act because Playdom’s privacy policy misrepresented that the company prohibited children under 13 from posting personal information online.

Operators of both children’s and general audience websites should be familiar with their information collection, sharing, retention and data security practices, and ensure that those practices are accurately reflected in the sites’ policies. Policies should be reviewed by an experienced professional and revised to reflect changes brought about by the adoption of new products, services, technologies or platforms.

In addition, this case appears to hold entities that acquire or otherwise take ownership of a commercial website liable for COPPA violations that occurred prior to a merger or acquisition. Accordingly, COPPA Rule compliance should be incorporated into the due diligence analysis undertaken in connection with any change in ownership of a commercial website.

 

Back to Top


CA PUC Smart Grid Privacy Decision
On May 6, 2011 the California Public Utilities Commission (CPUC) issued a decision adopting proposed rules to protect the privacy and security of customer usage data of the State’s investor-owned electric utilities and clarifying related obligations of the utilities and third party contractors. The rules would also apply, through tariffs, to certain third parties that are responsible for “system, grid, [], operational needs or energy efficiency programs, including Home Area Network (HAN) devices that automatically transfer residential data solely to those devices. Application of the proposed rules to other entities, including research facilities, could depend on a number of factors, including whether the facilities are subject to separate statutory authority governing access to and use of utility customer data, whether the data is in the aggregate or anonymized (and cannot be de-anonymized), or the facilities’ contractual relationships with covered utilities or third parties.

In addition to proposing a framework for protecting privacy and addressing certain jurisdictional issues, the decision appears to set the stage for real time pricing by requiring covered utilities to provide customers with access to their smart meter data, as well as cost, usage, pricing and bill forecast information, and notification when a rate tier is exceeded. The CPUC’s decision comes at a time when smart grid technologies are being integrated with legacy utility systems to enable consumers to monitor and control energy consumption and help utilities to forecast use and manage load.

The proposed framework takes into account a smart grid ecosystem consisting of numerous relationships and data uses. These relationships and data uses are reflected in a number of key terms that are defined as follows:

  • Covered information: any usage information obtained through the use of …..Advanced Metering Infrastructure when associated with any information that can reasonably be used to identify a customer [but not] information from which identifying information has been removed such that an individual, family, household or residence or non- residential customer cannot reasonably be identified or re-identified.
  • Covered entity: any electrical corporation or any third party that collects, stores, uses or discloses covered information relating to 11 or more customers who obtain this information from an electrical corporation or through the registration of a locked device that transfers information to that third party.
  • Primary purpose for collection and use: 1) providing for or billing for electrical power, 2) fulfilling the electric utility’s system’s other grid or operational needs, 3) providing services as required by state or federal law or specific ally authorized by an order of the Commission, or (4) implementing demand response, energy management, or energy efficiency programs operated by or on behalf of and under contract with, an electric….corporation.
  • Secondary purpose: any purpose that is not a primary purpose.
  • Reasonably necessary: refers to meeting a primary purpose or a secondary purpose when the secondary purpose is authorized by the customer.

The proposed framework also incorporates widely recognized Fair Information (FIP) Principles of Notice/Transparency/Consent, Purpose Specification, Individual Access & Control, Data Minimization, Use & Disclosure Limitation, and Data Integrity/Security. Those principles and corresponding rules are generally summarized as follows:

Use and Disclosure Limitations.

Covered utilities may collect and use covered information for primary purposes without customer consent. Subject to certain exceptions all third parties must have prior customer consent even for primary purposes. Those exceptions include:

  • The third party, under contract with the utility, agrees to abide by data collection and use restrictions at least as protective as those the utility operates under, uses the data for a primary purpose, and the utility permits customers to opt-out.
  • The above exception applies where the third party intends to disclose the data to another third party.

Use of covered information for secondary purposes requires prior express written customer consent, which, if granted, expires after two years.

Transparency.

Covered entities would be required to provide customers with clear notice of the collection, use, retention and disclosure of all categories of covered information. Upon request, utilities would be required to inform the CPUC who it is sharing covered data with.

Access and Control.

Utility customers would be able to have access to their covered information at least at the same level of detail that the covered entity provides to third parties and be able to amend any inaccuracies.

Data Minimization.

Covered entities would be permitted to collect only as much covered information as is reasonably necessary, retain it only as long as reasonably necessary, and disclose it to third parties only as much as is reasonably necessary.

Data Integrity/Security.

Covered entities would be required to ensure the reasonable accuracy of covered information that they collect, use and disclose. They would also be required to create and implement reasonable administrative, technical and physical safeguards to protect covered information. The rules also establish a breach notification protocol and impose certain privacy and data security accountability and audit reporting requirements on covered entities.

A subsequent phase of the proceeding will address extension of the rules to gas companies, community choice aggregators, and non-investor owned utilities.

The proposed decision can be viewed here. The PUC was accepting comments on it until May 26, 2011.

 

Back to Top


Coming Next:

  • The US-EU Safe Harbor Privacy Framework: Strategic Advantages for U.S. –Based Businesses.
  • Food for Thought: The EU Cookie Law – Is Your Company in Compliance?
  • Privacy Law Developments in India: Impact on U.S. Businesses

 

Back to Top


Copyright © 2010 St. Ledger-Roty & Olson, LLP.
1250 Connecticut Avenue, N.W., Suite 200, Washington D.C 20036