St. Ledger-Roty & Olson LLP

PRIVACY & INFORMATION LAW UPDATE
March 2012
A bimonthly update of trends and developments in privacy law & policy

Karen L. Neuman, Editor

  • You are receiving this publication because of your interest in privacy and data security. It is for informational including advertising purposes only and not a substitute for legal advice.
  • Not interested? unsubscribe. Please forward to someone who might be. View previous issues.
  • If someone sent you this publication subscribe to receive future issues.

In this Issue:
FEATURE ARTICLE:
Looked at Clouds from "Both Sides Now”:
NIST Guidelines on Security & Privacy in the Cloud
FTC Issues Final Privacy Report
Department of Commerce Kicks off Multistakeholder Process with Request for Comments
FTC Approves New COPPA Compliance Safe Harbor
White House Releases Privacy Framework
California Attorney General Announces App Privacy “Joint Principles” with Application Platform Providers
FTC Report Warns Mobile App Industry It Must Do More to Disclose Data Collection Practices Involving Kids

UPDATES:
US-EU Safe Harbor Intact for Now
Federal Appeals Court Rules No Damages under VPPA for Violation of Record Destruction Requirement
European Mobile Service Providers Unveil Voluntary App Privacy Guidelines
Privacy Group Challenges New FERPA Rule

ANNOUNCEMENTS & EVENTS:
Jill Josephson Joins SLRNO

FEATURE ARTICLE:
Looked at Clouds from "Both Sides Now”:
NIST Guidelines on Security & Privacy in the Cloud

By Karen Neuman
     Ari Moskowitz


A year-long effort by the National Institute of Standards and Technology (NIST) that was launched by the Obama administration’s push to accelerate government adoption of cloud computing culminated in January with the release of NIST's Guidelines on Security and Privacy in Public Cloud Computing.1 The guidelines are intended for government personnel who are responsible for making decisions about IT security and privacy, as well as private sector users of public (as opposed to private) cloud computing services.2 The guidelines make clear that organizations bear ultimate responsibility for assessing risk and for the security of data stored in the public cloud. At the same time, NIST acknowledges that these organizations surrender direct control over security and, possibly, data ownership.
Read more...


FTC Issues Final Privacy Report
By Karen Neuman

On March 26, 2012 the FTC issued its long awaited privacy report and framework asking businesses to implement recommended best practices for how companies collect, use, disclose and protect consumer information – both on- and offline. The report, "Protecting Consumer Privacy in an Era of Rapid Change: A Proposed Framework for Businesses and Policymakers", is intended to give consumers control over how their information is collected and used. It also calls on Congress to enact privacy (including data security and breach notification) legislation. The report is the culmination of a process initiated two years ago to address perceived threats to privacy by rapidly evolving technology that enables the collection and sharing of consumer information and its growing use by businesses for advertising and other purposes.
Read more...


Department of Commerce Kicks off Multistakeholder Process with Request for Comments
By Karen Neuman

On March 5, 2012 the Department of Commerce kicked off the “multi stakeholder” process for developing industry privacy self-regulatory codes by publishing a ) Request for Comments (RFC) in the Federal Register. This process was first introduced by agency officials in public comments earlier this year and one of the key components of the White House Framework for Protecting Privacy and Promoting Innovation in the Digital Age. The broad array of interests that are expected to participate (privacy advocates, consumer groups, companies, industry associations, academics, State Attorneys General and even International entities) suggests implicit challenges in a process that is intended to address widespread concerns about privacy while staving off legislation, and reassuring EU data protection authorities that the U.S. is serious about protecting individual. The original deadline for filing comments was extended from March 26, 2012 to April 5, 2012.
Read more...


FTC Approves New COPPA Compliance Safe Harbor
By Karen Neuman

On February 24 2012, The Federal Trade Commission (FTC) approved by a 4-0 vote a new Safe Harbor program for companies that are subject to the Children’s Online Privacy Protection Act (COPPA). The program is offered by Aristotle International, Inc.
Read more...


White House Releases Privacy Framework
By Ari Moskowitz

On Thursday, February 23 the White House released its long awaited report introducing the Administration’s framework for protecting consumer privacy and proposing a Privacy “Bill of Rights”.1 The report, Consumer Data Privacy in a Networked World: a Framework for Protecting Privacy & Promoting Innovation in a Digital World does not call for new rules for businesses collecting information online, but does establish a roadmap for strengthening privacy protections by working with industry to protect consumer privacy. The report is the culmination of a process that was initiated by the Department of Commerce and builds upon the Agency’s December 2010 privacy “Green Paper”, which contained a number of proposals, ranging from implementing FIPs to working with business to create an industry “self-regulatory codes.
Read more...


California Attorney General Announces App Privacy “Joint Principles” with Application Platform Providers
By Karen Neuman

On February 22, 2012, the California Attorney General announced the adoption of a Joint Statement of Principles with Apple, Amazon, Microsoft, RIM, Hewlett-Packard and Google that will, among other things, require apps using these platforms to have privacy policies and adhere to the representations in them. The companies will be required enforce the requirement that apps have privacy policies. Apps that breach their privacy policies by violating promises involving the use and disclosure about user information will be prosecuted by the California attorney general. According to the California Attorney General, currently only 5 percent of an estimated one million apps have privacy policies.
Read more...


FTC Report Warns Mobile App Industry It Must Do More to Disclose Data Collection Practices Involving Kids
By Karen Neuman

On February 16, 2012, the FTC issued a staff report , Mobile Apps for Kids: Current Privacy Disclosures are Disappointing, showing theresults of a survey of the privacy practices of mobile apps for kids. The survey focused on the Apple App Store and the Android Market, and evaluated the types of apps offered to children, the apps’ disclosures and interactive features, including connectivity with social media, and the ratings and parental controls offered for the apps. The report, Mobile Apps for Kids: Current Privacy Disclosures Are Disappointing, was issued as industry awaits FTC publication of the updated Children’s Online Privacy Protection Act (COPPA) Rule. The report warns of potential enforcement actions following the Agency’s “additional review” during the next six months “to determine whether some mobile apps are violating COPPA”.
Read more...


UPDATES

US-EU Safe Harbor Intact for Now

On March 19, 2001 this year’s Safe Harbor conference in Washington, DC addressed the sustainability of the US-EU Safe Harbor framework for transferring data from Europe to the United States. The Conference followed the release of the European Commission’s (Commission) revised draft Data Protection Directive 95/46. The United States is not considered among the countries deemed “adequate” by the Commission for cross border data transfers, and organizations transferring data from EU countries to the US must self-certify under the Safe Harbor program. The Commission has expressed concerns about whether the Safe Harbor may be maintained in light of the proposed changes to the Data Protection Directive.
Read more...

Federal Appeals Court Rules No Damages under VPPA for Violation of Record Destruction Requirement

On March 6, 2012, the United States Court of Appeals for the Seventh Circuit issued an opinion in which it ruled that damages are not available under the Video Privacy Protection Act (VPPA) for noncompliance with the law’s data destruction requirement. The only remedy under the VPPA is court- ordered destruction of the data. The plaintiffs filed a class action against Redbox, a video rental and DVD “kiosk” operator, under subsections (b) and (c) of the VPPA for unlawful disclosure of personal information and unlawful retention of rental history, respectively. Redbox filed an interlocutory appeal in which it asked the Court to decide whether the statute permits a private right of action for damages to enforce the law’s records destruction section. Relying in part on precedent to construe what the Court characterized as a poorly drafted statute, it concluded that Congress only intended to enforce the unlawful disclosure provision with monetary damages; Congress did not intend to enforce violations of the record destruction provision with an action for damages, particularly in instances where, like the case before the court, the Plaintiffs suffered no actual harm. The case is Sterk v. Redbox Automated Retail, LLC, No 12-8002 (7th Cir. March 6, 2012).

European Mobile Service Providers Unveil Voluntary App Privacy Guidelines

On March 1, 2012, the Groupe Spéciale Mobile Association (GSMA) released Privacy Design Guidelines for Mobile Application Development, a report that sets forth voluntary guidelines for European mobile telephone service providers and their mobile applications. The guidelines are intended for all entities in the mobile “app” or service delivery supply and distribution chain that are responsible for collecting and processing a user’s personal information. In other words, the guidelines apply to developers, device manufacturers, platforms, and OS companies, mobile operators, advertisers and analytics companies.
Read more...


Privacy Group Challenges New FERPA Rule

On February 29, 2012 the Washington, D.C. based privacy advocacy group EPIC filed a federal court action challenging recent amendments to the Family Educational Rights Privacy Act (FERPA) rule and asking that the rule be blocked and declared unlawful. The Complaint alleges that the Department of Education exceeded its authority when it adopted proposed changes to the final rule. The changes were adopted following a rulemaking proceeding that was initiated to address how states manage and safeguard student performance and other data collected for record keeping and data-driven education reform.
Read more...


ANNOUNCEMENTS & EVENTS

Jill Josephson Joins SLRNO

SLRNO is pleased to announce that Jill Josephson has joined the firm as counsel in its transactional practice group, where she will focus on technology and business transactions. Jill has over 15 years experience advising senior management and in-house counsel on operational and commercial and transactional matters, including e-commerce, software development and distribution, revenue generation and risk assessment and management. Jill can be reached at jjosephson@stlro.com.


FEATURE ARTICLE:
Looked at Clouds from "Both Sides Now”:
NIST Guidelines on Security & Privacy in the Cloud

By Karen Neuman
     Ari Moskowitz

A year-long effort by the National Institute of Standards and Technology (NIST) that was launched by the Obama administration’s push to accelerate government adoption of cloud computing culminated in January with the release of NIST's Guidelines on Security and Privacy in Public Cloud Computing.1 The guidelines are intended for government personnel who are responsible for making decisions about IT security and privacy, as well as private sector users of public (as opposed to private) cloud computing services.2 The guidelines make clear that organizations bear ultimate responsibility for assessing risk and for the security of data stored in the public cloud. At the same time, NIST acknowledges that these organizations surrender direct control over security and, possibly, data ownership.

The report attributes to cloud computing such common characteristics as “on-demand scalability of highly available and reliable pooled computing resources, secure access to metered services from nearly anywhere, and displacement of data and services from inside to outside the organization”. It identifies three distinct cloud computing “service models”: software-as-a-service, platform-as-a-service, and infrastructure-as-a-service, which span from running applications over the Internet to storing data in and accessing data from a remote Internet connected storage facility.

The two major tenets of the guidelines are context and accountability.

The context in which a particular organization is considering moving to a cloud computing platform is important; there is not a one-size-fits-all recommendation because every organization has different security needs.3 Illustrative of this concept is an example provided by NIST of two government agencies, one of which prepares information for public use and the other, which deals in classified data. The security needs of each organization are very different even though both may choose to move to the cloud; they may choose different providers, different service models, or use different language in their contracts.

The guidelines stress that accountability for the security and privacy of information stored in or passed through the cloud cannot be delegated to the cloud provider; instead, it lies with the organization..4

The report identifies both benefits and drawbacks of cloud computing, noting that the benefits particularly apply to small organizations that can take advantage of economies of scale.5 Benefits include “Staff specialization,” meaning that a cloud provider will often have more staff focused exclusively on data privacy and security than would a small organization (such as a startup with limited resources to devote to personnel). Similarly, according to the report, cloud providers have the resources to dedicate to maintenance, audits, and security procedures that will better protect the data they store. Other security benefits include having the data centrally available to a mobile workforce without having it duplicated on a numerous devices (thereby increasing security vulnerability), as well as more reliable backup and data recovery.

The authors note, however that “many of the features that make cloud computing attractive [] can also be at odds with traditional security models and controls.” The listed downsides6 include the shared environment in which more than one client’s data is physically separated, by software and not physically, creating additional security vulnerabilities. The biggest downside is the loss of control that an organization has over its data, as outsourcing data storage exacerbates all of the other security and privacy risks. An organization loses its ability to quickly change and prioritize its data security and privacy policies and procedures when that data is stored by a third party. The authors also note that legal obligations and protections for data may change when moved to and held by a third party.

To address these benefits and downsides, NIST presents four key guidelines7 and then a series of more detailed recommendations8 for securing data and selecting a cloud provider. The key guidelines are:

  • Carefully plan the security and privacy aspects of cloud computing solutions before implementing them.
  • Understand the public cloud computing environment offered by the cloud provider.
  • Ensure that a cloud computing solution—both cloud resources and cloud-based applications— satisfy organizational security and privacy requirements.
  • Maintain accountability over the privacy and security of data and applications implemented and deployed in public cloud computing environments.

NIST expands upon these guidelines in its recommendations for securing data when moving to the cloud. The security recommendations are divided into three areas beginning with “Governance,” where NIST lays the responsibility for maintaining data security squarely with the organizations and its employees. The recommendations include expanding policies and procedures to cover cloud computing and performing regular audits of those procedures. Under the rubric of “Compliance”, other recommendations include understanding the laws that impose security and privacy obligations, and investigating cloud providers’ electronic discovery capabilities and processes in the event of litigation. Other areas in which NIST provides recommendations include Identity and Access Management and Incident Response.

The guidelines conclude with recommendations involving three areas for organizations that are contemplating outsourcing IT to the cloud.9 These areas can be seen as spanning the arc of an organization’s relationship with a cloud provider. The first sets out NIST’s recommendations for how to evaluate a cloud provider before choosing one and a discussion of Fair Information Practice Principles (FIPPs). The second consists of NIST’s recommendations for negotiating a contract with a cloud provider and assessing that provider’s performance. NIST advises that organizations include all privacy and security requirements in the service agreement and “involve a legal advisor in the review of the service agreement and in any negotiations about the terms of service.”10 Finally, NIST provides recommendations about how to terminate a relationship with a cloud provider.

Though NIST’s guidance is directed to public entities, it applies equally well to all organizations, including private enterprises, that are considering shifting data and services to the cloud. It is critically essential to keep in mind that every organization, private or public, has a unique set of privacy and security concerns and obligations, so a specialized evaluation of specific needs is the best approach to cloud computing. This may be especially, although not exclusively, the case for certain regulated industries, including financial services and health. Questions about data security and privacy should be asked before deciding to make a move to cloud computing but responsibility for maintaining the security and privacy of that data must remain a concern for the organization regardless of where it is stored or accessed. Contracts with cloud providers should be carefully crafted to meet the unique requirements of each organization and tailored to the particular types and uses of the data.


1 National Institute of Standards and Technology, Guidelines on Security and Privacy in Public Cloud Computing (NIST Special Publication 800-144), available at http://www.nist.gov/itl/csd/cloud-012412.cfm
2 citation
3 Id. at 7
4 Id. at 52
5 Id. at 8-10
6 Id. at 10-12
7 Id. at vi
8 Id. at 14
9 Id. at 37
10 Id. at 51


Back to Top


FTC Issues Final Privacy Report
By Karen Neuman

On March 26, 2012 the FTC issued its long awaited privacy report and framework asking businesses to implement recommended best practices for how companies collect, use, disclose and protect consumer information – both on- and offline. The report, "Protecting Consumer Privacy in an Era of Rapid Change: A Proposed Framework for Businesses and Policymakers" , is intended to give consumers control over how their information is collected and used. It also calls on Congress to enact privacy (including data security and breach notification) legislation. The report is the culmination of a process initiated two years ago to address perceived threats to privacy by rapidly evolving technology that enables the collection and sharing of consumer information and its growing use by businesses for advertising and other purposes.

The report preserves the fundamental approach for protecting privacy embodied in the preliminary staff report, including privacy by design, simplified choice, and greater transparency. However, it revises some of those earlier recommendations.

First, the original scope of the proposed framework applied to all commercial entities that collect or use consumer data that can be linked to a specific consumer, computer, or other device. The final report, however, concludes that the framework should not apply to companies that collect only non-sensitive data from fewer than 5,000 consumers a year and do not share it. (This change might not represent the type of carve-out start-ups might have been seeking.)

Second, the report addresses concerns that, with technological advances, more and more data can be "reasonably linked" to consumers, computers, or devices. The final report clarifies that data is not “reasonably linkable” to the extent that a company takes reasonable measures to de-identify the data, commits not to re-identify it, and contractually prohibits downstream recipients from re- identifying it.

Third, the report recommends that Congress “consider enacting targeted legislation to provide greater transparency for, and control over, the practices of data brokers.” The staff report had proposed that companies give consumers “reasonable access to the data the companies maintain about them, proportionate to the sensitivity of the data and the nature of its use.” In response to commenters’ particular concerns with this issue as it relates to data maintained by data brokers, the FTC expanded on its proposal with a call for legislation that would include procedures for consumers to access and dispute personal data held by information brokers.

The report contemplates that Congress will enact broad privacy legislation while business will step up to “accelerate” the development of industry self-regulatory codes. In this sense the FTC report must be read with the privacy report issued earlier this month by the Department of Commerce (DOC). In that report, the DOC indicated that it would launch a multistakeholder process to develop such self- regulatory codes. The FTC report also indicates that over the course of the next year, FTC staff will focus on a number of “action” items, which are discussed throughout the report and include:

  • Do Not Track. The FTC will work with industry groups to complete implementation of easy-to use, persistent, and effective Do Not Track systems, including those developed by the Digital Advertising Alliance and the World Wide Web Consortium. Notably, the FTC declined to ask Congress to pass Do Not Track legislation..
  • Mobile. The FTC asks businesses in the mobile ecosystem to work toward improved privacy protections, including the development of short, meaningful disclosures, and encouraged participation in a workshop on May 30, 2012 that will, among other things, address mobile privacy disclosures.
  • Large Platform Providers. The FTC intends to host a public workshop later this year to address tracking activities of Internet and social media service providers, browsers and operating systems..

The report appears to have already drawn support from key members of Congress. Senator Rockefeller (D-WVA), Chairman of the Senate Committee on Commerce, Science and Transportation issued a statement in which he expressed support for the FTC’s recommendation that Congress pass privacy legislation and agreed that industry must do a better job of respecting consumer privacy choices. Senator Kerry (D-Mass), Chairman of the Commerce Subcommittee on Communications, Technology, and the Internet and co-sponsor of the Commercial Privacy Bill of Rights Act of 2011(which has provisions requiring baseline protections that are similar to the FTC framework) also issued a statement underscoring the need for privacy legislation.

Like its predecessor, this report does not have the force of law. However, it should be seen as signaling how the FTC intends to make policy as it continues using its broad enforcement powers to protect consumer privacy (as noted by the dissenting Commissioner), whether or not Congress can deliver any of the requested privacy legislation.

Back to Top

Department of Commerce Kicks off Multistakeholder Process with Request for Comments
By Karen Neuman

On March 5, 2012 the Department of Commerce kicked off the “multi stakeholder” process for developing industry privacy self-regulatory codes by publishing a ) Request for Comments (RFC) in the Federal Register. This process was first introduced by agency officials in public comments earlier this year and one of the key components of the White House Framework for Protecting Privacy and Promoting Innovation in the Digital Age. The broad array of interests that are expected to participate (privacy advocates, consumer groups, companies, industry associations, academics, State Attorneys General and even International entities) suggests implicit challenges in a process that is intended to address widespread concerns about privacy while staving off legislation, and reassuring EU data protection authorities that the U.S. is serious about protecting individual. The original deadline for filing comments was extended from March 26, 2012 to April 5, 2012.

The RFC should be seen as the start of a process that will implement the underlying goals of the White House privacy Framework by creating legally enforceable industry self-regulatory codes subject to FTC oversight and enforcement. The RFC invites stakeholders to recommend topics that they believe should be addressed first, as well as provide comment on procedures for developing the codes.

The RFC also lists the Administration’s privacy priorities and indicates that these priorities will be the focus of the proceeding:

  • Issues associated with mobile apps in general (e.g., a code of conduct that implements the full Consumer Privacy Bill of Rights).
  • Mobile apps that provide location based Services.
  • Cloud computing services, i.e., those that store data in architectures that provide on-demand self-service, broad network access, resource pooling, rapid elasticity, and measured service; or specific cloud computing market segments.
  • Accountability mechanisms (to enable companies to demonstrate how they are implementing the Consumer Privacy Bill of Rights).
  • Online services directed toward teenagers (individuals 13 or older and younger than 18)
  • Online services directed toward children (individuals under 13 years old) 18
  • Trusted identity systems, such as those discussed in the National Strategy for Trusted Identities in Cyberspace 19
  • The use of multiple technologies, e.g., browser cookies, local shared objects, and browser cache, to collect personal data.

At a minimum, stakeholders should closely monitor developments in this proceeding and consider taking advantage of early opportunities to participate in its outcome.

Back to Top


FTC Approves New COPPA Compliance Safe Harbor
By Karen Neuman

On February 24 2012, The Federal Trade Commission (FTC) approved by a 4-0 vote a new Safe Harbor program for companies that are subject to the Children’s Online Privacy Protection Act (COPPA). The program is offered by Aristotle International, Inc.

The COPPA Rule applies to commercial operators of websites and online services or general audience sites and online services that knowingly collect personal information from children who are under 13. COPPA Safe Harbor programs must be approved by the FTC. In order to obtain FTC approval, these programs must meet 3 criteria: 1) provide the same or greater protections for children as those contained in the Rule; 2) set forth effective, mandatory mechanisms for the independent assessment of members' compliance; and 3) provide effective incentives for members' compliance. Under the Safe Harbor, commercial website operators and online service providers that are subject to the COPPA Rule that choose to participate in Safe Harbor compliance program are deemed to be in compliance with the COPPA rule.

The Aristotle program requires participants to post COPPA-required privacy policies, comply with required notice and consent provisions of the Rule, implement data minimization practices and establish and maintain reasonable procedures to protect the confidentiality, security, and integrity of personal information collected from children. Participants must also agree to quarterlyand unannounced independent reviews, and undertake annual self-evaluations. The program offers participants some novel methods for complying with the Rule’s “prior verifiable “parental consent requirement for collecting personal information from children. These methods include real-time video consent over applications such as Skype and the submission of images of parental signatures photographed and transmitted to Aristotle by mobile devices.

The Aristotle Safe Harbor was approved as industry and privacy advocates await publication of the FTC’s updated COPPA rule, which the agency has been reviewing since 2010. One of the changes the FTC is considering is adding the type of video consent in the Aristotle Safe Harbor program to the acceptable methods for obtaining prior verifiable parental consent. Aristotle’s FTC application for Safe Harbor approval can be viewed here.

Back to Top


White House Releases Privacy Framework
By Ari Moskowitz

On Thursday, February 23 the White House released its long awaited report introducing the Administration’s framework for protecting consumer privacy and proposing a Privacy “Bill of Rights”.1 The report, Consumer Data Privacy in a Networked World: a Framework for Protecting Privacy & Promoting Innovation in a Digital World does not call for new rules for businesses collecting information online, but does establish a roadmap for strengthening privacy protections by working with industry to protect consumer privacy. The report is the culmination of a process that was initiated by the Department of Commerce and builds upon the Agency’s December 2010 privacy “Green Paper”, which contained a number of proposals, ranging from implementing FIPs to working with business to create an industry “self-regulatory codes.

The framework’s linchpin is the Consumer Privacy Bill of Rights, which builds on recommendations in the Green Paper by adopting well established Fair Information Practice Principles (FIPPs) to protect consumer privacy and clarify the U.S. approach to privacy protection. The report calls on Congress to pass laws codifying the rights and principles embodied in the FIPPs, and launches a crucial multistakeholder process to specify how they should be implemented in various contexts and industries through the development of industry self-regulatory codes.

The Consumer Privacy Bill of Rights applies to “commercial uses of personal data,” which the Framework defines as “any data, including aggregations of data, which is linkable to a specific individual” and notably includes identifiers on mobile devices and computers that are used to build a “usage profile.”2 The descriptions of the seven FIPPS laid out by President Obama rely heavily on terms like “reasonable” and “appropriate” because their exact scope is being left to future refinement through the multistakeholder process. The descriptions also briefly address application of the FIPPS to mobile privacy, for example by noting that privacy notices must be handled differently to be clear on a small screen, but do not include specific sections dealing with privacy in the mobile arena.

The seven FIPPS adopted in the Consumer Privacy Bill of Rights are:

  • Individual Control. Consumers have a right to exercise control over what personal data companies collect from them and how they use it.
  • Transparency. Consumers have a right to easily understandable and accessible information about privacy and security practices.
  • Respect for Context. Consumers have a right to expect that companies will collect, use, and disclose personal data in ways that are consistent with the context in which consumers provide the data.
  • Security. Consumers have a right to secure and responsible handling of personal data.
  • Access and Accuracy. Consumers have a right to access and correct personal data in usable formats, in a manner that is appropriate to the sensitivity of the data and the risk of adverse consequences to consumers if the data is inaccurate.
  • Focused Collection. Consumers have a right to reasonable limits on the personal data that companies collect and retain.
  • Accountability. Consumers have a right to have personal data handled by companies with appropriate measures in place to assure they adhere to the Consumer Privacy Bill of Rights.

Other major themes of the Framework include a call for national data breach notification legislation that will preempt state law, international interoperability in privacy regulatory regimes to ease the compliance costs on businesses, and strengthened Federal Trade Commission (FTC) enforcement. Also notable is a footnote mentioning that the White House is separately considering amending the Electronic Communications Privacy Act, which restricts the government’s access to and use of individual’s personal information, but the Framework applies solely to the private sector.

On international cooperation, the Administration makes the case that engaging governments and industry in its proposed multistakeholder processes to develop codes of conduct and mutual recognition of privacy regimes, and cooperate on enforcement is the most flexible and effective approach to achieving the goal of interoperability in privacy protection.

With regard to enhancing the FTC’s enforcement authority, the Framework report reiterates that the multistakeholder process will provide a foundation for FTC enforcement by holding companies to the voluntary codes of conduct developed in that process. Those that fail to do so will be subject to the FTC’s Section 5 enforcement powers. The White House goes a step further, though, and calls on Congress to enact legislation that would empower the FTC to enforce the Consumer Privacy Bill of Rights.

The Framework should be seen as the starting point of a process, not the end. Indeed nothing is in fact yet required of online businesses, although a number of “household name” technology companies have expressed support for the Bill of Rights and multistakeholder process. It is unclear what impact this show of support will have on smaller players or future start-ups who may lack the resources or the inclination to participate in the process, particularly those that are in the early stages of formation or believe that they have not yet attracted enough users to draw the attention of regulators.


1 Consumer Data Privacy In A Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy, available at http://www.whitehouse.gov/sites/default/files/privacy-final.pdf.
2 Id. at p. 10

Back to Top


California Attorney General Announces App Privacy “Joint Principles” with Application Platform Providers
By Karen Neuman

On February 22, 2012, the California Attorney General announced the adoption of a Joint Statement of Principles with Apple, Amazon, Microsoft, RIM, Hewlett-Packard and Google that will, among other things, require apps using these platforms to have privacy policies and adhere to the representations in them. The companies will be required enforce the requirement that apps have privacy policies. Apps that breach their privacy policies by violating promises involving the use and disclosure about user information will be prosecuted by the California attorney general. According to the California Attorney General, currently only 5 percent of an estimated one million apps have privacy policies.

The Agreement construes the California Online Privacy Act1, applicable to websites and online services, as also applying to mobile apps. The practical effect of the Statement is to: 1) require app host platforms and app stores make sure the apps have privacy policies before approving them and 2) companies that seek to make their apps available on one of the host platforms (estimated to currently be 95% of apps) have privacy policies without regard to whether the app was developed in California or where it is downloaded. It is unclear when California intends to initiate enforcement of the Joint Statement but the Attorney General indicated that available penalties for noncompliance include up to $5,000 per violation.

The Agreement sets out the following four principles:

  • Privacy Policy Requirement. Apps collecting personal data from a user must “conspicuously” post a privacy policy or statement clearly and completely describing the app’s collection, use and sharing of users’ personal data.
  • Optional Data Field for Hyperlink to App Privacy Policy. For new or updated apps, the platform providers will include in the submission process either an optional data field for (al) a hyperlink to the app’s privacy policy or statement or (b) the text of the app’s privacy policy or statement to enable users to read the policy prior to purchasing or downloading an app.
  • Protocol for Reporting Noncompliance. The platform providers will have to implement a protocol for users to report noncompliance with the privacy policies or applicable law.
  • Process for Responding to Reports of Non-compliance. The platform providers will have to implement a process for responding to reported instances of non-compliance with the privacy policies or applicable law. Any action taken by the platform provider for non-compliance will not limit the government’s right to pursue an action against developer for reported violations of applicable law.

In addition, the Platform providers and the California Attorney general will develop best practices for mobile privacy, including model mobile privacy policies. Parties to the Joint Statement will assess mobile privacy, including the need for public education initiatives.


1 Cal. Bus. & Prof. Code § 22575 – 22579.

Back to Top


FTC Report Warns Mobile App Industry It Must Do More to Disclose Data Collection Practices Involving Kids
By Karen Neuman

On February 16, 2012, the FTC issued a staff report , Mobile Apps for Kids: Current Privacy Disclosures are Disappointing, showing the results of a survey of the privacy practices of mobile apps for kids. The survey focused on the Apple App Store and the Android Market, and evaluated the types of apps offered to children, the apps’ disclosures and interactive features, including connectivity with social media, and the ratings and parental controls offered for the apps. The report, Mobile Apps for Kids: Current Privacy Disclosures Are Disappointing, was issued as industry awaits FTC publication of the updated Children’s Online Privacy Protection Act (COPPA) Rule. The report warns of potential enforcement actions following the Agency’s “additional review” during the next six months “to determine whether some mobile apps are violating COPPA”.

The report contends that neither app stores nor developers provide parents with information they need to determine what data is being collected from their children, how it is being shared, or who has access to it. FTC Chairman John Liebowitz admonished businesses in the kids mobile app “ecosystem” to do more to provide parents with easily accessible information so parents can make informed decisions about the apps their children download and use. Stores provide architecture for sharing pricing and category data, and should be able to provide a way for developers to provide information about their data collection and sharing practices.

The report invites industry to help identify best practices to convey data practices on small screens, including participating in a public workshop hosted by the FTC this year to address mobile privacy disclosures in connection with its efforts to update the "Dot Com Disclosure" guide.

According to the report in 2008, Smartphone users could choose from about 600 available apps. Today there are more than 500,000 apps in the Apple App Store and 380,000 in the Android Market. FTC staff noted that mobile apps can automatically capture a broad range of user information from a mobile device, including a user's precise location, phone number, and list of contacts, call logs, unique identifiers, and other information stored on the device. The FTC did not conduct tests to determine whether apps actually collected, used, or disclosed personal information from children. FTC staff “encountered a diverse pool of kids apps created by hundreds of different developers [and] almost no information in the app stores or on the developers’ landing pages about the data collection, sharing and use practices of the apps or stores. The report specifically recommends that:

  • All members of the "kids app ecosystem" – stores, developers and third parties providing services – should play an active role in providing key information to parents.
  • App developers should provide data practices information in simple and short disclosures. They also should disclose whether the app connects with social media, and whether it contains ads. Third parties that collect data also should disclose their privacy practices.
  • App stores should take responsibility for ensuring that parents have basic information. "As gatekeepers of the app marketplace, the app stores should do more."

Although the report does not call for legislation, it should be seen as part of recent developments leading toward heightened scrutiny of the data collection, retention, sharing and use practices of businesses in the mobile app ecosystem. These developments include last year’s settlement of an enforcement action against and a mobile app developer for COPPA violations, the California Attorney General’s agreement requiring developers to have privacy policies in before being accepted into app stores and last month’s revelations that apple and android apps, as well as the social app Path, uploaded users’ entire contact lists to developers’ servers without user notice or consent.

Back to Top


UPDATES

US-EU Safe Harbor Intact for Now

On March 19, 2001 this year’s Safe Harbor conference in Washington, DC addressed the sustainability of the US-EU Safe Harbor framework for transferring data from Europe to the United States. The Conference followed the release of the European Commission’s (Commission) revised draft Data Protection Directive 95/46. The United States is not considered among the countries deemed “adequate” by the Commission for cross border data transfers, and organizations transferring data from EU countries to the US must self-certify under the Safe Harbor program. The Commission has expressed concerns about whether the Safe Harbor may be maintained in light of the proposed changes to the Data Protection Directive. The Administration’s Privacy initiative, including the privacy bill of rights and Department of Commerce proceeding to develop industry voluntary privacy self-regulatory codes, released after the draft Directive was released, can be seen in part as an effort to preserve the Safe Harbor as well as promote “interoperability” between the US and EU approach to privacy protection. At the conclusion of the Conference, Commission representatives suggested that in order to sustain the Safe Harbor the US must approve comprehensive privacy legislation with enforcement provisions that have teeth. In other words, the industry self-regulatory codes, although voluntary, must be effectively binding through meaningful enforcement. The just-released FTC privacy report calls on Congress to enact privacy legislation.

The US and EU seem committed to working to preserve the Safe Harbor and recognize that it is essential to promoting trade and economic growth. In a joint statement issued after the conference the US Secretary of Commerce John Bryson and EC Vice President Viviane Reding underscored the importance of the Safe Harbor to cross-Atlantic data flows and trade objectives.

Back to Top


Federal Appeals Court Rules No Damages under VPPA for Violation of Record Destruction Requirement

On March 6, 2012, the United States Court of Appeals for the Seventh Circuit issued an opinion in which it ruled that damages are not available under the Video Privacy Protection Act (VPPA) for noncompliance with the law’s data destruction requirement. The only remedy under the VPPA is court- ordered destruction of the data. The plaintiffs filed a class action against Redbox, a video rental and DVD “kiosk” operator, under subsections (b) and (c) of the VPPA for unlawful disclosure of personal information and unlawful retention of rental history, respectively. Redbox filed an interlocutory appeal in which it asked the Court to decide whether the statute permits a private right of action for damages to enforce the law’s records destruction section. Relying in part on precedent to construe what the Court characterized as a poorly drafted statute, it concluded that Congress only intended to enforce the unlawful disclosure provision with monetary damages; Congress did not intend to enforce violations of the record destruction provision with an action for damages, particularly in instances where, like the case before the court, the Plaintiffs suffered no actual harm. The case is Sterk v. Redbox Automated Retail, LLC, No 12-8002 (7th Cir. March 6, 2012).

Back to Top


European Mobile Service Providers Unveil Voluntary App Privacy Guidelines

On March 1, 2012, the Groupe Spéciale Mobile Association (GSMA) released Privacy Design Guidelines for Mobile Application Development, a report that sets forth voluntary guidelines for European mobile telephone service providers and their mobile applications. The guidelines are intended for all entities in the mobile “app” or service delivery supply and distribution chain that are responsible for collecting and processing a user’s personal information. In other words, the guidelines apply to developers, device manufacturers, platforms, and OS companies, mobile operators, advertisers and analytics companies.

The guidelines adopt a “Privacy by Design” approach to protecting individual privacy. They are intended to encourage the development, delivery and operation of mobile apps that help users understand what personal information an app can access, collect, and use; and what the information will be used for, and why; and how users may exercise choice and control over this use. The report includes examples as well as illustrative use cases.

The breadth of the guidelines reflects the GSMA’s recognition of how integral apps are to all aspects of daily life and commerce. Thus, the guidelines address apps in the education, social networking and media, location, children and adolescents and mobile advertising contexts.

The guidelines can be seen as yet another industry response to calls to impose privacy regulations on the nascent mobile app industry amid widely publicized instances of unauthorized access and distribution of user data.

Back to Top


Privacy Group Challenges New FERPA Rule

On February 29, 2012 the Washington, D.C. based privacy advocacy group EPIC filed a federal court action challenging recent amendments to the Family Educational Rights Privacy Act (FERPA) rule and asking that the rule be blocked and declared unlawful. The Complaint alleges that the Department of Education exceeded its authority when it adopted proposed changes to the final rule. The changes were adopted following a rulemaking proceeding that was initiated to address how states manage and safeguard student performance and other data collected for record keeping and data-driven education reform.

The Plaintiff focuses on three core amendments to the rule. First, the Complaint challenges the rule’s expansion of the pool of individuals who may access student data without obtaining parental consent by designating “non-governmental actors” as “authorized representatives” to conduct audits and evaluations. EPIC contends that by designating private entities authorized representatives the Department of Education performed an “unauthorized, unlawful sub-delegation of its own authority.”

The Complaint also challenges the expansion of the definition of “Education Program” to broadly capture any program administered by an educational agency or institution would “expose troves of sensitive non-academic data.”

The new definitions of “authorized representative” and “education program” were intended to facilitate access to student data for consultants under contract with school districts and educational institutions. But the rule also includes a change intended to facilitate school security by expanding definition of “Directory Information” to permit the nonconsensual disclosure of data contained in a student’s ID card or badge. Specifically, Paragraph 17 alleges:

The agency states that the amended regulations modify the definition to clarify that an educational institution or agency may designate as directory information and non-consensually disclose a student ID number or other unique personal identifier that is displayed on a student ID card or badge if the identifier cannot be used to [access] education records except when used in conjunction with one or more factors that authenticate the user’s identity such as a PIN, password, or other known factor possessed only by the authorized individual.

EPIC argues that nonconsensual disclosure of the above-described ID card or badge data creates a risk of “re-identification”.

It is unlikely that the requested relief will be granted (blocking the amended rule and declaring it unlawful) as courts defer to an agency’s expertise when determining whether to invalidate a rule, as long as the agency’s decision is found to be rational and within its statutory authority. Nevertheless local school boards and third party consultants should monitor developments in this case.

Back to Top


ANNOUNCEMENTS & EVENTS

Jill Josephson Joins SLRNO


SLRNO is pleased to announce that Jill Josephson has joined the firm as counsel in its transactional practice group, where she will focus on technology and business transactions. Jill has over 15 years experience advising senior management and in-house counsel on operational and commercial and transactional matters, including e-commerce, software development and distribution, revenue generation and risk assessment and management. Jill can be reached at jjosephson@stlro.com.

Back to Top


Copyright © 2010 St. Ledger-Roty & Olson, LLP.
1250 Connecticut Avenue, N.W., Suite 200, Washington D.C 20036