FTC Approves Children's Online Privacy Compliance Program

By Jenny Paul

The Federal Trade Commission has approved a new safe harbor program, called the iKeepSafe Safe Harbor Program, that websites and other online services may use to comply with its Children’s Online Privacy Protection Act rule. 

COPPA regulates the collection and use by website operators and application developers of personal information from children under the age of 13.  COPPA also requires website operators and application developers to obtain parental consent before collecting a child’s personal information.  The COPPA rule allows industry groups or others to submit self-regulatory guidelines to the FTC for approval as a safe harbor program.  Operators and developers who choose to be regulated under an approved safe harbor program will be deemed to be in compliance with COPPA if the operators and developers comply with the approved program’s guidelines.

A potential safe harbor program must demonstrate it can ensure that the website operators and app developers subject to its program provide similar or greater protections to children than the safeguards in the COPPA rule.  The program must also demonstrate that it possesses an effective mandatory mechanism to assess the operators’ and developers’ compliance with its program guidelines and has disciplinary actions that will be taken if non-compliance occurs.  The FTC determined that iKeepSafe met all of the above criteria.  The FTC noted that iKeepSafe and its compliance partner, Playwell, have many years of experience in the children’s privacy sphere.


The Center for Digital Democracy opposed iKeepSafe’s application, saying the applicant wrongly used “permissive standards” instead of COPPA’s mandatory requirements.  In approving iKeepSafe’s application, the FTC noted that the company modified its application and inserted mandatory language in its application in response to the concerns of commenters.

 

COPPA Parental Consent Expanded by FTC

By Debbie Wong* and Nickolas Milonas

The Federal Trade Commission recently expanded the options for obtaining verifiable parental consent (“VPC”) under recent revisions to the Children’s Online Privacy Protection Act.  COPPA requires covered websites and online services to obtain verifiable parental consent before collecting information on children under 13.  The VPC method must be reasonably calculated to ensure that the person giving consent is the child’s parent.  While the rule enumerates several acceptable methods for obtaining parental consent, changes to the rule that took effect last July allow for FTC approval of new VPC methods.  As we previously reported, the FTC has already approved and rejected new VPC methods under the revised rule.

The FTC’s revised guidance on obtaining VPC includes three main clarifications.  First, the FTC will now include the collection of parent’s payment card information as a potential VPC method.  Although obtaining a 16-digit credit or debit card number alone would not satisfy COPPA’s VPC requirement, the FTC explained that there may be circumstances in which the collection of payment information would suffice in conjunction with other safeguards.  For example, the card number could be supplemented with security questions that only a parent could answer.

Second, the FTC will allow a developer of a child-related app to use a third party, such as an app store, to obtain VPC.  The developer, however, must ensure that COPPA requirements are met.  For example, requiring an app store account number or password is insufficient for obtaining parental consent without other indicia of reliability, like authentication questions.  In addition, the developer must provide parents with direct notice that outlines its information collection practices.

Last, the FTC clarified that in instances where a third party, like an app store, obtains VPC on behalf of app developers, the third party is not an “operator” under COPPA and therefore would not be liable if it fails to investigate the privacy practices of the developers for whom it obtains consent.  However, such third party may still be liable under the FTC Act if it misrepresents its level of oversight for child-related apps.  

The FTC revisions were praised by app developer trade associations as “a major win for innovation and privacy” because the revisions help clarify COPPA and remove obstacles that can discourage developers from creating educational apps for children. 

*Debbie Wong is a summer associate in K&L Gates’ Washington, DC office and contributed to this post.

Snapchat Agrees to Settle FTC Allegations That It Deceived Users

 By Jenny Paul 

Snapchat, a mobile messaging app, recently agreed to settle Federal Trade Commission allegations that the company deceived consumers by making promises about the “disappearing nature” of the messages that it could not keep, the FTC announced last week.

Snapchat allows a user to send to contacts a message in the form of a photo or brief video called a “snap.”  Because of the app’s design, a snap seems to disappear from a recipient’s app interface after the recipient has viewed the snap for a time period designated by the sender, usually 10 seconds or so.  The app is popular among users, particularly teens, who appreciate the privacy of these apparently self-destructing snaps.

The FTC accused Snapchat of misrepresenting its service by touting it as a way for users to send snaps that would “disappear forever” after that time period expired.  In reality, the FTC alleged, recipients could save snaps indefinitely by locating the files containing the snap messages on other portions of their mobile devices, until a recent Snapchat update closed this loophole.  The FTC also alleged that recipients can use third-party applications to download and save snaps.

The FTC also accused Snapchat of misrepresenting its data collection and privacy practices when it collected and transmitted some users’ geolocation information to an analytics tracking service, despite saying in its privacy policy that it did not track or access such information. 

In addition, the FTC took issue with Snapchat’s “Find Friends” feature, alleging that the feature collected users’ contacts information from their address books without users’ consent.  Snapchat also failed to secure the feature, resulting in a security breach that exposed to third parties 4.6 million Snapchat usernames and phone numbers, the FTC alleged.

The settlement requires Snapchat to establish a comprehensive privacy program and to undergo biennial independent security assessments for 20 years.  The settlement does not include a monetary penalty.  According to the FTC, the agreements will be subject to public comment through June 9, 2014, after which the FTC will decide whether to make the proposed consent order contained in the settlement final. 

FTC Has Power to Regulate Data Security Practices, Court Rules

By Jenny Paul, Roberta Anderson and Marty Stern 

In a closely watched case, a federal district court judge last week ruled that the Federal Trade Commission has the authority to bring enforcement actions against companies for data security breaches as an unfair practice under the FTC Act. 
 

The FTC brought suit against Wyndham Worldwide Corporation and several of its subsidiaries in 2012, alleging that the companies’ failure to maintain reasonable and appropriate data security for consumers’ sensitive personal information violated the FTC Act’s prohibition of unfair or deceptive acts or practices affecting commerce.  One of the subsidiaries, Wyndham Hotels and Resorts LLC, moved to dismiss part of the action, arguing that the FTC did not have the authority to bring an “unfairness” claim that involved data security.  The court disagreed, finding that specific data security legislation passed after the FTC Act merely complemented the FTC’s unfairness authority and did not preclude it.

Wyndham also argued the FTC was required to formally promulgate regulations to provide fair notice of what data security standards the FTC required.  The court rejected that argument, finding that formal regulations are not the only means of providing fair notice.  It noted that Section 5 of the FTC Act itself provides a three-part test for determining whether an act or practice is unfair, and that parties may look to the FTC’s complaints against entities, consent agreements, and public statements to determine the FTC’s standard for bringing an unfairness claim under the Act.

FTC Settles With Fandango, Credit Karma Over Mobile App Security

By Jenny Paul and Marc Martin 

Movie ticket seller Fandango and financial management service Credit Karma have agreed to settle the Federal Trade Commission’s allegations that the companies misrepresented the security of their mobile apps and failed to secure the transmission of millions of consumers’ sensitive personal information from the apps, the agency announced last week.
 

The FTC accused Fandango and Credit Karma of making security promises on their mobile apps and then failing to take “reasonable” steps to secure the apps.  In particular, the complaints against Fandango and Credit Karma assert that each company disabled a default process called SSL certification validation, which would have secured the apps’ communications and ensured that an attacker could not intercept information submitted by consumers through the apps.  Such information, depending on the app, could have included sensitive personal information such as credit card details, e-mail addresses, passwords, Social Security Numbers, dates of birth, home addresses, and credit report details, according to the FTC.

The settlements require Fandango and Credit Karma to establish “comprehensive security programs” designed to address security risks during the development of their applications and to undergo biennial independent security assessments for 20 years. The settlements also prohibit Fandango and Credit Karma from misrepresenting the level of privacy or security of their products and services.  According to the FTC, the agreements will be subject to public comment through April 28, 2014, after which the FTC will decide whether to make the proposed consent orders contained in the settlements final.

FTC Settles With Home Security Company Over "Do Not Call" Violations

By Jenny Paul and David Tallman

The Federal Trade Commission announced this week that it had reached a settlement with a home security company that allegedly violated the FTC’s Telemarketing Sales Rule by making sales calls to millions of consumers on the FTC’s Do Not Call Registry and to consumers who had asked the company not to call them again.
 

Versatile Marketing Solutions (VMS) allegedly called millions of consumers whose names and phone numbers it purchased from lead generators.  At least one million of these calls were made to numbers on the Do Not Call Registry, and another 100,000 were made to consumers who had previously asked the company to stop calling them, according to the FTC and Department of Justice complaint.

VMS claimed that it received sales leads from lead generators who said they obtained consumers’ express consent to receive telemarketing calls about a home security system or to receive additional information about installation of one.  According to the FTC, this was not the case.  Rather, the leads allegedly were obtained through fake survey calls and robocalls, many of which were placed to consumers on the Do Not Call Registry.  Even though VMS received complaints about the survey calls and robocalls, it continued to buy leads from the same lead generators and to make calls to consumers using those leads, the FTC said.

The settlement, which must still be approved by a court, prohibits VMS from engaging in “abusive” telemarketing practices.  That means it may not call any consumer whose number is on the Do Not Call Registry (unless VMS can prove it received written permission to make the call or that it has an established business relationship with a consumer), nor may it call consumers who previously told VMS not to call them again.  The settlement also places restrictions on how VMS may obtain and use lead-generated phone numbers in the future.

The settlement assesses a $3.4 million penalty against VMS and its owner, Jasjit Gotra, with all but $320,700 suspended due to their inability to pay. 

Apple and FTC Settle In-App Charges Complaint for More than $30 Million

 By Jenny Paul and Marc Martin 

Apple has agreed to pay at least $32.5 million to settle a Federal Trade Commission complaint that the company charged consumers millions of dollars for in-app purchases made by children in kids’ mobile apps without parental consent.

The FTC alleged in its complaint that Apple violated the FTC Act when it failed to disclose to parents that by entering their Apple password, they authorized a single in-app purchase in a children’s app, as well as a 15-minute window in which passwords did not have to be reentered in order to make purchases within the app.  That 15-minute window allowed children to make in-app purchases without further action from their parents, the FTC alleged, and allowed the children to collectively rack up millions of dollars in charges for purchases such as virtual items or currency used in playing a game.

Under the terms of the settlement, Apple will refund a minimum of $32.5 million worth of in-app charges that customers indicate were made by a minor and were accidental or not authorized by the actual holder of the Apple account.  The settlement also requires the company to modify its billing practices by March 31 to ensure that Apple account holders provide express, informed consent for in-app billing charges and to allow these account holders to revoke consent at any time.

Earlier, Apple’s in-app billing practices for kids’ apps had been the subject of a class action brought by a group of parents who alleged that their children made purchases within apps without their consent.  That class action settled last year, and required Apple to issue iTunes credit to some affected account holders whose children spent less than $30 and cash refunds for those who were billed more than $30.

FTC Approves First New Method for Verifying Parental Consent under COPPA

By Nickolas Milonas

The Federal Trade Commission recently approved a new method for verifying parental consent under a revised rule of the Children's Online Privacy Protection Act.  COPPA requires websites and online services geared toward children to get parental consent before collecting information on children under 13.  While the rule enumerates acceptable methods for gaining parental consent (such as verification via Social Security number), changes to the rule that took effect July 1 allow for FTC approval of new VPC methods.  

Imperium LLC’s approved VPC method uses a knowledge-based authentication system that verifies a parent's identity by asking a series of personal security questions, similar to a bank or other financial institutions.  The method is the first of its kind to be approved under the COPPA rule.  As we reported in November, AssertID proposed a social-media based VPC method.  The FTC, however, subsequently denied AssertID’s application due to the potential for circumvention by false social media accounts.

The FTC noted that identify verification via knowledge-based authentication is “well-established and has been adequately utilized and refined in the marketplace to demonstrate that it is sufficiently reliable to verify that individuals are parents authorized to consent to the collection of children’s personal information.”  The FTC also stated that such authentication, when used under COPPA, must contain dynamic, multiple-choice questions with a low probability for guessing correctly and of sufficient difficulty that a child in the household could not reasonably ascertain the answers.

FTC Says Parental Consent Proposal From AssertID Fails to Meet COPPA Rule

By Jenny Paul and Marc Martin 

A parental consent method proposed by AssertID does not meet the criteria for approval set forth in a revised Children’s Online Privacy Protection Act rule,
the Federal Trade Commission recently determined.

Under the COPPA rule, covered online sites and services must obtain verifiable parental consent before collecting personal information from children under 13.  While the rule enumerates acceptable methods for gaining parental consent, changes to the rule that took effect July 1 opened the door for FTC approval of unenumerated VPC methods.

AssertID was among the first of companies to react to the stricter revised rule, even as commenters predicted that companies would
struggle to comply with the revisions.  AssertID, which develops privacy and identity-verification solutions, submitted for FTC approval an unenumerated method called ConsentID.  ConsentID would facilitate verification by asking a parent’s “friends” on a social network to verify the identity of the consenting parent and the existence of the parent-child relationship — so-called social-graph verification. 

The FTC denied AssertID’s application by a
4-0 vote, finding that the company had failed to provide sufficient evidence that ConsentID was reasonably calculated to ensure that the person providing consent is actually the child’s parent.  In doing so, the FTC determined that there is not yet adequate research or market testing which demonstrates the reliability of the social-graph verification method.  It expressed concern that users can easily fabricate Facebook profiles, noting that Facebook itself estimates it has about 83 million fake accounts, and that children under 13 may falsify their age information to establish social media accounts that could appear to the software to be credible.

While AssertID’s effort to create a new verification solution to manage compliance in a stricter COPPA environment was unsuccessful, the company still
plans to offer an automated VPC service that relies on verification methods the FTC has already approved.  

TMT Round-up: Developments on Unlocked Phones; FTC Backs Do Not Track Standard Despite Ad Industry Objections; German Team Sets Wi-Fi Data Transmission World Record

 By Jenny Paul, Nickolas Milonas and Marc Martin

NTIA petitions FCC for rule requiring unlocked phones

The National Telecommunications and Information Administration is seeking new regulations that would require wireless carriers to unlock mobile phones, tablets and other devices upon the customer’s request.

 The NTIA filed a petition with the Federal Communications Commission in September, asking the FCC to immediately initiate a rulemaking that would shift the burden of unlocking mobile devices from consumers to wireless carriers.  Unlocking a device allows the device to be used on the networks of other carriers, not just the network of the carrier from which it was purchased.  Removing a lock on a mobile device would not affect the terms of the contract or the related penalties for termination between the consumer and the wireless carrier, according to an NTIA release.

FTC backs Do Not Track standard despite advertisers’ withdrawal from talks

Federal Trade Commission Chairwoman Edith Ramirez remains hopeful that the industry can create a Do Not Track standard, despite news that advertisers withdrew from online tracking talks at the World Wide Web Consortium.

A Do Not Track standard could allow consumers to opt out of online tracking or exercise more control over how their online activities are recorded.  The Digital Advertising Alliance withdrew from the W3C talks in September, saying it “no longer believes that the [W3C working group] is capable of fostering the development of a workable ‘do not track’ . . . solution.”

The DAA said it would work separately to consider options for enhancing consumer privacy, “rather than continuing to work in a forum that has failed.”

“[W]e intend to commit our resources and time in participating in efforts that can achieve results while enhancing the consumer digital experience,” DAA managing director Lou Mastria said in a letter to the W3C.  “The DAA will immediately convene a process to evaluate how browser-based signals can be used to meaningfully address consumer privacy. . . . This DAA-led process will be a more practical use of our resources than to continue to participate at the W3C.

Although Ramirez said she was disappointed by the DAA’s departure, she noted that a Do Not Track standard still could be reached.  “[M]y end goal on Do Not Track remains for consumers to have meaningful choices not to be tracked, whether that option emerges from within or outside the W3C,” Ramirez said in a statement.

German Team Sets Wi-Fi Data Transmission World Record

A team of scientists from the Fraunhofer Institute for Applied Solid State Physics (IAF) and the Karlsruhe Institute of Technology (KIT) recently set the world record for wireless data transmission at 100 Gigabits per second.  The team’s Wi-Fi network transmitted data at a frequency of 237.5 GHz over a 20-meter distance in controlled laboratory conditions. 

While such high-frequency signals allow for intensive data transfers, the propagation characteristics of these signals do not allow for long-distance travel and are easily disrupted by obstacles (e.g., buildings, walls, etc.).  At a rate of 100 Gbps, for example, you can transfer the contents of an entire Blu-ray disc in two seconds.  The team of scientists at IAF and KIT set the previous Wi-Fi data transmission record at 40 Gbps, and that technology was tested by sending data signals between the peaks of skyscrapers.  The team hopes that its new technology can be used in rural areas as “an inexpensive and flexible alternative to optical fiber networks, whose extension can often not be justified from an economic point of view.”  The same technology could also be used to patch holes in existing fiber lines.  One of the scientists also noted the use of multiplexing techniques (transmitting multiple streams) and multiple antennas could facilitate data rates of 1 terabit per second.

FTC's Online Privacy Rules for Children Clarified

By Nickolas Milonas and Marc Martin

The Federal Trade Commission recently released guidance on its December 2012 updates to the Children’s Online Privacy Protection Act (COPPA).  COPPA regulates the collection and use by website operators and application developers of personal information from children under the age of 13.  COPPA also requires website operators and application developers to obtain parental consent before collecting a child’s personal information.  The guidance touches upon several issues, including geolocation data; services directed towards children vs. mixed-audience services; parental access to children’s personal information; and disclosure of information to third parties.

As we previously reported, the December changes to the COPPA regulations are scheduled to take effect this July and contain definitional changes; expand the scope of permitted operations to include the collection of certain personal information through the use of persistent identifies; clarify the use of age screens for content targeting a broad audience vs. content specifically targeting children; heighten parental notification requirements; and implement more-stringent requirements regarding the retention and disposal of personal information.

In advance of the FTC’s guidance, industry groups voiced concerns that the complex changes could deter innovation and asked the FTC to delay implementation until 2014 to ensure compliance.  However, privacy groups advocated rejecting any delays, stating that the changes are necessary to protect children and companies have had plenty of lead time to revise their policies and products.

Updated (5/6/13): In a letter to representatives of the advertising, application, and e-business industries, the FTC confirmed that it will not delay implementation of the new COPPA rules scheduled to take effect this July. The FTC stated that all stakeholders were afforded a sufficient opportunity to raise their concerns with the new rules but did not present any facts to warrant delaying implementation.

Online Advertising Guidance Updated by FTC

By Brian McCalmon and J. Bradford Currier

The Federal Trade Commission has updated its primary business guidance for online advertising, “.com Disclosures: How to Make Effective Disclosures in Digital Advertising.” The update to the guidance, which had not been revised since 2000, addresses the emergence of new electronic media and communications protocols over the last 13 years and compliance with the FTC Act’s requirement that ads be truthful and not misleading, adequately substantiated, and not unfair. Each of these mandates can require the disclosure of qualifying information in a manner that the FTC intends to be “clear and conspicuous” – i.e., easily accessed and understood by the consumer.

With the emergence of smartphones, tablets, and social media, the form and placement of adequate disclosures has become more complicated than it was over a decade ago. For example, with their smaller screens, smartphones can complicate the task of ensuring that a necessary disclosure is clear and conspicuous. The same goes for ads on social media outlets such as Facebook or Twitter that may impose space constraints on advertisers (such as Twitter’s character number limitation). With its updated guidance, supported by 22 appended examples, the FTC renews its call for advertisers either to ensure that disclosures satisfy the FTC Act and rules, or to refrain from using the medium if adequate disclosures cannot be provided.

The updated guidance reflects the FTC’s recent focus on social media and consumers’ growing reliance on wireless devices to access Internet services. Although FTC guidelines do not have the legally binding effect of regulations, they are intended and considered to be highly reliable statements of the FTC’s enforcement intent; once issued, companies ignore them at their own, potentially significant, peril.

Apple Settles In-App Purchases Class Action

By Nickolas Milonas and Marc Martin

Earlier this week, Apple agreed to settle a class action lawsuit regarding so-called “bait apps”—mobile apps directed towards children, which are free to download but then charge for in-app purchases. Under the terms of the settlement, Apple agreed to issue $5 in iTunes credit to affected customers. If customers racked up more than $5 in in-app charges, Apple will issue up to $30 in iTunes credit. Apple will issue cash refunds for accounts that spent more than $30.

The size of the class is not yet determined, but Apple will send notices to 23 million potentially affected customers. Customers who were affected during the relevant time period will have to certify that the charges were made by minors and without parental permission. While the final settlement amount may vary, Apple could end up paying in excess of $100 million when all is said and done. The court is scheduled to hear the proposed settlement tomorrow, March 1.

The lawsuit was filed in 2011 by a group of parents who claimed their children made purchases within apps without their consent. The parents alleged that Apple failed to adequately disclose that these apps contained the ability to make in-app purchases. At that time, Apple’s policy required account holders to enter their passwords when downloading a mobile app, but would not require passwords to be re-entered for the next 15 minutes. During that 15 minute window, children could make in-app purchases without entering an account password. Apple has since changed its policy to require a password for every purchase.

The settlement is one piece in the larger puzzle regarding mobile app disclosures and privacy protections. Earlier this year, the FTC released a mobile privacy report and entered into a settlement with Path—a mobile-only social network that was allegedly mining information without users’ consent, including, according to the FTC, the information of minors in violation of the Children’s Online Privacy Protection Act. Late last year, the FTC released another report, highlighting the widespread practice of mobile apps collecting and sharing minors’ information with third parties without disclosing such practices. Also last year, the California Attorney General entered into an agreement with six mobile app platforms to increase consumer privacy protections. It issued a follow-up report earlier this year, Privacy on the Go, which includes recommendations for app developers, platform providers, and mobile carriers.

Data Privacy Update: FTC Releases Mobile Privacy Report and Settles Action against Path; Facebook to Identify Tracking Advertisements

By Nickolas Milonas, Marc Martin, and David Tallman

In a trio of recent data privacy developments, the FTC published mobile data policy recommendations, Path settled an FTC action regarding allegedly unlawful data collection, and Facebook will now tell users which ads are tracking their online activity.

The FTC recently released a staff report calling on mobile services to make their data policies more transparent and accessible to consumers. The report makes recommendations for mobile platform providers, application developers, advertising networks, and other key players in a rapidly expanding marketplace. The recommendations focus on providing consumers clear and timely disclosures about what consumer data is collected and how that data may be used. The report results in part from a May 2012 FTC workshop in which representatives from the industry, academia, and consumer privacy groups examined privacy risks and disclosures on mobile devices. 

Noting the expansive growth of services offered on mobile platforms, the report recognizes unique privacy concerns rooted in the “unprecedented amounts of data collection” possible from a single mobile device. The report also notes consumers are increasingly concerned about their privacy on mobile devices, stating “less than one-third of Americans feel they are in control” of their mobile personal data. 

With those concerns in mind, the report offers recommendations to improve mobile privacy disclosures. These recommendations are consistent with the broad principles previously articulated in the FTC’s prior March 2012 Privacy Report, which generally called upon companies handling consumer data to adhere to the core principles of “privacy by design,” simplified consumer choice, and greater transparency. The staff report elaborates on these general principles by providing guidance to address the unique challenges presented in the mobile environment (e.g., limited screen space, the centrality of platform and operating system providers, etc.) Among other recommendations, the report suggests: 

  • Developing privacy best practices and uniform, short-form disclosures;
  • Providing just-in-time disclosures to consumers requiring affirmative consent before allowing apps to access sensitive content like geolocation, contacts, or photos;
  • Developing a one-stop “dashboard” to review content accessed by apps; and
  • Offering a “Do Not Track” mechanism on smartphones to prevent third-parting tracking at the operating system level.

On the heels of the staff report, the FTC also announced a law enforcement action against Path, a mobile-only social network accused of collecting user data without consent. Through its social networking service, Path’s app allows users to upload and share content, including photos, comments, location data, and even the names of songs that the user plays. Among other allegations, the FTC claimed that the Path application automatically collected and stored personal information from users’ mobile device address books without the users’ consent (including names, addresses, phone numbers, email addresses, Facebook and Twitter usernames, and dates of birth). The agency also alleged that Path violated the Children’s Online Privacy Protection Act by collecting personal information from approximately 3,000 children under the age of 13 without parental consent. Path settled with the FTC on the same day that the agency filed its action. Path agreed to pay $800,000 in fines, delete all information for users under 13, and submit a comprehensive privacy plan with updates/assessments every other year for the next 20 years. 

Finally, Facebook recently announced it will alert users to advertisements that are based on or track browsing history. When users are logged in to their Facebook account and hover over ads with their mouse, a new pop-up icon will alert users if they are being tracked. The feature is the product of an agreement between Facebook and the Council of Better Business Bureaus, and users are still able to opt out of brand-specific ads, as well as ad tracking altogether.

These developments highlight the continuing regulatory focus on online privacy issues, particularly in connection with social media and mobile applications.

FTC's Revised Internet Privacy Rules for Children (COPPA) Published in Federal Register

By J. Bradford Currier, Marc Martin, and David Tallman

Providers of online services, websites, and applications directed at children will need to reexamine their policies regarding the collection of information from children in light of new privacy rules issued by the Federal Trade Commission and recently published in the Federal Register. The rules impose new obligations under the Children’s Online Privacy Protection Act (“COPPA”), which generally requires online services and websites to notify parents and obtain parental consent before collecting, using, or disclosing personal information from children under 13 years of age. The revised regulations were announced late last month and represent the result of more than a year-long examination of COPPA by the FTC that produced hundreds of comments from stakeholders. The COPPA revisions are just the latest step for the FTC regarding children’s privacy, which has been the focus of increased enforcement action and staff reports over recent years.

The new regulations revise the COPPA rules in five key areas:

(1) Definition Changes: The new rules expand the definition of online “operators” subject to COPPA to include child-directed sites or services that use third parties, such as plug-ins or advertising networks, to collect personal information from children. However, the FTC clarified that this change is not intended to extend liability to platforms that merely offer access to child-directed apps (such as online application storefronts). In addition, the definition of “personal information” for which parental consent must be obtained now will include location information, as well as photos, videos, and audio files that contain a child’s image or voice.

(2) Permitted Operations: The new rules permit the collection of a child’s personal information through the use of “persistent identifiers,” which recognize users over time and across different websites, for the sole purpose of supporting the online service’s internal operations (e.g., contextual advertising, frequency capping, legal compliance, site analysis, and network communications). Such information, however, cannot be used to contact a specific user, amass a profile on that person, or for any other purpose. Operators also may allow children to participate in interactive communities without parental consent, so long as they take reasonable measures to delete a child’s personal information before it is made public.

(3) “Age Screens”: The new rules distinguish between online services and websites whose primary target audience is children and those directed to a broader audience. While online services and websites whose primary target audience is children must continue to presume that all of their users are children subject to COPPA, the online services and websites directed to a broader audience may implement procedures to “age screen” users and obtain consent only for users who self-identify as under 13 years of age.

(4) Parental Notice and Consent: The new rules amend the parental notice provisions to require concise and timely notices to parents and access to operators’ privacy policies. Specifically, the notice to parents should clearly explain: (i) the types of personal information to be collected; (ii) that the parent’s consent is required for the collection of such information; and (iii) how to provide consent. The parental notice must also include a hyperlink to the operator’s privacy policy. The revised rules add several new methods for operators to obtain parental consent, including: (i) electronic scans of signed parental consent forms; (ii) videoconferencing; (iii) government-issued identification; and (iv) alternative payment systems (such as credit or debit cards), if they meet certain criteria. The FTC also established a voluntary 120-day notice and comment process so parties can seek approval of particular parental consent methods.

(5) Increased Security Measures: The new rules require websites and services to take reasonable steps to ensure that children’s personal information is released only to service providers and third parties capable of maintaining the confidentiality, security, and integrity of such information. The new rules allow the retention of children’s personal information only for as long as reasonably necessary to provide the service and also require operators to securely dispose of the information.

The revised COPPA regulations have drawn a mixed reaction. Some stakeholders have praised the FTC for striking an appropriate balance between privacy and innovation, while others have suggested that increased compliance costs will lead to a decline in online service and mobile application development. Either way, the new rules reaffirm the importance of COPPA compliance and children’s privacy issues to online services.

The revised COPPA rules will take effect on July 1, 2013.

FTC Report Investigates Mobile Apps for Kids

By Samuel Castic

Federal Trade Commission staff recently released a report titled “Mobile Apps for Kids: Disclosures Still Not Making the Grade,” which contained the FTC’s most recent mobile app investigative findings that build upon its report from February of this year. The February report contained four key recommendations, which we summarized in a prior post.

This new report expanded on the FTC’s prior investigation by reviewing mobile app features and comparing them to disclosures made concerning the apps. The FTC found that many apps shared kids’ information with third parties without disclosing such practices to parents. Specifically:

1.      Most apps failed to disclose information collection or sharing practices before the apps were downloaded;

2.      Many apps failed to disclose that they contained advertising content or that the app shared privacy data with third-party advertising networks (including device IDs, geolocation information, and phone numbers);

3.      Some apps failed to disclose that they allowed in-app purchases;

4.      Some apps failed to disclose that they contained social media integrations that allow users to communicate with members of social networks; and

5.      Some app disclosures included false information. For example, certain apps expressly stated that user information would not be shared or that the apps did not contain advertising, when that was not the case.

The FTC has taken the position that mobile apps are online services for purposes of the Children’s Online Privacy Protection Act (“COPPA”), which prohibits the online collection of personal information concerning children under age 13, except in certain circumstances. As we have noted in prior posts, this area is fraught with risk and legal exposure. Indeed, the report indicates that the FTC staff plans to launch “multiple nonpublic investigations” to determine whether certain players in the mobile app space have violated COPPA or engaged in unfair acts or deceptive practices in violation of the FTC Act.

The report concludes by urging the mobile app industry to carry out the recommendations from the FTC’s recent privacy report—most notably, to:

1.      Incorporate privacy protections into the design of mobile products and services;

2.      Offer parents easy-to-understand choices about data collection and sharing through kids’ apps; and

3.      Provide greater transparency about how data is collected, used, and shared through kids’ apps.

Stay tuned in the upcoming weeks as the FTC is expected to announce new COPPA regulations that could impose further compliance challenges for mobile apps.

FTC Chairman and Experts to Examine Mobile and Online Privacy in Upcoming Webcast

A live webcast program entitled Privacy Untangled, featuring Federal Trade Commission Chairman Jon Leibowitz and an expert panel will be carried on Broadband US TV on Friday, October 26, 2012, from 1:00-2:30 p.m. ET.

Balancing privacy with commercial interests has become increasingly complex and contentious, as businesses and government organizations rely on the collection, storage, and sharing of online and mobile consumer data. Recent regulatory initiatives, including the White House’s proposed Consumer Privacy Bill of Rights and related workshops, and the privacy enforcement actions and best practices reports of the FTC have placed evolving privacy practices in the spotlight. In addition, privacy watchdog groups continue to criticize the government’s privacy initiatives as insufficient, while service providers complain of the government over-reaching in its regulatory approach towards industry privacy practices.

An in-depth examination of these issues will be provided in a live webcast with co-hosts Marty Stern of K&L Gates and Jim Baller of the Baller Herbst Law Group. In addition to special guest FTC Chairman Jon Leibowitz, the program will feature an expert panel with Sue Kelley, American Public Power Association General Counsel; Deborah J. Matties, Attorney Advisor to FTC Chairman Leibowitz; Emily Mossberg, Principal at Deloitte & Touche LLP; Ross Shulman, Public Policy and Regulatory Counsel at the Computer and Communications Industry Association; Bernin Szoka, President at TechFreedom, and Peter Swire, former Chief Counsel for Privacy under President Clinton and current professor at the Ohio State University.

The panel will engage in a lively discussion regarding privacy issues and the government’s recent initiatives to adjust privacy regulations for an evolving online and mobile marketplace.

You can register for the webcast here (free registration required).

FTC Releases Mobile App Privacy and Advertising Guide

By J. Bradford Currier, Marc Martin, and Samuel R. Castic

Developers of mobile applications are urged to adopt truthful advertising practices and build in basic privacy principles into their products under guidance recently issued by the Federal Trade Commission. The guidance is aimed at providing mobile app start-ups and independent developers with marketing recommendations designed to ensure compliance with federal consumer protection regulations. The guidance follows recent actions by the Federal Communications Commission, the White House, states, private stakeholders, and the FTC itself to establish mobile privacy codes of conduct and safeguard consumer information. The FTC guidance focuses on two key regulatory compliance areas for mobile app developers: (1) truthful advertising and (2) consumer privacy.

(1)        Truthful Advertising – The guidance recommends that mobile app developers always “[t]ell the truth about what your app can do.” The FTC cautions mobile app developers that anything a developer tells a prospective buyer or user about their app can constitute an advertisement subject to the FTC’s prohibitions on false or misleading claims. As a result, mobile app developers are encouraged to carefully consider the promises made concerning their apps on websites, in app stores, or within the app itself. Specifically, the guidance reminds mobile app developers that any claim that an app can provide health, safety, or performance benefits must be supported by “competent and reliable” scientific evidence. The FTC notes that it has taken enforcement action against mobile app developers for suggesting that their apps could treat medical conditions and recommends app developers review the FTC’s advertising guidelines before making any claims to consumers.

The guidance also advises mobile app developers to disclose key information about their products “clearly and conspicuously.” While the guidance recognizes that FTC regulation does not dictate a specific font or type size for disclosures, mobile app developers are encouraged to develop disclosures that are “big enough and clear enough that users actually notice them and understand what they say.” The FTC warns mobile app developers that it will take action against mobile app developers that attempt to “bury” important terms and conditions in long, dense licensing agreements. 

(2)        Consumer Privacy – The guidance calls upon mobile app developers to build privacy considerations into their products from the start, also known as “privacy by design” development. The FTC suggests that mobile app developers establish default privacy settings which would limit the amount of information the app will collect. The FTC also recommends that app developers provide their users with conspicuous, easy-to-use tools to control how their personal information is collected and shared. The guidance pushes mobile app developers to get users’ express agreement to: (1) any collection or sharing of information that is not readily apparent in the app; (2) any material changes to an app’s privacy policy; or (3) any collection of users’ medical, financial, or precise geolocation information. At all times, mobile app developers should be transparent with consumers about their data collection and sharing practices, especially when the app shares information with other entities. 

The FTC also advocates that mobile app developers install strong personal information security protections in their products. In order to keep sensitive data secure, the guidance suggests that mobile app developers: (1) collect only the data they need; (2) secure the data they keep by taking reasonable precautions against well-known security risks; (3) limit access to a need-to-know basis; and (4) safely dispose of data they no longer need. Mobile app developers are also encouraged to establish similar standards with any independent contractors.

The guidance also pays special attention to the issue of mobile app protection of children’s privacy under the Children’s Online Privacy Protection Act (“COPPA”). The guidance reminds mobile app developers that they must clearly explain their information practices and get parental consent before collecting personal information from children if their apps are “directed to” kids under age 13 and keep such information confidential and secure. The FTC’s recommendations parallel its recently proposed rules designed to clarify the responsibilities under COPPA when third parties (such as advertising networks or downloadable “plug-ins”) collect personal information from users on child-directed websites. Mobile app developers are encouraged to contact the FTC or review the Bureau of Consumer Protection’s business resources when developing their privacy policies.

Comment Deadline on Proposed Children's Online Privacy Rules Extended by FTC

By J. Bradford Currier and Marc Martin

The Federal Trade Commission has extended the deadline for comments on the agency’s proposed revisions to the Children’s Online Privacy Protection Act (“COPPA”) released earlier this month. As we reported previously, the proposed rules would modify key definitions contained in COPPA to clarify the parental notice and consent responsibilities of website operators as well as third-party advertisers and “plug-in” developers that collect personal information on children. The proposed rules also aim to establish clear guidelines for the use of so-called “persistent identifiers” and would potentially allow websites which appeal to a general audience to “age screen” users by birth date and provide parental notice and obtain consent only for users who identify themselves as under 13 years of age. 

Comments on the proposed rules will now be accepted until September 24, 2012.

Revisions to Children's Online Privacy Rules Proposed By FTC

By J. Bradford Currier, Marc Martin, and Lauren Pryor

Websites, social media platforms, software “plug in” developers, and online advertisements aimed at children may face new restrictions under proposed rules recently released by the Federal Trade Commission. The proposed rules would modify key definitions contained in the Children’s Online Privacy Protection Act (“COPPA”), which requires websites or online services directed at children under the age of 13 to seek and obtain parental consent before collecting or using a child’s personal information. With the new definitions, the FTC aims to clarify the responsibilities under COPPA when third parties (such as advertising networks or downloadable “plug-ins”) collect personal information from users on child-directed websites. The proposed rules represent another example of the FTC’s recent efforts to expand its enforcement on a variety of privacy-related issues related to children. Comments on the proposed rules will be accepted until September 10, 2012.

The proposed rules modify the scope of the FTC’s COPPA Notice of Proposed Rulemaking released in September 2011. As we reported previously, the earlier proposals would have expanded the definition of personal information to include so-called “persistent identifiers,” which represent unique user identification information obtained through “cookies” or other methods for purposes other than to support the website/service’s internal operations. The initial proposals would also have extended COPPA protections to photographs, videos, or audio files that include a child’s image or voice. The prior proposals further stated that the FTC would consider a wider range of factors, including whether a website included child celebrities and music content, when determining whether a website or online service was directed to children. Stakeholders submitted hundreds of comments in response to the 2011 proposals, leading the FTC to release this new round of proposed rule changes.

The new proposed rules modify the obligations under COPPA in three key areas:

(1)        Website Operators

Previous FTC guidance suggested that the responsibility for providing notice to parents and obtaining consent for the collection of personal information from children rested with the entity actually collecting the information. As a result, a child-directed website/service operator could permit others to collect personal information from child visitors without taking responsibility for seeking and obtaining parental consent. The proposed rules would now hold responsible both the child-directed website/service operator andany third-parties collecting information on such operator’s behalf for the parental consent requirements. Specifically, the FTC stated that “an operator of a child-directed site or service that chooses to integrate into its site or service other services that collect personal information from its visitors should be considered a covered operator under [COPPA].” The FTC noted that the website/service operator is often in the best position to give notice and obtain consent from parents and can control which third-party plug-ins, software downloads, or advertising networks are integrated into its site.

(2)        Website/Service Directed to Children

The COPPA rules only apply to websites/services “directed to children.” The new rules would clarify that that a third-party plug-in, software download, or advertising network is covered under COPPA when the third-party provider “knows or has reason to know” that it is collecting personal information through a child-directed website or online service. The new rules would not require third-party providers to monitor or investigate whether their services are incorporated into child-directed websites/services, but providers may not ignore information brought to their attention indicating that incorporation has occurred.

The proposed rules also attempt to address the fact that some websites/services that contain child-oriented content may also be of interest to adults. Under current FTC rules, these sites must treat all visitors as under 13 years of age. In response, some commenters suggested that the FTC adopt a system that would permit websites/services directed to a broad audience to implement procedures to differentiate among users and require notice and consent only for users who self-identify as under age 13 years of age. The FTC agreed. The new rules allow general audience websites/services to “age screen” all users (i.e. by supplying a birth date) and provide notice and obtain consent only for users who identify themselves as under 13 years of age. The FTC recognized that child users may lie about their age, but thought the age screening process “strike[s] the correct balance” between privacy and access. However, child-directed websites/services that knowingly target children under 13 as their “primary audience” or whose overall content is likely to attract children under 13 must continue to treat all users as children under COPPA.

(3)        Persistent Identifiers and Website/Service Support

The new rules clarify how child-directed websites/services can use persistent identifiers. The FTC first reiterated its 2011 proposal that persistent identifiers should be included in the definition of personal information. The FTC then stated that website/service operators may still use persistent identifiers without obtaining consent for activities such as performing site maintenance and analysis; performing network communications; authenticating users; maintaining user preferences; serving contextual advertisements; and protecting against fraud and theft. The exemption would not apply when the information collected through persistent identifiers is used to contact a user directly, including through the use of behaviorally-targeted advertising, or for any other purpose.

Obama Administration Pursues Mobile Privacy Code of Conduct

By J. Bradford Currier and Marc Martin

The National Telecommunications and Information Administration (“NTIA”) will hold its first meeting on July 12, 2012 aimed at developing voluntary codes of conduct designed to provide consumers with clear information regarding how personal data is handled by companies which develop and offer applications for mobile devices. The NTIA’s planned meetings with stakeholders were first announced in February 2012 as part of the White House’s proposed Consumer Privacy Bill of Rights. The NTIA meeting comes as both the Federal Trade Commission and Federal Communications Commission have recently taken action to improve consumer transparency and privacy safeguards for personal information collected by mobile apps.

A number of stakeholders have already filed comments expressing their support for improving the clarity and comprehensiveness of privacy disclosures provided to mobile app consumers. However, a number of commenters noted that the rapid pace of innovation in the mobile app market and the relatively small screen sizes of current mobile devices will make long-term, definitive disclosure rules difficult to develop. While NTIA hopes to tackle a number of Internet policy topics, including copyright and cybersecurity issues, the organization chose mobile app privacy as the first meeting topic because it believes consensus on a code of conduct can be reached “in a reasonable timeframe.” NTIA expects the mobile app privacy meeting will serve as a useful precedent for later discussions involving other online consumer protection concerns.

The NTIA meeting is open to all interested stakeholders and a venue should be announced before the end of the month. Interested stakeholders are asked to inform NTIA online in advance if they plan to attend the meeting.

Telemarketing Robocall Rules Published in Federal Register

By J. Bradford Currier, Marc Martin, and Marty Stern

Regulations restricting the use of autodialers and prerecorded voice messages in telemarketing (a practice known as “robocalling”) adopted by the Federal Communications Commission in February 2012 were recently published in the Federal Register. As we reported previously, the regulations require telemarketers to obtain a person’s prior express written consent before placing a robocall, but contained exemptions for certain “informational” autodialer calls, such as debt collection inquiries, airline delay information, and fraud detection calls from banks. The new rules are designed to harmonize the FCC’s regulations under the Telephone Consumer Protection Act with the recently amended telemarketing rules issued by the Federal Trade Commission. While the Report and Order adopting the regulations has an effective date of July 11, 2012, the federal Office of Management and Budget must still approve key aspects of the new rules. Specifically, publication of OMB’s approval in the Federal Register will trigger a number of compliance deadlines, including:

  • Telemarketers must receive a consumer’s “prior express written consent” before making any robocall to a wireless number or residential line.  The written consent requirement will take effect one year after publication of the OMB’s approval of the new regulations in the Federal Register.
  • Previously, telemarketers could rely on an established business relationship (“EBR”) with the consumer for robocall solicitations.  At the end of a one-year period following publication of OMB’s approval of the new rules in the Federal Register, the EBR exception will be eliminated.
  • All telemarketing robocalls must provide “an interactive opt-out mechanism” that allows consumers to be placed on a telemarketer’s internal “do-not-call” list.  Telemarketers will be required to provide an opt-out mechanism within 90 days of the publication of the OMB’s approval of the new regulations in the Federal Register.
  • Telemarketers must employ technologies to limit the amount of “abandoned” robocalls that result in dead air.  Telemarketers will be required to meet certain abandoned call benchmarks within 30 days of the publication of the OMB’s approval of the new regulations in the Federal Register.

Myspace and FTC Agree to Privacy Consent Order

By J. Bradford Currier

Social networking site Myspace has agreed to a proposed consent order with the Federal Trade Commission which provides for independent privacy audits for 20 years. The FTC alleged that Myspace made their users’ unique identifiers, known as “Friend IDs,” available to advertisers despite its privacy policy promising that the company would not share users’ personally identifiable information without first giving notice to users and receiving their consent. 

The FTC’s complaint alleged that Myspace violated Section 5(a) of the FTC Act by misleading users about how Myspace shares their personal information with third parties. Advertisers could use a Friend ID to access a user’s Myspace profile, which in many cases contained the user’s full name in addition to age, gender, and profile picture. The FTC warned that an advertiser could combine users’ personal data with additional information obtained from the advertiser’s tracking “cookies” to effectively track users across the Internet.

Under the terms of the proposed consent order, Myspace would be required to establish and maintain a “comprehensive privacy program” designed to address privacy risks and protect the confidentiality of users’ personal information. Myspace further agreed to biennial audits of its privacy program from a “qualified, objective, independent third-party professional” for 20 years to ensure that the company’s privacy controls meet the dictates of the consent order. Myspace would also be required to abide by stringent recordkeeping requirements to assist the audit process, including retaining records of customer complaints and legal actions against the company. In addition, the proposed consent order prohibits Myspace from misrepresenting its privacy policies and compliance with industry privacy frameworks, such as the U.S./EU Safe Harbor Framework, which protects personal data online internationally.

The proposed consent order, one of several recent aggressive enforcement efforts by the FTC, will be subject to public comment for 30 days, after which the FTC will decide whether to amend the proposed order or make the consent order final.

Consumer Privacy Report Released By FTC

By J. Bradford Currier and Lauren B. Pryor

The Federal Trade Commission recently released its long-awaited Final Report on protecting consumer privacy, in which it stated that consumers should have more choice and control over how their personal information is collected and used. The FTC’s Final Report offers non-binding recommendations for companies “that collect or use consumer data that can be reasonably linked to a specific consumer, computer, or other device.” The Final Report comes more than a year after the FTC first issued its proposed framework for regulating consumer privacy and just a month after the White House released a proposed Consumer Privacy Bill of Rights

Recognizing the potential burden of the Final Report’s recommendations on small businesses, the FTC stated that its conclusions did not apply to companies that merely collect and do not transfer non-sensitive data on fewer than 5,000 consumers a year. Similarly, a company’s data collection practices may fall outside the scope of the Final Report if:  (1) a given data set has been reasonably stripped of personally identifiable information; (2) the company publicly commits not to re-identify such information, and (3) the company requires any secondary users of the data to keep it in de-identified form. While the majority of the Final Report discusses protecting consumer privacy online, the FTC noted that its recommendations would also apply to companies collecting personal information offline, such as financial institutions and healthcare industries. With these qualifications, the Final Report provides three best practices for companies collecting personal information from consumers:

(1)        Privacy By Design

The Final Report recommends that companies build in consumer privacy protections at every stage of the development of their products and services. Specifically, companies should incorporate reasonable procedures for collecting, securing, and retaining customer data. The Final Report commends a number of leading online service companies that have adopted stringent encryption systems in the face of increasing cyberattacks. Companies should limit data collection to activities which are “consistent with the context of a particular transaction,” and provide prominent notices to consumers regarding the collection of data unrelated to the requested service. Companies should also destroy consumer data when the company no longer needs this information to provide the requested service. On this point, the FTC expressed support for offering consumers an “eraser button” on social media websites to allow the deletion of personal information at the user’s discretion. Additionally, companies should ensure that data collected remains accurate and offer customers an opportunity to correct erroneous information. By adopting these policies, most online services’ default privacy settings would be strong.

(2)        Simplified Consumer Choice

The FTC also advised companies to provide easy-to-use mechanisms allowing customers to determine how their data is collected and used. The application of the simplified consumer choice policy will vary depending on the context of the interaction between the company and the consumer. For example, a car dealership may send a coupon to a customer based upon personal information obtained during prior purchases at the dealership without providing the customer with a choice. By contrast, if the car dealership intends to sell that customer’s personal information to a third-party data broker for use in unrelated marketing activities, the car dealership must provide the consumer with the ability to prevent the sale of his or her information. 

For most online services, the FTC suggested that companies allow users to choose data sharing preferences during the registration process or at least before any personal information is collected. The FTC identified company practices requiring consumers to disclose personal data in order to obtain important services on a “take it or leave it basis” as especially problematic and inconsistent with the public interest. 

The Final Report generally concludes that companies should provide consumers with the ability to opt out of being tracked across third parties’ websites. However, the FTC stopped short of recommending that Congress pass “do not track” legislation and stated that the FTC would work closely with stakeholders to develop an industry-led solution. The FTC reaffirmed its commitment expressed in recent enforcement actions to requiring companies to give prominent disclosures and to obtain express affirmative consent for material retroactive changes to privacy policies and before collecting especially sensitive information such as health, financial, and precise geolocation data. The Final Report indicates that the FTC will host a workshop on the concerns raised by the data collection practices of large ISPs, search engines, and social networking platforms later this year.

(3)        Information Collection Transparency

In accordance with industry guidance on mobile applications released earlier this month, the Final Report calls for “clearer, shorter, and more standardized” privacy policies. This recommendation can result in a Catch-22: if brevity comes at the expense of accuracy, there is increased risk that a privacy policy will be deemed misleading, deceptive, or insufficient. The FTC noted that screen size limitations on mobile phones further compound the difficulties with providing sufficient privacy disclosures and stated that it will host a workshop in May 2012 regarding how mobile privacy disclosures can be short, effective, and understandable on small screens.

The Final Report also encourages transparency by recommending that companies allow consumers more options to access their personal data. Specifically, the FTC indicated its support for recent legislation which would give access rights to consumers for information held by data brokers. The Final Report also suggests that the data broker industry should explore the idea of creating a centralized website where data brokers identify themselves to consumers, describe how they collect consumer data, and disclose the types of companies to which they sell information. At a minimum, the Final Report asks all companies collecting personal data to improve their consumer outreach and education efforts relating to data privacy practices.

Mobile App Platforms Reach Voluntary Agreement with California State Attorney General

By Samuel R. Castic and J. Bradford Currier

Californians who download mobile applications on their smartphones, tablets and other mobile devices should soon have greater knowledge of how their personal information is collected and used under a non-binding Joint Statement of Principals recently reached between six mobile app platforms, such as Apple, Inc., and the California Office of the Attorney General. The California announcement comes just days after the Federal Trade Commission warned app developers to improve privacy disclosures for mobile apps directed at children and within hours of the White House’s announcement of a Consumer Privacy Bill of Rights to protect citizens online.

Although the agreement does not create any new legal obligations for app providers, the parties agreed to voluntarily abide by five privacy principles: 

(1) Any app that collects personal data from a user, regardless of age, “must conspicuously post a privacy policy or other statement describing the app’s privacy practices” that informs the user how the data will be used and shared. California law already requires websites and online services to post privacy policies when they collect personally identifiable information about users. Despite this obligation, the California Attorney General reported that only 5 percent of mobile apps currently offer a privacy policy, although other parties suggest that the figure is approximately 33 percent. The agreement makes clear that the California Attorney General views mobile applications as online services subject to this law. 

(2) The agreement modifies the app submission process to make it easier for app developers to include a link to, or the text of, the privacy policy governing the app. However, the agreement contains no commitment by app platforms to notify users when a privacy policy changes. 

(3) The app platforms will create reporting procedures for users to identify apps that do not comply with applicable terms of service or applicable law. 

(4) The app platforms agreed to implement a response process to handle reported violations of app privacy policies. 

(5) The parties agreed to work together to develop “best practices” for mobile privacy policies. 

While no timetable exists for implementation of the agreement, the parties agreed that they will reassess the state of app privacy policies in within six months.

FTC Warns Mobile App Developers About Privacy Practices

By Samuel Castic, J. Bradford Currier, and Lauren B. Pryor

In another example of its recent efforts to step up enforcement on a variety of privacy-related issues, the Federal Trade Commission released a staff report on privacy disclosures for mobile applications used by kids. The report follows a recent FTC enforcement action against a mobile app developer for children and a notice of proposed rulemaking to amend the Children’s Online Privacy Protection Act (“COPPA”). The staff report represents a “warning call” to the app industry to provide parents with easily accessible, basic information about the mobile apps that their children use.

Under COPPA, operators of mobile apps directed at children under the age of 13 must provide notice and obtain parental consent before collecting personal information from children. The report surveyed approximately 1,000 apps designed for children and reviewed the types of privacy disclosures currently available to parents and kids. The FTC found that users frequently received the privacy disclosures only after downloading the app, limiting parents’ ability to “pre-screen” apps for their children. Additionally, the FTC reported that app websites often failed to provide meaningful notice regarding the data collection features of the app such that parents were not informed as to whether the app collecteddata from their children, the type of data collected, the purpose for such collection, and what parties may access such data. The FTC found this lack of disclosure troubling, especially in light of current technologies that allow mobile apps to access a child’s information with the click of a button and to transmit it invisibly to a wide variety of entities. 

In light of these concerns, the report offered four key recommendations:

  • App developers should provide “simple and short” privacy disclosures that are easy to find and understand on a mobile device screen;
  • App developers should “alert parents if the app connects with any social media, or allows targeted advertising to occur through the app”;
  • Third parties obtaining user information through apps should make their privacy policies “easily accessible” through a link on the app promotion page or in the app developer’s disclosures; and
  • Companies that provide platforms for downloading mobile apps should take action to help better protect kids and inform parents (e.g., develop a uniform practice for developers to disclose data collection practices).

The FTC plans to conduct additional reviews over the next six months to determine whether to take enforcement action against app developers that violate COPPA. The FTC also plans to hold a workshop on mobile privacy issues later this year.

FCC Substantially Revises Robocall Telemarketing Rules

By J. Bradford Currier, Marc Martin, and Marty Stern

In response to “thousands” of complaints from consumers, the Federal Communications Commission unanimously adopted a Report and Order requiring telemarketers to obtain a person’s prior express written consent before placing a call using an autodialer or artificial/prerecorded voice (a practice known as “robocalling”). The FCC exempted certain “informational” robocalls, such as debt collection inquiries, airline delay information, and fraud detection calls from banks, from the new rules. The new rules are designed to make the FCC’s regulations under the Telephone Consumer Protection Act (“TCPA”) more consistent with the recently amended telemarketing rules issued by the Federal Trade Commission.

The robocalling regulations seek to protect consumers from unwanted telemarketing robocalls in four key ways:

            (1)        Prior Express Written Consent Requirement

Under the new rules, telemarketers must receive a consumer’s “prior express written consent” before making any robocall to a wireless number or residential line. Before consenting, the consumer must receive a “clear and conspicuous disclosure” of the consequences of providing consent and “unambiguously” agree to receive telemarketing calls at a designated telephone number. Telemarketers may not require robocall consent as a condition of purchasing any good or service. In the event of a dispute concerning consent, the telemarketer will bear the burden of showing that the required disclosure was provided and that it obtained sufficient written consent. Telemarketers may receive the required written consent through an electronic signature in compliance with state law or the federal E-SIGN Act, which allows agreements to be made through email, website form, text message, telephone key press, and other methods. Telemarketers will be required to comply with the written consent requirements within a year of the publication of the federal Office of Management and Budget's approval of the new rules in the Federal Register.

Critically, the FCC carved out a number of exceptions to the express written consent requirement, including for emergency calls, service calls from a customer’s carrier if the customer is not charged, and health-care related calls regulated under the federal Health Insurance Portability and Accountability Act. Additionally, the Report and Order stated that the prior written consent requirement only applies to telemarketing calls and the FCC was “leav[ing] undisturbed the regulatory framework for certain categories of calls.” Specifically, solicitations from tax-exempt charitable organizations, political messages, and school-closing notifications would not require prior express written consent. This exemption for political messages is particularly important in a presidential election year because robocalls are becoming an increasingly common technique by political candidates and parties – there will be no relief this year. In addition, informational calls such as debt collection calls, airline delay notifications, bank account fraud alerts, survey inquiries, and wireless usage warnings to the extent that these communications do not include advertisements or telemarketing messages were also excluded from the new requirements. The informational call exception represented a major victory for a broad cross-section of industry stakeholders, including credit card companies, financial services firms and airlines, who commented that companies should not need to receive prior express written consent before providing critical financial and travel information to consumers.

            (2)        Eliminating the “Established Business Relationship” Exemption 

Under existing FCC rules, telemarketers could normally place robocalls to residential landlines without prior consent when the telemarketer had an “established business relationship” with the consumer, usually based on the consumer’s previous purchases of the telemarketer’s goods or services. The Report and Order eliminated this exception, stating that telemarketers often misused this exemption to send frequent robocalls offering unrelated and unwanted services. The FCC stated that the electronic signature provisions of the new regulations would lessen any new compliance costs caused by the elimination of the established business relationship exception. Additionally, the Report and Order noted that the FTC eliminated the established business relationship exception for telemarketers under its jurisdiction in 2008, meaning that many telemarketers had already adapted their practices to comply with the new regulations.

            (3)        Establishing an Opt-Out Mechanism

Previously, customers who wished to opt-out of receiving robocalls had to dial a telephone number provided by the telemarketer to register his or her request. Many consumers complained that the call back system was burdensome and ineffective. Under the new regulations, all telemarketing robocalls must provide “an interactive opt-out mechanism” that is announced at the outset of the call and available throughout the recorded message. If the consumer selects the opt-out mechanism, the telemarketer must automatically add the consumer’s number to the telemarketer’s internal “do-not-call” list and immediately disconnect the call. Additionally, if the robocall is received by an answering machine or voicemail service, the message must contain a toll-free number that allows the consumer to call back and connect directly to the opt-out mechanism. The Report and Order did not prescribe a particular opt-out mechanism to be used by telemarketers. Telemarketers will be required to provide an opt-out mechanism within 90 days of the publication of the OMB's approval of the new rules in the Federal Register.

            (4)        Limiting Abandoned Calls Through Predictive Dialers

Telemarketers often use “predictive dialer” technologies, which initiate the next phone call while the telemarketer is on the phone with another consumer. If the telemarketer is unable to take the subsequent call, the consumer usually experiences a hang-up or “dead air.” The FCC’s rules limit the amount of these abandoned calls and requires telemarketers to employ technologies ensuring that no more than three percent of all calls result in abandonment. However, the FCC’s rules did not include a clear timeframe for measuring the rate of abandoned calls. The revised rules now calculate the abandonment rate during a “single calling campaign” over a 30-day period. The Report and Order adopted the FTC’s definition of calling campaign as “the offer of the same good or service for the same seller,” even if the telemarketer uses different scripts containing different wording in support of the same campaign. The FCC stated that the 30-day period would provide a reasonable assessment of the abandonment rate and would take into account fluctuations caused by time of day, operator availability, and number of phone lines used by the telemarketer. The revised abandonment rate measurement rules will become effective within 30 days of the publication of the OMB's approval of the new rules in the Federal Register. Interested parties will then have an opportunity to file petitions for reconsideration under a timeframe that will be announced by the FCC.

Facebook Settles Privacy Action with the FTC

By Lauren B. Pryor

On Tuesday, the Federal Trade Commission announced a proposed settlement with Facebook, the world’s largest social media site, in connection with an FTC complaint alleging that Facebook repeatedly deceived consumers by promising to keep certain personal information private and failing to do so. The settlement suggests that the FTC may continue to focus on privacy-related enforcement actions into the New Year.

Among other things, the complaint alleged that Facebook made privacy misrepresentations in connection with the 2009 revamping of its privacy model. The complaint also alleged that:

  • without notice or consent, Facebook changed its privacy features to allow private information – such as Friends Lists – to be publicly available; 
  • although Facebook represented that users could restrict sharing of data to limited groups (e.g., "Friends Only"), such information was shared with third-party apps;
  • Facebook shared personal information with advertisers while promising not to do so; and
  • Facebook continued to allow access to pictures and videos after user accounts were deleted or deactivated.

The proposed settlement bars Facebook from making deceptive privacy claims, requires Facebook to obtain user consent before changing privacy features and requires Facebook to submit to privacy compliance audits over the next 20 years. In addition, under the proposed settlement, Facebook must prevent access to user content after an account has been deleted and establish and maintain a comprehensive privacy program to address privacy risks related to the development and management of new and existing products and services. Notwithstanding the allegations, the proposed settlement does not constitute an admission by Facebook as to any violations of law.

In a blog post in response to the settlement, Facebook CEO Mark Zuckerberg outlined recent modifications to Facebook’s privacy policies designed to address privacy concerns. Zuckerberg further announced that the role of Facebook’s Chief Privacy Officer will be split into two distinct positions to address matters related to policy and products.

The Facebook settlement is evidence of a greater effort by the FTC to hold social media companies accountable for allegedly deceptive privacy practices. Specifically, the FTC recently settled an action with Twitter concerning its data security practices and actions with the operator of www.skidekids.com and the application provider W3 Innovations, LLC in connection with violations of the Children’s Online Privacy Protection Act. 

The FTC will accept public comments on the proposed Facebook settlement through December 30, 2011. Thereafter the FTC will decide whether to make the order final.

FTC Proposes Major Expansion to COPPA's Scope and Compliance Requirements

Update (11/22/11): The FTC extended the deadline for comments on the proposed COPPA reforms until December 23, 2011, citing the complexity of the questions and issues raised by the proposed amendments. The original comment deadline was November 28, 2011.

---------

The Federal Trade Commission recently announced a set of proposed revisions to the Children’s Online Privacy Protection Act (“COPPA”) which would expand the Act’s application to a greater number of websites and online services. COPPA requires that website operators notify parents and obtain parental consent before they collect, use, or disclose personal information from individuals under 13 years of age. Specifically, the proposed rules would expand the definition of personal information to include so-called “persistent identifiers,” which represent unique user identification information obtained for purposes other than for the support of the internal operations of a website or online service. The new rules would also extend COPPA protections to photographs, videos, or audio files that include a child’s image or voice. The FTC will consider a wider range of factors, including whether a website includes child celebrities and music content, when determining whether the site or online service is directed to children. The proposed rules rejected a number of alternative means of obtaining parental consent proposed by stakeholders and declined to establish a safe harbor for websites and online services which follow best practices guidelines issued by the Direct Marketing Association.

A K&L Gates Client Alert providing a detailed summary of the FTC’s proposed COPPA revisions and an analysis of the potential impacts of the reforms on websites and online services may be found here.

FTC Settles Privacy Case Against Children's Social Networking Site

The Federal Trade Commission recently announced its settlement with the operator of www.skidekids.com, a social media website marketed as the “Facebook and Myspace for kids.” The FTC claimed that the website collected personal information from approximately 5,600 children without parent consent in violation of the Children’s Online Privacy Protection Act (“COPPA”). COPPA requires that website operators notify parents and obtain parental consent before they collect, use, or disclose personal information from individuals under 13 years of age. The agency also alleged that the website’s operator made deceptive claims regarding the website’s privacy policy and information collection practices.

While the Skid-e-Kids website asserted that parents would be contacted by email prior to their child’s use of the site, the FTC found numerous instances where parental notice was not provided and consent was not received. As a result, the site allowed children to create profiles, post personal information, upload pictures, and send messages to other users, resulting in the unauthorized collection of user names, birth dates, email addresses, and cities of residence. 

In addition to barring any future COPPA violations and deceptive privacy claims, the operator of Skid-e-Kids agreed to: (i) destroy all information collected from children in violation of COPPA; (ii) provide online educational material about privacy, retain an online privacy professional, or join a FTC-approved safe harbor program; and (iii) pay a $100,000 civil penalty. All but $1,000 of the penalty will be waived if the operator complies with the settlement’s oversight requirements and supplies accurate financial information to the FTC. The settlement remains subject to court approval.

The settlement is further evidence of the FTC’s recent efforts to step up enforcement on a variety of privacy-related issues. On the same day as the Skid-e-Kids settlement, the FTC reached another settlement with an online advertising company for misleading customers regarding the use of tracking cookies. Less than a month ago, the FTC settled a privacy case against a mobile application developer for alleged COPPA violations. The FTC has specifically emphasized online privacy protections for children, recently launching a website promoting safe use of social networking sites by tweens and teens.

FTC Settles First Privacy Case Involving a Mobile Application

By Samuel Castic

The FTC announced a consent decree and order on Monday settling the civil action that was commenced against W3 Innovations, LLC, and Justin Maples—the entity and person respectively behind the Broken Thumbs Apps brand—for alleged violations of the Children’s Online Privacy Protection Act (“COPPA”). Broken Thumbs Apps developed Apple Store Apps including Emily’s Girl World, Emily’s Dress Up, Emily’s Dress Up & Shop, and Emily’s Runway High Fashion, which were collectively reported to have more than 50,000 downloads. The FTC announcement indicated that this is the FTC’s first case involving mobile applications, or “apps.”

The FTC found that the apps at issue contained “subject matter, visual content, and language” that was directed at children under the age of 13, which directly implicated the COPPA requirements. The allegations in the complaint suggest that the FTC was most concerned with two aspects of the apps’ operation. First, the FTC took issue with the apps’ invitation for users to “e-mail Emily,” the fictional namesake of the apps, and the developer’s subsequent collection and maintenance of e-mail addresses from individuals who were likely to be children, even though those e-mail addresses were not publicly displayed. Second, the FTC called out the blog feature of several of the apps, which permitted users to submit comments (which could include personal information), and required users to provide a name when submitting a comment.  The app developer was thus alleged to collect personal information from children under the age of 13, but it did not comply with the COPPA by: (i) providing required notices, including parental notices, about how such information was collected or used; or (ii) obtaining verifiable parental consent before collecting, using, or disclosing personal information about children.

In settling the action, the developers agreed to, among other things: (i) a civil penalty of $50,000; (ii) an obligation to promptly respond to FTC compliance monitoring inquiries and consenting to broad FTC investigatory powers; (iii) reporting obligations for three years on changes in the app developers’ address, employment status, name, or corporate structure and on the app developer’s practices and compliance with the COPPA; (v) detailed record keeping obligations for six years; (vi) a three year obligation to report the consent decree requirements to specified types of third party entities that the app developer deals with; and (vii) a mandatory requirement to delete all personal information that was collected without complying with the COPPA. The details of the consent decree provide a continuing set of compliance obligations, and failure to comply in any respect can subject the app developer to further penalties. 

This FTC settlement comes several months after a hearing on the COPPA was held by the U.S. Senate Committee on Commerce, Science, and Transportation. This past spring, the chairman of that committee, Senator Jay Rockefeller, made inquiries of companies like Apple and Google to ascertain what efforts they undertake to verify that app developers comply with the COPPA. Significantly, this consent decree may foreshadow continuing FTC interest in COPPA compliance for mobile application developers and content providers.

FTC Launches Antitrust Investigation Against Google

By Ryan Demotte

Last week Google acknowledged in an SEC filing that the Federal Trade Commission has launched a formal antitrust inquiry into the company’s search and advertising business practices, issuing a subpoena and notice of a civil investigative demand to the company. According to news reports, FTC lawyers have been informally gathering information for several months concerning the way Google orders search results and advertising. By taking this step, the FTC can compel Google to turn over a wide range of internal information concerning its business. Rivals argue that Google engages in anticompetitive conduct by using its dominance in the search market to favor its own services and reduce web traffic to competing services. Google already faces a similar investigation by European regulators.

In a response posted on its official blog, Google provided a preview of what may be its core defense to any antitrust allegations – that competition in the search engine market is “only one click away,” and that users are free to use any of a variety of alternatives to Google. 

Using Google is a choice – and there are lots of other choices available to you for getting information: other general-interest search engines, specialized search engines, direct navigation to websites, mobile applications, social networks, and more.   

According to this line of reasoning, since consumers, i.e. search engine users, can switch to alternative search engines at virtually no cost, Google does not have market power. Critics, on the other hand, point to Google’s high market share – over 66 percent in the U.S. and greater in Europe - as evidence that Google effectively controls traffic to web sites, and thus does have market power in search that can be leveraged anti-competitively in other businesses.

While Google has faced antitrust scrutiny on some of its acquisitions in the past, this investigation is the first to focus on its core business of search and advertising, and thus presents potentially serious legal risk for the company.

Restrictive Website Rules Found to Be Anticompetitive

By Scott M. Mendel and Michelle S. Taylon

In Realcomp II, Ltd. v. FTC (6th Cir. April 6, 2011), the Sixth Circuit upheld the Federal Trade Commission's conclusion that Realcomp, a Detroit area multiple listing service, violated Section 5 of the Federal Trade Commission Act by adopting rules restricting the ability of its broker members to advertise discounted brokerage services. While none of Realcomp’s website restrictions eliminated discount brokerage services or information regarding such services, they made such information less accessible and more costly to obtain. That was enough for the court to conclude that Realcomp’s policies had an actual anticompetitive effect based on the decline in the share of listings accounted for by discount listings.

The Realcomp decision can have significant implications for businesses, especially joint ventures, considering rules that restrict the information that can be disseminated over their websites. Rules that prevent, restrict, or make more costly the dissemination of information relating to discounted services must be reviewed carefully to determine their potential for anticompetitive effects.  

Senators McCain and Kerry Introduce Privacy Bill of Rights

On April 12, 2011, Senator John Kerry (D-MA) and Senator John McCain (R-AZ) introduced the “Commercial Privacy Bill of Rights Act of 2011” to establish the first federal statutory baseline of consumer privacy protection that would apply across industry sectors. The bill would govern how customer information is used, stored, and distributed online. We will provide more analysis soon, but for now, here are the highlights:

Information covered. The bill applies to broad categories of information, including names, addresses, phone numbers, e-mail addresses, other unique identifiers, and biometric data when any of those categories are combined with a date of birth, place of birth, birth certificate number, location data, unique identifier information (that does not, alone, identify an individual), information about an individual’s use of voice services, or any other information that could be used to identify the individual.

Right to security and accountability. Information-collecting entities would be required to implement security measures to protect user information and would be prohibited from collecting more individual information than is necessary “to enforce a transaction or deliver a service requested by that individual,” subject to certain exceptions.

Privacy by design. Entities would be required to implement privacy by design concepts, which would require entities to incorporate privacy protection into each stage of product or service development in a manner that is much more comprehensive than previously required anywhere in the United States.

Privacy policies. Entities would be required to have privacy policies or disclosures that clearly, concisely, and timely notify individuals of the entities’ practices “regarding the collection, use, transfer, and storage” of individual information, and entities would also be required to notify individuals when their practices undergo “material changes.”

Right to notice, consent, access and correction of information. The bill would offer individuals the option to opt-out of most information collection activities and require that individuals affirmatively consent to sharing certain information with third parties, and for an entity's collection of especially sensitive personal information. Entities would also have the right to access and correct information that entities maintain about them.

Service providers. The bill would require entities that contract with any service provider that has access to individual information to require the service provider to comply with the requirements of the bill, and to comply with the entity’s information policies and practices.

Third parties and data transfers. The bill would restrict the ability to transfer or share individual information with third parties, and would obligate the transferring entity to contract with any such third party for the protection of the individual information before transferring it.

Enforcement. The bill would empower state attorneys general and the Federal Trade Commission (“FTC”) to enforce the new restrictions. It would allow the FTC to develop safe harbor programs for authorized information collection.

Scope. The new rules would apply to non-profit organizations (a potential expansion of FTC authority), telecommunications common carriers (an expansion of FTC authority), and other entities which collect personal information on more than 5,000 individuals in a given year. The bill’s restrictions would not extend to federal and state governments or law enforcement agencies.

The privacy protections follow the decision by many popular Internet browsers to allow users to select a “do-not-track” feature for their searches. Leading Internet merchants and privacy watchdog groups praised the bipartisan bill, calling it “an important step” toward the development of a comprehensive national privacy law, while critics maintain that it does not go far enough to protect consumer privacy rights.