FB pixel

Lessons from TikTok: Federal and state law implications for children’s biometric data privacy

 

tiktok app biometric data collection

Guest post by Bess Hinson, Ashley Thomas and Bisi Adeyemo, Morris, Manning & Martin, LLP

TikTok has been dominating news headlines for its ongoing struggles with privacy advocates and regulators. United States politicians have raised concerns that TikTok poses a national security threat over potential ties to the Chinese Government and President Trump’s administration has banned the app in the U.S. Numerous countries have opened active investigations into TikTok’s data collection practices and India has also banned the app from use within its borders. One primary reason why TikTok is being scrutinized by governmental authorities is TikTok’s collection of personal information related to minors and the alleged lack of security and privacy safeguards explicitly tailored to children who are the main demographic target for using the app. Countries that have adopted data privacy and biometric laws have specific requirements around the collection of personal information related to children. Due to the nature of the app and its purpose, TikTok also has been scrutinized for compliance with privacy laws affecting minors.

In May 2020, TikTok, Inc. (TikTok) and its Chinese-owned parent company ByteDance Inc. were named in a lawsuit filed in a federal court in San Francisco, alleging claims under Illinois’s Biometric Information Privacy Act (BIPA) for failing to obtain parental consent of minor users. This lawsuit isn’t the first time TikTok has faced scrutiny for its data collection practices with respect to minors. TikTok reached a settlement in 2019 with the Federal Trade Commission (FTC) for alleged violations of the Children’s Online Privacy Protection Act (COPPA) and has received criticism from consumer rights groups. Companies whose products and services are directed towards children need to be aware of federal and state privacy laws regulating biometric collection as well as specific requirements with respect to minor data collection.

Illinois biometric privacy law

As a result of COVID-19, businesses may increase adoption of biometric data collection in an effort to avoid contracting the virus and to minimize spread. Enacted in 2008, Illinois’ BIPA was the first state to adopt a law governing biometric data collection. The Illinois state legislature recognized that biometric information is unlike other unique identifiers or sensitive information and passed the law to protect biometric information.

Under Illinois’s BIPA, a “biometric identifier” is defined as any personal feature that is unique to an individual such as a retina scan, fingerprint, voiceprint and specifically includes scans of facial geometry. BIPA further defines biometric information as any information based on an individual’s biometric identifier used to identify an individual. BIPA also provides examples of identifiers that are excluded from coverage that includes writing samples, photographs, physical descriptions such as height and weight. Before a business can collect biometric information, BIPA requires the business to:

– Provide written notice to an individual, parent or legal guardian that biometric information is being collected and stored;

– Inform the individual, parent or legal guardian how long biometric information will be retained and the specific purpose for which the biometric information will be used;

– Receive a written release from the individual, parent or legal guardian;

– Develop a publicly available written policy that includes a retention schedule and guidelines for permanently destroying the biometric information when the initial purpose for collection no longer exists or within three years after the last interaction between the business and the individual subject, whichever is earlier.

Plaintiffs, who are bringing the lawsuit against TikTok on behalf of minors identified as P.S. and M.T.W., allege that TikTok collected facial recognition scans of minors and did not obtain parental consent before collecting or using the minors’ facial scans. The plaintiffs also claim that TikTok did not disclose what they do with that data, who has access to it, and whether, where, and for how long that data is stored. BIPA permits plaintiffs to obtain statutory damages of $1,000 per negligent violation and $5,000 for each intentional or reckless violation. The TikTok Plaintiffs are seeking equitable relief and demanding that TikTok comply with BIPA by providing a publicly available retention schedule or guidelines for permanently destroying the TikTok app users’ biometric identifiers and biometric information.

Children’s Online Privacy Protection Act (COPPA)

Last year, the FTC filed a complaint against TikTok alleging that the Company violated COPPA by failing to notify parents prior to the collection from children under the age of thirteen, failing to obtain verifiable parental consent, and failing to delete children’s information at the request of parents. The complaint further acknowledged that TikTok was aware that a significant percentage of users were younger than thirteen and received numerous complaints from parents that their children had created accounts without the parents’ knowledge and were under the age of thirteen.

On February 27, 2019, TikTok agreed to pay the FTC $5.7 million, a record amount at that time, to settle allegations that the Company violated COPPA, agreed to comply with COPPA in the future and agreed to take all videos made by children under the age of thirteen offline. Although TikTok agreed to comply, the Company has faced numerous allegations that it is still violating the COPPA.  In December 2019, TikTok settled a lawsuit for $1.1 million with two parents in California and Illinois who alleged that TikTok had collected personal information from children under the age of thirteen without parental consent. Child privacy advocates filed a complaint with the FTC in May 2020 alleging that the Company is in violation of the settlement agreement, including failing to delete all videos of children under the age of thirteen.

International scrutiny

The United States isn’t the only country scrutinizing TikTok’s data collection practices. In May 2020, the Data Protection Authority in the Netherlands announced an active investigation of TikTok’s data collecting practices. Recognizing that a large number of Dutch children had signed up for the app during the pandemic, the Dutch Data Protection Authority is examining whether the information provided to children using the app is easy to understand and adequately explains how their personal data is collected, processed and used under the European Union’s General Data Protection Regulation (GDPR). The Dutch DPA will also examine whether parental consent is required for TikTok to collect, store and use children’s personal information. In Nigeria, a not-for-profit organization, Laws and Rights Awareness Initiative, is suing TikTok claiming its data collection practices violate the Nigerian Data Protection Regulation. On June 29, 2020, India banned the use of TikTok among other mobile apps citing national security concerns. All of these investigations and cases are still pending.

Additional state privacy laws

While TikTok hasn’t faced a lawsuit under other state privacy laws, companies that market or serve minors should be aware of state law privacy developments.

– Texas’ biometric privacy law requires prior consent before capturing a biometric identifier or selling biometric information and must use reasonable care in storing information, and should destroy the biometric identifier within a reasonable time. Texas imposes a steep civil penalty of $25,000 for each violation.

– Washington’s law prohibits any company or individual from entering biometric data in a database for a commercial purpose, without first providing notice, obtaining consent, or providing a mechanism to prevent the subsequent use of a biometric identifier for a commercial purpose.

– Florida became the first state to implement a ban on the collection of its students’ biometric data in 2014.  The Florida law prohibits schools and districts from collecting, obtaining, or retaining, any student biometric information.

In 2020, two new state privacy laws which have come into force include biometric information: the California Consumer Privacy Act (CCPA) and the New York Stop Hacks and Improve Electronic Data Security Act (NY Shield Act).

The CCPA went into effect on January 1, 2020 and requires covered businesses to provide notice of the personal information, including biometric identifiers, they collect of California consumers and provides California consumers with a private right of action if personal information is involved in certain data breaches. Under the CCPA, businesses cannot sell the personal information of consumers that are sixteen years old or younger without prior consent. For minors who are thirteen years old or younger, businesses must obtain consent from a parent or guardian. If the minor is between the ages of thirteen and sixteen years old, businesses can obtain consent from the minor. The CCPA requires that businesses must establish, document, and comply with a reasonable method for verifying that the identity of the individual authorizing the sale of a child’s data is actually that child’s parent or guardian. CCPA regulations issued by the California Attorney General, which is tasked with enforcing with the CCPA, provide examples of permissible forms of verification which include:

– Providing a consent form to be signed physically or electronically by the parent or guardian under penalty of perjury and returned to the business by postal mail, facsimile, or electronic scan;

– Requiring parents or guardians to use payment methods such as credit cards that provide notification of each transaction;

– Asking the parent or guardian to communicate in person with trained personnel, either through a toll-free line or videoconference; or

– Verifying the parent or guardian’s government-issued identification against the business’s database, and then promptly deleting their PI from the business’s database.

The NY Shield Act went into on March 21, 2020, and mandates that any business that retains computerized information of New York residents must develop a data security program to protect the security, confidentiality and integrity of the private information including biometric information. A business that must comply with the NY Shield Act, is required to implement a data security program that includes reasonable administrative, physical and technical safeguards. Any business defined as a small business under the SHIELD Act is deemed to be compliant with the data security requirements if the small business’s security program has reasonable administrative, technical and physical safeguards that are appropriate for the size of the business, the nature and scope of the business’s activities, and the sensitivity of the personal information that the business collects from consumers. Violations of the NY Shield Act are considered deceptive acts or practices and the New York Attorney General is required to oversee enforcement of the statute. Covered businesses may be liable for a civil penalty of up to $5,000 dollars per violation.

Best practices

Biometric privacy laws are constantly evolving and businesses should actively monitor federal and state law developments. While there isn’t a one-size-fits-all approach for biometric privacy compliance for businesses that serve or market to minor users, here are some best practices that businesses should consider:

– Provide notices about personal information collected from children under 13;

– If collecting information about children in Europe or California under the age of 16, evaluate whether further requirements apply;

– Obtain verifiable parental consent before collecting, using, or disclosing children’s personal information;

– Provide parents with the ability to review and change their children’s information or allow them to delete that information;

– Maintain children’s personal information in a reasonably secure manner to protect its confidentiality, security and integrity;

– Regularly consult with your attorney to understand emerging developments in biometric and child privacy laws.

About the authors

Bess Hinson is Chair of the Cybersecurity & Privacy Practice at Morris, Manning & Martin LLP concentrating on cyber and data risk management and governance, breach preparedness and response, crisis management, and global data privacy compliance. Ashley Thomas and Bisi Adeyemo are Associates in the Cybersecurity and Privacy Practice. They can be reached at bhinson@mmmlaw.com, athomas@mmmlaw.com and badeyemo@mmmlaw.com.

DISCLAIMER: Biometric Update’s Industry Insights are submitted content. The views expressed in this post are that of the author, and don’t necessarily reflect the views of Biometric Update.

Related Posts

Article Topics

 |   |   |   |   |   |   |   |   |   |   |   |   | 

Latest Biometrics News

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Read This Week

Featured Company

Biometrics Research

Biometrics White Papers

Biometrics Events

Explaining Biometrics