Amended  IN  Assembly  May 04, 2022
Amended  IN  Assembly  March 24, 2022

CALIFORNIA LEGISLATURE— 2021–2022 REGULAR SESSION

Assembly Bill
No. 2408


Introduced by Assembly Members Cunningham and Wicks

February 17, 2022


An act to add Section 1714.48 to the Civil Code, relating to social media platforms.


LEGISLATIVE COUNSEL'S DIGEST


AB 2408, as amended, Cunningham. Child users: addiction.
Existing law, the California Consumer Privacy Act of 2018, prohibits a business from selling the personal information of a consumer if the business has actual knowledge that the consumer is less than 16 years of age, unless the consumer, in the case of a consumer at least 13 years of age and less than 16 years of age, or the consumer’s parent or guardian, in the case of a consumer who is less than 13 years of age, has affirmatively authorized the sale of the consumer’s personal information.
This bill, the Social Media Platform Duty to Children Act, would impose on an operator of a social media platform a duty not to addict, as defined, child users and would, among other things, prohibit a social media platform from addicting a child user by any either of certain means, including the use or sale of a child user’s personal data. The act would authorize a person authorized to assert the legal rights of a child user who suffers injury as a result of a violation of the act to bring an action against a violator to recover or obtain certain relief, including a civil penalty of up to $25,000 per violation per calendar year. violation.
Vote: MAJORITY   Appropriation: NO   Fiscal Committee: NO   Local Program: NO  

The people of the State of California do enact as follows:


SECTION 1.

 This act shall be known as the Social Media Platform Duty to Children Act.

SEC. 2.

 The Legislature finds and declares all of the following:
(a) California should take reasonable, proportional, and effective steps to ensure that its children are not harmed by addictions of any kind.
(b) A broad diversity of psychologists and psychiatrists in the field of addiction, as well as scientists, doctors, and other researchers, acknowledge the existence of social media addiction.
(1) Research using the Bergen Social Media Addiction Scale, a widely used measure of social media platform addiction, has found that social media platform addiction has a prevalence across the general population of about 5 percent.
(2) In people who become addicted, the brain’s reward system is more active when using social media than it is in the brains of people who are not addicted. The result, according to health experts and researchers, is compulsive and excessive social media use.
(c) There is growing evidence that social media platform addiction is a particular problem, particularly among adolescent children.
(1) The largest social media platform company in the world’s own secret internal research validates both the existence of social media addiction in children and that social media addiction hurts children. As an example, in September 2021, The Wall Street Journal published a series of articles referred to as “The Facebook Files.” Those articles, citing a trove of internal documents obtained from Frances Haugen, a whistleblower, demonstrated the extent to which Facebook knew that its platforms cause significant harm to users, especially children.
(2) More specifically, as revealed by Haugen’s sworn testimony before Congress and the accompanying secret research she revealed to The Wall Street Journal, “Facebook has studied a pattern that they call problematic use, what we might more commonly call addiction. It has a very high bar for what it believes [problematic use] is. It [means] you self-identify that you don’t have control over your usage and that it is materially harming your health, your schoolwork or your physical health.” … “Facebook’s internal research is aware that there are a variety of problems facing children on Instagram, they know that severe harm is happening to children.”
(3) During whistleblower Haugen’s sworn testimony to Congress, she revealed that, when it comes to meeting the platform’s addiction-like definition of “problematic use”: “Five to six percent of 14 year olds have the self-awareness to admit both those questions” that qualify a child as having problematic use.
(4) Five to six percent of Instagram’s child users is millions of children, certainly many thousands of which reside in California.
(d) Social media platform addiction is more acute in girls than boys.
(1) Girls experience a higher prevalence of social media addiction than boys.
(2) Girls who admit to excessive social media platform use are two to three times more likely to report being depressed than girls who use social media platforms lightly.
(3) A March 2020 presentation posted by Facebook researchers to Facebook’s internal message board reported that “66% of teen girls on IG experience negative social comparison (compared to 40% of teen boys)” and that “[a]spects of Instagram exacerbate each other to create a perfect storm.”
(e) The business models of some social media platform companies financially motivate them to deploy design features that increase the likelihood of addiction among all users, including children.
(1) Instead of charging to sign up, social media platforms earn “substantially all” of their revenue through advertising.
(2) The more time users engage with the platform, the more ads users see, and the more valuable the advertising becomes.
(3) In this regard, addicted consumers are particularly profitable because their consumption behavior goes beyond normal engagement levels.
(4) User engagement does not distinguish between engagement that increases because it is enjoyable and enhances health and well-being and engagement that increases because of addiction. In fact, many users spend even more time on social media when engaging with content that makes them subjectively unhappy or objectively unhealthier.
(5) For these profit-driven reasons, social media platform companies intentionally invent, design, and deploy features that are intended to make it hard for users to stop using the platform, including deploying techniques used in gambling and techniques that mask or avoid cues that might prompt a user to stop using.
(f) Companies that market high-volume addictive products, including tobacco, have a special incentive to addict young, potentially life-long, lifelong, consumers.
(g) Adolescent children are at far greater risk than adults to becoming addicted to social media platforms.
(1) Adolescent children exhibit higher levels of stress and an increased proclivity toward taking risks.
(2) During adolescence, children’s reward systems develop much faster, while their self-control systems, which are not fully developed until 21 years of age, lag behind. For this reason, rates of behavioral addictions are elevated in adolescence as compared to adulthood.
(3) Social media platform companies can use the data they collect on children to determine which children are most likely to be vulnerable to a given ad, thereby exacerbating the risks of addiction.
(4) As compared to adults, children are more susceptible to the pressures and influence of advertisements, less likely to recognize paid-for content, and less likely to understand how data is used for these purposes.
(h) Because their brains are still developing, children are at far greater risk of being harmed by social media platform addiction than adults. Addiction adversely influences the development of judgment, attention, and memory in the brain.
(1) Higher daily rates of checking social media platforms have been linked to a reduction in the volume of brain tissue that controls memory, emotions, speech, decisionmaking, and self-control.
(2) For this reason, reduction in this kind of brain tissue is in turn correlated with higher impulsivity, something with which children and adolescents are already susceptible by dint of their youth.
(3) Several studies have found links between spending time on social media platforms and rates of suicide and depression among teens.
(4) Numerous studies show that reducing social media platform use results in mental health benefits.
(5) Social media platform addiction can create a vicious cycle for shy and lonely youth. Discomfort with real-life interactions leads to internet interactions, isolation from real-world interaction causes loneliness, loneliness combined with social phobia motivate additional engagement online.
(i) When social media platform companies create, design, implement, or maintain features for users, including child users, on their social media platforms that the company knows or should know are addictive to children, they should be held liable for the harms that result.
(j) Other addictions, including gambling addictions, have had a demonstrable negative effect on state economies.
(k) California has a compelling interest in protecting the mental health of its children from social media platform addiction for, at a minimum, all of the following reasons:
(1) To prevent needless suffering to California children and their families.
(2) To ensure the capacity of all its children to fulfill their potential and to reach normal goals for social and educational achievement to the benefit of all Californians.
(3) To prevent the costs of treating mental health harms to children from being incurred by and shifted to California families, businesses, insurers, schools, and mental health professionals.

SEC. 3.

 Section 1714.48 is added to the Civil Code, to read:

1714.48.
 (a) For purposes of this section:
(1) “Addict” means to knowingly or negligently cause or contribute to addiction through any act or omission or any combination of acts or omissions.
(2) “Addiction” means use of one or more social media platforms that does both of the following:
(A) Indicates preoccupation or obsession with, or withdrawal or difficulty to cease or reduce use of, a social media platform despite the user’s desire to cease or reduce that use.
(B) Causes or contributes to physical, mental, emotional, developmental, or material harms to the user.
(3) “Child user” means a person who uses a social media platform and is not older younger than 17 18 years of age.
(4) “Personal data” means information that identifies a natural person or is linked or linkable to an identifiable natural person.
(5) (A) “Social media platform” means an internet service that meets both of the following criteria:
(i) (I) The internet service is a means by which content is generated by a user of the service, or uploaded to or shared on the service by a user of the service, that may be encountered by another user, or other users, of the service.
(II) For purposes of this subparagraph:
(ia) “Content” means anything communicated by means of an internet service, whether publicly or privately, including written material or messages, oral communications, photographs, videos, or visual images.
(ib) “Content that may be encountered by another user, or other users, of a service” includes content that is capable of being shared with a user by operation of a functionality of the service that allows the sharing of content.
(ic) “Encounter” means to read, view, hear, or otherwise experience content.
(ii) The internet service is controlled by a business entity that generated at least one hundred million dollars ($100,000,000) in gross revenue during the preceding calendar year.
(B) “Social media platform” does not include any of the following:
(i) An email service, if emails are the only user-generated content enabled by the service.
(ii) An SMS and MMS service, if SMS or MMS messages are the only user-generated content enabled by the service.
(iii) A service offering only one-to-one live aural communications.
(iv) An internal business service that is an internal resource or tool for a business or nonprofit organization in which the services is not available to children in the general public.
(v) A service, including a comment section on a digital news internet website or a consumer review of a product and service on an online commerce internet website, with functionalities that allow users to communicate only in any of the following ways:
(I) Posting comments or reviews relating to content produced and published by the provider of the service or by a person acting on behalf of the provider of the service.
(II) Sharing comments or reviews described in subclause (I) on a different internet service.
(III) Expressing a view on comments or reviews described in subclause (I), or on content mentioned in subparagraph (A), by means of any of the following:
(ia) Applying a “like” or “dislike” button or other button of that nature.
(ib) Applying an emoji or symbol of any kind.
(ic) Engaging in yes or no voting.
(id) Rating or scoring the content, or the comments or reviews, in any way.
(vi) An internet-based subscription streaming service offered to consumers for the exclusive purpose of transmitting licensed media, including audio or video files, in a continuous flow from the internet-based service to the end user.
(vii) A service that operates for the sole purpose of cloud storage or shared document or file collaboration.
(viii) A service that operates for the sole purpose of providing general or tailored internet search services.
(b) An In accordance with Section 1714, an operator of a social media platform has a duty not to addict child users. A An operator of a social media platform’s duty not to addict child users includes a duty not to addict child users by any platform shall be found to have violated their duty if the social media platform is found to have addicted a child user by either of the following means:
(1) The use or sale of a child user’s personal data.

(2)The child user’s engagement in the platform’s products or services, including through the use of notifications soliciting child users to access those products or services, or permissions or advertising related to those products or services.

(3)

(2) The development, design, implementation, or maintenance of a design, feature, or affordance.
(c) (1) A person authorized to assert the legal rights of a child user who suffers injury as a result of a violation of this section may bring an action against a violator to recover or obtain any of the following relief:
(A) (i) Actual damages.
(ii) In a class action, the amount of damages awarded pursuant to this subparagraph shall not be less than one thousand dollars ($1,000) per member of the class.
(B) A civil penalty of up to twenty-five thousand dollars ($25,000) per violation per calendar year. violation.
(C) Injunctive relief.
(D) Punitive damages.
(E) An award of litigation costs and no more than twice the amount of reasonable attorney’s fees to a prevailing plaintiff.
(F) Any other relief that the court deems proper.

(2)In an action pursuant to this subdivision in which the plaintiff has shown, by a preponderance of the evidence, that the defendant addicted a child in violation of subdivision (b), the defendant shall have the burden of proving, by a preponderance of the evidence, that the extent of the injury or injuries alleged in the action were not, in whole or in part, caused or exacerbated by the defendant’s violation.

(3)

(2) (A) A knowing or willful violation of this section shall subject the violator to an additional civil penalty not to exceed two hundred fifty thousand dollars ($250,000) per violation per calendar year. violation.
(B) A civil penalty pursuant to this paragraph shall not be treated as an offset against an award of damages caused by the same knowing or willful violation in an action pursuant to this subdivision.

(4)

(3) (A) A social media platform that, before January 1, 2023, developed, designed, implemented, or maintained features that were known, or should have been known, by the platform to be addictive to child users shall be liable for all damages to child users that are, in whole or in part, caused by the platform’s features, including, but not limited to, suicide, mental illness, eating disorders, emotional distress, and costs for medical care, including care provided by licensed mental health professionals.
(B) A social media platform shall not be held liable for a violation under this paragraph if, by April 1, 2023, the platform ceases development, design, implementation, or maintenance of features that were known, or should have been known, by the platform to be addictive to child users.
(d) An operator of a social media platform shall not be subject to a civil penalty pursuant to subdivision (c) if, before engaging in a practice that led to that violation, if the operator did both of the following:
(1) Instituted and maintained a program of at least quarterly audits of its practices, designs, features, and affordances to detect practices or features that have the potential to cause or contribute to the addiction of child users.
(2) Corrected, within 30 days of the completion of an audit described in paragraph (1), any practice, design, feature, or affordance discovered by the audit to present more than a de minimis risk of violating this section.
(e) The provisions of this section are cumulative to any other duties or obligations imposed under other law.
(f) This section shall not be construed to impose liability for a social media platform for content that is generated by a user of the service, or uploaded to or shared on the service by a user of the service, that may be encountered by another user, or other users, of the service.
(g) This section shall not be construed to negate or limit a cause of action that may have existed against an operator of a social media platform under the law as it existed before the effective date of this section.
(h) The provisions of this section are severable. If any provision of this section or its application is held invalid, that invalidity shall not affect other provisions or applications that can be given effect without the invalid provision or application.
(i) A waiver of this section is unenforceable as void against public policy.