Google HQ

Google headquarters

SAN FRANCISCO - A Utah family is trying to launch a class action complaint accusing Google for failing to properly protect students from using school-issued Chromebooks to access online porn and other inappropriate websites.

Identified only as John and Jane Roe, the parents of a minor referred to as M.C. retained George Feldman McDonald, of Bloomington, Minn., and Edtech Law Center, of Austin, Texas, for the Oct. 17 complaint they filed in federal court in San Francisco.

The seven-count complaint accuses the tech giant of design defect, failure to warn, under both straight liability and negligence theories, as well as a federal civil rights violation and a claim under California’s unfair competition law.

The family claims M.C., at age 11, accessed pornography through a school-issued Chromebook and developed an unhealthy, severely harming addiction. In addition to a jury trial, the family seeks “actual, compensatory, general, special, incidental, consequential, punitive and future pain and suffering damages” along with court orders and compensation for their legal fees and court costs.

M.C. got a Chromebook, according to the complaint, as a sixth-grader in March 2020 to facilitate remote learning. The family says M.C. was searching for information about Pokémon characters when “with each click, Google’s search algorithms pushed M.C. toward increasingly sexual content, including ‘pornographic Japanime,’ which produced sexually explicit images, and eventually to content depicting real people having sex.”

The family said the school claimed its third-party filter couldn’t prevent access to pornographic content and, despite attempts to limit access at school and monitoring usage at home, M.C. “continued to access pornography surreptitiously at school for years. The school continued to advise M.C.’s parents that they were unable to prevent such access. … Using Google’s Products, M.C. engaged in other related dangerous online activities, such as sending money to strangers in exchange for sexually explicit photos and sharing personal information about himself with strangers, including his home address, thereby endangering him and his family.”

Even after the school placed a “high restriction” designation on M.C.’s account, access to pornography was possible. The parents ultimately placed M.C. in a low-technology charter school.

According to the complaint, “Google’s products are dangerous by design” because “product safety is bad for Google’s bottom line” in that its alleged chief goal is to collect and monetize user data. The family accuses Google of a plan “to colonize K-12 education in the United States and around the world” starting in 2012 after the Chromebook laptop computer “flopped in the adult consumer market when it debuted in 2011.”

The family says their child’s public school required students to use Chromebook despite a default setting that grants “virtually unrestricted access to the internet, where Google knows students are likely to be exposed to harmful content, such as pornography and violence, and harmful communications, such as cyberbullying and sexual predation. Further, Google’s search algorithms are designed to promote maximally ‘engaging’ content, which frequently is content that is dangerous, especially for K–12 students.”

The complaint alleges schools can pay to access “tools that, if used, can make Google’s products somewhat less dangerous.” It further said that while administrator consoles have configurable safety settings “they are inadequate and overwhelming, numbering over a thousand, they are ever-changing and difficult to navigate.”

In addition to suggesting Google could “redesign its search technology to not promote — and even suppress — content that is objectively harmful to children,” the family argues the company could sell school Chromebooks with certain safety features enabled by default and allow administrators to roll back restrictions. It also called on Google to warn administrators, teachers and parents about the dangers of access to the open internet.

“Google knows more about the intersection of human behavior and computing than any other company on the planet,” according to the complaint. “Google also knows more about the dangers lurking on the internet, especially for children. Valued at roughly $2 trillion, Google could make products that arrive safe out of the box for students of every age. Instead, it makes products that are dangerous for all students because of the internet-first design that Google’s data-monetization business model requires.”

Ultimately, the family argued the law puts the onus on companies like Google to ensure children can safely use products, but said Google’s business model improperly shifts that burden to schools, parents and even the children themselves. The complaint quoted from Gemini, Google’s own artificial intelligence model, regarding the safety of unrestricted internet access and said the company didn’t account for the risks it plainly understands but nonetheless specifically and directly markets its products as safe to the education market.

The complaint alleges Google’s conduct “is so entwined” with school district policies and functions — “namely, collection and maintenance of student information and provision of administrative and pedagogical tools and services” — it can be sued as an agent of the state for an alleged violation of 14th Amendment rights “of parents to make decisions concerning the care, custody, and control of their children and “to direct the upbringing and education of children under their control.”

Google did not respond to a request for comment.

More News