Copy

Welcome to the latest edition of FPF’s Youth & Education Privacy newsletter. I’m Jamie Gorosh, policy counsel on the Youth & Education team at FPF. My work lately has been focused on the regulation and use of edtech products, student monitoring, and tracking student privacy legislation at the state level.

Next, you’ll be hearing from Bailey Sanchez! There is a lot happening in student and children’s privacy, and by rotating FPF experts, we will spotlight different perspectives on important issues in these areas.

Among other recent developments in child and student privacy, this newsletter highlights:

As we continue to refine the content and format of this newsletter, we want to hear from you - what’s on your mind, and how can we help? Reach out to us anytime by replying to this email.

A Focus on the FTC

As we approach ten years from the last COPPA rule update (2013) and three years since the FTC first announced its plans to undertake an update (2019), a group of lawmakers led by Sen. Markey are encouraging the Commission to move forward with an update.

It remains unclear if or when that will happen, as Commissioner Bedoya recently stated that the Commission favors legislative action before pursuing regulatory updates. Another potential reason for the delay? “The FTC (and its roughly 50-person privacy division) may simply have its hands full with all of the tasks it has undertaken in privacy.” In addition to its Advance Notice of Proposed Rulemaking, the FTC hosted a day-long virtual workshop on October 19, “Protecting Kids from Stealth Advertising in Digital Media.” Two good recaps of that workshop that we’ve come across are linked here and here

The FTC has also been stepping up its enforcement efforts, and last month, it announced a complaint and consent agreement against Chegg, “for its lax data security practices,” noting that the personal data of millions of users had been exposed over multiple security breaches in recent years. This enforcement action, at the time the Commission’s second in a month related to data security, follows its adoption of a COPPA policy statement last spring that effectively put edtech companies on notice about stepped-up enforcement. Although the Chegg decision did not use COPPA as a basis for enforcement, we are watching closely to see what happens next.

Age-Appropriate Design Code

Following California’s adoption of an age-appropriate design code bill and the early success of the U.K.’s version (“the big names in tech made positive changes,” the former UK Information Commissioner recently wrote) attention has turned to what comes next. My colleague Chloe Altieri talked to Stateline about how it could become a national standard. We’ve seen states indicate interest in adopting their own versions (see New York and Connecticut).

Chloe and our colleague Bailey Sanchez have been all over this topic recently. They published a comprehensive analysis of the key components of the California law and critical pending questions, and, hot off the press today, they have released a follow-up report comparing the California and UK codes. Bailey also spoke at the International Association of Privacy Professionals (IAPP) Privacy. Security. Risk. (PSR) conference, where the issue was “top of mind.” 

New research highlights the complexity and urgency of designing age-appropriate experiences for kids online. One-third of kids between ages 8-17 who are on social media have signed up falsely claiming to be an adult, in no small part because “age assurance has been a key challenge for online platforms and services for many years.” It is difficult to do without collecting personal data from children and families in return, raising the longstanding question of balancing safety and privacy. A new report from the Family Online Safety Institute (FOSI) dives deeper into attitudes around online safety, monitoring, and various methods of age assurance among parents and children in the US, UK, and France.

And a new coalition aims to help advance the broader conversation about potential social media reforms. The Council For Responsible Social Media, a new project of Issue One, aims to “change the national conversation around social media reform so it is focused on meaningful, achievable and bipartisan solutions.” The group of influential leaders was formed to “bridge divides in addressing the negative mental, civic, and public health impacts of social media in America.”


“Honey pots of highly sensitive information”

Cybersecurity continues to be a major concern - and priority - for schools and districts. The Director of the Cybersecurity and Infrastructure Security Agency identified K-12 schools as one of three “target-rich, resource-poor entities” that it plans to focus on. A recent Government Accountability Office (GAO) report called for additional federal coordination - and funding - to boost cybersecurity in K-12 schools.

While a ransomware attack like the one that recently hit Los Angeles Unified School District is “the single greatest cyber threat” facing K-12 schools, sometimes data breaches are more inadvertent, such as a failure to redact personal information from emails. Regardless of how the attack happens, “the repercussions of an attack on vulnerable school systems can be strong, long-lasting and expensive."

EdSurge took a close look at the situation and potential solutions; including a few trends captured in Data Quality Campaign’s annual review of state data-related legislation. One concern: the DQC report found that 120 of the 131 bills introduced either required new data collection or updates to current practices, and some worry the more data that schools collect, the more they become “honey pots of highly sensitive information.” This is especially true for larger school districts, which “manage more money, have more users, and manage far more devices and services than smaller districts—all of which increases their vulnerability to cyberattacks.


Academic Integrity Debates Return

As the end of the semester approaches, many students are preparing for final exams. And while the use of online proctoring has fallen from the peak of the pandemic as schools and even some proctoring companies adjust their practices in response to a variety of privacy and ethical concerns, the market is seemingly here to stay. My colleague Lauren Merk recently published an in-depth analysis of a legal ruling that found that a public university’s use of room-scanning technology during a remotely proctored exam violated a student’s Fourth Amendment right to privacy. While she notes that Fourth Amendment cases are especially fact-dependent, “Entities that employ proctoring software should be mindful of the Court’s reasoning and consider potential legal risks and privacy implications before employing proctoring technologies or requiring room scans within the home.”

The news that students are using AI to write their essays - avoiding plagiarism detection software and in some cases, getting straight A’s - has the academic world talking (and tweeting!).  While some argue this is “nothing to worry about” others are concerned that new “worryingly good” tools may “kill college writing” and make it “easier than ever for students to cheat.”


Online Harassment and Safety Tips

Online harassment targeting teachers is on the rise; 59% of teachers, 58% of administrators, 48% of support staff, and 38% of school psychologists in the US reported experiencing a form of harassment between 2020 and 2021. Asian American students also reported an alarming rise in cyberbullying during the pandemic, and student journalists are increasingly becoming targets.

A new FPF resource outlines practical steps that educators can take to improve their online safety and lower their risk, as well as what to do if they are being harassed. It is good advice for anyone to follow - things like enabling two-factor authentication, turning off location data tracking, and how to request that your personal information be removed.

And I’d be remiss to not flag a less traditional approach to educate folks about their risks online: a TikToker who is ‘consensually doxxing people in her comments section to educate them about privacy, tracking down the names and birthdates of users even with private accounts.

47 percent: Nearly one-half of Roblox’s more than 50 million active daily users are under the age of 13 and will no longer see ads, sponsored experiences/content, or user ads moving forward, the company announced in late October.
4.9 millionResearchers found 4.9 million public Facebook posts that included photos of students, and 726,000 posts that contained both a photo and student names, raising critical questions about whether school Facebook posts violate student privacy.
32 percent: There was no clear preferred form of age assurance among either parents or children according to new research by the Family Online Safety Institute (FOSI), with the most-preferred option - parental verification via text or app - only receiving support from 32% of parents.
$76,500: The amount of money that the University of Texas, Dallas spent on social media monitoring software over the past seven years, according to a report in The College Fix.
7 out of 10: Of the 42 daycare apps that EFF researched, only 10 had privacy policies that stated they did not share data with third parties, and 7 of those 10 were sharing data anyway. EFF has written to the FTC urging it to review privacy and security protections in daycare and early education apps.
30 million: Online proctoring company Proctorio, which recently announced it achieved an industry-leading level of security verification, worked with more than 4,000 education and corporate institutions to proctor more than 30 million exams in 2021 alone.
GET TO KNOW DAVID SALLAY
We first introduced David Sallay, FPF’s new director of youth and education privacy, in our newsletter last month, and now you can get to know him even better in this Q&A, where he talks about what he’s reading for work (and fun), the issues he’s worried about (and optimistic about) and more.
CONFERENCE PANELS
My colleague Bailey Sanchez spoke at the International Association of Privacy Professionals (IAPP)’s annual Privacy. Security. Risk. Conference, our colleague Lauren Merk represented the Youth & Education team on a panel at the Family Online Safety Institute (FOSI)’s annual conference, and our colleagues Jim Siegl and Miles Light each spoke on panels at the Privacy + Security Forum Fall Academy.
NEW Y&E TEAM RESOURCES
I’ve highlighted these throughout the newsletter, but would be remiss to not give one more shot out to Bailey and Chloe’s work on California’s Age-Appropriate Design Code (here and here), Lauren on the latest legal developments in online proctoring, Chloe and our former intern Nick’s online safety tips for educators, and Lauren and my recap of the FTC’s action against Chegg.
BOOTCAMPS ARE BACK
The Youth & Education Team recently held our first in a series of student privacy bootcamps, designed for ed tech vendors. We plan to hold several more this winter for additional audiences and would love to hear from you: what do you want to learn about? Please fill out this survey to share your thoughts and be on the lookout for future invitations. We hope to see you there!

The rollout of a 2021 Texas law that requires schools to send DNA kits to parents as part of an effort to help locate missing and trafficked children is leading to apprehension and “anger and distress” among some parents and families who are reminded of recent school shootings.

Twitter
Website
Email
YouTube
Copyright © 2022 Future of Privacy Forum, All rights reserved.


Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.