This Week in Tech // April 6
Stuff happening in tech that is relevant to your kids, classrooms, and lives.
Second edition of the weekly roundup! Here’s what caught my eye this week:
“Chromebook remorse” is spreading (Thank you Baratunde Thurston for flagging this story.) While the pressure grows to integrate generative AI into classrooms, schools all over the country are dialing back the use of “edtech.” A middle school in Kansas decided to stop using Chromebooks, and schools in North Carolina, Virginia, Maryland, and Michigan are actively re-evaluating classroom use of screens. As I said in Tech Policy Press a few years ago: “past is often prologue — at least in terms of how “disruptive” technology will play out in real-world settings. If we want a preview of what’s coming, all we need to do is look at the recent past. After several cycles of “disruptive” technology, I think we can safely say that the potential of new technology often breaks down in its implementation”
Speaking of schools… a large multidistrict case will be heard in a jury trial starting this June. Districts are arguing that social media companies have undermined education and created a crisis that has forced schools to divert resources from their educational mission to address issues resulting from addictive products. Claims include: negligence, failure to warn about the social media’s addictive features, and public nuisance. I’m surprised there’s not more coverage about this, especially considering the implications of the recent social media trials in LA and New Mexico.
AI slop has flooded YouTube Kids (and the internet more broadly), putting bizarre and even disturbing short-form videos in front of kids. You’ve probably seen it before… the weird trance-like animation that has zero plot, and just changes from scene to scene. (My son has discovered whatever the f*ck this genre of video is…) There are tons of these accounts, and to put the volume into perspective: one of the most popular accounts was pumping out 50 videos PER DAY. Top AI slop channels targeting children have earned over $4.25 million, so YouTube has very little incentive to intervene. Ps: about 21-33% of videos on regular ole’ YouTube may be AI slop.
Microsoft Copilot is apparently “for entertainment purposes only” according to… Microsoft’s Terms of Service. The TOS goes on to say “It can make mistakes, and it may not work as intended. Don’t rely on Copilot for important advice. Use Copilot at your own risk." Well thanks, Microsoft. Noted. (ps: This applies to all chatbots and Microsoft was just caught in a moment of honesty that they are apparently now correcting. I’m sorry… “updating…”)
The Take It Down Act has a key deadline hitting next month. (This is a May 2025 law making it illegal to publish sexually explicit images of a person— including AI-generated deepfakes — without their consent.) Platforms have until May 19, 2026 to have removal processes in place for nonconsensual intimate imagery, including deepfakes. Because this involves requests to remove images, the burden falls on victims or organizations advocating for them. It is not a stretch to say that will involve a massive documentation burden that will hit schools hard. (Which, call me crazy, seems to validate the claims in the multidistrict case I talk about above.)
California’s community colleges are paying millions for AI chatbots that students say give wrong answers, can’t correctly name their own college’s president, and leave students so frustrated that they turn to unofficial social media channels. Worth bearing in mind as schools are pressured to adopt and integrate these products into schools at a massive expense to taxpayers, teachers, and students. This underlines issues raised in a recent open letter to Georgetown students. Also reminds me of a post I wrote about chatbots a few years ago, and this gem from Cyd Harrell in 2021:
Sources & mentions this week: The Markup, Baratunde Thurston, The 74, Fortune, NYTimes, Gadget Review, Cyd Harrell, Tech Policy Press, Courthouse News, EdTech Week, Georgetown Center on Privacy & Technology.



