2025: Highlights, Milestones & Lessons
Taking a moment to capture some of highlights / milestones / lessons from the first few months of this intrepid little project.
Reflecting on 2025, I’m proud to have finally launched this project. I’ve been thinking about it for ages, and in September of this year I finally hit “publish.”
I took my time this fall — experimenting with topics, testing different formats, gauging what resonates with parents, and finding a rhythm that fits into the chaos of daily life (mine and my readers) while being both relevant and accessible. A steady once-a-month(ish) cadence gave me space to find my footing and build momentum thoughtfully. In 2026 I am picking up the pace. That means moving to more of a weekly cadence, but also expanding beyond the newsletter and podcast to create more opportunities for engagement both online and IRL. (So stay tuned for that.)
Below are some of the highlights from Home Screen’s three months so far, including some milestones, and a few favorite quotes & clips.
🎉 Milestones
September 2025: Started posting / testing short-form videos on Instagram
September 26, 2025: Launched Home Screen!
October 10, 2025: First newsletter
October 29, 2025: Published first interview, featuring Deepti Doshi.
💡 Lessons & ideas
We talk about phones and social media as if they are a package deal, when they are not. We can separate them. (But this takes *a lot* of time...)
The term “social media” is used to describe a lot of platforms that are very different from each other, from TikTok and Instagram, to Reddit and Discord, to streaming platforms. This complicates the discussion.
YouTube often flies under the radar, but is an entry point to all sorts of other media and social media platforms.
Generative AI is perfectly crafted for moments of embarrassment, which are a feature of early adolescence.
Parasocial relationships (when a person develops an emotional bond with a celebrity/fictional character) are not new, but chatbots are taking them to a whole new level — fast. Generative AI in toys is set to turbo-charge this.
Some kids are more susceptible to algorithmically-fed content. Researchers are trying to understand what’s going on here.
Consider changing the question from when should I give my kid a phone? to… when should I allow my kid to be online unsupervised? Because being online involves three things that makes it risky for kids:
Strangers (read: creeps…)
User-generated content (and now AI-generated content), which means users can share *whatever* they want.
Algorithmically mediated experiences that are designed for addiction, because the more users “engage”, the more companies profit. And guess what gets a whole lot of engagement? Rage, porn, hate, and violence.
A couple things I want to try this year…
Introduce and encourage analog forms of the features kids like on phones: CD and tape players, iPods, cameras, landlines, walkie-talkies, etc…
More unsupervised playtime outside.
Organizing a group of kids to walk to the park on their own. (Maybe allowing them to borrow my smart watch.)
Deepti’s video game rule that they can play only with a friend over. This way, it becomes an engaging social activity. As Deepti put it: “... That has been a way to teach him that these are tools. It’s like a tool to play with your friend. It’s not in and of itself the thing to do. It’s a thing to do with your friend, because you want to be.”
🏆 Favorite quotes / lines:
Phones
“… it’s not the PHONE that is the problem — it’s what’s on *any* internet-connected device, which is… EVERYTHING. […] So when we ask ‘when should I give my kid a phone?’ I think the question we should really be asking is: ‘when should I allow my kid to be online unsupervised?’” - Emily Tavoulareas, Newsletter 3: When Should I Give My Kid a Phone?
For kids a phone is freedom and privacy. But that is not what they are getting. They are getting algorithmically-mediated freedom, with the ILLUSION of privacy. They cannot see or feel the degree to which what they experience as freedom is actually controlled by someone else. They don’t realize that they are trading their parents for corporations that have the singular goal of using the information they collect to addict and manipulate them. - Emily Tavoulareas, Newsletter 3: When Should I Give My Kid a Phone?
Social Media
“... we should talk about whether we want to call YouTube a social media platform because [...] pretty much every single child in America uses YouTube. It is sending your kids lots of information and is prioritizing short video and so it is a gateway to other forms of short video and other social media platforms for your kids.” - Amanda Lenhart, Episode 2
“... when we say the word social media, we mean things that are so different, they should not actually be categorized together.” - Amanda Lenhart, Episode 2
“Safety features might reduce exposure, but they don’t eliminate it. We need to be crystal clear about this. You cannot expect companies to prevent your kids from seeing particular kinds of content. It does not work like movie ratings or network TV, largely because the companies face no real legal constraints.” - Emily Tavoulareas, Newsletter 3: When Should I Give My Kid a Phone?
“… we had a rule that you could do Minecraft, but you had to do it with a friend over. So that it wasn’t an activity you did on your own, you had to do it on one screen with a friend over. So that means you had to be talking with your friend and negotiating with your friend about the world you wanted to build and where you wanted to kind of use your money and what you wanted to use your fire for. That has been a way to teach him that these are tools. It’s like a tool to play with your friend. It’s not in and of itself the thing to do. It’s a thing to do with your friend, because you want to be.” - Deepti Doshi, Episode 3
“... what it’s doing is distorting reality for kids at a time that they are learning how to navigate the world, how to connect with each other, how to create relationships. And part of what’s been bugging me for a while, including things like even Snapchat, which everyone’s like ‘oh, Snapchat’s great.’ Like, fine, yeah, they have all these security features, but they also literally stack rank your best friends and tell kids who your best friend is based on what? Based on Snapchat’s definition of what friendship is? But that intermediation, I think, has been really grossly overlooked because, probably because it’s so intangible.” - Emily Tavoulareas, Episode 2: Interview with Amanda Lenhart
AI
“Products that we are already using are not only allowing, but *actively enabling* young children to trade real relationships for an illusion — or perhaps more aptly, for a delusion. They are not necessarily stand-alone products. WhatsApp, Facebook, Instagram, Snapchat, TikTok, CapCut, Discord (to name a few) all have chatbots.” - Emily Tavoulareas, Newsletter 1: Weapons of Mass Delusion
“... we are entering an era that I think is about to get really, really complicated around social and emotional relationships with non-human entities, right? We are less uncomfortable with you having a really, really strong relationship with your stuffed animal when you’re six. Cause again, we create these effective relationships with things we care about. Kids get attached to objects. But it’s a different kettle of fish when that thing responds back to you.” - Amanda Lenhart, Episode 2
“... [Generative AI] is perfectly designed for that moment of deep embarrassment and importance of peers and the sense that you’re being watched by everyone. That’s the hallmark of early adolescence in particular…” - Amanda Lenhart, Episode 2
“The existence of products like this is not inevitable — it is a choice. A choice by companies, a choice by investors, a choice by consumers, and a choice by governments. It is also a choice for them to be integrated into products that we *already use,* and it is a choice for them to be targeted at children.” - Emily Tavoulareas, Newsletter 1: Weapons of Mass Delusion
Incentives / profit
“... the incentives for these companies are to make enormous sums of money very, very quickly and pay back their investors and their shareholders with giant payouts. And if you can’t show that you’re increasing your stickiness and you’re increasing the number of users and how long they spend on your site. Anything that you’re doing that drops any of those numbers down is like a non-starter. Even if you’re like, by the way, kids are dying... It doesn’t matter. Like it doesn’t matter… or it does matter, but it’s a PR problem, not a moral problem.” - Amanda Lenhart, Episode 2
“... when honestly you teach people that money is the only thing and that humans are in the way of you making money… they are only a thing to be gotten money from as opposed to a thing to be cared for, which you have a fiduciary responsibility in other non-monetary ways. Then I think that’s just an obstacle. That’s just a PR problem you need to mitigate.” - Amanda Lenhart, Episode 2
“… the problem is not one feature, or one product, or one company. The problem is an entire industry that has been incentivized (and even gleefully encouraged) to optimize their products for engagement, which is just another word for ADDICTION. No amount of “safety features” or “parental controls” can change this — they are a ruse to distract from the larger, more complex and entrenched problem, which is business models that are fueled by addiction.” - Emily Tavoulareas, Newsletter 1: Weapons of Mass Delusion
Independence & Community
“It’s not their fault that they don’t have the same kind of degrees of freedom that maybe we had growing up […] they become the data points of our loneliness epidemic because we’re not creating the context for them to build the relationships in the community that would be really healthy for them.” - Deepti Doshi, Episode 3
“We are now like mainly dual income families, you know, where both parents are working. And in the absence of that kind of supervision, I think we need to then be reflective of like, we’re outsourcing that supervision sometimes, maybe it’s to a babysitter or summer camp in some cases, but some cases we’re outsourcing that supervision to the device. For me, I’d rather outsource that supervision just to the neighborhood.” - Deepti Doshi, Episode 3
“I think it’s important that we give our kids the credit and the confidence, like they can understand the system. I think we need to talk to our children about the system. I was like, look… this is why I’m scared of ChatGPT. This is why I ask you to do it in front of me. It’s not because I don’t trust you, it’s because you’re talking to a machine that is not in your control.” - Deepti Doshi, Episode 3
“In the U.S., the role of community is often described as a “support system,” the implication being that what it provides the parent is tangible help. But of course a huge part of the value of community is intangible. It’s emotional support, it’s feeling seen, it’s belonging and connection, it’s running into familiar faces and having a quick 5 minute conversation at the store, or in the parking lot, or at your mailbox. It’s hard to put your finger on because what makes community so central to our species is the intangible inefficiencies / complexities / depth of human interactions and relationships. We continue saying parents need “support systems” when in fact what they need is connection.” - Emily Tavoulareas, Newsletter 2: Big Tech Didn’t Rewire Childhood — We Did.
“In the rush to optimize childhood, we’ve isolated ourselves. We’ve replaced spontaneous neighborhood gatherings with scheduled playdates, and place-based groups with online parenting forums. In much of America, our social lives are no longer in physical proximity to our homes… our kids activities are not at the local park or rec center. No single entity did that to us — we slow-boiled into it.” - Emily Tavoulareas, Newsletter 2: Big Tech Didn’t Rewire Childhood — We Did.
📌 ICYMI
Publications & posts from this year, all in one spot in case you missed them:
Pilot Episode: Interview with Deepti Doshi. Deepti talks about fostering independence and community, and how to help our kids build muscles for navigating the current realities of digital technology... and whatever comes next.
Episode 2: Interview with Amanda Lenhart. Amanda shares her unique perspective as both a researcher who has studied kids & tech since the ‘90s *AND* as a mom who has parented 4 kids through completely different eras of the internet.
And a few of the most popular Instagram posts:
How I use the “claw” machine as an analogy that introduces the kids to the idea of products that are designed to be addictive…
In the aftermath of Adam Raine’s suicide, I attempted to explain how chatbots go from helping with homework to coaching kids to suicide. Bottom line: imagine your kid goes to a math tutor for help, and during their sessions your kid shares a few personal details about their life that reveal vulnerability… and then the math tutor exploits that information to win the affection and trust of your kid. That trust allows the tutor to then coach them into doing any number of things. That’s basically what chatbots are doing. They are optimized for engagement, so they end up “behaving” like a psychopathic technology that is designed to manipulate people’s attention and affection.
Two clips from my interview with Deepti Doshi. Here she describes a rule she applies to video games, in order to make it a more dynamic social experience.
And here she describes how platforms that parents use (like NextDoor) affect our perception of safety, and then lead to kids having less free and independent playtime — especially outdoors.
My flag that while people are focused on restricting use of generative AI for mental health, kids are not necessarily seeking mental health advice intentionally… they could be on any chatbot, and end up unintentionally on a delicate mental health topic.
A few clips from my interview with Amanda Lenhart.
Here she explains why companies don’t really care about the impact their products are having on kids. As she says… “it’s a PR problem, not a moral problem.”
Here she explains how generative AI is “perfectly” designed for the awkwardness of early adolescence… and I pile on with how people are treating it like search, when it’s very much… not search.
Here she describes why AI in toys is totally new (and concerning) terrain.
So that's a wrap for 2025. In 2026 things are ramping up: more frequent posts, incredible guests, and opportunities to engage — not just consume. Because if the last few years have taught us anything, it's that everything is harder in isolation.
See you in 2026!










