
Surviving Bad Presidents
May 17, 2025
The Enduring Appeal of the Shaman
May 17, 2025On a recent commute to work, I texted my distant family about our fantasy baseball league, which was nice because I felt connected to them for a second. Then I switched apps and became enraged by a stupid opinion I saw on X, which I shouldn’t be using anymore due to its advanced toxicity and mind-numbing inanity. Many minutes passed before I was able to stop reading the stupid replies to the stupid original post and relax the muscles of my face.
This is the duality of the phone: It connects me to my loved ones, and sometimes I think it’s ruining my life. I need it and I want it, but sometimes I hate it and I fear it. Many people have to navigate this problem—and it may be at its worst for parents, who’ve recently been drowned in media suggesting that smartphones and social media might be harming their children’s mental health, but who also want their kids to enjoy technology’s benefits and prepare themselves for adult life in a digital age.
It was with this tension in mind that I rode a train last week to the town of Westport, Connecticut. There, a parent-led group called OK to Delay had organized an “Alternative Device Fair” for families who wanted to learn about different kinds of phones that were intentionally limited in their functionality. (There would be no frowning at X with these devices, because most of them block social media.) Similar bazaars have been popping up here and there over the past year, often in the more affluent suburbs of the tristate area. Westport’s fair, modeled after an event held last fall in Rye, New York, was set up in a spacious meeting room in the most immaculate and well-appointed public library I’ve ever seen. When I arrived, about 30 minutes after the start of the four-hour event, it was bustling. The chatter was already at a healthy, partylike level.
The tables set up around the room each showed off a different device. One booth had a Barbie-branded flip phone; another was offering a retro-styled “landline” phone called the Tin Can. But most of the gadgets looked the same—generic, rectangular smartphones. Each one, however, has its own special, restricted app store, and a slew of parental-control features that are significantly more advanced than what would have been available only a few years ago. One parent showed me her notepad, on which she was taking detailed notes about the minute differences among these phones; she planned to share the information with an online group of parents who hadn’t been able to come. Another mom told me that she’d be asking each booth attendant how easy it would be for kids to hack the phone system and get around the parent controls—something you can see kids discussing openly on the internet all the time.
A couple of years ago, I explored the “dumb phone” trend, a cultural curiosity about returning to the time before smartphones by eschewing complex devices and purchasing something simpler and deliberately limited. One of the better phones I tried then was the Light Phone II, which I disliked only because it was so tiny that I constantly feared that I would break or lose it. At the library, I chatted with Light Phone’s Dan Fox, who was there to show people the latest version of the device. The Light Phone III is larger and thicker and has a camera, but it still uses a black-and-white screen and prohibits web browsing and social-media apps. He told me that it was his third alternative-device event in a week. He’d also been to Ardsley, a village in New York’s Westchester County, and to the Upper East Side, in Manhattan. He speculated that kids like the Light Phone because it doesn’t require all the rigmarole about filters and settings and parents. It was designed for adults, and therefore seems cool, and was designed in Brooklyn, which makes it seem cooler. (Fox then left early to go to a Kendrick Lamar concert with his colleagues.)
The crowded room in Westport was reflective of the broad concern about the effect that social media may have on children and teenagers. But it was also a very specific expression of it. Explaining the impetus for hosting the marketplace, Becca Zipkin, a co-founder of the Westport branch of OK to Delay, told me that it has become the standard for kids in the area to receive an iPhone as an elementary-school graduation present. One of her group’s goals is to push back on this ritual and create a different culture in their community. “This is not a world in which there are no options,” she said.
The options on display in Westport were more interesting than I’d thought they were going to be. They reflected the tricky balancing act parents face: how to let kids enjoy the benefits of being connected (a chess game, a video call with Grandma, a GPS route to soccer practice, the feeling of autonomy that comes from setting a photo of Olivia Rodrigo as your home-screen background) and protect them from the bad stuff (violent videos, messages from creeps, the urge to endlessly scroll, the ability to see where all of your friends are at any given time and therefore be aware every time you’re excluded).
Pinwheel, an Austin-based company, demonstrated one solution with a custom operating system for Android phones such as the Google Pixel that allows parents to receive alerts for “trigger words” received in their kids’ texts, and lets them read every message at any time. As with most of the others demonstrated at the fair, Pinwheel’s custom app store made it impossible for kids to install social media. During the demo, I saw that Pinwheel also blocked a wide range of other apps, including Spotify—the booth attendant told me and a nearby mom that the app contains “unlimited porn,” a pronouncement that surprised both of us. (According to him, kids put links to porn in playlist descriptions; I don’t know if that’s true, but Spotify did have a brief problem with porn appearing in a small number of search results last year.) The app for the arts-and-crafts chain Michaels was also blocked, for a similar but less explicit reason: A red label placed on the Michaels app advised that it may contain a loophole that would allow kids to get onto unnamed other platforms. (Michaels didn’t respond to my request for comment, and Spotify declined comment.)
Beyond the standard suite of surveillance tools, many of the devices are also outfitted with AI-powered tools that would preemptively censor content on kids’ phones: Nudity would be blurred out and trigger an alert sent to a parent, for instance; a kid receiving a text from a friend with a potty mouth would see only a series of asterisks instead of expletives.
“The constant need to be involved in the monitoring of an iPhone is very stressful for parents,” Zipkin told me, referring to the parental controls that Apple offers, which can become the focus of unceasing negotiation and conflict between kids and their guardians. That is part of these alternative devices’ marketing. Pinwheel highlights the helping hand of AI on its website: “Instead of relying on parents to manually monitor every digital interaction (because who has time for that?), AI-driven tech is learning behaviors, recognizing risks, and proactively keeping kids safe.”
The story was similar at other tables. Gabb, a Lehi, Utah, company, offers a feature that automatically shuts down video calls and sends notifications to parents if it detects nudity. The AI still needs some work—it can be triggered by, say, a person in a bathing suit or a poster of a man with his shirt off, if they appear in the background of the call. Gabb also has its own music app, which uses AI and human reviewers to identify and block songs with explicit language or adult themes. “Taylor Swift is on here, but not all of Taylor Swift’s music,” Lori Morency Kun, a spokesperson for the company, told me.
At the next booth, another Utah-based company, Troomi, was demoing a system that allows parents to set content filters for profanity, discussions of violence, and “suggestive” chitchat, on a sliding scale depending on their kid’s age. The demonstrator also showed us how to add custom keywords to the system that would also be blocked, in case a parent feels that the AI tools are not finding everything. (“Block harmful content BEFORE it even has the chance to get to your kiddo!” reads a post on the company’s chipper Instagram account.)
Across the room, Bark, an Atlanta-based company that started with a parental-control app and then launched its own smartphone, offered yet another nice-looking slab with similar features. This one sends alerts to parents for 26 possible problems, including signs of depression and indications of cyberbullying. I posed to the booth attendant, Chief Commercial Officer Christian Brucculeri, that a kid might joke 100 times a day about wanting to kill himself without having any real suicidal thoughts, an issue Brucculeri seemed to understand. But false positives are better than missed negatives, he argued. Bark places calls to law enforcement when it receives an alert about a kid threatening to harm themselves or others, he told me, but those alerts are reviewed by a human first. “We’re not swatting kids,” he said.
Although everybody at the library was enormously polite, there is apparently hot competition in the alternative-device space. Troomi, for instance, markets itself as a “smarter, safer alternative to Pinwheel.” Pinwheel’s website emphasizes that its AI chatbot, PinwheelGPT, is a more useful tool than Troomi’s chatbot, Troodi—which Pinwheel argues is emotionally confusing for children, because the bot is anthropomorphized in the form of a cartoon woman. Bark provides pages comparing each of these competitors, unfavorably, with its own offering.
Afterwards, Zipkin told me that parents had given her varied feedback on the different devices. Some of them felt that the granular level of monitoring texts for any sign of emotional distress or experimental cursing was over-the-top and invasive. Others were impressed, as she was, with some of the AI features that seem to take a bit of the load off of parents who are tired of constant vigilance. Despite all the negative things she’d personally heard about artificial intelligence, this seemed to her like a way it could be used for good. “Knowing that your kids won’t receive harassing or bullying material or sexual images or explicit images, or anything like that, is extremely attractive as a parent,” she told me. “Knowing that there’s technology to block that is, I think, amazing.”
Of course, as every parent knows, no system is actually going to block every single dangerous, gross, or hurtful thing that can come in through a phone from the outside world. But that there are now so many alternative-device companies to choose from is evidence of how much people want and are willing to search for something that has so far been unattainable: a phone without any of the bad stuff.
#Phone #Blocks #Bad #Stuff
Thanks to the Team @ The Atlantic Source link & Great Job Kaitlyn Tiffany