Main

Tech CEOs Under Senate Scrutiny on Children’s Safety

Some of the tech industry’s most prominent and powerful leaders are in Capitol Hill for a Senate hearing focused on protecting children online. Center for Humane Technology Policy Director Camille Carlton joins Caroline Hyde and Ed Ludlow on "Bloomberg Technology." -------- Like this video? Subscribe to Bloomberg Technology on YouTube: https://www.youtube.com/channel/UCrM7B7SL_g1edFOnmj-SDKg Watch the latest full episodes of "Bloomberg Technology" with Caroline Hyde and Ed Ludlow here: https://tinyurl.com/ycyevxda Get the latest in tech from Silicon Valley and around the world here: https://www.bloomberg.com/technology Connect with us on... Twitter: https://twitter.com/technology Facebook: https://www.facebook.com/BloombergTechnology Instagram: https://www.instagram.com/bloombergbusiness/

Bloomberg Technology

4 weeks ago

What, if any, policy can be put in place to ensure that children are that little bit safer? We can't kill humanity, but we can potentially hope that we put some barriers in place with its technology or indeed just policy more broadly. Yeah, absolutely. I think we're at a point, as everyone has said so far, where we are happy that these hearings are happening because the public needs to see it. But we're also at the point in our democratic process where it's time to push forward change and some o
f the things that we look for in bills, specifically around kids safety are three main pillars. We want a duty of care, which essentially means, just like a doctor has a fiduciary responsibility to ensure that patients are treated in their best interest. We want platforms to ensure that kids are not harmed on their platforms. We want privacy by default. This means if a kid goes on to Instagram, the moment they sign up, they have all of their settings at the highest level of privacy. It means tha
t they won't get messages from people that they're not friends with. Their profiles won't be able to seen, be seen by X or people. And we want safety by design. We want no more addictive features, features that are known to be harmful to kids, that platforms continue to utilise because it keeps them on longer. I'm going back to that privacy focus now. Some companies, I think of one in France, Uber, that came on to the show last week saying how ultimately you can use technology as a force for goo
d here. You can have some sort of ability to really age appropriately who really is part of using that system. But then privacy comes into it. How do you allow the camera to identify whether a child is indeed the age they say they are without in some way impacting their own privacy? How do you see technology being the sole fair? Yeah, I think that that's a great point. But part of what's missing is that these companies already have age estimation tools that they use to know the age of their user
s. I mean, one thing that came out of the ADR lawsuit that's happening across 41 states is the fact that Meta knows that it has millions of users under the age of 13 on its platform, and it's keeping that knowledge from the public. So they know this and they use their estimation tools to actually target against these kids and to target advertising to age ranges. So we actually don't need new technology Companies already have this information. We just need them to enforce it. Camille I still go b
ack to the idea that when this Senate hearing is finished, what comes out of it? Because many of the social media companies have codified policy, right? So if you take Tick tock as an example, they have restrictions on age. I think 13 is the barrier to using tech talk. And there is a policies. The question seems to be on enforcement and is Caroline rightly points out either a lack of or use of technology to enforce those policies. And I just wondered what your perspective is on policy versus act
ion. Yeah, I think the fundamental difference that we're talking about, though, is self-regulation policy by these platforms, which is where we're at right now, as opposed to regulation that's codified in law. So these platforms at any moment can change their terms and conditions. And what we're seeing right now is that they release features that folks have been asking for for years ahead of a big PR moment. And that's great. We want you know, we want these better features. But releasing these f
eatures because of a PR moment is no substitute for regulation that incentivizes safe innovation and age appropriate strategies from the beginning. Camille Pinterest's CEO Bill Ready has an op ed in the Hill this morning likening the social media industry to big tobacco. That is something I've heard before. Why do you think social media companies would do something like that? Are they trying to get in the psyche of lawmakers and show that they're on the same wavelength or something like that? We
ll, I think every social media company and the ways in which they utilize their products are different. Pinterest has a different business model and a different position than other social media companies. We've seen that with SNAP coming out and endorsing the Kids Online Safety Act now with X coming out and endorsing cesium. They're all taking slightly different positions. But I think the main takeaway here is that we're starting to see them come to the table. There is overwhelming support by pa
rents, by ages. I mean, schools are suing these platforms. And I think that they are realizing that the time is up on this move fast and break things mentality. And we need to come to the table and figure out exactly what a path forward looks like. And that path does require regulation in the same way that we had to do with tobacco.

Comments

@0x526F62

“These companies already have age estimation tools that they use to know the age of their users”. “estimation” … “know” … these are different words.