

So I think that the scale of the systems has created this responsibility for us, where we have to be more proactive about finding and addressing issues [harmful content]. And we can do that both by building technology that is possible now, but wasn't even possible five or ten years ago, and by hiring people at a scale that would not have been possible for us before.
Related Quotes
The community - more than two billion people use our products, and we get that, with that, a lot of people are using that for a lot of good, but we also have a responsibility to mitigate the darker things that people are gonna try to do.
You can think about the metaverse as an embodied internet, where instead of just viewing content - you are in it. And you feel present with other people as if you were in other places, having different experiences that you couldn't necessarily do on a 2D app or webpage, like dancing, for example, or different types of fitness.
People will always want more immersive ways to express themselves. So if you go back ten years ago on the internet, most of what people shared and consumed was text. Now a lot of it is photos. I think, going forward, a lot of it is going to be videos, getting richer and richer.
Moderating content at scale is insane. 3.2 billion people use one of our services every day. It's wild.
We're a community of a billion-plus people, and the best-selling phones - apart from the iPhone - can sell 10, 20 million. If we did build a phone, we'd only reach 1 or 2 percent of our users. That doesn't do anything awesome for us. We wanted to turn as many phones as possible into 'Facebook phones.' That's what Facebook Home is.
One of the things that I'm very mindful of is to make sure that the services that we're building help to create meaningful interactions between people and not just a place where people can zone out and consume content for a long time.
Popular Topics
Popular Authors









