By Walt Hickey
Numlock is off until Jan. 2, 2019! During the week between Christmas and New Year’s we’re doing best of’s. Today: my Sunday special talking with Bloomberg’s Sarah Frier about Facebook’s struggles this year, sent the week of Thanksgiving.
This week, I spoke to my friend Sarah Frier, who covers Facebook for Bloomberg. She’s appeared here before, but I wanted to talk to her this week specifically because Facebook had an unrelentingly weird week and I didn’t want to let it slip by without talking about what really happened.
We spoke about Facebook’s no good very bad week, what’s gone wrong at the company this year, and the real core of the company’s issues.
Sarah can be found on Twitter here and you should follow all of her coverage at Bloomberg. She’s also coming out with an as-yet-untitled book about Instagram.
This interview has been condensed and edited.
Walt Hickey: This week was three days long and Facebook had a bad day. All three of them. What went down this week?
Sarah Frier: The main problem this week was the aftermath of this New York Times investigation. Basically, all of the problems that we've heard about from Facebook -- about how they've handled the Russian crisis, how they've handled the data privacy issues, how they've handled their public perception -- what the Times report did for really the first time was linked a lot of those bad decisions directly to executive decision-making. In particular, decision making by Sheryl Sandberg, the Chief Operating Officer of Facebook, who has not thus far been in the spotlight for a lot of the issues.
We've heard calls for Mark Zuckerberg’s head for months. People say, "Oh, you should step down and be executive chairman, or he should be fired." He can't be. He has all the voting control. He is the founder and in charge and the board would never be able to do that. But Sandberg is fireable. So now this heat is coming onto her. So that's where we start.
And then this week happens where Facebook denies this report, and then kind of says, "well, parts of it are true," and then they put out this memo confirming much of it. This report was titled "Delay, Deny, Deflect," and basically Facebook did that in response to the report! It was kind of perfect. There are also general business issues: they're coming off the heels of downtime for their advertising system, which is probably the last thing you want to have happen on the eve of the biggest shopping holiday of the entire year. Marketers were calling me Wednesday morning and saying that they couldn't get their ads to run. It was really a nightmare.
Whoa. What happened there?
Well, people couldn't post the ad they wanted to post. Basically the way Facebook advertising works is there's this online system that you have to buy all the ads through. It's a self-service system. One reason it's been so easy for them to scale that business is because there are basically no humans involved. But if you get to the point where you have trouble, there's not really like a hotline to call. It just takes awhile for you to get someone to fix it and Facebook to confirm the bug. They eventually fixed it, but it took awhile.
And then the other thing is all of these parliaments around the world were gathering together to ask Zuckerberg to testify. He said he wouldn't. There was also this BBC report about how fake news is leading to very grisly violence in Nigeria. So it was more of the same for Facebook basically.
It was crazy. It was a very short week. Day one you had the Times report coverage, day two you had the ad tech problems, day three they bury this memo on Wednesday evening. It was just one after another. And a lot of times, the week before Thanksgiving is where news is sent to die.
Right? So it was kind of perfect that they had denied the report when everyone was working, and just as everyone was about to go on family vacations then they said, "actually, a lot of this is true."
Broadly speaking, what's Facebook's year been like? You've done a lot of reporting on some of the founders of companies that they bought leaving.
The company is in the midst of a crisis related to how it grew.
Facebook has responded to those crises by apologizing and fixing the specific things that ended up becoming controversies, but, of course, that doesn't fix the underlying system that led to them. You have just this pattern of Facebook getting into trouble, saying first, "Is that really trouble?" Then second, "Oh actually you're right, we'll fix this. We apologize." And then third, it happens again and the same cycle happens all over!
They haven't really fixed the fundamental problems with how Facebook works: they got to this point by focusing first and foremost on growth, on scaling, on building their business so that it required the least amount of human intervention. Technologically the word they use is scalable. What that basically means is that they can not have to look at it in detail and just let it grow. Now when problems happen or get brought to their attention, they have to decide, "Okay, is this a Facebook problem or is this a humanity problem?" And more and more, their critics who are looking at them more closely now are saying, "actually it's a Facebook problem."
In March, when there was violence percolating in Myanmar and Sri Lanka, finally Facebook decided after years of ignoring it and saying it wasn't their problem, that actually maybe they should start to dive into it. There are so many things like that! Take privacy stuff. "We gave developers information on users years ago, but they weren't allowed to give it to any third parties." Well, a developer did give a lot of data on up to 85 million people to Cambridge Analytica, a political consulting firm, and they did use it for ill effect. So is that Facebook's fault? Well, the public thinks it is and they're just now being held to a standard that they should have been held to earlier.
Facebook is not some underdog startup anymore. It has more than 2 billion people using it around the world and with that comes making the problems of humanity exponentially worse sometimes.
Their product is a little bit more conceptual than our typical notion of a product. If General Electric made a washing machine that helped people commit genocide, General Electric's ass would be on the line. But since Facebook is a bit more of a nebulous organization that has network effects, it's a little bit harder to pin down exactly what their role in that kind of incident is. Does that make sense?
Yeah, and that's exactly the problem. Facebook looks at the data and sees more and more people reading that or looking at this. So it must be good, right? If this story is going viral, people must love it and it must make them happy or it must contribute to their lives in some way. It turns out that things go viral because they spur emotion and shock and it's not always because they are good things. Even if the data says that, you know, people are reading this and they care about it. Well they might care about it and be reading about it because they think that they are now terrified for their lives because of some information that's not true. And so that's a negative effect. But if you're an engineer looking at it and thinking, you know, I built something that people love, you might be blind to that effect.
They're looking at a chart that is going up and they don't ask the questions about what the y-axis is actually measuring.
Right. Whether it's a positive experience.
Sounds like they're in a little bit of a pickle.
Well, the thing is Facebook Inc. is not Facebook itself. We keep talking about Facebook, the website. But this company has made bets to be more than Facebook. They own Whatsapp, they own Messenger, they own Instagram, and those products are all incredibly popular and have very different problems. Whatsapp is end-to-end encrypted, which means that even if Facebook wanted to solve some of the content issues that have cropped up, they couldn't on WhatsApp, because they can't see what people are sharing. And that's led to a lot of terrible violence in India and Sri Lanka and Brazil, just spurred on by lies that nobody can track.
That seems huge.
It's really bad. But you know, Facebook has been talking about its content policy and talking about the fact-checking networks that they have, but none of those promises apply to what is one of the largest messaging platforms in the world.
How do they even attack that problem at a certain point?
Well, in some places they've restricted forwarding. So in India you can now only forward messages to small groups. But in some countries you can still forward messages to 250 people. So it's really not an evenly-applied solution and it actually doesn't get to the root of the problem.
But Instagram is fine, right? I get to look at nice pictures of doggies on it.
No! Instagram also has unique problems. As some of Facebook gets cleaned up, Instagram has been home to some very interest-based organizing. On Instagram you can find these really niche communities. There's no virality, so that actually helps Instagram prevent some of the problems with Facebook. There are really niche interest-based groups and there's no real identity that is required. So you can be a total Nazi in a group with other Nazis organized by hashtag on Instagram and it will be really difficult for the company to find you.
There have been a few cases where people have become radicalized there or spoken about an atrocity that they're planning to commit there. And it's very difficult to find it until something bad happens. Like when the Borderline shooter of the nightclub in Thousand Oaks -- I'm actually in Thousand Oaks right now for Thanksgiving -- that shooter posted on Instagram in the midst of his attack. But there are bad people everywhere, right? So these social networks, it's not that they have to remove all the bad stuff in society, but they have to become attuned to how their platforms amplify it and what they could do to not be the gasoline on the fire.
Oculus is still doing great right?
Zuckerberg, when he purchased Oculus in 2014, said that he expected that it would be the next platform where we all communicate after mobile phone.
Well, readers can't see it but naturally we're talking over an Oculus right now.
I don't know that anyone's really doing it!
People at Facebook talk to me about how maybe one day we won't have to commute into work we will just go to work via virtual reality. Now I don't know about you, but I don't think that I could do that. But that's fine, because Zuckerberg knows that it's not working. He actually joked about it at a recent conference.
But there's been a lot of turbulence overall! All these platforms that we've been talking about -- Oculus, Whatsapp, Instagram -- have recently lost their founders who came to Facebook with the acquisitions. Facebook used to be this place where if you had this really valuable product that you wanted to keep growing and really build upon Facebook's expertise in growing products to humongous audiences, you'd go there, and keep running your company and you wouldn't have to give up your power. Zuckerberg would sort of respect you as a founder and let you do your thing.
This year we've seen those founders leave. That makes for some interesting questions about the future of the company. Just as Zuckerberg is under fire for what he has been blind to in building these products, he is now even more in charge of them than he was before.
If you have anything you’d like to see in this Sunday special, shoot me an email. Comment below! Thanks for reading, and thanks so much for supporting Numlock.