Unbiased

Subscribe to this podcast and join 50,000 readers worldwide that make up The Factual community.

February 2022
Ep 4: Fact-checking The Social Dilemma - Tim Kendall, CEO of Moment
Tim Kendall, former Facebook director and one of the stars of The Social Dilemma, answers criticism on if the docudrama was too simplistic in its assertion that social media is destroying society.
Unbiased Podcast
Ep 4: Fact-checking The Social Dilemma - Tim Kendall, CEO of Moment
0:00/0:00
0:00/0:00
Show notes
[1:05] Is Facebook a scapegoat for polarization?
[9:36] Why social networks surface incendiary content
[13:10] Should big tech be regulated?
[18:25] Social media companies as arbiters of truth
[23:45] Technology as an addiction
Transcript

Arjun: In 2021, you might've been one of the a hundred million people who watched The Social Dilemma on Netflix. The docu-drama outlined how social media manipulates our emotions and spreads conspiracy theories in pursuit of profits. One of the main interviewees is Tim Kendall, former president of Pinterest and product director at Facebook. Tim helped shape how Facebook monetizes its users and has an intimate knowledge on how social media algorithms work.
His concern with these companies has been evident for a while, as he testified to the House Committee on Energy and Commerce on the ills of social media, back in 2020. Now, as CEO of Moment, Tim is working to build tech for good and to atone for the shame that he feels in building these social media giants.
Tim graduated from Stanford's school of engineering in 1999. And he actually was my classmate during our MBA at Stanford from 2004 to 2006. For as long as I've known Tim, he's been a really kind, intelligent, and pretty funny guy. So I'm delighted to welcome him on the show today. I think we're going to have great conversation.
Welcome Tim.

Tim: Thank you. Excited to be here.

Arjun: Dan and I have a slew of questions, not least because both of us, of course, saw The Social Dilemma with our families as so many people did in the country. The central message of that docu-drama was that social media companies are manipulating us for profit. But Facebook actually put out a response afterwards, saying such a strategy would never work in the long-term. You can't build a company based on upsetting your users and it's simply not true. So what do you say? What do you make of that?

Tim: Facebook has this pattern, which is that it, time and time again makes an assertion and then 6, 12, 18 months later, it's proven to be incorrect. So one of the most infamous examples of this is right after the election. There was some chatter about the fact that maybe the election was somehow influenced by Facebook.
And then Mark [Zuckerberg] said, that's absurd. That doesn't make any sense. Then you fast forward 18 months and he's apologizing in front of Congress, for not having his hand on the pulse of the algorithm and the company and the actors that were all under our nose, manipulating people, manipulating the electorate.
So I think that it's a talking point that certainly holds up against logic. " Oh why would we put polarizing content in front of people? Wouldn't polarizing content ultimately make people leave the platform?" And that sort of holds up against general logic, but it's totally inconsistent with the nature of how these products work and how these algorithms work. Which is that they're clearly given a stipulation, which is let's maximize the amount of time that people spend on the platform. And then as a result, these algorithms figure out what is this sort of content that'll compel people to spend increasing amounts of time in the platform. I don't even know if I can say this in your podcast. It's just total bullshit. It's more bullshit from faith.

Arjun: You absolutely can say that on the podcast. Okay. Good. Dan, I know this has been a topic near and dear to your heart cause you've been actually thinking about this even longer than I have. So go ahead.

Dan: Absolutely. I think first off, I just want to make the statement that I think it takes way more time to argue on social media than it does to agree. What you're agreeing it's "good job" and that's it. When you're arguing, you're out there Googling, you're digging up spreadsheets and so on. My research skills have never gotten better.

Tim: When our point of view is threatened there's aspects of the limbic system that get activated. And , it just creates this energy and this motivation and all consuming state for us to support our point of view with facts that we go out and find and then posting to the comments.

Dan: I have a question for you on that front because Facebook has been blamed for a lot of the polarization in the U.S. And for that matter social media on the whole. But do you feel that that might be just a convenient scapegoat? Because you look at some other countries that have social media and they don't have anywhere near the same issues of political polarization that we have here in the United States.

Tim: What countries do you think exemplify that?

Dan: If you look at Germany, for example, you look at the Nordic countries. Don't get me wrong. You can look at a number of countries and find ones that are, as testy as we are, but there are other countries that have a decent Facebook presence.

Tim: I think there are outliers. What you just said... it's one of Facebook's more recent talking points. What about this country and this country, those countries aren't that polarized yet. There are millions of Facebook users spending billions of minutes. So clearly, by exception, we can simply argue that Facebook does not cause polarization. But what they don't point out is of the countries, by the way, they're 3 billion plus people now, on Facebook, WhatsApp or Instagram. Of the countries in which the majority of people are on Facebook, what portion of them are more polarized today than they were 15 years ago? I'm sure it's the majority.
I think one of the things that Facebook does a shitty job at and look they're incentivized to do a shitty job of explaining just how sophisticated AI is, but one of the things that AI can do and what it clearly is doing is understand the composition of the network.
So let's take a country and figure out, how do I take this system, this aggregation of tens of millions of people and get all of them as a collection to spend more and more time. What I would do and, one of the people in Social Dilemma talks about this. The way that I as a collective could get people to spend more and more time as I could slowly and imperceptibly walk people a little bit further on the political spectrum. A little bit further on an axis of division than they were the day before. people hear that and they're like the AI can't do that. Like it can't figure out what the most potent division access is and then take Dan and me and walk us further and further apart. The reality is that's exactly what AI can do. It is that sophisticated and that's exactly what's happened.

Arjun: Tim, is what you're saying that while social media is not necessarily the root cause of polarization, it will exacerbate and amplify whatever divisions we have in search of maximizing engagement and time on platform.

Tim: Yes, I think we've all experienced it to varying degrees with COVID, right? What is truth? Truth has been completely neutered and obfuscated by social networks. We can't get aligned on anything. And I think anecdotally, at least, and I haven't seen all the data but I suspect that if you guys look through the nature of your interactions with your family or friends over the last one to two years, certainly it's the case in my experience, there's real division that has been established and exacerbated by the obfuscation of truth on social media, around COVID as an example.

Arjun: When COVID started, maybe because I'm a hopeless optimist, I actually thought this would be the thing that brought us together. Because here was this disease and it impacted all of us. It didn't matter if you're rich or poor, black or white, you could be American non-American anyone. We were all in this together. And I was like, holy cow, finally, something that it's an unfortunate one, but it's going to unify us. And of course it did the very opposite.
I'm going to push on this a little bit in that this disease, and everything we know about it, has been an evolving set of facts. What we knew in March of 2020, to what we knew after the summer of 2020, through 2021 the vaccination, how useful it is how effective it is, variance, all these other things. It's been an evolving set of facts. Absent social media don't you think people would have still found ways to have different points of view when it comes to COVID because the facts are evolving and some people have a different risk profile than others?
And generally what I've seen is that if you have a higher risk profile, you'll be more cautious and be like, this thing's really dangerous, we've got to really buckle down. And other people are like, come on guys it's 1.8% fatality rate. And primarily for older people with comorbidities, chill out, don't worry. And they'd go with those set of facts. So we'll just selective bias, confirmation bias. And then we would go our ways. Is it really fair to blame social media for where we are?

Tim: I think you make valid points. I would say that If the division on COVID really matched up along the dimension that you're talking about, which is our predisposition towards health risk, and that's actually how it split. But the reality is that it fractures along political beliefs. And that dimension is not a consistent or parallel spectrum with health and risk aversion.

Dan: Tim, one of the things that popped into my head, and I want to jump back for a moment, is you talked about Facebook's algorithm and how Facebook's algorithm understands engagement or mathematically tries to sniff out what's going to lead to higher engagement. And, correct me if I'm wrong, but it sounds to me like this is a program that can effectively understand cultures and potentially understand what gets us going better than we even understand ourselves. Is that fair?

Tim: Totally fair.

Dan: The reason I bring this up is because, when it comes to COVID, is it less of an issue where there's this concerted unified disinformation campaign or a group of people who agree and more an issue of Facebook bubbling up the content that it thinks I'm going to react to. And so there's just a certain number of erroneous message areas who make it through that filter, and then ultimately become an alternative truth effectively.

Tim: I would be speculating, but I think knowing what I know about how these systems work, that's probably exactly what's happening.

Dan: So effectively, then this algorithm can increase the availability of disinformation if it feels you're gonna spend a lot more time interacting with it.

Tim: Absolutely. Tristan Harris, who's the main subject of the film. He was interviewed before the film on a podcast. First of all, if you look at YouTube, 75% of the time spent on YouTube- the jumping off point for that time is attributable to recommendations. It doesn't come from people spear fishing for particular type of video. It comes from us clicking on the recommendations on the right hand side. He talks about research that they did, where basically, if you do a search on girls in bikinis, underage girls in bikinis, you within two hops, get to completely inappropriate.. basically you get to child pornography within two instantiations of an algorithmic recommendation. And similarly, if a teenage girl does a search for diet, in two hops, she will be looking at videos that glorify anorexia.
The point is that these things real fast, get us to the worst possible content that we could ever put in front of that person, because the worst possible content is actually the best content for maximizing time spent.

Arjun: That's crazy. I didn't know about those. Tim, what has been the reception after The Social Dilemma aired? One of the things that I've seen sometimes in commentary of posts and reviews is that- look guys like Tim Kendall, they've made a bundle of money out of Facebook. So pretty easy now to sit back and cast stones because they've made their money. How do you counter that kind of criticism?

Tim: I think it's fair criticism. I can't go back and change my participation in helping Facebook get built. Admittedly, there were thousands of people involved. It's not like I was one of the key founders. I was in the middle of the company, helping them figure things out, including the business model.
But what I will say is that I'm using considerable amount of my resources to fight and create solutions that help people counter a lot of these forces. I've put almost $10 million into Moment, which is this tech holding company, but it's also a whole series of pieces of software that originally were designed and it's now evolved into a different effort, but it's still an effort focused on countering these forces to help people have more agency over their social media usage.
So I think it's fair criticism. I think I should be judged today by so what degree are you investing your time and your resources to right those wrongs. And I think that you could objectively look at that and you could say, oh, quite a bit of time and quite a bit of resources.

Dan: One of the things you've been an advocate for is regulation of big tech. You've actually testified in front of Congress about it. And it also seems like you're part of a movement to really address that from within the industry. Do you feel that regulations needed or do you think that market forces will rectify this over the longterm?

Tim: My answer to this was different a year ago. I think my answer to this a year ago was that we absolutely had to create regulation. That we should potentially repeal or heavily amend Section 230 of the Communications Decency Act, which was established in 1996 under the Clinton administration. It really is the reason that things like Facebook and YouTube and others have been able to flourish. Because it has allowed for internet platforms to get established and scale without any of the content liability that a traditional media company has.
YouTube, for instance, does not have liability for the harms that are caused by the content on its platform. Whereas if I'm a newspaper and I write a slanderous article, I have liability associated with that. And the whole point of that act was to allow these platforms to prosper and it did a pretty good job. Now there's this concern that none of these platforms have any accountability or liability for all the content that propagates so we should, in fact, make them liable if they're liable. They're prompted to pay for the harms that they perpetuate then maybe they'll will get their act together.
And my argument at the time, by the way was if you look at the history of Facebook and, the chatter 10 years ago, and the criticism around Facebook 10 years ago, it was largely around privacy. There was a huge issue around data leakage and people's privacy being, exploited. And, the interesting thing is that once the FTC finally stepped in, they threatened Facebook for a long time around all these privacy missteps and then they finally said, oh, you know what... this is ridiculous. Here's a $5 billion fine. And if you look at the privacy slipups that Facebook has had since that fine you would be hard pressed to find. So the argument was look once the FTC got their stick out and started really penalizing Facebook for being so sloppy on privacy, Facebook suddenly had sort of the resources to take care of these issues.
And so could that be the case that if you amended 230 such that Facebook had more liability, would they magically sort out that problem as well? I actually think that some of the things that are now emerging over the last six to 12 months around this newer area being called web3, which is this sort of decentralized notion of building out different internet services, I think it's possible that Dan, to your point, the market will figure it out and they may figure it out through the guise of this whole new web3 paradigm. The paradigm being that It's possible through web3, which is this kind of crypto- enabled ecosystem of software. It's possible that some of these networks could be reconstituted and de-centralized such that they could actually just disrupt and replace the networks we have today.

Dan: I'll editorialize a bit here. I have a teenage daughter and we finally just banned social media in the house because the instances of mental illness amongst her peers, And obviously we have the pandemic and everything as well,

Tim: Yeah. But even before that, the data were outrageous. They did a nice job in The Social Dilemma of showing the incidents of, I think it's like one in four teenage girls are inflicting self-harm.
Oh yeah. And honestly, and this is anecdotal. My experience I'd even imagine it might be higher than that. One of my big fears about regulation and you touched on this, is that it almost seems like. I'll just say the tech industry on the whole, just to draw a big blanket with a specific spotlight on Facebook, but it does seem like Facebook and the tech industry is almost getting to the point where they could be like, the tobacco industry in a way. Where they're able to effectively write their own laws, fund their own legislation, because they just have the money. And my fear is that in implementing regulation, we could very well get regulations that favor Facebook, as opposed to ones that allow for upstarts to come. I think it's a great point. And I think it's a huge risk around regulation. And I, don't think the point you're making is lost on Facebook. They're advocating for it. They're buying full-page ads, The Wall Street Journal saying, please regulate us.

Dan: They do that, your antenna should go up.

Tim: Yeah. It seems like what you're saying is they should be held accountable for what's on their platform. They've gotten away for so long without that it's time to hold them accountable. At least that's one course of action that could emerge from this.

Arjun: So, when they have tried to police content on their platform, we've also seen that sometimes they get it wrong and most famously in the last year, this whole theory that maybe the COVID virus escape from a lab. in Wuhan China. It's still considered a remote possibility, certainly not the leading theory of where this came from, but also not something that you could just dismiss outright. And even, I think in the last couple of days Dr. Fauci is being grilled on whether or not he covered it up to some degree or orchestrated with other doctors and public health officials to bury this. And so the social platforms all did what we quote thought was the right thing and they shut it down and they banned journalists who talked about it at the time. And now they're like maybe we were a little bit too quick on the trigger there. So are we just begging for censorship in a way? And then when we get it, we're going to be like, oh crap, maybe this wasn't such a good idea to hold them accountable.

Tim: You bring up a great point. Do you want to make Facebook the arbitrary of truth and what are the risks associated with that? And I think they're considerable. The COVID lab example is one of hundreds of examples that show that when Facebook did step in and tried to be the arbiter of truth, it was problematic. I get to these conversations and I just want to put my head in my hand and just say... there's a part of me that's just grateful that I don't have to figure out the answer that I can pontificate along with you guys, because it is a, morass and what a disaster and a mess.
The way out is really unclear. And Dan I'm with you. My daughters are young five and seven but boy, don't want to let them anywhere near this shit. And we're grateful that, with our second grader, we just sent a note out to her class. This is just an interesting framework that might be worth sharing. There is a framework called Wait Till Eight. . If you go to Wait Till Eight dot org all the content and all the templates are there, it allows you to essentially advocate within your child's class for the parents of the kids to band together and say, look, we are not giving our kid a smartphone until they're in eighth grade. The vast majority of our daughter's class call it 90 plus percent, , very quickly jumped on the thread and pledged, to band together.
And that's what you really need. You need the network. That's what's so problematic about these things. But if you can get the network, you get your kids, friends, parents, to all be aligned and look, it doesn't solve the sports that span different schools, but at least at their school, with their grade, you can stipulate that the vast majority of kids are not gonna be on social network til they're 14.

Dan: a fantastic idea. And it's a valid strategy because one of the challenges that we're having at home is that there are some friends that my daughter has. She can only communicate with on Instagram, for example. And so she's like, how do I get in touch with them? I don't know the answer to that, but I would much rather have her figure out a creative solution for getting in touch with those people that be on that app. And I think the other thing you alluded to, which I'm really interested in is, it sounds to me like in a lot of ways, the hair-trigger responses we need regulation, we need to take some massive action, but it also sounds like there's a bit of. Cultural education or cultural awareness that really needs to go on.

Tim: And I think Dan to that end there's certainly a cynical take on the film. I think that , first of all, was seen by a lot more people than probably anyone expected. I know Arjun, you mentioned a hundred million. I think the Netflix figured that we'd been given as over a hundred and fifty million people have seen it. And at least there is a shared experience across 150 million people who have seen this film and at least have a sort of a shared set of facts and maybe a shared feeling of alarm. And that certainly a decent first step.
There were certainly wasn't a mass deletion or defection from Facebook as a result of the film. But at least people have, I hope attitudinally are more aware of the forces at play and hopefully are more thoughtful about their own usage in their kids' usage. But, it's a morass and a mess. I noticed over the holidays I just got into a pattern of using my phone a lot more, which basically just meant browsing, social media, more looking at news more. I looked at Apple Screen Time to see I wonder how bad it is cause it feels like I'm using a way more than normal. And I were a couple of days that I was up to five hours. What I did as a result of that is I basically just enacted a fast where I said, okay, I'm not going to be on my phone for the next several days. Or if I am, I'm only going to be on these, email for a little bit or whatever. And I have to say the before and after, just in terms of my mental state was quite noticeable, I'm just really underscoring your point in that these things are cigarettes and they're addictive and they mess us up psychologically adults and kids.

Arjun: Why is this issue of addiction so important to you, Tim? Because you've spoken about this a lot.

Tim: I'm familiar in my own family with the carnage of classic addiction, substance addiction. The consequences are typically more pronounced and more acute. I think in learning about it in the context of my own family and just how the disease of addiction can be just so corrosive. When I started to think about addictions beyond classic substance addictions, so what are often called behavioral addictions or process addictions, you start to realize that as a world, we're all addicted. We each have different vices. Some of us are addicted to work. Some of us are addicted to shopping. Some of us are addicted to sugar. I don't think we have good vocabulary around addiction.
And I think as a result, we've got a culture of addicts running in 10 different directions, addicted to 10 different things. And in a lot of cases, I don't even think we're conscious that we're in the throes of addiction. So I care about it because I think it is, on the spectrum, with substances so clear that when you're addicted to heroin, life gets bad and life gets bad, fast. And it gets bad for the addict and it gets bad for the family. I think that this phone does the same thing. It's just more nuanced. It's more subtle. And we've got a bunch of nascent societal norms around this thing, because it's so new that we don't really understand how corrosive and terrible it can be. But, as I said in the film, this thing is causing terrible psychological harm. It's causing people to hurt themselves. And in some cases it's causing people to kill themselves.

Arjun: I hear you. And I haven't looked at the research in depth on causal factors. One reason that I'm more cautious about drawing that conclusion, and please tell me, if I'm being too cautious, if I'm missing data is, I feel like middle school was hell and it was been hell forever... long before Facebook and phones.
I hated middle school. It was a cage match. It was like Thunderdome everyday. You got in there and you're like, all right, which fight are we going into. And that was on the guy side. And I feel like girls that I knew hated middle school, or also found it really challenging. There's the popular kids and, there's movies like Mean Girls that have existed long before social media. So one thing I wonder is... is it too easy to blame social media? And then we missed the real factors- cultural elements that make it okay to pick on kids, or to objectify looks, or wealth, or some other dimension like that. And yes, of course, things like social media exacerbated and amplify, make it easy to cyberbully, but there's always been bullying. So what I'm wondering is yes, it's bad. But if we just focused on that, do we miss the cultural elements that have always been there long before social media?

Tim: I don't know why this example comes to mind. But one of the organizations , that my wife and I have given a lot of time and money to is this organization called Thorn. And they do a lot of work around child sex trafficking. And one of the things that they're studying more recently is this notion of sexting among teenagers. It's become incredibly common. And I think one in five girls will send a nude picture to their peers at some point. Because that behavior is normalized. It's becoming increasingly common. And what happens, and Thorn has documented this because they're really trying to create awareness around this topic for parents, what is quite common is a girl will text a inappropriate picture of herself to her boyfriend, that boyfriend will share it with a group of friends, and then that group of friends, one of them will share it with social media. And by the time that girl goes to school, the next morning, everybody in the school has seen that photo. And I give you that example Arjun, and because I hated middle school too, but that never happened in middle school.

Arjun: True.

Tim: And I think that's one of 50 awful things that these networks and smartphones enable. I think there are probably other cultural factors that are at play. And I think that a lot of the research that's been done it's not done a very clean job of showing causal linkage, but it's quite clear that incidents of self-harm is up by many factors in these cohorts of teenagers and particularly teenage girls. And it's quite clear that incidents of suicide is also up and coinciding with this same period of time in which both the smartphone and social media became so dominant.

Arjun: Yeah, I think your point's a really good one, which is, prior to the phones, I think what would have happened in middle school is a girl might've had a reputation that was completely unfair and got around really quickly, but there was seldom photographic evidence and it couldn't have gone on a scale that can go now and a speed that can go now where before it would take maybe days or weeks to get around now it's minutes and her life is shattered.
So I think it's much like your early example, which is even if social media isn't the causal reason for polarization. It will take whatever inch of divisiveness we have and we'll go a thousand fold very quickly. And maybe that's really what we're looking at is yes, it is an extraordinary amplifying factor that we've never had to deal with as a society before.

Tim: You should go to Thorne.org, by the way, there's a lot of information about how to, as a parent talk about these kinds of risks with your kids. And they have some really wonderful resources on this topic because that's really where we're left in is what is our role as parents in terms of getting as up to speed and is competent around these topics so we can have informed conversations with our kids. So they can hopefully have themselves informed points of view and exercise good judgment, and hopefully not getting too much.

Arjun: That's a great recommendation. We'll have all the resources you've mentioned in the show notes so that people can go and check out these organization and as well Wait Till Eight framework and things like that. I think these are really good practical things that hopefully we can come away with saying it's not all doom and gloom. Yes. We're dealing with a different situation, a different set of issues. Your movie or rather the docu-drama at least like you said, got us all in the same set on the same page that whether you agree with the degree of severity, this is a problem we need to all be conscious about. And now there are these spring works and people taking steps, that's progress. Hopefully it goes faster in that direction to avoid more sort of catastrophic outcomes.

Tim: Agreed.

Arjun: Thank you very much, Tim. This has been fantastic. I hope everyone listening got a lot out of this from Dan and me we were very happy to have you on the show.

Tim: Awesome.

Dan: you. Tim.
Do you want to keep listening?

Share