We've lost control of our phones - fmr Facebook exec
Social media, once created to bring millions of people together, now - to control what we think and how we live, without us even knowing it. We talked to Tim Kendall, former Facebook director of monetisation, ex-president of Pinterest and the CEO of Moment app.
Sophie Shevardnadze: Tim Kendall, former Facebook director of monetisation, former president of Pinterest, and the CEO of Moment app, great to have you with us. I mean, I've so many questions for you. Where do we start? So obviously, during this pandemic, our screentime on social media is way up. I mean, do you think that this increase in usage will be permanent? Or are we going to sort of wean off once we're allowed to freely roam again?
Tim Kendall: It's hard to predict the future. I mean, I think that we've seen a surge during the quarantine, during this pandemic, because people don't have alternatives. They don't leave their houses often, they're clearly not nearly as social as they used to be. And by ‘social’ I mean physically social, like going over to each other's houses or meeting up places. So I guess I'm optimistic that I think that it will reduce to some degree once we get through this craziness of quarantining.
SS: So this year's ‘Social Dilemma’ documentary, which is dedicated to the threat that is social media, says that internet companies are using technology to compete for our attention, which they then sell to advertisers. But here's a question from someone that has worked on television for most of her life. Hasn't TV been doing the same thing for decades now? I mean, we've heard the same arguments against TV, ‘it's brainwashing’, ‘it's manipulating’, ‘break it’, ‘throw it out’. Isn't social media, just a new sort of TV?
TK: Yeah, I think it's a great question. There are a couple of things that I'd say. One is that TV's just not as good at being as addictive as social media. And that's for a couple of reasons. One is it's not interactive, and two is it's not personalised to us. The content that I see on TV is not about my social world. It doesn't reflect my popularity and my standing. It doesn't allow me to compare myself to other friends or colleagues in the way that social media does. So social media really allows me, it really preys on a bunch of things and kind of my animal brain that really get me addicted. You know, for just hard data, in the 50s people watched about four hours of TV a day. And now that stat on average at least in the United States is about the same, it's about four hours. Just to show the difference with social media, in 2010 people spent 12.5 minutes per day on social media and today they spend 2.5 hours per day. And so it's gone up 10 times in 10 years per person while the number of people using social media has grown from about 500 million to 3 billion people. So it's explosive in that sense, and it's addictive. You know, the mental health data on TV just looks very different than social media, just in terms of the impact that social media usage has on my mental well-being. We know that it makes me depressed, and we know that it makes me anxious.
SS: You're saying the average number is 2.5 hours - I wish it was that low. In my case, actually, it's much more than that. I was like, 2.5 hours - wow! That's nothing compared to what happens to me. No, but like we agree, right, that this sort of a concept of catching my attention and keeping it by manipulating is really an old concept, an old threat, but really on steroids... Right?
TK: Absolutely. I mean, I think that, you know, probably artificial intelligence - In fact, it's already doing this with Netflix. Netflix - not only are their recommendations powered by artificial intelligence but some of the programming. Some of the show development and creation is now being powered by data that they have about how you and I watch their shows and what appeals to us. And so I think it's kind of a matter of time before traditional television starts becoming so personalised, that it starts to suck our attention in those sorts of ways. I think traditional TV just isn't nearly as good as social media is at kind of stealing our attention.
SS: You mentioned before we started this interview about America being polarised like never before. And I said, well, the world is polarised not only America. Here's the thing, you also said political consequences of social media are scary, that the way this goes now could lead to a civil war. But once again, I'm going to bring up the traditional media, like Fox News has been around since 1996 and there was no Facebook or YouTube and the media was already polarising the landscape because you know, flashy things sell in the media. More blood - more ratings, and moderation and objectivity just don't interest people, you know, they don't bring ratings. So why are we saying that tech companies need to change but we're okay with like Fox and MSNBC, destroying journalism in America, and completely polarising the country as well?
TK: Yeah. I think that the beginning of all of this was cable TV. It was in 1996 when Fox and CNN and MSNBC started fracturing us into different groups that had a different set of facts. I think what we're saying is that social media is just that on steroids, because instead of there being three versions of the truth, in the United States, there are 100 million versions of the truth. I mean, you see this, you see this even on the left, or on the right, which is that Republicans can't even align and neither can Democrats. They're completely fractured. And part of the reason that they're fractured is that there is not an - and they talk about this in the film - there's no concept anymore of truth. There's no concept of shared truth, but I blame cable TV for starting this whole thing in terms of polarisation. So I think you're right that that is a problem. Social media by virtue of the technology and the ability to create your own CNN, you know, your personal news network, makes it even worse.
SS: I'm trying to pinpoint something here, help me out, because I'm talking about whether we should be targeting in this case the huge tech giants or something more narrow and particular. For instance, we've heard accusations in Facebook fomented off-line ethnic cleansing in Myanmar. But like an old fashion radio program has fomented as much graver genocide in Rwanda, for instance. And we see broadcasters being held responsible, not the radio technology. What is the argument for treating the tech companies of today differently? Do you know what I mean?
TK: Um, well, the argument in the United States - and I testified in front of Congress about this very issue, the United States is back in 1996. Around the same time, coincidentally, the cable was, you know, fracturing us. There is a law called Section 230. And what it does is it allows these tech companies to propagate and publish any content they want, and they're not liable for it. It's the Freedom of Information and Decency Act. I may be getting that name wrong, but it's Section 230. And the testimony that I gave basically suggested that they need to look at 230 and they need to amend 230 because you're right, news organisations are liable. If you say something that's, you know, libel, factually incorrect, etc. you have a liability, you are held accountable. If anyone can say anything, and anyone can propagate anything that's popular and inflammatory, we can create some really serious societal problems.
SS: Here's a question turned from the other side. For instance, the co-creator of the Facebook Like button tells us in the film that his team just wanted to create something that will be good, that would foster goodwill, enable people to reassure each other, and notes that he didn't expect teenage depression or the ‘like addiction’ to come out of it. But addiction to being liked one way or another has, once again, nothing to do with Instagram. It's human nature. Right? So I'm just wondering, are we sort of scapegoating the big tech because we're just too embarrassed to admit that our behaviour is ugly at times and we're just blaming our ages-old vanity on Facebook?
TK: Yeah, um, I think that's a good point. I mean, my takeaway from the film is twofold. Maybe, threefold. One is that the companies need to take more responsibility and accountability, the governments may need to step in if they don't do that swiftly enough. And we as individuals are responsible, too. We are making choices that are not in our best interests. And we've lost control of our phones. For a number of the reasons that you mentioned, we've lost control of our phones because we've let them prey on our vanity, our need to be right, our need for recognition, our need for other people to inflate our reputation. All of these things are sort of human tendencies that on social media get really tapped into. And we have personal responsibility to make, I think, different choices if we conclude that these things have taken over our lives in the same way that I have a responsibility if I drink too much to figure out a path to get that back in control because I don't like the consequences it's creating in my life. I think the same is true of cigarettes, the same is true of eating too much sugar. So I agree with you, I think it's silly. And I don't agree with people who just say, “Look, it's not my fault social media, they're the ones who are making me pick this thing up.”
SS: So talking about cigarettes, you were Facebook's head of monetisation, and you've admitted in the documentary that you helped make Facebook ‘as addictive as cigarettes’. That's your quote. So I’m just wondering, is it a figure of speech, or are these two addictions indeed driven by the same processes in the brain?
TK: The point that I was trying to make is just that there's a threshold at which Big Tobacco realised what they were doing and that they continued to do it, they continued to put more and more things into cigarettes that made them more addictive and made me want to smoke more of them. And the point that I'm trying to make is that social media, and in particular, these artificially intelligent algorithms are doing the same thing. They're moving along, looking for new additives to inject into social media, so that you and I want to use it more. Because the algorithms being told, “get more of Tim's attention tomorrow.” And then it goes off and thinks, “okay, how can we do that? He spent two and a half hours on it yesterday, how can we get him to spend three hours on it today?” And so the point that I was making was that these algorithms are going off and finding new additives, like the cigarette companies used to do. It started with - If we rewind 10 years ago, it started with, as I said earlier, popularity and comparison. And now we're in a different realm of misinformation, conspiracy theory, polarised content. We're stepping up the ladder of things that really trigger and engage the most primal part of my brain so that I spend more time and I'm more engaged.
SS: Tim, so tell me so that we know what are the most sophisticated tools that keep users on the hook? I mean, are engineers and the big tech consciously using neuroscientific knowledge to keep us engaged?
TK: No, I think part of what is scary is that there really aren't human beings on the other side of these services. It's really an algorithm that's, in my opinion, not very well supervised, that is being given an instruction, which is “make Tim spend more time on the service tomorrow. You have this universe of billions of pieces of content, he's got a couple hundred friends there publishing content, let's figure out what to put in front of him that's going to get him sucked in. And we also have push notifications. And we can play with the time of day that we send those push notifications to him. If things get really bad, we have his phone number, so we can text message him.” And what really bad, I mean, “he hasn't come back for a few days”. When I go off Facebook for several days or a week, I get a text message from them with something that really tries to pull me back. So those are the tricks. And it's preying on - I mean, one of the most effective things that Facebook figured out 15 years ago was they send you an email that said, “Hey, there's a new photo of you on Facebook”. They don't send you the photo. I don't know why they don't. Why don't they send you a photo? They don't send you the photo, because they want you to come to the service to see the photo, which by the way 100% of people do. And then once you're there, you stay longer. And when you stay longer, they can show you more ads, and when they show you more ads, they make more money.
SS: You know, not so long ago, I spoke to ‘father of modern marketing’, economist Philip Kotler. I don't know if you've heard of him. And we talked about how commercials are to blame for our growing dissatisfaction with our lives in a way that they feed us the idea that the product they're offering is what we need and without it, our lives are just incomplete. Are social media working in the same way? Or are their ways more complex than that?
TK: I think it's more complex. I think it used to work in that way and only along those dimensions, which was that they tapped into our natural tendency to compare. And that's right. I mean, that was kind of early commercials and marketing, it was like, “Look, look at your life, it could be better if you bought this product”. And look, that's a very effective way to manipulate someone and change their mind and get them to spend money. Now, I think it's gone even further and it's really starting to tap into whether you are right or wrong, whether your view of the world is right or wrong, and by the way, these services really help convince you that you're right because that's good for business. And they make you angry at the other side because they put information in front of you that proves that the other side is wrong. And it's doing the inverse to the other side. And that's what's so scary. You know, I think Tristan Harris, who's in the film, recently has said something that I think is really helpful, which is, let's just imagine that you picked up your phone, and there were two options. There was a feed of information that showed you things that basically validated your worldview, they basically showed you that you were right. They affirmed the things that you believed. And then there was another feed that actually challenged you, put information in front of you that challenged the views that you held. So if you believe the Earth is round, in this feed it would actually show you data that illustrates how the world might be flat. People don't want to read that second feed. That second feed is not that engaging. People don't want to be told that they're wrong. They want to be validated. And so I think it's gone one step further than just, you know, convincing you that your life could be better if you buy a product. It's gone further, it's convincing you that you're right and the other is wrong. And not only they are wrong - they're bad.
SS: So another scary thing that I gathered from the documentary is when I heard you say that, despite the fact that you helped architect all those manipulative tools, you fell prey to them yourself. What does it mean for the rest of us? I mean, does that mean that being aware of all those things social media do to you is just not enough?
TK: I think like with a lot of things that are addictive, it's not very helpful to know how bad they are. I mean, I think 15 or 20% of people still smoke and I think the biggest reason people don't smoke now has less to do with the fact that it kills them and more to do with the fact that it's socially just not acceptable. That's at least my experience in the United States. And so I don't know that it's so helpful necessarily to know that it's bad for you in the medium and long term. What I think can be really helpful (for everybody, and this is how, you know, we build tools and focus on apps of my new company Moment) is that, if you're aware... It’s sort of three things that we say at Moment, and the whole idea of Moment is we're trying to help you get back control of this thing and how you use it. The whole idea is basically, let's help you develop awareness for how much time you spend on your phone. Most people spend - this is an aggregate, so this is social media, along with everything else - most people spend 4 - 4.5 hours on their phone a day. But if you ask all those people, most of them say “two hours”. So there's just a disconnect between reality and perception. And pointing that out can be really useful because it shows the loss of control. There is sort of this moment that happens like, “Oh, I really don't have control because I'm not even correctly perceiving reality”. And then the second thing that we suggest, and this does this in our app Moment is just a bunch of ways of tweaking habits. The most useful and basic one, and it actually moves the needle the most in terms of social media usage, is “don't bring your phone into the room where you sleep”. Just set a rule.
SS: That's tough.
TK: And you know, when I talk to journalists, [for them] it's really hard.
SS: It is because most of our work is done on phone. Yeah, all our work is done on phone, especially now that we're all working online, because we don't go to record in-person interviews, or we don't travel that much to do reports. Everything's happening in our phone.
TK: I get it. By the way, I'm working on this all day long. You should see my phone usage, especially since this film has come out.
SS: Right. So you're the CEO of Moment now, an app which according to the description of the website, ‘helps people build healthier relationships with their phones’.
SS: So if I delete all social networks from my phone, how will my relationship with it become healthier exactly? Because, you know, I can really just check Twitter on the desktop...
TK: Yeah, well, first of all, I wouldn't recommend you delete. We don't actually think deleting social media is realistic or the best thing necessarily. I just think that we want to help people be more deliberate about how often they use it. We've all had this experience of going to our phones to send an email, or to look at the weather for a few minutes. And then we come to about 45 minutes later, and we've been scrolling Instagram. We don’t need our phones to go on Instagram. So 45 minutes passes, I'm on Instagram. I don't feel any better for it. I actually feel worse. I feel a little bit guilty and I don't feel like I've accomplished a whole lot. And I've looked at pictures of my friends presumably having a more interesting life than me. That's just a bad experience, but I can't really help myself. And so I think what we suggest are ways of setting limits for experiences like that so that, hey, it's okay if you spend a few minutes on Instagram, but we let people go through a process of trial and error where they realise that, “yeah, if I just limit myself to 10 or 15 minutes on Instagram, I feel better”.
SS: How exactly do you do that? It's a great idea. But how does that work when applied?
TK: Yeah, yeah. Well, so there are two things. One, one is that in general, I think reducing your phone usage overall is helpful. And so that's what we really help people do because this gets into so that -
SS: How how would you help a journalist like myself help reduce phone usage?
TK: We give you the tips, you'd set some limits. And then here's the most important part. And this is the newest part of the experience - we'd have you create a group on Moment with several friends - and they all have Moment - and then you go through a multi-day guide that helps you develop an awareness and helps you play with certain things as a group to try to reduce your usage. And that whole time we show you and your friends how much everyone is using their phone. So the idea is that - and this is true of a lot of behavioural addiction in terms of how people sustainably change behaviour, is that they commit with a group of people. And that's how we came up and develop the feature: “Look, you probably have talked about this with colleagues of yours. You've probably talked about this issue with friends of yours, agree that you're gonna even at the very least just show each other how much you use your phones”. That even in itself, we see, can help reduce usage.
SS: Definitely downloading it after our talk today.
TK: Good. Good.
SS: Thanks a lot, Tim, and good luck with everything.
TK: Yeah. Thank you. Nice talking to you.