NYT's Kevin Roose on Humans in the Automation Age, NFTs, & Substack

On this episode, we speak with Kevin Roose, a tech columnist at The New York Times, and author of “Futureproof: 9 Rules for Humans in the Age of Automation”.

On this episode, we speak with Kevin Roose, a tech columnist at The New York Times, and author of “Futureproof: 9 Rules for Humans in the Age of Automation”.

Google Play / Apple Podcasts / Spotify


Kevin Roose, Mark Percival, Brent Sanders


Brent Sanders  00:06

So Kevin, thank you so much for joining the Formulated Automation podcast, Kevin's new book, "Futureproof: 9 Rules For Humans in the Age of Automation" is a new read that here at formulated, we took on the last couple weeks after discovering it. I think we actually heard you on a different podcast, which is how we discovered the book. And so I wanted to kind of give you an opportunity to introduce yourself. Would you know what got you writing this book?


Kevin Roose  00:32

Sure. So to introduce myself, I'm a tech columnist at the New York Times, and where I've been for four or five years now, covering social media, the internet, web culture, crypto, AI, all of this, all of the things that we weigh in on now. And I got really interested a few years ago in AI and automation. I kept hearing about it from sources, all the companies in Silicon Valley were obsessed with using machine learning and neural networks to develop new products. And I was going to all these conferences where people would stand up on stage and talk about, you know, all the effects that AI and automation we're gonna have on jobs and society. And I was sort of annoyed at the whole conversation to be totally honest, because it was sort of divided into camps, like it was very binary, either people thought that AI and automation were going to be amazing for society, and they were gonna, you know, cure all our ills, and fix the climate and, you know, allow us to spend all our time playing video games, while the robots did our work for us, ushered in this sort of utopian futuristic society, or AI and automation, we're gonna be horrible, and they were going to destroy all the jobs and everything we hold dear, and turn us all into just like screen addicted zombies. And probably we'd end up, you know, working as slaves on Elon Musk's, you know, Mars colony. And so I was really struck by the lack of nuance in the conversation and the lack of practicality in the conversation. It seemed to me that there were so many people doing sort of diagnoses what is going on? How is AI and automation changing society? And very few examples of people who are saying, Well, what can I do about this? What can I do about, you know, if my job is slated to be automated, what can I do to make sure that's less likely to happen if, you know, I'm finding myself sort of being bossed around by my algorithmic feeds and recommendations and feel like I've lost control of my own choices? What can I actually do to solve that? And so I wanted to write a book that was both very practical, and very sort of realistic about the fact that some of this stuff is really good. And we shouldn't, you know, be overly pessimistic about it. And some of it is pretty bad, and we shouldn't sugarcoat it either. So that's what I was trying to do with this book.


Brent Sanders  02:53

Yeah, it's a very careful line, right? Like, we've been in this space for a while now. And, you know, Mark, and I both keep trying to ask these questions of ourselves, as well, as you know, like, any technologist should be like, what's the impact of the code or writing? What's the impact of the automations we're creating? And it's sometimes unclear, right? It's like, you bring up some really good points, and I'll do my best not to spoil anything. You mean, everybody should go out, and immediately grab this book, so I'm not gonna spoil anything. But you know, it's, it's similar to, industrial revolutions that happen and and you bring up some of the other past changes and the way people have worked and how that affects people. And it reminds me of like, the way I always feel about it is similar to economic policy, where it's like, well, there's going to be winners, and there's going to be losers, and we're not really sure who those people exactly are going to be. But there's gonna be more winners than losers, or at least there should be. But it's really hard as we're working on projects to know, you know, are we automating people out of a job. And as we presented, I think you quote some people in the automation space that are really similar, where they're like, oh, we're, we're going to reallocate these people that we're going to retrain them, we're going to put them on higher value work. And we have seen that, but it's still at the end of the day, you're trying to save money, right, especially in these more RPA focused engagements. So you know, I think to your point of like, not candy coating it or sugarcoating, it is a tricky business. And it's hard to know, you know, where the lines are going to be drawn in the future.


Kevin Roose  04:30

Totally. And I think part of the issue with this conversation, specifically this type of conversation is, it's often different depending on who's listening. So I noticed as I was reporting this book that I would go to, you know, some big AI conference, some RPA conference or some, you know, big investor presentation, and you'd hear you know, the CEOs of these companies get up on stage and they would say, Oh, yeah, we're releasing workers from repetitive and mundane work. We're reallocating them to be, you know, fulfill their human potential, it was all very positive. And then you would like, talk to people off stage or off the record in my case, and they would say, yeah, we're trying to get rid of 30% of our costs in the finance department like, like, it's not an idealistic argument at all. And so, you know, that kind of thing drove me crazy, because I felt like there was this public conversation about AI and automation. And then there was this very different private conversation that I was getting these glimpses of, and I wanted to sort of try to see if I could figure out, you know, which is the honest one, which is the one that we should actually plan around.


Brent Sanders  05:35

Yeah, yeah, I love that kind of portion of the book, you know, I can just picture tech executive with a drink in his hand, or just an executive in general, just giving you the the the actual, like, these people are a pain in my butt, and I won't have to deal with them, or I won't have to deal with as many because I would say like, management is hard, like, the complicated part of business is tends to be the people, right? It's a lot harder to deal with a, you know, 100% company than 10%. Well, I shouldn't any, any number of people becomes difficult, I should say. So, you know, yeah, that doesn't mean this gets easier.


Kevin Roose  06:12

Yeah, people are a huge pain in the ass. And unfortunately, they're also brilliant and creative, and we need them in businesses. And, you know, I think for a lot of executives, the sort of the promise of automation is being able to get rid of, you know, some low performing people and keep all the high performing people. But that's not really how it shakes out in practice all the time.


Mark Percival  06:34

I think there's also this issue of you don't know, I mean, for us, maybe they're gonna take a team and reallocate some people somewhere else. But if that automation didn't exist, that team would have grown by, let's say, you know, 30%, then that growth goes away. And so I think you don't actually see that because it's hard to analyze, well, what would have been the future jobs? What am I costing in terms of future jobs that are being eliminated?


Kevin Roose  06:55

Exactly. And that's one of the biggest mistakes I think people make when thinking about automation, is they picture this happening, like how it used to happen in the 50s, and 60s, when, you know, factories would bring in robots and lay off employees at the same time. Like, that's not really how it works anymore. Now we see more things like, in the book, I talk about this company, my bank in China, which is one of the biggest lenders, in China, they give out home loans and car loans and stuff. And their signature product is called 310. Because it takes three minutes to apply for a loan on the internet, it takes one second for an algorithm to decide if you are qualified or you're not qualified. And zero humans are involved anywhere in the process. And so they've grown enormously big with this model. And they have many, many fewer employees than their competitors, which rely more on a sort of human evaluation. They have, you know, 1000s of loan officers that they employ. So it's not that my bank, you know, brought in robots to replace the people that they employ, it's just that they never employed people in the first place. And so as they displace competitors, who are more human oriented, those jobs are lost.


Brent Sanders  08:11

That's a really interesting way of thinking of it is like, and I would say, our focus really has been creating sort of, not these sort of Greenfield automations. But more, so you couldn't do these things at whatever scale you're looking to do with people, right, it's like, I go back to one of the first automations, I did maybe five, six years ago, you know, it's just processing 10,000 emails, or 15,000 emails, you know, every three days or so. And it's like, if that was someone's job, it would be misery. It's like, we need software. And to that point, I almost don't even think of it as automation. It's like, it's just software, which you draw an interesting line in the book is, like this stuff has been going on, you know, it's just a natural extension. It was my takeaway, like, this is a natural extension of software, natural extension of what the, you know, what is navall? Say? He's always talking about, you know, building an internet business that prints money for you while you're sleeping. It's like this, this ideal of what a software business should be. And that, in a sense, is automation. But that's also, you know, just kind of the nature of what started back in the 80s or even late 70s.


Kevin Roose  09:20

Sure, I mean, we don't think of TurboTax as being advanced AI automation. Software, but it has put countless human accountants out of business and made them much harder for them to pay their bills. You know, Excel is probably the most labor displacing technology in the past half century. There used to be entire teams of people, you know, data processors who did essentially what Excel automated, and so we don't think of those as sort of job killing robots, but they really are and I think the more we can sort of make our A picture of how automation enters workplaces more realistic, I think that's a good thing.


Brent Sanders  10:05

You know, there's so many different topics that I want to talk about, you know, you've broken these down into these nine roles. That is one of the things more on the crafting of the book. How did you land on just nine? Because I feel like you, as you brought up, there's so many nuances to this topic. And it's broad, but it's also at times really narrow. I mean, how did you come up with it? How did you distill it down to just nine?


Kevin Roose  10:29

Well, I thought 10 sounded like I was doing the 10 commandments, you know, sounded a little grandiose or something, I decided I would be a little more humble than that. But I thought, I mean, nine, the way the book is sort of roughly structured is, you know, it's about sort of how automation and AI affect our lives in three areas, our home lives, our work lives, and then our society. So roughly, the rules are sort of three in each of those buckets. So three are mostly about sort of home life and what AI and automation are doing to us as people, what they're doing to us as workers and how we can respond to that, and then how we should respond at a society level, to the threat of job loss to all the changes that are coming into the economy as a result of AI and automation.


Brent Sanders  11:22

 Yeah, yeah. So, you know, with this nuance that you're really careful because it's, you know, some chapters, or I should say, some rules, do scare you, some of them, inspire us, some of them really make you feel like, you know, the human is the key ingredient to certain tasks, which is true. But the one that I have to say, I walked away with the biggest impact that I still think about today was around demoting your devices. And I, like I haven't done the rubber band on my phone. And again, I won't throw any other spoilers in there. But this idea of, you know, your device driving you, I never really made that connection with like, writing automation software, but it exactly is, you know, you hear about the way Amazon has gamified its its warehouse workers where it's like, there's a timer, and you know, you're running from thing to thing, and you have to jump through, and you're just basically, as you point out an endpoint, right? It's, it's like, what you're not trying to be and so I'm curious, like, how was the personal aspect of this book, maybe you mentioned your own experience sort of demoting your device? Was that kind of happening? as you're reading the book? Was this before? What like, how did that kind of come into the book?


Kevin Roose  12:38

Yeah, well, I had a conversation several years ago that that has stuck with me with a friend of mine who is works in tech, and we were talking about, you know, the threat of advanced AI, this sort of potential to be very disruptive when it gets to, you know, super intelligent, your algorithms. And he sort of stopped me and he said, like, we already have super intelligent AI, it's like, in our pockets, and it's called, like YouTube and Facebook. It's like, like, we don't need to hypothesize about what it would be like to have artificial intelligence that was in some ways more sophisticated than we are, or at least capable of steering us in directions that we didn't want to go. Because we all experienced this every day. I mean, we go on to YouTube to watch one video, and we catch ourselves, you know, 40 videos later saying like, Where did the last four hours go? Or, you know, we're scrolling through Tick tock, and we just lose track of time. And I think one thing that I've, I've realized from looking into the research on this is just how powerful this technology is. And even something that seems pretty benign, like Netflix recommendations. I mean, that has a real impact on not just what we watch, but on our preferences. And it actually alters us at the level of what we value. And so that sort of power is something that I didn't want to let escape the scope of the book because I think these things are really connected. I mean, one thing that is flying under the radar a little bit is is this kind of story of internalized automation, we think of automation as this external thing that happens to us at work or you know, that comes into our lives in ways that are obvious but I found myself just sort of automating myself and going on really autopilot when it came to you know, what I what I you know, thought what I believed what I watched what I listened to, I even subscribe to one of those like wardrobe in a box services where they send you clothes based on the algorithms assessment of your style. And like one day, I remember looking in the mirror wearing my like, algorithmically selected clothes, and just being like, I don't like this like why am I wearing it? Like, I hate this like, this sucks and like, it was Just because I had been told by an algorithm that I would like it. And as it turns out, the research on that is fairly persuasive that algorithms don't just reveal what we believe and predict what we believe, they actually change what we believe and what we prefer. So that's sort of one of the pieces in the book that I felt most strongly about was that any book that talks about AI and automation, and its effect on our lives has to deal with the stuff we carry around in our pocket all day.


Mark Percival  15:26

I mean, a lot of this comes down to the, you know, it's your satisfaction with life and with with automation, what I think one of the sales pitches is, at least on even on the RPA side is where we're going to take this mundane task away from you, and you're going to be more satisfied with the work you're doing. And one of the things in the book pointed out that I thought was interesting, it really made me think, because I started to notice this and other aspects is when we automate a lot of times people are actually less happy. And so we were sold this line that, you know, we're taking away something mundane that you don't like, and then we take it away. And then for some reason, we find out that the workers on the assembly line are actually less happy now that they don't have to do this mundane task, which is counterintuitive.


Kevin Roose  16:06

Totally, there's two great pieces around this one is from this book, by historian David Nye about the electrification of America. And he writes about what happened in the 1930s. And when all the factories were being electrified. And, you know, managers thought workers would be super happy about this, and it would make them more productive. And they would be, you know, able to work longer hours and be more efficient and not have to lug around heavy machinery. And instead, they they were much less happy. which is I think what you're referring to there's another piece that I didn't put in the book, which I think is maybe an even better illustration, there's a great book by Shauna zuboff that came out at least 20 years ago called in the age of the smart machine. And it's all about the sort of PC revolution. And what happened when office workers started using computers for the first time. And there's a piece where, you know, she asks a bunch of sort of secretarial workers and people who are typists and things like that too, it's right before they get computers. And she asked them to sketch what they think their life will be like, after the computer. Like, just draw a picture, what is a picture of you after this computer, and they're drawing, I mean, it's like the sun is shining, and they're kicking back in their chairs, and like, life is good. And the computer is just taking care of everything. And it's like they are living their best lives. And then the computers come in, and she goes back to them a while later and asks them to draw like, what is your life actually like with this computer in your office. And like they draw, they draw literally like jail bars, like they're like I am in prison. My life sucks. This is horrible. And it's because you don't know their bosses have all these new ways to keep track of them and their expectations of productivity that are making their jobs much less relaxed and much more demanding. And like, the sort of night and day difference between what they thought technology was going to do to their job, and then what it actually did to their job. Yeah, it is something that I think we should all learn.


Mark Percival  18:08

It's interesting, because recently I don't know if you've heard about this, there was a worker that was in New Jersey that was near the airport, did you read about this where he interfered with the GPS system? No. So basically, if you go on AliExpress, or you go into needy sites, you can buy essentially a plugin cigarette lighter GPS jammer. And it doesn't make any sense. Like, why would you want to jam a GPS in your cigarette lighter or your car? And the answer is you work in a work truck, and your boss tracks everywhere you go, and you just want to go to lunch, or you just want to drive a little farther and do something else. And so there's this huge market for these GPS jammers. And this guy, unfortunately, had a job at the near Newark Airport and drove around the airport with us and subsequently got into a lot of trouble about it. But it kind of goes to show I mean, there's this definite the tracking and sort of, you know, that minute by minute aspect of tracking is part of that happiness, I think.


Kevin Roose  19:02

Totally. And we're starting to see that sort of resistance with Amazon drivers, Uber drivers. They're sort of these coordinated campaigns by Uber and Lyft, drivers to like, collaborate to drive up the surge pricing in a given area, they all sort of mean it. And I think that sort of, is a form of sort of labor resistance of the kind that we saw in you know, earlier waves of automation going all the way back to the Luddites, but I also think it's not just something we can brush off is like, Oh, these people just hate technology or they're, they're anti technology, I think we we have to look at the ways that technology is changing people's jobs. And even if it's not getting rid of their jobs, it might be making them worse, it might be making them more demanding, less forgiving, it might be pushing up the pace of the work to the point where they, you know, have to pee in bottles in their delivery track. And so I think that has to be part of the converse about automation and AI and jobs stops and starts at the like sort of level of like, will jobs disappear? And I think the more interesting question to me is, how are jobs going to change? And what are workers going to do in response.


Mark Percival  20:15

And there's one other piece that I thought was interesting, because right after I read your book, I watched this documentary that got recommended to me on Netflix. And so it was on Cadbury and they interviewed these workers that had previously worked at the Cadbury factory and one of the guys said something that was really interesting, he said, I don't really know how they do it today, it seems really miserable. When you see another worker there 100 meters away, and you can't have a conversation with them. And it made me think that a lot of this also goes back to the social aspect of that sort of, sometimes when you automate work, it actually separates you from your colleagues. And that, especially today, feels Express really, you know, relevant with slack and everything going on in the pandemic, and everybody feels sort of, you know, separated now. But the more automation comes in, the more you really do see a lot of automation projects are about kind of improving the process. But a lot of that improvement of the process is actually taking humans out of the equation of that or that human to human communication layer.


Kevin Roose  21:10

Totally, that's a huge piece of it. And I think this is not just an accident. I mean, I think that I was just reading bradstone. My friend who works at Bloomberg has a great new book out about Amazon. And he writes about how the company's executives deliberately try to sort of atomize the workers at the fulfillment centers, they try to sort of keep them from socializing too much, they try to keep turnover high. Because that means you like fewer people sort of getting ideas about organizing or unionizing, because they're going to be out of there in six months anyway. So like, I think there's I don't want to attribute this all to some grand conspiracy against workers. But I do think that a lot of the sort of ways the technology is being used to make work less social, less personal, and sort of remove those sort of water cooler effects. I think that is sometimes deliberate.


Mark Percival  22:05

On the other side of this, are you seeing anybody who's trying to address this from the automation standpoint of saying, Let's evaluate automation and look at how it's gonna affect people from a social standpoint? And, you know, from a happiness standpoint?


Kevin Roose  22:17

Yeah, I think there are some companies that are sort of pushing in the other direction to one category of software that I've been keeping tabs on recently is what I call boss wear, which is basically this, like employees surveillance software, that companies are now requiring remote workers to install. So their bosses can keep track of, you know, are they? Are they at their keyboard eight hours a day? Are they paying attention to this video call? Like, really dystopian stuff. And actually, I've talked to some companies that have experimented with that stuff, and then backed off of it, because it was just too invasive. Workers were pushing back against it. And it wasn't actually helping them all that much with productivity. So I think this, there's going to be a little pendulum here. And I hope that some of those, the more invasive uses of AI and automation for worker tracking are gonna swing back toward a more sane place.


Brent Sanders  23:11

We hear that term as process mining. Process mining software is a really hot area that, you know, we have not touched, but it's been recommended. As a first step to come into a company. It's like just installing process mining, doing your normal engagement, and then come back to the results in a month and see what you have. And on one side, it's tracking, it's like a click tracker, right? It's, it's every keypress. It's every mouse movement. And it's meant to find repetitive patterns. But I think this is one of the things where it's like, if you're not coming at it from like, a more of a business perspective of like, what is the process I want to automate versus like, here are the granular keystrokes that happen all the time, or sure you can identify if people are just slacking off, maybe. But when I heard about that, it is a surprisingly popular thing in automation, at least the RPA space, which just seems a little shameless, right? It's not surveillance, I guess, because it's just movements of things. But it is very invasive. It's very, like, here's everything, this person is doing other work on the computer.


Kevin Roose  24:18

Totally. And, and I, you know, I don't have any, like, there are forms of this that I think might be actually helpful to people and you know, might not seem super invasive, but like, let's apply it evenly. Like the CTO should be using the process mining software, like let's figure out how much of his job is repetitive and automatable. Because I suspect that it's not just the entry level, you know, data processors who are doing mundane and repetitive work that can be automated. A lot of the executives might be just as automatable.


Brent Sanders  24:49

Yeah, we actually try to go after that. So because it tends to have a bigger impact, right. If you have the CTO generating a report for the board once a month and it literally takes them weeks. Or an extensive amount of time, that's a great place to dive in first versus, you know, she'll and accounting is spending 15 minutes extra filling out our TPS report.


Kevin Roose  25:10

Right. Well, and just on that sort of equity level, like, why is this, you know, why are we surveilling? You know, why do we care if shihlin accounting is watching YouTube videos during, during the workday, like if the CTO is, you know, checking sports scores, like, it's just, it's just so it can so easily become a tool for sort of power to entrench itself. And my hope is that if companies are using this stuff, they're also looking at, you know, not just their low level, low paid workers, but also people who, you know, who might have more power, who might also be wasting time during the work day who might also not be as productive as they could be.


Mark Percival  25:51

And you have to, I think we have to also just appreciate the fact that I mean, humans need to waste time. Totally, it's part of the process. It's part of the creative process as part of just like that just a general happiness and satisfaction with life. I mean, it is huge, you know, wasting time is, unfortunately, companies look at this as something they can clamp down on. But I think the companies you see that do the most restrictive measurement or measures, wind up losing people if they burn them out sooner, that's just as miserable.


Kevin Roose  26:17

Absolutely. I mean, this phrase that I hate so much that sort of taken off, and especially in retail jobs is time theft. Like if you're, if you're dawdling, if you know, go for a walk during the day, if you, you know, take a longer, you know, snack break or something, you're, you're doing time theft. And it's just such a, it's such an unproductive way of looking at things, because so much of the economy right now is moving away from jobs that, you know, where the important metric is just raw output. And so much of it is moving to jobs, where things like creativity, and new ideas and inspiration, and empathy and collaboration, like those are the things that are that we want to measure. And so if you're measuring people in this very old, like, sort of Frederick Winslow Taylor, you know, industrial, you know, time management way, you actually are incentivizing the wrong kind of value adds you're incentivizing people who are just like hustling and grinding, and who are not actually coming up with new ideas, probably, or developing new ways to work together or doing any of the other sort of softer forms of productivity that actually end up mattering more.


Brent Sanders  27:29

Yeah, that's one part of your book that also, I mean, just connected with me entirely, because I have spent parts of my career, especially early on just grinding, right trying to get as much out of it bill as many hours balanced as many projects as one at once and it only ends in one place, it's Oh, you're always gonna burn yourself out. And, you know, that can introduce all sorts for me, you know, introduces depression, anxiety, all these things that you address in your book around, you know, there's this hustle culture, and it was funny, I didn't really realize it was now that I have, you know, family and child and family times really important that you point out that this is a specific, like, almost a demographic of people, you know, it's largely single males that, you know, can put all their time into work. And, you know, it's, it really is an emphasis on the wrong things. I mean, I'm a lot older now. And I value, you know, time with people a lot more, I think, especially with family, but it's, I think, like I was going into this book, as this is an extension of automation in business and RPA. And like, that was my expectation when in the book, I really came away with more so the personal pieces, it was a little bit of a self help book. I hate to say it, but it probably wasn't your intention. But that was the effect.


Kevin Roose  28:47

I don't think of that as a pejorative. In fact, I embrace that because it is literally a self help book. One of the reasons I wrote this is that I was trying to figure out how I, Kevin Roose, your tech columnist for The New York Times, can deal with automation and AI in my industry and in my life. So I was literally trying to help myself. So I don't, I don't think that's a bad thing. And in fact, I hope people take useful actionable help, and advice away from this.


Brent Sanders  29:16

Yeah. Can we talk about, aside from automation? I know we're gonna go a little off topic here. But, you know, we had to do a little bit of research, you know, in our preparation for this, and, you know, the first thing that came up around your name when I was googling was around this article you did, and you turn one of your articles into an NFT. Could you tell our listeners a little bit about a little bit of background about that?


Kevin Roose  29:41

Sure. So I cover tech, I got really interested earlier this year in NFTs, which, you know, for people who haven't sort of dug in fully are basically a sort of way of claiming digital ownership on the blockchain of a specific digital good can be tracked and can serve as sort of a certificate of authenticity for the person who owns that specific digital goods. So you're seeing them in, you know, basketball cards and and you know, artwork and all those sorts of things. And some of them are selling for millions of dollars. So I thought, like, why should artists and basketball players have all the fun like, maybe I should explore this too. And so I decided to write an article about NFT's and actually turn the article into an NFT and sell it at auction with the proceeds going to charity. So we did that. And I thought, you know, maybe someone, some new york times reader, you know, who's really into crypto is gonna pay a couple 100 bucks for this. It sells for 350 ETH, which at the time was $560,000. and is now over a million dollars. And so I ended up raising a ton of money for charity, and sort of having this unexpected, blockbuster NFT sale of one of my articles, definitely the best rate I've ever gotten for an article.


Brent Sanders  31:07

No, that is amazing. So that was, you know, before I think we reached it, I found that and I'm like, Oh, we have to talk about that. So that is a wonderful story. And I wasn't aware that it went to charity, what charity did it end up going to?


Kevin Roose  31:23

It went to the Neediest cases Fund, which the New York Times is an in -house charity. That's fantastic. So yeah, it's a great cause. And I hope that sometime soon, I can actually get them the money because it's been taking a while just because of some logistics, but I hope they get it before the price of ether crashes. And it's back to only $500,000. Again.


Brent Sanders  31:48

That's fantastic. And another thing is that I, you know, signed up for your newsletter. I noticed that I was curious, you know, in the New York Times it sounds like they've come out with some restrictions around employees outside newsletters. So was that about a month ago or so? I was curious, any new thoughts on you know, how you feel about it or how it's affecting you. I mean, it obviously even free newsletters, it sounds like they're kind of putting the kibosh on?


Kevin Roose  32:18

Yeah, this is a live discussion at the times that is above my paygrade. But there is some sort of talk of restricting newsletters, and there's now this committee that has to evaluate all these newsletters that people want to write. And, you know, I get it, like, it's scary to have, you know, a new medium writers are writing in and maybe they're, you know, taking a little more Liberty than they would in the newspaper, or maybe they're just you know, you know, doing that instead of their work or something. So I get their anxiety. But I think it's ultimately a question of like, there's a new medium, people are using it to do journalism. Do you want your reporters and columnists and writers experimenting? They're not. And I think it's ultimately I would, I would hope that, you know, my bosses and the people who make these decisions, would see that as a potential positive for writers to be, you know, to be communicating directly with with a subset of people to be going into more depth on some of these issues that they cover than they could in your time story. I think it's just gonna take a little bit of adjustment to get there.


Mark Percival  33:23

And when it sounds like time theft to me.


Kevin Roose  33:27

I didn't think about it that way. But it truly is.


Brent Sanders  33:32

So, you know, with that, you know, I am wondering like, this is a new medium, like, how do they look at you know, you're doing a book? I mean, and this is my kind of naivety around this industry. I mean, I understand the broader trends that have been happening in, you know, journalism, but I'm curious, like, how do they look at a book versus something like a substack?


Kevin Roose  33:55

Well, books are just they've been around for a lot longer. And, you know, New York Times journalists and other journalists and other big publications have been writing books for decades, probably longer than that. So I think they've had some time to wrap their heads around the fact that you know, people are going to do a book there may be going to take some time off their they're sort of newspaper duties. And, you know, I think there's a way in which substack is sort of not that dissimilar, you're just writing it in chapters and sending it over email instead of writing it all at once, and then printing it on paper and selling it in a bookstore, and the delivery devices different, but ultimately, it's just words and pixels, just the same as a book. So I think they're not super different, but one has just been around for a lot longer.


Brent Sanders  34:40

Yeah. Very cool. Back to automation. You talked about something that I think really reached us, you know, professionally, which you talk about this idea of so so automation. So idea of, we see it all the time in terms of, you know, how how we might like differentiate what We're gonna do it when I say we, you know, our company formulated and podcast and you know, when we take on something we're trying to be beyond something that's just so so I mean, I'm curious, like how did how did that term pop up? And like, how did you differentiate or define So-so automation?


Kevin Roose  35:17

Well, the term So-so automation, I did not come up with that that came from two economists around us, Mr. Glue, and Pascal Restrepo, who write about automation and the economy, and they were really trying to solve this puzzle, which is, you know, we're getting all this new technology, all this new automation, and machine learning, you know, robotics, they're all taking huge steps forward. And yet, our economy is not getting substantially more productive, your companies are not making a ton more money with a ton fewer people, you don't see the kind of productivity spike you would expect if our economy were being rapidly automated. So what explains that, and their theory, which I find pretty persuasive, is that it's because we're not getting the right kind of automation, we're not getting the kind of automation that makes firms, you know, substantially more productive, that, you know, allows them to displace tons of workers at a time and makes everything work faster and better and cheaper. We're getting this So-so automation, this automation that is just barely good enough to replace a human, but not so good that it transforms the entire business. And so their example of social automation is something like a call center, which like I don't know about you guys. But like when I get an automated call center response, like I'm pressing zero, like, I want to talk to a human because the human is going to be a way better problem solver than whatever automated phone tree I'm on. And so that kind of thing. It takes costs out of the business, it allows them to do slightly more work with slightly fewer people, but it doesn't really, it's not really good. It's not really revolutionary, it's just purely substitutive. And so that's the kind that actually on a macro economic level, is they believe, responsible for this sort of productivity paradox. And that the answer is not less automation, it's actually just better automation.


Brent Sanders  37:15

Hmm. You know, with regard to RPA, specifically, so we're talking about like a very specific part of automation, which is, you know, more than what I would consider like the beep, boop, up, you know, robot software, it's very much on rails, and there's not a there is some decision making, but it's very limited. I'm curious on just RPA, would you consider yourself an optimist, the pessimists or sub optimist, as you described in the book?


Kevin Roose  37:41

Well, RPA, to me, is basically purely So-so automation. I mean, it is literally designed to replace humans. I mean, there are people who would dispute that RPA companies certainly would. But the genesis of RPA was basically that companies were looking to automate certain processes, and take people out of those chairs without upgrading their entire tech stack without saying, you know, we're gonna do away with our old Oracle database or whatever, and we're gonna move to something much newer that is more automated, they were saying, like, it's cheaper to automate the people who are doing who are interacting with the old systems than it is to replace the old systems. And so that's what these bots are going to be doing. So I'm not super optimistic about that industry. I do think that it's, you know, probably, you know, it's, it's probably a stopgap, like, it's probably a bridge. But you know, until these companies actually decide that they're going to upgrade their whole systems or until, you know, those companies become obsolete and are replaced by newer companies that use more automated systems, but it seems to me like, like, kind of a mediocre form of automation. And, you know, I but I would be curious to hear you guys are a lot closer to that world than I am. So what am I? What am I missing there?


Brent Sanders  39:01

No, no, I don't think you're not wrong. You're not wrong. I mean, we've had a great guest on Sean Chou of Chicago based. Now I'm gonna call it an automation company, but he probably would call it something else. I forget exactly what term they were something more along the lines of operating system for the business, but he went and RPA is just another form of labor arbitrage. It's just another way to know, you're paying an American X amount of dollars, you're paying somebody in India x, you know, minus 80%. And the folks in India were like, Alright, well, I'm gonna automate this to a computer and I'll offload it to them and it's just sort of moving dumb work down the chain. It's dumber and Dumber works in it. The reason we got into this field, and specifically around automation in general, was this idea that legacy software is sort of going to come home to roost. It is like people need to deal with their legacy software and we quickly fell into this bucket of, well, here's a good way to start. Sort of leg out easily, here's a way to, you know, address these incredibly painful technical situations that companies have gotten into, but it's still a stopgap. Like we don't do any automation with the plan of it being around for 10 years.


Mark Percival  40:14

Yeah, I mean, I think that the stopgap is exactly yet, right. It's, it's the legacy software continues to kind of, you know, eat the world. And the idea is that you do want to upgrade your systems. But is there anybody who has been involved in that process knows, it's usually a really, it's, it's like boiling the ocean, right? Like to do this, it's going to take, you know, three years, and we have to spend all this time, you know, when do you actually make the move over. And it's that in between periods, where you don't have the confidence to go into that system fully, where I think RPA tends to play. And that's why I think there's been so much, you know, interest in it, and yc places like UiPath doing so well financially, is because it's exactly that it's a stopgap and these companies don't have a choice, they have all this legacy software and play.


Kevin Roose  40:55

Totally. And I think it's important to sort of distinguish, like, I don't think these people working on the software are bad people. I don't think that, you know, deliberately, you know, waking up every day and steepling their fingers and going like, how can I get rid of workers today? Like, I don't think there's just as a brand.


Mark Percival  41:12



Kevin Roose  41:14

Like, I, I don't think there's malicious, you know, Dr. Evil intent behind all this. And I also don't think it's entirely their fault. I mean, one thing I've realized is that automation is not, it's not an automatic process, right? people decide, not only executives decide which jobs to automate, but they also decide what to do with those people. I mean, do you lay them off? As soon as it's feasible? Do you retrain them to do something else in your organization, there are more and less humane ways to implement things like RPA. And so I think, you know, I'm less interested in sort of haranguing the RPA vendors that I am, in really, you know, holding the companies that are implementing the stuff accountable and saying, like, did you lay off, you know, all the people in the finance department, after you automated their jobs? Or did you take them and, you know, pay them while you look for other work for them to do inside the company? And I think that's gonna be a really big differentiator.


Brent Sanders  42:09

You know, that was like the big ethical question that I really wanted to get an answer to, from our guests and folks in the community over the last year or so. And I think the best conclusion I found was that good companies do good by their people, like they make the most of their human resources. And I hate to use that term resources, they do the best they can with people, because people are really hard to replace the really hard to train, it's all incredibly expensive, when, from a board level, if you look at it, just by the numbers, like losing people, and then having to hire other people is expensive. And, sure, retraining is, is difficult, but the domain knowledge that people grow after being somewhere for five years, or this is how things are done, and then things are going to change, and therefore you're going to be let go, that just tends to reflect not even just poorly from a PR perspective, like, you know, they're just cutting people, but from, you know, what are you doing with your, the resources that you've invested in? And so I think that was like the most distilled the short version being good companies do good things with automation?


Kevin Roose  43:15

Yeah, I think that's right. And I think that, you know, one of the mistakes that business leaders make is just thinking of LIBOR as a cost center, you know, and thinking of it as something that needs to be reduced as much as possible. But that's where you're, you know, in a good company, that's where a lot of the ideas are coming from, that's where a lot of the, you know, the brilliant breakthroughs are coming from and the, you know, the transformative product ideas and things like that. So I think I think, you know, companies that don't just see workers as something, you know, annoying to be eliminated, are probably better off long term than companies that are just trying to reduce headcount as much as possible.


Brent Sanders  43:52

Yeah. Yeah. Well said. Okay, with that being said, Mark, do you want to cover anything else?


Mark Percival  43:59

No, I think this has been I mean, obviously, you know, there's a lot to think about the book is, you know, it's made me think differently about when we automate something, just from the standpoint of, you know, not just thinking about the effects on the workers. But I think also really looking at it less, less, as, you know, how much time we're going to say, but also, you know, how will this affect the workers in an improvement sort of situation? Is this going to take away from social interaction? Or are we going to see employees less happier, happier after this? And I think it's a really tough question, but it's been, I think it's a conversation we need to have.


Kevin Roose  44:29

And I'm really glad you guys are having that conversation. I mean, seriously, I think it is important that the people who are designing and funding and implementing this technology, I mean, it's not so much fun to think about all the time, but it is important to think about what the downstream effects are going to be. And I would just, you know, I would say that one of the things quickly before we go is, you know, involving workers in this process is historically the way to do it in a way that is least disruptive and ultimately that you know, is more most beneficial for companies in the 20th century when a lot of the factories that auto factories and things like that were automating, they actually had worker automation councils, so groups of workers that would evaluate new technology, you know that the management of companies come to them and say, we want to evaluate, we want to automate this task. And they would say, okay, that's a task that, you know, some people don't enjoy doing, maybe let's do it this way. What if the technology could be used to automate this part, but leave this part in the hands of humans? You know, what if these people could be retrained to do this other part of the process or oversee the robot? And so I think that kind of worker involvement we've seen is really the difference between automation going well, and automation going badly?


Brent Sanders  45:44

Yeah, that's really well said, I mean, this is why I'm giving I mean, partially the self help piece of this, but I'm giving this to your book, I mean, to say, just giving it out to people we talked to so if they set up a meeting, it's like my follow up thing. It's like, Hey, give this a read. Because, you know, I always think about Facebook as like, everyone had great intent. Well, I shouldn't say everyone, but there were probably a lot of really good intentions in building these features out. And you see, we're living the effects right now. And now you have to write books about how to disconnect from devices, because of these patterns. And sort of, you know, somebody's doing their job really well, but not really thinking through like, what's the result of this going to be? So I think it's duly critical to, you know, automation. And I really think it helps us explain AI by giving it like, there are things that does well, there are things that doesn't there's, it's not going to, it just paints a much more balanced and as you put it nuanced picture, which is really valuable for us to then come in and explain what we do.


Kevin Roose  46:49

Good. I'm glad it's helpful. And I you know, that your checks in the mail for the royalties that they deserve.


Brent Sanders  46:58

Awesome. Well, Kevin, thank you so much for coming on and talking to us. And again, for our listeners, check out "Futureproof: Nine Rules for Humans in the Age of Automation". It's available now.


Kevin Roose  47:07

Thanks so much for having me.

Keep in the Loop

Get the latest news and exclusive releases

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.