Transcript for:
Impact of Technology on Human Behavior

i think it's a mixture of everything that's on social media that makes it so cool stories or bitmojis or gifs teens want to try out the new feature they want to be on the cutting edge of the technology that's coming out so on snapchat there's like the snap shows there's snapchat streaks in the morning just send a streak picture and when i get home from school send a streak picture now it's like a habit to do when i get on my phone when i post things on social media there's definitely a period where i'm checking who saw it how many views you get and what people are liking on my stories and what i should keep posting there is a lot of pressure to present a version of yourself that's close to perfect i almost never post a picture that hasn't been like touched up in any way on instagram i always check it if it's not getting as many likes as my other pictures i might delete it i have binge watched i think multiple shows i've watched about 10 to 12 episodes in a day the 15 seconds between each episode definitely makes you feel like you have you have this urgent choice you have to make instead of having to wait for an episode to come out every week netflix as a whole makes it a lot easier to just consume so much media all at once in a row i'm still trying to decide but it starts playing the next episode i'm like oh it won't hurt you find yourself like two hours later and you're still watching it and you have homework to get done i watch a lot of youtube probably more than netflix and the suggested videos uh help me like subscribe to new people i will watch videos for like an hour and a half without even having a plan to do that there's just so much content that's just addicting [Applause] [Music] thank you all so much for coming can you hear me yeah um i am so uh honored that all of you came um i'm so honored that jack and trudy led that beautiful opening introduction aza's introduction about his father and what it means to him personally um and this is incredibly personal for me i mean this is basically how we've decided to spend at least my life's work at this point is to try and correct and figure out how we can correct the issues that we've just been exposed to and many more and the reason we wanted to bring everyone here is because right now what's going on in the tech industry is a kind of cacophony of grievances and scandals and oh they screwed this up and now there's this election thing over here and it's kind of all over the place so if you ask people what's wrong what are we trying to fix here what we wanted to do was say okay how can we get behind a common understanding of what's actually the problem and how can we fix it so that's my hope for today that we can do in this room together um and to do that uh having seen everything you just saw you know what what is the problem in the tech industry we've got tech addiction polarization election manipulation at outrageification vanity efficient teens mental health bots i'm going to claim to you that there's actually one thing that is driving all these problems and it has to do with something that eo wilson said which is that the real problem of humanity is that we have paleolithic sorry paleolithic emotions medieval institutions and god-like technology this is kind of the problem statement of humanity because while we have these ancient instincts these paleolithic ancient instincts we've got exponential tech and where i first got exposed to these paleolithic instincts was actually as a kid as a magician magic is kind of a study of the limits of these evolutionary features of our minds right you're looking at these evolutionary limits of our attention distraction menus the limits of misdirection these are the kind of features of the minds and later as a technologist learning how how do you get into people's brains when you're designing software the persuasive technology lab which i've talked about so many times you can pull on these little strings and how people's brains work and then you can do that actually in their relationships and you can control you know and shape the way people are playing in their relationships and then of course at google as a design ethicist doing this with two billion people's thoughts how are you affecting two billion people's paleolithic instincts because while we were all looking out for the moment when this is technological progress we're all looking out for the moment when technology would overwhelm human strengths and intelligence right when is it going to cross the singularity replace our jobs be smarter than humans this is the thing everyone's talking about but there's this much earlier moment when technology actually overwhelms human weaknesses and i want to claim to you today that this point being crossed is at the root of bots addiction information overload polarization radicalization outrageification vanitification the entire thing which is leading to human downgrading downgrading our attention spans our relationships civility community habits downgrading humans and that we haven't had a name for this problem sorry let me do this again yeah it's got to be done manually i see all right and that we've been missing a name for this problem which is kind of the system of results that takes place when you have an extractive attention economy so when i say extractive attention economy i mean that there's this race to the bottom of the brain brainstem which everybody knows which is this race to reverse engineer human instincts who's a better magician evolutionary biologist who can figure out these tricks in the human mind i'm going to go through the basics that everybody knows but just to bear with me obviously you know turning our phones into slot machines turning tinder and dating into slot machines our email into a slot machine we check our email 74 times a day crawling in our dopaminergic systems overloading social proof social availability we have to be there for each other social reciprocity with streaks we're playing with all these different cognitive biases and people okay and it's all in the name of getting our attention but in the race at the bottom of the brain stem it's not just about getting your attention i have to get you addicted to getting attention because that's how i get more attention so it works better when you sit there wanting to up your follower account when you sit there wanting to get more likes when you sit there wanting to get more attention for what you look like and it's created a world where you get rewarded the more only by looking different than how you actually are and we spend now about a fourth of our lives on the screen in these artificial social systems and it's not just on the screen it's also off the screen because it shapes the way that we now think about our relationships it's colonized how we see our relationships and it's had serious results here's a graph of the percentage of women um with high depres depressive symptoms starting in 1991 at 17 percent and if you go all the way to 2013 about the rise of kind of the modern age of social media it hits about 20 21 but then look what happens after that so um this is gene twangy's research which shows pretty deterministically that for 10 to 14 year old girls especially when you hook up social validation to a variable schedule of slot machines and social validation from friends with your self-image this we know what this is doing it's downgrading and overwhelming who we are in our identities so downgrading our attention downgrading our need for attention our need for attention from other peoples and downgrading our identity but then it starts to downgrade something else which is our free will so when i say that you might think i'm going too far but let me give you an example let's say you're about to hit play on a youtube video how many people have done this you're about to hit play you think i'm going to watch one video just one and then i'm going to do something else and then of course what happens is you wake up from a trance and it's been two hours and you're like what the hell just happened to me and it's because when you hit play on that youtube video what actually happens is that it wakes up a little avatar voodoo doll version of you inside of google server somewhere based on all your clicks based on all your likes based on everything you've watched and how that compares to the other voodoo dolls we're building but those are like your nail clippings and hair filings right so all those clicks which is making this voodoo doll look and act more and more like you and we don't have to manipulate you because we just have to manipulate the voodoo doll we test a bunch of videos on them and we say which one's going to keep you here the longest and it's like playing chess against the chessboard of your mind i'm going to win because i can see way more moves ahead on the chessboard this is such now that youtube's average watch time is more than 60 minutes a day on mobile and it's because specifically of what the recommendation engines are putting in front of you as said by the chief product officer at youtube and this is actually colonized our choice because with over a billion hours on youtube watched daily 70 of those billion hours now are from the recommendation system so the ais are actually downgrading and overwhelming our free choice so free will is being colonized okay but then it's deciding what we believe and downgrading what we believe because if i'm youtube and imagine this is a spectrum and on one side of youtube you have the calm walter cronkite rational science carl sagan side of youtube and the other side of the spectrum you have crazy town you have bigfoot ufos etc right if i'm youtube and i want to want you to watch more i've got the regenerative section but i've got the extractive section which direction am i going to steer you if i want you to watch more i'm always going to tilt it towards crazy town you could start in the calm section you could start in the crazy section but if i want you to watch more i'm always going to tilt you that way and three examples of that about a year ago if you were a teen girl and you started on a dieting video it would recommend anorexia videos because those were the things that were better at keeping people's attention if you were watching a 911 news video it would recommend uh conspiracy theories about 911 it recommended alex jones 15 billion times just consider for one moment 15 billion times alex jones is recommended to people um and if you're watching in nasa moon landing would recommend flat earth but it's not just the videos that are recommended also and this is just taken from a week ago from guillaume cheslow the youtube recommendations whistleblower this is from a week ago it's also polarized and changed the language that we're exposed to on a daily basis so from a week ago what are the most recommended keywords number one gets schooled shreds debunks dismantles debates rips confronts destroys hates demolishes obliterates so even if you solve fake news this is kind of the background radiation of what a population the size of islam is exposed to on a daily basis for 60 minutes a day so now let's take a look at one of these examples so what percentage of the results if you look at flat earth for example what percent of results were positively in favor of flat earth this is from data about two years ago if you looked on google search the results were 20 so 20 of the google search results were in favor of flat earth if you looked on youtube search it was 35 but if you looked at youtube recommendations it was 90 percent so it's not like anybody wants this to happen it's just that this is what the recommendation system is doing so much so that kyrie irving the famous basketball player uh said he you know he believed the earth was flat and he was apologized later because he blamed on youtube rabbit hole when he later came on to npr to say i'm sorry for for believers i didn't want to mislead people a bunch of students in a classroom were interviewed saying oh the round earthers got to him which shows you something really critical and important that once you tilt people into these crazy directions it's actually really hard to bring people back so now think about a population the size of islam in languages the engineers don't speak being tilted in this direction not because anyone wants it to happen but because that's the extractive attention economy better getting your attention this also happened with facebook about two years ago um mark zuckerberg said our new mission is not to make the world more open and connected it's to bring the world closer together and they said one of the ways we're going to do that is we're going to build an artificial intelligence to start suggesting facebook groups to people he says and it works because in the first six months we helped 50 percent of people join more meaningful communities so what does that mean so there you are and uh you're about to join a facebook group let's say it's a new moms facebook group it wakes up the avatar voodoo doll version of you right based on all your clicks and everything you've ever joined and it says which of these groups if i got you to join them would keep you there the longest would be most engaging right this would be the most engaging groups and what do you think one of the most popular groups of mine advice anti-vaccine conspiracy theories if you join anti-vaxx then you're in what rene de resta calls a conspiracy correlation matrix because it recommends from anti-vax chemtrails flat earth etc and now anti-vaccine is a top 10 global health threat not entirely because of social media but it's been exacerbated massively by social media so now we can hire 10 000 content moderators in english and we can start hiring people in burma to speak the languages that we want to start moderating all this content but imagine in that tilt graph you've hired 10 000 boulder catchers that are trying to stop the boulders from rolling down the hill but you've created a system that's impacting two billion people it doesn't scale and so how many engineers i mean so by the way youtube has fixed the flat earth teen girl anorexia problem and and the anti-vaccine conspiracy issues those have been um cranked down now but the point is for each one of those three issues which are only the ones we reported on in english because we have journalists looking into this stuff how much is this happening around the world and languages the engineers don't speak how many engineers have looked at 911 conspiracy theories in arabic and we know from an mit twitter study that fake news spread six times faster than true news in languages the engineers don't even speak where you have for example in burma and a genocide happening where there was only a couple years ago four burmese speakers at facebook for a population of 7.3 million users in burma who are not digitally literate amplifying a genocide this is the situation that's tilting the world in this crazy making direction and we have about four billion new people coming online into this environment in the next six years so this is just situation awareness what is happening so are humans just bad is this who we are is this humans choosing all this no that's not why this is happening it's happening because we have these artificial social systems that have hijacked and overpowered human nature combined with overwhelming ai that have again overwhelmed human weaknesses by building these voodoo dolls combined with an extracted extractive attention economy built on getting attention from people and this is what's causing the downgrading of humans because while we've been upgrading the machines we've been downgrading our humanity and this is existential because our problems are going up climate change inequality our ability and need to agree with each other and see the world the same way and have critical thoughts and discourse is only going up but our capacity to do that is going down because of human downgrading so we're going the wrong way and so that's why now this is everything up until now and that's why we need to change course we need to change course now because this situation is about to get even more dangerous sorry we're going to get to the second part in a moment but i just need to show you the situation first how many people here have been um with a friend and you're convinced that facebook was listening to your microphone because an ad for the thing you were just talking about shows up on your facebook feed how many people have that experience i want you to look around there's a large number of people who have had this experience well it turns out the data forensics show that they're not listening to the microphone it's a conspiracy theory but the reason why you believe it is because they don't have to listen to your microphone because they have a little voodoo doll version of you and they just listen to what that guy says and because they know and can predict what that person's saying they know exactly what you were thinking about anyway and that's the point is that we went from a world from just hijacking people's minds and cognitive biases to now completely out predicting human nature i don't have to with cambridge analytica anymore steal your data with there's a new paper out with 80 accuracy i can actually get your big five personality traits just by looking at your mouse movements and click patterns you can also do with eye gaze by the way and now ai can actually completely generate faces oops sorry completely generate faces none of these faces that you're seeing are real this is not a cross morph between existing faces these are all generated from scratch and they can be generated to be ones that you would find completely trustworthy for example this person might look eerily familiar you don't even know why he might be the combination of people that you might have seen on this stage and you could have these faces animate to say the things you want them to say and in the future whisper in your ear the words that you will find the most trustworthy with technology you don't have to overwhelm people's strengths you just have to overwhelm their weaknesses this is overpowering human nature and this is checkmate on humanity because while we are all watching out for that moment when it crosses our strengths it's been manipulating our weaknesses and as a magician that's the only thing you need to know about someone so just take a moment and think into for just a second what you've heard so what should we do about this i know let's go grayscale let's turn off those notifications let's put those apps into some folders it'll be great that's like being in a burning you know situation with climate change and being like i got it let's ban straws we got this okay so this is clearly massively insufficient it's a systemic problem that needs systemic solutions so what are we going to do what if we train the engineers in ethics let's have them study jeremy bentham and emmanuel content consequentialism that will clearly solve the problem believe me i'm far in support of training everyone in ethics but this is not sufficient to the problem we just laid out of human downgrading i know let's put it on the blockchain let's just put the attention economy on the blockchain this will be great that's not going to solve the problem well the blockchain community is involved in the solutions we can talk about later what if they pay us for our data well yeah they the companies should pay us for all the data that they're using to profit off of us but that's like this digital frankenstein is wrecking the world and handing us some pennies every now and then while doing that that's not enough and if they protect our data well that's nice but we just showed how i don't need your data because i can predict everything i need to know about you so that doesn't solve the problem what if we wait for more research let me just figure this thing out no because this problem is urgent it's urgent and so there's no question that silicon valley is sophisticated about technology but the thing that we have not been very sophisticated about and we need to get sophisticated about is human nature and that means having a much more full stack understanding of how we really work a magician's view of ourselves an anthropologist view of ourselves a meditative meditation teacher's view of ourselves and that's like taking this evolutionary diagram and saying maybe we need to look in the mirror and see a full appraisal of all of our human weaknesses frailties and also our brilliance where are we brilliant and per what aza said at the beginning this is a quote from his father jeff an interface is humane if it is responsive to human needs and considerate of human frailties that's why we've been calling this humane technology because we have to start with a view of human frailties and design ergonomically to wrap around those human frailties and human needs and figure out how do we leverage our brilliance and that's why we invited all of you here today to launch a coherent agenda for technology that actually addresses the core concerns that are actually affecting people's real lives real elections the mental health of kids polarization the vanity ification outrage education the entire thing the good news is that all of it has to do with just one thing which is the overpowering of human nature and we can bring it back into alignment so how are we going to do that well the three things that got us to this world of human downgrading are artificial social systems so again social systems that we have to bend ourselves to be inside of these things that are kind of hijacking our minds so we have to uh these overwhelming ais so these ais that out predict human nature and these extractive incentives these three things are what have led to the downgrading of humans but the good news is that by changing these three things going from artificial to humane social systems that bring back into alignment all of our social instincts and recognize the natural brilliance humane ais that are a fiduciary to our values and the limits of human nature instead of exploiting them and from extractive incentives to regenerative incentives that are based on competing in a race to the top to help us and doing these things can go from the downgrading of humans to the recognizing the brilliance and amplifying the brilliance of human beings our brilliance of our relationships our civility are um sorry our creativity our common ground our shared truth everything so this is what we have to do so let's take a look at how we do that how do you build humane social systems i mean after all you know we got here by trying to do human-centered design somehow that wasn't enough a joke about this between a's and i is the bug in human-centered design as it puts human bugs at the center so we need to protect what we can see we have to start learning how to see new things about ourselves that we haven't been projecting and that means again looking in the mirror and having a full stack socioergonomic model of human nature and we're going to introduce just a little overview how we might do that so this is a full stack socioeconomic model and it involves different layers our physiology emotions cognition decision making sense making choice making group dynamics you have to have this full understanding of how do we work and i'm not going to go through all of this but i'm going to give you a few examples there's this two sides there's these individual ergonomics for how do we work individually our attention physiology tension cognition there's a whole model for how these things are paleolithically wired and the kind of ergonomics of how this works and then also our social ergonomics so how do we work in groups how do we do social reasoning and if you have this ergonomic model then you can use it to start diagnosing problems and so let's take an example that we all face how many of you are here you know you're writing an email and you get midway through running the email and suddenly boom a new tab opens how many of you have done this you self-interrupt new tab opens we actually do this about every 40 seconds that's the average time that we focus on a computer screen now can you believe that this is actually from two and a half years ago 40 seconds now notice this wasn't from an external interruption so if we're trying to solve this problem with technology what are we going to do here let's should we build better ai do we need more data do we need better machine learning do we need a do not disturb button well notice where did the problem come from it wasn't from the outside the call was coming from inside the house so the problem was inside of us so where inside of us was it coming from so let's take a look at this full stack economic model let's start with our physiology how are we breathing in that moment now oftentimes there's actually something called email apnea we don't breathe and repeat our email um and that has a large effect on your cognition take a breath right now did you notice what your breathing was like before you took that breath when you are subject to not breathing you can feel stress anxiety all these other feelings but it's as simple as putting your attention back on it but suddenly you gain agency over the thing that had been downgrading you and so that's the theme we're going to show you is that the putting attention back on the thing we can't see can actually reclaim these things because our physiology our heart rate our stress our cortisol affects our emotions and our emotions affects our attention and our cognition and so starting with this deep understanding we're starting at the micro and we're going to go to the macro okay so that's how we start and then now let's take a look at an example of how this affects our social reasoning so leveling up so what's a good example of that well when we feel socially isolated this is the one of the most common experiences with technology today only half of americans now say they have meaningful daily face-to-face social interactions only half 18 percent say they rarely or never feel like they're people that they can even talk to all of us have those moments when you feel you're alone and you don't even know who to call how many people have that feeling that you're alone you don't even know who to call you're just sort of trapped in it right our brains are brilliant at social reasoning but somehow we're getting confused i mean for a second you could say well hold on we have things like this why don't we just do this when we're lonely right why don't we just do this maybe we're not doing this because our revealed preference is that we want to be alone we actually want to just sit there alone it's like no well our nervous system is confused because when we're not given a really great menu of choices where the burden is all on our minds to think about what are those thousand relationships i might want to type in and which letter do i want to type it's not very ergonomic to the social signaling that our brains are looking for that information and imagine if instead the kinds of information our brain is looking for are those signals that our friends care about us for example our brains are really good at receiving the information from our friends when they say um you know they say call me anytime you need to talk when your friend says that to you we know what that feels like in the moment but our brains are not so good at remembering that when we feel down but imagine if technology was designed differently because it recognized that fact about human nature and made those signals visible when you were feeling alone so imagine you were alone and you would see this as the opening menu on say a screen and imagine if technology and facebook and these things were designed to help deepen those one or two relationships that are those being there for each other relationships and what if it was easy to get support from each other what if this was as easy as getting knowledge from wikipedia or as easy as making content go viral on social media what if this was the thing we were making easy what would that do to addiction to polarization to conspiracy thinking if instead of being isolated we were more connected because we could get support from each other okay so that was social reasoning now let's go up a level and look at polarization so uh consider where you are in the political spectrum uh raise your hand if you think common ground exists with people on the other side yeah keep your hand up if you find it easy to find it so we find it really hard to find common ground with people on the other side we think we can't agree and polarization has been going up from 1991 to 2001 2004 to 2014 uh some of it exacerbated by social media um but are we actually polarized we can't agree or are we being presented with uh experiences that don't bring out our ability to find that natural common ground here is a study from 2004 asking americans to estimate how much of the total wealth does the top 20 percent own how much of the total wealth is the top 20 own if you ask democrats uh they thought it was about you know 60 or 55 but if you ask republicans they actually agree with how much they thought that the the top 20 owned but then if you ask them okay maybe that's fine but what should the top 20 own what's the ideal amount how much should the top 20 percent actually have and uh democrats also and republicans agree basically on what they think the top 20 should have and yet the actual wealth the top 20 has um is uh much much much more than what both the ideal and the estimate are so if we agree why are we fighting about this radioactive topic maybe we're not being asked the right questions and what if we agree on more than we think and maybe it's a matter of social media and technology not putting us into the kinds of group dynamics where you have these mass broadcasts at each other uh groups which our brains are not designed and evolved for so if you think about where are our our minds naturally brilliant if you gain awareness at basically finding common ground so things like campfires where we are much more easy to find reasonableness and openness and civility dinner tables are as ancient as any human experience getting a drink with one another and these things are based on certain features of those groups so how big is the group size how much trust is there how many people at the table what are the well framed questions and there are things like living room conversations that give our brains the kind of trust signals we need because they create these small safe spaces where a small number of people can engage in well-framed con conversations well framed questions with good group facilitation with diverse perspectives this has actually been working or things like change my view which was actually born out of a reddit channel that was basically noticed that if you give people the incentives to say hey i want you to change my view i'm giving you an invitation to change my view about vaccination and people if you see this little delta 12 there people are rewarded for the more that they actually change each other's minds and this works so well that they actually have spun it out into its own situation now what if finding common ground was easy what if we made this as easy as accessing wikipedia or this as easy as making content go viral on social media what happens to polarization when finding common ground is easy what happens to conspiracy thinking when common ground is easy what happens to addiction when finding common ground is easy it's easier to have these conversations it has rippling out cascading benefits so and what happens more importantly to our existential threats that depend on us agreeing and finding that common ground and imagine that scaling up at scales where this was the default way that things like facebook and twitter were actually designed for smaller spaces with good group facilitation with good shared questions and some of you in the audience are actually working on problems like this so this is an example of a more full stack ergonomic view of how do you take technology and wrap it around a more subtle and sophisticated view of human nature and that means starting by asking what do we want to protect what are the zones we want to protect because children are naturally brilliant at doing playful stuff with each other we need to wrap technology around these experiences instead of replacing them or manufacturing synthetic alternatives okay so that was uh humane social systems realigning technology to work with humane social systems now we have overwhelming ai and we need to go to humane ai so how we do that so we've got these voodoo dolls lying around right and we've got this warehouse of two billion voodoo dolls uh one out of four uh people on earth we've got a voodoo doll for all of them and each one can be used to perfectly predict what will persuade us what can keep us watching what can politically manipulate us what do we do with all these voodoo dolls well we need a new kind of fiduciary because this is a new kind of power it's a new kind of power asymmetry if you have technology that can perfectly manipulate you into doing and feeling whatever you do you need to make sure it's in our interest it acts in our interest and imagine if this is like your ai sidekick that's designed to basically sit to protect the limits of human nature and be acting in our interest as opposed to the opposite side working for extraction and lastly we need to go from extractive incentives to regenerative incentives because if you look at the scope of human downgrading the scope of how it affects our attention spans our polarization our civility our trust our decency we can't just solve this with a bunch of band-aids we need a new set of incentives that accelerate a competition to fix these problems like making the market work for solving climate change we need a new set of incentives that create a race to the top to align our values uh to align our lives with our values so what i mean by this is you know you might say well hold on a second what about the business model right how many you're thinking what about the business model because it's we have free right now what are we going to do to replace these free products well great we're getting free social isolation free downgrading of attention spans free destruction of our shared truth free incivility free is the most expensive business model we've ever created and you would have never thought seven years ago that you might be taking lyft and uber rides everywhere constantly and paying for that if you ask someone that wouldn't have thought that but we do it and we pay for it gladly because it makes life happen it unlocks more life more economic opportunity because it's taking us where we want to go but imagine we had the kind of incentives that there's a race to the top so everyone is competing to help you find a relationship instead of competing to keep you swiping on screens forever where things are competing for our trust to take us where we want to go in our lives this is just like a better party this isn't just we should you know do this it'd be nice it's like i want to live in a world where facebook and tinder are actually asking no get out of the way i think i've got a better idea about how to help this person with their dating life right and imagine a world where you love how you make choices and you love how you're paying attention to things because everything is actually competing to find that natural brilliance in us and coordinate those experiences to make that happen so this is the agenda going from artificial to humane social systems from overwhelming and overpowering ai to humane ai and from extractive incentives to regenerative incentives this is a new agenda for technology that actually addresses the human consequences and harms that we're currently experiencing and this might seem really far off but um let me tell you why i have hope because it was about six years ago that i was at google and i made this presentation um saying hey we have this moral responsibility in how we shape two billion people's attention we've created this race we're hosting this race at the bottom of the brain stem we have to do something about it and when i was at google i felt completely hopeless there was literally days i went to work and i would check wikipedia read wikipedia all day and check my email and i would have no idea how once you see something as massive as the attention economy and this perverse incentives how could a system this big ever change i mean i truly felt hopeless i was depressed um at some point actually through aiza's help i actually got an opportunity to give a ted talk on time well spent and i saw that there was actually a lot of power in shared language a lot of people were feeling the same way but there wasn't language for it and i saw how language like the attention economy brain hacking hijacking our minds that language and shared understanding creates a unified surround sound and things can start to change because what would happen was over the last years people go to work and they would hear these phrases coming around you go to three meetings in one day someone tells you about something and what happened that's pressure if you think where does pressure exist political pressure show me the atoms of pressure where where they where's the material reality of pressure they don't exist anywhere where they exist is in common surround sound and from people starting to speak up about these problems i saw the power of shared understanding and shared people speaking up about these problems roger mcnamee jiren lanier justin rosenstein who invented the like button sandy parakeelis at facebook rene diresta guillaume uh the ex youtube recommendations engineer mark benioff that when people start speaking up like people in this room and say there's a problem here with shared language things can start to change because what happened was common sense media did a report in september 2018 72 percent of teens now believe that they've been manipulated into spending more time on these services no one was thinking about that two years prior to that the verge says the time well spent debate is over and time well spent won why did they say that because mark zuckerberg a year ago said that our new big goal is to make sure that facebook is time well spent apple launched time well spent features to help you manage your screen time on your phone so did google and now they have grayscale on phones by default at night youtube included time well spent you have a billion phones now basically running tiny time well spent features without writing any lines of code just creating a shared understanding you can move an entire ecosystem from where it is to where you want to be now these are baby steps these are the tiniest baby steps but the point of this is that we've actually set off a race to the top for well-being now apple and google are competing for who can better provide well-being experiences for people but we need to upgrade that race to be a race to find humans natural brilliance we need to upgrade the bar to actually competing to align with this deeper model and sophistication about human nature so that's what we have to do now this is a civilizational moment in a way that i'm not sure we're all reckoning with because it's historical moment when a species that is intelligent builds technology that that technology can simulate like a puppet version of its creator and the puppet can control the master that's an unprecedented situation to be in that could be the end of human agency when you can perfectly stimulate again not the strengths of these people but just their weaknesses and the surround sound of their social environment and the mental health and the social norms of all their friends and pulling an entire mental health generation of teenagers in a direction that is a dark and important and civilizational moment and it could either be the end of human agency or and and that would happen by being unconscious to what's happening that would be if we let the self-driving car of the extractive attention economy go we know exactly where that would lead to which is human downgrading so we could let it do that but just like jack and trudy did at the beginning and i can ask you to do now of taking a breath when you gain awareness over that you suddenly go from being subject to that to gaining choice so this is like civilization taking a breath and saying let's see the ways that this stuff has hijacked human nature and let's have a design agenda that fixes it so we are just one group at the center for humane technology that wants to drive this change but our role is to help support the entire community in catalyzing this change we're launching a design kit that relates to the things i've shown you today to help be better at examining these features these subtle features of human nature we're actually going to be launching a podcast interviewing people who are experts in russian disinformation magicians conflict mediators the people who are the experts on this subtle terrain on the inside of how human nature works so we can rapidly accelerate the tech industry's common understanding of these issues we also want to host a conference where we're going to be bringing many of you and everybody in this room who's already working on these topics together to accelerate change in this area human downgrading is like the global climate change of culture like climate change it can be stuck it can be catastrophic but unlike climate change only about a thousand people need to change what they're doing and guess what many of us are here in this room and many of us are watching this so product teams can start integrating full stack human design humane design into their products tech workers can start raising their voices repeating this three times a day of human downgrading journalists can shine light on the systemic problems and solutions instead of the scandals and the grievances voters can once they understand it can demand policy from policymakers saying hey we don't want our kids downgraded and policymakers can respond to that by protecting citizens and shifting incentives for tech companies that leads to shareholders saying hey we want to demand commitments from these companies to shift away from human downgrading based business models vcs funding that transition to being helping humanity be brilliant and entrepreneurs building products that are sophisticated about our humanity culminating in platforms that provide these incentives this sort of uber and lyft competition for who gets you there first apple and google competing to allow apps to be competing not for your attention but for our trust and getting us to align with our values this would be together creating a race at the top and doing this can move it from being impossible to inevitable again this is a civilizational moment for the first time in history we could be facing the end of human agency if we let this car run on autopilot or by just becoming aware of what this is and of human downgrading and how it's happening the good news is it has to do with just one thing which is how human nature gets hijacked and once we become aware of that we have choice and this is the humane agenda that we are hoping all of us can get behind doing that we can take eo wilson and say we can embrace our paleolithic emotions upgrade our medieval institutions which helps us have the wisdom to wield the god-like technology so this is what we hope with all of you including in the lunch that's going to happen afterwards we can talk about doing together thank you [Applause] okay so um so what i wanted to do um thank you all so much um there's so much to absorb in what all that was about um i wanted to invite jack and trudy just to um help us sink into that and thank you guys what you talked about both of you so brilliantly um is a great human dilemma that we are a part of that we carry and that there's some historic moment as you say um but it's also not that easy to carry so the first thing before we do a little bit of a meditation um is in cultivating the attention that you were talking about tristan in ourselves that gives us agency take a moment quietly to sense in your body in your heart how has all this information landed did you move away from it and distract yourself did it make you sad do you feel energized where in your body has it landed and how does it affect you and continuing to pay attention notice as well without any judgment how it affected you that you also have agency that you can bring the qualities of courage you can bring the great heart of compassion that can hold all the difficulties and sufferings that have been created hold it all in the heart and that you have vision you have thousands of generations of ancestors who survived and given you the capacity to face difficulty and to transform it and feel these capacities in yourself yes you can turn toward this with knowing awareness yes you can hold it the difficulties with compassion and care and yes you can bring vision and courage and play your part and finally let yourself imagine five or six or seven years from now when many of the possible solutions that tristan outlined have begun to really happen and that the technological world has transformed into a humane engine for the humanity of children of people everywhere and since what was your part when you look back and say this is the seed i had to plant and water this is what i could offer it took courage and vision reflect what is your step your gift your strength to contribute and feel the upwelling of that capacity and courage of the caring heart and know that this is a time that really needs you and you have something important to add to it so i think all of us together are making a pitch for loving awareness everyone in this room loves awareness consciousness the brilliance that we have inside the house and uh and just to remind ourselves that to focus on what we love the most in our lives and to love awareness and to infuse the awareness and brilliance with that that caring for what we love the most our children some of each other our grandchildren our planet our life so thank you thank you thank you guys thank you so great thank you guys so much um i just want to briefly thank um the center for humane technology team um sf jazz center lynn winslow who helped put on this event um and everybody who was involved in making this happen there's going to be lunch served outside and the premise is that we can actually talk now amongst each other there's amazing people in this room amazing amazing people who are all part of actually making this change happen so i really encourage you to to meet each other and talk about what that might look like and we'll talk more soon thank you so much so you