An Ethical Crisis in Computing?

Just another WordPress site

An Ethical Crisis in Computing?

you so it’s it’s great to have Moshe Vardy here with us we spent a day yesterday together on a prize committee which is a causal explanation for why we see him here now it wasn’t at u-dub so thanks for making time to come by and share your thinking with us Moshe is a long-term colleague an incredibly deep thinker and scholar of starting from the the point of interest of how do you apply logical inference logical reasoning to computer science challenges and he’s taken that to a number of places and has become known as an innovator in AI using about knowledge database theory systems and in many other areas of computer science so it’s been really impressive it’s been it’s an impressive swathe of contributions and and more recently I think more recently applying some of his insights to deeper questions at the intersection of Technology people in society and that’s we have a kind of a joining of our evolution in terms of questions that we’re asking these days and coming in looking for interventions and policies Moshe is to the more formal description here’s the University professor and George Distinguished Service professor in computational engineering at Rice University he’s won a number of awards along the lines that I just described for the work the godel prize from Sig act panel aqus award the Sigma Cod award Blaise Pascal medal the I Triple E Computer Society good award and the trips award from sig log I won’t keep on reading these awards because he’s gonna make people feel like what’s like what’s going on here he’s authored or co-authored over 600 papers two books reasoning about knowledge and finite model theory and it’s applications he’s a fellow of numerous organizations including the National Academy of Engineering so Kadim II of Sciences American Academy of Arts and Science I didn’t see this before I read this just now you hold six honorary doctorates that’s very interesting how many people have any honor or doctorates in this remember anyway I really also know him from he no Moshe for this is almost single-handedly I think of course he had a team eventually transforming the CAC M from kind of a lightish wait magazine kind of society tableau to a rich engaging high-quality centerpiece of the computing community and thanked against thanks again for that and I think that’s been recognized to shovel the of the awards that I didn’t even talk about anyway today we’ll talk we’ll hear about an ethical crisis in computing question mark so we will hear the answer thanks Marcia for being here thank you for this kind of election very good to be here again so I want to raise the question are we having an ethical crisis in computing but I want to start to talk about the concept that I think is very fundamental it’s not talked enough is about trust how many people play there ever the trust game this is very often it popular in kind of corporate off-site retreats you know Stan on top of here fall backwards hope that people will catch you ok so why do corporations do it because they know that for teams to work require an element of trust but this is actually bigger than just for small team this actually applied in a very broad way and why is that because a vintage time the philosopher said that every game is unwritten rule what is the unwritten rule follow the rules of the game and it doesn’t help to write it because then you did another unwritten the world says follow that all and in fact it’s not enough that that you know the words of the game but you have to trust that everybody else’s follow the rules of the game and if you think but this applies you know you go to traffic light and there is a green light you just drive through because you trusted the lights work properly if you have a green light the other side will have a red light and you trust the other driver will obey the rules of the game and if you want to see how important these go to some third-world country with traffic lights of you there’s a friendly suggestion at best and nobody nobody drive through a green light because you don’t know what’s going to happen there democracy is another type

of a game and unless you trust it everybody is following the rules right your new follow the rules it turns out that Trust is a huge important economic value because in economy without trust everything everything has higher friction because you don’t trust you sign a contract do you trust that the contract will be will be followed so excuse me people have computed that Russia is losing about 70 percent of the GDP because it’s a lot of trust economy is compared for example to Sweden which is a high trust economy and such an Adela said last year we cannot afford not to build more trust in technology because our entire economic growth agenda can come to extension without it so Trust is incredibly fundamental concept in modern society and actually in all societies but to trust technology you will have to be familiar with the technology and Trust it but for many of us many technologies could have black boxes so you trust the people who develop the technology or may be done on the people individually so the company and maybe the industry but what happens if it’s none of the above and now people are saying now that we are we have a crisis of trust because the situation who is with cybersecurity is dismal we hear every day we hear about cybersecurity breaches of democracy essentially has been hacked by by technology welcome back to dit and I would say this is us this is on us guys this is we should have built better foundation for society and we didn’t we fail to do that privacy you know the people who say privacy is dead but of course these are people who are who have an interest in not having privacy and surveillance has become a feature not a bug I’ll talk more about automated decision-making and of course everybody is now talking about explainable AI well it’s primarily I at this point is a term of aspiration it’s not a technical reality so do people trust tech Windows one is opacity Google at Twitter has a privacy policy try to read it you know it’s written by lawyers it’s written to be write a bit but not readable ok a surveillance has become a business model I’ll come back to it you know take some technical tips are proudly how proud that their children have not seen no screens at home so this the products are for your children not for their children there is a in Silicon Valley now you have some some expensive neighborhoods but people don’t want to see any any self-driving car in their neighborhood they should go to some other neighborhoods Marc Benioff CEO of cell phone said a couple of years ago there’s a Kaiser of trust concerning data privacy in cybersecurity I think it’s even broader than that and that had led to what some people now call the tech lash the backlash against technology and I came across it in October 2017 when I read a column by Peggy Noonan the world journal about gun ownership and the question is it rhinos and why do people are attached to their guns and I think her explanation is cockamamie but the sentiment she expresses is important she said because of the personal and financial information got hacked in the latest breach because a country’s real overlords are in Silicon Valley and appealed to be moral merchants for the beautiful alliteration moral merchants to operate on some new on some will new postmodern ethical wavelengths and they’ll be the one program in the orbit that will suit some take all the jobs a Niall Ferguson was a juvenile a few months later most alarming was the morphing of cyberspace to Siberia not to mention the cyber Caliphate a dark in Lowell Asylum or malevolent Actos ranging from Russian troll to poor I see Twitter users could work with impurity to severe the institutional foundations of democracy and if you think it’s only in the world is juvenile here is Tom Friedman just a few months ago in The New York Times and finally there is the internet balance who for too long ignored the weaponization of social media which is turning our Free Press into a house of mirrors we citizen can no longer cognitive dissonance fact from fiction and make informed decisions informed judgment essential for democracy what’s amazing about this tech lash that here is an article from a few years ago can Silicon Valley save the world what is the date on that 2014 ok 13 I’m sorry June 24 2013 so we went in about four years from

saving the world to destroying the world what happened and I don’t know how many all have hair jennifer chase just moved to Berkeley and she just described they have now 253 million dollar gift to build data science division every University has some data science initiative but at the same time there have been a stream of books and articles describing the danger of this the new data science starting with the book in 2017 weapons of mass destruction a big data increasing inequality and threatened democracy and there are more and more books books like that and I’ll take one example of a machine machine learning and the justice system so in April 2017 Giorgio Bell the Chief Justice of the United States is a visiting a RPI and is having fireside chat with Chilean Jack from the president and she asked him can you foresee a day when smart machines driven artificial intelligence says will assist with cotton fact-finding or more controversial even judicial this making and and he says it’s a day that’s here it’s putting a significant strain on how the judiciary goes about doing things York Times recently wrote an algorithm that grants freedom or takes it away and so it turns out that machine learning is more now being deployed our justice system for decisions about bail sentencings parole separating children from families and so on and so forth and these are all decisions that are very very difficult for humans to make and they have profound implication on people’s life so judges are very happy to delegate this decision to automated decision systems and you have a company called North Point and they have a beautiful website but advancing justice and embracing community who can say no to that and in 2016 for public ID the first a kind of expose but on the system that are used to do it and they show that if you take similar data you just change a few variables to essentially changing the race of the of the of the defendant from from black from white to black suddenly the system makes different judgments and it’s not that difficult to figure out why everybody knows now some people call it garbage in garbage out if you want to say bit more politely say culture in culture out where the history of operation against the african-american this country and that’s in the data and if you trended that down there that’s what them or the system will learn and that’s why now people talk about this new Phase II conservation okay and how this new all this a the new systems are being deployed are essentially discriminating on the basis of race and class it turns out that the system even they have fancy websites but the machine learning underneath is very shaky for example two years ago this audit of composite system by North point system to claim to learn Hanul 777 attributes and used the learning on base on death turns out that in reality it’s a linear regression just two variables and any person who’s common sense will make similar decision so there is no fancy machine learning there and even the machine learning itself has come under a lot of criticism for example for lack of reproducibility there’s an article in science two years ago I’m sure you have heard about the alchemy of machine learning mono machine learning is coming and a lot of it’s you know it’s like of course incredibly exciting technology we are all excited to hear about it but it’s very far for me ready for being deployed in the justice system it’s one thing to predict which clickers are going to link on which which link are going to click on but it’s very difficult to make profound decision about people’s life and to to reveal the absurdity of the situation let me tell you about fluffy so fluffy is a black lab is very cute as you can see and I trained fluffy to detect risk of recidivism and fluffy is very very good if fluffy barks then there is a high risk of freezes recidivism if fluffy fluffy leaks then that means that there are very low risk of recidivism but as you can see fluffy is a black box and it does not provide explanations so would you allow fluffy to make parole decisions now first of all understand how ridiculous it is but Margaret is there is not much difference between fluffy and some black box machine learning that does not provide explanations we think we don’t have to go to fluffy I write lots of recommendation letters imagine that I write a recommendation letter that will

just read like this I recommend John Smith for tenure people would laugh at my letters they said this is useless for us why because you do not provide an explanation how many people have read Ender’s Game this is a tech audience so people remember that and there is a is a boy who in he and his friends are they think they’re playing video games but in reality they are fighting intergalactic war and when they win they destroy the civilization I think this is a metaphor for all of us we thought were just playing games there’s running the world and we have not necessarily made it a better place and this is often has been described of the last couple of years as an ethical crisis so here is a sequence of of headlines from different articles in the media ethical dark side ethics crisis chief ethics officers Facebook is now doing what some people called ethics washing giving some ethics enter seven half-million-dollar so to show what an ethical company it is ethics ethics ethics but a year and a half ago in the snowbelle meeting of the computers Association there was a panel on social responsibility and the question posted on the panel was we live in a world that we’ve created we computing professionals who can no longer pretend were making the world a better place and the question were what is our social responsibilities computing professionals and the answer was ethics ethics ethics and I want to be the ethics party-pooper okay I want to ask is it really about ethics and I’m going to challenge that and I’m going to use it as a metaphor the Ford model-t default more detailed argue was the most important industrial product of the twentieth century we think it was computer not computer we’ll be adding the 21st century but the 20th century was shaped by the automobile and this was the first mass produce must consume cars and as soon as the fall Model T stop being produced and which people start driving around we discover cars kill people lots of people get killed and we’ve been spending now more than a century trying to deal with this with the death the death leanness of cows and reduce reduce the mortality and of course the country goes the population growth are more cars so more people get killed so people in transportation business focus on this this is death billion vials a vehicles miles traveled how many people get killed from each billion mile that people drive and you can see that we have done an amazing record of reducing it of course it’s now going why is it going up the drivers and append the pedestrians are was distracted but ignoring this which is shameful we’ve done amazingly well in reducing the mortality from car crashes how did we do it by gazillion thing not by one thing you know think that you know you go to you go to the first car didn’t have meals and then we have that the cutting edge technology interlock brakes and airbags and we’ve changed there the whole terrain of the city with crosswalks and traffic lights and we have laws like DWI and now d-double Texas finally outlawed driving while texting okay so it’s finally Eagle in illegal in Texas what did we not do what did we not there was the remedy for for this mortality we didn’t say every driver if you get ethics training no we basically had public policy in a public policy you see it spans form regulating the car company is changing the way we build cities changing the laws you need to you need the license to drive a car all kind of things happen yeah and thanks to Al Center to public public policy part of it was public pressure to create the right the right incentive absolutely just don’t don’t fall don’t he’s not running again I hope easy so now let’s compare this now this was the metaphor car what happened with cars now let’s talk about the internet so internet kind of really goes back to the 70s but not as a public thing is a public thing it’s mid-90s but the foundation the cultural foundation of Internet goes back in the I was a postdoc at Stanford in the 80s and we had the well which was a dial-up a bulletin board and there was using it and they the cultural infrastructure for that goes back to the 60s and there was a sentiment of entire establishment the counter people call it the counterculture and out of that anti-establishment counterculture has to do partly with the Vietnam War came up the mantra information wants to be free of course information doesn’t anything

just of the speech they don’t have anything but the sentiment was there formation wants to be fully so when you look at Timberland lately paper described the worldwide worldwide web was all about unfetter public sharing of information and in the first really the beginning it was just exhilarating all these information was there at your fingertips but then you couldn’t find anything there was just way too much information so we had the problem was how do we find information and the first solution was where libraries catalog so let’s catalog the Internet and that was what Yahoo tried to do have a catalog for the internet that didn’t scale that we discovered that that doesn’t scale so in your day idea idea came search engines and it first it was such a network AltaVista but they were actually pretty bad and then Google came up who is really the first very very good search engines and it was like remember when you put it you use Google and it would like magic the first few links from just the right links it was really magic it’s not as good today but it was amazingly good at the beginning and of course google face immediately challenged information was to be free you cannot charge for information how do you make money and ahead the model which was the the media newspapers and in radio and TV and say we’ll use advertisings but there’s a problem on TV you have captive audience a newspaper you use real estate page to put it on the on the screen people hate if you take you had big banners you remember the banner ages people hated banners and so there was a problem people don’t click so they had another idea and the idea was a good another brilliant idea and it was micro targeted advertising so a little ties meant that you see on your page of different than the one I see everybody sees something different and but today you have to gather personal later so you can do advertising to people and how much data is being gathered the answer is we don’t really know these are well kept secret once in a while somebody leaks there was some delete last year that Google has a deal with MasterCard where all the data from your offline purchases they also get it so they have a very deep profile on you and so this led to become known as surveillance capitalism that is described is if you’re not paying for it you’re the product so this is the situation that has emerged and of course when when Facebook came to use exactly the same model they were googling the way and Facebook just follow with the same idea of of surveillance capitalism how well is it work for Google very well thank you very much okay I mean this is absolutely amazing this was business-wise this was one of the most brilliant business ideas if you look look at the stream of revenues over less than twenty years unbelievable so this works so well so let’s imagine now that we’re inviting Sergey Brin and Larry Page to an ethics bootcamp and we teach them a theory of justice and roles and Hume and all of this stuff do you expect him to go back to Mountain View and says guys it was all big mistake now however is the issue here is what I wrote about a year ago I thought if society find the surveillance Mendell offensive then the remedy is public policy rather than ethic outrage cookies running a perfect legal business we should be outraged at what they are doing if you don’t like if we don’t like the rules it’s up to us to change the rules of the game it’s we shouldn’t say oh you’re bad guys because you found a loophole in the rules now I want to explain the issue is not just that that do we don’t have privacy here the issue is people said yes but it’s a deal I’m willing to pay because I get all these free things from Google and I think this is complete illusion because we saw that Google makes more than more than 100 billion dollar where does the money come from you said no no not for much it’s from advertisers really what you see that retire get their money for them adjust the cost of doing business everything is folded back into the price of the product that you buy we pay this for this all this money comes from us but in an indirect way so basically there is now some kind of a consumption tax on the economy on every on everything because of advertising it goes all his gold back to a very small number Google Facebook number of big online advertising is very small and but it’s done in a in a one in a completely opaque way and as a result people feel that their privacy is is gun now again argument is people notice and are willing to give the data in a change from these set of free services it turns out that what we know it doesn’t mean

that the people know it in general and people like Pew have done a survey turn out that 74 percent of us Facebook us do not know that Facebook has a very detailed profile on them inside this profile you can buy click lick lick lick lick lick lick lick lick you can find it but you have to be quite sophisticated to find it and very few people very few people know about it and we when people see the profile they are uncomfortable and often disagree with the profile so just to tell you Facebook profile of me is extremely liberal my own self images I’m a centrist now I’m confused who is right I don’t know who is right anymore and again the opacity is important there are also people use Gmail how many people have read in terms of services of Gmail turns out for example every time you send a paper on Gmail you’re giving Google a license to use it to improve their services be careful what you send and even beyond that you know people debate what is the old what is now we have debate now between capitalism and socialism but I think Spirit has shown that that that free market is a popular regulated is a wonderful device for economic growth ok people have tried and favored capitalism and we’ve seen the result and people have tried and federal socialism and we’ve seen the result so some of in between this is a good market we have destroyed the market for information is no more market for information and I a column about these two years ago how the hippies destroyed the internet it was very popular in Germany translated to German have been the hippies thus internet there’s thought and in the US I got really hate me about this absolute hate mail I mean it was just stunning but again I go back is it about issue is the issue here ethics or is it the but do we have the right public policy and I want to use a different example and it has to do with liability so here is example I copied the license this is license from corporate xxx some of Microsoft but it’s another trillion dollar corporation you can guess and it says to the extent not a profit but by applicable law in no eventual exit be liable for any personal injury not due to inability to use this software or whatever so generally this is the prevailing this is not just to pick up on cooperation xxx this is the prevailing standard in a software industry use it at your own risk we are not really responsible for the product we are selling you know it’s the first line right there not applicable by applicable law in the Knoblauch absolutely yes yes very carefully written by lawyers all right now let’s compare it to the concept of called strict liability if a by cow and the tire explodes and observe an accident and I was helped by this accident the tire manufacturer has strict liability and this is generally part of what called product liability law and they this was this is a development of especially of effect what we call common law and the observation is that in this was in this modern modern times you cannot expect to get a car and inspect it for technical soundness used to be caveat emptor and people said there’s no I can buy a car and check the tires for good from for high standards of manufacturing so the manufacturer is strictly liable okay I don’t have to prove the law negligent if they’re negligent they have higher liability so that the of liability and it goes all the way to criminal negligence but in general they’re strictly liable and this goes all the way back to four thousand years ago the laws of Hammurabi you want to see strict liability here is like the strict liability for you the builders built the house for a man and as long as made his work sound in a house which he has built has fallen down and so caused the death of the householder that builder shall be put to death this is strict liability okay so already already more than twenty years ago Helen isn’t bound out an article about accountability in computerised society and actually what she warned about is the lack of accountability because if you say if there’s no liability there’s no accountability and she again she said what we’re headed into a world which is run by software at the same time the software developers and vendors are not

strictly liable the way we have with any other product in our society and again my question is is it about ethics or about public policy and as Eric pointed out that license he said to the extent not prohibited by applicable law so the problem is not just you can say the problem is not the license per se but the absence of applicable laws that will decide that software vendo should be stick liable just like any other product and you could argue maybe in the very early days of software we didn’t have the standard of care how to build good how to develop good software I don’t think we can make the case today so again my argument it’s about lack of public policy now why is there no IT public policy well one reason is that especially today the tech the tech industry is is incredibly powerful I mean you have today the big five of more than five trillion dollars in market market equity maybe last week it went below five and today is about five again I don’t know exactly but it’s not just about today’s industry this goes back this is a long history with the IT industry to its previous incarnations doesn’t just go to date companies but you go 25 years ago in before that those always the argument is regulation stifles innovation so look that not regulated because we’re going to harm innovation like Elon Musk two years ago tweeted something and SEC accused accused accused him of trying to manipulate the stock price by using Twitter and wild magazine immediately immediately jump in said the case again illo mask which in chile innovation so let’s ignore security regulation innovation is the most important thing and this goes beyond just the industry the poor my point here is not to dump on the industry but there was a whole ecosystem that i think where this come from so John Perry Perry Barlow was one of the founders of the Electronic Frontier Foundation and he just died a couple of years ago and in 1996 he ought a declaration of independence of cyberspace this you really have to go in read this document it’s absolutely stunning document here is the the first few sentences government of the industrial world you were a giant of flesh and steel I come from cyberspace the new home of the mind on behalf of the future I ask you of the past to leave us alone this was the attitude it’s not just the industry whose the attitude this is somehow we’re in the some utopian thing we are in some the home of the mind leave us along with your with your steel and iron and flesh regulations okay it shows it shows in some of the philosophy was that what happened in cyberspace stays in cyberspace which of course we know it’s not the case and you can imagine what would happen if there will be an attack on the u.s. power grid which there’s some evidence that the Russian have been already kind of probing some of the week we don’t have a uni an American system we have many many different players in some of the small small ones that have less have less resource to invest in cybersecurity we’ve already seen some footprints of a of Russian trying to to penetrate the u.s. power grid or an attack on the US election let’s imagine something radical is dead I’ll come back to that and I would say it’s even it’s just beyond this utopianism it’s pervasive to the computing ecosystem so in 1995 plenum Chris Christensen who just passed away very recently what incredible influential article about disruptive innovation ultimately published the book but felt there was an article and he described a process where you have a new entrant to the market come with a new technology and this completely disturbed the previous the previous they don’t dominant a players in the whole the running example of this document was about this drive technology this is what this lab used to look like okay they were big okay and they shrunk every time they shrunk it was because there was a new technology but it was very rarely the same company was able to move from one technology to the other there’s always a new technology they displace the old one now Christensen was trying to be descriptive he said this is what I’m seeing in the computing industry but this now has become the mantra every business proposal that people submit in Silicon Valley is about disruptive disrupting some some market is if disruption is something to celebrate and it’s not very far to go from disrupting to breaking so you have Facebook who until a few years ago the motor was move fast and break things and literal breaking stuff you’re

not moving fast enough but break what it’s one thing to break your own at you know travel reimbursement system Auto break democracy and now if the evidence is very strong this idea of frictionless sharing led the ability to have massive dissemination of fake news and the ability to target people all individual level many millions of people with targeted advertising and I think the most definitive word on what happened in 2018 so far 16 was a book by Katherine Jamieson cyber war her Russian hackers controlled helped elect a president what we don’t can and do know and she makes the case that we still don’t have all the information to say she said depends on the standard of evidence if you want beyond reasonable doubt we don’t have it but if you take the civil court civil litigation standard of preponderance of evidence with a broad proponents of evidence the trash and intervention help elect Trump and again if you go build this philosophy of disruption and braking you get to the next step which some people has called criminality is a business model so you take take the sharing economy these are both both the Airbnb and uber what is the business model ignore current laws because most city regulate the the lodging business and most city regulate the transportation business if I would start drive a car in for money that we say no no you need a medallion to do it and this companies basically said oh we don’t do it which is the platform now imagine they have done it they would have done it with an app for illegal drug delivery but somebody else would come you know and all that they would do it the app would match buyers and sellers and they would say we’re just that we have just the middleman here we’re just platform everybody would say no no no you are an able in crime this is this has been described as criminality is a business model and it was this were both brilliant business ideas and they used the the convenience to consumer as a political leverage but the reality was these two companies the business model was about breaking the law so this brings us back to a topic that I think is something that you know I’m partly Hill to learn more about about this business how Microsoft does it and I start talking about it to people this morning about corporate responsibility so corporate responsibility is also something that has evolved over the years you go back to the nineteen sixteen ask you CEO whom are you responsible for and you see we say oh this is complicated I have many stakeholders I have shareholders have customers of employees and located in a community have to balance interest of all of them but starting actually 1970 and became dominant in 1990s or become about shareholder primacy no no the shareholders are the only one that counts and therefore profit-making in Israel Cheryl there should trump everything else and Milton Friedman said the socialists want your business to increase its profits and the Business Roundtable which is the two hundred the century top CEOs in country just last August 8 oops there was a mistake we need to go back to stakeholders which again some people debate because the the fact that you have so many different stakeholders does not lend to a very easy decision making for corporations but I have to say this your on is a decision making think of each one of you how do you make your decision about your life and answer is it’s about stakeholders you have responsibilities or family you have responsibilities to do employers do yourself you have a community you somehow have to navigate between all of this so the question I want to pose it technology is driving the future who is doing the steering and the reality we’ve created we have led tech corporations decide where society is going and I think this is this is not the right choice for a society to do quoting an article from from bloomin well of the corporation playing government technology has moved very fast but we’ve neglected to develop the public corresponding public policy the problem is notice cooperation the problem is what society has not done and what we need to remember innovation is not a goal new is not a goal social progress is a goal in a way in the service of social progress is good innovation per se I don’t see why innovation per se would be would be a goal now we can go back to ethics is ethics important of course ethics is important and not here to tell you you should behave unethically I introduced last year last spring at right were not teaching it for the third semester a course on computing ethics and society and to say the

student reacting when you follow them to think about the dimpling occasions of Technology on society and to see them changing their even they come to tell me they’ve changed or thought about their credit because of that I think it’s wonderful a Chief Justice Earl Warren put it very beautifully in civiles life they’ll all float in the sea of ethics ethic determine what kind of public policy do we want to have and we need you know it’s a delicate balance how to what is the nice line we decided that the Dineen alcohol is a bad thing and it’s so bad that we turn into public policy instead of just individual decision-making there was a disaster we’ve done the same with the drug war another disaster so something we say no no no we should not necessarily legislate it we should leave it to individual decision making but ethics is about individual responsibility public policy is about societal responsibility and we create institution to enforce public policy we have the justice system for example we don’t have in society you know if you’re unethical I may not lend you but I dunno sanctions for for unethical behavior unless you violate the law and it’s become very clear now that that this is changing regulation is coming started coming from the media and then in 2009 Tim Tim Cook spoke about we need some regulation about data and he was talking in particular about the data brokers because would unbeknownst to us it’s not just our data and companies like Google and Facebook but there’s a whole army of different data broker who are selling buying and selling information this is all completely opaque so some apparently you can go now and in there about a hundred thirty-five are seeing data brokers and could go one by one and ask them to delete your data but all is required complex navigation between 135 websites it’s not clear that they’ll eat anything if you tell them to delete a alphabet came recently about face the temple ban on face recognition I think Microsoft also bites me to talk about the need to regulate face recognition so you see now see also going to Washington to talk about regulation partly because the last thing they want is 50 states to have different regulations much better to have just business-wise it’s much better to have federal regulation now how to regulate this is this is difficult this is not it’s not as if this is an easy solution this is a hard solution because doing smart regulation is very very difficult for the army to require good governance it’s well-known that once you put regulation you create more opening for co-option for example there is so-called regulatory capture which we’re going to be the regulator in people of experts who are the experts people from industry so you end up having a rotating people rotating between regulators and Industry so you have to to to try to address that you find different philosophy of about regulation between the US and Europe clearly some technology would require international treaties again Microsoft I think is called for some international treaties on technology and of course the the elephant in the room is big tech too big I mean basically since the Microsoft trial there has been very little interest action against technology we do know that one reason that Microsoft was able to prosper because IBM were I used to work with under consent decree and one reason that Google was able to grow with because Microsoft was under restriction they came up from the Microsoft aisle and but after the Microsoft trial that did not end up for the government well for the government we’ve done no serious enterprise action and are people who question whether we should let Google buy YouTube or let Facebook buy what’s up in Instagram and become too dominant in this in these categories so I’m making here just to show that we can start thinking we need to start thinking how about regulation I’m making just a few examples here so there are all these terms of services that are unreadable but there is a solution to this unavailability how many people are familiar with the Creative Commons I’m sure what you have read was what’s called a simple language description there is a legal document in the background no one I believe that no one here unless you specialize in copyright law have read the actual document you’ve read the simple language description in fact in security investment whenever by a mutual fund by law they have to provide a simple language description of what is the investment philosophy of this so we this to me in a no-brainer that every license should have a simple language description of the of the main terms of the license the fact that that the the Chief Justice of the Justice of the

Supreme Court United States says oh this is happening oh my goodness what can we do about it the Chief Justice that the Supreme Court in the chief regulators of the federal court system they have the power to put in place rule for example that says that every automated system should be audited for transparency explain ability and lack of bias it’s up to him he is the chief regulator for him to sit and says ruin to me it shows is I don’t know exactly what to describe it but people have been thinking on how to automate decision-making and the fact that a compass able to sell cysts to sell the system with no regulation no requirements for audit I think that’s a scandal when Yahoo was sold to to to southwestern to what is AT&T now they disclose that the year before that year and a half before that they had the bridge and billion users data were was uncovered and only they only disclose it because now they had to do it because of part of the of the they’re being bought by AT&T so I to meet and again no-brainer that cooperation will have to disclose data breach is much faster so I love the services I’m getting from these free services I’m getting but I don’t like the advertising I don’t like the surveillance I’m willing to pay for them the fact that there’s no market I can’t pay for to use these services and the only way to pay them is with my data to me I want the option of having to pay and then we will see I mean some people argue that the rich people would have no surveillance and the pupal will have to pay will have surveillance but this will have a market determine what is the real value you know if Google wants phoom to me you know thousand dollar-a-year I’m not going to pay okay hand it all over you yeah I’ll paid okay so we’ll have some discovery of a real value of these services you know we can go back I’m kind of in closing I want to come and we can go back to the early the very beginning of our discipline so very people think of Leibniz as the very one of the very first kind of computing professional so to speak because he built calculator he invented binary notation and he talked about this calculus ratio signatory and reasoning calculus and what was the goal Menken will then possess a new instrument it will hands the capabilities of the mind to a far greater extent an optical instrument stands in the eyes so there is this tension between what we call AI and AI a that we want intelligence that will replace human or augment human is intelligence augmentation modification and Leibniz was all in favor of intelligence augmentation ada lovelace in the letter to Babbage Bubba’s really turns out was in the pond or he want to make money from his Universal engine for analytical engine and she writes to him I wish to add my my towards expanding interpreting the almighty his laws and works for the most effective use of mankind so she’s kind of chiding him yes it’s okay to make money but let’s also think how we can make make it good for Humanity and I want to close by observing this this issue of technology and it’s among society we are just that the latest you know other other disability have discovered the dark side of Technology before the fizzy discovered it with World War two an atomic bomb the chemist discovered it in World War one and chemical weapons we somehow thought that were immune for this but technology always has two sides so the first technology was fire and it was as a magical technology that in Greek mythology is the display imagined that Prometheus stole it from the gods and promises paid the personal penalty for death every day liver comes and it is in Eagle come and eat his liver and then the goes again over the night and the next day he gets his liver eaten again and again and again so this is the personal punishment but there’s another chapter to that story and this is Pandora’s Box this is the way to punish all of humanity the God sent Pandora was supposedly a box full of gifts and when they opened the box pestilence and and all kind of ill societal ills come out of the box so already the glyphs are telling us technology is never free you’re always the dark side of Technology it always says adverse impacts think about it we couldn’t we look the way we look has been shared by file-by-file because we have this tiny jaws because we eat cooked food fire still kill people we still to the struggle with the F was the adverse impact of fire this is from the fine California in 2018 and somehow this topic of how will technology I don’t think has has getting enough attention

so we launched at Rice a new initiative what we call the technology culture and society we were going to it’s an interdisciplinary initiative what we’re going to try to study of this interplay between technology culture and society how does technology impact society and are should society respond and mitigate what not I’m not advocating let’s go back to the caves we can go back to the caves but how do we live well with technology and right now it’s not really a living well with technology and I want to finish with a slogan that I stole from BMW so BMW was the ultimate driving machine and now they have to deal with you know automated vehicles etc so the new slogan is don’t be driven by technology drive it I think that’s a good slogan for all of us thank you very much [Applause] so I think it’s interesting with respective ethics to also talk about systems you can the systems inherently be ethical or be used ethically or not now and I would ask two questions about any Cisco you know one whom is it accountable and two is it sufficiently transparent to be accountable and if you look at systems like our current you know social media platforms no my question would be is anything like that possibly accountable in other words is it possible for it to be sufficiently transparent with any kind of organizational structure then it could be accountable to any kind of you know organizational structure any kind of regulation so you know I I have to say when when first of all people talk about ethically I am a little skeptical if we can be ethically I first of all this be let’s be coexistence reliable systems you know before we we don’t even know as you know it’s a silly struggle to do that sighs I’m going to make them also ethical you know suddenly that philosophers are rolling their eyes because there was something called atole experiments that was kind of an obscure thought experiments and mainly philosophy of dismisses over the time is as just a short experiment don’t take it seriously nobody really makes decision like this in the real life and suddenly you hear I don’t know how many articles being given how they how they automated vehicles are going to make this kind of totally totally experiment kind of of decisions if anybody watch the series that would a good place you have to watch the Torah experiment episode it is such a fantastic episode I don’t want to worry it you have to look for it if you have not seen it you have to watch it so I would say before we do Nicolay I let thinks about we are responsible ultimately I mean it’s us it guys it’s on us we are the people who develop it let’s talk about our responsibility I said it’s even beyond us I think society is neglected to put enough rules on us I mean if you have children you discover that yes you want your check to explore an experiment and go and take risk and sometimes fail but at some places where you draw a line and you say no no no do not go play on the freeway it’s ok you want to play on the street in our quiet neighborhood ok on the fluid no so somehow you know when you let you there are no rules it let people develop their war sign ok we all need some some kind of societal harness to behave properly and by now the biggest person in our society are corporations and they need some rules how what is what is responsible play then we can talk about ok once we make corporation responsible then we can talk about other degrees of responsibilities but there is the phrase have been become popular recently it’s called ethics washing you know when Facebook is giving giving million dollars to ethics that to me is a wonderful example of ethics washing so that I think I feel it absolutely absolutely but responsibility and accountability is the thing and I said I just I just talked to Eric before this talk and as I’d love to get an article the Microsoft describe how it how it is going to behave as responsible responsibilities and so on and saying I have to wait a little bit when this become public but I think I think Microsoft is shaping up is the more responsible player who are the funk you know your big 20 years ago when Microsoft for the Darth Vader of that of the tech industry so yeah so excellent talk thank you for coming to talk to us and you said you’re skeptical about ethical AI and I’m an intern here studying ethically I am a PhD intern so in some sense my job is to share that skepticism with you and and interrogate that skepticism one thing that I’ve seen a lot of focus on is like how to not make technology that is designed or in

some ways designed specifically to support an unethical end but I like your analogy of fire in the sense that it’s kind of morally ambiguous it can be used to do good things and used to do terrible things and one it’s bring that into the technology space one analogy that might hold is a word processor it depends what you use it for it it’s not by itself doesn’t have ethical implications and in the beginning of your talk you brought up the idea of separating children from families and I think that’s sort of a really important like example of things where people might squirm right so how how what guidance will you have for industry about building technology that depends on whoever takes it and does something with it like tool builders people who build morally ambiguous or morally and I know tools but then give them to someone else how do you think through that so I think I think everything we do is has dual use we just have to live with this reality that you know all out every everybody here everything you have done can be used for good and for bad purposes okay and I I think we just have to live with it don’t answer it let’s not do it because it could be used you know I use a knife to cut my bread it will be a sharp sharp knife the same knife can be used for someone else to kill I mean this is the story of technology okay so I you know I think there is there is interesting development I don’t know how many here people read the news last week about the the new review model for no Apes so no Apes now say that when you submit a paper no Apes you have in the paper to describe possible adverse impacts so one of the thing that part of our naivety and use as a discipline was we think what we’re doing is just good end of story what do we have to worry about it and now we kind of say well it can be used for good or evil let’s think about it early let’s start to think okay I’m back I’m about to release a new model checker okay let’s think about how it can what can go wrong now I said I look at some of the thing I’ve done and of course you can use model checker to verify some horrible program right and it funk you to me because of that we should not release it answer would be good depend on software we’re not soft or reliable let’s develop truth for reliability but but especially I think an error of machine learning I always says I don’t understand why no ape still has problem committee they should just train a big model and use a big DNA and to make all accept or reject decisions and of course they said no no no no no why not if it’s okay to decide on separating children families this is a much less profound decision accepting favors they it’s called eat your own dog food right so I think it I think we need to we need to be aware we just were very naive that out yeah yeah this happened to the chemist happen to that today to the physicist it’s not us well we’re in the same now we need to start being grown-ups and think about trade-offs and life as adults is all about trade-offs I feel like you say well we’re not doing so well but we tried sort of back in the 80s when I joined graduate school in 1987 CPS our computer professionals for social responsibility how many were members Yeah right not a lot of us and back then it was about reliability of the starwars system but you you know and as a street defense initiative and it was really hard to get people to join up I mean I mean and the issues that CPS are looked at was not just reliability to expand into community workplace and all that and you know it closed down in 2013 right was amazing it’s actually amazing to me this is this is the Greek tragedy Stoli when it when it shut down so for some other people went on Ruttenberg joined the Electronic Frontier Foundation started there’s a number of very small you know organizations but but I think you know we’re an utter failure as a discipline to like really get behind this like physicists did and how do you feel I just remember putting all this energy into it and it was just like so here is interesting and so again the they if you look again there is a hard sciences they have you know you know concern scientist the bulletin of scientists they have they have now had this first row consciousness for years we have somehow failed to do that okay see Pisa was the one glaring example so I’ve recently done study what is ACM purpose okay so you go back you go back all the way to the founding well actually 1947 okay people said well it

just to do computing this is the purpose then there is a the Constitution that was written really revised last time in 1998 and it says to do computing I think I think now we can say wait a minute computing is again is not an end it means means for what you notice the I Triple E a tag line technology for Humanity now of course the tag line is just one thing you have to go and ask yourself what does it translate to but it’s a beginning at least you say what is the purpose so I think for ACM just to say is about computing for as a science and profession it is it is self-serving and it’s and it’s time for us to be less self-serving it was it at the best come closer so we can here so the two parts I understand some parts a bit hazy particularly the last exchange one but I understand is the need to perhaps more rigorously enforced antitrust which has nothing to do with technology per se but with the expansion of the become that’s a clear bucket action there is a second bucket cleaner actionable heart which is about for every individual or an NPDES release of their information that is their information my private information my activities etc so I see those two as clear buckets where there is action item you know by the society / government which can be enforced I see that clarity I also agree with the initial comment you made of the last discussion about this humanity is confusing which is anything that you discovered will be used for dual purposes I mean people forget about the most interesting one which is you know integrated circuits when Shockley and Burdine passed a current in Bell Labs and nobody would force then it has been used to kill many people have been done many things so ignoring things about the chip which has far more power in post-world War two we shall do so so I get these three bullets from this stop is that any other message that you want to tell us because I think anticipating if I were to anticipate or ask in chocolate to anticipate what will happen no hits for not know others fear not hard to begin bigoted self case he’s a famous that there is a beautiful story that I’ve read but when the physicists in cabinet lashing Campbell discovered the electron they were very excited discovered a new particle and there was a party and they had a toast to the electron which will never be of any use to anyone so no we we are not assembly a scientist we are not a scientist better are able to predict the outcome of our technology but we should not by now by now what we should know is technology down the line will have consequences and what we cannot test I’ll give an example somebody give two eyes to yourself to give a talk but Pico drones Pico domes what a picker dawns P codons are bones about the size of a B and and this person gave a talk about the technological issues would P code on limited battery power limited sensing ability limit computing power and I said n you know I found this idea of this office a bees flying around me which are drones really scary in terms of surveillance capability and actually really assassination capability now when a bee stings me I’m going to die for me right it potentially I said have you thought about the possible adverse impact of this of this technology he said that other people should think about it and that to me is a moral stance it is not acceptable anymore I’m just an engineer exactly this is the life I’m just engineered yes I’m uncomfortable with how you can you need to speak up I can hear it I’m uncomfortable yes your conclusion yes even though I I’m comfortable almost everything you said yeah so first of all to say just drive to say try yeah it’s not good enough and the so first this conversation is very strange to me because as you say melmac ransburgh if you haven’t read his laws read them send me an email and I’ll send them an essay is this when he retired he when was his first last technologies not good it’s not bad but Norse in neutral it will be some combination of the two right and the second lot yes invention is the mother of necessity which is a lover of language I love the turnaround with that trace but basic that says because it’s not neutral because it will have negative consequences there in cutting of steers you can’t predict the future accurately or completely but clean up your own mess without throwing the baby out with the bathwater now the problem is is that there’s this matter of scale this changes and there’s something called your order of magnitude rules

that anything changes for an order magnitude exactly the same things no longer the same thing wouldn’t be privacy or the rate of change so if we if we look at this history even back in just revolution they made a distinction people spoke with these things between progress and evolution an evolution this is technical and scientific process progresses an ethical question and so if we’re saying we’re trying to change the rules of that and we’re concerned with progress rather than technological innovation then that is inherently an ethical question now people don’t have to be able to answer the question what are the effective than ethical implications of what we do but we’re not even aware that dirt is a question to be answered and so the first step is is how when we’re making every decision we make as you say is an ethical decision the question is that we doing it by oh that just making that decision by omission or Commission to best of our abilities and then put in the safeguards so you can get Hansberry so what exactly are you you said you’re uncomfortable with something I said can you clarify no I’m not sure I’m getting what you’re uncomfortable with I would say that there’s an emphasis on technology without looking at the saying that every technologist is actually not designing technology they’re design culture and social structures and if the technology is just a tool and if I look at it we need to take more responsibility for that if I look at the ACM for example in my field of human-computer interaction it’s not you know you can graduate at almost every university in the world without ever having written a program who was used by another hit by human being much less be marked and your ability to do so and all of the consequences of Technology are human and we we don’t respect that along with in the profession and not disagreement with you I mean you’re just saying it your new way and I say it in a somewhat different way and I try to say then it’s not that ethic is not important but just to pretend that if everybody takes an ethics course and every company will just have a ethics principle that doesn’t change anything unless you go to what Eric said unless you translate to responsibility so we come back to this question of that it’s always been this case but the rate of change is very high and we can’t regulate in advance because otherwise we won’t well we’ll build fire exits where that will build the firewalls worthy the fire doors should be in vice versa and so how and the one thing is is how early can we start to anticipate things and how they put in the structures where we can actually make things they put the controls in place but what we can’t do is do what Facebook did and just let’s let’s just do it until it breaks and then we’ll fix it we could do that in the earlier era no no no I mean virus I’m invited in agreement with you yeah and but then the question is what’s our responsibilities well I first of all I would argue that’s my point is that line I’m just an engineer is to me more not morally acceptable okay and the answer what to do about it is not going to be easy you know we have unleashed technology the genies out of the box we can’t just say go back to the bottle okay we can’t just do that and there is the issue of when to regulate technology it’s a difficult question you cannot do it too early let’s I don’t want to do this kind of milk by one person so I agree I will you know that the point is I want to start the discussion here I didn’t come here to give you all the answers I’m coming here to start I think our community and to start the discussion and they’re not going to be clear answers and they’re going to be different opinions but but the point is it’s time for us to engage with society it’s time you know there’s not enough scholarship on the regulation of technology but you go to a politician you and and they’re willing to angrily they don’t know how to regulate and they need to get good advice from whom so we need we need our Professional Responsibility to engage with society recognizing that what we have done is not neutral it has positive effect and an adverse impact as well and this is I said I was trying to say this is the so technology from the very beginning people live by fire and die by fire and you still every chair here underneath will have some tag that what we have done we have public policy and they said the chair have to be non flammable and if you take off the tag this is the criminal offense and I know how many people went to jail on taking off the tag but we’ve we need to engage with society on this yes please so what you say which is that corporations have lost in some sense or moral compass and then the only hope is to have government-wide regulation but it would be very curious to hear how you think that can be enacted especially when the big tech corporations are in America a country where regulation doesn’t really fly and where a president has been known to say government is not

the solution government’s the problem I can’t you see some sort of revolution or like impetus to enact that regulation with public support well I mean we can we can I’m sure spend the next two hours talking about the poly American politics I and today Super Tuesday so we may be wiser today know we may be wiser than of the day but I think by the way that sentence I viewed is one of the most destructive sentences of the 20th century because people really believe it and we tell them excuse me if your house is on fire do you want the fire people took the firemen and women to come and put it up of course you understand these are people from the government I mean we in this country we had you I don’t know if you heard the line for this were from 2016 don’t don’t let the government touch my Medicare this is part of absurd that we had in this country and right now I don’t know that that tenant and very optimistic about politics in this country but we have to think of the long-term and in fact you know one of the argument that I would make is technology partly we are in the pollution in this country and in other countries as well is the result of technology this is a whole other talk I’ve given this talk clear some time ago with the world of technology is great for people like us not being great for the majority of the population and are voting with the middle finger but let’s get away from politics Jonathan I don’t disagree with anything that you or Bella’s said but I would like to let us off the hook a little bit for what we did in the 1980s because until 1995 this art community was small Bombo genius and trusting and and in 1995 a couple things happen one was the internet became kind of legal the other was the mosaic browser came out there and suddenly we weren’t the small homogeneous community in fact the first remailer that allowed anonymity emailed to be sent anonymously came out was from finland in 1992 and it was reviled by many people so so there was an expectation you actually knew everybody who is doing it and so the big changes that bad actors came in and you talk about them a little bit but not very much your list of hooey who we should trust or we need to trust another one is other users was the other category and I think unless those college you know university students tend to be in a homogeneous trusting environment and so they tend to so I think the point that we need to really emphasize that you do need to be thinking about unintended consequences is important because we came from a context but that wasn’t true students are in the context where that that wasn’t true and it’s also a fact that for companies for people building something whether it’s a start-up or anything else if they are going to spend a lot of time thinking about unintended consequences they’re not gonna make it to market as fast as they would like to and so they’re so it’s great it’s it’s complex and the last thing I want to do is just ask you one question yeah I think you probably have the answer to but but I haven’t found anybody who’s given me a good one so I’m hoping what it what what a buildin yes one yet we want Facebook to respect our privacy as individuals if we want privacy on the other hand we want Facebook to identify Russian hackers we’re doing everything they can to disguise their privacy how is that even possible so I’m happy I’m happy to send your little paper that I wrote actually with a we have a a negligible for public policy at Rice and I wrote with the director of that is it’s a facebook question I think the credible question is a very very challenging public policy question so how many people know what is section 230 so some people know section 230 is about 26 words I believe of incredible importance for our community and many people are completely unaware that here is the 1996 technology cook the communication Decency Act and section 230 basically said that platform owner commuting platform owners are not responsible for content placed by third parties and that means that Facebook is not liable for content that’s that I put on the world you put on there all the Russian hackers put on them and this enable the explosive growth of platforms now Facebook would just say now we’re not responsible or not responsible so it’s not our problem politically become untenable anymore they can just say we don’t care anymore politically that’s the issue for them so some people are now saying let’s revoke section 230 26 what it’s going to be very easy to do it and some people in fact I used to think we just do it let’s make them liable the problem with this is going to be now that Facebook will and all the speech of about two and a

half billion people Wow do you want Mark Zuckerberg well the power that Xi Jinping doesn’t have control the speech of two and a half billion people and I think right now this is this is a big dilemma and nobody really has a good solution to that how do we what do we do with Facebook and the answer maybe have to do with enter trust maybe Facebook is just you know first of all let them you know they must they must dispose of what’s up and Instagram make them a little more smaller even that there’s still a very large control the speech of too many people I think this issue of have too many very small player controlling today where the public speech is being made is a huge policy a dilemma for us I call it the Facebook dilemma and you know how many people knows about the popular paradoxes the proper paradoxes Karl Popper was a famous philosopher of science and most people know about philosophy of science and he said that science for something to be scientific has to be falsifiable but popular later in his life formulated the paradoxes and I think he would be now smiling in his grave because he says too much speech leads to the curtailment of speech too much tolerance lead to the curtailment of Tolerance because if do you tolerate intolerance too much democracy lead to the curtailment of democracy and I think we are all it’s the issue of scaling who talk about you talk about scaling you guys talk about scaling look we should have been I tell you where we should have had I should have seen the light there was an in the mid 90s I was a huge fan of using it you can post question you got intelligent answer you’re the flaming Wars and then in the mid 90s Usenet open up and overnight there are too many people and the quality of signal to know is that the signal-to-noise ratio went way down and it just stopped because it was not useful anymore and we fail to understand the concept of human scaling even though our discipline is very much about scalability when you come to human scaling we were just naive and ignorant and young and stupid can I say and this is the point is human scaling is with an able human scaling in a way that was not there before and you know we didn’t say the consequences now we understand them better and now we need to think very hard about this I don’t have an easy answer what to do but Facebook I think I’m happy to send you the little paper we wrote about it as a as a central policy dilemma and maybe the answer is you see what happened before suppose you were a Nazi and you want to publish a Nazi column and sending the New York Times New Times will not publish it you tried many main main newspaper they will not eventually you will find some small town in some part United States and some small time will publish your your article and about 50 people read it and so it for know of speech but we echo the amplitude was limited naturally because there are so many different players now you put this article and it’s sounding and who have basically given radicals however you want to call it hateful speech we’ve given them a megaphone they never had before and how do we put the genie back in the bottle I don’t know I mean we should all think very hard about it okay maybe we should they try to maybe one more questions where five more minutes when the New Republic moral decision for me noticed now free field I want to reveal ask a question yeah you will make moral decisions in many ways not related to our our work and as producers of technology that can be misused I mean it doesn’t poppers paradox it probably doesn’t make sense for everybody developing technology to spend a large fraction of their time thinking about this but there are a couple of areas where there we have a unique ability to health and one is in explaining to the public to the non-technical public how how things work and alerting them to ways in which it can be being misused and and and and and Spencer Spencer Spencer is time on it which is difference if you know and that’s that’s um and we’re worried you meet we’re in a unique position to do that so rather than thinking about oh we should be very concerned about this bath this technology we’re producing that can be misused we should think about how being experts puts us in a position to to do something useful somebody mentioned at Star Wars

I remember that whole debate about about about that and I was beautiful some of us were asked to sign petitions telling the world how this was not feasible but the people signing it are putative ly computer experts and actually didn’t have a clue whether it was feasible or not so I think we also need to be responsible and the kind of advice that we give you know are we acting as an expert witness to inform or are we promoting a point of view so I’ll tell you a we teach this course computing ethic and society and make students really think about it and this ever so formal or senior I’m sorry for a junior or senior students and then these of students are going to start you know being being you know the bottom of day of the corporate hierarchy and they said well what power am I going to it am I going to go to Google and what power will I have to shape what the product does and I’ll tell them you have power in your fingers in your feet first of all you have still tell the most important policy exercises citizen is vote because many of these questions will end up being political questions these are not technical questions the issue of what the proper level of regulation is a disciplic policy to end up deciding by society by the politician we love to despise and ultimately detriment of the decision will be made and so the power of voting is important power we all make decisions as consumers we should also apply the power we have as as voters the other one they have the power to decide whether they’re going to walk that’s I say the power of voting with their feet and I can already see the Millennials this is a new generation they have much more social conscience than the previous generation five years ago rice students they came to ice to study hard and party hard now there are people who are promoting a more tech it rise and these students are worried about gentrification and not for themselves that they don’t realize it also will impact them but we have in the in the we have an area of town which is mostly of kind of poor neighborhoods and the students are worried the signification will impact them adversely this is a new phenomenon of tides I’ve never seen students taking putting posters and making demonstration and we had an alarm events and we had the career mixer would we let alarms come and recruit for their companies and we have some alarms at Palantir and the students had a petition they wanted us to boycott Palantir and we to explain to them why it’s inappropriate for us to decide who are the good alarms and who are the bad alarms but they are free to make the one to vote with their feet so students already telling me that they don’t want to go to Facebook don’t because they think of Facebook is an evil company but there is a bit of an order involving around Facebook so some student says I’ll have many offers why do I have to go to Facebook I think these issues especially this coming generation I think the corporate image is going to be incredibly important and cooperation that will have this negative image will have a hard time to recruit people and it will be another way in some sense how the market will kind of regulate itself just through individual individual decisions so we have wave power in them anyway with the powers expert which we need to exercise very very carefully with our citizens but we also have especially people in our in our industry have the power to decide since again most of our community people work in industry where the power to decide when do we want to walk and look for the tunnel that Google is already going to I suspect that some of the turmoil in Google is what psychiatrists would call turn transference uncomfortable with the business model of Google but they can protest that because this is Google the business model so the finding way to protest all kind of other ethical issues instead of protesting the central business the central issue which are Google business model okay let’s let’s call it up [Applause]