title: Privacy could be the next big thing
url: https://kryogenix.org/code/privacy-could-be-the-next-big-thing-hackference/
hash_url: 6f628fc1d3
About privacy, and how people are scared and uneasy about what's being done with their data. And how we need to stop building new technology and start working out how to explain to everyone that it is possible to build a world where you don't have to feel exploited and frightened and you still have all the same internet superpowers that you have today.
This is the text version of the talk; not a transcript, but my working copy.
Hi, my name's Stuart Langridge, and I'm going to talk about how privacy could be the next big thing. Emphasis on the could.
Who in this room thinks that the amount of data that's collected on all of us is a bit much? Who in this room thinks that the uses to which those data get put are a bit... creepy?
In 2012, Target, the American discount store, put together a list of 25 products that when purchased together indicate that the purchasing woman is likely pregnant. Then they mailed out coupons for baby products to prospective mothers... and one of them's father stormed into his local Target and demanded to see the manager. âMy daughter got this in the mail!â he said. âSheâs still in high school, and youâre sending her coupons for baby clothes and cribs? Are you trying to encourage her to get pregnant?â And then a couple of days later he apologised profusely when it turned out she WAS pregnant.
Google's chat app, Allo (well, one of Google's fifty chat apps) includes the Google Assistant, which will âhelpfullyâ add things to conversations you're having with others... such as locations from your personal maps or things from your personal searches, visible to the person you're chatting to. Some of this stuff is apparently a bug and it's been fixed.
https://www.recode.net/2017/3/13/14912394/google-allo-search-history-privacy-messaging-app
Women are less likely to be shown ads for high-paying jobs.
https://www.theguardian.com/technology/2015/jul/08/women-less-likely-ads-high-paid-jobs-google-study
If your social media friends have bad credit ratings, it could be harder for you to get a loan.
http://money.cnn.com/2015/08/04/technology/facebook-loan-patent/
Uber, who are the Lord High Chancellors of this sort of thing, have tracked drivers who were attending taxi protests and fired them, have built an internal app called âGod Viewâ which continues to track you after you get out of their cars, and mined their dataset to identify âRides of Gloryâ -- rides taken to get home after one-night stands.
https://motherboard.vice.com/en_us/article/53ddwb/uber-knows-too-much-about-you
They retracted that one. Because even they were aware that this sort of thing is creepy. Really, really creepy.
Isn't it great to live in the 21st century? Where deleting history has become more important than making it.
https://twitter.com/aianhangover/status/466813236520964097
There used to be a saying. If you're not paying for the product, you are the product. Now, this has always been various levels of untrue. Sometimes you're paying for the product and you're the product. If you're getting a thing for free that doesn't mean that you've agreed to be exploited in every single imaginable way.
âThere is no correlation between how much money users pay and how well they're treated.â
But leave all that aside for now. When people started bandying that expression around, you heard a lot of people saying... cool, I'm OK with being the product, whatever that means, because I'm not really interested enough to pay. And, hey, if you're gonna show me ads, which you are because I'm not gonna pay, then they might as well be ads for stuff I might wanna buy, right? TV ads are rubbish, or at least they were back in the days when anyone watched actual TV, as far as I can tell. What's different now is a word I've used already, and a word we're now hearing a lot. Creepy. From actual people, from media articles, from friends in the pub and people on the train and colleagues in the office. It's creepy. What do they mean?
In Anathem, Fraa Jad describes it as âan intuition of the numenous, combined with a sense of dread.â This is true but nobody knows what it means. Still, the bloke saying it was hundreds of years old, so perhaps we can do better.
The issue here is aggregation. Emergent phenomena. The whole of data science is oriented around taking a big pile of facts about a thing and using them to deduce new facts that you weren't told. It's what Target did. It's what Sherlock Holmes did. Take data you've got and use it to derive new and surprising conclusions. It's a cool trick... when you watch it happening to someone else.
This is not the face of someone who is pleased and delighted by their user experience. People do not like it when you do this.
Companies. Learn this. Your data collection is creepy when you use it to deduce things you weren't told and shouldn't know.
Doing this is what data science is for. So there's something of a mismatch here. And it's not exactly new. Supermarkets are laid out in an incredibly precise way. Vegetables at the beginning because it communicates freshness. Bakery near the entrance because it smells nice. Stuff everyone buys is at the back so you have to walk through everything else to get it. It's the same reason that the only exit from the airport is through the duty free shop. It's an incredibly powerful technique.
âEvery aspect of a storeâs layout is designed to stimulate shopping serendipityâ
https://www.realsimple.com/food-recipes/shopping-storing/more-shopping-storing/grocery-store-layout
So people find this weird and unpleasant! And the worst thing is that theyâre helpless. Theyâre trapped. Because thereâs nowhere else to go.
There are a bunch of stock answers for what you should do about this. And they're all wrong.
You can't opt out by not using this stuff at all. We're part person and part machine now. And that's OK. You'll never be lost again. Horror films now have to come up with spurious rubbish excuses why everyone's mobile phones don't work otherwise all the suspense is gone. You can listen to any music you want. You can video chat with someone on the other side of the world. Louis XIV couldn't manage any of that. These are superpowers, and we shouldn't have to trade them away.
âIf you leave your phone behind, itâs like missing limb syndromeâ - Elon Musk
https://waitbutwhy.com/2017/04/neuralink.html
You can't regulate this problem away. The EU have done some work on this; India have declared privacy a fundamental human right. Government regulation is needed. But it's too slow, too easy to work around if you can constantly stay one step ahead, too likely to not happen. Fill in your own reasons why.
âThe right to privacy is fundamentalâ - Indian Supreme Court
There are worries about what government themselves might do, too. John Stuart Mill was a noted philosopher who wrote a lot about things like freedom of speech. But one of the more important things he said has been kinda forgotten; that âlaws passed by governments are about the ninetieth most important restriction on our freedom of speechâ.
http://blog.danieldavies.com/2002/10/free-as-bird-im-profound-believer-in.html
Americans make a big deal about their First Amendment, and rightly so; people outside America often make a big deal about the First Amendment and are then rather surprised to discover that they don't have one. Government regulation is at best a part of this answer, and it's not the lead part.
You can't convince people by constantly having a go at them about it. Then you're a whiny person who annoys their friends. The Overton window is not far enough in that direction yet. If you say to someone, use this different messenger! It doesn't matter that your friends aren't there, because you'll be right and they're wrong! then... they will not listen.
âSo....Mozilla knows, right, that âprivacyâ has never been an effective selling point for software? Like ever?â
https://mobile.twitter.com/sarahmei/status/882008927516463104
And you can't get a new public who do care.
âThe children of the revolution were faced with the age-old problem: it wasn't that you had the wrong kind of government, which was obvious, but that you had the wrong kind of people.â - Night Watch, Terry Pratchett
This is the wrong kind of thinking. Which was, obviously, Pratchett's point.
âMore than 70% of people would reveal their computer password in exchange for a bar of chocolate, a survey has found.â
http://news.bbc.co.uk/1/hi/technology/3639679.stm
Who's prepared to tell me their password? I have a chocolate bar here.
And the fix is not technology. The tech is not the hard bit. There's loads of tech. Signal. Matrix. Purism. Privacy Badger. VPNs by the dozen. Password managers by the dozen. Tor.
What there is here is a chilling effect. People are frightened of what might happen with their data, because they don't know. And they don't like the things they can imagine might happen. This isn't being worried about imprisonment, or having your illegalities found out; most people don't really have any. The worry is about other things. Your rep is the big one. We're loath to do things because we don't know what will be done with the data and we fear that unknown. That's a chilling effect; something which isn't a ban, but discourages people anyway.
âFreedoms are not being taken away, we are just afraid to use themâ
The Social Cooling people summarise it like this:
âIf you feel you are being watched, you change your behaviour.â
Ideally people really would dance like nobody's watching. But hardly anyone does.
https://www.socialcooling.com/
But everyone's still involved because they've got no choice.
but what if there were a choice?
Whoever gets this right, whoever works out how to tell this story, will define the next ten years. Mobile changed everything; it changed how we look at the world, put power in your hands, made billionaires and made industries. Everything old was new again; viewed through a new lens. Social networking changed everything: it changed how we look at the world, it put power in your hands, it made billionaires and made industries and everything changed when we viewed it through a new lens. And that's what privacy could be. Go back in time and tell someone with one of those phones from the Matrix that everything will be done on mobile phones in twenty years. Everything will be done only on mobiles. Go back in time and tell someone on Six Degrees -- anyone remember Six Degrees? First social network. -- that everything will be done on social media in twenty years. It'll elect presidents. Now go forward in time and see a world where your data is yours and everything still works, and tell them that there was a point where we felt like we had to give that stuff up. And they'll laugh at you and ask where your penny farthing is.
People want this fixed. 82% of people are not comfortable with the sale of their data to third-parties in exchange for speed or convenience or product range.
Half of all people have avoided doing some basic stuff online because they have concerns about how their data will be used.
Here, finally, is an industry that truly needs disrupting.
And that's how you disrupt an industry. It has been said that if you build a better mousetrap, the world will beat a path to your door. This is a terrible lie and it ruins people's lives and if I could go back and pour a pot of coffee into Ralph Waldo Emerson's lap I'd do it in an instant. According to Wikipedia, there have been 4,400 patents awarded in the US... for mousetraps. And nobody can name a mousetrap inventor. It's rubbish. The way you overcome an incumbent business is by doing battle on a field that they can't compete on. Not that they don't, or won't. Can't. How did Apple beat Microsoft? Not by making a better desktop OS. They did it by shifting the goalposts. By creating a whole new field of competition where Microsoftâs massive entrenched advantage didnât exist: mobile. How did Microsoft beat Digital and the mainframe pushers? By inventing the idea that every desktop should have a real computer on it, not a terminal. How do you end up shaping the world? By inventing a thing that the current incumbents canât compete against. By making privacy your core goal. Because companies who have built their whole business model on monetising your personal information cannot compete against that. Theyâd have to give up on everything that they are, which they canât do. Facebook altering itself to ensure privacy for its users⦠wouldnât exist. Canât exist. Thatâs how you win.
The beauty of this is that it's a weapon which only hurts bad people. A company who are currently doing creepy things with your data but don't actually have to can alter themselves to not be creepy, and then they're OK! A company who is utterly reliant on doing creepy things with your data and that's all they can do, well, they'll fail. But, y'know, I'm kinda OK with that.
So. People want this. Everyone finds all this data collection stuff to be at least a bit unnerving. And this is no longer a conversation which just has geeks like me in it. The Daily Mash tells jokes about this stuff.
http://www.thedailymash.co.uk/news/society/privacy-experts-too-paranoid-even-for-lunch-2014082989998
âFacebook do weird things with your dataâ is a mainstream opinion. Tin foil hats are a fashion item now.
https://www.etsy.com/ca/listing/55473505/knit-tinfoil-hat-made-to-order
The world is ready to be convinced. Eager to be convinced.
How do we do that? No idea. I wish I had an easy, glib answer here, and I don't.
Differential privacy is not a bad one. This is the thing that Apple were talking about last year, from the Dwork, McSherry, Nissim, Smith paper.
https://link.springer.com/chapter/10.1007%2F11681878_14
The idea here is that you want to get aggregate information from your users, but you want to set things up so that you can't tell what anyone's answer actually is, or even if they participated. Here's how it works. I should be clear: this is kinda the Jackanory version of how this works; don't try this out on any cypherpunks you happen to know. But it gives a flavour. Read the paper and all the work since for the detail.
Let's say I want to know the proportion of people at Hackference who like chocolate. So, I give each person a card that says âYESâ, and one which says âNOâ. Then I say, flip a coin. If the coin comes up heads, you tell the truth, and put the true answer in the hat. If it comes up tails, then flip the coin again. If that second flip is heads, you say âYESâ, regardless of what your actual feelings are. If it's tails, you say âNOâ.
So you've got Alice, Bob, Cantrice, and Darsh. Alice and Bob like chocolate; Cantrice and Darsh don't. 2 do, 2 don't. Alice flips a coin, gets heads, tells the truth, and we get a YES in the box. Bob flips, gets tails, flips again, gets tails again, and puts NO in the box. Cantrice flips, gets heads, tells the truth, puts NO in the box. Darsh flips, gets tails, flips again, gets heads, puts YES in the box. So in the box, there's 2 YESes and 2 NOs, which is nice and accurate!
OK, this is the most rigged demo of all rigged demos ever. The point about this is that is does roughly work, though. You need to analyse the results more carefully than this, but I'm not gonna put up lots of equations about standard deviations here.
The thing is, what we're doing is adding noise to the answers. This is called the ârandomised response mechanismâ, and it was thought up in the sixties to help people answer questions about embarrassing or illegal behaviour in secret while still giving meaningful answers. And if you have lots of people answer, then people can just not add their answers at all and you still get a roughly accurate response. And the state of the art is much more advanced than this. You can do data science without being creepy about it. This is known stuff. The methods exist.
What we need to do is come up with a way to help people understand that there are ways to never be lost again, and to listen to any music you want, and to video chat with someone on the other side of the world, without them having to feel disquieted about it. That it's not OK that you're made to feel weirded out. That it's possible for there to be alternatives. That having to feel someone rooting around in your life is not a price you should have to pay.
What we're currently presented with is a false dilemma. It's been painted as a choice: you can opt out and cut yourself off from superpowers, or you can give this stuff up to pay for them, and that's it. What we need to do is change that story. Help people understand that it can be different.
These ideas, these alternatives, they'll come from us. People in this room and rooms like it. Who's building the next big company? You are. But when you do, talk about the story. About what will change all this. People are scared and they shouldn't have to be. When you're talking in the hallways, when you're building your companies, when you're hacking on projects, talk about the story that you want in people's heads. Those of you who are curators of the user experience know this -- that that's not really about the font you use or how round the buttons are. It's about the view of the world that you're making. Because the world we've got, people don't like, and so far nobody's managed to explain that it doesn't have to be like this. We can't shout at people about it; we can't tell them to opt out; we can't wait for government to save us or people to spontaneously learn. We need to explain. To teach. To help people to understand so that stuff which is really obvious to us becomes obvious to everybody. And when that happens, everyone responds. The world really does change then, because we shift away from âhey, this one weird thing over here protects me, but how important is that?â, and everyone starts saying âwhat do you mean, you can aggregate my data and make predictions and do what you want with it? why would I ever want that? nobody else does that!â The see-saw tips. The story changes.
Talk about how we change the story.