183 RR Consequences of Technology with Ben Hammersley

The Rogues discuss the consequences of technology with Ben Hammersley.

This episode is sponsored by

comments powered by Disqus


[This episode is sponsored by Hired.com. Every week on Hired, they run an auction where over a thousand tech companies in San Francisco, New York, and L.A. bid on Ruby developers, providing them with salary and equity upfront. The average Ruby developer gets an average of 5 to 15 introductory offers and an average salary offer of $130,000 a year. Users can either accept an offer and go right into interviewing with the company or deny them without any continuing obligations. It’s totally free for users. And when you’re hired, they also give you a $2,000 signing bonus as a thank you for using them. But if you use the Ruby Rogues link, you’ll get a $4,000 bonus instead. Finally, if you’re not looking for a job and know someone who is, you can refer them to Hired and get a $1,337 bonus if they accept a job. Go sign up at Hired.com/RubyRogues.]

[This episode is sponsored by Codeship.io. Don’t you wish you could simply deploy your code every time your tests pass? Wouldn’t it be nice if it were tied into a nice continuous integration system? That’s Codeship. They run your code. If all your tests pass, they deploy your code automatically. For fuss-free continuous delivery, check them out at Codeship.io, continuous delivery made simple.]

[This episode is sponsored by Rackspace. Are you looking for a place to host your latest creation? Want terrific support, high performance all backed by the largest open source cloud? What if you could try it for free? Try out Rackspace at RubyRogues.com/Rackspace and get a $300 credit over six months. That’s $50 per month at RubyRogues.com/Rackspace.]

[Snap is a hosted CI and continuous delivery that is simple and intuitive. Snap’s deployment pipelines deliver fast feedback and can push healthy builds to multiple environments automatically or on demand. Snap integrates deeply with GitHub and has great support for different languages, data stores, and testing frameworks. Snap deploys your application to cloud services like Heroku, Digital Ocean, AWS, and many more. Try Snap for free. Sign up at SnapCI.com/RubyRogues.]

CHUCK:  Hey everybody and welcome to episode 183 of the Ruby Rogues Podcast. This week on our panel, we have Avdi Grimm.

AVDI:  Hello from Pennsylvania.

CHUCK:  Jessica Kerr.

JESSICA:  Good morning.

CHUCK:  I’m Charles Max Wood from DevChat.TV. I’m going to give you a quick reminder. If you do JavaScript go sign up for JS Remote Conf at JSRemoteConf.com. We also have a special guest this week and that is Ben Hammersley.

BEN:  Hi there.

CHUCK:  Do you want to introduce yourself really quickly?

BEN:  Sure. I’m a writer and a broadcaster from the UK but based really on planes around the world. I lecture around the world on the effects of technology on society and consult for governments and the military and so around the world. And then at the moment I have a BBC TV series called, and you have to say this in a silly voice, it’s called [clears throat] Cyber Crimes with Ben Hammersley.


CHUCK:  [dun-dun-dun-dun]

BEN:  Dun-dun-dun-dun, exactly, which is about the evil world of [clears throat] Cyber Crime.


BEN:  So, there you go. We can talk all of these things.

CHUCK:  Do you have sidekicks like Doctor Who and Sherlock?

BEN:  They are my bitches, yes.


CHUCK:  Awesome. So, I have two very pressing questions I have to ask before we really get going. The first one is: are you the one that coined the term podcast?

BEN:  [Laughs] Oh, god. This follows me everywhere I go.


BEN:  Yes, yes. I am. But, and I’d love to say that it was some form of heroic word creation story where I went to the top of a mountain and discussed the creation of this word with a monk and then brought it back down and gave it unto the universe. But actually, the story is that I was writing in 2004 I think it was. I was writing a story in The Guardian, a newspaper here in the UK, about this new phenomenon of automatically downloading audio files via the medium of a specially configured RSS feed.

And it was very, very close to the deadline. And my copy editor said that it was a couple of sentences short and please could I write a couple of extra sentences because it was very, very, very close to the deadline. And they didn’t want to have to pad it out themselves. And so, I wrote this sentence where I said something like, “But what shall we call this new phenomenon?” and then I made up three or four different new words. And one of those was podcast. And that was published.

And then about a year later, I had an email from the Oxford English Dictionary people saying, “We see you wrote this sentence. Where did you come up with the word?” And I said, “Well, you know, too much caffeine and it was five minutes to go before the newspaper had to be printed.” And they said, “Oh, jolly good.” And so, I didn’t know. I was sort of unaware.

JESSICA:  Wow. So, it’s official. You got an email from the Oxford English Dictionary.

BEN:  [Chuckles] That’s right.

AVDI:  I just realized I have a new item for my bucket list.


BEN:  I think it was, I don’t think they do that all the time. I think it was just for this particular case because it was word of the year that year.

AVDI:  Oh, wow.

BEN:  And so, ever since then I’ve been expecting either a trophy or a diamond-encrusted iPod to arrive in the post or something like that, but…

AVDI:  [Laughs]

BEN:  Neither of these things happened at all, which is very sad.

CHUCK:  I can’t afford a diamond-encrusted iPod, but what’s your address?

JESSICA:  [Laughs]

CHUCK:  I’m just kidding.


BEN:  The thing is if I tell anybody my address, then all of the anti-podcast people will be [inaudible], the anti-podcast jihad is quite fearsome. So, I don’t…


BEN:  You know, I don’t want to get doxxed by the anti-podcast anonymous. They’re fearsome. They’re all reel-to-reel guys that you know, [takes breath] terrifying guys.

AVDI:  They’re still milling the tapes around?

BEN:  Yeah. And in fact, people still do that obviously for major amounts of data transfer. And so, it’s just…


BEN:  It’s like a cosplay version of that. It’s very, very weird. But you know, I’m scared of them. So, sorry.

CHUCK:  [Laughs]

BEN:  Just put London and it’ll get to me eventually.

CHUCK:  I need a mustache like yours, too. That was the other thing I wanted to bring up.

BEN:  Well, that’s what you get when you invent a word, actually. It just appears overnight.

JESSICA:  [Laughs]

CHUCK:  [Chuckles] Oh, is that all it takes?

BEN:  That’s right. That’s why if you look in all those old-timey pictures they all had quite impressive facial hair. And it’s because at that time, there were fewer words in the dictionary. And so, more people were inventing more of them. And so you know, there’s a lot more space in the world to invent new words. And so, that’s why they all have such impressive facial hair. Dickens, massive mustache.


BEN:  Mark Twain, massive mustache. And it’s because they just invented a lot of language. Nowadays, the dictionaries are almost full. So, we’re in the scarcity phase. So, that’s why you don’t see quite as many impressive handlebar mustaches anymore.


CHUCK:  Alright, well…

BEN:  Well, really I feel we’re weaker as race. We’re weaker as a civilization. Maybe we should invent new, start afresh with a new less populous language. And then everybody can reclaim their astuteness.

CHUCK:  Well, you convinced me. I’m giving up on personal grooming. Anyway, I watched your talk. I really enjoyed it. I never really thought about those effects of Moore’s Law.

BEN:  Mm, yeah.

JESSICA:  Can you explain your view of Moore’s Law, Ben?

BEN:  [Laughs] Well, there are lots of different ways of looking at Moore’s Law. I think the way that I’ve said in that talk and many other talks I’ve given since then is that the general populous really has a hard time understanding the exponential nature of Moore’s Law, that doubling and doubling and doubling. Again, because we don’t really have a way to grasp sort of in common sense. We don’t have a way to grasp just how big those numbers get, how quickly. So, that’s the first thing. And then we just don’t instinctively understand it.

The second thing is that we don’t understand the ramifications of it. And there are two, I think there are two main ramifications. The first one is that if it already exists and it’s a little bit rubbish, then Moore’s Law basically means it’s going to be brilliant in a few years’ time.

So, the example I give of that is say digital photography. Kodak invented Digital Photography in the late 70s and it was terrible. It took photographs that looked like a chessboard. At the same time, Kodak was making Kodachrome film which is the most beautiful transparency film in the history of the world. And so, they looked at digital photography and basically dismissed it because its look is not very good. So, we’ll just keep making film and we won’t make digital photography.

And of course, the disadvantage of that is that they entirely forgot about Moore’s Law, the idea that once it became possible then simply through that accretion of processing power then it would become good enough eventually. And so, roughly about the same time that Kodak went bankrupt, Instagram the app which takes pictures that look as if they were shot on Kodachrome film, was sold for 800 million dollars. So, it’s the underestimation of precisely how damaging Moore’s Law can be to incumbent business, that’s the first thing.

And the second one is related to that, which is that if we can dream it up, it’s going to happen. And so, we’re starting to see that this year, especially with artificial intelligence, that we’re seeing AIs coming onto the market which are profoundly science fiction-y but are available in the stores.

JESSICA:  Really? What’s an example of that?

BEN:  Well, two things in the past couple of weeks. Amazon Echo for example, the device that Amazon started to sell or announce last week, which is a household appliance which just sits in the middle of a room plugged into a standard wall socket. And it’s basically the Amazon equivalent of Siri or Google Now or Cortana or any of those voice-activated interfaces that goes out to the cloud. It does web queries. I think it talks to Wolfram Alpha and things like that. And then you can also ask it to do things. You can say “Put such and such thing onto my shopping list,” and it will automatically add that item to your cart within Amazon.

Now that’s, if you think about all of the different steps that that takes, the natural language processing, the understanding of what the question is going out to all these different systems and finding the answers to your questions, all of that technology, that’s actually, that’s a whole load of really quite deep AI stuff. And Amazon is selling it for 99 dollars. So, we have a household AI or the beginnings of a household AI for less than 100 bucks.

And again, going back to the first thing about Moore’s Law, the capability doubling every year or so, even if Amazon Echo is rubbish today or not as impressive as you might hope it to be today, in five years’ time, in ten years’ time, which is still not a long time to go, we can be looking at technologies we can’t possibly imagine today. And same thing as we have today. If we think about the technologies we have today and go back ten years, what didn’t exist? Well, smartphones didn’t exist. YouTube didn’t exist. Facebook didn’t exist. Twitter didn’t exist. Think of all those different things that weren’t around ten years ago and how they’ve transformed the world. Now, think of all of those things but with an additional five or seven or eight cycles of Moore’s Law, 64, 128, 258 times as powerful as they are today. And what we find is we can’t really imagine what’s going to happen. And that’s a fundamental shift.

CHUCK:  So, we put Echo in our house which is listening all the time. And then we have things like Facebook that is such a privacy-minded company. And these things start talking to each other and have these capabilities. Some of it is frightening and some of it is exciting. And I’m not sure which is…

BEN:  Yes.

CHUCK:  Which I’m more of, afraid or excited.

BEN:  I think it’s both of those things. And this really comes to one of the other effects of Moore’s Law which is the capability of a technology and the growth in that capability far outpaces society’s ability to transform to deal with it. From simple things like etiquette, you know it took us 20 years to know not to have your phone on in a restaurant for example. And to this day, many people still keep their phones on in cinemas and things like that. Because it takes maybe a generation for people to learn how to deal with technologies at that very basic social level.

And when we start adding in all of these, what are effectively miracle technologies, technologies with to non-technical people are absolutely indistinguishable from magic… Amazon Echo like you say, it’s a box that sits in the corner of your room that listens to everything you say until it hears its name. And then it listens more carefully and goes out and does what you want it to do. Now, if you were to take that piece of equipment and show it to somebody who didn’t understand technically how it works, it’s a god basically. It’s awfully magical.

JESSICA:  It’s alive.

CHUCK:  Mmhmm.

BEN:  Yeah. Or if it’s not alive, then it’s certainly deeply mystical. It’s deeply powerful. It’s a crystal ball. It’s a shrine. It’s whatever you want to call it. It’s something like that. And so, you can easily see how people, non-technical people will be either completely freaked out by it or be awfully bewildered by it or completely bewitched by it, one of those things. And that’s where we get into the, what I think the fundamental discussions we need to have about technology in this part of the decade, which is that all of these technologies are now becoming so ubiquitous and so powerful at the same time that they cease to be technological problems and they start to be political and social problems.

CHUCK:  Well, that’s a point that you made in your talk, was that our legislators don’t understand the technology that we have now. And since it’s moving at an exponential pace, it’s moving twice as fast in a year or so as it is now and twice as fast as that a year or so later. So, how do we have the conversations so that they have even a clue of what they’re talking about or legislating against?

BEN:  Indeed. And in fact since I gave that original talk, I’ve changed my mind a little bit about that argument. What I was saying then was that legislators, politicians, and so on need to be educated, to be brought up to speed with the implications of these things so that they can make good choices around these things.

My current view is different. I think that actually what we’re seeing is the increasing irrelevancy of those existing power structures. And so, it’s not so much that we need to educate the politicians about the technologies. It’s that we need to protect society from the death throws as it were of those politicians or those power structures as they realize that modern technology is forcing them into a situation where they’re no longer relevant.

I think a good example of this is the financial world. If you’re the Treasury Secretary or the Chancellor of the Exchequer here in the UK for example, 20 or 30 years ago you had tools that you could play with where you could affect the economy, for example. Today what we’re realizing is because of the network, because of the technologies that are involved in the financial system, then the Finance Ministers, the Treasury Ministers, Chancellors, all of those people, their levers are not floppy. They don’t have the efficacy within the systems that they used to have.

And so, what we’re starting to see is the collapse of the modern, and certainly in Europe, we certainly see the collapse of the political system, or at least a crisis within the political system. Not because of their lack of understanding of the technology. They probably, a lot of them probably do understand the technology. It’s that the technology renders them obsolete in very subtle ways. And so, it’s more a matter of protecting ourselves from the counterblast of that.

And certainly in Europe we’re starting to see the rise of political parties who are very much, I would call them militantly nostalgic, that they want to rollback modernity because it’s freaking them out, because all of the things that all of the institutions that they hold dear and all of the ways of governing that they hold dear, all of these power structures within society that they hold dear, are all falling apart because of digital technologies.

AVDI:  So, this is interesting to me. And I feel like there’s a flipside to this. I’ve spent most of my life up ‘til now very much feeling like the new structures that the technology enables are going to sort of grow up around these old structures and render them irrelevant. And I saw that as a mostly positive benign thing. But lately I’ve actually started questioning that.

We were just talking a little bit before the show about how I started watching and interesting BBC Documentary series called All Watched Over by Machines of Loving Grace which talks a bit about the rise of what they call the California Ideology, which was the same kind of idea that this libertarian utopian techno-utopian world where the new self-organizing systems would be able to completely obviate the need for all the old political systems. And you see that I think more than ever in some of the stuff that comes out of Silicon Valley where there’s almost a religion of disruption. But often it seems like there’s very little ethics that goes along with that religion of disruption. And people just plow ahead thinking that as long as you’re disrupting you’re doing good. And a lot times, and it’s starting to look like a lot of times that’s not really necessarily the case.

BEN:  Yes, that’s right. And we’re seeing some very good examples of that today, the day that we’re recording this, with Uber for example. So you know, one of the, Uber the taxi company. They allegedly last night, one of the senior executives at Uber was at a dinner where he allegedly said that he was going to form a team within Uber to start doxxing journalists who wrote things about, that would be critical of Uber.


BEN:  Yeah, which is causing enormous amounts of fuss. But Uber themselves, especially around Europe are incredibly controversial in that they come into market, specific cities, with disruption above all else as their mantra. And the local context is always much more complex than they necessarily seem to acknowledge. And the local context here specifically around taxis is growing up and particular for certain social reasons and there are certain social places. And so, they come across, that sort of Californian ideology comes across in places other than San Francisco as being a little bit sociopathic. And for many technology companies, they come across as being completely sociopathic, or if not sociopathic then certainly as a corporate entity somewhere on the autism spectrum.


BEN:  Google used to be like this. Google’s original attempts at social networking were always a little bit spectrum, right? Remember six or seven years ago when they launched I think it was called Google Wave maybe, whichever one it was, one of their social network attempts. One of the big problems with social networks is that you have to, if nobody’s a member of it then it’s useless. But if lots of people are a member of it, say like Facebook, then it suddenly becomes incredibly useful for the members. And so it becomes this, you have to get a social network up to critical mass for it to be useful.

And so, the thing that Google did that time was they said, “What we’ll do is we’ll enroll every Gmail user into this network automatically and we will automatically friend the top ten most contacted people in your email inbox. And we’ll let everybody see who everybody else’s friends are.”


CHUCK:  [Chuckles]

BEN:  Now, if you’re an engineer who doesn’t leave the cubicle in Mountain View then your top ten friends are your top ten friends. And that maybe seems an entirely reasonable thing to do. However, if you’re somebody who’s having an affair, if you’re somebody who is running away from an abusive spouse, if you’re somebody who has two groups of friends who should never really meet [inaudible].

AVDI:  [Chuckles]

BEN:  Whatever it is, whatever human [inaudible] or just human situation that you have, then those boundaries between those people are very, very, very important to you. And when you wake up that morning and you find that Google has automatically enrolled those ten people in your social network and is showing each other everybody else’s friend list, then really bad real world things can happen. And that was because of this lack of understanding of the wider context of stuff within the digital network from say Californian companies.

Now, that lack of understanding of the wider context that technology is found within is pretty much universal. We see that, everybody has this. Everybody sees technology through their own frame. And so, companies see the technology they make through their own frame. Politicians see it through their frame. Every individual user sees it through their frame. And so, half of the debate around new technology is just a mismatch of context where people don’t quite see that a particular technology that they use in one way might be used in a completely different way by other people under completely different circumstances.

CHUCK:  Yeah.

JESSICA:  This means that as developers, as people who create software, we have a responsibility to think about people in contexts other than ours?

BEN:  Yes. Think about the ubiquity of the sort of technology that we’re talking about. And if you’re a mobile developer, or you’re building something that people access through their mobile device, then that technology’s incredibly intimate and important to that person. It’s carried around in their pocket. It’s never more than a meter away from them at any time in their life, right? It’s in their pocket. It’s on the little table by the side of their bed. It’s on the little shelf in their bathroom when they’re having a shower. So, these technologies are incredibly important to people and have access to the most important and intimate parts of people’s lives. And if you don’t understand that, then you’re dangerous. You’re actively dangerous.

As a technology industry, we’ve spent maybe the past 15 years pointing out how cool it will be if people adopted these new technologies into their lives and made them a fundamental bit of their lives. Now, people have done that. They’ve adopted these technologies and made them a fundamental part of their existence. Now, with that comes from the developer’s point of view a huge moral responsibility, because you could really ruin people’s lives, like genuinely ruin people’s lives. And not just ruin their lives, but you could get them killed. And if your instant reaction is, “There’s no way I could get somebody killed through a piece of software,” then you’re just not thinking wide enough or around enough people’s lived experience in order to find a way that your piece of software could get somebody killed.

And so, as an industry we need to have these discussions and we need to watch over each other to say to other developers, “Dude. Make sure the data you’re dripping doesn’t get somebody into trouble,” or whatever the case may be. But reach the level of ubiquity and reach the level of capability that mistakes like this could be really, really bad.

JESSICA:  Ben, can you give an example of software that you’d never thought would get somebody killed but could?

BEN:  Well, any form of communications technologies. So, we’re talking about any form of communications technologies where you might be using it to talk about things which become unpopular with the place that you live in.


BEN:  Now, in the United States, in the United Kingdom and countries like that, you’re not likely to get rounded up and shot in the back of the head for an email that you wrote. But if you are creating a communication system that’s going to be used globally, then you are going to have users which are going to be using it in those places.

This talks to a thing that happened a few weeks ago which was the new head of GCHQ here in the UK, GCHQ being the British NSA basically. There’s a new guy there and he did an essay for the Financial Times when he started his job where he said that it was disgraceful that major tech firms, specifically Apple and Google, were not allowing GCHQ and NSA to access encrypted data on their devices, and specifically that it was completely unacceptable that Apple and Google were implementing crypto on the devices that even Apple and Google couldn’t break. Because that would fundamentally weaken the capability of GCHQ or NSA to fight terror, as an example.

Now in many ways, he’s entirely right. We have to admit that there are bad guys out there who want to kill us. And we have to admit that it is the job of those intelligence agencies to find these people and stop them from killing us. And it would be, given the best possible world, probably a good idea to give them, the tools that they need to do that. However, what this guy was forgetting was that the wider context is that Google and Apple make devices which are sold everywhere. And so, the very same backdoor that will be given to GCHQ or the NSA in order to find really, really bad guys could be used for example by say the Chinese State Police to find pro-democracy demonstrators, because iPhone 6’s are on sale in China. And so, Apple’s responsibility is not to the safety and sanctity of the US and the UK and the Western world. Apple’s responsibility is the safety and sanctity of its users. And many millions, if not the majority of its users will be in places where the state is actually the bad guy.

And so, this is the sort of example of how what might be seen as potentially a good thing in certain circumstances, allowing for a judicially controlled overseeing backdoor into Apple devices that given a court order and enough oversight the NSA could use it to find the guy who’s got the ticking bomb that’s going to blow up Manhattan, that might be a particularly good thing to have. But having that actually endangers potentially hundreds of thousands of people who have iPhone 6’s in much more dubious regimes around the world. And so we, this is because everybody is looking at these devices from a different context.

JESSICA:  That’s fascinating. So, the technology of software and the internet with all its new interconnections that didn’t exist 20 years ago lets our inventions spread outside of our own context fantastically quickly. And we’re not…

BEN:  Yes, insanely quickly. And so, because of that it creates enormous complexity. Because it’s basically impossible what I’ve just asked people to do, which is try and thing of the ways that your technology can be used outside your own context. It’s basically impossible, right? Because it requires you to have almost superhuman powers of empathy, and empathy into cultures you might not necessarily know, even have heard of. So, it really creates an impossible situation for developers. And so, that risk is actually inherent and is increasing.

And then on top of that we have the risk of complexity of all of this stuff, which is we don’t really understand as a society what happens when all these technologies themselves start interacting with themselves. And that’s something that we saw in the financial crash and it’s certainly something we’re about to start seeing as more and more AIs come onto the market. Not only do we not understand each other as people, not only do we not understand how other people are going to use that technologies, but we also don’t understand how other technologies are going to user other technologies and what’s going to happen to the people then, which is quite a complex thing to have just said.

CHUCK:  [Chuckles]

JESSICA:  And that gets back to your definition of Moore’s Law which is about it’s not just the processors. You said you use IT to invent the next generation of IT.

BEN:  Yeah.

JESSICA:  So, it’s not just the processors anymore. It’s everything we build.

BEN:  That’s right. And so, that’s why we’re seeing this increase in complexity, which is not just the raw power of the technology but the interconnections of the second order effect of that technology, of the social effects of that technology, of the decision making that comes from that technology, of the different ways that we now live because of that technology.

CHUCK:  I’m still not sure how I feel about the fact that I carry around a crackerjack’s prize of tomorrow in my pocket today.

BEN:  And it’s only going to get worse, right?

CHUCK:  [Chuckles] Yeah.

BEN:  I’m talking about the iPhone 6 again. The iPhone 6 is as powerful I think as a MacBook Pro from 2008, something like that. And it’s six hundred and, I’m trying to remember, I think it’s like 600 times as powerful as a Pentium 75. And I know for a fact that the iPhone 5 was three times of a Cray-3 supercomputer, the one at the end of Superman 3, the iconic nuclear supercomputer from the end of the 80s, or the beginning of the 80s, sorry. This thing of the amount of processing that we can carry around with us, the amount of capability, and what that can then be used for. We really, even people who are technology very, very savvy, we really don’t necessarily understand not only what is possible but what is soon going to be very, very commonplace.

Google Research this morning announced a project that they had for an artificial intelligence that can look at a photograph and in English describe what is in that photograph. So it can say, the example they have is a picture of two pizzas in boxes on top of an oven in a kitchen. And they gave that jpeg to this AI and the AI said, “It’s two pizzas in boxes open on top of an oven in a kitchen.” And then there’s another one and the AI said, “It’s two dogs playing in a park.”


BEN:  Now, that’s super cool.

JESSICA:  How fantastic for accessibility.

CHUCK:  [Chuckles]

BEN:  Right.

JESSICA:  And also, think of a new, as much fun as Google Translate is now, think how much fun we can have with an image describer.

BEN:  Right. But also think of all the really bad things you can do with it, right? Let’s think about all of the really evil things that we could do with that. Let’s go out and get as many images as we possibly can from a database where there are lots of images, whether it’s Flickr or we can crack into Snapchat, or Instagram, something like that. And we say to this AI, “Okay, go through all of these billions of images and find me all of the naked people, or find me all of the naked people that look like this person.” Or, “Find me all the pictures of,” and then whatever the thing people want to find. Now, we can come, undoubtedly this technology will result in all sorts of entertaining and interesting crimes. Just through blackmail I can think of mechanizing that, right?

JESSICA:  [Inaudible]

BEN:  Because how cool would that be, right? Let’s just make a bot that goes around and looks for naked selfies. And then when it finds one, emails that, identifies where it came from and sends a message to it saying, “If you don’t send this many bitcoins to this address, we will post this particular thing or post this particular image to the front page of reddit,” something like that. And you could set a botnet on doing that. It’d be quite easy, probably quite profitable. I can think of very many ways to break the law with computers. It seems to be a talent of mine.

JESSICA:  Would that be a Cyber Crime?

BEN:  Yes.


JESSICA:  I just had to get that word in there again.

BEN:  [Chuckles]

CHUCK:  So, the thing that’s really interesting about this discussion is that where does the responsibility lie? So, Google invents this technology and we can use it for good. Is it their responsibility to make sure that it doesn’t get used for that, or is it the responsibility of the people who are the next level up programmers, or the rest of the public? And the other question is I worry that we’re going to hamper our own progress for fear of these kinds of things happening.

BEN:  Well, quite. I mean, it can’t be the manufacturer’s responsibility because that would mean that for example all car crashes were the responsibility of Ford.

CHUCK:  Right.

BEN:  Or something like that. At the end of the day, so much of Western law goes against the idea that it’s Google’s fault.

CHUCK:  Right, but they do legislate that they have to have certain safety features.

BEN:  That’s true, that’s true. But you only really have to get into the gun debate to work out that that’s never going to happen.

CHUCK:  Yeah, yeah.

BEN:  So, I think we have to just say that it comes down to individual responsibility. I can go into a store down the street and buy 100 knives because it’s a kitchenware shop. I could then go and stab 100 people. It’s in the usage that makes a tool, I think.

CHUCK:  Mmhmm.

BEN:  So, we have to be, it has to come down to that. But the problem…

CHUCK:  But these tools are getting more and more powerful, right?

BEN:  Well, that’s right.

CHUCK:  By an exponent of two every year, year and a half.


BEN:  That’s true. But then you have to think about it practically speaking and say, “Well, okay. What would a country have to do to be able to ban a potentially problematic technology within its borders?” And given the modern world, it would have to become North Korea.

CHUCK:  Mmhmm.

BEN:  Basically you have to become Amish. You either have to say, “We’re going to have to deal with everything,” or, ”We’re just going to not deal with anything,” and build a very big fence around your country. But if you’ve got the internet, you’ve got it all. You either have it or you don’t. You can’t have half the internet, because otherwise, or certainly not in a modern democracy. You can’t have half the internet. You have to have it all. And so, it really does come down to society deciding what is acceptable and what isn’t. And that’s really where this stuff starts to get really, really interesting. Because it’s not necessarily the really dangerous stuff, the stuff that will kill you, that’s going to be problematic. It’s going to be the stuff that’s socially awkward that’s the most problematic.

So, let me give you a good example. Jawbone, the fitness tracker people, they have the thing called the Jawbone 3 which comes out I think next month just in time for Christmas in the US. And that has constant pulse detection. And the Apple Watch is going to have the same thing. And they have an API so you’ll be able to monitor your pulse all the time. Now it will be trivial, programmatically speaking, to take somebody’s pulse rate track and match it against their calendar and match their calendar against LinkedIn and do a real-time feedback thing of who it is that you meet that causes your blood pressure or your pulse rate to go up.

JESSICA:  Perfect.

BEN:  So, you can have a… yes, you can have…

JESSICA:  Then we can use it for online dating.

CHUCK: [Chuckles]

BEN:  Or you could use it for online asshole meter, right?


BEN:  Because you could say every time I meet Bob from sales, he drives me completely crazy.

CHUCK:  You know Bob from sales, too?

BEN:  Yeah, what an asshole, right? So, the thing about that is you could use this sort of quantified self-type idea to prove it to yourself. That although you previously had this feeling that Bob from sales was really aggravating, actually now you have the data. And then it will be a really, really trivial thing to have that being an extra field in LinkedIn for example, which says of the 50 people who’ve met Bob with sales in the past month who were wearing a Jawbone 3, or an Apple Watch or whatever, 45 of them felt an elevated pulse rate whilst they were meeting with this person. Either Bob is incredibly sexy or…

CHUCK:  [Laughs]

BEN:  He’s incredibly annoying, right? Now, technologically none of those things are difficult. Technology’s already there, right? It’s just in simple use case. Now if we were to do that, and undoubtedly somebody will because it seems quite new, it’s something I made up on the back of a post-it note on a plane a couple of days ago. Now, if somebody does that then we have to come to terms with the social ramifications of an additional variable of data going around the world about you, which is people’s average increase or decrease in heart rate whenever they meet you.

Now, what would be the social implications of that? For Bob, it’s really bad. [Chuckles] Suddenly we have new metrics by which to measure ourselves. And that’s a whole new social thing. Again, we’re talking about quantifying things. We’re starting to see the rise of what’s called the quantified office where there are quite a few companies raising quite a lot of money at the moment to implement systems where they’ll be able to quantify everything that a person does.

And so, there are some companies who attempt to measure programmer productivity by the number of GitHub check-ins that they do every day. Now obviously, anybody who has ever coded in their life would know that that’s a terrible metric to measure productivity by. But that won’t stop companies from doing it. And the same thing measure people’s productivity by how many emails they send, or how many words of documents or how many PowerPoint slides they make. But these companies are being created to measure this stuff. And so, we have now a social situation where people are being measured by their number of GitHub check-ins or whatever. And that creates a whole new social and political question to be asked about these technologies.

JESSICA:  In the case of stabbing someone with a knife or crashing your car into someone, we don’t prohibit knives and cars outside of the airport. We instead respond by punishing the stabber after the fact. Can we do that same sort of thing for destructive activities like posting personal photos?

BEN:  You can to a certain extent. But it depends on the type of wound this has left, right?

CHUCK:  One thing that occurs to me, if I can just chime in for a minute is that the people who are doing it have the same resources that we do to find them and punish them. So, I think we can but only if we can find them, only if our offense is better than their defense.

BEN:  Right. And that’s a big problem with cyber warfare, which is what’s known as the attribution problem, which is that most countries in the world who have cyber warfare capabilities are very, very worried about using it. Because once people start using cyber warfare technologies, it becomes very, very dangerous because you can’t prove for certain who it was who launched the attack. And it’s the same thing for Anonymous or any of those low-grade digital assaults. Identifying who’s doing it and why they’re doing it is very, very difficult in the digital realm. And so, we do end up severely one-sided arguments, which require perhaps I think a social change rather than a law enforcement change. They require it to become socially beyond the pale rather than just illegal. I think there’s a fundamental difference there.

It’s the difference that we have in the UK about drink driving, DUI. There was a huge social change in the past 20 years in the UK about drinking and driving. 25 years ago for example, it was very low-grade offense socially. Nobody really cared. And then the government, but it had always been illegal. But the government did an advertising campaign which is incredibly powerful. If you ever want to really scare yourself silly, look on YouTube for British government anti drink driving advertisements from the 1980s.

But because of that advertising, it remained illegal but it is now absolutely socially unacceptable. If you were caught, it’s one of the things that separates our two countries, which is we hear about actresses getting arrested on a DUI charge. And everybody’s like, “Ah, Lindsey Lohan all over,” whereas here in the UK if somebody’s arrested on a DUI charge they lose their job and probably get divorced. It’s like the end of their life. It’s utterly unacceptable. And I think it’s that sort of social change that might have to happen for certain activities that happen  online, like say doxxing or something like that, which can’t be stopped legally, can’t really be stopped technologically, can only be stopped by the fact that your mother will never speak to you ever again.

AVDI:  Yeah, I look out and I’m a little worried about a situation where the powers that be are always necessarily going to be a few decades behind because to be old enough to be head of government you sort of necessarily have to be old enough to not really get the current state of Moore’s Law. And then the alternative seems to be Anonymous, which is not the kind of government I want to have. [Laughs]

BEN:  No, indeed. The thing about Anonymous that frustrates me is that people seem to conflate technical capability with political understanding. The same thing with Occupy. Now, I might be, I am in fact quite sympathetic to a lot of Occupy’s political viewpoints. But the fact that Occupy or Anonymous or any of these groups are capable of using technology to express their opinion does not add merit to their opinion.

And that’s something that I think we need to, that’s a process that we need to as a society get through, which is just because you’re good at the internet doesn’t mean you’re right. Whereas at the moment it seems to be split between people who are terrible at the internet versus people who are great at the internet, and because we’re also great at the internet then we’ll settle with the people who are great at the internet. But just because you’re great at the internet doesn’t mean you’re any good. And that’s a real political evolution that’s going to have to happen in the next ten years or so.

Now, it’s going to happen through natural accretion which is just that the old people are going to die or simply retire. But between now and then we’re going to go through some, I think certainly in the west and certainly in the north of Europe we’re going to go through some quite dark patches where a lot of the sort of stuff, the social change that we have seen through the internet that we think is good, is going to get rolled back because the old people are in charge and the old people are completely freaked out by it.

AVDI:  As a programmer I know what to do if I need to learn about the latest technology that’s coming down the pipe. I’m more in the dark when it comes to educating myself politically and especially ethically, how to think about these ethical issues that flow out of the technology that we’re creating. Do you have any pointers on how a programmer is to give themselves an education in the ethical side of what they’re doing?

BEN:  There isn’t really a book that you can read on how to be a good person, tragically. I think it’s more a matter of just having a think about, well two things. If I was evil, how could I use this for bad? That would be one thing. But the other thing is to think about the culture that you’re embedding in this software that you’re writing. And that’s a really, quite a subtle point and it requires an awful lot of mindfulness almost, an introspection about it.

But the example that I give quite a lot is around Blackberry. The Blackberry as a device is really the physical instantiation of a political and cultural belief that email is the most important thing in the universe, right?

CHUCK:  [Laughs]

BEN:  If you take on a Blackberry, if you’re given a Blackberry by your business, then what you’re effectively doing is accepting into your world the idea that if that light starts blinking you have to stop whatever it is you’re doing and answer that email. And that and the entire user interface of a Blackberry is based around that political belief. Now, they might not have interrogated themselves to understand that, but that’s what that is.

And so, it’s the same with any other type of technology is, what are the assumptions in the design of this piece of tech? What are the assumptions? The assumptions might be, the assumption is this person has to be online at all times. Or that undercutting the price, being in a store and finding something, finding that thing but for half the price online, is the most important thing for that person, or whatever it is. And so, by interrogating the underlying cultural assumptions of a technology, then you can start to unpick it a little bit. When you do that, when you look at it from the embodied cultural beliefs in the thing, you sometimes come up with really interesting things, insights into your own life.

CHUCK:  I think that phone notification noise that went off was oddly appropriate.

BEN:  Yeah. [Chuckles]

CHUCK:  While you were talking about that. [Chuckles]

BEN:  Indeed, because a mobile phone has a really interesting embodied cultural belief in it, which is that previously a telephone number was a place. You called my house or you called my desk. And if I wasn’t…

CHUCK:  And you might not be there, yeah.

BEN:  I might not be there. But the phone’s ringing. All that signifies is I’m not there. Now however, because a phone number signifies a person, and culturally the assumption is that person will have that phone with them at all times and on at all times, then we end up with all sorts of new cultural assumptions which is that if you ring my phone and I don’t pick it up then it means I don’t like you anymore, or I’m doing something, or I’m doing something suspicious or whatever it is. And so, there are these cultural assumptions in technology that sometimes we don’t interrogate, we don’t question, and we don’t even notice, but which do have a fundamental effect on the way that we coexist with that tech.

JESSICA:  You mentioned earlier that it’s impossible to empathize with everyone everywhere with all the contexts that are not ours, which is totally true. And we should try. And we all work in, well most of us work in teams now. So, if we have more diversity in our teams, then we’re better able to imagine more different contexts. So, does that make it kind of like we have an ethical responsibility to the world to aim for diversity so we can do a better job at understanding the consequences of what we make?

BEN:  Yes. [Laughs] I mean yes, I think so.

CHUCK:  [Laughs]

BEN:  And I think, and what you define by diversity is also very important here, because we’re not just talking about ethnic or gender diversity but just the ability to see how the technology you’re developing would affect different people of different lifestyles or different social positions and different social privileges and so on. Now of course, there does come a point where you just basically have to choose something. It’s not fair on me to call on say the guy writing an email client on Blackberries to not creating an email client on the Blackberry which wasn’t email-centric, you know. [Chuckles] It’s not fair on that person. But Blackberry has to do is acknowledge that their device is not a device for, their device does have that cultural assumption.

JESSICA:  Or they could build in features like do not disturb between certain hours.

BEN:  They could do that. They could choose to do that or they could choose, just simplify and just say this is the device for people who really want to be on top of their email. If you don’t want to be on top of your email, don’t go with it. And they could be radically simple in that way. That may be more successful for them. Who knows? But it couldn’t be any less successful. So, rock on.

JESSICA:  [Laughs]

BEN:  But it’s just being aware of it that I think is the most important thing. I don’t think it’s necessary to necessarily have a quota system or something like that.

JESSICA:  I like what you said about diversity. It’s not gender and ethnicity are clues. But really, it’s about a diversity of context and a diversity of understanding how the world works in places that we don’t live. We can build that within ourselves by seeking out new experiences.

BEN:  Yes, exactly. That’s exactly right. And it’s a habit that once you start getting into, it actually becomes sort of addictive, I think, just that ability to, or the habit of looking at something and saying, “Well, how would this go down if I wasn’t me? If I was a young, single mother? Or a quasi-retiree, or somebody in the third world or somebody in Paris trying to run their kids to school, or somebody stuck in the traffic in Los Angeles,” or whatever it is. If you try and run it through all those different scenarios, then that actually becomes quite addictive to do. And it also becomes very useful I think from a design point of view. That ability to empathize actually makes you a better designer, makes you a better code, makes you a better maker of things because you’re making stuff that is much more nuanced and more finely-tuned by usage, even if that usage is only in your head with imaginary people.

JESSICA:  That’s beautiful.

BEN:  I don’t know if it’s beautiful. I just think it’s necessary in this day and age. If we are trying to write software or create services which are run on a global platform, then they are going to be used by people around the world. Or you can choose to just make something that is only going to be very local and very specialized. But as long as you’re honest about that in your own head, then that’s cool too. But if you’re making something that you think is going to be a global release, and you want it to be globally used, then you have to be aware of the different contexts it’s going to be, it’s going to be used in.

And also, it also enables you to take criticism in a way. You know, you should never read the comments anyway. But if you read the comments in, or the reviews in the Apple App store for example, almost universally those reviews are written from the point of view of the reviewer thinking the universe revolves around them, almost for the 100%, certainly all the one-star reviews apart from the apps that genuinely are objectively rubbish. But most of the one-star reviews of actually quite good apps, the one-star reviews come from people who are saying, “This application is terrible because it didn’t do the thing that I wanted it to do for my life.”

JESSICA:  Like the guy at GCHQ who’s upset about encryption on the phones because he thinks everything is about the GCHQ.

BEN:  Right, exactly that, exactly that. Anything that Google does is obviously about American or British national security. And of course, Google and Apple are looking at you going, “Um, no. Actually, our biggest market is China. And our biggest growth market is Africa. And actually, the United States and Northern Europe are really not our most important market nor where we’re thinking about most.” And that is going to be a really interesting shift in technology in the next ten years, I think, is the fact that the majority of technological development is going to happen not necessarily in Asia but it is going to be directed towards Asia.

And your Californian geek, as we’re seeing now with young, male teenage gamers in the whole GamerGate thing, are all going to start feeling really left out by the industry that previously had been dedicated to servicing them as their first and most important customers. And what we’re going to start seeing soon is that the first and most important customers are actually 30-siomething women in the center of China rather than 20-something guys in metropolitan North America. And that’s going to be a big shock for everyone. People are going to start to really be upset when our laptops get smaller and curvier and pink.


JESSICA:  Yeah, that will be different.

AVDI:  That’s a really interesting thought. That’s going to change a lot of things.

BEN:  Yeah, it’s going to change a lot of things. And it’s not just in personal technology. It’s in silly things like architecture. And not [inaudible] architecture, but interior fittings. If you’re a company that’s making say, bathroom fittings, then the vast majority of bathroom fittings in the world are going to be bought in China because of all of the cities they’re building, all of the urbanization that’s happening there. And so, the vast majority of bathroom fitting’s going to be made for the Chinese market. And so, we’re going to start to see Chinese interior decorating tastes come into the market much heavier than we previously were used to. Things are going to get weird in that way because the dominant aesthetic isn’t going to be of the American middle class. The dominant aesthetic is going to be a Chinese middle class, just because there are so many more of them. And there are only so many factories that make bathroom fittings, or whatever it is that you’re talking about. And so, that’s a real mindset shift.

And again, the more conservative people in North America and Northern Europe aren’t going to take kindly to that because they have lived their entire life in a time when we’re number one. And well we’re not really anymore for an awful lot of things, certainly in terms of say consumer choice. By that, we’re actually a middle of the road market. And so, Apple is much more interested, you know the iPhone 6 and iOS 8 are much more about Chinese users than they are about North American or Western European users, absolutely 100%.

AVDI:  Okay, so here’s a question that’s sort of related to what I asked you earlier. When looking at the way things are moving like you’re just describing, are there some sources of information, news sources? Where do you look to get a better perspective on these trends, because a lot of the news sources, tech news sources and others that I see are just very, they’re very contextual. They’re very myopic, focused on people like me.

BEN:  Yeah. There are lots of news sources from different places. Global Voices for example, the project that’s out of MIT, or Harvard, sorry. [Inaudible] Center at Harvard. That’s a very good resource for local news written in English about places. So, you can get interesting viewpoints. There are lots of blogs that cover Asian technology or something like that. I think the most important thing is to look outside the industry as well.

So, although the vast majority of the work that I do is consulting in the effects of technology on society and those sorts of future trends, that is usually expressed not in personal computing or mobile computing blogs, but is expressed in things like shop fitting magazines and interior décor and architecture magazines and general science publications and medicine. And even things like big catalog from department stores.

There’s a store here in the UK called Argos which publishes a very thick catalog. It’s maybe a thousand pages of products. They do it every quarter I think it is. It’s like, imagine if say Target or Walmart published a catalog. Now, that’s actually incredibly good reading to start to get an idea of the zeitgeist. Because it’s all of the products you would never usually look at. So for example, all the toys you wouldn’t look at if you don’t have kids, or all of the white goods you never look at if you don’t live in the suburbs. And it’s the same thing. If you can pick up those sorts of things but from other countries, you can start to get an idea of how people are living and the things they most value that’s outside of the usually quite minimalist, usually quite tech-centric coder lifestyle.

And once you can start to look at that, then you get an idea of hey, actually some of the things that we think are really, really commonplace actually are really, really rare. And some of the things that we think are banal and passé are actually incredibly commonplace and very, very popular. And you start to realize the real shape of the way that this technology is taken up.

JESSICA:  That’s fascinating. I feel like we’ve transitioned into picks.

CHUCK:  [Chuckles] Let’s have Jessica go first. Jessica, what are your picks?

JESSICA:  I have one blog post that I’m completely fascinated by this week. It’s by Peter Hintiens. It’s called ‘Children of the Fight’. And it’s about how some small percentage of the population doesn’t experience empathy and how that was probably essential for the evolution of humor. It goes that far. It’s a really fascinating read. I’ll link to it in the show notes. That’s my pick.

CHUCK:  Cool. Avdi, what are your picks?

AVDI:  I think just one today, and it’s a hardware pick. I’m sure the listeners will never tire of listening to my ongoing foray into ergonomic technologies. [Chuckles] So with that in mind, I have been enjoying the MS Sculpt keyboard that I think I picked a few months ago. It’s definitely an improvement over just a regular standard keyboard. But I’ve still been getting a lot of pain. I’ve still got some ulnar nerve issues going on.

So, I decided to go to the next phase and I got myself a Kinesis Advantage Keyboard after trying out a few different crazy ergonomic keyboards. And I got to say, it’s in a completely different category than the Sculpt. The Sculpt is like, “Okay, this is like a regular keyboard only slightly less painful.” The Kinesis is just amazing. It’s very, very differently shaped and it’s deliberately shaped to not make your fingers move so much. And it’s interesting. I’m still typing relatively slow compared to what I was typing on the Sculpt. But every single time I make a typing mistake I realize it’s because I stretched a finger too far. And if I hadn’t stretched so far I actually would have hit the key that I wanted. So, it’s really remarkable.

JESSICA:  I used a Kinesis for years and they’re wonderful. My favorite part is that prime real estate right by both your thumbs. Instead of being just one big spacebar, you’ve got Control and Alt.

AVDI:  Yes.

JESSICA:  And a whole bunch of keys that you need to hit all the time.

AVDI:  I feel like the people who designed the traditional keyboard imagined humans as having this gigantic prehensile pinky.


AVDI:  And sort of…

JESSICA:  Opposable pinkies.

CHUCK:  I want one. [Laughs]

AVDI:  Pathetic stubs of thumbs that can’t do anything. And it’s funny because if you’ve ever used a video game system, you know that the thumb is the most important digit you have. Most of the buttons and most of the controls are positioned for the thumb for a very good reason, because thumbs are really useful. So yeah, that’s the other thing I love about this, is that it’s a keyboard that actually acknowledges I own thumbs. So yeah, it’s not cheap. It’s a $300 keyboard. But it is totally worth it. I’m really liking it. And that’s my pick for today.

CHUCK:  So, this last weekend my wife and I went out to Park City, Utah which is up in the mountains. It’s up where they have the Sundance Film Festival. So, when that’s going on you’re likely to see celebrities in Utah. Otherwise, you’re probably not. But anyway, it’s a really just great town. If you like to ski it’s a great place to go for that, too. But just a fun place to get away. In the summer there’s a lot of hiking and mountain biking and stuff that you can go do as well. And so, I’m just going to pick A, getting away. Get away from the computer, get away from work, and just go out and do something fun. And Park City. I’m going to pick Park City because it’s just, it was just a nice weekend to just go up and be close to nature. I don’t know if we actually went into nature because it was cold and it snowed. But it was a lot of fun. So anyway, those are my picks. Ben, what are your picks?

BEN:  So, I’m going to pick another series of podcasts actually, because I invented the word, so I’m allowed. I’ve been a huge fan of the past few years of the podcast 99% Invisible, which is an amazing podcast about design and architecture and all of the stuff that surrounds us. And a couple of years ago, Roman Mars, the guy who makes it had a Kickstarter to go from I think monthly to fortnightly. And then last year had another Kickstarter which funded the formation of the group of likeminded podcasts.

And then recently, a couple of days ago, they finished the Kickstarter where they raised over $600,000 to continue that series of podcasts and to add three more. And it’s a group of podcasts called Radiotopia. And as somebody who makes radio and as somebody who listens to an awful lot of it, I have to say that this is, all of them are stunningly good storytelling podcasts and absolutely well [inaudible]. If you’re into design specifically, then 99% Invisible is actually brilliant.

The latest episode of one of the other podcasts, which is called Benjamin Walker’s Theory of Everything, the latest episode has an interview with a writer [inaudible] of mine called Paul Ford who’s a huge hero of mine with his writings about the digital world. And so, I would thoroughly recommend all of the podcasts in Radiotopia and specifically 99% Invisible. And anything that can get its audience to give them over half a million dollars in additional funding year after year after year has got to be worth it. So, try those out. They’re only about half an hour each one.

CHUCK:  Cool. They sound like a lot of fun. Alright, well I don’t think we have any announcements so we’ll go ahead and wrap up the show. And we’ll catch you all next week.

[This episode is sponsored by MadGlory. You’ve been building software for a long time and sometimes it’s get a little overwhelming. Work piles up, hiring sucks, and it’s hard to get projects out the door. Check out MadGlory. They’re a small shop with experience shipping big products. They’re smart, dedicated, will augment your team and work as hard as you do. Find them online at MadGlory.com or on Twitter at MadGlory.]

[This episode is sponsored by Ninefold. Ninefold provides solid infrastructure and easy setup and deployment for your Ruby on Rails applications. They make it easy to scale and provide guided help in migrating your application. Go sign up at Ninefold.com.]

[Hosting and bandwidth provided by the Blue Box Group. Check them out at Blubox.net.]

[Bandwidth for this segment is provided by CacheFly, the world’s fastest CDN. Deliver your content fast with CacheFly. Visit CacheFly.com to learn more.]

[Would you like to join a conversation with the Rogues and their guests? Want to support the show? We have a forum that allows you to join the conversation and support the show at the same time. You can sign up at RubyRogues.com/Parley.]