Android Influencer: ‘s Adrian dwig is dedicated to making Android as safe as it can be

BY

Published 4 May 2015

NSFW AI Why trust Greenbot

We maintain a strict editorial policy dedicated to factual accuracy, relevance, and impartiality. Our content is written and edited by top industry professionals with first-hand experience. The content undergoes thorough review by experienced editors to guarantee and adherence to the highest standards of reporting and publishing.

Disclosure

Android has had an unfortunate reputation over the years. There have been numerous criticisms that the open source mobile operating system is unsafe; that it’s riddled by constant security threats malware.

But the reality is that mobile security isn’t that black white, that Android is much more secure than its reputation suggests—just look at how much it evolved from KitKat to llipop. That’s because people like ’s Adrian dwig, lead security engineer for Android, are in charge of ensuring Android becomes more secure with every new release.

I sat down with dwig—in a corner, on the floor of the Moscone Center in San Francisco—after one of his talks during the RSA Conference to chat about Android security, how keeps its mobile users safe, what he believes is Android’s real achilles heel (hint: it’s not malware).

: t’s start by talking logistics. How are vulnerabilities in the Android OS typically discovered? How long does it take for the Android team to get together patch things up?

dwig : I don’t think this is unique to Android by any stretch. There are a lot of different ways that people find vulnerabilities. have a team inside of —more than one team, actually—that try to find vulnerabilities. There also other people outside of .

There’s one technique called fuzzing. It goes back—I don’t know the exact origin of the phrase—maybe 20 years ago, there was a team of us doing exploit development for ndows at the time, we began to realize that understing the code was really hard, but connecting to it just throwing rom data at it would potentially result in crashes. A lot of times crashes would end up being exploited, that’s what became known as fuzzing. You take an A, send it stuff, see what happens. build big farms on servers doing that—million millions of things being sent to try crash it.

Sometimes, there are six variables on an A. [ look at] all the possible values iterate through them. More more, you want to try to get a state machine into an unpredictable place, so you’ll record a normal set of transactions then you’ll permute that normal set of transactions. One approach might be to take every app that exists in the world, modify a byte in it, see what happens. Most of the time, the applications work just fine, but [maybe] you modify that one byte maybe it goes down a path it wasn’t expected to go down. It’s not at all a predictable process other than you’re just trying to cover the entire codebase.

: That sounds incredibly tedious.

dwig : Some people get very excited by it.

: Do you get excited by it?

dwig : I’ve never done a lot of vulnerability research in that form, but I’ve done vulnerability search in other forms—which is the second way that we’ll see a lot—which is analyzing the code trying to find out if there are known patterns that are likely to be applicable. So, for things like signature verification or install process, they have to be done in a specific way. If there’s any kind of variation from that, that’s a problem.

I get really involved in the design types of things. ke, are you a detail person, or a big picture person? The design level stuff tends to be more of my personal interest. But I always see both of those [types of people] on Android: people who are like, “I picked up the source code, I walked through the source code I’ve seen this issue.” Those things get reported from time to time—that’s how a lot of those things get found internally, externally we see a lot of fuzzing automated testing.

: So I imagine that there are rs whose job is to go through the code every day make sure everything runs.

dwig : Yeah. Inside Android security we have a team called the “attack team.” That’s what they do: they find the bugs they helpfully guide the other engineers to solve those bugs.

: at about when users bring up something in the forums? Do you have people who are watching those?

dwig: don’t have anybody who does that formally on my team. have people who do that sort of casually. There are some groups on + a few other places where things tend to get surfaced then there are some people that do a good job of surfacing things—I’ll give a shout out to stin Case. But we don’t have a dedicated community manager who is out there day in day out.

: How do you cook up which security features to include in future versions of Android? How do you field that stuff?

dwig: There’s a few different approaches. It’s not like any other product development effort. might have a vision for where we want to be, then a series of features that help us get to that vision.

th security, you’re looking at what the behaviors are that we’re seeing going on right now, or that we know will come next based on that historical experience that is going to be a potential risk to users. You’re prioritizing where harm might be trying to make sure that you put defenses in place to prevent that. do both of those things, then it’s a stard development model: prioritize the features, stack rank them, allocate, then kind of go from there.

A lot of times security has a lot of excitement associated with it. There’s fear; people get scared. In practice, once you’ve done it enough times, some of that goes away you can begin to be much more rational about it. try to do the thing right now that’s going to help people the most. Sometimes it’s protect them from something, sometimes it’s that they want to do something they can’t do right now, so we enable it. [For instance,] they want to be able to do financial transactions, so it’s a combination of protection a combination of making sure that the crypto libraries you need in order to connect to your bank server are available.

: How long does it take for security features to get to a certain version of Android? at is the cycle of patching up security? Can you explain that process a bit?

dwig: ’ve had a lot of fundamental technologies that have been introduced: SEnux was one of those technologies, encryption was one of those fundamental technologies, the application sbox was one of those fundamental technologies. At this point, I don’t want to say all the fundamental technologies are in place, but I think most of them are in place. So now, a lot of it is refinement, making sure that we very narrowly scope to the intended behavior versus non-intended behavior.

There’s a phrase I picked up at one point: Every good thing that ever happened on Android came as a result of an A—then, of course, the same person then pointed out that every bad thing that’s every happened on Android has also come from an A. That’s the balance we’re constantly taking: we want to add new functionality, we’re gonna add new functionality, but how do we do that in a way that’s safe that makes Android better?

: There is a lot of misconception about Android’s security. ople say it’s unsafe. at do you say about that?

dwig : Android is very threatening to a lot of people. I think we’ve seen that play out in the security space as much as we have anywhere. [The security industry is a] billion dollar business. [Those companies] came to fruition became very successful because previous platforms did a really bad job with security. And so, as the new generation of platforms establish itself, these are businesses that are trying to underst where they fit in. I think the first reaction from all of these businesses is to try to establish the beachhead they had on previous platforms.

Fortunately, when we built Android, we also had hindsight, we did things quite a bit differently. And so, that’s something that’s going to take some time for people to underst: how big the differences are what the long term roadmap looks like. You can’t, in year one of a platform, reliably say that “this looks like the last platform 30 years ago, it’s going to go the same way.” History can repeat itself a little bit, but not that much, especially not if you’re actually looking at it.

: True or false: Android’s biggest security issue is malware.

dwig: It’s false by several orders of magnitude. It’s not that malware is not a thing we need to worry about, it’s that people lose their phone every day. That’s a $700 loss. Nothing in the malware space is as valuable as that $700.

For many people, if you own a house or a car, then your phone is the third most valuable thing. In San Francisco, you’re probably renting using Uber, so there’s a pretty good chance that your phone is the most valuable thing you own. The physical loss element is really significant. The percentage of people who lose their devices is like ten to 20 percent.

[Another issue is that] we really don’t know at this point is how big the person-to-person problem is. Incognito mode in Chrome was implemented because there’s a belief that people want to keep secrets within their own household. It’s entirely about people protecting local data from the people you have some sort of relationship with—your kids, your roommate, or whatever. That’s a real thing that we don’t have an understing for.

: at is the person-to-person problem exactly?

dwig: It’s a personal relationship, is what I really mean. Enterprise talks a lot about the insider thread—the person you think you should trust but you can’t really. I think that’s everybody when it comes to a device like this. I don’t want to say that’s the big thing we should worry about, but I think in terms of the areas that the security community security feature set has not kept up with the level of risk, that’s an area. In Android, we’re way overinvested in malware protection right now, relative to this other area, which is why at we’ve got a team working on Smart ck trying to figure out if you can use Bluetooth to do unlocking.

: Are third-party security apps like okout the like really worth the download?

dwig: I think most people don’t want to think about security—I don’t want to think about security when I’m not thinking about security. So in that sense, we’ve built the security solutions so they’re sort of invisible, behind the scenes to protect you. There are some people who want to take a very active role—they want to have an alarm system on their home, they want to have an alarm system on their car—for those people I think it’s great those things exist.

You have to pick: which of those kinds of people are you? Are you somebody who really wants to be thinking about this? Then go ahead install one. But I don’t want a world in which everyone has to be that way. So we built [Android] so that if you don’t want to think about it you don’t have to.

: There are critics who have said that Android’s biggest security vulnerability is user privacy—that apps have access to data like phone contacts payment information, that though they’re required to disclose those things, you never really know what they’re doing with that data. at does say about that?

dwig: I think privacy is among the more complicated questions remaining in the world. I don’t think there’s anything unique about Android vis-à-vis privacy.

One thing that is kind of unique is that there is no central authority within Android. at we are trying to do is make the individual entities in the ecosystem be responsible for their own actions, which means if you go to install app X, you need to be responsible about what X does—you need to make a decision about what your preferences are. X needs to be responsible for providing that information their exposure to that information. The approach that we’ve taken so far is to make information available—this is ’s mission statement, which is “to organize the world’s information make it universally accessible useful.” That’s what we try to do; we just try to surface it.

I think it’s driven an incredibly valuable conversation. Android was the first consumer platform that made any information available about the inner working behavior of applications. It caused it to ask these questions, I think they’re great questions, I expect that in the future we will do more more to make those questions more precise make sure the answers are more precise. I don’t think that’s specific to our platform, though. I think all platforms are moving in that direction, which is great to see.

: In the short term long term, what is doing to make Android more secure to better protect users’ sensitive data?

dwig: ’ve lots of features that are designed to do that. ’ve added better encryption support, better sboxing, better exploit detection. ’re also really interested in some of the lower level hardware features. Security companies like RSA have managed to get pretty low level hardware features on a lot of consumer devices, but very rarely have they’ve been exposed to mere mortals that build applications because they’ve traditionally required a hefty business model associated with licensing fees the like. One of the things we’ve been working on is exposing that security infrastructure—trust zone, the secure element, T, depending on the specific device—so they can be used by any application developer.

: t’s get into the fun questions now: How long have you been an Android user?

dwig: Good question. I’m an ione to Android convert. I really like open systems I really find long term value in platforms that embrace innovation, so it’s almost like…a religion, heartfelt philosophical belief. For me, that’s a lot of what Android brings: a real commitment to something more than a few dollars to a particular manufacturer.

[My first Android phone] was the HTC Hero. The white one. It was really pretty.

: at are you currently using?

dwig: I don’t like big phones. I have three broken red Nexus 5s, I have this one that’s pretty banged up. I don’t believe in cases I drop things a lot.

: at is one app you absolutely cannot live without?

dwig: The number one will be my bank app. I also have a real estate fixation—I really like the Zillow app.