Ralph Echemendia, also known as “The Ethical Hacker,” distinguishes fact from fiction in the realm of cybersecurity.
Hacker and multimedia magnate Ralph Echemendia has been cracking code for over twenty years. His versatility and specialized insight into the world of cybersecurity made him an ideal choice to serve as technical supervisor for the Oliver Stone films Snowden and Savages.
Echemendia’s expertise was also pivotal to the development of the award-winning TV series Mr. Robot, which he found time for when he wasn’t training security experts at the nation’s top tech companies, such as NASA, Google, and Microsoft. In this exclusive interview, “The Ethical Hacker” reveals his biggest concerns about consumer cyber safety, why apathy is dangerous, and how to prevent corporate spies from hacking your mobile devices.
Innovation & Tech Today: After working on movies and films such as Savages and Mr. Robot, how do hacking and cybersecurity in mainstream media compare to the real world?
Ralph Echemendia: There has been a great deal of attention paid to the fact that we want to make it realistic as much as possible while still entertaining. It’s really about the storytelling, the computers are just sort of the set. Beyond even the press and the media, movies are really the biggest communicator on a global level, so people look at that and they tend to go home thinking they now know something, so it’s very important that it’s done correctly.
In the real world, it’s a lot different in the sense that the narrative is different, because it’s not about storytelling. It’s about operations and it’s about making money. It’s about all these different things that really don’t apply to what you’re seeing on screen. But, there’s an important symbiosis between the real world and what Hollywood is putting out and they’re going to be putting out a lot more related to technology because it’s what’s happening.
I&T Today: With all the major hacks that have occured over the past year, how can we expect corporations to improve their cybersecurity practices?
RE: I think a big part of that is obviously legislation and regulations helping to push that on companies that have to do it. When you really think about it, only a few years ago, companies didn’t even have to necessarily disclose if they had been hacked.
The unfortunate truth of it is that what does “more” actually mean for companies? Because they don’t really have the resources. The human talent is one of the biggest problems. There’s a massive scarcity issue when it comes to that. The solution is really in the people. It’s not to buy more tech, necessarily.
Forbes, a couple of years ago, wrote an article about there being a million cybersecurity jobs unfilled on a yearly basis. I saw some other reports saying there are 2.8 million jobs in cybersecurity, most of which will be unfilled because there just aren’t people who have the experience and time in cybersecurity. So that’s the biggest hurdle that companies have to deal with.
On the flip side of that, I think the other thing is to empower the actual users, the consumers, with more tools. I don’t necessarily think that it’s a company problem or a government problem. As a consumer, you’re really the problem. At the end of the day, if you click and say “Yes” on one of those things, that’s not really on them, that’s on you. And most of the attacks are really now targeting more the individual than targeting the infrastructure of major corporations. So I think it’s a combination of being able to address both of those issues.
I&T Today: How can consumers improve their situation with regard to cybersecurity?
RE: It’s a matter of awareness and that’s the real problem. The majority of the stuff that we use as consumers is a matter of convenience. It creates a much more convenient world for us, and so much so that we rely on it. Once we’ve adopted technology the way we have, we can barely remember what it was like not to have the technology. There has to be some level of situational awareness on the consumer’s side, which is a difficult thing to do because most consumers will want to run away from this issue of security.
It’s a scary issue, and even just the word “security” is one that instills a level of fear, uncertainty, and doubt. That’s really what we have to address. In fact, that’s been my passion over the last two years: to figure out how to communicate to consumers what their mobile devices are doing. It’s easy enough to block what is known to be bad, but what is more difficult is to inform the user of what they don’t really know is happening and empower them.
I&T Today: Do you think the public perception of data security is shifting?
RE: I think, slowly, the public perception is shifting at a mass level as a result of all these types of incidents that you hear about now in the news. It’s nothing new. This has been going on before it was reported. It’s just that now it is being reported, and it’s reaching the mass media, and therefore it’s reaching the masses. So, obviously, that is bringing up their awareness.
Unfortunately, they still don’t know what to do with it. I’ll tell you a little story. I worked on Snowden with Oliver Stone and I took my daughter to see a director’s cut of it. She watched the movie and, at the end of it, the first thing that was kind of funny is she turned to me and said, “That’s it?”
It was a weird response. What do you mean, that’s it? She goes, “That’s the end?” And I said, “Well, yeah. Why? Why are you saying it like that?” She goes, “Well, because when I came in here, I didn’t care. I didn’t know who Snowden was. And now I care. I’m aware of this privacy issue, so now I do care, but you’re not really giving me a solution. Now you’ve made me aware of a problem without giving me the ability to do anything about it.” That was very interesting to hear a 17-year-old say that.
That is really the issue, is that at a mass consumer level they might know there’s a problem, but they don’t fully understand the problem, and they don’t know that there’s a solution. They can’t even see a solution.
I&T Today: How does your new app Seguru provide a solution to consumer cybersecurity?
RE: The idea is to empower people to become their own security gurus with information. It’s a number of different technologies, but I call it “safeware” because we’ve got all this software and hardware, clouds, malware, and ransomware, but there really hasn’t been any kind of disruptive innovation in consumer cybersecurity since the antivirus. That’s 25 years old and really does nothing in today’s attack vectors and environment.
So Seguru is a mobile safeware that allows you to visualize what your phone is talking to and where that is in the world, whether legitimate or bad traffic. If it’s bad traffic, it’ll block it. It’ll alert you to something that doesn’t look like you taking communications on your phone. Then it also alerts you to all the things that are supposed to be “normal.” What you’ll often find is that you’re not okay with some of the things that it’s doing and where that information is going. It has to be easily digestible and de-mystify this issue of cybersecurity in such a way that it almost game-ifies the experience for the user.
In a way, it’s kind of turning you into a hacker yourself, with your own data. Spy on yourself, if you will, because that’s what’s happening. All these governments and corporations are spying on you and yet you don’t know what that means. So when you’re allowed to see what your data does and where it goes, you’re in a position to do something about it.
When we started the Seguru project and were looking at what our phones were communicating with, we were surprised. I’m an expert and I didn’t know my phone was doing this. I didn’t know my phone is communicating with Russia. Or what the hell is in Thailand? Why is my phone talking to Thailand?
And oftentimes it’s because these companies have partners and a lot of it is advertising. A lot of it is tracking you in one way or another, primarily for the purpose of feeding you ads for consumption of whatever products are out there.
I think that we are surprised only because we don’t really know what they’re doing with our data, and once you actually start to understand that, then you can either accept it or you can choose not to accept it. I want to be able to give people the ability to say yes or no.