Feb 28, 2019 | 7 min read

Spotlight Series: Exploring the "Grim Gap" in Industrial Security with Joe Weiss

This series highlights the key insights and lessons from our Digital Leadership series of podcasts. We spotlight the important takeaways from our interviews in an accessible format. The following Digital Leadership insights come from Joe Weiss, Managing Partner at Applied Control Solutions LLCStay tuned for the full podcast interview with Joe Weiss next week, in the meantime, take a look at our library of podcasts.

What in your background has shaped your view of security? 

By education and initial experience, I was a nuclear engineer. My first 5 years at EPRI I was running the nuclear instrumentation controls and diagnostics. In the mid-1990s, I was approached on cybersecurity I told them, no one cares about email. I was doing work getting digital systems installed and we never thought that it was a double-edged sword. With Y2K, when officers became liable, silos came down. We extended from unintentional Y2K programs into intentional programs to address cyber. We started it not because we knew there was a problem, but when I started to go to cyber security meetings – each time when I said 'All of this applies to a control system,' and I was told this was the first time anyone had brought it up.   

Cyber is a three-legged stool – the first leg is physical security, the second is cyber security, and the third leg is control systems.  The IT world has to solve the Windows and IP problems. The control systems are a small area. The only people that could solve the control systems were the utilities and the control systems community. Here we are in 2019 and there has not been a lot of work to secure or even identify where we don’t have security.  

Why is it that control systems have not had security embedded already? 

People complain that the engineering systems world didn’t consider security, but the security community doesn’t understand their liability and safety. Sensors and controllers have two roles – they feed IP networks, but they also have real safety consequences. Part of the reason I wrote my latest blog is that Dragos and GE were writing to educate the engineering community about security, but no one is educating the security community about engineering. Cyber security is typically taught in computer science, and students are not typically required to take engineering systems, and engineering students don’t have cyber security requirements – so the gap starts from the beginning.  

What are some of the key issues that create confusion in the industry?  

People need to come together on the definition of a cyber incident. The de facto definition is that you are connected to the Internet and someone is trying to steal or manipulate your data. The NIST definition is communications between systems that affects confidentiality, reliability and availability. This definition does not mention malicious behavior. The difference in impact is that IT can’t kill anybody but engineering can. If you cannot impact reliability or safety of control systems, it’s not actually that important. Prior to 9/11 the engineers owned the systems and were irrelevant – with the bottom line being that if control systems don’t work it doesn’t matter. After 9/11, cyber was yanked away from the control systems into IT.   

What has changed as far as the focus on security?  

When IT took over the cyber security problem, the focus became protection of the networks – NOT keeping the lights on. If the network is down, the lights can be on, but if the lights are off, the network is down. We are focusing on the wrong thing. In terms of defining standards, I’d like to say the control system industry is unified, but the electrical industry and nuclear industries keep going off on their own. One of the biggest efforts in the electric industry is to invent their own form of Internet communications.    

Can you share any examples?  

In the Bay Area there were two incidents that happened in one week that illustrate the contrast. BART decided they would turn off the Internet when the trains were underground. Then Anonymous, the hacker group, hacked the personal information of the riders. BART had a control room problem where they lost the view of every train in the system, and they had to shut down the system with every train where it was. There were tons of front page stories about Anonymous stealing customer information, but the shutdown of the trains and loss of visibility could have resulted in loss of life, but the story was buried on page 6 of the San Jose Mercury News.  

I have personally tracked over 1,100 incidents in the US, many of which have been completely ignored by the media. The extent of the problem is enormous. In IT security, you are typically securing a box running Windows, but in the control systems world you are dealing with systems of systems. It is really difficult to do a validation of control systems, because there can be protocols that have no security whatsoever.  

 

orange_line_1-759860-edited

Momenta Partners encompasses leading Strategic Advisory, Talent, and Investment practices. We’re the guiding hand behind leading industrials’ IoT strategies, over 100 IoT leadership placements, and 17+ young IoT disruptors. Schedule a free consultation to learn more about our Connected Industry practice.