Mar 6, 2019 | 4 min read

Conversation with Joe Weiss

Podcast #49: Control Systems Cybersecurity: A Grim Gap

Joe Weiss works as the Managing Partner of Applied Control Solutions and Managing Director of the ISA99 Standards Organization. He has a deep background in control systems security and has been active in the cybersecurity community around control systems for decades. Control systems are everywhere in every type of power generation, substation, pipeline, manufacturing, transportation, medical, pharmaceuticals – and security is often not understood. Our conversation covered a range of topics including how security is only a small part of control system challenges, the three “legs” of security (physical, IT security and control systems), the cultural challenges to raising appropriate security awareness and areas where security has been overlooked. He provides a look into the magnitude of the problem while pointing to encouraging areas of progress.  

Recommendations:

Control Systems Unfettered 

Protecting Industrial Control Systems from Electronic Threats by Joe Weiss   

ISA99, Industrial Automation and Control Systems Security 

 
orange-line.png

We'll notify you bi-weekly about new podcast episodes, upcoming guests, and news. You can subscribe to the podcast and if you'd like to be considered to appear on the podcast contact us.

 

View Transcript

Good day everyone, this is Ed Maguire, Insights Partner at Momenta Partners. Today our guest is Joe Weiss, who is the managing partner of Applied Control Solutions. The way we connected was quite interesting, Joe is an incredible insightful thinker, and he had responded to some of the content that we put out on our website, and had reached out a couple of times with really insightful commentary and a link to his blog, which is at cotrolglobal.com/unfettered, and I was so impressed with the depth and uniqueness of his insights into cyber security around control systems, that I reached out and asked him to be a guest on the podcast, and share some of his insights. 

So, Joe thank you so much for joining us today. 

Thank you for asking. 

Let’s start with some background, could you share a bit of perspective on your background, and what’s shaped your view of technology, and brought you into the field of expertise where you are now focused. 

By education and by my initial experience in the world, I’m a nuclear engineer, so my focus there was always instrumentation, controls, and diagnostics. That’s what I was doing for years and years, first when I was with GE Nuclear Energy, and then when I went to the EPRI (Electric Power Research Institute). In fact, my first five hours at EPRI was running the nuclear instrumentation and diagnostics programs, because they had regulatory concerns over sensors. Sensors for that matter, diagnostics, so that’s what I was doing so much there, was because of the regulatory considerations. Then what happened is, I moved from running the nuclear program to going over and running the fossil plant instrumentation and controls programs, and kind of an interesting aside. 

In the mid-1990s the IT organization at EPRI came to me asking me to address cyber security, and I basically told them, ‘Go home, get away, because nobody cares about email’, that was what we thought cyber was in the mid to late nineties. The other thing that was going on was, I was doing all of this work trying to get digital systems installed everywhere, along with communication, so, much of what I was doing was making things vulnerable because we never thought there was a double-edged sword to that. We thought the more communication you had, the better things would be, the concept of a bad guy simply never came to mind. Then what happened from there is, I ended up becoming the technical lead essentially for the electric industry, on the Y2K embedded systems program which in a way was the first non-malicious cyber program. 

But the big thing about Y2K was, that because officers and directors were personally liable, all of a sudden, all the silos everywhere came down, and people talked for the first time. So, that part was great, and basically at one minute past midnight, 2000, we had to figure out what do we do now, and we thought – myself and others on this EPRI Y2K program thought was, ‘Well, gee since we have people talking, why don’t we just extend from the unintentional…l’ which is what Y2K was, ‘… into the intentional cyber?’ That’s how that whole thing started, there was never any reason, we weren’t worried or anything else. There were programs ongoing, but they were if you will, somewhat on the dark side. So, when we started the cyber program at EPRI, it wasn’t because we knew there was a problem, what happened was – and this is leading to where I am now, when I started going to all of these cyber security meetings where you’d have the big communication companies, DOD, etc. I’d sit in the back of the room and pretty much everything they said went over my head, all of these x.509’s and certificates.  

But each time when I said, ‘Wait a minute, none of this not only applies to a control system, but could even be used by a control system’, each time I was told that was the first time anybody had ever brought that up. That’s when I started thinking, ‘Wait a minute, we may have a problem here, so when I started and put out the first EPRI product, as well as when I put out the R&D plan, this is back in 2000, maybe even early 2001. But the point was, it was supposed to have addressed the entire control system loop, from the control room which is where you had all of your windows HMI’s, or VAX/VMS, whatever, all the way down through the controllers, down if you will to the sensors and final elements, was supposed to have been the whole thing.  

When I started the program, and it's also in my book which I can go into later, is I tried to say that the cyber is a three-legged stool. The first leg is physical security, its guns, gates, and guards. The second leg is IT security, in other words all of the networks, be it Windows, whatever, all of the IP networks, the third leg were the control systems. What I was saying all along is, really the IT world has to solve the Windows and internet protocol problems. The control system world is a small player there, and if you look even today, obviously the IT world hasn’t solved it because we keep finding all of these major IT breaches. But the real point was, the only people that could solve the last leg of the stool, which are the control systems, or where you go boom in the night, would be the utilities or the control system community, because if they didn’t, it wouldn’t get solved.  

And here we are now in 2019 and there hasn’t been a lot of work done to actually secure, or even identify the fact where we don’t have security when you get below the IP network layer. So, that was the initial ‘Where are we?’ 

We can get into this a little bit later, but I know you’ve been doing quite a lot of work in terms of tracking some of the security shortfalls and incidents that are completely under the radar. But to expand on the point, why is it that industrial control systems have evolved, to your point, the concept of security has not been embedded into the engineering. Is that just a result of the siloed natures of the technologies, or are there other factors at work? 

You’re right in there are other factors, but I want to point out one other thing which I hadn’t really thought about until you just said it; people complain that the control system world/engineering world didn’t consider security. Well, what I’m dealing with to this day, and this goes all the way to NIST, and that is the security community still doesn’t know or understand reliability and safety! So, if you want this major culture gap, here’s where it’s coming from. 

The sensors and these controllers feed two pieces, they’re feeding the IT piece through networks, but this is what’s operating systems that have to be reliable, and inherently have safety consequences. Part of the reason I wrote my latest blog, this thing about the sensor issue, is because DRAGOs and GE were writing a series of whitepapers to educate the ‘engineering community’, about cyber security, which is really good. However, other than my book and some other things there’s nobody trying to educate the security community about the engineering concerns. 

I have brought that years ago, I was on a panel in Las Vegas, it was an education conference, and this starts way at the beginning because cyber security is normally taught in computer science. If you’re taking computer science, or cyber security, they don’t require you in most cases to have to take any engineering classes. Meanwhile the engineering domains whether its electrical, mechanical, chemical systems, nuclear, industrial, don’t really require you to have any cyber security requirements. So, you’re creating that gap from the beginning. 

It’s pretty remarkable, one of the comments I’ve heard from some software engineers in the IT security world, is that a lot of applications end up being built without any thought for the type of structural integrity and structural engineering concerns that characterize physical engineering, like if bridges were built like software is built, they’d be collapsing all over the place. 

Yes, official blue-screen of death for a bridge. 

Exactly, and you just can’t have that. I’d be interested to learn a bit more about some of the work that you’ve done, documenting the problem. Could you share some of the insights that you’ve come away with as you’ve done such deep work in the control system space? 

I’m going to go back to what you asked before, because I think it will tie things together, which is recognizing all of these control systems, cyber incidents that have occurred to-date. To start with, people need to come together on what is the definition of a cyber incident? The undocumented but de facto definition of a cyber incident in the IT world is, you’re connected to the Internet, you’re using Windows, and somebody is trying to steal or manipulate your data. The NIST definition, I think its FIPS 140, its been around for a very long time, is its electronic communication between systems that affects confidentiality, integrity, or availability. There are two things about that definition. 

  1. There is no mention by choice of the word malicious. The NIST definition does not require an event to be malicious to be a cyber incident. 

 

  1. Its an IT definition, because the most important letter to us in the control system world is the letter S, for safety; because IT can’t kill anybody, but engineering can and has. 

It’s a broader definition, right? I think what you’re saying here is that safety is a much broader encompassing concept, and security and even cyber security are just a subset of safety in the broader sense? 

Yes. The whole point is this, for the engineering world; if you cannot negatively impact reliability or safety, then it's really irrelevant. So, this whole thing about cyber for the control systems, not for data, it's really important people don’t steal your formulas or other business confidential information, but if you can’t affect the reliability or safety of the operation, then I’m sorry, it’s just not that important. What’s gotten lost is, prior to 9/11 the engineers owned the systems and cyber was an issue, it was a business issue. We had a big ISA conference in Houston on 9/10, we had a couple of sessions on cyber, we had utilities, oil and gas, chemicals, water and whatever there, Delco was there, I think Proctor & Gamble was there, we had dog food manufacturing company there, because if the control systems don’t work you can’t make anything. 

Well, what happened was, the next day was 9/11, I couldn’t get out of Houston nor could anybody. Following that, cyber was made national security, and yanked away from the engineering organizations after IT. There were a lot of unintentional consequences to that, one of the first is prior to 9/11 when the engineers owned this the focus was, is the process doing what its supposed to do? In other words, are you keeping lights on, water flowing, is the assembly line running at the right speed? 

Right, its much more about availability, right? Keeping processes going. 

Availability, productivity, safety, engineering concepts. Well, when IT got hit, it all of a sudden became the network, and to this day ICS Cyber Security unfortunately has morphed into protecting the networks, not protecting the lights from staying on, but protecting the networks. And let me be very clear, you can keep lights on even if the network is down, but if the lights are off the network isn’t going to be on anyway. We’re looking at the wrong things, it’s the end game we’re worried about, and the networks are the way to optimize getting there. It’s turned into, ‘Gee, if you don’t have the networks then you’re toast’, and the answer is, that’s not true! 

Particularly when you’re dealing with control systems right, because the control systems themselves are  

In the control system world 

Yeah. 

And this is why I keep joking and I’m using a name, if you’re not worried about the control systems, you might as well be dealing with Macy’s. There’s nothing there, or for that matter any financial institution or anything else, any retain or whatever. What makes this different, because every industrial company has a front-end with the business operations. That front-end, like I say you might as well be Macy’s if you’re worried about the front-end. What makes the control system world different is it’s the back-end where you actually make things, and that’s where physics comes into play and you’d better be careful what you’re doing. One of the real concerns is, because IT doesn’t understand the implications of what they’re trying to do. They have often caused more problems than the hackers. 

That’s the too many cook's syndrome as it were. 

It’s not just the too many cooks, its having cooks who don’t even know what’s in the ingredients. So, what happens is, and this is kind of the genesis of where the ISA 99, or the ISA 62443 standards came about, because you’ve got the ISO 27000 set of cyber security standards, but they’re for IT, and the concern was if you try to apply them directly to a control system, you could cause real havoc and maybe even damage, and there has been damage. So, ISA 99 started to essentially create an ISO 27000 that was specific to control systems, and I’d like to say that the control system world is unified, but they’re not, because the electric industry of all things keeps going off on its own. For that matter so does the nuclear industry, and there are some real problems in both. 

That’s interesting. You’ve been involved with standards efforts, what are some of the real challenges involved with establishing those? 

Everything is going back to culture, the culture being when this was originally engineering, you had the engineering people who are involved, and they understood the hardware, the implications, you name it. Well, once it became a, ‘Cyber/network problem’, it went from protecting the hardware and the process to all about the networking. One of the biggest efforts going on right now in electric is through IEC TC 57, which is trying to invent their own form of internet communications, and the hardware is out of scope. 

It sounds like haven’t a lot of people already worked on that, right? 

Yes. Where we are when considering you being Momenta Partners -- there’s a big world right now out there for VCs and others looking for the next great security company to fund. Almost all of them, almost all of them are some form of network monitoring. Again, it's going back to this issue of, ‘IT, it’s the network’. DHS and for that matter DOE have done very-very-very little to actually fund projects that specifically address the control systems, and almost nothing when you get down to these lowest level field devices, and its 2019. 

It’s pretty astounding when you talk about this, because if you think about what could be some of our most vulnerable infrastructure, the fact that there really has not been any programmatic, or even sustained effort to address some of these threats is pretty amazing. I would love to have you talk about the scope of the problem, and maybe part of the issue is that people don’t appreciate how big it is. You’ve done a lot of work on this, but we don’t get the type of headlines in control system failure, I have to say almost I’ve never heard of these failures. 

I’ll give you a classic example from where I live, but before I do that, when you ask how big is it; first of all, this is the invisible visible, these are the things that are everywhere, but you don’t even see them. So, it’s not just every power plant, and by the way that could also be solar, wind, coal, nuclear, hydro, every sub-station, every pipeline, all of your manufacturing, all of your transportation, defense, medical, pharmaceutical, what control systems are is simply a device – and it even could be a human device, that monitors and controls the process. So, think about your heart, controlling blood pressure, your monitoring how hot something is to know do I have to open a valve, or do I have to add water? Think about ubiquitous this is where we are today, but it's this funny thing of saying, if you’re a doctor and you can’t trust your temperature or blood pressure readings, how do you make a diagnosis. It’s pretty straightforward isn’t it? 

And yet all of the funding and all of the focus by DOE, DHS, you name it, is on the diagnosis. The assumption is, ‘Well, gee, that temperature and blood pressure reading must be right’. Why? Because we said it is, we have no way of seeing anything else, therefore it is. So, the example I wanted to give you is what happened here in the Bay area, this is a number of years ago, and it occurred within a few weeks of each other. BART – the Bay Area Rapid Transit had decided that they were going to turn off the internet when the trains were underground, for a while. That caused a ‘insurrection’, and Anonymous the hacker group, in order to protest ended up hacking the personal information from the BART writers. So, that made the front page of the San Jose Mercury, and the San Francisco Chronicle -- well, what happened a couple of weeks before was, and this was on page six or so of the San Jose Mercury, BART had a control room problem where they lost the view of every train on the system, they no longer knew where anything was, so they had to shut down every train exactly where it was. 

As best as I recall it may have been a router problem, but the difference is this could have killed how many people? And yet this is page six that ran for one day, there was a whole week of things about what Anonymous did by stealing the BART customers’ personal information. Gee, which is more critical? 

No doubt, these control system incidents… that’s truly life and death. 

Or, another way of saying it, it’s existential for this country, and we’re not hardly even looking.  

You’ve been tracking a lot of these incidents; on your blog you mention you’ve tracked, is it about 1600 incidents? 

Over 1100, but that’s what I call an individual incident, and I’ll explain how big this is. I think many people are familiar with Stuxnet, what Stuxnet was, was hacking into a controller to change the logic, so you could change the process, and then to change it back so nobody would know. So, here’s the one that will get you, the Volkswagen cheat device scandal, what was that? That was going in and changing the logic, and every single Volkswagen and Audi diesel, to know that when that car was under test to be able to change the fuel and emission controls, so it passed the test. Then after testing to change the logic back so that they would get their 45 miles per gallon. If I didn’t tell you that, I would have just told you Stuxnet. That was at least 800,000 cars. TI Chrysler got hit with I think it was either 80,000 or 100,000, these are billions, and the interesting part about these were, normally you worry about a rogue in cyber as an individual, these were rogue corporations. 

The next thing you think about is malicious, ‘Well, they must be trying to hacker damage something’, in this case it was done because EPA had changed the environmental requirements, and it was no longer physically possible for the diesel to meet both the new emission requirement, and, the mileage they were promising. So, the question becomes, what is meant by malicious? In this case it was to the environment and to the value of the cars, that’s not what most people think malicious is. 

It is pretty astounding the scope of this. Is there a way to ensure there’s a seal of good housekeeping as it were, for control systems? Of course, it’s almost impossible for security… 

No, I’ll tell you what, there are several organizations including ISA. ISA has a thing called ISASecure TUV, it’s like a UL, UL one of two, but UL is not really in the big industrial control space. The point is, these certifiers are certifying for safety, that’s why they were there from the beginning. Cyber is a very different animal, and part of why this becomes so big is, in the IT world you’re worried about a box, is that Windows box secure or not? In the control system world it’s a system of systems, you may have a Honeywell plant distributed control system, it's got say a bunch of Rockwell controllers, which then has a bunch of Amazon sensors and analyzers, which may have some Pepperl+Fuchs sensors associated with it, which may have valves from… I can keep on going.  

So, if you want to actually put a seal of approval, what we really need is a seal of approval of the system being used. It is really-really difficult to do if you will a cyber validation of this entire system, especially when you’re going to have protocols or devices that have no security in them whatsoever. 

Are there ways that companies can audit their systems, or will there be external requirements? 

There are in the utility and nuclear space, but they’re really not adequate. In the utility space you have this thing called NERC CIP, the North American Electric Reliability Corporation, Critical Infrastructure Protection Standards. The reason they’re wholly inadequate is #1 all of electric distribution, in other words what goes to your house or building, all the electric distribution is out of scope. All of these now non-routable protocols, these sensor protocols that are using 12 bot modems, that’s all out of scope. Depending on the voltage of the transmission system, if it's not high enough it's out of scope, depending on the power plant, if it's not big enough it's out of scope. And if it's out of scope they’re not even looking. 

That’s the challenge. What do you think would be necessary to catalyze a broader effort to perform audits, or deeper review? 

The first thing is, I believe we’re looking at everything backwards. We’re looking at everything as if it's an IT system, and so there’s this whole thing, if we’ve gone through all the networks, and the networks are good, therefore we’re good. Wrong. What we need to do is turn this entire paradigm upside down, and go back to where we were in a sense, before we had the IP networks, which is to say what is most important? The lights staying on, the water flowing and the water being clean, or the boiler in a refinery operating in its optimal condition, and the valves which are part of the safety system being sure that they really will operate. That has to be done from the engineering side, and it can’t be done from the network side. The network side can help, but the network side is not the complete solution. 

What cyber has been up until now is having the network side be the complete solution. Its intractable, we will never no matter what any network monitoring company tells you, ever be able to assure cyber security of a control system if all you’re doing is, ‘Network monitoring’. If you’re not looking at the actual process, and knowing that the process is good, and then coupling that to the network, you’re dead on arrival. 

Are there any either companies or industries that you think at least understands the scope of the problems, and maybe taking some encouraging steps in the right direction? 

I’m going to go to the point of saying encouraging steps, and I’m going to point out again, I’m an independent consultant, I work with all kinds of different organizations, governments etc. I’m the managing director of ISA 99, and I do as best as I can to be as independent and impartial as possible, because that’s my job. 

I’m going to break it into two pieces, one is the control systems, and the other is the end devices. With one exception that I know of, and everything I’m telling you is that I know of, all of the major control system vendors are taking cyber security seriously, I want to be really clear about that, they are spending fortunes, they’re devoting all kinds of resources to try to secure their systems. But the legacy systems, the initial platform was a platform that was never designed to be secure. There’s only one company I know of, it’s a small start-up and its able to do this because it doesn’t have the legacy to protect – there’s a plus and minus there, but it’s a company called Bedrock. What it's done is designed a control system from scratch to be cyber secure as well as to incorporate the lessons learned from control system issues in the past. It even has made an attempt to include EMP protection, (Electro Magnetic Pulse). So, the issue of what I’m really saying here is, if a company really-really wanted to do it, yeah, they can… one has.  

At the device level what’s happening there is, we’ve got all of these legacy devices literally hundreds of millions, they can’t be secured and they’re going to be there for the next 10 to 15 years, so we need to have some way of at least monitoring what is happening, and that’s where this one little outfit in Israel is coming in, a little company called Cedar. You’ve got to look at the raw signals that actually know what’s happening. If you look at it after it's an Internet packet it’s too late. The hope is that as these newer sensors and devices are built, that they really do incorporate the requirements laid out in 62443. It’s going to be important because these newer devices are going to have internal webservers. This is where we are and where we’re going, period. 

Certainly, it will open up a lot of new issues and vulnerabilities. As you look forward is there anything that you’re optimistic about, or any particular concerns that bubble up the most? 

The two things that in a sense make me somewhat optimistic, and this is on my blogsite, #1 Moody’s are now really starting to look at cyber security, and recognize it as a potential solvency issue that can affect their ratings, which basically means we have a chance for the first time to have these kinds of issues really-really float up to the sea-level. 

That is something that’s been discussed for a long time just in terms of liability, but that’s really interesting. I was not aware of that. 

This just happened. Its on the blog site, Moody’s have a monthly magazine and in the January issue there was a full page devoted to PG&E and the potential bankruptcy. The bottom paragraph was about cyber security, not just for PG&E but of the utilities. And so, here is Moody’s explicitly stating, ‘This is a concern to us’, and that message needs to get across because they’re not accepting, but I’ve got a piece of paper saying I’m NERC CIP compliant. They really want to know that you’re securing your facilities, that you’re not going to be subject to insolvency. 

Hopefully the insurance companies will pick up on it, because with all due respect to what’s out there, they don’t really understand the real technical issues associated and the potential real impacts. So, that’s the first. 

The second is, the National Academy of Engineering wanted me to write a paper, and to start getting involved in taking this seriously. If we don’t, from a market perspective the ICS cyber market is going to be growing just because of the fear factor. The problem is, with all of that put in you’ve got the biggest backdoors you could imagine, that is a huge risk to everybody. The flip side is, if we do it right and start looking at the process, not only do we have a chance to make many of the cyber threats go away, but we also have a chance to make what IoT and Industry 4.0 keep promising, because IoT and industry 4.0 are based on two things; lots of sensors and Big Data analytics, and if you can’t trust what you measure, IoT and Industry 4.0 isn't going to go very far. 

Yes. No question, that’s a big issue. 

But again, this type of technology of looking at the sensors, the beauty is, it’s an enabling technology, it can transform so much if you know how good the measurement is that you’ve got right now, in real time. That has so many implications, we just have never had that capability before. So, there is a chance to make lemonade out of lemons if people choose too. 

No doubt. This has been a really enlightening conversation, I certainly was not aware of the extent of the challenges, and the scope ahead. I would love to get a sense of some resources, where can people go to learn more about how to understand the challenge, and how best to address it? Can you recommend a resource? 

There’s a couple of things, and again I’m going to be biased. I have a book that I wrote, I wish I could say it's out of date, it isn’t, it's called Protecting Industrial Control Systems from Electronic Threats. It was published in 2010, either fortunately or unfortunately it's still valid today. It will give people a much better idea, both of what is and isn’t control systems, between IT and control systems; what are the unique issues and what should we be doing. 

The second thing is, check out my blog site, I think it will give people a better idea, www.controlglobal.com/unfettered  

The third is, get involved with ISA 99. If people are interested in participating, it’s a standards organization, I believe it’s the most relevant and important of any of the control systems standards organizations in the world, that is my feeling and I happen to be on a bunch. So, if people are interested, they can send me an email, or they can go directly to ISA.org and go to the ISA 99 site, or whatever. There’s a number of different parts of the standards, there’s patch management, there are system considerations, there’s risk management, there’s what do system integrators do, what should vendors do. There’s a whole series of standards within this envelope. 

The main thing is, get involved with the smart people. 

That’s good advice, we’ll post a link to your blog and to the book. I really appreciate this, again this has been Ed Maguire the Insights Partner at Momenta Partners, and our guest has been Joe Weiss who is with Applied Control Solutions. Joe, thank you so much for your insights, it's fascinating and I’ve learned a ton and had no idea how deep these challenges were. So, thank you again. 

And thank you for the opportunity, I hope people found it of interest. 

 

[End] 

 

Subscribe to Our Podcasts