Matthew Harris talks about the Invisible Threats of USBs

On this episode of Forging Connections, Matthew Harris talks with Tim about the threat of weaponized USBs.  Listen in now to learn how USBs are an invisible threat in the workplace and learn more about how workers can avoid the dangers of USB and other cybersecurity threats.

Episode Transcript

Tim Verras (00:02):

Welcome to Forging Connections, a podcast from Honeywell about the convergence of IT and operational technology for industrial companies. We'll talk about the future of productivity, sustainability, safety, and cyber security. Let's get connected.

Hi everybody. This is Tim Verras with the Forging Connections podcast. Today, I'm sitting down with Matthew Harris. He's a principal marketing specialist in our cyber security team, and he's gonna talk to us a little bit about weaponized USBs. Matthew, welcome.

Matthew Harris (00:33):

Hi, welcome. Thank you.

Tim Verras (00:34):

Yeah, so talk to me a little bit about cyber security and kind of your personal journey. I like to kind of get a, a, a sense of kind of what drove people to, to do the awesome things that they do. So like, how did you get into cybersecurity?

Matthew Harris (00:47):

Well, having been in IT for almost 30 years, um, several years ago on a normal customer meeting, I walked into the office, I was just waiting outside the MD's office, and I glanced into the finance department to see everyone in tears. Um, walked in to see the MD and he just, um, was completely pale. And it turns out that, um, they'd had a ransomware attack. Somebody had taken some photos of a kid's birthday party, the weekend, brought the USB key in, and showed it around the office, um, not knowing that the USB key was infected. Um, not only did it infect those systems, it propagated through the organization and infected their backups. They paid the ransomware fee, which is a discussion point, but the decryption software was bugged, so it destroyed all of their corporate data and their backup data, so they basically had no business, a multi-generation business had died, and I just stumbled back to my car.

Matthew Harris (01:52):

And on the way back to the office, what on earth has just happened? How did it happen? What can I do to stop it happening again? So that really kicked off my, my personal agenda, having seen the, the physical and emotional side of cybersecurity, rather than coming at cyber from a technical protection point of view. I've seen the devastation that the other side of ransomware and USB keys can do. Um, and that really started my journey. And the more I started talking to people about cyber security and are they doing it, the more I saw people explaining to me that they didn't understand what I was talking about or not paying attention. And I started running cyber awareness training sessions to try and tell as many people as possible about doing the basics and it just wasn't working. Um, and I didn't understand why. And if you think of an old technology term called “technophobia”, where people are afraid of technology, there, there has to be some sort of “cyberphobia” thing on top of “technophobia” that just makes this whole area sit in people's blind spots. So when you do talk to people about cyber security, you need to be really aware of where they sit mentally on the subject, what their skills and abilities are, what their comfort zone ability is, you know, are they comfortable dealing with things outside their comfort zone and cybersecurity fits into that category. So I really got fascinated by this and why I find the USB side of protection and security so interesting is because it takes me all the way back all those years ago of when I first walked into that manufacturing organization.

Tim Verras (03:43):

That's a great story. I mean, you, you know, we, we talk so much here at Honeywell around the, the technical aspects of cyber security, but to hear that kind of emotional aspect, you know, that's a, that's a side you don't often hear. So that's a, a really interesting approach on that, Matthew, and it's just, it's really awesome that, you know, you were able to kind of pivot that into, uh, a career and, and, you know, really focus on weaponized USBs. And that's kind of where I want to go next. So talk to talk to me a little bit about weaponized USBs. What does that mean? And kind of what are some scenarios that, that you might see? What are some of those other, uh, scenarios that might come across and how do, how do USBs get weaponized?

Matthew Harris (04:23):

This is the thing, um, these things, if they've been around for over 20 years now, and nearly every marketing event ever, you'll always see USB key sitting on, on the tables and we've got them in drawers, we've got them in our special little drawer in the kitchen full of knick knack stuff, um, goodness knows what's sitting on those. And they're in engineer’s bags, um, pockets, coats, cupboards, they're absolutely everywhere, and the threats are coming and they're evolving. Now, if we just have a look at the simplest, which is just a USB key that has malware on it, and somebody inadvertently plugs it in and away it goes. But weaponized USB keys, now they're more insidious. They they're more, much more, well, they're basically nasty, um, because they look like a normal USB key, um, but they've been reprogrammed to actually pretend to be a keyboard.


So when an unsuspecting user plugs that USB key and thinking it's a normal USB key, and they're going to just move some files about, it's already started. If we look at the rubber ducky, which is a very funny name for a devastating bit of technology, that's a keystroke injection tour that can do about a thousand words a minute automatically. And there's a new version that's come out of the rubber ducky from the black hat conference a couple of weeks ago in Vegas that can now automatically have a look at what environment it's been put into and then deploy whatever exploits based on those results. And while this is going on, the user has absolutely no idea because none of the text commands are showing on the screen. So what makes them particularly nasty as well is that there is not really leaving a footprint. So if you think of, when you're trying to get into an organization from outside and you are trying to hack in through the website or through a VPN connection or whatever means, you're trying to break through the corporate perimeter and there's all manner of firewalls, then there's the network protection in the routers.


Then you have departmental firewalls maybe, and so on. So there's layers of protection you need to break through, but with a weaponized USB key, it's plugged straight into a industrial control system workstation. So it bypasses pretty much all of the corporate security controls. And if an organization says, oh, we, we have this covered, um, because we've got all the ports locked down. If you remove the mouse cable and you plug a weaponized USB key in, then it's pretending to be a keyboard ,or it's pretending to be a mouse, or it's pretending to be something else, while it's doing that work. And if you think that the USB keys are invisible anyway, it just makes them particularly dangerous. Now, the likelihood of this happening is quite low, but the probability of severe damage on what they call loss of view, so you can't see what's actually going on in the control environment, or loss of control, meaning there's now a runaway, um, system failure.

Matthew Harris (07:45):

And if you think of that happening within a chemical plant or nuclear power station, you can imagine the consequences. And when you think about the destruction, they're not like normal hackers. When hackers trying to get into an industrial organization, they're not going to be just putting ransomware in there to close a system down. Gartner predicts that by 2025, most OT environments are going to be weaponized. And what they mean by that is there'll be a back door or remote access into those environments to actually then do something, if and when something needs to be done. And if they are using these weaponized USB keys that also have the capability of wireless connectivity, so a threat actor can actually remotely connect in directly into that workstation, um, it does change the way we think about what level of protection we need.

Tim Verras (08:45):

All right. So, so Matthew, that's a really fascinating way to look at it. Uh, I understand that, uh, the folks on your team have just released the yearly cyber security threat report that Honeywell puts out. And for those of you who don't know it, that are listening, uh, every year we put out a threat report that talks about the, the biggest, latest threats to OT environments that we're seeing at Honeywell. Uh, we feel that it's important to kind of share that information. So others know, uh, because the more people know about these threats, the more they can do to prevent them or detect them early. So Matthew, talk to me a little bit about our, our latest threat report. What are we finding?

Matthew Harris (09:19):

Really interesting because we've been doing this now for four years, and it's very specific to the OT USB environment. Probably the top three points I would make is there's been an increase in the infections from USB exploits from 37 to 52%, specifically targeted for USBs. Um, the remote access capabilities, they, they've stayed around the sort of 51%, which would make me think that they've managed to achieve something. But again, what does that mean, exactly? Um, then the threats coming in that can actually cause a loss of control or loss of view, and it's a bit more worrying has moved from 79% to 81%. These are the, um, the, the numbers we've seen and what we've been able to block. And that sort of reemphasizes the point that I mentioned earlier about Gartner and what they expect to happen by 2025.

Tim Verras (10:23):

That's great. And you know, how do we kind of put that threat report together? Do, do we kind of gather it from, uh, our, our cyber security customers and our, our experts, or like, how do we find this information?

Matthew Harris (10:33):

The data's actually gathered off our estimate systems, um, dotted through hundreds and hundreds of customers, primarily just analyzing it. And it's anonymized as well, so we can just come up with general information for security purposes, understandably, but it gives us very detailed understanding of the types of threats and the nature of the threats that are beginning to emerge. And because of the nature of the SMX solution, it helps us to deal with not zero day, but near day is probably a better description of emerging threats as what's coming out, but targeted specifically via years, be into OT environment.

Tim Verras (11:16):

That's great. And you know, I don't, I don't like to talk too much about our products here on our podcast. We like to share a lot of knowledge, but I do think in this, in this case, it's, uh, it's important for, for our listeners to understand what, what exactly is SMX like, what, how, how does it work? What's its purpose.

Matthew Harris (11:30):

It has very strong capability because it does one thing and it does one thing very, very well. It helps customers protect their USB OT environment, helps them understand who is using what devices, ensures that they are enforcing policy correctly. Um, so for example, an engineer's visiting a particular site and all they need to do is plug the USB key into the SMX scan, the files that they want to work on. The files that haven't been scanned will be hidden. So they then move through the organization doing their job and as they move around, if there is any files that, um, do infect the USB key, because they move on to the next machine, because that infected file that's now on the USB key wasn't scanned, it can't be transmitted across to other workstations, but also enforces behavior because it makes sure that the engineers can only use the correct USB key.

Matthew Harris (12:35):

That's the one that's been scanned. And also when they leave, they scan the USB key out and it unlocks the rest of the USB key, um, for work other areas. Um, now the main thing about the solution is it was built by OT engineers for the OT environment, by OT cyber engineers. And it's extremely good at what it does, focusing down on the OT USB environment and providing that enforcement capability. So, and, and a good example of that is if you had a USB key and you scanned it in, popped it into your bag, then you walked off to whatever control station you needed to go to, and you put your hand in to grab a USB key, but you grabbed the wrong one by mistake, and you plugged that one in. Because that other USB key may not have been scanned in, it won't be recognized and can't be accessed. So that's probably one of the key fundamental, um, points.

Tim Verras (13:42):

Got it. Okay. Now our, our threat report is kind of generated from the data that we see from those various SMX systems out at our client site, and then we kind of share that, uh, to the public. So let's talk about then that for a second. So, you know, I've, I've done, uh, talked to enough cyber security folks to know that it's not a question of, uh, if it's a question of when, so, you know, how do you mitigate against these USB threats when they do pop up in, in an OT environment,

Matthew Harris (14:07):

The easiest way of doing this, but it seems particularly challenging to achieve is to get the basics completed and with the complexity of the industrial environments, that there's a particular challenge around understanding what the threat is. If you are looking at it from an IT point of view, versus you are looking at it from an OT point of view. So what they call the OT, IT divide, there's about an 80% match between what the technologies do, but that 20% difference in the OT environment is where a lot of issues can emerge. So when it comes to mitigating them, the OT engineer can generally speaking work within the it environment safely, but not necessarily the IT engineer working in an OT environment and a, a good way of explaining this maybe would be, if you were patching a laptop, you would just let the patch run and you pop off and go make yourself a nice cup of coffee.

Matthew Harris (15:12):

If you are patching something in an OT environment, it's a physical event. So you imagine you're driving in your electric car, down the German autobahn at 200 miles an hour, and just as you are about to come up to a tight bend, it says they're about to patch your brakes. So that whole concept of OT is if something stops working, there's likelihood to be a physical event that can cause harm or loss of life. If you have an IT failure, you have reduced access to an application or personal data. So there's a fundamental human difference between IT and OT, and that's one of the things that we need to understand when we look at how we mitigate these problems as to where the threat came through the system and then identifying what that threat was, and then putting controls in place to ident—stop it from happening and then working backwards from a forensics point of view to recover, but also understand what damage has been done, because it can take anywhere from six to nine months to understand in some cases, whether a company's been hacked at all.

Tim Verras (16:28):

I imagine that's a long nine months when, uh, when you're on the receiving end of it. And, uh, that's, you know, that's a long time to be shut down if you're measuring an assembly line in, in minute per dollar lost, um, then, uh, you know, that, that can, that can have a, a big effect. So let's, let's dig into that a little bit. Um, the difference between IT and OT, you know, everyone kind of has heard the, the big IT hack stories, right? Customer data, stolen credit card, data stolen, that kind of thing, but in an OT environment, do, do, do companies have dedicated OT, cyber security teams in the same way they have IT teams or do they usually rely on the IT teams to kind of carry that load?

Matthew Harris (17:09):

This is really interesting. Um, that's a very good question because it's generally worn on a number of shoulders. So some organizations, it may sit squarely with the site manager; in another organization, it may just be with the automation leader. In other locations, it may be a combination of, uh, site manager and operations director, the IT director, maybe a CSO or chief information officer, but fundamentally it comes down to what the board of directors have decided the risks are, what levels they've applied to those risks, and who they think are most likely or most capable of actually dealing with those risks. And that's where it comes, or it becomes important to make sure that there's a common understanding of what the risks are between the board, the leadership, senior management, and the staff. So when something does actually go wrong, everyone understands what they need to do, but this is the challenge because every organization has a different way of dealing with the cyber threat or dealing with the cyber task that need to be done.

Tim Verras (18:17):

And I, I imagine that plays right into the hands of, of, of bad actors looking to exploit that because there's not really a kind of uniform way of doing it so that, that, uh, inconsistency can cause, uh, can cause the kind of chaos that they thrive in, I would imagine.

Matthew Harris (18:31):

Uh, yeah, absolutely. And if we just go and slap some social engineering on the top of that, you know, if we look at the way USB keys can be delivered, um, into environments, um, of course you have the manipulated one where somebody's coerced into taking a USB key in, there was an example a while back of somebody being forced to take a weaponized mouse in that gave remote access to somebody. Another aspect is, um, the USB dropping where somebody made tailgate, somebody into an office building, and leave USB keys in strategic places around the office. Um, the restroom's one where it can be left by the mirror, but it's not just that it's been left by the mirror, it's how they've labeled the USB keys. So in the restroom it could say “payroll.exe” and somebody will look around, make sure no one's watching, and just pop it in their pocket to have a look at it later.

Matthew Harris (19:31):

Um, maybe if we look at the canteen, it will have something along the lines of honeymoon photos to be printed and an old colleague of mine, he, uh, was an ethical hacker, I think at a white hat hacker there called now. And he focused on banks and casinos, but what was interesting, he would use a sports drone to fly around the car parks in secure environments to see where key people will park their cars. He would then go and fly back later with the USB keys attached underneath the drone. And he wouldn't drop the USB keys near the boards. He would drop the keys near the people who want to be on the board. So the ones who are hungry to get something. So they would almost guaranteed 98% statistically to actually plug it in and see what's going on. And that's where the back door systems can come in. So then the threat actors can come through the IT environment and get into the OT environment. Um, and if they're targeting engineering specifically, um, they, they could do the same approach as well.

Tim Verras (20:42):

Yeah, that's fascinating, ‘cause you know, often time when we see it depicted in movies or TV shows, it's usually someone like hacking away on, in code, on a, on a keyboard, but really a lot of it, there's a lot of human psychology that they're playing with there.

Matthew Harris (20:56):

It, it is, there is a lot of that human psychology and probably the, the top three tips I can think of is if you have an email that is something that is asking you something that is emotional, logical, or you feel it's duty bound or legal, then it's generally going to be a bad email and it will ask you to click on something. So a good, good piece of advice is just to never click on an link in an email, go directly to the site via the browser. The other thing that will help is have a password manager with a complex initial password that you can use to do with all the other passwords. And the other one coming back to the human point is trust your instincts. Um, a lot of people do feel that something's wrong with that link, with that text message, with that phone, with the USB key, with something they've seen, trust your instincts and pull back. Don't do anything further and ask a colleague to come and have a look.

Tim Verras (21:55):

That's great. Those are, those are, uh, excellent tips. And I think that kind of wraps us up today. Matthew, is there anything else you wanted to cover?

Matthew Harris (22:02):

I think really just, um, reemphasizing the basics. You know, if we understand what we've got and we understand what it's doing and we know what it's done, then we can help ourselves get the best protection. And if we think of the USB keys, it's, I guess it's quite appropriate if we think of COVID and I know personally, my daughter picked me up on a regular basis about washing my hands and then I would turn the tap off, and then I've just touched that tap with a dirty hand, and now I've retouched that tap, and trying to learn a new behavior, a new habit, is a three-to-six-month process. So when we talk about how we can improve behavior, improve security and do security by design, this isn't something you click on once or twice a month and you are done. This is almost a new behavior and a new understanding we need to come to from a humanity point of view, um, moving forward, and it's how we do that without making life any more complicated than it already is.

Tim Verras (23:06):

That's great. Well, Matthew, thank you so much for, for taking the time outta your busy day to talk with us today and uh, um, good luck out there fighting the good fight.

Matthew Harris (23:14):

Thank you very much. Take care.

Tim Verras (23:16):

Alright, take care.


This has been Forging Connections, a podcast from Honeywell. You can follow Honeywell Forge on LinkedIn and download new episodes from our website at Thanks for listening.