Redefining CyberSecurity

Integrating Human Factors Engineering in Cybersecurity | Human-Centered Cybersecurity Series with Co-Host Julie Haney and Guest Calvin Nobles | Redefining CyberSecurity Podcast with Sean Martin

Episode Summary

Explore the groundbreaking intersection of human factors and cybersecurity with Dr. Calvin Nobles, where we uncover the essential role of designing cybersecurity systems that optimize human performance. Dive into how integrating human-centered approaches can transform the field, making it more intuitive, effective, and attuned to real-world user needs.

Episode Notes

Guests: 

Julie Haney, Computer scientist and Human-Centered Cybersecurity Program Lead at National Institute of Standards and Technology [@NISTcyber]

On Linkedin | https://www.linkedin.com/in/julie-haney-037449119/

On Twitter | https://x.com/jmhaney8?s=21&t=f6qJjVoRYdIJhkm3pOngHQ

Dr. Calvin Nobles, Ph.D., Portfolio Vice President / Dean, School of Cybersecurity and Information Technology, University of Maryland Global Campus [@umdglobalcampus]

On LinkedIn | https://www.linkedin.com/in/calvinnobles/

____________________________

Host: Sean Martin, Co-Founder at ITSPmagazine [@ITSPmagazine] and Host of Redefining CyberSecurity Podcast [@RedefiningCyber]

On ITSPmagazine | https://www.itspmagazine.com/sean-martin

View This Show's Sponsors

___________________________

Episode Notes

In a recent episode of Human-Centered Cybersecurity Series on the Redefining CyberSecurity podcast, co-hosts Sean Martin and Julie Haney dive into the intriguing world of human-centered cybersecurity with their guest, Dr. Calvin Nobles, Dean of the School of Cyber Security and Information Technology at the University of Maryland Global Campus. The episode provided a wealth of knowledge, not only about the significance of human factors in cybersecurity but also about how organizations can better integrate these considerations into their cybersecurity strategies.

The conversation illuminated the critical role of human factors, a field born out of experimental psychology and foundational to related subfields such as human-computer interaction and usability. Dr. Nobles' insights shed light on the need for cybersecurity systems to be designed with human limitations and strengths in mind, thus optimizing user performance and reducing the risk of errors. It's a call to move from technology-centered designs to ones that place humans at their core. A significant point of discussion revolved around the common misunderstandies surrounding human factors in cybersecurity. Dr. Nobles clarified the definition of human factors, pointing out its systematic approach towards optimizing human performance. By fitting the system to the user, rather than forcing the user to adapt, cybersecurity can become more intuitive and less prone to human error.

The episode also touched on the concerning gap in current cybersecurity education and practice. Dr. Nobles and Haney highlighted the sparse incorporation of human factors into cybersecurity curricula across universities, stressing the urgency for integrated education that aligns with real-world needs. This gap points to a broader issue within organizations—the lack of focused human factors programs to address the human element comprehensively.

Practical advice was shared for organizations aspiring to incorporate human factors into their cybersecurity efforts. Identifying 'human friction areas' at work, such as fatigue, resource shortages, and a lack of prioritization, can guide initiatives to mitigate these challenges. Moreover, the suggestion to provide cybersecurity professionals with education in human factors underlines the need for a well-rounded skillset that goes beyond technical expertise.

This episode serves as a beacon for the cybersecurity community, emphasizing the necessity of integrating human factors into cybersecurity education, practice, and policies. By doing so, the field can advance towards a more effective, human-centered approach that enhances both security and user experience.

Top Questions Addressed

___________________________

Watch this and other videos on ITSPmagazine's YouTube Channel

Redefining CyberSecurity Podcast with Sean Martin, CISSP playlist:

📺 https://www.youtube.com/playlist?list=PLnYu0psdcllS9aVGdiakVss9u7xgYDKYq

ITSPmagazine YouTube Channel:

📺 https://www.youtube.com/@itspmagazine

Be sure to share and subscribe!

___________________________

Resources

 

___________________________

To see and hear more Redefining CyberSecurity content on ITSPmagazine, visit: 

https://www.itspmagazine.com/redefining-cybersecurity-podcast

Are you interested in sponsoring this show with an ad placement in the podcast?

Learn More 👉 https://itspm.ag/podadplc

Episode Transcription

Integrating Human Factors Engineering in Cybersecurity | Human-Centered Cybersecurity Series with Co-Host Julie Haney and Guest Calvin Nobles | Redefining CyberSecurity Podcast with Sean Martin

Please note that this transcript was created using AI technology and may contain inaccuracies or deviations from the original audio file. The transcript is provided for informational purposes only and should not be relied upon as a substitute for the original recording, as errors may exist. At this time, we provide it “as it is,” and we hope it can be helpful for our audience.

_________________________________________

Sean Martin: [00:00:00] And hello everybody. You're very welcome to a new episode of redefining cybersecurity podcast. I am your host, Sean Martin, and I'm thrilled to have a new episode of the series, human centered cybersecurity that I get the joy of co hosting with Julie Haney from NIST, where we get to talk about all kinds of cool things, human factors and beyond. 
 

Uh, of course, as folks know who follow ITSP magazine, Marco and I had a vision. That the human element would be key. And, uh, we're great to see so much focus on this. And, uh, like I said, I'm thrilled to have Julie join me as a co host for this conversation with, with Calvin Nobles. We're going to talk about, uh, just that human factors and, uh, How do we build a program? 
 

How do we build products? How do we build ops and all the fun stuff with the human in mind? So Julie, um, just to refresh our guests, memory, who, who you are, what you're up to at NIST, [00:01:00] and then, uh, we'll, we'll move over to Calvin to get. Get some background on you.  
 

Julie Haney: Yeah. Thanks, Sean. Um, to reintroduce myself. 
 

I'm Julie Haney. I lead the human centered cyber security program at the National Institute of Standards and Technology or NIST for short. I'm very excited to be a co host on this, this sub series of podcasts with Sean. Um, today I'm very excited. I'm always excited about our guests. I'm very excited to have Dr. 
 

Calvin Noble. Um, he is the newly appointed Dean of the School of Cyber Security and Information Technology at University of Maryland Global Campus, um, and he's joining us today to talk about human factors. I had the pleasure of meeting Calvin, um, probably a couple months ago, and we've had a few discussions. 
 

He's super smart, very interesting, um, I'm sure you're going to be very, uh, interested in what he has to say. Um, so welcome, [00:02:00] Calvin. And, uh, just to start it off, we usually ask our guests, um, you know, can you tell us about your current role and what was the path that got you there?  
 

Calvin Nobles: Well, thank you all for having me on the show. 
 

I'm very grateful to be here. And I'm also always excited to talk about human factors, the human element in cybersecurity. So my current role, as Julie said, I'm currently the Dean of the School of Cybersecurity and Information Technology. And, uh, And I recently was working as the department chair at the Illinois Institute of Technology, leading the cyber security and IT programs and everything that comes with that. 
 

And so one of the things about, um, being in academia, you get to do research. And so my research area, you know, pertains to the human element in cyber security. And so I just recently wrote up a short paper for an academic conference where I took a look at the definition. A human factors that currently exist in the literature, and I will tell you, looked over 100 some more articles and [00:03:00] came up with 16 definitions of human factors, all, all different definitions, but there was some commonalities. 
 

So I was able to capture that in the article. But again, it made me think about how do we really define human factors in cyber security. And so one of the reasons why this is so important to me is because when I started working in cyber security, one of the things I worked realized that there was a lot of parallels with aviation. 
 

And so I couldn't let that go. And so I started crosswalking the human factors over to cybersecurity. And what I realized there's a significant gap around the human element. I know we talk about the human element in some cases on people say, you know, you want to talk negatively about it. We're just trying to bring the science with human factors in cybersecurity, human element. 
 

Don't get me wrong, our workers and our cyber security professionals in cyber security are some of the brightest in the world, but just like anybody else, they have some times when they have weaknesses and limitations, and we must design around those so that we can optimize [00:04:00] human behavior and performance. 
 

Sean Martin: Well, the way I look at it, I'll kick it off with this is. Yes, they're super smart. Yes, they work really hard. So why would we not treat them? Well, why, why would we just say, ah, they're smart. They can deal with whatever we toss at them and let them figure it out. No. And I, we look at different organizations. 
 

I remember I have connection back to kind of like the early days of the set top box before everything was, was cloud stream now, but, uh, a big, uh, Big investment in human factors to understand how people would interact with a device connected to TV to make that whole viewing experience different and better. 
 

And that was purposeful. To make the experience better. And I think that can be applied. And I, I often talk about on the show that, that security is due for a transformation and that can cross everything from how we fit into the business. Uh, so we drive [00:05:00] value for the business, not just protect and then reduce risk. 
 

But I also think in terms of operations and teams. So some of this I'm sure we'll touch on. Where we have a chance to make the environment better, make the experience better, make the teams better, make the outcomes better. And I really truly believe that they're centered and rooted in the topic we're going to talk about today. 
 

So Julie, you want to, you want to kick it off with some, some of your ideas?  
 

Julie Haney: Sure. So I, It was very interesting when I met you, Calvin, and read some of the articles, um, because like you mentioned, there's like 16 different definitions of human factors and people kind of misuse it sometimes or are complete. 
 

Um, so I'm wondering. How do you define, what is the definition of human factors more as a scientific discipline?  
 

Sean Martin: Is it one long description combining all 16, Calvin?  
 

Calvin Nobles: Now, you know, one of the things I believe in, I believe in simplicity. How do we keep it simple? [00:06:00] And so, you know, my basic definition of human factors from a scientific perspective is it's a systematic approach to optimizing human performance and behavior based on the designing of the system so that human can fit into the system and environment that we build for them. 
 

And so in other words, we just like in aviation, they build a cockpit. So when the pilots get in, it's very standardized, really built for the pilots. We must take that same approach in cyber security, that we build an environment so that we accounted for human limitations and human weaknesses, but the end users still have the ability to attain optimum behavior and performance. 
 

Julie Haney: And I mean, one of the things I was, um, thinking Calvin is that, um, We see a lot of other terms that have relationships to human factors. So usability, human computer interaction. What are the relationships between [00:07:00] human factors? Are those just a subset?  
 

Calvin Nobles: Yeah, absolutely. So in the United States, human factors were born out of experimental psychology. 
 

Now, a few years later, it started taking shape and became human factors as well. The human factors professionals know it today. And so based out of that, there's some other things that grew out of that. Some of those things are human computer interaction. When personal computers started becoming ubiquitous. 
 

One of the things we noticed was that we needed an area that really focused on us to help us produce an engineer. User interfaces and get that user experience. That's going to be very positive. And so that's what human computer interaction came from. So in other words, human factors gave birth to HCI. 
 

When we talk about, um, usability and some of the other words you hear, you hear things like, um, human machine interaction, also in cybersecurity, and you heard in other field, One thing we have to remember is that human factors gave birth to all of those fields. Now, one of the things we have to take into account, human factors is a vast domain.[00:08:00]  
 

I mean, human factors is in everything we do. And so I clearly understand why the computer sciences and the other professionals carved out an area. The medical community did it too. And so it works in their favor. It really gives them the opportunity to reduce risk. And so I understand how HCI and some of the other areas came about because. 
 

The field is very big. And so I think we should do likewise in cybersecurity.  
 

Sean Martin: So you talked about interfaces there and Julie mentioned kind of the interaction. And that was kind of the story I gave as well, right? The, the interaction or engagement with the system. And you, you described Calvin about fitting in nicely there. 
 

Is it, it's not just the actions, right? The, the pulling of levers and turning of knobs, pressing of keys. It's also what we see, perhaps what we hear, the response we get when we interact, um, obviously we look at phones and things where we get the haptics, right, where we actually feel [00:09:00] something as we're typing those types of things. 
 

How, how important are all of the senses? Incoming and outgoing when we talk about this  
 

Calvin Nobles: very important. In some cases, we have some organizations that have built extremely complex cyber ecosystems. And so when you talk about maximizing your senses, whether we hearing, whether we touching or whether it may be. 
 

We can easily overload the user and put them in an information overload situation, or we can put them in a situation to where the cognitive battery is draining rather quickly if they do a very demanding task. And so we have to account for that. And so one of the things that managers have to really understand how cognitively demanding is the job or the functions that those individuals are doing. 
 

You can't put someone in a cognitively demanding environment and leave them in there for six to eight hours and think that they're going to have the same level of production as someone who's doing low level [00:10:00] tasks. And so these are the things we have to start training and helping us understand how do we bring this type of expertise to the cyber domain and help managers help give them extra tools in their toolkits to help them manage the performance of their people. 
 

Sean Martin: So, yeah, I was wondering, do you want to ask Julia?  
 

Julie Haney: Go ahead.  
 

Sean Martin: All right. So I mean, my brain is filled with so many things, but one of the things we want to touch on is kind of how this relates into cyber security specifically. So what are some of the most pressing issues that you? You see here. I mean, you kind of touched on the fatigue element of it. 
 

Um, and keeping people proactive, active and engaged. So what are some of the things you're seeing that we need to focus on anything?  
 

Calvin Nobles: You know, I'm glad you mentioned the thing up there about fatigue. I wrote a paper several years ago around human performance issues in cyber security. And when we talk about human [00:11:00] performance issues in cybersecurity, we're not talking about performance evaluation that's driven by HR. 
 

We're talking about the actual performance of actually working in a cybersecurity environment and that very large cybersecurity ecosystem. And so when we talk about, we talk about fatigue, distractions, burnout, stress, the lack of prioritization, we can go forever and list a whole laundry list of things that actually impacts human performance. 
 

And so at the end of the day, you know, one of the things that concern me is the one, the lack of cyber security professionals in industry, we, we, we know there's a talent gap, and there's different arguments around that for a whole nother show. Right. And so we'll, we'll leave, we'll park that right there. 
 

But, you know, We got cyber security professionals who are coming to work every day, and they are on small teams or teams that are under resourced, or teams that like the technological capability they need, or a team that's not getting the training they need to stay up to speed. Another factor that we have to account is the adversary, the back, the threat [00:12:00] actor gets a vote. 
 

And so that's, it keeps them on their toes constantly. So it's like, it's not like they get to stop doing the marathon and get a quick walk. It's like, they got to be constantly moving forward at that marathon pace because the bad guys moving at that marathon pace as well, and bad guys can produce offensive capabilities faster than we can engineer, engineer, we can engineer defense. 
 

Sorry about that. And so we have to take that into account. And so all these things have an impact on the human element in there and people get tired. Make it. And it's not like they get a time off. And it's always, it's just constant. And so that constant grind over time, put people in a state of where they're very fatigued, where they might be stressed out. 
 

And we're not even talking about the, Personal stressors that individuals have in their lives, whether they're dealing with a health issue, a family issue, the loss of a loved one, those things hurt too. And they have an impact on our cybersecurity professionals. And so we have to take pages from other [00:13:00] sociotechnical industries, like what they do in healthcare. 
 

When a doctor's not feeling well, and he's a surgeon, he don't perform surgery that day. And nobody's questioning him. Same thing about a pilot. If a pilot is not feeling well, Here, she does not get in the cockpit that day. We call, they call a reserve pilot in as customers. We wait, the crew get there and then we take off, right? 
 

And so how do we bring practices like that into cybersecurity so we can make sure we're taking good care of our cybersecurity professionals?  
 

Julie Haney: Yeah, I, and so I'm wondering, you know, obviously human factors is very important in cybersecurity. But, you know, are organizations thinking about this? Do they have human factors programs? 
 

How are they doing with that? What's kind of the status quo with that?  
 

Calvin Nobles: Well, I speak about, I speak on this subject across the nation, and I always, I have a slide that I showed in my presentation. And so one of the slides I show is which engineering discipline is missing. And [00:14:00] then I say, don't worry. Thank you. 
 

It's an open book test and we'll walk through it and I was like, how many of you have network engineers? Hands go up. How many of you have software engineers? Hands go up. How many of you got computer engineers? Hands go up. How many of you have human factors engineers? Hands don't go up. And so the thing that we have to understand from that is that most organizations don't have the organic talent to address the human element the way it needs to be addressed, addressed. 
 

We have to also understand that when we talk about people, most people think they know about people, but this is a really important thing. Complicated area to the fact that where if you look at the aviation community, the health care community, mining operations, maritime operations, they all have human factors professionals on the team for a reason is because they tried to create an environment where they can reduce the risk from human performance and poor behavior. 
 

So I think that's something that we have to account for. And so I think as soon as [00:15:00] we can start realizing that we need to have human fighters, petitioners on our teams, I think we'll be able to make better traction in what we're making right now.  
 

Sean Martin: And I want to poke you a little bit on the, the engineering part of that title. 
 

Um, cause my experience, uh, Building products for a security company, mind you, and thinking about this, it comes in at the product requirements phase, and it comes in at the user story part of the of the building out where we're saying, here's, here's the experience we want. Here's what we want the system to do. 
 

Here's where the data comes into play. Here's where the, the network and the endpoint comes into play. Here's where the Person interacting comes into play. It's so to me, it's, it's part of that, which then feeds other engineers. So I'm wondering, is there, this is naive, but is there, is there a human factors engineer that you also need sitting next to your software [00:16:00] engineer and your network engineer and your, uh, computer systems engineer, as well as somebody who. 
 

Like I described in my experience has a focus on the use case of the user stories and the product requirements part of it. Do you need both?  
 

Calvin Nobles: You know, if I get this question a lot and they would ask me if you could have a human factors engineer, what would you put them? I said, I will take a seat and I will park it right beside your principal security architecture and let them to work together to understand. 
 

And that's where I will start. And the reason you want to understand is because first of all, you human factors engineer have this Develop a sense of the environment. It's not like they're coming off the street and they're going to be spawn on your environment right away. They need to learn your environment. 
 

And from there, I think they can start working, branching out and working with different teams and get a feel for where your friction areas are. Every organization is different. Once you determine where your human friction areas are, you can start working and solving problems and bringing about solutions to reduce those frictions.[00:17:00]  
 

Julie Haney: So I'm wondering, because I, as I'm thinking, you know, um, you know, there's usually pushback when you say we need to hire someone new, right? We need to hire a human factors engineer. Is there, is there any way? I mean, obviously. A human factors engineer, you know, trained person is going to be the expert in it. 
 

But can we somehow raise the bar a little bit? Just raise the awareness of human factors among the cyber security professionals that we have now with that help, or maybe the next generation of cyber security professionals.  
 

Calvin Nobles: Julie, that's a great question. And one of the things that I realized is that human factors protection professionals are in high demand. 
 

They're not sitting on the sideline waiting to get in the game, right? They're employed in other social technical domains. So if I had to think about how do we, what's the best way for us to approach this in cybersecurity, I would approach it from taking your cybersecurity professionals, educating [00:18:00] them in human factors, whether it's through a certification program, in some cases a master's degree, Or in some cases, only job training, working with someone that you contract out and bring them in with the human factors expertise so they can start to develop the skills and come up to speed. 
 

And so this, that's a huge problem there that we are not even looking at where are we going to get this human factors talent from, you know, because. Those individuals are highly employed in other places, and they, and they're doing a great job in making those other places safer.  
 

Sean Martin: I was wondering if you had, sorry Jill, if you could give a couple of examples. 
 

Maybe something adjacent, maybe it's very tech oriented, where cybersecurity professionals can say, I can see that environment looks like mine, um, and they benefited tremendously by having human factors engineering involved.  
 

Calvin Nobles: Right. So let's talk about nuclear power and nuclear engineering when you look at those power plants. 
 

Now they don't show us the inside of those places for reasons, but when you think they do show us with the operators working at those control [00:19:00] boards with all those massive, you know, knobs and those massive gauges, right? And those individuals are responsible for learning and reading those gauges and understand what's happening behind those gauges. 
 

And so human factors engineers play a huge role. And helping them design the layout of that, of that panel that they're looking at those gauges and the training programs so that those individuals can really understand what's happening with the plant with us not seeing the plant. And so that's just one example. 
 

The other example I love to use is when we think about, Okay. We think about NASA. NASA is all about building around that astronaut and keeping all the astronauts safe, you know, and I'm a, I'm a, you know, I watched the NASA channel, I'm a junkie. And so I pick up a lot of things from a human factors perspective, you know, if you look at how much assistance to astronaut gets in some cases from the people back in the, um, support stations and planning those support roles. 
 

The reason being that is because they need to be focusing on the critical missions and things at hand. [00:20:00] So it helps them out tremendously when they got people back in the control station said, well, what about this running checklist for them doing monotonous tasks that the astronaut don't need to be focusing on at that particular time. 
 

And so the other one I love to use is think about a surgical team going into surgery, a human factors engineer is not going to be in surgery, but you know what they look at the spaces, and they make sure those spaces are designed so that surgical team, the team can get in and get out with that surgery and everything they do in there is about teamwork and executing at a high rate. 
 

While they're also looking at reducing human error in those procedures. And so those are some areas that we can look at how human factors have played a critical role in some really technical fields.  
 

Julie Haney: Yeah, I, those are great, great examples. Um, and thinking back a little bit about, um, your previous comments about teaching human factors to cyber security professionals, I know that you've been quite [00:21:00] involved in that, um, and you also did some work to see, you know, what, what universities are incorporating human factors courses in their cyber security programs, computer science programs. 
 

So what did you find and What? How are you taking? What's the approach that you're taking to teach human factors to these, um, these people that are, you know, in these technical professions?  
 

Calvin Nobles: Right. So the research we did, we just took a very simple look at how many universities have was actually accredited through the National Security Agency and to the academic centers of excellence for cyber security. 
 

And what we found out we looked at at the graduate level and we looked at the undergraduate level. And so I'm trying to remember this off the top of my head. I believe at the undergraduate level, I believe it was like 232 programs when we looked at it, and only 14 offer, of course, with human factors or something related in the title. 
 

And when we looked at the graduate levels of program, I think there was a little over 300, about 306 programs, and it was only [00:22:00] seven that offers a course in, uh, in human factors. So we need to do better in terms of helping academia understand the significance of this. You know, we watch industry flail at this every day. 
 

And as academics, we got to say, how do we help them solve that problem? And then one of the ways we can help them solve that problem is bringing human factors into the classroom. And what do we mean by in the classroom? Bringing the scientific approach. In the classroom, helping students understand what human factors is as a discipline and keep it based in the science so that we can actually leverage that in industry. 
 

I know I won't name the universities, but I developed human factors courses for two universities and one university. It's one of the best courses. I mean, I have to limit the number six in the course because I can't teach a graduate level course with that many students in it. But one of the other thing I really want to do and is work with you, Julie and Sean, work with you and find out how we can make it. 
 

Develop [00:23:00] workshops to take this to universities to help them understand how to develop this course and teach this course, because I think, you know, courses and human factors and cyber security is necessary. And I think with my students, they will tell you it's a lot of work. But what I get out of this course is I'm exploring things I've never done before through a different lens that's going to help me understand how the human element. 
 

It's supposed to be integrated the right way into cybersecurity and not mostly what we see today in cybersecurity. We build everything. We build a technology, technological front, and then we just expect humans to gather around it. You know, that's not human centered. And so how do we bring that human centered approach back to cybersecurity through the science of human factors engineering, and that's what we got to do. 
 

Sean Martin: And I was excited. I was going like this, raising, raising that you had a course. And then he said, it's full of my, ah, I was totally deflated. Not that I'd be smart enough to take that course anyway, but, [00:24:00] uh, I'm joking, of course. But no, I love that you're doing that work, Calvin. And I think for this to be successful, it has to be a full circle, right? 
 

So universities need to understand the value. The students need to understand the value. The hiring organizations need to understand the value This kind of builds this stuff and ends up taking off. Um, how do, let's look at the organization for a moment. Um, cause these, these students need a place to go in cyber security once they, once they have this learning. 
 

What are some of the signs that cyber security program is flailing or failing or some other ing? Because they don't have this expertise in house.  
 

Calvin Nobles: So some of the signs you'll see our chief information security officers are burning out and they're burning out pretty quick. And 1 of the things we have to start taking a look at is why are they [00:25:00] burning out? 
 

What's causing that burnout? Can we have something as a human factors council, something that take a look across organizations, cybersecurity program and say, you know. You are burning your sizzle out and it's having a cascading effect and burning out others, you know, identify some security controls we can put in place or some type of programs or other initiatives we can put in place to identify that. 
 

For instance. When I was in the squadron, we had a human factors council and one of the things we did with that council when there was a aviator or air crewman who was unable to perform his or her duties, they could come to the aviation safety office and say, I got a problem. I got an issue. I need to leave the airplane for a while. 
 

And there was no harm, no foul. And we worked with that individual. And when they, they got to a better place, whatever, from whatever it was, they came back, we put them through training and we put them back on the aircraft. We have something like that in the cyber [00:26:00] security world might be what we need, but I also understand that these companies are out here to be profitable. 
 

It's a different environment. When you're talking about a military environment compared to a company that's trying to make a profit. But at the end of the day, we've got to find a way. Where we can restart reducing the amount of human errors we see in the cyber security space, and I will tell you, I'm doing research right now around the cloud environment. 
 

We're looking at cloud misconfigurations. And one of the things we're hearing from all the people that we're interviewing how complexity is. It's extremely complex. And the other thing we're hearing is that there there's no training that really focuses on cloud misconfiguration. Now they get a lot of training on the configuration aspects, but they don't get anything specifically addressing cloud misconfiguration. 
 

And so there's some things we can do from a human factors perspective that really help these organizations be a lot more successful than what they are.  
 

Julie Haney: Yeah, Kevin, you mentioned, um, human error that we need to, we need to [00:27:00] find ways to reduce human error. Um, and I, I know. In my own work, one of the topics that are always like, you know, the hot topic, the thing that people, um, want to hear most about is security awareness training, right? 
 

A lot of organizations kind of pin that the human factor, right, on security awareness and training. And you had, you had a great point, one of your position papers, um, you said equating human error to a training and awareness issue is a fallacy. So I'm wondering if you could talk a little bit more about that. 
 

Calvin Nobles: Absolutely. You know, I say this up front about cybersecurity awareness. I do believe it's important. But in a lot of organizations, they get cybersecurity awareness once a year. And a lot of times they're going to get it early in the year. And when they get it early in the year, think about that. The environment where we work in cybersecurity is constantly changing. 
 

There's new technique, tactics, and procedures that the [00:28:00] adversary is throwing at us every day. And so, if someone gets training in March, that person is not going to get more training until the following March. So, look at the time lapse, you know, in that time that new techniques could affront these individuals and they might not know what to do. 
 

And so, that's why I believe if you're basing everything on cybersecurity awareness, that's wrong from the start. But cybersecurity awareness is important. And then you also hear me say on some of my presentations, everybody don't eat mayonnaise. And what I mean by that is that you can't give everybody this exact same cybersecurity awareness. 
 

If you work in HR, you need something that's modified for your environment. If you work as a, IT or cyber security professional. You need something that that's modified. That's tailored to you and the same thing for our senior executives. They need cyber security awareness tailored to them because they are very visible targets, right? 
 

And so you can't feed everybody mayonnaise because everybody don't eat [00:29:00] mayonnaise. And so that's why I believe that, you know, we got so much focus on cyber security awareness. That's just just one piece of it. I think there's some other things we can do in terms of like, um, understanding. Where people are, are they best suited for their positions? 
 

And if they're in those positions, have you prioritized their work so they can optimize their performance? Are you putting too much on them and they're not saying anything, you know, because in cybersecurity, you know, a lot of workers are afraid to speak out because it's seen as a sign of weakness, or you don't want to be the one to rock the boat. 
 

And so that's another thing that human fighters. Uh, professionals can help you with is establishing that those open communication, like what we see in aviation when they say, if you see something, say something, and some cases saying that in cyber security don't always return, you know, In a favorable way for you. 
 

And so we have to continue to work to include and improve our communications. We have to also work to understand that the, [00:30:00] that the chain of communication is up and is down. And one of the things that I love when we talk about awareness is that in one of my, in one of my courses. For the midterm, they have to come up with some, a modified cyber security, human factors analysis classification system, the same system that the aviation community and any other transportation industry use to investigate an accident. 
 

And so they modify it for the cyber environment. And one of the reasons I love teaching that aerotox, it's nothing more than an aerotoxonomy. The reason I love teaching it is there's four levels in it. And at the very top, there's a level called organizational influences. And that level is where you capture things like organization preparedness, organizational resources, organizational resource allocation, policies, governance. 
 

The thing I'm trying to say here. This is where you [00:31:00] catch your executive responsibilities, right? But if you think about it, every time we talk about a data breach or something happening in that aspect We always look at the lower level but with the HVACS model You just can't look at the lower level and say that was the root cause you have to walk it up to every level And by the time you walk it up to the top level and you started identifying the you know The factors there what you realize is that There's no stone left on turn for something catastrophic happening in organization from a data breach, a cyber attack or ransomware attack. 
 

And so again, that's another training mechanism that I would like to see organizations use in cybersecurity because it will help give a bigger, a bigger picture, a broader picture, and it helped senior leaders understand I played a role in that, a more significant role than what I thought. And so it's just not pinning the, pinning the tail on the donkey at the lowest level, you got to pin that donkey up that tail on a donkey up high, because the decisions made up there have [00:32:00] cascading effects. 
 

Sean Martin: Who are you calling a donkey? 
 

My tail hurts, no. I want to stick on, stay on this, uh, human error thing. Cause the, I think you touched on it earlier just now as well. Am I getting feedback? You're good. Um, cause the, the procedures, a lot of times is this is how we do things. Here are the playbooks. Here's how you participate in that. And you're, you're doing it. 
 

And then there's always an exception where something doesn't fit the playbook or, or you're not fitting because like you described, I'm a little stressed or I'm tired or whatever the situation may be. And things happen. People are human, but a lot of times it's, it's the procedure and the procedure isn't flexible enough or, or we've abstracted the details so far away from, from the user that, They can't make better decisions, right?[00:33:00]  
 

So they only have the guidance at the abstraction level. So we don't set them up to succeed in many cases. And, and I have an issue with the word error in that sense, because many cases they're following what's been prescribed and the procedure failed. Not the, it's not a human error. So how, how do we, how do organizations look at that? 
 

And I know it's probably earlier up front, looking at the policies and the procedures and whatnot. But how do organizations recognize that, That that may be an issue for them if it is  
 

Calvin Nobles: right. I think what you said is spot on, you know, when, when something, when it is a procedure issue, we always say it's a human error, but there's a correct way to say that what I'm saying is it's a human error. 
 

And so I think what organizations have to do is look at their procedures. And when you look at the procedures, you have to understand what we call it. Yeah. a human error science. It's a lot of science that's been out there for a long time that other industries [00:34:00] have really mastered. And when they mastered it, what it helps them understand is how to properly write your procedures. 
 

And not only just write your procedures, you have to test your procedures. And don't just like we do a cyber security. You don't test them in a live environment, but you walk through them and you have people to explain the procedures and you make sure they understand the procedures. Because can you imagine dealing with a cyber security incident and you don't have the procedures or you don't know the steps. 
 

Are you dealing with someone who. Got a sick child at home. That person might be at work physically, physically, but cognitively, they're distracted. And so having strong procedures and having guidelines and having playbooks and having other instructions help them tremendously. I mean, like we look at pilots and we're like, pilots are really cool, which they are cool. 
 

But if you notice everything about them is already scripted. So when they get in the cockpit, it's almost like they're doing the same dance because They got standardized practices of doing things. Now, I get it. There's no way every [00:35:00] flight is going to be the same, but because they have a very standardized way of doing things, it makes life easier for them. 
 

And we should be looking at that in cybersecurity. How can we implement things to make it easier for our cybersecurity professionals so we're not draining their cognitive batteries on a daily basis and run that battery runs out and within that eight hour work period. So I think there's a lot of things we can do. 
 

We also have to look at and understand like, like you were saying, the show Human errors is a broad field. You have lapses, you have slips, you have mistakes, you have unintentional, you have intentional. It's a broad field. And you know, sometimes it's very hurtful when we say, you know, somebody made an error or somebody made a mistake, and we leave it as that. 
 

No. Do the investigation to find out what happened and what you will discover is that a lot of times. It's more organizational issue than it is an individual issue. Right. And so that's how we have to look at human factors. It's [00:36:00] layered, you know, you got, you know, you got the organization, and then you got the management area in there, and then you got the individual a lot of time it comes down to the individual level. 
 

And that's where we keep it. But we also have to raise it up a little and really explore what are some of the management issues that led to this and what are some of the organizational issues that led to this. And this is why, again, I love the HVACS model, and I'm going to keep preaching that until we get a lot more people looking to leverage that in cybersecurity because it gives you a broader look. 
 

And the other thing that we're looking to do with modified HVACS is, It's used as a tool to explore accidents. Let's move it to the left of the boom and make it a proactive instrument that we can use to help organizations identify their weaknesses so they can put fixes in place so that we don't have to deal with the boom. 
 

Julie Haney: Yeah, that's, that's a great point. That's, that's the, yeah, definitely [00:37:00] proactive is, is much better. Um, I, I must say, I love your, your cognitive battery. Love that. I'm going to, I'm going to use that. So as you, you talked about bringing it up to the organizational level. So if I'm a podcast listener and I work in an organization that does not have a human factors program as of yet, what kind of advice would you give to those organizations that are just starting out? 
 

What are kind of like the first steps?  
 

Calvin Nobles: You know, the first thing I'll ask any organizations to do is think about the human friction areas at work, whether it's fatigue, whether it's a shortage of talent, whether it's the lack of prioritization. I mean, when I was, when I worked in cyber security, every day, everything was a top priority. 
 

And that means, you know what we say when that happens, right? Think about when we have, Okay. Things that don't go well, and there's no documentation, nothing codified to help the team work through that. So at best, the ad hoc, right? How can we [00:38:00] help them put emergent procedures in place to reduce the stress around that? 
 

You know, how can we work with managers to understand that, you know, when people come to work, they're not going to be happy. Physiologically, 100 percent every day, you know, can managers, you know, weigh in. Do how well do you know your people to be able to say, you know, Julie, you don't look well today. You know, why don't you go home? 
 

We get, we have it from here, you know, to, to give you that break. You know, these things matter because other social technical domains, they have practices in place where people can take a time out, a call a day off and, and everything is okay. And so what I really would like to see organizations do is, Is really take a different look at human factors. 
 

Take a look at it from the science aspect of it. And not when we say human factors from the working definition, all the bad things people do. We kind of stuck there, to be honest with you. And so I'm trying to pull us back from the Just thinking negatively and getting us to thinking more scientifically, where we can use human factors engineering to keep us to the left [00:39:00] of those booms and to give us a better platform to take care of our people while they at work. 
 

Sean Martin: I know, uh, Calvin, you've, you've written some papers on this. You clearly have some courses available at some universities for this. Um, I'm hopeful that you can share some resources with folks. Uh, but is there one thing you could say based on our conversation today, that a team can do to redefine cybersecurity with human factors in mind? 
 

Calvin Nobles: Absolutely. There are several podcasts that, you know, I was told this earlier to Julie early in the week. They get it right over in New York. They're getting it right. I mean, they are moving, you know, swim lanes ahead of us in human factors and cyber security. They got some things that they put out, um, like for instance, the human factors and ergonomic society. 
 

They got a cyber, uh, technical group that they put out information and work. And then you got the, let's [00:40:00] see, can I get this right? You got the charter Institute for ergonomics and human factors. They got some great documents, white papers and other things that really help you understand the human element in cybersecurity and these things you can download from the, from the internet. 
 

And I encourage everybody to, if you're really interested in really trying to solidify the human element in cybersecurity, I will start there. You know, and I will say this too. And I put this call to action out to every human factors, uh, professional that's working in academia. We got work to do. We don't even have a textbook on human factors in cybersecurity. 
 

And I'm working to change that. You get a textbook out there till we have something to go to. So people can say, I really want to know how I can. Solidify the human element in cyber security. Where do I start? Like, here's a textbook for you right here. And so I would love to do that and make sure that we start bringing this to the forefront because right now human factors in cyber security is not in the, it's on the second team, and it needs to be on the first team.[00:41:00]  
 

Julie Haney: Yeah, Calvin. This has been a wonderful discussion. Um, I admire your work. Um, I know, um, Calvin, um, is assembling, um, a group of superstars to, uh, work with us at NIST to, to kind of, um, set some direction for human factors in cybersecurity to help bring more of the scientific and the research world into practice, to make those connections, to help Educate our cyber workforce. 
 

I'm very excited to see where that collaboration goes. Um, yeah, Sean, any, any final thoughts? It's been a great discussion.  
 

Sean Martin: If there's room for a donkey on that, uh, on that working group, I'm happy to participate in that. It is a super important, super important project. So, um, no, I think it's incredible the work you're doing. 
 

Calvin, and I love that it's rooted in, in research and science, which this whole series that [00:42:00] Julie and I are putting together is, is connected to. And I'm thrilled for the work that you're doing, Julie, with NIST on this topic, both specific and broadly. And, uh, yeah, I think, I mean, it's a very interesting topic as well. 
 

As when you start to. Poke into people's minds and how do they think and how do they interact with certain things? And why do they do certain things that way? It may not just be because of what you built for them. It's how they are and interacting with what you built for them. So, uh, it's, it's really exciting space to explore. 
 

And I'm thrilled you're doing it. Glad we had the chance to have this conversation and hopefully many more when, when the book comes out, maybe we can, we can have a follow up conversation and kind of walk through some of the highlights of that.  
 

Calvin Nobles: Absolutely. Be more than happy to.  
 

Sean Martin: Perfect. All right. Well, uh, I suspect there'll be a few good links to resources from Calvin, uh, that we can share with folks. 
 

And of course, I [00:43:00] encourage you to follow Julian NIST and Calvin and the work he's doing. And, uh, if you have a little spare time left over, please subscribe to redefining cybersecurity podcast and stay tuned to our stuff as well. Um, please share with your friends, family, and enemies, and, uh, we'll see you on the next one. 
 

Julie, always a pleasure. To co host these with you, you bring the amazing guests and the amazing topics there. And I'm very appreciative to that and all the cool questions too. And Calvin, thanks so much. We'll, uh, we'll see you all on the next one. Thank you.