Why are certain employees more likely to comply with information security policies than others?

Information security policies (ISP) that are not grounded in the realities of an employee’s work responsibilities and priorities expose organizations to higher risk for data breaches, according to a research from Binghamton University, State University of New York.

information security policies

The study’s findings, that subcultures within an organization influence whether employees violate ISP or not, have led researchers to recommend an overhaul of the design and implementation of ISP, and to work with employees to find ways to seamlessly fit ISP compliance into their day-to-day tasks.

“The frequency, scope and cost of data breaches have been increasing dramatically in recent years, and the majority of these cases happen because humans are the weakest link in the security chain. Non-compliance to ISP by employees is one of the important factors,” said Sumantra Sarkar, associate professor of management information systems in Binghamton University’s School of Management.

“We wanted to understand why certain employees were more likely to comply with information security policies than others in an organization.”

How subcultures influence compliance within healthcare orgs

Sarkar, with a research team, sought to determine how subcultures influence compliance, specifically within healthcare organizations.

“Every organization has a culture that is typically set by top management. But within that, you have subcultures among different professional groups in the organization,” said Sarkar. “Each of these groups are trained in a different way and are responsible for different tasks.”

Sarkar and his fellow researchers focused on ISP compliance within three subcultures found in a hospital setting – physicians, nurses and support staff.

The expansive study took years to complete, with one researcher embedding in a hospital for over two years to observe and analyze activities, as well as to conduct interviews and surveys with multiple employees.

Because patient data in a hospital is highly confidential, one area researchers focused on was the requirement for hospital employees to lock their electronic health record (EHR) workstation when not present.

“Physicians, who are dealing with emergency situations constantly were more likely to leave a workstation unlocked. They were more worried about the immediate care of a patient than the possible risk of a data breach,” said Sarkar.

“On the opposite end, support staff rarely kept workstations unlocked when they were away, as they felt they were more likely to be punished or fired should a data breach occur.”

The conclusion

Researchers concluded that each subculture within an organization will respond differently to the organization-wide ISP, leaving organizations open to a higher possibility of data breaches.

Their recommendation – consult with each subculture while developing ISP.

“Information security professionals should have a better understanding of the day-to-day tasks of each professional group, and then find ways to seamlessly integrate ISP compliance within those job tasks,” said Sarkar. “It is critical that we find ways to redesign ISP systems and processes in order to create less friction.”

In the context of a hospital setting, Sarkar recommends touchless, proximity-based authentication mechanisms that could lock or unlock workstations when an employee approaches or leaves a workstation.

Researchers also found that most employees understand the value of ISP compliance, and realize the potential cost of a data breach. However, Sarkar believes that outdated information security policies’ compliance measures have the potential to put employees in a conflict of priorities.

“There shouldn’t be situations where physicians are putting the entire hospital at risk for a data breach because they are dealing with a patient who needs emergency care,” he said. “We need to find ways to accommodate the responsibilities of different employees within an organization.”

What’s preventing organizations from making pragmatic security decisions?

Human beings are poor judges of risk. For example, we perceive the risk of air travel to be higher than it actually is after a fatal aviation-related accident happens.

pragmatic security decisions

We also tend to dismiss risks just because we don’t see a tangible negative impact right away. This is, for example, what prevents many from making dental hygiene a priority: we all know dental hygiene is critical to our health and a relatively easy “investment”, but when nothing bad happens immediately after skipping teeth brushing once, many stop being regular about it.

“It is hard or impossible to predict just how many times of skipping a good brushing it takes to get you in trouble with tooth pain, so we tend to take on more risk until we end up getting toothache and regret not investing enough on proactive maintenance,” Ehsan Foroughi, Vice President of Products at Security Compass, told Help Net Security.

“For security, in many cases it starts with skipping it and taking risky shortcuts when the product is not yet widely adopted or the company is small and young. But as it grows and the risk grows, we tend to overlook that until something bad ends up happening.”

Obstacles to surmount on the path to better security

Another thing that makes companies brush aside security is competition.

“Software is becoming the core of every industry’s competitive advantage and there is a lot of pressure from the market and competition to release new software or improvements to existing software faster and at a lower cost (so that a limited investment can yield more results),” he noted.

“Proper security hygiene, when done in the traditional way, gets in the way of agility and creates the dilemma: should we take on risk to move fast in the business, or should we slow down and do the right thing? Unfortunately, human nature pushes many to choose the fast and risky approach which leaves them with a ticking time-bomb of a security incident waiting to happen.”

Barriers to pragmatic security decisions

Other roadblocks to sensible security decision-making include:

  • Engineers not being well versed in security understanding and practices, as well as having a hard time communicating complex issues to business stakeholders
  • Executives and decision-makers at the business level lacking education and awareness around the topic, most specifically around the foundations of software security
  • Security teams being perceived as the only owner of the organization’s security.

What can CISOs do to make things better?

Like quality, security should be everybody’s job and responsibility, not just the QA/security team’s.

One of CISOs’ goals should be to improve security culture across the organization, by raising awareness, educating, consulting, promoting and providing processes and tools.

“When it comes to education, many think of hard skills such as security testing and coding skills. However, educating staff on how security affects the bigger business, how it can reduce revenue if not done right, and how it can affect them directly, is critical,” Foroughi noted.

He also advises CISOs not to wait for disaster. “The worst time to fix things is when an audit fails. Also, it costs a lot more to wrestle with malware clean ups and deal with ransomware than to enforce policies to protect data – so shift left and invest in proactive measures.”

But, at the same time, they should take care not to go overboard: enforcing extreme policies without regards for the value of assets being protected or the impact to productivity and usability often results in people bypassing the policies, and that would be even more harmful.

Preparing for the future

Foroughi expects the compliance and technology landscapes to get more complex and demanding.

When it comes to introducing new technologies and the need for employees to have the skills to wrangle it, he advises organizations not to focus on a specific skill set when hiring, but to look for foundational understanding in individuals.

“If you have the right people on board and the culture enables them to take initiative, they will bring the latest technology into the organization and will have the capability to quickly learn and adapt to deal with new problems,” he explained.

The problem of balancing security vs. time to market will also get harder to address, he says.

First and foremost, CISOs should be pragmatic and focus on getting 80% secure and 80% fast instead of choosing one over another.

They should also know that they will have an easier time to get buy-in from the rest of the organization if they learn how decisions in CISO’s domain affects the larger business and how to present proposals for future investment using that perspective.

In general, CISOs have to educate executives on how security and risk management affects business goals and on the importance of finding the balance.

“Invest in automating the balanced approach to development and prioritize this investment,” he concluded. “When asking the developers to cooperate with you to roll out this automation, start by explaining why you are doing this – you will face much less resistance.”

How can we harness human bias to have a more positive impact on cybersecurity awareness?

Dr. Jessica Barker, Co-CEO of Cygenta, follows her passion of positively influencing cybersecurity awareness, behaviours and culture in organisations around the world.

Dr. Barker will be speaking about the psychology of fear and cybersecurity at RSA Conference 2020, and in this interview she discusses the human nature of cybersecurity.

positive impact cybersecurity awareness

What are some of the most important things you’ve learned over time when it comes to security culture? How important is it and why?

A positive and robust security culture is absolutely fundamental to the overall security maturity of an organisation. An organisation’s culture sets the tone for what is normal and accepted; it’s not what is written in a policy, it is what influences how people actually behave. From a security point of view, this is absolutely crucial and extremely influential.

Different cultures will influence whether people do what they should when it comes to security, for example a culture in which leadership demonstrate a strong commitment to, and respect for, security is much more likely to result in positive security behaviours than one in which leadership are dismissive of security.

The phenomenon of social proof, in which people model their behaviour on how others act (especially those in positions of authority or those they particularly admire), means that the role of leadership in security culture is vital. People in an organisation look to those in leadership to see how they should behave.

If leaders are seen to follow security policies and good practices, such as wearing identity badges and challenging tailgating, then others throughout the organisation are more likely to follow suit. A culture of fear in an organisation is very destructive. If people feel they are going to be blamed for clicking a link in an email they then suspect was phishing, for example, they are less likely to report such incidents when they happen. A culture of fear does not reduce the number of incidents, it just drives them underground and reduces the likelihood of people reporting those incidents.

When someone mentions security awareness training, there’s always a big split – some say it’s essential, others claim it’s a waste of money. What’s your take on this? Does it depend on the type of training?

Great security awareness training, that is part of a healthy cyber security culture and that is aimed at encouraging positive security behaviours, is essential. The problem is that awareness-raising training has a history of being dry, dull, technically-focused and ineffective. That is not engaging and not only will such awareness-raising fail to make a positive difference, it is actually likely to have a negative impact. Too often training has been designed by people with technical expertise who may know what they want to say, but not how best to deliver it or indeed what messaging is going to be most relevant and effective for the people they are communicating with.

For awareness training to be effective, it needs to be relevant to the people it is aimed at, it needs to be engaging, interesting and it needs to feel useful. Talking with people about security in their personal lives, for example, can be really powerful because it is something that everyone can relate to and when people engage with the content in relation to their home lives, they absorb it in terms of their working lives, too.

Awareness-raising that feels like an experience, for example a table top exercise or a live demonstration of a hack, is memorable and fun – people go away from experiences telling their colleagues, friends and family about them, which has a positive ripple effect. Using emotion in a constructive way is really powerful, for example by telling stories. I say “constructive” because it is most important that awareness-raising is empowering, and this is something that is overlooked way too often.

Eliciting fear has been one of the most used marketing strategies in the cybersecurity industry since its inception. Can scaring employees actually make an organization more secure?

Using fear, uncertainty and doubt (FUD) is generally a classic example of awareness-raising that engages with emotion in a destructive way. When we deliver cyber security awareness, we are often talking about the threats, which inevitably will scare a lot of people, so we need to be really responsible in how we do that.

Unfortunately, people often use fear as a blunt instrument, without an understanding of the affect it has. For years, sociologists and psychologists have been studying fear, and what happens when we talk about something scary as a means of promoting behavioural change. My keynote at RSA Conference 2020 will cover some of this work and the lessons we can learn in cyber security.

What’s your take on how many CISOs prefer to spend money on technology instead of educating employees. Can they really solve their security problems with tech purchases?

It’s been encouraging to see, in recent years, that more and more CISOs and security teams understand that security can’t be solved with technology alone. I understand the tendency to want to “fix” security with a piece of shiny kit, because if that worked it would be simple and very comforting. Unfortunately, security is not simply about technology, it’s about how people engage with technology, and for this we need to focus on people at least as much as we focus on tech.

What are the biggest misconceptions about security culture and what can security leaders do in order to make sure their employees are more security conscious?

One of the biggest misconceptions about security culture is the belief that it can’t be measured and tracked, in the way that other elements of security are. This is something I have been working on for my whole career in security: there are very effective ways to measure security culture and there are lots of metrics you can use to check progress. More so, it’s really important that leaders put these in place. When awareness-raising is not part of a strategy and there are no metrics to see if it is having the desired impact, it is usually not very effective. How can you know if something is working if you don’t have any ways of measuring success?