WHY YOUR ORGANIZATION COULD FALL VICTIM TO UNINTENTIONAL INSIDER THREAT AND FIVE WAYS TO LOWER YOUR RISK
The overwhelming majority of insider threat events are not the result of a malicious employee’s actions, rather they are caused by the unintentional insider – someone gets hit by a spear phishing email, data spillage occurs, documents are destroyed improperly, a data storage device is lost or stolen, people are the victims of social engineering and elicitation.
Research shows that while well-known events like the Ashley Madison compromise, which involved an insider, get a lot of attention, organizations may be too focused on the spectacular threat vectors. A 2013 CERT Software Engineering Institute (SEI) study on unintentional insider threat showed that 17% of cases were unintentional hacks, while 49% were unintentional disclosure. The SEI highlights that employees should be cognizant of the non-spectacular risks which are far more common than the over-exaggerated spectacular risks. In other words, it may be more often the case that organizations are the victims of 10,000 paper cuts rather than a single atomic event.
While a lot of time and energy has gone into examining the root elements of the malicious insider, the unintentional vector has received less focus. Available research on the topic points to the perception of risk, biases, the influence of environment, and everyday stressors, though we shouldn’t discount simple ignorance. So, how can organizations address the very real risk of unintentional insider threat? Start by getting inside the mind of the average employee as you roll out your strategies.
Looks Legit..
Insider threat and cybersecurity training during onboarding or even annually may not be enough. The constantly evolving threat landscape requires ongoing training. For example, phishing emails used to be fairly obvious – spelling errors, an obviously incorrect sender email address, etc. Now, spear phishers commonly spoof legitimate sender email addresses, or have taken control of a legitimate user’s account through earlier attacks.
There may be little that employees can do other than call the sender to verify unusual requests for information or action, but that highlights another challenge to security.
Hoping for the Best!
Many of us don’t want to contact our superior on seemingly simple requests at the risk of looking insubordinate or challenging their authority. This deference to authority is exactly what attack authors are preying on. Of course, this could be addressed by management or even incorporated into organizational policy – such as using voice confirmation for certain types of requests – but it must be part of a larger cultural shift to have any lasting effect.
Ignorance is Bliss.
People in general have a deferential attitude towards what happens on their workstations – if there is no error screen or warning, then they must be ok, right? For most people who are not intimately familiar with the mechanics of computing and the internet, they trust their machine or an administrator to tell them when something is wrong. The message to employees and the public in general usually says something to the extent of having anti-virus software and not providing a social security number to anyone asking for it in an email.
I Know What I’m Supposed to Do, But….
Consider the feet-on-the-ground workplace culture – in this case I am referring to culture as a system of beliefs, practices, orientations, acceptable norms, demonized and praised attributes, that organically emerge – not the practices as written. Individuals may be trained to respond in a certain way when risk presents itself, but face a cost-benefit analysis in terms of culture compliance. When there is no obstacle between the individual and cultural practices, or when individual and cultural norms are aligned, there is no pressure on the individual to act in a contrary manner (e.g. not follow security protocol). When the individual’s behavior is not in line, or contrary to cultural norms, then the individual must make a cost-benefit decision, the result of which could depend on any number of factors.
Consider the recent Department of Justice “hack” in which 20,000 FBI employee names were released. According to media reports, the hacker said he or she was able to access systems by telling a help desk attendant he or she lacked a token code (dual-factor authentication) and the attendant provided one since he/she posed as a “new employee.”
It seems unbelievable at first, but consider it from the attendant’s point of view. The caller seemed to know what they were talking about in terms of access. The caller may be in a position of authority and could pose risk to the attendant’s job by not performing. The attendant’s primary duty is to resolve issues, not to analyze issues.
Going back to the authority statement, while most of us have been trained to know when to deny a privilege escalation – it’s another thing to get a request from a person who seems to be in charge – who might be able to affect our day to day stress level. So what do you do? What does a low level system administrator in today’s economy do?
But Mom Said it’s OK.
A dysfunctional work culture, or work culture incongruent with documented policies and procedures, tends to be the result of some incentivized behavior, either through perceived or actual punishment or reward. This isn’t too far a stretch from mixed messaging in parenting psychology – inconsistent rule application/messaging and unbalanced, sometimes opposing, responses to behaviors, result in confusion for the child and ability to function accordingly.
Why Should I Care?
As I pointed out in my blog post “Why Insider Threat Detection Fails”, humans are poor performers when it comes to detecting rule violations of anything other than social contract or personal safety rules. While we function in a super connected world of relationships, the human mind still functions in a hunter-gatherer world, designed to monitor maybe 50 relationships. Simply put, the human mind is really concerned with its own survival, and by extension its progeny; concepts like threat to the corporation from abstract concepts like supply chain are not natural to the human mind and do not present as an immediate threat to self.
As such, if training and communications about cybersecurity are only presented as a series of “if-then” concepts without tying those to the individual’s health and well being, they will fall on deaf ears. That message – that the health of the company is the health of the individual – needs to be articulated, repeated, demonstrated, and believable. Rote memorization of “if-then” rules will yield some measure of protection, but it does nothing to build a culture or to take real residence in the mind of the employee.
Your employees are your first responders, your first line of defense, and the most critical asset. There are certainly a variety of factors which might cause them to become the next unintentional insider threat, but nothing is worse than apathy.
5 Ways to Combat Insider Risk
- Climate surveys by a third party industrial psychologist can clarify what the culture really is.
- Messaging to the workforce – if in doubt, question. Build a culture of rewarding security posture and questioning suspect vectors.
- Tie organizational risk to real life employee risk in training. Don’t just say it’s bad for the company to lose money from IP theft via insider threat. Tie it all to the employee’s bottom line.
- Be consistent – what’s on paper needs to match what managers exude.
- Encourage questions. It might save you a lot of money. Employees who think they might be facing a security issue, insider or cyber, should feel reporting/questioning is a duty rather than a burden. Make this a value and you could very well save a lot of pain in the end.
Gabriel Whalen is a behavioral consultant and social engineer expert at Secure Halo.