Insider threat management is not limited to protecting government secrets against espionage from foreign nations. Businesses of all sizes need to keep a lookout for insider threat indicators to protect sensitive data against unauthorized disclosure.
In this article, you will learn to identify the top indicators of an insider threat. By paying close attention to these early warning signs you can develop an insider threat management program that proactively identifies these threats before they can cause serious damage to your organization.Table of Contents
Insider threat management is the practice of combining tools, policies, and processes to detect, mitigate, and respond to security incidents caused by an organization’s insiders.
The definition of an insider is not strictly limited to employees. An insider is anyone that has access to the organization’s internal systems.
This includes, but is not limited to
The term insider threat describes a scenario where a trusted insider becomes a security threat to the organization.
These insiders do not even need to be the ones acting maliciously. They could very well become compromised by a malicious third-party that then uses the trusted insider’s level of access to move laterally through the network.
At the mere mention of insider threat, it’s a safe bet that the majority of people will think of malicious insiders first. According to the 2021 Data Exposure Report by Code42, 42% of data breaches were caused by malicious insiders.
While malicious insiders are not the cause of the majority of insider security incidents, they are serious enough that they need to be accounted for in your insider threat management program. According to the 2019 Cost of a Data Breach Report by IBM Security, the median cost of a malicious cyber insider threat incident was $4.45 million in 2018.
Malicious insiders start as trusted individuals that are given access to sensitive information or systems as a part of their role in the organization.
While their use of these resources may start in good faith, somewhere along the way a trusted insider risks becoming a malicious insider threat, abusing their level of access for personal or financial gain
Examples of malicious insider threats
Negligent insider threats can be further categorized into two subtypes: accidental and non-malicious.
Accidental insiders unknowingly cause damage through genuine mistakes, whereas non-malicious insiders intentionally break company policies and procedures without malicious intent.
Examples of negligent insider threats
Why is an insider threat more dangerous than an external threat? Simply put, insiders are trusted to work within the organization’s secure perimeter. They don’t have as much work to do when compared to an external threat, who would need to bypass firewalls and other security measures to gain access.
While there are security frameworks and tools that can restrict the damage that insiders can do, the simple fact that they already have intimate knowledge of and access to the organization’s systems means that they can more readily cause damage to the organization than an external attacker.
This combination of knowledge and access makes insider threats particularly dangerous.
Aside from extreme cases of deliberate corporate espionage, the vast majority of insider threats start as trusted individuals before an insider threat incident occurs..
So, what causes someone to become an insider threat?
A lack of training is a significant cause of negligent and accidental insider threats. They may simply be unaware of the dangers of their actions, or they may not understand what alternatives are available to them.
As mentioned above, employees without adequate security awareness training are more likely to commit dangerous acts. This is especially true if the organization’s security measures are overly restrictive.
If an employee does not understand the value of these security processes they will simply see them as a barrier to their productivity, tempting them to non-maliciously break company security policies.
Alongside end-user training, organizations need to ensure that they provide their insiders with options that are both secure and convenient to use.
Otherwise trustworthy employees can be tempted to engage in sabotage, espionage, and other malicious activities when they are deeply unsatisfied with their employer or their career.
The 2020 Verizon Data Breach Investigations Report found that 86% of all data breaches are financially motivated.
If an employee is experiencing financial stressors in their life they are more likely to accept bribes from malicious third parties, consider selling sensitive data to threat actors, or steal intellectual property to gain favor with a competing company.
This example comes from the Insider Threat Mitigation Guide by The Cybersecurity and Infrastructure Security Agency (CISA).
This insider threat was an engineer at an aerospace manufacturing company. He worked on commercial and military satellites that were sold to the Air Force, Navy, and the National Aeronautics and Space Administration.
By the nature of his role, he had access to closely held trade secrets, including anti-jamming technology and encryption plans for communication with satellites. Naturally, this data was of high value to competing companies and foreign nations alike.
The CISA report notes that there were indications of a potential insider threat risk coming from this employee.
These stressors motivated the engineer to steal detailed mechanical drawings and design information for a satellite program. He intended to take the data he stole and sell it to Russia for a significant sum of money.
Once alerted of the high-risk file transfer, his company’s insider threat team informed law enforcement. An FBI agent was then assigned to go undercover as a Russian intelligence officer to solicit an exchange for the data.
Ultimately, the engineer was sentenced to five years in prison for the attempted illegal sale of proprietary trade secrets to a foreign government’s intelligence service.
His actions violated the Arms Export Control Act (AECA) and International Traffic in Arms Regulations (ITAR). The actions of this one employee posed a threat to national security and risked significant financial harm to his company.
What would you do if you were offered 1 million dollars to install malware on your employer’s network?
Moral quandaries aside, that’s a life-changing amount of money; particularly if you’re struggling financially.
For one Tesla employee, this wasn’t a hypothetical situation.
Here’s the gist of the story
Fortunately for Tesla, their employee thwarted a potentially devastating cyberattack by covertly working with the FBI to take down the third party threat.
While the employee’s motives for turning down the bribe aren’t entirely clear, it goes without saying that this case could very well have gone in a much worse direction for Tesla.
Whether it be the employee’s alignment to the company mission, fears of getting caught, moral integrity, national security concerns, not trusting that he’d receive his $1M compensation, or other motives, Tesla is incredibly fortunate that their trusted insider acted in the best interest of the company.
Malicious activities from insiders are rarely spontaneous events. With careful monitoring for insider sentiments, high-risk activities, and anomalous lifestyle changes you may very well be able to detect an emerging insider threat before they take action.
It’s important to reiterate that these risks are not unique to an employer/employee relationship. While the language I’ll be using will be in the context of a working relationship these very same signs will apply to other contexts as well.
In addition to this pathway, there are potential signs of an emerging insider threat to watch out for. While these signs and triggers may not be definitive proof of an insider threat they will help inform you of warning signs to be aware of.
The majority of this information was sourced from various insider threat management reports provided by CISA. I will provide links to the full reports at the end of this article if you would like to dive deeper into the subject.
Note: While anomalous changes are worth paying attention to, your insider threat management program must avoid being accusatory or otherwise hostile towards employees.
Such a program intends to monitor the workforce for potential indicators of compromise. These indicators emphasize an anomalous deviation from an employee’s normal behavior. Exhibiting one or more signs if not a definitive indicator of a legitimate threat, it simply means that closer attention must be paid.
Whether real or perceived, the very first step of a trusted individual becoming an insider threat is typically some form of grievance against their organization.
Grievances can come in many forms
Note: As with all of these potential warning signs, an employee having a grievance is not a definitive sign that they will act maliciously. Ultimately the goal should be to create an environment that mitigates and addresses grievances in a healthy way, not one that treats employees with grievances as potential threats.
Individuals with certain behavioral traits are more likely to become an insider threat. One of the most obvious is past history; a survey by Code42 found that 63% of employees who admit to taking data with them to a new job are repeat offenders.
In CISA’s report “Combating the Insider Threat”, they note more general behavioral traits that may make someone more vulnerable to becoming an insider threat risk.
These traits are
A thoroughly vetting process is a critical security measure against hiring and promoting individuals with these high-risk traits. While not all of these traits are definitive warning signs (we all have our flaws, after all; it doesn’t mean we’re morally corrupt) an abundance of these traits within the workforce can lead to a greater risk of insider threats.
The CISA report further notes that signs of vulnerability, such as drug or alcohol abuse, financial difficulties, gambling, illegal activities, poor mental health* or hostile behavior, could put insiders at risk of becoming insider threats.
This is particularly true in the case of collusion. During the early reconnaissance stage of a more advanced attack, a malicious third party will be on the lookout for signs of vulnerability that they can exploit.
Malicious third parties will use social engineering to manipulate vulnerable insiders into providing them with unauthorized access, non-public information, and other valuable resources.
Vulnerable insiders may also be a risk in and of themselves if their vulnerabilities coincide with the other risks of an insider threat. For example, an untreated gambling addiction could lead to significant financial hardship. This combination of stressors may motivate an insider to accept a bribe or steal sensitive information for financial gain.
*Note: Employees with mental health concerns are not inherently dangerous. One in five adults in the United States, roughly 46.6 million people, experience mental illness in a given year, according to the National Institute of Mental Health. While behavioral monitoring has its place for insider threat detection, it’s critical that this monitoring does not cross the line into discrimination based on mental health status.
Anomalous changes to an employee’s lifestyle may be a warning sign that an insider is progressing further along the 6-step pathway. Unexpected lifestyle changes are of particular concern if there is no notable reason for the change.
For example, an unexplainable upgrade to an employee’s standard of living could be an indication that the employee is using their insider connections to sell sensitive data.
Such changes include
Every employee has their own unique work style. Differences in working styles are completely normal, however, an anomalous change in work style could be a sign of an emerging insider threat risk, particularly if no notable factors are influencing the change.
When a trusted member of an organization enters the later stages of the 6-step pathway, they’ll do all they can to sequester themselves to avoid detection. As such, anomalous changes to workflows may be an indication that an insider is attempting to hide their malicious intent.
Such changes include
As will all of the signs mentioned thus far, no single indicator (or group of indicators) is a definitive sign of an insider threat. From an insider threat management perspective, these signs are simply indications that closer attention should be paid.
For example, an employee showing keen interest in matters outside of the scope of their duties may very well be vying for a promotion or a career change. If their behavior has a risk of having their colleagues disclose privileged information, corrective actions can be issued without accusing the employee of malicious intent.
Now that you have an understanding of the top insider threat indicators you can take steps to prevent insider threats in your organization.
This collection of articles details the steps you can take to protect your organization and sensitive data against malicious, accidental, and negligent insider threats.
These insider threat management articles focus on the best practices for securing sensitive data against theft from employees.
User activity monitoring is a critical tool for detecting potential insider threats. To get the most out of these solutions you need to be aware of the privacy and compliance risks of monitoring employees in the workplace.
These articles provide critical tips for monitoring employees in a way that is legal and respectful of workplace privacy concerns.
These articles provide insights into the critical security controls required to ensure that sensitive data is kept safe from unauthorized disclosure.
Insider threat management is not strictly technical. Roles that focus on the “human side,” such as privacy, insider threat training, and ethics, are equally important for preventing data breaches.
These articles highlight the role that Human Resources plays in managing security and employee satisfaction.
An insider threat management program is critical for protecting your organization against malicious and negligent insider threats. These in-depth resources provide critical information for developing such a program for your organization.
The Cybersecurity and Infrastructure Security Agency (CISA) has an abundance of resources regarding insider threat mitigation including an insider threat mitigation guide, warning signs of insider threats and what you can do about them, and details regarding the National Insider Threat Task Force’s maturity framework.
In terms of insider threat training, the Center for Development of Security Excellence (CDSE) has internet-based, self-paced training courses that are intended for use by the Department of Defense and other U.S. Government personnel and contractors within the National Industrial Security Program.
While not all of the courses are not publicly available, their website does include reference material and course modules that will be of benefit to organizations that want to implement an insider threat training program.