Insider threat management is not limited to protecting government secrets against espionage from foreign nations. Businesses of all sizes need to keep a lookout for insider threat indicators to protect sensitive data against unauthorized disclosure.
In this article, you will learn to identify the top indicators of an insider threat. By paying close attention to these early warning signs you can develop an insider threat management program that proactively identifies these threats before they can cause serious damage to your organization.
Table of ContentsInsider threat management is the practice of combining tools, policies, and processes to detect, mitigate, and respond to security incidents caused by an organization’s insiders.
The definition of an insider is not strictly limited to employees. An insider is anyone that has access to the organization’s internal systems.
This includes, but is not limited to
The term insider threat describes a scenario where a trusted insider becomes a security threat to the organization.
These insiders do not even need to be the ones acting maliciously. They could very well become compromised by a malicious third-party that then uses the trusted insider’s level of access to move laterally through the network.
At the mere mention of insider threat, it’s a safe bet that the majority of people will think of malicious insiders first. According to the 2021 Data Exposure Report by Code42, 42% of data breaches were caused by malicious insiders.
While malicious insiders are not the cause of the majority of insider security incidents, they are serious enough that they need to be accounted for in your insider threat management program. According to the 2019 Cost of a Data Breach Report by IBM Security, the median cost of a malicious cyber insider threat incident was $4.45 million in 2018.
Malicious insiders start as trusted individuals that are given access to sensitive information or systems as a part of their role in the organization.
While their use of these resources may start in good faith, somewhere along the way a trusted insider risks becoming a malicious insider threat, abusing their level of access for personal or financial gain
Examples of malicious insider threats
Negligent insider threats can be further categorized into two subtypes: accidental and non-malicious.
Accidental insiders unknowingly cause damage through genuine mistakes, whereas non-malicious insiders intentionally break company policies and procedures without malicious intent.
Examples of negligent insider threats
Why is an insider threat more dangerous than an external threat? Simply put, insiders are trusted to work within the organization’s secure perimeter. They don’t have as much work to do when compared to an external threat, who would need to bypass firewalls and other security measures to gain access.
While there are security frameworks and tools that can restrict the damage that insiders can do, the simple fact that they already have intimate knowledge of and access to the organization’s systems means that they can more readily cause damage to the organization than an external attacker.
This combination of knowledge and access makes insider threats particularly dangerous.
Aside from extreme cases of corporate espionage, the vast majority of insider threats start as trusted individuals before an insider threat incident occurs.
So, what causes someone to become an insider threat?
A lack of training is a significant cause of negligent and accidental insider threats. They may simply be unaware of the dangers of their actions, or they may not understand what alternatives are available to them.
As mentioned above, employees without adequate security awareness training are more likely to commit dangerous acts. This is especially true if the organization’s security measures are overly restrictive.
If an employee does not understand the value of these security processes they will simply see them as a barrier to their productivity, tempting them to non-maliciously break company security policies.
Alongside end-user training, organizations need to ensure that they provide their insiders with options that are both secure and convenient to use.
Otherwise trustworthy employees can be tempted to engage in sabotage, espionage, and other malicious activities when they are deeply unsatisfied with their employer or their career.
Examples include
The 2020 Verizon Data Breach Investigations Report found that 86% of all data breaches are financially motivated.
If an employee is experiencing financial stressors in their life they are more likely to accept bribes from malicious third parties, consider selling sensitive data to threat actors, or steal intellectual property to gain favor with a competing company.
Financially-Motivated Stressors
This example comes from the Insider Threat Mitigation Guide by The Cybersecurity and Infrastructure Security Agency (CISA).
This insider threat was an engineer at an aerospace manufacturing company. He worked on commercial and military satellites that were sold to the Air Force, Navy, and the National Aeronautics and Space Administration.
By the nature of his role, he had access to closely held trade secrets, including anti-jamming technology and encryption plans for communication with satellites. Naturally, this data was of high value to competing companies and foreign nations alike.
The CISA report notes that there were indications of a potential insider threat risk coming from this employee.
These stressors motivated the engineer to steal detailed mechanical drawings and design information for a satellite program. He intended to take the data he stole and sell it to Russia for a significant sum of money.
His data theft attempt did not go unnoticed; User Activity Monitoring (UAM) software revealed that he had inserted a USB device and copied five folders with the sensitive data he intended to sell.
Once alerted of the high-risk file transfer, his company’s insider threat team informed law enforcement. An FBI agent was then assigned to go undercover as a Russian intelligence officer to solicit an exchange for the data.
Ultimately, the engineer was sentenced to five years in prison for the attempted illegal sale of proprietary trade secrets to a foreign government’s intelligence service.
His actions violated the Arms Export Control Act (AECA) and International Traffic in Arms Regulations (ITAR). The actions of this one employee posed a threat to national security and risked significant financial harm to his company.
Get started today—Download the FREE template and customize it to fit the needs of your organization.
What would you do if you were offered 1 million dollars to install malware on your employer’s network?
Moral quandaries aside, that’s a life-changing amount of money; particularly if you’re struggling financially.
For one Tesla employee, this wasn’t a hypothetical situation.
This case study comes from Teslarati concerning a document from the U.S. Department of Justice.
Here’s the gist of the story
Fortunately for Tesla, their employee thwarted a potentially devastating cyberattack by covertly working with the FBI to take down the third party threat.
While the employee’s motives for turning down the bribe aren’t entirely clear, it goes without saying that this case could very well have gone in a much worse direction for Tesla.
Whether it be the employee’s alignment to the company mission, fears of getting caught, moral integrity, national security concerns, not trusting that he’d receive his $1M compensation, or other motives, Tesla is incredibly fortunate that their trusted insider acted in the best interest of the company.
Malicious activities from insiders are rarely spontaneous events. With careful monitoring for insider sentiments, high-risk activities, and anomalous lifestyle changes you may very well be able to detect an emerging insider threat before they take action.
It’s important to reiterate that these risks are not unique to an employer/employee relationship. While the language I’ll be using will be in the context of a working relationship these very same signs will apply to other contexts as well.
In their report “Insider Threats 101”, CISA identifies a 6-step pathway that malicious insider threats follow.
In addition to this pathway, there are potential signs of an emerging insider threat to watch out for. While these signs and triggers may not be definitive proof of an insider threat they will help inform you of warning signs to be aware of.
The majority of this information was sourced from various insider threat management reports provided by CISA. I will provide links to the full reports at the end of this article if you would like to dive deeper into the subject.
Note: While anomalous changes are worth paying attention to, your insider threat management program must avoid being accusatory or otherwise hostile towards employees.
Such a program intends to monitor the workforce for potential indicators of compromise. These indicators emphasize an anomalous deviation from an employee’s normal behavior. Exhibiting one or more signs if not a definitive indicator of a legitimate threat, it simply means that closer attention must be paid.
Whether real or perceived, the very first step of a trusted individual becoming an insider threat is typically some form of grievance against their organization.
Grievances can come in many forms
Note: As with all of these potential warning signs, an employee having a grievance is not a definitive sign that they will act maliciously. Ultimately the goal should be to create an environment that mitigates and addresses grievances in a healthy way, not one that treats employees with grievances as potential threats.
Individuals with certain behavioral traits are more likely to become an insider threat. One of the most obvious is past history; a survey by Code42 found that 63% of employees who admit to taking data with them to a new job are repeat offenders.
In CISA’s report “Combating the Insider Threat”, they note more general behavioral traits that may make someone more vulnerable to becoming an insider threat risk.
These traits are
A thoroughly vetting process is a critical security measure against hiring and promoting individuals with these high-risk traits. While not all of these traits are definitive warning signs (we all have our flaws, after all; it doesn’t mean we’re morally corrupt) an abundance of these traits within the workforce can lead to a greater risk of insider threats.
The CISA report further notes that signs of vulnerability, such as drug or alcohol abuse, financial difficulties, gambling, illegal activities, poor mental health* or hostile behavior, could put insiders at risk of becoming insider threats.
This is particularly true in the case of collusion. During the early reconnaissance stage of a more advanced attack, a malicious third party will be on the lookout for signs of vulnerability that they can exploit.
Malicious third parties will use social engineering to manipulate vulnerable insiders into providing them with unauthorized access, non-public information, and other valuable resources.
Vulnerable insiders may also be a risk in and of themselves if their vulnerabilities coincide with the other risks of an insider threat. For example, an untreated gambling addiction could lead to significant financial hardship. This combination of stressors may motivate an insider to accept a bribe or steal sensitive information for financial gain.
*Note: Employees with mental health concerns are not inherently dangerous. One in five adults in the United States, roughly 46.6 million people, experience mental illness in a given year, according to the National Institute of Mental Health. While behavioral monitoring has its place for insider threat detection, it’s critical that this monitoring does not cross the line into discrimination based on mental health status.
Anomalous changes to an employee’s lifestyle may be a warning sign that an insider is progressing further along the 6-step pathway. Unexpected lifestyle changes are of particular concern if there is no notable reason for the change.
For example, an unexplainable upgrade to an employee’s standard of living could be an indication that the employee is using their insider connections to sell sensitive data.
Such changes include
Every employee has their own unique work style. Differences in working styles are completely normal, however, an anomalous change in work style could be a sign of an emerging insider threat risk, particularly if no notable factors are influencing the change.
When a trusted member of an organization enters the later stages of the 6-step pathway, they’ll do all they can to sequester themselves to avoid detection. As such, anomalous changes to workflows may be an indication that an insider is attempting to hide their malicious intent.
Such changes include
As will all of the signs mentioned thus far, no single indicator (or group of indicators) is a definitive sign of an insider threat. From an insider threat management perspective, these signs are simply indications that closer attention should be paid.
For example, an employee showing keen interest in matters outside of the scope of their duties may very well be vying for a promotion or a career change. If their behavior has a risk of having their colleagues disclose privileged information, corrective actions can be issued without accusing the employee of malicious intent.
Now that you have an understanding of the top insider threat indicators you can take steps to prevent insider threats in your organization.
This collection of articles details the steps you can take to protect your organization and sensitive data against malicious, accidental, and negligent insider threats.
These insider threat management articles focus on the best practices for securing sensitive data against theft from employees.
Read More:
User activity monitoring is a critical tool for detecting potential insider threats. To get the most out of these solutions you need to be aware of the privacy and compliance risks of monitoring employees in the workplace.
These articles provide critical tips for monitoring employees in a way that is legal and respectful of workplace privacy concerns.
These articles provide insights into the critical security controls required to ensure that sensitive data is kept safe from unauthorized disclosure.
Insider threat management is not strictly technical. Roles that focus on the “human side,” such as privacy, insider threat training, and ethics, are equally important for preventing data breaches.
These articles highlight the role that Human Resources plays in managing security and employee satisfaction.
An insider threat management program is critical for protecting your organization against malicious and negligent insider threats. These in-depth resources provide critical information for developing such a program for your organization.
The Cybersecurity and Infrastructure Security Agency (CISA) has an abundance of resources regarding insider threat mitigation including an insider threat mitigation guide, warning signs of insider threats and what you can do about them, and details regarding the National Insider Threat Task Force’s maturity framework.
In terms of insider threat training, the Center for Development of Security Excellence (CDSE) has internet-based, self-paced training courses that are intended for use by the Department of Defense and other U.S. Government personnel and contractors within the National Industrial Security Program.
While not all of the courses are not publicly available, their website does include reference material and course modules that will be of benefit to organizations that want to implement an insider threat training program.
Cookie | Duration | Description |
---|---|---|
__cfruid | session | Cloudflare sets this cookie to identify trusted web traffic. |
cookielawinfo-checkbox-advertisement | 1 year | Set by the GDPR Cookie Consent plugin, this cookie is used to record the user consent for the cookies in the "Advertisement" category . |
cookielawinfo-checkbox-analytics | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics". |
cookielawinfo-checkbox-functional | 11 months | The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional". |
cookielawinfo-checkbox-necessary | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary". |
cookielawinfo-checkbox-others | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other. |
cookielawinfo-checkbox-performance | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance". |
JSESSIONID | session | The JSESSIONID cookie is used by New Relic to store a session identifier so that New Relic can monitor session counts for an application. |
LS_CSRF_TOKEN | session | Cloudflare sets this cookie to track users’ activities across multiple websites. It expires once the browser is closed. |
OptanonConsent | 1 year | OneTrust sets this cookie to store details about the site's cookie category and check whether visitors have given or withdrawn consent from the use of each category. |
viewed_cookie_policy | 11 months | The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data. |
Cookie | Duration | Description |
---|---|---|
__cf_bm | 30 minutes | This cookie, set by Cloudflare, is used to support Cloudflare Bot Management. |
_zcsr_tmp | session | Zoho sets this cookie for the login function on the website. |
Cookie | Duration | Description |
---|---|---|
_calendly_session | 21 days | Calendly, a Meeting Schedulers, sets this cookie to allow the meeting scheduler to function within the website and to add events into the visitor’s calendar. |
_gaexp | 2 months 11 days 7 hours 3 minutes | Google Analytics installs this cookie to determine a user's inclusion in an experiment and the expiry of experiments a user has been included in. |
Cookie | Duration | Description |
---|---|---|
_ga | 2 years | The _ga cookie, installed by Google Analytics, calculates visitor, session and campaign data and also keeps track of site usage for the site's analytics report. The cookie stores information anonymously and assigns a randomly generated number to recognize unique visitors. |
_ga_GY6RPLBZG0 | 2 years | This cookie is installed by Google Analytics. |
_gcl_au | 3 months | Provided by Google Tag Manager to experiment advertisement efficiency of websites using their services. |
_gid | 1 day | Installed by Google Analytics, _gid cookie stores information on how visitors use a website, while also creating an analytics report of the website's performance. Some of the data that are collected include the number of visitors, their source, and the pages they visit anonymously. |
CONSENT | 2 years | YouTube sets this cookie via embedded youtube-videos and registers anonymous statistical data. |
Cookie | Duration | Description |
---|---|---|
_opt_expid | past | Set by Google Analytics, this cookie is created when running a redirect experiment. It stores the experiment ID, the variant ID and the referrer to the page that is being redirected. |
IDE | 1 year 24 days | Google DoubleClick IDE cookies are used to store information about how the user uses the website to present them with relevant ads and according to the user profile. |
NID | 6 months | NID cookie, set by Google, is used for advertising purposes; to limit the number of times the user sees an ad, to mute unwanted ads, and to measure the effectiveness of ads. |
test_cookie | 15 minutes | The test_cookie is set by doubleclick.net and is used to determine if the user's browser supports cookies. |
VISITOR_INFO1_LIVE | 5 months 27 days | A cookie set by YouTube to measure bandwidth that determines whether the user gets the new or old player interface. |
YSC | session | YSC cookie is set by Youtube and is used to track the views of embedded videos on Youtube pages. |
yt-remote-connected-devices | never | YouTube sets this cookie to store the video preferences of the user using embedded YouTube video. |
yt-remote-device-id | never | YouTube sets this cookie to store the video preferences of the user using embedded YouTube video. |
yt.innertube::nextId | never | This cookie, set by YouTube, registers a unique ID to store data on what videos from YouTube the user has seen. |
yt.innertube::requests | never | This cookie, set by YouTube, registers a unique ID to store data on what videos from YouTube the user has seen. |
Cookie | Duration | Description |
---|---|---|
_dc_gtm_UA-6494714-6 | 1 minute | No description |
_gaexp_rc | past | No description available. |
34f6831605 | session | No description |
383aeadb58 | session | No description available. |
663a60c55d | session | No description available. |
6e4b8efee4 | session | No description available. |
c72887300d | session | No description available. |
cookielawinfo-checkbox-tracking | 1 year | No description |
crmcsr | session | No description available. |
currentware-_zldp | 2 years | No description |
currentware-_zldt | 1 day | No description |
et_pb_ab_view_page_26104 | session | No description |
gaclientid | 1 month | No description |
gclid | 1 month | No description |
handl_ip | 1 month | No description available. |
handl_landing_page | 1 month | No description available. |
handl_original_ref | 1 month | No description available. |
handl_ref | 1 month | No description available. |
handl_ref_domain | 1 month | No description |
handl_url | 1 month | No description available. |
handl_url_base | 1 month | No description |
handlID | 1 month | No description |
HandLtestDomainName | session | No description |
HandLtestDomainNameServer | 1 day | No description |
isiframeenabled | 1 day | No description available. |
m | 2 years | No description available. |
nitroCachedPage | session | No description |
organic_source | 1 month | No description |
organic_source_str | 1 month | No description |
traffic_source | 1 month | No description available. |
uesign | 1 month | No description |
user_agent | 1 month | No description available. |
ZCAMPAIGN_CSRF_TOKEN | session | No description available. |
zld685336000000002056state | 5 minutes | No description |