Published in News

Cyber security blighted by bias

by on24 June 2019


More dangerous than a Russian hacker

A study of cybersecurity professionals indicates that their confirmation bias is probably more likely to sink their computer networks than a Russian hacker.

Forcepoint found how cybersecurity decision-making is affected by six common biases.

One example was that older generations were typically characterized by information security professionals as "riskier users based on their supposed lack of familiarity with new technologies".

But this flies in the face of studies that have found the opposite to be true: younger people are far more likely to engage in risky behaviour like sharing their passwords to streaming services.

The presumption that older workers pose more of a risk than younger workers is an example of so-called "aggregate bias" in which subjects make inferences about an individual based on a population trend.

The report said that biases like this misinform security professionals by directing their focus to individual users based on their supposed group membership. In turn, analysts wrongly direct their focus to the wrong individuals as sources of security issues.

Availability bias may influence cybersecurity analysts' decision-making in favour of hot topics in the news, which ultimately cloud other information they may know but are not so frequently exposed to; leading them to make less well-rounded decisions.

People encounter "confirmation bias" most frequently during research. By neglecting the bigger picture, assumptions are made and research is specifically tailored to confirm those assumptions.

When looking for issues, analysts can often find themselves looking for confirmation of what they already believe to be the cause as opposed to searching for all possible causes.

The fundamental attribution error also plays a significant role in misleading security analysts, Forcepoint found. This is manifested when information security analysts or software developers place blame on users being inept instead of considering that their technology may be faulty or that internal factors contributed to a security lapse.

The report cites what it calls the framing effect. "Security problems are often aggressively worded, and use negative framing strategies to emphasize the potential for loss."

Last modified on 24 June 2019
Rate this item
(0 votes)

Read more about: