Erick Miyares Podcast Poster

Erick Miyares | Ep 18

Watch or listen:
The Psychological Impact of Technology Use in Cyber Warfare.

In this episode, we explore the psychological impact of technology use in cyber warfare in the Armed Forces Intelligence Units. Eric Miyares reveals the psychological toll of cyber warfare operations, detailing operator stress, cognitive overload, and the future of operational CyberPsychology.

Connect with the guests
Erick Miyares

Erick is a veteran of the US Marines and Army Special Operations Intelligence. Now has retired from the forces he has pivoted into researching the impact of cyber warfare on the psychology of ‘Interactive Operators’.

In the shiny new field of Operational CyberPsychology, Erick is pioneering a pathway into the psychological impact of those who deal with the cognitive and emotional strain of cyber warfare.

Even though he is a veteran, he continues to serve his country by psychologically supporting those who are still engaged in international operations.

Connect on Social Media :

In this episode Eric talks about his passion for cyberpsychology that emerged from very personal experiences. After losing friends to suicide and reflecting on his own cognitive struggles upon retirement, he realised there were deeper mental health and cognitive concerns linked to continuous high-stakes technology use in military settings.

The Rise of Operational CyberPsychology

Eric highlights a growing need for “operational cyberpsychology” in the military. This perspective takes established knowledge about physically demanding roles – such as pilots and infantry troops – and applies similar research methods to those who engage in digital or ‘cyberspace’ operations. Eric argues that we understand the physiology of fighter pilots and submariners but lack insights into the cognitive and psychological burdens on cyber operators who wage war with keyboards and code.

Cyberspace Operator Syndrome

Drawing inspiration from the concept of “Operator Syndrome” (originally used to describe the constellation of physical and mental health issues experienced by special forces), Eric proposes a related idea: cyberspace operator syndrome. This syndrome accounts for the chronic stress, burnout, rumination, and moral injury that can afflict those working to infiltrate or defend networks. Traditional thinking may see cyber operators as “desk-bound,” but Eric’s research shows they shoulder enormous responsibility, from pressing the ‘Enter’ key to launch a virtual offensive to preventing adversaries from breaching critical systems.

The Extended Mind

Eric and other researchers draw on the idea that using computers can effectively transport an operator’s consciousness into a different domain. When a cyber operator is deeply engaged with adversary systems, their sense of self can become “disembodied,” merging with the online space in real time. This immersion, Eric explains, amplifies stress and heightens the psychological stakes – especially when errors could jeopardise mission success or even cost lives.

Future Directions & Broader Implications

Eric hopes his research will translate into concrete changes within defence and intelligence communities. He envisions new guidelines on assessing, selecting, and training cyber operators, alongside more holistic support for mental health and resilience. His work also complements other researchers, like Todd Fletcher (focusing on cyber security professionals) and Dr Marshall S. Rich (examining the psychology of cybercriminals).

By exploring varied angles—defence, offence, and criminal behaviour—these experts collectively map the complex human dimensions behind our interconnected digital world.

Ultimately, Eric’s story and studies underscore the fact that, whether it’s a drone pilot, a special operations analyst, or a defensive cyber team, modern warfare is as much a psychological battleground as a physical or virtual one. Through ongoing research and collaboration, Eric hopes to ensure the well-being of those who protect our networks – and, by extension, the rest of us.

You can read more about Erick’s journey into Operator CyberPsychology in his LinkedIn blog from August 2024: https://www.linkedin.com/feed/update/urn:li:activity:7233880137191075841/.

Other podcasts
Marshall S. Rich

Dr Marshall S. Rich | Ep 12

Watch or listen:
Digital Mental Health

In this episode, we explore the work of the world of Forensic CyberPsychology. 

This involves understanding cybercriminal behaviour and the cognitive bias and decision-making vulnerabilities of cyber attackers.

Dr. Marshall Rich merges cybersecurity, psychology, and forensics, revealing how attacker behaviour shapes modern cyber threats and defences.

Connect with the guests
Dr Marshall S. Rich

Marshall is a Forensic CyberPsychologist has over 30 years of experience in cybersecurity, incident response, and risk management. 

He has served in both military and civilian sectors, blending his expertise in technical defence systems with a deep understanding of human behaviour.

Currently a Senior Expert in Cybersecurity at the United States Institute of Peace, he works at the intersection of peacebuilding and cybersecurity, crafting strategies to counter the ever-evolving cyber threats.

Connect on Social Media:

Watch an interview with Dr Rich and Professor Aiken on Forensic CyberPsychology at Capitol Technology University.

In this episode, we talk about the intersection of Forensic CyberPsychology and Cyber Security. 

Dr. Rich shared how his military experience highlighted the role of human behaviour in cybersecurity. Sophisticated systems can be undermined by human error or psychological manipulation. This realisation led him to earn a PhD in forensic cyberpsychology, where he studies how adversaries exploit cognitive biases to breach systems.

A Multi-Dimensional Approach to Cybersecurity

Dr. Rich advocates for integrating technical defences with insights from cyberpsychology. By understanding the psychology of both attackers and victims, organisations can predict and preempt cyber incidents more effectively. This holistic approach incorporates:

  • Behavioural insights to counter attackers’ tactics.
  • Dynamic training programs tailored to evolving threats.
  • Feedback loops to refine strategies in real time.

Forensic Cyberpsychology in Practice

At USIP, Dr. Rich develops training programs that equip personnel with tools to identify and mitigate cyber risks. He emphasises the importance of understanding the specific threats faced in different global conflict zones, tailoring strategies to local conditions. His predictive algorithms, developed during his PhD, achieve an impressive 70% accuracy in forecasting cyberattacks, allowing his team to focus on high-risk areas.

The Role of AI in Cybersecurity

Dr. Rich explored how AI can enhance both offensive and defensive strategies. While adversaries use AI to scale up attacks, defenders can employ it to predict behaviours and counteract threats proactively. The future of cybersecurity lies in this interplay between AI and human behavioural analysis.

Insights for Aspiring Forensic Cyberpsychologists

For those interested in the field, Dr. Rich recommends a blend of education in psychology or cybersecurity, specialised training in digital forensics and behavioural analysis, and practical experience. Networking and interdisciplinary collaboration are vital to staying ahead in this fast-evolving field.

Recommended Reads:

  • The Cyber Effect by Dr. Mary Aiken – A deep dive into how the digital world shapes human behaviour.
  • Cybersecurity and Cyberwar: What Everyone Needs to Know by P.W. Singer and Allan Friedman – Exploring the intersection of cybersecurity and global conflict.

Marshall’s experience covers a range of skills including the military, academia and working with government organisations.

Dr Marshall S. Rich's Research
Other podcasts
Ep 8 Todd Fletcher and Dr Chris Fullwood

Todd Fletcher and Dr Chris Fullwood | Ep 8

Watch or listen:
The Psychology of CyberSecurity Professionals.

Why do cybersecurity professionals either blatantly or subconsciously disregard standard cybersecurity protocols?

In this episode, we explore how personality and psychology shape cybersecurity roles and how human factors drive defences, risk, and burnout in the digital world.

Connect with the guests
Todd Fletcher

Todd Fletcher: Cyberpsychology PhD student exploring the psychology of cybersecurity professionals, with a rich background spanning IT, networking, and cybersecurity engineering.

Visit Todd’s research and personal website.

Dr Chris Fullwood

Dr. Chris Forwood: Senior Lecturer in Psychology at Birmingham City University and co-author of the Oxford Handbook on Cyberpsychology.

Read more about Chris and his research and watch his podcast episode on how we present ourselves online.

This episode delves deeply into the intersection of psychology and cybersecurity, providing invaluable insights for professionals, students, and parents alike. Whether you’re exploring the field or safeguarding your digital presence, understanding the human element in technology is more crucial than ever.

Cyberpsychology vs Cybersecurity:

  • Cyberpsychology: Broad discipline examining human interaction with technology, from motivations to behavioural impacts.
  • Cybersecurity: Primarily technical but deeply intertwined with human psychology, focusing on protecting systems and data while understanding user behaviours and vulnerabilities.

Todd’s Research Journey:

  • Motivation: Todd’s curiosity about the psychological factors influencing cybersecurity professionals.
  • Current Focus: Examining how personality traits, organisational culture, and cognitive behaviours affect decision-making and security compliance among professionals.
  • Goal: Developing a “Security Acceptance Model” to better integrate human psychology into cybersecurity practices.

Insights on Personality and Cybersecurity:

  • Certain traits, such as curiosityopenness to new experiences, and conscientiousness, correlate with success in cybersecurity.
  • Traits like impulsivity and risk-taking can increase susceptibility to breaches, such as falling for phishing scams.

Human Factor in Security:

  • Cybersecurity breaches are often linked to human errors rather than technical failures.
  • Stress, burnout, and cognitive overload significantly impact professionals’ effectiveness and decision-making.

Challenges in the Cybersecurity Profession:

  • High burnout rates due to long hours, constant upskilling demands, and pressure to safeguard against evolving threats.
  • Lack of leadership support and understanding of cybersecurity risks within organisations.

Youth and Cybersecurity:

  • Encouraging curiosity in technology while guiding ethical practices is vital for fostering a positive interest in cybersecurity.
  • Parents should foster open communication and maintain awareness of their children’s online activities to prevent malicious influences.

Pathways into Cybersecurity:

  • Multiple routes include certifications, college degrees, and self-learning. Key attributes for success are curiosity, continual learning, and networking with industry professionals.

Favourite Cyberpsychology Resource:

  • Oxford Handbook of Cyberpsychology: A foundational text that explores the interplay of human behaviour and digital technology.
Other podcasts
Google's Triple Threat

Search and select – big tech nudges

In July 2019, Robert Epstein (PhD) testified before congress in relation to research he has been conducting since 2012 on Google – specifically on their power to suppress content, and manipulate thoughts and behaviour of those who use the search engine.

This PDF is an updated and expanded version of his testimonial, where he lays out the results of his research and suggestions not to disband the search engine or to make it a public entity, but rather to ‘encourage’ the tech giant to share its index with other entities (i.e. or it to become a public commons) while still retaining ownership and control thereof. Google already does this, to some degree, but Dr Epstein argues that making it available to all will encourage greater competition without weakening the infrastructure of the work already done. 

A previously released article from Bloomberg Businessweek (from page 37 of the PDF) provides a shorter summary of his arguments.

Although a lot of the research is around the search Engines ability to influence USA elections, a few highlights of his main points are worth noting:

  • ‘The rise of the internet has given these companies unprecedented power to control public policy, to swing elections, to brainwash our children, to censor content, to track our every move, to tear societies apart, to alter the human mind, and even to reengineer humanity’ – to reiterate this point, he links to a 2018 The Verge article on a leaked 2016 Google video to top execs highlighting their desire to resequence human behaviour to better align with Google’s values. 
  • ‘Google’s corporate culture revolves around the idea that we’re here to create a better world, where “better” is defined by the prevailing company values. If you doubt that, check out the leaked PowerPoint presentation, “The Good Censor“… the algorithms determine what more than three billion people around the world can or cannot see.’
  • ‘If you have been using the internet for a decade or more, Google has collected an equivalent of about 3 million pages of inormation about you…Google services are not free. We pay for them with our freedom’
  • An existing body of research suggests that these new, often invisible, ways of changing people’s thinking and behaviour are likely to have a much bigger impact on children than on adults. And who is more attached to new tech devices than anyone else? Our children, who are often unattended when they are immersed in social media or playing games or communicating with other people on their computers or mobile devices? 
  • ‘In January 1961, President Dwight D. Eisenhower warned about the possible rise of a “technological elite” that could control public policy without people’s awareness…The elite now exists and they have more power than you think. Democracy as originally conceived cannot survive Big Tech as currently empowered. It is up to our leaders – and to every one of us as individuals – to determine where we go from there. 
Related Articles
Locus of Control and CyberSecurity

What role does job control play in adherence to Cyber Security?

'Exploring the Role of Work Identity and Work Locus of Control in Information Security Awareness'.

Extracts and summary of the research by: Dr Lee Hadlington, Dr Masa Popovac, Prof. Helge Janicke, Dr Iryna Yevseyeva, Dr Kevin Jones (2019)

In her summary of the work, Dr Popovac describes the research as exploring ‘the adherence to organisational information security and the role of work-related and individual factors such as individuals’ perceived control within the workplace, their commitment to current work identity, and the extent to which they are reconsidering commitment to work.’

Key quotes from the research:

  • ‘Cyber security is not just about technology. Almost all successful cyber attaches have a contributing human factor’ (a direct extract from the UK National Cybersecurity Strategy 2016-2021 p. 38)
  • ‘for the most part, technology cannot be the only solution to issues related to organisational cybersecurity…employee[s] (the human factor), can present a paradoxical element into the fight’
  • ‘On the one hand, employees can be a critical asset in the fight against cybersecurity breaches, and can act to deny malicious attempts to access sensitive company data. On the other hand, employees can be the ‘weakest link’…in the cybersecurity system; they are not logical, prone to misunderstanding and confusion, act on impulse and want to get their jobs done’

Summary of the research: 

This research focuses on what factors, outside of personality type, play into employee engagement in cyber security engagement in the workplace. The main aim of the research was to understand:

  • to what extent the ability to control job function has in the taking of responsibility for cyber security
  • if the identification with the workplace plays any role in improving cyber security amongst workers

The researchers point out that:

  • there is a difference between knowledge of the company’s information security policies and the ability of the employee to understand them.
  • there is also a potential gap in how individual attitudes and behaviour aligns with these policies. 

Previous research done in the area of cyber security has found that those more likely to be cyber-security conscious were: 

In contrast, those who engage in cyberloafing (engaging in non-work tech use during working hours) or have higher levels of internet addiction were less likely to be cyber-security conscious. The assumption was that these workers believed the higher levels of company security mitigated online risk when accessing specific materials and activities. Another assumption is that those who have little regard for the company they work for, or who feel they have limited control over their job, are also more likely to have a lower interest in adhering to internet security protocols.

Employees who have a higher internal locus of control are more likely to have lower stress levels, feel more in control of their work and have greater job satisfaction. Those who are higher in external locus of control feel they have little control over work, higher levels of stress and lower job/company commitment – therefore more likely to engage in counter-productive work behaviours, often to rekindle a sense of self-control over their work or potentially as an active attempt to harm the company.  

Those who feel less committed to their work may be less prone or may not see the value in engaging in cyber security behaviour.

The findings of the research are: 

  • Those with a higher internal locus of control are more likely to see their actions as a way to protect both themselves and the company from cyber attacks
  • Workers with a higher external locus of control perceive themselves to have a minimal amount of control over their work and workplace, assumed that both they and the company were vulnerable to attacks whatever action they did or didn’t take, so saw little value in following processes relating to information security.  
  • Those who have a strong work identity, and experience a sense of belonging in their workplace, are more adherent to cyber security policies
  • In contrast, those with a lower level of work identity and/or looking for a new role are less compliant. 
  • Being older and being female were also found to be more likely to engage in higher levels of information security compliance – confirming previous research. 
  • Those who have a clear understanding of the formal company rules around information security are more likely to follow them. 

Definitions: 

Locus of control: ‘an individual’s expectancy related to how rewards or aspects of life outcomes are controlled on the basis of the actions of the individual (internally) or as a result of forces outside the control of the individual (externality)’.

Organisational commitment: ‘the level of attachment an employee has with their workplace’.

Work identity: ‘the strength of an individual’s identification with their work, and not directly their workplace or organisation’.

This is not an open-source document and will need purchasing to read the full original article.

Related Articles
Googles Bias

Merely being able to see the bias, doesn’t protect you from the bias

The title of this post is a direct quote taken from an interview with Dr. Robert Epstein, a Senior Research Psychologist on The Epoch Times TV channel.  

If you were ever wondering how much our opinions are being influenced by ‘Big Tech’, this is worth watching. Dr Epstein has spent the last decade conducting scientific research on the effects of bias in search engines (particularly Google). An interesting quote at 1:04:12 ‘merely being able to see the bias, doesn’t necessarily protect you from the bias’.

We can make changes to our own tech habits and behaviour, but we do need to do something about changing regulations around how tech influences our behaviour. If they can change our behaviour for our good, that is one thing, but changing our behaviour in a socially and personally detrimental way, we need to do something about it.

We are all at risk, no matter our age.

If you are interested, you can read more about the work done by Dr. Epstein here. 

For ease of reference, links copied directly from the Epoch TV interview page are included below:  

Below are some links Dr. Epstein mentions in the interview:
Dr. Robert Epstein’s Privacy Tips
Report on Google’s Triple Threat
Taming Big Tech: The Case for Monitoring
More information on Dr. Epstein’s Google research
The American Institute for Behavioral Research and Technology

Related Articles
CyberSecurity and CyberPsych

When Cyber Security meets CyberPsychology

Cyber Security is not the same as CyberPsychology. It is similar to comparing someone who helps you physically set up home security and someone who seeks to understand why you don’t turn that security on when you leave the house. 

In a recent webinar, one of our Cyber Experts Dr John Blythe joins three of the collaborators of the latest whitepaper on Human Factors in Cyber Security. The video is a playback of the webinar.

If you want to access a copy of the white paper to read, you can find it on the Chartered Institute of Ergonomics and Human Factors website.

The webinar playback showcases a recorded video summary of the white paper and also contains a Q&A session with the three panellists. It provides a value insight, for those involved in Cyber Security within organisations, as to the human factors that have been and continue to affect companies in a remote and hybrid working environment. 

 

Related Articles
Personality Type and Cyber Security

Does our personality type make us more or less susceptible to phishing and online scams?

According to academic research in this area, the short answer is ‘yes’. The majority of research has used The Big 5 Personality Types to identify different types of Cyber Security behaviour. Although research in this area has slightly conflicting results, there are some general findings that are interesting to note. These are outlined briefly below.

Openness to Experiences

Those who have a greater level of this type have a greater ability to adjust their viewpoints and are therefore better able to review information in emails on the merit of the content itself, rather than on preconceived ideas around either the content or the sender. They are, therefore, better able to identify phishing content. However, they are more likely to reveal personal information about themselves on social media and within online communication.

Extroverts

Are a lot more sociable and more likely to share information with others around phishing scams. They are also more likely to share information about themselves with others and are more likely to have been bored during lockdowns, craving social interaction, so potentially more likely to click on links to help alleviate boredom.

Agreeableness

Those high in agreeableness traits are more inclined to want to please others, and try avoid people disliking them. They are, therefore, more susceptible to phishing attacks, as they just want to please others. If the email looks like it comes from an internal department or a supplier/customer, they may want to try be helpful and/or ‘fix’ things.

Neuroticism

People who have more of this personality type have an inherent need to believe that others are telling the truth. They also don’t like to upset people, so are likely to fall for phishing scams.

Conscientiousness

Those who display more of this trait are the least likely to fall for phishing scams. They tend to read content more critically and are more likely to follow training guidelines.

Although generic cyber security training and education is vital within any organisation, to help minimise susceptibility to phishing attacks, training should include how each personality type can be affected differently. This may make individual workers more vigilant towards phishing attacks that they are more susceptible to – based on their dominant personality traits.

Take the Big 5 Personality test and a brief explanation of each:

If you want to take the Big 5 Personality Test to find out more about where you fit within each range, you can find a link below.

Take the test.

Read more about The Big 5 personality types.

A few notes about Personality based Psychometric Tests:

  • Although there are a number of psychometric tests available on the market, a large number of them are complicated to decipher and/or are only commercially available. Researchers, therefore, tend to use The Big 5 personality psychometric test as a standard academic for research.
  • Personality tests can indicate a preference for specific behaviour but should not be used to stigmatise people and categorise them into neat boxes. In all things psychological and behavioural, we are all on a spectrum, and display a unique combination of characteristics to a greater or lesser degree.
  • Personality tests are self-completion questionnaires that people fill in based on how they view their own behaviour. We are generally not very good at understanding our own behaviour. This means that they can give us (like any self-completion questionnaires) an indication of different behavioural types, but should be read and interpreted as such.

If you want to know more about what cybersecurity threats you may encounter, you can read ESET’s T2 2021 Cyber Threat Report.

Related Articles
The Cyber Effect Article Header

The Cyber Effect – Mary Aiken

Published in 2017, The Cyber Effect is one of the first CyberPsychology books to be written specifically for the general public. 

A book that everyone should read, especially parents. It will open your eyes to the potential opportunities and dangers that exist online.

About Mary Aiken:

Professor Mary Aiken is a Professor in Forensic Cyberpsychology at UEL. Go directly to her website or follow her on Twitter.

Related Articles
online personas

Why is there sometimes a real disconnect between online and offline behaviour?

We know that cyber-bullying is an issue that has affected many of us, to a greater or lesser degree. We have also heard of people who have been publicly shamed with their reputations dashed to pieces. Jon Ronson, in his book ‘so you’ve been publicly shamed’ investigates how recipients of online shaming have coped with their experience (for better or worse). 

But, what about those who instigate or perpetuate online abuse or shaming? And, what about those who reveal more about themselves to online strangers than they would to someone physically present? 

John Suler, (a pioneer in cyberpsychology and psychology professor at Rider University) first published his six perceptual classifications on The Online Disinhibition Effect in 2004, which still holds true today and goes some way to explain why people can display uncharacteristically unsociable behaviour online when they wouldn’t do so offline. 

The first is the perception of being unknown (dissociative anonymity). Each site or app offers varying degrees of anonymity, but in many cases, a person’s true identity can be partially hidden through unrecognisable usernames created as their avatar. What they then say or do online is unlikely to be tracked directly back to them and therefore unlikely to result in any social reprimands by anyone in their immediate social circles 

Linking into this is the second effect, the perception of not being physically seen by the recipient of the message (invisibility). When around 55% of our message is portrayed in our body language and about 40% in our tone of voice, not having to worry about how one present themselves or their message, can mean inhibitions are less likely to be held in check. 

When there is a delay in a message being sent and the reply being received (asynchronicity), such as emails or text messages, a person has more time to contemplate their reply, potentially resulting in overthinking each element of the message without the bonus of social cues that face-to-face communication allows for the correct interpretation thereof. This delay and lack of social cues can allow for toxic thoughts to form, influencing the response given. 

It is human nature to fill in information gaps. When a friend talks about someone we don’t know, we often form a visual impression of the person in our heads, which often differs substantially from what they actually look like. When we read messages from someone we haven’t met in person, we often perceive the message as a voice in our head (solipsistic introjection). Based on the phrases and words used in a message, we develop a tone of voice, intonation and persona around the person we have only met online. Any gaps in our knowledge of them, we fill in with our own ‘personal expectations, wishes and needs’ of who they ought to be. This leads to developing a much deeper affinity with someone we engage with online and can encourage us to reveal substantially more detail about who we are than if we were to meet the same person face-to-face.

Engaging with people online can sometimes seem ‘unreal’. Any online characters we create, e.g. when creating a gaming or virtual world avatar or even our social media profiles, can lead us to perceive our online selves as existing in a different, almost fictitious, realm (dissociative imagination). Because this world does not always transcend into reality, the rules and norms that exist online aren’t linked to and have no accountability in the real world. Some may even convince themselves that their online self can suspend its moral conscious with the belief that ‘it isn’t really me’. 

In the real world, a person’s authority is often displayed by their dress code, their body language and their possessions. Online, the dissolution of many barriers allows anyone to become well-known, to be an authority, to be heard by many and to reach previously unattainable goals. This has led to the perception that rank and hierarchy have been flattened out online (minimising authority). We mostly feel a lot more equal in an online world and, therefore, have limited fear of disapproval from authority figures, giving us the opportunity to ‘speak out and misbehave’ in ways we otherwise wouldn’t.

In reality, it’s not just about understanding how others behave deviantly online, recognising these tendencies and traits in ourselves can help us to regulate ourselves a little more online, both in how we communicate with others and how we interpret what others say. If almost 95% of what is said needs to be seen in body language or heard in the tone or intonation of our voice, when interpreting technology-based communication we maybe need to be a little more conscious of our tendency to read more into what is said and become a little more cognisant of how we phrase things online. 

Related Articles