PhD Literature Review Sample: Evaluating the Long-term Impact of Continuous Cybersecurity Education on Employee Behaviours in Eastern European SMEs
Theme 1: Understanding Human Behaviour in Cybersecurity
The first theme exploring the cybersecurity behaviours of employees was primarily informed by the frameworks of Alsharida et al. (2023) and Lahcen et al. (2020) as discussed earlier in the Methodology section. In this aspect, awareness and knowledge levels served as one of the possible antecedents of further compliance with internal cybersecurity rules and mitigation strategies. The replies of the interviewed information technology specialists highlighted two main sub-themes, namely Awareness and Knowledge and Risk Perceptions discussed further in this section.
Sub-Theme 1.1. Awareness and Knowledge
How aware are the employees of your company of cybersecurity threats?
The first question explored the degree of employee awareness of cybersecurity threats faced by the analysed companies. As shown by the replies, these levels could be appraised as medium, with most staff members not being informed about such issues.
“We do not discuss all actual threats including high-level threats with them. This is treated as somewhat classified information. We only provide training related to possible phishing threats and other threats directly affecting employees” (Interviewee 3)
These statements may be seen as problematic in the light of the recommendations voiced by Muller and Burrell (2022) and Moustafa et al. (2021), namely the need to keep every team member on the same page regarding the utilised security procedures and the reasons for their selection. If some startup members lack an in-depth understanding of their risk profiles, they can accidentally disclose some relevant information to hackers or consider some activities less risky than they actually are.
What training programs are in place to enhance awareness?
In terms of the training programmes introduced by the analysed companies, most of the interviewees noted the use of internally designed programmes rather than industry-level models, including the earlier analysed CATRAM (Cybersecurity Awareness TRAining Model).
“We design our own programmes for every employee category. This way, we can account for our resources and main risks” (Interviewee 2)
On the one hand, this method could be seen as a potentially more effective one according to Alanazi et al. (2022) and Li et al. (2019). Since each startup is unique in terms of its challenges and risks, a well-designed programme tailored to its particular needs could provide unique synergies and help it utilise its resources more sparingly (Hong & Furnell, 2021). As discussed earlier, the minimum viable security (MVS) concept is widely applied by startups due to limited funding, to mainly protect key areas such as intellectual property or sensitive data related to investor relationships. On the other hand, this scheme requires greater levels of coordination and knowledge between team members and cybersecurity specialists in comparison with standardised solutions and industry best practices (Mohammad et al., 2022; Sule et al., 2021). If mistakes are made, the end results may be sub-optimal in comparison with such established options relying on established frameworks, such as CATRAM allowing managers to test the skills and knowledge of their subordinates using well-described procedures.
What methods does your company utilise to test awareness levels and ensure employee readiness?
Answers to this question mainly suggested that the interviewed information technology specialists prioritised attack simulation and classroom learning tests as key methods used to appraise cybersecurity awareness levels.
“We run online tests to appraise their knowledge and sometimes try to hack into our networks using phishing and social engineering. Several times, we used external experts for this purpose to get independent audits of our readiness levels” (Interviewee 1)
These practices are in line with the earlier analysed secondary evidence. They reflect both theoretical testing and practical exercises to ensure that staff members are aware of the optimal behaviours and actually engage in them under stressful conditions. With that being said, several respondents noted that it was difficult to test the cybersecurity readiness and compliance of remote workers connecting from their own home systems. This corresponds to the concerns voiced by Borkovich and Skovira (2020), Khan et al. (2022), and Verkijika (2019) regarding the fact that remote work may act as a source of additional threats in startups as they begin to expand in order to launch their initial ideas to the market.
Sub-Theme 1.2. Risk Perceptions
How do the employees of your company perceive the risks associated with cybersecurity?
In terms of risk perceptions, the majority of the interviewed specialists noted that employees were mainly afraid of data theft, compromised sensitive data, and ransomware.
“They have heard of threats such as WannaCry or disclosed user data leading to major corporate fines from mass media” (Interviewee 3)
As noted by Hasan et al. (2021), such perceived risks could not fully reflect the whole spectrum of threats that startups are exposed to in contemporary cybersecurity environments. This narrow focus could decrease sensitivity to lower-level risks, such as direct hacker attacks, which may be dangerous for small and medium startup companies. Such problems were reported by Burton and Lain (2020) and He and Zhang (2019) within the scope of the Theory of Planned Behaviour (TPB) model’s application to cybersecurity contexts where subjective norms directly influence cybersecurity behaviours. In this aspect, multiple interviewees noted that regular startup members saw potential risks as something primarily affecting medium and large firms or a problem mainly handled by security specialists.
What factors, information or training activities can influence their perception of risk, in your opinion?
In this question, the interviewees provided no specific suggestions for ensuring optimal outcomes.
“I do not know. I think that we should discuss our past risks with real examples” (Interviewee 2)
“I think that they mostly listen to the news in forming their risk perceptions rather than our training materials” (Interviewee 4)
Considering these responses, it can be suggested that the majority of the analysed startups do not have a holistic strategy for informing their staff members about key risks affecting their businesses. While the interviewed cybersecurity specialists mention training exercises and tests ensuring awareness and knowledge of potential threats, this information may not be closely linked with day-to-day routine experiences. As noted earlier by Andrade and Yoo (2019) and Lahcen et al. (2020), such gaps in understanding may lead to the underestimation of possible phishing, vishing or whaling risks, as well as social engineering risks. With startups frequently working with representatives of major investment funds and other high-power stakeholders, the lack of security provisions in this sphere may lead to highly detrimental consequences for all involved parties (Ahmad et al., 2022).
As shown by the results of the questions in the two identified sub-themes in ‘Understanding Human Behaviour in Cybersecurity’, the interviewed specialists appraised the levels of startup staff awareness and knowledge of cyber threats as medium. While the latter were informed about such risks, they did not possess in-depth information about actual risks encountered by their company. These perceptions of threats as something distant and not directly linked with their daily routine or workplace sphere of responsibility were further supported by the implicit agreement of the interviewed cybersecurity specialists to rely on technology-based solutions and view regular users as sources of potential problems rather than a facilitating factor in achieving higher standards of security.
Theme 2: Mitigation Strategies and Best Practices
Sub-Theme 2.1. Technical Solutions vs. Human Factors
How do technical solutions complement human factors in cybersecurity?
The majority of the interviewed specialists stated that technical solutions were seen by them as a superior method in comparison with training and other techniques focused on the human factor in cybersecurity.
“I think that zero-trust methods are superior to training. You cannot always tell a senior manager that decides to violate security procedures. However, you can block suspicious behaviours from their account to prevent potential problems” (Interviewee 7)
This supports the trend of limited inclusion of regular staff members into cybersecurity processes that was highlighted in the previous section. According to Ahmad et al. (2022) and Alohali et al. (2018), this may be problematic due to a number of reasons. On the one hand, technological means have limited effectiveness against phishing, vishing, whaling, and other instruments relying on the human factor (Butavicius et al., 2020). If startup employees possess a limited understanding of such threats, protection systems can be easily compromised despite their infrastructural complexity. On the other hand, the mentioned ability of senior staff members to override or violate some security procedures suggests hierarchical issues where security specialists may not be able to follow optimal best practices if they receive direct orders from startup owners or other senior executives (Bansal et al., 2023). This may create additional points of failure in the case of cyberattacks due to excessive rights granted to some accounts within the company systems.
What balance should be struck between technology and human behaviour?
In terms of the balance between technology and human behaviour, the statements indicated a similar trend.
“I am a firm believer in technology. It helps us track user behaviours and stop both fraudulent actions and simple human errors” (Interviewee 5)
These responses further suggest that startups may be excessively relying on technological cybersecurity elements while ignoring the role of the human factor. While this strategy could be effective for preventing regular attacks according to Butavicius et al. (2020) and Borkovich and Skovira (2020), companies developing unique solutions with high market potential may be targeted by major competitors using advanced industrial espionage techniques. Considering the effectiveness of phishing and other human-factor-oriented instruments used by hackers, such a lack of balance could be seen as a problematic indicator.
Sub-Theme 2.2. Incident Response and Recovery
How prepared are startups to respond to cybersecurity incidents?
With this question, the majority of the interviewees appraised cybersecurity readiness as relatively high. However, multiple concerns were voiced by individual respondents.
“They know the risks. If they are developing know-how that they are willing to build their company on, they try to protect their intellectual property from potential competitors” (Interviewee 1)
“Small company mentality is frequently a problem. When things start to grow, they retain it but protecting the information is more difficult when there are dozens of workers” (Interviewee 4)
These statements support the concerns voiced earlier by Alanazi et al. (2022) and Sule et al. (2021) regarding the ‘growth pains’ experienced by startups from the standpoint of cybersecurity. While their initial provisions in this sphere could have been developed for a small team of highly committed developers, the expansion of employee numbers could not be fully compatible with small-scale information protection systems (Hong & Furnell, 2021). According to multiple interviewees., this controversy could be difficult to resolve due to the time and resource pressures making it impossible to stop the product development process and revise the existing cybersecurity measures to adjust them to the new status quo at the structural level.
What role does employee behaviour play in incident recovery?
As noted by multiple interviewees, staff members could influence the recovery from incidents to a substantial degree.
“If they are prepared for emergencies, the recovery is as simple as restoring the latest backup” (Interviewee 5)
These results are in line with the findings of Li et al. (2019), Moustafa et al. (2021), and Verkijika (2019), who noted that staff training aligned with NIST principles could greatly facilitate the speed of incident recovery. With that being said, several interviewees suggested that their firms did not entirely delegate such obligations to regular startup staff members. In this scenario, regular backups and continued user behaviour monitoring were seen as the responsibility of information security personnel (Andrade & Yoo, 2019). This effect was especially evident if such workers were an external organisation providing services to startups.
The analysis of the two sub-themes in ‘Mitigation Strategies and Best Practices’ further suggests that the interviewed specialists perceived the human factor in cybersecurity as an additional threat rather than a source of opportunity. On the one hand, they prioritised technological solutions over staff preparedness, which could be problematic in the case of phishing, whaling, vishing, and other attacks involving regular users. On the other hand, the appraised levels of employee readiness for incident response and recovery were inconsistent due to the growth of staff numbers. Additionally, multiple references to technological instruments such as automated backups imply that the interviewees prioritised such tools over staff training in this sphere as well.
References
Ahmad, Z., Ong, T., Gan, Y., Liew, T., & Norhashim, M. (2022). Predictors of employees’ mobile security practice: An analysis of personal and work-related variables. Applied Sciences, 12(9), 4198-4221. https://doi.org/10.3390/app12094198
Alanazi, M., Freeman, M., & Tootell, H. (2022). Exploring the factors that influence the cybersecurity behaviors of young adults. Computers in Human Behavior, 136(1), 1-18. https://doi.org/10.1016/j.chb.2022.107376
Alohali, M., Clarke, N., Li, F., & Furnell, S. (2018). Identifying and predicting the factors affecting end-users’ risk-taking behavior. Information & Computer Security, 26(3), 306-326. https://doi.org/10.1108/ICS-03-2018-0037
Alsharida, R., Al-rimy, B., Al-Emran, M., & Zainal, A. (2023). A systematic review of multi perspectives on human cybersecurity behavior. Technology in Society, 73(1), 1-22. https://doi.org/10.1016/j.techsoc.2023.102258
Andrade, R., & Yoo, S. (2019). Cognitive security: A comprehensive study of cognitive science in cybersecurity. Journal of Information Security and Applications, 48(1), 1-18. https://doi.org/10.1016/j.jisa.2019.06.008
Bansal, G., Thatcher, J., & Schuetz, S. (2023). Where authorities fail and experts excel: Influencing internet users’ compliance intentions. Computers & Security, 128(1), 1-23. https://doi.org/10.1016/j.cose.2023.103164
Borkovich, D., & Skovira, R. (2020). Working from home: Cybersecurity in the age of COVID-19. Issues in Information Systems, 21(4), 234-246. https://doi.org/10.48009/4_iis_2020_234-246
Burton, J., & Lain, C. (2020). Desecuritizing cybersecurity: Towards a societal approach. Journal of Cyber Policy, 5(3), 449-470. https://doi.org/10.1080/23738871.2020.1856903
Butavicius, M., Parsons, K., Lillie, M., McCormac, A., Pattinson, M., & Calic, D. (2020). When believing in technology leads to poor cyber security: Development of a trust in technical controls scale. Computers & Security, 98(1), 1-20. https://doi.org/10.1016/j.cose.2020.102020
Hasan, S., Ali, M., Kurnia, S., & Thurasamy, R. (2021). Evaluating the cyber security readiness of organizations and its influence on performance. Journal of Information Security and Applications, 58(1), 1-19. http://dx.doi.org/10.1016/j.jisa.2020.102726
He, W., & Zhang, Z. (2019). Enterprise cybersecurity training and awareness programs: Recommendations for success. Journal of Organizational Computing and Electronic Commerce, 29(4), 249-257. https://doi.org/10.1080/10919392.2019.1611528
Hong, Y., & Furnell, S. (2021). Understanding cybersecurity behavioral habits: Insights from situational support. Journal of Information Security and Applications, 57(1), 1-22. https://doi.org/10.1016/j.jisa.2020.102710
Khan, N., Yaqoob, A., Khan, M., & Ikram, N. (2022). The cybersecurity behavioral research: A tertiary study. Computers & Security, 120(1), 1-17. https://doi.org/10.1016/j.cose.2022.102826
Lahcen, R., Caulkins, B., Mohapatra, R., & Kumar, M. (2020). Review and insight on the behavioral aspects of cybersecurity. Cybersecurity, 3(10), 1-18. https://doi.org/10.1186/s42400-020-00050-w
Li, L., He, W., Xu, L., Ash, I., Anwar, M., & Yuan, X. (2019). Investigating the impact of cybersecurity policy awareness on employees’ cybersecurity behavior. International Journal of Information Management, 45(1), 13-24. https://doi.org/10.1016/j.ijinfomgt.2018.10.017
Mohammad, T., Hussin, N., & Husin, M. (2022). Online safety awareness and human factors: An application of the theory of human ecology. Technology in Society, 68(1), 1-19. http://dx.doi.org/10.1016/j.techsoc.2021.101823
Moustafa, A., Bello, A., & Maurushat, A. (2021). The role of user behaviour in improving cyber security management. Frontiers in Psychology, 12(1), 1-17. https://doi.org/10.3389/fpsyg.2021.561011
Muller, S., & Burrell, D. (2022). Social cybersecurity and human behavior. International Journal of Hyperconnectivity and the Internet of Things (IJHIoT), 6(1), 1-13. http://dx.doi.org/10.4018/IJHIoT.305228
Sule, M., Zennaro, M., & Thomas, G. (2021). Cybersecurity through the lens of digital identity and data protection: issues and trends. Technology in Society, 67(1), 1-18. https://doi.org/10.1016/j.techsoc.2021.101734
Verkijika, S. (2019). “If you know what to do, will you take action to avoid mobile phishing attacks”: Self-efficacy, anticipated regret, and gender. Computers in Human Behavior, 101(1), 286-296. https://doi.org/10.1016/j.chb.2019.07.034