Embedding threat intelligence and practical training in ICS cybersecurity awareness for frontline resilience - Industrial Cyber
Industrial CyberArchived Mar 17, 2026✓ Full text saved
Embedding threat intelligence and practical training in ICS cybersecurity awareness for frontline resilience Industrial Cyber
Full text archived locally
✦ AI Summary· Claude Sonnet
AI
Attacks And Vulnerabilities
Control Device Security
Critical Infrastructure
Features
ICS Security Framework
Industrial Cyber Attacks
IT/OT Collaboration
Malware, Phishing & Ransomware
Threat Landscape
Embedding threat intelligence and practical training in ICS cybersecurity awareness for frontline resilience
OCTOBER 12, 2025
ICS cybersecurity awareness rethinking is progressively demanding a departure from typical security perspectives that are IT-centric. The change is linked to the fact that the existing cyber threat landscape is largely influenced by the threat actors that are state-sponsored and motivated by geopolitical factors. Instead of simply depending on compliance, critical organizations are opting for threat intelligence and building practical training right into the operations to help frontline teams understand tactics that adversaries are constantly adopting and using to target.
Creating a cybersecurity-focused culture across the plant floor begins by treating security as more of a safety and continuity measure, instead of something that needs to be bolted on after incidents. Some of the important measures involve leadership making and showing the priorities, getting cyber hygiene in every activity, and giving the power to the employees so they can identify the risks in both the digital and physical spheres. When cyber resilience gets into the plant DNA, operational teams view the organizational defenses as the necessary steps to be able to continue the work without interruption.
On the other hand, ICS (industrial control systems) organizations may deploy dynamic role-based awareness programs that are powered by machine learning to counter the AI-driven misinformation and disinformation. These programs simulate threats and assaults, monitor employee reactions, and modify training in accordance with the behaviors that have been observed, thereby decreasing the susceptibility level of phishing and improving the ability for real-time threat detection. The flexibility of these methods corresponds to the manner in which attackers use sophisticated AI to be one step ahead of the defenses, and consequently makes the implementation of appropriate safeguards across the organizational structure imperative.
Transitioning from mere compliance to awareness entails developing programs that serve the organization’s mission instead of depending on blanket regulations. Tailoring the instructions to the exact assets and processes goes a long way in transforming compliance from just a checklist into a genuine security posture. Adoption of organization-wide continuous assessments, customized audits, and open organizational security communication can help overcome challenges and be ready for any situation.
Also, the impact of ICS cybersecurity awareness efforts is gauged continuously using metrics such as fewer cyber incidents, quicker responses, and real changes in the behavior of employees. Research findings reveal that putting employee awareness at the forefront results in quicker incident recovery, decreased vulnerability to social engineering, and higher stability of operations. In the end, strong awareness initiatives become a source of confidence to the entire organizational hierarchy, in addition to their primary function of threat prevention.
Rethinking ICS cyber awareness in changing threat landscape
Industrial Cyber spoke with experts to examine how cybersecurity awareness in ICS environments differs fundamentally from traditional IT settings, and how it has evolved amid the rise of nation-state and geopolitically driven attacks.
John Lee, managing director of OT-ISAC
“Cybersecurity in ICS environments differs from IT because the focus is on safety and reliability and not on protecting the confidentiality, integrity, and availability of information,” John Lee, managing director of the OT-ISAC (Operational Technology Information Sharing Analysis Centre), told Industrial Cyber. “Volt Typhoon, a nation state that has been targeting US power facilities, the attack on Denmark’s energy sector in 2023 is among the recent attacks. Disruption of critical infrastructure can have a huge impact on lives and economies.”
Lee identified that cybersecurity awareness in ICS is paramount to prevent these threats. “Organizations should focus on the technical aspects as well as the business impacts. Industry groups and experts are advocating secure design, deployment, and operations as well as supply chain risk management. These measures will help organizations to be more resilient against potential attacks.”
Andrew Tunnecliffe, threat intelligence and detection lead of CI-ISAC
Andrew Tunnecliffe, threat intelligence and detection lead of CI-ISAC, told Industrial Cyber that the differences are significant and stem from the core priorities of each environment.
“IT focuses on protecting the confidentiality, integrity, and availability (CIA) of information, with worst-case scenarios being data theft or financial/reputational damage,” Tunnecliffe said. “In ICS, safety is paramount, followed by availability and integrity. Operational worst-case scenarios include physical equipment damage, environmental disasters, injury, or death. Because of those foundational differences, the risk context is different. Risks in IT are tied to business logic, fraud, data breaches, and service availability. In ICS, risks are linked to physics and engineering, such as improper turbine speeds or altered chemical formulas.”
Tunnecliffe added that the rise of nation-states and geopolitical attacks, exemplified by Stuxnet, Industroyer, and TRISIS, shows a shift toward physical destruction of infrastructure. “While industrial accidents remain threats, cyberattacks that take down entire power stations pose a much greater risk. The blurring lines between ICS and IT also increase supply chain risks, requiring vetting of all contractors, SaaS providers, and software vendors.”
He also highlighted that nation-state tools are designed to mimic legitimate processes and feed false data, making threats invisible. “Operators should report anything that feels ‘off,’ not just when something is broken.”
Georgianna George Shea chief technologist at the Foundation for Defense of Democracies (FDD)
Cybersecurity awareness in ICS environments is fundamentally different from IT because it focuses on safety and operational continuity, whereas IT awareness focuses on data confidentiality, integrity, and availability, Georgianna ‘George’ Shea, chief technologist at the Foundation for Defense of Democracies (FDD), told Industrial Cyber. “In IT, a breach often results in data loss or downtime that can be restored through backups. In ICS, a compromise can produce kinetic consequences like damaged equipment or loss of life. Many of these threats arise not from malware, but from manipulation of controller logic or unauthenticated process sensors, which are not detected by network security tools.”
Highlighting that ICS environments are also more varied with less standardization than IT, Shea identified that IT systems include office technology (servers, endpoints, firewalls, cloud services), whereas ICS deployments vary widely by industry. “An automotive manufacturer’s robotic assembly line bears little resemblance to a pharmaceutical cleanroom. This lack of uniformity complicates training and awareness, because ICS cybersecurity problems are often tied to unique industrial processes and hardware-level physics, not general-purpose IT infrastructure.”
She noted another reason awareness has lagged is market and tooling dynamics. The IT security market is massive, encouraging standardized, scalable solutions that vendors can sell broadly. ICS has a smaller market. “Most OT security tools focus on network traffic analysis, assuming Level-0 sensors and actuators are uncompromised. But if those inputs are spoofed or tampered with, as seen with Stuxnet and in real-world Aurora-like events, operators may see ‘normal’ data while equipment is being driven toward failure. This creates a false sense of security.”
“IT cybersecurity also has a mature education ecosystem, with well-defined degree programs and certifications,” Shea pointed out. “OT/ICS security is often learned on the job, forcing practitioners to bridge the gap between engineering and cybersecurity. This creates cultural friction: practices like rapid patching or segmentation, normal in IT, can be dangerous in ICS environments where uptime and safety must take precedence.”
Shea noted that with the rise of nation-states and geopolitical attacks, ICS awareness has shifted from being an engineering afterthought to a national security imperative. “Adversaries target not just IT networks but the physics of control systems, seeking to cause long-duration physical disruption. These incidents highlight that kinetic cyberattacks can masquerade as equipment malfunctions and create cascading consequences across entire sectors.”
“Finally, unlike federal IT systems, which follow FISMA, ICS has no broad, cross-sector legal requirements for cybersecurity,” she added. “While sectors like energy have mandatory standards (e.g., NERC CIP), most ICS operators adopt frameworks such as NIST SP 800-82 or ISA/IEC 62443 voluntarily. Until policy and practice explicitly address sensor authenticity, relay configuration protections, and physics-level monitoring, ICS cybersecurity will remain under-prepared for the kinds of kinetic attacks that nation-states are already demonstrating.”
Dean Parsons, SANS Certified instructor and CEO of ICS Defense Force, said that cybersecurity awareness in ICS/OT differs fundamentally from IT. “The controls, processes, and attack methods are not only different, but so too are the knowledge requirements for end users, practitioners, and leadership. IT-focused security awareness programs alone are insufficient in critical infrastructure environments because they do not address risks tied to safety, engineering consequences, or the specialized workforce knowledge needed to protect ICS/OT operations from human error.”
Dean Parsons, SANS Certified instructor, and CEO of ICS Defense Force
“In ICS/OT, the consequences extend beyond data protection to physical outcomes that impact safety, reliability, and engineering processes,” Parsons told Industrial Cyber. “Industrial signaling and commands operate and read sensors, programmable logic controllers (PLCs), actuators, and other equipment that directly manipulate the physical world in real time. Any disruption, whether from a cyberattack, misconfiguration, deploying incorrect non-ICS/OT aware controls, or equipment malfunction, can place human lives and operational continuity at risk.”
He mentioned that with rising nation-state and geopolitical threats, ICS/OT workforce development and security awareness initiatives must prepare personnel at every level to identify engineering-specific abnormalities, recognize industrial risks, escalate issues rapidly, and maintain safe operations through cyber incidents.
Unlike IT-focused programs, ICS/OT-specific security awareness and training address essential areas and the differences between IT and ICS/OT, including safety and industrial-grade incident response processes, specialized engineering skillsets, system design considerations, ICS/OT-specific cybersecurity controls (5 ICS Cybersecurity Critical Controls), and operational support requirements.
Only by tailoring training to these domains, organizations build the knowledge and resilience necessary to defend critical infrastructure against adversaries who deliberately target industrial protocols and engineering equipment to cause harmful physical consequences.
Building cybersecurity culture on plant floor
The executives focus on how awareness programs counter the perception that security disrupts operations and instead make cybersecurity part of daily plant culture.
“Many operators see security as an obstacle to productivity. Awareness programs must reframe security as an ‘enabler of safe and reliable operations.’ This means linking cybersecurity practices directly to plant safety, regulatory compliance, and uptime,” Lee said. “For example, training can show how failing to lock a workstation could allow an attacker to change setpoints, risking equipment damage. Programs should use relatable, real-world incidents to illustrate that secure practices reduce operational risk. Involving operations staff in security decision-making also helps. When employees feel ownership and see that controls are designed to fit workflow, cybersecurity naturally integrates into ‘plant culture.’”
Tunnecliffe said that anchor safety to culture by incorporating rituals and controls from the physical world into the IT world. “Examples include safety briefings, lock-out/tag-out (LOTO), and empowering users to issue stop-work orders. These actions build a sense of shared safety and ownership. Additionally, positive feedback loops are valuable. Publicly celebrate and reward staff who report suspicious activity, like leaving a personal USB device in a machine or reporting a phishing email. This fosters shared ownership and vigilance.”
Shea recognizes that the tone at the top drives the culture of the organization. “This starts at the board level and continues to the C Suite. Awareness programs should make clear that cybersecurity supports operations by protecting continuity and safety. Recently, a board cut bonuses after a security breach for the CEO. This was a fantastic way to enforce the board’s risk appetite and demand a culture of cybersecurity.”
“The best results come when cyber considerations are built into existing routines, so they are part of normal work rather than seen as extra steps,” according to Shea. “Using real ICS case studies helps employees understand how cyber incidents can create safety risks and costly downtime. Tabletop exercises that involve operators, engineers, plant managers, and security staff show that responsibility is shared across the plant. When people see how their role directly prevents disruption, they begin to view cybersecurity as a normal and essential part of keeping the plant safe and reliable.”
Parsons says, “Let’s make no mistake about it. The reality is, in engineering control systems environments, safety rightfully comes first, and inadequate, misaligned, or misapplied security will absolutely disrupt operations and put lives at risk.”
“A prime example of how misaligned security can directly disrupt control systems and impede safety when IT security controls, processes, and procedures are applied to ICS/OT environments,” he explained. “These approaches, when not tailored for control system networks, engineering hardware, and the safety-first culture, are very likely to break critical processes, compromise reliability, and impede safe operations. Ultimately, putting lives at risk.”
Parsons notes that in ICS/OT, security must support safety, never override it. “This means that controls, processes, and technologies must be purpose-built for control system environments and fully aligned with established engineering practices. IT security controls and processes are not built or do not have an approach for this.”
He underlined that those responsible for ICS/OT cybersecurity must earn the trust and confidence of engineering teams, staff, and leadership who manage these facilities and bear ultimate responsibility for the safety of their staff and the potential sector cascading impacts. Only then can security be effectively integrated to defend the critical infrastructure operations that make, move, and power our world.
“To achieve this, mature organizations have embraced the safety-first approach and investment in ICS/OT-specific workforce development and security awareness training,” Parsons said. “ICS/OT security training must highlight the fundamental differences between IT and ICS/OT, equipping all personnel – end users, practitioners, and leadership – with the knowledge to safeguard operations. In an ICS/OT organization, this applies to everyone: when ICS/OT is present, ICS/OT is the business.”
Adapting ICS cybersecurity awareness to AI-driven deception
With phishing, deepfakes, and AI-driven deception on the rise, the executives address whether ICS awareness training should evolve to address these human-focused threats.
Lee said that ICS awareness must expand beyond technical safeguards to ‘psychological resilience.’
“Phishing, social engineering, and AI-powered deception target the human layer. Training should include simulated phishing campaigns, exercises in spotting manipulated audio/video, and discussions on how attackers exploit trust and urgency,” he added. “Employees should learn verification habits, such as confirming requests through secondary channels before acting. Because ICS often relies on third-party contractors and remote vendors, awareness must extend beyond employees to the entire supply chain. Cultivating skepticism, critical thinking, and verification practices is essential in defending against AI-driven manipulation. While trust may be extended, it must be accompanied by verification in cybersecurity.”
Tunnecliffe identified that the core principle should shift from ‘Don’t Click’ to ‘Verify, Then Trust,’ emphasizing healthy skepticism of unsolicited digital communication that requests an action, especially with physical consequences. “Training should incorporate out-of-band verification; for example, confirming email instructions with a human-to-human phone call. If a supervisor calls to ignore a safety alarm, operators must call back on a known, trusted, internal number to confirm. High-fidelity, contextual simulations are crucial. Demoing vishing or real-time deepfake generation, where voices are cloned, shows the simplicity of these attacks, as many people are unaware.”
“Cybersecurity safeguards traditionally stop at the technical boundary: system-to-system interactions are protected with controls like passwords or MFA, while human-to-human interactions often rely on assumed trust,” Shea said. “In daily operations, humans recognize colleagues by how they look or sound and rarely ask for credentials. That assumption is now a vulnerability. AI-driven deception can convincingly mimic trusted team members or vendors, making it possible to issue fraudulent instructions during video calls, phone conversations, or chat platforms.”
To close this gap, she added that organizations should systematically map where human-to-human interactions occur through technology, much like a six-sigma process review, and embed explicit verification steps. “For ICS environments, this means not only documenting system connections but also identifying communication points where operational decisions could be influenced by AI-driven deception.”
Moving ICS cybersecurity awareness beyond compliance
Beyond compliance, executives explore approaches that truly shift ICS workforces toward a security-first mindset, supporting safety and resilience.
“Compliance training is often viewed as a checklist. To create a security-first mindset, programs must focus on ‘behavioral change and cultural embedding,’” Lee said. “Storytelling about past ICS breaches, hands-on drills that simulate attacks, and linking lessons directly to personal and plant safety are effective. Recognition programs that reward secure behaviors help normalize good habits. Embedding cybersecurity discussions into daily toolbox talks or shift handovers reinforces the idea that security is part of the job, not an extra requirement.”
He added that when employees consistently see leadership modeling secure behaviors, the workforce naturally aligns with a security-first culture.
“Employees need to feel empowered to question or halt any process if they suspect a cyber issue, without fear of reprisal, similar to a strong physical safety culture,” according to Tunnecliffe. “A blame-free reporting policy is critical, focusing post-incident review on improving processes, not punishing individuals who may have been deceived.”
He mentioned that leadership must drive accountability by including cybersecurity in daily briefings, linking it to performance metrics, and publicly adhering to rules themselves. “This demonstrates that cybersecurity is a core business priority, not just an ‘IT problem.’”
She pointed to money, adding that “the board that cut bonuses after a security breach knows that the leadership of the organization is directly influenced by money. Insurance companies also have the power to be one of the strongest drivers of change in how organizations approach cybersecurity.”
Furthermore, she added that if insurers require companies to meet specific cybersecurity standards to qualify for coverage or favorable premiums, leadership has a direct financial incentive to prioritize security. “This influence cascades down through policies, training, and daily procedures.”
“Meaningful change for control systems and our critical infrastructure environments comes from embedding ICS/OT specific cybersecurity into the existing physical safety culture of engineering,” Parsons said. “This requires consequence-driven training that connects cyber risks directly to physical hazards and safety outcomes. Leaders must model the right behaviors, while workforce engagement should be reinforced through engineering-led tabletop exercises (not IT-driven) that reflect real operational contexts and outcomes.”
Assessing effectiveness of ICS cybersecurity awareness
Lastly, the executives look into metrics or real-world indicators that best show whether ICS cybersecurity awareness efforts are reducing risk, rather than just producing documentation.
Lee said that paper compliance does not equal reduced risk. It is just not completing training and passing assessments. Metrics need to demonstrate a change in human behavior and increased operational resilience.
He noted that some performance indicators include a reduction in successful phishing click rates during phishing simulations, faster incident reporting times when suspicious activity is noticed, and improved compliance with access control policies, such as fewer shared passwords and complete MFA enrollment. He added near-miss reporting, where employees identify and report risks before they escalate (near misses are lessons learned to prevent the next incident), decreased downtime or operational disruptions related to cybersecurity lapses, and external audits show improvement in employee responses over past audits.
“Vanity metrics like ‘95% of staff completed cyber awareness training’ are insufficient,” Tunnecliffe said. “Indicators should focus on a positive culture. This includes the quality of reporting (e.g., proactive phishing email reports with context), near-miss reporting (employees immediately reporting accidental clicks or handing over found USB sticks to IT), and challenge rates (operators questioning out-of-process commands, even if legitimate, measured through out-of-band communication follow-ups).”
He also noted reduced Mean Time to Detect (MTTD) during incident response, and a decreased base policy violation rate (e.g., fewer work email accounts used for non-work-related social media sign-ups), which also demonstrates improved security awareness.
Building on the PCAST CPS Resilience report, Shea said that track fail-over to manual operations as a training outcome: how often crews practice, how quickly they assume local control, and whether they sustain minimum operations. “Use stress-testing that mirrors real multi-point scenarios to measure recognition and escalation speed, not just attendance. Pair this with awareness-driven hygiene signals: fuller and more accurate inventories across IT, OT, and key suppliers maintained by operations staff, and sustained preventative maintenance with timely patching that operators schedule and execute during planned windows.”
She also suggests adding communications resilience drills that confirm teams know how to operate safely when internet or site links are unavailable, and measure hard-restart recovery time to see if personnel can rebuild services without circular dependencies.
“Metrics that matter go beyond completion reports. Leading indicators include increased operator-reported anomalies, faster verification and access revocation drills, and routine engineering restore tests (including critical ICS/OT assets like Windows-based OT systems, PLCs, RTUs (remote terminal units), protection control relays, etc., passing within required recovery times,” Parsons said. “Lagging indicators such as reduced operational downtime, faster industrial incident containment, and a high closure rate of lessons-learned actions demonstrate that awareness is reducing real risk in the business-critical operational environments.”
He added that in ICS/OT, operations are the business, and protecting them means protecting lives, safety, and continuity. “Mature organizations embrace the differences between IT and ICS/OT, dedicating focus, ICS/OT dedicated training, and awareness aligned with engineering and safety priorities. They appoint champions from engineering to lead security awareness, ensuring programs are ICS/OT-driven rather than IT-focused.”
In mature organizations and facilities, Parsons concluded that ICS/OT security awareness is not a yearly task or checkbox. It’s a daily practice embedded in the physical safety culture, building resilient control system-aware workforces that see cybersecurity as inseparable from safe, reliable operations.
Anna Ribeiro
Related
New York introduces cybersecurity rules, $2.5 million grant program to strengthen water infrastructure defenses
Building ‘Incident Management for Industrial Control Systems’ to address gaps in OT cyber incident response
GAO report highlights risks to CMMC rollout as nation-state attacks target defense contractors
Why industrial cybersecurity must evolve as climate disruption and digitalization reshape critical infrastructure
ISAC advisory highlights cyber and physical risks to critical infrastructure as Middle East tensions rise
Suspected Iran-linked cyberattack hits medical technology giant Stryker amid Middle East tensions
Finland’s National Security Overview 2026 flags Russian and Chinese cyber espionage targeting government, critical infrastructure
Cydome flags NAVTOR NavBox path traversal and authentication flaws exposing vessel data, networks to cyber risk
Iran-linked cyber espionage surges across Middle East as conflict tensions rise, researchers say
Texas orders cybersecurity review of state agencies for Chinese-made medical devices after federal warnings