Here’s what nobody admits: Your firewall isn’t the problem. Your SIEM isn’t the problem. That shiny new EDR tool you just bought? Also, not the problem.
The problem is Steve from accounting, who uses “Password123” because he can’t be bothered to remember anything more complex. The problem is your CISO, who talks about zero trust but still approves exceptions for the CEO’s personal devices. The problem is the unspoken rule that security slows things down, so everyone ends up finding workarounds.
As the famous quote, attributed to Peter Drucker goes – Culture eats strategy for breakfast. In cyber operations, it eats your security posture for lunch. We learned this the hard way three years ago when a mid-sized financial firm hired a colleague to figure out why they kept getting phished despite spending millions on awareness training. Their policies were pristine. Their tech stack was impressive. Their incident response plan could’ve won awards.
But their culture? Rotten to the core.
The thing about culture is that it exists in layers. What you see on the surface tells you almost nothing about what’s actually happening. You need to understand three distinct dimensions: observable, non-observable and implicit. Miss any one of them, and you’re building your security program on quicksand.
Observable culture: The stuff you can actually see
Observable culture is everything tangible. Your policies. Your procedures. The security awareness posters in the break room. The mandatory training modules everyone clicks through while checking their phones.
This is where most organizations stop. They write a 47-page security policy, mandate annual training, deploy some monitoring tools and call it a day. Box checked. Compliance achieved. Everyone goes home feeling good about themselves.
Except none of it matters if people don’t actually follow through.
Observable elements include your formal security protocols, your incident response plans and your access controls. They include visible behaviors like password hygiene, device management and whether people actually report suspicious emails. They include the technology you deploy and how you communicate about threats.
You can measure this stuff. You can audit it. You can put it in a spreadsheet and show it to the board.
But observable culture is the easiest to fake. People learn to perform security theatre. They know what they’re supposed to do. They know what gets measured. So they do just enough to avoid getting flagged while continuing their risky behaviors in the shadows.
Take Target’s 2013 breach. They had a $1.6 million FireEye malware detection system. The system did exactly what it was supposed to do. It detected the malware. It sent alerts. Multiple times.
But the security team ignored the alerts. They had policies and procedures. They had the technology. But the observable layer was disconnected from actual practice. The tools were there, but the follow-through wasn’t. The breach exposed 40 million credit card numbers and cost Target over $200 million in settlements.
The impact on cyber operations was catastrophic. The tools didn’t fail. The observable culture, the visible security apparatus, existed in a vacuum. Having security controls is meaningless if your operational culture treats alerts as noise. Target’s incident response plan looked great on paper. But when alerts fired, nobody acted. The gap between documented procedure and actual behavior created a blind spot large enough to drive a truck through.
That financial firm we mentioned? Their observable culture looked perfect. Everyone completed their training. Policies were documented and signed. Security tools were deployed and configured.
But when we dug deeper, we found developers routinely turning off security controls because they “slowed down deployments.” We found executives sharing credentials because “it’s faster than waiting for access requests.” We even found an entire shadow IT ecosystem that nobody wanted to acknowledge.
The observable layer gives you structure. Structure without substance is just theatre.
Non-observable culture: The hidden drivers
Now we get interesting.
Non-observable culture is everything happening inside people’s heads. Their beliefs about cyber risk. Their attitudes toward security. Their values and priorities when security conflicts with convenience or speed.
This is where the real decisions get made.
You can’t see someone’s belief that “we’re too small to be targeted” or “security is IT’s job, not mine.” You can’t measure their assumption that compliance equals security. You can’t audit their gut feeling that reporting a mistake will hurt their career.
But these invisible forces shape every security decision your people make.
Non-observable culture includes beliefs about the likelihood and severity of threats. It includes how people weigh security against productivity. It includes their trust in leadership and their willingness to admit mistakes. It includes all the cognitive biases that distort risk perception.
Optimism bias makes people think breaches happen to other companies. Availability bias makes recent incidents loom larger than systemic vulnerabilities. Confirmation bias makes people see what they expect to see and ignore contradictory evidence.
Sony’s 2014 breach wasn’t a tech failure. It was a belief failure. People saw security as IT’s job, not theirs. So they clicked phishing links, shared credentials and treated threats as unlikely because “we make movies.” North Korean attackers didn’t need fancy exploits. They used that non-observable culture. Result: 100TB leaked. Unreleased films, personal data, executive emails. Networks stayed down for weeks, production stalled and trust took a beating. No firewall can fix a culture that thinks it won’t be targeted.
At that financial firm, the non-observable culture was toxic. Developers believed security was an obstacle to innovation. Executives believed cyber risk was purely technical and could be solved by buying more tools. Staff felt that admitting security concerns would make them look incompetent.
Nobody said these things out loud. But everyone acted on them.
The gap between what people say they believe and what they actually think is where security programs go to die. You can mandate all the training you want. If people fundamentally believe security doesn’t apply to them, they’ll find ways around every control you implement.
Implicit culture: The deepest layer
Here’s where it gets really uncomfortable.
Implicit culture is the stuff nobody talks about because nobody even realizes it’s there. The unspoken assumptions. The invisible norms. The “way things are done here” that everyone knows but nobody questions.
This is the most powerful layer because it operates below conscious awareness. People don’t choose to follow implicit norms. They do. Automatically. Without thinking.
Implicit culture includes unspoken beliefs like “security slows us down” or “leadership doesn’t really care about this.” It contains hidden power dynamics that determine who can challenge security decisions and who can’t. It includes the organizational identity that shapes how people see themselves and their work.
It includes psychological safety, or the lack thereof. Can people raise concerns without fear? Can they admit mistakes without punishment? Can they challenge assumptions without being labelled difficult?
Equifax’s 2017 breach wasn’t just a missed patch. It was a cultural failure. A critical Apache Struts flaw was disclosed, and security teams were warned to patch. Yet the unspoken rule was that security emails were noise, and uptime trumped fixes. Security had no absolute authority to stop work until the patch landed. So the vulnerability sat for months, visible and ignored. Attackers exploited it, exposing data on 147 million people, including Social Security numbers. Trust collapsed. Leadership changed. Equifax later agreed to settlements totalling more than $700 million. And nobody owned the risk decision!
At that financial firm, the implicit culture was brutal. There was an unspoken assumption that business units were more critical than security teams. There was an invisible hierarchy in which anyone with sufficient seniority could overrule security recommendations. There was a hidden belief that admitting vulnerability was a sign of weakness.
Nobody wrote these rules down. Nobody explicitly taught the new hires. But everyone learned them within weeks of starting.
Implicit culture is why change is so hard. You can rewrite policies overnight. You can deploy new tools in a matter of weeks. But shifting deeply embedded assumptions? That takes years.
And if you don’t address this layer, nothing else sticks.
Shifting all three dimensions
How do you actually change culture?
You can’t just pick one dimension and hope the others follow. They’re interconnected. Change in one without the others creates misalignment and confusion.
Start by making the invisible visible. You can’t fix what you can’t see. Conduct culture audits. Run anonymous surveys. Bring in external facilitators who can spot blind spots you’ve normalized. Ask uncomfortable questions and actually listen to the answers.
Leadership has to model the behavior you want to see. Don’t just talk about it. Actually do it. Visibly. Consistently. When leaders admit mistakes, it creates permission for everyone else to do the same. When leaders prioritize security over convenience, it signals what really matters.
Embed security into daily operations. Not as a separate function that people have to remember. As part of how work gets done. DevSecOps isn’t just a buzzword. It’s about making security the default path, not the exception.
Build continuous learning into your culture. Threats evolve. Your understanding needs to evolve, too. Post-incident reviews shouldn’t be about blame. They should be about building organizational memory and getting smarter.
Fix your incentives. If you reward speed over security, people will choose speed. If you punish people for reporting problems, they’ll stop reporting. Ensure consequences for negligence are transparent and fair, while also ensuring people feel safe raising concerns.
At that financial firm, we spent six months working through all three layers. We didn’t just update policies. We surfaced hidden beliefs through facilitated discussions. We identified implicit assumptions and challenged them openly. We changed how leadership talked about and acted on security.
It was messy. It was uncomfortable. But it worked.
The reality
In practice, technical controls are easy. Culture is hard.
You can buy tools. You can write policies. You can mandate training. But you can’t mandate belief. You can’t purchase trust. You can’t deploy psychological safety.
Target had the tools but not the operational discipline. Sony had the policies but not the shared belief that security mattered. Equifax knew, but lacked the cultural permission to act on it. Each breach happened at a different cultural layer. Each costs hundreds of millions. Each could have been prevented not by better technology but by better culture.
Culture change requires patience, consistency and a willingness to confront uncomfortable truths. It requires leaders who are willing to examine their own assumptions and behaviors. It requires organizations that value honesty over appearances.
Observable culture provides structure. Non-observable culture offers motivation. Implicit culture includes the foundation. You need all three.
The organizations that survive are those where security is woven into their cultural DNA, where risk intelligence is instinctive rather than imposed, where people make good security decisions because it’s simply how things are done.
That’s the real work. Not buying another tool. Not writing another policy, but building a culture where security isn’t something people do. It’s something they are.
This article is published as part of the Foundry Expert Contributor Network.
Want to join?
No Responses