{"id":6606,"date":"2026-01-19T10:30:00","date_gmt":"2026-01-19T10:30:00","guid":{"rendered":"https:\/\/cybersecurityinfocus.com\/?p=6606"},"modified":"2026-01-19T10:30:00","modified_gmt":"2026-01-19T10:30:00","slug":"the-culture-you-cant-see-is-running-your-security-operations","status":"publish","type":"post","link":"https:\/\/cybersecurityinfocus.com\/?p=6606","title":{"rendered":"The culture you can\u2019t see is running your security operations"},"content":{"rendered":"<div>\n<div class=\"grid grid--cols-10@md grid--cols-8@lg article-column\">\n<div class=\"col-12 col-10@md col-6@lg col-start-3@lg\">\n<div class=\"article-column__content\">\n<div class=\"container\"><\/div>\n<p>Here\u2019s what nobody admits: Your firewall isn\u2019t the problem. Your SIEM isn\u2019t the problem. That shiny new EDR tool you just bought? Also, not the problem.<\/p>\n<p>The problem is Steve from accounting, who uses \u201cPassword123\u201d because he can\u2019t be bothered to remember anything more complex. The problem is your CISO, who talks about zero trust but still approves exceptions for the CEO\u2019s personal devices. The problem is the unspoken rule that security slows things down, so everyone ends up finding workarounds.<\/p>\n<p>As the famous quote, attributed to Peter Drucker goes \u2013 Culture eats strategy for breakfast. In cyber operations, it eats your security posture for lunch. We learned this the hard way three years ago when a mid-sized financial firm hired a colleague to figure out why they kept getting phished despite spending millions on awareness training. Their policies were pristine. Their tech stack was impressive. Their incident response plan could\u2019ve won awards.<\/p>\n<p>But their culture? Rotten to the core.<\/p>\n<p>The thing about culture is that it exists in layers. What you see on the surface tells you almost nothing about what\u2019s actually happening. You need to understand three distinct dimensions: observable, non-observable and implicit. Miss any one of them, and you\u2019re building your security program on quicksand.<\/p>\n<h2 class=\"wp-block-heading\">Observable culture: The stuff you can actually see<\/h2>\n<p>Observable culture is everything tangible. Your policies. Your procedures. The security awareness posters in the break room. The mandatory training modules everyone clicks through while checking their phones.<\/p>\n<p>This is where most organizations stop. They write a 47-page security policy, mandate annual training, deploy some monitoring tools and call it a day. Box checked. Compliance achieved. Everyone goes home feeling good about themselves.<\/p>\n<p>Except none of it matters if people don\u2019t actually follow through.<\/p>\n<p>Observable elements include your formal security protocols, your incident response plans and your access controls. They include visible behaviors like password hygiene, device management and whether people actually report suspicious emails. They include the technology you deploy and how you communicate about threats.<\/p>\n<p>You can measure this stuff. You can audit it. You can put it in a spreadsheet and show it to the board.<\/p>\n<p>But observable culture is the easiest to fake. People learn to perform security theatre. They know what they\u2019re supposed to do. They know what gets measured. So they do just enough to avoid getting flagged while continuing their risky behaviors in the shadows.<\/p>\n<p>Take <a href=\"https:\/\/www.csoonline.com\/article\/544343\/compliance-target-the-breach-that-should-ve-never-happened.html\">Target\u2019s 2013 breach<\/a>. They had a $1.6 million FireEye malware detection system. The system did exactly what it was supposed to do. It detected the malware. It sent alerts. Multiple times.<\/p>\n<p>But the security team ignored the alerts. They had policies and procedures. They had the technology. But the observable layer was disconnected from actual practice. The tools were there, but the follow-through wasn\u2019t. The breach exposed 40 million credit card numbers and cost Target over $200 million in settlements.<\/p>\n<p>The impact on cyber operations was catastrophic. The tools didn\u2019t fail. The observable culture, the visible security apparatus, existed in a vacuum. Having security controls is meaningless if your operational culture treats alerts as noise. Target\u2019s incident response plan looked great on paper. But when alerts fired, nobody acted. The gap between documented procedure and actual behavior created a blind spot large enough to drive a truck through.<\/p>\n<p>That financial firm we mentioned? Their observable culture looked perfect. Everyone completed their training. Policies were documented and signed. Security tools were deployed and configured.<\/p>\n<p>But when we dug deeper, we found developers routinely turning off security controls because they \u201cslowed down deployments.\u201d We found executives sharing credentials because \u201cit\u2019s faster than waiting for access requests.\u201d We even found an entire shadow IT ecosystem that nobody wanted to acknowledge.<\/p>\n<p>The observable layer gives you structure. Structure without substance is just theatre.<\/p>\n<h2 class=\"wp-block-heading\">Non-observable culture: The hidden drivers<\/h2>\n<p>Now we get interesting.<\/p>\n<p>Non-observable culture is everything happening inside people\u2019s heads. Their beliefs about cyber risk. Their attitudes toward security. Their values and priorities when security conflicts with convenience or speed.<\/p>\n<p>This is where the real decisions get made.<\/p>\n<p>You can\u2019t see someone\u2019s belief that \u201cwe\u2019re too small to be targeted\u201d or \u201csecurity is IT\u2019s job, not mine.\u201d You can\u2019t measure their assumption that compliance equals security. You can\u2019t audit their gut feeling that reporting a mistake will hurt their career.<\/p>\n<p>But these invisible forces shape every security decision your people make.<\/p>\n<p>Non-observable culture includes beliefs about the likelihood and severity of threats. It includes how people weigh security against productivity. It includes their trust in leadership and their willingness to admit mistakes. It includes all the cognitive biases that distort risk perception.<\/p>\n<p>Optimism bias makes people think breaches happen to other companies. Availability bias makes recent incidents loom larger than systemic vulnerabilities. Confirmation bias makes people see what they expect to see and ignore contradictory evidence.<\/p>\n<p><a href=\"https:\/\/www.csoonline.com\/article\/550056\/sony-hacked-in-feb-knew-about-huge-security-flaws-before-cybersecurity-train-wreck.html\">Sony\u2019s 2014 breach<\/a> wasn\u2019t a tech failure. It was a belief failure. People saw security as IT\u2019s job, not theirs. So they clicked phishing links, shared credentials and treated threats as unlikely because \u201cwe make movies.\u201d North Korean attackers didn\u2019t need fancy exploits. They used that non-observable culture. Result: 100TB leaked. Unreleased films, personal data, executive emails. Networks stayed down for weeks, production stalled and trust took a beating. No firewall can fix a culture that thinks it won\u2019t be targeted.<\/p>\n<p>At that financial firm, the non-observable culture was toxic. Developers believed security was an obstacle to innovation. Executives believed cyber risk was purely technical and could be solved by buying more tools. Staff felt that admitting security concerns would make them look incompetent.<\/p>\n<p>Nobody said these things out loud. But everyone acted on them.<\/p>\n<p>The gap between what people say they believe and what they actually think is where security programs go to die. You can mandate all the training you want. If people fundamentally believe security doesn\u2019t apply to them, they\u2019ll find ways around every control you implement.<\/p>\n<h2 class=\"wp-block-heading\">Implicit culture: The deepest layer<\/h2>\n<p>Here\u2019s where it gets really uncomfortable.<\/p>\n<p>Implicit culture is the stuff nobody talks about because nobody even realizes it\u2019s there. The unspoken assumptions. The invisible norms. The \u201cway things are done here\u201d that everyone knows but nobody questions.<\/p>\n<p>This is the most powerful layer because it operates below conscious awareness. People don\u2019t choose to follow implicit norms. They do. Automatically. Without thinking.<\/p>\n<p>Implicit culture includes unspoken beliefs like \u201csecurity slows us down\u201d or \u201cleadership doesn\u2019t really care about this.\u201d It contains hidden power dynamics that determine who can challenge security decisions and who can\u2019t. It includes the organizational identity that shapes how people see themselves and their work.<\/p>\n<p>It includes psychological safety, or the lack thereof. Can people raise concerns without fear? Can they admit mistakes without punishment? Can they challenge assumptions without being labelled difficult?<\/p>\n<p><a href=\"https:\/\/www.csoonline.com\/article\/567833\/equifax-data-breach-faq-what-happened-who-was-affected-what-was-the-impact.html\">Equifax\u2019s 2017 breach<\/a> wasn\u2019t just a missed patch. It was a cultural failure. A critical Apache Struts flaw was disclosed, and security teams were warned to patch. Yet the unspoken rule was that security emails were noise, and uptime trumped fixes. Security had no absolute authority to stop work until the patch landed. So the vulnerability sat for months, visible and ignored. Attackers exploited it, exposing data on 147 million people, including Social Security numbers. Trust collapsed. Leadership changed. Equifax later agreed to settlements totalling more than $700 million. And nobody owned the risk decision!<\/p>\n<p>At that financial firm, the implicit culture was brutal. There was an unspoken assumption that business units were more critical than security teams. There was an invisible hierarchy in which anyone with sufficient seniority could overrule security recommendations. There was a hidden belief that admitting vulnerability was a sign of weakness.<\/p>\n<p>Nobody wrote these rules down. Nobody explicitly taught the new hires. But everyone learned them within weeks of starting.<\/p>\n<p>Implicit culture is why change is so hard. You can rewrite policies overnight. You can deploy new tools in a matter of weeks. But shifting deeply embedded assumptions? That takes years.<\/p>\n<p>And if you don\u2019t address this layer, nothing else sticks.<\/p>\n<h2 class=\"wp-block-heading\">Shifting all three dimensions<\/h2>\n<p>How do you actually change culture?<\/p>\n<p>You can\u2019t just pick one dimension and hope the others follow. They\u2019re interconnected. Change in one without the others creates misalignment and confusion.<\/p>\n<p>Start by making the invisible visible. You can\u2019t fix what you can\u2019t see. Conduct culture audits. Run anonymous surveys. Bring in external facilitators who can spot blind spots you\u2019ve normalized. Ask uncomfortable questions and actually listen to the answers.<\/p>\n<p>Leadership has to model the behavior you want to see. Don\u2019t just talk about it. Actually do it. Visibly. Consistently. When leaders admit mistakes, it creates permission for everyone else to do the same. When leaders prioritize security over convenience, it signals what really matters.<\/p>\n<p>Embed security into daily operations. Not as a separate function that people have to remember. As part of how work gets done. DevSecOps isn\u2019t just a buzzword. It\u2019s about making security the default path, not the exception.<\/p>\n<p>Build continuous learning into your culture. Threats evolve. Your understanding needs to evolve, too. Post-incident reviews shouldn\u2019t be about blame. They should be about building organizational memory and getting smarter.<\/p>\n<p>Fix your incentives. If you reward speed over security, people will choose speed. If you punish people for reporting problems, they\u2019ll stop reporting. Ensure consequences for negligence are transparent and fair, while also ensuring people feel safe raising concerns.<\/p>\n<p>At that financial firm, we spent six months working through all three layers. We didn\u2019t just update policies. We surfaced hidden beliefs through facilitated discussions. We identified implicit assumptions and challenged them openly. We changed how leadership talked about and acted on security.<\/p>\n<p>It was messy. It was uncomfortable. But it worked.<\/p>\n<h2 class=\"wp-block-heading\">The reality<\/h2>\n<p>In practice, technical controls are easy. Culture is hard.<\/p>\n<p>You can buy tools. You can write policies. You can mandate training. But you can\u2019t mandate belief. You can\u2019t purchase trust. You can\u2019t deploy psychological safety.<\/p>\n<p>Target had the tools but not the operational discipline. Sony had the policies but not the shared belief that security mattered. Equifax knew, but lacked the cultural permission to act on it. Each breach happened at a different cultural layer. Each costs hundreds of millions. Each could have been prevented not by better technology but by better culture.<\/p>\n<p>Culture change requires patience, consistency and a willingness to confront uncomfortable truths. It requires leaders who are willing to examine their own assumptions and behaviors. It requires organizations that value honesty over appearances.<\/p>\n<p>Observable culture provides structure. Non-observable culture offers motivation. Implicit culture includes the foundation. You need all three.<\/p>\n<p>The organizations that survive are those where security is woven into their cultural DNA, where risk intelligence is instinctive rather than imposed, where people make good security decisions because it\u2019s simply how things are done.<\/p>\n<p>That\u2019s the real work. Not buying another tool. Not writing another policy, but building a culture where security isn\u2019t something people do. It\u2019s something they are.<\/p>\n<p><strong>This article is published as part of the Foundry Expert Contributor Network.<br \/><\/strong><a href=\"https:\/\/www.csoonline.com\/expert-contributor-network\/\"><strong>Want to join?<\/strong><\/a><\/p>\n<\/div>\n<\/div>\n<\/div>\n<\/div>","protected":false},"excerpt":{"rendered":"<p>Here\u2019s what nobody admits: Your firewall isn\u2019t the problem. Your SIEM isn\u2019t the problem. That shiny new EDR tool you just bought? Also, not the problem. The problem is Steve from accounting, who uses \u201cPassword123\u201d because he can\u2019t be bothered to remember anything more complex. The problem is your CISO, who talks about zero trust [&hellip;]<\/p>\n","protected":false},"author":0,"featured_media":6607,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[3],"tags":[],"class_list":["post-6606","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-education"],"_links":{"self":[{"href":"https:\/\/cybersecurityinfocus.com\/index.php?rest_route=\/wp\/v2\/posts\/6606"}],"collection":[{"href":"https:\/\/cybersecurityinfocus.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/cybersecurityinfocus.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/cybersecurityinfocus.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=6606"}],"version-history":[{"count":0,"href":"https:\/\/cybersecurityinfocus.com\/index.php?rest_route=\/wp\/v2\/posts\/6606\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/cybersecurityinfocus.com\/index.php?rest_route=\/wp\/v2\/media\/6607"}],"wp:attachment":[{"href":"https:\/\/cybersecurityinfocus.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=6606"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/cybersecurityinfocus.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=6606"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/cybersecurityinfocus.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=6606"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}