Satir OD applied to security work¶
Team dynamics as security infrastructure¶
The way a security team is experienced by the rest of the organisation determines how much useful information reaches it. A team that is experienced as judgemental will be managed rather than collaborated with. Developers will not surface security concerns early if they expect the response to be a lecture or a ticket. Operations teams will not escalate ambiguous situations if the last time they did, it produced work they did not have capacity for and no visible benefit.
This is not a morale issue. It is a capability issue. The security posture of an organisation depends on information flowing from where it originates to where it can be acted on. Communication patterns that suppress that flow are a vulnerability.
Security awareness that builds capacity¶
Most phishing training teaches people to spot last year’s attacks, leaving staff confident against threats that are no longer the primary vector while real campaigns move through. QR codes in PDFs, credential harvesting on legitimate cloud services, MFA bypasses via adversary-in-the-middle proxies: these do not look like the examples in the training module.
The Satir OD critique is not of the training format but of the underlying model: that awareness is a stock of information to be deposited rather than a capacity to be developed. Information degrades. Capacity is more durable.
The best approach combines occasional purple team sessions where staff experience both attacker and defender perspectives, with regular simulations that run against live defences using current threat intelligence rather than archetypal scenarios. Staff see results in real time, practise actual responses, and understand why certain behaviours matter in terms of what they would actually prevent. Metrics track not only click rates and reports but filter effectiveness and how quickly unusual activity is noticed and escalated. The goal is not zero clicks but fast detection, consistent reporting, and the organisational habit of paying attention.
Designing for how people behave under pressure¶
Satir’s survival stances appear reliably in security incidents. The analyst who follows procedure mechanically in a situation that requires judgement (computing). The team lead who does not escalate because the last person who raised a concern was criticised for overreacting (placating). The manager who identifies a responsible individual in the post-incident review because acknowledging a systemic failure feels more threatening (blaming).
Incident response plans that are designed for a calm, rational team following a clear playbook will fail in conditions that activate these stances. Plans that account for how people actually behave under stress, and that create structures for slowing down and re-establishing shared understanding when things are moving fast, are more robust.
Building honest reporting culture¶
Incongruence in security culture is common and costly. The stated policy is that all incidents must be reported. The actual culture is that incidents attract scrutiny, scrutiny generates workload, and the person who reported the incident carries disproportionate share of that workload. People make rational decisions about reporting based on the actual culture, not the stated one.
Changing reporting behaviour requires changing the actual culture, not reinforcing the stated one. That means post-incident processes that produce learning rather than blame, near-miss reporting that produces visible improvement rather than invisible remediation, and leadership behaviour that models the honesty it requires of others.
Satir’s contribution here is the observation that this is not primarily a policy design problem. It is a relationship and trust problem. The conditions for honest reporting are built through the accumulated experience of what happens when people are honest, and they are destroyed faster than they are built.