Cybersecurity is usually framed as a protective shield—tools, protocols, and practices designed to block attackers and preserve trust in digital systems. Yet this view obscures the deeper reality. Cybersecurity is not simply about erecting barriers, but about managing a continuous cycle of disruption and repair. Breaches and defences do not occur in isolation; they follow rhythmic patterns that resemble oscillatory systems in physics and biology. Understanding these rhythms can clarify why insecurity persists, why risk is displaced onto individuals, and how resilience might be built.
The Cycle of Disruption and Repair
After a large earthquake, the ground continues to tremble with aftershocks that taper off according to a simple rule known as Omori’s law: the number of shocks decreases in proportion to time since the main quake (Utsu et al., 1995). Cyber incidents follow a similar profile. A major breach is followed by intense waves of exploitation—copycat attacks, opportunistic phishing, new malware variants—that gradually subside. Attention shifts elsewhere, until the next breach reignites the cycle.
This pattern illustrates that insecurity is not random but structured. Attackers, defenders, and users all become phase-locked into a shared tempo: patch releases, exploit timings, and user alerts all unfold in cycles. The rhythm itself sustains the system, ensuring that the problem never resolves completely.
Risk Displacement and Commercialisation
Within this cycle, large organisations strategically displace risk. The everyday costs of insecurity—password resets, multi-factor authentication, credit monitoring—are pushed onto individuals. At the same time, insecurity is commercialised: firms sell cyber-insurance, premium secure services, or rapid-response subscriptions. Each new protective measure increases complexity, which in turn produces fresh vulnerabilities (Anderson et al., 2020). The result is not a march toward greater stability but a recurrent equilibrium, where insecurity is redistributed rather than eliminated.
Insecurity as Systemic Energy
The cycle functions like an engine. Just as heat engines transform temperature differences into work, the cybersecurity economy transforms vulnerability differences into revenue and activity. Stability in one domain requires ongoing energy drawn from instability elsewhere (Böhme & Schwartz, 2010). Entropy is not removed from the system; it is channelled, managed, and continually reintroduced.
This perspective suggests that insecurity is not a failure but a structural condition. The industry, the attackers, and the users are bound together in a system where difference—between secure and insecure, patched and unpatched—is what keeps the whole field running.
Toward Practical Resilience
If insecurity is structural, then resilience requires more than stronger locks. It requires strategies that break the predictability of cycles. Introducing irregularity into update schedules, diversifying authentication mechanisms, or deliberately slowing propagation of exploit information are ways to prevent attackers from phase-locking into defensive rhythms. The analogy with ecosystems is useful: systems that maintain diversity and non-equilibrium dynamics are less prone to collapse (Holling, 1973).
For policymakers, engineers, and security professionals, this means focusing less on total elimination of risk—an impossible goal—and more on shaping the rhythms of interaction so that the energy of insecurity is dissipated rather than amplified. The challenge is to accept insecurity as inevitable, while designing systems that remain robust within it.
References
- Anderson, R., Barton, C., Böhme, R., Clayton, R., van Eeten, M., Levi, M., Moore, T., & Savage, S. (2020). Measuring the cost of cybercrime. Journal of Cybersecurity, 6(1), tyaa017.
- Böhme, R., & Schwartz, G. (2010). Modeling cyber-insurance: Towards a unifying framework. WEIS 2010.
- Holling, C. S. (1973). Resilience and stability of ecological systems. Annual Review of Ecology and Systematics, 4(1), 1–23.
- Utsu, T., Ogata, Y., & Matsu’ura, R. S. (1995). The centenary of the Omori formula for a decay law of aftershock activity. Journal of Physics of the Earth, 43(1), 1–33.
One reply on “Rhythm: Cybersecurity as System Cycle”
Technogical dependencies are generally as normatively (!) uncontrolled, asuncontrollable, regarding complexity, entropy, prediction.
This is a useful item of knowledge. When we know what the limit conditions are (and never can be), we can inversely infer probable, or at least plausible, system trajectories.
Presenting generalist strategic responses to cybersecurity and entropy is an unforgiving pastime. People want well-defined atomic tokens, narrative clarity, ontological identity, but all of this simply muddies the waters more. Step one: acknowledging the critically unsustainable metastability in play; everything else, such as it is, follows.
LikeLike