Categories
cybernetics Philosophy

Technological Bullying

Technology is no longer a peripheral factor in abuse and social harm. It has become part of the mechanism. Research on technology-facilitated coercive control shows how perpetrators use everyday digital tools — smartphones, cloud accounts, GPS services, social media, spyware, smart home systems — to extend surveillance, isolation and intimidation beyond physical proximity, making abuse continuous rather than episodic (Dragiewicz et al. 2018; Rogers et al. 2022; Brookfield, Fyson & Goulden 2024). At a systemic level, studies of digital infrastructures demonstrate that algorithmic systems routinely encode and reproduce existing social inequalities rather than neutralise them, reinforcing asymmetries of race, gender, class and power through the very systems designed to optimise communication, ranking and decision-making (Noble 2018; Dolata, Feuerriegel & Schwabe 2022). The same technical substrates that promise protection, efficiency and access operate as amplifiers of structural vulnerability.

What unsettles people is not the harm itself, but that it is so structurally ordinary. The systems we inhabit now stabilise power by manufacturing the conditions that justify it. They distribute fear, attention and legitimacy in ways that feel natural because they are continuous. Violence is not just enacted. It is routed. Normalised. Buffered. In this sense, technology is not failing us by accident; it is faithfully executing the logics embedded within it. And we comply, because resistance appears inefficient, unscalable, or unprofessional. The field closes around its own distortions and calls that reality. (See also The Entropy of Simplicity, Recursive Tension, and The Physics of Nothing on daedeluskite.com.




Annotated bibliography: Technology, coercive control, and sociotechnical power

> Note within the text: The empirical base is growing but still uneven across regions, populations, and longitudinal depth. What follows reflects strong, available peer-reviewed and scholarly work, while acknowledging these gaps.






Dragiewicz, M., Burgess, J., Matamoros-Fernández, A., Salter, M., Suzor, N., Woodlock, D. & Harris, B. (2018)
Technology-facilitated coercive control: Domestic violence and the competing roles of digital media platforms. Feminist Media Studies, vol. 18, no. 4, pp. 609–625.
What it is: A peer-reviewed empirical and theoretical study examining how digital platforms both enable and constrain coercive control in domestic violence contexts.
Why it matters: Establishes technology-facilitated coercive control (TFCC) as a structural extension of abuse, not a peripheral or exceptional case. It supports the claim that everyday platforms embed control into ordinary communicative life.




Rogers, M. M., Fisher, C., Ali, P., Allmark, P. & Fontes, L. (2022)
Technology-Facilitated Abuse in Intimate Relationships: A Scoping Review. Trauma, Violence, & Abuse, vol. 23, no. 4, pp. 1334–1349.
What it is: A systematic scoping review of peer-reviewed studies on technology-facilitated abuse in intimate relationships across multiple countries.
Why it matters: Demonstrates that digital abuse is widespread, patterned, and structurally embedded in modern relationships. It also notes that empirical data remains uneven, particularly regarding long-term impacts and cross-cultural variation.




Brookfield, K., Fyson, R. & Goulden, M. (2024)
Technology-facilitated domestic abuse: An under-recognised safeguarding issue? British Journal of Social Work, vol. 54, no. 1, pp. 419–436.
What it is: A peer-reviewed social work study analysing how technology-facilitated abuse is encountered and often overlooked within safeguarding systems.
Why it matters: Shows how institutions systematically under-recognise or fail to structurally respond to tech-mediated abuse, treating it as anomaly rather than distributed condition.




Dolata, M., Feuerriegel, S. & Schwabe, G. (2022)
A sociotechnical view of algorithmic fairness. Information Systems Journal, vol. 32, no. 4, pp. 754–818.
What it is: A peer-reviewed theoretical and literature analysis of algorithmic systems, demonstrating why “fairness” cannot be treated as purely technical.
Why it matters: Provides a foundation for understanding technology as a field-structuring force rather than a neutral tool. It underpins the argument that digital infrastructures reproduce asymmetries because they are embedded in pre-existing power relations.




Noble, S. U. (2018)
Algorithms of Oppression: How Search Engines Reinforce Racism. New York: New York University Press.
What it is: A widely cited scholarly monograph examining how commercial search algorithms reproduce and normalise structural racism and inequality.
Why it matters: While not focused on domestic violence, it is foundational for understanding how platform infrastructures encode power and bias, supporting the argument that technology stabilises and normalises structural inequality.




Brown, C. & Hegarty, K. (2024)
Measuring the impact of technology-facilitated abuse: Fear, distress and digital coercion in intimate relationships. Journal of Interpersonal Violence.
What it is: A peer-reviewed study exploring the psychological and experiential impacts of technology-facilitated abuse.
Why it matters: Provides evidence that digital abuse produces measurable psychological harm. The authors also acknowledge limitations in longitudinal and cross-cultural data, reinforcing that empirical coverage remains uneven.




Synthesis note (on uneven data):
Across these works, technology-facilitated abuse and algorithmic power emerge as real, documented, and structururally embedded phenomena. However, the empirical field still shows uneven coverage, particularly in long-term studies, institutional data transparency, and non-Western contexts. This unevenness is not a weakness of the argument; it is part of the phenomenon itself. Systems of power shape not only harm, but what is visible, measurable, and officially acknowledged.

One reply on “Technological Bullying”

The technologies accumulating the most corporate power thrive by harvesting our most primitive circuits. They do not persuade our reason. They trigger reflex, fear, craving, belonging, outrage — fast, subcortical economies of attention and response. In doing so they convert human psycho-biology itself into extractable infrastructure, where the more reactive we become, the more indispensable the machinery that feeds on it.

Like

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.