Publications

Peer Reviewed Articles

Under review: Kazansky, B. Countering preemptive, data-driven surveillance regimes: proactive strategies of resistance & their tensions.

Abstract: This article explores how targeted communities counter preemptive, data-driven regimes of violence in their daily work and organizing strategies, drawing on findings from a four-year qualitative study of resistance to data-driven surveillance amongst international digital rights nonprofit organisations and grassroots groups focused on social justice organising. The article explores an imperative by surveillance targeted communities to proactively act before violent preemptive surveillance driven measures can exert harmful effects, with proactivity understood as the need to act before probable harm. The article investigates proactivity as it arises within two strategies of resistance. The first strategy discussed is one of data-driven resilience in the context of digital security/safety practices in human rights organisations. The second strategy discussed is abolitionist refusal within community organising against predictive policing harms. The article explores the tensions and ambiguities of these two resistance strategies, looking at the contrasting ways that they prefigure the future and contend with the burdens that come from needing to continously anticipate future harm. The analysis of proactivity provides insight into the potentials of resistance to the anticipatory regimes of contemporary data-driven surveillance and policing.

2021: Kazansky, B. “It depends on your threat model”: The anticipatory dimensions of resistance to data-driven surveillance”. Big Data & Society. 

Abstract: While many forms of data-driven surveillance are now a ‘fact’ of contemporary life amidst datafication, obtaining concrete knowledge of how different institutions exploit data presents an ongoing challenge, requiring the expertise and power to untangle increasingly complex and opaque technological and institutional arrangements. The how and why of potential surveillance are thus wrapped in a form of continuously produced uncertainty. How then, do affected groups and individuals determine how to counter the threats and harms of surveillance? Responding to an interdisciplinary concern with agency amidst datafication, this article explores what I term ‘anticipatory data practices’ – future-oriented practices which provide a concrete anchor and a heuristic for action amidst the persistent uncertainties of life with data. This article traces how anticipatory data practices have emerged within civil society practices concerned with countering the harms of surveillance and data exploitation. The mixed-method empirical analysis of this article draws from 50 interviews with digital security educators and technology developers; participant observation at 12 civil society events between 2016 and 2019 and the textual analysis of 100 security manuals produced by NGOs and grassroots groups.

2021: Kazansky, B. & Milan, S. “Bodies not templates: Contesting dominant algorithmic imaginaries”. New Media & Society.

Abstract: Through an array of technological projects and awareness-raising initiatives, civil society mobilizes against an onslaught of surveillance threats. What alternative values, practices, and tactics emerge from the grassroots which point toward other ways of being in the datafied society? Conversing with critical data studies, science and technology studies, and surveillance studies, this article looks at how dominant imaginaries of datafication are reconfigured and responded to by groups of people dealing directly with their harms and risks. Building on practitioner interviews and participant observation in digital rights events and surveying projects intervening in three critical technological issues of our time— the challenges of digitally secure computing,  the Internet of Things, and the threat of widespread facial recognition—this article investigates social justice activists, human rights defenders, and progressive technologists as they try to flip dominant algorithmic imaginaries. In so doing, the article contributes to our under- standing of how people make sense of the challenges of datafication from the bottom-up. 

2019: Tanczer, L., Deibert, R. J., Bigo, D., Franklin, M. I., Melgaço, L., Lyon, D.. Kazansky, B., & Milan, S. “Online Surveillance, Censorship, and Encryption in Academia”. International Studies Perspectives.

Abstract: The Internet and digital technologies have become indispensable in academia. A world without email, search engines, and online databases is practically unthinkable. Yet, in this time of digital dependence, the academy barely demonstrates an appetite to reflect upon the new challenges that digital technologies have brought to the scholarly profession. This forum’s inspiration was a roundtable discussion at the 2017 International Studies Association Annual Convention, where many of the forum authors agreed on the need for critical debate about the effects of online surveillance and censorship techniques on scholarship. This forum contains five critiques regarding our digitized infrastructures, datafied institutions, mercenary corporations, exploitative academic platforms, and insecure online practices. Together, this unique collection of articles contributes to the research on academic freedom and helps to frame the analysis of the neoliberal higher education sector, the surveillance practices that students and staff encounter, and the growing necessity to improve our “digital hygiene.”

2015: Kazansky, B. “Privacy, Responsibility, and Human Rights Activism”. The Fibreculture Journal, (26 2015: Entanglements–Activism and Technology). 

Abstract: In this article, we argue that many difficulties associated with the protection of digital privacy are rooted in the framing of privacy as a predominantly individual responsibility. We examine how models of privacy protection, such as Notice and Choice, contribute to the ‘responsibilisation’ of human rights activists who rely on the use of technologies for their work. We also consider how a group of human rights activists countered technology-mediated threats that this ‘responsibilisation’ causes by developing a collective approach to address their digital privacy and security needs. We conclude this article by discussing how technological tools used to maintain or counter the loss of privacy can be improved in order to support the privacy and digital security of human rights activists.

2015: Ganesh, M., Kazansky, B., Deutsch, G. “Tensions and Frictions in Researching Activists’ Digital Security and Privacy Practices”. Privacy Enhancing Technologies Symposium.

A reflection piece co-written by the three of us to discuss our respective experiences engaging in ethical and epistemologically just research in sensitive contexts.

Book Chapters

2019: Kazansky, B., Torres, G., van der Velden, L., Wissenbach, K. R., & Milan, S. “Data for the social good: Toward a data-activist research agenda”. In A. Daly & M. Mann (Eds.), Good Data. Amsterdam: Institute of Network Cultures.

Abstract: ‘Big data’ is a hyped buzzword – or rather, it has been for a while, before being supplanted by ‘newer’ acclaimed concepts such as artificial intelligence. The popularity of the term says something about the widespread fascination with the seemingly infinite possibilities of automatized data collection and analysis. This enchantment affects not only the corporate sector, where many technology companies have centered their business model on data mining, and governments, whose intelligence agencies have adopted sophisticated machinery to monitor citizens. Many civic society organizations, too, are increasingly trying to take advantage of the opportunities brought about by datafication, using data to improve society. From crowdsourced maps about gender-based violence (‘feminicide’) in Latin America, to the analysis of audio-visual footage to map drone attacks in conflict zones, individuals and groups regularly produce, collect, process and repurpose data to fuel research for the social good. Problematizing the mainstream connotations of big data, these examples of ‘data activism’ take a critical stance towards massive data collection and represent the new frontier of citizens’ engagement with information and technological innovation.In this chapter we survey diverse experiences and methodologies of what we call ‘data-activist research’ – an approach to research that combines embeddedness in the social world with the research methods typical of academia and the innovative repertoires of data activists.

2017: Constant VZW (collective) w/ a small contribution from Kazansky, B. Technogalactic Guide to Software Observation.

An amazing project developed by the folks at Constant in Brussels. I was fortunate to participate in a multi-day workshop that had us thinking and playing with ethnographic and performative methodologies towards the study of software services. Some record of that appears in the book.

Reports & Op-eds

2023: Kazansky, B., Johnson, O., Paes, B., Kilbey, H., and The Engine Room. Chatbots in humanitarian contexts: Learning from practitioner experiences. The Engine Room. Draft forthcoming.

In recent years, chatbots have offered humanitarian operations the possibility to automate personalised engagement and support, inform tailored programme design and gather and share information at a large scale. However, adopting a chatbot is never straightforward, and there are many considerations that should go into doing so responsibly and effectively. With some humanitarian organisations having experimented with chatbots for several years now, many are now interested in taking stock of their experiences and fostering greater awareness of how to design, budget for and maintain chatbots responsibly and effectively. Responding to these priorities, The Engine Room has developed this report to explore the existing uses, benefits, tradeoffs and challenges of using chatbots in humanitarian contexts. This research has resulted from a collaboration between IFRC, ICRC, and a research advisory board consisting of representatives from IFRC, ICRC, the Netherlands Red Cross, and UNHCR.

2022: Kazansky, B., Karak, M., Perosa, T., Tsui, Q., Baker, S., and The Engine Room. At the confluence of digital rights and climate & environmental justice: A landscape review. Available at: https://engn.it/climatejusticedigitalrights.

From October 2021 to April 2022, The Engine Room conducted a research project exploring areas of intersection between digital rights and climate & environmental justice. Supported by Ford FoundationMozilla Foundation, and Ariadne, we set out to map the important work happening at the intersections of these vast fields, with the goal of identifying current barriers to and opportunities for greater collaboration, and highlighting areas in which funders can provide needed support. We spoke to a variety of practitioners from both digital rights (DR) and environmental and climate justice (EJ-CJ), and many flagged how relevant it can be for DR and EJ-CJ practitioners to learn more about each other’s work and priorities, and to explore areas of overlap. In our research, we surfaced numerous areas of cross-cutting concern as well as existing points of interconnection between DR and EJ-CJ. In our report, we highlight five key intersections in particular: sustainable internet and technology, access to information and information disorder, safety and defence, data-driven environmental monitoring, and migration justice. The report: provides an overview of some of the key areas where digital rights and allied technology work currently intersects with environmental and climate justice, provides insights on the needs and challenges of practitioners engaged in work spanning DR, tech and EJ-CJ issues, identifies barriers to, and opportunities for, an ecosystem that sits between the two, and lays out a variety of opportunities for digital rights funders to provide impactful support that is grounded in the real-world experiences of different communities and movements engaged in the fight for climate and environmental justice.

2019: Kazansky, B and Torres, G. Everyday Data: a workshop report.

The outcome report for a 1-day workshop we organized on the heels of the Data Power conference in Bremen. We sought to create a space to explore and unpack the concept of the ‘everyday’ as it figures into studies of data practices and resistance to datafication. The workshop brought together a small group of interdisciplinary scholars working on issues related to the making and unmaking of datafication, to paraphrase Neal and Murji (2015). Participants came from sociology, anthropology, computer science, media studies, and informatics. Their topics of research include community activism, platform labor, feminist data practices, and the data-resistant practices of states, studying datafication through the respective participation of citizens, governments, corporations, and academia.

2016: Kazansky, B. Digital Security in Context: Learning how human rights defenders adopt digital security practices. Tactical Technology Collective.

This research report is the culmination of a project investigating the sociotechnical issues that make adopting digital security practices a challenge for human rights defenders. Its findings were based on 60 interviews and action-research with three human rights groups between 2013-2014. I’m glad to say that in the years since it was published in 2016, this research has helped bolster efforts within civil society to provide more equitable and community-centric forms of digital security support for communities dealing with technological threats.

2016: Kazansky, B., Kaltheuner, F., van der Velden, L., et al. (2016) Evolution and Sustainability of Digital Security Tools: An Exploration of F/LOSS Encrypted Chat Apps. WIKI Digital Methods Initiative

Rough documentation of our collaborative project studying the maintenance of digital security tools at the Digital Methods Winter School in 2016.

2015: Kazansky, B. Addressing the Right to Privacy in 2015. Internet Policy Review.

An op-ed raising the issues to watch for 2015 in the world of privacy and human rights. In particular, I tried to shine a light on cases where privacy-related activism was criminalised.

2012: Kazansky B. In red hook, mesh network connects sandy survivors still without power. Techpresident. November 12, 2012.    

A report on the autonomous mesh infrastructure put up in Red Hook and the role it played following superstorm Sandy.

Visual & Multimedia

2013: Kazansky B. Animation producer and script writer, With Tactical Technology Collective, What is a digital shadow?

Working with talented illustrators and colleagues at Tactical Tech, I conceptualised and helped produce this educational animation.

2012: Kazansky B. Videographer/created for visual component of Master’s thesis. Mesh: a documentary about community networks (trailer).

Part of a much larger project that ended with the collection of 100+ hours of footage about mesh networks around the world, this trailer introduced a few key people working to democratise this technology from around 2003-2012.

2012: Redgrave, K., Chang, D., Kazansky, B., Seo, A., Sifry, M. Politics and the Internet (interactive) Timeline. Techpresident.

With Micah Sifry, I put together a timeline that would cover key events related to ‘online activism’ and networked politics.

2012: Kazansky, B. Podcast host and producer of “Hacking Censorship”. Drone Humanitarianism, Berkman Center, Harvard. Oct. 4, 2012.

My first and last foray into podcasting, I interviewed a dozen hackers and Internet Freedom technologists about the fight to circumvent censorship.