Library

Image censorship - deletion of technical images

"Ignore, Ignore, delete, ..." is the rhythm that dominates the documentary "The Cleaners," 2018.

It reports on thousands of content moderators who decide every second on the deletion or retention of visual images on social media platforms. Not only is their work conducted in secret, also the policies and guidelines given by their employers are among the most closely guarded secrets. In addition to the psychological strain that this work entails, the film sheds light on the consequences of deleting images.

In 2020, “Bildwelten des Wissens” volume 16, which published by the joint research unit "Das Technische Bild" by the Hermann von Helmholtz-Zentrum für Kulturtechnik and the Institute for Art and Image History of the Humboldt-Universität zu Berlin, also examines the issue of image censorship and the deletion of technical images. The 12 essays explore the different aspects of the topic. In his contribution "Blocking, Moderating, Projecting," Simon Rothöhler estimates the volume of digital images at over 1.4 trillion annually1 , from predominantly camera-enabled smartphones, uploaded onto networked storage infrastructures and shared via social media platforms. Consequently, that amount of images entail the need for some sort of image control to protect personal rights and copyrights.

Control however does not start when images are published. Estelle Blaschke's essay "Discrete Operations" makes it clear that with the use of "high-performance processors"2, the photographer is no longer assisted in the creation of an image, but the device and its software completely take over the "leadership in a supposedly smooth process"3. That the assisted creation of images is not a new phenomenon becomes clear in Paul Brakmann's essay "Mail from Rochester: Quality Controls in Photofinishing." Even in the early days of analog photography, quality control mechanisms could be found in the creation of an image.

Automated image optimisation creates an aesthetic standardisation that is hardly noticed by humans due to the massive use of this technology, yet the camera produces consistently good images. Images that are created are likely to connect likeminded communities, images may trigger protests or images may pluralise with their visual message. Kerstin Schankweiler notes in her essay "The Censored Eye" that given the power that an image can develop with its dissemination, the process of state-imposed surveillance is reversed into sousveillance, a surveillance of the state by its citizens. "The governments, the state and social structures which the image protests are directed against, feel downright monitored and threatened by the mobile phone recordings"4.

The deletion of a technical image already eludes complete elimination with its dissemination on the Internet. Getrud Koch describes this fact impressively in her essay "Non-erasable images" in the course of duplication and untraceable dissemination. Computerised controls and filtering systems offer a more subtle form of image censorship that takes place below human perception. Yet these techniques are limited in their success because the analysis of images is based on formal criteria. The context in which images are shown is not being considered. And even with the intervention of a human operator in the decision of showing or not showing supposedly dangerous or indecent content does not achieve complete control. Katja Müller-Helle, editor of the volume, concludes that dealing "with images under technological conditions needs to be rewritten."5

1 Simon Rothöhler „Blockieren, Modernieren, Projizieren“, p. 24

2 Estelle Blaschke „Diskrete Operationen: Formen präemtiver Bildzensur in der KI-gestützten Fotografie, p. 32

3 Ibid, p. 32

4 Kerstin Schankweiler „Das zensierte Auge“, p. 47

5 Katja Müller-Helle Editorial, p. 10


(Kopie 1)