CFP: Networked Images in Surveillance Capitalism

Initial abstracts (max. 300 words) and a short biographical note (max.
100 words) are due on: 31 March 2021.
Digital Culture & Society, 2/2021
Edited by Olga Moskatova, Anna Polze and Ramón Reichert
Please send your abstract and short biographical note to:
This email address is being protected from spambots. You need JavaScript enabled to view it.

 

Networked Images in Surveillance Capitalism

Capturing personal data in exchange for free services is now ubiquitous
in networked media and recently led to diagnoses of surveillance and
platform capitalism (Zuboff 2019; Srnicek 2017). In social media
discourse, dataveillance and data mining have been criticized as new
forms of digital work and capitalist exploitation for some time (cf.
Allmer 2015; Andrejevic 2012; van Dijck 2014; Fuchs 2010, 2013; Scholz
2013; Trottier 2012). With the general transformation of the open web
into an ecology dominated by commercial platforms (Hands 2013; Helmond
2015; Langois and Elmer 2013; Gillespie 2018), platformization and
economic surveillance also redefine digital visual culture, facilitating
new forms of images, everyday practices and online visibility, while
expanding the logics of social media to the rest of the web. From social
photos (Jurgenson 2019), selfies and image communities on the internet
to connected viewing and streaming, and video conferencing during the
Corona pandemic – the digital image is not only predominantly networked
(Rubinstein and Sluis 2008) but also accessed through platforms (van
Dijck 2013; van Dijck et al. 2018) and structured by their economic
imperatives, data acquisition techniques and algorithmic processing.
Today, participation and commodification are closely linked in the
production, circulation, consumption and operativity of images and
visual communication, raising the question of the role networked images
play for and within the proliferating surveillance capitalism.

Linking images and surveillance automatically brings traditional
concepts such as panopticon and its numerous modifications into play,
since they rely on optical and visual metaphors (Haggerty 2006;
Buschauer 2016). In his famous analysis of the panopticon, Michel
Foucault showed to what extent power can be exercised through visuality
and so produce specific subjects. However, as frequently remarked
(Haggerty and Ericson 2000; Kammerer and Waitz 2015), this form of power
seems incapable of grasping the dynamics of networked digital media
technologies. In the paradigm of the control society (Deleuze 1992), not
only media but also the techniques of surveillance and control are
increasingly networked and unobtrusive. Many of their contemporary forms
do not rely on the visible demonstration and internalization of the
gaze, but on automated data-based and algorithmic forms of control that
are often motivated economically. They are not “salient”, but “silent”
(Introna and Wood 2004) and even “calm” technologies (Weiser and Brown
1997) that proliferate in everyday life and diffuse through
environments. Although the relationship between visuality and
surveillance is thus being transformed, images are nevertheless an
important part of post-panoptical media assemblages and their silent
forms of power. Since many successful economic platforms and our
everyday networked practices are image based, an evaluation of
surveillance capitalism that takes media differences seriously becomes
decisive.

Aestheticization of Surveillance Capitalism

The special issue therefore aims to interrogate the manifold
relationships between economic surveillance and networked images, and to
identify their intersections. On the one hand, images may support the
reproduction and maintenance of surveillance capitalism in several ways:
Aesthetic strategies and media principles of user-generated,
professional and popular images such as humour, compactness, nudity,
spectacularity, cinematicity, seriality, interactivity, cuteness or
emotionality can contribute to users turning to a platform, capturing
attention, prolonging browsing times and generating the “network
effects” (Srnicek 2017) necessary for the functioning of surveillance
capitalism. Adding to the “attention economies” (Beller 2006; Franck
1998; Goldhaber 1997; Krogan and Kinsley 2012; Terranova 2012) and
experiential-aesthetic regulation online, they can reintroduce the logic
of “gaze”, i.e. the focused stare, into the media environments of
“glance”, i.e. the incidental and fleeting glimpses (Bryson 1983) or
intermediate forms of active cognitive engagement with media content
such as “grazing” (Creeber 2013). As such, images can serve as
incentives themselves or be part of nudging interface and website
aesthetics (Mühlhoff 2018), and therefore contribute to the
aestheticization of digital capitalism.

Anaesthetization of Images

On the other hand, networked images can become anaesthetized, “calm” and
“silent” themselves – in a similar way to the techniques of control and
surveillance: Against the background of surveillance capitalism,
technological endeavours such as the internet of things (IoT),
ubiquitous computing and ambient intelligence appear as attempts to
expand the opportunities for data extraction and monetization. Everyday
objects become sentient things that are capable of multimodal monitoring
of environments and living beings, and of recording, storing and
circulating captured information. Visual data acquisition in the form of
sensors, webcams or computer vision operates without drawing attention
to itself. Often, not only the technologies are invisible, but also the
images that are no longer destined for human viewing and remain data
without being visually displayed (Paglen 2016; Rothöhler 2018). By
being processed in machine-to-machine seeing and communication within
IoT or used as training data for computer vision application (Crawford
and Paglen 2019), the networked and social media images are
anaesthetized and rechanneled into an invisible “visual” culture as new
economic assets (Mackenzie and Munster 2019). These “invisible image
data” (Rothöhler 2021) or “invisible images” share their
unobtrusiveness with algorithmic security systems such as facial
recognition, which exploits the publicness of the face, and produces
“calm images” operating in the background without addressing the users’
conscious attention (Veel 2012).

Subjectivation in Surveillance Capitalism

Furthermore, silent and economically motivated forms of networked
surveillance do not eliminate power relations and processes of
subjectivation. Rather, silent and scopic forms of power are related in
different ways, depending on platforms and the images they provide: On
social media platforms, forms of social control based on the visibility
of the personal can hardly be separated from algorithmic sorting and
recommending. They modulate visibility and invisibility as well as the
associated social fears (Trottier and Lyon 2012) and thus
algorithmically reconfigure scopic forms of power (Bucher 2016, 2018)
and self-care (Nguyen-Trung 2020). It can be assumed that algorithmic
control not only complicates or prevents the possibility of
subjectivation (Chiney-Lippold 2011, 2016; Rouvroy 2013; Rouvroy and
Berns 2013), but also enforces new and old ways of subjectivation. This
means that categories such as gender, age, class and race, which are
gaining increasing attention in surveillance studies (Dubrofsky and
Magnet 2015; Browne 2015; Conrad 2009), take on special relevance for
investigations of a networked digital capitalism. For example, not all
bodies are subjected to the exposure, economization of attention,
automated censorship and content moderation in the same way on popular
platforms for sharing images (Gillespie 2018; Müller-Helle 2020;
Roberts 2019). Nudity, female nipples, scars, bodily fluids, or pubic
hair, for instance, are regularly banned from Instagram (Byström et al.
2017; Gerling et al. 2018), while TikTok gets negative press for shadow
banning LGBTQ-related tags or suppressing black or disabled creators –
raising questions about the relationship between moderation,
discrimination, normalization, and economics. Image sets that can be
retrieved from social media platforms without compensation, and destined
to train algorithms, are known to demonstrate racial and gender bias or
lack of diversity (Buolamwini and Gebru 2018; Crawford and Paglen 2019;
Gates 2014; Kember 2013; Monea 2019). On streaming platforms, the
rhetoric of algorithmic personalization (Alexander 2010; Finn 2018) also
obscures collaborative filtering and stereotypical clustering, which can
reinforce gender and age biases (e.g. by correlating gender and genre)
(cf. Lin et al. 2019), among others, and so modulates specific viewer
subjects (Kellogg et al. 2020).

The special issue invites the submission of papers examining such and
comparable phenomena that are capable of shedding light on the role of
networked images and the reconfiguration of visuality in surveillance
capitalism. In particular, it focuses on the tension between a visual
aestheticization of capitalism and the anaesthetization of images or/and
surveillance techniques. It raises the following questions, such as: To
what extent and by means of which aesthetic strategies do images create
incentives for, and stabilize surveillance capitalism? How do they
contribute to its aestheticization? How is pictoriality reconfigured in
post-panoptical, ambient media environments and subjected to forms of
anaesthetization? How is subjectivation produced in apparatuses of
dataveillance and algorithmic control, and how are the regimes of the
gaze transformed within them?

Topics can include, but are not limited to:

· The role of images for the generation of the “behavioural
surplus” (Zuboff 2019) and data extraction
· Images as decoy and nudges; medial and aesthetic incentive
strategies
· Audience labour and modulation of viewing
· (In-)visibility as social control, and its relation to data
monitoring and algorithmic sorting
· New forms of subjectivation, desubjectivation or the prevention
of subjectivation in visual

surveillance capitalism

· Economization of attention
· Platform politics and automated censorship of images
· AI training on user-generated images and platform capitalism
· Surveillance capitalism in popular visual media and media arts
· Gender, race, class and algorithmic control on platforms for
(moving) images
· Calm images and invisible images
· Visual data acquisition in the internet of things, and ubiquitous
computing
· Tension between the aestheticization of surveillance capitalism
and the anaesthetization of

When submitting an abstract, authors should specify to which of the
following categories they would like to submit their paper:

1. Field Research and Case Studies (full paper: 6000-8000 words). We
invite articles that discuss empirical findings from studies that
examine surveillance and political economies in digital visual culture.
These may e.g. include studies that analyze particular image platforms;
address nudging and incentive aesthetic strategies; scrutinize whether
and how algorithmic personalization produces specific consumer subjects,
etc.

2. Methodological Reflection (full paper: 6000-8000 words). We invite
contributions that reflect on the methodologies employed when
researching data-driven and algorithmic surveillance and networked
images. These may include, for example, critical evaluation of
(resistance) discourses of transparency or obfuscation, algorithmic
black boxing, and their implicit epistemologies of the visible;
discussion of new or mixed methods, and reflections on experimental
forms of research.

3. Conceptual/Theoretical Reflection (full paper: 6000-8000 words). We
encourage contributions that reflect on the conceptual and/or
theoretical dimension of surveillance, capitalism and images. This may
include, for example, the relationship between scopic and silent forms
of power and control; critical evaluation of different concepts such as
surveillance capitalism, platform capitalism, algorithmic
governmentality, etc.; the tensions between the aestheticization of
capitalism and anaesthetization of images in data-driven media
environments (e.g. due to filtering, platform censorship, calm
technologies, etc.).

4. Entering the Field (2000-3000 words). This experimental section
presents initial and ongoing empirical work. The editors have created
this section to provide a platform for researchers who would like to
initiate a discussion about their emerging (yet perhaps incomplete)
research material and plans, as well as methodological insights.

Deadlines and contact information

§ Initial abstracts (max. 300 words) and a short biographical note
(max. 100 words) are due on: 31 March 2021.
§ Authors will be notified by 19 April 2021, whether they have been
invited to submit a full paper.
§ Full papers are due on: 1 August 2021.
§ Notifications to authors of referee decisions: 1 September 2021.
§ Final versions due: 10 November 2021.
§ Please send your abstract and short biographical note to:
This email address is being protected from spambots. You need JavaScript enabled to view it..

References

Alexander, N., 2010. Catered to Your Future Self: Netflix’s “Predictive
Personalization” and Mathematization of Taste, in: McDonald, K.,
Smith-Rowsey, D. (eds.), The Netflix Effect. Technology and
Entertainment in the 21st Century. Bloomsbury, New York, 81-97.

Allmer, T., 2015. Critical Theory and Social Media: Between Emancipation
and Commodification. Routledge, New York.

Andrejevic, M., 2012. Exploitation in the data mine. In: Fuchs, C. et
al. (eds.): Internet and Surveillance: The Challenges of Web 2.0 and
Social Media. Routledge, New York, 71-88.

Beller, J., 2006. The Cinematic Mode of Production: Attention Economy
and the Society of the Spectacle, Dartmouth Coll. Press, Lebanon, NH.

Buolamwini, J., Gebru, T., 2018. Gender shades: Intersectional accuracy
disparities in commercial gender classification. Conference on Fairness,
Accountability and Transparency, 77-91.

Buschauer, R., 2016. Datavisions – On Panoptica, Oligoptica, and (Big)
Data. International Review of Information Ethics 24, 5-14.

Bryson, N., 1983. Vision and Painting: The Logic of the Gaze. Yale Univ.
Press, New Haven, London. Bucher, T., 2018. If...Then. Algorithmic Power
and Politics. Oxford Univ. Press, New York.
Browne, S., 2015. Dark Matters. On the Surveillance of Blackness, Duke
Univ. Press, Durham/London.

Bucher, T., 2016. Want to be on top? Algorithmic Power and the Threat of
Invisibility on Facebook. In: Chun, W. et al. (eds.): New Media, Old
Media: A History and Theory Reader. Routledge, New York, 566-578.

Bucher, T., 2018. If...Then: Algorithmic Power and Politics. Oxford
Univ. Press, New York.

Byström, A., Soda, M., Kraus, C. (eds.), 2017. Pics or it didn’t
happen. Images banned from Instagram. Prestel, New York.

Cheney-Lippold, J., 2011. A New Algorithmic Identity Soft Biopolitics
and the Modulation of Control. Theory Culture & Society 28(6), 164-181.

Cheney-Lippold, J., 2016: We Are Data: Algorithms and The Making of Our
Digital Selves. New York Univ. Press, New York.

Conrad, K., 2009. Surveillance, Gender, and the Virtual Body in the
Information Age. Surveillance & Society 6(4), 380-387.

Crawford, K., Paglen, T., 2019. Excavating AI: The Politics of Images in
Machine Learning Training Sets. Retrieved from:
https://excavatingai.com/

Creeber, G., 2013. Small Screen Aesthetics: From TV to the Internet.
BFI, Palgrave Macmillan, London.

Crogan, P., Kinsley, S., 2012. Paying Attention. Towards a Critique of
The Attention Economy. Culture Machine 13, 1–29.

Deleuze, G., 1992. Postscript on the Societies of Control. October 59,
3-7.

Dubrofsky, R. E., Magnet, S. A. (eds.), 2015. Feminist Surveillance
Studies. Duke Univ. Press, Durham, London.

Finn, E., 2016. What Algorithms Want: Imagination in the Age of
Computing. MIT Press, Cambridge, Mass.

Franck, G., 1998. Ökonomie der Aufmerksamkeit. Carl Hanser, München.

Fuchs, C., 2013. Class and exploitation on the Internet. In: Scholz, T.
(ed.), Digital labor. The Internet as playground and factory. Routledge,
New York, 211-224.

Hands, J., 2013. Introduction: Politics, Power and ‘Platformativity’.
Culture Machine 14, 1-9. Gates, K., 2014. Can Computers Be Racist?
Juniata Voices 15, 5-17.

Gerling, W., Holschbach, S., Löffler, P., 2018. Bilder verteilen.
Fotografische Praktiken in der digitalen Kultur. Transcript, Bielefeld.

Gillespie, T., 2018, Custodians of the Internet. Platforms, Content
Moderation and the Hidden Decisions that Shape Social Media. Yale Univ.
Press, New Haven, London.

page5image19167104

Goldhaber, M. H., 1997. The Attention Economy and the Net. First Monday
(2)4. Retrieved from: http://firstmonday.org/article/view/519/44
(01/21/2021)

Haggerty, K. D., 2006. Tear down the Walls: on Demolishing the
Panopticon. In: Lyon, D. (ed.), Theorizing Surveillance. Willan, London,
23-45.

Haggerty, K. D., Ericson, R. V., 2000. The Surveillant Assemblage.
British Journal of Sociology 51, 605– 622.

Helmond, A., 2015. The Platformization of the Web: Making Web Data
Platform Ready. Social Media + Society 1(2), 1-11.

Introna, L., Wood, D., 2004. Picturing algorithmic surveillance: the
politics of facial recognition systems. Surveillance & Society 2(2,3),
177-198.

Jurgenson, N., 2019. The Social Photo: On Photography and Social Media.
Verso: London.

Kammerer, D., Waitz, T., 2015. Überwachung und Kontrolle. Einleitung in
den Schwerpunkt. ZfM 13, 10-20.

Kellogg, K. C., Valentine, M. A., Christin A., 2020. Algorithms at work:
The new contested terrain of control. Academy of Management Annals 14,
366-410.

Kember, S., 2013. Gender Estimation in Face Recognition Technology. How
Smart Algorithms Learn to Discriminate. Media Fields Journal, 7, 1-10.

Langlois, G., Elmer, G., 2013. The Research Politics of Social Media
Platforms. Culture Machine 14, 1– 17.

Lin, K., Sonboli, N., Mobasher, B., Burke, R., 2019. Crank Up the
Volume: Preference Bias Amplification in Collaborative Recommendation.
arXiv, eprint arXiv:1909.06362.

Mackenzie, A., Munster, A., 2019. Platform Seeing: Image Ensembles and
their Invisualities. Theory, Culture & Society 36, 3–22.

Monea, A., 2019. Race and Computer Vision. In: Sudmann, A. (ed.), The
democratization of artificial intelligence. Net politics in the era of
learning algorithms. Transcript, Bielefeld: 189–208.

Mühlhoff, R., 2018. „Digitale Entmündigung“ und „User Experience
Design“. Leviathan – Berliner Zeitschrift für Sozialwissenschaft 46(4),
551–74.

Müller-Helle, K., 2020. Bildzensur. Löschung technischer Bilder,
Bildwelten des Wissens. De Gruyter, Berlin.

Nguyen, K. T., 2020. Care of the self in the age of algorithms: Early
thoughts from a Foucauldian perspective. Journal of Science Ho Chi Minh
City Open University 10(1), 79-90.

Paglen, T., 2016. Invisible Images (Your Pictures Are Looking at You).
Retrieved from
https://thenewinquiry.com/invisible-images-your-pictures-are-looking-at-you/
(01/21/2021)

Reichert, R., 2020. #Foodporn auf Instagram: Wettbewerb und
Sozialsteuerung. POP 9(1), 93–99.

Reichert, R., 2020. Medien im Ausnahmezustand. Überwachungstechnologien
in der Ära von Covid- 19. FALTER. Retrieved from

page6image19145728

https://www.falter.at/zeitung/20200422/medien-im-ausnahmezustand-
ueberwachungstechnologien-in-der-aera-von-covid-19 (01/21/2021)

Reichert, R., 2018. Biosurveillance, Self-Tracking und digitale
Gouvernementalität. In: Buhr, L., Hammer, S., Schölzel, H. (eds.),
Staat, Internet und digitale Gouvernementalität. Springer VS,
Wiesbaden, 65–86.

Roberts, S. T., 2019. Behind the Screen. Content Moderation in the
Shadows of Social Media. New Haven, London.

Rothöhler, S., 2018. Calm Images. Bildproliferation und Bildlosigkeit
im Internet der Dinge. Merkur 72, 32-42.

Rothöhler, S., 2021. Calm Images: The Invisible visual culture of
digital image distribution. In: Moskatova, O. (ed.): Images on the Move:
materiality – Networks – Formats. Transcript: Bielefeld.

Rouvroy, A., 2013. The end(s) of critique: data-behaviourism vs.
due-process. In: Hildebrandt, M., de Vries, K. (Eds.), Privacy, Due
Process and the Computational Turn. Routledge, New York, 143-168.

Rouvroy, A., Berns, T., 2013. Gouvernementalité algorithmique et
perspectives d'émancipation. Réseaux 177, 163-196.

Rubinstein, D., Sluis, K., 2008. A Life More Photographic. Mapping the
Networked Image. In: Photographies 1, 9-28.

Scholz, Trebor (ed.), 2012. Digital Labor: The Internet as Playground
and Factory. Routledge, London.

Srnicek, N., 2017. Platform Capitalism. Polity, Cambridge, Malden, MA.

Terranova, T., 2012. Attention, Economy and the Brain. Culture Machine
13, 1–19.

Trottier, D., 2012. Social Media as Surveillance. Rethinking Visibility
in A Converging World. Ashgate: Surrey/Burlington

Trottier, D., Lyon, D. 2012. Key Features of Social Media Surveillance.
In: Fuchs, C. et al. (eds.), Internet and Surveillance. The Challenges
of Web 2.0 and Social Media. Routledge, New York, London, 89-105.

van Dijck, J., 2014. Datafiction, Dataism and Dataveillance: Big Data
between Scientific Paradigm and Secular Belief. Surveillance & Society,
12(2), 197-208.

van Dijk, J., 2013. The Culture of Connectivity. A Critical History of
Social Media, Oxford Univ. Press. Oxford.

van Dijck, J., Poell, T., de Waal, M., 2018. The Platform Society.
Public Values in a Connective World. Oxford Univ. Press, Oxford.

Veel, K., 2012. Calm Imaging: The Conquest of Overload and the
Conditions of Attention. In: Ekman, U. (ed.), Throughout: Art and
Culture Emerging with Ubiquitous Computing. MIT Press, Cambridge MA.,
119-132.

Weiser, M., Brown, J. S., 1997. The Coming Age of Calm Technology. In:
Denning, P.J., Metcalfe, R. M. (eds.), Beyond Calculation. The Next
Fifty Years of Computing. Springer, New York, 75-85.

page7image19140736

Zuboff, S., 2019. The Age of Surveillance Capitalism. The Fight for a
Human Future at the New Frontier of Power. Profile, London.