The Algorithmic Nickelodeon featured in Sight & Sound’s Best Video Essays of 2019

The Algorithmic Nickelodeon from Shane Denson on Vimeo.

 

Several weeks ago, Sight & Sound Magazine’s “Best Video Essays of 2019” came out, featuring 134 videos nominated by 39 contributors — including my “Algorithmic Nickelodeon” piece, picked by Jiří Anger from Charles University in Prague. He writes:

Despite its formal shortcomings, this must be one of the most thought-provoking videographic works I have seen. Denson’s theoretical manifesto imagines a form of audiovisual criticism that would not be merely expressive but transformative, reinventing our notion of subject-object relations. For this to happen, deformations of the image/object and displacements of the analyst/subject must take place simultaneously. Creative thinking joins forces with EEG headsets and editing programmes to create a media-theoretical ‘perpetuum mobile’, designed for constant questioning of what cinema means in the age of algorithms.

I am honored to have my work featured alongside many fascinating videos, many of which were made by friends and colleagues of mine (including especially noteworthy pieces by Chloé Galibert-Laîné, Kathleen Loock, Jason Mittell, Tracy Cox-Stanton, as well as Allison de Fren’s piece “Mad Science/Mad Love and the Female Body in Pieces,” which I commissioned for the Videographic Frankenstein exhibition at Stanford and published last year in Hyperrhiz).

By the way, I agree completely with Anger’s assessments of my video’s “formal shortcomings,” which stand out all the more against the background of all the excellent and polished work featured in the poll. In fact, my video was conceived and produced as a very rough proof-of-concept for a symposium organized by Kathleen Loock in Berlin last year (where I had hoped to do a live demo of the setup but was unable to due to technical limitations in the venue). A more polished video for the project is currently being planned, but in the meantime I’m quite happy with Anger’s assessment of it as “one of the most thought-provoking videographic works”!

The Algorithmic Nickelodeon at Besides the Screen Festival (Vitoria, Brazil, September 9-12, 2019)

BesidesTheScreen

I am happy to report that my deformative, EEG-driven interactive video project, The Algorithmic Nickelodeon, which was screened last month at the ACUD-Kino in Berlin, has been selected for screening at the Besides the Screen Festival taking place in Vitória and São Paulo, Brazil this September. My understanding is that it will be among the works shown in Vitória from September 9-12.

The Algorithmic Nickelodeon

Yesterday was the first event on my trip to Germany and Switzerland: the symposium Videographic Criticism: Aesthetics and Methods of the Video Essay, organized by Kathleen Loock, and with talks/screenings from her, Allison de Fren, Chloé Galibert-Laîné and Kevin B. Lee, Liz Greene, David Verdeure, and myself.

Above, you will find my video contribution, “The Algorithmic Nickelodeon,” which builds on work started at the Duke S-1: Speculative Sensation Lab during my time there as a postdoc. The video is offered as proof-of-concept for an experimental approach to videographic theory–using video not (only) as a vehicle for theoretical expression but as a more radically transductive medium of media-theoretical exploration and transformation.

Speculative Data: Full Text, MLA 2016 #WeirdDH

SpeculativeData-jpg.001

Below you’ll find the full text of my talk from the Weird DH panel organized by Mark Sample at the 2016 MLA conference in Austin Texas. Other speakers on the panel included Jeremy Justus, Micki Kaufman, and Kim Knight.

***

Speculative Data: Post-Empirical Approaches to the “Datafication” of Affect and Activity

Shane Denson, Duke University

A common critique of the digital humanities questions the relevance (or propriety) of quantitative, data-based methods for the study of literature and culture; in its most extreme form, this type of criticism insinuates a complicity between DH and the neoliberal techno-culture that turns all human activity, if not all of life itself, into “big data” to be mined for profit. Now, it may sound from this description that I am simply setting up a strawman to knock down, so I should admit up front that I am not wholly unsympathetic to the critique of datafication. But I do want to complicate things a bit. Specifically, I want to draw on recent reconceptions of DH as “deformed humanities” – as an aesthetically and politically invested field of “deformance”-based practice – and describe some ways in which a decidedly “weird” DH can avail itself of data collection in order to interrogate and critique “datafication” itself.

SpeculativeData-jpg.002

My focus is on work conducted in and around Duke University’s S-1: Speculative Sensation Lab, where literary scholars, media theorists, artists, and “makers” of all sorts collaborate on projects that blur the boundaries between art and digital scholarship. The S-1 Lab, co-directed by Mark Hansen and Mark Olson, experiments with biometric and environmental sensing technologies to expand our access to sensory experience beyond the five senses. Much of our work involves making “things to think with,” i.e. experimental “set-ups” designed to generate theoretical and aesthetic insight and to focus our mediated sensory apparatus on the conditions of mediation itself. Harnessing digital technologies for the work of media theory, this experimentation can rightly be classed, alongside such practices as “critical making,” in the broad space of the digital humanities. But due to their emphatically self-reflexive nature, these experiments challenge borders between theory and practice, scholarship and art, and must therefore be qualified, following Mark Sample, as decidedly “weird DH.”

SpeculativeData-jpg.003.jpeg

One such project, Manifest Data, uses a piece of “benevolent spyware” that collects and parses data about personal Internet usage in such a way as to produce 3D-printable sculptural objects, thus giving form to data and reclaiming its personal value from corporate cooptation. In a way that is both symbolic and material, this project counters the invisibility and “naturalness” of mechanisms by which companies like Google and Facebook expropriate value from the data we produce. Through a series of translations between the digital and the physical—through a multi-stage process of collecting, sculpting, resculpting, and manifesting data in virtual, physical, and augmented spaces—the project highlights the materiality of the interface between human and nonhuman agencies in an increasingly datafied field of activity. (If you’re interested in this project, which involves “data portraits” based on users’ online activity and even some weird data-driven garden gnomes designed to dispel the bad spirits of digital capital, you can read more about it in the latest issue of Hyperrhiz.)

SpeculativeData-jpg.004

Another ongoing project, about which I will say more in a moment, uses data collected through (scientifically questionable) biofeedback devices to perform realtime collective transformations of audiovisual materials, opening theoretical notions of what Steven Shaviro calls “post-cinematic affect” to robustly material, media-archaeological, and aesthetic investigations.

SpeculativeData-jpg.005

These and other projects, I contend, point the way towards a truly “weird DH” that is reflexive enough to suspect its own data-driven methods but not paralyzed into inactivity.

Weird DH and/as Digital Critical (Media) Studies:

So I’m trying to position these projects as a form of weird digital critical (media) studies, designed to enact and reflect (in increasingly self-reflexive ways) on the use of digital tools and processes for the interrogation of the material, cultural, and medial parameters of life in digital environments.

SpeculativeData-jpg.006

Using digital techniques to reflect on the affordances and limitations of digital media and interfaces, these projects are close in spirit to new media art, but they are also apposite with practices and theories of “digital rhetoric,” as described by Doug Eyman, with Gregory Ulman’s “electracy,” or with Casey Boyle’s posthuman rhetoric of multistability, which celebrates the rhetorical affordances of digital glitches in exposing the affordances and limitations of computational media in the broader realm of an interagential relational field that includes both humans and nonhumans. In short, these projects enact what we might call, following Stanley Cavell, the “automatisms” of digital media – the generative affordances and limitations that are constantly produced, reproduced, and potentially transformed or “deformed” in creative engagements with media. Digital tools are used in such a way as to problematize their very instrumentality, hence moving towards a post-empirical or post-positivistic form of datafication as much as towards a post-instrumental digitality.

SpeculativeData-jpg.007

Algorithmic Nickelodeon / Datafied Attention:

My key example is a project tentatively called the “algorithmic nickelodeon.” Here we use consumer-grade EEG headsets to interrogate the media-technical construction and capture of human attention, and thus to complicate datafication by subjecting it to self-reflexive, speculative, and media-archaeological operations. The devices in question cost about $100 and are marketed as tools for improving concentration, attention, and memory. The headset measures a variety of brainwave activity and, by means of a proprietary algorithm, computes values for “attention” and “meditation” that can be tracked and, with the help of software applications, trained and supposedly optimized. In the S-1 Lab, we have sought to tap into these processes in order not just to criticize the scientifically dubious nature of these claims but rather to probe and better understand the nature of the automatisms and interfaces taking place here and in media of attention more generally. Specifically, we have designed a film- and media-theoretical application of the apparatus, which allows us to think early and contemporary moving images together, to conceive pre- and post-cinema in terms of their common deviations from the attention economy of classical cinema, and to reflect more broadly on the technological-material reorganizations of attention involved in media change. This is an emphatically experimental (that is, speculative, post-positivistic) application, and it involves a sort of post-cinematic reenactment of early film’s viewing situations in the context of traveling shows, vaudeville theaters, and nickelodeons. With the help of a Python script written by lab member Luke Caldwell, a group of viewers wearing the Neurosky EEG devices influence the playback of video clips in real time, for example changing the speed of a video or the size of the projected image in response to changes in attention as registered through brain-wave activity.

At the center of the experimentation is the fact of “time-axis manipulation,” which Friedrich Kittler highlights as one of the truly novel affordances of technical media, like the phonograph and cinema, that arose around 1900 and marked, for him, a radical departure from the symbolic realms of pre-technical arts and literature. Now it became possible to inscribe “reality itself,” or to record a spectrum of frequencies (like sound and light) directly, unfiltered through alphabetic writing; and it became possible as well to manipulate the speed or even playback direction of this reality.

SpeculativeData-jpg.009

Recall that the cinema’s standard of 24 fps only solidified and became obligatory with the introduction of sound, as a solution to a concrete problem introduced by the addition of a sonic register to filmic images. Before the late 1920s, and especially in the first two decades of film, there was a great deal of variability in projection speed, and this was “a feature, not a bug” of the early cinematic setup. Kittler writes: “standardization is always upper management’s escape from technological possibilities. In serious matters such as test procedures or mass entertainment, TAM [time-axis manipulation] remains triumphant. [….] frequency modulation is indeed the technological correlative of attention” (Gramophone Film Typewriter 34-35). Kittler’s pomp aside, his statement highlights a significant fact about the early film experience: Early projectionists, who were simultaneously film editors and entertainers in their own right, would modulate the speed of their hand-cranked apparatuses in response to their audience’s interest and attention. If the audience was bored by a plodding bit of exposition, the projectionist could speed it up to get to a more exciting part of the movie, for example. Crucially, though: the early projectionist could only respond to the outward signs of the audience’s interest, excitement, or attention – as embodied, for example, in a yawn, a boo, or a cheer.

SpeculativeData-jpg.010

But with the help of an EEG, we can read human attention – or some construction of “attention” – directly, even in cases where there is no outward or voluntary expression of it, and even without its conscious registration. By correlating the speed of projection to these inward and involuntary movements of the audience’s neurological apparatus, such that low attention levels cause the images to speed up or slow down, attention is rendered visible and, to a certain extent, opened to conscious and collective efforts to manipulate it and the frequency of images now indexed to it.

According to Hugo Münsterberg, who wrote one of the first book-length works of film theory in 1916, cinema’s images anyway embody, externalize, and make visible the faculties of human psychology; “attention,” for example, is said to be embodied by the close-up. With our EEG setup, we can literalize Münsterberg’s claim by correlating higher attention levels with a greater zoom factor applied to the projected image. If the audience pays attention, the image grows; if attention flags, the image shrinks. But this literalization raises more questions than it answers, it would seem. On the one hand, it participates in a process of “datafication,” turning brain wave patterns into a stream of data called “attention,” but whose relation to attention in ordinary senses is altogether unclear. But this datafication simultaneously opens up a space of affective or aesthetic experience in which the problematic nature of the experimental “set-up” announces itself to us in a self-reflexive doubling: we realize suddenly that “it’s a setup”; “we’ve been framed” – first by the cinema’s construction of attentive spectators and now by this digital apparatus that treats attention as an algorithmically computed value.

So in a way, the apparatus is a pedagogical/didactic tool: it not only allows us to reenact (in a highly transformed manner) the experience of early cinema, but it also helps us to think about the construction of “attention” itself in technical apparatuses both then and now. In addition to this function, it also generates a lot of data that can indeed be subjected to statistical analysis, correlation, and visualization, and that might be marshaled in arguments about the comparative medial impacts or effects of various media regimes. Our point, however, remains more critical, and highly dubious of any positivistic understanding of this data. The technocrats of the advertising industry, the true inheritors of Münsterberg the industrial psychologist, are anyway much more effective at instrumentalizing attention and reducing it to a psychotechnical variable. With a sufficiently “weird” DH approach, we hope to stimulate a more speculative, non-positivistic, and hence post-empirical relation to such datafication. Remitting contemporary attention procedures to the early establishment of what Kittler refers to as the “link between physiology and technology” (73) upon which modern entertainment media are built, this weird DH aims not only to explore the current transformations of affect, attention, and agency – that is, to study their reconfigurations – but also potentially to empower media users to influence such configuration, if only on a small scale, rather than leave it completely up to the technocrats.

Conversations in the Digital Humanities at Duke

page-events-dhduke

Today, Oct. 2, 2015, the Franklin Humanities Institute, the Wired! Lab, the PhD Lab in Digital Knowledge, and HASTAC@Duke will be presenting “Conversations in the Digital Humanities,” the inaugural event of the new Digital Humanities Initiative at Duke University. More information about the event, in which I will be participating alongside colleagues from the S-1: Speculative Sensation Lab, can be found on the FHI website.

Also, all of the 10-minute “lightning talks” will be live-streamed. The first block of sessions, from 2:15-3:45pm EST, will be streamed here, and the second block, from 4:00-5:40pm, will be viewable here. (Apparently, the videos will be archived and available after the fact as well.)

Here is the complete schedule:

2:00 – 2:15
Welcome and Introduction to Digital Humanities Initiative

2:15 – 3:45 
Session 1 (10 minutes per talk)

  1. Project Vox (Andrew Janiak, and Liz Milewicz)
  2. NC Jukebox (Trudi Abel, Victoria Szabo)
  3. Visualizing Cultures: The Shiseido Project (Gennifer Weisenfeld)
  4. Going Global in Mughal India (Sumathi Ramaswamy)
  5. Israel’s Occupation in the Digital Age (Rebecca Stein)
  6. Digital Athens: Archaeology meets ArcGIS (Tim Shea, Sheila Dillon)
  7. Early Medieval Networks (J. Clare Woods)

3:45 – 4:00
Coffee Break

4:00 – 5:40 
Session 2 (10 minutes per talk)

  1. Painting the Apostles – A Case Study in “The Lives of Things” (Mark Olson, Mariano Tepper, and Caroline Bruzelius)
  2. Digital Archaeology: From the Field to Virtual Reality (Maurizio Forte)
  3. The Memory Project (Luo Zhou)
  4. Veoveo, children at play (Raquel Salvatella de Prada)
  5. “Things to Think With”: Weird DH, Data, and Experimental Media Theory (S-1 Lab)
  6. s_traits, Generative Authorship and the Emergence Lab (Bill Seaman and John Supko)
  7. Found Objects and Fireflies (Scott Lindroth)
  8. Project Provoke (Mary Caton Lingold and others)

5:40 – 6:00 
Reception

Things to Think With

mindwave2

As a late addition to the program, the Duke S-1 Speculative Sensation Lab will be participating in “Conversations in the Digital Humanities” this coming Friday, October 2, 2015, at the Franklin Humanities Institute at Duke. The event, which will consist of a series of brief “lightning talks” on a range of topics that run the gamut of current DH work, will take place from 2:00-6:00pm in the FHI Garage in Smith Warehouse, Bay 4. More info here: Conversations in the Digital Humanities.

Here is the abstract for the S-1 Lab’s presentation, which I will be participating in along with Lab co-director Mark Olson and our resident programmer Luke Caldwell:

“Things to Think With”: Weird DH, Data, and Experimental Media Theory

S-1 Speculative Sensation Lab

The S-1 Speculative Sensation Lab, co-directed by Mark Hansen and Mark Olson, experiments with biometric and environmental sensing technologies to expand our access to sensory experience beyond the five senses. Much of our work involves making “things to think with,” i.e. experimental “set-ups” designed to generate theoretical and aesthetic insight and to focus our mediated sensory apparatus on the conditions of mediation itself. Harnessing digital technologies for the work of media theory, this experimentation can rightly be classed, alongside such practices as “critical making,” in the broad space of the digital humanities. But due to their emphatically self-reflexive nature, these experiments challenge borders between theory and practice, scholarship and art, and must therefore be qualified, following Mark Sample, as decidedly “weird DH.”

In this presentation, we discuss a current project that utilizes consumer-grade EEG headsets, in conjunction with a custom Python script by lab member Luke Caldwell, to reflect on the contemporary shape of “attention,” as it is constructed and addressed in individual and networked forms across media ranging from early cinema to “post-cinema.”