"Stop Shooting" and Other Ideas on Managing Image Overload at Magnum Photos
In 1947, a group of seasoned photojournalists including Robert Capa and Henri Cartier-Bresson founded the international photo cooperative Magnum. At first, Magnum’s challenge was to cover the world with its limited network of photographers, and to get their pictures to as many magazine clients as possible before the novelty of those pictures expired. A decade later, Magnum’s problems had to do with filing cabinets, log books, storage space, and “dead” material. In 1958, Magnum’s New York-based executive editor John Morris begged photographers to “STOP shooting for a period of one month” so that staff could figure out a better system for editing, captioning, and selling their stories. Who has ever heard of the great Henri Cartier-Bresson being told to stop photographing?
This paper demonstrates how Magnum’s legacy – which rests on iconic images and the reputations of its technically skilled and socially concerned photographers – was produced in response to the problem of plenty at the photo agency. I show how editing, filing and retrieving images and story texts posed an ongoing problem within the agency’s operations, and examine the myriad of ways in which it was handled behind the scenes by forgotten figures. I argue that Magnum’s inability to deal with its image files also became a point of pride, signaling that the cooperative wouldn’t succumb to the commercially-oriented practices of competing photo agencies such as Black Star. Such narratives still operate today, while the unresolved excess of the picture archive continues to pose fundamental challenges to understanding both the history of Magnum’s operations and the role of photography in the postwar world.
Everyday Techniques of the Affect Lab: Photography in Early 20th Century Psychological Research
Grant Bollmer, North Carolina State University and the University of Sydney
We rarely encounter the feelings and actions of individuals that implicitly guide scientific beliefs, at least directly. We seldom confront the aberrational frustrations that emerge when a particular body is placed into an apparatus and transformed into a machine for the production of knowledge. Procedures that may be quite strange or even violent are framed in language that appears “neutral,” or “scientific,” or “objective.” In this talk I am interested in looking at the everyday life of what I term “the affect lab,” a historical, contingent fabrication in which human bodies and media meet in order to elicit and capture the motions and emotions of the human body, designed to inscribe and analyze what are often thought to be beyond the ability to inscribe and analyze. I am specifically interested in the use of books and folios of drawings and photographs of the human face in early 20th century psychological research on emotion, initially created for artists and sculptors but used in the affect lab to group facial expressions into identifiable categories. These categories today shape much of what is called the “basic emotions” theory of affect, central for proposed and realized technologies employed in surveillance, social media, and digital animation. Yet, I argue, these categories are partially an effect of the everyday use of photography – both as a means to stop motion, but also as an archival technique that bridges empirical scientific research with general aspects of how photographs can be stored and arranged.
Shame About That Microfilm
Lily Cho, York University
This paper attends to the curious shamefulness that can attend to working with the once ubiquitous technology of microfiche and microfilm. It begins with pondering the relationship between microfilm and photography and the way in which photography criticism privileges what Tina Campt aptly calls the “thingyness” of working with photographs. For Campt, the thingyness of the photograph enables an engagement with it that exceeds the seeming limits of his two-dimensional visuality. It is no longer just about looking. It has everything to do with (and I borrow from Eve Kosofsky Sedgwick advisedly here) touching feeling. “For while the affects of photographs are certainly produced through their visuality, they also resonate in equally profound ways through their materiality and through the haptics of their thingyness” (Campt 128). Campt’s claims for the power of thingyness will resonate with anyone who has ever stepped into a climate and humidity controlled room, slipped on cotton gloves, and picked up a photograph from a time outside of the one that they currently inhabit. What, then, of the feeling of the researcher whose archive is solely grounded in microfilm? Since 2008, I have been working with over 40, 000 images of Chinese head tax certificates archived at Library and Archives Canada (LAC). In keeping with the protocols of “preservation technology,” the originals were destroyed when LAC chose to safeguard this archive through microfilm. It is an extraordinary archive. These certificates constitute the first use of identification photography in Canada. They offer a nearly complete chronological record of Chinese immigrants who left Canada between 1910 and 1947. These photographs are also constitute one of the most significant collections of photographs of working class, non-white people in early-twentieth-century Canada. And, preserved as they are in several thousand feet of microfilm, they are irresistibly “thingy.” In considering the materiality of the image, my paper will take up this remarkable archive of head tax certificates in order to think through the productivity of shame. Shame, as Elspeth Probyn teaches us, comes from mistaken interests. For my larger project, Mass Capture, these photographs are a not simply a record of the shameful treatment of their subjects by a state that accepted decades of their labour while shamelessly denying them access to citizenship purely on the basis of race. These photographs also put into place the shame of mistaking, and misreading, the interest of the subjects in those who will look at them. From the immigration agent who first formally encountered these photographs, to the mid-twentieth century librarians who destroyed the original documents in the name of committing them to national memory, to contemporary researchers like me who have been so fortunate to have been held captive by their complexity, these photographs compel because there are so many ways to mistake their interest.
Between Object and Information: The Machine Circa 1930
Mark Hayward (York University) & Ghislain Thibault (Université de Montréal)
In 1932, the French engineer published a short treatise entitled Reflections on the Science of Machines in which he proposed a “general science of machines.” Two decades later, Lafitte’s work was “rediscovered” by the Cercle d’études cybernetique as precursor of Weiner’s analysis of information, communication and control. Developing this connection between Lafitte’s machine and the arrival of cybernetics, this presentation explores the concept of “the machine” as means of ordering and interpreting the increasingly complex environment resulting of industrialization as experienced in the early twentieth century.
This presentation focuses on two contexts in which the concept of the machine was taken up. The first of these relates to the work of Lafitte, and the interest his work generated among Catholic intellectuals such as Emmanuel Mounier and Simone Weil, both of whom were interested in developing a new humanism adequate to the social and spiritual demands of the mechanical era. The second explores the embrace of the machine as an aesthetic object, focusing on the Machine Art exhibition organized by the Museum of Modern Art in 1934. Both Lafitte and the Machine Art exhibition emphasized the ubiquity of machines, highlighting the transformative role of mechanization plays in how humans experience their environments.
Our presentation attempts to elaborate how the concept of the machine came to attract such attention in the 1930s, drawing attention to the ways in which it deployment as a means of mapping experience anticipated subsequent interest in information and media.
Postwar Credit Reporting and the Origins of Ubiquitous Consumer Surveillance
Josh Lauer, University of New Hampshire
“You’ve heard the term `credit rating,’ but do not think that you have one,” Life magazine teased its readers in 1953. “Your own financial affairs, you feel, are too trifling” The Life feature, published in late December as many Americans piled up holiday debts, illuminated the inner workings of the consumer credit bureau. Comparing local bureaus to “branches of the FBI,” the Life writer quipped that this commercial surveillance regime, comprised of some 1700 affiliated bureaus, would “make even the head of the Soviet secret police gnash his teeth in envy.”
This paper examines the information-gathering role of consumer credit bureaus during the 1950s and 1960s, and the credit reporting industry’s key role in the development of late twentieth-century consumer surveillance. Though consumer credit bureaus were not new in the 1950s – their origins date to the 1870s – the Life story revealed both the startling reach of credit reporting networks and the shocking granularity of personal information in credit bureau files. The largest credit bureaus had millions of individual files, and each file documented the personal life, habits, and relationships of its subject, in addition to subject’s financial affairs. Significantly, these millions of files were paper files, housed in rows of metal cabinets and retrieved by human hands. Credit bureaus did not begin to computerize until the mid-1960s.
As this paper reveals, the roots of what we might now call “ubiquitous consumer surveillance” can be traced to pre-computer information infrastructures, especially those operated by credit bureaus. Though the computerization of credit bureau records would speed the transmission of consumer data and invite new kinds of algorithmic analysis (e.g., credit scoring), analog systems for gathering and communicating detailed personal information were already in place. The “convenience” of today’s digital payment and instant authorization descends from these offline, manual surveillance systems.
"To Have and To Hold": Information, Paper, and Cabinets in the Middle-Class Home
Craig Robertson, Northeastern University
In this presentation I explore the movement of office furniture into middle-class homes in the first half of the twentieth century. I focus on the index card cabinet and vertical filing cabinet as a way to think about how ideas of efficiency and information circulated outside of their articulation in the immediate spaces of corporate capitalism.
The need for cabinets to organize papers spoke to the dramatic increase in the number of official documents and papers that people needed to navigate their everyday lives outside of the home. This demand for information and the resulting “paperization” of everyday life (and identity) resulted in a recognition that papers needed to be stored in such a way they could be protected and easily found.
Cabinets were also positioned as vital to the successful implementation of Taylorist ideas of efficiency in the home. For so-called “scientific housekeeping” to fulfill its promise to save time and make the home a more efficient and productive space, a home required accurate records of what was in it, when and how things were used, how much money was spent etc. Prescriptive literature showed how all of this information could be recorded on index cards organized by subject in a catalog drawer.
The information needs represented in the use of these cabinets illustrates a modern reconceptualization of information. Paper in files gave a material existence to this reconfiguring of information as a thing that could be detached and repositioned, reordered and recombined, all in the name of precision, accuracy and speed. To that end, I argue that card and filing cabinets are part of the process by which this conception of information became comprehensible.
Computable Soldiers: Media and Medicine in the First World War
Jeremy Packer (University of Toronto), and Alexander Monea, George Mason University
This presentation examines the entangled roles of computation, media, and medicine in the U.S. Army, and demonstrates how the U.S. Army Medical Corps followed what we term a governmental a priori in the First World War. We do so by investigating the centrality of two media, tabulation machines and telephones, to the U.S. war effort. First, we see the introduction of toe tags and medical charts that follow the patient, recording information about his diagnosis, treatment, and movement across temporary field, foreign, and domestic hospitals. Following the Census Bureau's model, the U.S. Army Medical Corps also began recording this information on punch cards to allow for the continuous computation of troop strength at the front and continuous monitoring for patterns of injuries and disease. These electrically computable records also lent themselves to post-war analyses of what made for the healthiest fighting bodies, which led to changes in who was recruited, how they were assigned, where their unit was deployed and how it moved around, as well as where field and temporary hospitals were set up and how surgeons were distributed amongst them.
More broadly we argue that in the U.S. context biopolitical advancement occurs at the cutting edge of socio-technical integration. In this instance, attempts to maximize the military force involves integrating new media into war strategy and thus demanded locating and fostering new media-specific skills. In turn, a larger subset of the population, in this instance women, is opened up to a media-driven search to maximize the force writ large and to locate specialist capacities in individuals that correspond to the needs of media devices. And here we locate our second focus.
In 1917, the Adjutant General’s office under the orders of the U.S. Army Surgeon General requisitioned Hollerith tabulating machines and 25 female clerical operators to begin parsing the medical statistics of the war. They were initially tasked with transcribing draft and medical records into punch card format so that they could be computed. Soon after, machines were requisitioned for field hospitals in France so that data could be parsed on the fly. At the same time, hundreds of women were recruited as telephone operators by the Signal Corps so they could provide the necessary technical skills to successfully link the front with chains of command. The seemingly banal technologies and technicalities of telephone and tabulator leverage military capacities via their ability to collect, store, process, and distribute data. The handling of data, through bodily-technical forms of discipline and specialized knowledge, becomes a military (and civilian) skill of increased importance.
Cold War Sonar Networks and the Infosonic Imaginary
John Shiga, Ryerson University
As a set of techniques for navigating, monitoring and mapping undersea space by means of sound, sonar played a central role in military, commercial, scientific and activist reimaginings of the deep ocean as a workshop, laboratory, wilderness, battle zone, and resource extraction site. Due in large part to the development of transoceanic sonar networks by the US Navy in conjunction with telecommunications firms and bioacoustics labs in universities, by the end of the Cold War, the ocean became a matrix of quasi-territories that are now central to international conflicts over resource rights, cultural identity and national security. Sonar has played a pivotal role not only in producing ocean space and the entities therein as objects of knowledge, prediction and control but also in the recent development of what might be called an infosonic imaginary in which water-based sound becomes a source of both information about potential military threats as well as a zone of uncertainty and anxiety concerning the impact of human acoustic activity on ocean space and the entities therein. How do aquatic figurations of mediation – channels, currents, waves, “the cloud,” and so on – both anticipate and register Cold War relays between information and ocean space? How might an archaeology of sonar inform the emerging elemental approach in media studies by engaging ocean sound and energy exchange more broadly as central to the history of information and materiality? And how has the heterogeneity and unpredictability of ocean sound shaped the ways in which organizations capture, mobilize, and imagine information in water?
Bunk and Ballyhoo
Will Straw, McGill University
If a concern over information overload has been a consistent motif of cultural commentary since at least the beginning of the 20th century, this concern has nevertheless marked particular moments with exceptional intensity. In the United States, in a period roughly bounded by 1927 and 1933, new or refurbished vocabularies emerged to designate an informational culture marked by exaggeration, sensation and what, today, we might call “fake” news. In particular, terms like “Ballyhoo”, “Bunk,” “Hokum” and “Hullabaloo” -- many of them rooted in an earlier culture of carnivals and side-shows -- came to be used to characterize the culture of modern urban media. Each of these names, as well, served as the title of early 1930s magazines which offered a satirical perspective on a newly media-saturated cultural condition. While much of the new critique of media ubiquity concentrated on the relentless circulation of images, it focused, as well, on the new, condensed forms of textuality -- headlines, “brevities”, slogans -- which had become prominent in the popular culture of the 1920s. This paper will draw on ongoing research on the sensational media culture of the United States during the period 1925-1935.
Lana Swartz, University of Virginia
Pop Music Charts and the Metadata of Culture
Liam Young, Carleton University
This talk traces a pre-history of the popular music chart—from its earliest incarnation as a list of sheet music bestsellers in fin-de-siècle roadshows to its most recognizable and paradigmatic form, the Billboard Hot-100 (ca. 1958). As a solution to the linked problems of information organization and commodity circulation, charts were essential in the emergence and management of popular music as a cultural field. In spite of this, like other forms of ‘grey literature,’ they have received relatively little scholarly attention.
Charts do not offer access to recorded sound but instead compile, arrange, rank, and disseminate metadata. What data to include on the charts, how to arrange it, and how frequently to update it were decisions with long-term consequences on the popular music field: charts were a key site in the struggle for artist recognition and compensation (the appearance of artist name next to song title and publisher activates alternate understandings of authorship and copyright); data points like ‘previous position’ and ‘weeks on chart’ establish the frequency and duration of song circulation and audience attention; even uses of the term ‘popular’ to describe musical genre and style, and a new category of fan, occurred first on the charts.
Focusing on these metadata functions shows how cultural knowledge and experience is premediated by interstitial forms of writing and data organization, usually unnoticed, like the chart. In processing distinctions at the heart of the popular music field, charts are an exemplar of what recent German media theory calls ‘cultural techniques.’ Conceiving of them as such throws 20th century charts into longer histories, and futures, of listing—a technique that has structured ways of knowing and acting in human societies for millennia.