archive(s) : interstice(s)

Addressing the gaps in time-based media collection and preservation

“Inconvenience has its virtues . . . ”
– Rick Prelinger

The difficulty in time-based media is that it is time-based; moving images and sound are bound to time. A specific duration is required to experience these media, unlike traditional material objects, the intake of which is sometimes satisfied with a passing glance – there is no playback mechanism required. Articulating the archive as an “interstice” is my attempt to circumvent the disparate formations of the archive that seem to hold it in a fixed time and space. Emphasizing the interstitial, I align with Jason Farman’s position “that the delay between call and answer has always been an important part of the message” (Delayed Response, 2018).

Rather than proposing yet another taxonomy of the archive, I wish to interrogate this very tendency to separate and organize notions of the archive. In short, my aim is to elevate the ephemerality and mutability in the archive(s). It is also to challenge the Foucauldian episteme that defines the archive as “the system of [a statement-thing’s] functioning” (original emphasis, Archaeology of Knowledge, p. 129). The archive as interstice acknowledges the multitudes of being and knowing the world as much as it acknowledges the myriad constructions of the archive. In this way, we might recognize the archive as a system that is “malfunctioning” or deviating from presupposed spatial orientations. This framework identifies the interstice (the gap, trace, or memory) as the major stakeholder in time-based media archiving.

I want to begin by grounding this rationale with questions that always seem to present themselves in archival epistemologies: What constitutes “collection” or “preservation”? Is it the encasing or the containment of a material object in space? What constitutes materiality? How is it negotiated in “real” or “virtual” space?

This brings me to Ranganathan’s Colon Classification system. In pivoting away from the definition of the archive as such, my hope is to highlight what falls through the cracks, what fails to encode, or what cannot be arranged according to a specific protocol. The “classification” in Ranganathan’s facets favors multiple variables over taxonomy, emphasizing “universal principles inherent in all knowledge.” [1] This thinking led to the development of the following facets:

Personality—what the object is primarily “about.” This is considered the “main facet.”; Matter—the material of the object; Energy—the processes or activities that take place in relation to the object; Space—where the object happens or exists; Time—when the object occurs

Ranganathan defined an object as “any concept that a book could be written about.” [1] The limitation here is that the Colon Classification system presupposes objects as publications. Considering the flexibility afforded by the system and Ranganathan’s own acknowledgement that “the library is a living organism,” it stands to reason that library classification systems must accommodate the ever-expanding notion of materiality. [1] With this in mind, we might see the facets working for contemporary time-based media. I’ll use Scott Northrup’s Hämeenkyrö Redux as an example:

Personality: Romance; Matter: Digital video; Energy: Mourning, melancholia; Space: Hämeenkyrö; Time: 2018

Obviously, this reverse indexing system (where we begin with the object itself rather than the reference terminology) is quite an undertaking, but the motivation behind this organization is the implementation of a user-centered experience rather than a top-down episteme. We see this user-experience ethos working in the Prelinger Archives, where the “taxonomy” functions as a mutable installation (Figure 1). What the Colon Classification system and the Prelinger Archives demonstrate is that both archival collection and organization, in theory and practice, are interstitial in their methodologies and subject to change.

Figure 1: Prelinger Archives

Turning to the interstitial material quality of time-based media, Shannon observes the meta-dimensions of audio recordings. [2] Identifying the imbricate relationship between the recording device and the subject (or object) being recorded, she writes, “any sonic archival document is archiving the historical event and its own recording.” [2] The interstice between the mechanisms of recording and sound itself actively shapes material experience. Unlike traditional text or still photographs, sound “can suggest the material and volumetric properties of both the recorded sounding subject or object and the space in which that recording occurred.” [2] The suggestion of materiality and space is readily apparent in Alvin Lucier’s I am Sitting in a Room. With the removal of the sound’s source (in this case, Lucier’s voice), we are left with reflections of the original sound. The result is the sonic representation of the space itself. (Link: http://www.ubu.com/sound/lucier.html)

The difficulty here is the performative quality of time-based media. This leads me to Shannon’s remark that “preservation necessarily involves transformation.” [2] Considering the expanse of magnetic tape that currently safeguards our digital data, we must come to grips with the fact that some email threads, like many ephemeral films, will inevitably slip through the cracks. But as Rick Prelinger reminds us, “ephemeral films weren’t meant to be kept in the long run.” [3] Rather than merely acquiescing to the contestation in best practices and arguments over what should or shouldn’t be preserved, we might consider leaning into the imminent, fleeting quality of time-based media. Moving beyond static models, we might begin to acknowledge the archive(s) as a mode of creation in addition to a method for collection and preservation.

Recalling the case study I presented above and earlier in the semester, Scott Northrup’s Hämeenkyrö Redux (Figure 2) calls attention to the “ghosts in the reels.” The digital video operates as a memento to a precedent film, Hämeenkyrö mon amour, which vanished from a faulty hard drive. The following “redux” was produced with archival vestiges: photographs and videos taken with the artist’s iPhone and his sensorial memory embodied in sonic form. Northrup’s voice-over, which recounts a fleeting romantic interaction, is a reification of immaterial loss. While both iterations of Hämeenkyrö exist precariously as digital artifacts, they demonstrate the notion that the archive is inextricably tied to ephemerality. (Link: https://vimeo.com/289864823/90eecd2691)

Figure 2: Hämeenkyrö Redux, Digital Video, 12min. Scott Northrup, 2018.

The interstitial quality of the archive permeates archival research and practices, but also library sciences, media archaeology, and media and cultural studies. Circling back to Jason Farman’s emphasis on delay, I want to end with a final loose thread. Reading Timothy Leonido’s “How to Own a Pool and Like It,” I want to also acknowledge how gaps can be manipulated to operate unethically. Leonido recounts the conviction of Edward Lee King, which was substantiated by speech-recognition technology. [4] The “dubious 99 percent accuracy rate” and the questionable research practices in Lawrence Kersta’s voiceprint trials notwithstanding, how can we trust recording apparati that can be so easily manipulated? A recent example of egregious misuse of technology that leads to misrepresentation:

I think this is why many of us turned to art practice this semester when illuminating the ways in which we might circumvent nefarious infrastructures and data-collection tactics. Speaking from personal experience, to practice art is to embrace inconvenience.

 

Notes:

[1]  Mike Steckel, “Ranganathan for IAs” (October 7, 2002) http://boxesandarrows.com/ranganathan-for-ias/

[2] Christine Mitchell, “Media Archaeology of Poetry and Sound: A Conversation with Shannon Mattern,” Amodern 4 (2015)

[3] Prelinger Archives Part 1,” C-Span (April 11, 2013) {video}

[4] Timothy Leonido, “How to Own Pool and Like It,” Triple Canopy (April 2017)

 

Application Post – Photo Collections

||visuals here||

 

Of the readings this week, I would like to focus specifically on the Diana Kamin article: “Mid-Century Visions, Programmed Affinities: The Enduring Challenges of Image Classification” and compare her insights with a recent example of a large image classification system which has a direct through-line to another broad and pervasive category of photography. I will also highlight my personal user experiences in the exploration of the two online archives described in Nina Lager Vestberg’s article “Ordering, Searching, Finding” as well as the British Library’s Endangered Archive Programme (referenced in Allison Meier’s post “Four Million Images from the World’s Endangered Archives”.)

Before getting into those topics, I would like to quickly comment on the other readings for this week starting with the John Tagg paper, “The Archiving Machine; or, The Camera and the Filing Cabinet.” I did not care for this article! I found his writing to be overly dense and dry and hard to get through. This writing is successful in denoting the major innovation achieved by “the modern vertical file” and its associated cabinet and I don’t wish to trivialize that by any means. However I found the self-seriousness of this writing to be formidable and the conclusion (“one might say that the archive must and must not be the horizon of our future”) to be dissatisfying. (Tagg, 34) After reading it a second time I realized that there is a video on Vimeo of the author delivering this paper word for word which makes it all slightly more digestible but not less annoying. Apologies for this bit of editorial!

However I appreciated the in depth background on Alphonse Bertillon, who served as the director of the identification bureau of the Paris police at the end of the 19th century — his use-case described the impossible task of trying to sort through a collection of over 100,000 photos owned by the police to find a single criminal highlighted the need for and radical innovation of the classification and organizational structure of a “Bertillon” cabinet. Tagg convinces of the massive impact made by the debut of the modern vertical file at the Chicago World’s Fair in 1892 and the broad changes to systems of organization that followed. This emphasis on the need for “a systematic order” for storage/retrieval and classification specifically for collections of photographs is a major theme through this piece and the rest of the readings for this week.

Tagg also briefly lingers on the word and concept of “capture” in terms of systematic/organizational apparti (apparatuses?) but I followed through to its function in photography specifically — each photograph is a captured moment. Vestberg expands on this in her piece by noting that “every photograph is … also a mini-archive within the archive” (476) which, when a photo is housed in a large systematized, organized collection of other photographs, strikes me as very true. The process of “capture” in terms of photography and artistic process versus pure documentation is described in Douglas Crimp’s “The Museum’s Old, the Library’s New Subject” when comparing the works of Picasso and Ansel Adams in the context of the Museum of Modern Art’s fiftieth anniversary. The conversation of “taking a picture” versus “making a picture” (Crimp, 71) inspired me to contemplate when the photograph becomes a data element versus a constructed artwork, or when it can serve as both. This is illustrated by the anecdote that Crimp describes of coming across Ed Ruscha’s photography artbook Twentysix Gasoline Stations in the transportation section of the NYPL. Crimp concludes that “the fact is there is nowhere for [this book] within the present system of classification” (78) which is answered by Anna Sophie Springer who follows from Crimp in her article “Original Sun Pictures: Institutionalization” by stating that Twentysix Gasoline Stations should probably be shelved in the Photography section of the library, as “photography has advanced into a proper, canonical genre.” (Springer, 123)

Moving on to the Diana Kamin article. She successfully illustrates right out of the gate how we are currently drowning in an unorganized mass of digital imagery — “In the year 2015 alone, more photographs were taken than in the history of analog photography combined” and noting that “1.8 billion pictures are uploaded to social media (our contemporary universal image archive) every day” (Kamin, 311) — and the ongoing challenge to classify and categorize photographic imagery. She describes this problem by surveying two approaches to image organization undertaken by two librarians in the mid-20th century. We then meet Bernard Karpel, a librarian from MoMA, and Romana Javitz, who headed the Picture Collection at the NYPL.

Without re-hashing and summarizing every point of the article here, Kamin categorizes Karpel’s proposed system of image classification as a model based in the “discourse of affinities” whereas Javitz’s user-centric approach as housed within the “discourse of the document.” To briefly re-cap:

Karpel’s insane proposal for a visual system of classification that he hoped would become as ubiquitous as the Dewey Decimal system was based on “an ‘objective’ eye that sees [the visual object] without the clouding of context to evaluate only form.” Kamin notes that this perspective was probably bolstered and built up by Karpel’s “long career at MoMA, where aesthetic formalism was the dominant methodological approach for analyzing works of art.” (Practically speaking, this proposed visual system sounds like a complete mess to me — which Javitz rightly argued would discount a broad section of the general public who did not have the specialized knowledge that would be required to communicate inquiries into any collection of objects. Karpel proposed a basis of “aesthetic evaluatives (qualities such as tactility, transparency, and multiplicity) [that] would be arranged along the axes of his primary cataloguing tools: ‘affinity’, ‘polarity’, and ‘sequence’, which would address, respectively, formal consonance, dissonance, or relationality between two or more pictures.”  (314) Understandably, this very complicated system involved duplicate cataloging cards and is totally impenetrable for a common user.

Javitz’s method for image classification was just the opposite. Kamin describes how she was vehemently opposed to hierarchical structures of organization/classification, as illustrated in the description of Javitz avoiding subheadings as much as possible: whereas previously “‘Lakes would be found under ‘F’ – Forms of Land and Water – Lakes’”’, Javitz deemed that Lakes should appear under ‘L.’

It is not hard to understand how this common sense and user-focused approach to visual classification of images won out over the very tricky visual basis of Karpel’s system of affinities. Javitz advocated that photographs be considered as documents first and classified as such. However there remains a lot of space between these two arguments in our current moment with technology. When trying to parse the what’s surely by now 2B + images that are being uploaded every day, machine-abled automatic indexing of images is a problem that many are working to solve. Javitz’s and Karpel’s approaches are both being applied. Kamin notes: “Javitz’s discourse of the document is most frequently encountered in the keywording dominant on the internet (a system in which any image can be tagged with multiple identifiers, or in natural language, by its uploader), while the discourse of affinities is manifest in discussions around pattern recognition and machine vision.” (329)

Both of these approaches are visible when looking at project by Google that I have found myself thinking a lot about during the course of this semester. The Google Image Labeler was an interesting space on the internet while it was initially active between 2006 and 2011, and it has returned since 2016 in a new form that remains relevant in the conversation of indexing and classifying images.

In its original form, the Image Labeler was set up as a game between two users who would be automatically partnered by the software. A time limit would be set and then a series of images would appear on the screen. Each ‘player’ would then submit text describing the contents of the image. Points would accrue when the two players would agree on terms. There were no prizes! However this was an active space online, and it was effective in improving Google’s image search function. This was the company’s clever way to ‘gameify’ the very manual approach to keywording and indexing images that would be required in order to achieve robust search.

The Image Labeler was shut down between 2011 — among other issues there became pervasive abuse between players, spamming the game with words like ‘abrasives, entrepreneurialism, and forbearance, among others — and reemerged in a different form in 2016. To this day you can log-in to the Labeler and lend a human-based approach to visual indexing. The images that confront users today are confusing and not straightforward to answer. (How do I see fog in a photograph?) Machine learning and approaches based on similar visual forms, as Keplar saw and envisioned the world, has come a long way in the intervening years but the current Labeler gives some insight into what types of images the machines are still struggling to classify, and the categories of images that users are most interested in.

Image classification via keywording remains an important and valuable characteristic in the broad world of stock photography. Stock photo collections are larger and more prevalent than ever. These images surround us in the world, and the measure of success for the end user (usually commercial) depends on choosing the right one. In “Re-use Value” from Cabinet Magazine, Jenny Tobias highlights the importance of keywords in the stock image marketplace. Referring to the image at the top of the article, Tobias writes:

According to the original caption, it is also a Portrait of Otto Bettmann—About 2 Years of Age, With an Umbrella.

Little Otto in his skirts could not have known that he would start collecting images as a teenager, earn a Ph.D. at twenty-five, begin compiling a picture history of civilization while curator of rare books in the Prussian State Art Library in Berlin, flee Nazism to the United States in 1935, and shortly thereafter found the Bettmann Archive, a major purveyor of stock photography. Nor did he know that sixty years after its founding, the Archive would be acquired by Microsoft’s Bill Gates for his Corbis image archive, shipped from lower Manhattan to a climate-controlled Pennsylvania mountain for preservation and digitization, and then redistributed over the Internet, where little Otto can be found today, exactly a century after the photographer snapped the shutter.

The article notes that the only transaction on this photo of Otto was for personal use, so not commercial — but the keywords listed here go very deep beyond what is shown in the actual image, and such is the nature of keywording stock images.

As Vestberg writes in her “Ordering, Searching, Finding” about the user experience of navigating the iconographic digital databases of the Warburg and Conway Libraries, keywording is key for researchers and stock hunters alike — and there are challenges: “Since the keywording system for digital files closely follows the iconographic categories of the analogue files, there are a number of elements in any image that may not be included in the metadata because they have not, for whatever reason, been considered iconographically significant in the process of keywording.” (Vestberg, 477).

In the case of the stock photo of Otto Bettman, it seems impossible to be able to feed this image to a computer and have the machine know on its own to tag the image with “Prominent persons.” The central difficulty here is summed up by Vestbeg neatly: “accounting for what a picture shows is never the same as describing what it depicts.” (478)

I was inspired by this last article and its elucidated challenges with searching these archives for images of “arrows” in order to find representations of “Saint Sebastian,” and vice versa. I decided to explore the Warburg, Conway, and the Endangered Archive Programme (just for fun) with the search term (keyword) “Magic” and compare results. (I chose Magic because I love the Warburg system of classification a lot, particularly the subsection of “Magic and Science.”) My findings were varied. The Warburg iconographic database returned 696 results that contained ‘magic’ in the metadata, but there were a lot of duplicate results. The Conway Library surprisingly only had 23 results, and the EAP had just one higher than Warburg with 697 (also containing duplicates.) I have prepared the first ~ 20 results from each of these for you to see, here. I find the visual artifacts from Warburg to be the most aesthetically satisfying, though this journey of searching and finding through the varied organizational structures of these photo collections was more meaningful than the destination of visual results.

 

 

References:

John Tagg, “The Archiving Machine; or, The Camera and the Filing Cabinet,” Grey Room 47 (Spring 2012): 24-37.

Douglas Crimp, “The Museum’s Old, The Library’s New Subject” in On the Museum’s Ruins (Cambridge, MA: MIT Press, 1993): 66-83.

Anna Sophie Springer, “Original Sun Pictures: Institutionalization” in Fantasies of the Library, eds. Anna-Sophie Springer & Etienne Turpin (Berlin: K. Verlag, 2014): 119-31

Diana Kamin, “Mid-Century Visions, Programmed Affinities: The Enduring Challenges of Image Classification,” Journal of Visual Culture 16:3 (2017): 310-36.

Nina Lager Vestberg, “Ordering, Searching, Finding,” Journal of Visual Culture 12:3 (2013): 472-89.

Allison Meier, “Four Million Images from the World’s Endangered Archives,” Hyperallergic (February 23, 2015).

Wikipedia contributors. (2018, August 7). Google Image Labeler. In Wikipedia, The Free Encyclopedia. Retrieved 20:12, November 20, 2018, from https://en.wikipedia.org/w/index.php?title=Google_Image_Labeler&oldid=853941020

Application Post

This week’s readings are concerned with the challenges for indexing, storage, and access of photo archives, all of which are complicated by “the constant problem of adequately attending to the different levels of content in any picture.” (Vestberg)

The things depicted are references to pre-existing objects in the world, making every photograph a “mini-archive within the archive,” according to Vestberg. Yet, she adds, they might have little to do with the thing depicted, or in other words: what the photo might mean. “Accounting for what a picture shows is never the same as describing what it depicts” (Vestberg).

This complicates not only the work of professional picture archivists, but also the photo management of anyone with a camera on hand — which today is virtually everyone. The problems illustrated in the readings made me appreciate more the mess that is my own photo archive, which I want to talk about today. (Apologies for taking my own work as an example! It seemed to fit when I first started writing, but at this point I feel embarrassed)

I have owned a camera since 2011 and have since then accumulated over 15,000 pictures, not counting the thousands more taken with mobile phones. This archive consists of mostly “fine art” photography — not in the sense that it’s “fine”, or ”art”, but in that its focus is aesthetics, not personal or documentary. I keep my pictures in Adobe Lightroom, a photo management and editing software. By default, Lightroom displays them in a grid reminiscent of a contact sheet from analog times. Pictures are grouped in folders by import date or date taken, if this metadata is available. These folders — perhaps the digital equivalent of physical vertical files — are listed in the sidebar, like a single filing cabinet slid open.

I have never found a good way to organize my collection in this system. To make a long story short, it’s a mess that I have avoided dealing with. Until, two years ago, I wanted to make a photo book. Figuring that organizing the database myself was a lost cause, and unable to rely on any metadata whatsoever, I decided it’d be an interesting experiment to computer-curate the book: to automate all metadata collection and organize the material purely on computable information.

So I used online computer vision services by Google and Microsoft to generate captions and tags for each image. Then, I collected dominant colors by averaging pixel distributions, and applied the Histogram of Oriented Gradients algorithm to quantify image composition. This way I was able to generate about 850 data points for each picture.

Without knowing, I had tapped into what Kamin calls the “discourse of affinities:” processing and organizing images solely by their formal content rather than their documentary value. At the Minneapolis College of Art and Design in the 60s, preceded by Warburg’s Mnemosyne Atlas in the 1920s, and Malraux’s 1949 Musée Imaginaire, former MoMA librarian Bernard Karpel sought to “force a reorientation of basic library approaches away from the historical and factual formulas to those that can follow the exploitation of the image in semantic and aesthetic terms.” (Kamin)

Kamin notes that “the discourse of affinities is manifest in discussions around pattern recognition and machine vision” today — the very technology I used to complete my project. Here, the generated tags and captions are purely descriptive, as they have to be derived from machine-readable form. And only sometimes, accidentally, they transcend into something bigger: when the computer gets things wrong.

Assuming that at this point we are all somewhat familiar with the politics of machine learning and computer vision I want to point out that they become visible in my project, too, and move on to another aspect that I think is relevant.

One of my frustrations with Adobe Lightroom is the rigid organizing structure that does not at all address the aforementioned “problem of adequately attending to the different levels of content in any picture.” (Vestberg) Generally, it seems that photo storage infrastructure is unable to account for complex interrelations between pictures: In an ideal spatial organizing system you would want a photo of a black cat, for example, to be close to other felines as well as animals in general, but also in the proximity of internet memes, the color black, and folklore.

The vertical filing cabinet in the physical archive, and its digital counterpart in Adobe Lightroom, are all limited in their spatial configuration. There are only so many attributes that can be taken into account simultaneously in adjacent files, folders, cabinets — in the physical world with its three dimensions. However, in geometry it doesn’t matter whether you describe a space in three, four, five, six, or more dimensions. So theoretically, my cat picture could be in the animals section in three dimensions but also next to folklore/religion in an additional dimension.

To imagine a fourth dimension, one would have to be able to picture a fourth axis perpendicular to all other three, and to imagine 800 more is unthinkable. Yet, although we can’t see it, I was able to computationally arrange my own photo collection in 850-dimensional space, considering all 850 criteria at once.

If you glance over the fact that many aspects of a picture’s meaning can’t be derived or represented computationally, a high-dimensional archive possibly enables a superior ordering logic; where all these simultaneous connections become possible, where formal as well as documentary attributes could be considered side-by-side and all at once, if they only can be quantified.

But there is a catch. Rendering this archive visible requires reducing all 850 dimensions again to the two or three we can perceive. To do this, I used a dimensionality reduction algorithm called t-SNE. The algorithm calculates the distances between all pictures in high-dimensional space and arranges them in 2D, trying to achieve a similar distribution.

The result is a map of affinities where pictures that were close together in high-dimensional space are grouped together in 2D as well. Unfortunately, accuracy is impossible. Similarly to how flat maps of our spherical Earth always distort the globe in some way or another (e.g. rendering Greenland or the poles either huge, tiny, or distorted), t-SNE can never account for all of the spatial relationships in hyper-space at once.

Finally, to get the arrangement into book form, I used another algorithm that attempts to compute the shortest path through all points in the arrangement, thereby defining the page order. The sequence is bound as an accordion book, a 95ft-long journey across categories, formal characteristics, and descriptions simultaneously. It’s a new way to traverse the collection, but leaves the viewer unable to see the whole picture where it all made sense.

Slides
https://goo.gl/MTC5W7

Works Cited

Nina Lager Vestberg, “Ordering, Searching, Finding,” Journal of Visual Culture 12:3 (2013): 472-89.

About the politics, philosophy, and aesthetics of image classification, and how historical models prefigured the logics of machine vision: Diana Kamin, “Mid-Century Visions, Programmed Affinities: The Enduring Challenges of Image Classification,” Journal of Visual Culture 16:3 (2017): 310-36.

 

Critlib: The Dilemmas of Meeting Theory and Practice

Critical librarianship, or critlib for short, is a term that has emerged in the last few decades in reference to the application of critical social theory to the practices of librarians, cataloguers, archivists, and others concerned with the storage, classification, and accessibility of knowledge. The spread of the term critlib has also been popularized across social media, specifically Twitter with #critlib, to connect a community of librarians with a particular set of values they bring to their professions. But regardless of whether one actively engages with that online community directly, critlib largely represents the process of problematizing librarianship through theoretically informed practice. Critlib as a term encompasses both the critical theory informing it and the varied practices of rethinking the role and organization of the library.

The emergence of critlib was accompanied by valid criticisms of the uses of critical theory; some argue that it is inaccessible, elitist, and is too convoluted to provide a framework to work from in the day-to-day profession.[1] Others argue that the library needs to be an objective place and that the role of the librarian is to help students learn “information literacy concepts and how to apply those concepts to their tasks… we are not paid to subscribe to some abstraction about oppressive power structures or to apply our skill sets to an ambiguous and amorphous idea of ‘social change.’”[2] However, this stance fails to recognize the already non-neutral position of the library in regard to how its structure makes certain resources accessible or not accessible, the implications of its classifications, and the mainstream emphasis on practicality in the library.

Lua Gregory and Shana Higgins look at the history of the organization and expansion of librarianship and its relationship to the rapidly industrializing and competitive characteristics of the Gilded Age (1870-1900) and the Progressive Era (1890-1920). Melvil Dewey, one of the primary founders of the library system and director of the American Library Association from 1887-1905, often used language to describe librarianship that indicated he saw “business practice as the ideal for the organization and practice of librarianship.”[3] The business practices in mind during this time placed emphasis on efficiency (increased mechanization) and saw the intensifying power of corporate entities.

However, even at these early stages of the organization of the library system, resistance to the proposed corporate model of the library was pushed by the Vice Director at the time, Mary Salome Cutler Fairchild. Fairchild’s design of the class “Reading Seminar” at the New York State Library School “inspired women training to become librarians to think more deeply about the implications of their work for their communities, and the historical and cultural contexts of their work.”[4] Despite many responses to a survey dispersed to alumni that indicated the value and importance of theoretical and philosophical training in librarianship, Dewey’s Handbook of the New York State Library School focused largely on practical matters and efficiency. Here, theory is cast aside for practice, specifically practice that is unengaged with looking carefully at the machinations of the corporate model that standardizes library work into a mechanical process, disconnecting the profession from the communities it is meant to serve.

Considering this legacy of the commodification approach to librarianship and the concurrent response of certain librarians arguing for more attention to theoretical work, the modern American library system has been faced with the dilemma of reconciling theory and practice since the early formations of its aims and organization. This dilemma, as Emily Drabinski explains, is to be expected:

“If we understand action and discourse as both produced by  and productive of the present, the coincidence of critical and compliance perspectives makes analytic sense. The kairos of contemporary critical approaches is not generic, but emerges from and alongside a kairos of compliance that it contests and resists…Critical perspectives on information literacy instruction represent a reaction against a kairos of compliance.”[5]

Drabinski uses the Greek term Kairos here to refer to qualitative time, marrying ordinal time with social, political, and historical context to a sense of the present.

In 2014, the Framework for Information Literacy for Higher Education (Framework) was offered as a critical alternative to the Information Literacy Competency Standards for Higher Education (Standards), which were in place since 2000. The Framework aimed to emphasize the importance of local and contextual learning outcomes measured by local and contextual tools, where the Standards provided a general set of performance indicators and data reporting tools. Though the Framework has been lauded for making room for the specificities of community context, providing flexibility in the assessment of a library’s value, the actual implementation of the Framework has been a complex process fraught with uncertainties. The Standards, with its generalized approach, provided certain tools that librarians made use of to show the importance of the library in its community in order to secure funding for maintenance.[6] Furthermore, as Alison Hicks points out in “Making the Case for a Sociocultural Perspective on Information Literacy,” the Framework in its effort to provide a contextually based approach has “positioned all disciplinary thinking as emerging from the same core and overarching information literacy concepts rather than, as is the case with a sociocultural perspective, recognizing the individuality and uniqueness of each discipline.”[7] By vaguely alluding to the importance of community knowing without specifying how to engage with it, the Framework also works to homogenize the value of collective and varied experiences in a hazy catch-all.

Critlib as an engagement with both theory and practice is not to be understood as some ideal harmonious meeting of the two, as it clearly comes with its own dilemmas surrounding implementation and engagement. Rather, critlib enables us to consider the way librarianship has been embedded in these dilemmas in the formation of its foundational structure through to the contemporary processes of rethinking the library. Considering the general concern of librarians with accessibility and engagement, critlib aims to meld the self-reflexive thinking of theory with the implementation of effectual practices responsive to community needs.

 

[1] Karen P. Nicholson and Maura Seale, “Introduction,” in The Politics of Theory and the Practice of Critical Librarianship, ed. Karen P. Nicholson and Maura Seale, Sacramento: Library Juice Press (2017): 8.

[2] Eamon Tewell, “The Practice and Promise of Critical Information Literacy: Academic Librarians’ Involvement in Critical Library Instruction,” College and Research Libraries (2017): 37.

[3] Lua Gregory and Shana Higgins, “In Resistance to a Capitalist Past: Emerging Practices of Critical Librarianship,” in The Politics of Theory and the Practice of Critical Librarianship, ed. Karen P. Nicholson and Maura Seale, Sacramento: Library Juice Press (2017): 26.

[4] Gregory and Higgins, 29.

[5] Emily Drabinski, “A Kairos of the Critical: Teaching Critically in a Time of Compliance,” Communications in Information Literacy, 11(1) 2017: 83.

[6] Drabinski, 85.

[7] Alison Hicks, “Making the Case for a Sociocultural Perspective on Information Literacy,” in The Politics of Theory and the Practice of Critical Librarianship, ed. Karen P. Nicholson and Maura Seale, Sacramento: Library Juice Press (2017): 73.

 

Works Cited

Drabinski, Emily. “A Kairos of the Critical: Teaching Critically in a Time of Compliance.”Communications in Information Literacy, 11(1) 2017: 76-94.

Gregory, Lua and Shana Higgins. “In Resistance to a Capitalist Past: Emerging Practices of Critical Librarianship.” in The Politics of Theory and the Practice of Critical Librarianship. ed. Karen P. Nicholson and Maura Seale, Sacramento: Library Juice Press (2017): 21-38.

Hicks, Alison. “Making the Case for a Sociocultural Perspective on Information Literacy.” in The Politics of Theory and the Practice of Critical Librarianship. ed. Karen P. Nicholson and Maura Seale, Sacramento: Library Juice Press (2017): 70-81.

Nicholson, Karen P. and Maura Seale. “Introduction.” in The Politics of Theory and the Practice of Critical Librarianship. ed. Karen P. Nicholson and Maura Seale, Sacramento: Library Juice Press (2017): 1-18.

Tewell, Eamon. “The Practice and Promise of Critical Information Literacy: Academic Librarians’ Involvement in Critical Library Instruction.” College and Research Libraries (2017).

 

Addressing Archival Injustices at Weeksville Heritage Center

For this presentation, I want to take up Kameelah Janan Rasheed’s question, “Is it okay for things not to exist for future generations?” (41:00). This question was particularly fraught for Janan Rasheed, who grappled with her own “archival impulse,” and it was also situated within a discussion about Blackness and archives: empowering Black people to create, manage, and dictate the terms of their own archives (Jules), protecting Black cultural production from being made legible or knowable through archives (St. Felix), and using illegibility or invisibility as a means of escape from the surveillance of blackness (Browne). For me, this question is central to this week’s readings in that they all, in one way or another, argue that to address the archival injustices that silence, exclude, or overwrite marginalized voices, those voices, those communities must be involved. They must be able to determine which parts of their stories should or should not exist for their own – and others’ – future generations.

For Michelle Caswell, every step of the production and maintenance of archives should reflect this self-determination. Archivists, then, have a set of obligations: “to center those people who have been marginalized in our appraisal decisions moving forward,” to describe records using “the same languages that communities use to describe themselves,” and to take a “survivor-centered approach…which is centering survivors [in] decision processes” about what is done with archived materials, such as digitization” (Cole & Griffith, 24). This suggests that community involvement must be situated not at a superficial level, after the fact of collection or preservation, but from the beginning and throughout.

I think Weeksville Heritage Center (WHC), a “multidisciplinary museum” in Brooklyn, is attempting to operate along these lines. Weeksville was founded by James Weeks in 1838 and would become “one of the largest known independent Black communities in pre-Civil War America” (The Legacy Project). This community flourished through the work of African American entrepreneurs and land investors and because of its residents, who were deeply committed to sustaining their independence (5 of July). Weeksville Heritage Center’s (WHC) mission is to continue this work, to “document, preserve and interpret the history of free African American communities in Weeksville…and to create and inspire innovative, contemporary uses of African American history through education, the arts, and civic engagement” (What We Do). One of their main programs is The Legacy Project, which is described as “[standing] for the freedom and right to know, document, and defend one’s own history,” with the goal of keeping Weeksville’s “legacy alive and vibrant for future generations” (Legacy Project). WHC hosts Legacy Project events at least every month. Some recent examples include “Embodying Archives,” wherein participants were invited to explore individual and collective memories carried “genetically, spiritually, and physically” through “performances, discussions, and communal movement” and “Archives for Black Lives,” which was a day of intergenerational self-documentation where participants were taught strategies for recording their oral histories and digitizing family photographs. 

On November fourteenth, visual artist Elise Peterson will lead a workshop on digital collage. Peterson’s digital collage work incorporates portraits of artists of color within famous Matisse paintings, and some have been displayed on billboards in the United States and Canada. For The Legacy Project, digital collage is a technique “that offers [the] chance to push the visual boundaries of a design, illustration, or art piece,” but it also seems to enact an intervention, whether in a historical narrative, modes of representation, or public spaces. The Project is a manifestation of Weeksville Heritage Center’s interest in supporting “self-reliance, resourcefulness, transformation, collaboration, celebration, and liberation of Black persons in America.”

Elise Peterson.

Here, WHC seems to align with Caswell’s idea of the work of archives as the work of social justice (Cole & Griffith, 23), as well as her argument, drawing from Geoffrey Yeo’s definition of a record as a “‘persistent representation of human activity that travels across space and time,’” that records need not be material but can also be oral, kinetic, etc. (Cole & Griffith, 23).

By giving Black community members access to archival strategies and technologies, The Legacy Project creates a space of self-determination for Black people within archives; giving them a chance to modify archives by filling silences with newly recorded oral histories, for example, or to create new collaborative archives with newly digitized material. This project is explicit in its orientation to future generations – to keeping these spaces open for them and creating a foundation or toolkit for them to use. Importantly, however, only some of the material generated through The Legacy Project is archived at the WHC’s Resource Center for Self-Determination and Freedom. Clips of oral histories recorded at the museum are available to the public through the Center’s digital collections, but The Legacy Project, operating alongside and sometimes with the Resource Center, does not aim solely to generate material for these digital collections nor to make such material universally accessible, rather it gives Black communities and families the resources to preserve their materials for their own purposes. Here, it is clear that Black communities are centered in the appraisal process, both in the sense that the archive is made by and for them and in the sense that their lives are not simply objects of knowledge but that they themselves are subjects involved in knowledge production, entitled to keep their records from the archive in the first place.

To return to Kameelah Janan Rasheed’s question – “Is it okay for things not to exist for future generations?” – while The Legacy Project’s primary focus may not be archiving records for everyone everywhere, it does have an interest in preserving material for the future generations of its own community. Preservation and stewardship are highly personal in this context, and though the archives produced through the Project’s programs might hold importance beyond the family, it is up to that family to decide whether and how to share them. In this, I also see Doreen St. Felix’s point that we have to acknowledge and accept that there are things we might never be able to access or understand; we have to understand that legibility can be oppressive (1:07:39).

Janan Rasheed also asked, “What are the limitations of radical visibility?” but I wonder, what are the limitations of radical invisibility? I am thinking here of Joy Buolamwini’s advocacy for more inclusive code and coding practices, which, for her, would involve modifying existing systems and using more inclusive training sets. After encountering facial recognition software that could recognize her white colleague but not herself, Buolamwini formed the Algorithmic Justice League to combat bias in the design and development of algorithms. Does Buolamwini’s interest in greater inclusivity and visibility imply greater legibility? By contrast, Nabil Hassein has written “against Black inclusion in facial recognition,” arguing “I have no reason to support the development or deployment of technology which makes it easier for the state to recognize and surveil members of my community.”

Is there a productive space between legibility and illegibility? Should there be? In her book Dark Matters: On the Surveillance of Blackness, Simone Browne argued that “protoypical whiteness,” the “cultural logic that informs much of biometric information technology” becomes meaningful only through “dark matter,” or bodies or body parts that confuse biometric technology such as facial recognition (Browne, 162). While she recognizes that the exclusion of “dark matter” from the design processes of these technologies risks reproducing existing inequalities, she also wonders whether there is some benefit to remaining “unknown” or illegible to them (Browne, 163). She points to the potentiality of this illegibility, saying “then and now, cultural production, expressive acts, and everyday practices offer moments of living with, refusals, and alternatives to routinized, racializing surveillance” (Browne, 82).

Robin Rhode, Pan’s Opticon, 2008.

I want to end with these images from Robin Rhode’s Pan’s Opticon (2008), which Browne uses in her analysis and on the cover of her book. In them, Browne argues, the subject’s “ocular interrogation” of the Panopticon and “the architecture of surveillance – corners, shadows, reflections, and light – [covers] the wall with dark matter. … [He] is not backed into a corner, but facing it, confronting and returning unverified gazes” (Browne, 59). For Browne, this kind of looking, which she refers to as “disruptive staring” and which bell hooks has called “Black looks,” is a political and transformative act with, I think, potential for archives and beyond (Browne, 58). 

Sources

Simone Browne. Dark Matters: On the Surveillance of Blackness. Duke University Press (2015) Print.

Bergis Jules, Simone Browne, Kameelah Janan Rasheed, and Doreen St. Felix, “Failures of Care” Panel, Digital Social Memory: Ethics, Privacy, and Representation in Digital Preservation conference, The New Museum, February 4, 2017 {video} (1:08).

Harrison Cole and Zachary Griffith, “Images, Silences, and the Archival Record: An Interview with Michelle Caswell,” disclosure: A Journal of Social Theory 26 (July 2018): 21-7.

Joy Buolamwini, “How I’m Fighting Bias in Algorithms,” TEDxBeaconStreet, November 2017 {video} (8:45).

Kimberly Christen, “Tribal Archives, Traditional Knowledge, and Local Contexts: Why the ‘s’ Matters,” Western Archives 6:1 (2015): 1-19.

Nabil Hassein, “Against Black Inclusion in Facial Recognition,” Digital Talking Drum. 15 August, 2017. Web.

Radical empathy in the U.S. – Mexico border

On the 3th of October of 2013, a boat carrying migrants from Eritrea, Somalia and Ghana sank on the island of Lampedusa in Italy. After these tragic events, the International Organization for Migration started a project in order to track and keep a record of migrant deaths. Today, the Missing Migrants Project estimates that around 28,000 migrants around the world have lost their lives since 2014.

The odyssey that these migrants have to endure takes them through unimaginable areas. Across deserts, jungles and oceans, thousands go missing. The word “missing” here can be applied to multiple scenarios:

  • Missing because they are unable to establish contact with their families, even though they may be alive
  • Missing because they have been detained without access to means of communication
  • Missing because they (or their families) choose not to seek help because they can get deported
  • Missing because their remains may never be found, or get properly documented or identified.

One of the most difficult challenges we face in the world is helping the families of these migrants. If we fail to do so, we are condemning thousands of people to oblivion.

During her intervention at the Failures of Care keynote, Doreen St. Felix references the essay “Venus in Two Acts” by Saidiya Hartman, where she tackles the subject of impossible speech — speech that, according to St. Felix’s interpretation, “occurred in history but was never able to have been recorded”. Hartman urges us to think about how can we recover those voices.

This question, framed within the context of missing migrants in the U.S. – Mexico border, adds new levels of complexity as it incorporates multiple local, state and federal administrative bodies. Each actor with their own rules, their own databases and their own records. How is it possible for these families to navigate this never-ending bureaucratic maze?

On October 2017 the International Committee of the Red Cross published a policy paper where they enumerated a series of recommendations in order to facilitate the search for and identification of missing migrants. In said paper the Red Cross suggests, among many other things, the following: standardize data collection “for the sole humanitarian purpose of searching for and identifying the missing person”, setting up effective channels of communication that support the families during their search, provide them with access to services, and lift “any specific and legal barriers” that the families may face in the exercise of their rights.

There is no question that the goal of the Red Cross is a noble one. Reading the policy paper one can see that there is a clear concern and commitment to respect and defend the families’ dignity. Access to information and records of missing and deceased migrants is fundamental for these families. Nevertheless, the solutions by such international organizations, more often than not, adhere to a rights-based framework. This lens is necessary and it can certainly help, but in order to be fulfilled, we need to set in motion a larger set of bureaucratic procedures that involve multiple branches of Government. This process can take years and can be drastically altered depending specific electoral results or the funding that certain agencies get so they can help this families.

Michelle Caswell and Marika Cifor warn us of the shortcomings of such an approach, they state that a rights-based methodology “ignores the realities of more subtle, intangible, and shifting forms of oppression that are also pressing social justice concerns”. Although, one can argue that the forms of oppression that certain migrant communities endure in the country are far from subtle.

A clear example happened a month ago during a meeting in Boulder, Colorado, between the Forensic Border Coalition (an organization comprised by forensic scientists, scholars and human rights activists); Paula Wolff, a lawyer representing the FBI; and representatives of the Inter-American Commission of Human Rights. After six years of trying to make this reunion possible, the FBC asked the U.S. Government to grant them access to the National DNA Index System (NDIS) in order to help identify the remains of the migrants that have died or disappeared across the U.S.-Mexico border. Paula Wolff replied that, even though she sympathized with the families, and there was no disagreement on what must be done, “the only issues are working on how it is to be accomplished”.

Not only is the FBI limited by the law on what database information can be made public, the law also states that, in order to have access to the NDIS, “all DNA samples submitted to the database must be taken in the presence of law enforcement”. Nevertheless, the majority of these families do not want to approach the authorities because of their own legal status and fear of violations of privacy and surveillance. Unfortunately, given the harsh conditions of the desert, a DNA sample means that it is often the only viable way of identify a body. This rights-based framing of the problem jeopardizes the safety of this community, therefore, we must ask ourselves if there is an alternative solution that can help us establish dynamics of affective responsibility in order to help these families.

In the early 2000’s, forensic anthropologists Bruce Anderson and Robin Reineke started collecting information of the relatives who called the Pima County Medical Examiner’s Office in Tucson, Arizona. This was not part of their job description but they realized that it was something important that they had to do. Reineke joined the office as part of her research for her dissertation but she continued with this work. In her words, once they talk with one of these family members, “it’s impossible to not feel responsible to carry on the search”. In 2013 she founded Colibrí Center, an institution that believes in radical empathy and is creating alternative channels to help these families.

During their first years of operation, Colibrí Center created a large database and helped families in their search, but a year ago, after receiving a grant from the Howard G. Buffet Foundation, they started collecting DNA from family members that needed their help. The new capabilities of the organization brought with them the necessity to elaborate systems that guarantee the safety and protection of the community they are trying to help. The Center keeps its database private, they do not inquire about the legal status of anyone that works them, they schedule appointments to collect DNA samples at locations that they do not disclose publicly, the tests are free and the names of those who are tested are not shared with the police.

In order to find if there is a DNA match, Colibrí works closely with the Pima County Office of the Medical Examiner. The County’s Office has created a DNA database of more than 1,000 cases of unidentified individuals. The DNA samples collected are shared with a private lab where Colibrí also sends their samples “in the hope of producing blind matches between the unidentified and the families, matches that the medical examiner then confirms. Once someone is identified, Colibrí works to notify the family and to facilitate the next steps in the process.”

Colibrí Center has traveled to multiple states in the U.S. and even to Mexico in order to help these families. They have now created a solid network that is based on social justice. With their archival work they have become, in the words of Michelle Caswell and Marika Cifor, “record-keepers…caregivers, bound to records creators, subjects, users”, and the community they serve.

Sources:

Michelle Caswell and Marika Cifor, “From Human Rights to Feminist Ethics: Radical Empathy in the Archives,” Archivaria 81 (Spring 2016): 23-43.

Bergis Jules, Simone Browne, Kameelah Janan Rasheed, and Doreen St. Felix, “Failures of Care” Panel, Digital Social Memory: Ethics, Privacy, and Representation in Digital Preservation conference, The New Museum, February 4, 2017 {video} (1:08)

Hay Andrew, “Group seeks U.S. DNA to identify missing migrants”, Reuters. October 5, 2018. Accessed November 12, 2018. https://www.reuters.com/article/us-usa-immigration-missing/group-seeks-u-s-dna-to-identify-missing-migrants-idUSKCN1MF2QI

International Committee of the Red Cross, “Missing Migrants and their Families”, ICRC. August, 2017. Accessed November 12, 2018. https://www.icrc.org/sites/default/files/document/file_list/missing-migrants-and-their-families.pdf

Colibrí Center, “DNA Program”, Colibrí Center. Accessed November 12, 2018. http://www.colibricenter.org/programa-de-adn/

Colibrí Center, “Colibrí’s Commitment to Protecting Privacy & Security”, Colibrí Center. May 25, 2017 {video} (3:00). Accessed November 12, 2018. https://www.youtube.com/watch?v=1BKkRpm1wbk&t=13s

 

 

 

The Stakes of Geologic Survey

 

© 2005 — Ron Reznick

The image above displays a survey marker from the United States Geologic Survey. It is part of a network of markers that is cross-referenced with a database that contains details on the ground composition below. These markers were placed by land surveyors who would walk the land, stake out boundaries and record whatever environmental data they could find, rock types, soil types, tree species and so on. Effectively these surveyors were assembling elaborate inventories of the land in terms of natural resources that was anchored to this survey marker as reference point. This created a knowledge infrastructure that enforced the on-the-ground communication of settler boundaries with scientific universality. Today the survey marker is, of course, an outdated technology in the geosciences. Now, state geologic surveys are more likely to use remote sensing and GIS, but there are ways in which the logic that arises from the object of the survey stake remains.

This practice of collecting, storing, and institutionalizing environmental data is and has always been a deeply colonial process. In Victorian Canada, for instance, the project of cataloging plants and minerals through botany and geology was intimately tied to the process of asserting power over the hinterland. The Geologic Survey of Canada, established in 1842 as an active scientific arm of British imperialism, was, in fact, more about the inventorisation of natural resources and the establishment of territory in such a way that put these resources under imperial rule (Bélanger 51). In the US, the Public Land Survey System, developed shortly after the revolutionary war, put in place a cadastral system that prepared the land for colonial settlement. The surveyors staked out rectangular blocks from the east out to the west that divided the land into 1-mile sections and six-section townships. This prompted the transition from public land to private ownership. In fact, there is something to the very act of staking out that claims ownership and separates inside from outside, as mine from yours.

I am interested in the survey marker or the survey stake as a media technology and how it stores and communicates information with a bias towards the simultaneous accumulation of data, territories, and its resources, particularly by bringing mineral deposits under corporate or state control. As a media technology, the survey stake affords certain kinds of calculations, perceptions or epistemologies and is, therefore, shaping the survey system as a process of collecting and archiving environmental data. The point I want to make is that a criticism of colonial discourse, like Laura Ann Stoler argues in “Colonial Archives and the Arts of Governance,” should take into account archiving as process rather than simply as a universal source to extract information from. But, in order to do so, we must also challenge the physical technologies that create the conditions and mechanisms through which certain types of information are selected and stored, and communicated — and the types of information that these techniques cannot register and therefore preclude from the institutionalized, modern scientific knowledge system.


Environmentalism often uses environmental data from geologic surveys as an authoritative source, to raise awareness, foster a sense of urgency, and serve as a call to action, but without looking at its processes of production. Of course, to read too much politics into environmental data is a dangerous bet when you want to argue for a strong and widespread environmentalism, particularly in today’s political landscape that is haunted by rumors of a post-truth politics. But the archive of environmental data, like any other, should be considered for the ways in which it collects, stores and communicates information — from what it isolates and what it emphasizes — and it should be negotiated with its colonial histories. In Stoler’s words, “scholars need to move from archive as source to archive as subject” — that is, attend to the ways in which documentation happens rather than its uninhibited extraction (90) because the colonial archives’ rules and mechanisms for selection of documents that are archived were created to reproduce a certain power discourse, or reproduce certain imaginaries (97). Similarly, by a process of quantification, the survey technology isolates the lucrative elements of resource to bring them under biopolitical control. We should be suspicious of the environmental archive if the epistemological apparatus of state geologic surveys was fashioned to produce nature as a standing reserve of natural resource.

To be sure, what Foucault calls the historical a priori can also be approached from the discourse’s technologies for archiving and measuring. Foucault argues for a concept of the historical a priori of the archive, by which he means the historically sanctioned group of rules that form the conditions for the emergence of new statements in a discourse. The a priori is not imposed on the discourse, he notes but they are caught up in one another, reinforce each other and evolve together as a group. In Foucault’s words, “[The historical a priori] has to take account of the fact that discourse has not only a meaning or a truth, but a history” (127)

But if discourse has a history it also has a technology. As we have seen in the intellectual furnishing and containers week, the materiality of the container shapes the way in which we understand the document, and what can be collected and stored in the first place. This system that creates the conditions for what Foucault calls the statement-events, or discursive acts, in short the system of enunciability’s location and practical workings remain somewhat in the air in Foucault’s text, while I would argue they are in fact located in the physical media with which we collect, store, and communicate. Indeed, what is often grouped together as German media theory calls this the ‘technical a priori’.

Articulated initially by Friedrich Kittler, the technical a priori is the idea that the past is recorded on a prior level, that is, through the physical media for documentation. This means that the technologies of measuring and recording are the “true first archaeologists of knowledge” and as such are also historical documents themselves that can yield an understanding of the ways in which knowledge has emerged through history. Kittler’s argument was not necessarily counter Foucault, rather the technical a priori was, in Bernhard Siegert’s words, “an attempt to overcome French theory’s fixation on discourse by turning discourse from its philosophical head onto its historical and technological feet.” (3) This elicits an evaluation of the affordances that technology yields for understanding, and to think about more concrete ways in which power is inscribed in our daily knowledge infrastructures.

We don’t see a survey stake every day, but all reporting on climate change holds data gained through a survey technology of some sort. I am arguing that we should be conscious of the types of understanding of environment that these technologies afford, what their techniques of representation emphasize and what they isolate. It becomes urgent to find techniques and technologies of ordering environmental and geologic data that attend to the ways in which environments are caught up in markets, politics, and socialites. So my question is, if these technologies shape what can be stored in the first place, and the broader ways in which we understand the document, should the survey, in so far as it isolates the environmental processes from its political and economic histories, be the universal scientific archive that defines the rules of environmental discourse?

Works cited
Foucault, Michel. “The Historical a priori and the Archive.” Archaeologies of Knowledge. Pantheon, 1973. pp. 126-131.

Siegert, Bernhard. “Cultural Techniques, Or, The End of the Intellectual Postwar in German Media Theory.” Cultural Techniques: Grids, Filters, Doors, and Other Articulations of the Real. Translated by Geoffrey Winthrop Young. Fordham University Press, 2015. Pp. 1-18.

Stoler, Laura Ann. “Colonial Archives and the Arts of Governance.” Archival Science. Vol 2. No. 1-2. Pp. 87-109.

Contentious Archives: The Afghan Films Archive and the Israeli Archive of Executions

Ann Stoler writes that “to understand an archive, one needs to understand the institutions that it served” (Stoler 2002, 107). By understanding that archives are linked to, or part of a larger institution or state power, it becomes clear that archives are not simply stores of histories holding information which has been deemed important and valuable, but part of a larger web of power and bureaucracy. While archives do play an important role in preserving information, they are also responsible for how this information is positioned in the context of history. In “Colonial Archives and the Arts of Governance,” Stoler describes the link between archives and state power as one which is mutually dependent. Through an ethnographic approach to studying archives, the relationship between archives, politics and forms of governance is made obvious and therefore showing that the organisation, accessibility, and information within the archive always has political significance. By using Ariella Azoulay’s article “Archive” and Mariam Ghani’s long-term research project “What We Left Unfinished” I will look at the influential role archives have in knowledge creation but also their role in forming, sustaining, and asserting political and state power. Often acting as an extension of the bureaucratic state, the decisions regarding what gets archived and how it gets archived are telling of the political issues and contestations of the times. Through these two examples, it can be seen that moving information into, and around, the archive is not passive or arbitrary but often a protective and/or violent means of controlling information and its circulation.

In the article “Archive,” Ariella Azoulay details the creation and leak of over two thousand classified documents from the Israeli Defense Forces (IDF). These documents were leaked by Anat Kam, an Israeli citizen who was carrying out her “compulsory military service” (Azoulay 3). These classified documents detailed the targeted killing of three Palestinians, revealing that the IDF had violated a ruling by the Israeli supreme court. These documents also revealed information regarding other Palestinians the state had executed as well as the plans and instructions for operations in the West Bank. Following the end of her mandatory service, Kam leaked the documents to a journalist who published the findings, which then resulted in her arrest. This is an instance in which the state’s utilisation of the archive and the process of archiving was used to document violence but was also a violent act in itself. By keeping this material in an archive, the intention was for it to be taken out of circulation but also through the mechanisms of the archive, allowing the killings to remain documented and preserved but inaccessible and hidden from public view. Azoulay has dubbed this collection of documents the “Israeli Archive of Executions.” Her article analyses the ways the bureaucracy of the archive creates a physical and conceptual distance between the visitor to the archive and the information held within. She says,

“If we follow the footsteps of those entering the archive, we shall discover that the way to file any document in it, let alone search for a document, is lined with a rich constellation of accessories and mechanisms that in themselves already serve as sentries.” (Azoulay 2)

This constellation of obstacles inherent in navigating the archive ensures that without some prior knowledge of what is within, there are certain things which will remain secret and inaccessible. The process of archivisation can therefore be used as a political tool; it allows for information to be documented and recorded but buried within the archive and relegated to the past. Stoler writes that “colonial archives were both sites of the imaginary and institutions that fashioned histories as they concealed, revealed, and reproduced the power of the state” (Stoler 2002, 97). The archive acts as the metaphorical and literal manifestation of the state’s political interests and in the case of the Israeli Archive of Executions, the archive was used to hide violations of court rulings but also war crimes committed by IDF officials.

Although the presence and existence of these documents have been made known, their content has largely remained out of reach. The function of the archive, being to preserve and protect information, was used for violent means.The same mechanisms that allow for violence to occur and be hidden however is also what protects documents and information from destruction.

Mariam Ghani’s project “What We Left Unfinished” is a long-term, multimedia research project that aims to reconstruct political narratives and aspirations that were unfinished and abandoned by using five unfinished Afghan feature films which were filmed between 1978 and 1992 but never edited. These unfinished films reveal the important issues and tensions that existed during various political situations. Ghani explains that through a reconstruction of these unfinished films, “we can reconstruct not the truths, precisely, of how the state existed and acted in those moments, but rather its most important fictions: its desires and fears, ambitions and ghosts. In the imaginary presented by most finished films of the period, we see the ideal People’s Democratic Republic that could have been, but wasn’t; in the unfinished films, the reality – a utopian project secured by violent force – lingers like a shadow, just barely concealed behind allegories and codes.” (mariamghani.com)

Ghani uses fictional films to investigate the political climate during different eras and regimes because it is through this medium that ideals and utopian imaginations of the future and of politics were able to be expressed and explored. It is also for this reason that the archive was a target for destruction by political forces. Despite merely being a projection of an ideal, the archive was a threat to state power.

In order to protect these films from destruction, a collection of negatives was hidden behind a brick wall between the years 1996 and 2002 (Ghani 48). The wall was was covered by a poster of Mullah Omar (former leader of Afghanistan) which successfully prevented its destruction. The physical hiding of this material was in order to prevent its destruction during a time of political turmoil under the Taliban regime. Here, Azoulay’s conceptual understanding of the archive as one which is able to hide material is literal and physical. “In some ways, the whole archive was temporarily filed in the invisible dusty drawer, and only very gradually did it emerge from this position of retreat over the subsequent decade (2002-12)” (Ghani 45). Many of the films that were not hidden were burned and destroyed. It is therefore interesting to see how the physical conditions and violence towards the tangible materials have caused a conceptual and ideological shift in the archive; what was hidden for years, is now understood to be the heart or the central element of these archives. The identity and holdings of the archive were directly influenced by the political climate. Returning to the quote from Stoler about how archives are both the sites of the imaginary and the institutions which determined and shaped history, the Afghan Films Archive was a target for destruction as it held films which depicted and represented alternative political futures and forms of governance.

Both of these examples are of archives containing “contested knowledge,” the materials held in these archives were seen to be of high value and significance as they contain material which were products and reflections of the state and if made public, would be a threat to its power (Stoler 87). The classified IDF documents and the Afghan Films Archive were targets of violence because of their potential to cause political and social disruption. These archives utilised the system and process of archiving to either engage in, or protect from violence, revealing the highly charged nature of their holdings.

 

Sources

Azoulay, Ariella. Archive. Issue 1. Politicalconcepts.org

Corallo, Regina. “The Human Dimension of Archives.” SCOPE. November 29, 2015. Accessed November 06, 2018. http://www.scope-mag.com/2015/11/human_dimension_archives/.

Ghani, Mariam. “What We Left Unfinished’: The Artist and the Archive,” Dissonant Archives: Contemporary Visual Culture and Contested Narratives in the Middle East, ed. Anthony Downey, l.B. Tauris & Co Ltd, New York: 2015.

“Mariam Ghani Screens Films from the Afghan Film Archive as Well as Her Own Unfinished Film.” Experimental Media and Performing Arts Center (EMPAC). January 11, 2017. Accessed November 06, 2018. http://empac.rpi.edu/events/2017/spring/watering-flowers/mariam-ghani.

Silverstein, Richard. “Kamm Agrees to Plea Bargain, Israel’s Assange Gets Nine-Year Sentence.” Eurasia Review. February 07, 2011. Accessed November 06, 2018. http://www.eurasiareview.com/07022011-kamm-agrees-to-plea-bargain-israel’s-assange-gets-nine-year-sentence/.

Stoler, Ann Laura. “Colonial Archives and the Arts of Governance,” Archival Science, 2:1-2 (2002): 87-109.

Vered, Luvitch. “Kam: History Forgives Those Who Expose War Crimes.” Ynetnews. December 4, 2010. Accessed November 06, 2018. https://www.ynetnews.com/articles/0,7340,L-3874912,00.html.

“What We Left Unfinished (in Progress).” Mariamghani.com. February 08, 2018. Accessed November 06, 2018. http://www.mariamghani.com/work/366.

The Good Life: the utopian “no place” of email archives

Susan Breakell makes the point that “to archive” was not originally used as a verb; rather the word became one around the same time as the entrance of the PC into our homes and lives. We use the word both to mean to store records, and to store electronic information that we no longer regularly use. Zielinksi highlights that “the archive serves to organize mental and enforced orders in the shape of appropriate structure and to preserve, with a tremendous amount of effort, the memory of past orders.” And from Mattern we see that archives demonstrate the interconnected technological, social, intellectual, architectural infrastructures required. This embodiment is entwined with certain politics and epistemologies, and particularly takes place in large part through aesthetics.

“The Good Life” is a project by artists Tega Brain and Sam Lavigne, an archival performance art that positions your email inbox as the stage. It is based off a proportion of the emails sent between Enron employees in the late 1990s to the early 2000s. This large-scale archive of emails was the first of its kind, and was the training database used for many early natural language processing (NLP) algorithms – including most current spam filters, and early versions of Siri. By allowing your inbox to be hijacked for a period of time of your choosing (between 7 and 28 years), you too can embody “The Good Life” of white collar, mainly white, mainly male, corporate workers (and some criminals) though language-based architectures of late 90s corporate culture. In doing so, we can all explore the enduring nature and wide-usage of digital archives, “the impulse to archive” against “the right to be forgotten,” the inescapability of bias in training data sets, and the aesthetic of emails, the poetry, and the “rational” world order of this corporate elite.

Enron started out as an energy company. Based in Houston, Texas, it was considered “America’s most innovative company” for six years in a row. It employed 20,000 people, and in 2000, the year before it collapsed, it claimed revenues of $101 billion.  It embodied a vision of American corporate success, constantly scaling and growing, moving from energy into creating new financial instruments, from trading to investments in broadband. Right before its collapse, was in partnership with Blockbuster to stream movies online – it could have been Netflix. In 2001, its stock price collapsed, and in the fallout, the company and its executives were found to have been involved in price fixing, misrepresentation of earnings, institutionalized accounting fraud, and generally corrupt business practice. When it declared bankruptcy, it was the largest in American history.

As a consequence, the Federal Energy Regulatory Commission (FERC) acquired the company’s data, including the massive archive of emails that had been sent to, from, and between employees – 1.6million emails in total. After complaints, some of these emails were removed from the archive. We can consider this a form of selection and curating the archive, though as Breakell notes, “any selection process is problematic.” One hundred employees were given 10 days to search through and remove personal emails (of their coworkers, their friends, their family members, their children). These workers were told to search for terms like “social security number” “credit card number” and “divorce”. However, as you can still find emails sent between divorcing spouses and flirting coworkers through The Good Life’s database, it’s clear many of these searches were not particularly effective in their task. The archive of 500,000 emails was the first large scale archive of its kind to be made publicly available. It is still one of the only large public email collections that’s easily and freely accessible online.

As Hal Foster writes, “no place” is the literal meaning of “utopia.” The artists’ project’s name, “The Good Life” speaks to Hal Foster and Breakall’s point – in the “no place” of the archive, we see the archival impulse go further: we can imagine “possible scenarios of alternative social relations.” To fully experience “The Good Life,” you can opt in for your own email inbox to receive a slightly reduced version of the archive. You can have 225,000 emails in total sent to your inbox in the order and with an equivalent time-spacing they were originally sent. Originally the project provided the option to have the emails sent over 5 days, 30 days, or 1 year, but these tiers had to be canceled because the emails kept getting blacklisted as spam. Given that modern spam filters were originally built off this database of emails, this seems ironic. Now, your options are to sign up to receive the emails every day for 7, 14 or 28 years.

Beyond an examination of the banality and volume of email even from its earliest usage, the project brings into play a much deeper critical commentary on contemporary digital archives, perhaps especially unintentional ones. First, we can consider the political, social, and cultural architecture of an archive, the importance of archives, and the enduring legacy of this particular database. Finn Brunton notes “the FERC had unintentionally produced a remarkable object: the public and private mailing activities of 158 people in the upper echelons of a major corporation, frozen in place like the ruins of Pompeii for future researchers.” As the first of its kind, it has been used to train spam filters, email recognition technologies like prioritization rules in your inbox, fraud detection, counterterrorism operations, and workplace behavioral patterns.The hegemonic ordering of an archive that Zielinski writes about is very much alive and enduring. There is a good chance that at least something on your phone is running off software that used this archive as its training database.

It matters then, that the users from which this archive was generated were from a particularly narrow group of people. This archive was used to build NLP algorithms because it was assumed to be representative of how people use email. But algorithms are only as good as the data provided, even or perhaps especially when they are on a large scale. As we discussed last week, biased inputs can generate and embed biased outputs in both allocation and representation. What cause for concern does it give us that so much of the epistemic scaffolding of our current information management systems are built off the corporate (and at least somewhat corrupt) working elite of the 1990s and early 2000s? On the other hand, as artist Mimi Onuoha has pointed out, today many our current datasets are built off the personal data of those who have no choice, or limited choice but to sign away their data, typically the structurally disadvantaged. This archive then offers a rare view into a group of users normally afforded more “privacy” than most people.

However, it is clear there was still a personal cost. The scrubbing of the archive did not clear out, for example, a named husband and wife emailing each other as their divorce proceeded. Employees may not have been aware in 1990s that their emails would ever resurface, least of all for public perusal. It is likely that corporate practice has changed since this time with increased awareness of the permanence of emails – the concept of “huddling” in corporate culture today is to take something offline, to communicate without leaving a digital trace. And even though we all know on an abstract level that email is not private, most of us today would still be deeply uncomfortable with our emails being publicly available in a searchable format and with our names attached to them, even though we operate with some awareness that this is possible. While it is clear that this email database deeply embodies the archival impulse, it also speaks to the right to be forgotten.

Though we might ask if that is realistic. We are now all contributing to digital archives many many orders of magnitude larger than these 500,000 emails of the Enron database. Every email, click, like, hovering over a link, and many other forms of our digital footprints are now collected by the biggest (and some not so big) corporate players in the world. What machines are ultimately being trained off datasets produced by our digital labors, and what implications does this have for both material and immaterial orders? Is anarchive even possible in this terrain? The artists suggest that by rendering your inbox into a timewarp between 1998 and present day, you subvert your email provider’s algorithm’s ability to make accurate sense of your data. Their “service obfuscates your personal emails, and it breaks the machine learning’s algorithms for understanding you.” They add: the real benefit is that it also makes it impossible for you to use your email.

Though there is a strong case to be made for examining the material infrastructure required to enable email technologies, for most people, emails appear largely through immaterial means. And yet, clearly they too operate at an aesthetic level. The Good Life’s commitment to replicating the Enron employee’s experience is achieved through the Windows 1995 interface. And while we might imagine email as standardized communication, the variation in content is analogous to Zielinski’s write up of VALIE EXPORT’s work. Formally similar frames can bring to the forefront the heterogeneity of what is contained, in this case in emails. In teaching the machines how to “think” through human language, this archive is showing a range of human communications. Granted, this is limited both by it being explicitly written content (which differs greatly from human speech, for example) and by the narrow collection of humans whose “labor” was used to generate this.

Mattern writes of a critical reviewer of an early article at pains to point out that highlighting the aesthetic experience might suggest that poetry is devoid of “intellectual or political engagement” and to fail to acknowledge that “poets even think rationally.” Given the current political debates about whose speech is considered “rational” and “unemotional” I thought it was telling that artist Constant Dullaart and NYU data scientist Leon Yin created an experiment with Brain and Lavigne’s project – a predictive text generator based off the Enron corpus. When the generator was fed a “poem” (itself found in the Enron database), it emulated the speech patterns of the emails to create this rather poetic response:

 …I put my arms in front of me

The company, that Enron companies,

the service of the company

so the company

so the company seedness.

 

And went to pull her nearer

To the CIO,

The CCPM

Please no California

Thanks company

And the company

So the company.

 

And realized that my new best friend

Business conceding the company

so the company

so the company

so the companies seedness.

 

Was nothing but a mirror

Of the company

So the company.

The Aesthetics of the Digital Archive: Navigating through the Sublime.

PRESENTATIONhttps://docs.google.com/presentation/d/18E8OmOfcbexfmQ1GjIZrqu2FT0M3gr3ApiEK3oTztXU/edit?usp=sharing

This week’s reading focused on art practices that are inspired by the archive — aesthetic strategies that place the epistemological aspect of gathering and ordering documents at the foreground of the artworks. In my presentation, I will concentrate on the digital archive. I will survey different approaches artists take to represent the characteristics of the digital record: its vastness, fluidity, obscurity, and materiality.

I will start with an anecdote from my childhood. When I was eleven or twelve years old, I found out about the Internet by a TV show that advertised its new website. We didn’t have an Internet connection at my home yet, but I was very curious about it. The obvious thing to do at the moment was to start to fill a notebook with all the websites I heard of, so when I did get an Internet connection, I could check them out. By the time my house was connected to the net, this list had long become obsolete. I was frustrated and fascinated by that. Of course, I could have the same failure if I was ever to try to write down all the books or music disks I can think of, but the Internet encompasses additional characteristics: it grows exponentially, and for the common eye, is intangible, invisible, and uncatalogued.

Many artists focus on that note: the parallel, or translation, between the physical and the digital archive; between the human-scale collection, and the incommensurable scale; between the classic encyclopedia set and Wikipedia. The artist Michael Mandiberg approaches this subject with the work “Printing Wikipedia”. In 2015 Mandiberg wrote software that transforms Wikipedia’s entire English-language database into 7,473 volumes of 700 pages (as it existed on April 7, 2015), making the volumes available to print on demand. Mandiberg’s intention to visualize the large accumulation of knowledge is for me beautiful in its failure. Not only did he place in the installation only a few printed volumes, but by the exhibition opening time, the work was already outdated. To actually transform Wikipedia into tangible words on paper would demand many “crazy” printers that in real time spill prints of pages as they are edited, filling the room with unordered sheets, as in Christopher Baker work’s “Murmur Study”, where he prints live tweets, or Jason Huff’s work “Endless Opportunities,” which I will show shortly.

In the same year Ai Weiwei created “An Archive”, a collection of 6,830 rice paper sheets of printed tweets, a physical chronicle of his social media activity from 2003 to 2013. With this gesture Weiwei also materializes the fleeting and forgettable condition of the digital archive, in this case, the personal one we create on social media.

Also focused on the fleeting aspect of social media archive, Liat Segal created in 2014 the sculpture “Confessions Machines”, an ultra-violet printer that displays on a UV sensitive surface, confessions made as posts on the Facebook platform. Segal points out with this work the immaterial condition of digital files, and the short exposure life our words have in a digital ecosystem where they are constantly buried by newer ones.

Similarly, Cristopher Baker investigates the personal voice in the large digital archive, in this case on the YouTube platform, in “Hello World! or: How I Learned to Stop Listening and Love the Noise”. In this piece, Baker not only exposes the multiplicity of voices but also questions how the archive platform format shapes or influences the content itself.

Focusing on the short-lived aspect of the digital archive, on the lack of a discernable unique, but from the catalogue point of view, Jason Huff explores in “Endless Opportunities” how image results change in relation to a single keyword in search engines. Huff developed an algorithm that searches for changes in image results algorithms and prints the results onto paper.

With a similar focus in change, Dganit Elyakim, Batt-Girl, and Eran Hadas created “Wikiland”, an interactive installation that approaches with humor the narrative dilemma in the Commons. As Siegfried Zielinski points out, the archive accounts for the externalization of historical consciousness. The archive represents a voice, a political position. In “Wikiland”, viewers physically fight in order to change the most disputed Wikipedia articles about Israeli geography, and consequentially, the political position of the archive.

Although fractured, created by different authors, and dependent on different bodies, digital archives construct narratives. Many artists have focused on the characteristics of these voices. In 2017 Zach Blass created “Im here to learn so”, a four-channel video installation based on Tay, an artificial intelligent chatbot created by Microsoft in 2016. Tay was supposed to learn from social media posts, but within hours of her release, the chatbot became extremely racist and was terminated after a single day of existence. Blass reanimates Tay in his artwork, exposing the concealed voice the social media archive creates.

With a different argument on the same subject, Jon Rafman explores the narrative poetics in the Google Street View archive. Navigating through the endless archive, apparently created by an objective and detached eye, Rafman captures decisive moments, odd moments, relevant for the human eye only. These relics are uncategorized items in the massive archive, and therefore attest to the labor Rafman placed in finding them.

Similarly, Clemente Valla investigates errors in the Google Street View archive. As Hal Foster describes, artists working with archives often try to make the lost or displaced information present. In this case, Valla makes visible the mistakes in the seemingly perfect world of representation.

The archive narrative is also explored in Natalie Boockhin work “Mass Ornament”. Boockhin selects several YouTube videos of home performers being inspired by celebrities and weaves them into a single choreography. On the same note, Penelope Umbrico inquiries into the commonplaces of our digital expressions with thematic collections of photographs taken from Flickr.

Thinking of our unconscious digital expressions and the archive they create, Evan Roth’s work “Internet Cache Self Portrait” makes visible the invisible archive we feed with our daily online interactions. Roth printed the massive collection of images saved by his browser. Again, printing is used as a strategy to compare the digital with the physical archive.

Finally, the artists Arvida Byström, Molly Soda, and Chris Kraus also bring forward the invisible but, in this case, the censured. Inquiring into how censorship policing shape the digital archive narratives, the artists published a book of images censored by Instagram. This book presents a parallel narrative. Physically published, instead of online, the artists made sure further censorship is more difficult to implement.

I subtitled this presentation “Navigating through the sublime”. The “Sublime” concept refers to a greatness beyond all possibility of calculation, measurement, or imitation. It is long explored in aesthetics as the pleasure of the overwhelming. The “Sublime”, however, represents, as Sianne Ngai argues in Our Aesthetic Categories: Zany, Cute, Interesting, a higher power structure over the viewer. Many artists use multiplicity, vastness, and an impossibility to completely perceive, as strategies to convey the digital archive dimensions, to create an aesthetic experience, but also to question the power relationship between humans and the digital archive.

Artists Countering Colonial Archives

I’m going to do a very brief survey of a few artists that are working around the theme of countering colonial archives in different ways. Some of these artists use actual archival material from existing archives as a starting point for their work, while others create alternative archives of their own to surface narratives that are often obscured or hidden from general discourse. All incorporate collecting as part of their creative practice and explore complicated ideas around cultural heritage, acquisition and repatriation when dealing with archival materials from the “global south” that exist in institutions in the West.

The first artist I’ll talk about is Sameer Farooq. He’s a Canadian interdisciplinary artist who creates what he refers to as speculative museums in order to ‘counter what large institutions are telling citizens to think about their past.’

The particular project I want to show you is from a larger series called The Museum of Found Objects, which are kind of these crowdsourced collections of everyday mundane objects. But this particular Museum of Found Objects was an exhibition at the Art Gallery of Toronto and was a reaction to an exhibition happening in the room next door entitled Maharaja: The Splendour of India’s Royal Courts. I want to read a few quotes the curatorial essay for the exhibition because I think it eloquently illustrates the reason for this collection of everyday objects as a response to and also a critique of the Maharaja exhibition:

the narrative of the “maharaja…” exhibition celebrated the opulence of india’s rulers, describing art as a product of royal patronage but also showing how aesthetic power or value was determined by the structures of imperialism… the exhibition as a format for the presentation of this kind of history aestheticises a violent, despotic and traumatic period. moreover, it presents an elite and anglocentric narrative that prompts us to consider how this history is as much Britain’s history as it is india’s.

additionally, presenting the “maharaja…” exhibition in canada in turn makes assumptions about canadian – and specifically indo-canadian – audiences and their relationship to this colonial history. second generation and diasporic indians are often estranged from a critical understanding of this history of india…their relationship to these types of exhibitions, like that of non-south asian visitors, is largely voyeuristic.

… How do museums engage communities in constructing the narratives of history that represent them? How can museum exhibitions exhibit the contingencies of history?”

So the idea for The Museum of Found Objects: Toronto was, as Farooq puts it, ‘to update the colonial exhibition with contemporary, everyday objects from South Asian communities across Toronto.”

Through creating an exhibition of contemporary objects from South Asian communities across Toronto, Farooq is playfully highlighting the irony of the Maharaja exhibition’s failure to address indian and indo-canadian audiences. He’s creating visibility around this disconnect that South Asians and the South Asian diaspora feel towards the representation of their culture and history via the lens of these western institutions. Also it’s important to note that in including everyday mundane objects from these communities, he’s also critiquing the Maharaja exhibition’s centering of royal and imperialistic narratives as an anglocentric practice.

I also just think it’s great that they were able to literally exhibit next to the exhibition they were trying to challenge.

Another artist I want to talk about is Maryam Jafri. She’s a Pakistani artist that I think is now based in New York. But her work is really interesting in that it deals with photographic archives but also touches on contemporary themes around digitization and ownership. I’ll walk you through two of her works.

This particular ongoing piece is called Independence Day 1934-1975. It’s a collection of over 60 photographs from former European colonies across Africa, Asia, and the Middle East. And the photographs were all taken on the first Independence Day of the country. The kind of twilight period that marks the transition from colony to nation state. So these are photographs that she specifically pulled from archives in their respective countries, not from archives in the West. And she arranged them according to semantic themes like  Celebrations, and swearing-in ceremonies. In arranging the photographs across these semantic themes, she makes visible the repetition and almost homogeny of these independence rituals. Which is quite interesting considering these are photographs are from very different places right? Like to see such visual similarity in the independence rituals across different places in Asia and Africa I think actually highlights the idea that the Nation State in the post-colonial context is very much a European model and these are simply iterations of this European model around the world. She’s viewing these rituals as not only significant of the transition from colony to country but as an initiation of these places into the Western definition of what it is to be a nation state.

Another project of Jafri’s is Getty vs Ghana. And so basically in doing research for the previous project Maryam became really familiar with a lot of the independence day photographs that were from these national archives in different places. And so she was shocked to find a lot of the same images online licensed under big companies like Getty and Corbis. In this work she presents two photographs, one from the national archive and another from Getty or Corbis. What’s interesting in the presentation of the two photographs is that it creates a strong sense of duality. You could look at it as “Global South” vs Privatized and Corporate West, or offline vs online. But another interesting thing to note is that the digitized photos are often cropped or changed in some way. You can’t really see it well in this example. But in some other photographs from the exhibit you can see that the crowds in the background are often cropped out, or cropped so that the center of the image is on a European figure like the Dutchess of Kent rather than the figures from the actual country. It’s also funny because she had purchased rights to a lot of the archival photographs from the previous project, but found that she still had to buy the same images from Getty and Corbis. I feel like that acquisition process also speaks to the theme of the project in itself.

The last artist I want to talk about is I suppose an emerging artist named Avani Tanya. She’s from India and did a joint residency with the Delfina Foundation and The Victoria & Albert Museum, which I mentioned earlier with Sameer’s work. The theme of the residency was Collecting as Practice, and as part of the residency she had access to a lot of materials from the archives at the museum. I’m going to play a video of an interview with her that I think sets up a good context for the work that I’ll show after.

https://vimeo.com/245194777   3:17

I wanted to play that not only to set up a premise for the work i’m going to show you now, which is the result of the residency, but also because I think she makes a nice draw between the more contemporary artifact of the primark jeans and the jammu kashmir shawl. So, she’s thinking about how the value of artifacts in an archive and the relationships that exist between artifacts changes over time. The work that came out of the residency was a publication entitled A Selective Guide to the V&A’s South Asia Collection. So the book features artifacts from the South Asia collection, but instead of looking to the museum for information about acquisition and other meta data around the objects, she invited her peers from the UK, India and Pakistan to share their own interpretations of and responses to the objects. She elevates that subjective response and also in a way gives agency to contemporary south asian and diasporic communities in allowing them to (re)present south asian cultural artifacts from the colonial context.

Sivanesan, Haema. “Annotations to The Museum of Found Objects: Toronto (Maharaja and-).”
“Vimeo.” Vimeo, 13 Nov. 2018, vimeo.com/245194777.

Ordering Logics Presentation

DRAFT!

Much of my interest in my art and design practice is in surveillance and how it is enabled through technology. My research primarily includes the systems and technologies the government create in agencies from GCHQ to the NYPD, and systems and technologies created by corporate institutions that are marketed and sold to local and federal governments. These technologies are top-secret and/or trade secrets. Moreover, the policies, strategies and rules of engagement are even more secretive and hidden behind impenetrable secret courts, cries of national security, and private contracts. Through leaked documents and product demos we can glean the priorities, hierarchies, and intent of theses technologies

In 2013, Edward Snowden, a data analyst, released documents uncovering the extent of abuse and power the NSA holds. The documents outline several projects of mass surveillance that the NSA maintains. One of these technologies is XKEYSCORE. XKEYSCORE is the NSA’s “widest-reaching” (these are the NSA’s own words) system; developing intelligence from computer networks, the program covers nearly everything a typical user does on the internet, including emails, websites, and Google searches. The XKEYSCORE system continuously collects so much internet data that it can be stored only for 3-5 days at a time. XKEYSCORE was also used to hack into systems that allowed the NSA the keys to cell phone communication. Though the advent of FISA restricts how the NSA may surveil US citizens, FISA warrants that allow the use of XKEYSCORE may lead to some US-based information being swept up by the wide-reaching filterless data gathering. Whistleblowers claim that they could, “wiretap anyone, from you or your accountant to a federal judge to even the president, if I had a personal email.”

XKEYSCORE’s (or XKS) strength is in its ability to go deep. Because much of the traffic in the internet is anonymous (thanks to efforts of privacy activists) XKS’s dragnet approach allows analysts to pick up on small bits of info to start creating profiles on “targets”. As XKS relies on scraping and acquiring telecom signals, it looks at different points of access into the telecom systems [pg.11]. This primarily includes phone numbers, email addresses, log-ins, and other internet activity. In looking at the slide deck for XKS we can glean some classifications that are of interest, and this make someone susceptible to surveillance. They are sprinkled throughout the deck[pg.14-20]. Something I find interesting, is not only data as classification, but ease of access to data as a class [pg.23]. For me there are two major themes in the classification that stand out. One is an anti-globalist and islamophobic one: this person doesn’t belong in this place. The other is an interesting position on internet security and safety: Are you exposed? We’ll target you! Do you care about protecting yourself on the Internet? We’ll target you!

Palantir is a data mining company founded by Peter Thiel, founder and former CEO/”don” of PayPal. They use data fusion to solve big problems, from national defense to improving medical patient outcomes to supply chain management. Data fusion is the process of taking different sets of data and find trends between them. This is exemplified in Palantir’s efforts to aid law enforcement, currently for the LAPD and secretly for NOPD for 6 years. Palantir’s models utilize datasets that include court filings, licenses, addresses, phone numbers, and social media data. Like others, the model uses this to index probability for a given target, but instead of indexing likeliness of buying a product, or voting for a candidate, it models likeliness of committing a crime.

In this so-called “crime forecasting”, Palantir used models that treated gun violence as a communicable disease. This is to say: those who were related, or closely associated, to those who have committed crimes were considered likely to commit a crime, too. For those who have already been charged with crime, the model creates an automated “chronic offender score” for the individual, above a certain threshold, and the individual is placed on a watch list. The individual is notified that they will be under increased surveillance, only to be removed if they have no interactions with law enforcement officers — a murky situation since law enforcement officers are now encouraged to scrutinize a citizen. Companies like Palantir allow local law enforcement to bolster its tactics of surveillance. Unlike the NSA, Palantir enables law enforcement to have a laser focus in its efforts to surveil certain people in their communities. Through their slide deck, presenting and pitching their work to other cities, they highlight where Palantir gathers its data: jail calls and phone logs, gang affiliation data, crime data, and social media [pg13]. This again has that same factor of perpetuating violence, as they use data that already skews (we know that the criminal justice system disproportionately targets Black and Hispanic people) to find “new” criminals, ones that have not been abducted into that system. We see often in the slide decks that “indirect” connections are plotted in the system, glossing over that these are not “indirect”, but “systematic” connections, a system that already favors guilt by association [pg.16].

IBM is one of the oldest tech companies around. The popular IBM PC was released over 37 years ago in 1981. Since then IBM has grown from a computer company into a computational one and has always been a major player in emerging technologies. The case has been no different with artificial intelligence technologies. Perhaps most famously, IBM has developed the computer system Watson, a machine learning, natural-language processor for question and answering, that appeared on Jeopardy brutally defeating. Since then Watson has been built out to be one of the largest services IBM offers, now extending the QA features to CV, Text-to-Speech, Speech-to-Text, and much more. With IBM’s rapid growth in the AI sector, it comes as no surprise that they have lent their machine learning prowess for systems of surveillance.

Last month The Intercept, in partnership with the Investigative Fund, reported on software developed by IBM for the NYPD. Through leaked corporate documents, the public is able to see how and what the NYPD are interested in surveilling. The system and IBM engineers had, unknowingly to the public, used access to NYPD CCTV system of over 500 cameras to tag individuals and train data. The data of public citizens is claimed to be safe via NDAs and background checks. For IBM, the NYPD was one of the first serious customers in surveillance tech, especially after 9/11. Looking through the slides of the leaked documents we can see how the NYPD prioritizes and categorizes its citizens in order to find state-aggressors. One of the most appalling categorizations is that of skin color. This is reminiscent of IBM’s body camera tech that allowed the categorizations of people by “ethnicity” tags, such as “Asian,” “Black,” and “White.” For me, the standout classifications are the more nuanced approach to the cameras themselves than the nuisance of classifying citizens by skin color. This further reinforced the idea that the camera provides a ground truth, and seeing the the defaults of these annotations[pg.33+34], such as Light as default skin color, and Black as default torso color(perhaps more of a nod to the default fashion of NYC). Further, I’m curious as to the classification field for “Large Amount of Skin in Torso” and the general interest for needing skin. Largely the claims for the NYPD using the fields have been shot down early by the NYPD as they claim to have acknowledge the tendency to profile. But an IBM engineer brings an interesting statement to the Intercept, “A company wont invest in what the customer doesn’t want.”

Classification as conceptual Container

Last week’s topic, as you might remember, was about containers: boxing and shelving of physical objects and their impact on the material that can be stored and archived. The furniture and boxes therefore not only determine what can be stored but also serve as a reference to what we consider worthy of being stored and assigned value in a certain culture. And what says more about a culture than which objects it considers valuable and its ways of preservation.

In the following short presentation I want to look at a different kind of container. I want to look at conceptual containers and more specifically at classification systems. Therefore I will draw on the introduction of Foucault’s order of things (1966) firstly for a theoretical framing and secondly to take a closer look at Carl Linneaus as one of the examples Foucault mentions in his introduction. I want to impose the same questions we had concerning the physical boxes and containers to the system developed by Linneaus. And I’ll focus on one example where the system becomes most visible — when it fails.

Foucault starts off writing about Jorge Luis Boges – we heard about him and his ideas of the Library of Babel last week. Foucault references Borges’s own references to a “certain Chinese encyclopedia”. Footnote: Actually there is no evidence that this encyclopedia ever existed, most likely it was an invention by Boges himself, who is considered to be the father of magical realism in Latin America. Anyway in this “Chinese encyclopedia” “is written that ‘animals are divided into: (a) belonging to the Emperor, (b) embalmed, (c) tame, (d) sucking pigs, (e) sirens, (f) fabulous, (g) stray dogs, (h) included in the present classification, (i) frenzied, (j) innumerable, (k) drawn with a very fine camelhair brush, (1) et cetera, (m) having just broken the water pitcher, (n) that from a long way off look like flies’.”

Of course Foucault is mentioning this classification in the beginning of his book because it is challenging our own classification system of animals by the “… exotic charm of another system of thought, is the limitation of our own, the stark impossibility of thinking that.” Taking this taxonomy seriously for one moment, from my perception at least, I can not understand that this classification could be useful. One class could easily be applied to two kinds of animals, like “belonging to the Emperor” or “having just broken the water pitcher”, not to mention “included in the present classification,” which would apply to every animal. And what does “et cetera” say about an animal? What does it show you when you look though this classification at animals? What first struck me is that it seems to be recklessly time based. A “stray dog” could be “tamed” and would move therefore from (g) to (c). Or the “just” in “having just broken the water pitcher” is clearly referring to some sort of process and timely sequence. In my eyes these classes look unstable and way too unspecific.

What Foucault draws from this list is its listness. Animals classified in this list are linked only through the alphabetical series, which is in itself completely random, as we probably all read in the Georges Perec piece. In a fully structuralist argument Foucault draws the conclusion that things could just meet in language, and since thought is fundamentally linked to language, these concepts, taxonomies and classifications are a learned structure through which humans, at least in the European context that Foucault is writing in and about, perceive things.

To explain this he takes up the notion of tabula, a table, a grid, “…that enables thought to operate upon the entities of our world, to put them in order, to divide them into classes, to group them according to names that designate their similarities and their differences.” The table upon which, since the beginning of time, language has intersected space.” Language is the fundamental classification system; it constitutes itself through difference.

This sounds almost like an inventory of objects, which are each assigned a specific place in the grid (like libraries do with their books) to be identified and found again. That is quite similar to the project of Carl Linneaus.

Carl Linneaus was born in Sweden 1707 and died in 1778. He was a botanist, physicist and zoologist and most of all the founder of the modern taxonomy for botany and zoology, which is still, with some additions, in use today. The taxonomy he introduced for the first time in 1735 in Systema Naurae, is an hierarchical system that makes it possible to place a certain entity on the “tabula” gathered into “taxa” groups that have similar characteristics among each other and differ from other groups. These groups are assigned a certain rank, which then forms the hierarchy we all know from our schoolbooks: domain, kingdom, phylum, class, order, family, genus and species. The easiest way to demonstrate is to show an example: Picture!

Here you can see the classification system and the seven taxa Linneaus developed. Above Kingdom there would be the “chaos-group”, I will call it, frankly: the group of all growing things on the planet. From there one class is extracted through the characteristics of being organized bodies, living and feeling and moving spontaneously. That seems like a reasonable classification for us, because we can already see how such a class differs from plants and minerals. So, kingdom is Animalia, Phylum is Chordata, which means all animals with a backbone, that excludes for example spiders or jellyfish. You see, we are getting more and more specific here. Then class is Mammals, animals with fur and milk glands, clearly focused on how the reproduction works, which would exclude all the birds. I won’t have time to go through all the categories en detail but let’s take a quick look at the most specific category: Species. This category is reserved for a single group of organisms that can only reproduce among themselves. “Canis lupis” might be, interestingly, not the best example, because it can reproduce with “Canis lupis familiaris”, the common dog, which is also classified as its own species. Thist shows that the differences which must be emphasized in order to build such a classification system are in some cases much more permeable.

Another great example of the failing of the grid and academic fights during Linneaus introduction of his taxonomy is surely Homo sapiens. How to classify the classifier? But that would open up a much broader field and would extend this talk even more!

Just a last quick note on the binomial nomenclature, Linneaus’s naming system. “Canis lupis,” for example, is the combination of the Genus, which can be assigned more than once, and the species-specific name, which is exclusively assigned to this specific group of organisms. That way he established a taxonomy, a classification system and a system of naming as well, so all zoologists using this system knew which animal the other zoologists are talking about. In this project he somehow fulfilled the desire of an academic universal language rooted in latin, like an academic esperanto, that distributes not only how to name organisms but also how to perceive them.

This taxonomy, and lets cite Foucault here again, “enables thought to operate upon the entities of our world”, which means that through this classification humans can operate, set themselves in a certain position according to the classification and see living organisms through that tabula, this grid. But what does this world look like?

Firstly, as  is illustrated nicely in the graph I just showed to explain the Linnean taxonomy, is that every species is standing next to each other like in an euclidean space. They get assigned a certain position in the hierarchy, which isolates them from their surroundings. It is no coincidence that in the midst of the enlightenment, the beginning of the industrial revolution in England and the birth of capitalism, a classification system is invented. As mentioned above, it is the attempt of an inventory of what is on earth. The word “inventory” is derived from business language; one does an inventory to check which resources are available and in stock. Perceiving nature as bare resources is tightly connected to a capitalist worldview. Scholars like Jason Moore (Capitalism in the Web of life) have gone into this thought a lot further than I can do here.

Secondly, the Linnean Taxonomy can not display interrelations. It kind of over-layers the fact that life is dependent on life, that the tree cannot live without the fungus at the tip of its roots and vice versa. A classification of flowers by their sexual organs makes sense only if one connects it to the insect and their co-evolution. There is more interspecies crisscrossing, exchange and dependencies than Darwin’s later evolution of species, which derived from the Linnean classification system, tried to picture. The tabula structure, with its columns, rows, cells, or the “tree structure of evolution”, tends to support orderly mental models that don’t always allow us to appreciate flows and cross-pollinations between those cells and branches. Recent discoveries like the “Wood Wide Web” challenge this orderly thinking of columns and rows. Here we become aware that through this order one tree was seen as a single individual, despite the fact that it is now clear that talking of a tree involves at the same time talking about the forest.

 

 

Brave Pneumatic World

I have a vivid memory from childhood that has stuck with me: I’m in the back of my mom’s car at the drive-thru bank in our town. As she presses a button, a capsule shoots up a clear tube out of sight. A disembodied, telephonic voice emanates from a speaker, says a few words, and seconds later the capsule glides back down the tube, this time filled with money and a lollipop for me, as if by magic. Perhaps it is the image of Jetson-ian ‘tube-based’ movement, or the unforgettable summation of the internet itself as ‘a series of tubes’ by Senator Ted Stevens; something about housing and moving information, infrastructure, or objects with compressed air via (usually underground or hidden) tube networks just aesthetically strikes us as futuristic.

There is a long history of humans envisioning a faster, ‘non-traditional’ conception of movement; always looking for a shortcut, there had to be some system that could bypass the pitfalls of conventional mobility allowing for a faster, unobstructed journey. The pneumatic tube thus functions as both an example of technology and a piece of infrastructure to facilitate technological or information transfer. Susan Stewart mentions in her piece, ‘Plato likens the pigeon to a bit of knowledge’(1); like ‘pigeonholes’, each opening in a pneumatic tube arrangement not only serves to house ‘knowledge,’ whether it be a physical object or document, but to move that particular piece to a specific and purposeful destination. The idealized vision of pneumatic systems was intended to help move humans faster and with less obstruction, and to help businesses move product between locations — and it brought a futuristic expediency to these otherwise banal and unavoidable tasks.

Further adding to the air of ‘futuristic’ or liminality is the concealment of these tube systems: we typically only see the intake/output aspect of the operation before the tubing network vanishes into a wall, or underground. This obfuscation of the apparatus functions in stark contrast to the purposeful design and arrangement of of shelving, as discussed in the Mattern piece: “we put things on shelves, rather than behind doors or in drawers… when they’re sufficiently attractive for display’(2). While these systems of cascading tubes are fascinating, we tend to bury the infrastructural underpinning within or beneath architecture; we see only the initial setup and final product. Preceding our modern conception of ‘instant’, the ability to stick something into a tube in the wall and have it reach its targeted destination almost instantly has a distinctly science-fiction appeal (unsurprisingly, the use of the pneumatic tube in works of fiction is widespread). Fitting with the sci-fi aesthetic, operators of pneumatic systems even referred to themselves as ‘rocketeers'(3).

Chun describes new media as ‘[racing] simultaneously towards the future and the past, towards what we might call the bleeding edge of obsolescence’(4). The pneumatic tube system falls within the realm of technology once viewed as forward-thinking and potentially revolutionary, only to be rendered outdated by new developments. Once these tube systems lose their utility, a skeletal remainder physically remains. The disused pneumatic system at the Brooklyn Public Library looked worn like other machinery we saw, but somehow also out of place, as if from a totally different era. The demise of pneumatic tube systems was brought about by their high cost (as is often the case), and what was once a tangible tool of ‘the future’ swiftly became a relic of dated thought. The New York Times eulogizes the once state-of-the-art New York pneumatic mail system as such:

For the time, the system was thoroughly modern, even high-tech, a subterranean network for priority and first-class mail fueled by pressurized air. Only a few decades later it was mostly a dinosaur, made obsolete by the motor wagon and then the automobile.(5)

Like the payphone or the railway semaphore, a physical monument to the obsolescence of entire networks remains visible with its intended purpose obfuscated. To those not aware of the history, these structures become simply another topographical marker. Much like actual dinosaurs, we are left with only skeletal remains hinting at the magnitude of something lost to time.

We got to personally see a pneumatic system now relegated to an antique at the Brooklyn Public Library. This is the conundrum of the technology: in present day it is both obsolete and yet still somehow reminiscent of ‘futuristic’ ideals. In her piece, Stewart mentions Cornell’s 1952 work Dovecote, which featured colored balls that could be moved from panel to panel via a series of hidden tracks within the frame of the work: “Dovecote…presents an image of memory in a process of disappearance… Dovecote appears as… a forgotten function, a device no one remembers.”(6)  By design we are only privy to the ‘beginning and ending’ of pneumatic systems, unable to see the obscured infrastructure. In this way at a glance, the obsolete systems visually call to mind notions of immediacy, ‘magic’ transportation, and both the past and the present at once.

Pneumatic tube systems are still in place and functional in certain modern settings: banks, fast food restaurants, and somewhat regularly within hospitals to transport samples or medicine to and from labs. The technology has also been applied to scenarios far beyond urbanized environments: Whooshh Innovations has crafted a pneumatic tube system used to transport migrating fish over dams, ensuring the integrity of natural ecosystems and man-made infrastructure (I thought I was clever in finding this, but as it turns out the Jon Oliver show has already produced a ‘viral clip’ about the novelty of this technology).

Perhaps the most well-known modern adaptation of the pneumatic tube system has been popularized by eccentric and ambien-fueled tech entrepreneur Elon Musk. Hyperloop has proposed the development of an underground human-transport system that would allow for coast-to-coast travel at breakneck (hopefully not literally) speeds. Using a combination of a pneumatic system and mag-lev technology used in ‘bullet trains’, initial designs offer yet another iteration of a ‘futuristic’ re-imagining of how to transport things faster than the current paradigm. Like most of Elon Musk’s ideas, Hyperloop attracted a lot of interest with detractors and proponents attempting to reason how and why this system could work or fail. Vox describes the promise of pneumatic travel as ‘part Victorian, part Jetson’(7) (to me, there is an element of Super Mario as well), and there is something undeniably more idealistically ‘futuristic’ about tube travel than even another one of Musk’s conceptions, the self-driving car, offers. Perhaps the immediacy of ‘instantaneous’ travel at some point stops resembling technology and becomes something more like ‘magic’.

https://docs.google.com/presentation/d/1GqDRcCvpZYS1a8R136R7CT5G3ov57x_-_MK8VVgWO9w/edit#slide=id.g43bc45247d_0_10

1 http://www.wordsinspace.net/secure/Stewart_Wunderkammer.pdf page 293

2 http://www.harvarddesignmagazine.org/issues/43/before-billy-a-brief-history-of-the-bookcase

3 https://www.youtube.com/watch?v=sd58w0CXQrM  @1:10

4 http://www.wordsinspace.net/secure/Chun_EnduringEphemeral.pdf

5 https://www.nytimes.com/2001/05/07/nyregion/underground-mail-road-modern-plans-for-all-but-forgotten-delivery-system.html

http://www.wordsinspace.net/secure/Stewart_Wunderkammer.pdf page 293

7  https://www.youtube.com/watch?v=sd58w0CXQrM @ 3:11

Dovecote by Joseph Cornell

The Total Archive and Posthumanism

The Total Archive and Posthumanism

“…The Library is total and its shelves register all the possible combinations of the twenty-odd orthographical symbols … the interpolations of every book in all books.” [1]

In Jorge Luis Borges’s,[2] “The Library of Babel,” he envisions a fictional, seemingly unending and universal library that contains all things that have been written, and will be written. Somethings are sensical and others are nonsensical. This library is one constructed of infinite space and time. It is a metaphorical replica of the universe and depicts the theory of the total archive.

Borges in this literature is undermining the idea of totality. In order for information to be accessible, it must be discernible. But, Borges’s library holds no classification scheme, no decimal system, no form of indexing. The information is infinite, and as such cannot be counted by ephemeral humanity.

The shelves, or containment units, typically represent accessibility of material, but here it is used ironically. Borges describes the specific architecture of this unending labyrinth. There exist an infinite number of hexagonal galleries. Each contains 20 shelves with five shelves per side, except for one. The shelves span the distance from the floor to ceiling, which rarely exceeds the height of the average librarian.

Paul Otlet’s work can be looked at as a more feasible attempt to a total archive as well as a precursor to the World Wide Web. In the Mundaneum, original works were reduced to a system of three- by five-inch index cards placed in filing cabinets, limiting the information stored. “Otlet’s vision was focused on pure information, not objects, and was distinguished by its universality and its emphasis on establishing the connections between bodies of knowledge…,”[3] allowing for a more effective use of space and indexing. Otlet was able to recognize the importance of search and retrieval. This system did not need containers for the original works, for it focused on effective retrieval of information.

The librarians in Borges’s work lived in an existence where they were surrounded by knowledge, and despite the architecture of the library, the openness of the shelves, the heights constructed for them to reach, they still could not find the answers they sought. The useless and the useful cohabit the same space, indiscernible. No book is more important than any other, thus knowledge becomes inaccessible.  

Finally, Borges leaves us with a sentiment: “The Library is unlimited and cyclical.” This in response to the idea that totality is achievable, and in as so much as the library (and the universe) must come to an end. He remarks such a notion is “absurd” and that once one has reached the theoretical end, it would simply begin anew.

Jonathan Basile is a writer and creator of libraryofbabel.info, which is a site that aims to make Borges’s library a reality through the use of an algorithm. The digital library houses 10 to the power of 4,677 books. Even still it represents a much pared down version due to digital storage limitations and parameters as constructed in Borges’s Library, such as page numbers (410) and symbols (22). The site also houses a similar application for images.

Ultimately, he states: “my project resembles Borges’s library only by mirroring its failure.”[4] The fruition of the universal library remains elusive because so long as the universe exists totality is “essentially incomplete.”

Basile’s algorithmic embodiment of the Library may contain all words that have been and will be written, but it lacks intention in its randomness. Humans have not written nor said nor will probably ever say all that can be, making the information meaningless. But what about the future? A future that looks inhuman.

According to N. Katherine Hayles,[5] beyond this theoretical metaphysical total archive, there is a natural phenomenon that limits the practical flow of information; expansion and compression. Borges’s library can be described as a compression. Once one has reached the “end” of the library, the cycle of information repeats in exactly the same order. This creates a lack of randomness to the universe, in other words, a compression. The inverse of this is expansion. “The Aleph,” another Borges work, envisions a photographic archive that contains a photo of itself, which contains a photo of itself, and so on and so forth. Like a set of nesting dolls, one encapsulating another, infinitely growing.

Tangibly, we see expansion and compression as a system of information ebbs and flows through “apparatuses of control,” such as political powers and institutions.[6] As information archives expand or compress the inverse occurs in relative systems. Hayles uses automated storage and retrieval systems employed in libraries as an example of this. These systems allow for the removal or compression of human browsable stacks while inversely expanding the space for utilization of other activities.  

I believe this phenomenon is seen within the internet. It acts as a system that is not only the closest and latest iteration to the Mundaneum but also acts as an expanding archive. However, as we know, server storage capacity is limited, not all permutations exist within it, and the information is transitory. As information expands data must compress, at least for the time being.

Researchers and scientists are working on means to develop information protocols through the use of quantum entanglement in quantum computing, which could mean infinite storage. Though, even this does not address the overarching issue of retrieval. With that issue aside, what if storage and containers are of no limitation if space and time simply do not act as ultimate parameters of archivable information?

Posthumanism can be envisioned as a future where the upper echelon of intelligence no longer belongs to what we now consider to be human. This future includes ideas that are un-human by nature, a world that has transcended the human form. But, how does this future affect current information infrastructures? Does it allow for infinite information storage? Does it allow for navigation? And must this information be transmutable, as seen in Otlet’s archive?

Hayles states in her book How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics, that there are four challenges when thinking of a posthuman future narrative. (1) overcoming anthropocentrism, (2) including nonhumans, (3) accounting for the historical present, and (4) incorporating embodied cognition. Hayles uses the word “becomes” deliberately to illustrate that we are already in the fetal stage of posthumanism. That our interactions with computers act as an extension of ourselves. “We already are cyborgs in the sense that we experience, through the integration of our bodily perceptions and motions with computer architectures and topologies, a changed sense of subjectivity.”[7] Hayles does not mean to say that posthumanism will mark the end of humanity, but rather the “conception of a human.”  

Technologies now include the digitization of information and the container often being “the cloud.” This too is not unlike the system put forth by Otlet, where information changes form (in this case digital) to fit the container (the server). This is a reduction of expanding information, or in other words a compression.

Perhaps posthuman synths or alien intelligent life forms are able to decipher such a Library or at least one of its infinite translations.

The Pioneer Plaque,[8] which uses science as a universal language, were plaques placed on board of the 1972 Pioneer 10 and 1973 Pioneer 11 spacecrafts, and features a pictorial message providing information about the origin of the spacecrafts in case they are ever intercepted by extraterrestrial life. Though there is controversy about the universality of the pictured elements, conceivably information can be compressed into such dimensions and be written in a singular, universal, infinite language that allows for an ever-expanding consortium of information.

But, again raises the question of how would one index infinity? This may lie in the quantum computational field that I mentioned earlier. A system for post- or trans-humans to infinitely index information as it is infinitely archived. Though I am not an expert in quantum theory. Perhaps it is plausible. Perhaps it is not. And if so, I wonder if containers, boxes, shelves, filing cabinets, rooms, buildings, budgets, politics, bits and bites, hard drives, hardware, overall computational power, and our limited perspective of linear, observable time are necessary barriers. Epistemological barriers that guide the archiving focus towards information that has the ability to ultimately become knowledge.    

Notes:

[1] Borges, Jorge Luis, and Andrew Hurley. Fictions. London: Penguin, 2000.

[2] an Argentine writer, noted for such works as “The Library of Babel”, and “The Aleph”

[3] Molly Springfield, “Inside the Mundaneum,” Triple Canopy 8.

[4] Basile, Jonathan. Tar for Mortar: The Library of Babel and the Dream of Totality. Santa Barbara, CA: Punctum Books, 2018.

[5] Katherine Hayles is a postmodern literary critic and professor and Director of Graduate Studies in the Program of Literature at Duke University

[6] Hayles, N. Katherine,. “Theory of the Total Archive: Infinite Expansion, Infinite Compression, and Apparatuses of Control,” Lecture, Crassh, Cambridge, UK, March 31 2015.

[7] Hayles, N. Katherine,.  “Condition of Virtuality”, p. 12.

[8] Paglen, Trevor. “Friends of Space, How Are You All? Have You Eaten Yet? Or, Why Talk to Aliens Even If We Can’t.” Afterall: A Journal of Art, Context and Enquiry, no. 32 (2013): 8-19. doi:10.1086/670177.

Light as Information

As a child, my weekends consisted of sitting pretzel-style below the invasive presence of the wall sized-television set. Below our CRT television were rows of VHS tapes ranging from Chitty Chitty Bang Bang and the Wizard of Oz to A Bug’s Life and Home Alone. As the years went by, those clunky boxes of nostalgia began to fade, along with the VCR player that enabled the tapes’ animation. We witnessed the arrival of a new neighbor on the shelf: the sleek and compact DVD. These optical storage discs, with their ability to refract color when spun in the light, held an aura of opulence in the eyes of a child who was entranced by technology, and with their new technological facades and abilities came new intellectual furnishings and systems that stored and activated their memory.

As Mattern states in her piece “Before BILLY: A Brief History of the Bookcase” “I grew up in a domestic world that seemed to hospitably reconfigure itself around our family’s evolving interests and enterprises.” (2) In the case of my childhood, I relate. I think this is a symptom of the modern world in which we all relate, where our past is caught in an ephemeral sandstorm where memory and time are buried with change and technology.

Our world’s yearning desire to replace the old with the new, the slow with the fast, impacts more than just our storage furnishings and discs, but manipulates the very natural core of our universe, where we have ejected for increased speed and capability. In the case of light, its radiation has been harnessed through the electromagnetic spectrum’s encompassing presence, from the atomical to the astronomical, as a catalyst to the ever-increasing speed in which information is being stored and disseminated. Wendy Hui Kyong Chun, in “The Enduring Ephemeral, or the Future Is a Memory,” examines this increase of speed as being a double-edged sword, one that allows for increased information to be disseminated but with decreased assimilation. (1) It is like the inverse relationship of the wavelength and frequency of light, as the frequency increases, wavelength decreases, with the wavelength representing the assimilated material.

The laws of light don’t just metaphorically relate to the speed of information, but its very properties have been harnessed to disseminate information. Going back to my discussion of the rapid change of media storage formats at the new millennium, the introduction of the optical storage disc, Blu Ray, used a precise beam of light in order to harness and capture the information engrained in its surface, ultimately increasing information storage capability.

Following the creation of the DVD in 1995 by Phillips and Sony, Blu Ray was officially released in 2006. While both DVD and Blu Ray used optic lasers to read and write digitally encoded information onto the disc, the Blu Ray advanced information storage capabilities through dual-layer precision. Blu Ray’s ability to store an increased amount of data stems from the short wavelength of the blue laser that is used to read and write the disc. The wavelength of 405 nm gives the laser more precision compared to a DVD, which uses a red laser with a wavelength of 650 nm. In the early days of Blu Ray technology, each disc layer could only hold about 25 GB of information; the technology has since advanced to 100 GB per layer and can transfer data at a rate of 48 Mbps, as compared to the DVD’s 10 Mbps capability. (3)

According to Kintronics, Blu Ray is designed with the capabilities for (BD-ROM) pre-recorded content, (BD-R) recordable PC data storage, (BD-RW) rewritable PC data storage, and (BD-RW) rewritable HDTV recording. (3) Not only does Blu Ray allow for more storage capacity, but it allows for an increase in user interactivity, including internet accessibility, instant skipping and playlist creation. These features, while allowing for a more user friendly mobility, further Chun’s theory that increased information leads to decreased assimilation. (1)

The information on a Blu Ray is encoded in pits that run from the disc’s center to the edge of the optic surface. The blue-violet laser reads the bumps in-between the pits where the information is stored. When the light hits a bump of information, it is reflected back towards a photo electric cell that detects the information, interpreting it as binary data. (4) The amount of information capable of being stored is dependent upon the size of the pits. Smaller pits allow for larger amounts of information. As compared to a DVD surface that is formed with a larger wavelength of light, a Blu Ray surface has a much larger amount of smaller pits, allowing for more information to be disseminated across the surface.

When it comes to the physical design of the Blu Ray, it has advanced the problematics of the DVD by placing the encoded data on top of a plate of polycarbonate, as compared to the DVD which compacts the data between two plates of polycarbonate, allowing for a birefringence, splitting of the beam and thus risking the disc unreadable. (3) The Blu Ray’s furnishing has advanced disc media storage, past the realm of Chitty Chitty Bang Bang nostalgia and home video and into the ambiguous space of the locus. As Chun mentions, “A locus is a place easily grasped by memory, such as a house, an intercolumnar space, a corner, an arch, or the like.” (1) The Blu Ray thus becomes a locus, a hybrid space between time, space, and light, for high speed media storage and data recording — or is it memory, or memory making? And where does the Blu Ray position itself within the very same rows of once-new media storages that lined my childhood wonderment?

As Mattern mentions in “Before BILLY,” “What were, only a few days before, systematically coded wares in a miscellany of merchandise, are now individuated objects, appreciated for their distinctive functions or aesthetic values, classified and authorized, in part, through their place on the shelf.” (2) The Blu Ray will become yet another marker of our ephemerality, baring a once-advanced infrastructure, while slowly becoming shrouded in a familial dust of the former. And just as our storage furnishings and equipment morph with time and technology, so do the phenomena we exploit for change. The harnessed energy of light we have sourced to capture and reveal information has inversely pushed our desires to reach a point where information can travel at the speed of light. But have we become lost in the shadows of this unfathomable velocity, where our information has become too quick to capture?

 

References:

1.Wendy Hui Kyong. “The Enduring Ephemeral, or the Future Is a Memory.”Critical Inquiry, vol. 35, no. 1, 2008, pp. 148–171., doi:10.1086/595632.

2.Mattern, Shannon. “Before BILLY: A Brief History of the Bookcase.” Harvard Design Magazine, President and Fellows of Harvard College, www.harvarddesignmagazine.org/issues/43/before-billy-a-brief-history-of-the-bookcase.

3.Mesnik, Bob. “How Blu-Ray Optical Discs Work.” Kintronics, Kintronics, Inc., 1 Mar. 2016, kintronics.com/how-blu-ray-optical-discs-work/.

4.YouTube, Into the Ordinary, 6 Sept. 2017, www.youtube.com/watch?v=H-jxTzFrnpg.

 

 

 

 

 

Application: Breast Cancer Campaign Tissue Bank, A Case Study in Building a Biobank Network

One of the central concerns in our course is the question of how the collection, organization and analysis of information lays the foundation for how we then produce knowledge from it. By information, in this class alone we’ve have considered books, manuscripts, images, tweets… In “Middlewhere: Landscapes of Library Logistics,” Professor Mattern takes us to BookOps, a centralized sorting, cataloging, and distribution facility that serves the local libraries distributed all across New York City. (1) We also get a glimpse into the workings of the Research Collections and Preservation Consortium, or ReCAP, which connects NYPL’s patrons to Princeton and Columbia’s resources and vice versa. We learned that if the underlying software that operates NYPL, Columbia, and Princeton can be mutually intelligible, then ReCAP would be much more robust by allowing patrons to do “common searches” across all three catalogs. This is a question of interoperability, and it is also the central concern of my subject today, the Breast Cancer Campaign Tissue Bank, to which I will return after addressing the larger topic of biobanking.

I’m interested in the collection, organization, and analysis of biological information, and that’s what led me to look at biobanks, which are organizations that “collect, store, and oversee the distribution of specimens and data” for institutional, non-profit, or commercial purposes. (2) Biobanks form an important part of the infrastructure for today’s population health research and personalized medicine, or precision medicine, initiatives. I see many overlapping concerns between biobanks and libraries in terms of its infrastructure for collection, organization, and research, including the problem of interoperability. The word “biobank” itself has no concrete definition. Sometimes they are also called “biorepository,” “specimen bank,” and “tissue bank,” or “bio-library.” Basically, a biobank stores biological information, ranging from physical tissue samples to genomic data to various forms of electronic medical records. (2)

In 2009, biobank was named one of TIME magazine’s “10 Ideas Changing the World Right Now,” but the practice of collecting, organizing, and then analyzing biological material had begun far before 2009. (3) So what changed? The TIME magazine piece itself offers some clues. The 2009 article cited several European countries’ efforts to build their own “national biobanks.” It also mentioned deCODE, an Icelandic commercial genetics company that has, reportedly, collected over 100,000 Icelandic individual’s DNA, which is 30% of Iceland’s entire population.

DeCODE was founded in 1996; it preceded most public and private population-wide biobanking initiatives, such as the UK biobank or 23andme, by almost a decade. This decade from 1996 to 2006 seems to mark the maturation and stabilization of the technology of mass DNA collection and sequencing. This diagram shows how, shortly after 2007, the cost for sequencing a genome started sharply declining. (4) This is a turning point at which population genomics shifts from a technology problem to a collection and analysis problem.

Biobanks do not only store genomic data, of course. Depending on the type and purpose of the biobank, it may collect your blood, your permission to access your electronic medical records from elsewhere; it may ask you to perform various sorts of physical or psychological tests. It may ask the volunteers to come back months or years later for follow-up tests.  The purpose of biobanks, large or small, is typically to advance research by bringing together multiple forms of data on a huge scale. But if analyzing genetic data–finding correlations between genes and diseases–is not complicated enough, then analyzing multiple forms of data is infinitely more complicated. In Kadiya Ferryman and Mikaela Pitcan’s Data & Society Report on “Fairness in Precision Medicine,” they quote a computer scientist calling genetic data “low-hanging fruit,” as “the methods of collecting and analyzing genetic data are more established than for other kinds of data (such as wearables data), or for analyzing multiple types of data together.” (5)

My case study today is the Breast Cancer Campaign Tissue Bank in the UK, hereafter referred to as the BCC Tissue Bank. UK is the home to one of the earliest and biggest national biobanking initiatives, simply called the UK Biobank. In the U.S., there is the “All of Us” initiative, previously known as the Precision Medicine Initiative. I choose to present on the BCC Tissue Bank because 1) it was the subject of a really neat research paper I found; (6) and 2) unlike the UK Biobank or the All of Us Initiative, (7) the BCC Tissue Bank is not an actual physical biobank that recruits, collects, and stores samples from volunteers; it is meant to be a network with the specific goal of solving some of the interoperability issues that concern biobank-based research.

Around 2010, Breast cancer researchers in the UK identified specific knowledge gaps in the breast cancer research, and to fill the gap they needed “high-quality and clinically annotated samples,” which is challenging because relevant samples are spread out in different biobanks and therefore in different software systems with different terminologies and standards. The BCC Tissue Bank was created in 2010 as an attempt to solve this issue by creating “a single web portal from which researchers could source and request samples from across the network using the terms agreed to in the data standard.” The BCC Tissue Bank, therefore, is built to be a networked information library. (6)

To facilitate data collection between systems, the BCC Tissue Bank decided to create a “plug-in” to be installed at each individual biobank. The “plug-in” was called the “Node.” The researchers call this the “federated” approach that preserves the autonomy and variability among regional biobanks, as opposed to a centralized approach that mandates every bank to use the same system, which will inevitably result in the need for a massive transfer of data for the biobanks who are already using a different system.

Data collection in the case of BCC Tissue Bank actually means data uploads, which can take a number of forms:

  1. Direct Input.

One is to input data directly into a centralized database run by the BCC Tissue Bank. Biobanks can directly input information into the web portal. This has the benefit of the data vocabulary being automatically aligned with BCC Tissue Bank’s vocabulary. While some biobanks that do not have a robust data infrastructure would theoretically choose this options, most biobanks already have their own elaborate information systems, so to do input data separately into a completely different system would prove cumbersome and unrealistic.

  1. Spreadsheets.

Spreadsheets are exported out of one system and then imported into the BCC Tissue Bank system. Spreadsheets allows for mass data transferring, but as anyone who has any experience with migrating datasets across systems would know, cleaning up the spreadsheets so that information from one system can be legible to another system can be also very complicated and time-consuming.

  1. Using JavaScript ObjectNotation (JSON)

Biobanks can use JSON “to automate the push of data from their biobanks’ data systems into the Node.” This is obviously the preferred method for BCC Tissue Bank, as it eliminates the periodic labor involved in upload via spreadsheet or direct entry.

To ensure that the data pushes through smoothly, however, there is still the problem of database-by-database or regional variations in how a term is used. For that, the Node has an module for “mapping,” which maps the term that is used by the central system onto the local term used by the individual biobanks. After the relationship between the local and the central terms are connected, or mapped, researchers can perform searches on BCC Tissue Bank’s web portal using the central terms while the local biobanks can continue to use whatever terms they have been using. Here’s an example of how central terms like post-menopausal is mapped onto the local system that records post-menopausal without the dash.

So here’s the summary of the main approaches BCC Tissue Bank took to increase interoperability between different data systems used by individual biobanks. Similar to how BookOps and ReCAP are meant to facilitate the logistics of running a distributed network, the BCC Tissue Bank is a project that seeks to centralize information spread across a distributed network and embedded in varying standards and definitions. The BCC Tissue Bank is still up and running, of course, although the researcher notes that the preferred method of data transfer is still spreadsheets, as there are just too many technical and regulatory issues with the automatic data push option. This shows how ingrained infrastructure could impede the adoption of revolutionary technology and thus influence the trajectory of the technological medium itself. It reminds me how computing technology co-evolved with punch cards for a good period before punch cards finally became history.  It also makes me interested in the claims about how the blockchain technology will change the way medical records are accessed and shared.

 

References

  1. Shannon Mattern, “Middlewhere: Landscapes of Library Logistics,” Urban Omnibus (June 24, 2015)
  2. Boyer, Gregory J. et al. “Biobanks in the United States: How to Identify an Undefined and Rapidly Evolving Population.” Biopreservation and Biobanking 10.6 (2012): 511–517. PMC. Web. 2 Oct. 2018.
  3. Alice Park, “Biobank, 10 Ideas Changing the World Right Now” TIME, March 12t, 2009, http://content.time.com/time/specials/packages/article/0,28804,1884779_1884782_1884766,00.html
  4. Editorial Team, “The Past, Present and Future of Genome Sequencing,” LABIOTECH.edu, April 9, 2018, https://labiotech.eu/features/genome-sequencing-review-projects/
  5. Kadija Ferryman and Mikaela Pitcan, Fairness in Precision Medicine (Data and Society, February 2018)
  6. Qinlan PR, Groves M, Jordan LB, et al. The informatics challenges facing biobanks: A perspective from a United Kingdom biobanking network. Biopreserv Biobank 2015;13:336–370
  7. All of Us Initiative, National Institute of Health (NIH), allofus.nih.gov