Mellon DH Seminar at Price Lab

The Price Lab holds bi-weekly Mellon Seminars, where our Mellon Faculty Fellows and Associate Scholars present and discuss their work in the digital humanities.

Meyerson Conference Center, Van Pelt-Dietrich Library

Welcome & Introductions

Open to 2016–2017 Price Lab Mellon fellows and associates only.

Meyerson Conference Center, Van Pelt-Dietrich Library

Jim English (Penn English and Price Lab) and Scott Enderle (Penn Libraries)

The Contemporary Fiction Database Project 
Jim English and Scott Enderle will provide a brief overview of the CFDB project, a first phase of which involved analysis of hand-made metadata pertaining to roughly 2000 English-language novels published since 1960.  That phase produced one important finding: a dramatic shift around the year 1980 which makes it much more likely that a novel nominated for a major prize is set in the past, and much more likely that a novel on the bestseller list is set in the present or the future. The second and current phase of the project aims to dig further into that fulcrum point around 1980 by building a digitized corpus of contemporary fiction and subjecting it to various forms of computational analysis. They will describe how they acquired a CFDB library of nearly 1,000 digitized novels, prepared them for text mining, and conducted our initial experiments. They will also describe a parallel stream of this research in which they are exploring questions about readers and readings of contemporary fiction via computational analysis of roughly 2 million reviews of the CFDB novels, scraped from the Goodreads social reading site.

Meyerson Conference Center, Van Pelt-Dietrich Library

Stewart Varner and Sayan Bhattacharyya

DocSouth Data: Library Collections built for Text Mining 

The popularity of Digital Humanities has led to increased interest in text mining and data analysis among humanists. While there are many tools available for anyone who wants to experiment with these techniques, researchers often hit a roadblock when it comes to finding collections that are ready to be analyzed.

DocSouth Data began as an idea to address this challenge by providing access to the data behind the North American Slave Narratives collection; the most widely used collection in UNC's Documenting the American South. This was an obvious place to start for several reasons; First, it is an extremely compelling collection of obvious historical significance. Second, it is complete and coherent in that it includes all known autobiographical narratives of fugitive and former slaves published as broadsides, pamphlets, or books in English up to 1920. Third, because it was transcribed by students, it is extremely reliable data. In addition to the the North American Slave Narratives collection, users can also get the data behind three other collections which are of similar quality: The Church in the Southern Black Community, First-Person Narratives of the American South and the Library of Southern Literature.

This presentation from Stewart Varner, Managing Director of the Price Lab, will provide an introduction to the collection, how it was designed and how it can be used in digital humanities research.

 

Text analysis tools in progress from The HathiTrust Research Center

Sayan Bhattacharyya will briefly describe two ongoing tool-building initiatives at the HathiTrust Research Center (HTRC), the research wing of the HathiTrust Digital Library: the HTRC Bookworm and the HTRC "Extracted Features" functionality. The first tool, the HTRC Bookworm, consists of the generic Bookworm tool, developed by Erez Aiden Lieberman and  Ben Schmidt, integrated with the HathiTrust Digital Library (HTDL).  This hookup leverages the extensive metadata that enriches the HTDL,  enabling motivated visualizations of facets of a corpus. One such motivation, though by no means the only one, is the tracing of individual words over facets across time or across other dimensions. The second tool, HTRC's "Extracted Features", provides users with bags of words (and some other information) per page. This is not only  useful for performing text analysis on  those texts which cannot be made available as linear, sequential streams of words because of copyright restrictions, but also lowers the cost of processing for those texts which can be. He will argue that, in addition to their utilitarian value, tools such as these may also help problematize such notions as "text" and "reading".

Meyerson Conference Center, Van Pelt-Dietrich Library

Alexander Monea (George Mason University)

Towards a Speculative Code Studies
In this workshop/seminar, Alex Monea will offer some of the initial outlines of his current research project that aims to develop a new methodology tentatively coined speculative code studies. In theory, the idea is that critical code, software, and hardware studies can be made to speak to blackboxed systems or pieces of code, software, and hardware, and that they can do so in an anexact, yet rigorous way that preserves their critical-analytical purchase. Such a practice would look to constitute a sufficient, if piecemeal, archive of materials for rigorous speculation about the contents of black boxes. Beyond the event horizon of the black-box lie the secrets to the future of technically grounded humanistic inquiry into the stakes of computational media. Without a rigorous theory and method of speculative code studies, critical code, software, and hardware studies remain subalternized, unable to speak (back) to the power structures that conditioned and continually modulate their identities. In short, if our emerging field(s) of technically grounded scholarship remains mute about Google/Alphabet, Facebook, Amazon, Alibaba, Weibo, and their ilk, then we are missing something crucial. This workshop will begin with an outline of some of these ideas and will be preceded by discussion of how we might further such a research agenda and achieve the goal of socio-politically meaningful code, software, and hardware studies.

Biography
Dr. Alexander Monea is Assistant Professor of Digital Humanities serving jointly in George Mason's English Department and Cultural Studies Program. His current research focuses on examining computation and its attendant technological implementations and peripheral supports. In particular, his book project analyzes the history of computation through genealogies of numerical mediation, 'big' data, predictive analytics, and other points of entanglement between computation, governmentality, and/or capital. His recent publications range from analytical work focused on specific computational apparatuses, like Google's Knowledge Graph, to more theoretical critiques of speculation, to methodological meditations on doing politically meaningful media studies research.

Meyerson Conference Center, Van Pelt-Dietrich Library

Luciana Parisi (Goldsmiths University of London)

Computational Mediation and the Future of the Humanities
It has been argued that the transformation of the medium of knowledge production has direct bearing on how humanities have been able to select, store and transmit knowledge. In particular, when reflecting upon computational systems or automated modes of aesthetic production, for instance in design, music, or art in general, the specificity of “mediality" or of the medium through which these practices work, has brought into questions ideas of authorship, creativity, imagination. This seminar will discuss the possibilities that computational and digital systems offer to develop methods of research and collaboration afforded by the medium - by algorithms and their relation to data. This discussion will also point out that the humanities and social sciences need to re-assess their relationship to the sciences, bringing about changes in epistemic resources and theorisation of the locations, modes and objects of knowledge. More on Luciana Parisi

To view Luciana Parisi’s November 7th talk at the Control Societies Speaker Series, which Price Lab is cosponsoring, click here.

Meyerson Conference Center, Van Pelt-Dietrich Library

Nicole Brown (National Center for Supercomputing Applications, Univ of Illinois at Urbana-Champaign)

The Master's Tools: THe Paradox of Computational Platforms as Spaces of Surveillance and Resistance

Meyerson Conference Center, Van Pelt-Dietrich Library

N. Katherine Hayles (Duke University)

Unthought: The Power of the Cognitive Nonconscious
Recent research in cognitive science, neuroscience and other fields has revealed a level of neuronal processing inaccessible to consciousness; it goes by various names, but I call it the cognitive nonconscious.  Nonconscious cognition has been shown to play a major role not only for humans but for other lifeforms; on this view, all biological organisms have some cognitive capacities.  Cognition here is defined as a much broader activity than thinking; rather, it consists of processes of interpreting information in contexts that connect it with meaning.  By this definition, many technical devices also carry out sophisticated cognitive acts on a daily basis.  The framework grounding these ideas provides a robust way to think complex human and technical systems together, enabling a major paradigm shift in understanding the exponentially increasing penetration of computational and cognitive media into all areas of life in developed countries. More on N. Katherine Hayles

Meyerson Conference Center, Van Pelt-Dietrich Library

Luciana Parisi- via Zoom

Computational Mediation and the Future of the Humanities

Due to technical issues, Luciana was unable to speak on her original date of November 7. She will be joining the Price Lab on December 12 for a lecture via Zoom.

It has been argued that the transformation of the medium of knowledge production has direct bearing on how humanities have been able to select, store and transmit knowledge. In particular, when reflecting upon computational systems or automated modes of aesthetic production, for instance in design, music, or art in general, the specificity of “mediality" or of the medium through which these practices work, has brought into questions ideas of authorship, creativity, imagination. This seminar will discuss the possibilities that computational and digital systems offer to develop methods of research and collaboration afforded by the medium - by algorithms and their relation to data. This discussion will also point out that the humanities and social sciences need to re-assess their relationship to the sciences, bringing about changes in epistemic resources and theorisation of the locations, modes and objects of knowledge. More on Luciana Parisi.

To view Luciana Parisi’s November 7th talk at the Control Societies Speaker Series, which Price Lab is cosponsoring, click here.

Meyerson Conference Center, Van Pelt-Dietrich Library

Serkan Şavk (Izmir University of Economics, Turkey & Princeton University)

The transforming topography of Istanbul during the early modern period is depicted in a bulk of textual and visual resources produced in different forms and genres by creators with different political and cultural identities. The common feature of these resources is narrating the urban space through a blend of imagination and experience. A comprehensive understanding of the urban topography requires the study of these diverse resources in an interrelated way. The intertextual character of digital media is suited to study and highlight such an interrelation. In the Mapping Early Modern Istanbul (MEMI) project I aim to handle the bulk of resources by mapping them in multiple layers over an interactive and open access platform. Based on technical, temporal and budget wise restrictions I have decided to build this platform with data visualization/mapping software. In the presentation, I will reflect on the theoretical and technical challenges of my project as well as the pros and cons of different mapping apps and services.

Bio: Serkan Şavk received his Ph.D. from Hacettepe University, Turkey in 2014, and he teaches at Izmir University of Economics (IUE) Department of Cinema and Digital Media. During his stay at Princeton University Department of History as a visiting fellow, Dr. Savk conducts a research project titled Mapping Early Modern Istanbul (MEMI). He is currently a visiting fellow in the Department of History at Princeton University. His stay at Princeton is funded by TUBITAK (The Scientific and Technological Research Council of Turkey) and IUE Overseas Experience Program.

Meyerson Conference Center, Van Pelt-Dietrich Library

Lucas Stephens (University of Pennsylvania)

Route Navigation and Burial Monuments in an Ancient Cultural Landscape

The landscape around Gordion, in modern-day central Turkey, was monumentalized through the construction of nearly 100 burial mounds (or tumuli) during the Iron Age (900-500 BCE). Previous studies have suggested that linear alignments of the tumuli could indicate that they were built along ancient routes. This project is an attempt to thoroughly investigate that hypothesis and analyze the landscape through both digital and humanistic perspectives in order to better understand the motivations of monument builders and the subsequent effect of tumuli on daily activities. GIS analysis of the topography between the urban center of Gordion and several smaller outlying settlements has revealed a local travel network within which the tumuli acted as landmarks. A combination of digital maps, videos taken while traveling routes, and 3D models of landscape features provides useful research tools for archaeologists and depicts the cultural landscape in a way that approximates past movement.