Skip to content

Digital Humanities at UCLA

Digital Humanities – a Toolbox and a Theory

As taught at UCLA, the field of Digital Humanities is a self-conscious discipline, one that engages students in emerging methods of software learning alongside critical examinations of the constraints and epistemological models implicit in these techniques. With DH projects there are, on the one hand, a headiness to the possibilities of electronic formats that print formats make difficult or impossible, such as text mining and search, analyzing virtual environments, or building federated collections and finding aids for scholars. DH projects can create widespread communities for knowledge sharing, often necessitating the development of open standards, licenses, and formats. One result of DH are more resources created specifically for scholarship that consider the networked context and the need for open protocols to share work.

At the same time, DH methods and tools must be understood as tightly bound with the theoretical assumptions driving them. Marija Dalbello argues that DH is not just a set of new methods but a genre of discourse reflecting on the themes and assumptions underpinning its tools (2011). DH does not only concern pragmatic analysis software for exploring electronic environments or materials – it is also the study of new epistemologies for humanistic research, of “how the interpretative task of the humanist is redefined in these changed conditions” in ways that shed light on traditional scholarly activities (Drucker 2009, xii).

Particularly, DH scholars should be aware of the forms that shape their projects. New computational capacities have encouraged humanists to adopt scientific or social science methodologies over more traditional interpretive methods; at the extreme scholars have opted for purely quantitative interpretations of content over the textual and qualitative, with little consideration for the social and political context of their evidential sources or tools. DH scholars must always ask, how do these new tools affect (infect) interpretation? What constraints do they impose? DH should help humanists become more explicit about their methodological assumptions more generally.

screen-shot-2016-12-06-at-10-41-09-am

[Image by Morgan Currie and Drew Davis for “The Feminist Critique”]

DH Techniques of Critique

How so? For one, the infrastructure and interface of a DH project influences what is available to a user – what is seen, articulated, made functional, made hidden. DH scholars must form a technical literacy of how a project embodies cultural and political values, more or less transparently; they should interrogate their own modeling.

For instance, classifications and digital schemas in a database will influence what is available at the interface level, and so digital humanities projects must be sensitive to the politics of database structures. Jerome McGann foregrounds the importance of considering structural constraints to increase a project’s functionality, findability, and robustness over time; he calls for measures of control and standardization that at the same time can expand the computational and conceptual possibilities of these systems (2007). In a second example, the archive or cultural repository is a kind of institutional authority, so to address this issue of institutional control, archives can take subjectivity and the politics of representation and community narrative into account. Archives can better represent minority voices to disrupt historical narratives and categorical systems by introducing new, complicating, critical content. In a third instance, GIS’ reliance on the absolute grids of Euclid can give way to more relative and lived experiences processed over time. Through multiple timelines, GIS projects can show more sensitivity to the way a place developed up to a single point in time to reveal process, along with multiple perspectives to tell a story.

Mining Alternate Narratives

DH at UCLA has encouraged me to deploy this self-conscious, critical exploration of databases, software, and design in my own research. Another thread of my DH work over the years has been to examine and expose alternative accounts to a narrative or a statistical phenomenon using DH tools. I’ll provide three examples of this work:

In The Feminist Critique I used software designed by the Digital Humanities Initiative to scrape the editing history of Wikipedia’s ‘Feminism’ article in order to locate points of intense editing activity. My hypothesis was that these bursts of sudden changes in content would signal a controversy that had erupted among editors over how to define this topic. As it turns out, one of the highest points of editing over the article’s history concerned a user that wanted to include a heading called ‘Anti-feminism’ in the article. Accessing the articles archives, I could read the minute-by-minute content changes and also the heated exchanges among editors over this one user, who was ultimately banned from Wikipedia for inflammatory and sexist language. As this research shows, the stabilization of the article only came about after an intense exchange among several editors over whether and how to incorporate misogynistic views.

The Police Officer-Involved Homicide Database Project likewise exposes the various statistical accounts of the controversial phenomenon of civilian deaths caused by police. In this project, a collaboration between myself and three other UCLA students, we analyzed five databases that collected data on these incidents, three at the federal level and two that gathered data on Los Angeles County. We organized a data hackathon and invited the creators of the local databases, the Los Angeles Times Data Desk and the Youth Justice Coalition, to join us and other participants in exploring the data and finding discrepancies across these databases. This work exposed the contingencies of these statistical accounts both in terms of ultimate numbers, which rarely reconciled across the datasets, and also over how to define a police officer-involved homicide to begin with.

In Visualizing Interpretation Using the Newscape Archive myself and another UCLA graduate student explored how to model perspective in the media. News stations express differing perspectives on the causes of an event, particularly by a preference for certain assertions they make about causality. This project used word frequencies to track the greatest amount of consensus over an event, points when causal attribution deviated, and where a story received the most traction over time. We worked with news coverage of Anders B. Breivik’s attacks in Norway on July 22, 2011, and looked at CNN, FOX, and MSNBC news reports from the first five days of coverage: from Friday morning (July 22, 2011) to Tuesday evening (July 25, 2011).

Works Cited

Dalbello, Marija. (2011). “A Genealogy of Digital Humanities.Journal of Documentation 67(3): 480–506.

Drucker, Johanna. (2009). SpecLab: Digital Aesthetics and Projects in Speculative Computing. University of Chicago Press. Print.

Gregory, Ian N. (2008). “Using Geographical Information Systems to explore space and time in the humanities.”  The Virtual representation of the past. Digital Research in the Arts and Humanities. Ashgate, Aldershot, pp. 135-146. ISBN 978-0-7546-7288-3.

McGann, Jerome. (2007). “Database, Interface, and Archival Fever.PMLA 122.5: 1588–1592. Print.

 

css.php