icosilune

N. Katherine Hayles visits LCC

[General,Talks] (01.19.09, 12:02 am)

Notable scholar of literature and new media, Katherine Hayles visited us in LCC last Thursday. Her presentation was about electronic literature, and about the practice of academic study of the humanities. The presentation was posed as a conflict between traditional and digital humanities. The traditional humanities are slow to understand the digital, but the digital must be able to build from the foundation of traditional. There are tacit and implicit differences between the two disciplines, indicating shifts and differences in modes of thinking. The primary differences occur along the lines of scale, visualization, collaboration, database structures, language and codes, as well as a few others. Hayles’ research was conducted by interviewing several new digital humanities scholars.

The most notable difference is the idea of scale. This relates to the sheer physical limitations in the capacity of the researcher to read the domain of study. Digital technology enables a broad, but shallow, analysis of a broad corpus of text. The example is of 19th century fiction. A scholar will have read around 300 to 500 texts, but these texts are atypical, notable works, which are read because they are outstanding, the ones that stand out. The nature of research, the questions, and conclusions change when a quantative analysis is possible. When it is possible to look at thousands of texts at a distance.

Franco Moretti poses reading texts at the greatest distance possible. Hayles described this as “throwing down the gauntlet to traditional humanities,” whose approach has been to do deep reading, looking within texts to understand psychology, allusions, and connections. Moretti attempts to read texts as assemblies, breaking them into pieces, without ever reading a whole text. This is a dramatic change in method, and comes across as wildly controversial. It is notable that Moretti does have experience of practice, and is well read and familiar with the corpus. He is able to employ this approach precisely because of this familiarity. Moretti focuses on analyzing texts in terms of devices, themes, tropes, genres, or systems. The practice of analysis amounts to a kind of distant statistical profiling. Moretti analyzes how genres are born and die, tracing genres which have passed, such as epistolary and gothic novels. Moretti’s conclusion is that genres die because their readers die (not necessarily literally, but in the sense that they move on to other material).

Another question is how do you tell when technology platforms emerge. Hayles’ example is Tim Lenoir. He makes the claim that algorithmic processing of text counts as a form of reading. Lenoir’s project traces citations among a set of scientific papers. This network develops and defines a relationship of connections. This is interesting because the analysis is of material entirely contained within the texts themselves, and does not actually analyze works in terms of some external system of values. The claim that this analysis is reading is inflammatory in the traditional humanities, where reading is a hermeneutic activitiy focused on interpretation. The problem is that the traditional understanding of reading is wedded to comprehension. Lenoir argues that, at a wide scale, textual meaning is less important, but what is really interesting are the data streams.

In common with Moretti, Lenoir is interested in finding patterns. Patterns do not require primary investment in meaning. The traditional humanities is instead intereested in hermeneutic interpreatation, which is bound tightly to meaning. These two perspectives are mutually opposed, but Hayles is interested in linking patterns with hermeneutic reading, finding some form of common ground from which these may build from each other.

One such example of a work which uses both strategies is Tanya Clement‘s analysis of Gertrude Stein’s “The Making of Americans.” This text is a traditional narrative through half of the text, but at some point in the middle, the narrative breaks down and becomes virtually unreadable. The text at that point is composed of frequently repeated phrases, content which is essentially an anti-narrative. A deep reading of such a text is difficult or impossible because of the very structure of the text itself. An analysis of pattern is necessary to deduce meaningful conclusions. Clement’s analysis finds that texts contains repeated 490 word sequences, where only a few words within these sequences vary. The analogy is made to the notion of character, as character is repitition with only slight variation. This is a way  of understanding the text which is arguably very valuable, but would be impossible without pattern analysis.

The traditional humanities is usually solitary, involving a deep communion between the reader and the text. Networked culture is interested in collaborative approaches to study, and when applied to study of texts and narrative, comes with a shift of assumptions in how to approach a text. One way of looking at this is in scale of participation, but another approach is to break up a text and treat it as a database. David Lloyd’s project “Irish Mobility” which chops up prose to remove references of subordination and cooperation. Then the resulting material is embedded into a database. This allows the user to “refactor” the content. The resulting piece becomes harder to read, but arguably the content is more meaningful. The resulting form is fragmentary hypertext, and enables the user control over the narrative.

Hayles gives a few examples of database projects used in education, wehre students build from each others’ work, and is published. Thus, their work continues to live beyond the class, and is valuable for sharing and feedback. These projects are less interested in representation, and more interested in communication and distribution.

Regarding language and code, Hayles gives a few examples. A succinct quote comes from Tanya Clement: “Software is an exterioralization of desire.” The writer of software must have an exact  articulation of what the computer must do, without tacit knowledge. Modifying code is generally easier than modifying tacit knowledge, and once created, it is also easier to observe because it is actually written and visible. Tacit assumptions are by their very nature concealed. This is not to say that digital systems are always explicit about their values, but they more clearly formulate their models, and thus the values are more concretely established within the system.

Disciplines are formed by the violence of exclusion, according to Weber. Disciplines achieve legitimacy by constructing boundaries. On one side of this boundary is placed the material which “belongs” in the discipline, and the other side is that which is excluded. This process occurs with astronomy and astrology: One side is given legitimacy while the other is denied it. The legitimacy of traditional humanities is threatened by digital humanities which is outside of the boundaries of the traditional in many senses.

We were not able to extensively discuss the relationship between language and code because the presentation was beginning to run out of time. The relationship between digital and traditional humanities is construed as a conflict. Hayles’ goal is to find a reconciliation between these two. However, the examples described are primarily data oriented approaches to texts and literature. The approaches of pattern analysis and interpretive hermeneutics presuppose a inherent content related difference in the reading of texts. I think that it would be useful to have a more process oriented approach, that focuses on the system rather than the structure of narrative. A common ground might be found in considering that both hermeneutics and the digital are dependent on process.

No Comments »

No comments yet.

RSS feed for comments on this post. TrackBack URI

Leave a comment

You must be logged in to post a comment.