Archive: August, 2008

WJT Mitchell: Picture Theory

[Readings] (08.29.08, 4:46 pm)


Chapter 2: Metapictures

This essay is about images that refer to images. It further relates to art that refers to art (partially itself, and partially other art). Going into it, Mitchell starts with Clement Greenberg and Michael Fried, who discuss how modern art has become essential and self-referential. This partly seems to be representative of postmodernism and eternal self-reference and analysis. This also connects to self reference in language and literature, metalanguages that reflect on languages.

Mitchell’s focus is still on images, pictures, though, but this is mentioned upfront, so we keep it in mind.

The first example given is of a man drawing a world and then finding himself inside of it. This is more of a casual and tired bourgeoisie type of art (it’s a New Yorker illustration) rather than a sublime and frightening perspective of the enveloping nature of images. The fear derived seems to be of the lack of boundary and enveloping nature of images. We create images, and in turn find ourselves reflected back in them.

Looking at metapictures requires a second order discourse. The first order of an image is the “object-language”, which I suppose is the manner of direct representation. Even images which self reference (frame within a frame, or a man painting a picture of a man painting a picture…) can be posed hierarchically by a first order representation. This works (at least on paper) because images are necessarily finite. Extending this to a second order requires a blurring of the inside and outside.

In mathematics, first order logic can make generalizations about classes of things, but a second order of logic is requried to make generalizations about functions and relationships. This in mind, second order thinking involves thinking beyond direct interpretation and consider sorts of external analogies and references.

Mitchell looks at multistable images, which are ones that can be interpreted in two ways. These have different thresholds, and confuse the role of pictoral representation, but they do not formally confound representation in the sense that the New Yorker cartoon does. The second order boundary is ambiguous, but not overtly flaunted. Images in this sense have a supposed mutability. That form of multistability is also observed in various literary forms (The Lady or the Tiger).

The dilemma with multistable forms is an essential question (Wittgenstein found the rabbit-duck image to be extremely troubling). The question ultimately seems to reside in where we do the identification in our minds. Is it an internal model, an external model (with respect to vision), is it determined by some sort of grammar?

Pictures generally have some sort of presence and role in life. They can seem to have lives of their own. Thus, metapictures, which throw the reference and metaphor into ambiguity, call into question the role and self-understanding of the observer.

This is understood in looking at Foucault’s writing on Las Meninas (a classical painting by Velazquez which employs some degree of reflection and ambiguity), and how Foucault’s attention to Las Meninas is similar to Wittgenstein’s dwelling on the duck-rabbit which complicate and make the images all the less penetrable or comprehensible. They encourage us to think of not the images directly, but think of how the images relate to their culture, our culture, and our thought.

Matisse’s “Les trahison des images” serves a similar role, but instead of exploring our relationship with images (or images with images), it explores the relationship of images and words. This, in turn, is an infinite relation.

The conclusion of the chapter offers some insights: “The metapicture is not a subgenre within the fine arts but a fundamental potentiality inherent in pictoral representations as such: it is the place where pictures reveal and “know” themselves, where they reflect on the intersections of visuality, language, and similitude, where they engage in speculation and theorizing on their own nature and history.”

Reading Info:
Author/EditorMitchell, W.T.J.
TitlePicture Theory
Tagsdms, visual culture
LookupGoogle Scholar, Google Books, Amazon

Clement Greenberg: Avant-Garde and Kitsch

[Readings] (08.29.08, 4:44 pm)

Avant-Garde and Kitsch

Greenberg is writing to discern and define two relatively recent antipodes of art and culture. The avant-garde and kitsch are both products of modernism, and both have emerged concurrently, affecting each other through their opposition. Greenberg begins his essay by noting that the answer in their difference lies deeper than mere aesthetics.


The avant-garde is a product of Western bourgeois culture. Outside of this moment of culture and the abundance produced by capitalist society, the avant garde would not have been possible, and bohemia would never have been established. The avant-garde came to being by emigrating from the old model of aristocratic patronage, to instead pursue art for its own sake (as opposed to for funding). The avant-garde never fully detached from bourgeois society, still depending on its money.

As an academic and intellectual movement, the avant-garde is compared with “Alexandrianism”, a certain reluctance to interfere or comment on culture. The essence of Alexandrianism is a devotion to repetition of themes and old masters. A consequence of this is a non-involvement in political matters. Where the avant-garde is different is from the devotion to pushing the boundaries of art. Instead of a reverence for the tradition and history of art, the avant-garde is about its future and potential. Content is irrelevant, only the form of art itself is of importance.

This devotion to art for its own sake is coupled with a sort of desire in the recreation of the world. Greenberg notes on artistic values: “And so he turns out to be imitating, not God–and here I use ‘imitate’ in its Aristotelian sense–but the disciplines and processes of art and literature themselves.” The abstract or non-representational stems from this notion. The avant-garde artist is not imitating or re-creating the world, but imitating and re-creating the form the art itself.

This idea focusing on the form and values are exposing a certain systematicity in the ideas of art. Where traditional representation evokes or creates a system of nature, the abstract evokes or creates the underlying ideas that ground representation in the first place. Instead of art that imitates life, art that imitates art. In simulation and modeling, this line of construction is prefixed with “meta”.

While the avant-garde rejects content and the bourgeois society from which it arose, it still depends on the wealth of the bourgeois to survive. Since the only audience with the culture and education to appreciate the strange avant-garde perspective is the audience with the wealth to afford education and culture.


Kitsch is hard to explain, but easy to give examples. Greenberg explains that “Kitsch is a product of the industrial revolution which urbanized the masses of Western Europe and America and established what is called universal literacy.” While universal literacy certainly sounds nice enough, but has some ominous undertones. Greenberg explains that kitsch arose to fill a demand for culture coming from the proletariat of industrialized countries. The proletariat consisted of peasants from the country who were settling in cities and simultaneously losing a taste for country culture and developing some degree of leisure time that required fulfillment.

The culture that arose from this is composed of artifacts of the existing cultural tradition. It borrows and synthesizes devices and forms from culture and manufactures products which are easily understandable and palatable for the prole audience. Kitsch is the ultimate synthesis of culture and media, and is also the ultimate recycler and disposal: it will use bits of artifacts that “work” and re-use them until exhausted. As a result (in the sense of Walter Benjamin), it destroys culture and then profits immensely.

Greenberg also explains some properties of kitsch: it is profitable, it is mechanical, it operates by formulas, it is vicarious, it is sensational, it adopts varying styles, it is the epitome of all that is spurious, it pretends to demand nothing from its customers but money. While not a formal definition, it helps clarify some things about what kitsch might be, but does not exactly explain, except in contrarian terms, what kitsch is not. Not kitsch is not profitable, not popular, not spurious. All of these qualifiers are exceedingly vague and subjective.

Greenberg claims that artistic values are relative, but that among cultured society, there is a general consensus over what is good art and what is bad. This may work qualitatively, in that many agree on the values of classics, there is very little agreement in contemporary works. Greenberg continues and explains that to acknowledge the relative nature of values, that artistic values must be distinguished from other values, but Kitsch is able to erase this distinction though its mechanical and industrial construction.

A key theme in this is the notion of value, and a relative situation of values. It is a sort of intellectual and educational background that defines and establishes these values for the educated audience, and lacking these, the proles miss the value inherent in abstract works. This education supplies a history and context, which is totally missing from the world of kitsch.

wrapping up

The avant garde imitates the process (and system) of art, and kitsch imitates its effect. The history of art had not enabled the interconnection of the artist with form, because of the nature of patronage as it had supported artists since the middle ages (which seems a little puzzling). Later, following the renaissance, a history of artists preoccupied and lonely in their attachment to their art begins to appear.

It is capitalism and authoritarianism that turn to kitsch as being tremendously effective at profiting from and placating the masses. Greenberg explains that socialism is the only movement that can support the avant-garde, or new culture.

critical notes

Greenberg’s primary concern seems to be that only the avant-garde is producing new cultural value, through the pushing of its limits. But, this attitude leaves something to be desired. Surely, cultural value must be seen as more than a scalar quantity?

There are many subtle assumptions underlying the criticism of Kitsch, which is that, when understood formally, as a synthetic product that seeks to make a profit, one could say that near unto all forms of art are kitsch. Ancient cultures were constantly referencing and alluding to the legitimacy of previous cultural products- Roman gods were borrowed from Greece and used to satisfy a cultural demand and need for legitimacy, yet this borrowing is not really seen as Kitschy. Even kitsch is disposed to find new ideas in itself from time to time.

Many contemporary works, must create something new, arguably have some artistic value, reference and synthesize, and some even have the misfortune of being popular. The qualifier of kitsch seems to only occur when the popularity and profit is absent. Clearly, there is a spectrum of gradations of a work in terms of its accessibility, but this is not necessarily equivalent with its artistic value. The danger of the existence of kitsch is to blur and erase this distinction, but that seems to afford the existence of kitsch much more authority than it ought to deserve.

Additional contemporary works derive from forms that might be considered kitsch, while not avant-garde, they can embrace the values of abstraction and, having emerged from a popular medium, form bubbles of artistic experimentation and radical difference and creativity. For example, within the popular medium of newspaper-style comics emerge highly experimental and complex works. These cannot be said to be kitsch in their emergence, but rather wholly new products.

In this sense, it seems that the qualitative distinction between kitsch and avant-garde, while an effective border, is little more than an arbitrary line over superficial ideas of value and imitation. The image drawn here (in 1939) is evocative of an intellectual stagnancy, one that began in the industrial revolution, but contemporary culture is certainly changed and contains new value from when Greenberg was writing. That value certainly did not all stem from avant-garde artists, nor is all of that value purely capitalist, so it must have come from elsewhere. But where?

Reading Info:
Author/EditorGreenberg, Clement
TitleAvant-Garde and Kitsch
Tagsdms, postmodernism
LookupGoogle Scholar, Google Books, Amazon

Roland Barthes: The Death of the Author

[Readings] (08.29.08, 4:43 pm)


Barthes opens his essay by looking at a quote from Balzac’s Sarrasine, and digging into the methods of understanding the quote’s author. The quote is remarking on a castrato impersonating a woman, describing the fluid evocation of the idea of “Woman” given off by the impersonator. Barthes is trying to discern who is behind the quote, though, who is saying it. It could be the story’s hero, it could be Balzac the author, Balzac the philosopher, it could be universal wisdom or romantic psychology. Barthes explains that due to the nature of writing itself, it is impossible to know. Writing is a voice with an indeterminate speaker, whose identity and body is lost through writing.

The idea of the author is a construction that derives from rationalism and the Reformation, which were concerned with elevating and unearthing the individual in all things. There is a fascination with modern society to connect the author to their work, and to understand the the human behind the work, through the work, perhaps instead of the work itself. Criticism sees a work as the product of the author, or a product of the author’s character traits.

Barthes looks into Mallarme (who was a subject of great interest by Umberto Eco), and explains that Mallarme’s work was intended as a removal of the author so that pure language remained.

Other writers see to expound on the relationship between themselves, their works, and their characters, blending them to some degree. The author’s relationship to the work can be seen as somewhat incidental and residing in chance. Writers may challenge the position that they stand on in relation to their work. Surrealism pushes this further by playing with the system of language. This playing is supposed to expose the limits of authorship (or authorial intent, I suppose?) and exhaust the idea of the person in writing, as opposed to the subject, the one who writes.

As a side, many popular contemporary authors see their writing as being very systematic. They do not control or master the writing from the top down, but rather they develop characters and let the characters act on their own. In this sense, the writing is a run of a simulation.

With this in mind, modern works may be approached with the knowledge of the author’s absence. If we “believe in” the author, it is as the parent of the book, the precursor to the work. In the modern text, the writer is born along with the text.

Barthes explains a perception of the text which is lacking the absolute message of the author (in an almost theological sense). The text is a “space of many dimensions”, it is a “tissue of citations”. Expression is merely translation from a dictionary. “Succeeding the Author, the writer no longer contains within himself passions, humors, sentiments, impressions, but that enormous dictionary, from which he derives a writing which can know no end or halt: life can only imitate the book, and the book itself is only a tissue of signs, a lost, infinitely remote imitation.”

In a post-author text, deciphering becomes impossible or useless. Imposing an author onto a text forces the text to adopt an ultimate signification, which destroys the writing. Modern writing instead can be distinguished and traversed.

Written language is inherently ambiguous, and when we remove the author, written language can be perfectly understood. Barthes mentions Greek tragedies, which use ambiguity and duplicity to convey meanings. It is the reader who is able to interpret, connect, and weave these together.

Barthes is not trying to criticize the meaning or unity of texts, but rather the idea that unity or meaning descend from an external author who precedes and begets the work. Rather, meaning and the unity of a work coalesce in the reader, who connects and strings together meanings from all places. The reader lacks history or psychology or identity in the sense that the author does. The reader’s meaning can be considered a liberation or popularization from the idea that meaning is from and for the author.


Authorship is interesting in a modern society, especially in terms of commercial products. In a culture where corporations are extended rights and status granted to individuals, commercial products tend to stand with the company or the corporation as their author. Some examples of this is are computer software, pharmaceuticals, fast food, etcetera. Despite the fact that many individuals are responsible for creation, and these creations have evolved and changed significantly over time, the products themselves are, even legally, authored by the corporation.

Authorship is important in simulation as well. If one ascribes to the belief that all creative expressions are systematic (that is, they are embedded with models of meaning), then these systems could be said to be authored. The systems are open works, in the Umberto Eco sense, as they are free to some interpretation, but are still constrained by their original authorship.

Reading Info:
Author/EditorBarthes, Roland
TitleThe Death of the Author
Tagsdms, postmodernism, narrative
LookupGoogle Scholar, Google Books, Amazon

Espen Aarseth: Narrativism and Genre Trouble

[Readings] (08.29.08, 4:42 pm)


Aarseth presents another critique of narrative theory as applied to games. He is challenging the idea that narrativism can be used to analyze anything, or more specifically, is challenging the interpretation of anything as a text. Aarseth defines a game as consisting of three elements: Rules, a semiotic system (the game world), and the resulting gameplay. The semiotic system is incidental, and may be exchanged. Knowledge of the semioitic space of the game world, or of the skin that has been applied to it is unnecessary for skill at the game itself. However, it may be necessary to better *appreciate* the game.

Aarseth’s claim about the relevance of semiotic systems to games is tricky, though. It may be possible to interchange skins on existing games, but there are further connections that are made between the world of the game and its rules and resulting gameplay. It wouldn’t make sense to place a skin on a game of chess that randomized the types of pieces. We have certain associations with the order of chess and the order of the skins that we apply to it. What we do when skinning something is drawing a metaphor. The mechanics are necessarily unchanged, but the associative meanings are different.

Aarseth is also looking to demonstrate the disconnection between games and stories. It doesn’t make a lot of sense to equate games as being narratives, especially when exploring abstract games. However, an inescapable fact is that many contemporary (especially successful commercial) games are grounded in stories. This is what Jesper Juul might call the “fiction” of the game. Aarseth proceeds to wonder what the relationship is between games and stories, whether it is a dichotomy, or a continuum or rivalry? He notes that games may be translated among game forms (Rogue and Diablo, for instance), much like narratives may be translated between narrative forms. These are structural equivalences, though: Game adaptations preserve rules, narrative adaptations preserve key events and relationships. Realistically, though, many successful narrative adaptations use much more creative approaches.

The key problem with adapting games to narratives and vice versa can be found in genre theory (John Cawleti): Underlying form cannot be translated, but style and convention may be adapted with relative ease. Aarseth gives the specific example of the Star Wars films and the various games associated with them. A genre that does try to mix the two is the adventure game, which Aarseth derails as being uninteresting, unsatisfying from gameplay perspectives, and limiting in terms of freedom.

Another domain, the simulation game, also employs story to a strong degree, but is flexible where the adventure game is not. Aarseth makes the significant claim that simulation is the core essence of games, and that games are the art of simulation. Aarseth further extends that by saying how simulation is much more satisfying and allows games to handle unusual situations that are not permitted in narratives. Adventure games have a conflict between player volition and the mechanics of the game. Aarseth claims that within simulation games, the player is afforded opportunities to counter authorial intention, that the authors of simulations are essentially removed from the work, and that players will have the last word.

Aarseth’s stance here is ludicrous. Simulation authors are capable of imposing very severe restrictions on players, and the simulation itself may be biased in its very model that defines it. Civilization used (and still uses) a very expansionist, colonialist model of history, and it is not possible for the player to thwart that ideology in any way. The only road to success is to ascribe to the ideology and act in accordance with it. The most recent release of Civilization opens up the model, so that advanced users can write new ideologies into the rules, or rip out the existing ones, but these users are not the average player. It also bears noting that in the discussion of simulation, Aarseth does not mention the Sims (though he had mentioned it earlier). Not sure what to make of that, though…

The important thing to note about comparing games and narratives, though, is to follow Aarseth’s initial focus, of looking at translatability. If we explore how narratives have been translated, adapted, and (especially) extended, it might be possible to make a not-too-revolutionary claim that many successful adaptations break many of the rules of narrative structure. A good example is Jane Austen adaptations, or extending Aarseth’s examples, one could look at Star Wars novels, and the extended universe developed around the world defined by the films. The resulting products might be narratives, but relationships might be changed, settings might be changed, characters might be changed. What has been translated might not be the narrative at all, but rather the world, or the underlying value system of the story. In this sense, we can make the claim that narratives themselves are artifacts of systems. We may not be able to adapt the narrative directly, but the elements of the system may be procedural in nature.

Reading Info:
Author/EditorAarseth, Espen
TitleNarrativisim and Genre Trouble
Tagsdms, ludology, simulation, games
LookupGoogle Scholar, Google Books, Amazon

Stephen Wolfram: A New Kind of Science

[Readings] (08.29.08, 4:41 pm)


Wolfram’s book, A New Kind of Science is chiefly concerned with the implications of his deep study of cellular automata, originally triggered by his MacArthur grant. The main principles and findings of this research seem to be that simple rules can lead to computational complexity and very interesting results.

While the notion that simple rules can lead to powerful results is not a new notion, especially in the sense that science and mathematics have striven to find simple and elegant ways for describing the laws and theories governing nature. Wolfram’s pursuit seems to be towards computation, and relating the simplicity of cellular automata towards emergent natural phenomena. Wolfram aims for CAs to lead towards a new type of science and mathematics that uses the (simulative?) power of CAs to make useful claims about nature.

The visual appeal of the automata is certainly very compelling, but there is an equally disconcerting lack of mathematical reason backing up his arguments. Unfortunately, he also does not give evidence as to what mathematical justification might look like, instead choosing to demonstrate problems through visual analogy.

Wolfram’s use of CAs is also an exemplary demonstration of Baudrillard’s simulation, in that when viewed through the lens of cellular automata, everything seems to become one. CAs become the universal tool which may be used to represent recreate everything.


The Notion of Computation

The idea of Computational Universality becomes something of great significance here. A function is universal if it can compute anything that is computable. The Turing machine is the fundamental example of this. A consequence of universality is that universal functions may be simulated or computed analagously. A consequence of Wolfram’s research has been to find that certain classes of CAs are universal, may be used as computing machines.

Wolfram has additionally discovered a number of CAs which are reversible, that is, their inputs may be determined uniquely from their outputs. Computationally, this represents an interesting class of functions, but it also references issues of information and disorder that are important in signal systems and in thermodynamics.

The Principle of Computational Equivalence

Wolfram’s thesis is essentially this: “all processes, whether they are produced by human effort, or occur spontaneously in nature, can be viewed as computations.”

Wolfram extends this idea to the point of declaring it a new law of nature: “The Principle of Computational Equivalence introduces a new law of nature to the effect that no system can ever carry out explicit computations that are more sophisticated than those carried out by systems like cellular automata and Turing machines.” And thus, when viewing nature as a system of computation, this principle is naturally very relevant.

An issue with representing things as computations is that it disregards the idea that not everything requires brute computation, instead, certain things may be proven rather than computed. This distinction is tricky and important. Some information may be proven only with difficulty, and other facts may be much more easily computed than proven. However, there is generally a difference between that which is computed and that which is proven. The advantage of the later is eliminating the need for the former. Wolfram’s argument hinges on the notion of raw computation, which may pale in the face of abstract proof. One may set out to compute that there are infinitely many primes 3 mod 4, which is an indefinite exercise, or one may instead aim to prove this in a finite and short number of steps.

This point is important, but flawed. Later on, Wolfram examines rules and theorems of mathematics, and uses their countable, enumerable nature to represent them as computable. In this view, theorem proving is a process of computation, rather than some mysterious or magical intellectual exercise. This fact has been used in the past, notably by Kurt Goedel to prove the incompleteness of mathematics. This means that proofs are indeed finite and computable, but that is still not a good way of approaching them.

Computation is still computation and must obey the law that computations may not be “outdone”, as it is not possible to out-compute something in the same number of logical steps. On the other hand, proof and ideal computation are different from raw computation in that they might be more efficient and save time (or computational steps). The essence to these, the way that proofs are made and solved in practice is not by computation, but rather by “intuition” and experience. The two of these may seem magical in abstract, but actually echo more strongly the ideas of pattern matching. Instead of applying rules in brute force, pattern matching relies on analogy, recognition, and application of known rules to new inputs. This approach is still computable, but not easily by CAs.

Reading Info:
Author/EditorWolfram, Stephen
TitleA New Kind of Science
Tagsdms, simulation, emergence
LookupGoogle Scholar, Google Books, Amazon

Norbert Weiner: Cybernetics

[Readings] (08.29.08, 4:40 pm)



Weiner begins Cybernetics by posing some of the problems encountered by the growing field of modern science. Specifically, and this echoes Vannevar Bush, he is concerned about the massive specialization in science. He argues, though, that scientists need to be versed in each others’ disciplines. He too is interested in developing some sort of calculating machine, but is proposing an electronic model that seems to more closely resemble what we use today. What is interesting about Weiner’s model is that it is inspired by the human nervous system.

The essential problem that is set out to be solved is anti-aircraft artillery. This is the essence of the idea, and segues cleanly into the notion of feedback loops that will be explored in detail later. This idea involves a certain forecasting of the future, and relates closely to how human action works as usual. Human actions, such as picking up a pencil involve certain kinaesthetic or proprioceptive senses. This correlates in some fashion to the intentionality described by phenomenologists.

Furthermore, the kinds of dilemmas from the problems Weiner is describing are generally solved by pattern recognition. Distinguishing signal from noise, guidance, switching, control, etc. It is interesting to note that the type of discipline proposed by Weiner more closely resembles analytic patterns that seem to be suggested by Dreyfus.

Some of Weiner’s application seems grounded in Gestalt psychology, which is the psychology of the coordination of senses. The sum idea is that the whole amounts to more than the sum of its parts. Generally it is a psychology of perception. One of the ideas that Weiner is approaching with this and toward the end of the introduction, is the idea of developing a fully functional prosthetic limb. The limb would not only need to fill the space and function as the lost limb, but also register the immediate senses, and furthermore the proprioceptive senses. The combination of these seems to unite the goals of cybernetics. Also notably, the idea here is the replacement/extension of a limb, not the mind.

A further concern with the potential of this prosthetic power of computation is its complicating moral significance. One moral dilemma posed is the notion of machine slave labor, which has the potential to reduce all labor to slave labor. While robots have not replaced human labor, this concern is insightful in terms of the economic changes due to computers (divisions of companies being replaced by silicon chips, etc).

Chapter 5: Computing Machines and the Nervous System

Weiner gives early on a somewhat hand-waving proof that the best way to encode information (when there is a constant cost for the information) is to use binary for storage. The logic of some operators is described, as well as the ways of implementing binary logic in several engineering approaches. After that, he mentions their potential grounding in the neurological system.

Weiner next attempts to address some of the tricky details of mathematical logic (such as the paradoxes of naive set theory) with corresponding analogues that could apply to a computational system modeled after the nervous system.

Reading Info:
Author/EditorWeiner, Norbert
Tagsdms, ai, digital media
LookupGoogle Scholar, Google Books, Amazon

Katherine Hayles: Writing Machines

[Readings] (08.29.08, 4:38 pm)



Institutions are not run by specified rules, but rather by complex networks of individuals, among whom the real causes and reasons for things become apparent. It is individuals and networks of them that cause things to happen. Hayles wishes to look at the digital age and writing environments, but to do that must focus on the networks of forces and individuals that surround the discipline and culture.

To do this, she is starting from a somewhat autobiographical perspective: Hayles started pursuing a track in science (specifically chemistry), but later found the cutting edge research to be tedious and unengaging. She took some courses in literature, and in a new track, found herself puzzling out the inconsistencies with her scientific discipline (ambiguity over clarity, investigating rather than solving problems, etc).

Electronic Literature:

Hayles opens this chapter by noting how the Turing Machine (and by extension the computer) was originally theorized as a calculating machine, but had a hitherto unexpected power for simulation. Hayles poses simulation as applying to environments, and this makes it seem a much more tactile and somatic experience than a conceptual one. She connects simulation to literature and the result is this sort of electronic literature.

Hayles specifically is looking at Talan Memmott’s hypertext work, “Lexia to Perplexia”, which is a jumble of jargon and wordplay, intended to confuse the idea of subjectivity. Hayles describes this language as a creole of English and computer code.

The admitted illegibility is an indication of electronic processes that the reader does not understand, or cannot grasp. “Illegibility is not simply a lack of meaning, then, but a signifier of distributed cognitive processes that construct reading as an active production of a cybernetic circuit and not merely an internal activity of the human mind.” I think this is supposed to mean that interpretation is transcendent of human thought.

The goal in this transformation is to raise awareness and weave together the human body with electronic materiality. This idea seems to be looking in the direction of Donna Haraway, but going more in the direction of a semiotic system. The goal is not to challenge human nature, but challenge subjectivity and language.

Prevalent in the work are allusions to Narcissus and Echo, and these mythological references are intended to highlight the collapse of the original into simulation. Following Baudrillard, there is no longer an ontological distinction between the real and the simulated.

The work is intended to be a “later generation” multimedia or hypertext work, very active and confusing with respect to user interaction. The work goes beyond general hypertext and instead of moving from lexia to lexia it acts nervously, seeming of its own accord.

Reading Info:
Author/EditorHayles, Katherine
TitleWriting Machines
Tagsdms, cybertext, digital media
LookupGoogle Scholar, Google Books, Amazon

Sherry Turkle: Life on the Screen

[Readings] (08.29.08, 4:37 pm)


Turkle presents some methods of looking at computation and culture from a psychological perspective. Her work is grounded extensively in ethnography, and follows individuals for whom computers are a part of their lives. Computers, online communication, and simulation each open new means for interaction and expression for individuals, and at the time of Turkle’s writing (1995), the influence of the internet on culture was still very fresh. It remains fresh today, probably partly due to its continually evolving and changing nature. There are clearly mixed feelings about many aspects of computation, but ultimately Turkle seems to make it out to be a force for good.


Turkle starts by looking at writing, in an anecdote about her learning French. The style of writing Turkle used in her example was bottom-up, rather than top-down, an approach that was contrary to accepted models. This approach uses dialogue and tinkering instead of formal or abstract modeling. (p. 50-51) She divides approaches to material into two categories: hard mastery and soft mastery, which are the practices of engaging with things in a top-down or bottom-up manner, respectively. This distinction is to become a major thread for part of the book, and was an important factor in her earlier work. The appearance of simulation in computer culture encourages soft mastery, bricolage, and tinkering, which make use of the ability to test and experiment, getting into a model as opposed to looking objectively at it. Piaget and Levi Strauss discuss bricolage as a stage of development in infants, but they present it as a stage to be passed, rather than a whole method of thinking. (p. 56)

In software, change has been made to account for bricolage and other styles of learning and interaction: “Instead of rules to learn, they want to create environments to explore. These new interfaces project the message, ‘Play with me, experiment with me, there is no one correct path.'” [they as software designers] (p. 60) However, this positive reception is far from unanimous: Turkle also looks at reception to computers and simulation in academic setting. There is a lot of hostility, much of that derives from the opacity of the computer, and to some domains, especially science, that opacity is threatening to fundamental principles. (p. 63-66)

On: “The Games People Play: Simulation and Its Discontents”: Turkle looks at simulation from the vantage point of games. Early children learn the tools and concepts of the game by getting a feel for them via practice. As rules became more complex, they lended to the credibility of the microworld: “A literature professor, commenting on a colleague’s children with their Christmas Nintendo gifts, likened their disclosure to that of Dante scholars, ‘a closed world of references, cross-references, and code.'” (p. 67)

On Sim games (SimCity, SimLife, etc): Simulation encourages players to develop understanding of rules and relationships, leading to estimation and intuition. Some relationships are very complex and are not understood, but this does not obstruct the interaction experience. (p. 69)

“It is easy to criticize the Sim games for their hidden assumptions, but it is also important to keep in mind that this may simply be an example of art imitating life. In this sense, they confront us with the dependency on opaque simulations that we accept in the real world. Social policy deals with complex systems that we seek to understand through computer models. These models are then used as the basis for action. And those who determine the assumptions of the model determine policy. Simulation games are not just objects for thinking about the real world but also cause us to reflect on how the real world itself has become a simulation game.” (p. 71)

Turkle defines simulation rejection and resignation. A third response is to develop a cultural criticism: “This new criticism would not lump all simulations together, but wold discriminate amongst them. It would take as its goal the development of simulations that actually help players challenge the model’s built-in assumptions.”

“Undertanding the assumptions that underlie simulation is a key element of political power. People who understand the distortions imposed by simulations are in a position to call for more direct economic and political feedback, new kinds of representation, more channels of information. They may demand greater transparency in their simulations; they may demand that the games we play (particularly the ones we use to make real life decisions) make their underlying models more accessible.” There is nonetheless fear that the complexity and opacity imposed by simulations may never be cut through. Further concern is how to understand the relationship between reality and simulation.

Reconstructing ourselves in virtual communities: there is a complex interaction and relationship between ourselves and machines, and a “complex dance of acceptance and rejection to analogies of ‘the machine.'” There is appeal to think of ourselves as cyborg and machine like. (p. 177) Describes IRC and chat, which, while textual and one of the least “dynamic” of communication methods, is extremely personal. The relationship between self and textual identity is very close. (p. 179) Turkle describes anonymity and MUDs, which can be addictive. This seems to relate in the psychological understanding- to the presentation of self. The concept of a persona is much more literal and explicit here. (p. 184)

In an interesting diversion, Turkle discusses tabletop games, citing a specific example wherein the games were used as vehicles for self definition and epiphany. These are grounds for personal exploration and discovery. Later, MUDs serve similar purpose, allow expression for emotion difficult to express well in real life. They allow certain degrees of self experimentation, and ability to work through real issues. Because of their playful nature, they are not deeply binding. They allow the experimentation of being a better self. (p. 186) “You are who you pretend to be”: identity is constructed by fantasy, (this pulls back to role experimentation in sociology), and MUDs enable this as was never possible before. (p. 192)

Alternately, MUDs may be a place to reenact the problems of real life, and may serve as an addiction. (p. 199) The interesting comparison here is that Turkle’s examples so far seem to be generally positive: interaction with technology is a means to self discovery, communication, and enlightenment. She does not delve deeply into the addictive nature of “holding power” that games exert. While they can be very much devices for progression, that seems to be less the case nowadays. Is this something that has changed, or is it just a matter of changing perspective? Or is it a matter of the people who play games? It would seem that as a medium matures, its capacity for enlightenment would grow, but if that is not the case then why is it? Is it because of the capitalization of online entertainment, that games cannot be enlightening if they are to be sold for profit?

Onto some negative qualities about online communication: MUDs enable “easy intimacy” where things can move along too quickly. Commitment is easy in virtual world, but would be too much for real life. Furthermore, due to the lack of closeness and *embodied* intimacy, it is difficult to understand the degree that the actual relationship exists. It may only be in the interactor’s minds. There is a lack of (what sociologists might call) role support. The result leads to projection onto others: “In MUDs, the lack of information about the real person to whom one is talking, the silence into which one types, the absence of visual cues, all these encourage projection. This situation leads to exaggerated likes and dislikes, to idealization and demonization.” The situation is not all that different in modern multi-user environments, MMOs, and Second Life. (p. 206)

Gender play and MUDs. This is more literal in the sense that people *play* a gender. It raises the attention to cross-gender portrayal, and gender relationships and roles. It is also much more acceptable to play at other genders. This gender play is also often about self understanding and experimentation. (p. 214)

“For some men and women, gender-bending can be an attempt to understand better or to experiment safely with sexual orientation. Bot for everyone who tries it, there is the chance to discover, as Rosalind and Orlando did in the Forest of Arden, that for both sexes, gender is constructed.” (p. 223) Each example also understands the male gender as also constructed, but modern games fail to account for this: they hand the players pre-established gender models. This is generally not done with explicit interactions, but it is with subtle things like determination of dress. Male is usually an assumed term, while female is external. Is this the result of a development that is new and occurred with the game industry?

Netsex enables confusion over the question of trust and identity. (p. 225) Also deception… textual nature implies an intimacy that is betrayed by deception. More recent environments seem to be much more jaded? (p. 228) Being digital raises new questions of being. It changes our perceptions of community, identity, relationships, gender, and other things. (p. 232)

On the erosion of the real:

Baudrillard ref: Disneyland and shopping malls are part of culture of simulation. They tend towards a culture of isolation and retreat. The loss of the real encourages this. (p. 234) Discussing the compelling nature of false Disney animatronic crocodiles, versus the imperceptibly slow real ones: “Another effect of simulation, which I’ll call the artificial crocodile effect, makes the fake seem more compelling than the real.” (p. 237)

Janice Radway ref, cross with Henry Jenkins: Engagement with media (romance, TV) offers resistance to “stultifying categories of everyday life.” This engagement is somewhat empowering, but also has other more disempowering, limiting effects (see Radway). (p. 241)

A consistent danger is that MUDs encourage people to solve non-real problems, by living in unreal places. Digital worlds enable exploration, but their appeal may be such that one may not wish to return from them. (p. 244)

Rape in MUDs: submission to digital realm also involves sacrifice of some autonomy. When one controls a rule based avatar, the player’s engagement is confined by the rules, and if those rules are compromised, or even if NOT (the rules are simply outside of player control, and possibly outside of player consent), it is possible that the player avatar be compromised horribly. The self identification and experimentation with avatars can lead to exploration and understanding, but can also lead to new forms of disempowerment and victimization. (p. 251)

A problem ultimately lies in the depth of the “emote”. Authenticity is irrelevant in a culture of simulation. Emotion displayed in a simulation is necessarily inauthentic. How does one understand feeling or emotion. Emotion or action may be easily displayed in a virtual world, but real emotion is not so easily displayed or understood. Does emote simply stand for a reflection of Frederic Jameson’s “flattening of affect in postmodern life”? (p. 254)

Reading Info:
Author/EditorTurkle, Sherry
TitleLife on the Screen: Identity in the Age of the Internet
Tagsdigital media, dms, cyberculture
LookupGoogle Scholar, Google Books, Amazon

Sherry Turkle: The Second Self

[Readings] (08.29.08, 4:35 pm)


This is one of Sherry Turkle’s earlier books, and the crux of this is the understanding of the computer as an evocative object. It calls into question various preconceptions and understandings that we have of ourselves: what it means to be human, what it means to think. By being a fuzzy object that expresses some humanlike characteristics but not others, it leads us to examine and return to the nature and definition of those characteristics. In this book, Turkle looks specifically at cognition, learning, programming, and the various cultures that have emerged around computation. While a bit dated, her findings certainly inform an understanding of the development of culture and computers to the modern day.


Turkle opens with an analysis of the “Wild Child” who was discovered in France in the year 1800. This occurred shortly after the French Revolution, while theories of human nature and culture were wildly fluctuating, the Wild Child was a “test case” for understanding many of those theories. He was an evocative object, and our understanding of him challenged and provoked new understandings of what it means to essentially be human, and our relationship with nature. (p. 12)

The computer is similarly an evocative object because of its holding power, which “creates the conditions for other things to happen.” (p. 15) It provokes self-reflection. “The computer is a ‘metaphysical machine’, a ‘psychological machine’, not just because it might be said to have a psychology, but because it influences how we think about our own.” (p. 16)

Using here, a symbolic embedding of human concepts within devices. This difference is cognitive, not embodied, but human ideals and values are put into the machine. As a product of this interaction, the machine concepts and values [as in all communication] return to the user. The computer has not a mind, but reflects the minds of its creators and programmers. (p. 17)

Turkle on technological determinism vs attribution: Attribution claims that technology has long-term impact on people, while attribution claims that technology only has meanings and can be understood in terms of meanings given to it. Both are wrong, computer evokes rather than determines, and opacity of the computer prevents attribution. (p. 21)

The evocative computer, through engagement with ourselves, encourages us to think introspectively and become philosophers. Asks “what does it mean to be human?” Behind anxiety of popular reception to AI is a concern over what it means to think. This is similar to the anxiety in the 1950s over nuclear holocaust (as evidenced in film and whatnot), and the subtle sexual anxiety that underlies Freudian psychology. (p. 24)

Evocative objects are on the borderline, inspire breakage to understand. They are marginal: too far from humans for real empathy, but autonomous enough to be ambiguous. Turkle uses Piaget’s metaphysical developmental studies that inspire transcendent questions. (p. 32)

There is a conflict between things and people. Contrast things with The Sims. The border here is especially fuzzy. Software is an extension of the computer- Turkle is discussing physical devices, but software adds an extra layer into this understanding. There is a meta consideration of the concept of intelligence or humanity: rational explanation undermines this: computers are rational, but not human. (p. 61)

On the seduction of games: “Those who fear the games often compare them to television. Game players almost never make this analogy. When they try to describe games in terms of other things, the comparison is more likely to be with sports, sex, or meditation. Television is something you watch. Video games are something you do, something you do to your head, a world that you enter, and to a certain extent, they are something you ‘become’. The widespread analogy with television is understandable. But analogies between the two screens ignore the most important elements behind the games’ seduction: video games are interactive computer microworlds.” (p. 66-67)

Games lend immersion, and to the feeling of being cut off outside the magic circle. For some (notably Jarish, one of Turkle’s primary informants in this chapter), games provide immersive environment which is comfortable alternate to real world, whose complexities cannot be understood fully. The world of games is whole, conscious, it can be a pleasing alternative to reality. (p. 72)

On Woody Allen and the interactive novel: Echoes adaptation and fanfiction here: The key is immersion, being in a world. This aspect counters the classical understanding of phenomenology which wants to look only in the real physical world. A difference is in identity: computers enable projection of other identities, and allow for self-insertion, role-playing, and controlling characters. (p. 77) A reference to Gone with The Wind: also a matter of world construction. Clearly the present of the game industry is quite different. (p. 78)

On the culture of simulation: “Video games offer a chance to live in simulated, rule-governed worlds. They bring this kind of experience into the child’s culture and serve as a bridge to the larger computer culture beyond. They are not the only agent to do so. Reinforcements come from surprising quarters. Children come to the video games from a culture increasingly marked by the logic of simulation.” (p. 79) Later Turkle discusses Dungeons and Dragons: which operate as a rule-governed simulation fantasy.

On “playing house” style of games: Turkle is placing rules in opposition to empathy, but rules [especially social ones] underlie even abstract fantasies. These enable experimentation and play with real domains. Sociologists describe understanding of rules and structures that occur within society, and see “playing house” games as role-learning. There is not simulation proper, but there is a reflectivity that is also present in simulation. (p. 83)

Papert and LOGO: Learn to speak French the way French children do: speak French to French-speaking people. Learn mathematical logic by speaking in math to a mathematical entity. This is Piagetian learning, it happens automatically in the right circumstances. From translation studies, we know that language implies culture. (p. 97)

Learning and mastery: Various different styles, both hard and soft are valid. Leads to wondering: What of the middle between them? Top down style can be disasterous if the initial plan is faulty. The bottom up may never get anywhere due to its lack of structure. What of a middle-out strategy? (p. 105)

Programming is an ambiguous field by which people may explore methods for reality. Naturally, this leads to exploration of gender. This is enabled by the objectivity of the computer. “Approximate models” could be simulated and addressed reflectively. (p. 109)

Adolescence is characterized by self-discovery and self-definition. Example here is a girl for whom power was threatening, but constraint enabled control. Computer enables a “world apart” for building a new self-image. (p. 145)

Reflection is an externalization of the self. Computation and conceptual metaphors offer a new means for looking at the self, in relation to the machine, and to the world. (p. 155) Children eventually turn to using computational metaphors to describe themselves, and can extend to culture at large. Has this taken place?

The computer is a catalyst for culture formation. This is even more true with widespread internet adoption and is discussed in “Life on the Screen”. A new computational aesthetic enables a new cultural understanding. It enables previously inaccessible understanding, not just information, but knowledge. Considering wikipedia and the like. (p. 166)

A computer is an object to think with and more, it is a building block around which may emerge new cultures and values. “The men and women I am writing about here also used the computer as an ‘object-to-think-with.’ But here the computer experience was used to think about more than oneself. It was used to think about society, politics, and education. A particular experience of the machine–only one of the experiences that the machine offers–became a building block for a culture whose values centered around clarity, transparency, and involvement with the whole. Images of computational transparency were used to suggest political worlds where relations of power would not be veiled, where people might control their destinies, where work would facilitate a rich intellectual life. Relationships with a computer became the depository of longings for a better, simpler, and more coherent life.” (p. 173-174) Here, again, the computer is a vehicle for utopian thinking. However, while the utopia may not be realized (and indeed, it hasn’t been), these computational values have definitely been adopted within computer culture.

Games feel more knowable than the depth and incomplete engagement with the real world. This is the opposite of phenomenological expectation. It relates back to mechanization and mechanical reproduction. In a mechanized world, the person is a cog and may only see a part. In a game world, a person may see the wholeness of the system and attain mastery over it. (p. 186)

Relationships with computers: Children: keep it mysterious. Adults: make it transparent, want total understanding. (p. 195) This is not to say that children do not try to understand them, but they use computers as a vehicle for deeper philosophical understanding of things. Adults will use computers to escape the overbearing gravity and complexity of the world. (p. 192)

On the controversy of “The Hacker”: The concern is over the relationship to engineering tools, but this concern does not apply to artistic tools. Culture accepts an artist’s relation to tools as being intimate, but this seems over the line when extended to engineering tools. The danger of the hacker is the rejection of physical embodied life for the purity of the machine. (p. 205) “They are like the virtuoso painter or pianist or the sculptor possessed by his or her materials. Hackers too are ‘inhabited’ by their medium. They give themselves over to it and see it as the most complex, the most plastic, the most elusive and challenging of all. To win over computation is to win. Period.” (p. 207)

Science fiction, literature, and hacker culture: evolve around the desire to control and master: imposes “hacker values” of individualism, mastery, and nonsensuality into literary worlds. The game industry has been built around hacker culture. This may go some way into explaining the games we have now….. (p. 222)

Hacker culture is built around an intimate identification with the object. Baudrillard definitely carries through here. The purity of the object is pure seduction. (p. 238)

Turkle discusses Newell and Simon’s generalized problem solver, as a big step in AI. Predictions of the future in AI models for how people think: but only thinking certain types of problems, and only modelling certain types of thinkers. One of the projects is a computational model of Freudian slips. (p. 244)

The Freudian Slip program was evidently made by Don Norman. There is a difference between thinking and feeling behind Freud model here. Machine implies intention, human implies mistake. How could a simulation “make a mistake?” Searle criticizes AI for its lack of intentionality, but the problem here seems to be a lack of involuntary behavior. “Freud saw meaning behind every slip. Sometimes the meaning was obvious, sometimes it had to be traced through complex chains of association and linguistic transformations. Norman’s emphasis on computational representation draws attention not to meaning but to mechanism.” (p. 248)

Again on computational anxiety: “Behind the popular acceptance of the Freudian theory was a nervous, often guilty preoccupation with the self as sexual; behind the widespread interest in computational interpretations is an equally nervous preoccupation with the self as a machine.” (p. 299) A thought on why some models are so powerful and compelling.

Paradox is present in machines and has a power (on Godel’s incompleteness theorem). It makes machines, and the underlying mathematical logic behind them complex enough to reflect the potential for paradox, and gives a further depth to them as human-like. (p. 305)

“Ours has been called a culture of narcissism. The label is apt but can be misleading. It reads colloquially as selfishness and self-absorption. But these images do not capture the anxiety behind our search for mirrors. We are insecure in our understanding of ourselves, and this insecurity breeds a new preoccupation with the question of who we are. We search for ways to see ourselves. The computer is a new mirror, the first psychological machine. Beyond its nature as an analytical engine lies its second nature as an evocative object.” Computations provide a mirror, but reflect the self as a machine. (p. 306)

Reading Info:
Author/EditorTurkle, Sherry
TitleThe Second Self: Computers and the Human Spirit
Tagsspecials, digital media, cyberculture
LookupGoogle Scholar, Google Books, Amazon

Don Norman: The Psychology of Everyday Things

[Readings] (08.29.08, 4:33 pm)


In this classic and seminal work, Norman canonically covers topics of the design of objects and devices, using copious examples of bad design to illustrate how good design may be achieved. Norman’s perspective is that of a usability expert and an engineer. He is interested in users’ cognitive models of devices, and how easily they may execute their goals using those objects. In this sense, he uses a classic cognitive model of problem solving, very reminiscent of Newell and Simon’s Generalized Problem Solver.

Norman’s key points are on the cognitive models of objects that users form, the transparency with which the objects enable users to discover that model, and how the affordances objects map onto these models. Objects should use feedback to make transparent the states of an object so that it may be transparently understood.

In studying human engagement with artifacts, Norman uses a cycle (familiar to cognitive science) of goal formation, execution, and evaluation. Norman outlines seven stages of action:

  1. Forming the goal
  2. Forming the intention
  3. Specifying the action
  4. Executing the action
  5. Perceiving the state of the world
  6. Interpreting the state of the world
  7. Evaluating the outcome

This clearly operates in opposition to a more phenomenological reading of interaction. It stands by the notion of man as a reasoning animal, so this cycle, and its iterations (identifying error and the like) is a normal part of engagement with the world. Where frustration occurs is when expectations conflict with feedback in execution and evaluation.

Where Norman’s theory fails is in his treatment of aesthetics (which are always subservient to the functional capacity of an object, and also in cases where user intention is totally non-accounted for in the original design. This makes Norman’s approach conflict strongly with creative tasks. The best example of this is the Lego motorcycle (p. 83), which Norman seems to believe has a “correct” assembled shape.

Reading Info:
Author/EditorNorman, Donald
TitleThe Psychology of Everyday Things
Tagsdigital media, dms, hci
LookupGoogle Scholar, Google Books, Amazon
« Previous PageNext Page »