icosilune

Archive: August 29th, 2008

Brenda Laurel: Computers as Theatre

[Readings] (08.29.08, 4:30 pm)

Notes

Chapter 1: The Nature of the Beast

Laurel opens with Spacewar, the first computer game, which was developed at MIT in 1962. The game was the “natural” thing to do with the computer when it arose. Laurel claims that this is because games have the capacity to represent action, and the key ingredient is the human element. That fact makes the interface a matter of significant importance. Agency is shared in computers, both the user and computer act and have a common context for action. This idea is required for the standard dialogue and communication model of UIs. The idea of agency and action is derived and anchored in Aristotle.

Laurel looks at psychology, which has been a longstanding influence in HCI (especially through Don Norman), and finds that psychology and theatre are closely related. Psychology “describes what goes on”, and theatre “represents what might go on”. The approach to interfaces defined by Norman and in psychology in general are found to be the same as the definition of theatre: “representing whole actions with multiple agents.” (p. 7)

Laurel finds some criticism with traditional definitions of interfaces. Interfaces may be a simple layer between the user and the machine, but cognitive science introduces the notion of mental models, so the user and computer have models of each other. However, this can easily turn into a “horrible recursion” with self reference and abstraction. Another classical approach is the view of the interface as a prosthetic, mapping directly between the user and the machine, where actions are mapped nearly one to one. To Laurel, none of these approaches is satisfactory, so she turns to look at a theatrical model of interfaces. The idea with this is that the program performs for the user, and that the UI should be modeled after the technical support for a theatrical production, which, when it works, is totally invisible and does not matter or affect the user’s experience. This idea seems to reverberate in contemporary “rich client” design, which uses display effects and lots of theatrics and feedback to demonstrate interaction. However, like in many theatre productions, technical support does not always conceal the nature of the play (for instance the moving mechanical set in Sweeney Todd) Not all models of theatre apply to transparency, some are highly reflective, and this should be seen as making quite a bit of sense, especially when connected to rich UIs.

The challenge with the theatrical model of the interface is the user’s role within the production. Clearly, a play can not easily welcome audience participation. Instead, Laurel proposes that the interface treat the user as a performer (or character), and this again echoes with others, notably Goffman. This tradition is generally known as scripting the interactor, and is commonly used in tutorials and games. Defining interactivity leads to another conceptual problem. This challenge also percolates into Chris Crawford on interactive storytelling. Laurel’s definition is this: “It enables you to act within a representation that is important.” (p. 21) This definition complicates standard assumptions with interfaces, but is nonetheless quite evocative. It however depends on ideas of action and representation that need to be fleshed out further.

Imagination plays a significant role in models: It is used by humans for planning (in traditional AI sense), but it is also generally used for much more. Art is an external representation of imagination. The things represented in art often tend to some with whole worlds of meaning, even when the things are wholly imaginary. The idea of an interface metaphor convolutes the relation between representation and reality, whereas the language of theatre better establishes that relation as a knowable boundary.

Chapter 2: Dramatic Foundations

The focus of drama here is of the Aristotelian variety, focusing squarely on the Poetics. The question of “Why Aristotle?” is addressed, and this seems to be because Aristotle is complete, and one might also say that Aristotle is easy to model. The Greek cultural frame of drama, specifically with the idea of divine possession inspires a more modern sentiment of immersion, which links to immediacy and transparency where interfaces are concerned. Divine inspiration precludes the boundary of a medium.

Aristotle outlines four causes of representation, and Laurel expands on these and applies them to HCI. Programs themselves are difficult to reduce to this model because of their complex layers and patterns. Laurel’s investigation of programs is persistently grounded in the user experience of the software. So, while a program may be very good at a task, if it is nearly impossible for users to perform it, the task is not effectively performed by the software. Laurel finds two things specifically: “functionality consists of the actions that are performed by people and computers working in concert, and programs are the means for creating the potential for those actions.” And: “The most important way in which applications, like plays, are individuated from one another is by the particular actions they represent.” (p. 45)

  • Formal cause: This is the idea of applying to the essence of something, the abstract idea that the representation aims to represent. The formal cause of a play is the action and plot that is performed. The formal cause for a human/computer action is a representation of action (the functionality?)
  • Material cause: This is the physical construction of an object, its substance and composition. The material cause of a play is its enactment. The material cause of an interface (not a program!) is its enactment as well, the presentation and feedback given to the user.
  • Efficient cause: This is how the object is made, involving the maker and the tools. The efficient cause of plays is its technical construction, and with interfaces it is the same. However, interfaces also are very notable in their use of models and concepts, especially in reference to the underlying application itself.
  • End cause: This is the purpose of the thing in question in its existence and application. The end cause for a play would be its catharsis, for the Aristotelian model. The end cause for an interface is the successful use of the functionality by the user.

Laurel also presents Aristotle’s six elements of qualitative structure in drama. These are: Action, Character, Thought, Language, Melody/Pattern, Spectacle/Enactment. Each of these are connected between UI aspects as well as classical dramatic ones. This is notably the same scale used by Mateas and Stern in describing Facade. An effect of these is that material causes propagate backwards, originating in the enactment and affecting the other levels towards the top. Formal causes originate at the level of action, and then move downward affecting each other aspect of the production. The conflict framed by this confluence of factors resembles the problems found in design of all sorts. Even still, this stratification of elements is found in many other sources (pertaining to architecture and design among other things), not all of which are Aristotelian.

In terms of the behavior of programs, Laurel emphasizes that it is important to connect the idea of character. For Aristotle, a good (virtuous, even) character is one who successfully transformed thought into action. A virtuous object is one which fulfills its purpose. This idea also implies that a good character is one who fulfills expectations. This in mind, that aspires to the idea of mythologically emphatic plots, closely perscribed experience, as well as to transparent user interfaces. What complicates the situation is that characters and interfaces must be appropriate for the function or action.

Reading Info:
Author/EditorLaurel, Brenda
TitleComputers as Theatre
Typebook
Context
Tagsdigital media, dms, cybertext
LookupGoogle Scholar, Google Books, Amazon

Brown and Duguid: The Social Life of Information

[Readings] (08.29.08, 4:29 pm)

Overview

Brown and Duguid in this book look at the growing futurism of the information age and present a variety of critiques to common claims that digital technology will change the face of the world. This book was first written in 2000, and many of the fantastic endeavors of the dot-com boom have begun to collapse. Given sufficient retrospect, Brown and Duguid’s warnings that the internet will not totally change the face of business (among other things) seem obvious and unsurprising as the futurist claims now seem facile. They ultimately do not pin down a solution to the question of where technology is taking society, but do assert that tunnel vision will not yield the correct answers. Rather, they advise, it is important to look around. Specifically, they emphasize that the entities threatened by new technology will still adapt in self preservation, that the existing status quo is a product of evolution and has many benefits that technology cannot replace, and that technology will fall on its face if it is unable to address social needs.

The book is published by the Harvard Business School Press, and this betrays some of the interest underlying the exploration in the text. Brown and Duguid are interested in how technology shapes the economic and business world, not necessarily its inherent or expressive capabilities.

Notes

In the preface (in 2002), Duguid and Brown say “We are not trying to ignore or denigrate information technologies (the term Luddite nonetheless appeared intermittently in conversations). Rather, we are arguing that by engaging the social context in which these technologies are inevitably embedded, better designs and uses will emerge.” This places a usability/design spin on interpretation. (p. x) They also compare technology as a replacement of man (or human processes) versus augmentation. This echoes the early division between Engelbart and common AI modes of thinking. (p. xii)

The discussion at this early point is about design: coming from the perspective of information versus social consideration in design objectives. (p. 4) A recurring theme in the book is the “6-D’s” or “6-D vision”, about several theories which emerged aound digital technologies:
“demassification”, “decentralization”, “denationalization”, “despatialization”, “disintermediation”, “disaggregation”. These visions predicted fundamental change at every level of our social structure, that would involve the demise of many of the most stable institutions, notably: offices and work space, firms and corporations, universities, national governments, etc. Evidence and hindsight would indicate that these have, indeed, not been realized. The problem is a matter of perspective: if information is seen as a cause to everything, then change in the transmission of information will create dramatic results. The authors seem to imply that the abundance of information technology has lead to the vision that everything is mere information. (Sounds kinda like Raymond Williams) (p. 22)

Predictions miss due to lack of anticipation of other factors occurring from “around”, not just in the direction of the technology (as in 50’s futurism). A question to consider: What external forces shape development of information technology today? What are the next revolutions that will occur (or have occurred) in parallel, unanticipated by information futurism? (p. 32)

On agents: the authors use a robust understanding of agents and their applications, but tend to veer down the course of doomsaying to a significant degree. They share the assessment of Weizenbaum that agents are made to seem more like people and vise versa. (p. 39) There is a great deal of fear over the ambiguity and the opacity of agents: When we do not know about what goes on under the surface, they may not be innocent. Do people expect agents to be innocent? People may be more jaded now, but still we use the Amazon recommendation and various Google technologies without complaint. Are they any more different or less scary now? (p. 45) There is further discussion on agent brokering and worrying remarks over whether people can tell the difference between an inept or a corrupt agent. Compare with Jesse James Garrett, on how agents could automate many dimensions of problem solving and search processes. (p. 48) Agents and social/cultural issues of goal oriented behavior: The author discusses “human foibles” as evidenced by shopping at a supermarket, and how decision making changes very rapidly. Also, human rules (in terms of how to perceive and select, etc) change rapidly and are more volatile than agents could hope to be. (p. 51) Doomsaying is also like tunnel vision, it neglects many states and measures in place to track agent behavior, and autonomy is extremely limited. So the fear that agents would replace human behavior seems slightly misinformed. (p. 55) This begins to follow an argument towards embodiment, but stops short explicitly. The authors do not discuss other forms of agents (for instance in games, or specifically, The Sims) whose goals are to simulate and intentionally represent human foibles to some degree. One could make an argument that human understanding of the world (and our decision changes) are already the result of manipulation and preference abstraction as represented by advertising stereotypes. So, agents could *potentially* enlighten this matter were they transparent. The authors do not go so far as to suggest that agents could have a place in the world that would not undermine its moral foundation. This fear seems as flawed as the idyllic holy land predicted by futurists.

Brown and Duguid move on to some traditional environments which information technology “threatened” to devastate, but the replaced version of the process or environment turned out to be less capable of satisfying social needs than the traditional. The problems produce sound very similar to Norman’s common frustration with everyday objects. (p. 72) Part of this, again in hindsight, indicates fledgeling technology/medium, where new experiments and changes are made at a rapid pace and the successful survive. Part of this critique sheds light on an interesting issue, though: existing institutions and processes tend to emerge via evolutionary means, that which creates success will spread by natural selection. But the problems are being critiqued in a manner that sounds like a problem with design. Could Norman’s principles of transparency and affordances and his model process-oriented interaction be applied to constructs like an office? A worthy question to consider.

Some embodiment is discussed here: it sounds like digital media does not enable or is not equipped to handle embodiment? “Putting this all on the desktop, while supporting the individual in some ways, ignores the support and knowledge latent in systems that distribute work.” This argument sounds a lot like Norman, but instead of frustration from execution/evaluation, it is from the designer’s lack of accounting for human social structures. (p. 80) An example of good social design is how Apple distributed computers in schools very liberally, which led to a wide consumer base. Additionally, IBM and Microsoft benefited from widespread distribution of IBM machines in a corporate environment.

Discussing knowledge and information: the classical concern is to find the difference. The flaw with digital futurism is to equate information with knowledge. The difference seems to involve a few details: knowledge involves a knower (***), who actually knows the material in question. It also reflects a matter of knowing how, versus knowing that. Knowledge seems to invovle a “gradual assimilation”. Furthermore, knowing how also is a matter of knowing to be. (p. 119) Knowing how requires practice. (p. 128)

Knowledge relates to learning, while information relates to search. Learning seems to require a community? The discussion here is focusing on business practices rather than more abstracted knowledge. Learning requires practice and peer support. (p. 125)

Knowledge is leaky in some areas and resistant in others. These changes seem to be community oriented. Duguid and Brown bring up the example of Steve Jobs visiting Xerox PARC and coming away with UI ideas, where the rest of the Xerox community was ill-receptive to them. (p. 151) The ability for information to disseminate like that and find footing in diverse areas is one of the greatest strengths of the internet, but it goes oddly unmentioned. Instead, they argue that the firm will persevere in their roles to nurture new buisnesses, and that regional clustering will be a significant factor in development of business. This is true, but it is foolish to ignore the many affordances of digital media to disseminate knowledge in other forms.

Document design is discussed next, and how paper has grown despite many hearkening its demise. Much of this revolves around the embodied nature of documents and the context that surrounds them. They suggest that digital technology can learn a lot from traditional document design, and I think that history has shown us that it has. Metadata and community-based aggregation services/technologies/portals have applied copious metadata to form context in many electronic documents. This change is a strong feature of modern web design. History (in all levels) has shown that, yes, metadata is important. (p. 205)

The authors leave basically without answers other than to avoid tunnel vision. They iterate that technology will not succeed without accounting for social context, but never once imagine what capability technology might have were it to do so. They assert that institutions (the firm, the paper document, the university) evolved and will continue to operate and continue their existence, as they are evolving structures. Technology will not eliminate them. BUT technology will change them, and the question of how is left unaddressed. Some institutions, ie copyright, have many forces supporting their continued existence, but clearly the digital age and ease of replication of data can not leave copyright unchanged. There are many directions and possibilities for these changes to occur, but Brown and Duguid are uninterested in identifying the problem or stressed areas, or taking a position on how or what changes may or should occur. (p. 252)

Reading Info:
Author/EditorBrown, John Seely and Duguid, Paul
TitleThe Social Life of Information
Typebook
Context
Tagsdigital media, dms
LookupGoogle Scholar, Google Books, Amazon

Espen Aarseth: Cybertext

[Readings] (08.29.08, 4:27 pm)

Overview

Aarseth attempts in this work to catalogue and develop a theory of nonlinear (also multilinear and otherwise) texts, both of the electronic and paper variety. He addresses the complexity of understanding the nature of a text, its access, interaction, and phenomenological presence. The variety of these devices leads to what he calls cybertext, which has underlying differences from that of narrative.

Notes

The act of reading can be interpreted as a kind of power play. This involves a sense of safety, which can be compared between things such as tabletop and computer simulation. Power grants added control, but with extra immersion and investment. Reading itself is not devoid of power, but it is a more subtle kind (Intro, 5). Oral storytelling traditions involve significantly more interactive and participatory characteristics. Some of this is phenomenological and rooted in sense of place. Conversation can be seen as heavily interactive, whereas dialogue is less so (Intro, 15).

Aarseth discusses semiotics in the first chapter, exploring means of interpreting signs in cybertexts (specifically games) in a literal manner. Ie- each visual element in the game is some sort of sign unit. Signs need not be human-interpretable, though, a sign system does not need to be inherently linear, either. Could some cybertexts exist without logical/symbolic interpretation inherently (Paradigms, 31)? Aarseth emphasizes the duality of code, vs the Execution of code- which heavily distorts the symbolic interpretation (Paradigms, 40).

Aarseth explores some types of interactivity. He cites Peter Bogh Anderset (1990: “A theory of computer semiotics: Semiotic approaches to construction and assessment of computer systems.”): “An interactive work is a work where the reader can physically change the discourse in a way that is interpretable and produces meaning within the discourse itself. An interactive work is a work where the reader’s interaction is an integrated part of the sign production of the work, in which the interaction is an object-sign indicating the same theme as the other signs, not a meta-sign that indicates the signs of the discourse.” Compare w Crawford, others.

One of the most interesting features of Aarseth’s work is his statistical typology of cybertexts. Statistics is less absolute or conceptual, but rather empirical and analytic. He categorizes quite a few of these properties along which to sort possible works: Dynamics, Determinability, Transiency, Perspective, Access, Linking (Textonomy, 62).

Aarseth discusses Mary Ann Buckles (1995: “Interactive Fiction: The storygame ‘Adventure'”), sounds like a thing to look up. Aarseth also discusses differences between plot (sjuzet), story (fabula), as well as drama and intrigue. These are in context of the adventure game, and exploring how these are dynamically intertwined and related (Adventure Game, 112). Aarseth also describes many IF works as functionally Autistic (115). “Personal relations and habits in an adventure game like Deadline might best be described as autistic. The Encyclopedia Britannica defines autism as ‘a neurobiological disorder that affects physical, social, and language skills.’ Further, ‘it may be characterized by meaningless, noncontextual echolalia (constant repetition of what is said by others) or the replacement of speech by strange mechanical sounds. Inappropriate attachments to objects may occur. There may be underemphasized reaction to sound, no reaction to pain, or no recognition of genuine danger, yet autistic children are extremely sensitive’ (Britannica Online, ‘Autism’)”.

In Cyborg Author, Aarseth criticizes dramatic theory. Finally, he critiques literature as an ideal for cybertext. “To achieve interesting and worthwhile computer-generated literature, it is necessary to dispose of the poetics of narrative literature and to use the computer’s potential for combination and world simulation in order to develop new genres that can be valued and used on their own terms.” (The Cyborg Author, 141).

Aarseth explores the MUD, describing it as a symbolic exchange environment. Not necessarily utopian, but unmoderated, open. Consider application to game idea (!). Compare to Second Life, others.

Ruling the Reader describes various kinds of general communication, reading, listening to reading, listening to a lecture, conversation. Each have significant differences, and are different forms of compunication and participation.

Reading Info:
Author/EditorAarseth, Espen
TitleCybertext: Perspectives on Ergodic Literature
Typebook
Context
Tagsdigital media, dms, cybertext
LookupGoogle Scholar, Google Books, Amazon

Umberto Eco: The Open Work

[Readings] (08.29.08, 4:26 pm)

Overview

The Open Work, (Opera Aperta, in its original Italian) is Umberto Eco’s first book on the subject of semiotics, although it was not considered such at the time. Eco is concerned with the evolution and values of open works, where openness is in the sense of freedom of interpretation and meaning making. Openness is dependent on the freedom for an observer to interpret or explore meaning within a work.

The Open Work is reaction against Croce, a predecessor of Eco, who was a product of Italian fascism, and strongly emphasized the idea of pure meaning and authorial intent.

semiotics as encyclopedic sense/meaning derives from rules applied to sign systems infinite semiosis- implies that language cannot touch the world (Wittgenstein?) in AI connects to problem of infinite regress.

“Meaning is an infinite regress within a closed sphere, a sort of parallel universe related in various ways to the ‘real’ world but not directly connected to it; there is no immediate contact between the world of signs and the world of the things they refer to.” (p. xxii)

modern work is representation of the knowledge of the contemporary world, and the contemporary crisis. Eco argued that via formal ambiguity, art makes a political stand. Breaking down form is a political act, idea still reverberates in avant garde. (p. xx) Though Eco tones down on this idea during his semiotic period.

Chapter 1: The poetics of the open work

Eco lists a number of composers and their works that incorporate degrees of freedom for the performers. It is interesting to connect this to the composer, Sylvano Bussoti, part of whose piano sheet music was included as the header for Deleuze and Guattari’s chapter Rhizome in A Thousand Plateaus.

can connect with fan/participatory culture in terms of remixing movements

“Every reception of a work of art is both an interpretation and performance of it, because in every reception the work takes on a fresh perspective for itself.” openness is an interpretive freedom.

The next question: why does the artist need to include this openness?

platonic form argues for closure, that there is only one right way to do something aesthetically.

this is changed in medieval interpretation, where scriptures were read according to moral (or allegorical or analogical) dimensions, and interpreted and applied to the new meaning. This is not indefinite or open, but rather constrains the interpretation along four channels of interpretation. This reflects the order of society, which is imperial and theocratic.

Eco moves onto the Baroque, which has an interesting style: its richness and complexity (between extremes of solid and void, light and darkness, curvature, etc), and certain plasticity, all demand an observer to witness the work not just from one perspective, but to move around to better absorb the movement and dynamic of the Baroque form.

Baroque culture emphasizes a certain creativity in man- in the renaissance, man has changed to a puzzle solver, a creator, etc. However, it still has a significant degree of codification and rigidity in its structure.

Considers Romanticism next, and the emergence of pure poetry, which is by nature, abstract, blurry and interpretive. The pleasure of poetry is guessing. This leads to the idea of suggestiveness, which attempts to create openness for the reader.

Onto death of authorial intent?

Contemporary openness (Kafka, Joyce) lead to the construction of worlds, which are self contained and are microcosms. These worlds reflect the incarnation of ideas, certain *senses*, which are are arguably the real meanings of an open work.

The open work requires the reader to *make* the composition with the composer.

Artistic forms, the aesthetics/poetics reflect the way that science or culture views reality. An example of this is how the emergence of the “field” in physics influences the manifestation of cause and effect in artistic works. Similarly, Eco discusses the logical problem of binary logic. This idea relates to the “Law of the Excluded Middle” which is a foundation of mathematics and recently has been questioned by some logicians. Further examples are mathematical incompleteness (Goedel, notably), as well as Einsteinian relativity, Heisenberg’s uncertainty, etc.

Openness is a fundamental part of perception. We can observe and interpret, but essentially never exhaust.

For artistic creation, the artist’s role is to start a work, and it is the role for the viewer (addressee) to finish it. The “completed work”, which exists from the interpretation of the observer, still belongs to the artist in a sense, but must also belong to the viewer. Open works are never quite the same, but are also never gratuitously different. The open work is still constrained in its outcomes and limited in that it is still grounded within an ideology.

Concludes with some bullet points:
1) Open works are in movement and are characterized by an invitation for the observer to make the work with the author.
2) Of all works in movement, there are some works that are open for interpretation and uncovering on the part of the observer.
3) Every work, is open to degrees of interpretation.

Eco stops short of connecting these worlds defined by open works- each open work is a field of possible interpretations, together which define a world of connected meanings that is consistent and can be navigated by observers. At the same time, this requires certain degrees of accessibility. Eco still stops short of allowing these worlds of works to connect to each other. This is what Deleuze and Guattari would argue. That these worlds of meanings set up by each work would connect to each other and to the areas from which they borrowed references and ideas.

Dimensions of openness: openness and games- games may be very prescriptive, even despite interactivity, while leaving no ambiguity in terms of meaning making. Contrasting this, we can explore meaning making in games that have essential ambiguity (like Metal Gear, which tears down the 4th wall at points). Other games have a great deal of freedom in understanding meaning, This openness seems to derive from inherent ambiguity in some works.

Reading Info:
Author/EditorEco, Umberto
TitleThe Open Work
Typebook
Context
Tagsdms, philosophy, narrative, media traditions
LookupGoogle Scholar, Google Books, Amazon

David Bordwell: Film Studies

[Readings] (08.29.08, 4:25 pm)

Notes

Book is about the end of Theory. Namely, Bordwell is referring to Grand Theory, which is the first general theory of film and aimed to apply a sort of universal approach towards interpreting films. Theory is to be replaced with the process of theorizing. This idea seems to replace classical Grand Theory with a wide number of new approaches to cinema.

Bordwell’s essay examines the ways in which film theory has developed and splintered over time. The dominant schools of thought in this are subject-position theory and culturalism, both of which try to describe and explain properties of “society, history, language, and psyche”. Bordwell contrasts these with a third “middle-level” research, which aims to answer smaller problems.

The predominant theory of film originating in the 1920’s was auteurism, which posed the idea of a film as being authored by a single creator. This theory was challenged by others, namely structuralism in the 1960s, which looked at films within structures and genres (such as gangster films, etc).

It is odd to note that auteurism was not attacked by the fact that films are such massive and complex projects that a single individual cannot possibly be due for the film in its entirety. That idea, while it may not be widely accepted as a theory, is still massively popular.

Both auteurism and structuralism are strongly prevalent in popular understanding of games. Games are always categorized by genre, and many ones that tend to be notable have names attached: Miyamoto, Wright, Molyneux, etc.

Structuralism, by way of semiotics and mythologies, serves to relate films to a ritual structure, reflecting and enacting social dilemmas. “For example, Thomas Chatz argues that like myth, Hollywood genres are social rituals replaying key cultural contradictions. The emphasis which Hollywood filmmakers place upon the resolution of the narrative indicates the importance of key thematic oppositions, such as man/woman, individual/community, work/play, order/anarchy.”

Structuralism was gradually superseded by subject-position theory via the percolating influence of post-structuralist thinkers such as Derrida, Lacan, Foucault. The changes originated in the challenge of finding the the social and psychic function of cinema, which in turn led to questions about the role of the “subject”. This shift in focus led to the perspectives of scopophilia and narcissism, where films serve to satisfy various voyeuristic desires. This aspect defined ideology through representations, which led to an inescapable situation where representation determined subjectivity.

A theoretical school that emerged beyond subject-position theory is culturalism, which held that “pervasive cultural mechanisms govern the social and psychic function of cinema.” This domain too sees to define a foundation of knowing and acting, but allows subjectivity some freedom from representation. The center of cultural studies is the understanding of history of cultures that use texts. Cultural studies offers a more general, lighter perspective than subject-position theory.

Having reviewed the grand theories, Bordwell examines some bullet points concerning doctrine and practice.

  1. Human practices and institutions are in all significant respects socially constructed.
  2. Understanding how viewers interact with films requires a theory of subjectivity.
  3. The spectator’s response to cinema depends upon identification.
  4. Verbal language supplies an appropriate and adequate analogue for film.

Looking at the practice of theory (now there’s a phrase!), there are several methods that are employed:

  1. Top-down inquiry
  2. Argument as Bricolage
  3. Associational Reasoning
  4. The Hermeneutic Impulse

Middle-level theory emerged as a result of an increased awareness of the history and practice of actual film making. The goal of such theories is to employ both empirical and theoretical means of understanding film. The aim of these is to approach theory from the perspective of specific problems rather than sweeping doctrine. Bordwell’s defining point is the line “… you do not need a Big Theory of Everything to do enlightening work in a field of study.” (p. 29)

It might serve well to note that some of these aspects of film theory were used to reason about games, and wound up falling flat for their preoccupation with visual imagery and total failure to account for interactivity and issues of gameplay.

Reading Info:
Author/EditorBordwell, David
TitleFilm Studies and Grand Theory
Typebook
Context
Tagsdms, film, media traditions
LookupGoogle Scholar, Google Books, Amazon

John Wiltshire: Recreating Jane Austen

[Readings] (08.29.08, 4:23 pm)

Overview

Wiltshire examines the notion of adaptation and other work as fitting under the general umbrella of “recreation”. He is interested in looking at Jane Austen explicitly, but is also interested in generalizing to develop a general theory of adaptation.

Notes

Introduction

Wiltshire mentions early the referencing of Pride and Prejudice in Bridget Jones’s Diary. The connection can be considered both referential as well as adaptive. The comparison is more of a transcoding, finding different ways to meet similar ends. One interesting point of challenge raised by this is that an adaptation may attempt to transcode different components of the source story. In Bridget Jones, the plot is structurally the same as in Pride and Prejudice, but the characters and social context are different. Other extensions (eg, Mr. Darcy’s Daughters) extend and continue the characters, but are necessarily different in plot.

There are a number of processes described here– remaking, rewriting, adaptation, reworking, appropriation: all of these are loose terms on the same category. This is the general process of “recreation” that Wiltshire is trying to derive.

There is some discussion on the cultural capital and artistic value and worth of artifacts. Adaptation may be seen as trying to “take” value, or alternately, attempt to “extend” value. Contemporary works live in the shadow of capitalism and cannot be created without at least some awareness of “marketability” (referenced from W.J.T. Mitchell). Wiltshire critiques the idea of “auteur theory” and poses the remarkable idea that adaptors (specifically scriptwriters and filmmakers) should be considered readers, because they are making interpretive choices. If we push that argument a little bit more, we can go so far as to say that any author is partly a reader of systems of their own design. Referenced here is a line from Helen Fielding’s Bridget Jones’s Diary, in which Natasha is complaining of the “arrogance with which a new generation imagines that it can somehow create the world afresh.” Mark Darcy’s reply is “But that’s exactly what they do, do.” This connects to Jenkins again. “One might add that indeed each generation produces its own works of art, but not entirely out of its own materials.” (p. 5) Wiltshire connects to Donald Winnicott, who is referenced persistently later.

Some notes on Jane Austen’s image in culture: conservative, stuffy, anti-contemporary. In context, she is progressive, transgressive, challenging, etcetera.

Imagining Jane Austen’s Life

On the psychology of the biographical impulse: Comes the duality of intimacy and remoteness, construction and terrifying unknowability. Part of this is to construct Austen’s life in the romantic terms of her novels, which is a drive to make her inner life much more familiar and knowable in the terms that we have to remember her by. It also underlies the competing forces of nostalgia and progressiveness in reconstruction. These forces are opposed, nostalgia is a desire to idealize and evoke familiarity, and consequently reflects the stuffiness that tends to appear in portrayals of Austen. Austen’s progressiveness is unusual because it separates her both from this idealization and casts the shadow of separation between herself and her works.

Jane Austen in Manhattan, Metropolitan, Clueless

On some of the dogma of fidelity: Extends to differentiation of subject and object in the sense of developmental psychology (Winnicott reference). To see an “unfaithful” adaptation is to see an objectified text, which has some degree of reverence and authority. Unfaithfulness implies that the text is being changed or taken advantage of by the filmmaker, abducting it from its state of purity and compromising it. The issue of faithfulness denies the subjectivity of the text itself, not just in terms of it being interpreted subjectively, but rather its capacity to cause meaning independently. The broader scope of adaptation sees more of “borrowing” or “influence” or “persuasion” by the original material, rather than something that sounds like wedlock.

In the adaptation of Emma in Clueless: The influence is not that of a mother text, but rather an inner presence. Instead of being idealized, the text of Emma is loved, destroyed, and remade (or reborn). Clueless not only adapts, but also parodies and recontextualizes. Not bound so intimately to the original, it is free to stand as an independent work. Other adaptations have an undercurrent of anxiety, a certain nervousness in their relationship to the original. Blind faithfulness requires a reverence for the past and an unwillingness to embrace newness. With Clueless, the process of adaptation is not carrying cultural capital, but instead the essence of art. (p. 56-57)

From drama, to novel, to film

Looking at how Austen borrows conventions from drama: emotions are reflected on the surface of the characters. Austen is notable for the inner life of her characters. However, the conventional way of exposing this is to represent it externally, through soliloquy or indirect speech. Working around the lack of this is deeply problematic in adaptations: many forms of media are not able to carry the same degree of inner life, especially not with the notion of indirect speech. Adaptations may fall to stage conventions: making expression direct (for example, repetition, melodrama), and this lacks the novel’s subtlety. Notably, simulation sounds very promising in this respect. If emotional state should be expressed, there might be multiple ways of doing so. Especially Sims-like games might represent thoughts visually with bubbles, following comic conventions.

Wiltshire gives an example of a filmic adaptation of Persuasion, which uses imagery and dramatic cuts to establish a complex depth through referential and metaphorical analogy. Instead of using verbal language or melodrama to convey meaning, it resorts to filmic and visual language. This is still a form of communication, and in context it carries both the same meaning, and the same tone, in that both are suitably subtle.

Pride and Prejudice, Love and Recognition

Wiltshire gives a nice overview of the philosophy and ideology of Pride and Prejudice specifically. The “great subjects” of the book are “class, love, money and marriage”, while it is “principally about sex, and it’s about money: those are the driving motives of the plot.” (as described by Lilian Robinson, who produced the BBC television adaptation) However, beyond this is a deeper, epistemological issue, which relates is visible in the book’s title. The issue is judging and re-judging, recognition and re-cognition, “that act by which the mind can look again at a thing and if necessary make revision and amendments until it sees the thing as it really is” (as described by Tony Tanner). The deeper issue here is one of knowing another. Interestingly, the idea of impressions and knowing relate back to Goffman, who describes this interaction in slightly more mechanical terms.

Another issue at stake is the idea of “projection” as developed by David Hume. On the extremes are characters such as Mr. Collins, Lydia, and Mrs. Bennett, who project their ideals and desires onto anyone, versus Darcy, to whom no one is worthy of the projection of his ideals. The issue here is that with projection, the subject of the projection does not matter. For Mr. Collins, the identity and substance of the girl he wishes to marry do not matter, as long as she suits his plans. The girl has no existence for him, she is merely a frame around whom he may construct his desires.

Described here is what it means for characters to be identifiable as individuals, rather than caricatures. Many games have this problem, due to their existence as rule-based systems, agents (including the player) necessarily take on the roles of caricatures. This problem descends from a failure to find an inner life and psychology, as opposed to partial repetitions, gestures, tropes, etc. This idea relates again back to Goffman in terms of establishing identity and selfhood. “Such people (politicians, celebrities, for example, but also acquaintances) occupy a space in the inner theatre that is like that of a caricature, for in the economy of our psychological lives we cannot spare the energy to lend them an inner being. Instead they serve as objects: objects onto which we may project, or into which we may invest, atavistic propensities of our own. We may think of them as wholly bad, or as buffoons, or admire them as heroes and heroines. We make do, in other words, with partial and stereotyped notions of others.” (p. 103) This idea is especially strong in Pride and Prejudice, as the protagonists are faced with the realization of their false projections of their own impressions onto each other. It is this objectification (described in terms of Winnicott’s psychoanalysis, but also may echo that of Herbert Mead) that enables individuals to interact with their environments, by treating objects as partial reflections of one’s own ideas (and self).

Pulling in more philosophy, Wiltshire references Hegel. The phenomenon of projection may be seen as extending Hegel’s master-slave relationship. In interacting with others, our relationships are dominations, where we make psychological use/abuse of the other. The example described here is Darcy’s first proposal scene, where, in the “bonds of love”, he is oblivious to the person whom he is addressing. The idea of this use and objectification hearkens to the postmodern vision of others as unknowable and totally alien. However, Pride and Prejudice does seem to conclude with the notion that understanding and acceptance are ultimately possible.

Discussing the early scene in the book (where Elizabeth and Caroline walk around the room), Darcy admits to his flaw of a resentful temper. A more careful analysis is taking place here, but ultimately the exchange is difficult to wholly understand. As readers, we are given facial expressions, but do not see Darcy’s inner character in his responses in this scene, leading us to make our own interpretations. While the effects of language are procedural, they are also deep and subtle, however, we are faced with the same dilemmas as when faced with any black-box system. The reader is alone in comprehending Darcy, and as such, to us, Darcy is an interpretation and construction, much as he is to Elizabeth.

On the gradual development and recognition of emotion: Austen represents “falling in love” as an explicit sequence of thoughts and emotions, which, while slow and subtle, denotes a clear development of emotion. The process is vague and obscure, but at the same time conscious and rational. If we were to use this as grounds for representing love as a concept in a game or simulation, this sort of pacing and peculiar clarity may be very useful.

The love that does develop takes the form of a shared reality, rather than a jumble of projections. At some point, the matter of projection breaks down and in its place is a simultaneous reconstruction of something new and different. The change also takes place in the character of the protagonists as well, in addition to merely their perception of each other.

A final change and difference between the BBC adaptation and the novel is a more modern issue of responsibility. Where in the novel, a revealing point occurs when Elizabeth visits Pemberly and sees the portrait of Darcy, with the smile that she failed to recognize before, the revelation is in the discovery of knowledge that was previously missed. In the television series, however, the same portrait is somber, but intercut with the scene is Darcy riding towards Pemberly and taking a spontaneous swim in a lake, seemingly seeking escape from the pressures of responsibility, as well as a release of pent up emotion. These scenes are very divergent, and represent a shift in perspective, as well as probably a modern viewpoint (the trope of escape from responsibility). The change also reflects Darcy in a more central role as an agent in the story (as opposed to the emphasis being on Elizabeth’s recognition). But at the same time, attempts to convey the same recognition to the viewer, who can identify with and empathize with the desire for freedom.

Reading Info:
Author/EditorWiltshire, John
TitleRecreating Jane Austen
Typebook
Context
Tagsspecials, media traditions, narrative, fiction, adaptation
LookupGoogle Scholar, Google Books, Amazon

Philip Auslander: Liveness

[Readings] (08.29.08, 4:21 pm)

Notes

Chapter 1: introduction

Auslander opens by comparing theatre and media, referencing Herbert Blau and Karl Marx. Theatre and media are rivals, and, much like industrial production, media (specifically television) has filled and saturated the cultural economy. Television has essentially formed its own culture, its own environment in of itself. Television is “the” cultural context, not just one of many. Arguably, the same could be said today of the internet.

On live performance, Auslander is trying to challenge the ideas of the traditional value of liveness. Performers in theatre cling to “the magic of live theatre”. These ideas attempt to place a binary opposition between live performance and the media. Generally, this style of thought puts liveness as the defined opposite of the recorded. “In other words, the common assumption is that the live event is ‘real’, and that mediatized events are secondary and somehow artificial reproductions of the real.” (p. 3)

The term “mediatized” derives from Baudrillard, and is concerned with the idea of “mass media” and, in Baudrillard’s definition, a system of bringing all discourses under a single code (sort of a universal formulation of everything as equivalent under a particular semiotic system as a substrate, much like poured concrete). Mediatization can be applied to live performances, by which they become mediatized performances, for example, a play or event broadcast on TV.

Ultimately, Auslander emphasizes that there are no ontological differences between live performance and media. Live performances are just as susceptible to incorporating media elements as others.

Chapter 2: Live Performance in a Mediatized Culture

Initially, media events were modeled on live ones. Now that media is culturally dominant, live events are now modeled on mediatized ones. Auslander treats this change as one dependent on the historical situation, rather than as dependent on intrinsic properties. Note: This could be just an instance of remediation, where disciplines borrow, reference, and support each other. This approach treats the relationship as non-antagonistic, though.

Television strove to emulate theatre when it emerged (rather than film). Initially it had the capacity to “go live” which emphasized its quality of liveness and immediacy, even though that is not generally how it is used today.

Television’s essential properties are immediacy and intimacy. It can look on events exactly when and as they happen. Film, by contrast, is characterized by memory, repetition, and temporal displacement. Television is intimate in the sense of bringing the external to the home, without needing to travel to it. TV was seen as a cultural sanitizer, to bring only appropriate legitimate content and values into the home. Similar properties can be said of digital media, but in that case it uses the hypermediate in addition to the immediate, and the illusion of the sanitized vanished much more quickly.

Live performance is now heavily influenced (and tends to emulate) the rhetoric and practices of mediatization, with screens becoming prominent in many situations and venues.

Mediatization is reflected in production of performance by the apparatus of reproduction. Auslander references Jacques Attali on representation as a method compared to repetition. Representation developed initially with capitalism but was gradually replaced by repetition as a result of mass-production. This sounds to reference Greenberg on the Avant-Garde and Kitsch.

Referencing Benjamin: Masses have desire for proximity, and at the same time, have desire for reproduced objects. Benjamin’s claim that reproduction devalues the original can be seen as evidenced by the decay of the value of “real” liveness and intimacy, as eroded and replaced by the emulated synthetic on the screen.

Auslander concludes this section with the claim that the system of the virtual has incorporated liveness into its substance. This could be seen as that live elements may be understood as tools and media for a larger system of meaning making. I don’t think that is what Auslander is getting at, but it seems like a productive line of enquiry.

On simulation and live performance: Ronald McDonald performing in restaurants (p. 49-50). Performances occur in numerous locations, and are all live and separate, but are designed to evoke one single character, which is the template that generates each performance. “All performances of Ronald McDonald are generated from a single interpretation of the character, which functions as a template. I have chosen this example in part to make the point that a template is not the same as a script: improvisational performances, too, can be generated from a template.” (p. 50) Here, live performance aspires to the conditions of mass art.

A condition of this, related to Benjamin’s aura, is the illusion of authenticity. No occurrence of mass or scripted art can be considered authentic because of its reproducibility. Performances that derive from templates instead reference an ideal template, and attempt to borrow its aura or authority.

Using Baudrillard’s definition of the real as “that which it is possible to give an equivalent reproduction”, then the live must be defined as “that which can be recorded”.

Loose notes:

Television has become its own culture.

live performances are “more real”
but mediatized are more artificial

live performance can function as mass media?
or, mass media better enables narrative adaptation??

book is controversial because of challenging liveness of theatre?

new media for Auslander is TV

Recent web technologies,
web 2.0, instant messaging, mobile computing
aid confusion of liveness and mediation

liveness and canon?
variation of performance is acceptable in theatre
but in TV and others, liveness leads to a conflict in the establishment of canon.
some particular instance must be elevated to some degree of authority
this derives from the idea of having the perfect performance

for example, music videos which replicate live performance
but use studio track
so the idea of liveness and the reality of it is very
confused and challenged

some performances have have celebration of the virtual
for example, Gorrilaz, which hides liveness of performers
television yields a combination of immediacy and intimacy
filming live television is a SIMULATION not a REPLICATION

Reading Info:
Author/EditorAuslander, Philip
TitleLiveness
Typebook
Context
Tagsdms, performance, media theory
LookupGoogle Scholar, Google Books, Amazon

Bolter and Grusin: Remediation

[Readings] (08.29.08, 4:19 pm)

Overview

The introduction begins with a discussion of the sci-fi film “Strange Days” which partly revolves around a new technology called “The Wire”, which is a sensory recording/playback device.

Like digital media today, it threatens to obsolete other forms of media, but at the same time is bound with similar restrictions and constraints as film and other media.

Remediation is about how media strive to achieve immediacy in spite of their mediation. Newer media attempt to do exactly what their predecessors have done, billing themselves as improved versions of other media. The book is an attempt to challenge the idea that media exist in isolation.

Transparency is an effort to make the presence of the medium disappear. The rhetoric of transparency is introduced through the discovery and use of perspective in the renaissance. Perspective is a technological means of controlling space from a single location, and is also a technology in the sense of its mathematical formulation. As a technology though, it is necessarily about representation and reconstruction of the real world and the human eye. This is an example of immediacy, but that is dependent on the subject of the immediate.

If accurate reproduction is a manifestation of immediacy, then that would imply that Umberto Eco’s openness, and by extension, much of the avant garde, is “latent”.

The automation of computer graphics follows in the same trends as the automation of photography. And in doing so it gauges its accuracy via comparison to photographs. However, the aesthetic that is being pursued is the automated nature of reproduction: the effacement of the programmer, the removal or hiding of the subjective influence in the technology itself.

The introduction of interactivity makes the issue of immediacy and transparency complicated. Interactivity requires certain elements (which may be transparent in the sense that they might be evocative of other forms, eg desktop, paintbox, etc), but to provide immediacy they must be manifest and visible, counter to transparency.

Hypermediation is a situation that involves a coming together of many kinds of media, and often highlights the mediated nature of its construction. This trend can be found in medieval illumination, baroque cabinets, and of course many contemporary things, of which new media is a prime source.

adaptation and hypermediacy: adaptations do not draw attention to their nature as mediating the original. This is also called “repurposing”. Bolter notes that McLuhan once said that the content of any medium is always another medium. The representation of one medium in another is remediation. Does that mean that adaptation is a subset of this?

The remediation of an encyclopedia in digital form bills itself as, not an encyclopedia, but an improved encyclopedia. The transition, involving hyperlinking and providing digital affordances to traditional form draws attention to the electronic medium. Thus such adaptations are translucent, not transparent.

For a game adaptation of a narrative form, the discussion of transparency and mediation become interesting to compare. A goal is to not be transparent, but to evoke the original medium in a way that allows it to be read in a new and open manner.

Remediation as a figurative representation, rather than conceptual. What tends to get “taken out” (of one media and then put into another) is not content, but representational practices. For example: film techniques as adopted by games.

So, the issue is representational practices.
so, for abstract things, like fractal/generative art
these have not really had a foundation in practice, so they forged their own style,
which, admittedly, looks pretty awful, but had informed graphical representation for a long time.

Attempting to challenge the idea that the computer is totally independent of other practices and disciplines. Convergence is serving to undermine that idea, continuing to challenge the idea of the utter-newness.

“A medium is only dead when it’s not being remediated anymore.”

Reading Info:
Author/EditorBolter, Jay and Grusin, Richard
TitleRemediation: Understanding New Media
Typebook
Context
Tagsdms, media theory
LookupGoogle Scholar, Google Books, Amazon

Jon McKenzie: Perform or Else!

[Readings] (08.29.08, 4:18 pm)

Notes

Introduction

Performance is seen as a something applied to business and industry, workers, as well as to art and culture. Everything can be seen as performance.

Specifically, begins looking at a cover of Forbes magazine whose caption is used for the title of this book. This represents a set of firings taken by corporate boards of directors against chief executives. The phenomenon is not limited, though, performance reviews crop up in all lines of business, at all levels, with vauge ominous threats “–or else” if performance is unsatisfactory. “Thus, the Forbes challenge and its hold upon throats around the world: Perform–or else: be fired, redeployed, institutionally marginalized.”

McKenzie proceeds to make his comparison more subtle and complicated: the threat of retribution for performance also echoes images of Vaudeville, popular theatric performance, and cultural performance. Cultural performance strikes a vein with performance art and other richly controversial topics, such as demonstrations, drag, etcetera. The goal of performance in these cases is a certain liminality and subversion. The threat for failure in this case is social normalization.

Alongside these is a subtle and often ignored dimension, which is that of technological performance. Technological performance is ascribed generally to electronic technologies, as well as to consumer products. Failure in this case tends toward being obsolete, defunded, and discarded.

Related to this is the understanding of knowledge and education. Postmodernism tends toward a significant rethinking of the role of knowledge.

One of the main points seems to be that performance is an emergent phenomenon in a system of power and knowledge. McKenzie makes a significant claim that performance will be seen as defining the current era, the 20th and 21st centuries, much like discipline defined the previous two centuries.

A term is introduced, “the lecture machine”, whose icon is the lectern, which seems to denote the system of performance where one is empowered to know and to speak, and separates the speaker from the audience. Lecture machines are systems that enable performances which separate knower from those who do not know. The metaphor can be extended to other boundaries, such as the television and the computer screen.

Chapter 3: Technological performance

McKenzie opens this chapter by looking at some very technical perspectives on performance. Technical articles don’t generally need to define performance as they are embedded within the discipline. McKenzie tries to relate performance of these varying engineering sciences together, and notes that there isn’t anything that seems to coherently tie them together. Specifically, there is a “lack of an explicit and general definition of technological performance.”

Turning again to a scientific paper, McKenzie finds that technological performance is “effectiveness in a given task”. At this point, the difference between performance of the technological variety does not seem too far from that of the business or cultural variety. The task for business is profit, and the task for culture is a certain cultural efficacy.

McKenzie follows to explain that the ideas of effectiveness at tasks is highly context dependent and contingent on external values imposed on the system. He turns to another definition, which poses performance as a “function of effectiveness, reliability, and cost”.

Technical performance might be defined as the rate of change of effectiveness with respect to cost as opposed to just effectiveness.

McKenzie looks at the social aspect of performance, which is about projects. … “projected technologies are more social than technological, more fantastic than objective”. These projects occupy a curious pre-performance state, in that they live in an imaginary dimension before they are built and realized.

Projects become relevant and developed via the affect of social influences. They are carried through by various stakeholders, and the process of development involves the project being born from an abstract world of concepts and ideas into a concrete world deprived of interpretation and ambituity.

Referenced here is Donald MacKenzie’s work “Inventing Accuracy: A Historical Sociology of Nuclear Missile Guidance”. This text looks at the idea of accuracy as being affected by social context, as well as affecting social context.

Again, is referenced the Military-Industrial-Academic complex. Again, the cold war has influenced and spurred academic growth and the development of science and technology. This paranoia both comes from culture and comes to affect culture in return.

McKenzie comes to reference Laurel and her understanding of computers as theatre– that designing computer interfaces is really the art of designing experience. (Experience which is created by performance of the software.) McKenzie raises the idea of extending other cultural performance models to apply to HCI, (instead of just Arisototelian poetics, as Laurel uses).

Also cited is Robert Crease, who has studied experimentation in science as a sort of performance, where the laboratory is a special stage for the enactment of material and learning of special knowledge. Science teeters between presentation (of experiments) and representation (where theory is applied to the world, or interpreted from data?).

Ultimately the goal here seem so be that the idea of technological performance (as effectiveness) are still rooted in models of cultural performance, especially as defined by stakeholders in evaluation of the technology.

Finally, the tripartite collective of performance, (studies, management, and technology), are united under the category of their emergence in cold-war America, and are collectively symbolized by feedback loops and the missile.

A question: What is McKenzie trying to do? And what are we supposed to get out of this?

Reading Info:
Author/EditorMcKenzie, Jon
TitlePerform Or Else
Typebook
Context
Tagsdms, performance, media theory
LookupGoogle Scholar, Google Books, Amazon

Friedrich Kittler: There Is No Software

[Readings] (08.29.08, 4:17 pm)

Overview

In this essay Kittler frames the argument that software ultimately serves to conceal what is important in a computer system. The most clear way of using this seems to be in the manner in which software restrains and restricts the capacities of the user. Programmability is seen as a force that enables this concealment, and is, instead of an advantage, considered an indictment. While software is limiting and restrictive, so too is hardware, so it is difficult to tell exactly where and what the problem is.

I would say that ultimately the restrictions of software are the the same as the restrictions of using any system of abstracting models.

Prevalent through this essay is a theme of disgust over the notion of design, and a desire to appeal to bare mathematical concepts. Unfortunately, Kittler wildly misuses his reference to fractal theory, and his treatment of Turing’s computability too seems dubious.

A useful summary is located on mediamatic.net.

Notes

Kittler opens by noting the pervasiveness of computers, and that writing is more frequently (he says that the bulk of writing) is stored in computer memory, where it is no longer perceivable by humans. In fact, we do not even write anymore, but we use tools that are able to write by themselves.

Kittler is concerned with this technology driven evolution, and the technology that has enabled it. He specifically looks at the relations between Turing machines and microprocessors. The Turing machine can imitate any other Turing machine and compute any computable function. This fact means that computation is independent of hardware and that nature itself may be considered a Turing machine.

This is relevant from a perspective of languages, and Kittler suggests that programming languages have transcended ordinary language and have formed something of a tower of Babel founded on computational equivalence. He uses an analogy referencing fractals and the idea of self-similarity, but that does not seem to bear much resemblance to the idea in a real mathematical sense. What he is describing is a similarity among models and languages. Beyond this, the question arises of what language *does* when it has reached this universal state.

What follows is a close look at the process of executing WordPerfect on DOS. The language issues that seem to be problematic are the syntactic qualities stemming from DOS as a platform: the “exe” and “com” extensions, as well as the pervasiveness of acronyms. One of the problematic points is where the OS ends and the program begins. This relates again to the BIOS of the computer, which is another layer operating underneath even the operating system. In turn, underneath these too is additional hardware, in which information is only represented as differences in voltage. In turn, Kittler argues, the formalization of these is mathematical theory, which is composed of sentences with words and letters.

From this sequence of observations, we are to find that, really, there is no software. It is difficult to tell how this argument is supposed to coalesce. If we are to apply this sort of reduction to everything, (our bodies are cells, governed by biology, then by chemistry, then by physics), then we wind up with reductions that are of very limited use at all.

The current trend in design is that of concealment of the technicality of the underlying machine elements: BIOS conceals the hardware, the OS conceals the BIOS, and software conceals the OS. This concealment is “completed” by GUI design, and hardware level security. HCI has taught us that GUI design certainly does not always conceal the machine nature of the computer, it in fact reveals it in different ways. The fallacy of this is best exposed when programs catastrophically fail and reveal their inner workings in a most jarring manner (the Windows blue screens are most notorious in this regard).

Kittler sees the effect of the prevalence of software as pushing this trend of concealment. He invokes the idea of software as similar to one-way cryptographic functions, and as a result, cannot be easily reproduced or computed. Again, history has shown that modern programming languages, instead of being progressively harder to decompile, are much easier than in the past (especially languages such as Java or C#), similarly, all software is still universally accessible on one level or another, the communities that crack software or hack electronic devices such as game systems, portable music players, etcetera are tantamount evidence to this.

Software is seen as being able to conceal itself on every front, and have that concealment further enforced by patents and copyrights. The quality of undecidability and complexity has led too to the legal status of software as material, despite its immateriality. At this point in time, due to the internet and rapid ability for software to be copied, this nature has become much more complex, with the material elements being held to very tightly by publishers.

The capacity for software to function is limited and dependent on the power of the hardware, which ultimately limits the capacity for software to emulate other systems through its available memory.

Ultimately, Kittler concludes that software is ill fitted to take on the increasing complexity of problem solving that will be demanded in the world. Programmability limits the effective potential of any platform due to its opening and lack of focus. Hardware has the strength and power to simulate much more effectively real world systems.

Reading Info:
Author/EditorKittler, Friedrich
TitleThere is No Software
Typebook
Context
Tagsdms, media theory, postmodernism
LookupGoogle Scholar, Google Books, Amazon
« Previous PageNext Page »