Thursday, March 28, 2013

The Digital Turn: Reflecting on the Digital Capacity for Self Reflexivity

I want to pick up, very quickly, from where I left off with McPherson's suggestion that racial concerns permeate DH covertly, sneaking in perhaps through the back door of UNIX modularization and the logic of fragmented autonomous processes. McPherson likens this "lenticular" logic to holographic postcards which both separate and bring two distinct images into a unitary construct that both highlights their difference even as it covers over their separability. The idea here is that UNIX modularization arose from the same milieu as neo-liberal covert racism. Gone are the days of the lynch mob and the segregated water fountains, only to be replaced by the quietly managed structural racism that made cities like Detroit and Chicago even more racially segregated in the 80s than in the 50s. The logic of modularization, lenticular logic, allows multiple social "processes" to operate under the illusion of autonomy. It covers up the interconnectedness of each module, privileging smooth, uninterrupted, and straightforward articulations of what might actually be rather complex systems. While this may make for elegant computer design, it also describes a rather insidious social situation where the concerns of the margins have no chance to make their plights known to more mainstreamed groups. The screens, in other words, are hidden.

How to begin investigating, let alone solving this problem? If it is true that the very logics upon which DH is predicated begins with the principle of concealment for the sake of cleanliness, how can we turn these digital discourses back on themselves in a critically productive fashion? I believe Johanna Drucker's essay "Humanistic Theory and Digital Scholarship" begins to get at some of the ways the digital can begin to work against its own cultural logic, though perhaps not in the way that she intends to. The basic thrust of Drucker's essay consists of a call for humanistic criticism to play a more central role in what Ramsay would call the practice of "building." Drucker closes her essay thus:
Our challenge is to take up these theoretical principles and engage them in the production of methods, ways of doing our work on an appropriate foundation. The question is not, Does digital humanities need theory? but rather, How will digital scholarship be humanistic without it? (94)
This seems to be basically the same question McPherson asks. It's clear that digital scholarship is effective as a mode of knowledge production, but what in particular makes it useful to the theoretically committed humanist? Both McPherson and Drucker seem to suggest that only by integrating the more traditional modes of humanistic criticism at the level of "building" can we really think about DH as an actually humanistic endeavor.

The trouble is, it doesn't take very long before we run into some imaginative difficulties on this trajectory. What, after all, could Foucauldian code look like? Is it possible to write a computer language that inscribes Marxist critique at the level of Python syntax? The answer right now looks to be unequivocally "no." The computer knows only binary, and criticism necessarily undoes the polarizing work of the either/or construct. Drucker begins to get at this apparent absurdity in her examination of a digital reconstruction of a Roman forum:
If we embark on a project to study inscriptions in the Roman forum, for instance, should we simply take a virtual simulation, a fly-through model made on a standard 3-D software platform and place the writing on the variously appropriate surfaces? Or should the ways signage works as an articulation of space, the species of "spaces" described by Georges Perec, be merged in a mash-up of Christopher Alexander's "pattern language" with Gaston Bachleard's poetics of space and the Foucauldian analysis of spatialized regimes of disciplinary control structured into and structuring architectural and social relations? A ridiculous proposition emerges from that impossible sentence. (92)
This is exactly the trouble with trying to, as McPherson says in relation to race issues in the digital humanities, "hold race and computation together in a systemic manner" (153). There is something about the way computers work - not only internally, but on our own social heuristics - that simply makes sustaining cultural criticism difficulty. It is as if computation necessarily imposes the lenticular modularity that prevents academic cross-pollination and therefore insulates praxis from critique.

At the end of Drucker's frustration, however, she asks whether or not the ridiculous proposition is really ridiculous? Whether the impossible sentence must really be thought of as impossible Drucker's own analysis takes her into a critique of visualization as a way of representing data, that it always distorts and represents incorrectly or ideologically. Though she seems essentially right about this, I would submit, briefly, that this distortion is not a problem as long as it is self-reflexively implemented. Indeed, it seems to me that distortion is the deformative act that McGann wants from the beginning. Perhaps distortion is even what the critical act has always been. However, I want to move past Drucker's critique and entertain for a moment what a digital construction that uses its own tools against itself might look like. How, in other words, might a program "show the screen," not only of itself but of the cultural logic in which its lenticular modularity originated?

I want to suggest that a certain kind of video game is one way of thinking about this possibility. The simulation, it seems to me would make room for dynamically representing the many theoretical considerations Drucker describes in her Roman forum project. Indeed, video games can simulate quite complex ideological structures that make clear the network, not the node, as McPherson might say.  Next time I hope to get into some of the ludic possibilities for DH criticism by way of Paradox Interactive's game Victoria II. If anyone has other ideas of games that might solve some of Drucker's problems, I'd love to hear what you think.

2 comments:

  1. I’m going to do some drive-by criticism here and point out a huge flaw in McPherson’s argument. In your own words, Jordan, UNIX’s “modularization arose from the same milieu as neo-liberal covert racism,” and “covers up the interconnectedness of each module, privileging smooth, uninterrupted, and straightforward articulations of what might actually be rather complex systems.” However, by McPherson’s own admission, every single transfer of information between UNIX processes is clearly denoted by the | (“pipe”) symbol. This makes the task of tracing the interconnection of processes incredibly easier than tracing the interconnection of race in society, even to the most casual of observers. The purpose of modularity in programming languages is to make one’s code more transparent and understandable to others. The “purpose” of modularity in society is to obscure its underlying conflicts.

    Also, I have a ton to say about the possibilities of simulation, but I'll save that for class.

    ReplyDelete
  2. Though discussion will show if we disagree or not on the Golumbia/Ramsay bit, I think we’re on the same page with Macphereson. A statistics class will teach you Procrustes analysis, which as I understand it refers to specific ways of examining results in which your chart is plotted by deliberately ignoring the values that do not fit within a pre-established range. To illustrate the possible gruesomeness of the thinking behind the method – the, as Macphereson puts it, “worldview in which a troublesome part might be discarded without disrupting the whole” (146) – a link to the wiki page for the mythological origins of the term.

    This way of thinking is at once Horrifically discriminatory and quite useful --- A way to apply the principles behind the Procrustean bed (though this wouldn’t actually be ‘doing’ a Procrustes analysis in the statistician’s sense) in the context of the geocoded Virginia Woolf would be – through coding [!] – creating a way of excluding all lat/longs outside of the immediately British context of the text (if that’s what you’re looking for… then again, this might be tough with the sun never setting bit). This way, ‘Louis’ would just not register unless it was within the acceptable range that you set. This may naturally cause more problems than it solves, especially because some of the most cherished results on something like a geocoded map are the unexpected ones. However, unless we want to go through such results by hand in every case (perhaps even defeating the purpose of ‘distant reading’ along the way), we must likely avail ourselves of methods underwritten by the desire to ‘control complexity’.

    It’s not the same as ‘lenticular logic,’ but you see why I’m raising it --- in order to work more effectively with large sets of data we need to come up with ways to account for the values that deviate. Now to make the leap that you might not like: if we insist, with Ramsay, that all of us (humanities scholars with an interest in using digital methods) have an obligation to know how to code, then will this quite tangible exchange of time spent reading for time spent working with the ideologically-infused languages of computing produce a shift in academia’s desire and ability to examine a series of questions not easily solved through digital methods?

    ReplyDelete