The Walt Whitman Archive – Whitley 2
By
Edward Whitley
January 2011
2What are its weaknesses? What do you wish it would let you do? What changes would you suggest?
Edward Whitley
Associate Professor of English and Director of American Studies – Lehigh University
¶ 1 Leave a comment on paragraph 1 0 I would like to see the Whitman Archive make more transparent the choices that go into deciding how texts are marked-up and encoded. Archive co-director Kenneth M. Price has written that the “ongoing decision-making” regarding the mark-up of texts on the Archive is “interpretive rather than mechanical,” and that this interpretive coding “adds intelligence to an electronic text . . . mak[ing] it robust and illuminating as a tool for analysis.”[ref]Kenneth M. Price,”‘Debris,’ Creative Scatter, and the Challenges of Editing Whitman,” in Walt Whitman, Where the Future Becomes Present, ed. David Haven Blake and Michael Robertson (Iowa City: University of Iowa Press, 2008), 67.[/ref] I admire Price’s forthright admission that marking-up a text for the digital medium is not merely a change in format, but an act of interpretation. As someone who visits the Whitman Archive on a regular basis, I’d like to have a clearer sense of what kinds of decisions—and, therefore, interpretations—are being made as texts are digitally encoded, and, specifically, what those decisions reveal about the textual condition of Whitman’s works.
¶ 2 Leave a comment on paragraph 2 0 This request is not based in a suspicion of any kind that the Archive directors are placing a biased interpretation onto Whitman’s poetry as they encode it for presentation on the Archive; rather, my request is based in a desire to following along on the process of discovery that Matt Cohen describes as an integral part of digital encoding: “As a student in the 1990s, doing transcription and basic encoding of Whitman documents, I learned a lot about textual structures that I had not encountered in seminars; when discussing these observations with the project directors, sometimes new areas of concern or future development would emerge.”[ref]Matt Cohen, “Design and Politics in Electronic American Literary Archives,” in The American Literature Scholar in the Digital Age, ed. Amy Earhart and Andrew Jewell (Ann Arbor: University of Michigan Press, 2011), 233.[/ref] The Archive is already a valuable resource for gaining access to Whitman’s works; it can also be a resource for learning about how Whitman constructed his texts in manuscript and in print by eavesdropping on the process by which these texts are reconstructed for the digital medium.
¶ 3 Leave a comment on paragraph 3 0 Directors and editors at the Archive have already begun to provide the kind of running commentary on the encoding process that I would like to see more of. Brett Barney, the Senior Associate Editor of the Archive, has written an insightful essay about his encoding the 1867 edition of Leaves of Grass that is available on the Archive, and both Price and Folsom have published, in a number of different venues, provocative essays that detail the lessons about textuality that they have learned from preparing texts for the Archive.[ref]Brett Barney, “‘Each Part and Tag of Me is a Miracle: Reflections after Tagging the 1867 Leaves of Grass,” The Walt Whitman Archive, accessed December 2, 2010 http://www.whitmanarchive.org/about/articles/anc.00002.html. Additional essay by Folsom and Price are also available at http://www.whitmanarchive.org/about/articles/index.html.[/ref] Currently, the Whitman Archive makes available the TEI-encoded XML source files for the major texts on the Archive as well as the Archive’s encoding guidelines and related DTD files. As much as I value this peek into the inner workings of the Archive, the process by which these XML files are created—the “ongoing decision-making” Price refers to—is as important as the product itself. The Archive could benefit from more self-reflexive commentaries on the encoding process, perhaps using Barney’s essay as a model. The recently added WWA-Changelog—a blog connected to the Archive that details the changes that editors make to files once they are posted online—could provide a model for how such commentaries could be more fully integrated into the experience of working with the Archive.
It has been illuminating to read this post (and ones by Claire Warwick and Evan McGonagill) underscoring the value of metacommentary in the Whitman Archive. Ed Whitley clearly wants us to devote more time to creating such metacommentary. What we have to balance is the need to make headway on our attempt to edit Whitman’s writings and to fulfill the promises made in grant applications against the development of the metacommentary.
At times, I have thought that we could host a blog dedicated to the process of creating the Archive. It could deal with the issues of encoding choices, as Ed W. mentions, but it could also deal with a host of other matters: workflow decisions, grant writing, negotiations with university administrators, questions of long term preservation, technical standards and platforms, and so on. I think such a blog could be immensely valuable. The only reason I haven’t pushed for its creation is that we are scrambling to keep up with current obligations, and it is hard to imagine how to fit this undertaking into the schedule, too. Nonetheless, given the amount of interest in this topic across several posts, I am sure we will think hard about this matter in coming years and perhaps we’ll somehow find a way to discuss more fully and open up more completely the process of creating the Archive.
Notwithstanding my sympathy for transparency, I would reinforce Ken’s caveat that the more effort put into explaining how the text is marked up, the less time there is to actually mark up text. My own 2 cents is that the largest audience probably is intererested in seeing more product available through the Archive and that explications of how that product was created might wait until further along in the process.
I
do take the point about the trade off between doing the work and documenting it. But, as I said, we have found that a lot of users do value this. The blog approach sounds like a good one. It allows people to comment on their work as they do it with minimal fuss, and can easily be linked to the main project web page. We used this approach on the VERA (Virtual Environments for Research in Archaeology) project and it seemed to work very well.
This is also important for the sustainability of the resource of course. We hope this archive will be available long after you are no longer working on it, and indeed being used as an example of good practice. Thus, for those scholars who want to learn from you and for digital curators who want to preserve your work, it’s vital to be able to understand your rationale for technical decision making.