NWACC GRANT REPORT: GOALS, OUTCOME, FUTURE

LISTEN TO TUMTUM

PHOTOS

GPS DIARY

GRANT REPORT

GRANT EQUIPMENT INVENTORY

RELATED COURSE DOCUMENTS/ARTIFACTS

COURSE WEBSITE

OTHER LOCATION-AWARE PROJECTS

GPS RESOURCES

SCHEDULE A TOUR

CONTACT PROJECT DIRECTOR

RETURN TO HOME PAGE

1. Site-Specific Interactive Narrative, using GPS as input

September 2, 2003
Grant Recipient: Naomi Spellman, Visiting Faculty
Evergreen State College, Olympia, Washington
Course: Camera to Computer Winter/Spring 2003

2. Brief statement of project goal

The project goal was to create a digital media project that augments the physical environment where the media is encountered. The projects run on Tablet PC’s with Global Positioning Satellite receivers and custom software. GPS tracks viewer location, and determines where and when media is delivered. The landscape becomes the interface. Every version is rendered in real-time, according to the viewer’s pattern of movement. The physical surrounding provides narrative details — architectural space, natural elements, sounds, smells, etc. The digital media students produce serve to somehow alter our perception of the physical environment.

3. Teaching or research setting in which the results of the project were implemented

The project was realized in the context of a two quarter media course called "Camera to Computer". Students worked with processes and conceptual strategies relevant to still and moving digital media production. Winter quarter was devoted to learning photographic and digital imaging skills through weekly individual and group projects, discussion, and field trips. Spring quarter focused on the project supported by the NWACC grant.

An important first step involved examining how humans interpret their physical surrounding in the context of art-making, writing, and film. How do we read the physical, social and emotional qualities of a space? How do we provide a specific experience in a given site? The focus might be on the natural aspects of a site, its architecture, cultural artifacts, history, politics, etc. We did weekly readings for the first 5 weeks to aid in this kind of interpretation. Readings included Guy Debord’s "A Critique of Urban Geography"(1955), Lev Manovich’s Article "The Poetics of Augmented Space" (2002), Bob Hughes’ Essay "Narrative as landscape" (1997), Italo Calvino’s novella "Invisible Cities" (1972), and excerpts from Dolores Hayden’s "The Power of Place; Urban Landscapes as Public History" (1995). Students also watched the Wim Wenders film "Until the End of the World" (1991), which centers on a portable digital device that records and plays back dreams. We took several field trips to Seattle, where we saw James Turell’s installation and site models at the Henry Museum, and two nights of performances at the Northwest New Works Festival at ontheboards. The emphasis in discussions of these readings and works was on the interpretation of our physical environment through analysis, art installations, and the human body. How do we tell a story in physical space?

Concurrent with the readings, students generated site-specific media sketches. Starting the first week of spring quarter, students created sound compositions . The first and second assignments involved augmenting or altering the physical space in which the sound pieces were listened to — just by the nature of how our brains combine what we see with what we hear. We went out as a class to listen to and discuss each piece. I found Jennifer Pellinen’s solution particularly effective. She created an audio piece which she situated in a downtown café (Otto’s, which became our office spring quarter!). As one starts the audio, one hears sounds of the café activity which complements the ambient sounds. Slowly a demonic voice creeps in, a seeming invocation of one’s inner fear. She continued this examination in her final work presented at the course exhibition. Using her laptop, a webcam, VR goggles, and Onadime multimedia software, Jennifer generated an alteration of the existing visual landscape through various manipulations of live video and sound. I was thrilled with the initiative she took in teaching herself Onadime software, as well as in researching the hardware.

Students were then assigned to collaborative groups based on their personal interests. Political content and mood were some of the distinctions students made in regard to the type of work they wanted to produce. Each week I met with the groups to go over their media onsite. The distinctive aspect of this project is that it is computer-based, but not desk-bound. Unlike a website, which dictates where media is experience, the students and the public are required to go to a location that holds significance which is not readily accessed — and to have that element brought to life. For those of us who toil on computers daily, this project provided a welcome change from typical computer-based media, both during the authoring phase and during its presentation.

One group worked with a local story-teller to provide a voice for the Deschutes river in the Tumwater Historic Park. The river tells of its evolution, both pre-historic and man-made. The video compositions and soundscapes the students provided to augment this story worked beautifully. The relevance of reacting to physical cues and to a specific terrain cannot be underestimated. I have not previously experienced such a deep connection and empathy with content. The terrain and history of a physical location became a source of understanding and inspiration. When we examine and research and sweat, our relationship to the location where that activity takes place becomes deeper, and the final projects reflected that relationship. I hope as others experience these pieces, they too will gain a deeper connection with these places.

A second group group chose to work in downtown Olympia, concentrating on historical events in a two-block radius. They researched historic events in the downtown area, and decided to concentrate on catastrophic events, such as major fires, earthquares, etc. They determined the specific locations where these events took place, then formulated a voiceover script based on their research. They supplemented this narrative history with ambient sounds that complemented the stories, as well as historic images.

The third group generated media which aimed to compare the human body with the earth’s terrain. This group interspersed footage of the terrainwith overviews of the body. Dictionary definitions of words which function both to describe the land and to describe the body were incorporated: shoulder, lock, etc. There seemed to be a lack of connection among the disparate parts contributing to their story. However, they did complete their work, and conceptually the work has wonderful potential.

Most of the media software students used were taught winter quarter. Applications used for the projects include Photoshop, FinalCut Pro, Peak and Deck. Additionally some students taught themselves Adobe After Effects and Sound Designer specifically for this project.

The NWACC Grant provided us with the support of visiting artist, programmer, and lecturer Jeff Knowlton, with whom I worked on the first GPS narrative project http://34N118W.net. During the eighth and ninth week, students were introduced to the Macromedia Director interface, as well as basic concepts in interactive programming. They learned how the interactive script controls media files, and how to alter the script for their own projects. They also learned how serial data can be parsed for specific text strings. In constructing the final pieces, they were responsible for generating a map and defining its boundaries with GPS coordinates, determining the hotspots on their maps which trigger their media, importing all media files, and inputing data specific to their location and media files. Jeff was a wonderful addition to the course and a nice break from my role in front of the class. It allowed me to function in a supporting role. I operated primarily as manager and mediator toward the end of the quarter. His presentation of computer code as an integral part of interactive media was refreshing, as was the way he physically engaged students in an exercise that demonstrated how the parsing engine sorts relevant text strings.

Students did a wonderful job utilizing their skills and interests while producing their media. I was impressed with the initiative the students displayed in obtaining the results they wanted. I think the collaborative structure served them well here, as a support mechanism, sounding board, and as a source of requisite skills. Overall they did a good job of finding their niche within the collaborative process, and completing all media planned for the projects.

4. Discussion of project results/extent to which goals were met

As is usual with projects utilizing new hardware and custom software, our final project was plagued with its share of bugs, crashes, and hardware and software incompatibilities. See the link "GPS Diary" on the project website for some of the trials we experienced. Since it is a site-specific project and relies on some level of accuracy, we were thrown for a loop when calculations of the user’s location based on a previously accurate piece of scripting were off. Rip Heminway in the Evergreen Cal Lab helped us alleviate this problem.

On July 19 we held a public presentation of one of the works, "Tum Tum". Attendees seemed to enjoy the work. Some had comments about what worked and what didn’t. One response addressed a lack of complex integration of the media with the physical experience. Another visitor questioned the logic of repeating the visuals at hand on the computer screen. I think the students did a great job in the limited time they had. One idea that came out of these discussions was to pick up on the distinct pitch each waterfall consistently emits, and to use audio to harmonize with or complement these tones. Audio seems to be a more effective complement to the physical experience. Perhaps if we had retinal scanner type displays, we could integrate or overlay the computer-based visuals with the view of the physical realm, so that the real and the virtual meld. It seems we are better able to process multiple/disparate layers of audio than multiple visual sources.

5. Impact of project/future plans

This project was concerned with augmenting space, or enhancing real world experience. With wireless consumer technology readily available, suddenly the possibility of a transparent networked realm, or floating layer, is tangible. No longer relegated to the realm of desktop experience, communication designers are now faced with addressing problems specific to location and user-aware services, enhanced navigation, and virtual and reconfigurable architecture. Environmental data can now function as triggers to elicit intuitive response to users. The notions of push and pull are replaced by transparent enhanced user-aware response. In turn these applications enhance and redefine traditional 2D communication design. Bookmarks can follow you home. One no longer needs to request information.

As we witness a marked increase in the use of electronic information that is either divorced from a finite location, or specific to a location, it seems necessary that media programs address this shift, especially as we experience a sharp drop-off in demand for web content. One of the things I emphasized to the students is that some of the functionality we associate with the Web is giving way to location-aware applications, where the constraints of interface design are replaced by an emphasis on smart technology. Cell-phones, PDA’s, and Tablet PC’s containing wireless cards, GPS, and infrared allow us to access and use information specific to our location. This shift marks a sea change in our perception of information technology, and of physical space.

Overall I am thrilled with what the students gained from this project. In addition to learning a about GPS, mobile computing, and computer code, they seem to have genuinely benefited from the process. They have a new understanding of how media can be employed, and how we can develop content for specific locations. The use of technology in arts education is not an end in itself, but rather a vehicle for expression. Expression through this medium, as in all mediums, will however ultimately constitute a framework or language in its own right. Like the World Wide Web, integrated mobile computing will adhere to and promote a standard. If we follow the trajectory of the Web, it may seem inevitable that coupons and commercials on location will define location-aware service for most of us. But right now I hope to contribute to a humane and stimulating use for the new meta medium.

This report was compiled by Naomi Spellman, the project director for a site specific interactive narrative project in the spring of 2003 at
The Evergreen State College with the support of the Northwest Academic Computing Consortium - NWACC - and Motion Computing.
.