NWACC GRANT REPORT: GPS DIARY |
GPS Diary for a Site Specific Interactive NarrativeThese are excerpts from email interchanges associated with Camera to Computer Course at Evergreen State College Spring Semester 2003. * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Proposal for Use of NWACC Grant for Computing Equipment
"Camera to Computer" students are now learning photographic, digital imaging, as well as animation and interactive techniques. In the spring they will learn how to integrate these skills into a more complex interactive work. They will also get away from the desktop and start employing mobile computing capability. One of the projects they will do in the spring is a site-specific interactive media work designed for the mobile computing platform. Students will research and examine a site according to their own interests. The site may have ecological, historical, personal, or other significance. Students will then generate media to augment the site. Media will include sound, images, text, and video. The media will be composed and arranged to complement the experience of walking through the site. Once the piece is finished, participants will be asked to experience the work. The work is presented on a Tablet PC. The location of relevant environmental or urban elements will determine delivery of media. So as a participant is walking along, she will hear audio relevant to where she is standing and what she is looking at. This project will utilize Global Positioning Satellite devices to track location of the user. The GPS feeds longitude/latitude information to the Tablet PC. The user's location triggers delivery of media as the user walks through a delineated space. The input for this interactive work depends on the user's pattern of movement and speed, rather than a conventional input device such as a keyboard or mouse. Because the project operates in reaction to location sensitive triggers, the fieldwork and testing associated with it takes place onsite. Like any site specific work, the project starts with a premise then is informed by cycles of research, design, production, and testing. This process repeats itself and continues throughout the planning and completion stage. Public viewing of the project, to take place in May, will also necessitate equipment onsite. The viewing equipment is the same as the equipment the work is designed on: Tablet PCs, headphones, and GPS devices. To enable the design and presentation of student work, we need to purchase the necessary hardware and software. In addition to use in the Camera to Computer project, the hardware and software will be available to Evergreen students after June 2003 to use for other wireless projects. Evergreen students will be able to design wireless mobile computing projects for art, environmental, educational and other applications on and off campus. "34 North 118 West" is a GPS controlled narrative situated in downtown Los Angeles. It delivers a historic narrative, with sound effects, to participants as they walk through a delineated neighborhood. Sounds are triggered by the location of the participant. These sounds denote activity in the neighborhood in past times. For example on crossing a street, antique car horn beeps. Trains depart from a long gone freight depot. Horse carriages rattle down the cobblestone street, now paved over. The sound of bottles clanking is heard as one walks past the now defunct bottling plant.
This work is currently running in the Art in Motion Festival, where it was chosen for the Grand Jury Prize for creative excellence. It has appeared in Wired Magazine, the Los Angeles Times Magazine, the Christian Science Monitor, washingtonpost.com, and Wireless Review. See description and information on this project at http://34n118w.net .
One of the collaborators instrumental in the above described project was the software designer, Jeff Knowlton. He will be invited to teach and integrate the existing custom software with the spring student project. He also plans to work together with a GIS specialist at Evergreen College.
A description and breakdown of equipment and costs follows. This breakdown would allow students to work in groups of 4.
Tablet PC's are computer manufacturer's first attempt to provide a lightweight, compact, hardy device for indoor and outdoor location-based applications. The StepUp Tablet PC's proposed are the first sub $1000 tablets.
4 Tablet PC's (StepUp DocuNote) $4048
Benefits to Evergreen and Community at Large:
Given the wireless initiative in development at TESC, these tablets will lend Evergreen students the capability to design mobile computing projects for art, environmental, educational and other applications.
Information on products
Project Schedule
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
From: NWACC@directory.reed.edu [SMTP:NWACC@directory.reed.edu]
Dear Naomi:
On behalf of the Northwest Academic Computing Consortium, it is my
pleasure to inform you that you are a recipient of a Proof of Concept
award from NWACC. Congratulations on submitting an excellent proposal.
Award payments will be made in two installments: An initial payment of
80% of the requested funds will be issued by May 15, 2003. A final
payment of the 20% balance of requested funds will be issued by October 15, 2003, provided that: (a) the project Web site has been established;and (b) satisfactory evidence of progress towards completion of the project has been made.
Please provide us (by e-mail) the name, address, and phone number of the appropriate Financial Officer at your institution to whom award checks should be sent and who is authorized to sign an award agreement on behalf of your institution.
In order to share your project with other members of NWACC, we require
that you create a project Web site as soon as possible but not later than September 15th, 2003. As soon as your Web site has been established, please send us the URL so we may link to it from the NWACC Web site. Your Web site should include a copy of your project proposal as well as other materials as they are developed. All projects must be completed and grant funds utilized by April 15, 2004. Please notify us when your project has been completed and a short final report is posted on your project Web site. A template for the final report is provided below.
If you have any questions, please contact the NWACC office at 503-777-7254 or nwacc@reed.edu. Once again, congratulations. We wish you great success in your endeavor!
Sincerely yours,
bjnjkMark Sheehan, Chair, NWACC Grant Committee 2003
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
On Mon, 12 May 2003, Spellman, Naomi wrote:
Jeff, before I send this out to my students and have them believe
something that = FALSE, can you look over this? What are some good
resources for looking at basic interact. progr. principals to get ready
for week 9?
Check this out: One group is working with a local storyteller who, based
on some historic detail and myth, will spin a yarn told from the river's
point of view at Tumwater falls. There are 5 waterfalls on the path, each
waterfall location will trigger segment of story perhaps.
Other group: Conflating the terrain with human body as far as scale,
texture. I am wondering ifwe can do something with the touch screen since
this should be about TOUCHING. I don't know is motion comp screens react
without pen, as it is a digital pen. Do you?
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
C2C:
We can use any of the data listed below (at very bottom), which is
standard for GPS receivers. latitude and longitude, or vertical and
horizontal location coordinates, are the obvious ones, which we already
have been working with in the existing script. Other variables, and
examples of how they could be used, follow. I am using pseudo-code,
english set up in the way one must think when programming, in precise
mathematical terms, with functions such as , , =, +, -, x and terms
such as if, then, while, loop, true, false, set.
The important thing to note, is that whenever you choose to work with a
certain input, you must address all occurances of that data, unless you
only want your piece to react in a specific range. In other words, when
using velocity, you need to first of all establish what your total range
is (0-11 mph assuming person is on foot, 1-120 if they are in automobile, etc.), then divide that total range into increments, then specify what happens at each increment. OR you could only have something happen if they are running (greater than 6 miles per hour), and not react to any other range. Of course, if that person got on a skateboard, bike, or in a car, the running mode would be on all the time. But hopefully noone will disappear with our Tablets on a bike.
What could happen when specific criteria are met? For example, what could happens when ELEVATION equals OVER 10 METERS, or when VELOCITY is greater than 6:
------------------------------------------------------------------
Speed over gr in knots or VELOCITY
if velocity equals 0 (mph)
then do nothing (silence)
if velocity is greater than 0 (mph) and less then 1 (mph)
then play "video01.qtm"
loop while speed is 0 OR more than 1
if velocity is greater than 1 and less then 2
then play "video02.qtm"
loop while velocity is less than 1 OR more than 2
if velocity is greater than 2 and less then 3
then play "play "video03.qtm" "
loop while velocity is less than 2 OR more than 3
(imagine video clips working like this: when you are still, the character
on the screen is still, when you start to walk, she starts, each time you
speed up, she speeds up, etc.)
-TIP- use looping for time-based media whenever possible, so that we can
use and reuse the same clip seamlessly, versus playing a huge file. That
will bog you down.
------------------------------------------------------------------
Antenna altitude above/below mean sea level (geoid) or ELEVATION
An example of how you might use elevation could be
start:
set volume of all channels to zero
play sound in channels one, two and three
if greater than 0 m (meters above sea level) and less than 2 m
then set volume of channel one to 3 (volume runs from 1-7, 1 being quite
and 7 loudest possible)
if greater than 2 m and less than 4 m
then set volume of channel two to 5
if greater than 4 m
then set volume of channel three to 7 (volume runs from 1-7, 1 being
quite and 7 loudest possible)
if elevation decreases by 1 m (each occurance)
then decrease volume levels of channels one, two, and three by 2 levels
------------------------------------------------------------------
time of day (I believe this is in stream, since it is necessary for GPS,
but I'll check) TIME
If greater than 05:00 00 (a.m.) and less than 08:00 (a.m.)
Then use medium blue layer (make image appear blue by laying translucent
layer over image onscreen)
If greater than 08:00 00 (a.m.) and less than 10:00 (a.m.)
Then use light blue layer (make image appear light blue by laying
translucent layer over image onscreen)
If greater than 10:00 00 (a.m.) and less than 15:00 (3:00 p.m.)
Then use bright yellow layer (make image appear yellow by laying
translucent layer over image onscreen) and lighten image by 10%
Etc.
------------------------------------------------------------------
Number of satellites in use [not those in view]
If 1 (satellite)
Then show one green ball
If 2
Then show 2 green balls
If 3
Then show 3 green balls
Etc.
-OR-
Start: set volume of all channels to 1 (lowest volume)
If 1 (satellite)
Then play "sound01.aif"
If 2
Then play "sound02.aif"
If 3
Then play "sound03.aif"
(imagine a low noise growing in complexity as more satellites are picked
up. If you were in and out of range at your site, perhaps under trees,
this kind of thing might be good)
------------------------------------------------------------------
= UTC of position fix
= Data status (V=navigation receiver warning)
= Latitude of fix
= N or S
= Longitude of fix
= E or W
= Speed over ground in knots
= Track made good in degrees True
= UT date
= Magnetic variation degrees (Easterly var. subtracts from true
course)
= E or W
= GPS quality indicator (0=invalid; 1=GPS fix; 2=Diff. GPS fix)
= Number of satellites in use [not those in view]
= Horizontal dilution of position
= Antenna altitude above/below mean sea level (geoid)
= Meters (Antenna height unit)
= Geoidal separation (Diff. between WGS-84 earth ellipsoid and
mean sea level. -=geoid is below WGS-84 ellipsoid)
= Meters (Units of geoidal separation)
= Age in seconds since last update from diff. reference station
= Diff. reference station ID#
= Checksum
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
Subject: more re data as triggers
Hmmm, would it be a good idea to have something playing on the screen,
like a video, that only plays when the user is running? Seems like a
recipe for an accident.
"Of course I was looking at the screen when I ran into traffic. I couldn't
see the screen unless I was running. That's why I am Litagating."
Time is already on the Tablet. We don't need the GPS for that.
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
C2c:
The Keyspan serial-USB cables we purchased were not working as consistently as KK's Belkin cable, so we purchased 2 Belkin cables and will return the original ones. We are missing one; does anyone have a Keyspan USB-serial cable? We have the packaging and the CD-ROM, but not the cable!
When referring to media files in the script, it is the sprite name and not the cast member name that must be referred to! Remember, a sprite is a cast member on the stage, and its number corresponds to the channel it resides in. That has been corrected in the one script where it was a problem.
Your maps are very small. Group 2's map is about 600 x 700 pixels. Just as a comparison, the map jeff and I used in our original project was about 7000 x 9000 pixels. Jeff thinks this may be a factor in accuracy in addition to the below problem. We will probably resample your map in photoshop to test this.
GPS coordiates on the map for Group 2 appear to be incorrect. The proportion of the map graphic does not correspond to the proportion indicated by the GPS coordinates submitted. Jeff and I will redo that map today and see if that solves our problem.
File names are too long. One of the specifications given for media files residing in the cast member window was that the files names be under 8 characters total. When files get copied between PC and MAC format, long file names are shortened, which means they no longer can be found by your scripts! My suggestion is to use a naming system ("audio01.aif") instead of descriptive names ("dog_barking_at_fireman_whileeating.aif").
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
C2c:
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
C2C: Good news, finally. The Tumwater piece from group 3 is now functioning! There is one little glitch to iron out - the cursor is visible over the quicktime movies, I think that can be addressed through scripting. I'll see if Jeff can figure it out. I will finish group 1's piece soon, and then I'll invite people to come out to Tumwater to see those two pieces. The downtown piece is still suffering from inaccuracy. I will be working together with Rip to figure this one out, but I'll keep everyone updated.
Hope all of you are well. My course was cancelled this summer, so I will probably just be around for part of the summer. Email is best way to reach me.
Naomi
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
From: Spellman, Naomi
Subject: site-specific interactive media work and reception
A GPS-controlled narrative, "Tum Tum", will be presented onsite at
historic Tumwater Falls this Thursday. This exciting new work, done by
students of Camera to Computer during Spring Semester, tells the story
of the river over time. Students worked together with local storyteller
Rebecca Hom to bring their story to life. The story unfolds spatially as
one walks through the park. The work is run on Tablet PC's with GPS
receivers and headphones.
This work will be available to the public for viewing on Thursday, July
17, from 6:00 8:00 p.m. Following the presentation, from 8:00-10:00
p.m., there will be a reception at the home of Thad Curtz with food and
drink. Directions to Tumwater park and reception below.
RSVP for presentation and/or reception appreciated but not absolutely
necessary. Just reply to this email.
If you would like more information on how this project works, my website
http://34n118w.net should be helpful.
Warmly,
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
DIRECTIONS TO TUMWATER FALLS PARK
Driving
You can take either Capitol Way or Deschutes Parkway to Tumwater Falls
Park. From downtown Oly, take Capitol Way south to Custer Way (at the
Miller Brewery). Turn right (west), cross the Deschutes river, then
left, following brown signs to "Tumwater Falls Park" (NOT "Tumw
Historical Park"). Turn left to go down to the park from DeschutesWay.
It is just below the "Falls Restaurant". Park in the parking lot. We
will be right there at the picnic tables with equipment for you to view
the piece. You do not need to bring anything!
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
READ_ME_TUMTUM
To run "Tum Tum" onsite, you need the following:
Hardware
Software
Location
You must be onsite for this piece to work! See directions at bottom. Starting point: 47N00.819, 122W54.268
Make sure the GPS receiver and Tablet PC are OFF.
Connect serial cable to GPS receiver, by sliding the serial cable connector onto the connection pins (on Garmin units, at top of device under rubber flap).
Connect the Serial-to-USB converter cable to the serial cable on one end, and the Tablet PC's USB port on the other.
Turn the Tablet PC on (slide button on bottom right side)
When Tablet boots up, the button under your right hand with a key icon needs to be pressed (this button is the same as control-alt-delete). Hit Œokay" to log on as administrator with no password
Once the Tablet is ready, turn the GPS receiver on.
You should check which com port number the GPS is connected to by looking at the device manager utility on the Tablet PC (control panel/system/devices/hardware/com ports)
Check to see if the Tablet is receiving the incoming text string from the GPS unit by opening up Hyperterminal application (PROGRAMS/UTILITIES/COMMUNICATION/HYPERTERMINAL). In hyperterminal, go to FILE/OPEN and locate the pre-made setting that corresponds to the com port being used. When you open this ht file, you should get a text string flowing into the hyperterminal window. If not, check all connections. Try rubbing GPS Serial pin connections with dry eraser, then reconnecting. Remember that the GPS must be outputting NMEA string (Push "page" button on Garmin reciever to get to system menu. Under "interface" you can set to NMEA output)
Once you've established a successful connection, exit Hyperterminal.
Double-click on the "TumTum" application from your hard disk.
The start up screen asks you to specify the com port number, then hit "FINISHED". That's it. If it's working properly, assuming you are at Tumwater Falls Park (directions at bottom), you should see a red cursor in the middle of the screen, and a map which updates as you move. You may have to stop and start "TumTum" a second time before it works. (key icon button is control-alt-delete, this brings up Task manager. "End task", and restart "TumTum") Make sure the GPS has a clear view of the sky. Move out from under trees, at least to get started. The yellow hotspots on the screen are the media triggers; follow these.
*Available at Computer support.
** Available at Lab Stores.
DIRECTIONS TO TUMWATER FALLS PARK
Starting point is located at 47N00.819, 122W54.268
Driving
Yahoo! Maps - 200 Deschutes Way Sw, Tumwater, WA 98501-4087
by Bus
|
This report was compiled by Naomi Spellman, the project director for a site specific interactive narrative project in the spring of 2003 at
|