Introduction to Wearable Computing: Project Possibilities
Class projects should be viewed as an opportunity to explore in depth
some aspects of wearable computing that are of particular interest to
you. It is fine to work in groups of any size as long as each
individual's contribution is clear. Ideally, as a class, we will be able
to design a common project to which we can all contribute components.
Projects do not have to rely upon the existing WearTools suite or be
implemented in perl. However, I encourage you to make use of these
tools where they make sense.
Important Dates:
- Feb 27th: 1/2 page project proposal is due. You will also be
expected to give a 5 minute (1-2 slide) presentation to the
class on your plans. This class period will be dedicated to discussing
all of the projects and to coordinating efforts across project components.
(4% of class grade)
- March 11th: Design review. Be prepared to describe the details of
what you intend to implement and how it relates to others in the class. Expand your written proposal to 2 pages.
(5%)
- March 27th: Prototype I. Be prepared to demonstrate and describe a
component of your project in a 10 minute presentation. At this stage,
you should have approximately half of the project completed.
(7%)
- April 10th: Prototype II. Demonstrate and describe your working project
(again, 10 minutes each). Also - please hand in a 1/2 page description
of your user evaluation plans.
(7%)
- April 29th: User Evaluation. Recruit three test subjects (other than yourself)
to evaluate your project. Questions you are trying to answer
include: how easy/hard is your system to use? How does your system
improve (or detract) from the user's ability to perform a task (e.g.,
time, accuracy)? How should your interface/system be improved. On
this day report on your findings, discuss planned improvements (10 minutes), and turn in a 1-2 page document on your findings.
(7%)
- May 13th: Demonstration and presentation of complete system.
A draft of your final write-up is due (this will consist of a
synthesis of what you have written already and an extended outline of
the remaining material). You will also
be expected to give a 30 minute presentation to the class on
this day. (10%)
- May 23rd: Final write-up is due by 5pm (postscript is fine).
Your document should be on the order of 10 pages/person and should
cover what you have done, why it is important, how it relates
to the existing literature, and the details of what you have
accomplished. (10%)
Possible Projects
1: Outdoor Navigation Assist System
Components include:
- Building maps from GPS data
- Respond to a human query by giving audio instructions to a location.
-
Augmented reality environment building tool.
Through keyboard/vocal/GPS input,
provide an interface that makes it
easy to construct 3D models by
placing objects and specifying
walls.
- Use the 3D augmented reality
system to convey a path to a desired location.
- Use the 3D AR system to make other information available about the
space.
-
Use the 3D AR system to develop a spatial adventure game (e.g., see
non-linear story telling).
2: Tools for Technomages
-
Twiddler keyboard training system.
- Develop a communication system
for wearable users that takes
advantage of the available I/O
modalities. Look toward the Zepher
or instant-messaging models.
- System for filtering phone calls in a context-dependent
manner. One possible starting point:
Gnome-o-Phone.
- Face detection and recognition for identifying user context.
- Informational avatars: wearable broadcasts web pages, etc. for
other local wearable or other users to view. Can we do this in an
interesting, context-dependent manner?
- Look paintings: the fusion of multiple images taken at
different orientations into a single panoramic view (cf Steve Mann or
Zhigang Zhu).
- Context-based messaging: the delivery of messages to a user in
very specific contexts.
Project Suggestions from Last Year
- General speech interfaces for wearable computers. How to
switch between different applications in an easy and clean
manner? How to communicate the state of the interface to the
user? How to handle interruptions from other applications?
- System for filtering phone calls in a context-dependent
manner. One possible starting point:
Gnome-o-Phone.
- Face detection and recognition for identifying user context.
- High-level control of multi-robot systems. Design a user
interface that effectively manages the flow of information gathered by
many robots to the user, displaying a synthesis of the available and
relevant information. The user should be able to also post goals for
the set of robots.
- Augmented reality: overlay of images and/or sound on top of
reality. Use GPS/compass system and/or a 6-axis accelerometer to
determine position of the head.
- Augmented reality cont: Interface to Doom/Quake or VRML-based
environment.
- Adventure game. Utilize position and other user inputs to
drive evolution of game state.
- Adaptive mechanisms for mapping context to document retrieval.
- Informational avatars: wearable broadcasts web pages, etc. for
other local wearable or other users to view.
- Vision-base place recognition for contextual input.
- 3D audio systems (cf nomadic radio).
- PEDnet Networks of low-cost devices in which
wearables play a role in the transport of communication packets.
- Java-based library implementation of the existing WearTools (and
do some interesting things with it).
- Java/Jini implementation of the WearTools. Addition of
fixed-location services. How to automatically construct maps of
``closest'' resources?
- Mechanisms for interaction between wearable users. Audio and
visual techniques.
- Interactive tutoring systems.
- Tour guide. Enhancement of the existing UMass system (cf Ben
Alamed). Tour guide content for campus or Amherst.
- Development of a general
I2C interface
for bringing in large
numbers of sensors.
- Gestural recognition via vision or accelerometers.
Understanding the activities that the user is engage in based upon
observed gestures in the context of various manipulated objects.
- Look paintings: the fusion of multiple images taken at
different orientations into a single panoramic view (cf Steve Mann or
Zhigang Zhu).
- Body networks: the transmission of power and data through the
skin.
- Context-based messaging: the delivery of messages to a user in
very specific contexts.
- Visual enhancement (e.g. IR-enhanced based vision or panoramic
vision).
- EMG interfaces as alternative mechanisms for providing direct
input into the wearable.
fagg@cs.umass.edu
Last modified: Wed Jan 30 13:39:08 2002