Research Overview

Dance.Draw

vlcsnap-2011-11-15-23h13m19s51

The Dance.Draw project is a 3-year, NSF-funded project that looks at the intersection of dance and interactive technology. We have 3 main research goals: tracking the motions of dancers for the purposes of creating real-time interactive media that can be part of the dance production, researching methods for enhancing audience engagement and interaction, and developing novel IT support for the collaborative performing arts process. You can find out more about this project on the Dance.Draw site.

Creativity Support Evaluation

Melissa Screen

My PhD student, Erin Carroll, has been investigating how to evaluation digital creativity support tools. We developed the Creativity Support Index, which is a self-report metric modeled after the NASA TLX, but tailored to evaluating how well tools support people in open-ended, creative work. In addition, we have been investigating how the physiological state of the body changes both in response to being presented with creative work, and during the process of doing creative work. This ongoing research is now focused on the goal of using machine learning to detect states of ‘In-The-Moment-Creativity,’ periods of highly creative experience during the creative work process.

Interactive Surveillance

Interactive Surveillance1

In this research work with artist Annabel Manning, I explore new ways for users to interact with artwork in interactive projected gallery spaces.

SoundPainter Aging In Place

screen-6567

In this work that is based on a Processing script originally written by Nathan Nifong, I use sound rendering to represent activity levels of distributed family members.

Dual-Cursor Interaction

My research historically focused on symmetric, bimanual interaction. I still work in this area, along with my PhD student, Berto Gonzalez. More about this work (including movies) on the Dual-Cursor Interaction page, and some quick screenshots below.

symDrawScreenShotNov05
A screenshot of an image being drawn with symDraw. The menuing system is a bimanual, hierarchical, transparent pie menu design and can be invoked by either hand. Shapes are drawn with both hands, by stretching out the corners, and once drawn, can be selected and then simultaneously rotated, translated and scaled.
symToneNarrative2 cropped colored
This screenshot of symTone shows the bimanual Ken Burns Effect, where a user selects areas of interest in an image to create a slideshow using two mice and two cursors.
ToneZoneScreenShot

This screenshot of symTone shows the ToneZone tool which allows users to simultaneously adjust the minimum and maximum input and output tones of the underlying image by manipulated the size and position of the dashed rectangle. This geometric manipulation serves as a memory cue to help users doing exploration of the image parameters.