On September 25, I presented my idea for a theremin-inspired musical instrument that uses the Leap Motion controller in front of my Natural User Interfaces class. The primary concepts that I introduced during my five minutes were:

  1. Physical markers to aid with spatial location.
    The theremin has proximity-sensing antennae that provide spatial landmarks for the musician. The Leap Motion controller does not require these antennae, but their secondary function can be recreated using a rod with markers to indicate the positions of specific pitches.

  2. Use of the computer display to aid with pitch placement.
    Unlike a theremin, this instrument will be hooked up to a computer monitor. That monitor can be used to display the pitch, in the form of a musical note, that is being played (or will be played once the volume is activated).

  3. Use of the computer display to aid with volume control
    There is no physical boundary between "off" and "on" for the instrument's volume. To remedy that, I plan to have a "liminal zone" in which the musician's hands can be detected without causing any notes to play. A meter on the computer display will show how close their hands are to the "active zone," in which volume can be controlled.

Professor Bowman and my classmates provided some helpful feedback that is informing my thoughts as I develop this project.

Two comments in particular fit together so well that I am almost certain to incorporate them into my design. One person suggested that I use different finger configurations (which the Leap can detect) to provide richer controls than position detection alone. Someone else suggested the possibility of "snapping" the pitch to half steps instead of sliding from pitch to pitch.

I do not want to eliminate that "slidey" sound entirely, as it is one of the distinctive features of an in-air gesture interface that cannot be reproduced easily with traditional instruments. However, I am very interested in integrating optional pitch snapping.

My new idea is to use a single finger to slide continuously through pitches, and to use two fingers to snap to half steps. This will allow both styles of play within a single play session, and should open up some interesting musical possibilities. In theory, I could map a different progression to the third finger, such as the notes of a particular scale. Maybe this could be configurable by the musician before beginning a song, the way some musicians use custom tunings, like with the banjo, or even switch out their instruments completely when playing in different keys, like with the harmonica.

Another suggestion was to take advantage of gestures, which the Leap can also recognize. That could also be a method for toggling between play modes. One possible implemenation would be defining one hand as the "instrument" hand, controlling pitch and volume, and the other as the "interface" hand, using gestures to affect the music externally by adding effects or changing modes.

Finally, one person asked which hand would control which functions of the instrument. This led to a short discussion about the configurability of the instrument. I envision the interface being reversible for left or right handed use. Of course, there is a possibility that I will make both hands "instrument" hands, in which case it will be completely ambidextrous.