International Conference on 

                NEW INTERFACES FOR MUSICAL EXPRESSION 

                Date: May 24-26, 2002

                Location: Media Lab Europe in Dublin, Ireland 

                Important Deadlines:

                February 15, 2002 Submission deadline 
                March 15, 2002 Notification of acceptance 
                April 5, 2002 Camera-ready papers due 
                April 19, 2002 Early registration Deadline (50 Euro discount)
                May 24-26, 2002 Conference and related events 


ABOUT NIME


What? 
NIME is an International Conference on New Interfaces for Musical Expression. This conference will explore the new directions
that musical interfaces are taking, addressing current research and evolving issues through presented papers, discussions, and
performances with academics, technologists and artists working at the cutting edge.

When and where? 
It took place at Media Lab Europe, Dublin Ireland from May 24-26, 2002.

Themes
Acoustic musical instruments have settled into canonical forms, taking centuries, if not millennia, to evolve their balance between
sound production, ergonomics, playability, potential for expression, and aesthetic design. As electronic music instruments liberate
the action of musical control from the sound production mechanisms, their form doesn't need to be limited by the corresponding
constraints and is free to move in many other directions. 

Electronic instruments have been around for only the last century, during which rapid advances in technology have continually
opened new possibilities for sound synthesis and control, keeping the field in rapid revolution. Today's sensor technologies
enable virtually any kind of physical gesture to be detected, captured, and tracked, while new synthesis technologies provide
multiple parameters that can direct and continuously sculpt the detailed nuances of essentially any sound.

Inserting a computer into the loop between the musical controller and synthesizer also enables any kind of gesture to be
software mapped onto essentially any musical response, from the most delicate and intimate control of a virtuoso playing a fine
instrument to the limited, high-level direction of a child stomping through a simple interactive installation. The common availability
of sophisticated sensing, processing, and synthesis hardware and software has led to an explosion in the quantity and variety of
electronic music interfaces that are being developed. 

But as this field grows and builds traction, it also raises many questions. What forms will the musical controllers of tomorrow finally
take, provided that they settle at all? Will they ever supplant the keyboard, string, percussion, and wind form factors that still
dominate the commercial landscape? What kind of musical mapping standards and algorithms will we be developing and will
common controllers ever become adaptive and intelligent? How deep a union will research in musical interfaces forge with work in
Human-Computer Interfaces? Is this field becoming so broad that it will fragment into subgenera with different goals and
motivations, or are there deep principles in common that can be applied throughout?