Workshop in CCMIX

Sensors for Interactive Music Performance

Yoichi Nagashima

( SUAC / ASL )

This workshop focuses the sensor technology for interactive music performance with some viewpoints. At first, I will introduce and demonstrate many types of sensors as interfaces between human and computer systems, not only as technically but also as artistically with multi-media works. Secondly, I will lecture to design, develop and produce sensing system, and lecture of handling sensor information to create interactive art with Max/Kyma environment. Thirdly, we will discuss the possibility of "new human interface" and the interactivity with multi-media technology. Finally, I and Tamami Tono (composer, SHO performer) will have a demonstration as a "small cncert" as the "live" application of the theme of this workshop.


(1) Why developing/using sensors in composition ?

At first, I will introduce the movies of my works of interactive multimedia arts using original sensors, as samples of the theme of this workshop.

    1. CIS (Chaotic Interaction Show)

    2. Muromachi

    3. Strange Attractor

    4. Virtual Reduction

    5. David

    6. Asian Edge

    7. Johnny

    8. Brikish Heart Rock

    9. Atom Hard Mothers

    10. Ten Nimo Noboru Samusa Desu

    11. Visional Legend

    12. Bio-Cosmic Storm

    13. Eternal Traveller

    14. Beijing Power

    15. Shinkai (Installation)

    16. Wandering Highlander

    17. Windmill (Installation and Performance) produced by my students

(2) MIDI, MAX - for interactive/algorithmic control

Because we have little time in this Workshop Tutorial, so I do not talk that MIDI and Max (Max/MSP) are best partner not only in computer music but also in media art. I only show sample Max patches below, so please chack the Cycling'74 Website.

(3) AKI-H8 - small microcomputer (example from Japan)

In order to treat sensors, we must use the [analog-to-MIDI] interface. If you cannot produce the interface by your own, you may use these commercial systems :

    1. I-Cube

    2. SensorLab

    3. AtoMIC Pro

    4. AKAI

    5. Roland

    6. Korg

    7. etc ...

In Japan (you know AKIHABARA ?), we can use easily the powerful microcomputer board with 32-bits CPU (clock 16MHz), 2 serial (MIDI/RS232C) ports, 8 channel 10-bits A/D, 40-bits or more Digital Ports, 2 channel 10-bits D/A, 5 channel 16-bits counters, 128KB FlashEEPROM, 8KB RAM and more. It costs only $30. I have produced many original MIDI equipments in a part of my composition. We call this "AKI-H8".


AKI-H8

I have little time to explain about this development , but many musician or artists (not specialists in Electronics or IT) in Japan develop their original MIDI equipments with my Japanese WebPage . There are many sample circuits and sample AKI-H8 source codes and binary codes runnable directly. If you want to study this, please study Japanese at first.


Developing AKI-H8 system on my desk


Developing AKI-H8 software in Macintosh

(4) Treating Sensor Information in MAX

Here are some sample patches which treats the sensor information in composition / performance control.

    1. Gating

      At the front-end of the MAX patch, I set gating switch for sensor information. Thus, unexpected inputs troubles will be rejected.

    2. Cutting Period

      Using sensors as trigger commander, sometimes I want to cut for a moment after an event. This simple patch realizes the Cutting Period of system senseless.

    3. Sampling

      If sensor information traffic is too heavy, this patch samples with the [sampling time] period.

    4. Averaging (Filtering)

      If sensor data moves too sensible or with higher frequency noises, this patch reducts the higher noisy movement. This is 5 stages moving avarage calculation, or Low Pass Filtering.

    5. Threshold Switch

      This patch outputs triggered bang when input level crosses over than preset threshold.

    6. Peak Switch

      This patch generate a trigger bang when input level is over the higher threshold level and under the lower threshold level within the set interval time.

    7. Level Conversion (Normalize)

      Some analog sensors output the regular range voltage of A/D input of AHI-H8 (+5V - 0V), but many analog sensors cannot output this full range. For example, one sensor output range is +1.5V - +4.0V. This patch converts these narrow input voltage to full range fo MIDI (0-127). You may set the lowest voltage to [offset down], and set the gain [100 = equal gain].

    8. Inversion (Polarity)

      This patch converts the direction of the value of sensors. I know many of you used this patch between MIDI Keyboard Note Number and GM module !

    9. Random Generator(1)

      This patch is a very simple sample of algorithmic generator of [random music]. You can control the sliders of each parameter.

    10. Random Generator(2)

      You can control the interval of each events by sensor information. This means [tempo] control in music.

    11. Random Generator(3)

      You can control the ragne of rendom genaration by sensor information. This means [note range] of the part (instrument).

    12. Random Generator(4)

      Normal [random] object outputs the [0-n] integer, so the scale is chromatic. You can set the multiplied value with this random integer, simplest [scale] is generated. For example, [* 1] means chromatic scale, [* 2] means the whole tone scale, [* 3] means the Diminish 7th scale, [* 4] means the Augmentd scale, [* 5] means the Sus4 scale. I like [* 2] character, so I use frequently in my work...

    13. Random Generator(5)

      You can control the offset of genarated note number by sensor information. This also means [note range] of the part (instrument).

    14. CHAOS Generator(1)

      This patch is the simplest demonstration of Chaos. 1-dimensional simplest chaos is calculated with the [Logistic Function]. The value (floating, not integer) [X(n)] range is 0.0 - 1.0. The chaos parameter [Myu] range is 3.0-4.0. So the calculation equation is very simple:

      X(n+1) = Myu * X(n) * { 1.0 - X(n) }

      Thus, next [X(n+1)] falls in the same range 0.0-1.0, but in the chaos zone with the parameter [Myu], the final result cannot be respected by anyone (by God!). This patch demonstrate the calculation and the behavier of the Chaos. You can [listen to] the Chaos !!

    15. CHAOS Generator(2)

      This patch controls the Chaos parameter [Myu] by sensor information. But, using this type of Chaos in music, many audience cannot feel whether [random] or [chaotic vibration]. This is deeply theme of [chaos in music], I think.

    16. Tonality Generator

      This patch is a sample of [weighten-scale] method. You can control each notes[C,C#,D,.....,Bb,B] value, and this value is the possibility of random generated scale of the note. You can easily apply the sensor information to this patch, so this is your homewark.

    17. MSP control

    18. SuperCollider control

      As you know, SuperCollider can deal with MIDI information directly, so you may treat sensor MIDI message directly with SuperCollider. But myself, I use Max [sensor to music] algorithm at first, and one of the output of [master] Max controls [slave] SuperCollider. This is the SuperCollider program of my work [Bio-Cosmic Storm]. I used 2 PowerBook in this worak, running this program in SuperCollider and running Max patch.

      Synth.scope({
      	var p111,p112,p113,p116,p117,p118;
      	var s, z, y, ss, zz, yy;
      	var p106,p107,p108,p109,p110,p114,p115;
      	var p119,p120,p121,p122,p123,p124,p89,p90,p91,p92,p93,p94;
      	var p95,p96,p71,p72,p73,p74,p75;
      	
      	p108 = MIDIController.kr(16,108,0.0,2.5,'linear');			// [A] Res LPF level
      		p106 = MIDIController.kr(16,106,100,4000,'exponential');	// 	LPF cutoff
      		p107 = MIDIController.kr(16,107,0.01,0.6,'linear');		// 	Q
      		p95 = MIDIController.kr(16,95,1.5,2.0,'linear');		// 	chaos param
      		p96 = MIDIController.kr(16,96,-1.0,1.0,'linear');		// 	pan
      	p71 = MIDIController.kr(16,71,0.0,2.5,'linear');			// [B] Res LPF level
      		p72 = MIDIController.kr(16,72,100,4000,'exponential');	// 	LPF cutoff
      		p73 = MIDIController.kr(16,73,0.01,0.6,'linear');		// 	Q
      		p74 = MIDIController.kr(16,74,1.5,2.0,'linear');		// 	chaos param
      		p75 = MIDIController.kr(16,75,-1.0,1.0,'linear');		// 	pan
      	p109 = MIDIController.kr(16,109,0.0,5.0,'linear');			// [A] Synth level
      		p110 = MIDIController.kr(16,110,0.01,0.5,'linear');		// 	echo depth
      		p111 = MIDIController.kr(16,111,0.05,1.5,'linear');		// 	pulse density
      		p112 = MIDIController.kr(16,112,30,400,'exponential');	// 	base freq
      		p113 = MIDIController.kr(16,113,0.0,1.0,'linear');		// 	freq random range
      	p114 = MIDIController.kr(16,114,0.0,5.0,'linear');			// [B] Synth level
      		p115 = MIDIController.kr(16,115,0.01,0.5,'linear');		// 	echo depth
      		p116 = MIDIController.kr(16,116,0.05,1.5,'linear');		// 	pulse density
      		p117 = MIDIController.kr(16,117,70,1000,'exponential');	// 	base freq
      		p118 = MIDIController.kr(16,118,0.0,1.0,'linear');		// 	freq random range
      	p119 = MIDIController.kr(16,119,0.0,7.0,'linear');			// [A] Noise Level
      		p120 = MIDIController.kr(16,120,0.1,15.0,'exponential');	// 	noise pan rate
      		p121 = MIDIController.kr(16,121,0.0,1.0,'linear');		// 	noise pan depth
      		p122 = MIDIController.kr(16,122,1.5,2.0,'linear');		// 	chaos param
      		p123 = MIDIController.kr(16,123,0.01,0.6,'linear');		// 	filter Q
      		p124 = MIDIController.kr(16,124,300,2500,'exponential');	// 	filter cutoff
      	p89 = MIDIController.kr(16,89,0.0,7.0,'linear');			// [B] Noise Level
      		p90 = MIDIController.kr(16,90,0.1,15.0,'exponential');	// 	noise pan rate
      		p91 = MIDIController.kr(16,91,0.0,1.0,'linear');		// 	noise pan depth
      		p92 = MIDIController.kr(16,92,1.5,2.0,'linear');		// 	chaos param
      		p93 = MIDIController.kr(16,93,0.01,0.6,'linear');		// 	filter Q
      		p94 = MIDIController.kr(16,94,300,2500,'exponential');	// 	filter cutoff
      	s = Mix.ar(Array.fill(10, { Resonz.ar(Dust.ar(p111,50), p112+(p113*1800.0.rand), 0.003)}) );
      		ss = Mix.ar(Array.fill(10, { Resonz.ar(Dust.ar(p116,50), p117+(p118*2500.0.rand), 0.003)}) );
      		z = DelayN.ar(s, 0.048);
      		zz = DelayN.ar(ss, 0.048);
      		y = Mix.ar(Array.fill(7,{ CombL.ar(z, 0.1, LFNoise1.kr(0.1.rand, 0.04, 0.05), 15) })); 
      		yy = Mix.ar(Array.fill(7,{ CombL.ar(zz, 0.1, LFNoise1.kr(0.1.rand, 0.04, 0.05), 15) })); 
      		4.do({ y = AllpassN.ar(y, 0.050, [0.050.rand, 0.050.rand], 1) });
      		4.do({ yy = AllpassN.ar(yy, 0.050, [0.050.rand, 0.050.rand], 1) });
      	Pan2.ar( RLPF.ar( Crackle.ar(p95,1.0), p106, p107, 1.0, 0 ), p96, p108 ) 
      +	Pan2.ar( RLPF.ar( Crackle.ar(p74,1.0), p72, p73, 1.0, 0 ), p75, p71 ) 
      +	( p109 * (s + ( p110 * y ) ) )
      +	( p114 * (ss + ( p115 * yy ) ) )
      +	Pan2.ar( Resonz.ar( Crackle.ar(p122,1.0), p124, p123, 1,0 ), SinOsc.kr(p120,0,p121,0), p119 ) 
      +	Pan2.ar( Resonz.ar( Crackle.ar(p92,1.0), p94, p93, 1,0 ), SinOsc.kr(p90,pi,p91,0), p89 ) 
      })
      

    19. Kyma control

      Just like as SuperCollider, I use Kyma as the [slave] component of my work/system with [master] MAX. Above is the output of [SHO breath sensor] produced by myself. I composed a work [Visional Legend] for Tamami Tono Ito (composer, SHO player), and using the same concept for the work.


      System Block Diagram


      (slave) Kyma Patch


      (master) MAX Patch

      This is the 1998 version of the work, but you can read the newest 2001 version of the work Here.

(5) MiniBioMuse-III

I have developed the new sensor called "MiniBioMuse-III", and will have a demonstration. Here is the report of "MiniBioMuse-III", but this page is written in Japanese. Sorry...