\documentstyle[a4j,ascmac,12pt]{article} \pagestyle{empty} \setlength{\oddsidemargin}{24mm} \setlength{\evensidemargin}{24mm} \setlength{\topmargin}{0mm} \setlength{\headheight}{11mm} \setlength{\headsep}{0pt} \setlength{\topskip}{0pt} \setlength{\footskip}{0pt} \setlength{\textheight}{230mm} \setlength{\textwidth}{160mm} \begin{document} \bf \begin{flushleft} {\large {\bf   \\   \\   \\ }} \vspace{2mm} {\large {\bf Multimedia Interactive Art : System Design and Artistic Concept of Real-time Performance with Computer Graphics and Computer Music }} \end{flushleft} \begin{flushleft} Yoichi Nagashima*\footnotetext{ {\small {\bf *Composer, Art \& Science Laboratory \\   Hamamatsu, Shizuoka 430, Japan \\}    }} \end{flushleft} \vspace{5mm} This is the report of some applications of human-computer interaction about experimental performances of multimedia interactive arts. The human performer and the computer systems perform computer graphics and computer music interactively in real-time. As the technical point of view, this paper is intended as an investigation of some special approaches : (1) the idea of ``chaos'' information processing techniques used in the musical part, (2) real-time communication system for the message of performance, (3) some original sensors and pattern detecting techniques, (4) distributed system using many computers for convenience to develop and to arrange. \section{Background} There are many field to research in ``computer music''. For example, ``musical automata'' and ``algorithmic composition'' are interesting themes for composers. Many theories, musical models and computational models are discussed, many systems and softwares are researched or developed by many researchers and composers \cite{degazio}. Today, software engineering and computer technology are growing powerful, so we have good environment for computing musical information in real time, and the concept of ``real-time composing'' can be used easily in compact system \cite{chadabe}. \subsection{PEGASUS Project} The research called PEGASUS project (Performing Environment of Granulation, Automata, Succession, and Unified-Synchronism) had produced the compact system of real-time granular synthesis \cite{nagasm2}. The second step of this project aimed at ``automata and unified synchronism'', an experimental work was composed and performed \cite{nagasm3}. The theme of the third step was ``algorithmic composition'', researching two approaches : (1) chaos application for real-time composition, (2) Chaotic Interaction Model (CIM) for flexible and dynamic generator for music \cite{nagasm5}. \subsection{Multimedia Performance} The new step of this project is aimed ``multimedia interactive art'' by the collaboration with the artist of computer graphics. In this report, I discuss three performances as the application of multimedia interactive arts which were realized at concerts and events in Osaka, Kobe and Kyoto during 1993 - 1994. These performances are the ``visual arts'' for the graphic artist and are the ``compositions'' for the music composer, but these cannot be produced by each artist only. I used three different approaches of human-computer interaction in these performances. The types of message flow of these performances are: ``human---music---graphics'', ``human---graphics---music'', and ``graphics---human---music/graphics''. The human performer was also inspired by the sounds and images interactively in real-time. \subsection{``Chaos'' in Music} ``Chaos'' is easily generated with the following simple function : {\boldmath \[ X_{n} = \mu \cdot X_{n-1} \cdot (1 - X_{n-1}) \]} this function is called ``logistic function''. With increasing in the area {\boldmath \( 3 < \mu \)}, the value of {\boldmath \( X_{n} \)} is branched into two, four, ... and into the ``chaos zone''. The parameter {\boldmath \( \mu \)} is very important to control the random characteristic, and it is possible to control the ``chaos'' dynamics with the value of {\boldmath \( \mu \)}. I was interested by the fact that the resulting state of chaos cannot be determined in spite of its deterministic definition. Many critical points in the ``chaos zone'' were observed in our previous work \cite{nagasm6}. Even branching many values of {\boldmath \( X_{n} \)} in the ``chaos zone'' of {\boldmath \( \mu \)}, it is impossible to obtain finite values normally. But there are many points with finite values of {\boldmath \( X_{n} \)} in the chaos zone of special {\boldmath \( \mu \)}, which is called ``window''. When the value is slightly varied on the edge of the ``window'', the ``chaos vibration'' is shifted somewhere in short term. It may return back to the finite state in some cases, as if it were pushed back by an active something \cite{aihara} \cite{bidlack}. This reaction is very critical and sensitive for the value of {\boldmath \( \mu \)}. \section{``CIS (Chaotic Interaction Show)''} ``CIS (Chaotic Interaction Show)'' was performed at IAKTA (International Association for Knowledge Technology in the Art) workshop and Kobe international modern music festival in 1993. This work was produced by Yoichi Nagashima (computer music) and Yasuto Yura (computer graphics), performed by Manato Hanaishi (percussion). There are three points in this composition : (1) ``chaos'' application for real-time composition with generating musical primitives, for example, notes, rhythm, scale, tonality, etc... , (2) collaboration with CG artist using MIDI interactive connection, (3) musical conversation and improvisation of the performers (percussionist and conductor) with CG display and the sound. The system is originally constructed : (1) one notebook computer is used to run the ``chaos generator'' software produced by the composer as one part of the composition, generates eight individual chaotic part in real-time, controls many MIDI special-defined control messages, and manages many informations from MIDI sensors. (2) the other notebook computer is used as normal MIDI sequencer to send BGM parts and system control messages. (3) original MIDI sensors are produced by the composer as the joy stick controller and the wireless Power Glove to send control parameters for the conductor. (4) MIDI drum pad controllers are used for the percussionist and the conductor. (5) some MIDI sound generator modules are used, and originally produced two granular synthesizers and two sinusoid synthesizers are also used. (6) one CG computer is used to generate back-grounded graphics not only with internal sequence but only with MIDI real-time parameter control. (7) the other CG computer is used to generate the graphics of the response of performance which are triggered with the playing of the pad. \section{``Muromachi''} ``Muromachi'' was performed at ``Kontrapunkt fur Augen und Ohren'' in Kyoto as the first version, and at the 1st JACOM (Japan Computer Music Association) concert in Kobe as the revised version (``Muromachi2'') in 1994. This work was produced by Yoichi Nagashima (computer music) and Yasuto Yura (computer graphics), performed by Emiko Yahata and Asako Suzuki (live graphics). This work is the interactive art with computer music and computer graphics, and performed as the real-time multimedia performance. The performer on the stage draws graphics freely with the special sensor on the original CG software. The music system receives the messages of CG via MIDI, and generates many types of sounds. The chaotic algorithm generates some iconic phrases, and the back-grounded sounds are also generated without fixed sequenced data. There is no pre-fixed information in this work. The performer may go into the next scene, may finish at everywhere, and may continue eternally. \section{``Strange Attractor''} ``Strange Attractor'' was performed at the 1st JACOM concert in Kobe. This work was produced only by Yoichi Nagashima and performed by Sachiyo Yoshida (piano). This work is the live computer music with piano and computer graphics, and performed as the interactive multimedia art. The main theme of this piece is ``chaos'' both in music and in graphics. There are many chaotic algorithms running in the system : original software ``chaos generator'' and eight individual chaotic MAX patches. The original CG software also generates ``2-D chaos'' graphics in real-time, and is controlled/triggered by the performance of the piano. The ``2-D chaos'' graphics is caluclated with ``Mira's attractor'' algorithm : {\boldmath \[ F(x) = a x + (1 - a)\frac{x^{2}}{1 + x^{2}} \; \; , \; \; x_{n} = b y_{n-1} + F(x_{n-1}) \; \; , \; \; y_{n} = -x_{n-1} + F(x_{n-1} + 1) \]} The pianist plays the ``prepared piano'' : picking the strings, beating with sticks, and throwing something into the piano. The acoustic sound of the piano is also used with effector. Original acoustic sensors detect the message, and computer sounds are generated via MIDI. The system generates ``piano'' sound when the pianist plays noise, and generates ``chaos phrase'' when the pianist plays the piano normally. The pianist listens to the computer sound with checking ``chaos character'' and changes the CG and chaotic parameters by playing the piano. \section{Musical Concept and System Design} The point on the concept of these system design are ``modularity'' of all parts. For example, four notebook computers are used to generate the musical part of the performance, connected with the special network, and running individual tasks in the parts of music. This method is applied not only to graphics part but also to the combination of music and graphics. All real-time communications are realized by using MIDI with the special protocols defined for sensors, graphics, and display. The human performances are detected by the special sensors and general sensors, converted to MIDI messages and transferred to the pattern matching systems. All output messages to the graphics system and music system are also MIDI information using the special protocol for original graphic software and original synthesizers. \subsection{Syetem Performance} The technological points on these types of performances are to ensure real-time response and to reduce information traffic. The human performer (player) feels impatience if the response of the system is delayed or if the resolution of the control is rough. It is important to keep in mind that real-time artistic performance is the good experiment from the viewpoint of human-computer interaction. To lighten the heavy traffic of information with high sensor resolution and high sensing rate, the special MIDI protocol are defined to compress the information, and the special MIDI machines cut unnecessary information. \subsection{Musical Model} As the artistic standpoint, these performances are not only human-computer interaction considering the computer as the hardware system but also human-model interaction considering the computer as the virtual world in the software. We may recall the idea of composition in computer music, called ``algorithmic composition'' or ``real-time composition''. This compositional method is applied to these performances with the idea of ``chaos in music''. Many agents run that generates chaotic phrases individually in the original software. This chaotic software is the part of the composition, and the messages from human performer change the chaotic parameters, trigger and excite the dynamics of chaotic generators. In the compositional sense, the computer music system is not only the complex musical instruments but also another part as the musical partner for human performer. The improvisation of human performer is important in these performances, and there is no performance just same as former one. The duration of the performance is not fixed and the performer can finish anywhere. If she(he) does not hope to enter the Coda, the performance will continue eternally. \section{Future Work} These experiments of art and computer technology raised the problem of generality of computer art and the problem of environment of artistic creation. The new project of universal environment and artistic software model starts now with the viewpoint of human-computer interaction and multi media. [Fig.1] shows the block diagram of the concept. \begin{thebibliography}{99} {\small {\bf \bibitem{degazio} B.Degazio : Musical Aspects of Fractal Geometry. Proceedings of International Computer Music Conference, pp.435--442, 1986. \bibitem{chadabe} L.Chadabe : Interactive Composing. Proceedings of International Computer Music Conference, pp.298--306, 1983. \bibitem{nagasm2} Y.Nagashima : Real-time Control System for ``Psuedo Granulation''. Proceedings of International Computer Music Conference, pp.404--405, 1992. \bibitem{nagasm3} Y.Nagashima : Musical Concept and System Design of ``Chaotic Grains''. IPSJ SIG Notes Vol.93, No.32, pp.9--16, 1993. \bibitem{nagasm5} Y.Nagashima, H.Katayose, S.Inokuchi : PEGASUS-2: Real-Time Composing Environment with Chaotic Interaction Model. Proceedings of International Computer Music Conference, pp.378--390, 1993. \bibitem{nagasm6} Y.Nagashima : Chaotic Interaction Model for Hierarchical Structure in Music. Proceedings of 46th Annual Conference of IPSJ, vol.2, pp.319--320, 1993. \bibitem{aihara} K.Aihara, T Yoshikawa : Ordered and Chaotic Systems and Information Processing. Journal of JSAI, vol.8, no.2, pp.179--183, 1993. \bibitem{bidlack} R.Bidlack : Chaotic Systems as Simple (but Complex) Compositional Algorithms. Computer Music Journal, vol.16, no.3, pp.33--47, 1993. \bibitem{nagasm7} Y.Nagashima, H.Katayose, S.Inokuchi : Chaotic Interaction Model for Compositional Structure. Proceedings of International Workshop on Knowledge Technology in the Arts, pp.19--28, 1993. } } \end{thebibliography} \vspace{150mm} \begin{center} Figure.1 Block Diagram of the New System \end{center} \end{document}