Abstract: In this essay, a system developed to generate visual meaning for live performed music is described. Specifically, this system is calibrated to respond to North Indian classical and folk drumming tradition, using custom designed digital musical interfaces, such as the Electronic Tabla and Electronic Dholak. A drum, when struck, does not generate its sound as a record of the force applied, but as an artifact of a physical response to that strike within the artistically controlled conditions of its material state. With the design of the North Indian drum controllers, we developed a physical model for digital audio synthesis to recreate the aural qualities of the drums’ response under the control of the player. From there, we abstracted that concept to develop a dynamically responsive model for real-time rhythmic visual synthesis.
In creating a visual experience for North Indian classical music, we sought to create a dynamic visual accompaniment with an appropriate ambience for the patterns and complexities of North Indian drumming. The design process was shaped by the need to react to a series of signals that would be received from the musician through the digitized musical interfaces, while giving a visual performer the ability to modify and shape those reactions over the course of the performance. In contrast to using a prerendered or abstract visualization, we aim to create an audiovisual composition which is both aesthetically compelling and responsive to the conditions of live performance, in addition to providing a meaningful visual context for what the performer is playing.
In this essay, we will describe:
- Veldt: a custom built application for visual expression of musical performance.
- Digital Indian Drums: the Electronic Tabla and Electronic Dholak which digitize gestural information of a live performer.
- Rhythmic visualization of our system used in live performances.
Veldt
Veldt is an application which was designed from the ground up for the purpose of visual expression and performance. It receives MIDI (Music Instrument Digital Interface) and OSC (Open SoundControl) [1] messages from digital musical interfaces and maps them to a system of reactive events in order to generate live visuals, which are rendered real-time using the OpenGL [2] graphics language. Mappings are flexible: sets of mappings may be arranged and modified during the design and rehearsal process, and triggered by control events during different movements of a performance, and arbitrary text, images, video, and geometric models may be used as source material.
We display a real-time composition of these media sources over geometric elements which are generated and modified according to the parameters of the current mapping. In addition to control events received from the performer, a physical simulation environment is incorporated to allow for a variety of secondary motion effects. This visually (and contextually) rich combination of source material over physically reactive structural elements allows for a response that is dynamically generated and artistically controlled. While the parameters that govern the overall response of the system to the drum controllers may be modified through cues such as MIDI program change messages, Veldt allows an additional visual performer to control the finer aspects of the performance.
Digital Indian Drums
One challenge of this project was retrieving musical data from a performing musician quick enough to make a visual response in real time. There are many methods of generating MIDI or OSC messages from audio data by extracting features from the frequency domain and time domain. These methods proved to be too slow, and inaccurate for real time applications as the number of parameters increased. Hence, we choose to build instruments which could perceive the gestures of the performers, using sensors to capture data in real time, and convert them directly to MIDI or OSC messages. The instruments we will discuss are the Electronic Tabla and Electronic Dholak.
The Electronic Tabla
Tabla are a pair of hand drums traditionally used to accompany North Indian vocal and instrumental music. The silver, larger drum is known as the Bayan . The smaller wooden drum is known as the Dahina . We model the Electronic Tabla (ETabla) on the hand positioning and movements of the many elaborate stroke techniques. We use a set of force sensing resistors to measure the pressure and location when fingers strike the drums. All data is converted to MIDI signals and sent out via a MIDI outport, triggering both sound and graphical feedback (using veldt). [3] For our visual response system, we focused on two strokes performed with the left hand, known as Ka and Ga, [4] to make it easy for the audience to grasp how graphics were being triggered. The Ka stroke is triggered by striking sensors at the upper edge of the EBayan. The Ga stroke is triggered by striking the center of drum, while the pressure of the left wrist on the drum determines its pitch.
The Electronic Dholak
The Dholak is a barrel shaped hand drum originating in Northern India, with two membranes on either side of the barrel. [5] Two musicians play the Dholak. The first musician strikes the two membranes with their left and right hands. The second musician sits on the other side of the drum, striking the barrel with a hard object, such as a spoon or stick, giving rhythmic hits similar to a woodblock sound. [6]
We created the Electronic Dholak (EDholak) inspired by the collaborative nature of this traditional drum. Two musicians play the EDholak, the first striking both heads of the double-sided drum, and the second keeping time with a “Digital Spoon” and manipulating the sounds of the first player with custom built controls on the barrel of the drum and in software. We further explore multiplayer controllers by networking four drummers playing two EDholaks at the two geographically diverse sites. The EDholak captures finger-striking data with piezo electric sensors. The “Digital Spoon” has a piezo sensor attached to it, as well as a force-sensing resistor on the EDholak itself, to capture how hard (pressure) and where (location) the spoon strikes. All this music gestural data is converted to MIDI and used to trigger sound and graphical response (using veldt). [7]
Rhythmic Visualization
To further integrate the digital Indian drums into the performance environment, we have integrated the drum controllers with the veldt graphics system to generate live visual projections during set. We have developed and tested customized methods of rhythmic visualization in two performances, and will describe in further detail.
The ETabla Concert
For our first concert performance in May 2002, we focused on a system of geometric forms and fluid motion, driven by the percussive energy of the ETabla. The visualization we developed was based on a particle model where strikes made by the player appeared as patterns composed of small geometric elements which would appear, and then merge into a background of fluid motion. As the drum player made Ka and Ga strikes on the EBayan controller, new strike patterns were generated according to their type ( Ka or Ga ), and the intensity and pitch of those strikes as recorded by the digital sensors. Additional control messages were sent by the visual performer to modify the mapping of ETabla signals and other ambient qualities, such as lighting and color, as the performance progressed.
Once created, the motion of these particles was governed by a dynamic vector field, governed by a distribution of fluid cells which determined how forces are exerted in their vicinity, in response to the number, distribution, and motion of elements in and out of their domain. The energy introduced to the system by ETabla strikes excited secondary behaviors under the control of the visual controller. By properly modifying the characteristics of particles and cell response behaviors, we could evoke an impression of different actions or environments: e.g. fanning a flame, striking the surface of water, or blowing leaves. In each of the configurations the Ka and Ga strikes had different roles for the simulation. For example, in the fire style, Ka strokes produced small rings of sparks, while Ga strokes generated broad flames from the bottom. Variation in the pressure of the player’s hand on the surface of the controller, which altered the pitch of Ga strikes, would act as a bellows, stirring up the scene. Video links to this performance are located at the bottom of this article.
The Gigapop Ritual
For the Gigapop performance, we received EDholak drum messages from performers playing in two separate (Montreal, QC and Princeton, NJ) locations over Gigabit Ethernet and used them to generate visuals in the theater at the Montreal location. Our intent was to create an environment in which the actions of both drummers were visible and distinguishable.
Our solution for this concert was to allow two players to interact through a sculptural metaphor. Using a dynamic geometry representation to allow modifications to the structures in real-time, the two performers interacted through a series of operations to create a visual artifact of their drum patterns. Their strikes were dynamically mapped to a series of geometric operations that generated, deleted, deformed, or detached elements of the structure and generated unique artifacts from the rhythms they played. We see structures which have evolved under different mapping rules. We chose a mapping which created smaller, separate elements rather than building from a central structure. We chose rules which resulted in a solid, sheet-like structure. To add a convincing physical response to the addition and alteration of new elements, we used a mass-spring model to apply and distribute forces as the structures developed. The actions of the drummer bend and distort the figure, while secondary forces try to smooth and straighten the figure, like a plucked string which vibrates to rest.
To represent the shared performance space, we experimented with several different forms of visual “interaction” between the signals received from two performance spaces. To begin, we assigned the two drummers into separate visual spaces: one drum would excite patterns as in the ETabla performance, while the second was assigned to build structures. We then assigned both performers as builders, so that their rhythms would build upon one another. In the next, one performer’s strikes would build while the strikes of the second would rattle or delete those structures. Video links to this performance are located at the bottom of this article.
We are currently developing a system to visually represent the cyclic nature of North Indian Rhythm (known as Theka). We are also building a new musical controller based on the Sitar to collect musical data about Raga, adding a new dimension to our visual system to represent pitch and modal melodies.
While developing veldt as a visualization system for the ETabla and Gigapop concerts, we began with the goal of enriching a musical experience for the audience. Our system allowed for the further abstraction of a digital musical interface to a visual context, and required that we consider our visualization process in the role of an instrument as well as accompaniment. This incorporation of visual feedback in concert performance can provide the audience with another means of conceptualizing their experience of it, and an additional means of becoming actively engaged with the artist’s performance. Beyond the context of concert performance, our system can also be used in a pedagogical context, allowing a student to receive a variety of feedback and coaching while learning to play these instruments. In addition, we have seen that performers sometimes choose to change from using the device as an audio controller to treating it as a video controller. To that effect, we might also explore this work in the context of game environments, where our controller is an interface to a complex inner system which provides audiovisual feedback.
Links
ETabla Concert Video Clips:
http://www.cs.princeton.edu/sound/research/ controllers/etabla/Bupali.mpg
http://www.cs.princeton.edu/sound/research/ controllers/etabla/ Groovebox.mpg
http://www.cs.princeton.edu/sound/research/controllers /etabla/ Harmony.mpg
Gigapop Video Clip:
http://www.cs.princeton.edu/sound/listen/gigapop.mov
Veldt:
http://veldt.lobitlandscapes.com/performances/index.html
The Electronic Tabla:
http://www.cs.princeton.edu/sound/research/controllers/etabla/
The Electronic Dholak:
http://www.cs.princeton.edu/sound/research/controllers/edholak/
The Gigapop Ritual:
http://www.cs.princeton.edu/sound/research/gigapop/
Endnotes:
[1] Wright, M. and A. Freed. “Open SoundControl: A New Protocol for Communicating with Sound Synthesizers,” Proceeding of the International Computer Music Conference. 1997.
[2] Segal, M and Akeley, K, The OpenGL TM Graphics System: A Specification , Version 1.0, Silicon Graphics, April 30, 1993.
[3] Kapur, A., Essl, G., Davidson, P. & Cook, P. R. (2002) The Electronic Tabla Controller. Journal of New Music Research. 32(4) 2003, pp. 351-360.
[4] Courtney, D. R. (1995) Fundamentals of Tabla: Complete Reference for Tabla. vol. 1,. Houston TX: Sur Sangeet Services, 1-15.
[5] Kothari, K.S. Indian Folk Musical Instruments, pp. 43-44, Sangeet Natak Akademi, New Delhi, (1968).
[6] “Dholak”, RagaNet , Batish Institute, Available at: http://www. raganet.com /RagaNet/ Issues/7/dholak.html
[7] Kapur, A., G. Wang, P. Davidson, P. R. Cook, D. Trueman, T. H. Park, M. Bhargava. “The Gigapop Ritual: A Live Networked Piece for 2 Electronic Dholaks, Digital Spoon, DigitalDoo, 6-string Electric Violin, Rbow, Sitar, Tabla, and Bass Guitar.” Performed Live at Princeton University and McGill University at the International Conference for New Instruments for Musical Expression (NIME 2003) . May 23, 2003. Available at: http://www.cs. princeton.edu/ sound/publications/gigapop_nime_2003.pdf