next up previous
Next: Client-server architecture Up: A framework to integrate Previous: Introduction

Data structures

In this section we define the data structures that form the basis of our framework using the Java programming language. The data structures are grouped into three sections: Parameters, VirtualInstrument, and Composition.


We will consider three types of Parameters: Numbers, Nuplets, and ControlSignals. We do not define the internal structure of the Parameters. Instead, we define the interface to the Parameters, i.e. how to access their data.

A Number represents a numerical value. Its interface defines two methods, one of which returns its value as an integer, the other as a floating point number.

 public interface Number

{ public int intValue();

public float floatValue();


A Nuplet is an abstraction of an array of Numbers. Its interface is defined as follows:

 public interface Nuplet

{ public float value(int i);

public int dimension();


The method dimension returns the length of the array. The method value returns the element with index i of the array. This structure can be used, for example, to stock the wave form of an oscillator.

A ControlSignal implements two methods:

 public interface ControlSignal

{ public Nuplet value(float time);

public int dimension();


ControlSignals are data structures which have a time dimension. They will provide the necessary input values during the synthesis. The value of a ControlSignals, at any given time, is a Nuplet. Because the Nuplet has a fixed dimension larger or equal to zero, we can use ControlSignals as a multi-dimensional control structure; the Nuplet groups the value of every dimension into one object. Some examples of ControlSignals are the ConstantSignal and the BreakPointFunction.


In this section we discuss a number of data structures which help us describe a synthesis technique. The definition of Module, Connection, and Patch will lead us to the discussion of VirtualInstrument.


A Module is an object that has a type, a number of typed inputs, a number of typed outputs, and a value.

 public class Module

{ String mValue;

byte mModuleType;

byte mInputType[];

byte mOutputType[];


A Module abstracts a function which takes a number of inputs, performs some calculation and outputs a number of results.

Connections will be used to link Modules and are directed from an output to an input.

 public class Connection

{ Module mInModule;

int mInputNum;

Module mOutModule;

int mOutputNum;


A Patch consists of a set of Modules and a set of Connections:

 public class Patch

{ Vector mModules;

Vector mConnections;



We want to present a formal description of a synthesis technique. We call such a description a VirtualInstrument [Bat95]. We assume that synthesis techniques can be established using unit building blocks [Mat63, Ris93]. This leads us to the definition of a VirtualInstrument as a Patch of Modules and Connections. Connections link Modules together to form a networked instrument.

A VirtualInstrument only describes a synthesis technique. The actual synthesis will be performed by a synthesis kernel. How the VirtualInstrument and its Modules will be converted to a set of signal processing functions within the kernel is briefly discussed in section 3.2.3.

We currently accept two basic types of Modules: param-Modules and mod-Modules. The mod-Module represents a primitive building block of the synthesis technique. When the VirtualInstrument is implemented by the synthesis kernel, the mod-Module is mapped onto one of the signal processing functions of the kernel. The value of the mod-Module indicates the name of this function.

The param-Module is used to hand over data to the synthesis kernel, both for the initialization and for the control of the synthesis. Its value is an index in an array of Parameters (see section 2.3.2).

A couple of remarks. First, a VirtualInstrument must satisfy a number of constraints. For example, some synthesis environments only manage acyclic structures or tree structures. We will thus need additional functions to test these conditions.

Second, she current definition of VirtualInstrument is well suited to describe synthesis techniques that model the analoge studio (signal models) [DP93]. It can also represent physical models that use a waveguide description. Physical models that use a modal description, however, are not well represented in this formalism. Indeed, in these models a two-way interaction between the Modules is necessary and the connections are expressed in terms of accesses at at certain location on the Module. The description for this interaction between Modules is not possible without complicating the current one and we will leave the issue as is for now.

A third remark concerns the multidimensional ControlSignals. This concept is not new [EGA94], but we would like to underline its usefulness again. Consider that we want to use a VirtualInstrument which uses additive synthesis. Using the unit generators currently found in most synthesis kernels, we can construct a VirtualInstrument that synthesizes 1 component using a sine wave oscillator which is controlled in frequency and in amplitude. If we now want to use additive synthesis using 2 components, we have to use two sine wave oscillators. With the initial description of the VirtualInstrument altered, we need a description for an additive synthesis regardless of the number of components. For this problem we propose the use of multidimensional ControlSignals. If the ControlSignals input to sine wave oscillator have a dimension larger than one we sum the resulting signals.

Lastly, the value of a mod-Module now depends on the synthesis kernel that will realize the VirtualInstrument. Most kernels, however, offer similar kinds of signal processing functions. For example, modules for adding two sound signals can be found in every kernel. If we can determine the functions common to most kernels, and associate one value to every group of similar functions we can construct VirtualInstruments independent of the underlying kernel. This project, which we have not started yet, will be of importance when we create our tools for the control of sound synthesis.


If Parameters and VirtualInstruments stand closer to the synthesis, we now arrive at the definition of the structures which stand closer to composition.


A SoundObject is composed of a start time, a duration, and a reference to a Process.

 public class SoundObject

{ float mStart;

float mDuration;

Object mProcess;


A SoundObject is one element of the composition. It can represent a single note as well as a complex sound that evolves in time - in essence ``a single sound which might last several minutes with an inner life, and ... [has] the same function in the composition as an individual note once had.'' [Cot74] [Cot74]. SoundObjects can be seen as ``cells, with a birth, life and death'' [Gri87].

The Process is a structure that determines the content of the SoundObject. It is the life, evolution, or force of a SoundObject. The Process can be one of two different kinds: it can be a SoundProcess or a Texture.


 public class SoundProcess

{ VirtualInstrument mVirtualInstrument;

Parameter mParameter[];


A SoundProcess is an object which represents a synthesis process, and which contains the recipe and the ingredients for this synthesis. A SoundProcess is the combination of a VirtualInstrument and an array of Parameters. The VirtualInstrument describes the synthesis algorithm. The Parameters serve as control structures for the synthesis, or as initialization values during the creation of the VirtualInstrument.


 public class Texture

{ SoundObject mSoundObject[];


A Texture is a composed Process and contains a number of SoundObjects. The definitions of Textures and SoundObjects refer to each other: a SoundObject can refer to a Texture that itself can refer to a number of SoundObjects. However, we do not allow cyclic paths: a Texture cannot contain a SoundObject referring to the initial Texture. The composition can thus be organized in a tree structure [RC84].

next up previous
Next: Client-server architecture Up: A framework to integrate Previous: Introduction

Peter Hanappe
jeu 5 jun 13:03:16 MET DST 1997