148
COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF INTERACTION BY MICHAEL HAMMAN B.Mus., New England Conservatory of Music, 1983 M.Mus., University ofMaryland at College Park, 1987 THESIS Submitted in partial fulfillment of the requirements for the degree of Doctor of Musical Arts in the Graduate College of the University of Illinois at Urbana-Champaign, 1997 Urbana, Illinois

COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

  • Upload
    others

  • View
    4

  • Download
    0

Embed Size (px)

Citation preview

Page 1: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF INTERACTION

BY

MICHAEL HAMMAN

B.Mus., New England Conservatory of Music, 1983 M.Mus., University ofMaryland at College Park, 1987

THESIS

Submitted in partial fulfillment of the requirements for the degree of Doctor of Musical Arts

in the Graduate College of the University of Illinois at Urbana-Champaign, 1997

Urbana, Illinois

Page 2: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

© Copyright by Michael Hamman, 1997

Page 3: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

iii

to Paula

Page 4: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

iv

Acknowledgments

This thesis owes a great deal to conversations and collaborations I have had

during the last seven years while attending the University of Illinois and while living in

Urbana. To Sever Tipei, I give my thanks for the many conversations we have had

regarding my project, for his moral support throughout the development of this thesis, and

for his direction in my investigation of the possibilities for the composibility of

human/computer interaction. To William Brooks, for his probing questions which helped

me to focus many of my thoughts and articulations in developing this thesis. I thank Jim

Beauchamp for his patient instruction and direction regarding digital signal processing,

sound synthesis, and sound analysis, and for use of the Computer Music Project at the

University of Illinois at Urbana-Champaign. I thank Ricardo Uribe, who provides-in

the Advanced Digital Systems Laboratory-an empowering environment for composers

and engineers to collaborate and teach one another, and for the many conversations and

clarifications regarding cybernetics. Thank you Allen Hance, at the department of

Philosophy at the University of Illinois, for providing helpful suggestions on my

treatment of dialectics.

I am grateful to Paula Pribula-Hamman for her remarkable editing and for her

other helpful contributions to the presentation of the text of this thesis.

To Adam Cain, Anthony Carrico, Camille Goudeseune, and Charles Lipp, with

whom collaboration greatly motivated the direction, and assisted in the realization, of

fruitful compositional and technological projects undertaken for this thesis.

My special thanks go to Herbert Brun for his utterly unwavering commitment as a

teacher and as an ever-present respondent to the many projects that I have proposed over

the last seven years. I also owe a great deal to the members of the Seminar in

Experimental Composition for providing an environment in which talk is not cheap.

Thank you Robin Bargar, Adam Cain, Arun Chandra, Insook Choi, Kirk Corey,

Tom DeLio, Agostino di Scipio, Camille Goudeseune, Sal Martirano, Frank Mauceri,

Mark Sullivan, and Roseane Y ampolschi for the friendships, conversations, and

discriminations which continue to open up the space for my work.

Finally, thank you Paula Pribula-Hamman for your encouragement, patience,

insightful criticism, partnership, and unconditional love. Your capacity for spontaneous

joy, love, and integrity reminds me ofthe possibility of my own humanity.

Page 5: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

v

Table of Contents

Introduction.................... .... ............ .... ............................ ..... ............................................ 1

PART I: Framing the Context ofinteraction and Cognition ........................................... 7

1. Experiential Dimension of the Dialectic....... ... .......... ....... ...... ................ ...... 8 2. Autopoiesis and the Biology of Interaction .................................................. 17 3. The Dialectical Hermeneutics ofinteraction ................................................ 25

PART II: The Musical Task Environment and the Problematisation oflnteraction ....... 36

4. The Musical Task Environment.. .................................................................. 40 5. From Programmed Structure to the Programming ofinteraction ................. 51 6. Computers, Composition, and the Hermeneutics of Interaction ................... 68

PART III: Three Case Studies .................................................................... 71

7. Chaos and Granular Synthesis .................. . .............. . ................... 74 8. ResNET: Sound Synthesis Through Dynamically Con:figurable

Feedback/Delay Networks ................. .... ..................................... 90 9. Orpheus: Interactive Design and Composition .................................. 99 10. Conclusion ........................................................................... 131

Bibliography ...................................................................................... 133

Vita .. .. ... .......... ......................... . .............. ... ......... . ..... .. ........... . ...... . .144

Page 6: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

Introduction

There are organisms that generate representations of their interactions by specifying entities with which they interact as if these entities belonged to an independent domain, while as representations they only map their own interactions. 1

In this study, I make an account of the phenomenological dimension of human/computer

interaction in music composition. My central thesis focuses on the following question:

how does the use of computers effect the ways in which a composer might imagine and

realize musical thoughts and ideas? In addressing this question, I consider issues related

to computer music research from the point of view of human/computer interaction. By

this, I do not mean that I am concerned with GUis (graphical user interfaces), or other

such matters typically of concern within human factors engineering or the computer

sciences. Rather, I am concerned with more deeply embedded epistemological and

hermeneutic questions regarding interaction: How are musical and auditory objects,

processes, concepts, and constructs represented within an interaction between a human

and a computer program? How are the thoughts and actions of a composer or other audio

researcher constrained by those representations? How does one's notion of interaction, in

the more general sense, affect one's expectations and presuppositions regarding

human/computer interaction? How does a designer's knowledge of historical musical

task environments influence the way in which s/he designs and creates systems for music

composition and sound design?

In this study, I focus on the dialectical dimension of human/computer interaction,

With this focus, I propose the following imperative: that the computer assist in the

articulation of a domain of cognitive activity in which the object of activity include the

very processes by which that activity is carried out; that the result of a process contains a

trace of that very process; that, in a word, the raison d'etre for human/computer

1 Maturana (1970), p.9.

Page 7: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

2

interaction is as much concerned with the formulation of domains of interaction as it is

with the results obtained through particular interactions.

In order to be able to articulate this imperative within the field of music

composition, I first differentiate composition as production and composition as research,

focusing on the latter as the context of the current study. Composition as production

emphasizes the primacy of the 'result', treating process as the means by which a desirable

result-the object of that process-is effected. Any tools or concepts which complicate

the productive process are avoided in favor of those that will readily assist in the

realization of production. Those problems which do arise do so only within the context

of the work itself-they are not understood in relation to the conceptual or physical tools

by which that work is realized. As such, problems tend to be well-understood and well­

articulated cultural manifestations, having a history within a particular economy of

aesthetic and technological codes. Meanwhile, the resulting artifact becomes an object

for exchange within that very economy.2 Functioning as an object, the tendency of the

artifact is to reinforce the cultural and economic edifice under whose rubric its generative

procedure is manifested. Toward this end, the tools and concepts which the composer

uses tend to have a stable state: they exhibit predictable behaviors, providing only those

kinds of surprises which can be anticipated (though not necessarily foreseen) in advance.

Composition as research, by contrast, emphasizes the primacy of the process

itself, treating process as its own object. This is not to say necessarily that the process

ends up with no object, in the usual sense of the term; it's just that in the case of

composition research, such an object retains-in fact projects-traces of the very

processes by which it is generated. The inclination is, therefore, less one of attempting to

effect a desired end result, and more one of attempting to realize the concretization of an

otherwise abstract determination. Toward this end, composition as research tends to

direct itself toward the formulation of problems (whose cultural manifestation is as yet

undefined), rather than the solution of problems (which already have at least some

meaningful cultural definition). As research, solutions to problems are treated as

provisional, experimental; they occur as situating hypotheses-hypotheses that generate

the very reality against which they seek to test themselves. Such hypotheses are of a

critical nature to the degree to which they seek to delay the moment of their own

consequence.

2 Bourdieu (1993).

Page 8: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

3

What differentiates research from production centers largely around issues

concerning the composer's use of tools, both conceptual and physical, and thus hinges on

how slhe conceptualizes the task environment. A task environment defines the set of

tools, as well as the conventions for their use, which constitute how one works within a

particular problem domain. A problem domain constitutes a network of cultural and

epistemological codes which define a domain of competence and performance

appropriate to the fulfillment of tasks and goals. These tasks and goals obtain particular

meaning as a consequence of their agreement with and reflection of those codes.

Questions regarding the task environment are-frequently m hidden

ways-deeply embedded in the design of computer systems. But it is only relatively

recently that human/computer interaction (HCI) has been considered as a special field

within computer science. With this development has come a variety of methodologies

regarding the design of interactive computer systems. Most research in HCI regards a

successful interactive computer system as one which sublimates the tasks, goals, and

intentions of its user over her/his behavior and over the behavior of the computer. The

computer functions in order to allow a user to perform certain tasks and accomplish

certain goals. By the same token, the user functions in order to understand how to get the

computer to help her/him to perform those tasks and accomplish those goals. The user

therefore should not have to concern her/himself with the operating requirements of the

computer; nor should the performance of the computer be hampered by inadequate or

incorrect behavior on the part of the user. Rather, the interaction is one in which the

computer functions much as a well-designed door handle or automobile control panel

does, matching its physical-or at least virtually physical--composition in such a way as

to map itself to the problem domain in which its user operates, while leaving behind as

little of its own residue as possible.

In this paper, I question this blanket transference of design science to

computer/human interaction-particularly as it applies to artistic endeavor-on the

grounds that, as a symbolic processor, the computer is capable of a much different kind of

partnership with a human than that given by a doorknob or an airplane cockpit. While

such an approach to design science is appropriate for things like hammers, doorknobs,

and word processors, it is not always appropriate for computer systems in which humans

engage in artistic research. I make this argument in defense of human agency since, by

the very same notion that the computer system disappears in order to allow the user to

simply and uninterruptedly accomplish particular tasks, so too does human agency.

In the following analysis of interaction in computer music, I try to articulate ways

in which human agency itself is foregrounded. Through the foregrounding of human

Page 9: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

4

agency, interaction arises from the particularity of human engagement-from the

particular labors in which the human participant engages-rather than from appropriated

historical methodologies through which musical materials and procedures are created. As

a result of such a mode of interaction, a human has the freedom to learn that which s/he

does not already know and, thereby, to expand her/his notion of both the object of

interaction and the nature of interaction itself. Through this development, a composer

might come to understand composition as not merely the well-rehearsed process by which

this or that artifact comes into being, but as the development of the means by which such

a "coming into being" might itself be brought into being.

Given this, the historically determined criteria which underlie many interactive

computer systems can be augmented, or even supplanted, by new, more variable criteria.

One such criterion has to do with the idea of the composer's intention. In the context of

the current discussion, "intention" is understood not simply as that which brings about a

particular result but rather as that under whose aegis otherwise undirected determinations

and decisions are made. The notion that a computer system should faithfully serve the

primacy of an a priori intention is of fundamental significance in standard thinking

regarding human/computer interaction. In its dialectical unfolding, however, intention is

not something already given and fixed. Rather, it is viewed as immanent, its

manifestation emerging solely in the activity of its working itself out; emergent, not

simply as accidental progress, but as a process in which the self is actualized through its

projection as other. Human thought becomes, as such, an agency that progresses in its

self-determination through its encounters with objects of its own creation. The

conceptual and physical tools with which such encounters are realized are defined not by

historical criteria which serve to yield them as understandable; rather, they are defined by

criteria by which they are intentionally rendered as problematic. This problematisation of

working tools helps to foreground the hermeneutic horizon across which their use casts

human actions, goals, and desires-as such, enabling conscious self-reflection vis-a-vis

those actions, goals, and desires. In such an environment, "object" and "subject" arise

together in the determination of the one by the other.

This argument is clarified and elaborated over the course of this study. The goal

of this study is to elaborate a dialectical framework with respect to which one might

understand the procedural dimension of music composition with computers. This

elaboration has three parts. In the first part, I present a theory of interaction. In this part,

I first present certain ideas related to dialectical philosophy, particularly those advanced

by T.-W. Adorno, H. Marcuse, and to a certain extent G. W. Hegel. I make this

presentation in order to foreground the interrelation between "subject" and "object."

Page 10: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

5

Provisionally, we can understand the "subject" as our own "consciousness" of things in

the world and the "object" as referring to those things. In our common metaphysical

interpretation, we tend to treat the two--subject and object-as separable parts of

experience. Given this separability, we understand cognition as that process by which

things in the world (the object) are "perceived" by a consciousness (the subject). In this

study, I elaborate a dialectical view of the relation between subject and object. Of

particular interest in the current study is the notion that the particularity of the object is

contingent upon the labor which the subject exerts in its comprehension of it, while, by

precisely the same development, the subject is immanent within the processes by which it

projects itself toward an object that it sees as other to itself.

Having arrived at this provisional understanding of the interrelationship between

subject and object, I tum to a consideration of the cognitive dimension of interaction.

Toward this end, I consider the theoretical work of Humberto Maturana, particularly his

development of the notion of autopoiesis. According to Maturana, living systems are

closed systems. As such, the nature of an interaction is not an input/output mechanism

between organism and environment; rather it is determined by the ways in which an

organism changes its structure in order to maintain its organization in an environment and

its perturbations. Nevertheless, it is the particularity of those perturbations which

constitute the domain of interactions which an organism generates. Interaction, as such,

becomes a form of self-reflection in that internally generated behaviors result in the

production of entities which appear for the organism as external entities, separate from

itself. This notion has interesting ramifications when we consider how human beings

interact with computers, since computers, being symbolic processors, in many ways

project and extend human being's capacity for cognitive self-reflection. Just as many tools

used by human beings are extensions of their body, so computers are, in many respects,

extensions of a human's capacity for self-reflection.

In Part II, I develop an analysis of the interactive aspects of music composition in

general and computer music composition specifically, focusing on those aspects of

interaction which emphasize its dialectical and hermeneutic dimension. Toward this end,

I consider compositional activity as a means by which domains of interaction are

imagined and realized, occasioning moments of cognitive self-reflection, in the sense of

the term introduced above. In order to do this, an otherwise familiar task environment is

problematised; that is, some or all of its elements are configured such that a human is

prompted to notice her/his own presence as something foreign, something which 1s other

to itself, in the dialectical sense of the term. This principle is developed through a

presentation of the compositional procedures of John Cage, Gottfried-Michael Koenig,

Page 11: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

6

and Iannis Xenakis. The focus of such compositional activity was on constructing task

environments which reframe the manner in which musical problems are posed and

solved. Computer technology has come to be understood as an important tool in such

compositional activity. With the computer, the composer is able to begin to model the

very processes by which musical problems might be formulated. Two distinct manners of

approach can be differentiated: one which tries to find ways to formulate new procedural

problems, and another which tries to find ways to re-present, albeit in an unprecedented

manner, already established procedural approaches.

In the third and fmal part of this study, I present three "case studies" which act as a

'concretizing' accompaniment to the principles unfolded within the first two parts. These

case studies are music composition software systems which have been developed by the

author. Each system is regarded not so much as a means for production of musical

works, but as an object of research itself. Each system marks a point on a trajectory

beginning with the possibility of the composibility of entire forms through the

specification of a data and process model, to the possibility of the composibility of the

interfaces by which those data and process models might themselves be composed.

Page 12: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

PART 1:

Framing the Context of Interaction and Cognition

7

Page 13: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

8

1. Experiential Dimension of the Dialectic

1.1 Deductive Reasoning and the Cult of Understanding Dialectics is a mode of thinking which radicalizes the relationship between a

concept and its object. It is, in essence, a mode of thinking that attempts "to break the

power of facts over the word"3 in order to emphasize the subject/object interrelations with

respect to which things in the world arise.

In attempting to make an explication of dialectics, perhaps the place to begin is to

say what dialectics is not. Dialectical thinking can be contrasted with deductive

reasoning with which it is frequently and erroneously assumed to be equivalent.

Deductive reasoning starts with an absolute and clear delineation and then erects new

delineations, from each of which new delineations are derived.4 As a particular form of

abstraction, deductive reasoning can only yield its particular truths when the things which

it treats are understood to be 'units.' Those units are organized according to an order

which is external to any interior semblance of those units and which is indifferent to any

inter-relations which may be obtained between such units.s For instance, the truth of the

proposition-2 + 7 = 9-is a consequence of the definitions and rules of the number

system and is thus not inherent in any of its constituent elements. The proposition arises

as a content belonging to those definitions and rules; its elements are, in effect, 'place­

holders' which manifest the true development of the proposition's content: an operation

(named 'addition'), plus a notion of identity which is understood as the demarcation of a

process which a particular operation (such as addition) engenders. Hegel terms the kind

ofthinking which characterizes such a deductive system "Understanding"-a term which,

in the following discussion, I will adopt.

For Hegel, Understanding operates not only in mathematics; it operates wherever

ideas and procedures are treated as separate from one another and where they are

hermetically unaffected by other ideas and procedures. Understanding sees the world as

comprised of "a multitude of determinate things, each of which is demarcated from the

other. "6 Each such "thing" is identical only with itself and is opposed to all other things.

Such fixity of identity introduces the logic of their oppositions.

The single-most primary opposition or separation, as constituted within

Understanding, is that of the content of cognition and its form. According to this

3 Marcuse (1960), p. x. 4 Findlay (1958), p. 56. 5 Ibid. 6 Marcuse, op. cit., p. 44.

Page 14: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

9

conception, the content of cognition is constituted by the entities that populate the

universe while the form of cognition is the mode of consciousness and thinking which

apprehends and comprehends these entities. This arrangement presupposes that "the

material of knowing is present on its own account as a ready-made world apart from

thought, [and] that thinking on its own is empty and comes as an external form to the said

material. "7 These two separate aspects of experience are understood as constituents

under cognition: "cognition is compounded from them in a mechanical or at best

chemical fashion."8 As such, they have the following properties:

the object is regarded as something complete and finished on its own account, something which can entirely dispense with thought for its actuality, while thought on the other hand is regarded as defective because it has to complete itself with a material and moreover, as a pliable indeterminate form, has to adapt itself to its material.9

Under this rubric, truth arises from the agreement of thought and object. The object is

primary while thought is secondary. Thought is supposed to mold itself in

accommodation to the object. In this regard, thought is in a sense considered to be

'defective' since it requires material to complete itself. The extent to which thought can

complete its identity with the object determines the degree to which it overcomes its

defective nature.IO

In so far as thought remains contingent upon material in such an absolute and

determinate manner, it has no power for its own development. "(l]n its reception and

formation of material, [thinking] does not go outside itself; ... its reception of the material

and the conforming of itself to it remains a modification of its own self, it does not result

in thought becoming the other of itself." 11 While thinking can modify itself in its effort to

come to grips with its experience of the materiality of an object, it cannot go beyond itself

and outward toward the object; it cannot possibly breach that distance by which it and

object are separated. Such a movement on the part of reason effects its own "self­

renunciation", ceding as it does the primacy of experience to the appearance of

phenomena as they occur for the senses-something to which reason may only respond

but never affect.12

7 Hegel (1969), p. 44. g Ibid. 91bid. IO Ibid., p. 45. II Ibid.(my emphasis). 12 Ibid., p. 46.

Page 15: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

10

1.2 Appearance, Essence, and the Immanence of Subjectivity Dialectical thinking attempts to reconnect the determinations which have in

Understanding separated 'things-in-the-world' from thought. Such an attempt, however,

introduces a conflict which Understanding cannot accept in that the connecting activity of

dialectical reflection constitutes a negation of the very determinations that define

Understanding's notion of reality. Through the sublation13 of that aspect of those

determinations which enforce the separations they determine (i.e. the separation of things­

in-the-world and thought), dialectical thinking embraces the moment of contradiction as

the beginning of reason. 14 In this way, dialectical thinking comprehends the world

through a process of negation.

This formulation requires some development, which I now give. First, dialectic

thinking differentiates the appearance of things and their essence. This differentiation is

to be found in Piato (his differentiation between the Forms and their manifestation in

sensual experience) and in Kant (his differentiation of noumena and phenomena).

According to this notion, the thing's appearance represents the current state of that thing.

Its essence represents the thing as it is realized according to its totality, according to its

immanent potential. Appearance represents the thing in its stability, as a static and fixed

entity, while essence represents the thing in the completion of its development.

Common sense, however, equates appearance and essence, and in so doing locks

the thing in that as which it appears. According to this formulation, the thing is, in

essence, exactly as it appears. In its appearance, it comes to us 'as it is,' without the

intervention of our thought or of our comprehension. Dialectical thinking, by contrast,

breaks down the "distorting mechanisms of the prevailing state of being [given by the

thing's appearance]"IS by comparing "the apparent or given form of things to the

potentialities of those same things, and in so doing [distinguishing] their essence from

their accidental state of existence." 16 This is brought about, "not through some process of

mystical intuition, but by a method of conceptual cognition, which examines the process

whereby each form has become what it is." 17

According to this manner of thinking, the thing is thought both from the point of

view of its appearance and from the point of view of that process by which its essence is

13 the simultaneous cancellation, preservation, and elevation of an element in a dialectical process as a partial element in a synthesis (Webster's Third New International Dictionary) 14 Hegel (1969), p. 46. IS Marcuse, op. cit., p. 46. 16 Ibid. 17 Ibid.

Page 16: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

II

realized. Such thinking "conceives 'the intellectual and material world' not as a totality of

fixed and stable relations, but 'as a becoming, and its being as a product and a

producing."'l8 Our experience is not welded to the appearance of phenomena as they

occur in the world; it arises within the context of its own intervention.

Given this delineation of appearance vs. essence, one is left tempted to think that a

thing arises as a development in which its essence, or potentiality, is realized and that,

therefore, this essence must be defmed a priori. After all, to say that a thing is

constituted by its essence, one would have to assume that such an essence must already

exist. A dialectical analysis, however, would reject this assumption, understanding the

object as a movement-a process by which its potential is realized through the negation

of its existence and its subsequent passage to new existence. The particularity of this

development is not established a priori; rather, it is immanent within that very

development.

The dialectical 'Triad' demonstrates the contours of this development by depicting

the Dialectic as an elaborative movement. This movement is logical, not temporal. It

moves from Thesis through Antithesis to Synthesis. Again, this triad articulates, not three

separate thoughts, but the logical development of a single thought.

The initial Thesis, characterized as a positive presentation, can only be as such by

virtue of its negation-that is, its Antithesis.19 If a thing is to be positively determined, it

can only be determined by distinguishing that which it is not. In this regard, Thesis and

Antithesis are two sides of a single coin: "inseparable 'moments' of a single thought; the

Antithesis is the negated Thesis."20 21

And yet, this thought is still logically incomplete, due to its contradictory nature.

In order to complete itself, it must pass from negation (Antithesis) to Synthesis in that the

process under which the thought develops must be contained-a containment by which a

thing and its negation come together as a unity.22 Synthesis, in a sense, represents the

"entry point" of thought as a self-reflective and self-aware development, a development

which in turn constitutes the activated presence of the subject. Thought negates the

negation of the thing by synthesizing the thing-along with its negation-within

18 Marcuse, op. cit., p. 46. 19 Mure (1965), p. 34. 20 Mure (1965), p. 34 (my emphasis). 21 The tenn 'moments' here lends considerable insight; it is a tenn which Hegel had borrowed from mechanics wherein the 'moments' of the lever, its weight and distance, all operated identically--even though weight was understood as a 'corporeal' reality, while distance was understood in a more abstract way (Mure ( 1965), p. 34 footnote 2). 22 Mure (1965), p. 35.

Page 17: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

12

thought's own unfolding. Synthesized within thought's own unfolding, the thing is

transformed, thus constituting a new Thesis.

Hegel's formulation of Being exemplifies the development of this triadic

formula.23 He begins with "Being, pure being:"

In its indeterminate immediacy it is equal only to itself. It is also not unequal relatively to an other; it has no diversity within itself nor any with a reference outwards. 24

This notion of Being is being in its abstract purity. It is "not any particular kind of being,

such as this pen, that book, this table, that chair."25 In order to arrive at pure Being, we

have to abstract from it all specific forms of determination. From the Being of a table, for

instance, we must abstract away its woodness, its squareness, its brownness, and so on,

until all we have left is its most abstract "isness." Such being is completely

indeterminate, empty, vacant.

But such an utter emptiness, such a vacuum, is really nothing; and "pure nothing

. . is simply equality with itself, complete emptiness, absence of all determination and

content-undifferentiatedness in itself."26 As such, "the pure concept ofbeing is ... seen

to contain the idea of nothing."27 This identity of Being and Nothing represents, for

Hegel, the logically originary identity of Thesis and Antithesis. Accordingly, the truth of

Being "is neither being nor nothing, but that being . . . into nothing, and nothing into

being."28 Under the same development, however, while Being and Nothing logically

dissolve into one another,

it is equally true that they are not undistinguished from each other, that, on the contrary, they are not the same, that they are absolutely distinct, and yet that they are unseparated and inseparable and that each immediately vanishes in its opposite.29

Being is not so much an identity as it is a movement; the movement of Being vanishing

into Nothing and, simultaneously, of Nothing vanishing into Being. This movement is its

23 The basis for this example is taken from Hegel's Science of Logic; however, my explication of it here is taken both from Hegel and from Stace (1955), pp. 88-93 . 24 Hegel (1969), p. 82. 25 Stace (1955), p.90. 26 Hegel (1969), p.82. 27 Stace (1955), pp. 90-91 (my emphasis). 28 Hegel (1969), p. 82-83 . 29 Ibid., p. 83 .

Page 18: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

13

Synthesis: a Becoming. This movement, this Becoming, is not time-bound, however; that

is, it is not the description of a 'transformation.' Rather it is a synthesis "which can only

be stated as an unrest of incompatibles. "30

This movement has a development. Consider, for instance, a stone. A stone

remains a stone, even throughout the various interactions into which it enters: it gets wet,

it resists an ax, it withstands a certain load before giving way, etc.3I Through all this, it

maintains its being as stone. However, in its various transformations, it is not the case

that the stone maintains itself-it cannot act in order to effect its own development. A

plant, by contrast, is different: it maintains itself. At first it is a bud and then it is a

blossom and finally it decays. It is never only that which it is at a particular moment: a

bud now, a blossom now, etc. Rather the plant constitutes itself as "the whole movement

from bud through blossom to decay. "32 In this case, it is the entity itself which maintains

itself and not some force external to it. But even the plant, though it does direct its own

development, does not "'comprehend' this development. "33 It does not recognize, or more

accurately realize, that development as its own and so it cannot intentionally bring its

own potential-its essence-into being.34 The criterion of intention is significant. An

entity which can intentionally cause the realization of its own potential is one which, in

this capacity, is self-reflective. Such a being realizes itself "in the process of positing

itself, or in mediating with its own self its transitions from one state or position to the

opposite. "35 Such a being has the capacity to reflect upon its own development and the

freedom (at least potentially) to constitute itself as that development.

By constituting its own self-identity, cognition "(works] itself out through an

active self-directed process,"36 and not as a fixation of either object or subject. Such a

process is one by which "self-identity" is reinstated over and over again: each reinstating

giving rise to its negation and the necessity for a new formulation, which in tum gives

rise to the reinstatement, once again, of self-identity, and so on. Identity, in this case, is

not a fixed thing-it is the self-recognition of the movement by which the subject

develops itself: "it is the process of its own becoming."37 By this process, "I" and "not-1"

are reconciled within the self-reflective gesture of the subject. The reconciliation is not

to be thought of as anything like 'agreement.' Rather it engenders a process by which

30 Ibid., p. 91 . 31 Marcuse, op. cit., p 8. 32 Ibid. 33 Ibid., p. 9. 34 Ibid. 35 Hegel (1967), p. 80 (my emphasis). 36 Ibid., p. 81 . 37 Ibid., (my emphasis).

Page 19: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

14

thought thinks the contradiction itself, and from that contradiction itself, seeks a

synthesis. By this method of thinking,

the real subject-matter is not exhausted in its purpose, but in the working the matter out; nor is the mere result attained the concrete whole itself, but the result along with the process of arriving at it.38

The result of thinking finds itself not merely in the content of that which constitutes

thinking's 'substance', but in both the content and the process by which it is determined.

This process of mediation introduces the subject as a shaping force in the

unfolding of the object. Through mediation, an object is developed within the self­

reflective gesture of the subject, circumscribing a process whereby a thing passes from

positive existence (Thesis) through its negation (Antithesis) and into Synthesis.

Mediation, in this sense, counteracts the metaphysical primacy of the separation of

subject and object according to which the subject is subordinated to the apparent

immediacy of the object.

As a consequence of this formulation, the mind is no longer simply a cogitating

machine functioning solely for the apprehension of the immediate. Its force comes, not

merely from what it apprehends, but from its comprehension; and from its capacity for

projecting that which it comprehends into an object world in which it recognizes itself in

its development. From this it follows that

[t]he force of mind is only as great as its expression; its depth only as deep as its power to expand and lose itself when spending and giving out its substance. 39

1.3 The Object as Immanent in the Labor of the Subject How can such a relationship between object and subject be construed? In order to

address this question, a contradiction must be introduced. For while an object arises

within the self-reflective production of the subject, the object must be protected from the

historical tendency of the subject to subjugate the agency of the object under its own

imperative. This particular point is the basis for Adorno's analysis of the dialectic.

Adorno situates the crux of dialectical philosophy as a concern for the free

development of the object. Toward this end, he amplifies Hegel's emphasis on the

experiential dimension of dialectical philosophy.40 According to Adorno, however,

38 Ibid., p. 69 (my emphasis). 39 Ibid., p. 74.

40 Nicholson and Shapiro, Introduction to Adorno (1995), p. xv.

Page 20: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

15

dialectical philosophy immerses itself, not in the experience of the subject-as subjective

idealism would have it-but rather in the experience of the object. It strives to emphasize

the otherness of experience by focusing upon the object as that toward which the subject

extends its own development, thus modifying the course of that development. This is a

tricky distinction. For, on the one hand, we are arguing for a freely-evolving object,

whose imperative is primary over that of the subject. And yet, on the other hand, the

object can only arise according to the labor exerted by a subject in its comprehension of

the object.

The key to this distinction is that the subject arises in the labor exerted over the

comprehension of the object. It is not through the mere presence of the subject that an

object is constituted, but through its (the subject's) labor. The movement of thought "is

powered by the self-reflection of the subject attempting to conceive reality."41 As such,

subjectivity is immanent in its labor and is therefore contingent upon the nature of the

appearance of the object. And yet, by precisely this same development, the nature of the

appearance of the object is also contingent upon the manner in which it is conceived by

thought, which constitutes the activating agency of the subject. Experience is essentially

dialectical in that the object depends upon the subject for its existence by the very same

cognitive process by which the subject depends upon the object for its existence. The

activity of thought (the moment of individuated subject) "turns labor inward," embracing

"the burdensomeness and coerciveness of outwardly directed labor [which perpetuates]

itself in the reflective, modeling efforts that knowledge directs toward its 'object."'42 As

arising in its labor, the "I" becomes a particularized "I"- shaped by the object with

respect to which it makes its effort. This particularized "I" constitutes the empirical

dimension of the subject wherein speculation itself becomes an "experiential content. "43

Such a notion of the subject can be contrasted with that which Kant advanced.

Whereas for Kant, "no world, no constifutum, is possible without the subjective

conditions of reason, the constituens," Hegel's Dialectic maintained

that there can be no constituens and no generative conditions of the spirit that are not abstracted from actual [things] and thereby ultimately from something that is not merely subjective, from the "world."44

41 Ibid., p. xxiii . 42 Ibid., p. 2 1. 43 Ibid ., p. xxiii . 44 Ibid., p. 9.

Page 21: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

16

The subjective arises in a particularized subjectivity, while abstraction is manifested in

the concretized form by which a subject projects itself toward the object. The productive

activity of the subject becomes a generative mechanism "through which human beings

form something that then confronts them."45 This "confrontation" represents the moment

at which a subject recognizes its reflected effort as something that is other to itself, and is

the beginning of the notion of subject as that which it can comprehend both as and for

itself.

The aspect of labor, then, constitutes the subject as that which in its _effort

comprehends the object as something other than itself. I say "in its effort," since nothing

comes to pass merely by virtue of there being an object and a consciousness. Spirit is not

monolithic transcendence; rather, it manifests itself as "the quintessence of the partial

moments, which always point beyond themselves and are generated from one another."46

As "the quintessence of the partial moments," 'spirit' engenders a process in which subject

and object emerge together, each, nevertheless, in the particularity of its own

development. It is through this notion of spirit, "that the opposition between mere matter

and a consciousness that bestows form and meaning is extinguished. "47 Whole and parts

are obtained from the mutual constitution of the one by the other.

1.4 Summary Through dialectics, we begin to trace a notion of interaction in which subject and

object are mutually determinative. But this mutual determination does not constitute a

transcendent and monolithic "spirit" which presides over all experience. Rather, it

accentuates the idea that, in fact there is no transcendental object, no transcendental self:

subject and object occur in the utter particularity of the occurrence of the one for the

other, and each ofitself. Interaction is the enactment of this occurence.

45 Ibid., p. 21 . 46 Ibid., p. 4. 47 Ibid., p. 5.

Page 22: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

17

2. Autopoiesis and the Biology of Interaction

The neurophysiologist Humberto Maturana confronted a similar problem as did Hegel

(and Adorno}-the circumscription of thought by the metaphysical assumptions

embedded within language. Maturana's philosophical ideas stem from observations he

made during his research as a neurophysiologist, working with visual perception. The

observations he made called into question traditional notions of perception, i.e. that

perception occurs as a direct mapping from real objects "out there" onto structures within

our sensory organs. His experiments with frog vision, for example, challenged the

traditional assumption that the neurological activity of the optic nerve was a "direct

representation of the pattern of light on the retina. "48 He showed, for instance, that fibers

within the retina responded not to patterns of light intensity but rather to patterns of local

variation on the retina itself. This demonstrated that at least some of the cognitive

processes relevant to the survival of the frog occurred within the visual system and not at

a higher level of neuroanatomy (such as the brain).

In subsequent research in color vision, Maturana noticed that under certain

conditions, the retina would produce color messages not actually occurring in the

environment. This and many other experiments lead Maturana to question the traditional

theories of color vision as a process by which the visual system associates colors with

wavelengths on the spectrum and to postulate that the study of color vision is "the

understanding of the participation of the retina ... in the generation of the color space of

the observer."49 As such, it seemed that perception needed to be studied by viewing "the

properties of the nervous system as a generator of phenomena rather than as a filter on

the mapping of reality. "50

The traditional idea that biological systems are open systems could not account,

therefore, for the empirical evidence. Given the evidence, it seemed that living systems

were closed, not open. Perception, it seemed, occurs by virtue of a system changing its

own structure in response to a stimulus, and not from changes introduced from the

outside by that stimulus. As such, a stimulus could be induced in any arbitrary manner; a

chemically induced stimulus (such as an injection in the retina) was non-differentiable

from a stimulus induced by an external visual object. Maturana maintains that all

interactions within the nervous system should be understood in this way. As Maturana

writes,

48 Winograd (1986), p. 41. 49 Maturana ( 1970), p xii. 50 Winograd, op. cit., p. 42.

Page 23: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

18

The focus should be on the interaction within the system as a whole, not on the structure of perturbations. The perturbations do not determine what happens in the nervous system, but merely trigger changes of state. It is the structure of the perturbed system that determines, or better, specifies what structural configurations of the medium can perturb it. 51

Perception then occurs by virtue of changes that a system makes to its state in order to

adapt to particular perturbations.52 This contradicts the objectivist explanation which

defines perception and cognition in terms of "input" and "output." According to

Maturana's formulation, perception constitutes the means by which a system maintains its

own organization-through the alteration of its own structure-while interacting with

other systems. Perception is, therefore, not a process by which information is passed

from an environment into a system; it is a process by which the system changes its own

structure in response to perturbations introduced by the environment. Thus, the system's

closure to information and control.

2.1 Autopoiesis If a system is closed, however, how is it that it can have knowledge of a world?

This leads to a central idea in Maturana's theory: autopoiesis. Autopoiesis is the

principle by which a system "holds constant [its organization] and defmes its boundaries

through the continuous production of its components. "53 It is, as Maturana defines it,

a network of processes of production (transformation and destruction) of components that produces the components that: (i) through their interactions and transformations continuously regenerate the network of processes (relations) that produced them; and (ii) constitute it [(the system)] as a concrete unity in the space in which they (the components) exist by specifying the topological domain of its realization as such a network. 54

A system maintains itself through its ability to alter the structure of its components, and

to adapt its structure to changes in its environment. Failure on the part of a living system

to do this leads to its demise, to the cessation of its organization.

51 Winograd, op. cit., p. 43 . 52 See Varela (1991), pp. 150-153 for a demonstration ofthis idea using cellular automata interacting with "perturbations." As demonstrated, some rules do not register the perturbation (the cellular automata is unaffected) while others can be dramatically affected. 53 Winograd, op. cit., p. 44. 54 Maturana and Varela ( 1980), pp. 78-79.

Page 24: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

19

An autopoietic system is a type of homeostatic system; however, in the case of an

autopoietic system, the only components it produces are those of which it itself is

composed. As such, it is to be distinguished from an allopoietic system which produces

only components that belong to an organization that is other than the system producing

them. Allopoietic systems constitute human-made systems such as computers, theories,

etc. An autopoietic system, by contrast, generates "productions of precisely those

components which integrate it ... ,"55 which lead to the maintenance of its organization.

With respect to this formulation of the notion of autopoiesis, Maturana's

treatment of the terms organization and structure is significant. In everyday usage, these

two terms are understood as roughly equivalent. Maturana (along with others), however,

differentiates them as follows: Organization

refers to the function of components in the constitution of a whole. The organization of an entity or system is the set of relations that the observer specifies as defining the entity. 56

Organization does not imply a particular structure-in fact, a given organization can be

(and in autopoietic systems, by necessity, is) realized by different structures. In addition,

[t]he organization of a composite system constitutes it as a unity and determines its properties as such a unity, specifying a domain in which it may interact (and be treated) as an unanalyzable whole. 57

By contrast to organization, structure

refers to what is built, and to the way the components of what is built are put together. The structure of a system is the set of components and relations between components making up the unity. 58

Structure determines how a system is put together: it says nothing about its organization

or about the unity by which it is distinguished. As such, one can summarize the

difference between organization and structure by saying that organization is whole­

constituting, whereas structure is component-constituting. Organization can only be

comprehended in terms of a whole, whereas structure can only be comprehended in terms

of parts. Consequently, certain kinds of systems can undergo structural change while

55 F. Varela in Von Foerster et. al. (1974), p. 111 (my emphasis). 56 K. Wilson in Von Foerster et. al. (1974), p. 104 57 Maturana (1975), p. 316. 58 K. Wilson in Von Foerster et. al. (1974), p. 103 .

Page 25: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

20

maintaining their organization. As specialized instances of such systems, autopoietic

systems continually change their structure in order to maintain their organization.

Before proceeding, I would like to make clear that the notion of "organization" is

not to be understood as equivalent to "identity." This possible confusion is somewhat

heightened by the fact that Maturana himself uses the latter term quite often in his

writings. As Stafford Beer points out, however, the "it" which constitutes the

organization of an autopoietic system

is notified precisely by its survival in a real world. You cannot find it by analysis, because its categories may all have changed since you last looked. There is no need to postulate a mystical something which ensures the preservation of identity despite appearances. The very continuation is 'it.' ... Berkeley got the precisely right argument precisely wrong. He contended that something not being observed goes out of existence. Autopoiesis says that something that exists may turn out to be unrecognizable when you next observe it. 59

The "it" which the organization constitutes is more a process than it is an appearance, or

a "state." It manifests itself solely in its continuation.

Moreover, in order for it to maintain its organization, its it-ness, an autopoietic

system requires a medium (i.e. an environment); in fact, the very existence of an

autopoietic system demands such a medium. This might seem contradictory. That a

system whose sole dynamic activity ·is the production of components of which it is

composed, and whose "perception" of the external world consists of its re-constituting its

own structure-that such a system would require a medium seems contradictory.

Understanding why this is so, however, helps in understanding Maturana's concept of the

organization of living systems. For an autopoietic system, the medium provides the

physical elements whose perturbations of the autopoietic system permit in it the processes

by which the production of its components-and thus of its organization-take place. 60

Without a medium, there is no means for the perturbation of the system, and therefore no

means for the system to actively engage in the production of the components which

constitute its organization. Such a system would quickly cease to exist. This formulation

has far-reaching consequences when considering interaction in general and

human/computer interaction in particular since it suggests that the particularity of a

performance within a given task domain is largely determined by the manner in which the

59 Beer (1980), p. 67. 60 Maturana (1974), p. 319.

Page 26: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

21

task environment frames the presentation of phenomena-both physical and

conceptual-with respect to which a human may act.

2.2 The Cognitive Domain The discussion so far concerns a structural explanation of autopoietic systems. A

structural explanation takes into account the behavior of a system at any given moment.

A cognitive explanation, by contrast, introduces the notion of historicity and of patterns

of interactions. A cognitive explanation is differentiated from a structural explanation in

that the former "operates in a phenomenal domain (domain ofphenomena) that is distinct

from the domain of mechanistic structure-determined behavior [in which domain

structural explanations operate]. "61

As a result of entering into a cognitive interaction, the internal state of an

autopoietic system is changed in a manner that is relevant to the maintenance of its

organization. 62 In order to allow for this, "the nervous system enlarges the domain of

interactions of the organism by making its internal states also modifiable .... "63 For

example, in studying the vision of an animal (say, a cat), an observer would see (through

the use of measuring instruments attached to the animal's visual sensors) that the sensors

of the animal are modified by light. Moreover, an observer would also see that the

animal's behavior is modified by a visible entity (for instance, a bird). In both cases, the

observer would see that the sensors change through physical interactions, and that these

physical interactions constitute what we might call "the absorption of light quanta. "64

And yet, from the point of view of the organism itself, the internal state of the animal is

changed, not by the presence of the visible entity (the bird), nor even by the light quanta

(reflecting that presence) that reach the sensors. Rather, the animal's internal state "is

modified through its interactions with the relations that hold [among] the activated

sensors that absorbed the light quanta at the sensory surface. "65 This can be diagrammed

as follows:

internal state

61 Winograd (1986), p. 47. 62 Maturana (1980), p. 13 . 63 Ibid. 64 Ibid. 65 Ibid.

sensors ... <E-- perturbation

Page 27: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

22

While a structural explanation will tell us the nature of each individual

perturbation in such an interaction, a cognitive explanation accounts for "the pattern of

interactions by which [the current structure of the animal] came to be, and the relationship

of those changes to effective action."66 In other words, it takes into account the history of

interactions. Such a history is a history of alterations of an organism's internal state and

is, as such, particular to that organism. This history constitutes the organism's cognitive

process, a process which occurs within the cognitive domain of the organism. Winograd

and Flores point out that "it is ... within this cognitive domain that we can make

distinctions based on words such as 'intention,' 'knowledge,' and 'leaming."'67

Knowledge, for instance, does not reflect a single state of a system; rather, it reflects a

history of interactions and a pattern of actions.68 Also, learning constitutes "the coupling

of the changing structure of an autopoietic unity to the changing structure of the medium

in which it exists . . .. "69

2.3 The Consensual Domain Due to the various perturbations which other systems generate in it, the

autopoietic system continuously alters its own structure and, as a consequence, becomes

structurally coupled to those other systems.

When two or more organisms interact recursively as structurally plastic systems, . . . the result is mutual ontogenic structural coupling . . .. For an observer, the domain of interactions specified through such ontogenic structural coupling appears as a network of sequences of mutually triggering interlocked conducts ... . The various conducts of behaviors involved are both arbitrary and contextual. The behaviors are arbitrary because they can have any form as long as they operate as triggering perturbations in the interactions; they are contextual because their participation in the interlocked interactions of the domain is defined only with respect to the interactions that constitute the domain. . . . I shall call the domain of interlocked conducts ... a consensual domain.7°

Consensual domains are essentially linguistic domains.71 As a consensual

domain, language "is a patterning of 'mutual orienting behavior."'72 It is the means by

66 Winograd, op. cit., p. 47. 67 Ibid. 68 Ibid. 69 Maturana (1975), p. 321. 70 Maturana as quoted in Winograd, op. cit., p. 49. 71 Winograd, op. cit., p. 49. 72 Ibid.

Page 28: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

23

which interlocked systems (systems that are structurally coupled) effect sequences of

"mutually triggering interlocked conducts" within a particular consensual domain.

Viewed in this manner, language functions on a connotative, rather than denotative, level;

its function is not to reference external entities, as is commonly understood. Rather, "its

function is to orient the orientee within his cognitive domain. "73

The basic function of language as a system of orienting behavior is not the transmission of information or the description of an independent universe about which we can talk, but the creation of a consensual domain of behavior between linguistically interacting systems through the development of a cooperative domain of interactions. 74

This understanding of language counters the traditional view which understands

language by its denotative function, that is as a means by which meanings and contents

are conveyed. Maturana's development of language as behavior occurring within a

consensual domain rejects the common sense view that we can speak of, and have

knowledge of, things independent of our own experience of them. Our knowledge of an

"external world" is nothing more than the history of our interactions within a particular

environment or context. Such an environment is itself composed of interacting systems

which generate the sequences of perturbations which trigger the patterns of interactions

into which we enter. They are not the entities or things which we experience or of which

we have knowledge; rather they are simply other systems with which, through structural

coupling, we interact. The only way we can talk about "a world" is as observersJS

As observers, we generate distinctions in a consensual domain. A description in any domain (whether it be the domain of goals and intention, or that of physical systems) is inevitably a statement made by an observer to another observer, and is grounded not in an external reality but in the consensual domain shared by those observers. 76

Consequently, "[p]roperties of things ... exist only as operational distinctions in a

domain of distinctions specified by an observer."77 They are a consequence of our

descriptions which, belonging to a consensual domain, are therefore constituted by that

domain.

73 Maturana ( 1980), p. 30. 74 Ibid., p. 50. 75 Ibid. 76 Winograd, op. cit., pp. 50-51. 77 Ibid., p. 51 .

Page 29: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

24

2.4 Conclusion Autopoietic systems maintain their organization by virtue of interactions within

particular environments and therefore require particular environments in order to

maintain their organization. Just as the environment determines-by virtue of the

interactions it stimulates in the autopoeitic system through the perturbations it

generates-the organization of the autopoeitic system, so too does the organization of the

autopoeitic system determine-by virtue of the perturbations to which it is

sensitive-those particular aspects of its environment that are relevant to its continued

autopoiesis. In this way, autopoiesis has hermeneutic aspects inasmuch as it views the

"mind" as that which arises within the mutually constitutive development of a self­

organizing system (a subject) and an environment (an object); that mind and world arise

together in the mutual particularity of their constituting agency.

••••• /

·I

;

Page 30: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

25

3. The Dialectical Hermeneutics of Interaction

In the previous two chapters, I have presented a dialectical and hermeneutic framework

by which we may consider human/computer interaction. To review, some of the salient

points covered were:

• An object arises as a movement, a process by which its potential is realized

through the negation of its appearance; this negation is given by thought,

which we understand as the activating agency of the subject.

• The subject arises not as an "identity" but as a continuous process of its own

unfolding; it arises in a process by which "I" and "not-I" are continuously

distinguished, then reconciled, then distinguished once again, and so on. Such

a subject arises in its labor over the comprehension of an object; under the

labor of the subjective "I", an abstract "I" becomes a particularized "I": thus,

the subject is, effectively, shaped by the object with respect to which it labors.

• Autopoiesis constitutes the process by which a living system maintains its

organization and defines its boundaries through the continuous production of

its structure. The organization which is so maintained should not be

understood as an "identity"; rather, it should be understood as a process, a

continuity. All living systems are autopoeitic systems.

• Autopoietic systems are closed systems; Their interactions are self-generated;

that is, they arise as a consequence of the system's effort to maintain its

organization in the face of perturbations which arise within a medium. The

structure of the system is, at any given moment, contingent upon the

interactions which that system generates in response to such perturbations.

• A living system is coupled with its environment, in that the particularity of its

interactions (and thus of its structure) is contingent upon the perturbations

generated within that medium. Without those particular perturbations, the

system as that system would cease to exist.

Page 31: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

26

• Autopoietic systems are essentially 'self-modifying' systems, in that they alter

their structure in order to maintain their continuity.

From these observations, we can make a synthesis and orient for ourselves a

preliminary formulation regarding interaction. First, autopoiesis constitutes a dialectical

theoretical framework: it situates the subject (a living system's "structure") as contingent

upon a particular presentation of the object (the "medium" of that system). Moreover, the

interactions in which an autopoeitic system engages are generated within that system

itself and are not a product of the environment or of another system. Nevertheless, the

interactions generated are triggered by perturbations produced by the medium in which

the system exists. As such, the environment (i.e. "medium") has considerable influence

on how the system unfolds its own interactions. Without perturbation, there are no

interactions: the system ceases to exist.

This seeming contradiction embraces dialectical thinking in that it views cognition

as the occurrence of a history of interactions in which an abstract "I" becomes

particularized with respect to the occuring of an object (i.e. its "medium"). Such an

object appears, within the interactions in which it presents itself, as though it belongs to a

separate domain when in fact it is only a mapping of that system's own interactions. 78 In

this sense, we could say that cognition is a process by which the subject and object arise

together in the mutual determination of the one by the other. Subjectivity becomes

particularized with respect to the presentation of the object which it projects in its self­

reflection. The nature of this "self-reflection" is not that of a mirror, showing the

"identity" of the subject-rather it is that of transduction, by which process the subject

recognizes itself during its transitional moments and in its otherness to the objects which

it comprehends as separate from itself.

In the following discussion, I wish to speak about interaction according to this

dialectical framework. Specifically, I wish to understand interaction as a context for the

presentation of experience such that the subject (the living system) must alter its structure

in order to bring itself forward into a comprehension of that experience. This

understanding rejects the objectivist notion of the subject as passive and essentially

"voluntary" with respect to an object that remains separate from it. According to the

notion of interaction I wish to emphasize, subjectivity is anything but voluntary: it

constitutes the activating agency according to which the object arises. However, I do not

propose the notion of the subject as purely determinative of the object, either. Rather, I

78 Maturana ( 1970), p. 9.

Page 32: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

27

propose an understanding of the object as that which enacts a perturbating agency upon

the subject, to conceive the object as the hypothesis according to which subjectivity tests

itself, a kind of lure which calls forth a particular "I." In a sense, the object becomes the

proverbial "hypothetical argument" according to which thought presents itself as the

activating context of an emergent-that is, as yet unimagined-subjective "I." Interaction

becomes a context in which an abstract "I" is particularized according to the presentation

of the other which is the "object," by which process it actuates the self-reflection of the

subject as other.

3.1 Interaction as Design Science I begin with a brief overview of human/computer interaction understood as a

"design science." Design science is concerned with the design of things with which

humans must interact in order to accomplish particular goals. Such things include doors,

automobile control panels, bathtub fixtures, and so on. Interaction is enabled through the

establishment of an 'interface.' An interface, loosely speaking, constitutes the set of

mechanisms-both physical and conceptual-by which an interaction is engendered.

From the point of view of design science, the goal of an interface is to enable a human to

make passage through it with as little disruption of her/his cognitive functioning as

possible while, at the same time, directing her/him into the tasks required to accomplish

that passage.

The field of human/computer interaction (HCI) oftentimes derives its own efforts

from those made within the design sciences. These efforts are exemplified in the research

of Donald Norman's group at San Diego as well as research conducted at Apple and,

before that, at Xerox PARC. The principle notion of HCI is that human/machine

interaction is essentially a 'task-oriented' and 'user-centered' domain of activity_79

According to this principle, good interface design begins with what the user wants to

do.so Toward the conceptual development of the principle of user-centered interface

design, Donald Norman, for instance, distinguishes a person's "psychologically"

expressed goals and the physical controls and variables of the physical system itself. An

interface is a mapping between these two. The difference between the two-a user's

goals and the physical mechanisms of the machine-comprises a gulf which the interface

serves to bridge.&! There are two ways to bridge this gulf:

79 Rheingold (1990). 80 Nonnan (1986). 81 Ibid., p. 40.

Page 33: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

28

1. the physical mechanisms of the machine-its 'input' and 'output'

features--can be designed in order to match, as closely as possible, the

psychological needs ofthe user; or,

2. the user can adjust her/his plans and goals in order to match, as closely as

possible, the physical mechanisms which defit:~e the machine. 82

Norman refers to these as (1) the Bridge of Execution and (2) the Bridge ofEvaluation.83

The Bridge of Execution represents the conceptual steps needed to formulate the

psychological plans and goals in terms of the physical mechanisms of the computer. The

Bridge of Evaluation, by contrast, involves comparison of the observable properties of the

system state (its display, outputs, etc.) with the user's desired plans and goals.

According to user-centered HCI, the onus for closing the gulf between what the

user wants to do and how the machine is to behave is on the side of the machine. This

means that the mechanisms which the machine presents to the user must be such that they

assist the user in translating her/his plans and goals into particular actions performed

upon those mechanisms: there must be some kind of "common ground" between the two.

One way to conceptualize this notion of "common ground" is through the

formulation of interface metaphors. Interface metaphors are representations of objects

and environments that are familiar and which reflect something of the appropriate task

environment for a given problem or activity space. 84 An example of an interface

metaphor is the 'computer file' and the pictures of documents and folders which constitute

its graphical representation. Beneath the operating system of the computer, no such

abstraction exists; the purpose and function of the abstraction is to assist the user to

bridge the conceptual gulf which would otherwise hinder user/computer interaction.

The employment of interface metaphors in designing interfaces replaces "the

notion of the computer as a tool with the idea of the computer as a representer of a

virtual world."85 Accordingly, "action occurs in the mimetic context and only secondarily

in the context of computer operation."86 The idea of employing interface metaphors in

interface design derives from the observation ofthe pervasiveness ofthe use of metaphors

in everyday thought. According to Erikson, for instance, metaphors can act as "cognitive

hooks":

82 Ibid. 83 Ibid .

. 84 Laurel (1993), p. 5. 85 Ibid., p. 127. 861bid.

Page 34: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

29

A metaphor is an invisible web of terms and associations which underlies the way we speak and think about a concept. It is this extended structure which makes metaphor such a powerful and essential part of our thinking. Metaphors function as natural models, allowing us to take our knowledge of familiar, concrete objects and experiences and use it to give structure to more abstract concepts.87

Since metaphors, by definition, elicit well-understood representations of real-world

objects, in theory, their employment in computer interfaces makes it easier for people to

form the necessary conceptualization for their use. 88

Consider, for instance, the Macintosh "desktop metaphor." The desktop metaphor

orients a domain of interactions that references a set of actions one normally performs

within an office environment. This includes activitities such as writing and editing

documents, writing and sending messages to other members in the office, organizing

files , and so on. By referencing such tasks through graphical representation of things like

files, scrolling documents, trash cans, and the like, the desktop metaphor assists a human

in understanding how to use the computer and, perhaps more importantly, how to

comprehend her/his use of it within the larger task environment. As such, the desktop

metaphor effects a simplification of one domain of interactions by referencing another.

3.2 Problematising the Domain of Interaction While such simplifications are essential when going about our day-to-day business

(such as configuring a document, typing at the keyboard, and organizaing computer data),

in this paper I am concerned with interfaces whose purpose is not to solve well-known

problems, but to assist in the formulation of as-yet unformulated problems (e.g. music

composition). The particular manner in which a problem is formulated defines the

cognitive domain-i.e. the domain of interactions-in which one comes to think about

that problem. Such a domain orients the range of actions and thoughts which one

believes to be appropriate to the formulation and, thereafter, the solution of a particular

problem. To generate such an environment is to engineer an interruption-a

breakdown-in an otherwise 'normal' set of occurances in order to bring about an

alteration in the interactions into which one might enter with respect to the formulation of

a problem. Consequently, the occurance of the world comes to be predicated upon a new

set of principles: what might have been considered to be the normal occurance of objects

is subverted, problematised.

87 Erikson (1990) quoted in Laurel, op. cit. , p. 128. 88 Space does not permit a fuller discussion of metaphors; cf., however, Lakoff and Johnson (1980).

Page 35: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

30

This is one of the functions of artworks: to problematise the occurance of a world

through the delineation of a domain of interactions. Consider, for example, Marcel

Duchamp's Readymade. The Readymade worked, essentially, by restructuring the

signifying space of the object through reconstruction of the functional meanings which

that object defmes. This recontextualizing gesture generated a crisis of understanding

with respect to the nature of one's interactions regarding that object. Such a crisis caused

the re-appearance of that object by allowing properties, previously unnoticed, to become

dominant features. In this regard, it caused-if only for a brief moment-aspects of an

interaction that had previously been transparent to become opaque.

Take as an example, Duchamp's Comb. This particular comb is not the comb we

use for our hair. In combing our hair, the comb disappears in the functionality of its use.

The comb, as thing, does not, in our combing, exist: Its coming into being is arrested by

its absorption in our use of it, by our functional interaction with it. By contrast,

Duchamp's Comb presents the comb in such a way that we are beckoned to notice those

attributes which constitute its being-attributes we might not otherwise have noticed,

since they are subsumed in our use of the comb in combing. By placing the comb into the

context in which a painting or sculpture normally stands, Duchamp prompts the viewer to

reconsider the nature of those attributes which, after all, are 'structural' in much the same

manner that a painting or sculpture is 'structural.' In this way, Duchamp transformed the

interactions with respect to which one comprehends the comb and the domain of

interactions into which one may enter with regard to it.

By this means, Duchamp's Readymades generated a break in the otherwise normal

appearance of the object concerned. This was done not by changing the object itself (at

least, for the most part, not substantially) but by shifting the context of its presentation.

This shift in context constituted a breakdown in the normal circumspective manner in

which one might otherwise comprehend the object in question.

This notion of 'breakdown' is treated extensively by M. Heidegger and forms an

important aspect of the principle of interaction that I wish to articulate within the current

study. A breakdown is an interruption in the normal occurring with respect to which we

interact with objects in our environment. This "normal occurring" circumscribes what

Heidegger calls circumspective being. Circumspective being is a mode of being in which

both we, and the objects with which we interact, disappear within our absorption over

some task or activity. Objects occur for us in terms of the functionality for which they are

fitted with respect to some activity. That which distinguishes them as objects 1s

subordinated to the equipmental nature of their function. Heidegger proposes that

Page 36: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

31

the situated use of equipment is in some sense prior to just looking at things and that what is revealed by use is ontologically more fundamental than the substances with determinate, context-free properties revealed by detached contemplation.s9

The object functions precisely as an instrument of our engagement in "concernful"

activity; other than this, it has no independent being. As a piece of "equipment" the

object is nothing more than a placeholder-our most basic way of understanding it is to

use it.9°

Where something is put to use, our concern subordinates itself to the "in-order-to" which is constitutive for the equipment we are employing at the time; the less we just stare at the hammer-thing, and the more we seize hold of it and use it, the more primordial does our relationship to it become, and the more unveiledly is it encountered as that which it is-as equipment.91

Moreover,

the peculiarity of what is primarily available is that, in its availableness, it must, as it were, withdraw in order to be available quite authentically. That with which our everyday dealings primarily dwell is not the tools themselves. On the contrary, that with which we concern ourselves primarily is the task-that which is to be done at the time.92

As a result, the object disappears-as object-into its use as equipment. Consider for

instance, the hammer. As a piece of equipment-i.e. a carpenter's tool-the hammer, as

hammer, disappears into its use in hammering. While hammering, the hammer occurs

only in its equipmental functionality: that is, as "a tool for hammering." As occurant in

its equipmental functionality, the hammer as object is invisible; it is not present as

something distinct and whole unto itself. As Heidegger maintains, the nature of our

interactions with things in our environment have a similar tendency to subordinate the

thing to the equipmental function into which it, as thing, disappears.

Being-in-the-world ... amounts to a nonthematic circumspective absorption in references or assignments constitutive for the availableness of an equipmental whole. Any concern is already as it is, because of some familiarity with the

89 Dreyfus (1993), p. 61. 90 Ibid., p. 64. 91 Heidegger quoted in Dreyfus, op. cit., p. 64. 92 Dreyfus, op. cit., p. 64.

Page 37: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

32

world. In this familiarity Dasein [i.e. "Being"] can lose itself in what it encounters within the world. 93

A "breakdown" constitutes a disturbance in such a circurnspective absorption.

It is a commonplace that the more suavely efficient a habit the more unconsciously it operates. Only a hitch in its workings occasions emotion and provokes thought. 94

The things which populate the environment in which we are absorbed come to the fore

when there is, in some manner, a breakdown in their occurrentness as equipmental and

functional objects. When, for example, the hammer breaks it suddenly appears as object;

that is, it appears as a hammer. Meanwhile, hammering-as "nonthematic

circumspective absorption"-becomes impossible and, thus, the nature of the interaction

called "hammering" is irrevocably altered. Such a breakdown in its equipmental nature

foregrounds the utter abjectness of the hammer, an objectness which is brought about by

its sudden occurrence as somethingforeign.

What Duchamp did, in composing the Readymades, was to engineer a breakdown

in the occurring of otherwise familiar objects. Through engineering such a breakdown,

the object is distinguished from the background in which it is normally subsumed under

its being something-in-order-to. The object stands out as something noticeable,

something whose features-otherwise subsumed m their associated

functionality-suddenly come into relief. All of a sudden, it becomes possible to

hypothesize new means for the analysis and (re)synthesis of the properties by which that

object is constructed.

Such new possibilities for the hypothesis and resynthesis of experience have come

to constitute an important aspect of the very working methods by which art works are

produced. The capability of producing art works is dependent on the particularity of the

means by which they are produced: means and end mutually constitute one another. By

way of articulating this point, visual artist Robert Morris, along with other visual artists

during the 1950s and 1960s, sought ways of foregrounding the forms involved in the

activity of art-making itself-of making those forms elements of the works themselves:

I believe there are "forms" to be found within the activity of making as much as within the end products. These are forms of behavior aimed at testing the limits

93 Heidegger as quoted in Dreyfus, op. cit., p. 70. 94 Dewey as quoted in Dreyfus, op. cit., p. 70.

Page 38: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

33

and possibilities involved in that particular interaction between one's actions and the materials of the environment. 95

More specifically, Morris notes that

[t]he body's activity as it engages in manipulating various materials according to different processes has open to it different possibilities for behavior. What the hand and arm motion can do in relation to flat surfaces is different from what hand, arms, and body movement can do in relation to objects in three dimensions. Such differences of engagement (and their extensions with technological means) amount to different forms ofbehavior.96

Thus, the means by which an artwork is constructed can be made apparent in the

end which results. One way to do this is to systematize the means by which artworks are

made: to develop "a systematic method of production which [is] in one way or another

implied in the finished product. "97 Such a systematic method engenders "a more

phenomenological basis [for making works] where order is not sought in a priori systems

of mental logic but in the 'tendencies' inherent in a materials/process interaction. "98 As an

example, Morris points out the painting technique which Jackson Pollock developed in

which he laid the canvas flat upon the ground and, standing over it, flung brushfulls of

paint across its surface. In his interactions with the canvas, Pollock used his entire body

in applying the paint to the canvas. Such interactions included considerations regarding

the effects of gravity upon the paint; which considerations elicited investigation of the

properties and behavior of paint materials under the force of gravity. By restructuring the

domain of interactions in such a manner, Pollock engendered the appearance of principles

which could not otherwise have been observed-principles which became the basis for

reconstituting the means by which a painting might be made.

Pollock's method of transforming the means by which a painting might be made

was uniquely his. Nevertheless, the underlying principle-that, through the

transformation of the task environment, one could transform the very nature of that which

constitutes an art work-has become a cornerstone of contemporary art. The various

ways in which such transformations are realized constitute, in aggregate, an effort at

problematising a domain of interactions through the engineering of a breakdown in the

95 Morris ( 1970), p. 62. 96 Ibid. 97 Ibid., p. 63. 98 lbid.

Page 39: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

34

mode of occurrence of the objects that delineate the normal unfolding of those

interactions.

3.4 Interaction and the Enactment of Experience In the works of Duchamp and in the working methods of artists like Pollock and

Morris, the domain of interactions is not fixed according to historically-bound procedures

for its synthesis. One's interactions no longer come "ready-made"; rather they occur as

provisional and mutable, transforming the essential principle of interaction from one of

comprehending and affecting experience to one of enacting it. To comprehend and affect

experience is to condition an environment such that it facilitates a repeatable and

therefore expected performance. To enact experience, by contrast, is to generate an

environment-i.e. a domain of interactions-which orients a cognitive system toward the

formulation of any number of possible distinctions and therefore toward the generation of

an unexpected performance.

To enact experience, from a dialectical point of view, is to comprehend the

"subject" not as a thing, but as an emergent process. As an emergent process, a subject

arises in the moment at which something unfamiliar or foreign appears, and which, in its

labor over the comprehension and synthesis of that something, projects itself toward it.99

In a sense, the moment of the appearance of an object represents the very commencement

of the subject-its beginning, that is, as an activated and activating agent, as opposed to a

static, a priori, existent. 100 At this moment, the subject recognizes its labor over the

comprehension of the object as a self-modifying process by which it becomes other to

itself-it arises within a process by which a human comprehends her/his own presence

within an environment of interactions. To create such an environment is to reconstitute

the appearance of things as foreign objects; to cause, that is, the appearance of the object

as Other. Constituted as Other, the appearance of an object occasions the presentation of

a subject, which subject is bound to the particularity of that appearance.

In the following chapters, I attempt to address this formulation as a problem in

interaction. During the course of this discussion, I trace some of the ways in which some

composers have formulated musical problems by emphasizing, in one way or another, the

technological dimension in which they arise and by confronting musical problems,

ultimately, as problems of interaction. Such an approach to music composition does not

necessitate technical devices per se; nor does the presence of technical devices

necessarily project such an approach. Rather, such approaches view musical problems as

99 Adorno (1973), p. 14. 100 Ibid., p. x-xv; p. 8-14.

Page 40: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

35

essentially technological problems and, in so doing, articulate a new framework for the

specification of musical structure.

Page 41: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

PART II:

The Musical Task Environment and the Problematisation of Interaction

36

Page 42: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

37

For many, this is perhaps the only reason why a computer is needed. It is a valid reason, but it is certainly not the most interesting one. More interesting ones are to hear that which could not be heard without the computer, to think that which would not be thought without the computer, and to learn that which would not be learned without the computer. 1

The only legitimation that can make this kind of request admissible is that it will generate ideas, in other words, new statements. 2

In part I of this study, I articulated a dialectical and cognitive framework for

human/ computer interaction. In part II of the study, I consider the ramifications of such a

framework as these might apply to music composition in general and to the design of

computer music systems in particular. Typically, computer music systems reflect weak,

epistemologically shallow notions of the musical task environment. In a 1989 essay, C.

Roads makes a similar observation:

Since 1984 we have witnessed a dramatic increase in the commercial exploitation of computer music systems. A lively industry has grown up around the idea of making computer music systems accessible to musicians. Presently we enjoy the benefits of this industrial success, but we are also hampered by the flawed technical and musical protocols on which this success was built. Fundamental design problems hinder today's systems and impose sharp restrictions on both the mode of work and the musical results that can be obtained.3

Roads relates the problems associated with designing computer music systems to issues

of representation-that is, how musical objects and processes are displayed to the

I Berg (1987), p. 161. 2 Lyotard (1984), p. 65 . 3 Roads (1989), p. 257.

Page 43: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

38

mus1c1an, represented within computer algorithms and data structures, and m the

exchange of musical signals among different devices.4 As Roads notes~

Representations are central because they define the terms and concepts that musicians must use to conceive of and specify their music to the computer. These terms and concepts shape their composition strategies-and hence the type of music-that they can realize.s

Representations are, of course, central in the development of compositional

technique during the last 50 years. And yet, while "the lesson of books like Erhard

Karkoshka's Schriftbild der Neuen Musik (1966) and John Cage's Notations (1969) was

precisely to point out the diversity of music representations in the scores and working

notes of composers . . . today's commercial systems[, by contrast,] incorporate a weak,

monolithic, and normative music representation. "6

What Karkoshka and Cage (along with other composers) taught us was that just as

one might compose the musical materials and forms that are represented within particular

task environments, one can also compose aspects of those very task environments.

Composing the task environment is tantamount to composing the interactions one might

have with ones own representations-interactions, that is, which have self-referential

comportment. To compose the interactions that one might have with a computer is, as

Otto Laske has observed, to extend the domain of self-reference into an alia-referential

domain-i.e. to explicate a theory of composition. 7 The focus of such a theory is on the

procedural dimension of compositional activity rather than on the artifacts which result.

Such a theory is not to be proved or disproved; rather, it encapsulates the possibility of

hypotheses that are to be tested against the very reality they hypothesize.

In the following discussion, I examine the principle of interaction, as it is

embodied in music compositional procedure, in order to articulate a framework for the

design of computer music systems. The framework that I wish to articulate acknowledges

the dialectical dimension of interaction in much the same manner that the scores and

compositional methods of composers such as Cage, Stockhausen, Xenakis, and Pousseur,

and many others did back in the 50s, 60s, and 70s. First, I consider the issue of musical

scores from the point of view of the nature of interaction which they project. Then, I

consider how compositional technique itself is recontextualized through the

4 Ibid. 5 Ibid., p. 258. 6 Ibid. 7 Laske (1980), p. 427.

Page 44: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

39

"parameterization" of musical materials and process. In electronic music, as well as other

"abstract" approaches to music composition (serialism, stochastics, ·chance, etc.), the

parameterization of musical materials not only alters the presentation of those materials:

it alters how the composer thinks about and realizes experiments in the very procedures

by which those materials are assembled and organized. By restructuring the conceptual

framework according to which musical materials are assembled and organized, the

composer enters into a new relationship with those materials, resulting in a

transformation of the means by which musical artifacts are imagined and realized. With

the computer, such a transformation of the task environment can be made explicitly. As a

consequence, the composer not only observes the process of her/his interaction, buts/he

in fact composes that very interaction itself. In this regard, the computer program

encapsulates a theory of composition; a theory, that is, which frames a domain of

interactions. A computer system is understood as dialectical in the sense that the

imperative for its design and implementation is the enactment of a possible model of

experience (through the hypothesis of interaction), rather than (as is more commonly the

case with computer programs) the affecting of an already synthesized model of

expenence.

Page 45: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

40

4. The Musical Task Environment

4.1 Music Notation During the last 50 years, music notation has been transformed from a means for

historically determined communication between composer and interpreter to a means for

the reformulation and problematisation of that communication. A musical score has

come to be understood as the encapsulation of a theory of performance practice-a theory

whereby the interpreter is challenged to redefine her/his relationship to the technologies

of performance (e.g. the musical instrument).

As one example ofthe this development, consider John Cage's 26' 1.1499"for a

String Player (page 1 from the score is shown in figure 4.1). The score is for any 4-

stringed instrument. The following is from the performance instructions given by the

composer:

The notation is in space, the amount equaling a second given at the top of the page. Vibrato is notated graphically .... H indicates hair ofbow, W, collegno. B indicates bridge (extreme ponticello ); BN is close to bridge than normal; NB is closer to normal than bridge, etc., F indicating extreme sul tasto. Below these notations is an area where bowing pressure is indicated graphically, the top being least, the bottom most pressure (i.e. pianissimo, fortissimo). The 4 strings (e.g. violin EADG) are the lower large areas, the points of stopping these being indicated. These strings are in a continual state of changing "tune" indicated by the words, decrease and increase, i.e. tension. Slides are indicated by angles and curves, harmonics by 3 lines connected vertically by dots. Vertical lines connecting two separate events indicate legato. 4 pizzicatti are distinguished .... Manner of breaking triple and quadruple stops is indicated by arrows. If no indication is given, the player is free to break as he chooses. The lowest area is devoted to noises on the box, sounds other than those produced on the strings ... 8

Clearly, the notation delineates a very different set of interactions, between player

and instrument, than does traditional notation. When confronted with such a score, a

performer first has to learn what the notation means. Then s/he must learn how to

coordinate her/his bodily movements in response to events as notated on the score. After

a time, however, once the performer has been able to respond naturally to the score, slhe

finds her/himself in an entirely different relationship with her/his instrument. Without the

8 Cage ( 1960).

Page 46: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

r- 1. • ~ ·• • ,., I • '

.':::.:-

·· .. ..

-.... ·.· . · .: · ._,_ . ·

.· . ·.

· -= ~·- ~f%~~=~~~~: . . .. . ;~:.:~~~!~~~i.:~ : :~ . ~-. ;<ff=~~~f,·;:._;~~~· ::i'~~:..:::~ ~~~·.·-:-'if:#~~~~-; (

"' . .- . ' ·.

41

r < . •. -' --~

·:-·

~;{.{,'·<: ....:_· ·_:_:.i::;_:i·~c:'...:....' '.'-' ---+--_;_----+-'-'-..,.--"-----:-'--·""-:-·. -...,---:-:-~­k·~~ ·•-• ·· · ,~·- : . . .. ·._._. ·., .-;· · : . .

•'. · • . l:t. ···~ · • . ·-:···-

.....

' \ -. · . ._. __ ....:._....:._ _______________________ ..:_ __ ...;_ _______ _, ____________ _

Figure 4.1: Page 1 from Cage's 26' 1.1499"

Page 47: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

42

usual cues which an already well-known notation engenders, the performer is less prone

to fall back upon habitual and acculturated patterns of relationship.

This is occasioned by two aspects of the notation. First, Cage's score directs the

performer's attention toward the structure of behavior rather than focusing it upon a fixed

and static musical object, per se: it tells the performer what to do rather than what to

produce.9 Second, unlike say a lute tablature, that which the performer is instructed to do

references an interaction that is not part of an already well-established performance

practice. A lute tablature tells the performer what to do in order to effect, nevertheless, a

particular anticipated result. The performer's relation to the instrument, as such, remains

circumscribed by a particular. performance practice. Consequently, while the lute

tablature leaves a great many aspects of the performance open-which really means that

the performer is supposed to "know what to do"-Cage's notation in 26' 1.1499" for a

String Player specifies, in a highly determinate manner, every aspect of the performer's

activity, assuming that the performer has no pre-conceived idea as to what is to be done.

It is by means of such a focus on the performance itself that the interactions which it

invokes are differentiated from a performance 'practice.'

In this manner, the score engineers a breakdown in the circumspective being in

which the instrument is normally absorbed through historical performance practice. Like

the carpenter's hammer, the instrument makes its appearance as instrument when the

interactions in which it is presented are problematised. Accordingly, the instrument itself

quite literally becomes another instrument. In such a work, the means of production are

uniquely defined according to the same process by which a supposed musical content is

defined-musical "content" and musical "production" arise together.

4.2 Electronic Music and the Crises of Musical Form One can generalize this principle somewhat by saying that the means and ends of

production arise together, each one determining the unfolding of the other. This principle

constitutes one of the most important theoretical insights offered by the development of

electroacoustic music. This was particularly true in research carried out at the electronic

music studios in Cologne during the 1950s.

While the use of electronic means for producing music was not, in itself,

unprecedented, the notion that the technological problems of electronic music could

become interchangeable with aesthetic problems was. For, as G.-M. Koenig points out,

9 An aspect of notation which T. DeLio attributes also to the scores of Christian Wolff(cf. DeLio (1984)).

Page 48: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

43

technical problems in the studio prompted investigative procedures "which [translated]

the musical structure into a technical one." 10 Solutions to the technical problems of

electronic equipment came to be directly correlated with musical problems. As a result,

many technical means were unthought of "until they [collided] with a compositional

idea. The realization of electronic music is entirely conditioned by this dual music­

technical character."ll The relationship between the particularity of a technology and the

means by which musical structures might be conceived and realized was understood to be

mutually determinative.

Of course, this understanding of the relationship between technology and

compositional procedure was not, in itself, new. Composing for orchestral instruments

was always a process by which musical problems were predicated on technological ones,

and vice versa. However, this understanding was often subsumed under the historicity of

musical practice. Through the introduction of a new set of technical objects-as was the

case in the Cologne studios-this understanding of the relationship between technology

and compositional procedure was brought into the foreground.

The introduction of the electronic means for musical production was not itself the

impetus for this new understanding. Theremin, along with many others, had already

introduced electronic instruments. In their case, however, their introduction still

preserved the traditional categories by which music was composed and performed. By

contrast, the Cologne composers saw in electronic music an entirely new means for

specifying the technological problem as a musical one, and vice versa. By avoiding any

reference to instrumental and concert music paradigms, their interpretation of electronic

technology defined a new domain of interaction and, thus, posited a re-awakened

epistemology regarding the relationship of technology and music.

As one example of the consequences of this approach in compositional technique,

consider G.-M Koenig's Terminus. In this work, Koenig formalized the means by which

materials could be generated in order (1) to find ways in which sounds themselves could

be understood as formally complete entities (and thus not subsumed under the movement

of larger forms), and (2) to fmd ways for integrating large-scale form (entire

compositions) and the "low-level" forms constituting sounds themselves.

In this work, the individual sounds are not bits of decoration turning a stage into a landscape or the front parlour. The sounds tell no story other than their own. This is why the piece is not based on a form-plan in which sounds were inserted. There

10 Koenig (1960), p. 53. II Ibid.

Page 49: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

44

was rather only one single scheme: that of the production of sounds which gradually become forms of various complexity.12

For Koenig, electronic music composition is to be utterly differentiated from instrumental

music composition. In the tradition of instrumental music, sounds become elements of

organizations whose characteristics determine the "functional" meaning of those sounds.

The forms of instrumental music surely have one thing in common: they are composed of single sounds. These sounds have characteristics such as pitch or timbre, and when analyzing forms, we are frequently forced to pay attention to just these characteristics: the extent to which melodies or harmonic structures or those of timbre are engaged on the form as a whole. We are always dealing with sounds following a particular arrangement, and this arrangement-not the sounds comprising it-is what we call form.

In the context of electronically produced sound, by contrast, sounds themselves are

understood as having a form and constitution of their own.

[T]he electronically produced sound is not only within a form but is furthermore in itself the result of being formed. I don't mean its instrumental aspect-what we could call attack or decay or intonation. For whereas the characteristics of an instrumental sound are determined to a great extent by the mechanical texture of the musical instrument, the characteristics of an electronic sound are determined by the actions of the composer producing it. Everything about it is artificial, made with artifice directly derived.from the musical idea ofthe entire work.13

In the composition of his work Terminus I, for instance, the form of the

sound-the "sound-form"-was not defined a priori; rather it arose through unfolding of

the form of individual sounds themselves.

One could say that the individual sound, although the result of being formed, has nonetheless no actual form, but that it does acquire its form in the sequence of many similar or dissimilar sound forms.14

From a single sound form, a larger musical form emerges. The overall form of the work

emerges from the form of individual sounds; the large-scale form is, as such, essentially

contingent upon the emerging low-level forms of individual sounds.

12 Koenig (1965), p. 10. 13 Ibid., p. 9 (my emphasis). 14Ibid., p.10.

Page 50: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

45

Such a radical departure from traditional compositional practice arose as a

consequence of the "collision" of technology and musical practice"-· a collision from

which there arose a re-activation of the epistemological questions constituting musical

practice and technological investigation. As Eimert points out, "new ways of generating

sounds stipulate new compositional ideas."15 These new ways of generating sounds came

with their own constraints. In the early days, when there was but a single sine-tone

generator, for instance, composition of timbres engendered all kinds of technical

problems, the solutions to which required "a thought-out plan of realization, which

translates the musical structure into a technical one."16 Far from simplifying

compositional tasks, the electronic studio complicated them immensely. The difficulties

that were introduced focused the composer's attention upon the materials/form dialectic

of her/his compositional procedure. This is because the composer now had to concern

her/himself "with a material to which traditional, well-proven ways of his art do not

apply."17

Eimert compared this focus on the materials/form dialectic with that of composers

during the beginning of the development of polyphony in the Middle Ages. 18 At both

times, the composer was faced with a 'raw' material and with the problem of determining

the manner of processes by which it might be organized and structured. Such a problem

is a musically primordial one:

despite the apparent modesty of the preliminaries of electronic music, the full brunt of an experiment is borne in that a single creative selection and successful realization can bring us face to face with the absolute nature of music. 19

There was no transcendental musical object; the musical object arose as a consequence of

the particularity of the investigations and experiments under which material came to be

constituted, given the technological interpretation of the devices by which that material

was generated. As a particularised investigation into musical possibilities of an

articulated technological proposition-given a device and its interpretation-there could

be no rules in the sense of a transcendental notion of music. "Music" per se arose as a

consequence of the performances which a composer took in her/his interpretation: "that

15 Eimert (1959), p. 2. 16 Ibid., p. 53. 17 Ibid., p. 5. 18 Ibid. 19 Ibid.

Page 51: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

46

which normally [belonged] within the scope of theory here [remained] bound up with the

material object. "20

In much the same manner that scores such as 26' 1.1499" for a String Player

changed the relation between performer and instrument, so too did the technology of the

electronic music studio transform the relationship between the composer and means by

which musical ideas might be formulated and realized. As already stated, technology is

never neutral; nor is it to be equated with the technical devices which it circumscribes.

The technical devices which populated the Cologne studio, for instance--oscillators,

filters, amplifiers, tape recorders, etc.-were originally discarded radio transmission and

production components. Understood as compositional devices, however, their

equipmental nature was, for a moment, suspended, allowing for the emergence and

gradual appropriation of an entirely new functionality. Thus, music took on epistemic

dimensions as an idea regarding technology.

4.3 The Parameterization of Music and the Programming of Structure What were the consequences of this new "collision" of technology and music?

One consequence was the parameterization of musical materials and processes. Eimert

reports, for instance, that it was the "triple-unit of the note (frequency, intensity,

duration)" that made possible the techniques of the studio.21 Duration was correlated to

length of tape, frequency and amplitude to easily discretized settings of an oscillator and

amplifier, timbre to manual processes involving mixing and looping, etc. Such an

approach to the generation of musical materials was the only one possible, given the

devices at hand.

The parameterization of musical forms (both "sound" forms and "musical" forms)

represented an entirely new way in which a composer might come to understand the

potential for their materiality, their evolution, and for their structural interrelation with

other similar forms. This understanding had ramifications for instrumental music as well

as for electronic music. These ramifications were generative in nature, since any

abstraction could be concretized by linking its syntactic unfolding to some parameter of a

musical or sonic process. What underlies compositional procedures as diverse as those

employed by Boulez, Babbitt, Cage, and Stockhausen is the notion that a musical form

could materialize through the concretization of an abstract principle and that, through

such concretization, as yet unimagined models of materials and process might be

articulated.

20 Ibid. 21 Ibid.

Page 52: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

47

This shift in compositional procedure gave the so-called "pre-compositional"

stage of composition a renewed primacy. This new emphasis constituted a reframing of

the musical task environment from one which is determined by historical and cultural

practice toward one which is determined according to a project-specific imperative. Such

a project delineated an individual, and thus highly "subjective", perspective on the nature

of music and the particularity of the problem to be formulated. One way in which this

could be done was by specifying the musical problem as an abstract system, and then

finding means for mapping that system's behavior to declared musical parameters.

Through an emphasis on composing the abstraction, and then the definition of the

"parameter space" by which musical material might be generated and organized,

composers found a way to reframe the compositional task environment itself; to set up

systems for compositional activity which would short-circuit traditional approaches to the

structuring of musical materials and processes. The objective was to formulate a

question, or a problem, such that any possible answer or response would reveal new, as­

yet unimagined, possibilities for the synthesis of musical meaning.

This kind of activity delineates a domain of interactions by which a composer

might formulate, organize, and otherwise synthesize musical materials and musical ideas.

To focus on interaction itself (rather than the plans and goals by which it is motivated) is

to hypothesize the material outcome of one's own thought. It is to assert that the

properties of an "object" are not given a priori, but rather arise as a correlate to an

emergent comprehension. Such a comprehension arises as much under the imperative of

the thought that tries to think it as it does under that of the object or thing for which it

tries to account in its thinking. To compose an interface, in the manner in which we are

here speaking, is to delineate a hypothesized comprehension.

The compositional methods of Xenakis exemplify this notion of composition. In

his preface to Formalized Music, Xenakis writes:

The effort to reduce certain sound sensations, to understand their logical causes, to dominate them, and then to use them in wanted constructions; the effort to materialize movements of thought through sounds, then to test them in compositions; the effort to understand better the pieces of the past, by searching for an underlying unit which would be identical with that of the scientific thought of our time; the effort to make "art" while "geometrizing," that is, by giving it a reasoned support less perishable than the impulse of the moment, and hence more serious, more worthy of the fierce fight which the human intelligence wages in all

Page 53: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

48

the other domains-all these efforts have led to a sort of abstraction and formalization ofthe musical compositional act.22

Xenakis regarded the objectification of music compositional procedure dialectically-that

is, as the expression of an individual subjectivity freed from the historicity of cultural

practice.23 Xenakis carried out this project through the appropriation of a mathematical

model of musical procedure. Through this appropriation, Xenakis designed an approach

to compositional method in which musical material arose from the application of

otherwise abstract operations whose mappings to musical morphologies were the sole

means for the appearance of what might be considered "traditional" music compositional

activity. In this regard, compositional procedure-as a historically bound

procedure-was interrupted, "broken down," in order to allow for the appearance of a

new kind of musical material. Through the use of abstract operations, the possibility of

the introduction of the historical musical imperative was deferred until such time that its

effect could be minimized.

Consider, for instance, the procedure employed in the composition of his

Achorripsis.24 In this composition the composer asks the question "what are the minimal ·

constraints required for the establishment of musical coherence?"25 From this initial

question, the composer enters into a kind of Platonic dialog with himself, a dialog in

which each aspect of the composition, from the entire work down to each individual note,

is determined from results fashioned in response to a question. Each question is framed

as an operation whose parameters are interpreted as musical data.

The first such question asked is: how are densities of events to be distributed

among the cells which define the entire composition? Xenakis solves this first problem

through the application of the Poisson distribution such that events of zero density (i.e. no

events) define 107 cells, densities of 2.2 events/second define 65 cells, and so on as

follows:

TOTAL# OF CELLS: 196 0 events/ second 107 2.2 events/ second 65 4.4 events/ second 19 6.6 events/second 4

22 Xenakis ( 1971 ), p. ix. 23 Of course, Xenakis was not at all alone in this regard: the entire modernist project in music was focused on this concern. 24 The following analysis is taken from Xenakis ( 1971 ), pp. 29-34. 25 Xenakis (1971), p. 29.

Page 54: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

49

8.8 events/ second 1

The s~cond question asks: given this global distribution, how are the various

densities (five in all) to be distributed within the matrix? Again, Xenakis uses the

Poisson formula; this time, however, he uses it once for each column in order to

determine the density for each cell.

The third question concerns the sonic elements defining each cell. Consider, for

instance, the cell which occupies row three, column seventeen. This cell has a density of

4.5 sounds/measure, and its timbre class is string glissandi. With 6.5 measures per cell,

there are a total of 29 sounds ( 4.5 sounds/measure times 6.5 measures = 29). Given this,

"how shall we place the 29 glissando sounds in this cell?"26 Xenakis responds with the

specification of a set of seven hypotheses which define:

-the speed of the glissando - the starting and ending points of glissandi - the duration of glissandi - the frequencies of glissandi

From these, Xenakis draws up three tables of probability:

- a table of durations - a table of speeds - a table of intervals (between beginning and ending of each glissando)

From these tables, the composer may select freely the materials constituting the cell as

long as her/his selections follow the guidelines given by the data in the tables. As such,

the constraints represented in the table

are more of a general canalizing kind, rather than peremptory. The theory and the calculation define the tendencies of the sonic entity, but they do not constitute a slavery.27

By the time the composer has reached this stage of the compositional process,

however-a stage in which her/his freedom to select materials is almost completely

umestrained-herlhis task environment has been transformed. Consequently, the

constraints which determined the choices made earlier remain active even when he is

26 Ibid., p. 32. 27 Xenakis (1971), p. 34.

Page 55: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

50

most "free" to choose himself. This is an important aspect of the compositional

procedure which Xenakis, along with many other composers, formulated. The purpose of

the method was to find a way to subvert the tendency of the individual "I" to resort to, and

appropriate, material which belongs to a cultural and historical "I"-to, in effect,

emancipate the freely emergent and individual subject from the epistemological shackles

of the historical subject.

By problematising the working environment-by disrupting the machinery by

which a composer might normally work-the subject is allowed to emerge in .its self­

reflective immanence, freed from its determination as transcendent subjectivity. By

causing the conditioning of an environment, a composer engenders a situation in which a

subjectivity, which might otherwise lie submerged beneath habit and acculturation,

emerges as a consequence of its own effort to comprehend and synthesize its own equally

emergent experience. Such a compositional method does not hamper or constrain

compositional procedure. Quite the contrary: it engenders the conditions under which a

composer is freed from the automaticity of habitual and historical decisions.

My freedom will be so much the greater and more meaningful the more narrowly I limit my field of action and the more I surround myself with obstacles. Whatever diminishes constraint, diminishes strength. The more constraints one imposes, the more one frees one's self of the chains that shackle the spirit.28

Composition comes to include composing the very conditions-cognitive,

epistemological, equipmental-by which music compositional procedure per se might be

enacted.

28 Stravinsky (1970), p. 65 .

Page 56: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

51

5. From Programmed Structure. to the Programming of Interaction

As stated in the introduction of this study, the computer can be understood as a tool

which extends a human's capacity for self-reflective activity. Self-reflective activity is

that activity by which one's internal processes become externalized in some fashion-an

activity by which a subject makes an effort in its comprehension of an object that it

conceives of as other to itself and, in that effort, generates its own projection as other. As

has been argued throughout this study, such externalization of one's internal processes

requires an environment in which the normal course of events is, in some way,

problematised. Through problematisation of the normal course of events, a breakdown in

what Heidegger called "circumspective" being is generated, throwing both object and

subject into relief. As a result, subjectivity is reconstituted along the lines of an as-yet

incomprehensible presentation. In the effort to formulate its own comprehension, the

subject must alter the structure of its own occurrence. By this process, a

thing-understood as foreign, as other-acts as a kind of "catalyst" for the emergence of

thought. Such an emergence is really always a re-emergence; seeing its environment in a

way in which it had not seen it before, a thinking subject can choose to act in a manner

which is free, at least for a moment, from its own culturally assimilated history of

interactions.

With the computer, one can formalize the conditions which might allow such an

emergence of the thinking subject. With the computer, a human can construct an

epistemological domain of interactions which calls forth the reformulation of her/his

thoughts and actions. With a newly framed domain of interactions, subjectivity, as a

complacent and circumspective phenomena, becomes problematic; it can no longer cope

with the manner in which ideas and objects are hypothesized. Such was the case with

VisiCalc's REPLICATE feature: it generated the presentation of a possible interaction

which had not yet been presented. Through such an unprecedented presentation, human

thought was allowed to think the conditions of the problem in a manner never before

possible.

In a similar fashion, the use of the computer in music composition allows us to

ask questions and to formulate the situation of a subject/object dialectic in a manner not

possible without the computer. It is my view that this constitutes the most significant

imperative for the use of the computer. For, as Otto Laske observes, the computer forces

Page 57: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

52

musicians to focus on the pro-active, rather than re-active, aspect of their activity, [giving] them a chance to choose, rather than suffer, their processes.29

For this reason,

[t]he computer has changed the potential of music theory since, for the first time, it has given composers a tool for capturing their processes, and for articulating a theory of music based on their knowledge of compositional planning and problem solving [as distinct from their knowledge of historical musical artifacts].3°

The computer has changed the compositional landscape because, with computer

programs, it has become possible not only to state explicitly the hypotheses of musical

structure but to define the domains of interactions within which a composer might

formulate such hypotheses.

5.1 Computer Music: Two Traditions Within the history of computer music, two disparate traditions have emerged:

sound synthesis and computer-assisted score generation. Not only do these two domains

of activity trace a different tradition-the former concerned exclusively with digitally

synthesized sound, the latter primarily with instrumental music-they articulate two very

different paradigms. Digital sound synthesis can be characterized by the following

research orientation:

• focus is exclusively on the computation of single sounds;

• computation of sound is based primarily on historical models of natural

acoustic phenomena and the processes by which those phenomena can be

generated and transformed;

• separability of individual sounds and the context in which they might occur.

By contrast, computer-assisted score generation was originally characterized by:

• exclusive focus on computation of instrumental musical scores;

• computation of musical structures based primarily on · compositional and not

veridical criteria;

29 Laske (1991), p. 236. 30 Laske (1989), p. 46.

Page 58: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

53

• lack of concern for the evolution of individual sounds themselves; focus upon

a model of composition which derives from the tradition of concert hall music.

Classical sound synthesis has its historical basis in the field of acoustics:

composers began working with it somewhat later than did scientists. By this historical

derivation, it appropriated many aspects of the largely objectivist framework of 19th

century scientific method. As Marc Leman notes, Hermann Helmholtz "exchanged

intuition by rigorous scientific methods and made a clear separation between art and

science."31 For Helmholtz, the separability of the acoustical event and the context of its

occurrence was a necessary form of data reduction by virtue of the sheer complexity

presented by incorporating a more holistic approach. Many of the objectivist

underpinnings of sound synthesis were carried over into computer music languages such

as Music V and, later on, CSound. 32

By contrast, computer-assisted score generation had its ideological underpinnings

in the American experimental music tradition, in the Second Viennese School, and in

systematic compositional methods of the European serialists. It was not, for instance, a

large step from the compositional method employed in Achorripsis to that employed in

the composition of ST/10-1, 080262, made with Xenakis' stochastic music program.33

Computer-assisted composition was, for the most part, a research project carried out by

composers who saw it as a way to expand their ability to formulate and control the

unfolding of musical generative processes.

In spite oftheir obvious differences, sound synthesis and computer-assisted score­

generation have in common that they both reflect a dualistic notion of musical structure

by which timbre computation and score generation are understood as separable domains

of musical activity. This can be contrasted with the view which Koenig articulated in his

composition of Terminus, as described above, in which the large-scale architecture of the

work · was predicated upon the formal structures evolved within individual sounds.

· Although Terminus was composed without the aid of a computer, the computer can be a

catalyst in bridging the gulf between sound synthesis and music composition. As a

catalyst, the computer can be used to bring about an unprecedented framework for

formulating interrelationships between higher level musical ("syntactic") structure and

lower level timbral structures, and for formulating the generative processes by which

these interrelations can be imagined and realized. The computer can facilitate the

31 Leman (1995), p. 10. 32 Csound is arguably the most used language for sound synthesis, with the possible exception of Max. 33 In fact, it was exactly the same paradigm.

Page 59: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

54

delineation of a domain of interactions in which the generation of sound materials and

the generation of high level musical events are inseparable aspeds of an overall

compositional procedure. In order to accomplish this, it becomes imperative that

composers explicitly identify the historically and culturally determinative models of

compositional procedure which, for the most part, lie hidden beneath and within the task

environments which composers use. By exposing the epistemological bias which

underlies common musical interfaces, we can begin to understand compositional

procedure as a free agency which reflects the particular subjective means by which

musical and sonic structure might be conceived and its design realized.

5.2 Rule-based vs. Example-based Models of Compositional Procedure I distinguish two distinct attitudes regarding music and timbre composition: one

attitude understands music and timbre composition as a synthetic process while the other

understands it as an artificial process. I differentiate the term synthetic and artificial

roughly following Herbert Simon's formulation.34 With the term synthetic I refer to man­

made objects in which various components are gathered together in order to resemble, in

as detailed a fashion as possible, the inner structure of an already existing phenomenon.3s

The structure of the synthesized object results from stipulating an isomorphism between

that object and the "thing" or "phenomenon" it simulates. Thus, in the case of sound

synthesis, mere simulation of a single class of sound is not enough. For instance, the

ability to produce an A-440 trumpet tone through FM synthesis is not a sufficiently deep­

level simulation: the model must, from a single algorithm (though with different data), be

capable of producing all the tones within the trumpet's range. This concern for deep-level

synthesis is carried even further in physical models of natural phenomena such as musical

instruments, where the efforts toward synthesis concern modeling of the physical

mechanisms themselves.36

With the term artificial, I refer also to man-made objects. Here, however, the

impetus for creating the object comes exclusively from a desire for a particular human­

made design and not from the simulation of a natural thing. While the structure of the

object may be bound by certain laws as defined within the natural sciences, it differs from

a synthetic object precisely because it functions in relation to a particular context which is

defined according to a human-made design. So, for instance, the same synthesis

34 cf. Simon (1969). 35 This interpretation of the term is not inconsistent with its use in other branches of engineering. For instance, logic synthesis concerns the implementation of a logical model as an electrical circuit. 36 Indeed, in the case of physical models, we can refer to modeling the behavior of natural phenomena themselves and not only the behavior of individual acoustic instances.

Page 60: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

55

technique-FM-would be understood, not as a mechanism for the simulation of a

natural object (such as a trumpet tone), but as the delineation of an interaction through

which larger-scale compositional algorithms are linked to lower-level sound designs.37

The objective of human activity then becomes not so much the simulation of already

extant phenomena, but rather the concretization of otherwise abstract determinations.

Another way to articulate this difference is to say that the artificial presupposes a

focus on rule-based models of composition and sound design, while the "synthetic"

presupposes a focus on example-based models of composition and sound design. An

example-based approach is based on models of "existing musics."38 As models of

existing musics, example-based models foreground the primacy of "listening" over

conceptualization and model-building. The examples that are reactivated in memory "are

reconstituted from prior experience."39 Such experiences are, in turn, "de-compiled into,

and conceptualized in terms of, objects."40 Such objects represent "data models, not

process models."41 Example-based models of the compositional process take seriously

the notion that composers learn how to compose by studying and internalizing the

compositional techniques evinced by the great works of the past and, in doing so, form

"personal scripts," or "behavioral templates known as style" which constitute the

knowledge base used for the composition of new works.42

By contrast, a rule-based approach to composition is based on "an awareness, if

not an analysis, of compositional processes."43 As process models, "the focus of attention

is on a sequence of steps, or a set of decision rules."44 The objects of concern are not data

objects but rather procedural objects.45 This is not to suggest that data objects don't exist;

rather, they are subsumed under the contingencies of a process model. Rule-based

models of composition work by problematising the musical task environment-by

situating the composer, as a historical cognitive entity, within an environment which is

unfamiliar and essentially "non-historical." Such an approach forces the composer to

imagine and articulate problems in a manner which is directed toward the concretization

of an abstract idea.

37 Truax ( 1985). 38 Laske (1991), p. 238 . 39 Ibid. 40 Ibid. 41 Ibid. 42 Laske (1989), p. 48. 43 Ibid. 44 Laske (1991), p.238. 45 Ibid.

Page 61: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

56

Herbert Brun articulates the difference between example-based and rule-based

models of composition:

It is one thing to aim for a particular timbre of sound and then to search for the means of making such sound timbre audible. It is another thing to provide for a series of events to happen and then to discover the timbre of the sound so generated. In the first case one prefers those events to happen that one wishes to hear; in the second case one prefers to hear those events one wishes would happen.46

In the first case, a composer's activity is modeled primarily upon listening/or a structure.

In the second case, a composer's activity is modeled primarily upon listening from a

structure. These two different approaches, when rendered as computer systems, articulate

two different models of human cognition and epistemology-a topic to which I shall

return shortly.

5.2 The Compositional Life Cycle Laske views the compositional process as a "compositional life cycle" which can

be depicted as shown in figure 5 .1. Laske summarizes aspects of a generic life cycle as

follows:

According to the generic view of a composition's life cycle, the composer starts with a (fuzzy) design idea which leads him to generating (parametrical) data that, once fully grasped by him (data model), initiate a design process, first general, then more and more detailed, which leads to a definitive score.47

The compositional life cycle describes anything from the process of designing a

single sound to that of designing an entire family of compositions. The life cycle

comprises four basic stages: precomposition, epistemology, design, and

implementation.48 For the purpose of the current study, it is the epistemological stage

upon which the crux of the model rests. The epistemological stage is that stage during

which a composer attempts to account for the data, generated during the precompositional

stage, according to an as-yet emergent design. At this stage, whatever constitutes an

initial design impinges directly on how the composer comes to understand and otherwise

interpret the data. The transition from data to design, and from design to data, is a non-

46 Brun (1969), p. 117. 47 Laske (1989), p. 48. 48 Ibid.

Page 62: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

57

linear one.49 It is this stage of the compositional life cycle which is most often, and most

deeply, obscured by historical practice and cultural habit. Even within many computer

music systems, the epistemological stage lies hidden within the organization of

"representations" and "metaphors" which constitute a computer interface.

PROCESS DATA MODEL DESIGN MODEL

(generic)

STEPS Epistemological

Ml M2 Level

PRECOMPOSITIONA

LEVEL

W DATA

COMPOSITIONAL REALIZATION

Figure 5.1: The "Compositional Life Cycle"

DESIGN

LEVEL

By exposing the epistemological stage of the compositional life cycle, computer

music systems enable the design of processes in which a composer can examine, in an

articulated manner, her/his own design procedure.

These are aspects of ordered processes that exist in the dynamic relationship of thinking and acting, cycling and transforming, generated across the moving, fuzzy boundaries between inner and outer, subject and object. 5°

The compositional life cycle becomes as much a description of the composer as it is a

description of the artifacts slhe composes. As such, the composition life cycle traces a

process of enactment: the delineation of a conceptual environment which orients a

composer toward the formulation of a design without necessarily specifying that design.

49 Ibid. so B.C. Goodwin as quoted in Laske (1991), p. 243.

Page 63: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

58

The computer becomes, in this regard, the composer's "alter ego" allowing the composer

to specify the interactions by which s/he comes to hypothesize materials as well as the

forms by which their concretization is realized. From this point of view, the computer is

more than merely a tool which actuates a performative dimension of human activity-it is

more than merely a fancy calculator or number cruncher. Rather, it is a tool by which

human self-reflective processes can be explicated and thus observed and investigated.

5.3 Example-based Composition and the Primacy of Listening Nevertheless, most computer software systems for music composition tend to

reflect example-based notions of compositional procedure. Within the domain of sound

synthesis, for instance, the process of composing sound is most frequently referred to as

"sound design" or "timbre design." Within the domain of computer-assisted composition,

various generative procedures are separated from their original framework and used for

the sole purpose of generating decontextualized "material". Both approaches tend to

foreground the primacy of "listening." Such listening can be done by a human auditor

with the assistance of analysis tools. In this case listening, as a phenomenological

dimension, is constrained by the paradigm of analysis; human listening becomes an

extension of the analytical device.

When listening becomes the dominant point of access between a composer and

her/his material, then the primary selective criterion becomes an evaluative one based on

the judgment of the veridicality of the output. If the output matches what is expected, to

within an acceptable degree of error, then it is judged to be "good"; otherwise, it is judged

to be "bad." Such criteria have little to do with potential design issues related to a larger

musical or environmental context. Moreover, there is no means for the independent

evaluation of one's judgment, which, as such, boils down to a matter of "I like this"/"I

don't like that." Synthesis models are, thus, essentially "data models"-"they objectify

elements of information without tagging the knowledge processes that use that

information."Sl Consequently, it is difficult, if not impossible, to explicitly trace the

process by which this or that sound morphology is constructed and, as such, it is

impossible to hypothesize the abstraction from which other morphologies might be

structured.

As an example of a software system in which listening is the prominent mode of

interaction, I offer a brief overview of ISEE (Intuitive Sound Editing Environment), which

was designed and written by Roel V ertegall and Ernst Bonis. !SEE, as the authors

51 Laske (1991), p. 238 .

Page 64: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

59

describe it, "is a general sound-synthesis tool based on expert auditory perception and

cognition of musical instruments. "52 As the authors tell us, !SEE is offered as a remedy to

the standard synthesizer interface which most often consists of the presentation of the

synthesis model's "inner parameters. "53 In the usual case, those inner parameters

constitute the interface objects. Take, for instance, FM synthesis on the Yamaha DX7:

the control interface consists of the deep-level FM synthesis parameters themselves,

rather than being based on a model of timbre design. 54 Of course, this situation is not

intrinsic to FM synthesis; rather, it results from a more general approach to interface

design in music synthesizers. 55

The authors of !SEE propose a different approach, one which is based on the

principle of"direct manipulation." Direct manipulation has the following properties:

- the continuous representation of the object of interaction;

-physical actions with graphically rendered objects rather than complex syntax;

-immediate update of objects in response to user actions. 56

Perhaps the most ubiquitous example of direct manipulation is the mouse cursor on the

computer screen: move the mouse this way and the cursor moves; move the mouse over

certain parts of the display, and the shape of the cursor changes, thereby indicating a new

range of available actions, etc. With direct manipulation, "physical action is used to

manipulate the objects of interest, which in turn give feedback about the effect of the

manipulation."57 Thus, direct manipulation focuses the interaction on task-related

semantic knowledge, rather than on how the resulting operations are manifested at deeper

layers of the computer software or hardware.

As Vertegaal argues,

A first step in making the user interface of a synthesizer more intuitive is to provide a more direct mapping between task-related semantics ("I want to make a sound brighter") and synthesizer-related semantics ("Then, I need to change the output level of the modulator or the feedback level or both") .... A second step is to simplify syntax by reducing the number of actions needed to reach a specific goal, making the physical action more directly mappable onto sound features. 58

52 Vertegaal (1994), p. 21. 53 Ibid. 54 Ibid. 55 Ibid. 56 Hutchins (1986), p. 90. 57vertegaal, op. cit., p. 22. 58 Ibid.

Page 65: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

60

The means for such simplification can be found by mapping high dimensional synthesis

parameters to low dimensional control parameters. 59 !SEE accomplishes this by

instantiating a control space based on a taxonomy of instrument types. This taxonomy is

activated through the definition of four abstract control parameters: overtones, brightness,

articulation, and envelope. The taxonomy is based on "an expert analysis [which the

authors] performed of existing instruments using think-aloud protocols, card sorting, and

interview knowledge-acquisition techniques."60 Figure 5.2 depicts part of their

taxonomy.

-in I I

r--ic:, t- rWinrl I

4. 1-

... I

-ilnh· -· I

t- r"t>lu.,lco>rl I

-I. 1-

'-l!::tru"lc I

Figure 5.2: /SEE Hierarchical Model of Timbre Design

In this example, the first criterion for the definition of a timbre is that of its envelope.

The second criterion is the harmonicity of the spectrum. After this, the applicable criteria

vary, depending on the timbral criteria established thus far.61

One implementation of !SEE is a Macintosh-based software system and consists

of two applications. The first of these is the Control Monitor: it is "used to control and

monitor the positioning within the current instrument space and within the instrument

space hierarchy."62 This application consists of two windows. The top-most window

provides for the navigation through the instrument hierarchy and displays at which level

in the hierarchy one is currently positioned. For example, one may be currently

59 Wessel (1979). 60 Vertegaal, op. cit., p. 24. 6! Ibid. 621bid., p. 27.

Page 66: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

61

positioned over the "Bowed" level, meaning that whatever changes one executes will be

made to this level (discussion of how such changes can be made will be presented below).

Within this application, one can "zoom in" to a lower level of the hierarchy. Such

a zooming in would allow one to move, say, to a violin or cello timbral definition. In

addition, one could also zoom out, thus moving to a higher level in the hierarchy, which

in this case would be a move to the "Harmonic" level of the taxonomy.

The second window of the Control Monitor application consists of two smaller

windows. In each of these windows, one can change values for any of the four

parameters (e.g. overtones, brightness, articulation, and envelope). Each window defines

an X-Y coordinate system so that, by moving the mouse cursor within a particular

window, one can effect changes along both the x and the y axis. For the first of these

windows, they-axis determines the Overtones parameter, while the x-axis determines the

Brightness parameter. For the second window, the y-axis determines the Articulation

parameter, while the x-axis determines the Envelope parameter. These two windows

taken together define a 4-D parameter control space.

The second application is the Interpreter application. The Interpreter application

"translates the 4-D locations it receives from the Control Monitor to corresponding MIDI

synthesizer parameter data. "63 The Interpreter includes an "instrument space editor."

This editor allows for the specification of new instruments by associating user interface

control parameters with corresponding MIDI synthesizer settings.

!SEE, like other direct manipulation GUis, employs a whole host of interface

metaphors in order to accomplish its aims. The prevailing interface metaphor in this case

is the notion that timbres can be organized as hierarchies based upon orchestral

instrument models. Another prevalent metaphor is the notion that one can think of timbre

groupings as similarity matrices. Also, there is the notion that one might specify timbre

by manipulating four parameters-brightness, articulation, overtones, and

envelope-based largely on Hel~oltzian and orchestral instrument models of sounds.

Finally, and perhaps most significantly, !SEE models the timbre design process as that by

which sounds can be designed in isolation from the conceptual and physical environment

of their appearance.

As with any software system, !SEE is as much a model of the human who uses it

as it is a model of the process s/he uses to generate desired artifacts. Within !SEE, the

focus of interaction centers upon evaluative listening. "Evaluative listening" represents a

mode of observation in which one already has in mind what one wants (or does not want)

63 Ibid.

Page 67: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

62

and, based on that knowledge, makes judgments regarding the outcome of an interaction.

With evaluative listening, criteria for the judgment of an outcome tend to remain fixed.

Consequently, one's interaction with the system is founded on a performative basis. On

such a basis, one interacts with a system in order to get it to do what one wants. An

effectively designed system, as such, allows the human to perform the minimum number

of actions necessary to accomplish a priori goals.

ISEE is a tool for composition as production. In composition as production, the

focus is on producing a desirable result; process is merely the means by which that result

is reached. Anything which might potentially complicate, or otherwise disrupt, the

process is to be avoided. The crux of the interface is in encapsulating a familiar (i.e.

historical or "intuitive") environment which is conducive to productive activity. The

emphasis on "encapsulating a familiar environment" is key: unfamiliarity is an unwanted

consequence in this case. Such an interface promotes a state of circumspective being.

This state of being, it will be recalled, is one in which the objects of an interaction are

absorbed in their function, effectively disappearing in the equipmental properties which

they assume through habitual practice.

5.4 Rule-based Composition and the Contingency of Listening As long as the computer is used as an effective tool in creating an environment for

"intuitive" compositional or timbre design production, the potential for its use as "an

extension of human self-reflective activity" remains unrealizable. Under its interpretation

as a tool for compositional production, the computer can be used in the generation of

objects, but not for the investigation of what those objects might mean vis-a-vis a set of

contextual criteria.

Interpreted as a tool for compositional research (as distinct from compositional

production), the computer becomes a tool not only for the generation of objects but for

investigation and articulation of the processes by which those objects become meaningful

given some set of criteria. It becomes feasible to have an explicit trace of one's processes

and to become informed from observation of those processes. Toward the realization of

this possibility, composers have developed sound synthesis software systems in which

one specifies sound structures by specifying the processes by which they are formulated.

By specifying sound morphologies through articulation of a process, a composer sets up

an interaction which is as much informed by observation of the behavior of the process as

it is by the behavior of the outcome. This manner of interaction is enabled only by the

proclivity of the computer toward symbolic representations.

Page 68: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

63

One of the earliest experiments of this sort was engineered by G.-M. Koenig in his

SSP program. With SSP,a composer essentially describes a composition "as one single

sound, the perception of which is represented as a function of amplitude distribution in

time as sound and silence, soft and loud, high and low, rough and smooth."64 Using this

program, one composes sounds by specifying data formulations by which everything from

individual sounds to aggregations and patterns of sounds to entire compositions are

fashioned. Individual elements of a waveform are generated by the same procedures as

are patterns of events and even entire sections of a work.

Through the application of specified procedures for transformation and variation

to the appearance and evolution of a waveform, the composer can link an explicitly stated

procedure for transformation with results of that transformation. Other projects which

maintain this linkability include Paul Berg's PILE, and Herbert Brun's SAWDUST. PILE

is a computer language for sound synthesis.65 As Berg notes,

PILE instructions are based on groups of machine operations, not on a particular acoustical model. Parameters such as frequency, timbre, envelope, and duration are not specifically referenced. Rather, the available instructions fall into the following categories: manipulation of the accumulator, manipulation of external devices, manipulation of variables, manipulation of lists, and manipulation of program flow.66

In Brun's SAWDUST, the lowest-level composible component is called an "element." An

"element" consists of a sequence of amplitudes, all with the same values. Through

various operations--concatenation, mutation, mixing, etc.-which the composer can

specify, elements are linked in order to produce waveforms, events, or entire

compositions. 67

Another more recent example of such an experiment m linking music

compositional procedure and sound design is Kirk Corey's Ivory Tower.68 Ivory Tower is

a program plus a basic hardware configuration targeted for low-cost Intel PCs. The

software component is an interactive environment in which an experimenter specifies

basic boolean operations within eight input bins. These are referred to as "sub-routines."

The hardware component is comprised of a printer cable whose eight output data wires

are attached to eight input channels of a stereo mixer, or any other playback system.

64 Koenig (1978) quoted in Berg, et al. (1980), p. 26. 65 Berg (1987), p. 160. 66 Ibid. 67 Blum (1979), p. 6. 68 Corey (1997). A recording of a work made with Ivory Tower is available in Corey (1992).

Page 69: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

64

Within the software component, the boolean operations that are specified in each

of the eight sub-routine bins generate 1-bit patterns (patterns comprised of Os and 1 s only)

at a particular rate. All eight streams of 1-bit patterns is sent to the printer port. The 1-

bit patterns that are sent to the printer port are converted to 5-volt analog signals. In

normal use, these signals are typically sent to an on-line printer, or sometimes to an

external device. In Ivory Tower, they are sent to individual audio inputs of the mixer.

The result is eight channels of audio mixed down to stereo. Since signals are converted

from 1-bit patterns representing either 5 or 0 volts, all waveforms are rectangular.

What distinguishes Ivory Tower from other composition systems is the nature of

the feedback correlating a composer's inputs and her/his observations of resulting outputs.

The user interface provides for entry of the following information:

1. How many times should the program loop (total duration of output)?

2. How much delay should be added in a loop?

3. How often should each of 8 sub-routines bins be executed (frequency

factor)?

4. What are the opcode (assembly language) sequences which define each

sub-routine bin?

The opcode sequences for each of the sub-routine bins determine what the output signals

will be. A master loop steps through each opcode for each sub-routine bin in order,

according to the frequency factor defined for each. Static timbres result from statically

defined opcode sequences. In such a case there are no event-level sequences; only a

single event. Event ("symbol") level differentiations are introduced, however, when ·

opcodes that are entered within a sub-routine bin cause an opcode within that bin or

within another bin to be changed. When this happens, a system that was previously linear

can suddenly become non-linear.

In order to understand how the system works, we consider the following

experiment. First, we begin with an initial setup of the input data:

1. Number of program loops: 640,000 (about 30 seconds).

2. Amount of delay: 1 loop

3. Frequency 1 = 1 0; this means that sub-routine 1 will be executed once on

every tenth loop. All other subroutine bins will remain silent.

4. Sub-routine bin #1 contains a single instruction "XOR A, 1" which means

"let accumulator A equal itself exclusive-ORed."

Page 70: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

65

The program will produce a single square wave at 4.7 Khz.69 Now we add a second

channel by setting its frequency (keep in mind that the term frequency here does not refer

to an acoustical feature of the resulting waveform but rather to the frequency at which the

sub-routine bin is called), say, to 11. Then, we assign the sub-routine bin a single

instruction; in fact the same instruction as was assigned to the first sub-routine bin: "XOR

A, 1." At first thought, one might think that this would result in two square waves, one

at approximately 4.7 Khz and the other at approximately 4.2 Khz. However, closer

examination (and auditioning the result) would reveal that since both sub-routine bins are

altering the data at the same memory location, a rather complex waveform would result

whose "frequency" would be 10 * 11 or equivalent to an audio frequency of

approximately 427Hz. However, since we are applying the exclusive-OR operation with

1, only that bit of the accumulator which is associated with channel 1 of audio would be

effected. All other channels would be zero. Figure 5.3 shows a single period of the

resulting waveform, with the sample numbers shown.

------~n~----~11~----------~~~~--~~~--~

60

10 20 30 40

LJ 70

u 80

Figure 5.3

u 90

50

u 100 110

Next, we add an additional instruction to sub-routine bin #1 ; for instance, an

instruction which would add 1 to the accumulator "A". As such, sub-routine #1 contains

an instruction sequence of two instructions: exclusive-OR the accumulator with 1; then

add 1 to the accumulator.

69 This is based on a 386 CPU running at 24Hz.

Page 71: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

66

With a relatively small number of instructions, a system is created which

generates considerable variety. In this system, the traditional separation between data and

instruction is blurred. It is this blurring of the separation between data and instruction

which allows for the integration of high-level (syntax level) compositional organizations

and low-level (sample level) organizations. Periodicities at the signal level result from

"eigen-vectors" which are interrupted when some significant instruction is altered during

run-time. This interruption abruptly alters the state of the system, resulting in correlated

alterations in the waveforms it generates.

Ivory Tower constitutes an experiment in which the environment for composition

itself becomes a composible medium. Upon first sitting down with Ivory Tower, one may

be baffled. However, armed with basic knowledge about how the program works, and

some knowledge of opcodes for the 80X86 processor, a composer begins, tentatively at

first, to tinker around and eventually to conduct actual experiments. Over the course of

time, "integrities" begin to present themselves. These "integrities" reveal the

consequences of one's actions and, as such, the behavior of the system one is

constructing. Such integrities arise, however, as a consequence of a particularized

interaction; they do not manifest properties of Ivory Tower. In specifying sequences of

op-codes for each bin, a composer essentially hypothesizes a condition for the generation

of musical structure; s/he does not specify those musical structures themselves. At this

level of "remove," a composer's observations take on a different character. Criteria for

judgment are no longer so much evaluative as they are informative, in that the results of

the system which the composer is engaged in constructing reveal something about the

behavior of that system as well as revealing something about the nature of sound and

mUSIC.

Since the system operates in real-time-and since changes in op-code sequences

can be effected with relatively fast tum-around time-a composer's observation as a

listener is occasioned by the enactment of those hypotheses. Listening functions as part

of "the system", rather than as the exercise of a judgment made from the outside of that

system. In such an environment, one essentially composes the very media of one's

interactions.

Ivory Tower problematises interaction by disassociating compositional procedure

from historically determined practices. In this regard, Ivory Tower functions, for the

composer, in a way which is similar to the way in which Cage's score functions for the

performer. In both cases, the musical outcome is contingent upon the particularization of

a human performance, while, at the same time, human performance is conditioned by the

musical outcome. As a task environment it engages a question to which Pollock's method

Page 72: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

67

of painting was an answer: how can I reconstitute my working environment such that I am

free to discover new "truths" regarding the materials/form process-· i.e. the process of

composing? In the task environment which results from the posing of such a question, a

composer enters into a feedback structure which is hermeneutic--one which determines

the composer as much as the composer determines it.

Compositions systems like Ivory Tower, PILE, SAWDUST, SSP, and many others

are novel precisely to the degree to which they amplify and foreground this hermeneutic

and dialectical dimension of interaction.

Page 73: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

68

6. Computers, Composition, and the Hermeneutics of Interaction

In attempting to formulate a way of thinking about music composition and

human/computer interaction we might start by asking: in composing music with a

computer, what is the nature of our communication with the machine? Laske might tell

us that the computer becomes "an extension of ourselves" once we begin to program it,

noting that "writing a program for a computer . . . is a metaphorical expression for

'programming ourselves,' or a part of ourselves, viz., of our understandings."70 Further,

Laske would suggest that communicating with a computer is tantamount to

communicating with ourselves:

Why would it be of interest to enter into communication with a 'computer' in order to make explicit the specificity of a work or institution, and for explaining its coming-into-being as a historical process? Or rather, why enter into communication with ourselves in regard to an artifact in the world of human knowledge? The latter formulation of the question asked by the human sciences is meant to signal that it is us who question ourselves regarding a topic of our choice, on the basis of a representation of our choice, geared to a goal of our choice.71

But is it the case, as Laske seems to suggest, that the object we have in mind--our

knowledge of which we understand "in such a way as to integrate it into computer

hardware as a 'software' program"72-has an existence prior to and regardless of our

encounter with it? According to the dialectic imperative I am advocating in this study, I

would suggest that the answer to this,question is yes/no. What I mean is: that the object

has an existence prior to our encounter with it, but an existence which is not necessarily

used up and exhausted in its immediate appearance. Rather, the object is constituted as

that which develops under the thought which is given by a thinking subject; it arises in a

dialectical movement by which it becomes the other of that as which it presents itself at

any given moment. In this way, the object is essentially a context for the occurrence of a

particular subjectivity.

As previously stated, the enactment of experience understands the subject as an

emergent process. As an emergent process, a subject arises in the moment at which

70 Laske (1992), p. 241. 71 Ibid., p. 242. 72 Ibid., p. 241.

Page 74: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

69

something unexpected, or unfamiliar, occurs. To design an interface, thus, is to situate

such an occurrence. As the enactment of experience, an interface is no longer the

superficial trappings assigned to, or appropriated by, a computer or other device; nor is it

any longer only the means by which a mechanism is rendered as understandable to a

human. Rather, an interface becomes a tool for the specification of an epistemology. As

such a tool, it offers the means for the composition of interaction and, as such, for

projecting the interface as an orienting context rather than a bearer of content. With such

a tool, one not only specifies conditions which occasion the results of interaction; one

specifies the domain of interactions in which those very conditions might themselves be

formulated.

The computer can be used to develop such tools, even though it is most

commonly understood according to the highly performative criteria by which it is

associated with this or that specific functionality. The real imperative for the use of a

computer in the creation of works of art is to be found in its agency as a "free agent"-as

a medium with which one might explicitly articulate both the means by which an artifact

is generated and the ends to which that artifact is oriented. For composers, the real

significance of the computer arises when it is understood as a conceptual tool for the

composition not merely of musical artifacts, but of the very cognitive processes by which

those artifacts might be imagined and realized. When understood in this manner, the

computer allows the composer to begin to structure compositional procedure as a domain

of interactions whose particular outcome is as-yet unknown and thus information-rich.

The composer thus becomes engaged in constructing the very framework of her/his

activity-slhe becomes engaged in a process of enacting her/his own experience vis-a-vis

the composition of musical artifacts. A composer, ultimately, becomes engaged in

framing an environment and, as such, the appearance of objects whose appearance

occasions the presentation of a particularized subjectivity. The particular subjectivity so­

presented is not constituted as a transcendent or a priori existent; rather, it is emergent,

immanent within the particularity of the labor through which it comprehends and

synthesizes its experience. This process is depicted in figure 6.1.

Such an understanding of the functionality of the computer does not come

automatically: it requires the articulation of a particular project whose import obtains

from its capability of overcoming the strong cultural pull toward historical practice.

Every technology has a social dimension by which the performative criteria of that

technology is determined; no technology is purely neutral. Thus, to engage computer

technology toward the framing of an interaction which is not rooted in historical practice

Page 75: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

70

requires the explicit articulation of a contrarian approach-an epistemological gesture by

which that technology is infused with a particular subjectivity.

interactions

Figure 6.1: Specifying the Interactions by which one might specify an "Artifact. "

With respect to computer systems for music composition, the point of access

between the composer and her/his working method includes, but is not limited to,

listening. This is because the composer is not only an author of the process by which this

or that sound, or musical structure, is generated-s/he is author of the process by which

such a process is constituted. As author, the composer constitutes the tools (physical and

conceptual) with which s/he sets up and implements compositional strategies at all stages

of the compositional process. By this means a composer specifies the "data" (in the usual

sense of providing musical plans, designs, etc.) as well as the structure of interaction

which s/he, as composer, might engage in with respect to that data. As observer, the

composer observes both the result (i.e. musical or other acoustic data in the form either of

uninterpreted data or in the form of rendered sound or scores) as well the structure of

interaction through which that result is generated (figure 6.2). As such a tool, the

computer is no longer merely "assistive"-its use is compelled by the requirements of the

task.

composer specifies

----)~data

J,

result ) composer observes

Figure 6.2: Hermeneutics of Computer-assisted Composition

Page 76: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

71

PART III:

Three Case Studies

Page 77: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

72

In the following, I report on certain aspects of the experimental research which I

have been conducting over the last six years. These experiments are software

development projects for sound synthesis and computer-assisted music composition.

These experiments resonate, each in a different way, with many of the themes introduced

and elaborated in the first parts of this study.

Three experiments will be presented. The first experiment has a two-fold

objective. The first objective is to link low-level (i.e. individual sounds) and high level

musical organization through a single data set. The second objective is to investigate the

possibility that musical form arises as a consequence of the particularity of the unfolding

of a generative procedure rather than as a result of a predetermined plan. In this

experiment, a software system is initialized with a start-up data set, which determines, to

a certain degree, the nature of the unfolding of a set of interacting generative procedures.

The precise nature of that unfolding, however, cannot be determined before-hand; only its

· rough outline can be so-determined. Nevertheless, the outcome does reflect a meaningful

relation to the input startup data. Moreover, since the composer is also the programmer

of the system, slhe has intimate knowledge of how it works and, thus, can make

reasonable predictions based on that knowledge. With this software system, entire

compositions and even families of compositions can be composed using related start-up

data sets.

The second experiment concerns a framework for the synthesis of sound based on

variable networks of the resonators. Unlike the first experiment, in which entire

compositions are made through execution of a single program and in which acoustical

structures unfold heterogeneous iterative dynamic systems, this software system unfolds a

single "cybernetic" organization by which the physical structure of an acoustically

vibrating object is modeled. Typically, this latter system has been used not to create

entire musical works, but to create single sounds or aggregations of acoustical events.

The third experiment is a software system for the design of generative and

interactive structures by which sound and music might be composed and modeled. It

derives its synthesis model from experiment two. This third experimental system

promotes a "multiple-views" model of compositional design wherein a data model of a

sound, family of sounds, aggregate of events, etc., can be viewed and controlled through

various interfaces. Interaction is enabled either through direct manipulation (i.e.

graphical representations with which objects can be changed in real-time) or through

specification of procedures, scripts, and algorithms according to which sound-generating

models are created and controlled. These different "views" can be deployed with respect

Page 78: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

73

to low-level sounds, to high-level structures of musical events, or to procedures by which

low-level and high-level processes might be interrelated.

These three experiments, taken together, reflect an evolving attitude with respect

to the relation between computer and compositional procedure. At one extreme is the

idea of composition through programming; at the other extreme is composition through a

combination of real-time direct manipulation and programming. In all three experiments,

however, I understand the computer not as a tool for enacting a historical performance

according to which familiar elements of the musical task environment are replicated, but

rather as a means by which the musical task environment might be redefined according to

criteria that are specific to a particular project.

Page 79: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

74

7. Chaos and Granular Synthesis

Wave is a computer-assisted composition program which combines simple non-linear

systems and granular synthesis techniques to make compositions for computer-generated

tape. In this program, structure constitutes a set of processes which the program sets into

motion during its execution. At the lowest level, different aspects of a single grain of

sound are defined as a listing through a single iteration of multiple processes. At the

highest level, sonic events are defined as a result of overlapping streams of grains. As a

consequence of the non-deterministic behavior of the generative systems used, large-scale

form evolves only as the program is executed and can not be absolutely defined before­

hand.

The design of Wave stipulates that each time a particular composition is to be

performed, a new "version" is to be computed. 1 If the startup data structures are not

modified, only small variations are introduced with each new version. If, by contrast,

these data structures are modified, program algorithms can be greatly altered with

corresponding degrees of alteration appearing in each version of the resulting

composition. Even greater variation can be introduced by adding new functions to the

software. Three works for tape have been made by the author using this program: free­

Fall, Listing, and RE:Listing.

7.1 Granular Synthesis Technique The technique of granular synthesis used in Wave involves the sampling of a

single sine function using a variable envelope similar to the Triangle envelope described

by Roads (1991).2 However, the technique used here differs in that the attack duration is

variable within certain limits (figure 7.1 ). This allows for control of timbre .

.25 • Duration . 5 • Duration .75 • Duration

Figure 7.1: Grain Envelopes with Different Attack Durations

I This resonates with Koenig's notion of "structured variants" as implemented in Project 1 and Project 2 and with Sever Tipei's notion of manifold compositions (Tipei 1987). 2 Roads (1991 ).

Page 80: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

75

7.2 Procedural Structure of the Program The basic structure of Wave is built around two procedural loops, one embedded

in the other. The outer, or main loop, determines the basic evolutions and densities of

overlapping streams of grains. At this level, basic sound characteristics common to all

grains for a particular sequence are defmed. The inner loop determines the properties of

each single grain according to the more general data defined in the outer loop. Each grain

is defined in terms of six parameters:

-Frequency

-Amplitude

- Channel placements

-Duration between successive grains

- Duration of attack portion of grain envelope

-Duration of grain

7.2.1 Procedural Structure of Outer Loop

Each iteration of the outer loop generates a single stream of grains. This is carried

out in three steps. First, a set of general grain properties are defined for each of the six

parameters identified above, the values for each parameter specifying a particular range.

The ranges are passed to the inner loop routine in which each grain of the stream is

computed and written to disk. The following pseudo-code illustrates how the values for

these six parameter are determined:

FOR each of the 6 parameters DO Compute a median value Compute a variance value related to the median value

END DO.

Values for the medians and variances are computed separately. Together, these variables

determine the range within which particular values fall for each parameter of a grain.

After ranges for all six parameters have been defined, a time point is determined

for the beginning of the stream. Beginning time points can be computed such that

successive streams can overlap to varying degrees. The degree of overlap helps

determine the density of texture.

Page 81: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

76

Next, the number of grains for the current stream is determined. This value is

computed against a global range which is defined at program startup. This global range

defines a maximum number of grains per stream for a particular composition. The

composer specifies a particular value for this global parameter and thus defines different

types of rhythmic activity for each composition.

Once all data structures have been defined, control is passed to the inner loop

where the actual grains are computed. After the grains have been computed, an envelope

function is defined and applied to the amplitudes of the grains. This envelope function

defines a single rise time followed immediately by a single decay time.

7 .2.2 Procedural Structure of the Inner Loop

The inner loop subroutine computes the particular values for each discrete grain

comprising a stream. As is the case in the outer loop, each of the six parameters

characterizing a grain are determined separately, each computed in terms of its own

independent function.

The following pseudo-code for the mner loop sub-routine 1s shown as an

illustration of how this is implemented:

FOR i=O TO numberOfGrains DO FOR each of the 6 parameters DO

Compute a base value Scale this base value to range specified in Main Loop

END FOR; Store this data for a single grain

END FOR.

Each grain is defined in terms of its six parameters. Once a set of values defining the six

parameters of a grain has been determined, its data are saved in temporary storage. Each

parameter uses its own function which is scaled and offset to fit the range specified in the

Main Loop. Since all functions used are required to return a value 0 <= x <= 1, the return

value can be scaled to fit within the range specified in the Main Loop as follows:

parameter value= (f(x) *variance)+ median

Figure 7.2 shows the data flow connecting inner and outer loops. As is shown, MEDIAN

and VARIANCE values are computed only once for each stream. These values remain

invariant during grain computations for a particular stream.

Page 82: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

77

OUTER LOOr ' / MEDirvo

-----7 VARI ANCc.E;"-------,

INNER LOOP

base value • F ( ) \/ parm value • base va l ue • VARIANCE + MEDI~

Figure 7.2: Data Flow between Outer and Inner Loops

7.3 Specification of the Logistic Difference Equation In its current implementation, Wave makes use of a very small set of linear and

non-linear systems. One of the chaotic non-linear systems used, the Logistic Difference

Equation (LDE) will now be briefly described. Its definition is as follows:

x(t+1) = x(t) * r * (1- x(t)) (1)

where x ( t ) refers to the current state of the system and x ( t + 1 ) the subsequent state. The behavior of this system has been discussed at great length elsewhere,3 so I will limit the present discussion to a very brief description. A frequently used graphic representation of the behavior of the system is shown in Figure 7.3. The control parameter, r, defines the behavior of the system. As the value of r increases from approximately 3.0, its behavior becomes progressively more complex as the number of bifurcations of x values increases. Not only do period sizes increase correspondingly, but the control parameter r becomes more and more sensitive to changes in the system. It is characteristic of the Logistic Difference Equation that within this domain of apparent non-periodicity, pockets of order occur in which the system suddenly stabilizes to some very small period of oscillation (such as 3,4, or 5) only to very quickly multiply once again toward a state of non-periodicity.

After experimentation with the Logistic Difference Equation and its many variants, I observed that this model could be used to generate rich textures through the superposition of many streams of granulated sounds. Further investigation based on this observation became the basis for subsequent experimentation, program design, and composition.

3 see May (1957), Glieck (1987), and Bai-Lin (1990).

Page 83: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

78

7.4 Generating Grains with the Logistic Difference Map It will be remembered from the discussion on basic prograni control flow that

each parameter of a grain is obtained in two steps:

1. Get a base value (between 0 and 1)

2. Scale this base value to the range values computed in the outer loop.

This could be written in the following 'C' code:

base Value Value Used

F (); (baseValue * GValRange) + GValMean;

Using an implementation of the LDE as the generative function, the above code could be

rewritten, using the single parameter frequency, as:

baseFreq = LDE(FREQ); Frequency = (baseFreq * GFreqRange) + GFreqMean;

where LDEO represents the functional implementation of the LDE and FREQ refers to

the data structure used within LDEO to compute each return value. This data structure

contains three distinct data cells:

- x: input/output value of function

- r: the control parameter

- rStop: the "goal" value for the control parameter

- inc: amount by which r increments for each iteration.

Figure 7.4 shows the result of 30 iterations for the sound parameter of frequency

(displayed in Hz), and with the following data definitions:

- initial x value: .3

- initial r value: 3.6

- rStop value: 3.75

- inc value: .005

Page 84: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

(/) ·x co )(

3.0

79

. . . ·: I . r !

• I I I . . .

. . . . . . . . . . . . . . . . . . ~ ..... .. ....... .. ..... : ... ... .... ... ...... .. ~ .... ... ...... . .. . . ... ; ..... -......... -... .. : : : :

: : ~ :

3.2 3.4 3.6 3.8 4.0

r axis

Figure 7.3: Mapping of the Logistic Difference Equation.

Page 85: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

baseFreq

1. .447245 2. .369304 3. .432608 4. .360774 5. .436031 6. .373771 7. . 505122 8. .651247 9. .306592 10 .. 456190 11. .309291 12. . 446683 13. . 461053 14. . 4 77723 15 .. 462631 16 .. 664990 17 .. 732988 18 .. 675509 19 .. 568693 20 .. 920585 21. . 914392 22. . 726573 23 .. 616326 24 .. 928667 25 .. 665706 26 .. 813169 27 .. 865830 28. . 728381 29 .. 657197 30. . 830031

Frequency (in Hz)

887 809 872 800 876 813 945

1091 746 896 749 886 901 917 902

1105 1173 1115 1008 1360 1354 1166 1056 1368 1105 1253 1305 1168 1097 1270

Figure 7.4: Frequency Outputs for r = 3.6 to 3. 75, inc= 0.005

baseFreq Frequency (in Hz)

1. .234408 674 2. .933490 1373 3. .537880 977 4. .220612 660 5. .248582 688 6. .325342 765 7 . .924310 1057 8 . .924310 1364 9. .697159 1137

80

Page 86: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

10 .. 850897 11. . 927839 12 .. 929541 13 .. 903936 14 .. 601799 15. . 691007

1290 1367 1369 1343 1041 1131

Figure 7.5: Frequency outputs for r = 375 (static)

81

The first column shows the values for baseFreq while the second column shows

the value for Frequency - the values actually used for computing the grains. With each

iteration, the value ofr has been incremented by .005. The overall form ofthis sequence

defines a statistical rise in frequency as the value for r moves toward 3.75. By

"statistical", I mean that while each successive frequency may either move up or down,

the overall tendency is one of increasing frequency. Figure 7.5 shows the result of 15

iterations for frequency with r fixed at the value of 3.75. As can be observed, the output

reflects roughly three formant regions: around 674, around 1000, and around 1360 Hz,

respectively.

Through a controlled selection of initial and goal state parameters, many varieties

of movements through behaviors mapped by the LDE can be articulated. These are

controlled by three types of transformations of r: increment, decrement, and no alteration.

So, for instance, r can begin with a value in some non-period ("chaotic") region and

decrement toward and into a more periodic region. The corresponding frequencies would

reflect this movement from "chaos" to "order." Similarly, r can begin with a value in a

more highly periodic region, and gradually increment toward and into a more non-period

region, with the corresponding frequencies reflecting this movement from order to

disorder.

This same process occurs uniquely for each of the six parameters which define a

grain. Since each parameter operates with its own r, rStop, and inc values, the resulting

behaviors for each will be different. As such, a stream of grains defme a trajectory

through a 6-dimensional vector field, with each trajectory defining a particular mapped

area of the LDE.

7.5 Using Functions to Generate Streams of Grains As explained earlier, four basic steps are employed in generating each stream of

grains. First, the median and variance parameters are computed. Then, a point in time (in

seconds, relative to the beginning of the sound file) is calculated: this time point defines

Page 87: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

82

the beginning of a stream. After this, the number of grains to be generated is determined,

and control is passed to the inner loop.

7.5.1 Computing Mean and Variance Values in the Main Loop

It will be recalled that each stream is defined globally in terms of two values: a

median value and a variance value. These values are computed for each of six

parameters. This can be expressed in the following code:

ParmMedian = LDE(PARM_MEDIAN) * PARM_MEDIAN_FACTOR;

ParmVariance = getVariance(PARM_ VARIANCE)*

PARM_ V ARIANCE_FACTOR;

The definition of the function LDEO implements the LDE function (the very same one

described above). Its return value is factored with PARM_MEDIAN_FACTOR in order

for the actual mean value used to be appropriate for the sound parameter being computed.

The definition of the function getVarianceO is comprised of a "curving" function that

implements two sine and two cosine functions operating at frequencies which are

determined as part of the initial program startup. The resulting sub-audio waveforms are

sampled and their values returned, after being factored to within a range of 0 and 1.

Added to this basic curve algorithm is an "interruption factor" which allows for sudden

phase shifts within the overlapping sine and cosine functions. This is enabled by a

threshold of probability which determines how frequently the phases will be so shifted.

The return value of getV arianceO 1s factored with

PARM_VARIANCE_FACTOR which, like PARM_MEDIAN_FACTOR, is used to

bring it within a range appropriate to the sound parameter being computed. This process

is illustrated in figure 7.6. This illustration shows curves computed by the curving

function that fall within the range identified as "Maximum Variance" and "Minimum

Variance." These represent global variables which can be defined for each execution of

the program. The three dark circles in the illustration represent three successive iterations

of the Curve function. At iteration i, FreqVariance has a value of 300 Hz while at

iteration i+ 1, its value is 310 Hz. If the FreqMedian value is the same for each iteration-­

say, 400Hz--then all grains computed for the first stream would have frequencies within

the range 400- 700Hz, while those for the second would fall within the range 400- 710

Hz.

As shown in Figure 7 .6, the next iteration reflects a sudden interruption of phase,

so that the very next value for FreqVariance is 2500Hz. As a result, the grains occurring

Page 88: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

83

within the stream will all have a frequency within the range 400 - 2900 Hz. This

represents a sudden and radical shift of behavior from those streams spawned

immediately before. Since the frequency of such shifts is determined by the probability

threshold defined at program startup, probability threshold becomes an important control

parameter. Since the principles illustrated above for frequency apply to all s1x

parameters, a variety of behaviors can be determined and, consequently, a variety of

discontinuities articulated.

Maximum Variance

-

Minimum Variance ···· ··········· ········:::···:: ............... :·-.. ~ .. : __ ······· ·····················:-'-o_····························· ······························ ··· ·· ···········

I+ 1 I+ 2

Figure 7. 6: Iterative Selection of Values for Freq Variance using a Curving Function with Interruptions

7.5.2 Calculating Time Points for Each Stream

The calculation ofbeginning time points for each stream ultimately determines the

density of textures; streams spaced close together in time will result in greater textural

densities since their close temporal proximity will allow them to overlap.

In the current version of the program, the get_ TimePointO function implements a

"brownian walk" through a two-dimensional space. Each step increments one step along

the x-axis, while the point on the y-axis 1s determined randomly according to the

following stipulations: if the random function is less than .5, then the next y value will be

1 less than the last y value; otherwise, the next y value will be 1 more than the last.

Figure 7 shows a graphic example of such a brownian walk. Whenever the function

crosses the x-axis (i.e. the value of y equals 0), the distance, in number of steps between

the current x-value and the x-value at the last point at which y was equal to zero, is

returned. This can be seen in figure 7. 7, where the distance bracketed and labeled B is

equal to 9. The logarithm of this value is then multiplied by a density factor which is

Page 89: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

84

globally defined for each execution of the program. This final value is added to that of

the previous time point value, and a new time point is computed.

B

Figure 7. 7: Using a Brownian Walk to Calculate a Timepoint Value

7.5.3 Calculating the Number of Grains

The duration of a stream is largely determined by the number of grains of which it

is comprised (plus the inter-grain duration). The current implementation allows the

composer to define a general range for each program execution:

numGrains = (LDE(NUMGRAINS) * NUMGRAIN_ VARIANCE)+ NUMGRAIN _MEDIAN;

The global variables, NUMGRAIN_ VARIANCE and NUMGRAIN MEDIAN are

defined at program startup. The parameter, NUMGRAINS, is similarly defmed and

contains values for the x, r, rStop, and inc variables used within the LDEO function.

7.6 Generating Textures Through the Fusion of Streams The final considerations in the shaping of sounds involves fusing overlapping

streams in different ways. It was found that this could be accomplished by adding an

envelope generating function for each stream. As such, each stream would have its own,

rather distinctive, envelope and thus could blend in different ways with other overlapping

streams.

A single attack and decay time structure is used for the implementation of this

envelope generator. Three initial values are of significance: the amplitude factors at the

beginning and end points of the stream, and the point, with respect to the total duration of

a stream, at which the peak will occur. Figure 7.8 shows how this works. Here, the

amplitude factor for the beginning point of the stream is 0.0, while that for the end point

is 0.2. The attack time is calculated as the number of grains multiplied by .15. If the

number of grains is 2000, this puts the attack component as occurring during the first 300

grains, and the decay occurring during the last 1700 grains.

Page 90: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

85

amplitude

attack decay

(300 grains) (1700 grai ns )

Figure 7.8: Envelope Function used to Recompute the Amplitudes for a Stream of Grains

This techniques serves less to generate an "envelope" in the real sense of the term

than to allow overlapping streams to seem to fuse into one another to varying degrees and

in various ways. As a result, individual streams of grains are often indistinguishable.

7.7 Initializing Parameter Values Each level of the program described so far involves the use of a set of global data

structures. For instance, r, inc, and rStop data structures are used in LDE functions in the

outer and inner loop subroutines. Such data structures are themselves defined by the very

same types of algorithms as those used in the computation of grains and grain streams.

These global functions will now be described.

There are two such sets of functions: one for the inner loop and one for the outer

loop. The inner loop set feeds x, r, and rStop values to the LDE functions which generate

data used in computing grains. The outer loop set feeds x, r, and rStop values to the LDE

· functions used to generate Median values and numbers of grains, as well as initial values

used by the curve functions for computing Variance values.

7.7.1 Initializing Parameter Data Structures in the Inner Loop

When the inner loop routine is called, the step is that all of the x, r, inc, and rStop

variables used in the computation of grain parameters are initialized. For each of the six

sound parameters, x is first initialized with a random value between 0 and 1. The r and

Page 91: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

86

the rStop variables each get their initial values from a "curving" routine like the one

described above.

The value for inc is computed in two steps: first a magnitude value is computed

and then its sign. The magnitude value is determined with a curving function, with each

iteration (selection of an inc value) determining a new sample value of the curve. A

succession of inc values will rise and fall according to the definition of the curving

function. The sign is determined according to the relationship between r and rStop. If

rStop is greater than r, then inc will have a positive value; otherwise, inc will be negative.

7.7.2 Initializing Parameter Data Structures in the Outer Loop

Since the method for obtaining parameter data structures for the outer loop is

more involved than that for the inner loop, a more detailed discussion is warranted. Five

initialization routines set up the data structures used to generate streams within the outer

loop. These can be broken down as follows:

1. Initialize data structures used in the computation of Median values

2. Initialize data structures used in the computation of Variance values

3. Initialize data structures used to compute the number of grains

4. Initialize data structures used to compute the envelopes of streams

5. Initialize data structures used to compute beginning timepoints for streams.

Recall that MEDIAN values are computed as:

ParmMedian = LDE(PARM_MEDIAN) * PARM_MEDIAN_FACTOR;

Recall also that the data structure PARM_MEDIAN (where "PARM" refers to any of the

six sound parameters) is comprised of the x, r, rStop and inc data elements as described

above, and is further defined as follows: The initial value for x is defined with a random

function. The r, and rStop elements get their initial values from a curving function which

is identical to that described in earlier sections of this paper. Complex waveforms are

sampled, using this curving function, and their values returned for use as r and rStop

values. Since r and rStop are each defined as successive samples of a curve, they will in

general always be different.

The value for inc is computed in the same manner as are the inc values for the

inner loop. In this case, its sign is determined by the relationship between r and rStop.

Recall that rStop defines a value at which iterative increments of r should stop. So, for

Page 92: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

87

example, if rStop is 3.55, r is 3.50, and inc is .001, then r will be incremented 50 times.

At that point its value becomes 3.50-the same as that for rStop. The next time it is

incremented, the value of inc will become 0.0, thus effecting no incrementation.

Therefore, if rStop is less than r, then the value of inc should be less than zero.

As recalled, a Variance value for a particular sound parameter is obtained as

follows:

ParmVariance = getVariance(PARM_VARIANCE) *

PARM_VARIANCE_FACTOR;

P ARM_ VARIANCE contains data elements used as input values to the curving function

implemented within getVariance(). As such, successions of Variance values for each

parameter will trace complex curves defined within the particular curving function

implemented. These Variances are in tum normalized according to a value defined by

another global variable, PARM_ V ARIANCE_FACTOR. This allows for the definition

of an absolute range within which all Variance ranges are fitted for a particular

composition.

7.8 "Patching" Functions Wave allows the composer to specify functions by which the processes described

above are determined. Some such functions are defined within the program itself. In this

case, the composer may associate them, by name, to particular processes within the

execution of the program. In addition, however, the composer can add her/his own

function definitions by writing them in C, and then linking them into the program.

7.9 Discussion: Model of Composition The above discussion of this program can be summarized as follows: for each

iteration in the Outer Loop, a set of sound parameters was computed. This set of

parameters constituted the parameters according to which a single stream of grains was to

be computed. Once such a set of parameters had been defined, program control was

passed to the Inner Loop, in which individual grains were computed according to ranges

and medians specified in the Outer Loop. Just as each sequence of grains articulated a

single stream, so too did the overlapping of streams articulate dynamically unfolding

events of varying density, register, timbre, and amplitude.

Page 93: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

88

A single "listing" of such events from the program's execution comprised a

composition. Even with no changes in initializing data, however, each execution of the

program would yield a slightly different result-a different "version." This characteristic

emphasized the notion that a form is not something which is defined a priori, but is one

which defines itself by virtue of the particularity of the processes by which it is generated.

Changes in the data set, however, result not in new "versions," but in essentially different

structures. The ability to specify and differentiate new structures and versions articulates

a principle of open form that was more prevalent in the 1960s than it is today.

The support for the "patching" of functions extended this notion of "open form" to

the program itself. Since functional algorithms were interchangeable (and even

replaceable) within the program, new behaviors could be defined for both the program

and its performance of composition strategies. Due to its complexity, Wave would

frustrate most attempts at obtaining predictable results and was therefore a ·poor tool for

the production-oriented composer. Since it was always only a tool to be used by the

author of the software, its main purpose was to be used as a research tool. Since the

program was used only by the author of the program, knowledge of the program structure

was significant in the interaction which constituted the process of making musical works.

According to this model of interaction, the composer would first provide some initial

startup data. Upon observing the musical results (i.e. listening to the sound file

generated), the composer would try adjusting the data, or changing the functions that were

"patched" in, and then running the program again.

Because of the nature of the interactions which the program defined, it was

impossible to predict, with a high degree of certainty, what would happen. Instead, one

learned about the program by interacting with it-that is, specifying data and listening to

its results, specifying new data, listening to its results, etc. Moreover, aspects of the

program itself could be altered, by altering the behavior of the functions it used, or even

by changing aspects of the program itself.

Something more general can be said about this system, and the interactions which

it elicited, by returning to the hermeneutic model of interaction depicted in figure 6.2.

This model of interaction can be summarized as shown in figure 7.9. As depicted in this

diagram, the composer specifies both startup data and the algorithms and flow of the

program itself. By the same token, the composer observes not only the results of program

execution-i.e. the resulting composition-but s/he observes how the mapping of data to

algorithm, as well as the behavior of the algorithms themselves, are reflected in the

acoustical results.

Page 94: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

com ---)~ startup data

J.,

entire composition )

composer ----~ observes

Figure 7.9: Model of Interaction, WAVE Program.

89

Page 95: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

8. resNET: Sound Synthesis Through Dynamically Configurable Feedback/Delay Networks

90

resNET is a sound computation/composition software system m which structure is

conceived as a network of interacting agents. It was written in C++ and originally ran on

an Intel 486-based computer. It has since been ported to SGI IRIX and Windows NT

(running on a Pentium-based computer) where it runs in real-time.

Functionally, the system delineates two layers: signal propagation and signal

control. At the signal propagation layer, a network of interconnected modules is defined.

These modules are grouped into two types: an excitor and a resonator. As its name

implies, an excitor provides initial input signals to a system. Similarly, a resonator acts as

a resonating 'body' into which a signal is dispatched. A third module type provides

spatial placement information to the output signal. These module groupings can be

interconnected in either feed-forward or feed-back configurations, allowing for complex

re-propagation of the generated signal throughout the system, as in the following

example:

In this example, output from an excitor is fed into a resonator circuit; the output

signal from the resonator circuit is simultaneously forwarded to a signal out module

(which spatializes the output) and fed back to the excitor. The output of the signal out

module is simultaneously sent out (toaD/A converter, or storage medium) and fed back

as input into the excitor and resonator modules.

At the signal control layer, each component of the signal-propagation network is

attached to a control node. A control node acts either as an independent agent, or as a

component of a larger integrated sub-system, and it dynamically constrains the behavior

of its attached module. But, while a control node may determine the constraints defining

the behavior of a signal-propagation module, it is the module itself which defmes exactly

how those constraints are to be applied. Moreover, components at the signal propagation

level can themselves modulate the behavior of its attached control node or even of control

nodes attached to other propagation components. In the following diagram, control and

propagation layers are shown as distinct subsystems with bi-directional pipes connecting

them:

Page 96: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

91

8.1 Specification of Excitor, Resonator, and Output Module Groupings An excitor generates a signal in one of three manners: read from disk (a sound file

stored to disk); generate using a supplied formal specification (an 'algorithm'); and input

from a live signal. An excitor is encapsulated so that its output is always of a specific

type regardless of its method of generation.

A resonator is comprised of a network of delay and multiplication components

interconnected according to specifications made by the designer using an ASCII script

file. The following shows a script file fragment along with its diagrammatic

representation:

;;lowpass.scr

.Delay delayO = 1

.Mult coeffO = .99

.Network plusO = in plusO += multO delayO = plusO multO = delayO out = delayO

A minimal script file consists of three sections, each marked with a .Command

processor flag. Delay lengths are specified within the .Delay section. Similarly,

multiplier values are specified within the .Mult section. Finally, the network patch is

specified in the .Network section. As shown in the script file, delayO is set to a length of

1 and the value for multO is set to .99. The .Network specification models a simple low­

pass filter. It should be noted that since precise ordering of network components is

essential to the correct implementation of a design, and because such an ordering may not

always be intuitive to the designer, all input script files are subjected to re-ordering prior

to execution, thereby allowing the designer to represent a design without having to be

concerned about correct ordering. In future versions of the software, the designer may

Page 97: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

92

enter a design either as a script file or as a graphical circuit design like the one shown

above.

The output module handles spatial placement in the current prototype version.

Control data for spatial placement is obtained from particular components within a

resonator network, or from control nodes operating at the control layer. The output

module also watches for word overflow, calling a designer-supplied interrupt should

overflow occur. One such interrupt--called BounceOverflow()--causes the overflowing

signal to 'bounce' off the 16-bit wall. This allows for interesting timbre designs whereby

the output of the system is purposely driven to be very slightly unstable: the output

module's 'bounce' interrupt acts as a kind of waveshaper. Another such interrupt throws

away the current sample, reduces all of the multiplier values by some specified value

(usually very small), causes the entire system to jump backward in its iterations by one,

and then allows the resonating subsystem to generate another sample.

8.2 Strategies for Timbre Design While resNET could be used to design timbres in which traditional signal

processing constructs are employed, its real utility as a design system is in employing

experimental and incremental extensions to those traditional constructs. Moreover, event

sequences of extended duration can be realized through the application of control layer

modules.

As a first step in designing such a network, we begin with a plucked string circuit

attached to a simple filter module:

g3

With all components fixed, a variety of plucked string-like timbres result. With the

introduction of variability to some of the components, a far greater variety of timbres can

be designed. As an example, the following configuration is considered.

In this example, delay 1 has a length of 31 samples (as is the length of the noise

burst excitor input); multipliers gl and g2 each have a value of .5; delay 3 has a length of

2. All of the other components (including the panning module) are variable and, as such,

can be piped to a control node.

Page 98: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

93

Each signal propagation component which is attached to a signal control node

tells that node its maximum range of variability, its minimum and maximum values, and

passes it a pointer to an iterative function which is used to control its behavior. This

function is either an independent function-and therefore does not exchange ·information

with functions controlling other signal propagation components--or it passes control to a

single 'global' function within which control nodes can exchange information according

to the specification of that function.

In this example, if multiplier components g4 and g5 are piped to mutually

symmetrical functions, such as sine functions with opposing phase, the resulting

waveform consists of varying band-pass and band-reject filtered sounds. gl is set to 1.0,

and is attached to a sine function with an amplitude at around 1.0001 times the value of

the multiplier. With delay 2 set at around 900, and its range between 200 and 1800, the

resulting behavior ranges from discernible melodic structures to continuously

transforming timbres, depending on the control functions defined.

By experimenting with incremental variations of such a configuration, one

generates timbres which can be characterized as fixed, continuously transforming, or

intermittent. Intermittent timbres are those which may fluctuate between being perceived

as a single continuously transforming timbre or as a sequence (perhaps overlapping) of

single timbres, either fixed or transforming.

8.3 Linking Composition and Timbre Design An important goal in the design of resNET has been to explore ways in which

micro-level structures (for instance, those which generate a 'single' auditory event) and

macro-level structures (for instance, those which generate an entire composition or a

sequence of auditory events) can be joined systematically--i.e. as a matter of system­

level design. Toward this end, resNET is designed such that the signal control layer

modulates the signal propagation components, and not the signals themselves. This

design allows the composer to 'algorithmically' specify procedural inputs to the control

nodes modulating signal-propagation components. The term 'procedural inputs' refers to

inputs which alter the behavior of a procedure or function, rather than modifying the

parameters associated with that procedure or function.

As an example of such a design, consider a network in which each

component is patched to all other components through a multiplier:

Page 99: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

94

In this simple example, each multiplier is attached to a control node. The control layer is

constructed as a single 'global' function into which the output signal is fed. Within the

control layer, the output signal is analyzed, and the result of that analysis is used in

'tuning' the individual multipliers in order to heuristically favor particular output signal

types. Since some multipliers will eventually be zeroed out, a specific configuration

emerges. By changing the procedural inputs to the analysis module, different

configurations can be specified. The activity involved in iterative generation of

procedural inputs to such a system is of interest to the designer who wishes to

systematically incorporate large-scale morphological structures into the generation of

individual timbral events, and vice-versa.

8.4 Composing with resNET res NET represents an effort at specifying a computer ("virtual") instrument which

can extend the very notion of an instrument. Consistent with this effort is a desire to

incorporate composition specification directly into the design of a musical instrument.

This software has been used in the composition of topologies/surfaces/oblique

angles/installed parameters, a work for two-channel tape. In this work, a small collection

of networks representing known models-such as filters, reverberators, and plucked

string models-are broken down into their subcomponents and then reassembled

according to composed logics which only obliquely reference the timbres normally

associated with the original network configurations.

Each event was composed separately. An event constituted a single sound, or a

sound aggregate. As one example of the kinds of processes by which network

Page 100: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

95

configurations were abstracted from other configurations, consider the standard

configuration which defines a simple low-pass filter:

in

This configuration was subject to transformations involving the following:

- variable delay lengths,

- variable feedback coefficient values,

- variation of input pulse streams,

- specification of right and left channels,

resulting in the following structure:

In addition to these kinds of alterations of the original filter structure, the network itself

was variable; through the addition of new components and the rearrangement of current

components, entirely new configurations were defined. One such variation is depicted in

figure 8.1. Here, the original low-pass type structure (with a delay length of 1 sample) is

combined with a band-pass type configuration. This configuration was, in turn, subject to

additional transformations in order to render still newer configurations. Such a

configuration is shown in figure 8.2.

This kind of process was repeated many times, beginning with many different

acoustical models;it constituted the research phase of the composition. During this phase,

sound data was gradually collected and organized for the composition. During the course

of this labor, various kinds of experiments were defined on the basis of observations

Page 101: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

96

made with respect to earlier ones. These experiments were based on hypotheses

regarding the correlation of acoustical behavior with network configurations. More often

than not, hypotheses needed correcting; this need for correction yielded further

hypotheses and new experiments.

R

in

L

Figure 8.1: Original Configuration with added band Pass-Type Component and Channel-Placement Specification.

in R

L

Figure 8.2: Variation of Configuration Depicted in Figure 8.1.

Over the course of this research, the composer found himself adapting a logical

framework for the specification of hypotheses, which specifications had less and less to

do with the acoustical models which the original network structures traditionally depict.

Consequently, filter-type models came to be used for the generation of sound, while

simple physical models, and their variations, could be used for the generation of single

sounds, for the modification of such sounds, or for the generation of entire aggregates of

sound structures.

From this research arose the aesthetics of the composition. Prior to composing,

there was no a priori, pre-conceived notion of the composition. All plans and thoughts

Page 102: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

97

regarding the making of the work arose from this research. The aesthetics, to which I was

introduced through my research, concerned the nature of the sounds as materials. Rather

than dramatizing a form through the projection of materials, my desire was to dramatize

the material through the projection of a form. Long silences, for instance, served not only

to articulate temporal structures, but they assisted in the projection of the material. This

is particularly the case with sounds that display persistent broad-band frequency behavior;

the sudden appearance of silence dramatizes, in a very physical manner, the characteristic

features of such sounds. The articulatory "flatness" (i.e. lack of envelope and of internal

development) of many ofthe sounds assists in the projection ofthis effect. My interest in

such "flatness" manifests itself in the treatment of "gestures" and in the definition of

amplitude envelopes. With respect to the former, those few "gestures" which appear, and

which exhibit a tendency toward dramatized form, are broken down, through the

introduction of sudden cutoffs, whose introduction results in audible "clicks." Such

clicks are, in actuality, generated with decay envelopes whose duration varies between 22

microseconds (a single sample at sample rate = 44.1 Khz) and 2 milliseconds (100

samples at sample rate = 44.1 Khz). The variability of such envelopes correspond to a

variability in the frequency and amplitude characteristics ofthe clicks.

8.5 Discussion: Model of Composition resNET projects a notion of interaction in which the composer is able to relate

acoustical behaviors to a particular physical model. The manner in which a model is

specified avoids reference to historical or methodological bias in favor of an experimental

approach-an approach that is not constrained by criteria of veridicality to already

existing physical systems, or extensions thereof. Here, an arbitrary structure can be

tested, based perhaps on already understood principles, and its effects observed. Through

experimental activity, a composer develops a heuristic performance model of a domain

that is relevant to criteria that have their basis in that very experimental activity. Actual

"physical models" relate less to a world of real musical instruments than to world of

possible, embodied "virtual" instruments.

In this model of compositional procedure, the composer, on the one hand,

specifies particular networks and the data by which their behavior is executed. On the

other hand, s/he observes the acoustical results of the output sound or sound agregates and

draws correlations among that output, the structure of the input data, and of the network

which s/he has specified (figure 8.3).

Page 103: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

A sound or --7 gg~~;;r sound aggregate

Figure 8.3: Hermeneutic Feedback Model of Compositional Procedure for resNET

98

Page 104: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

99

9. Orpheus: Interactive Design and Composition

Orpheus is a software system which extends the functionality and performance of resNET

in the following ways:

1. It provides means by which a composer might interact with the compositional

models s/he specifies.

2. It provides procedures and data structures for linking macro- and micro­

structural elements of a musical design.

3. It provides real-time response to "gestural" interfaces and "command line"

interfaces.

In the following exposition, each of these will be described. Since the software is

currently under development, some of the features which implement the above

functionality are, as yet, incomplete.

9.1 General Description of Orpheus Orpheus is intended to be both a toolkit and an environment for sound

computation and music composition. As a toolkit, Orpheus constitutes a library of C++

classes. Eventually, these classes will form a stand-alone library for use in applications

for composition written in the C++ programming language.

As an environment, Orpheus supports the design of generative and interactive

structures by which sound and music might be composed and modeled. Such structures

enable both real-time and non real-time modes of interaction. Real-time modes of

interaction are manifested through the direct manipulation of graphical objects by which

acoustical organizations are represented. Non real-time modes of interaction are

manifested through the specification of algorithms by means of which both inside-time

and outside-time acoustical and musical processes are defined and stored. 1 On the one

hand, Orpheus provides a software environment that enables the creation of graphical

L I use Xenakis' terms "inside-time" and "outside-time" to differentiate between musical structures that articulate themselves in time (i.e. according to criteria that explicitly reference temporal placement of musical events) and those that articulate themselves outside of time (i.e. according to criteria which make no reference to temporal placement of musical events). See Xenakis (1971) for further discussion.

Page 105: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

100

objects modeling the parameter spaces of specific acoustical and musical formal

structures. Through interaction with these graphically represented objects, a composer

may investigate those particular features of a synthesis algorithm that are of significance

to particular hypotheses, designs, goals, and plans. On the other hand, Orpheus provides

a means by which the objects and processes constituting such structures and algorithms

can be linked across various levels of musical structure. This overall scenario is captured

in figure 9 .1.

INTERFACE

Real - Ti me Non Rea l -Ti me

ENVI RONMENT

Figure 9.1: Dual Interfaces Enacted Within Orpheus

Orpheus is not intended to act as an environment for musical production, per se.

Rather, it is intended as an environment for compositional research, as distinguished in

the introduction to this document. With Orpheus, a composer designs sound 'models'

which can be used for the production of musical works. Sound models describe anything

from single sounds, to entire sections of musical works, to compositional models for use

in data modeling and auditory display environments.

9.2 Modeling the Task Environment Orpheus defines a task environment which combines composition of sound with

composition of higher-level musical forms. These two dimensions define a dual

comportment in which specifications made in one dimension involve specifications made

in the other:

Page 106: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

101

Composition of sound and composition of musical form eventually arise together. This is

because, at some point, sound forms define not only the features of individual sounds but

the larger-scale morphologies within which they are unfolded as well. Compositional

design is understood as non-hierarchical. Accordingly, one might specify a structure

whose generative consequences are realized at various levels of temporal unfolding. This

approach can be depicted as shown in figure 9 .2.

A brief explanation will clarify the meaning of this diagram.

First, at the level of sound design (or composition), a composer defines sound

objects. Sound objects are structures that encapsulate particularized parameter data with

respect to a sound computation model. Sound objects are generative phenomena: they are

templates which define generative characteristics on the basis of which actual sounds

might be realized. Such realizations are called sound instances. When a sound object

generates a sound instance, it simultaneously specifies a constraint, or set of constraints,

regarding the musical context in which that sound instance is placed. These constraints

are defined according to grammars that operate at the middle-level and top-level of

design, as well as parameter data which define the sound object from which the instances

are spawned. Such grammars are only sparsely defined at the top level of design; they are

more thoroughly defined at the middle level.

Figure 9.2

At the median or top levels of design (the differentiation between these two is

arbitrarily made for the sake of description), patterns of events are generated through the

specification of grammars and through the manipulation of graphical objects.

Typically, one would move across various levels in order to find ways in which

control can be interconnected in a manner which is, nevertheless, non hierarchical.

Page 107: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

102

9.3 The Synthesis Model Orpheus is based on a physical modeling synthesis model along the lines of that

defined by resNET. However, in the current prototype version of the software, there is no

direct support for the specification of variable networks; this is left for a future version of

the software. In the prototype version of Orpheus, one particular network has been

constructed. This network, like all others formed in resNET, has the following basic

structure (which is discussed in section 8.1 above):

Moreover, control of the network comes from signals generated within a "signal control

layer" :

All sound computation is done in real-time using a PC-based sound rendering software

engine called AREAL (Audio Rendering Engine and Library).2

Figure 9.3 depicts the resonator and output parts of this network, along with

names given to the control signals by which their behavior is modulated. Each of these

"control signals" will be referred to, throughout the remaining discussion, as a parameter

node. I use this term in order to differentiate the means by which a synthesis algorithm is

controlled by objects that are external to it, and the coefficients which are internal to that

algorithm. Parameter nodes are explicitly associated with synthesis coefficients within a

synthesis algorithm definition module. A synthesis algorithm definition module is very

much like the resNET "script," described above, by which synthesis networks are defmed.

In the prototype version of Orpheus, these are hard-coded to the specific synthesis

network, which is also hard-coded.

2 Goudeseune and Hamman ( 1997).

Page 108: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

103

r

l

Figure 9.3: Resonator and Output Modules

There are two parts to the resonator; each is comprised of a "tunable" delay

module with feedback. By "tunable," I mean that it is possible to effect delays that are

non-integral multiples of the sampling rate. Typically, with a delay line, one can only

effect delays whose frequency behavior is a multiple of the sampling rate. This is not

problematic when one is using delay modules in filters and in localization simulations.

However, when designing a physical model, one might wish to use delay modules which

contribute to what is normally called 'frequency' behavior. This could involve delay

lengths of between 1 and half the sampling rate. Suppose that the sampling rate is

44.1Khz. Ifthe delay length is 1, then the resonant frequency would be halfthe sampling

rate, or 22.05Khz. If the delay length is 2 , then the resonant frequency is 1/4 the

sampling rate, or 11.025Khz. If the delay length is 10, then the resonant frequency is

1110th the sampling rate, or 4.41 Khz. The resonant frequency will always be an integral

multiple of the sampling rate. By attaching a "tuning" filter-which is essentially an

allpass filter-the delay line can generate resonant frequencies that are non-integral

multiples of the sampling rate.J

3 see Smith and Jaffe ( 1989), pp. 481-494 and Sullivan, C. ( 1990) for more detailed discussion.

Page 109: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

104

As shown in figure 9.3, there are six parameter nodes by which the behavior of

both the resonator and the output modules are controlled. The paranieter nodes Nl and

N2 control the lengths (non-integral) of the two delay units. Each controls two pertinent

coefficients within each delay unit: the length of the delay and one of the coefficients of

the tuning filter (the other coefficient remains constant at -1). The parameter nodes Pl

and P2 control the feedback of each of the delay units.

The other two parameter nodes shown in figure 9.3 (Ca and Cd) engage the OUT

module. The OUT module defines channel placement. This is determined according to

two coefficients: (1) the angle defined by the position of the sound in respect to the

listener (parameter node Ca) and (2) the "depth" of the sound (parameter node Ca).

Figure 9.4.shows roughly how this is manifested. The listener is an "idealized" listener

who is situated at a 90 degree angle from the two speakers; i.e., s/he is situated in the

middle of the two speakers at the same distance which separates the two speakers. The

angle defined by the right speaker is 45 degrees, that of the left speaker is 135 degrees.

The angle defined by the sound event shown is approximately 110 degrees. Note that all

sounds have angles in the range of 45 and 135 degrees.

listener

Figure 9.4: Angle of Position of Sound Event with Respect to Idealized Listener

The "depth" of a sound is currently implemented simply through amplification

and attenuation of the signal.4 With greater processing resources, this effect could be

improved by using the classical studio technique whereby the depth of a sound projected

within a stereophonic field is manifested with the use of a band-pass or low-pass filter.

Having discussed the resonator and output modules of the synthesis network of

Orpheus, I turn now to a discussion of the excitor. Figure 9.5 depicts the excitor and the

parameter nodes which control its behavior. The initial signal that defines this excitor

4 cf. Moore (1990) pp. 350-362 for more detailed discussion of the algorithm on which this spatialization implementation is based.

Page 110: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

105

consists of a sequence of noise bursts. Given this, there are three factors which determine

the behavior of this excitor:

1. the constitution of the initial noise burst stream;

2. the relative degree of continuity of the noise burst stream;

3. the amplitude ofthe fmal excitation.

B R D

111 s

DISTORTION MODULE

if (x >• - 1 && x <• 1) 3

Y • ( X - X /3)

f (x )- el se

y - 0

Figure 9.5: Excitor Model

L

The constitution of the initial noiseburst stream is as follows. The parameter

nodes B and R define the duration of the noiseburst and the rate at which bursts are

generated, respectively. The D parameter node determines the duration of a stream.

"Continuity" of the excitation signal is determined through the combined behavior

of the non-linear filter (the components circumscribed by a dotted box in figure 9.5) and

the distortion module. The coefficient of the non-linear filter is defined by parameter

node S. S has an allowable range of { -1, 1}. For most of this range, the result is a slight

filtration ofthe noiseburst signal; however, asS approaches the range between -.98 and

-1.0, the output of the filter becomes greater than unity. The distortion module,

meanwhile, flattens all signals whose amplitude is greater than 1.0, or less than -1.0. As a

result, any samples entering it that are greater than 1.0 or less than -1.0 are flattened,

while all other samples are scrambled according to the algorithm shown. If a relatively

Page 111: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

106

large number of input samples (to the distortion module) result in getting flattened, the

result is a discontinuity of the signal, as is illustrated in figure 9.6.

1.0 ............. ············ ...... .. ......................... .

-1.0 ........... ..... ............................................ .. . .

-------------------·~~) ~~~~~~ --~-~--~-~tt-•t-uu-••-•-u•-·~~-,M-------·--7 signal in signal out

Figure 9.6: Behavior of Distortion Module

If the average number of samples which exceed the range { -1, 1} is very large, relative to

the number of sample which fall within that range, then the result is an intermittence of

excitor signals-signals that are bursty, exhibiting "rhythmic" behavior.

The amplitude of the final excitation is defined by the value of L, which has a

range of {0, 1} .

Given the description just provided, there are a total of eleven parameter nodes by

which the synthesis algorithm is controllable. This will be relevant in the discussion of

the data model of Orpheus.

9.4 The Data Model Orpheus' data model is defined as a set of C++ classes. Figure 9.7 shows the

hierarchy of classes which defines the data model.

Indentations show the whole/part relations among classes: an indented class is a

component of the class above it. For instance, The Ct/Configuration and Ct/PathlnTime

Page 112: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

107

classes are constituents of the SndObject class. Similarly, the ParmNodeList and the

ParmNodeConnectionList classes are constituents of the Ct/Configuration class.

SpawnFilter SndObject

CtlConfiguration ParmNodeList ParmNodeConnectionList

CtlPathinTime

Figure 9.7

Each of these classes will now be described, beginning with the lowest level

classes first.

9.4.1 The ParmNodeList

A ParmNodeList defines ranges for each parameter node. As already stated,

parameter nodes determine the control space for the underlying sound synthesis

algorithm. The parameter nodes for this algorithm are: Nl , N2, Pl , P2, Cd, Ca, B, R, D,

S, and L. Typically, in designing control interfaces within Orpheus, one first defines

ranges by which values are constrained for each parameter node. So, for instance, one

might constrain Nl within the range {100,900} ,5 N2 within the range {22,77}, and so on

for other parameter nodes.

One might also define a parameter node as constant, rather than variable within a

range: for instance, one might define Pl as having a constant value of 1.0. When a

parameter node is defined as constant, it is not included in the ParmNodeList. This is

because the ParmNodeList is a publicly exported list which becomes usable within a

CtlConfiguration. A CtlConfiguration is an object through which an aggregation of

parameter nodes is controlled either through direct manipulation using a GUI, or within a

program. Since parameter nodes that are defined with constant values are not subject to

alteration, they are not exported to controlling agents.

9.4.2 The ParmNodeConnectionList

This is a list of ParmNodeConnections. A ParmNodeConnection is a connection

between two parameter nodes such that the alteration of the value of one parameter node

causes the alteration of the value of the other. ParmNodeConnections define weights

5 Ranges for delay lengths Nl and N2 are given in Hz.

Page 113: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

108

that determine the manner in which the alteration of one parameter node effects the other.

For instance, suppose that parameter nodes NI and N2 are joined by a

ParmNodeConnection which has a weight of 0.5. This means that when one parameter

node is altered by a factor of 1.0, the connected parameter node is altered by a factor .5.

Similarly, if the weight is -2.0, then the alteration of one parameter node by a factor of 1.0

will result in the alteration of its connected node by a factor of -2.0.

As an example of how this works, consider figure 9.8. In this example, as in the

others that follow, parameter nodes, along with their ParmNodeConnections, are depicted

within a 2-dimensional space. An alteration of a parameter node value is depicted as a

movement within this two-dimensional space. All such movements are scaled from the

range of the parameter node being altered to the range defined for the x/y plane in which

such an alteration is depicted as a movement. This representation is preserved in one of

the graphical "views" provided within Orpheus, and is discussed in greater detail later in

this chapter.

In figure 9.8, parameter nodes Nl and N2 are joined by a ParmNodeConnection

which has a weight of -2.0. An alteration of parameter node Nl by 1 unit downward

would result in an alteration to N2 in 2 units upward (figure 9.8a). Similarly, an

alteration of N2 by 1 unit will have the same effect on N2 (figure 9.8b). A parameter

node can be connected to more than one other node. 'For instance, in the prior example,

Nl might be connected not only to N2, but to S and B as well. Each such

ParmNodeConnection might have a different weight. Consider for instance the

configuration illustrated in figure 9.9a. With this configuration, an alteration of Nl

(shown once again as a movement within the x/y plane) by 1 unit will have the effect

depicted in figure 9.9b.

Figure 9.8a

Nl

v • Figure 9.8b

Page 114: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

109

Nl 3 . 0 / :

~• s /: ·· : ;

j -2. 0 l 0

• N2 B .

-~N2

~ s

Figure 9.9a Figure 9.9b

The motivation behind the use of the ParmNodeConnection is two-fold. First, it

facilitates a real-time graphical interface with which multiple parameter nodes can be

controlled using a single pointing device.6 Second, it allows for the grouping of

conceptually interdependent parameter nodes into a single definable aggregate. This

enables encapsulation whereby an aggregate of interconnected nodes could be represented

as a single data point.

9.4.3 The CtlConfiguration

A CtlConjiguration is a control object that is comprised of a ParmNodeList and a

ParmNodeConnectionList. It defines a control interface to a sound synthesis or music

morphological module. Figure 9.10 depicts a Ct/Conjiguration which is relevant to the

synthesis model defined for Orpheus. First, note that only a subset of the entire set of

parameter nodes is defined for this configuration: these are Nl, N2, Pl, S, B, Cd, R, and

L.

As already discussed, the attachment of connected nodes to any given node allows

for the indirect alteration of all attached nodes through the direct alteration of a single

node. Within a Ct!Conjiguration, that single node is called the 'primary node'; all

attached nodes are termed 'secondary nodes.' In figure 9.1 0, and in the figures that

follow, a primary node is indicated by being filled in with a dark gray color. In these

examples, parameter node Nl is the primary node. It should be noted, however, that any

parameter node within a Ct!Conjiguration can be a primary node.

As demonstrated during the previous discussion on ParmNodeConnections,

alteration of the primary node causes alterations only to those nodes (the secondary

nodes) to which it is attached through a ParmNodeConnection. All other nodes remain

unaffected. For instance, going back to figure 9.10 and altering Nl by a single unit would

6 The problem of "high-dimensional control" of synthesis parameters has been addressed in the research of Insook Choi, Robin Bargar, and Camille Goudeseune. See for instance Choi, et. al. ( 1995).

Page 115: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

110

result in the configuration shown in figure 9.11. Further alteration would result in the

configuration shown in figure 9.12.

Figure 9.10: ACt/Configuration.

Figure 9.11: Change in Structure of Configuration Through Alteration of N1.

Nl

Figure 9.12: Further Change in Structure of Configuration Through Alteration of N1.

Page 116: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

111

Note that only those nodes that are secondary to the primary node (in this case Nl)

are altered. These secondary nodes are shown with thicker lines in figure 9.12. If the

composer wishes to add other secondary nodes, s/he can do so by defining a

ParmNodeConnection for those nodes. According to this procedure, one could

conceivably interconnect all nodes such that the alteration of any single node would result

in the alteration of all other nodes according to the weights specified in their connections.

Within Orpheus, one can define and interact with any number of such

configurations. Each such configuration constitutes an interface through which the

synthesis algorithm is controlled. As such, each configuration defines a unique interface

to the underlying synthesis algorithm. Through the definition and exploration of different

interfaces, one effectively delineates "pockets" of feature spaces in relation to the larger

space which the synthesis algorithm, in its totality, might define. Through the explicit

definition of Ct/Conjigurations, aspects of the overall behavior of the synthesis algorithm

are constrained according to particularized control structures. Such a constraint acts as a

performance constraint-that is, it constrains the manner in which the performance of a

control, either through a program or through direct manipulation of graphical objects,

affects the larger synthesis model.

This is not unlike the notion of control available with respect to a musical

instrument. On a cello, for instance, pressing down firmly on the bow, which is placed

near the bridge, and drawing the bow upward upon the fourth string with a finger of the

left hand pressed down upon that same string about three inches from the nut-this

integrated action enacts a constrained performance over the totality of possible effects

that the cello is capable of producing. In a similar fashion, within a Ct/Conjiguration, a

composer can specify non-constant parameter nodes, and the ranges by which they are

constrained, as well as ParmNodeConnections between nodes, and the weights which

determine those connections. This can be done at any time during the design process.

9.4.4 The Ct/PathlnTime

A Ct/PathlnTime constitutes an 'in-time' structure: it defines a temporally

unfolded morphology. A Ct/PathlnTime is a list of values of a primary node, along with

clock-times (in ms.) which determine the temporal dimension of those values. Figure

9.13 depicts the temporal structure of the movements of a primary node, Nl. The list of

values/time-points constitutes a "path" which a particular node follows in time and space.

In this figure, the path has a duration of 390 ms (3.9 seconds) and defines a trajectory for

parameter node Nl which is bounded by values 100 and 1000.

Page 117: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

112

point# Nl value time-point (ms.) duration (ms)

I 100 0 29 2 170 29 28 3 210 57 18 4 150 75 19 5 220 94 29 6 310 123 17 7 450 140 22 8 460 162 27 9 370 189 18 10 320 207 33 11 450 240 29 12 680 269 43 13 950 312 28 14 1040 340 50 15 1000 390 1

Figure 9.13: Temporal Unfolding of a Single Primary Node.

While a particular Ct/PathlnTime defines a path for a primary node only, all other

nodes which have non-zero connections to the primary nodes (i.e. secondary nodes) are,

indirectly, affected by that path. Consequently, defining a Ct/PathlnTime designates an

unfolding in time of multiple synthesis parameters, each following an independent path.

In creating a Ct/PathlnTime, a composer can select any parameter node as the primary

node, thus effecting a large number of possible trajectories.

9.4.5 The SndObject

A SndObject represents a model of an individual sound or pattern of sounds. It is

comprised of aCt/Configuration and a Ct/PathlnTime. The Ct/Configuration defines a

control space vis-a-vis the underlying synthesis algorithm. The Ct/PathlnTime

constitutes an unfolding in time with respect to a particular grouping of parameter nodes.

With a SndObject, one could begin to compose sounds, and other acoustical and musical

objects. A SndObject therefore acts as a kind of sound 'prototype,' on the basis of which

acoustically realized events might be spawned. A SndObject acts as a kind of template

for the generation of acoustical events; it is an inside-time structure that has generative

capacities.

Page 118: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

113

A SndObject is a computational object. As such, it exports a set of functions.

These functions relate primarily to the spawning of SndObjectlnstances; they are

described in the following.

Replicate nameO£NewSndObjectinstance

Create a new SndObjectlnstance (with the name

nameOJNewSndObjectlnstance) through replication of the current

SndObject. The new one will inherit the structure and behavior of the

parent.

Example:

II Instantiate a SndObjectinstance SndObject sObjinstl

II Replicate, from that SndObjectinstance, a new II one. II sObjinstl Replicate sObjinstla

Spawn [command] nameO£NewSndObjectinstance

Create a new SndObjectlnstance (with the name

nameOJNewSndObjectlnstance) through varied replication of the current

one. The Spawn function has many "commands." Each command

defines some method by which a child morphology is spawned. These

reflect various ways in which the ranges of the parameter nodes of the

Ct/Configuration of the parent are varied for the child. Each such

command has one or more arguments. Currently, there are only three

commands implemented, but others are to be added as the software is

developed.

The commands currently implemented are as follows:

timeStretch -- stretch the time by some amount.

delayShift --shift the lengths of both delay lines by some

amount.

amplify --amplify volume by some amount.

Page 119: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

114

In a future implementation of this software, the "commands" aspect will be

greatly enhanced to enable the creation of primitive commands and the

"threading" together of these primitive commands into commands of

arbitrary complexity.

Example:

sObjinst1 Spawn timeStretch 3.5 sObjinst1a sObjinst1 Spawn delayShift .23 sObjinst1b sObjinst1 Spawn amplify 2.0 sObjinst1c

SetSpawnFi1ter va~ues ... name0£Fi~ter

Associate the current SndObject with a "filter" by which SndObjlnstances

might be spawned. Once a filter has been so associated, the Spawn

function, when invoked for that SndObject, will use the filter structure

defined in spawning SndObjectlnstances. A filter can be set by specifying

filter values; it can also be set by giving the name of a valid filter file, or

an already instantiated filter object (see SpawnFilter below).

Example:

II Set the SpawnFilter for a sound instance with II explicitly defined data II sObjinst1 SetSpawnFilter 0 -.25 1 .75 2 .1 6 -.95

II Instantiate a SpawnFilter object; then II associate that SpawnFilter object with II a previously instantiated SndObjectinstance II object. II SpawnFilter filter1 0 -.25 1 .75 2 .1 6 -.95 sObjinst1 SetSpawnFilter filter1

II Set the SpawnFilter for a given II SndObjectinstance from a file called II 'filter1.filter.' II sObjinst1 SetSpawnFilter filter1.filter

Page 120: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

115

Play [command] name0£Fi~ter

Play the SndObject or SndObjectlnstance through the audio hardware. As

is the case for the Spawn function, the Play function has a number of

commands.

Examples:

II Play with timestretch at 3.5 sObjinstl Play timeStretch 3.5

II Play with delayShift set to .23 sObjinstl Play delayShift .23

II Play twice as loud sObjinstl Play amplify 2.0

9.4.6 The SndObjectlnstance

SndObjectlnstance is the actualization of a SndObject within a realized musical

structure. It derives all of the features from a SndObject. As such, discussion of the class

methods for SndObject, including those which deal with spawning, are applicable to the

SndObjectlnstance.

9.4.7 The SpawnFilter

A SpawnFilter defmes a 'filter' which a SndObject uses in spawning

SndObjectlnstances and which a SndObjectlnstance uses in spawning child

SndObjectlnstances. The SpawnFilter does this by applying mathematical functions

against the Ct/PathlnTime by which the parent SndObject/SndObjectlnstance is defined.

The first such function is applied against the list of values for the primary node defined

for the SndObject. The second such function is applied against the time-point values for

each point along the time path.

An example will help to this description. Take a SndObject whose primary node

is Nl and which has the Ct/PathlnTime depicted earlier in figure 9.10. Next, take two

functions: one is applied to the values of the primary node (Nl), and the other is applied

to the time-point values. Each function in a SpawnFilter defines a range with respect to

which the function is operative. Two such functions are depicted in figure 9.14 as F1 and

F2 respectively. Function F1 is used to filter the set of parameter node values; function

F2 is used to filter the duration values for each time-point. The algorithm for arriving at

the new offset and range values is as follows:

Page 121: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

newOffset = (func_Offset * originalRange) + originalOffset newRange = (func_Range * originalRange)

116

Taking Fl as a filter function which is applied to the original set of Nl values, the

following algorithm is implemented:

originalOffset = 100 originalRange = 940

F1 Offset = . 75 F1_Range = .30

newOffset = ( .75 * 940 ) + 100 newRange = ( .30 * 940 )

= =

805 282

The Ct/PathlnTime for the resulting spawned object has a new structure as is depicted in

figure 9.15.

The new path, contrasted with the original path, is shown graphically in figure

9.16. As can be observed, the range of the Nl values in the new path represents a highly

compressed version of that of the original path. Moreover, the offset of the new path is

such that all Nl values focus on the upper part of the original path's range. By contrast,

the deployment of time-points represents an expansion of those in the original path. In

spite of these rather extreme transformation functions, however, the contour of the the

new path's structure remains similar to that of the original.

To reiterate a point already stated, alteration in a primary node causes alterations

to all nodes (i.e. secondary nodes) to which it is connected. Consequently, values for

secondary nodes will be transformed in a manner that is related to those of the primary

node. The particularity of that relation is determined by the weight according to which

the secondary node is connected to the primary node.

1.2 2.0

0.9 --- F1

0.& 1.0

O.l 0.5

0.0 0.0 +--~f---+---+---+ 0.0 .25 .5 .75 1.0 0.0 .25 .5 .75 1.0

Figure 9.14a: Filter Functions Fl Figure 9.14b: Filter Function F2

Page 122: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

117

point# Nl value time-point (ms) duration (ms)

1 805 0 48 2 826 48 46 3 838 94 31 4 820 125 33 5 841 158 48 6 868 206 30 7 910 236 38 8 913 274 45 9 886 312 32 10 871 357 54 11 910 389 48 12 979 443 69 13 1060 491 46 14 1087 560 80 15 1075 606 1

Figure 9.15: Temporal Unfolding of a Spawned SndObjectlnstance

1000 . qqqq . q--q q q q orig.path. q /~q . .. q qqq n~w path

N1 values

800

600

400

200

T timepoints (ms.)

Figure 9.16: Comparison of two different time paths

The discussion thus far has focused primarily on the creation and modification of

SndObjects. I will now discuss the larger data model within which SndObjects are

activated. The two remaining components of the data model are (1) EvtStructure and (2)

OutputModel, as I will now describe.

Page 123: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

118

9.4.8 The EvtStructure

An EvtStructure is a generative structure that encapsulates a process model of

possible events. An event constitutes the deployment of a SndObjectlnstance at a

particular point in time. At any given moment, one or more events can occur

simultaneously. The structuring of overlapping events is handled within an OutputModel

(discussed in the next subsection of this paper). The EvtStructure class is depicted in

figure 9.17.

The EvtStructure contains three components: (1) a SndObject, (2) a SpawnFilter

object, and (3) and an Environment object. The SndObject component serves as a kind of

'prototype' morphology for all events. All events deployed by an EvtStructure is based

on this model. Each time an event is generated, a SndObjectlnstance is spawned from the

prototype SndObject using the EvtStructure's own private SpawnFilter object (figure

9.18).

Since the SndObject and SpawnFilter objects have already been discussed at

length, I will focus the current discussion on the Environment object. Generally speaking,

the Environment component exercises constraints upon other simultaneously unfolding

events. At any given moment in the generation of events, there is a single event that is

considered 'primary.' An event is marked 'primary' because the EvtStructure which

generates it is marked 'primary.' When an EvtStructure is marked primary, it is allowed

to affect the way in which other simultaneous events are generated by exporting

constraints that are exercised within the EvtStructures from which those events are

generated.

~vt~tPiot'tooro

' /

AveDurfactorBtwnSpawnedEventOnsets IlL DensityFactor

Average: 2

~ Range: +- 2

LoudnessFactor Min: .3 Max: .7

ParmNodeFactor ParmNode1 : {min, max} ParmNode2: {min, max} ... ParmNodeN: {min, max}

Figure 9.17: The EvtStructure class

Page 124: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

' --·································· ··········-- ··-··) Event (SndObjectlnstance)

Figure 9.18: An EvtStructure generating an event through the spawning of a SndObjectlnstance

119

An EvtStructure's Environment constrains simultaneously occurnng events in

three ways. First, it limits the number of other events, on average, that can occur in

overlap with the current event-this is the DensityFactor sub-component. Each such •

limit is defined with respect to a range of variance. For instance, if the average limit is

described to be 2, and the range of variance is+/- 2, then there can be, at most, three other

overlapping events (for a total of four events) and, at least, no other events besides the

current one (for a total of one event).

Second, the Environment limits amplitude levels of coexisting events. This sub­

component is called LoudnessFactor. For each of the simultaneously unfolding events,

the amplitude level is multiplied by LoudnessFactor. LoudnessFactor defines a range of

values with respect to which a multiplicand is computed and applied to the amplitude

levels of another event. Through application of this constraint, a primary EvtStructure

can cause its own events to be foregrounded in relation to other overlapping events.

Third, the Environment limits selected parameter nodes of a concurrent

EvtStructure to within a specified range. Consider, for instance, the concurrent existence

of two EvtStructures El and E2, with El as primary. Given this arrangement, the

Environment component within El would export constraints, in addition to those

described above, that alter the ranges of selected parameter nodes which comprise the

SndObject object within E2. This Environment sub component is called

ParmNodeFactor.

An example will help to clarify this description. Since El is the pnmary

EvtStructure, its Environment can alter how a SndObject!nstance is spawned within E2.

In the example shown in figure 9.19, in addition to LoudnessFactor, there are two

parameter nodes indicated within ParmNodeFactors, named Nl and N2. Nl is defined as

multiplication factor with the range { .3, . 7}, while N2 is defined as a multiplication factor

with the range { .9, 1.0}. What this all means is that when E2 generates a particular event,

by spawning a SndObject!nstance through invocation of its SpawnFilter, the

Page 125: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

120

multiplication ranges defined within the SpawnFilter for parameter nodes N1 and N2 are

set according to the ParmNodeFactor multiplication ranges within El. This results in the

multiplication of the ranges of N1 and N2 (in E2) with the ranges { .3, 7} and { .9, 1.0}

respectively. The ranges for N1 and N2 within the newly spawned SndObjectlnstance

will have, as a consequence, the ranges {120, 150} and {1350, 2000} respectively.

The amplitude of the resulting SndObjectlnstance will be similarly affected,

through the export, from ES 1, of a LoudnessFactor by which the amplitudes of each point

along that SndObjects control path are altered. As a consequence, while the original

range within E2's SndObject is {0.5, 0.6}, the range within the spawned

SndObjectlnstance will be {0.53, 0.57}.

N1 = (1 00, 300)

N2 = (500, 1500)

Loudness= (0.5, 0.6)

f J ""-- ~Environment

LoudnessFactor: (.3, . 7)

ParrnNodeFactor:

N1 : {.1 , .3)

N2: {.9, 1.0)

Figure 9.19

Lolldness = {0.53, 0.57)

In addition to the three EvtStructure components previously discussed, there is a

variable called AveDurFactorBtwnSpawnedEventOnsets, indicating the duration

between spawned events. It is a multiplicand by which the duration of the last spawned

event is multiplied, the result of which defines the beginning time-point of the next

spawned event. As is the case with all other variables within an EvtStructure,

AveDurFactorBtwnSpawnedEventOnsets defines a range from which, at the moment

during which a new event is being spawned, a specific value is computed. So, for

example, suppose that the event Evt 1_1 has a duration of .3 seconds, and the computed

AveDurFactorBtwnSpawnedEventOnsets value for the subsequent event (Evt1_2) is 1.2.

Page 126: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

121

That subsequent event then would have a starting time-point of (1.2 * .3 = .36) after the

starting time-point of Evtl 1.

9.4.9 The OutputModel

An OutputModel controls the actual output of events. The OutputModel class,

and its components, is depicted in figure 9.20.

multFactorBtwnEvtOnsets numEvents

' / -I-

Figure 9.20

r::r,.rnrn"'r

EvtStructure1 EvtStructure2

EvtStructureN

The OutputModel consists of two components: a Grammar and a list of EvtStructures.

The Grammar defines a simple type-2 grammar. This is a grammar whose rewrite rules

are "context-free."7 An example of such a grammar might be as follows:

X - > a, Y Y -> .2(b, Y) z -> .l(c, Z)

. 8 (b, z)

.9(d, X)

In this example, terminal nodes are represented by lower-case letters. Forks in the rewrite

rule is indicated by a l' The numeric values represent probability factors according to

which one possible path is chosen over another. The above grammar would produce the

following kinds of streams of terminal tokens:

a, b, b, c, a, b, b, b, a, b, b, b,

d, a c, c, c, d, a ...

7 cf. Chomsky (1957) for an explanation.

c, d, a, b, b, c, c, c, d, a ...

Page 127: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

122

where a single 'a' is followed by some arbitrary number of 'b's, followed optionally by any

number of'c's and finally followed by a single 'd.'S

A Grammar can be used for a number of things: in the selection of EvtStructures

from which actual acoustical and musical events are generated; in the definition of

SpawnFilters according to which constituent SndObjectlnstances within each generated

event are spawned from the parent SndObject; in the computation of values that

determine starting time-points for events (the multFactorBtwnEvtOnsets variable

discussed below); and in the determination of primary EvtStructures.

The EvtStructureList is a pool of EvtStructures from which actual events are

generated. Each time that an event is to be generated for a given OutputModel object, a

particular EvtStructure is selected from the list. Selection occurs according to rules

defined within the Grammar. For each named EvtStructure in the pool, there is a value

indicating the probability that, at any particular moment, that EvtStructure might become

a 'primary' EvtStructure.

In addition to the two components just described-Grammar and

EvtStructureList-two variables figure within the definition of an OutputModel. These

are multFactorBtwnEvtOnsets and numEvents.

The variable multFactorBtwnEvtOnsets defines a range {min, max} according to

which the starting time-points for events are computed. When generating an event, a

starting time-point is computed by drawing a value from this range and multiplying that

value with the value which is given by the AveDurFactorBtwnSpawnedEventOnsets for

the currently selected EvtStructure. So, for instance, suppose that the range for

multFactorBtwnEvtOnsets is {0.9, 1.3} and that the value drawn, from this range, for a

determining the starting time-point for a particular event is 1.1. Next, consider that the

AveDurFactorBtwnSpawnedEventOnsets value defined for the currently selected

EvtStructure is 1.2. In order to arrive at the precise starting time-point for the event to be

generated, these two values-1.1 and 1.2-are multiplied together. In this example, the

result would be 1.32. Thus, the starting point for the current event would be computed by

multiplying 1.32 times the duration of the event with the most recent starting time-point.

The variable numEvents, defines a range {min, max} according to which the

number of events generated for a given deployment of an OutputModel object is

constrained. Each time that an OutputModel is so deployed, a new value which defines

the number of events for that deployment is defined from within the range defined by

numEvents.

8 Holtzman (1980), p. 9.

Page 128: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

123

After an OutputModel has been instantiated and defined, according to the above

criteria, the next step is to call its generateO method. The generateO method defines a

loop within which EvtStructures are selected and then, from these, events are generated

and scheduled in time. First, a value which defines the number of events to be generated

is determined from the range given by numEvents. Then a loop is iterated in which that

number of events is spawned. This is demonstrated in the following C++ sample code:

OutputModel.generate ( ) {

int n = getNumEvents(); for (int i=O; i < n; i++)

EvtStructure e = selectEvtStructure(); e.generateEvent();

9.5 The Graphical Environment In this section, I give a very brief overview of the Orpheus graphical interface.

There are three kinds of activities in which a composer will engage while using Orpheus.

These are:

1. Investigating the control space of the synthesis algorithm in order to discover

basic principles of its behavior;

2. Defining and investigating control configurations (using Ct/Configuration

objects) according to which such behaviors might be encapsulated and further

investigated;

3. Defining SndObjects, SpawnFilters, EvtStructures, and OutputModels and

investigating their use in the generation of various kinds of acoustical and

musical organizations.

These activities are enabled through the use of three different graphical interfaces. The

first interface is a 'sliders' window familiar to those who have ever used an analog mixing

console. In this case, however, each slider engages a parameter node controlling a

particular coefficient within the underlying synthesis algorithm. The second interface is

an integrated control interface which graphically visualizes Ct/Configurations through

Page 129: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

124

which the manipulation of groupings of control parameters can be made in real-time. The

third interface is a command window in which various kinds of commands are typed.

Within the command window, one can defme SndObjects, SpawnFi/ters, and

Globa!Syntaxes and link them in various ways.

Each of these interfaces will now be described in greater depth.

9.5.1 The Sliders Window

Figure 9.23 is a labeled screen dump of the "main window" of Orpheus. Below

the menu items along the top, there is a set of buttons. The seven buttons represent non

system-specific functions such as opening a new file, saving a file, etc. The remaining

buttons are labeled according to their function within Orpheus.

==---------------------------------a~

Elle E.dlt :!llew .Q.ptlons Interactions ObJects Help

DI~JgJ ~I!Qiet.l ~ Fl~ ~ "'~1""-1 s~ ..:!..Jl I

( Slide.- Window )

(Command Window

(Play Recently Recorded Path )

(Save Slide.- Poaltlona to CtiConllg )

(Save CtiConllg to Slide.- Window ) ( Toggle Sound On/Off l I Ready I j_ I ~

Figure 9.23: The Orpheus Main Window.

The first such button (beginning with the eighth button from the left) opens a

window with a large bank of sliders (figure 9.24). Once this window appears, the

synthesis engine is activated, and sound is generated. This bank of sliders is a means for

real-time control of the synthesis algorithm; each parameter node has its own slider. Each

slider has the name of the parameter node which it controls. Typically, one would begin

by adjusting the sliders in order to some sense of the various behaviors of the synthesis

algorithm they control. At various moments, one might discover a behavior which is of

interest. By pressing the Save Sliders Positions to Ct!Con.figuration button (eleventh

Page 130: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

125

button from the left as shown in figure 9.23) one may create a 'snapshot' of the position of

all of the sliders and store it to a particular Ct/Configuration.

Nl 'fft-o ---------------

0 ~ 1.0001

N7 y . eij I .Mill

PI ·r , E3 -1.0000

P') 'I' • m -unan R 'r

0 ta 1.0000

I 'f , SJ I .Milll

, -r 0 3 -uooo

S ~ . ;;) -l.IDDD

~ , 0~~0000 Ca i! ~ 1.0000

Figure 9.24: A Sliders Window.

9.5.2 The Integrated Control Interface

An integrated control interface displays a Ct/Configuration object in terms of

interconnected nodes, similar to that which is depicted in figures 9.1 0, 9.1, 1 and 9 .12. If

one chooses to open an integrated control window which is not already defined, one

clicks on the File/New menu item (or clicks the left-most button on the toolbar). When

creating a new integrated control interface, the window is at first empty. One creates a

control interface iteratively by selecting parameter nodes from a list, defining ranges for

selected parameter nodes, and defining ParmNodeConnections between selected

parameter nodes. During this entire process, one can listen to the results of the design by

selecting a node (that is, by placing the mouse pointer over the node and holding down

the left-most mouse button) and moving it around.

To open a previously created configuration, one clicks on the File/Open menu

item, and selects the desired configuration from the list of configuration files shown.

Figure 9.25 shows a typical integrated control view of a Ct/Configuration. Again, the

reader will note a similarity between what is displayed there and the pictures of

Ct/Configuration objects in figures 9.10, 9.11, and 9.12. In fact, the behavior of the

graphical objects in this window is identical to that described with respect to those

figures. By positionin~ the mouse pointer over a parameter node, and holding down the

left mouse button, one makes that node the primary node (see discussion in section 9.4

above). Holding the left mouse button down and dragging the mouse around effects

Page 131: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

126

correlated movement of the primary node, along with any secondary nodes to which it is

attached.

A CtlPathlnTime is created by hitting Ctrl-R, selecting a primary node by moving

the cursor over the desired node and then holding down the left-most mouse button, and

then dragging the mouse around in a desired fashion. Upon completion of the path,

hitting Ctrl-R will end the recording. To play back the recording, hit the Play Recently

Recorded Path button (the tenth button from the left, as shown in figure 9.23). Each

recording is added to a list of paths for the current control configuration. A list of

previously recorded paths can displayed, and individual paths can be played from this list,

as well.

The composer creates SndObjects from within an integrated control window by

selecting the Objects/Create Morphology menu item. This causes the most recently

played path of the currently selected to be recorded to a specific named object.

Figure 9.25: Integrated Control Interface Window.

9.5.3. The Command Window

In addition to interaction through direct manipulation, a composer can interact

with sound structures by entering commands within a command window (figure 9.26).

Section 9.5 gives detailed discussion of the commands based on SndObject and

SndObjectlnstance objects. In the final implementation of Orpheus, the command

window will eventually support a small Forth-like threaded interpreted language for

building EvtStructures and OutputModels and for realizing their outputs.

Page 132: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

127

Sobjlnst1 Play

Figure 9.20: Command Window

9.7 Discussion: Model of Composition Orpheus is a work in progress. It is an experiment in which compositional

procedure is viewed as a process by which a system is iteratively designed, tested, and

implemented. In its model of compositional procedure, it attempts to override historical

methodologies according to which the musical task environment is frequently understood.

In this attempt it presents two views of compositional procedure and, as such, posits two

models of interaction:

1. composition by instruction

2. composition through direct manipulation

Composition by instruction approaches computer-assisted compos1t10n through the

specification of logical structures whose aggregate behavior produces outputs that are

interpretable as musical data. In composition by instruction, a composer defines abstract

entities based on logical and other organizations in order to discover new models for

musical materials and process.

By contrast, composition by direct manipulation approaches computer-assisted

composition through the manipulation, in real-time, of graphical objects rendered on the

computer screen. Composition by direct manipulation engenders a 'performance' not

unlike that engendered by a traditional musical instrument. With the computer, one can,

Page 133: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

128

however, specify interactions that would not be possible with a traditional musical

instrument. Such impossible interactions might be the changing of ranges of specific

parameters while one interacts with it. This would be tantamount, say, to changing the

thickness of the violin G-string while one is playing Bach's famous air.

Within the graphical environment of Orpheus, the command window facilitates

composition by instruction. Similarly, both the integrated control interface window and

the sliders window support composition through direct manipulation. Given these two

paradigms, a composer might design 'virtual' instruments, extending the notion of

'instrument' to include specific compositional projects. Such a virtual instrument might

constitute a structure for the deployment of patterns of events, using an OutputModel

object. While designing such 'instruments,' the composer might craft, record, and

experiment with real-time performances of those instruments through use of an integrated

control interface.

Orpheus projects a multi-tiered model of compositional procedure without falling

into the traditional hierarchical heuristics which normally accompany such models. This

multi-tiered model can be depicted as shown in figures 9.21 a, 9.21 b, and 9.21 c. Each of

these figures depicts a different mode of interaction.

Figure 9.21a depicts real-time interaction through an integrated control view.

Through observation-both of her/his own bodily movements vis-a-vis the computer

pointing device and of the changing graphical display-the composer correlates action

and result. As s/he engages with the control interface, through manipulation of graphical

elements using a mouse, her/his experience of listening to the acoustical realization of

her/his actions correlates visual depiction of the control configuration with that acoustical

realization. By this means, an imagined correlation between action and result provides a

meaningful context for further action. With experience, s/he learns to 'play' the

instrument projected by the control interface and the underlying synthesis algorithm in

much the same manner that s/he might play a musical instrument, or drive a car, or fly a

hang-glider. 9

This mode of interaction is extended when the composer specifies the structure of

the control interface with which s/he performs embodied actions. By altering the

structure of the Ct/Conjiguration, one effectively alters the domain in which one might

perform such actions. Within this mode of interaction, the composer observes the

9 This aspect of human/machine interaction is a potentially rich area for exploration which can only be suggested within the context of the current paper, but which deserves a careful consideration. See Varela et. al. ( 1996) for background discussion of the notion of 'embodied action' as it applies to cognition. See also Merleau-Ponty ( 1964).

Page 134: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

129

acoustical realization as a correlation to the manner in which embodied actions are

redirected through alteration of the structure of the control interface. At such a moment,

real-time interaction through direct manipulation reveals information not only about the

underlying synthesis algorithm, but about the structure of the control interface as well.

This mode of interaction is depicted in figure 9.21 b.

composer ---L)v==:e:m~b~o:d~ie=d~a:;:c:;t;::io:n~s==\1~---, performs I (real-time) '

compos 't _

performs

~------- _____ J ·,>

Qn~egl:~te<J ·oo.ntro!~lnt~fface~) ~-

l synthesis algorithm J ' ~

acoustical/musical -7 realization

Figure 9.21a

composer ---+-~ specifies

I '.l/

! '\ . synthesis , L ___ al_go-:-r-ith_m __ _j

-+· acoustical/musical

realization

Figure 9.21 b

)

' v composer observes

' /

composer observes

Page 135: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

130

At another level, a composer composes structures from which patterns of sounds

and musical events are generated (EvtStructures and OutputModels). At this level, one

makes correlations between acoustical/musical realizations and the structures by which

those realizations are hypothesized. By changing the hypotheses, and thus the structure of

the pattern-generating algorithms and SndObjects, one begins to build a heuristic

framework with respect to which outputs synthesize musical structures. These

hypotheses can be tested through implementation of a control interface. Currently,

Orpheus supports such an interface through implementation of a text editor within which

one may construct commands and simple program structures.10 The composer interacts

with the pattern-generating algorithms by constructing small programs by which those

algorithms are instantiated and deployed. In this mode of interaction, the composer draws

correlations between the structure of a command or program and the acoustical and

musical output. Such correlations allow the composer to hypothesize, through proxy,

possible relationships between the structure of pattern-generating algorithms and

acoustical outputs. From this, a composer can begin to make an explicit trace of a model

of compositional procedure and thus of interaction.

composer ( commands and simple specifies __ __,,__-iil ____ P_ro_g.,...ra_m_ s _ ___ )

L------+-~truct!Jre of pattern--:generating algorithms and SndObjects

'

r·--·-:ri::~---~-' ---------r---------~

acoustical/musical realization

Figure 9.2Jc

) composer observes

10 In a future implementation of Orpheus, a graphical integrated interface view will allow embodied interaction through control of graphical elements using a mouse.

Page 136: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

131

10. Conclusion

The three case studies presented here do not merely constitute efforts toward the

facilitation of computer-assisted production of musical works. Rather, I regard them as

objects of research in and of themselves.

In the first case study, Wave, iterative chaotic systems are deployed within the

framework of non-linear granular synthesis. Through the specification of a large

collection of startup data, entire compositions are generated.

In the second study, resNet, the approach is more circumspect: rather than trying

to generate entire works, only single sounds or, at most, small aggregates of sounds, were

generated. However, the notion of abstraction remained in that such sounds were

generated according to signal processing principles that were only obliquely referential of

historical and empirical methodology.

Finally, with Orpheus, I attempt to integrate different paradigms of compositional

procedure, allowing the composer to experiment at many different levels of acoustical

signal-generation-from the lowest-level sample to the level of an entire work.

Moreover, it provides different interactive situations according with respect to which

different musical and compositional 'performances' are engendered.

With all three systems, composition was understood as including the composition

of the very procedures by which one might compose.

Jean-Francois Lyotard uses the term paralogica/ to describe a discourse which

brackets the epistemological framework with respect to which a particular language

"game" operates. 11 Such a language game is one which circumscribes human activity in

the sciences, the arts, etc. As an epistemological framework, it emphasizes the aspect of

hermeneutic play which manifests itself within a language. Such a discourse, when

directed at the composition of human/computer interaction, engenders a "political

disturbance of the Subject," orienting it toward "an engagement with a materially

different Other" .12 Composers have been among the leading advocates for such a

"paralogical" approach to human/computer interaction. From Hiller's MUSICOMP,

Xenakis' ST program, Koenig's PRJ/2 and SSP, Brun's SAWDUST, and Berg's PILE to,

more recently, systems such as MP I, 13 Ivory Tower, Manifold Controller, 14 and

TrikTraksiS_to name only a few--composers have sought ways in which the computer

II Lyotard (1984). 12 Docherty (1993), p. 13. 13 Tipei (1987). 14 Choi, et. al (1995) .

. Is Chandra (1997).

Page 137: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

132

can be used to problematise the task environment and, as such, bring about an as-yet

unexpected performance. It is this effort to problematise the task environment of music

composition that I seek to extend in my own research as a composer and as a designer of

composition software systems.

Page 138: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

133

Bibliography

The following bibliography contains sources cited within this document: texts which, in a

more general way, influenced the research and ideas explicated within this document; and

texts which extend the various research projects presented within this document.

Abowd, G. D., 1990. "Agents: Communicating Interactive Processes," in Human­Computer Interaction- INTERACT '90, ed. D. Diaper et al. North-Holland: Elsevier Science Publishers.

Adorno, T.W., 1993. Hegel: Three Studies. Cambridge, Massachusetts: The MIT Press.

Adorno, T. W., 1994. Philosophy of Modern Music , transl. A. G. Mitchell and W. V. Blomster. New York: Continuum.

Adorno, T. W., 1995. Negative Dialectics, transl. E. B. Ashton. New York: Continuum.

Ames, C., 1987. "Automated Composition in Retrospect: 1956-1986." LEONARDO 20(2): 169-185 .

Ankrah, A. , Frohlich, D. M., and Gilbert, G. N. , 1990. "Two Ways to Fill a Bath, With and Without Knowing It." in Human-Computer Interaction- INTERACT '90, ed. D. Diaper et al. North-Holland: Elsevier Science Publishers.

Ashby, W. R., 1964. An Introduction to Cybernetics. London: Chapman & Hall .

Babbitt, M. , 1972a. "Past and Present Concepts of the Nature and Limits ofMusic." in Perspectives on Contemporary Music Theory, ed. B. Boretz and E. T. Cone. New York: W. W. Norton & Co.

Babbitt, M. , 1972b. "Twelve-Tone Rhythmic Structure and the Electronic Medium." in Perspectives on Contemporary Music Theory, ed. B. Boretz and E. T. Cone. New York: W. W. Norton & Co.

Bai-Lin, H., 1990. Chaos. Teaneck, NJ: World Scientific Publishing.

Bannon, L. J. , 1986. "Issues in Design: Some Notes." in User Centered System Design: New Perspectives on Human-Computer Interaction, ed. D. A. Norman and S. W. Draper. Hillsdale, New Jersey: Lawrence Erlbaum Associates.

Beer, S., 1980. "Preface." In H. Maturana and F. VarelaAutopoiesis and Cognition: The Realization of the Living. Dordecht: Reidel.

Page 139: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

Berg, P., Rowe, R., and Theriault, D., 1980. "SSP and Sound Description." Computer Music Journal 4(3): 25-35.

Berg, P., 1987. "PILE--A Language for Sound Synthesis." in Foundations ofComputer Music, ed. C. Roads and J. Strawn. Cambridge, Massachusetts: The MIT Press.

Bidlack, R., 1992. "Chaotic Systems as Simple (but Complex) Compositional Algorithms." Computer Music Journal16(3): 33-47.

Blum, T., 1979. "Herbert Brun: Project Sawdust," Computer Music Journal3(1): 6-7.

Bobrow, D. G., 1975. "Dimensions of Representation." in Representation and Understanding, ed. D. G. Bobrow and A. Collins. New York: Academic Press, Inc.

Borin, G., De Poli, G., and Sarti, A., 1992. "Algorithms and Structures for Synthesis Using Physical Models." Computer Music Journal16(4): 30-42.

134

Bourdieu, P., 1993. The Field of Cultural Production: Essays on Art and Literature, ed. R. Johnson. New York: Columbia University Press.

Brennan, S. E., 1990. "Conversation as Direct Manipulation: An Iconoclastic View." in The Art of Human-Computer Interface Design, ed. B. Laurel. Reading, Massachusetts: Addison-Wesley.

Brun, H., 1969. "Infraudibles." in Music by Computer, ed. H. Von Foester and J. Beauchamp. New York: John Wiley and Sons, Inc, pp. 117-121.

Brun, H., 1970. "Technology and the Composer." in Music and Technology. Paris: La Revue Musicale.

Buxton, W. , 1990. "The 'Natural' Language oflnteraction: A Perspective on Non-Verbal Dialogues." in The Art of Human-Computer Interface Design, ed. B Laurel. Reading, Massachusetts: Addison-Wesley.

Cage, J., 1960. 26' 1.1499" For a String Player, score. New York: Henmar Press, Inc.

Cage, J., 1961. Silence. Middletown, Connecticut: Wesleyan University Press.

Cadoz, C., Luciani, A., and Florens, J.-L., 1993. "CORDIS-ANIMA: A Modeling and Simulation System for Sound and Image Synthesis -- The General Formalism." Computer Music Journal17(1 ): 19-29.

Page 140: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

135

Chandra, A., 1993. "CounterWave: A Program for Controlling Degrees of Independence between Simultaneously Changing Waveforms," Proceedings ofthe IAKTA/LIST International Workshop on Knowledge Technology in the Arts, pp. 115-134.

Chandra, A., 1997. "Composing Against the Computer" lecture given at University of Illinois Urbana-Champaign, April16, 1997.

Choi, I., Bargar, R., and Goudeseune, C., 1995. "A manifold interface for a high dimensional control space." Proceedings ofthe 1995 International Computer Music Conference. San Francisco: Computer Music Association, pp. 385-392.

Choi, I. 1993. Computation and semiotic practice as compositional process. D .M.A. diss., University of Illinois at Urbana-Champaign.

Chomsky, N., 1957. Syntactic Structures. The Hague, Netherlands: Mouton.

Corey, K., 1991. Is This a Thesis Yet?. Dissertation, University oflowa.

Corey, K., 1992. Music From the Ivory Tower and Elsewhere. Dubuque, Iowa: Curious Music.

Corey, K., 1997. "My Algorithmic Muse." Sonus 17(2).

DeLio, T., 1984. Circumscribing the Open Universe. Lanham, Maryland: University Press of America.

Di Scipio, A., 1990. "Composition by exploration of nonlinear dynamical systems." Proceedings of the 1990 International Computer Music Conference. San Francisco: Computer Music Association, pp.324-327.

Di Scipio, A., 1994. "Formal Processes of Timbre Composition: Challenging the Dualistic Paradigm of Computer Music, A Study in Composition Theory (II)." Proceedings ofthe 1994 International Computer Music Conference. San Francisco: Computer Music Association, pp. 202-208.

Dodge, C., and Jerse, T., 1985. Computer Music: Synthesis, Composition, and Performance. New York: Schirmer Books, Inc.

Dreyfus, H. L., 1993. Being-in-the-World: A Commentary on Heidegger's Being and Time, Division I. Cambridge, Massachusetts: The MIT Press.

Eckel, G., and Gonzales-Arroyo, R., 1994. "Musically Salient Control Abstractions for Sound Synthesis." Proceedings of the 1994 International Computer Music Conference. San Francisco: Computer Music Association, pp. 256-259.

Page 141: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

136

Eimert, H., 1959. "What is Electronic Music?," Die Reihe I. Bryn Mawr, PA: Theodore Presser.

Erikson, T., 1990. "Working with Interface Metaphors." in The Art of Human-Computer Interface Design, ed. B. Laurel. Reading, Massachusetts: Addison-Wesley.

Farrett, P. W., 1993. "Intensional Composer: An Exploration of Creativity Through Musical Composition," Proceedings of the IAKTAILIST International Workshop on Knowledge Technology in the Arts, pp. 57-78.

Feiten, B., and Spitzer, M., 1994. "A Modular Construction Set for Time-Domain Editors." Proceedings of the 1994 International Computer Music Conference. San Francisco: Computer Music Association, pp. 284-285.

Findlay, J. N. , 1958. Hegel: a Re-examination. New York: Collier Books.

Gadamer, H.-G., 1976a. Hegel's Dialectic: Five Hermeneutic Studies, transl. P. C. Smith. New Haven: Yale University Press.

Gadamer, H.-G., 1976b. Philosophical Hermeneutics, trans. & ed. D. Linge. Berkeley: University of California Press.

Gadamer, H.-G., 1992. Truth and Method, transl. J. Weinsheimer, and D. G. Marshall. New York: Crossroad.

Glieck, J., 1987. Chaos: Making a New Science . New York: Viking.

Gove, P. B., ed., 1966. Webster's Third New International Dictionary ofthe English Language-Unabridged. Springfield, Massachusetts: G. & C. Merriam Co.

Goudeseune, C. and Hamman, M. 1997. "Mapping data and audio using an event-driven audio server for personal computers." Proceedings of the 1997 International Conference on Auditory Display (!CAD).

Hamman, M., 1991. "Mapping Complex Systems Using Granular Synthesis." Proceedings of the 1991 International Computer Music Conference. San Francisco: Computer Music Association, pp. 475-478 .

Hamman, M., 1994. "Dynamically Configurable Feedback/Delay Networks: A Virtual Instrument Composition Model." Proceedings of the 1994 International Computer Music Conference. San Francisco: Computer Music Association, pp. 394-397.

Hamman, M., 1994. "Toward a cybernetics of sound synthesis practice and composition." Paper read at the Technology and the Composer Conference at Luxembourg City.

Page 142: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

137

Hamman, M., 1997. "Composition of Data and Process Models: a Paralogical Approach to Human/Computer Interaction." Proceedings of the 1997 International Computer Music Conference. San Francisco: Computer Music Association.

Hamman, M., 1997. "Interaction as Composition: Toward the Paralogical in Computer Music." Sonus 17(2).

Hamman, M., 1998. "Structure as Performance: Cognitive Musicology and the Objectification of Procedure." in Otto Laske: Navigating New Musical Horizons ed. J. Tabor. Greenwood Press, forthcoming.

Hegel, G. W. F., 1967. The Phenomenology of Mind, trans. J. B. Bailie. New York: Harper & Row.

Hegel, G. W. F., 1969. Science of Logic, transl. A. V. Miller. Atlantic Highlands: Humanities Press International, Inc.

Heidegger, M., 1988. Hegel's Phenomenology of Spirit, transl. Emad, P., and Maly, K. Bloomington: Indiana University Press.

Helmholtz, H. L. F., 1954. On the Sensations of Tone as a Psychological Basis for the Theory of Music, trans. A. J. Ellis. New York: Dover Publications, Inc.

Hiller, L., and Isaacson, L., 1959. Experimental Music. New York: McGraw Hill.

Holtzman, S., 1980. "A Generative Grammar Definition Language for Music." Interface 9.

Hutchins, E. L, Hollan, J.D., and Norman, D. A., 1986. "Direct Manipulation Interfaces." in User Centered System Design: New Perspectives on Human-Computer Interaction, ed. D. A. Norman and S. W. Draper. Hillsdale: Lawrence Erlbaum Associates.

Jaffe, D. A., and Smith, J. 0., 1989. "Extensions of the Karplus-Strong Plucked-String Algorithm." in The Music Machine: Selected Readings from Computer Music Journal, ed. C. Roads. Cambridge, Massachusetts: The MIT Press.

Karplus, K., and Strong, A., 1989. "Synthesis ofPlucked-String and Drum Timbres." in The Music Machine: Selected Readings from Computer Music Journal, ed. C. Roads. Cambridge, Massachusetts: The MIT Press.

Kaufmann, L. H., 1987. "Self-reference and recursive forms." Journal ofSocial Biological Structure X (1987).

Page 143: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

138

Kaufmann, W., 1966. Hegel: A Reinterpretation. Garden City, New York: Doubleday & Co.

Kellog, W. A., 1990. "Qualitative Artifact Analysis." in Human-Computer Interaction­INTERACT '90, ed. D. Diaper et al. North-Holland: Elsevier Science Publishers.

Koenig, G. M., 1959. "Studio Technique." Die Riehe 1, pp. 52-54. Bryn Mawr, PA: Theodore Presser.

Koenig, G. M., 1965. "The Second Phase ofElectronic Music." unpublished text.

Koenig, G. M., 1969. "PROJECT 1." Electronic Music Reports 1(2).

Koenig, G. M., 1970. "PROJECT 2: A programme for musical composition." Electronic Music Reports 1(3).

Koenig, G. M., 1970b. "The Use of Computer Programmes in Creating Music." Music and Technology. Paris: La Revue Musicale.

Koenig, G.M., 1978. "Composition Processes". Lecture delivered at the UNESCO Workshop on Computer Music, Aarhus, Denmark.

Lakoff, G., and Johnson, M., 1980. Metaphors We Live By. Chicago, IL: The University of Chicago Press.

Laske, 0., 1980. "On Composition Theory as a Theory of Self-Reference." Alios. La Jolla: Lingua Press.

Laske, 0., 1988. "Introduction to Cognitive Musicology." Computer MusicJournal12(1) : 43-57.

Laske, 0., 1989. "Composition Theory: An Enrichment of Music Theory." Interface 18: 45-49.

Laske, 0., 1991. "Toward an Epistemology of Composition." Interface 20, pp. 235-269.

Laske, 0., 1993. "A search for a theory of musicality." Languages of Design 1: 209-228.

Laurel, B., 1990. "Interface Agents: Metaphors with Character." in The Art of Human­Computer Interface Design, edit. B. Laurel. Reading, Massachussets: Addison-Wesley.

Laurel, B., 1993. Computers as Theatre . Reading, Massachusetts: Addison-Wesley.

Page 144: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

139

Leman, M., 1995. Music and Schema Theory. Berlin: Springer-Verlag.

Lyotard, J.-F., 1984. The Postmodern Condition: A Report on Knowledge, transl. Bennington, G., and Massumi, B. Minneapolis: University of Minnesota Press.

Marcuse, H., 1960. Reason and Revolution: Hegel and the Rise of Social Theory. Boston: Beacon Press.

Maturana, H., Lettvin, J., McCulloch, W., and Pitts, W., 1960. "Anatomy and physiology ofvision in the frog." Journal ofGeneral Physiology, 43, pp. 129-175.

Maturana, H., Uribe, G., and Frenk, S., 1968. "A biological theory of relativistic color coding in the primate retina." Arch. Biologia y Med. Exp. Suplemento No. 1, Santiago: University of Chile.

Maturana, H. 1970. "Neurophysiology of cognition." in Cognition: A Multiple View ed. P. Garvin. New York: Spartan Books.

Maturana, H., 1974. "Cognitive Strategies." Cybernetics ofCybernetics. Urbana, Illinois: University of Illinois Biological Computer Laboratory.

Maturana, H., 1975. "The organization ofthe living: a theory of the living organization." International Journal of Man-Machine Studies, 7, pp. 313-332.

Maturana, H. , 1980. "Biology of Cognition." in Autopoiesis and Cognition: The Realization of the Living ed. H. Maturana and F. Varela. Dordecht: Reidel.

Maturana, H., 1978. "Biology oflanguage: The epistemology of reality." Psychology and Biology of Language and Thought: Essays in Honor of Eric Lenneberg, ed. G. A. Miller and E. Lenneberg. New York: Academic Press.

Maturana, H., and Varela, F., 1980. "Autopoiesis: The Organization ofthe Living." In Autopoiesis and Cognition: The Realization of the Living, ed. H. Maturana and F. Varela. Dordecht: Reidel.

May, R. M., 1976. "Simple Mathematical Models with Very Complicated Dynamics." Nature 261(6): 459-467.

Merleau-Ponty, M., 1964. The Primacy of Perception, transl. J. E. Edie. Chicago, IL: Northwestern University Press.

Moore, F. R., 1990. Elements ofComputer Music. Englewood Cliffs, NJ: Prentice Hall.

Morris, R., 1970. "Some Notes on the Phenomenology of Making: The Search for the Motivated." Artforum 8, pp. 62-66.

Page 145: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

Morrison, J.D., and Adrien, J.-M., 1993. "MOSAIC: A Framework for Modal Synthesis." Computer Music Journal17(1), p. 45-56.

Mure, G. R. G., 1940. An Introduction to Hegel. London: Oxford University Press.

Mure, G. R. G., 1965. The Philosophy of Hegel. London: Oxford University Press.

Nelson, T. H., 1990. "The Right Way to Think about Software Design." in The Art of Human-Computer Interface Design, ed. B Laurel. Reading, Massachusetts: Addison-Wesley.

Newell, A., 1990. Unified Theories of Cognition. Cambridge, MA: Harvard University Press.

140

Norman, D. A., 1986. "Cognitive Engineering." User Centered System Design: New Perspectives on Human-Computer Interaction. Hillsdale, New Jersey: Lawrence Erlbaurn Associates.

Oppenheim, D., 1991. "Towards a Better Software-Design for Supporting Creative Musical Activity (CMA)." Proceedings of the 1988 International Computer Music Conference. San Francisco: Computer Music Association, pp. 380-387.

Parenti, S., 1985. The Relationships Between an Objectivity-Conditioned Language and the Present State of New Music. D.M.A. diss., University of Illinois at Urbana­Champaign.

Payne, S. J., 1990. "Looking HCI in the 1." in Human-Computer Interaction- INTERACT '90, ed. D. Diaper et al. North-Holland: Elsevier Science Publishers .

. Pressing, J. , 1988. "Nonlinear Maps as Generators of Musical Design." Computer Music Journal12(2): 35-45.

Punch, B., Sullivan, M., Koehler, R., 1991. "An Algorithmic Approach to Composition based on Dynamic Hierarchical Assemby." Proceedings ofthe 1988 International Computer Music Conference. San Francisco: Computer Music Association, pp. 45-52.

Rheingold, H., 1990. "An Interview with Don Norman." The Art of Human-Computer Interface Design, edit. Brenda Laurel. Reading, Massachussets: Addison-Wesley.

Risset, J .-C., 1970. "Synthesis of Sound by Computer and Problems Concerning Timbre." Music and Technology. Paris: La Revue Musicale.

Page 146: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

Roads, C., 1978. "Automated granular synthesis of sound." Computer Music Journal 2(2): 61-62.

Roads, C., 1985. "Granular synthesis of sound." Foundations ofComputer Music . Cambridge, Massachusetts: MIT Press, pp. 145-159.

141

Roads, C., 1989. "Active Music Representations." Proceedings of the 1989 International Computer Music Conference. San Francisco: Computer Music Association.

Roads, C., 1991. "Asynchronous Granular Synthesis." Representations of Musical Signals. Cambridge, Massachusetts: MIT Press, pp. 143-186.

Roads, C., 1996. The Computer Music Tutorial. Cambridge, Massachusetts: The MIT Press.

Rodet, X., and Cointe, P., 1989. "FORMES: Composition and Scheduling ofProcesses." in The Music Machine: Selected Readings from Computer Music Journal, ed. C. Roads. Cambridge, Massachusetts: The MIT Press.

Simon, H. A., 1969. The Sciences ofthe Artificial. Cambridge, Massachusetts: The MIT Press.

Smith, J., 1985. "An Introduction to Digital Filter Theory." in Digital Audio Signal Processing: An Anthology, ed. J. Strawn. Los Altos, California: William Kaufmann, Inc.

Stace, W. T., 1955. The Philosophy of Hegel: a Systematic Exposition. London: Dover Publications, Inc.

Stockhausen, K., 1971. "The Concept ofUnity in Electronic Music." in Perspectives on Contemporary Music Theory, ed. B. Boretz and E. T. Cone. New York: W. W. Norton & Co.

Stravinsky, 1., 1970. Poetics of Music trans!. Knodel, A. and Dahl, I. Cambridge, Mass: Harvard University Press.

Sullivan, C., 1990. "Extending the Karplus-Strong Algorithm to Synthesize Electric Guitar Timbres with Distortion and Feedback." Computer Music Journal 14(3 ), pp. 26-37.

Sullivan, M. V., 1984. The Performance ofGesture: Musical Gesture, Then, and Now. D.M.A. diss., University of Illinois Urbana-Champaign.

Tipei, S., 1987. "Maiden Voyages: A Score Produced with MPl." Computer MusicJournal 11 (2), pp. 49-58.

Page 147: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

Tipei, S., 1990. "The Computer, A Composer's Collaborator." LEONARDO 22 (2), pp 241-258.

Truax, B., 1987. "Real-time granulation of sampled sound with the DMX-1000." Proceedings ofthe 1987 International Computer Music Conference. San Francisco: Computer Music Association.

142

Truax, B., 1988. "Real-time granular synthesis with a digital signal processing computer." Computer Music Journal12(2):14-26.

Truax, B., 1990. "Chaotic Non-Linear Systems and Digital Synthesis: an Exploratory Study." Proceedings ofthe 1990 International Computer Music Conference. San Francisco: Computer Music Association, pp. 100-103.

Uribe, R. B., 1991. Tractatus Paradoxico-Philosophicus: A Philosophical Approach to Education. Urbana, Ill: Ricardo B. Uribe.

Varela, F. J., Thompson, E., and Rosch, E., 1996. The Embodied Mind: Cognitive Science and Human Experience. Cambridge, Massachusetts: The MIT Press.

Vercoe, B., 1988. CSOUND: a Manual for the Audio Processing System and Supporting Programs. The Media Laboratory, M.I.T.

Vertegaal, R., Eaglestone, B., and Clarke, M., 1994. "An Evaluation of Input Devices for Use in the ISEE Human-Synthesizer Interface." Proceedings ofthe 1994 International Computer Music Conference. San Francisco: Computer Music Association.

Vertegall, R., and Bonis, E., 1994. "ISEE: An Intuitive Sound Editing Environment." Computer Music Journal, 18(2), pp. 21-29.

Von Foerster, H. , 1973. "Cybernetics ofEpistemology." Proceedings ofthe 5th Congress of the Deutsche Gesel/schaft fur Kybernetik. Munich: R. Oldenbourg Verlag.

Von Foerster, H., et. al. , 1974. The Cybernetics of Cybernetics. Urbana, IL: The Biological Computer Laboratory.

Wessel, D., 1979. "Timbre Space as a Musical Control Structure." Foundations of Computer Music. Cambridge, Massachusetts: MIT Press, pp 640-657.

Wiggins, G., Miranda, E., Smaill, A., and Harris, M., 1993. "A Framework for the Evaluation of Music Representation Systems." Computer Music Journal17(3) , p. 31-42.

Page 148: COMPUTER MUSIC COMPOSITION AND THE DIALECTICS OF

Wilson, K., 1974. The Cybernetics of Cognitive Processes. M.S. thesis, University of Illinois Urbana-Champaign.

Winograd, T. and Flores, F. 1986. Understanding Computers and Cognition. Reading, Massachusetts: Addison-Wesley Publishing Company, Inc.

Wittgenstein, L., 1983. Tractatus Logico-Philisophicus. London: Routledge & Kegan Paul Ltd.

Wolff, C., 1959. "On Form." Die Riehe 1. Bryn Mahr: Theodore Presser.

143

Woodhouse, J., 1992. "Physical Modeling of Bowed Strings." Computer Music Journal 16( 4), pp. 43-56.

Xenakis, I., 1971. Formalized Music: Thought and Mathematics in Composition. Bloomington, Indiana: Indiana University Press.

Zicarelli, D., 1987. "M and Jam Factory." Computer Music Journa/11(4), pp. 13-23.