9 International Conference on New Interfaces for Musical Expression

chantingrompΚινητά – Ασύρματες Τεχνολογίες

10 Δεκ 2013 (πριν από 3 χρόνια και 7 μήνες)

149 εμφανίσεις

Proceedings of the 9
th
International Conference on New Interfaces for Musical Expression. All rights reserved by Carnegie Mellon University. See
copyright notices on individual papers regarding specific permissions to copy. No other part of this electronic proceedings may be reproduced in any
format without permission from Carnegie Mellon University. To seek permission, please contact: Gloriana St. Clair, Dean of University Libraries,
Carnegie Mellon University, 5000 Forbes Avenue, Pittsburgh, PA 15213, USA (Tel: +1 412 268 2447, Email: gstclair@andrew.cmu.edu). For technical support
please contact Causal Productions (info@causalproductions.com).
June 4
-
6, 2009

.

Pittsburgh, Pennsylvania, USA
Proceedings of the
9
th
International Conference on
New Interfaces for Musical Expression
Install Software
Support
Author Index
Hub Page
Table of Contents
Search
Editors: Roger B. Dannenberg and Kristi D. Ries
The 9
th
International Conference on
New Interfaces for Musical Expression
June 4–6, 2009

Pittsburgh, Pennsylvania, USA
Proceedings of the 9
th
International Conference on New Interfaces for Musical Expression. All rights reserved by Carnegie Mellon University.
See copyright notices on individual papers regarding specific permissions to copy. No other part of this electronic proceedings may
be reproduced in any format without permission from Carnegie Mellon University. To seek permission, please contact: Gloriana St.
Clair, Dean of University Libraries, Carnegie Mellon University, 5000 Forbes Avenue, Pittsburgh, PA 15213, USA (Tel: +1 412 268 2447,

Email: gstclair@andrew.cmu.edu). For technical support please contact Causal Productions (info@causalproductions.com).

Welcome from the Conference Chairs

Conference Organization

NIME 2009 Keynote

NIME 2009 Panel

Sponsors

Welcome Page

Session List

Table of Contents

Author Index:

Brief

or

Detailed

Manuscripts
Legend:
View Manuscript
[Search]
Hub Page
NIME 2009 Session List [Pag
e 1/1]
Session List
1:Evaluation and Modeling
2:Posters and Demos I
3:Robotics and New Interfaces
4:Electroacoustics
5:Computer Systems
6:Posters and Demos II
7:Haptics and Extended Instruments
8:Design and Graphics
9:Sensing and Conducting
10:Posters and Demos III
11:Control Strategies and Installations
12:Mobile Music
Program1 of the NIME 2009 Concert Schedule
Program2 of the NIME 2009 Concert Schedule
Program3 of the NIME 2009 Concert Schedule
Program4 of the NIME 2009 Concert Schedule
NIME 2009 Installations
[Search]
NIME 2009 Table of Contents [Pag
e 1/18]
Evaluation and Modeling (NIME 2009 Sessions)
9am– 11am,Thursday June 4
th
2009
1
FromReal to Virtual:A Comparison of Input Devices for Percussion Tasks
Mike Collicutt
,
Carmine Casciato
,
Marcelo M.Wanderley
7
Probabilistic Model of Pianists’ ArmTouch Movements
Aristotelis Hadjakos
,
Erwin Aitenbichler
,
Max Mühlhäuser
13
A Quantitative Evaluation of the Di
!
erences Between Knobs and Sliders
Steven Gelineck
,
Stefania Serafin
19
Evaluation of 3D Haptic Target Rendering to Support Timing in Music Tasks
Ricardo Pedrosa
,
Karon E.MacLean
25
Evaluating Interactive Music Systems:An HCI Approach
WilliamHsu
,
Marc Sosnick
[Search]
NIME 2009 Table of Contents [Pag
e 2/18]
Posters and Demos I (NIME 2009 Sessions)
12:15pm– 1:45pmand 3:45pm– 4:30pm,Thursday June 4
th
2009
29
The Ghetto Bastard:A Portable Noise Instrument
Neal Spowage
31
The Navi Activity Monitor:Toward Using Kinematic Data to Humanize Computer Music
Eric Humphrey
,
Colby Leider
33
Utilizing Tactile Feedback to Guide Movements Between Sounds
Alexander Müller
,
Georg Essl
35
An Interface for Live Interactive Sonification
SamFerguson
,
Kirsty Beilharz
37
Responsive Music Interfaces for Performance
Alexander Reben
,
Mat Laibowitz
,
Joseph A.Paradiso
39
Hands on Stage:A Sound and Image Performance Interface
Chi-Hsia Lai
41
The Vibrobyte:A Haptic Interface for Co-Located Performance
Kyle McDonald
,
Dane Kouttron
,
Curtis Bahn
,
Jonas Braasch
,
Pauline Oliveros
43
Multi-Laser Gestural Interface —Solutions for Cost-E
!
ective and Open Source
Controllers
Meason Wiley
,
Ajay Kapur
45
Mims:Interactive Multimedia Live Performance System
Ryo Kanda
,
Mitsuyo Hashida
,
Haruhiro Katayose
48
netBody —“Augmented Body and Virtual Body II” with the System,BodySuit,Powered
Suit and Second Life —Its Introduction of an Application of the System
Suguru Goto
,
Rob Powell
[Search]
Posters and Demos I continues on next page …
NIME 2009 Table of Contents [Pag
e 3/18]
Posters and Demos I continued …
50
Life Game Orchestra as an Interactive Music Composition SystemTranslating Cellular
Patterns of Automata into Musical Scales
Keisuke Ogawa
,
Yasuo Kuhara
52
Natural Materials on Stage:CustomControllers for Aesthetic E
!
ect
John Toenjes
98
The Kalichord:A Physically Modeled Electro-Acoustic Plucked String Instrument
Daniel Schlessinger
,
Julius O.Smith III
54
Controlling Live Generative Electronic Music with Deviate
Sarah Keith
56
SpiralSet:A Sound Toy Utilizing Game Engine Technologies
Andy Dolphin
58
LUMI:Live Performance Paradigms Utilizing Software Integrated Touch Screen and
Pressure Sensitive Button Matrix
Mingfei Gao
,
Craig Hanson
60
The SARC EyesWeb Catalog:A Pattern Recognition Toolbox for Musician-Computer
Interaction
Nicholas Gillian
,
R.Benjamin Knapp
,
Sile O’Modhrain
62
A 2D Fiducial Tracking Method Based on Topological Region Adjacency and Angle
Information
Hiroki Nishino
[Search]
NIME 2009 Table of Contents [Pag
e 4/18]
Robotics and New Interfaces (NIME 2009 Sessions)
1:45pm– 3:45pm,Thursday June 4
th
2009
64
Anthropomorphic Musical Performance Robots at Waseda University:Increasing
Understanding of the Nature of Human Musical Interaction
Jorge Solis
,
Takeshi Ninomiya
,
Klaus Petersen
,
Maasaki Takeuchi
,
Atsuo Takanishi
70
The Creation of a Multi-Human,Multi-Robot Interactive JamSession
Gil Weinberg
,
Brian Blosser
,
Trishul Mallikarjuna
,
Aparna Raman
74
MusicGrip:A Writing Instrument for Music Control
Nan-Wei Gong
,
Mat Laibowitz
,
Joseph A.Paradiso
78
Let Loose with WallBalls,a Collaborative Tabletop Instrument for Tomorrow
Grant Partridge
,
Pourang Irani
,
Gordon Fitzell
82
SORISU:Sound with Numbers
Hye Ki Min
86
The Tactus:A Tangible,Rhythmic Grid Interface Using Found-Objects
YotamMann
,
Je
!
Lubow
,
Adrian Freed
90
Real-Time Phase Vocoder Manipulation by Runner’s Pace
Jason A.Hockman
,
Marcelo M.Wanderley
,
Ichiro Fujinaga
[Search]
NIME 2009 Table of Contents [Pag
e 5/18]
Electroacoustics (NIME 2009 Sessions)
4:30pm– 6pm,Thursday June 4
th
2009
94
A Discussion of Multidimensional Mapping in Nymophone2
Kristian Nymoen
,
Alexander RefsumJensenius
98
The Kalichord:A Physically Modeled Electro-Acoustic Plucked String Instrument
Daniel Schlessinger
,
Julius O.Smith III
102
Augmenting Chordophones with Hybrid Percussive Sound Possibilities
Otso Lähdeoja
106
An Electroacoustically Controlled Vibrating Plate
Mark Kahrs
,
David Skulina
,
Stefan Bilbao
,
Murray Campbell
110
Don’t Forget the Loudspeaker —A History of Hemispherical Speakers at Princeton,Plus
a DIY Guide
Scott Smallwood
,
Perry R.Cook
,
Dan Trueman
,
Lawrence McIntyre
[Search]
NIME 2009 Table of Contents [Pag
e 6/18]
Computer Systems (NIME 2009 Sessions)
9am– 11am,Friday June 5
th
2009
116
Features and Future of Open Sound Control Version 1.1 for NIME
Adrian Freed
,
Andrew Schmeder
121
A Low-Level Embedded Service Architecture for Rapid DIY Design of Real-Time Musical
Instruments
Andrew Schmeder
,
Adrian Freed
125
Firmata:Towards Making Microcontrollers Act Like Extensions of the Computer
Hans-Christoph Steiner
131
Sharing Data in Collaborative,Interactive Performances:The SenseWorld DataNetwork
Marije A.J.Baalman
,
Harry C.Smoak
,
Christopher L.Salter
,
Joseph Malloch
,
Marcelo M.Wanderley
135
Challenges and Performance of High-Fidelity Audio Streaming for Interactive
Performances
Nicolas Bouillot
,
Jeremy R.Cooperstock
141
“Extension du Corps Sonore” —Dancing Viola
Todor Todoro
!
,
Frédéric Bettens
,
Loïc Reboursière
,
Wen-Yang Chu
[Search]
NIME 2009 Table of Contents [Pag
e 7/18]
Posters and Demos II (NIME 2009 Sessions)
12:15pm– 1:45pmand 3:45pm– 4:30pm,Friday June 5
th
2009
90
Real-Time Phase Vocoder Manipulation by Runner’s Pace
Jason A.Hockman
,
Marcelo M.Wanderley
,
Ichiro Fujinaga
82
SORISU:Sound with Numbers
Hye Ki Min
147
The elBo and footPad:Toward Personalized Hardware for Audio Manipulation
Colby Leider
,
Doug Mann
,
Daniel Plazas
,
Michael Battaglia
,
Reid Draper
149
The Midi-AirGuitar:A Serious Musical Controller with a Funny Name
Langdon Crawford
,
WilliamDavid Fastenow
151
An Early Prototype of the Augmented PsychoPhone
Niels Böttcher
,
Smilen Dimitrov
153
Catch Your Breath —Musical Biofeedback for Breathing Regulation
Diana Siwiak
,
Jonathan Berger
,
Yao Yang
155
A Wii-Based Gestural Interface for Computer Conducting Systems
Lijuan Peng
,
David Gerhard
157
Chess-Based Composition and Improvisation for Non-Musicians
Dale E.Parson
159
MagNular:Symbolic Control of an External Sound Engine Using an Animated Interface
Andy Dolphin
161
AUDIO ORIENTEERING —Navigating an Invisible Terrain
Noah Feehan
163
Developing the Cyclotactor
Staas de Jong
[Search]
Posters and Demos II continues on next page …
NIME 2009 Table of Contents [Pag
e 8/18]
Posters and Demos II continued …
86
The Tactus:A Tangible,Rhythmic Grid Interface Using Found-Objects
YotamMann
,
Je
!
Lubow
,
Adrian Freed
207
Designing for Conversational Interaction
Andrew Johnston
,
Linda Candy
,
Ernest Edmonds
165
midOSC:A Gumstix-Based MIDI-to-OSC Converter
Sébastien Schiesser
169
Parallel Processing SystemDesign with “Propeller” Processor
Yoichi Nagashima
171
Where Did It All Go Wrong?A Model of Error fromthe Spectator’s Perspective
A.Cavan Fyans
,
Michael Gurevich
,
Paul Stapleton
173
Advanced Techniques for Vertical Tablet Playing —An Overview of Two Years of
Practicing the
HandSketch
1.x
Nicolas d’Alessandro
,
Thierry Dutoit
175
Gyroscope-Based Conducting Gesture Recognition
Andreas Höfer
,
Aristotelis Hadjakos
,
Max Mühlhäuser
[Search]
NIME 2009 Table of Contents [Pag
e 9/18]
Haptics and Extended Instruments (NIME 2009 Sessions)
1:45pm– 3:45pm,Friday June 5
th
2009
177
Using Haptics to Assist Performers in Making Gestures to a Musical Instrument
Edgar Berdahl
,
Günter Niemeyer
,
Julius O.Smith III
183
Using Haptic Devices to Interface Directly with Digital Waveguide-Based Musical
Instruments
Edgar Berdahl
,
Günter Niemeyer
,
Julius O.Smith III
187
Haptic Carillon —Analysis & Design of the Carillon Mechanism
Mark Havryliv
,
Fazel Naghdy
,
Greg Schiemer
,
Timothy Hurd
193
The Electrumpet:A Hybrid Electro-Acoustic Instrument
Hans Leeuw
199
Sensor Technology and the Remaking of Instruments fromthe Past
Emmanuelle Gallin
,
Marc Sirguy
203
Twenty-First Century Piano
Sarah Nicolls
[Search]
NIME 2009 Table of Contents [Page 10/18]
Design and Graphics (NIME 2009 Sessions)
4:30pm– 6pm,Friday June 5
th
2009
207
Designing for Conversational Interaction
Andrew Johnston
,
Linda Candy
,
Ernest Edmonds
213
Designing for Style in New Musical Interactions
Michael Gurevich
,
Paul Stapleton
,
Peter Bennett
218
Re-Designing Principles for Computer Music Controllers:A Case Study of SqueezeVox
Maggie
Perry R.Cook
222
Interfacing Graphic and Musical Elements in
Counterlines
Jaroslaw Kapuscinski
,
Javier Sanchez
226
FrameWorks 3D:Composition in the Third Dimension
Richard Polfreman
[Search]
NIME 2009 Table of Contents [Page 11/18]
Sensing and Conducting (NIME 2009 Sessions)
9am– 11am,Saturday June 6
th
2009
230
Novel and Forgotten Current-Steering Techniques for Resistive Multitouch,Duotouch,
and Polytouch Position Sensing with Pressure
Adrian Freed
236
A Force-Sensitive Surface for Intimate Control
Randy Jones
,
Peter Driessen
,
Andrew Schloss
,
George Tzanetakis
242
A Flexible Mapping Editor for Multi-Touch Musical Instruments
Greg Kellum
,
Alain Crevoisier
246
Phalanger:Controlling Music Software with Hand Movement Using a Computer Vision
and Machine Learning Approach
Chris Kiefer
,
Nick Collins
,
Geraldine Fitzpatrick
250
The
UBS Virtual Maestro
:An Interactive Conducting System
Teresa Marrin Nakra
,
Yuri Ivanov
,
Paris Smaragdis
,
Chris Ault
256
The Vocal Augmentation and Manipulation Prosthesis (VAMP):A Conducting-Based
Gestural Controller for Vocal Performance
Elena Jessop
[Search]
NIME 2009 Table of Contents [Page 12/18]
Posters and Demos III (NIME 2009 Sessions)
12:15pm– 1:45pmand 3:45pm– 4:30pm,Saturday June 6
th
2009
256
The Vocal Augmentation and Manipulation Prosthesis (VAMP):A Conducting-Based
Gestural Controller for Vocal Performance
Elena Jessop
303
Designing Smule’s Ocarina:The iPhone’s Magic Flute
Ge Wang
316
The Drummer:A Collaborative Musical Interface with Mobility
Andrea Bianchi
,
Woon Seung Yeo
193
The Electrumpet:A Hybrid Electro-Acoustic Instrument
Hans Leeuw
260
Double Slide Controller
Tomás Henriques
177
Using Haptics to Assist Performers in Making Gestures to a Musical Instrument
Edgar Berdahl
,
Günter Niemeyer
,
Julius O.Smith III
308
Scratch-O
!
:A Gesture Based Mobile Music Game with Tactile Feedback
Nicholas Gillian
,
Sile O’Modhrain
,
Georg Essl
213
Designing for Style in New Musical Interactions
Michael Gurevich
,
Paul Stapleton
,
Peter Bennett
218
Re-Designing Principles for Computer Music Controllers:A Case Study of SqueezeVox
Maggie
Perry R.Cook
242
A Flexible Mapping Editor for Multi-Touch Musical Instruments
Greg Kellum
,
Alain Crevoisier
[Search]
Posters and Demos III continues on next page …
NIME 2009 Table of Contents [Page 13/18]
Posters and Demos III continued …
236
A Force-Sensitive Surface for Intimate Control
Randy Jones
,
Peter Driessen
,
Andrew Schloss
,
George Tzanetakis
262
HSP:A Simple and E
!
ective Open-Source Platformfor Implementing Haptic Musical
Instruments
Edgar Berdahl
,
Günter Niemeyer
,
Julius O.Smith III
246
Phalanger:Controlling Music Software with Hand Movement Using a Computer Vision
and Machine Learning Approach
Chris Kiefer
,
Nick Collins
,
Geraldine Fitzpatrick
264
Versum:Audiovisual Composing in 3D
Tarik Barri
266
Towards a Humane Graphical User Interface for Live Electronic Music
Jamie Bullock
,
Lamberto Coccioli
268
YARMI:An Augmented Reality Musical Instrument
Tomás Laurenzo
,
Ernesto Rodríguez
,
Juan Fabrizio Castro
270
SpeedDial:Rapid and On-the-Fly Mapping of Mobile Phone Instruments
Georg Essl
274
ForTouch:A Wearable Digital Ventriloquized Actor
Sidney Fels
,
Robert Pritchard
,
Allison Lenters
[Search]
NIME 2009 Table of Contents [Page 14/18]
Control Strategies and Installations (NIME 2009 Sessions)
1:45pm– 3:45pm,Saturday June 6
th
2009
276
Words,Movement and Timbre
Alex McLean
,
Geraint Wiggins
280
A Meta-Instrument for Interactive,On-the-Fly Machine Learning
Rebecca Fiebrink
,
Dan Trueman
,
Perry R.Cook
286
Action and Perception in Interactive Sound Installations:An Ecological Approach
Jan C.Schacher
290
The Argus Project:Underwater Soundscape Composition with Laser-Controlled
Modulation
Jonathon Kirk
,
Lee Weisert
293
PlaySoundGround:An Interactive Musical Playground
Michael St.Clair
,
Sasha Leitman
297
The Fragmented Orchestra
Daniel Jones
,
TimHodgson
,
Jane Grant
,
John Matthias
,
Nicholas Outram
,
Nick Ryan
[Search]
NIME 2009 Table of Contents [Page 15/18]
Mobile Music (NIME 2009 Sessions)
4:30pm– 6pm,Saturday June 6
th
2009
303
Designing Smule’s Ocarina:The iPhone’s Magic Flute
Ge Wang
308
Scratch-O
!
:A Gesture Based Mobile Music Game with Tactile Feedback
Nicholas Gillian
,
Sile O’Modhrain
,
Georg Essl
312
ZooZBeat:A Gesture-Based Mobile Music Studio
Gil Weinberg
,
Andrew Beck
,
Mark Godfrey
316
The Drummer:A Collaborative Musical Interface with Mobility
Andrea Bianchi
,
Woon Seung Yeo
[Search]
NIME 2009 Table of Contents [Page 16/18]
Program1 (NIME 2009 Concert Schedule)
320
The Oklo Phenomenon
Robert Wechsler
321
Anigraphical Etude 9
David Lieberman
322
Cosmic Strings II
Min Eui Hong
323
Study No.1 for PAM and MADI
Troy Rogers
,
Steven Kemper
,
Scott Barton
324
Fue Sho – Electrofusion
Garth Paine
,
Michael Atherton
325
Versum– Fluor
Tarik Barri
Program2 (NIME 2009 Concert Schedule)
326
Angry Sparrow
Chikashi Miyama
327
Biomuse Trio
Eric Lyon
,
Ben Knapp
,
Gascia Ouzounian
328
BodyJack
Suguru Goto
329
Code LiveCode Live,or Livecode Embodied
Marije A.J.Baalman
[Search]
NIME 2009 Table of Contents [Page 17/18]
Program3 (NIME 2009 Concert Schedule)
330
MOLITVA —Composition for Voice,Live Electronics,Pointing-At Glove Device and 3-D
Setup of Speakers
Giuseppe Torre
,
Robert Sazdov
,
Dorota Konczewska
331
Ben Neill and LEMUR
Ben Neill
,
Eric Singer
Program4 (NIME 2009 Concert Schedule)
332
Performance:Modal Kombat Plays PONG
David Hindman
,
Evan Drummond
333
A
"
ux/Reflux
Colby Leider
334
PLOrk Beat Science 2.0
Ge Wang
,
Rebecca Fiebrink
335
Hands O
n —A New Work fromSLABS Controller and Generative Algorithms
David Wessel
336
Bioluminescence
R.Luke DuBois
,
Lesley Flanigan
[Search]
NIME 2009 Table of Contents [Page 18/18]
NIME 2009 Installations
337
Elemental & Cyrene Reefs
Ivica Bukvic
,
Eric Standley
338
Cellphonia:4
'
33
"
Scot Gresham-Lancaster
,
Steve Bull
339
Pendaphonics
Dan Overholt
,
Byron Lahey
,
Anne-Marie Skriver Hansen
,
Winslow Burleson
,
Camilla Nørgaard Jensen
340
Sound Lanterns
Scott Smallwood
341
AANN:Artificial Analog Neural Network
Phillip Stearns
[Search]
FrameWorks 3D: Composition in the third dimension





Richard Polfreman

University of Southampton

University Road

Southampton, UK

r.polfreman@soton.ac.uk







Abstract

Music composition on computer is a challenging task,
involving a range of data t
ypes to be managed within a
single software tool. A composition typically comprises a
complex arrangement of material, with many internal
relationships between data in different locations
-

repetition, inversion, retrograde, reversal and more
sophisticated
transformations. The creation of such
complex artefacts is labour intensive, and current systems
typically place a significant cognitive burden on the
composer in terms of maintaining a work as a coherent
whole. FrameWorks 3D is an attempt to improve sup
port
for composition tasks within a Digital Audio Workstation
(DAW) style environment via a novel three
-
dimensional
(3D) user
-
interface. In addition to the standard paradigm of
tracks, regions and tape recording analogy, FrameWorks
displays hierarchical an
d transformational information in a
single, fully navigable workspace. The implementation
combines Java with Max/MSP to create a cross
-
platform
,

user
-
extensible package and will be used to assess the
viability of such a tool and to develop the ideas furthe
r.

Keywords
: Digital Audio Workstation, graphical user
-
interfaces, 3D graphics, Max/MSP, Java.

1.

Introduction

FrameWorks 3D presents a new design for audio and
MIDI sequencing user
-
interfaces. It extends traditional
approaches
with
features to aid the mappin
g of
compositional ideas onto a work, and facilitate rapid
experimentation with musical ideas [1]. While such
elements could be included in a (combination of) 2D
display(s), FrameWorks adopts a 3D space in order to
present complex
structural information
(h
ierarchical
and

relational) in addition to retaining the visibility of the
existing notation; difficult to achieve effectively in a single
2D space. This allows detailed visual exploration of a work
in a way which may give the composer new insights.

Once l
imited to games and scientific/bio
-
medical
visualisation, 3D graphics are becoming pervasive, from
Apple’s Cover Flow [2] and Microsoft’s 3D Flip [3], to
Second Life [4]. As 3D representations proliferate,
FrameWorks offers one approach to the adoption of
this
technology for music applications. While 3D has been
used in some music systems [5][6], it has yet to be fully
exploited in direct manipulation music composition tools.


2.

FrameWorks: A Brief History

2.1

Origins

The concept was developed in task analysis research in the
late 1990
’s focusing on music composition, and was first
implemented in a 2D prototype in 2001 [7]. The primary
concern is to allow rapid experimentation with material
and structural ideas within the same interface. This relates
to one of Green’s Cognitive Dimensio
ns of Notations [8],
viscosity
, described as the resistance to change of a
notation. FrameWorks is a highly fluid design, where local
changes to a work can propagate throughout allowing
experimentation to incur a low time
-
cost.

2.2

Concept

Clips
1
which are con
tainers for musical data of a
particular type (MIDI, audio, OSC, etc) and which can be
a) hierarchically arranged on
tracks
and b) connected
together by one
-
to
-
one mono
-
directional
relations

expressing a connection between two
clips
(and their
descendents)
. A combination of
clips
and
relations
forms a
framework
.
Clips
may be empty, and therefore the
structure of a work can be developed prior to musical
material; alternatively the structure can be built up from
materials. Thus composers can work in both top
-
down and
bottom
-
up
modes (or some combination thereof) although
a
framework
itself is a top
-
down structure.

The
relations
between
clips
are processes, which take
the material in a source
clip
, transform it and place the new
material in a target
clip
. Thes
e are dynamically maintained
at all times, thus any changes to either
clips
or
relations
are
immediately reflected throughout the
framework
. Typical
relations
include identity, transposition, time dilatation,



1
Previously referred to as
components
, the name has been
changed to avoid confusion with the pr
ogramming concept of
component
.

Permission to make digital or hard copies of all or part of this work for
persona
l or classroom use is granted without fee provided that copies
are not made or distributed for profit or commercial advantage and that
copies bear this notice and the full citation on the first page. To copy
otherwise, to republish, to post on servers, or
to redistribute to lists
requires prior specific permission and/or a fee.

NIME09,
June 3
-
6, 2009, Pittsburgh, PA

Copyright remains with the author(s).

NIME 2009
226
reverse
and
filtering
.

Relations
could be exten
ded to many
-
to
-
one, where data from more than one source are
combined to form the result (somewhat similar to side
-
chaining in studio effects).

Hierarchical and computational connections between
music elements are not in themselves new, but have mainly
bee
n used in programming language based algorithmic
composition tools. FrameWorks aim to bring these within
the scope of standard DAW software.

2.3

Initial Prototype: FrameWorks 2D

An initial implementation, written in Java 1.1, lacked
clip
hierarchies, support
ed only MIDI data and was only a
sketch of the intended system [9].

Figure 1 shows the
framework
view, where “hanging” from track timelines are
several
clips
, connected by lines representing
relations
and
whose colour indicates which
relation
is being used
.


Figure 1. FrameWorks 2D: framework view.


A basic piano
-
roll display allowed
clip
editing, while
relation
editors specified transformation parameters. For
example,
time

relations
chain together an arbitrary number
of source segments, with start and end
points, playback
speed and direction settings. In figure 2, the entire source
is played once forward
s
and once again in reverse.



Figure 2. FrameWorks 2D:
time relation
editor


Informal feedback from composers was positive in
terms of being able to cre
ate (and recreate) works in a fluid
manner, particularly lending itself to process based music,
but the interface was too limited in basic functionality for
serious work and formal evaluations, while support for
audio was indicated as essential.

3.

FrameWorks
3D

FrameWorks 3D is a new implementation written in Java
5, using the Java 3D API [10] and Max/MSP as an audio
engine [11]. Hierarchical arrangements of
clips
are now
supported and audio data is used rather than MIDI. Java’s
MIDI and Audio API, Java Sound
[12] has been criticised
for a number of limitations in terms of latency and jitter
[13], and while a number of solutions have been proposed
(e.g [13]), an alternative strategy of using Max/MSP as an
audio engine for Java has been adopted here.

3.1

Audio Engi
ne Separation

FrameWorks 3D has been designed so that the audio/MIDI
engine, wrapped in an
AMSEngine
class, can be re
-
implemented for various audio/MIDI API’s. Earlier
versions used an
AMSEngine
purely for data i/o, i.e.
playback and recording of MIDI data
, while the data itself
was hosted and manipulated in the main FrameWorks
code. While this limits the size of the
AMSEngine
and so
simplifies switching to different implementations, such a
design leads to frequent large data transfers between the
FrameWork
s model and the
AMSEngine
. While a
relatively minor issue when both are written in Java and
MIDI data is used, with an external engine and audio data,
this may become a significant overhead. The new
implementation expands the role of the engine to
include
managing the audio (and other) data and providing the
processing for relations, thus minimising the data crossing
the model/engine boundary. In the case of Max/MSP, this
also allows us to use Max patches as relation
implementations, leveraging a vast resou
rce of audio
processing objects, and permitting very rapid development
of new relations, potentially by users themselves.













Figure 3. Internal structure of FrameWorks 3D


3.2

Max/MSP Integration

Several programming languages can be used to define ne
w
objects that can be used freely in Max patches: Max itself
(i.e. abstractions), Javascript (js objects), Java (mxj
objects) and C (native).
In
FrameWorks 3D we subvert this
role, with our mxj~ class “taking over” the operation of
Max from the user, provi
ding a new application user
interface and hiding as much of Max as possible. Max


AMS Engine

GUI: Java 3D & Swing

FrameWorks Model

Data (Audio)

Processing

Audio I/O

Core Application

227
patches are loaded and scripted behind the scenes in order
to carry out audio operations. The mxj~ object loads Java
code and communicates with hidden Max patches to
control a
udio i/o, the real
-
time clock, etc. The technical
details and issue
s
involved are described elsewhere [14].

3.3

FrameWorks 3D GUI

Figure 4 shows the main FrameWorks 3D window.
Around the edge of the central 3D
framework
are various
editing and navigation tools
: a tree view of the framework
structure, clip parameters (editable), and zoom controls.



Figure 4. FrameWorks 3D main window.

















Figure 5. FrameWorks 3D: tree view and
clip
parameters.


Navigation is via the computer keyboard as in many 3
D
environments, which changes the virtual camera position
and orientation and thus the user’s viewpoint. The tabbed
pane for the 3D view provides three independent views of
the framework, to help keep track of the various
clips
and
relations
being used. Th
e tree view provides an alternate
representation of the framework, and selecting a clip in
either, selects that clip in both views and displays its
parameter settings where they can be edited (figure 5).

3.4

The Framework

In the 3D space, the x
-
axis represent
s time, the y
-
axis
separates one track from another and the z
-
axis (vertical) is
used group
clips
into hierarchical arrangements. Tracks are
narrow strips extending along the time axis, from which
rectangular
clips
are suspended.
Relations
are shown as
pip
es that connect a source
clip
to a destination
clip
. The
current playback position is shown as a flat sheet in the y
-
z
plane that moves along the x
-
axis. A small Head
-
Up
Display (HUD) in the 3D space shows the clock and basic
transport controls. Figure 6 s
hows the same
framework

viewed from different positions and orientations.





















Figure 6. Two views of the same framework structure

3.5

Clips

Clips
contain audio data and can be arranged in
hierarchical groups (Figure 7 below). Only leaf clips
hold
audio directly, and these display an overview of the sound
waveform when loaded. An audio clip is similar to an
audio region in standard DAW software; the user can load
a sound file and define a segment of that file to be the
current data (by Command
-
dragging the ends of the clip,
or by editing clip parameter values). Clips can be played
back individually and the framework played as a whole.














Figure 7. An example of hierarchical arrangement of clips.

228
3.6

Relations

Relations
are implemented as
plug
-
ins hosted by the
audio/MIDI engine. These are currently in the form of
specifically designed Max patches, which provide both the
user
-
interface and the processing algorithm, much like
commercial plug
-
in architectures such as Steinberg’s VST.
When Fra
meWorks 3D is launched, the relation plug
-
in
files are scanned and made available to the user.


















Figure 8. User interface for the Pitch Shifter relation.


Figure 8 shows a Pitch Shifter relation editor. This
applies a constant transposition
to the source material
using either a time or a frequency domain algorithm. Once
the required settings are set, the update framework button
applies the new settings to the audio, which will in turn
update all dependent audio throughout the framework.

A n
umber of relations have been developed so far,
including identity, reverse, filter (biquad), brassage and
pitch shifter.

4.

Further Work

Current development is focussed on refining the
interaction between Java code and Max/MSP,

designing

additional
relations
,
and including further user
-
interface
features in order to aid user testing. The 3D interface is
deliberately sparse at this stage in order to focus on user
-
assessment of the basic concept and gain user input on how
additional interface elements might be d
eveloped.

As the tool develops we expect to reinstate MIDI data,
add automation of
clip
parameters, add track parameters
and effects, to bring the whole system closer to a DAW
style environment.

In addition we are looking for further opportunities to
explo
it 3D user
-
interface elements within the environment,
such as in relation editors.

5.

Conclusions

FrameWorks 3D represents a novel approach to
sequencing tasks by extension of existing DAW metaphors
into a 3D space which features both hierarchical
arrangement
s of content and dynamically maintained
relationships between elements within the structure.

An initial 2D prototype showed some promise, and this
has now been significantly enhanced with a true 3D
implementation. While it is still early in the overall
dev
elopment of the system, we are aiming to disseminate
the ideas embodied in the software and gain feedback from
composers. A useable demonstrator system will be freely
available to users in late 2009.

6.

Acknowledgments

FrameWorks 3D development was in part fu
nded by the
i10 (
www.i10.org.uk
) by way of an Enterprise Fellowship.

References

[1]

R. Polfreman, “A task analysis of music composition and
its application to the development of Modalyser,”
Organised Sound
, vol. 4, pp 31
-
43, 1999.

[2]

http://www.apple.com/pro/tips/coverflow.html, last
accessed 20/01/2009.

[3]

http://www.microsoft.com/windows/products/windowsvista
/features/details/f
lip3D.mspx
, last accessed 18/01/2009.

[4]

http://secondlife.com/whatis, last accessed 18/01/2009.

[5]

T. Kunze and H. Taube, “SEE

A Structured Event Editor:
Visualizing Compositional Data in Common Music”, in
Proceedings of the 1996 ICMC
, 1996, pp. 63
-
66.

[6]

N. Cast
agne, and C. Cadoz, “L’environnement GENESIS :
créer avec les modèles physiques masse
-
interaction”, in
Journées d'Informatique Musicale
,
9e édition
, 2002, pp 71
-
82.

[7]

R. Polfreman, “Supporting Creative Composition: the
FrameWorks Approach,” in
Les Actes des
8e Journées d
Informatique Musicale
, 2001, pp. 99
-
111.

[8]

T. R. G. Green, “Cognitive dimensions of notations”
,
in
People and Computers V
, A Sutcliffe and L Macaulay, Eds.
Cambridge : CUP, 1989 pp. 443
-
460.

[9]

R. Polfreman, “FrameWorks
-
A Structural Composition

Tool,” in
Music without walls? Music without instruments?
Proceedings of the International Conference
, 2001, CD
-
ROM.

[10]

“Java3D API Tutorial”,
http://java.sun.com/developer/onlineTraining/java3d/index.
html

[11]

M. Puckette, “Max at 17”,
Computer Music Journal
. Vo
l
26, no. 4, pp 31
-
43, 2002.

[12]

“JavaSound API Programmer’s Guide”,
http://java.sun.com/javase/6/docs/technotes/guides/sound/in
dex.html

[13]

N. Juillerat, S. M. Arisona, S. Schubiger
-
Banz. “Real
-
Time,
Low Latency Audio Processing in Java,

in
Proceedings of
the In
ternational Computer Music Conference, ICMC
2007
.

[14]

R. Polfreman, “Role
-
Reversal: Max/MSP as an Audio
Engine for Java,” in preparation, submitted to ICMC 2009.
229