Development of a multi-touch expressive iPhone application for the controlling of musical performance

powerfuelΛογισμικό & κατασκευή λογ/κού

9 Νοε 2013 (πριν από 3 χρόνια και 9 μήνες)

154 εμφανίσεις


Development of a multi
-
touch
expressive iPhone application for
the controlling of musical
performance





Submitted by

Richard Williams (rjw28)

for the degree of BSc (Hons) Computer Information Systems

of the

University of Bath

2008



Signature of Author
. . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . .
Richard Williams

Signature of Supervisor . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . Prof. John f
fitch


1



This dissertation may
not
be
consulted, photocopied or lent to
other libra
ries without the permission of the author for 3 years
from the data of submission of the dissertation.

Signed:









2


Development of a multi
-
touch expressive iPhone application
for the controlling of musical performance

Submitted by:
Richard Williams

COPY
RIGHT

Attention is drawn to the fact that copyright of this diss
ertation rests with its
author.

The Intellectual Property Rights of the products produced as part of the project
belong to the University of Bath (see

http://www.bath.ac.uk/ordinances/#intelprop
).

This copy of the dissertation has been supplied on condition that anyone who
consults it is understood to
recognize
that its copyright rests with its author and
that no quotation from the
dissertation and no information derived from it may be
published without the prior written consent of the author.

Declaration

This dissertation is submitted to the University of Bath in accordance with the
requirements of the degree of Batchelor of Scienc
e in the Department of Computer
Science. No portion of the work in this dissertation has been submitted in support of
an application for any other degree or qualification of this or any other university or
institution of learning. Except where
specifically
acknowledged, it is the work of the
author.



Signed:




3



Abstract

This project is focused on the creation of an iPhone application to control music
performance wirelessly. ‘
iDJ

is an
iPhone
-
based
application that allows its
users to

wirelessly control a
musical performance, a DJ set or any soft
-
synthesis MIDI
application, running on a networked computer.

It is written in Objective
-
C as part of the iPhone SDK framework, which allows the
utilisation of the iPhone’s capabilities and core functionality.

iDJ
uses the OpenSoundControl (OSC) protocol to send a variety of values,
generated by the users interaction with the UI controls. These are sent using UDP
over a network to the server computer the user wishes to control. These OSC
messages are then converted
to MIDI and assigned, by the user, to whatever they
wish to control in their DJ or DAW application.








4







Acknowledgements

I would like to thank my supervisor, Professor John Fitch, for his
help throughout
this project. I would also like to thank my
friends and family for their continued
support, in particular

Camille Troillard, for his help with his
server
application
‘OSCulator’

which has been used
for the server side processing within this
project.


5


Table of
Contents

1
.
!
Introduction

12
!
1.1
!
Project Aims

12
!
1.2
!
Key Objectives

13
!
1.3
!
Project Roadmap

13
!
1.4
!
Definitions

14
!
2
!
Li
terature Survey

15
!
2.1
!
Introduction

15
!
2.2
!
Background

15
!
2.3
!
Music Controllers

17
!
2.3.1
!
Introduction

17
!
2.
3.2
!
Controller Taxonomy

17
!
2.3.2.1
!
Instrument
-
Like Controllers

17
!
2.3.2.2
!
Instrument Inspired Controllers

18
!
2.3.2.3
!
Hybrid Controllers

18
!
2.3.2.4
!
Alternate Controllers

19
!
2.4
!
Musical Interaction Typology

19
!
2.4.1
!
Instrument Model Systems

20
!
2.4.2
!
Radio Model Systems

20
!
2
.4.3
!
High Level Interactive Music Composition Systems

20
!
2.4.4
!
Implicit Interactive Music Composition Systems

20
!
2.5
!
Where This Project Fits In

21
!
2.6
!
What Makes a Co
ntroller Expressive?

22
!
2.6.1
!
Gesture Typology

23
!
2.6.1.1
!
Excitation Gesture

23
!
2.6.1.2
!
Modification Gesture

23
!
2.6.1.2.1
!
Parametric Mod
ification Gesture

23
!
2.6.1.2.2
!
Structural Modification Gesture

23
!
2.6.1.3
!
Selection Gesture

23
!
2.6.2
!
Are Control and Expression the Same?

24
!
2.7
!
The Use of Feedback in Musical Devices

24
!
2.7.1
!
Types Of Feedback

24
!
2.7.1.1
!
Primary/Secondary Feedback

25
!
2.7.1.2
!
Passive/Active Feedback

25
!
2.7.1.3
!
Importance Of Feedback

25
!
2.8
!
Previous Work

27
!
2.8.1
!
New Interfaces for Musical Expression (NIME)

27
!
2.8.2
!
Evaluation of Similar Applications

27
!
2.8.2.1
!
TouchOSC

28
!
2.8.2.2
!
OSCemote

30
!
6


2.8.2.3
!
iTM MCU

32
!
2.9
!
Mapping

33
!
2.9.1
!
What Is Mapping?

33
!
2.9.2
!
Mapping Typology

35
!
2.9.2.1
!
One
-
to
-
One Mapping

35
!
2.9.2.2
!
Divergent Mapping (One
-
to
-
Many)

35
!
2.9.2.3
!
Convergent Mapping (Many
-
to
-
One)

35
!
2.9.3
!
Importance of Mapping When Designing Music Controller Devices

36
!
2.9.4
!
Mapping iPhone Gestures

37
!
2.10
!
OpenSound Control

38
!
2.1
0.1
!
OSC Versus MIDI

40
!
3
!
Requirements

42
!
3.1
!
Introduction

42
!
3.2
!
Context Of The System

42
!
3.3
!
Information Gathering

42
!
3.3.1
!
Literature Review

42
!
3.3.2
!
Primary Research

43
!
3.3.2.1
!
Survey Analysis

43
!
3.4
!
Functional Requirements

46
!
3.4.1
!
Introd
uction

46
!
3.4.2
!
Terms Defined

46
!
3.4.3
!
User Interface

46
!
3.4.4
!
Application Requirements

48
!
3.4.5
!
Networking

50
!
Hardware & Software Requirements

50
!
3.4.6
!
Client Requirements (Device)

50
!
3.4.7
!
Server Requirements (Server Machine)

51
!
3.5
!
Non
-
Functional Requirements

51
!
3.5.1
!
Introduction

51
!
3.5.2
!
Terms Defined

51
!
3.5.3
!
Interface Requirements

52
!
3.5.4
!
Application Requirements

52
!
4
!
Design

53
!
4.1
!
Introduction

53
!
4.2
!
Design Approach

53
!
4.3
!
Architectural Design Model

54
!
4.4
!
Implications of Software, Tool and Language
Choices on Design

55
!
4.4.1
!
iPhone SDK Framework

55
!
4.4.2
!
Objective
-
C Programming Language

56
!
4.4.3
!
OSC Frameworks For Objective
-
C

57
!
4.4
.3.1
!
WSOSC

57
!
4.4.3.2
!
VVOSC

57
!
7


4.4.4
!
OSCulator

58
!
4.4.5
!
Ableton Live

59
!
4.5
!
Design Patterns Used by iPhone Applications

60
!
4.5.1
!
Model
-
View
-
Controller

60
!
4.5.2
!
Delegation

61
!
4.5.3
!
Target
-
Action

61
!
4.5.4
!
Managed Memory Model

61
!
4.6
!
User Interf
ace Design

62
!
4.6.1
!
Design Inspirations

62
!
4.6.2
!
Main Controller Interface

64
!
4.6.2.1
!
Draft Design 1

64
!
4.6.2.2
!
Draft Design 2

65
!
4.6.2.3
!
Final Design

66
!
4.6.3
!
Settings Interface

67
!
4.6.4
!
Final Interface Design

68
!
4.7
!
System Design

69
!
4.7.1
!
H
igh Level Design

69
!
4.7.2
!
Low Level Design

71
!
4.7.2.1
!
Class Diagram

71
!
4.7.2.1.1
!
RootViewController

72
!
4.7.2.1.2
!
MainViewController

72
!
4.7.2.1.3
!
FlipsideViewController

74
!
4.7.2.2
!
Initial OSC Message Sending Algorithm Design

75
!
4.7.3
!
Moving From Design To Implementation

76
!
5
!
Implementation

77
!
5.1
!
Introduction

77
!
5.2
!
User Interface Implementation

77
!
5.2.1
!
Customisation of UI Controls

78
!
5.2.1.1
!
Equalizer Kn
ob Rotation

79
!
5.2.2
!
Switching Interface View

82
!
5.3
!
Backend Implementation

84
!
5.3.1
!
Final Class Diagram

85
!
5.3.2
!
Handling User Interact
ion

86
!
5.3.3
!
Loading And Saving From PLIST File

87
!
5.3.4
!
Sending OSC Messages to the Server

88
!
5.3.4.1
!
OSC Algorithm Implementation

88
!
5
.3.4.2
!
OSC Implementation Problems

90
!
6
!
Testing And Evaluation

91
!
6.1
!
Introduction

91
!
6.2
!
Requirements Validation

91
!
6.3
!
System Testing

98
!
6.3.1
!
Overview

98
!
8


6.3.2
!
Ongoing Testing

98
!
6.3.3
!
Testing UI Input Control Values

99
!
6.3.4
!
Testing the sending of OSC messages to the
server

101
!
6.4
!
Performance Testing

102
!
6.5
!
User Evaluation

104
!
6.5.1
!
User Evaluation Results

105
!
6.6
!
Testing and Evaluation Critique

106
!
7
!
Conclusions

108
!
7.1
!
Introduction

108
!
7.2
!
Project Overview

108
!
7.3
!
Project Conclusion

110
!
7.4
!
Future Wor
k

111
!
8
!
References

113
!
9
!
Appendices

119
!
9.1
!
Requirements Analysis Survey Sample

119
!
9.2
!
VVOSC Tester Application

120
!
9.3
!
UI Input Controls Outputted to the Console

120
!
9.4
!
User Evaluation Feedback Questionnaires

123
!
9.5
!
Project Code

128
!



9


Table
of Figures

Figure 1

Project Roadmap
................................
................................
.....................
12
!
Figure 2

Table of Mobile Phone market statistics
................................
................
15
!
Figure 3

Image of ‘Guitar Hero™’ game controller
................................
...............
17
!
Figure 4

Diagram of the ‘Meta Trumpet’, by Sukandar Kartadinata
.................
17
!
Figure 5


Pioneer DJM
-
707 2 Channel Performance Mixer
................................
..
20
!
Figure 6

Screenshot of TouchOSC user interface
................................
..................
27
!
Figure 7

Screenshot of OSCemote user interface
................................
..................
29
!
Figure 8

Screenshot of iTM MCU user interface
................................
...................
31
!
Figure 9

Diagram of a Digital Music Instrument (DMI)
................................
......
33
!
Figure 10

Diagram of the ‘Three
-
layer
-
mapping chain’ by Kessous & Arfib, 2003
................................
................................
................................
...........................
33
!
Figure 11

Photo of user interacting with iPhone using multi
-
touch gestures
......
36
!
Figure 12

Table of OSC and MIDI comparisons
................................
..................
39
!
Figure 13

Survey Question: Do you own
an iPhone/iPod Touch?
......................
42
!
Figure 14

Survey Question: What Demographic(s) best describe you?
................
43
!
Figure 15

Survey Question: Would you find an iPhone application that allowed
you to wirelessly control your musical performance useful?
.............................
43
!
Figure 16

Survey
Question: What would you find most beneficial about this type
of application?
................................
................................
................................
...
43
!
Figure 17

Client
-
Server model of the intended application
................................
..
53
!
Figure 18

Screenshot of OSCulator on Mac OS X
................................
...............
57
!
Figure 19

Screenshot from Ableton Live 7

Main View
................................
.....
58
!
Fig
ure 20

Screenshot from Ableton Live

MIDI Mappings table
.......................
59
!
Figure 21

Image of M
-
Audio X Session Pro USB Mixer
................................
......
61
!
Figure 22

Headphone Cue Interface Button
................................
.........................
62
!
Figure 23

Generic On/Off Interface Button
................................
.........................
62
!
Figure 24

Main Controller Interface Design
: Draft 1
................................
............
63
!
Figure 25

Main Controller Interface Design: Draft 2
................................
............
64
!
Figure 26

Main Controller Interface Design: Final Design
................................
...
65
!
Figure 27

Settings Interface Design: Draft Design
................................
................
66
!
Figure 28

Settings Interface Design: Final Design
................................
................
66
!
Figure 29

Fi
nal interface design

exploded diagram
................................
...........
67
!
Figure 30

High Level System Diagram
................................
................................
.
68
!
10


Figure 31

UML activity diagram of system
................................
..........................
69
!
Figure 32

Initial Class Diagram of System
................................
...........................
70
!
Figure 33

Initial OSC Sending Algorithm Design
................................
.................
74
!
Figure 34

Implementing Ma
in Controller Interface using ‘Interface Builder’
......
76
!
Figure 35

Implementation Code: modifyCrossFaderSlider method
......................
77
!
Figure 36

Modification of crossfader UISlider, using the modifyCrossFaderSlider
77
!
Figure 37

Equalizer Knob Implementation
................................
...........................
78
!
Figure
38

Implementation Code: modifyEqualizer method
................................
..
79
!
Figure 39

Implementation Code: equalizerKnobChanged method
.......................
80
!
Figure 40

Diagram showing touch input with equalizer knob
.............................
80
!
Figure 41

Implementation Code: toggleView method
................................
..........
81
!
Figure 42

Implementat
ion Code: Rotation Animation code
................................
81
!
Figure 43

Diagram showing user interface rotation transition
.............................
82
!
Figure 44

Final Implemented Class Diagram
................................
........................
84
!
Figure 45

Implementation Code: buttonSelected method
................................
.....
85
!
Figure 46

Linking UI handler methods to Input
Gesture Behaviours within
Interface Builder.
................................
................................
...............................
85
!
Figure 47

OSC Algorithm Implementation Diagram
................................
............
87
!
Figure 48

‘Instruments’ application performance analyzer for iPhone development
................................
................................
................................
...........................
89
!
Figure 49

Implementation Code: initialising OSC connection
.............................
89
!
Figure
50

Screenshot of iPhone Simulator Testing application
............................
97
!
Figure 51

Connected iPod Touch device for testing purposes
..............................
98
!
Figure 52

UI Input Controls Outputted to the Console
................................
.......
99
!
Figure 53

Screenshot of OSCulator server acknowledging sent OSC messages
from client
................................
................................
................................
........
100
!
Figure 54

Screenshot of CPU usage of iDJ using the ‘Instruments’ application
101
!
Figure 55

Screenshot of maximum CPU usage of iDJ during CPU performance
testing
................................
................................
................................
...............
102
!
Figure 56

Diagram of device setup for user testing
................................
............
103
!
Figure 57

Ph
oto of participant using iDJ during user evaluation process
.........
104
!




11


Table of Tables

Table 1

User Interface Requirements
................................
................................
.....
46
!
Table 2

Application Requirements
................................
................................
........
48
!
Table 3

Net
working Requirements
................................
................................
........
50
!
Table 4

Hardware & Software Requirements

Client
................................
.........
50
!
Table 5
-
Hardware & Software Requirements

Server
................................
..........
51
!
Table 6

Non Functional

Inter
face Requirements
................................
...............
52
!
Table 7
-
Non Functional

Application Requirements
................................
...........
52
!
Table 8

Requirements Validation
................................
................................
..........
91
!

12


1.

Introduction

Computers have extended many aspects
of the human’s musical experience,
including the listening, recording, controlling and creation of music
(Obrenovic,
2005)
. This evolution of computer music has brought to light a “plethora of
sound synthesis methods
available..
.and
inexpensive computer platforms,
allowing a large community direct access to real
-
time computer generated
sound”
(Wanderley M. , Gestural Control of Music, 2001)
. Furthermore, there is
a thriving interest (namely the NIME co
mmunity) in the design and creation of
alternate music control devices that approach key areas such as gesture,
expression and mapping, to facilitate the control and creation of music in
innovative ways
(Blaine, 2005)
.

In conjun
ction
with
this, mobile phones and the iPhone in particular, have
become multi
-
function and multi
-
purpose computers, satisfying demand for a
single device for communication, media and
entertainment
(BindApple, 2008)
.
The iPhone
’s revolutionary gestural touch
-
screen interface and its wireless
capabilities, coupled with the powerful iPhone Software Development Kit,
makes this device an innovative platform for the implementation of an alternate
music controller.

Furthermore
, the iP
hone
contains numerous input sensors
such
as
a
camera, accelerometer and
microphone, all of which have the potential to
be utilised as input controls, for the controlling of musical performance in a
novel and expressive way.

The iPhone is an emerging platf
orm, which is still currently in its infancy. There
are currently no third
-
party application
s
available that allow users to utilise
their iPhone as a portable controller device, for the controlling and
manipulating of a musical performance or DJ set. As su
ch, there is a niche in
the domain of music controllers
that has not been explored
,
which
this project
will
aim to
occupy
.

1.1

Project
Aims

This project’s central aim is to design and implement an expressive, multi
-
touch
gestural application for the iPhone pla
tform, that allows musicians and DJs to
control the performance of music on their computer, using the iPhone’s built in
WI
-
FI capabilities to connect to their laptop or desktop. This application will
achieve this by using the OSC protocol to send a variety
of values, generated
by the user’s interaction with the UI controls. Users will be able to customise
the assignments of the application’s interface in order to control various
software effects for their soft
-
instrument, the performance of a live DJ set or

parameters within their DAW.

13


1.2

Key Objectives



Research extensively all key areas relating to the field of alternat
e
music
controllers, so strengthen the relevance and focus of this project.



Once becoming
proficient with Objective
-
C and the iPhone SDK
, build

an easy to use, responsive and expressive user interface, that is able to
take in users’ touch gestures and map these touches to corresponding
quantitative performance data.



Utilise the

OSC

protocol to send performance data generated from user
input, over
a wireless network, to be then picked up by a server residing
on the users’ computer. These OSC messages need to then be converted
to MIDI and assigned to the users’ choice of musical parameters, for
them to control.



Carry out extensive user testing to ev
aluate the usability and real world
usefulness of the intended application, amongst the targeted user
demographic.

1.3

Project
Roadmap

The diagram shown in
Figure
1
presents
the
intended structure
and roadmap of
the project, from th
e initial proposal of the project right up to the final
implementation and testing phases. Each separate stage has been presented and
discussed in the following report, reflecting the key areas and work that has
been undertaken as part of this project. The
refore, this roadmap
presents both
the structure of the following report and the structure of how the project is to
be carried out.

Figure
1


Project Roadmap


14



1.4

Definitions

Throughout this project, there are many terms and acronyms used that are
specific to the domain of computer mus
ic and alternate controllers. This section
will define these terms used for the ease of the reader.

DAW

-

Digital Audio Workstation

This term has been used to refer to highly specialized multi
-
track music software
suites, of which its purpose is to record,
edit, manipulate and playback digital
audio.

DJ

-

Di
sk Jockey

In this context, the term DJ has been used to describe someone who
controls,

manipulates and mixes pre
-
recorded music during a live performance.

iDJ

-

iDJ

iDJ is the name of the music
-
controlli
ng iPhone a
pplication that this project has
produced
.

MIDI

-

Musical Instrument Digital Interface

MIDI is an industry standard digital music protocol that allows electronic
musical instruments and computers, to communicate, control and synchronize
with on
e another.

NIME

-

New Interfaces for Musical Expression

NIME refers to the series of annual conferences that take place to demonstrate
new musical interfaces and alternate music controllers, as well as discuss the key
concepts in this field. This acronym a
lso refers to the community surrounding
the conferences.
This is expanded on in section
2.8.1
.

OSC

-


OpenSound

Control

OSC is a communication protocol that allows musical performance data to be
sent between computers, sound sy
nthesizers and multimedia devices

over a
network. This is explored in detail in section
2.10
.


15


2

Literature Survey

2.1

Introduction

This project proposes the creation of an expressive, multi touch application to
run on the iPhone
SDK framework, with the intention of controlling a music
application
-
using the OSC protocol. However, before embarking on the
development stage of this project, it is first important to research the key areas
of interest and academia associated with this
project, to enable me to have a
greater knowledge of the vast research in this popular area. Furthermore,
carrying out extensive literature research will help the creation, accuracy and
relevance of my application to this particular field.

This literature
survey will analyse some of the fundamental concepts relating to
music and alternate controllers, such as mapping, expression and the
OpenSound

Control protocol. As this project is more related to software
development than research, this literature survey
will also evaluate the tools
and software suites needed to successfully design and implement a music
controlling iPhone application.

2.2

Background

According to Wanderley, “the evolution of computer music has brought to light
a plethora of sound synthesis met
hods available in general and inexpensive
computer platforms, allowing a large community direct access to real
-
time
computer
-
generated sound”
(Wanderley M. , Gestural Control of Music, 2001)
.
The use of computer generated or
com
puter
-
controlled
music has become
a
standard practice
in many fields of music performance and
composition
. With
this, Blaine states that over the past decade, a “proliferation of inexpensive
controller devices has surfaced to enhance player interaction” wi
th regards to
musical controllers and
music
entertainment devices
(Blaine, 2005)
.

At the same time, m
obile phones have become multi
-
function and multi
-
purpose
devices that are more akin to handheld computers
,
and the Apple iPhon
e is no
exception.
Mark Donovan, senior analyst at comScore stated that
"Smartphones, and the iPhone in particular are appealing to a new
demographic and satisfying demand for a single device for communication and
entertainment”
(BindApple, 2008)
.
It is not until recently that touch
-
screen
technology
h
as become so advanced
and beginning to move into mainstream
technology
,
previously being
limited to “a handful of narrow
-
use applications
with limited functionality such as automat
ed teller machines, gas pumps,
museum displays, airline ticket kiosks, and PDAs”
(Nichols, 2008)

16


The iPhone is becoming much more than just a competing mobile phone

it
has become a standard platform and device for people to de
velop interesting
and expressive applications in all areas of entertainment and media.
The App
Store™
(the online store that brings applications for the platform to
iPhone

users),
and Apple iPhone Software Development Kit™ (SDK) have created a
whole new
ma
rketplace

(iPhone Footprint, 2008)





with
trend
s

(from
the international job site oDesk
)
show
ing

d
emand for iPhone
application programming
increasing
by 500% in
the period between March
2008 and September 2008
(iPhone Footprint, 2008)
. Through this marketplace
establishment,
Apple has become the third leading manufacturer of multi
function mobile handsets worldwide
within two years
(
see

Figure
2
)
(Oliver,
2008)
.

The iPhone and its SDK have been chosen as the platform for this project

s
development for reasons discussed above. Firstly, the iPhone has many
technological capabilities that can be utilised in an application. This allows the
iPhone
to be used
as a cont
rol device for music, using its
Wi
-
Fi capabilities for
synchronization to an external device.
Its touch screen technology allows new
methods of gestural control and musical expressivity to be explored.
Furthermore, the iPhone’s commerci
al success brings many benefits to choosing
it as this project’s platform. It will be possible to reach a huge global user
audience for this project’s application relatively easily (through the AppStore™).
Additionally, the surge in demand for iPhone appli
cations
1

means
there has
been an increase in online development support

and
an
online develop
er
community
for this platform. This will be a beneficial asset, acting as a source
of
research and guidance
during
the implementation stage of this project.




1

O
ve
r 300 million applications have now been downloaded from the AppStore™ by iPhone and iPod Touch users
(Sande, 2008)

Figure
2


Table of Mobile Phone market statistics


17


2.3

Musi
c
Controllers

2.3.1

Introduction

With the huge advances in computing in the 20
th
century, technology has made
its way into the world of music, contributing to the way musicians can control
and interact
with their musical performance.
Computers
have extended many

aspects of
human’s
musical experience, including the listening, recording,
controlling and
creation of music
(Obrenovic, 2005)
.
As stated previously, there
is a thriving interest in the design and creation of
music
control devi
ces that
take advantage of emerging technologies
(namely the NIME community)
, as
well as key areas such as gesture, expression and mapping, to
facilitate the
control and creat
ion

of
music in innovative ways
(Blaine, 2005)
.
This
section
will look at the taxonomy of various types of music controllers, as well as the
factors that affect their control and expressivity. Furthermore, this section will
look at previous real world examples of music controllers, to help identify how
this
project relates to these topics and the current research in these areas.

2.3.2

Controller Taxonomy

To identify how my application relates to previous work in the field of
alternative music controllers, it is first necessary to identify the differen
t type of
musi
c controllers.
The following controller classifications are based on the
research carried out
by
(Wanderley M. , Gestural Control of Music, 2001)

and
(Bongers, 2000)
.

2.3.2.1

Instrument
-
Like Controllers

Inst
rument
-
like controllers are
those
where
the
design
of device inputs and
instrument characteristics, are reproduced
b
ased on the features, specifications
and chara
cteristics of an existing acoustic instrument
(Wanderley M. , Gestur
al
Control of Music, 2001)
.

All the
main features and the overall essence of what
defines that instrument, is reproduced
into
the instrument
-
like controller device.
The most common examples in commercial instruments are

electronic
keyboa
rds (based on a
coustic pianos)
and electric drum kits (based on acoustic
drum kits). These instrument
-
like controllers are generally much cheaper and
much more portable alternatives than the acoustic instrument they are based on

(Blaine, 2005)
. The
best example of this is an electronic keyboard, compared to
a
very heavy grand
piano, which
it
would
have based its designs on
. Bonger
s

argues that with many
instrument like controllers, there is a trade off between
decrease in expressiveness and sen
sitivity, against the enormous range of sounds
they can produce

(Bongers, 2000)
.

18


2.3.2.2

Instrument Inspired Controllers

Instrument inspired controllers are a sub division of instrument
-
like
controllers.
The
main difference is that,
although these controllers are largely inspired by an
existing instrument’s design, its
creation
purpose is “conceived for another
use”
(Trueman & Cook, 1999)
. One example is the SuperPolm violin, which as
developed by S. Goto,
A. Terrier, and P. Pierrot
(Pierrot & Terrier, 1997)
.
Whilst it was based on the shape of a violin, its
intended purpose was for the control of granular
synthesis
(Wanderley M. , Gestural Control of
Mu
sic, 2001)
. A modern commercial example of
an instr
ument inspired controller, is the
Guitar
Hero™ game
controller, as shown
in
Figure
3
.
It is
based on the shape of an electric guitar and even
though it is designed to mimic how a guitar is
played, it is fundamentally a video game
controller.
Elements of
its
expres
sivity are based
on a real electric guitar, but they are vastly
simplified.
For example, it contains buttons on
the fret
-
board and a flipper controller instead of
guitar strings.
The user is not able to create music using the controller, simply
using the c
ontrol buttons on it to interact with the video game. In that sense, it
does not contain any of the original expressivity of
the
electric guitar
by which
it was
inspired.

2.3.2.3

Hybrid Controllers

Hybrid controllers
, often referred to as augmented instruments, ar
e
traditional

(often acoustic)
instruments that
have been customised and augmented by the
addition of extra input sensors.
(Wanderley M. , Gestural Control of Music,
2001)
. Bongers
views
these electronic extensions as
the next
l
ogical step,
as
it
enables an instrumentalist to use the
ir
proficiency
from
playing acquired after
many years of training
, to move the playing of the
ir
instrument forward in new
directions.
Any acoustic instrument can be extended into the realm of
augmente
d instruments (hybrid controllers).


A common commercial example is an
electric guitar processed through an
effects pedal. However, there are
more experimental hybrid controllers
such
the ‘Meta Trumpet’. D
eveloped
by Sukandar Kartadinata,
the ‘Meta
Trum
pet’
features extra input sensors
,
Figure
4


Diagram of the ‘Meta
Trumpet’, by
Sukandar Kartadinata


Figure
3


Image of ‘Guitar
Hero™’ game controller


19


such as a microphone and extras buttons,
affecting yaw and pitch (See
Figure
4
)
.


Augmented instruments can combine
electronic music with the authenticity of
live performance, where a virtuosity
o
f new instrument, and the extension of an
existing instrument, can be combined and developed.

2.3.2.4

Alternate Controllers

Alternate controllers are vastly different from the aforementioned types of
controller devices. This is
because their design does not follo
w and established
existing instrument’s one
(Wanderley M. , Gestural Control of Music, 2001)
.

Alternate controllers are associated with designing instruments in entirely new
forms to previous work, often using innovative methods
of input. They allow he
use of other gesture vocabularies than those of acoustic instrument
manipulation, these being restricted only by the technology choices in the
controller design. Therefore they allow non
-
expert performers the use of these
devices

(Wanderley M. , Gestural Control of Music, 2001)
.
It can be argued,
however, that like existing instruments, the
se
still require
their
performers to
develop
a level of proficiency

and
mastering
of them


to truly take full musica
l
advantage of
their
capabilities.

A good example of a real world alternate
controller
is the ‘Hyperscore’
graphical computer
-
assisted composition system. It was designed for users who
have
limited of no musical training and experience. It takes freehand d
rawing
as its input, “
letting users literally sketch their pieces”
(Obrenovic, 2005)

(Farbood, Pasztor, & Jennings, 2004)
.


Often, alternate controllers are created that explore implicit interactions
as their
sources of input
(See section
2.4

for an expansion of interaction types)
. For
example, research has been done regarding how music can be created and
controlled by implicitly analysing users’ everyday interaction with
their
environment. In other words, looking into
how to create or control music as a
by
-
product of user everyday interaction
(Obrenovic, 2005)
.


2.4

Musical Interaction
Typology

The level and the type of interaction can vary between
different types of music
controllers and instruments.
From
high
-
level
perspective, any interactive
controller or device

that produces music in real time could be labelled as an
instrument
(Chong, 1996)
. However
,
what classes a
controller or an instrument
as an instrument, depends on the level of interaction between the user and the
device.

20


To further identify where this project fits i
n, in the field of music controllers
, it
is important to classify the different types of
intera
ction levels and
interactive
musical
systems. The following classifications are based on research by
(Obrenovic, 2005)
and
(Chong, 1996)
.

2.4.1

Instrument Model Systems

These systems
that perform live tran
sformati
ons
rely on continuous musical
input and involvement from the user. In this sense
, they are similar to normal
instruments, which require constant user interaction.
An example of an
instrument model system is a MIDI
keyboard
(Obrenovic, 2005)
.

2.4.2

Radio Model System
s

These systems, based on the principles of a radio, operate largely independently
from the user. Most of the important interaction (e.g. tuning to a different radio
station)
is predetermined on a number of discrete
choices, and all output is pre
-
composed. Interaction is measured
on metrics such as minutes and hours.

2.4.3

High Level Interactive Music Composition
Systems

These systems fit somewhere between the instrument and the radio model. The
y

require constant user inte
raction but in a
much
less active
and demanding
way
,
making them a viable solution
in
addressing the problem of making music
composition and performance
devices
for users who
do
not have a musical
background
.
Furthermore, they offer high
-
level musical inte
raction primitives
,
such as large
-
scale shape of the
music
piece
(Obrenovic, 2005)
.
An example of
a
high
-
level
interactive music
composition is the ‘
Hyperscore

graphical
computer
-
assisted composition system, discussed in
the pu
blication by

(Farbood,
Pasztor, & Jennings, 2004)
.

2.4.4

Implicit Interactive Music Composition Systems

These systems explore how music can be created by implicitly analysing users’
everyday interaction with the environment. In other
words, th
e music generated

is a by
-
product of the users interaction.
An innovative and experimental
example of an implicit interactive music composition system is ‘Sonic City’.
Developed by the Viktoria Institue’s Future Applications Lab and the
Interacti
ve Institute’s PLAY Studi from Sweden, ‘
Sonic City
’ is a form of an
interactive music instrument

whereby the city is the interface. The system
enables users to create real time personalised music as a by
-
product from them
walking around and interacting w
ith different parts of the city

(Gaye,
Holmquist, & Maze, Sonic City: The Urban Environment as a Musical
Interface, 2003)
.

21


2.5

Where This Project Fits In

This project is to design an application to run on the iPhone hardware and
p
latform that
will allow
the user to wirelessly control a musical performance, a
DJ set or a MIDI instrument

running on a networked computer. The fact that
this project’s application will run on an iPhone

a device intended for a
different purpose altoge
ther, means that this application and its platform are
not based on an instrument model and design.
As the iPhone is not based on
any existing instrument, and it features many innovative input sensors (such a
s
a
touch screen
, a camera, a microphone and an
accelerometer), t
he hardware
platform is more akin to an alternate controller device
. According to
Synthesizer.de
website,
“Alternative
Midi Controllers
are a number of non
-
standard
OSC
or
MIDI
Controllers that can control any parameter in a
Synthesizer or
Sequencer”
(Sequencer.de)
. This projec
t platform fits that
description more successfully than that of a standard alternate controller.

In terms of this project’s interface
design;

despite this early stage of the project, it
will take design elements from physical
mixers and con
trollers. For e
xample
,
Figure
5

shows
the Pioneer DJM
-
707
mixer
, containing different equalizer
knobs, left and right panning fader button
and volume sliders.
For a more detailed
look at the interface design inspiration for
this project, please
see section
4.6.1
.
This
project’s interface, therefore, fits in to the
instrument
-
like category, as it will take
design inspiration from existing devices
such as the one
shown in
Figure
5
. The advantag
e of using software as an
instrument/controller
interface
is that it is relatively easy to create multiple
interfaces
in one application
, as shown on the TouchOSC iPhone application
(shown in
2.8.2.1
). The user can simply click
on a tabbed menu to use different
interfaces. Therefore this project can base different interfaces on different
instruments and controllers, as well as
experiment with new interfaces that are
not based on
any existing
instrument models.

In terms of the le
vel of interaction between user and iPhone, this project will fit
somewhere between the instrument model system and the high
-
level interactive
music composition system. It will be running on, what is essentially, a mobile
phone device. This means that anyo
ne can download it and use it, regardless of
their musical skills or experience. However, as the interfaces will be loosely
based on previous instrument models

it would require a level of proficiency
Figure
5


Pioneer DJM
-
707 2 Channel
Performance Mixer


22


and skill to master it and use it effectively. A DJ, f
or example, who already
contains previous knowledge of mixing tracks and using the mixing desk
interface

would be able to switch to this project’s application to achieve the
same result, relatively simply. However, a novice user would still have to
overc
ome the lack of knowledge associated with mixing songs, and a level
mastery would have to be achieved to successfully do this.

2.6

What Makes
a Controller Expressive?

Expression creates ambiguity when relating it to music instruments and music
controllers.
Ex
pression can be defined as the “felicitous or vivid
indication or
depiction
of mood or sentiment; the quality or fact of being expressive”
(Webster, 2008)
. In other words, a musical instrument, by itself, is not
expressive.
It
is the musician
using the instru
ment in such an expressive way,
that
this collaboration between the user and the device conveys some form

me
aning and feeling.
A music control
device,
however
,
may not necessarily fit
this definition so succinctly, compared
to a traditional music instrument. As
many music control devices separate the input and output layers into separate
entities, unlike an instrument

which directly outputs sound from gestural input
all in one entity, defining what makes a music control de
vice
expressive
becomes even more ambiguous

(Wanderley M. , Gestural Control of Music,
2001)
.

According to

(Poepel, 2005)
,
expression can be viewed
as a mechanism
where
“…performers code expressive in
tentions using expressive
-
related cues”, involving
metrics such as tempo, sound level, timing, intonation,
articulation
, vibrato,
tonality and pausing.
These cues are then ‘decoded’ by the listener, suggesting
that musical expression, like language, “depen
ds on a set of conventional
signifiers and an
understanding of those signifiers shared by both
performer
and
listener.”
(Dobrian & Koppelman, The E in NIME: Musical Expression with
New Computer Interfaces, 2006)
.
Taking into acc
ount research carried about,
in particular, by Marcelo M. Wanderley and Christopher Dobrian, it seems
more likely that there are many factors that
determine the expressivity of an
instrument or music controller. The way the performer/user is able to intera
ct
with the instrument or control device plays an intrinsic part
of
the expression,
however, this is often determined by the device’s ability to be controlled by
gestures, as well as the mapping scheme that maps input to sound output (For
a more detailed l
ook into mapping, see section

2.9
). Other
key factors to be
taken into account are
the
use of bimanuality (the use or requirement of two
hands in a coordinated or cooperative way

(Kessous & Arfib, 2003
)
)

and the
importance
of visual/
haptic feedback,
both of
which are discussed in section
2.7
.

23


2.6.1

Gesture Typology

Studies carried out by
(Cadoz & Wanderly, 2000)
and
(Kessous
& Arfib, 2003)

have led to the formation of accepted definitions of the different types of
gesture. Below is a description of these types:

2.6.1.1

Excitation Gesture

An excitation gesture c
onveys the energy that will be
eventually transformed or
present
in the
sonic result.
According t
o
Cadoz et al, thi
s can be either
instantaneous or
continuous
.
In its instantaneous form, the sound starts when
the excitation gesture has finished. An example of this is a percussive
instrument; the sound of a ringing cymbal is cr
eated after the gesture of hitt
ing
the cymbal with a drumstick
has finished. A continuous excitation gesture is
when the gesture and the sound can coexist at the same time. For example,
when a violin bow is moved across the violin strings, both the gesture
and
sound coexist

as soon as
the violin bow stops moving, both the gesture and
the sound end.

2.6.1.2

Modification Gesture

Modification gestures are related to the modification of the instrument’s
physical properties, introducing a further dimension of expressi
on. These
gestures may change the sound in terms of pitch or length for example, but
their energy does not participate directly to the sonic result. Modification
gestures may be either:

2.6.1.2.1

Parametric Modification Gesture

A parametric
modification gesture is “
continuous variation of a parameter”
(Cadoz & Wanderly, 2000)
, such as applying vibrato to a guitar or violin

string
.

2.6.1.2.2

Structural Modification Gesture

A structural modification gesture relates to the change of the instrument’s
s
tructure, such as the addition or removal of an extra part. For example, using
a mute
with a trumpet would be a structural modification gesture.

2.6.1.3

Selection Gesture

Selection gestures are very different from previous typologies, as they neither
provide energ
y to the sonic result nor modify the instrument’s physical
properties. The user p
erform
s
a choice
from
different but equivalent
element
s
of
an instrument to be used during a performance. A good example of this is a
pianist choosing which keys to press on a
piano.

24


2.6.2

Are Control and Expression the Same?

It is important to make the distinction between control and expression, and
whether having control over a sonic result guarantees the presence of
expression.
Dobrian et al argues that control does not equate to
expression, but
that it is the medium by which expression is made possible. In other words,
control
‘is a precondition for enabling expression, but is not in and of itself
sufficient”
(Dobrian & Koppelman, The E in NIME: Musical E
xpression with
New Computer Interfaces, 2006)
.
To have true expression, a control device
must have an interface that
provides accurate capturing of human gesture,
along with the effective
, and in many cases
complex,
mapping of input to the
sound output.
Simple one
-
to
-
on
e mapping schemes for capturing input data to
control sound parameters
may be necessary
for precise control over the sonic
result. This, however, does not necessarily equate to expression.
What can be
concluded from this argument is that th
e human element is the intrinsic
ingredient for expressivity. Advanced control and gesture capture may enhance
the user’s possibility for expressivity over the sonic result, but its mere presence
does not suffice.

In relation to this project, the applicati
on needs to accurately capture user
gestures in innovative ways to
successfully
become
an
expressive
device
and
allow the user to exhibit expression when using it
. Having simple one
-
to
-
one
mapped controls, such as a HiFi remote control, does not
allow for
any

expression. Use of the iPhone’s touch screen and
its
various input sensors would
allow
complex mapping from
gestural input to sonic output
to be developed,
allowing
the application to be
used in much more expressive ways than a
standard music controlle
r
.

2.7

The Use of
Feedback
in Musical Devices

2.7.1

Types Of Feedback

Dobrian et al makes the clear distinction between visual information; which tells
the user what is possible and where controls are, and visual feedback;
which
tell
s
the player what has
happened
(Dobrian & Koppelman, The E in NIME:
Musical Expression with New Computer Interfaces, 2006)
.
For example,
the use
of black keys for sharp and flat notes and white keys for root
notes
is an
example of a piano’s visual information.
However,
when a user presses a key on
a
piano
, it depresses
, offering visual feedback
. Whilst this is
part of the
fundamental workings of
the design of a piano

where the depressed note
moves a hammer to hit a metal string to produce a sound, it also acts
as visual
feedback

allowing the user to know that they have successfully hit that note.
A more
contemporary
example of visual
information and
feedback can be found
25


on a mixing desk. To allow the user to know
how far to turn a control knob,
visual inform
ation markers will aid the user in their use of the control. W
hen a
button or
switch is turned on or off on a mixer, a
n
LED light will turn on if the
switch has been turned on, and turn off if
the switch ahs been turned off, giving
the user
awareness of th
e parameter changes via
visual feedback.

There are also other forms of
feedback that are often present in musical
instruments and controllers. Auditory feedback is the most obvious form of
feedback to the user; an instrument is played and creates a sound,
and this
sound acts as feedback to the user.
Other forms of feedback
includ
e
tactile
or

haptic feedback.
Mobileburn.com defines haptic feedback as “the use of the
sense of
touch in a user interface design to provide information to and end
user
”, but when r
eferring to mobile devices, “this generally means the use of
vibrations…to denote that a touch screen button has been pressed.”
(Mobile
Burn, 2009)

Based one Wanderley
and Depalle’s
research

(Wanderley
& Depalle, Gestural
Control of Sound Synthesis, 2008)
, feedback can be considered
depending on its
characteristics, as follows:

2.7.1.1

Primary/Secondary Feedback

Primary feedback encompasses visual, auditory (
e.g.
piano key noise)
and
tactile
-
kinesthetic
feedba
ck.

An example of primary feedback that the iPhone
can utilise is haptic feedback through vibration
(Nichols, 2008)
.
Secondary
feedback
is the sound that the instrument produces.
(Vertegaal & Eaglestone
,
1996)

2.7.1.2

Passive/Active Feedback

Passive feedback relates to feedback provided through the
device/controller/instrument’s physical characteristics such as the noise of a
button, and active feedback is produced by the controller/instrument/device in
respon
se to the user’s
interaction
with it (such as the s
ound produced by an
instrument)
(Bongers, 2000)
.

2.7.1.3

Importance
Of Feedback

Visual information and feedback is a necessary element to playing instruments
and using musical controll
ers successfully. Musicians and instrumentalists rely on
their instrument or controller’s visual information and feedback for reference
marks, visualisation of their choice of note/control selection, as well as
confirmation of the interaction with the inst
rument/controller.
Feedback is also
necessary to allow the user to make mental choice
s
for future interactive
gestures and expressivity with an instrument/controller. Unsurprisingly, auditory
26


feedback is intrinsic to musical instruments/controller
s
, otherw
ise the user will
not know what the sound output is, based on their input with the
instrument/controller. However, musical instruments and controllers tend to be
very complex and require
a level of virtuosity to master them. This would be
very difficult to
achieve if there were no forms of visual information and
feedback
given to the user, to allow them to
keep track of the parameters,
levels and inputs that affect their intended musical output
(Wanderley &
Depalle, Gestural Contro
l of Sound Synthesis, 2008)
.

Bongers argues that due to the “decoupling of the sound source and control
surface, a lot of feedback from the process controlled” is lost with e
lectronic
music instruments
(Bongers, 2000)
.

Bonger
s further points that out that
touch
feedback from the sound source is rarely used; “the feel of a key that plays a
synthesized tone
will always be the same irrespective of the properties of the
sound”
(Bongers, 2000)
. What
can
be inferred from Bonger’s research is that

certain aspects of feedback can be harder to achieve in the realm of electronic
musical devices. However, the need for feedback in these devices is still as
important as in traditional acoustic instruments.

27


2.8

Prev
ious
Work

2.8.1

N
ew Interfaces for Musical Expression (NIME)

There
is thriving global interest and research in various areas of alternative
musical controllers
, musical expression
and innovative use of interface protocols
(
such as OSC and MIDI
).
The leading forc
e in this area
is the annual NIME
(New Interfaces for Musical Expression) conference where scientific researchers
and musicians “…from all over the world gather to share
their knowledge and
late
-
breaking work on n
ew musical interface design”
(Poepel, 2005)
, showcasing
the latest breakthrough technologies for
musical expression and artistic
performance. Tina Blaine believes that the NIME
community’s
innovation and
advances is far ahead of the curve of the commercial alternate cont
rollers, both
musical and video game controllers.
(Blaine, 2005)
, which puts the NIME in a
strong position to “raise the bar as to the quality and range of experiences,
devices and the expressive
capabilities they inspire, parti
cularly as it relates to
music creatio
n and education

(Blaine, 2005)
.

Some of the topics that have been covered by the NIME conferences are: design
reports on novel controllers and interfaces for musical expression, musical
mapp
ing algorithms and intelligent controllers, interface protocol (e.g. MIDI
and OSC) and alternative controllers.
The work that has been done by NIME
in the fields of musical controllers, innovative use of interface
protocols and new
mapping strategies has b
een very relevant and inspirational to this project.
Many of the journals and academic sources researched for this project have
been closely connected with the NIME community, which makes their work
highly relevant and important to this project.


2.8.2

Evaluatio
n of Similar Applications

When beginning this project, there w
ere

no
implementation
s

similar
to the
intended application of this project
running on the iPhone platform
. However,
there are now several applications that all do slightly different things, but
all
fundamentally control music in some way
, using the iPhone as their platform
.
This section is a critique of these different iPhone applications. I have
downloaded and, in some cases, purchased these applications f
rom
the
AppStore™ to test them on an iPhone. This has allowed me to find their
strengths and weaknesses, which can be taken into account for the design and
dev
elopment stage of this project.

Furthermore
it has given design inspiration
for this project, which
should result in a more successful application.

28


2.8.2.1

TouchOSC

Copyright © R.J. Ficsher 2008
. Version: 1.2

Touch
OSC is a premium iPhone / iPod Touch
application to allow the user expressive control
over
the controlling of their music DJ
application running on
their computer. It uses
the OpenSound

Control protocol to send and
receive UDP messages over a Wi
-
Fi network to
achieve the wireless connection to the computer.
It does require an application, such as
‘OSCulator’
(see section

4.4.4
)
, to be running
on the computer to act as an intermediate
between the iPhone controller application and
the music application the user wishes to control.
This is so that the OSC messages can be
received by the intermediate application and
mapped to M
IDI messages, which can then be
picked up by the music application. The reason
for this is that OSC is currently not supported
as a protocol in most professional music editing
and DJ applications, such as Ableton Live, Apple Logic and M
-
Audio Pro
Tools.
(hexler.net, 2008)

Strengths



After experimenting with this application, I feel it is, by far, the best
existing application to control musical performance from an iPhone. It has
many different interfaces to allow control over all
aspects of musical
performance.
The interface
shown in
Figure
6
shows
TouchOSC’s
mixer

controller interface, whi
ch allows control using sliders, buttons and an XY
grid
.



TouchOSC is a very usable application, making it easy to tak
e full
advantage of its ability in a short space of time.



It utilises the iPhone’s accelerometer as a control device, to allow the tilt
and movement of the phone to be mapped to a musical element on the
controlled application (for example, it can be used t
o map to the equalizer
of the music playing, allowing the EQ levels to be altered by physically
moving the phone)



The application is fast and produces no noticeable latency when using it as
a
real
-
time
controller.

Figure
6


Screenshot of TouchOSC
user interface


29




TouchOSC also has encompassed an effective
use of visual feedback,
allowing the user to know when controls are switched on and off, and how
far a variable control has been moved.

Weaknesses



Despite being the best current application similar to my project on the
market, it is not particularly easy
to get working and set up quickly. It
requires prior knowledge of OSC and mapping to get it to work effectively
with the musical application to be controlled.
For example, the user has to
set the IP address and UDP port number to connect to the computer
ru
nning the music DAW
(Digital Audio Workstation)
application.



Furthermore to this a third party OSC server application is required,
such
as OSCulator (section
4.4.4
). This receives the OSC data from TouchOSC

converts these mes
sages into MIDI

which then allows them to be picked
up in the DAW application.



It costs £2.39 to purchase. Whilst this is fairly low, I plan to offer a free
application to achieve the same result.

Lessons That Can be Learnt From TouchOSC

This has the
mos
t user
-
friendly
interface for an application of this type. Its
usability and visual feedback are very effective. In terms of interface design, I
will use this application as a reference point to achieve a usable application.
However, I would look at adding
labels and numerical visual feedback to the
controls on my interface,
to allow the user to know what they have assigned
th
e controls to
. This is something that TouchOSC has not implemented, but
would make the interface even more usable.

30


2.8.2.2

OSCemote

Copyrigh
t © Joshua Minor 2008
. Version: 1.1

OSCemote is a remote control application for
the iPhone and iPod Touch that sends and
receives
OSC messages to other programs
running on a
Wi
-
Fi
networked computer. These
can then, in turn, be assigned to MIDI messages
to control various
DAW
s
and other music/DJ
applications

(Minor, 2008)
.


Strengths



OSCemote’s biggest strength is the ability to
respond to multiple touches. If mapped
effectively, this makes the application
incredibly expressi
ve, when these controls are
assigned to changes in key, pitch or music
effects for example
.
Figure
7
shows
the
application responding to five different finger
touches of the iPhone’s interface. When
researching this application, I
mapped this
multi touch grid to an 8
-
band equalizer in Ableton Live (see section
4.4.5

for a critique of this software), and each separate finger touch could control
a separate equaliser band

making this application very exp
ressive in its
control.



Furthermore, OSCemote supports the use of the iPhone’s accelerometer to
be used as a controlling method

using the tilt of the phone to change the
sound of the musical performance.

Weaknesses



The user interface is quite elementary.
Despite using a combination of
graphical and numerical visual feedback, it has not found the right balance
of these two elements. The result of this is that on some of the interface
screens, such as the accelerometer controller, it is difficult to respond
to the
changes made due to the poor visual feedback.



It
is
a pre
mium application, costing £2.99, and not as good value as
TouchOSC.



Compared to TouchOSC, which is a very well rounded application,
OSCemote does not combine different control elements on one
interface,
Figure
7


Screenshot of
OSCemote use
r interface


31


meaning the user has to switch between different tabs to use different
controllers. This can prohibit the users
expression and real time control over
a musical performance, due to the time it takes to switch to the alternative
interface.



Like T
ouchOSC, it isn’t straightforward to set up, for a user with no prior
knowledge of OSC and mapping.
It requires the user to set the IP address
of the server, and the UDP ports to send and receive OSC messages
through.



A third party OSC server applicatio
n (
such as OSCulator) that acts
as an
intermediate between the iPhone app and the DAW application, is required
to successfully create the connection.

Lessons That Can be Learnt From OSCemote

This application, despite its drawbacks, does have support for multi
touch
gestures and use of the phone’s accelerometer, both of which work very well.
As
more than one thing might need to be controlled at a time,
the
support for
multi touch gestures
will be incorporated into this
project, as well as utilising
the accelero
meter as a control mechanism, for
enhanced
expressivity.

32


2.8.2.3

iTM MCU

Copyright © Nuno Pereira 2008. Version: 1.0.4


iTM MCU is a
generic
controller application
for
wirelessly controlling a mixer in a music DAW
or DJ application. Like TouchOSC and
OSCemote it
uses the OpenSoundControl
protocol to send and receive UDP messages to
and from the computer, over a wireless network.
This application, unlike the other two
applications, focuses just on providing an
interface to map to a
n

8
-
track
mixer, for mixing
and m
astering an audio track (See
Figure
8
)

(iTouchMidi, 2008)
.


Strengths



The major strength that iTM MCU has,
which TouchOSC and OSCemote lack, is its
ease of use when connecting to a music
DAW applicati
on on a computer. Unlike the other two, this application
comes with an intermediate application that the user runs on the compute
they wish to connect to. This automatically creates the connection with the
iTM MCU application, meaning the user does not hav
e to set up the server
IP address, and OSC port numbers to connect through. This is
advantageous as it means that the user is not required to adjust the
connection settings, making it much easier to use and quicker to set up,
with regards to connectivity.



The visual feedback element of this application is effective. The interface
contains sliders that mimic the look of physical sliders, like you would find
on a mixing desk. This provides the user with familiarity of the interface,
making it easy to understa
nd in a short space of time, as well as see the
variability in the control levels easily.

Weaknesses


The interface is not easy to use, as the buttons are too small with regards to
the size of a finger touch. As a result, it is difficult to have total accu
racy when

Figure
8


Screenshot of iTM
MCU user interface


33


using this control device, as it is very easy to accidentally press another control
button as well as or instead of the intended control.

There is just one interface, which is shown
in
Figure
8
, designed
to control an
8
-
track mixer as stated previously. Even though it is possible to assign the
controls to any aspect of the music in a DAW application, this is not feasible in
practice as its intended use limits alternative expressivity. Therefore the lack of
alternative int
erfaces or customisable interface lets this application down
considerably.

This is the most expensive application out of the three,
costing £3.49. This is
quite expensive when considering the application has limited control scope,
compared to TouchOSC and
OSCemote.

Lesson
s that c
an be
learnt

from
iTM MCU


iTM MCU
has the basic functionality that my intended application will include.
However, after testing this application, it has emphasised the importance of
good user interface design. The iPhone has a set
screen size and resolution, and
responds to finger touch. Therefore the visual controls on the application
’s
interface
need to be large enough for the human finger to interact with

-

accurately and effectively.
This will be taken on board when designing th
e user
interface for my application.

2.9

Mapping

2.9.1

What Is Mapping?

When analyzing the properties of traditional acoustic instruments, it is easy to
notice that the playable interface and the sound source are inherently bound
together, connected through physical
laws. However
,
w
ithin the realm of
electronic music instruments and music controllers, the interface tends to be
completely separated and independent from the sound source.
This
means,
therefore, that there must be some sort of relationship defined betwee
n them
to
allow input to correlate to output. As Hunt et al describes, “the art of
connecting these two,
traditionally inseparable component
s of a real
-
time
musical system
” is
what is
known as mapping
(Hunt, Wanderley, & Paradis,
The importance of parameter mapping in electronic instrument design, 2002)
.
Pa
radiso et al describes mapping as “matching the capabilities of the human
sensorimotor system to the parameter space of the instrument being played, in
order to provide the per
former with appropriate control of music's four
fundamental elements
-
time, pitch, timbre and amplitude”
(Paradiso &
O'Modhrain, 2003)
.

34


Figure
9
, from
(Rovan J. , Wanderle
y, Dubnov, & Depalle, 1997)
,
shows a
diagrammatic representation of a digital music instrument or music controller.
The user
gives
gestural input to interact with the interface,
which is
interpreted
and
controlled by the gestural controller. These inputs
are then mapped to
various parameters within the sound production source.
The mapping
scheme
is
an essential intermediate necessary to bind the gestural controller and the sound
production source together.
Without it, inputs would not be assigned to music
al
parameters, meaning that no sound output would be produced.

A more insightful look of how mapping works between the initial input gestures
and the eventual sound output, is
shown in
Figure
10
, based
on the ‘three
-
layer
-
mapping
chain’,
created by
(Kessous & Arfib, 2003)
:


Within this mapping chain, there are three layers. The first layer is the ‘related
to interpretation’ layer. This layer takes in input gesture and transforms it into
related
-
to
-
gest
ure
-
perception parameters. At this stage, a gesture extraction
algorithm will take place, which will interpret the gestural quantitative data
and transform it into a more qualitative format, closer to perception.

Figure
9


Diagram of a Digital Music Instrument (DMI)


Figure
10


Diagram of the ‘Three
-
layer
-
mapping chain’ by
Kessous & Arfib, 2003


35


The second layer in the chain is the ‘relat
ed
-
to
-
perception’ layer. This layer
transforms the first layer’s data into related
-
to
-
sound
-
perception data. An
example of this is
taking the gesture input data from a wind controller and
transforming it
into MIDI data
(Wanderle
y, Schnell, & Rovan, 1998)
.

Finally, the third layer
-
‘related to instrument definition’, transforms the
related
-
to
-
sound
-
perception data into synthesis model data.
Variables such as
pitch
, volume
and b
rightness can then be mapped to additive synthesis
(Kessous
& Arfib, 2003)

Mapping can be simple or incredibly complex, depending on what mapping
strategies are used. Its complexity can have a direct impact on the expressivity
of the digital instrument of controller.
This sectio
n will go onto explore the
different types of mapping strategies, as well as analyse the importance of
mapping within the field of digital music instruments and controllers.

2.9.2

Mapping T
ypology

Based on research by
(Rovan J. , Wande
rley, Dubnov, & Depalle, 1997)
, the

following
classification of
mapping strategies have been recognised in the
research area of mapping
:

2.9.2.1

One
-
to
-
One Mapping

One
-
to
-
one mapping is the
simplest
form of mapping
. Each independent
gestural output is assigned t
o one separate musical parameter
. These one
-
to
-
one
mapping schemes tend to be achieved using a MIDI control message. They also
tend to be the least expressive of the three mapping types.
As Dobrian et al
points out, “simple one
-
to
-
one mapping of input cont
rol data to a particular
sound parameter” is often essential for the performer to have precise control,
“but such control is not
equal to expression”
(Dobrian & Koppelman, The E in
NIME: Musical Expression with New Computer Interf
aces, 2006)
.


2.9.2.2

Divergent Mapping (One
-
to
-
Many)

With divergent mapping, one gestural output is assigned to control
more than
one musical parameter
simultaneously.
Rovan et al argue that although a
divergent mapping scheme
may initially provide a macro
-
leve
l expressivity
control, it may still prove limited when applied on its own as it
is unable to
access internal features of the sound object
(Rovan J. , Wanderley, Dubnov, &
Depalle, 1997)
.

2.9.2.3

Convergent Mapping (Many
-
to
-
One)

Conver
gent mapping is the most complex mapping type, whereby many
gestures are coupled to control one musical parameter. Convergent mapping
36


requires some proficiency with the system in order to achieve a level of effective
control. Despite being much harder to m
aster than the previous two schemes, it
does allow for much more expressivity over the digital instrument or controller.

2.9.3

Importance of Mapping When Designing Music
Controller Devices

In the area of electronic music instruments and controllers, mapping is a
rguably
one of the most important factors to consider, when
designing or developing a
new instrument/controller. According to Hunt et al, the “role of
mapping is
vital in the design of new instruments”
(Hunt, Wanderley, & Kirk, To
wards a
Model for Instrumental Mapping in Expert Musical Interaction, 2000)
. Due to
the fact that the instrument interface and sound source are separated within
electronic instruments and controllers (unlike traditional acoustic instruments),
this means
that the mapping relationship needs to be defined. For
a
new
instrument
that do
es
not follow a predefined mapping model based on
a
traditional instrument
,
there is a lot of freedom with regards to mapping input
gestures to output. This allows a huge potent
ial for new instruments to be used
in innovative ways that do
not follow a previous instrument model.

Marcelo Wanderley
(Wanderley M. , Gestural Control of Music, 2001)
and

(Kessous & Arfib, 2003)
arg
ues that the mapping between input parameters
and system parameters can define the very essence of an instrument. Kessous et
al supports this claim, stating that, “…in
e
lectronic musical instruments, the
mapping plays a determining part because it makes po
ssible to define the
personality and expressivity of the instrument”
(Kessous & Arfib, 2003)
.
T
his
mapping relationship between gestural variables, usually obtained through
different input sensors

and sound synthesis variables
in a digital musical
instrument
, “changes the way the instrument responds to performer actions”.
(Hunt, Wanderley, & Paradis, The importance of parameter mapping in
electronic instrument design, 2002)
Hunt et al also believe th
at the
“psychological and emotional response elicited from the performer is determined
to a great degree by the mapping”. Therefore the way an instrument’s interface
and source are mapped to one
another
is of intrinsic importance to the
character of that i
nstrument. Even if the interface and sound source are kept
constant, altering the mapping between those two will produce an instrument
with an entirely different character, feel and level of expressivity.

On a basic level, mapping is necessary to connect t
he gestural input to sound
parameters, in order to have control or performance over sound. On a more in
depth look, it is clear that mapping has a direct effect on the behaviour,
personality and expressivity of an instrument. The complexity and
innovation
of the chosen mapping strategy for a particular electronic instrument/controller,
37


has a huge impact to whether that instrument is expressive or not. Simple one
-
to
-
one mappings provide direct control over musical parameters, but do not
necessarily allow exp
ressive control over them. For this to be achieved, a
combination of one
-
to
-
many, many
-
to
-
one and even many
-
to
-
many mapping
schemes should be used to enhance the instrument’s capacity for expressivity.
Even though an instrument with a complex mapping schem
e may require a
high level of proficiency for it to be mastered effectively (like a traditional
acoustic instrument), it increases the instrument’s capacity for expressive
musical performance.

2.9.4

Map
ping
iPhone
Gesture
s

Focusing on the specifications of the
iPhone, which is the intended platform for
this project,
its touch screen interface has the potential to use a variety of
interesting and expressive mapping
schemes, to contro
l musical parameters.
The iPhone’s touch screen contains a
thin metallic layer.
Human contact
from a finger disrupts the slight
electrical charge flowing through the
screen’s layer, which the
system
registers as the touch of a button
(Nichols, 2008)
. The iPhone’s user
interface is based on the concept of
di
rect manipulation
, allowing its users
to interact directly, through physical
contact, using a combination of multi
touch gestures. This differs greatly from
the existing computer model, of using a
mouse and keyboard.

In
Figure
11

(
from

http://jars.de/tag/spread
)
, we can see someone interacting
with the iPhone interface using multiple finger touches. This application is in
fact based on
the model of a
guitar, in which users can press the interface where
the strings are shown, to pla
y any guitar chord they wish. This application uses
bimanuality, as you would with a traditional guitar

one hand is used to make
the chord shapes on the interface and the other hand’s finger is used to ‘strum’
the strings to produce the sound of the chor
d. This use of multi touch
and
bimanuality
on a touch screen interface
enhances
the playability of the
instrument
(Kessous & Arfib, 2003)
.

The
touch screen
interface is the most
important input method for the i
Phone.
However, it
has many other sensors that could be utilised as input controls that
Figure
11


Photo of user interacting with
iPhone using multi
-
touch gestures


38


respond
to gesture. For example, the iPhone features an accelerometer, in which
an application can know
of
the device’s physical