[ 29. August 2016 ]

NEWS – [cec-c] tools for notation of EA / digital media (David Hirst)

Von: David Hirst

Datum: Sun, 28 Aug 2016

Betreff: Re: [cec-c] tools for notation of EA / digital media

I have used a number of tools and notations for the analyses I have done

of ea music. Many are documented on the OREMA Project website.

As a part of my PhD I did a detailed analysis of Smalley’s Wind Chimes.

A summary can be found here:

http://www.orema.dmu.ac.uk/analyses/david-hirsts-analysis-smalleys-wind-chimes

This analysis used a scheme I developed called the SIAM Framework. SIAM

stands for Segregation Integration Assimilation and Meaning, it is

summarised in the following PDF:

http://www.orema.dmu.ac.uk/sites/default/files/Hirst_Wind_Chimes_Anal.pdf

In this analysis I developed a display that showed a spectrogram of two

minute sections of the work, underneath this I included symbols, text

and timing information relating to sound objects. The whole thing was

drawn and programmed using Flash – to be able to play the sound file,

see the sound objects, and see where they were placed in relation to the

time scale, with a cursor that followed the play back. For copyright

reasons, I can’t put the whole animation online, but there are screen

shots for the whole piece in the following PDF. You could import them

and a purchased copy of Wind Chimes into Pierre Couprie’s Eanalyse program:

http://www.orema.dmu.ac.uk/sites/default/files/Hirst_Screenshots.pdf

Also included in that PDF are screen shots of a tabulated „reduction“ of

all the initial information summarised into half the time frame.

(Another form of representation) = „Time span Reduction“.

I used Pierre Couprie’s program and Sonic Visualiser to carry out an

analysis of Jonty Harrison’s Unsound Objects in the first edition of the

eOREMA Journal. There are a number of different representations in that

article, for the pictures and text see:

http://www.orema.dmu.ac.uk/eorema/connecting-objects-jonty-harrison’s-unsound-objects

This analysis provoked a further interest in the representation of

„activity“ in ea music and how this kind of temporal analysis could be

automated and represented. The following paper summarises work to

explore the „Rhythmogram“ representation of sonic events, and work to

represent and automate activity and segmentation creation:

tenor2015.tenor-conference.org/papers/28-Hirst-PerceptualModels.pdf

Related papers on this work can be found in the following:

Hirst, D. (2014) The Use of Rhythmograms in the Analysis of Electro-acoustic

Music, with Application to Normandeau’s Onomatopoeias Cycle. Proceedings

of the

International Computer Music Conference 2014. Athens, Greece, 14-20

Sept, 2014.

pp 248-253.

quod.lib.umich.edu/i/icmc/bbp2372.2014.039/1

Hirst, D. (2014) Determining Sonic Activity In Electroacoustic Music.

Harmony

Proceedings of the Australasian Computer Music Conference 2014. Hosted

by The

Faculty of the Victorian College of the Arts (VCA) and the Melbourne

Conservatorium of Music (MCM). 9 – 13 July 2014. pp 57-60.

acma.asn.au/media/2014/01/ACMC-2014r1.pdf

The Rhythmogram can be used to depict both long term structures (the

whole piece), or short term, detailed structures say 10 seconds. The

pictures have been said to have some resemblance to the hierarchical

diagrams of tonal music by Lerdahl and Jackendoff.

MATLAB and the MIRtoolbox, plus the Auditory Toolbox were used for the

Rhythmograms and automated sound segregation.

A lot more detail can be found in my book:

Hirst, D. (2008). A Cognitive Framework for the Analysis of Acousmatic

Music:

Analysing Wind Chimes by Denis Smalley VDM Verlag Dr. Muller

Aktiengesellschaft & Co. KG. Saarbrücken.

Which is available through your local amazon.com

online store.

Or just download my PhD from:

minerva-access.unimelb.edu.au/handle/11343/39067

I have some other stuff too, but that is enough for now. 👴🏼

Cheers,

David

Dr David Hirst

Honorary Principal Fellow

Melbourne Conservatorium of Music

University of Melbourne

Parkville, Vic

Australia