Wednesday, August 30, 2006

Week5 - SuperCollider - MIDIIn

Here is a post of my code for accepting MIDI input data from the JV-30 to control SuperCollider code. It is not completely accurate yet, but is fairly close. I managed to get some of my code working with the MIDIIn help file. I suspect that it is my "Connect by Device ID" that is correct, and that there is possibly something simple wrong with the instance of the synth.
MIDIClient.prList;

(MIDIClient.init;) //Initialises MIDI services
(MIDIClient.restart;) //Restart MIDI services
(
//Connect by Device ID (UID)
MIDIIn.connectByUID(
inport: 0,
uid: -382154081);

)

(
//Phasing
SynthDef(
\phasing,

{
//Arguments
arg carFreq = 440,
panPos = 0;

//Variables
var carrier,
env;

//Carrier
carrier = SinOsc.ar(carFreq, 0, 0.5);

//Envelope
env = carrier * EnvGen.kr(Env.perc, 1.0, doneAction:2);

//Output
Out.ar(0, Pan2.ar(env, panPos));
}
).send(s)

)

(//Create instance of synth
b = Synth("phasing");
//Setup MIDI Controller
MIDIIn.control = {
arg uid, //Device ID
chan, //MIDI channel
num, //Controller number
val, //Controller value
newVal; //New Controller value

//Scale value to MIDI range
if(num == 16, {
//Map value ranges
newVal = val.linlin(
inMin: 0,
inMax: 127,
outMin: 500,
outMax: 1000);

//Change frequency
b.set(
\carFreq, newVal );

}
);
});

Sometimes I think that Christian does not tell us everything, yet when we see his code it always works.

Friday, August 25, 2006

Week5 - Improvisation Project

...there was sound.

1 microphone,
1 beatboxer,
1 macBook Pro,
1 Powerbook,
1 Juno6,
1 Guitar Amp,
1 Electric Guitar,
1 Amp,
1 Theremin,
2 Monitors working overtime,
Heaps of worms, and most of the group involved.

The challenge for the group is now to get everyone involved and discerning when who and what should be actively performing/improvising.
It was good having my computer back. Now I need to develop some appropriate SuperCollider stuff. My Max application was ok, but not awesome.

Albums that made this blog possible:
Extended play E.P. by Propellerheads

Program Notes From All Presenters

Semester 2 appears to be rolling along quite fast, which is not a bad thing. Information about graduations have been posted out, and I realise that securing a job as soon as possible after I finish my last assignment will be important. Anyway, four people presented in Forum and all four passed out program notes. The session was humorous, given that Tim's piece did not match his program note. At least I got to hear it for the first time since 2004 post-Forum, and it was cool. Analogue 'musique concrete' in my opinion works better than that of the digital variety.

Sitting in the front row, the pieces sounded fairly diverse, but all presenters included a musique concrete work. It was hard to catch the described ternary form of Jake Morris's 'New Surroundings', and I think another listen would be useful to experience the sounds used. Ben Probert's palindromic-titled piece 'Vocalacov' was interesting due to its vocal structures, and it displayed a different perspective to the genre. I can't remember a lot about William Revill's piece, but I think it would be really interesting if his piece had been created on analogue tape.

SuperCollider can take MIDI inputs. This was discovered in the lab, when Christian introduced the topic for the week 5. A MIDI keyboard can be used as a control source, and as a mechanism for real-time work. The processes seem to be fairly straightforward, but I've found SuperCollider to be challenging over the last few weeks. Hopefully a successful patch will make its way into the 'Stew', during the next week.

Watching TV this week (ABC2), I saw an animation that Ashley Klose did the sound for. However, I noticed Peter Combe's name first. In the Sound Design class this week, further discussions were had. It is really important to know when to come in with sound and music. It seems like this is more of a process that can't be taught, that it is more intuitive and based on experience. Next week we are going into the studio and we might be doing some 'foley' work.

That is all.

REF
ERENCES:
Haines, Christian. 2006. MIDI Input. Tutorial presented at the Electronic Music Unit, University of Adelaide, 25 August.


Klose, Ashley. 2006. Sound Design(4). Tutorial presented at the Electronic Music Unit, University of Adelaide, 25 August.


Whittington, Stephe
n. 2006. Forum Presentations. Presentations presented at the Electronic Music Unit, EMU Space, University of Adelaide, 25 August.

Albums that made this blog possible:

Howl by Black Rebel Motorcycle Club.

Wednesday, August 23, 2006

Week4 - SuperCollider - GUI

Here is my GUI code. It is not completely functional. The code works and a GUI window comes up. However, I think there is an issue with my SynthDef, as no sound is produced (yet). I have successfully achieved sound with a window when I created a simple SinOsc SynthDef. I think everyone has heard the sound of a sine wave moving up and down so I wont provide a soundfile of that.


(
//Global Variables
~thisPath = (PathName.new(Document.current.path)).pathOnly;

//Granular Synth
SynthDef(
\GranS,

{
//Arguments
arg granDens = 0, //trigger, start at 0
granRate = 1.0, //Values: 0.5, 1.0, 2.0, -1.0
granPos = 0,
granDur = 0.2,
granPan = 0,
granAmp = 0.5,
bufNum = 0;

//Variables
var bufid,
bufNum1,
signal,
trig;

//Buffer SF
bufid = Buffer.read(s, ~thisPath++"Wonder.aiff");
bufNum1 = bufid.bufnum;

//Setup
trig = Impulse.kr(granDens);

//Signal Granulate
signal = TGrains.ar(
numChannels: 2,
trigger: trig,
bufnum: bufNum,
rate: granRate,
centerPos: granPos,
dur: granDur,
pan: granPan,
amp: granAmp,
interp: 2);

//Output Signal
Out.ar(0, [signal]);
}
).send(s);
)

(
//GUI Window

//Variables
var win,
slid,
syn,
sliderData;

//My window
win = SCWindow(
name: "GUI for CC",
bounds: Rect(
left: 600,
top: 750,
width: 400,
height: 200));
syn = Synth(\GranS);

//Setup Window
win.front;
win.view.decorator = FlowLayout(win.view.bounds);
//win.view.background = Color.black;

//Slider for my Window
slid = EZSlider(
window: win,
dimensions: 300 @ 25,
label: "granDens",
controlSpec: ControlSpec(
minval: 0,
maxval: 10,
warp: \lin,
step: 1,
default: 1),
action: { |edz|syn.set(\granDens, edz.value)},
labelWidth: 60,
numberWidth: 40);
)

Week4 - Audio Arts - Format Exercise


The short excerpt of a film that I have selected for this exercise comes from The Manchurian Candidate, which rates highly on the conspiracy scale. The excerpt is about a minute and a half long and I viewed it on a home theatre system, with 5.1 setup. To set the scene, Captain Marco (Denzel Washington) is watching the TV news, which has a story relating to himself. This triggers dreams of what "really occurred" in Kuwait (Gulf War).

5.1

The haunting strings ligtly build up in the surround speakers. The TV voice merges into the sounds of Kuwait (helicopters, people doing the mind control operations, guns loading, and the soldiers). The use of the surround speakers is important as voices and sounds whir around your head. This spatial disorientation is very effective. There are a lot of different noises, including some low frequency pounding. The 5.1 setup captures the sound in the way intended in terms of dynamics and frequency response.


Powerbook with Headphones

I played the excerpt on my PB's DVD player and found the sound to be fairly impressive. However, the number of noises that was going on tended to clutter the stereo mix coming to my headphones. The full power of the background musical noises (strings and low frequency pounding) is partially lost. The subwoofer on the 5.1 system is never dominant, but is very effective.


Other Formats

I am going to attempt to consider other possible formats for viewing this scene, starting with the iPod (and other portable music/video players). It is a fairly easy task to convert the scene to and mp4 format to be compatible with an iPod. I would have done this except that my iPod can not play videos. The visual/audio would be fairly similar to what is experienced on the laptop, but being a compressed format would result in the dynamic range being diminished (although this depends on how compressed the file ends up). The listening environment could be anywhere, but will be through headphones or white earbuds. I think that it would be much harder to experience the same level of spatial awareness that can be heard on the 5.1 and even the laptop DVD player.
The mobile phone would be a difficult medium for experiencing this scene. There are too many different noises and the sound would be cluttered and confusing. I have no experience with 16mm tape, but the required spatial elements would be almost impossible.

Conclusion
I think that all possible formats would be able to effectively convey what is going on, given that the scene contrasts strongly with what preceeded and followed it. The spatial element of the scene is almost as important as the visual. Therefore, a 5.1 setup is almost a pre-requisite for viewing any modern film.

That scene was scary and disturbing.

Friday, August 18, 2006

The Title of Next Week's Blog Will be Shorter Than this One

Creative Computing results were delivered during week 4 and there was little in my PDF that surprised me. I made reference in my previous weekly blog entry that SuperCollider (SC) compositions need a significant amount of time for a real dynamic creation. One week is sufficient for creating the sounds, and another week for comprehensive structure and development. This leads to me the third week of student presentations in Forum where Henry presented his SC work. Despite the use of different sound materials it kind of developed in a similar fashion to my piece. Once the sounds are all stated, they just keep being presented in alternate patterns. Thus, there is this element of unpredictability combined with “I’ve heard this before.”

Again, I am thankful for the contrast of the works of the different presenters. Matthew Mazzone played his electronic and techno pieces, which seemed to sound fairly typical of what I’ve heard of this genre, and may well suit a computer game. Contrasting was Daniel Murtagh’s audio arts recording. His “heavy metal” song was enjoyable and I rated the vocals highly, whilst the work needed some more depth.

GUI controls were the focus of SC and emerged as a simpler topic than last weeks section on Arrays. Sometime next week I will post the code and a picture of a successful attempt at controlling granulation through a GUI controller.

I think I will leave discussing Sound Design until the midpoint of the semester (in 2 weeks time) and present any comments and issues that are possibly developing at that point. There will be an exercise posted in the next few days.

REFERENCES:
Haines, Christian. 2006. GUI. Tutorial presented at the Electronic Music Unit, University of Adelaide, 17 August.

Klose, Ashley. 2006. Sound Design(3). Tutorial presented at the Electronic Music Unit, University of Adelaide, 17 August.

Whittington, Stephen. 2006. Forum Presentations. Presentations presented at the Electronic Music Unit, EMU Space, University of Adelaide, 17 August.

Albums that made this blog possible:
Cosmo’s Factory by Creedence Clearwater Revival.

Week4 - Improvisation Project

Another session of improvisation and we got some technical issues sorted out, but little sonic achievement. It's awesome that David is bringing his guitar and amplifier to these sessions. It seemed that much of the equipment that we were considering using walked out of the door. Next class, we need to get first dibs on the amplifier that makes Ben's voice do some cool stuff.

As mentioned in last weeks blog, I brought a maracca as a substitute for my computer. Unfortunately I don't wish to do this again. Hopefully next week I won't have to use the word 'unfortunately' again as I should have my PB back. I'm considering adding the application that I built in Max last year to my improvising pool, whilst I work on something appropriately stimulating in SuperCollider.

I think that once everyone is completely sure of what part they're playing in the improvisation, gold will be found. Albert's inclusion may require some more thinking in relation to how we all will participate. Hopefully, we can begin to get the entire group involved together next week.

Albums that made this blog possible:
...Keep it Hot by The Casanovas

Tuesday, August 15, 2006

Week3 - SuperCollider - Granular Synthesis - Update

Here's my code for week 3. It is a little weird, and still needs changes. The weird thing is that it works, but not with the sound file that I believe I am buffering. I execute the synthDef section and then the TempoClock section. Both come up without errors, but have no sound. Prior to making the synthDef error free, it would not be able to locate the sound file. Therefore, it must recognise the soundfile, but won't granulate it.

The odd part is that after executing the code, I can get the sound file from the TGrains help file to work with my code. Thus, I have not recorded it because it is unnecessary. I had intended once my code worked properly, to find/create a good sound file to be used for granular synthesis.
//IN THE SAND BOX

(
//Global Variables
~thisPath = (PathName.new(Document.current.path)).pathOnly;

//Granular Synth
SynthDef(
\GranS,

{
//Arguments
arg b = 10;

//Variables
var bufid,
bufNum,
buflen,
signal,
tRate,
trig,
dur;

//Buffer SF
bufid = Buffer.read(s, ~thisPath++"105-timpani.wav");
bufNum = bufid.bufnum;
buflen = BufDur.kr(bufNum);

//Setup
tRate = MouseY.kr(8, 120, 1);
trig = Impulse.kr(tRate);
dur = 6 / tRate;

//Signal Granulate
signal = Normalizer.ar(
TGrains.ar(2, trig, b, 1, MouseX.kr(0,BufDur.kr(b)), dur, 0, 0.1, 2), 0.5);

//Output Signal
Out.ar(0, signal);
}
).send(s);
)

(~gran = Synth("GranS");

//Setup Clock
~granTempo = TempoClock.new( tempo: 1.5,
beats: 0,
seconds: Main.elapsedTime.ceil);

//Multi-Dimensional Array
~granPerf = [
Array.series(100, 50, 1), //GrainDensity, (SIZE, START, STEP)
[100, 1.1, 1.2], //GrainRate
[0.25], //GrainPosition
[0.25, 0.5, 1.2] //GrainDuration
];

//Sequence Gran
~granTempo.schedAbs(0,
{ arg beat;

//Performance Parameters
~gran.set(
\granDens, ~granPerf.at(0).at(beat%~granPerf.at(0).size).postln,
\granRate, ~granPerf.at(1).choose.postln,
\granPos, ~granPerf.at(2).at(beat%~granPerf.at(0).size).postln,
\granDur, ~granPerf.at(3).choose.postln
)
}
);
)

Here's the update of my code, which actually works the way I want it to. I changed some of my buffer code. Unfortunately, I had overlooked the most important thing, which was using a mono sound file. I used a stereo in my original attempts. I thank Martin for his valuable advice and comment.


//IN THE SAND BOX

(
//Global Variables
~thisPath = (PathName.new(Document.current.path)).pathOnly;

//Granular Synth
SynthDef(
\GranS,

{
//Arguments
arg granDens = MouseY.kr(8, 120, 1),
granRate = 1.0,
granDur = 0.2,
vol = 1;

//Variables
var bufid,
bufNum,
buflen,
signal,
tRate,
trig,
dur;

//Buffer SF
bufid = Buffer.read(s, ~thisPath++"Wonder.aiff");
bufNum = bufid.bufnum;
buflen = BufDur.kr(bufNum);

//Setup
tRate = MouseY.kr(8, 120, 1);
trig = Impulse.kr(granDens);
dur = 6 / tRate;

//Signal Granulate
signal = Normalizer.ar(
TGrains.ar(2, trig, bufNum, granRate, MouseX.kr(0, buflen), granDur, 0, 0.5, 2),
0.5);

//Output Signal
Out.ar(0, signal * vol);
}
).send(s);
)

(~gran = Synth("GranS");

//Setup Clock
~granTempo = TempoClock.new( tempo: 1.5,
beats: 0,
seconds: Main.elapsedTime.ceil);

//Multi-Dimensional Array
~granPerf = [
Array.series(100, 50, 1), //GrainDensity, (SIZE, START, STEP)
[100, 1.1, 1.2], //GrainRate
[0.25], //GrainPosition
[0.25, 0.5, 1.2] //GrainDuration
];

//Sequence Gran
~granTempo.schedAbs(0,
{ arg beat;

//Performance Parameters
~gran.set(
\granDens, ~granPerf.at(0).at(beat%~granPerf.at(0).size).postln,
\granRate, ~granPerf.at(1).choose.postln,
\granPos, ~granPerf.at(2).at(beat%~granPerf.at(0).size).postln,
\granDur, ~granPerf.at(3).choose.postln
)
}
);
)
Beware this sound file is quite loud

Sunday, August 13, 2006

Week3 - Audio Arts - Sound/Music Devices

To explore the devices that allow for good sound/music design in a scene I have selected the penultimate scene (scene 50) of the movie Heat directed by Michael Mann. This is a tense scene and I am trying very hard not to give any spoilers.


Robert De Niro and Al Pacino are in a stand-off in the fields of the Los Angeles airport. The scene is rare in that there is initially no music. Most of the movie has a variety of music propelling the action. Guns are drawn and it is like a game of cat a mouse. De Niro makes his entrance to the scene first where there are the sounds of his movements and breathing. The wind is howling and other nighttime noises are heard. The scene is quiet yet very loud. There are big crescendos of sound when airplanes arrive and depart the airport. I like the contrast in the sounds. Pacino makes his entrance to the scene and the sonic environment is now identical to that of De Niro's entrance, which I think is super. There is so much sound written into the drama. The crescendos of the airplanes are critical to the drama. The scene is reasonably dark, but is intermittently lit up by approaching airplanes. The music finally begins after about two minutes of scene setting. There is no dialogue and is a great sound environment. One final plane lights up the scene and powerful gun shots are made.

I think this scene and much of the movie uses incredible sound and music design. I don't think that I have given too much away. The soundtrack of the movie is fantastic, but I could not locate the piece of music that was used in this scene on the soundtrack.

Saturday, August 12, 2006

Week3 - Improvisation Project

As an improv group we temporarily moved into Studio 5 to attempt to get our improvisation started. A voice, microphone, amplifier and electric guitar are the first clear successes of our early attempts. This, however involved only 1/3rd of the group, whilst the rest of us tried to make stuff work. So, much of the allocated period of time was spent listening and figuring out how to get some more instruments sounding. My sonic contribution was limited this week and possibly next, whilst my PB is at the shop being 'repaired'. Therefore, I might bring some maraccas next week. That may upset the anti-ambiant members of the group...

The 1/3rd produced some interesting sonic material, but I think that at all times we have to be careful not to over-improvise in our construction of the improvisation project. When it comes to improvising, I think it is necessary to have an idea of what to do, and what it may sound best with. The way the improvisation unfolds has to be unexpected, whilst there has to be room for something completely new.

REFERENCES:
Whittington, Stephen and Harris, David. 2006. Forum Workshop. Lecture presented at the Electronic Music Unit, EMU Space, University of Adelaide, 10 August.

Friday, August 11, 2006

If This was the First Term of the Year We would be Three Weeks Away from the Mid-Semester Break

A good way to begin the first of my 3 or 4 blogs for the week is to start with the presentations of what is commonly known as the Music Technology forum. First up was me. Last week I stated that I would be flying the flag first for the third year students. However, I feel that it did not fly as strong as it could have. My piece was ok for the presentation, but was generally too long. The small amount of preparation I put in prior to presenting made me realize a few things that I kind of ignored when I finished the work at the end of the last semester. I think to create a stellar and dynamic work in SuperCollider, a significant amount of creative time is required. I.e. more than 1 week in a busy, stressful period of assignments and exams. For my final electronic composition that I will produce at the end of the year, I want to create a stellar, dynamic composition that will inevitably be a good testament to my studies at EMU.

I am definitely a fan of Dragos’s DIY attitude in relation to his work using Reason 2.0. I did not get a lot out of his piece, but was thankful for the amount of contrast in the pieces selected for performance from all four contributors. I’m glad Vinny’s piece wasn’t excessively long like others of his have been (otherwise I would have missed my ride home), but the combination of piano and laptop and visuals was cool. I actually preferred to watch the visuals from his computer screen as they looked better than what I saw on the projector screen.

The SuperCollider class was fairly challenging. I found the granulation simple enough, but the teaching about Arrays was tricky, and there was a lot to get through. Thus, the class went 20 minutes over. Stay alert for my SuperCollider blog with code and hopefully a sound file. It should appear sometime on Wednesday. Until my computer is fixed, I have to do my code at Uni, so I have to be really focused when I do it.

More discussion was had in Sound Design, and there should be a blog entry discussing devices which allow good sound/music design in a particular movie.

That’s enough for this entry. I’m going to go watch a movie, eat some food, and then make some music. More entries coming soon for this week.

REFERENCES:
Haines, Christian. 2006. In the Sand Box. Tutorial presented at the Electronic Music Unit, University of Adelaide, 10 August.

Klose, Ashley. 2006. Introduction (2). Tutorial presented at the Electronic Music Unit, University of Adelaide, 10 August.

Whittington, Stephen. 2006. Forum Presentations. Presentations presented at the Electronic Music Unit, EMU Space, University of Adelaide, 10 August.

Albums that made this blog possible:
Let it Happen by MXPX.

Tuesday, August 08, 2006

Week2 - SuperCollider - Splice and Dice

Here lies my Splice and Dice code. It is actually quite small compared to other SuperCollider exercises. Not building a SynthDef helps to keep it short.

The sound file that I used was my mixdown from my major project of the previous semester. I think that it sounds interesting, but I could probably do more with it. I tried to put a schedule clock into my code, but once I executed the code, a warning came up in the feedback window. I can't remember exactly what the warning was, but I think it was something to do with not being able to clear.

Splice and Dice
(
//Global Variables
~thisPath = (PathName.new(Document.current.path)).pathOnly;

//Buffer Sounds
~sf = BBCutBuffer(~thisPath++"MixDown.wav");)

(
//Setup Clock
~clk = ExternalClock(TempoClock(1.1));
~clk.play;

//Cut
~cutB = CutBuf1(~sf, 0);

//Playback
~bufC = BBCut2([~cutB, CutMixer(0, 1.0, 1.0, {0.5.rand})]).play(~clk);
)

//Mod TempoClock

~clk.tempo_(0.01);
~clk.tempo_(1);
~clk.tempo_(21);

//Free Buffer
~bufC.free;


Sunday, August 06, 2006

Week2 - Audio Arts - Scene Analysis/Breakdown


For the analysis / breakdown of two scenes from a feature film, I have selected The Bourne Supremacy. The lead actor is Matt Damon who plays Jason Bourne.

Scene 1: Dialogue 23:50 – 25:35

A conversation takes place between Ward Abbott and Pamela Landy (CIA Directors) where they are discussing Treadstone, Alexander Conklin, and Jason Bourne for the first time in the movie.
Dialogue: Both characters participate in tense, authoritative tones, with both seeking superiority. In pauses there is breathy/sighing noises from Abbott (who is much older than Landy)
Music: No music for the first 40 seconds.
Foley: Landy instigates the first shuffling of papers, which is the only noise in addition to the dialogue for the first part. Abbott makes a thump on the papers (on the table), and there is the noise of Landy opening a manila folder. The motion (noise) of papers provides symbolism for who is actually in control more.
Atmos: Silence creates the atmosphere. It is tense, and is preparing to give the viewer insights into the main protagonist.

Dialogue: Abbott begins a narrative on Treadstone and Jason Bourne. The picture leaves the conversation and shows Bourne (the object of discussion). This picture is not a recollection, but what is occurring simultaneously.
Music: Music starts as Abbott begins his narrative and continues for the duration of the scene. It is fairly light and sensitive, with some light percussion and strings. A slight rise in tension and the vision returns to the conversation.
Atmos: The sounds of Bourne’s location occur.
Foley: Murmuring of voices.
Removal of passport from wallet.
FX: Water and Bird (gulls) sounds used for further imagery.

Dialogue: Landy shows her cards by stating her belief as to Bourne’s location, which is undermined by vision of Bourne going through customs.
Music: Steadily rising throughout.
Foley: Typing on keypad in customs

The scene is interrupted by knock on the door and the door opening. Landy leaves and her footsteps (Foley) is the only additional sound to the music. Abbott is left sitting in the room.

Scene 2: No Dialogue 19:37 – 21:07

The scene is in India and begins with the winching of Bourne’s vehicle out of the river. There is no dialogue at all.

Music: Deep, low-end music, with violins playing a high melodic line. A counter melody begins whilst the camera is focused on Bourne. Bourne is surveying the scene where his lover, Marie died.
Foley: Sounds of winching.
Atmos: Voices of people (Indian).
Bird sounds.
Wind ruffling reeds.
FX: Engine sound.

Close up of fire where Bourne is burning Marie’s photos and personal effects.
Music: Music continues, and a piano is introduced. Bourne is grieving. Percussion builds as Bourne prepares to throw the last photo of him and Marie on the fire, but we do not actually see it happen.
Foley: Crackling and burning of fire.
Sound of passport being thrown on fire.

Music keeps going and Bourne is now walking through his house with determination and focus. Bourne is gathering up all his important items.
Foley: Bourne’s footsteps.
Rattling through drawers, removing frames, picking up passports.
Opening drawers, stuffing backpack, picking up gun.
Creaking and closing of door.
These sounds are mostly similar.
Music: Keeps going, and suggests Bourne knows what he is doing, has a mission.

Vision of bus exiting terminal (Bourne is on the bus).
Music: Keeps going.
Atmos: Sound of voices and vehicles.
Initially there is no sound other than the music when the bus is shown. The camera is focused on Bourne at this time.
Vision shows the other bus users.
Music: Slightly diminished.
Atmos: Voices of people in bus (no-one specific).
Foley: Noise of bus itself going along road.

The scene ends with three forceful, yet not overpowering drum hits.

Friday, August 04, 2006

Week2 - Improvisation Project

This is my entry regarding the improvisation project for Forum Workshop. Before splitting into our groups improvisation was discussed, and my impression of improvisation in relation to group work is of something very constructed. By this I mean that we have 12 weeks to explore musical creativity through processes of group improvisation. So each week, we are going to build upon ideas of how to improvise within our groups. Decisions have to be made about what instruments will be used, how particular sources of sound will be prepared and to what extremes we are prepared to go? I recall that in the workshop there was significant discussion by Stephen as to the group dynamics and I find that I am the only third year in the group. Issue or non-issue? I think I will just incorporate some SuperCollider into the improvisational pool.

REFERENCES:
Whittington, Stephen and Harris, David. 2006. Forum Workshop. Lecture presented at the Electronic Music Unit, EMU Space, University of Adelaide, 3 August.

Albums that made this blog possible:
Another Day Goes By by Scarlet Brow.

Four Music Tech Classes in One Day

I can no longer summarise the week of Music Technology as all my classes, not including Perspectives fall on a Thursday. I am happy with this except for the fact that there are classes occurring in the MacLab when I have breaks. I also need to make sure I'm in bed early enough on Wednesday night to be 'refreshed' for all these classes.

First up on Thursday was the new Sound Design class with Ashley Klose, which was mostly a discussion, followed by an outline of devices, which allow good sound/music design. I'm definitely keen to discover where this class will take us. We are being given weekly blog exercises, so that will be done and blogged at some later point. It is becoming apparent that I will be doing about four blog entries per week (day). The project for Forum requires “reports about the work in progress in your blog.” Any further comments that I have regarding this project will be in a separate blog entry.

Presentations from students occurred in the second half of Forum. This will be a recurring event for most of this term. After what seemed like the usual technical difficulties/issues Luke and John (both first year students) made their presentations. Luke's work was intended to challenge the musical ideals behind 'musique concrete'. However, if he reads this blog entry, I wouldn't mind him answering whether or not he feels that he has established the foundations of traditional western music within 'musique concrete'? John's first work 'Performance Symmetry' was short and was not really what I would consider 'musique concrete', but was put together well. This was followed by his self-described 'gratuitous' guitar piece made for the Audio Arts project, which managed to keep my attention.

I will be flying the flag next week for the third year Music Tech students by being the first to present. It is hard to decide what I will present, especially having not yet received any feedback on my projects last semester. I find it bizarre that when I think about presenting I feel that playing my SuperCollider work is the safe option. A 6-minute work that isn't extremely exciting won't be unexpected. I ask myself, whether I want to play one of the rock songs that I recorded, and do not really know. So, whoever reads this will have to wait until Thursday to see what I have decided.

A quick mention goes to the SuperCollider class. The class was about Splicing and Dicing. I will do the required reading, and comment further once I've experimented with the exercise and post it in another blog entry within the next five days or so.

REFERENCES:
Haines, Christian. 2006. Splice and Dice. Tutorial presented at the Electronic Music Unit, University of Adelaide, 3 August.

Klose, Ashley. 2006. Introduction. Tutorial presented at the Electronic Music Unit, University of Adelaide, 3 August.

Whittington, Stephen and Harris, David. 2006. Forum Workshop. Lecture presented at the Electronic Music Unit, EMU Space, University of Adelaide, 3 August.

Whittington, Stephen. 2006. Forum Presentations. Presentations presented at the Electronic Music Unit, EMU Space, University of Adelaide, 3 August.


Albums that made this blog possible:
Friction by Stavesacre.

Wednesday, August 02, 2006

Week1 - SuperCollider Exercise - Scheduling

The process of creating a phasing sequence was fairly simple, which is a good thing. I created a simple SynthDef to begin. I managed to create the Synth Def and only have one error that needed to be corrected. I think that is pretty good considering I had not touched SuperCollider since the date the major projects were due. The creation of three sound sources and scheduling them to play for a short period of time was simple. I have not done anything here that is groundbreaking, but I think I've grasped the concept.


SCHEDULING
(
//Phasing
SynthDef(
\phasing,

{
//Arguments
arg carFreq = 440,
panPos = 0;

//Variables
var carrier,
env;

//Carrier
carrier = SinOsc.ar(carFreq, 0, 0.5);

//Envelope
env = carrier * EnvGen.kr(Env.perc, 1.0, doneAction:2);

//Output
Out.ar(0, Pan2.ar(env, panPos));
}
).send(s)

)

(
//Variables
var d,
e,
f;

//TempoClocks
d = TempoClock(1.5);
e = TempoClock(0.6);
f = TempoClock(3.0);

//Clocks
d.schedAbs(1, {Synth(\phasing, [\carFreq, 440, \panPos, 1]); 2.0} );
e.schedAbs(0, {Synth(\phasing, [\carFreq, 447, \panPos, -1]); 1.0} );
f.schedAbs(0, {Synth(\phasing, [\carFreq, 452, \panPos, -0.5]); 3.0} );

//SystemClock
SystemClock.sched(25, {d.clear;
e.clear;
f.clear;});
)

Week1 - Audio Arts - Sans Analysis

Here is my sans analysis of the way I interpreted Vertov's 'Man with a Movie Camera'

0:00|Man with Movie Camera|
Diegetic Sound|On Screen|Man|Walking|
Diegetic Sound|On Screen|Object|Shifting of Camera|
Diegetic Sound|On Screen|People|Noise of People|
Diegetic Sound|Off Screen|People|Stepping in puddles|

0:05|2 Women with Baskets|
Diegetic Sound|On Screen|Women|Conversation|
Diegetic Sound|On Screen|Background People|Noise|
Diegetic Sound|On Screen|Man|Walking past and any other sounds he makes|
Diegetic Sound|Off Screen|People|Noise of market sellers|

0:09|Tram|
Diegetic Sound|On Screen|Tram|Moving Tram Noise|
Diegetic Sound|On Screen|Other Trams|More distant noise of Trams|
Diegetic Sound|On Screen|People and Traffic|Noise including bicycle bell|

0:16|Women hanging clothes|
Diegetic Sound|On Screen|Hanging Clothes|Associated Noise|
Diegetic Sound|On Screen|People|Associated Noise|
Non-Diegetic Sound|Off Screen|Person|Harmonica playing|

0:18|Man with Eggs|
Diegetic Sound|On Screen|Person|Noise of man, ie. Breathing|
Diegetic Sound|On Screen|Person|Movement of Eggs|
Diegetic Sound|On Screen|People in Background|Noise|

0:21|Street Scene|
Diegetic Sound|On Screen|Tram|Tram Noise|
Diegetic Sound|On Screen|People|Pedestrian Noise|

0:27|Shutters Opening|
Diegetic Sound|On Screen|Shutters|Opening of Shutters, possibly creaking|
Diegetic Sound|Off Screen|Person|Noise of person opening shutters|
Non-Diegetic Sound|Off Screen|Something|Menacing growl from the darkness|

0:29|Teeth Brushing|
Diegetic Sound|On Screen|Kinetic movement of Toothbrush|Brushing of teeth|
Diegetic Sound|Off Screen|Water|Noise of water|

0:31|Man Walking Past Store|
Diegetic Sound|On Screen|Man Walking|Associated Noises|
Diegetic Sound|On Screen|Fence|Movement and strain of moving the fence|
Diegetic Sound|Off Screen|People|People who are about to walk past store|
Diegetic Sound|Off Screen|Residents|Noise of residents above store|

0:35|Letterbox|
Diegetic Sound|On Screen|Poster|Flapping of poster|
Diegetic Sound|On/Off Screen|Wind|Wind Noise|
Diegetic Sound|Off/On Screen|Women|Walking|
Diegetic Sound|On Screen|Letter|Sliding of letter in opening|
Diegetic Sound|Off Screen|Letter|Landing in letterbox|

0:40|Woman Walking Past Store|
Diegetic Sound|On Screen|Women|Walking|
Diegetic Sound|On Screen|Man|Winding handle of shutters|
Diegetic Sound|On Screen|Man|Grunts of exhertion|
Diegetic Sound|On Screen|Shutters|Shutters movement upwards|
Diegetic Sound|Off Screen|People|People who are about to walk past store|
Diegetic Sound|Off Screen|Residents|Noise of residents above store|

0:45|Traffic/Pedestrian Officer|
Diegetic Sound|On Screen|Officer & Woman|Conversation|
Diegetic Sound|On Screen|People|Pedestrian Noise|
Diegetic Sound|On Screen|Object|Noise of moving the signal|
Diegetic Sound|Off Screen|Tram|Tram Noise|

0:48|Shutter Up|
Diegetic Sound|On Screen|Object|Shutter going up|
Diegetic Sound|Off Screen|Person|Noise of persons opening the shutter|

0:52|Vehicles on Road/Rally|
Diegetic Sound|On Screen|People and Vehicles|Associated Noise including horns, shouts, and engines|
Diegetic Sound|On Screen|People|Rally noises|

1:06|Fountain|
Diegetic Sound|On Screen|Fountain|Water Noises|
Diegetic Sound|Off Screen|People|Noise of any people hidden behind fountain|

1:08|Close up of Fountain|
Diegetic Sound|On Screen|Fountain|Water Noises|
Diegetic Sound|Off Screen|People|Noise of any people hidden behind fountain|

1:12|Shutters Opening (multiple times)|
Diegetic Sound|On Screen|Object|Noise of shutters opening|
Diegetic Sound|On Screen|Wind|Wind noise through trees|
Diegetic Sound|Off Screen|People|Noise of any people passing the window|

1:20|Sewing Machine|
Diegetic Sound|On Screen|Object|Sewing machine operating, the doll makes no noise|
Diegetic Sound|Off Screen|Room|Other noises from the room|

1:24|Sliding Up of Door|
Diegetic Sound|On Screen|Door|Sliding of door|
Diegetic Sound|Off Screen|Engine|Engine of vehicle|
Diegetic Sound|On Screen|Persons|Grunts and shouts and talking|

1:27|Final Scene (I Don't know what it is)|
Non-Diegetic Sound|Off Screen|Object|Unexpected Gun shots

Albums that made this blog possible:
Escape from Tomorrow Today by The City Lights. I tried this album as a possibility for some non-diegetic mood music, but it did not really fit.