Monday, May 29, 2006

Will It Click?

Audio Arts was a lesson in creative producing this week. A number of songs that David had recorded and produced in the past were presented. I was able to hear a mix of subtle and extreme techniques and mixing. These may be able to provide inspiration for my upcoming recording session. If something is not working, I need to be creative and come up with a possible solution to maximise the quality of the sound. My recording session will actually be occurring before this blog is posted. One technique that David mentioned in class is double-tracking. This has potential to enhance the recordings and the overall sound.

No listening experience in Forum, due to a non-appearance from David Harris. This did not bother me too much. I went and wrote some emails and checked out the Life Impact videos. If people are going to vote, vote for Stephen Whittington's video, which contains his Science of Music class. On the subject of Stephen Whittington, he was the presenter for the hour that is usually the second hour of Fourm. His topic was titled, "Ventures in Vocoding and Distributed Music Performance". I always find it fascinating to hear about the various projects that lecturers are involved in. Especially, when it comes from a creative lecturer who is not afraid to experiment. Vocoding is the "hardware or software implementation of speech-based compression algorithm." Stephen mentioned the use of a particular vocoder that is a part of all the old equipment currently being stored around the EMU. I believe that I actually used this vocoder in first year Music Technology. I'm not sure if I actually used a voice. I think I may have fed two separate signals to it, and discover some sort of sonic result. If I did do this, it may be lying about somewhere on a cassette tape. Distributed Music Performance is, "any musical performance in which the performers are not in close proximity to one another." Stephen is extending his focus in this area to VoIP (Voice-over Internet Protocol). Does this mean the use of a program like Skype?

I wanted to find an image of the vocoder that is located in the EMU, but I couldn't remember what model it was. However, I found a very interesting vocoder (Roland vp-330MkII) taken from .

After missing a week of Creative Computing, we hit routing, and creating a soundfile player. I am planning on completing my major project with SuperCollider. I think I am going to need the extra week. I really need to find much more SuperCollider time to constructively advance through the work. There will be a point where everything just clicks, which I believe is not too far off. More code coming soon:
(
//Global Variables
~thisPath = (PathName.new(Document.current.path)).pathOnly;// ~ tilda means global variable

//Buffer sounds
b = Buffer.read(s,~thisPath++"glass_cymbal.wav");
)
(
//Carrier
SynthDef (
\carrierSynth,

{
//Arguments
arg bufid = 0,
bus = 21,
dur = 4,
legato = 1;

//Variables
var soundFile,
env;

//PlaySound
soundFile = PlayBuf.ar(
1,
bufid,
BufRateScale.kr(bufid));

//Envelope
env = soundFile * EnvGen.kr(Env.perc(0, dur*legato), doneAction:2);

//Output
Out.ar(bus, soundFile); //Soundfile is being sent out bus 21.
}
).store;



//Modulator
SynthDef (
\modulatorSynth,

{
//Arguments
arg out = 0,
in = 21,
filterRq = 0.4, //Argument that can be sequenced
modFreqStart = 10000, //Argument that can be sequenced
bus = 21;

//Variables
var audio,
modulator;

//Input Bus
audio = In.ar([in, in], 1); //Filtering audio coming in bus 21.

//Modulator
modulator = RLPF.ar(
in: audio,
freq: XLine.kr(
start: modFreqStart,
end: 100,
dur: 7),
rq: filterRq
);

//Output
Out.ar(bus, modulator);
}
).store;


//Effect
SynthDef (
\fxSynth,

{
//Arguments
arg out = 0,
in = 22,
bufnum = 0,
delayTime = 3.5; //Argument that can be sequenced

//Variables
var audio,
fx;

//Input Bus
audio = In.ar([in, in], 1); //Audio coming in bus 22

//Effect
fx = CombN.ar(audio,
maxdelaytime: 5.0,
delaytime: delayTime,
decaytime: 5,
mul: 1,
add: audio);

//Output
Out.ar(0, fx);
}
).store;
)
(
e = Synth.after(1, "fxSynth");
d = Synth.after(1, "modulatorSynth");
)
//Pbind Sequencer
( Pbind(
\instrument, "carrierSynth",
\bufid, b.bufnum,
\delayTime, Prand(#[3.5, 0.2, 2.5, 5], 16),
\filterRq, Pseq(#[0.4, 0.1, 0.9, 0.6], 4)
).play;
)
Sounds something like this:


REFERENCES:

Grice, David. 2006. Mixing (2). Tutorial presented at the Electronic Music Unit, University of Adelaide, 23 May.


Haines, Christian. 2006. SuperCollider (8). Tutorial presented at the Electronic Music Unit, University of Adelaide, 25 May.

Whittington, Stephen. 2006. Ventures in Vocoding and Distributed Music Performance. Presentation presented at the Electronic Music Unit, EMU Space, University of Adelaide, 25 May.


Albums that made this blog possible:
A Good Tip For A Good Time by Cato Salsa Experience.

Monday, May 22, 2006

I Don't Get It

The listening component of forum isn't working for me at the moment. I don't get the reasoning behind playing Mr. Bungle. There may be a link with Stockhausen, but it appeared pretty thin and the commentary/explanation given was minimal. I listened, but did not know what to listen for. What do I take away? I don't always get a lot out of listening to the electronic composers, but I understand their significance to the EMU. Karlheinz Stockhausen is intruiging. I like 'Mikrophonie', but don't get into 'Gesang'. The short wave radios and national anthem ideas behind 'Hymnen' work, but I think I was irritated by the earlier playing of Mr. Bungle.

Mr. Robert Chalmers in all his lawyerly smoothness (or some of it) presented a fast, but informative discussion on copyright law and its application on music and technology. It was far from comprehensive, as areas of law that came up in the discussion prompted him to indicate that we would not want to go there. He even gave us some websites that he recommended to stay clear of. At least I know where to start searching if I have copyright issues. Copyright appears to be very convoluted indeed. The issue of when copyrighted materials become public domain is going to keep popping its head up in decades to come. The discussion prompted a number of hypothetical scenarios. So, I guess some students may be testing the waters to see what actually is possible.

Mixing is a very appropriate topic at this time of the semester and one that not even the nice Mr. David Grice can completely cover in an hour class. Another hour probably would not be enough. However, the basics were laid, and with my recording session coming up in the not so distant future, some important information was presented that will help with the recording process. It will also help with the pre- and post-production stages. A reinforcement of some suggested microphone selections was particularly useful.

Again appearing last is the Creative Computing section of the blog. I'm not in any way undermining the value of this class like I did with the listening class. However, there was no class this week and subsequently no materials presented. I do still need to provide a code example when I get a number of hours free to work things out: ... and here it is:
(
//New Synth
SynthDef(
\newSynth,



{
//Arguments
arg carrierFreq = 110, //Frequency of carrier
carrierVol = 1, //Mul
filterRq = 0.4, //RLPF...reciprocal of Q
modFreq = 220, //Frequency of modulator
dur = 3,
legato = 2;

//Variables
var carrier,
env,
modulator,
modfilter;

//UGens
modulator = SinOsc.kr(
freq: modFreq,
phase: 0,
mul: 1
);
carrier = SinOsc.ar(
freq: carrierFreq,
phase: 0,
mul: [carrierVol * modulator, carrierVol * modulator]
);
modfilter = RLPF.ar( //RLFP=>resonant low pass filter
in: carrier,
freq: XLine.kr(
start: 8000,
end: 650,
dur: 7),
rq: filterRq
);
env = modfilter * EnvGen.kr(Env.perc(0, dur*legato), doneAction:2);


//Output
Out.ar(0, env);
}
).load(s);
SynthDescLib.global.read
)
//Pbind sequencer
( Pbind(
\instrument, "newSynth",
\carrierFreq, Pfunc({rrand(55, 440)}),
\dur, Pfunc({rrand(0.5, 7)}), //Makes sense to have dur before legato
\legato, Prand(#[0.1, 0.5, 1, 1.5], 8),
\filterRq, Pseq(#[0.4, 0.1, 0.9, 0.6], 2)).play
)


REFERENCES:
Grice, David. 2006. Mixing. Tutorial presented at the Electronic Music Unit, University of Adelaide, 16 May.

Harris, David. 2006. Listening (8). Workshop presented at the Electronic Music Unit, EMU Space, University of Adelaide, 18 May.

Chalmers, Robert. 2006. Law, Music and Technology. Presentation presented at the Electronic Music Unit, EMU Space, University of Adelaide, 18 May.


Albums that made this blog possible:
'Where'd You Go EP' by The Mighty Mighty Bosstones,
'Live from Toronto: Songs in the Key of Eh' by the Mad Caddies.

Monday, May 15, 2006

Shine On


The coveted first position on my blog goes to Audio Arts where reverb is intended to emulate the size of the room that the sound is sourced from. Reverb is a tool to colour and bring life to recorded music. Vocals including backing, drums and keyboards as well are the prime contenders for reverb enhancement. Reverb is not intended to clutter the mix, but bring it to life. Pre-Delay is an important parameter, which is the time between the sound and the reverb kicking in. A gated reverb is the usual type of reverb used on a drum kit, so that would be a useful plug-in addition to the current ProTools setup.

Due to the inclusion of code examples, Creative Computing will take the last place this week, but in forthcoming weeks may indeed take first place. The code would have to be rather special. However, forum had the music of Christian Marclay and Pink Floyd played. Marclay is a turntable specialist and the way he scratched work by different artists together is impressive. The work involving Jimi Hendrix's work was a highlight, because there was times of momentary recognition, but most of the time, the turntabling took much of the semblance away. Again, I enjoyed listening to Pink Floyd. This time it was the 'concept' album 'Wish You Were Here', which was released in 1975. Incidentally, that was the same year Eno released 'Another Green World'. I enjoyed 'Shine On You Crazy Diamond' in its 9 parts, and heard a couple of interesting production techniques in the work, but perceived little else related to Music Technology.

Some interesting presentations of honours projects were given, and they were given by honours students. Seb Tomczak's DIY physical interfaces should produce a fascinating and hopefully practical outcome, whilst it appears Darren Curtis is pursuing a field - Frequency Medicine - that may consume him for a significant number of years. An example of this 'medicine' is the study of Binaural Beat Frequencies.

So far my success with SuperCollider iteration techniques has been limited. I have struggled with the 'do' command and the concept of returning a function. However, I've pressed on to using 'P' functions. These appear to be a fairly simple way of sequence. My first example takes my AM synthdef, which is now a stereo L and R synthdef, and puts it in the class Pbind. In this example, I have also used a Pfunc to randomly change the carrier frequency (which is an argument of the AM synth). The sound quickly becomes hideous so I will only present the code.

(
//AM Synth
SynthDef(
\synthAM,

{
//Arguments
arg carrierFreq = 220,
carrierVol = 1;

//Variables
var carrier,
modulator;

//UGens
modulator = SinOsc.kr(
freq: 220,
phase: 0,
mul: 1
);
carrier = SinOsc.ar(
freq: carrierFreq,
phase: 0,
mul: [carrierVol * modulator, carrierVol * modulator]
);

//Output
Out.ar([0,2], carrier)
}
).load(s);
)
(
Pbind(
\instrument, "synthAM",
\carrierFreq, Pfunc({rrand(50, 500)})).play
)

Naming of code and in particular of synthdefs is going to become an issue. For example, as soon as my AM synth has the code changed by a UGen like an impulse.ar, it is no longer an AM synth and a name has to be determined and applied.
(
//AM Synth (not)
SynthDef(
\notsynthAM,

{
//Arguments
arg carrierFreq = 220,
carrierVol = 1;

//Variables
var carrier,
modulator,
something;
//UGens
modulator = SinOsc.kr(
freq: 220,
phase: 0,
mul: 1
);
carrier = SinOsc.ar(
freq: carrierFreq,
phase: 0,
mul: [carrierVol * modulator, carrierVol * modulator]
);

something = Impulse.ar( [carrierFreq, carrier]);
//Output
Out.ar(0, something)
}
).load(s);
)
(
Pbind(
\instrument, "notsynthAM",
\carrierFreq, Pfunc({rrand(50, 500)})).play
)

REFERENCES:
Grice, David. 2006. Reverb. Tutorial presented at the Electronic Music Unit, University of Adelaide, 9 May.

Haines, Christian. 2006. SuperCollider (6). Tutorial presented at the Electronic Music Unit, University of Adelaide, 11 May.

Harris, David. 2006. Listening (7). Workshop presented at the Electronic Music Unit, EMU Space, University of Adelaide, 11 May.

Tomczak, Seb and Curtis, Darren. 2006. Honours Presentations. Presentations presented at the Electronic Music Unit, EMU Space, University of Adelaide, 11 May.

Dimery, Robert, ed. 2005. 1001 Albums You Must Hear Before You Die. Sydney: ABC Books.


Albums that made this blog possible:
'Happy Trails' by Quicksilver Messenger Service

Sunday, May 07, 2006

Darwin's Home of Happy Pets

Voiceover recording was our focus in Audio Arts, and I am completely convinced that Pet's Village is 'Darwin's Home of Happy Pets'. As demonstrated voiceovers are mainly used for advertising. Therefore, they need to be as loud as possible, so that when I am watching TV, I am compelled to turn the advertisements down. The microphone required will always be a condenser as the circuitry allows for optimum value with less noise. A dead room is the most successful location to make the recording, to get an articulate voice recording. The polar pattern should be in a cardiod or hyper position. An omni position will pick up too much of the particular space. To achieve the highest level of volume on the voice, a compresser is used. This has similar parameters to a gate plus a limiter. The ratio is usually around 4:1, with the attack as fast as possible. The threshold varies depending on the voice. Due to my availability for the second year class, I put on my best voiceover voice for their benefit and was recorded. The threshold required for my voice was significantly different to Henry's threshold. The compresser brings up the lows and compresses the high peaks, whilst the volume is made up by the gain. The main issue is artifacts like breaths being made louder by the compresser.

The topic for SuperCollider was iteration, but I have so far found it difficult to apply the iteration concepts to my synthDefs to form a sequence. The 'do' message was looked at in a simple form.
do(5, {"boing".postln;}) 
and the result is this:
boing
boing
boing
boing
boing
5
From my logic I should be able to apply the 'do' to my existing function, which is the bit that has the arguments, variables, uGens and output in it, and repeat it a number of times. On each repeat I should be able to modify the sound and create my sequence. Taking iteration from this simple level (the example shown) to the required sequence represents to me a huge step and I have not yet created a successful solution.

The listening in the forum mostly took the form of rock music with some more experimental/electronic techniques. The Led Zeppelin and Pink Floyd examples were the highlight. However, I was more interested in hearing the difference between the Syd Barrett era of Pink Floyd and the non-Syd Barrett era. I have listened significantly to the Dark Side of the Moon and Meddle, but have never heard any Pink Floyd with Syd Barrett prior to this class. The VCS3 is used to awesome effect in Breathe and I've found some newer pieces that uses a similar style synthesiser, if not a VCS3. For your listening pleasure, I've loaded 'Zero' by Rhubarb. The synth is heard in the middle of the song in an instrumental section. I am only loading one song due to uploading issues, but will hopefully add another when I can.

REFERENCES
Grice, David. 2006. Voice Recording. Tutorial presented at the Electronic Music Unit, University of Adelaide, 2 May.

Haines, Christian. 2006. SuperCollider (5). Tutorial presented at the Electronic Music Unit, University of Adelaide, 4 May.

Harris, David. 2006. Listening (6). Workshop presented at the Electronic Music Unit, EMU Space, University of Adelaide, 4 May.

Rhubarb, and James, Caleb. (1999) Zero, Rhubarb. Toupee Records. [sound recording:CD].

Monday, May 01, 2006

Recordings, Marathons, and Synthesis

Anzac Day shortened Week 7 by a day, thus resulting in no Audio Arts class. However, I did work on some recordings I made the previous week. With the help of Andrew Georg on piano (grand), I experimented/practiced some microphone techniques. In particular, the mid-side technique was explored. I have recorded the piano in the space on a number of previous occasions and this was the most successful attempt so far. The use of a Neumann U89 for the figure8 position and a U87 for the omni were an effective matching. I had planned on using a matched pair, but could only find one U89. The brilliance about this technique is when the inverting of the figure8 takes place and a stereo split is made. The quality of sound skyrockets and combined with the omni/central placing is brilliant. Adding some EQs and slight harmonic effects I produced this recording:



Curiously, when I was trying to locate the sweet spot of the figure8, both sides of the microphone produced a sound that was almost exactly the same. Perhaps it was the song selection, but I think I managed to obtain a pretty good sound.

The sound created by Xenakis in 'Voyage to Andromeda' was not such a good sound, but that is subjective and possibly biased. However, it was done with a computer in the 1980s, and commissioned for a kite festival in Japan, so I'm happy to accept it. The voyage it took us on was disturbing, probably one I would want to forget. The journey became monotonous and lost its evocative appeal. My level of acceptance was lower when it came to 'In Flagranti' by Gabrielle Manca. The experimental bottleneck guitar seemed irrelevant to a tech forum. Wouldn't you play it to a student studying guitar or in a composition forum?

The honours presentation by Seb Tomczak was engaging. The ideas and premises behind Milkcrate were interesting. I particularly liked the notion that each Milkcrate event is like 'running a musical marathon'. The marathon would be a real struggle if it was all based on basic synthesis like the following SuperCollider examples.

 synthAM
(

//AM Synth
SynthDef(
\synthAM,

{
//Arguments
arg carrierFreq = 220,
carrierVol = 1;

//Variables
var carrier,
modulator;

//UGens
modulator = SinOsc.kr(
freq: 220,
phase: 0,
mul: 1
);
carrier = SinOsc.ar(
freq: carrierFreq,
phase: 0,
mul: carrierVol * modulator
);

//Output
Out.ar(0, carrier)
}
).load(s);

)

//Play
b = Synth("synthAM").play
b.set(\carrierFreq, 440);


The code for the 'synthAM' synthDef is heavily based on the example 'Basic AM Synth'. However, I wrote the code out in full and rejoiced when all the brackets matched up with partners. Running through code errors took some time, but these were mostly to do with commas, and semi-colons. Most of these errors occurred in the UGens section. Just in writing this code, I found that my understanding of arguments and variables, and how to use them effectively has increased substantially. The use of single-character variables is now much clearer. My 'synthFM' example was based on the existing synthDef template, but I had to work it out without an in-class example, and as a result my code-writing abilities are increasing significantly.


 synthFM
(

//FM Synth
SynthDef(
\synthFM,

{
//Arguments
arg carrierFreq = 220,
modFreq = 220,
modIndex = 900;

//Variables
var carrier,
modulator;

//UGens
modulator = SinOsc.ar(
freq: modFreq,
phase: 0,
mul: modIndex
);
carrier = SinOsc.ar(
freq: carrierFreq + modulator,
phase: 0,
mul: 0.4
);

//Output
Out.ar(0, carrier)
}
).load(s);

)

//Play
b = Synth("synthFM").play
b.set(\modIndex, 440);

References

Haines, Christian. 2006. SuperCollider (4). Tutorial presented at the Electronic Music Unit, University of Adelaide, 27 April.

Harris, David. 2006. Listening (5). Workshop presented at the Electronic Music Unit, EMU Space, University of Adelaide, 27 April.

Tomczak, Seb. 2006. Honours Presentation. Presentation presented at the Electronic Music Unit, EMU Space, University of Adelaide, 27 April.