Friday, October 27, 2006

Week12 - Improvisation Project - 19 Days

Every improvisation class is different, and our final session together was no exception. Our sessions are very hit and miss. I think we were close to hitting something again this week, but we may have just missed it. I never really felt anything whilst playing my bass and struggled with the heavier songs we attempted.

It was possible that we never really settled after experimenting with the 7/8 assymetric time signature that David Harris suggested. It was again disappointing that we did not have the full group together.

The performance will take place in 3 weeks time and hopefully we will produce something impressive. I am trying to decide whether to continue playing on 2 bass strings or bring a 4-string bass with me. Will the full range help to expand my playing or make me try and do to much?

Albums that made this blog possible:
Ska-core The Devil and More by the Mighty Mighty Bosstones.

Week11 - SuperCollider - Data Technique - 19 Days

I have worked on a patch for this post, which works. I did not get the control envelopes to work so if I do post it is not showing any extra development to that shown in week 10's patch. Except, I have buffered a sound for granulation, where my previous patch involved granulating via the sine method. The sound I have created is good and I have overcome a couple of significant hurdles, which is useful for my project. I think I will be producing an interesting SuperCollider project.



Wednesday, October 25, 2006

My rhyming sucks - 21 Days

The album making this blog happen is a funky, rap, early 90s record with heaps of rhyming. It kind of inspired me to start some rhyming in this blog. However, I think it will be too difficult.

Week 11 began with Audio Arts and the class was good. It was useful to discuss some of the issues that are already coming up in our projects. Since the class, I have received what I believe is the animatic. It has not enlightened me to a great extent as to the plot, but the main character is interesting. I think the story is called "The Story of the Recipe Alchemist". There is some potential for some interesting sound design. Atmosphere is going to be incredibly important, but at this point it is difficult to think of any areas that may need extending in the temporal sense.

Trooper Saliva rhymes with SuperCollider, but makes no sense. Similarly, the short section of the Creative Computing class looking at score creation did not make much sense. It may be worth going over this with the proper working example in class. The data technique stuff did make sense and I got a control envelope to work with my sine granulation. I will probably post my SuperCollider patch tomorrow because circumstances should allow something good to be ready then, but not now.

My groups promising and successful improv session was cut short to have a listen to Luke Harrald's presentation. Luke is an acoustic and electronic music composer. He discussed some of the musical experiences he had overseas. These mostly occurred at IRCAM and NIME (New Interfaces for Musical Expression). Much of the focus was on the new interfaces. Thus, there was a significant section of the presentation devoted to some of the work of Adachi Tomomi.
It was good to hear Luke speak,
and there's no more for this week.

REFERENCES:
Haines, Christian. 2006. Data Technique. Tutorial presented at the Electronic Music Unit, University of Adelaide, 19 October.

Harrold, Luke. 2006. Artist Talk. Lecture presented at the Electronic Music Unit, EMU Space, University of Adelaide, 19 October.

Klose, Ashley. 2006. Sound Design(9). Tutorial presented at the Electronic Music Unit, University of Adelaide, 19 October.


Albums that made this blog possible:
Free at Last by DC Talk.

Sunday, October 22, 2006

Week11 - Improvisation Project - 24 Days

This weeks Improv session was the most successful session so far that I have been involved with. We came up with a number of songs in a range of styles that just seemed to work and were enjoyable to play.

Unfortunately, there was only 5 of the group. So, there are still two that have to be incorporated into the group's dynamic and that is a challenge. Five people seems to be the critical mass for the success of the group. I hope that this can be changed in our last session before the performances.

I think that all groups performing on the same day/evening is the way to go. It will be hard to set everything up, but that beats having three separate performances. Especially, if some of them were going to occur prior to the completion of the major project. Hopefully, the improvisation peformances will be a substitute for the recital. It may be possible to display some of the movies that the third year students are doing and some selected compositions from across the years.

Albums that made this blog possible:
Second Solution/Prisoner of Society by the Living End.

Wednesday, October 18, 2006

Week10 - SuperCollider - Granular Synthesis (2) - 28 Days

I successfully managed to recreate a sine granulation patch with my own minor modifications. Thus, I got some granular synthesis working. The next step for me was applying it to the Karplus-Strong Method of physical modelling...and it almost worked. I will post what I have done (the patch) and at some point work out how to get the pitch to granulate. If I don't do this I am left with a sound that is fairly useless.

Here is the revised code that works:
//Karplus Strong Granulated
(
//Noise > Filter > Delay
SynthDef(
\noiseBurst,

{
//Arguments
arg amp = 0.5,
dur = 1,
pan = 0,
midiPitch = 69,
dec = 0.001;

//Variables
var burstEnv,
env,
att = 0,
delayTime,
decayTime = 10,
noise,
del,
panning;

//Delay Time
delayTime = midiPitch.midicps.reciprocal;

//Envelope
burstEnv = EnvGen.kr(
envelope: Env.perc(att, dec));

//Envelope2
env = (EnvGen.kr(Env.sine(dur, amp), doneAction: 2));
//Noise
noise = PinkNoise.ar(burstEnv);

//Delay
del = CombL.ar(
in: noise,
maxdelaytime: delayTime,
delaytime: delayTime,
decaytime: decayTime,
add: noise) * env;

//Spatialisation
panning = Pan2.ar(del, pan);

//Output
Out.ar(0, panning)
}).load(s);

)
(
//Variables
var message,
wait,
thisGrainDur,
time = 0,
totalTime = 20;

//Routine
fork{
block{|break|

inf.do{
message = [ \midiPitch, rrand(40, 52, 64, 76),
\amp, rrand(-18.0, -6.0).dbamp,
\dur, thisGrainDur = exprand(0.1, 0.2),
\pan, 1.0.rand2,
\dec, rrand(0.001, 0.2)
];
//Instance
Synth(\noiseBurst, message);

//Duration and Interval
wait = thisGrainDur * rrand(0.05, 0.5);
time = time + wait;
if (time > totalTime) { break.value};
wait.wait
}
};
}
)

Tuesday, October 17, 2006

Save the Stew - 29 Days

Here is quick update of Music Technology in week 10:

Most of the Audio Arts class was spent looking at the I/O and track setup in ProTools. I really hope that a template or screen shots are provided for us, as this is something I usually struggle with. When it comes to the project, I want as few side issues as possible to deal with. I think that I now have a movie in the pipeline, but due to other work can not yet get my teeth into it. In the not-too-distant future I think it would be useful for each of us to have a one-on-one session for at least 20-30 minutes with Ashley to discuss our projects. That would be very useful.

Creating our own custom built granular synth patches was the focus of Creative Computing. I will hopefully post some code when I can figure it out. I would really like to try and modify some of the other processes that have been examined, but I have not yet been able to build the example patch. On the positive side of things, I submitted a preProduction plan that I am confident will result in a good-sounding and achievable project.

This post is now complete.

REFERENCES:
Haines, Christian. 2006. Granular Synthesis (2) & Array Sequence. Tutorial presented at the Electronic Music Unit, University of Adelaide, 12 October.

Klose, Ashley. 2006. Sound Design(8). Tutorial presented at the Electronic Music Unit, University of Adelaide, 12 October.


Albums that made this blog possible:
Caught In the Act by Royal Crown Revue.


Friday, October 13, 2006

Week10 - Improvisation Project - 33 Days

Two hours dedicated to our improvisation groups was possibly a good idea. A chance to focus after our sessions with master improvisers. I know that I needed some good time to spend with the group. It was an interesting session, but so far every session has been interesting. I think that the nature of the task has brought up a range of different scenarios. Each week has been different.

I note from this week's session that we are still unable to achieve solid input from every member. It is a challenge to get everyone instrumented up and stimulated. We performed last week with DJ Tr!p and with only two weeks of classes remaining, I am unsure as to what we need to achieve before the end.

I brought my bass guitar specially for our improvisation. I don't usually like bringing it with me because I always feel self-conscious with an instrument when I am in the music school. My two-stringed bass guitar (two strings were absent) was used throughout the session. Initially it was used through the ring modulation SuperCollider (SC) patch that I ran David's guitar through last week. I had more success on this attempt as I had a sound that could be isolated in the mix. The patch created an unstable sound due to its modulation and I altered the patch to take my bass signal straight to the output. I was reasonably impressed with the was SC processes my signal and used this sound for the rest of our jams. Ultimately, a compressor for the signal would have been useful. Perhaps I can build a SC patch that includes its own compressor.

I was greatful that I could contribute much more to the group with my two-string bass. The two strings did limit me in my creative output and I tended to repeat myself a lot. I was always going to struggle to have much influence on changing the direction when things grew stagnant. On a number of occasions I stopped playing to see if things would change, which rarely did. I did enjoy some of the interplay I had with Ben, who was fully functional. The introduction of some disco elements was cool and also the Sunshine of Your Love riff. This reminded me a little of what Derek Pascoe mentioned/demonstrated in his session. However, I don't recall there being any mention of anything that occurred in the three previous Forum sessions.

Every day is a new day, so I'm interested to see where our next Improv session may go and if it will be successful.

Albums that made this blog possible:
Figure 8 by Elliott Smith.

Wednesday, October 11, 2006

Week9 - SuperCollider - Physical Modeling

The Karplus-Strong (KS) Method for physical modeling is noise - filter - delay. After I recreated some of the example patches of this method, I thought about the task of how to modify the patch. I managed to create some unique sounds but no new instrumental sounds.

There are a number of parts and parameters of the KS patch that can be altered, but the effect is often minimal and slight. It was mentioned that this method could create some percussive sound as well as the plucked string sound. I was never able to create a percussive sound. Anyway, these are some things that can be altered within the original patch:
  • Decay Time
  • Pitch
  • Envelope Attack
  • Noise
  • Delay
In the KS method, I am still confused as to exactly what the filter and delay are. In the examples the CombL.ar is used. This is a delay object, but is this the delay or the filter that this method is dependent on? or is it both the delay and the filter? However, this is the part of my patch that I feel has the most significant impact on the sound. Increasing the attack on the envelope also has a significant effect on the sound.

The following patch substitutes the CombL for a Ringz filter, thus making the pitch redundant.
//PHYSICAL MODELLING Ringz
(
//Noise > Filter > Delay
SynthDef(
\noiseBurst,

{
//Variables
var burstEnv,
att = 0,
dec = 0.001,
delayTime,
decayTime = 10,
midiPitch = 69, // A = 440
noise,
del;

//Delay Time
delayTime = midiPitch.midicps.reciprocal;

//Envelope
burstEnv = EnvGen.kr(
envelope: Env.perc(att, dec));

//Noise
noise = PinkNoise.ar(burstEnv);

//Delay
del = Ringz.ar(
in: noise,
freq: delayTime,
decaytime: decayTime,
add: noise);

//Output
Out.ar(0, [del, del]);
}
).play
)

Tuesday, October 10, 2006

I Am Going To Start A Countdown To The End Of The Degree

Week 9 was not a startling week. It just zoomed by. I am happy for the weeks to go by fast because it means the end of the degree is coming soon. I need to make sure that I get enough work done and am learning enough in these last, fast weeks. The Improv project has already been discussed for the week so I will start with Audio Arts.

An interesting class was Sound Design. The tutorial addressed what to do once you get the vision/movie that you are going to do the sound for. Hopefully, I will get a movie soon. The animations mentioned in class sound promising. The deadline is getting closer and closer, so a plan of attack will be essential. Ashley discussed the creation of a LOG SHEET and a master sound sheet as the way to do this.

The focus of Creative Computing was the Karplus-Strong (KS) method of Physical Modelling. Recreating the examples was simple enough, but trying to come up with new sounds from it is challenging. In my next blog, I will attempt to overcome this challenge and see where KS method will get me.

Lately, I feel that my blogs have not been interesting enough. They just seem a bit bland and I am not completely sure how I am going to rectify that. Maybe something will happen in week 10 that will unearth some gold from me.

REFERENCES:
Haines, Christian. 2006. Physical Modelling. Tutorial presented at the Electronic Music Unit, University of Adelaide, 5 October.

Klose, Ashley. 2006. Sound Design(7). Tutorial presented at the Electronic Music Unit, University of Adelaide, 5 October.

Albums that made this blog possible:
Armed Love by The (International) Noise Conspiracy.

Friday, October 06, 2006

Week 9 - Improvisation Project

In the lead-up to our group's improvisation performance I produced some work that I was fairly impressed with. Unfortunately, due to being in a class I did not get to properly test my SuperCollider patch with some of my team. This would have been useful. Prior to the beginning of the session, I managed to get a short period to test my patch, which required David to be playing his electric guitar. The results were pleasing, but not matched in the performance itself.

Our special guest was DJ Tr!p (Tyson Hopprich) and he was interesting, but I only payed attention for short periods of time. One thing I did get, which is possibly kind of obvious, is to have a pool of stuff to draw sound, etc. from. I went into the performance with only one functioning process for creating sound. As our last improvised piece demonstrated, I was made redundant by minimal usage of the electric guitar.

Throughout the performances, I mainly had at least one eye fixed on my input meter of my PB to see how much electric guitar signal I had to process. This was rarely strong. I think that my processing worked from time to time, but I found it very difficult to hear and distinguish what I was doing. My positioning, and the positioning of the monitors and guitar amplifier were probably not ideal.

The performances themselves were reasonably impressive. The piano (Matt), keyboards (Dragos) and electric guitar (David) as well as what DJ Tr!p was sampling were the most easily distinguished elements of the mix in my opinion. However, for much of the time I was focused on trying to locate myself in the mix.

REFERENCES:
Hopprich, Tyson. 2006. Improvisation Workshop. Lecture presented at the Electronic Music Unit, EMU Space, University of Adelaide, 5 October.

Albums that made this blog possible:
Franz Ferdinand by Franz Ferdinand.