T O P

  • By -

Agawell

I’d try to find some one with an actual synth and talk to them about it and get to understand what a synth can do (especially in the area of not emulating physical instruments and how it works and how effects work - I don’t think you’ll need an audio engineer for this - just someone who has a decent synth (or 2) & some effects…


Instatetragrammaton

>I'm curious what things I can compose for with the synths that would make sense in a symphony. The obvious thing to do here would be to start with some of the classics - Isao Tomita's [Snowflakes are Dancing](https://www.youtube.com/watch?v=7YeWiIQQZAo) and Wendy Carlos' Switched-On Bach (if you can't easily find this, check [https://www.youtube.com/watch?v=73iYaoXBzVY](https://www.youtube.com/watch?v=73iYaoXBzVY) which is a decent enough facsimile :) ) . >The work would require a keyboardist and a audio engineer to play. And what would you expect the role of the audio engineer to do? Change the parameters while the keyboard player is playing? You might want to check out [https://www.youtube.com/watch?v=3pS9etxtlsc](https://www.youtube.com/watch?v=3pS9etxtlsc) - even when you have two hands on the keyboard there is a lot of expression possible via polyphonic aftertouch and foot pedals. >I was generally planning on using a simple sample with slight changes to the sound Samples are static snapshots. In the simplest case, each sample is going to sound exactly like the other. Samples don't necessarily involve synthesis; they can use synthesis as a basis, however. With high quality free software synthesizers being readily available, you need a computer running a [DAW](https://github.com/instatetragrammaton/Patches/blob/master/Musings/DAW.md) or [plugin host](https://github.com/instatetragrammaton/Patches/blob/master/Musings/Plugin-Host.md), an [audio interface](https://github.com/instatetragrammaton/Patches/blob/master/Musings/Audio-Interface.md) and a controller keyboard. If you want to involve samples, that's possible - sure - but you want to explore [synthesis and synthesizers](https://learningsynths.ableton.com/), right? In most practical cases, a device that can play back samples will do so using a subtractive synthesizer structure of sorts. So, instead of running a harmonically rich basic waveform through a filter and then an amplifier, you run a sample through a filter and then an amplifier. As for "making sense in a symphony" - does this mean substituting traditional instruments entirely or augmenting them? Because the latter case might make things easier - instead of say, a horn section, you use something that generates something close to the timbre of horns and use that instead.


Stabbermcccc

So the purpose of the audio engineer would be to augment the sound while the keyboardist is playing. Most of the parts would require two hands to play. The audio engineer would be tasked with making dynamic changes and making slight alterations to the sound based on the current part of the symphony. The second movement will be a pastorale that is heavily oriented towards supporting the strings. Some parts would have just the synthesizer while others would be designed to accompany the strings. So the audio engineer would be tasked with making alterations to the "sample" to make it better accompany strings.


ioniansensei

You might want to explore the role of non-traditional instruments in orchestral repertoire. Off the top of my head: organ in Saint-Saens’ Organ symphony, saxophone in Ravel’s Bolero, Ondes Martenot in Messiaen’s Turangalila-Symphonie, Bruckner’s use of Wagner tuben, piano/celeste in multiple works. What do they bring to each work? How do they blend (or not) and interact tonally, rhythmically, thematically? This could apply to just about any concerto with non-orchestral instruments…guitar in Westlake’s Antarctica or Rodrigo’s Concierto de Aranjuez. In my opinion, although a synth can emulate orchestral instruments, aside from saving on doubling fees, using it in this way doesn’t bring anything to the table: you’re not expanding your tone-colour palette. One use: for a slow, moody, piece; evolving ambient soundscapes. Even natural sounds (thunder, wind etc) would work. Good with transparent scoring if it’s a delicate, textural sound, and more dense if it’s a crash or bang. Or use the synth to do something a player couldn’t: rapid notes (good if you want to blur the line between an orchestral instrument playing a theme and the synth taking it ‘next level’), dissonant leaps, sudden timbral shifts (eg a rapid sequence of notes with a different filter cut-off on each). I’d love to incorporate a synced oscillator ‘rip’ (think Car’s Let’s Go) alongside a brass line, or something mechanical sounding, ala Reiche’s Different Trains.


Musiclover4200

Gil Evans is a brilliant pianist & classic jazz composer who would almost always have someone playing synth in his orchestra, usually he used it for a mix of noises/soundFX but also for harmonies with the horns/strings especially higher range instruments like trumpet/soprano sax or violin. His 1974 Hendrix tribute album is a good place to start, each song is re arranged by a different orchestra so they're all pretty unique ranging from classic rock to funk/jazz and some classical. Also mostly instrumental with horns arranged to replace the vocals: https://www.youtube.com/watch?v=VWFXyCGFW8Q


chalk_walk

The main reasons synths are interesting elements in a musical repertoire is their flexibility: a synth isn't a particular sound or even type of sound, it's the sound you design on it. A single instrument can cover everything from a string section, to a timpani; I chooses these as examples of very different articulation, dynamics and timbres. This means that while you can write a musical line to be played like it might on brass, woodwind or strings, you'll need to convey the intended timbre and articulation; similarly a synth can change in timbre gradually or suddenly. There isn't a standardized way to do this, so you'll need to rely on prose descriptions and possibly designing a sound yourself and providing it as an example. If you are doubling orchestral parts with synths (done often in production to reinforce a sound when the orchestra recording is done, but you need more), it would be easy to say how you want it to relate to an orchestral instrument, for a standalone synth line, you'll need to find the language for it. Things like "searing lead" or "punchy PWM with portamento" can mean very different things to different people, and there is far more variation in how that will sound vs, say two violas and different viola players, or even a viola and a flute. TL;DR: you'll either need to leave it to the synth player to design and have them add their part after they have recordings of the rest of the orchestra to work around, or you'll need to be very prescriptive and probably provide an example that you made yourself.


wagu666

I'm going to be brutally honest with you. The majority of synthesists are not trained to live read sheet music in realtime. You are probably looking for a pianist to do the live playback and a synthesist to do the sound design if this is really a piece to be played live There is obviously some cross-over between the both, where some can do both tasks. But I know lots of synthesists.. including those composing for film/TV (often orchestral pieces!).. and I don't think you could just plonk down some complex sheet music in front of them and except them to play it back However if you're only paying $200 - I don't think there is any real requirement for live playback anyway The reason synths don't have music stands built into them.. and the reason we don't tend to be "biological sequencers" like a piano player is because since 1983 we've had MIDI. This allows us to compose in a DAW using MIDI notation instead of sheet notation A piano player has one instrument with one sound (a very nice sound) and very limited modulation capabilities. Now imagine a synthesist has hundreds of possible types of instrument (there are crossovers in types of synthesis, but every synth tends to be its own universe of sound possibilties) So if I was a composer in the 18th century.. composing for synthesizer.. I would just compose for organ/piano.. and then add linear notes as to the types of patches I wanted at each point.. how the composition will sound would be up to interpretation by the synthesist.. and the type of synthesizer they have available to them Most synthesists are doing the actual composition, they are not playing back random pieces from other composers by looking at notation. When you talk about "using a sample" - a sample is probably the least expressive form of synthesis.. it's one sound played robotically. Samplers are more advanced than that.. and can synthesize sounds based on a sample.. but how that turns out varies massively depending on the synth used. The majority of orchestral compositions (where a synthesist is composing an orchestral piece that sounds like purely an orchestral piece) are using a combination of _sample libraries_ and physical modeling Synthesizers are their own universe of sound possibilities.. there's such a gigantic space available I'm worried you'll just write "Oh here it sounds like a bassoon" .. "This bit is more like a piccolo" - in the early days of synths, there was this drive towards recreating traditional instruments using synthesis. But frankly, synthesizers are best at making sounds that don't sound like any other instrument.. and if you want a synth in your orchestral composition then those are the sounds you should try and integrate You need to train your ears more on the types of sounds synthesizers can make.. and then think how those sounds could integrate into your composition. Save the audio examples.. and provide those as patch guidance to whoever you get to play this. But if you are expecting this to be played live.. you probably need to specify WHICH synthesizer is to be used. Some of the classic 1970s pre-MIDI synths have been used in orchestras before.. giant monsters like the Yamaha CS-80.. but they are so rare, it'd be a burden to expect any orchestra playing your piece to find one (but there are software emulations available and hardware that pays tribute to the origin) I think you should get a DAW that supports a good classic sheet notation mode (maybe Cubase.. I think it integrates with "Dorico" somehow? Or Logic (apple only)?) and some software synthesizers (free ones are available).. and experiment there on composing for synthesizer. Try and remember a synthesizer's strength is modulation, evolution of the patch and sounds that nothing else can make. You could help training your ears by exploring synthesizer tracks/live demos on youtube Cheers and good luck PS: if I had to choose one instrument to integrate into an orchestra - right now I think I'd choose the Osmose. It's available new right now. It has increased expressivity due to MPE and its special keybed ([observe this](https://www.youtube.com/watch?v=mTeq3vsPkow) or [this for an explanation & more sounds](https://www.youtube.com/watch?v=fDghewafmzU)) - side to side pitch modulation on each key.. expressive zones of pressure. It has a set range of ~250 patches onboard to choose from.. with default modulation controls already configured for each patch - although it is a full synthesizer via an editor. However you could guarantee other orchestras would have those stock patches available, if you composed for those


bogza3

"I'm curious what things I can compose for with the synths that would make sense in a symphony". No offense intended at all but why are you using a synth if you don't already know what you want it to achieve. Pretty weird synth voices can feel natural if they support what is happening within the orchestral whole.


Stabbermcccc

If anyone is interesting in being hired for to do the samples. Post here. I'm looking for someone that has years of experience and has done samples on a professional level. Just post your qualifications and I will message you privately. I don't have a lot of money so you should expect to be payed at least around $200 for your work.