So how did you end up a sound designer?
When I moved to Berlin after school, I was playing in bands, playing the drums, and didn’t really know what to do to be honest. I started studying acoustics at the technical university in Berlin and by that time, it was about 1988 when the wall came down, and the town was swamped with ENG teams and tv reporters coming in and there was a huge market for ENG assistants, sound guys for these small teams. So I got trained on SQN and the basic booming stuff and started working there just as a side job basically, and it was very nice because you earned good money and could carry on playing in bands.
Then I discovered the film school in Potsdam, close to Berlin, which has a dedicated film sound design class where you study for four years, so I did as many films as I could do and used the time as much as I could.
What really got me into film sound was this one key night at a local theatre where they showed Alien 1 and 2 and Eraserhead on one night. I came out of the theatre at 5 or 6 in the morning and I was like “What planet is this? What time is it?”. So those were the three films where I thought “There’s something more than dialogue and music in films to tell a story”
We’ve heard you have a killer studio – can you describe the setup and how you use it?
It’s Pro Tools HD|Native and I mainly work to a big screen and full HD projection via a BM Decklink SDI card. My screen speakers are Meyer Sound EXP Acheron Designer, sub is a JBL6312SP and my surrounds are JBL8350s. I do find having the “real” theatrical surrounds very helpful to hear that typical ‘real world’ surround speaker character.
On my desk you’ll find an Artist Mix, an Artist Control, an RTW Surround meter and of course my iPad Mini with Spanner to make my life easier. My only outboard units are a dbx 120 subharmonic synth and my Symbolic Sound Kyma system which I use occasionally for special effects that are hard or impossible to achieve with plugins.
The great thing about my place is that I’m in an old factory lot in the centre of Berlin and there are 3 other freelance sound editors in the same building, so it’s very easy to get a helping hand on short notice or share some ideas. This includes a very good dedicated Foley studio run by Hanse Warns, who recorded Foley on pretty much all my past Projects. It has a 600m2 warehouse connected to the recording studio that we can use to take a real car into the studio for interior foley. We also recorded interior gun-shots for “The International” in there to get the real echo and reverb that would fit the scene in the Guggenheim Museum. It’s a great playground for sound.
In the basement we recently built a big heated water-tank so our Foley artist Carsten Richter could do all the water foley for our latest project, by doing the swim moves and splashes with his whole body. Water Foley is often not very convincing when done just in a small tub with your hands, so this was a great step forward for our studio and saves a huge amount of time.
I also have a mobile kit that I use when I need to move into picture editorial’s space to be closer to the director. This happened on “Cloud Atlas” for example, and was crucial for the whole process and led to new ideas even on the picture side. Alex Berner (the picture editor) would even change picture cuts because we would chat in the evenings and found that certain transitions worked better with the sound that I, or the team, had created. Many elements in the film would not have ended up the way they were had I stayed in my studio working in a bubble. All these small “micro spotting sessions” I had with the director as he passed my door where extremely helpful. I know our world is relying more and more on remote connections, Skype etc. but often you can get so much more info from a director in even the shortest personal chat in your room, than in hour long Skype meetings.
In-the-box premixing – is this a feature of every job these days?
I always edit in a session that has ambience reverb, FX reverb and low-ender, EQ on all tracks and Spanners across the whole session. I make use of EQ and Reverb to shape sounds while I edit, so it’s definitely a cross between mixing and editing. My room is calibrated to theatrical levels and over the years you learn to prep elements that translate well to the theatre. I also need to provide layout stems to picture editorial all the time so for that it’s important that these stems illustrate what I had in mind before it hits the mixing theatre.
All the elements are fully panned which often makes pre-mixing in the theatre quicker, especially when doing temp-mixes where most of the time goes into balancing dialog and music. On films like Ron Howard’s “Rush” this was very important because the mixers wanted to work in a hybrid workflow pre-mixing in the box and then move over to the Console (MPC) for the final, because that was the interface they were more fluent with. There’s simply no way (and no time) to do zip-pans to picture on a traditional console when 10 F1-cars fly by in just 4 seconds. Most of that was done in the editing process and then adjusted in the mix.
Having said that, of course none of what I bring to the theatre is carved in stone, but it’s good to start with your ideas playing back as they should and not throw raw elements at the mixer to do everything from scratch. Today’s schedules simply do not allow for this.
Printed pre-dubs have become quite rare. Even on the last film which was a native Dolby Atmos Mix we didn’t print pre-mixes. Everything was kept virtual till the very end.
Do you always cut to picture or design stuff offline?
I sometimes have an option to create elements before there’s any picture available. On Cloud Atlas for example, I got a head-start of 4 weeks before receiving picture for the first time, which I used to create background elements of the different locations just based on the script. Also sounds for vehicles and guns were created long before the first turnover. It’s actually quite useful to do this because you don’t tie your imagination to a cut sequence. It leaves room for experimenting and ideas without the need to “fill your timeline” to picture. Of course many things do not always work when you see the picture later on but a ton of ideas and are born this way.
You’ve worked on independent German films as well as some high profile hollywood blockbusters. Do you find you need to change your style and workflow to suit different directors and different films?
Sometimes you have a really straightforward dialogue-driven film and of course, sometimes you have directors who say at the first spotting session “this has to sound like David Lynch” and you think “how would that work in the context of this film?”. Everyone has a different way to use words for sounds and you have to learn that language in a new way for every director or every new film. Every director has his own way of saying things and some are very specific: “I want this, there” and some just give you “This has to sound like David Lynch” or “This sequence has to sound visceral”.
Some directors want to be surprised in the theatre and really only want very rough place-holders when they cut picture. Others always want the latest and greatest in the AVID so they know where we’re standing and can give feedback. This is the reason why I always try to keep my tracks in the best possible shape – so I can provide a crash-down mix at any time with all the panning and reverbs in place.
A couple of years back, I tried to convince a director of doing his film in mono. Just looking at the film, I thought “this needs to be mono”. It felt like a film that wanted to be mono in a way, but he got cold feet and it was a bit difficult with the producer and the distributors because they thought “everybody’s doing 5.1, we have to do 5.1”.
So what do your sessions look like in terms of plugins?
The FX sessions are pretty straightforward: there’s a Channelstrip and a Spanner on pretty much every channel.
The Spanner rotation control is great for ambiences, to move elements around. That was a really cool addition, especially in combination with the iPad. To be able to do that intuitively with your hands and not with a mouse is great, and multiple channels in one interface is a key thing. I also like to use it on 5.0 reverb out to narrow down reverbs and have them go surround over time in a scene.
For designey stuff, I do most of that in the FX library application. Often that sounds weird to people but it has a huge plug-in rack so you can experiment while searching through your own or commercial libraries and try things, run them backwards, forwards, pitch them up and down, which is, in that context of finding the right texture for something, much easier and feels much more organic than laying it in on a track and setting up a huge mixer with delays and sends and returns etc. It takes such a huge amount of time and blows up your work session tremendously and probably for just one sound you want to generate and then you don’t need that huge set-up anymore.
Then the sounds are tweaked further in Pro Tools and layered, so Pro Tools for me is about timing things, aligning them to picture, putting levels in and layering, while the creation of the sound is mostly outside the box – or at least, inside another box.
To what extent are directors recutting in the final mix? And do you think they understand the affect on the soundtrack?
Oh, till the last moment. VFX often come in late, but normally there are stages in between from a lego-brick in frame and the final vehicle or monster. Of course sometimes tiny changes can mean a big change for sound but I’ve gotten used to it.
I’ve found that it’s mainly the picture editor who’s the first one to tell the director the implications of his last minute changes. If your picture editor is experienced and wise he will fight for the sound dept. and knows what it means for us. Luckily I’ve worked with a lot of great editors that were always great team-players and kept up the communication with me. I don’t expect the director to understand all the technical implications of a small change – that’s not his job, but you need people around him who tell him before promises are made that cannot be kept.
And how does Conformalizer fit into that whole thing?
It’s like a third arm. I always say that I feel like an amputee when I don’t have it. You take away my arms, I would probably stop right here without Conformalizer, without exaggerating. Seriously, we’re at the final mix, we’re finishing in two days and they’re still giving us new reels all the time. At this time of the mix, we’re not talking about just one session within everything in, we’re talking about dozens of editor sessions, multiple recording sessions, the Atmos recording session, we’re talking music sessions, two different FX stage sessions, the dialogue session etc, so if they make one cut, we have to go back and conform a dozen sessions, the console and the reverb unit.
Some people still seem to want to conform manually but that really doesn’t make any sense to me. I’m happy to fix the transitions but the actual number crunching and all that, I’m really more than happy to leave that to a machine as that’s what they’re good at. A computer is bad at fixing transitions because it doesn’t know how it needs to sound, but the actual cutting, I’m happy to leave to the machine. It just saves a huge amount of time and it’s so simplistic.
What I like about Justin’s stuff is you really sense that it’s an actual editor that does daily work on films that designs these things. You really feel the intelligence behind it and reasons he does stuff, trying to keep the interface really simple while really powerful at the same time.