Vienna Symphonic Library Forum
Forum Statistics

180,745 users have contributed to 42,140 threads and 254,362 posts.

In the past 24 hours, we have 3 new thread(s), 18 new post(s) and 51 new user(s).

  • Synchron and stage depth

    Hi,

    Something that I'm trying to learn is how to get more stage depth with Synchron libraries.

    With traditional libraries you usually arrange three or four layers of reverb, each one placing your instruments at a particular distance on the virtual stage. Strings, Woodwinds, Brass, Percussion, each one sitting from front to back of the stage.

    With MIR it's just a matter of moving icons in the desired space. Possibly, I'm starting to think, not overdoing with the Wet signal and the additional smoothing reverb, or part of the positioning information would get lost.

    But how to do with Synchron libraries? Microphone gives a clear placement in the L-R axis, and in the High-Low one. But what about the Front-Rear axis? How to control it? Is it just a matter of making the distance information contained in the room mics (in particular the Tree) more evident? Could the Close and Distant etc. Mix Presets help, and be mixed at the same time for different sections?

    Paolo


  • I wonder if volume can be an effective tool to make an instrument sound farther. Without touching any EQ, or using a different original Mix Preset.

    Paolo


  • Still exploring this issue. Considering that the libraries have been recorded at the final position, I guess the Tree mics shouldn't have different levels. Or, maybe I could touch them, for example by leaving the front line untouched and the rear row lowered, if I want an increased distance between them.

    The Close and Mid microphone could be the key to make an instrument sound nearer or farther, preserving the natural size of the room.

    No idea about the Ambient mics. Maybe they should be treated as the Tree, and contribute in increasing the sense of distance by increasing their level to contribute to the impression of distance.

    Paolo


  • It seems to me that realistic-sounding depth can be a very tricky thing to pull off convincingly, Paolo. Of course a great deal depends on the listening position you set up. If you're at the back of large auditorium, you won't be getting much sense of front-back distance differences between players on stage. Or if you're on the conductor's podium, you'll sense very many different distances between you and the players. That much is obvious, but very little else about depth dimension in the sound field is obvious or intuitive.

    Traditionally, typical advice on mixing for depth is to use wet-dry differences in reverb; wetter for further back. And now that we have access to a variety of binaural HRTF panners, it's possible also to represent the total angle a large section of players presents to your ears, i.e. the total angle between the left extremity and right-extremity of the section; this total angle of course being narrower the further you are away from the the section of players. 

    For Synchron libraries, if you want positions and distances to be substantially different from those that were captured in the recording, then it can get horribly complicated to try to understand what the various room mics are telling your ears and why, unless one is to use only the closest mics for azimuth angle, and then artificial reverbs to simulate the stage zone and auditorium. And frankly, since I can't afford to hire an acoustics expert to try and teach me what's going on in an original Sychron full mix and how best to alter it, I tend to switch off room mics when I want to change recorded positions and distances. But Synchron libraries don't insist that we must use the Synchron Stage ambience mics; these very lovely and carefully constructed libraries have been recorded with a great deal of flexibility still available to the user.

    I've been using only the Decca Tree Main mics, each mono side with its own HRTF panner and reverb to give me the azimuth angles, spread and ambience I want on large string sections. And this doesn't turn out too badly; perhaps because the Decca mics room tail is faded out fairly quickly during sample editing, or in any case because Decca tails tend to 'open out' very nicely (probably because of the trapezoidal plan of Synchron Stage A) such that there tend to be little if any noticeable contradictions between what the Decca mics are telling my ears and what my artificial reverbs are telling my ears.

    For single instruments it's more a case of using only the closest mics, summed to mono, with HRTF panning and artificial reverb wet/dry ratio for placement, and of course, as you've noted, volume, to finally nail the depth dimension I'm looking for. Thus far I've encountered no significant problems doing it this way.

    Do please keep us updated as you progress with your experiments. I for one am certainly keen to try any different approaches that show promise and don't demand that I have to think hard about acoustics when there's music to be made, Lol.


  • last edited
    last edited

    @PaoloT said:

    But how to do with Synchron libraries? Microphone gives a clear placement in the L-R axis, and in the High-Low one. But what about the Front-Rear axis? How to control it? Is it just a matter of making the distance information contained in the room mics (in particular the Tree) more evident? Could the Close and Distant etc. Mix Presets help, and be mixed at the same time for different sections?

    paging @Dietz for ideas on this...

    I don't have absolute answer about this, but can only offer my own thoughts FWIW..  I think this is difficult task with Synchron libraries.  It is a clear argument for favoring VI and Synchronized libs if you intend to do non-standard things like that.  

    The whole point of Synchron is to use the room they were recorded in.  You can use the close mics and affect things more, but then you're losing the room.  If you bring in the room there will be lots of ER and distance information coming from those mics.

    In addition to what you and Macker have said about various things like the width and volume.. I would add that proximity-effect can add a feeling of something being closer or further away.  In MirPro that is the "distance" parameter.   There are other freeware FX you could mess around with to see if they will basically EQ the sound a certain way to make it sound closer (warmer), or further away.  I don't have much experience with them, so I'm just throwing it out there for you to try out and let us know.

    https://www.tokyodawn.net/proximity/

    https://www.auburnsounds.com/products/Panagement.html

    Try subtle settings.  nice thing about these is that you can more directly affect the proximity aspect without messing with ER's and width.

     MirPro of course is also worth a try with very subtle settings.

    And there is always just using an EQ to make a sound seem further away.

    I would not mess with ER's at all to achieve anything related to changing stage depth, unless you use only the close mics in Synchron.  However, ironically, ER's probably actually have the biggest impact on how far back something sounds like.  So...  in a way...with Synchron you're probably barking up the wrong tree to mess with it...but.....  Just my thoughts, I have no experience trying this.

    I would be very interested to hear what Dietz would have to say about any of this.


  • dewdman, you speak of these technical matters authoritatively, as if from your own knowledge, understanding and experience. I'm all for learning from someone else's understanding and experience, but you only hint and imply that you have these things, without actually imparting any of it. I'm sure I'm not alone in wanting to benefit from some actual elucidation from you.

    You mention the "proximity effect." Please explain how this is relevant to Synchron libraries.

    You identify 2 plugins (both of which I have, and have trialed fairly extensively). You say you "don't have much experience with them", but also say, as if you do actually have understanding and experience of them: "Try subtle settings. Nice thing about these is that you can more directly affect the proximity aspect without messing with ERs and width". Does that mean, for example, don't use the "Proximity" plugin's main fader to render a calibrated difference of distance but use it subtly instead? And do you mean that Proximity's "Width" parameter should be disengaged, and why? Please elucidate by giving us the benefit of your own understanding of these 2 plugins.

    You seem to be concerned about ER. Please explain the role of ER in Synchron libraries and why it's of concern in this context.

    You say, "And there is always just using EQ to make things seem further away." What principle does this involve, and how does that principle translate into making actual EQ settings to affect distance? Please elucidate.


    "By all means use some common sense but don't let it enslave you." ~ Dobi (60kg Cane da pastore Maremmano-Abruzzese)
  • last edited
    last edited

    @Dewdman42 said:

    [...] I would be very interested to hear what Dietz would have to say about any of this.

    This is my personal opinion as a sound engineer and music producer, not an "official" VSL statement: Synchron Instruments were made to be used in situ, in the specific positions in that very specific hall where they were recorded. For free placement on a virtual stage use Vienna Instruments.


    /Dietz - Vienna Symphonic Library
  • Still keenly anticipating your own elucidations of your remarks, dewdman.


    "By all means use some common sense but don't let it enslave you." ~ Dobi (60kg Cane da pastore Maremmano-Abruzzese)
  • last edited
    last edited

    @Helmholtz said:

    Stillkeenly anticipatingyour own elucidations of your remarks, dewdman.
    I have already stated that I have no experience trying any of these things with synchron, and i am certainly not an “authority” about it. I was just throwing out some ideas to try along with the rest of you. I tend to agree with Dietz here about how synchron libraries should be used On their own without messing with their placement. I have always felt that way about synchron which is why I am fully invested in VI and synchronized instruments and I’m still just experimenting with a few synchron libraries. I do feel there is some magic captured in some of synchron instruments such as synchron brass which I don’t have yet, and Elute strings which I do have But have not had a chance to do much with. My eventual goal will be to have the entire synchron series and use together as a cohesive sound, as an alternative to the VI sound if and when synchron hall is what I’m after. But I still have years of playing around to do with the VI series ahead of me and up until now I have been focusing on using that together with mirpro to achieve many wonderful and flexible results. That is definitely why I chose a couple of years ago to go all-in on vi series and mirpro. Synchron series is a different approach, and in some ways an easier and immediate sonic result with less mixing fussing around in Mirpro, etc… and eventually I am sure I will utilize that on its own that way too, but I feel that mixing synchron series with mirpro is not really the way to go other then perhaps using mirpro and synchron hall to blend in vi series Instruments with synchron series as is. But that is future project for me as I am not fully invested in synchron series yet. I only meant to throw out experimental suggestions regarding distance since Paolo asked and seems interested to experiment.

  • dewdman, I notice you still haven't elucidated on any of your advice which appears to have been aimed at helping Paolo with his specific objectives as stated in his thread here. So do you stand by your advice or withdraw it? If the former, that would leave a bit of a mess to be cleared up, sorting the wheat from the chaff as well as the outright nonsense; but someone else will have to take care of that, right?

    I notice also you've gone way off topic.

    I do hope, for Paolo's sake and for others who might be interested in what Paolo is trying to achieve here, that this thread can now proceed without being sidetracked any further.


    "By all means use some common sense but don't let it enslave you." ~ Dobi (60kg Cane da pastore Maremmano-Abruzzese)
  • I’m not sure what you’re referring to now. I can’t think if anything further to add

  • dewdman your answer means the advice you gave to Paolo no longer stands. Good. Now perhaps this thread might proceed without you sidetraacking it any further.


    "By all means use some common sense but don't let it enslave you." ~ Dobi (60kg Cane da pastore Maremmano-Abruzzese)
  • last edited
    last edited

    In general, if you go back and change one of your previous posts to contain new information, I will probably not go back to read it.  Its one thing to update a previous post to fix grammar or spelling mistakes, but if you change what you said, or add to what you said...its already history and I do not make a habit of going back to re-read the threads over and over to find whatever new questions you have placed in an old post.  

    I see now that you have added some specific questions to an older post...so I will try to address some of them.

    There is no effort on my part to side track or derail this thread at all..  Nor have I said anything off topic.  Your questions seem to have a combative tone towards me, so I may pick and choose what I will respond to in that case.

    @Another User said:

    You say, "And there is always just using EQ to make things seem further away." What principle does this involve, and how does that principle translate into making actual EQ settings to affect distance? Please elucidate.

    Frequency content changes over distance differently across the spectrum.  That is what I meant by "proximity effect" earlier.  By the same virtue, you can use EQ to change the spectrum, which can have an impact on perceived distance.  Here is a video on you tube I just quickly googled for and you can find many more on this topic.




  • Props to Helmholtz.

    But oh dear, it seems we have some cleaning up to do now. 


  • No worries Macker. I've got this - but of course do chime in if something takes your fancy. I slipped up once, and we have a googled answer as a result. But as it turns out, it helps with the job at hand, Lol.


    "By all means use some common sense but don't let it enslave you." ~ Dobi (60kg Cane da pastore Maremmano-Abruzzese)
  • dewdman, oh hey I was beginning to miss your speciality. But now we have a lovely bit of projection here from the one whose posts change with the wind - oh yes I've seen you do it to cover your backside time and time again. Give me one example of me editing one of my posts that constitutes downright dishonesty and cheating, and I will apologise. That's your trick, sunshine, not mine.


    "By all means use some common sense but don't let it enslave you." ~ Dobi (60kg Cane da pastore Maremmano-Abruzzese)
  • Right oh, some more engineering 101, to counter the harm of dewdman's fake news.

    Proximity effect. You really should have googled that one, dewdman. It refers to a property of most microphones whereby very close sources tend to elicit an enhanced response at lower frequencies, compared to the mic's frequency response to the same source at much greater distances. It could perhaps be more properly called the "close proximity" effect, since it becomes noticeable within about a metre or so between source and mic.

    I'm willing to be wrong but I very much doubt if any of VSL's sample recordings have ever deliberately invoked the proximity effect. Otherwise mixing with VSL sample libraries would include the ever-present chore of EQing out the proximity effect on instruments that are to be placed farther back than right in your face!

    The "Proximity" plugin that I assume you found in your random googling (as if Paolo can't google for himself) has a facility for EQing in or out the actual proximity effect of a typical microphone, but this of course has nothing to do with making a distinction between 10 metres and 15 metres, for example.

    And you say this is the same as the other EQ-distance effect that I carefully questioned; i.e. the HF absorption of air (now you can google ISO 9631-1), which plays very little if any significant part in distinguishing differences between player positions on stage.

    Tut tut. A big bit of noob confusion there by you, methinks. And yet you still make it sound as if you know what you're talking about, and that can cause untold mischief and confusion amongst non-technical music makers who all too often just accept what they're told to do about the technical stuff. They need to be protected from fake news merchants out just to make a name for themselves by any means, fair or foul.


    "By all means use some common sense but don't let it enslave you." ~ Dobi (60kg Cane da pastore Maremmano-Abruzzese)
  • ER content. You haven't explained how this "places the instruments on stage". Let's hear your understanding of this first.


    "By all means use some common sense but don't let it enslave you." ~ Dobi (60kg Cane da pastore Maremmano-Abruzzese)
  • And no dewdman, I've learnt the hard way. I'm not trusting you by giving you info up front. You ask to borrow my watch then turn round and tell me the time. It's an old management consultancy trick from the days when being a management consultant was for some a license to print money and win fame and fortune. And these 101s do come at a big price, dewdman. as you will discover.


    "By all means use some common sense but don't let it enslave you." ~ Dobi (60kg Cane da pastore Maremmano-Abruzzese)
  • Just for good measure, without meaning anyone personally: Can everyone please stick to a kind, benevolent and light tone here? Thank you.


    /Dietz - Vienna Symphonic Library