Audio interfaces have evolved immensely over the last few years, begging the question, “does latency still matter?” You’ve heard our take; what’s yours? Let us know in the comments, and watch more Studio & Recording Lessons here 👉 th-cam.com/play/PLlczpwSXEOybJLExI9WQwRA7ARVxPCPEG.html
Direct monitoring is great, but. What if I need to overdub over some recorded stuff, which is going to headphones with some latency? Theoretically, recorded material will be off to the original material, which was played back with latency. Right? Thank you for great video anyway!
The simple answer is yes. It always matters. But in most situations 6ms actual RTL is sufficient. If you monitor using plugins that have latencies then the total RTL will have to be taken into account. That is when extremely fast audio interfaces such as the Presonus Quantum 2626 (the fastest audio interface in history) comes in handy. I can record at 1.6 ms RTL without any artifacts. The difference between 1.6 ms RTL and 3.5 ms is imperceptible and workable in the vast majority of situations (provided that there is no added latency to the system latency). However, this difference for very picky singers is enough to throw them off in the sense that the monitoring of their own voices appear unnaturally further away on their headphones than in a live situation without any monitoring. 1.6ms RTL is fast enough to satisfy these types of picky singers.
@@thatchinaboi1 I'm just jumping into this a year later, having just recorded six songs at home. Style is singer-songwriter, with a drum emulation, rhythm guitar track, lead and then usually two background vocal tracks and sometimes a lead guitar track. The songs are very vocals focused. What I haven't heard much about is the effects, if any, of buffer size and sample rate on quality of recording. After three songs I bought a Presonus Quantum 2626 and a Mac Studio. Still, depending on settings, I can still have latency recording a vocal track, for example. I've seen and read about reducing latency by changing settings but not much on what usual settings are and whether there's an impact on sound quality. Is there a difference in fidelity between 64 and 128 or 256? These are things that still seem to elude me. Would love a book addressing the subject so I can refer back to it.
@@gbarge4 Your system should be able to handle 64 sample buffer or lower depending on the recording sample rate. Increasing the sample rate by double will reduce the latency by around half. If you are having trouble achieving extremely low latency without any artifacts then I suggest you bounce down all the tracks into audio before recording the vocals. In terms of sound quality, as long as there are no audible artifacts in the monitoring, you should be fine and can go as low as your system can handle. Regardless of your sample buffer and sample rate, the final audio output should be artifact free and bit perfect.
This is why I will be a Sweetwater customer for life. They try to provide us professional musicians with the help we actually need to be successful rather then try to sell us products like a lot of other retailers.
Totally on point. I receive better service all around from Sweetwater than all other audio gear companies combined. I’ll be a lifelong customer because of that… plus they send you candy.
For me, a "digital" music composer, who works in the box for most of the time, latency is a big issue. Not only it has to be managable (RTL=12ms max), the amount of tracks and virtual instruments loaded in my projects is massive. And, by definition, I have to monitor thru plugins, as they're my instruments I'm working with. I really have to squeeze a lot from my DAW. No Direct Monitoring would help me. So I found out throughout the years who REALLY makes great interfaces and great drivers. After multiple FireWire and USB interfaces, I settled for RME PCIe card - plugged directly into the motherboard, no intermediate protocol like USB or Thunderbolt - goes straight into to bowels of a computer. Insanely small latencies. Drivers stable as f***. Maxed out projects with 128 buffer size, giving me ~8ms RTL - no other solution could give me such results. And they support their products for decades - they still release drivers for HDSP PCI cards from like 15 years ago. Insane dedication for consumers and their products. I will (most probably) never ditch RME.
Could't agree more. Running HDSP 9632 all the way from Win XP, through Vista, 7, 8, 8.1 and now 10, and it still works flawless. And to underline their seriousnes, new drivers for that card came in late 2021! Will always recommend RME, no matter what the purpose is.
@@svendtveskg5719 Yup, my previous older PC, that I keep as a backup machine, has an RME HDSP 9632 in it - indeed, it works flawless, similarly to newer version in terms of latency. If it wasn't for motherboard makers ditching legacy PCI format, I would still use this card in my new machine and be super happy about it. BTW - if you're on Windows, new drivers for 9632 came out a month ago. Gotta love RME :)
I really appreciate the information about analog “in the air” latency. It’s easy to forget that latency isn’t just a digital issue, and it’s nice to hear some numbers for it.
While this is technically true, you have probably noticed that playing guitar through headphones with a 12ms delay line is a lot more frustrating than playing through a tube amp 12 feet away from you. In the latter case all the psychoacoustic signals are still in place to give your auditory cortex a “picture” of the room, from which it determines that there was not actually any latency at the source despite the time it takes the sound to travel through the air. So in a real-world application, moving a sound source further away isn’t really anything like the effect produced by buffer-induced latency.
As a performing live streamer,, latency is critical. I need to hear what my audience hears, because I'm mixing as I perform, and that's only possible when the round trip latency is under 10ms.
What you put in is what comes out so put in good sounds and you have nothing to worry about. It is not streaming lag they talk about here it is the latency internally in the audio interface.
@@BurninSven1 You may have misunderstood him man. He was talking about his round trip latency. He’s monitoring through his interface outputs to hear his performance from his DAW that’s also sent out to a livestream. RTL does matter with that kind of setup
Yeah, not sure what the video is trying to answer. Of course latency matters in the case of monitoring or doing manual punch in/outs, etc. Also, as a drummer, 3ms is max round-trip for most. As a guitarist, 6ms. Not sure how you're able to handle 10ms.
"A higher sample rate gives you lower latency." I think this needs to be qualified. If you fix the buffer size in terms of number of samples, this is true-as long as your computer can keep up with the faster flow of information required by the higher rate. But the higher rate makes it more likely that you'll need to increase the buffer size, esp. if you are doing any nontrivial signal processing (or recording multiple signals simultaneously). Basically, with a higher rate, you're asking the computer to do more work per unit time, and this has costs that can end up actually penalizing latency. -Tom
Tom, is there a reference text with a good explanation of these tradeoffs? For example, Mike Senior's recording an mixing books. Maybe something is right under my nose but it seems nuanced explanations aren't there. One subject I don't yet understand is sound quality impacts, if any. This was all made more complicated while listening to a Bill Schee interview a few months ago, where he very fleetingly mentioned getting the fidelity he needs at certain sample rates. Or did I dream that?
well think of like this: buffer size gives the computer more time to process the audio signal, therefore if the buffer size is 64, the latency will be as twice as the 32 but using 32 buffer can introduce cracks and distortion in the signal and lead to instability. Increasing sample rate means getting more sample points from the input signal for a particular time interval (1 second). The more signal points you have the more detailed input signal you will have. But since there are more sample points, there is more data, thus it increases the load on the CPU, RAM and maybe SSD. Therefore, if you have a high performance CPU, you can set the sample rate at 192 kHz, and buffer to 64 to get a very low latency. For most of the time 96kHz/64 buffer is the optimized point.
The latency that I deal with the most occurs when sending midi from a DAW to a hardware synth/drum machine. The recorded audio signal from my hardware can be as much as 40ms late, and I think it has to do with midi rather than audio latency.
At roughly 8:40 mark Mitch says: "there may be a proprietary driver which lowers latency....and you might want to investigate that." Apart from buying the interface, maybe even multiple interfaces, and doing extensive testing in your environment, how the heck do you investigate latency of a "proprietary driver?"
Thanks for the great video! Logic has a great button for this that disables everything causing latency! If it's hiding the Control Bar can be customized to make it visible. The button looks like a tiny clock and is called Low Latency Monitoring Mode.
Let me save you some headaches. It’s not possible. Record audio and midi using onboard VDrum sounds and apply the midi to your vst after the recorded take. Make sure your midi take lines up with the audio.
I've started recording my E drums with Ableton to get a roundtrip latency of under 5ms. I use an EDrumin module with SD 3, and it seems to work great. Another option for SD3 is record the midi in tracker in standalone mode and import the midi to your DAW after.
thank you sweetwater! this is awesome! can you please do a video on audio interfaces for recording drum kits of 8-12 inputs or larger? I find all of these great videos and info on like 2-4 input audio interfaces from low budget to high end, but would be great if you could take a look at chaining them together and the latency and different options
Avid HD Native PCIe on a Sweetwater creation station 450v5 running 4 Digidesign 192's 16x8x8. Works great even when tracking on all 64 inputs! Sweetwater 4 life!
One of, if not the best overview of latency in recording. If graphics were added, for example, the 'round-trip' of the signal, even newbies might understand. Great the you reminded us of acoustic latency as well! Bravo!
As a Softwarepiano player live and in studio (quite often there is no real piano especially on the road) I'm still in trouble with sloppy transport of MIDI-information from Keybord actions to the Plugin. This is a big problem if you love to play rhythmically precise. So for me every single millisecond is important in the chain of digital instances. ROLAND is far better than everything else, but there is still a lot room for improvement. Unfortunately I own a Grand Piano, two Fender Rhodes, a Wurlitzer A-200... these guys tell us where it should be. It is still NOT there, by far, even in 2023. I would love, if companies improve their gear they put on the market not by accident but with knowledge and consciousness.
Apollo Solo for vocals. You can monitor your voice with compression and reverb, but print the dry signal, and latency is lower than you can get with most computer setups.
Very helpful. I did not know about the sample rate difference! It was interesting there was no mention of interfaces that have been around a long time like UA. In addition to my normal Focusrite Scarlett 18i20 I have a UA Satellite and an Apollo Duo for their pre amps and for comfort reverb (bought specifically for tracking vocals) that have DSP FX integrated. I noticed somewhat recently however even UA seems to be slowly moving away from those as well and offering regular software plugins. The UA hardware plugs are very expensive if you can't take advantage of the bundle deals! But still cheaper than the equivalent outboard gear.
This is why i still use multi track recorder's like Roland vs-2480 so i wont have to flip out when pro tools decides to give me problems, only recording two tracks ! ya it happened this is even after power testing it to see if it could handle many tracks even at 88.2khz as well !
Great info! Latency really has gotten so much better. Studio One also has a low latency monitoring option in the DAW to really reduce latency. It basically splits out the work to the CPU differently so it handles requests more efficiently. It's great for tracking guitars through amps sims, etc.
You've given me some things to try as I suffer perceived high-latency in the MIDI that passes through a Behringer audio interface from a Arturia KeyLab Essential 88. I've been suspecting the latency is associated with the USB port on my Windows 10 PC, but that's just a hunch. I see in benchmarks of some USB-attached storage devices that latency can vary from one USB port to another (where perhaps the two ports are tied to different root hubs at the driver level).
The midi latency is negligible (around 1 ms) unless you are daisy chaining many midi devices together (the delay accumulates). The latency you are hearing is the delay in your computer as it synthesizes the audio samples via whatever virtual instruments you are using and buffers the samples out to your audio interface for D/A conversion. The actual USB overhead is negligble in modern systems. That is, it's not your USB port that's the problem.
Regarding the guitar/cabinet example. Vibrations (P- and S- waves) from the audio equipment are getting to you much faster by floor than through the air. So you (your body) are actually getting feedback much faster than 10ms when standing next to a loud cabinet. Sound travels through solids at around 6000m/s, so ~17 times faster, which means when you strike a string, your feet and then entire body would feel it after ~1ms, while your ears would pick the air pressure 9ms later. Playing next to a stack is a completely different experience because of these P/S- vibrations.
Excellent video. I use a digital mixer (MR18) to get around these issues. I just wish it had higher sample rates. Would have love to hear Mitch’s take on using a digital mixer as studio audio interface.
Round Trip Latency (RTL) will always matter if: you process your input through native plug-ins (playing guitar or singing), or you play native (aren’t they all?) soft synths with a MIDI keyboard. The question is, “Is it still a problem?” Unbelievably and sadly, the answer is still yes. Anyone can hear the result of 5-6ms RTL if they’re monitoring their own voice through a native processor. The resulting comb-filtered sound makes my teeth rattle…and I always have to switch to direct monitoring. BUT that’s because what your DAW reads in latency usually isn’t full Round Trip Latency, it’s INTERNAL latency. You’ll think it’s 2.2-10ms but it’s really more like 22.2-30ms which is VERY noticeable. In 20 years, I’m baffled at how this issue hasn’t been addressed, I’m starting to think it never will.
@sweetwater 10:40 Mitch mentioned a few 'workarounds' to direct monitoring. I'm not unfamiliar with such techniques, particularly utilizing an aux send to hear fx. However, I'd like to ask are there any particular kinds of plugins that essentially will not work using this workaround?
Hi, thanks for your interest! Good question - really, any effects plug-in will work using the aux send method, but it will probably yield best results with time-based effects like reverb. If you’re using direct monitoring and also tracking through something like a compressor on an aux send, you might run into an unwanted “echo” effect and/or phasing issues caused by the dry and processed signals going into your headphones almost simultaneously. Hope this helps a bit - feel free to contact me directly with any further questions, and thanks again! Caleb Lowrey, Sweetwater Sales Engineer, (800) 222-4700 ext. 1620, caleb_lowrey@sweetwater.com
I use Windows 10 (i5-4690K CPU) and my DAW is Propellerhead Reason version 11, using a YAMAHA MG10XU mixer, the Yamaha USB Steingberg drivers do a pretty damn good job, you can customize the audio quality, and the samples, it goes to 2048 samples all the way down to 32 samples, and you can go from 8ms to 2.5ms which is the fastest. This version of Reason supports VST plugins so I use NeuralDSP plugins with it, it's a fantastic setup for just playing at home and doing some basic recording. I connect my guitar either directly to the interface or connect my Line 6 HD pro X balanced output to the input of the interface.
Nice info thanks. I’m using Reason 10 with Arturia MiniFuse 4 on Windows 10 i5 PC. Latency is low and acceptable. Good to know a mixer like your MG10XU is usable.
Idk the sample rate can affect latency, most of them aren't really that much different (input and output around 40-50 each (according to guitar rig)), except for the highest one which suddenly drops to basically 0... As soon as I tried that suddenly it felt like 3x easier to play my guitar, you have no idea, it felt Sooooo GOOD...
the ux2 was my first "decent" interface, it was pretty horrible to be fair but the way you could monitor through the effects on it with no discernable latency was indeed far ahead of its time, imagine they did something like that now but with helix as the base software.
I started my recording with a Line 6 Toneport in the early 2000’s that I still have but since retired years ago after I upgraded from win xp to a win 10 laptop. They were way ahead of its time with some great sounds using gearbox FX. 😊
The thing that Mitch doesn't emphasize enough is trying to track with plugins, either just monitoring through them, or printing that effect. Plugins can greatly impact your latency. This can especially be noticed when you are overdubbing and your project already is impacting latency due to plugin processing on current active tracks. This is why something PT HD, Metric Halo, or UAD low latency options work well.
Here is a neat trick from the live world that may also help! I have yet to see a live digital mixer that can perform under 3 milliseconds roundtrip to hardwired headphone amplifier so any singer that says they can perceive 3 milliseconds isn't perceiving latency so much as a phase misalignment that can be fixed with a simple Phase flip. So sometimes the signal resonating in a singers skull will be out of phase with the signal coming back to their ears in the IEMs or headphones. So if the total latency is under 4 milliseconds it is most likely a phase/polarity issue. So next time you have this issue try flipping the phase on the microphone the singer is using and it will solve the issue 9 times out of 10. Also I'm talking hardwired headphones, I'm not talking about wireless IEM systems which add some delay as well depending on multiple factors such as if they are digital or not and what UHF or radio frequency they use. Then just reverse the phase again once you are done recording if it is an issue.
Wow, I had no idea increasing your sample rate decreases your latency. I have a cheap focusrite usb interface for my windows computer and raising my sample rate to 96KHz and 64 samples reduced my round trip latency to 4ms. Wow! I exclusively record on my macbook with a UAD Apollo, so I've never had any issues with recording, but sometimes I play 100% in the box on Ableton with my Windows computer. Have 4ms roundtrip with this cheap interface is amazing.
Great piece, and in under 14 minutes. For most settings, a musician could go by this and be set every time. For one of my setups, I am still looking for a solution. I use a Zoom L-12 mixer as an audio interface, send it to a laptop with Presonus and use that with various VST's on instruments, vocals, keyboards, and automation to change VST settings *during* each song. Then, back to the Zoom L-12 and out by stereo to monitors and speakers. It sounds like a recording studio but for live performances. However: Latency is still a problem when using several VST's. This solution works...most of the time. But I have to skip some songs using this solution, and skip some CPU-intensive VST's. The best a Zoom L-12 can do is between 13ms and 18 ms. This is worse when more automation comes in during some songs. Conclusion: Looking for a 12+ channel audio interface that can handle the round trip back to itself for final 2-channel mixing and out to the speakers. Have yet to find it. Anyone have an idea? Obviously, I am not looking for just a low-latency audio interface, but a bi-directional one. EVO-16 doesn't work, neither does the Focusrite Scarlet 18i20.
I’m fairly satisfied now that I can run my tracking template on a M2 @64 buffer / 48k. It’s still not as good as a dsp chain but the kids don’t seem to care they are use to it. Autotune is of course the elephant in the room since it’s always on when cutting in my world.
Does latency still matter? Yes. Are there substantially more ways to address/minimize this latency then ever before? Yes. Latency is always something to consider but with modern technology it is a pretty easily addressable issue that shouldn’t be a problem for you
if you need a doctor, watch a video about to go to a doctor.. If you like to look better, cut your hairs by your self simply. If you need lover latency, look simply for the driver. Good to have videos like this, because this explains all for 100% success.
It is great that modern DAWs can have very low latency. The previous problem with latency was that there was a perceived delay that the musicians could hear. A good solution was to monitor the performance directly from the analog console while injecting insert sends from the analog console to the recorder. This technique would be the ultra low latency of a pure analog console. On a good DAW with a good analog console the sonic would be fine as long as there was no "bleeding" from a previously recorded track with a new track being recorded. When there is bleed phase filtering can compromise the sound tremendously, even though there is no perceived latency between the old and the new track. Preventing bleed was the solution. I started out recording to tape and bleed wasn't as big of a problem as it was once digital processing enters the picture. The solution to correcting for track misalignment down to a small number of samples is possible but DAW dependent and may not be practical or necessary.
Not sure if I agree about higher sample rate. I thought the clicks and artifacts come from the output buffers being full before the system can output the sound. So while we measure buffers in samples, I would have thought that to the system it’s really the milliseconds that count. A CPU and usb controller don’t really care about latency, samples etc, they care about how much time they have to work with a certain amount of data.
How does Roland Octa-Capture hold up in 2024 against RME,Audient and UA Apollo Interfaces conversion wise? Do I need to change my Interface? I am a weekend music hobbyist
Assuming that most of people will work at 44.100/24bit or 48.000/24bit, between 128/512 buffer size, "producing on the flow". What are the best options of audio interface in terms of RTL on those values?
For eliminating latency with MIDI devices, do not use your DAW as the MIDI router! Get a quality hardware based MIDI interface (I love the MioXL) and run all your devices through that. The computer should just be another node/device on that network. Otherwise you get the latency both in and out of the computer, and given the layers of drivers and software it can add up. Besides, this is a super cool way to have your studio because you can jam and write "DAWless", then just lock your DAW to the timing clock when it comes time to record.
Can you detail this a little more? Most MIDI latency I experience is due to slow CPUs in old 80s synths I have. I have never noticed perceptible latency from having the DAW make the MIDI connections as opposed to another program, nor have the audio buffer settings in the DAW ever had any effect on MIDI latency with hardware synths. And what clock would the DAW be syncing to?
I have a great set, a mixer a sq5, a sennheiser digital 6000, and rcf TT speakers, so I start with a total of 7 ms latency, people say you can't notice it, but if I stand at 4 meters from my speakers I have 10 ms latency, this calculation does drive me a bit crazy with that expensive stuff, next to an old analog set you make a win-win situation with analog, is analog equipment the winner in the end?
Great video! This has been the subject at my job recently as we consider whether we need to go Avid for a new room. One thing we can’t seem to get around is preserving auto input monitoring. We still need to do punches and we don’t love the idea of hearing both the live input and the recorded track together before and after the punch. As far as I can tell, HDX is really the only solution to this. Although maybe Apogee has it figured with their new “dual path” hardware integration with Logic?
Superb Mitch - been using the 2nd mixer/software option with interfaces for a long time - just recently got a chance to use the SSL 12 with a good spec new PC and wow! First time I've been able to monitor direct in Cubase - brave new world (for me anyway)... Cheers 😎
You need good computer processing. Most interfaces have a buffer setting. You can get it down real low just under 2 milliseconds but you can't be running a lot of vst at the same time. That's why it's good to record first setting your buffer low and gaining next to no latency, after recording is done crank the buffer up and latency won't matter when mixing down and you get your computing power back to turn on all you other vst's
The vast majority of interfaces aren’t going to get below 3ms at any buffer setting, on any computer, because of their drivers and the limitations inherent in USB. A faster computer may allow you to run more plugins at the lowest buffer before glitching, but it won’t speed up latency.
@@modelcitizen1977 Well, those numbers are not always means same physical latency, but in my case on all my setups 5-10ms wasn't ok even for wery slow music.
In the end, latency is dictated more than anything else by the drivers for your audio interface. An interface with lousy drivers is going to put up poor RTL numbers even if it’s hooked up to the fastest computer on earth.
I played the pipe organ in Fulton MO in middle school, and I don't remember any latency even though some of those biggest pipes were ~100' away. I was young though, maybe it was horrid, idk.
almost a great video, but then you made the sample rate mistake haha: higher sample rate does not really = lower latency as a 128 samples@96 kHz buffer should bring the same performance as 64 samples@48 kHz: 48000samples=1000ms => 1sample=1000/48000ms => 1000/48000*64 = 1,3333333333ms; 96000samples=1000ms => 1sample=1000/96000ms => 1000/96000*128 = 1,3333333333ms; fascinating isn't it
I would love to hear some thoughts on the new Mac M2 mini computers… reason I say this, I’m hearing a vast amount of people saying these machines have pretty well solved problems in latency. I have been looking to buy another machine in the near future, but I REALLY wish I could find something more in depth on this.
Interesting! I'm on a m2 Mac. Still I prefer my UA Apollo's direct monitoring to take care of my signal. When I play through a plugin in Logic With plugins running from my Mac, I just get this furry feeling when I touch the strings, which I don't like.
When I hear of someone claiming to detect 3 ms of latency (delay), I automatically suspect that this is the common scenario of the manufacturer exaggerating the interface spec's. I've owned interfaces claiming 3 ms latency when in fact it was over 6 ms at the same sampling rate.
Seems in the discussion of latency the focus is almost always on latency while recording audio. I hope someone will correct me if I'm wrong but for musicians recording software instruments the problem of latency is still very real and obviously does not benefit from a direct monitoring capability. The problem of course is that the sound from midi dr8ven vsts isn't even created until the midi input gets to the vst and the vst, using computer resources needs to essentially create the waveform. Studio One allows you to set buffers differently for playback of recorded tracks which at least in theory helps reduce load on cpu and allows for smaller buffers to be used on your activel6 recording tracks. Not sure if any other daws do this or how much difference it makes.
The upside with software instruments is that there’s no input buffer since you’re not using the interface’s A/D, only D/A. So the real world numbers are always faster than the reported RTL, because no round trip has been made and MIDI input is always instantaneous and unaffected by audio buffers.
Help iv bought a focus rite 2i4. It plays great using mixpad, BUT I record all my backing at once in reggae as the backing from my keyboard, it's studio quality ". But when I try to add the rhythm guitar, and as you know it's like a sharp percussion too another track while listening to the backing it's way lagged behind. Too noticeable to use !!! And it says five milliseconds!! But with reggae it's a timing nightmare". Iv tried buffer size , upped the sample to over 88 k. Which is actually the best. And buffer at 16. Which my computer handled great. What else can I do. Otherwise the expensive focusrite.and Software are a waste of money. I'm no better off than using the audio in and out on the computer itself " any ideas please thank you kaiwen
IMO companies should stop trying to cut corners. The truth is when recording the more analog you go the better. I used to have all kinds of problems tracking at home. Now I track using a console, outboard eq and compressor. No latency. Use the computer as a tape machine. Interfaces should start to sell a 2 or 4 channel strip with all analog preamp, like neve, have a pultec eq and a couple of compressors like 1176 and LA2a all analog so we record on the way in the right way! What try to reinvent the wheel! Make a small console that has also all the digital advantages of technology
Audio interfaces have evolved immensely over the last few years, begging the question, “does latency still matter?” You’ve heard our take; what’s yours? Let us know in the comments, and watch more Studio & Recording Lessons here 👉 th-cam.com/play/PLlczpwSXEOybJLExI9WQwRA7ARVxPCPEG.html
How many ms should I have to not noticing lentacy?
Direct monitoring is great, but. What if I need to overdub over some recorded stuff, which is going to headphones with some latency? Theoretically, recorded material will be off to the original material, which was played back with latency. Right? Thank you for great video anyway!
The simple answer is yes. It always matters. But in most situations 6ms actual RTL is sufficient. If you monitor using plugins that have latencies then the total RTL will have to be taken into account. That is when extremely fast audio interfaces such as the Presonus Quantum 2626 (the fastest audio interface in history) comes in handy. I can record at 1.6 ms RTL without any artifacts.
The difference between 1.6 ms RTL and 3.5 ms is imperceptible and workable in the vast majority of situations (provided that there is no added latency to the system latency). However, this difference for very picky singers is enough to throw them off in the sense that the monitoring of their own voices appear unnaturally further away on their headphones than in a live situation without any monitoring. 1.6ms RTL is fast enough to satisfy these types of picky singers.
@@thatchinaboi1 I'm just jumping into this a year later, having just recorded six songs at home. Style is singer-songwriter, with a drum emulation, rhythm guitar track, lead and then usually two background vocal tracks and sometimes a lead guitar track. The songs are very vocals focused. What I haven't heard much about is the effects, if any, of buffer size and sample rate on quality of recording. After three songs I bought a Presonus Quantum 2626 and a Mac Studio. Still, depending on settings, I can still have latency recording a vocal track, for example. I've seen and read about reducing latency by changing settings but not much on what usual settings are and whether there's an impact on sound quality. Is there a difference in fidelity between 64 and 128 or 256? These are things that still seem to elude me. Would love a book addressing the subject so I can refer back to it.
@@gbarge4 Your system should be able to handle 64 sample buffer or lower depending on the recording sample rate. Increasing the sample rate by double will reduce the latency by around half.
If you are having trouble achieving extremely low latency without any artifacts then I suggest you bounce down all the tracks into audio before recording the vocals.
In terms of sound quality, as long as there are no audible artifacts in the monitoring, you should be fine and can go as low as your system can handle. Regardless of your sample buffer and sample rate, the final audio output should be artifact free and bit perfect.
This is why I will be a Sweetwater customer for life. They try to provide us professional musicians with the help we actually need to be successful rather then try to sell us products like a lot of other retailers.
Totally on point. I receive better service all around from Sweetwater than all other audio gear companies combined. I’ll be a lifelong customer because of that… plus they send you candy.
@@Subtronik The candy is awesome! And I'm not even sure why 🤔 But I don't care why, it's candy 😋
Exactly :) Hats off
For me, a "digital" music composer, who works in the box for most of the time, latency is a big issue. Not only it has to be managable (RTL=12ms max), the amount of tracks and virtual instruments loaded in my projects is massive. And, by definition, I have to monitor thru plugins, as they're my instruments I'm working with. I really have to squeeze a lot from my DAW. No Direct Monitoring would help me.
So I found out throughout the years who REALLY makes great interfaces and great drivers.
After multiple FireWire and USB interfaces, I settled for RME PCIe card - plugged directly into the motherboard, no intermediate protocol like USB or Thunderbolt - goes straight into to bowels of a computer.
Insanely small latencies. Drivers stable as f***. Maxed out projects with 128 buffer size, giving me ~8ms RTL - no other solution could give me such results. And they support their products for decades - they still release drivers for HDSP PCI cards from like 15 years ago. Insane dedication for consumers and their products.
I will (most probably) never ditch RME.
Could't agree more. Running HDSP 9632 all the way from Win XP, through Vista, 7, 8, 8.1 and now 10, and it still works flawless. And to underline their seriousnes, new drivers for that card came in late 2021! Will always recommend RME, no matter what the purpose is.
@@svendtveskg5719 Yup, my previous older PC, that I keep as a backup machine, has an RME HDSP 9632 in it - indeed, it works flawless, similarly to newer version in terms of latency. If it wasn't for motherboard makers ditching legacy PCI format, I would still use this card in my new machine and be super happy about it.
BTW - if you're on Windows, new drivers for 9632 came out a month ago. Gotta love RME :)
@@sejtaridiss same
So what would you consider today for PC?
@@dpixvid Depends on what kind of music you want to make. Tell me, I'll try to help ;)
I really appreciate the information about analog “in the air” latency. It’s easy to forget that latency isn’t just a digital issue, and it’s nice to hear some numbers for it.
While this is technically true, you have probably noticed that playing guitar through headphones with a 12ms delay line is a lot more frustrating than playing through a tube amp 12 feet away from you. In the latter case all the psychoacoustic signals are still in place to give your auditory cortex a “picture” of the room, from which it determines that there was not actually any latency at the source despite the time it takes the sound to travel through the air. So in a real-world application, moving a sound source further away isn’t really anything like the effect produced by buffer-induced latency.
@@daddyzhoam Snake oil. Delay is delay. There's no magical delay lol.
As a performing live streamer,, latency is critical. I need to hear what my audience hears, because I'm mixing as I perform, and that's only possible when the round trip latency is under 10ms.
Quick on the whip
What you put in is what comes out so put in good sounds and you have nothing to worry about. It is not streaming lag they talk about here it is the latency internally in the audio interface.
@@BurninSven1 You may have misunderstood him man. He was talking about his round trip latency. He’s monitoring through his interface outputs to hear his performance from his DAW that’s also sent out to a livestream. RTL does matter with that kind of setup
Yeah, not sure what the video is trying to answer. Of course latency matters in the case of monitoring or doing manual punch in/outs, etc. Also, as a drummer, 3ms is max round-trip for most. As a guitarist, 6ms. Not sure how you're able to handle 10ms.
What's the best interface for really low latency but without breaking the bank? I hear the RTL on the focusrite scarletts is well above 10ms
"A higher sample rate gives you lower latency." I think this needs to be qualified. If you fix the buffer size in terms of number of samples, this is true-as long as your computer can keep up with the faster flow of information required by the higher rate. But the higher rate makes it more likely that you'll need to increase the buffer size, esp. if you are doing any nontrivial signal processing (or recording multiple signals simultaneously). Basically, with a higher rate, you're asking the computer to do more work per unit time, and this has costs that can end up actually penalizing latency. -Tom
Exactly. If your computer can keep up with 88.2, it can also keep up with half the buffer size at 44.1.
It's a silly claim.
Tom, is there a reference text with a good explanation of these tradeoffs? For example, Mike Senior's recording an mixing books. Maybe something is right under my nose but it seems nuanced explanations aren't there. One subject I don't yet understand is sound quality impacts, if any. This was all made more complicated while listening to a Bill Schee interview a few months ago, where he very fleetingly mentioned getting the fidelity he needs at certain sample rates. Or did I dream that?
well think of like this: buffer size gives the computer more time to process the audio signal, therefore if the buffer size is 64, the latency will be as twice as the 32 but using 32 buffer can introduce cracks and distortion in the signal and lead to instability.
Increasing sample rate means getting more sample points from the input signal for a particular time interval (1 second). The more signal points you have the more detailed input signal you will have. But since there are more sample points, there is more data, thus it increases the load on the CPU, RAM and maybe SSD. Therefore, if you have a high performance CPU, you can set the sample rate at 192 kHz, and buffer to 64 to get a very low latency. For most of the time 96kHz/64 buffer is the optimized point.
I’m an eDrummer. Latency absolutely matters to me.
Wich interface do you prefer?
The latency that I deal with the most occurs when sending midi from a DAW to a hardware synth/drum machine. The recorded audio signal from my hardware can be as much as 40ms late, and I think it has to do with midi rather than audio latency.
At roughly 8:40 mark Mitch says: "there may be a proprietary driver which lowers latency....and you might want to investigate that." Apart from buying the interface, maybe even multiple interfaces, and doing extensive testing in your environment, how the heck do you investigate latency of a "proprietary driver?"
Thanks for the great video! Logic has a great button for this that disables everything causing latency! If it's hiding the Control Bar can be customized to make it visible. The button looks like a tiny clock and is called Low Latency Monitoring Mode.
When recording electronic drums through a VST plug-in into a DAW, latency is the biggest hurdle I've encountered.
Let me save you some headaches.
It’s not possible.
Record audio and midi using onboard VDrum sounds and apply the midi to your vst after the recorded take. Make sure your midi take lines up with the audio.
I've started recording my E drums with Ableton to get a roundtrip latency of under 5ms. I use an EDrumin module with SD 3, and it seems to work great.
Another option for SD3 is record the midi in tracker in standalone mode and import the midi to your DAW after.
@@PuRe_AdDicT I do it all the time, without any issues. How is it not possible?
My Presonus Quantum running Superior Drummer at 0.9ms latency says you’re wrong
thank you sweetwater! this is awesome! can you please do a video on audio interfaces for recording drum kits of 8-12 inputs or larger? I find all of these great videos and info on like 2-4 input audio interfaces from low budget to high end, but would be great if you could take a look at chaining them together and the latency and different options
My god... the best guide on the topic I have ever heard. Thanks so much for this.
Avid HD Native PCIe on a Sweetwater creation station 450v5 running 4 Digidesign 192's 16x8x8. Works great even when tracking on all 64 inputs! Sweetwater 4 life!
Great substantive video ... as always, "Master Gallagher" presents most valuable materials ... thank yoU!!
RME..Totalmix…the best. Period. As far as latency goes. Drivers Rock Solid!
Agreed. I am not even using TB with my.. Just the USB on the RME UFX+ on Mac mini. Great. Sold off my Apollo gear.
One of, if not the best overview of latency in recording. If graphics were added, for example, the 'round-trip' of the signal, even newbies might understand. Great the you reminded us of acoustic latency as well! Bravo!
This is one of the best sweetwater videos to date
As a Softwarepiano player live and in studio (quite often there is no real piano especially on the road) I'm still in trouble with sloppy transport of MIDI-information from Keybord actions to the Plugin. This is a big problem if you love to play rhythmically precise. So for me every single millisecond is important in the chain of digital instances. ROLAND is far better than everything else, but there is still a lot room for improvement. Unfortunately I own a Grand Piano, two Fender Rhodes, a Wurlitzer A-200... these guys tell us where it should be. It is still NOT there, by far, even in 2023. I would love, if companies improve their gear they put on the market not by accident but with knowledge and consciousness.
Don’t forget your fancy cue system like a hearback or furman also have a/d d/a conversion happening also piling on to your rtl for the artist.
I got so much good information from this video. Must watch again. Great video, thanks 🤘
Apollo Solo for vocals. You can monitor your voice with compression and reverb, but print the dry signal, and latency is lower than you can get with most computer setups.
Very helpful. I did not know about the sample rate difference! It was interesting there was no mention of interfaces that have been around a long time like UA. In addition to my normal Focusrite Scarlett 18i20 I have a UA Satellite and an Apollo Duo for their pre amps and for comfort reverb (bought specifically for tracking vocals) that have DSP FX integrated. I noticed somewhat recently however even UA seems to be slowly moving away from those as well and offering regular software plugins. The UA hardware plugs are very expensive if you can't take advantage of the bundle deals! But still cheaper than the equivalent outboard gear.
While I"ve had ups and downs with Sweetwater this is a superb video.
Well explained... Thank you!
Defined like a Pro Mitch!!!!
This is why i still use multi track recorder's like Roland vs-2480 so i wont have to flip out when pro tools decides to give me problems, only recording two tracks ! ya it happened this is even after power testing it to see if it could handle many tracks even at 88.2khz as well !
Wish my LCD didn't die on my VS-2480.. Can't use it as its completely faded out... :(
Great info! Latency really has gotten so much better. Studio One also has a low latency monitoring option in the DAW to really reduce latency. It basically splits out the work to the CPU differently so it handles requests more efficiently. It's great for tracking guitars through amps sims, etc.
Would it help those of us working on older sets ups? Running a 2012 MacBook Pro i5 processor 16gb 256ssd only working on vocals
You've given me some things to try as I suffer perceived high-latency in the MIDI that passes through a Behringer audio interface from a Arturia KeyLab Essential 88. I've been suspecting the latency is associated with the USB port on my Windows 10 PC, but that's just a hunch. I see in benchmarks of some USB-attached storage devices that latency can vary from one USB port to another (where perhaps the two ports are tied to different root hubs at the driver level).
The midi latency is negligible (around 1 ms) unless you are daisy chaining many midi devices together (the delay accumulates). The latency you are hearing is the delay in your computer as it synthesizes the audio samples via whatever virtual instruments you are using and buffers the samples out to your audio interface for D/A conversion. The actual USB overhead is negligble in modern systems. That is, it's not your USB port that's the problem.
@@slartibartfast1268 yes, virtual instruments suffer the latency. I will soon test this by using a more powerful PC.
Really informative video Mitch thanks!!
Regarding the guitar/cabinet example.
Vibrations (P- and S- waves) from the audio equipment are getting to you much faster by floor than through the air.
So you (your body) are actually getting feedback much faster than 10ms when standing next to a loud cabinet.
Sound travels through solids at around 6000m/s, so ~17 times faster, which means when you strike a string, your feet and then entire body would feel it after ~1ms, while your ears would pick the air pressure 9ms later. Playing next to a stack is a completely different experience because of these P/S- vibrations.
I have a Presonus Quantum which uses Thunderbolt. I can track with pretty much no noticable latency.
Quantum is the bomb. Still the fastest out there for raw RTL
Yes he is %100 right 7ms is about where I can notice I was just talking about this lately and he said the same as me.
I watched at 1.5x speed and every word was enunciated well enough for this to actually be understandable lol. Nice
Excellent video.
I use a digital mixer (MR18) to get around these issues. I just wish it had higher sample rates.
Would have love to hear Mitch’s take on using a digital mixer as studio audio interface.
This is super helpful. Well done Mitch!
Excellent tutorial on Latency. Thank you, Mitch. Love your teaching style :)
Round Trip Latency (RTL) will always matter if: you process your input through native plug-ins (playing guitar or singing), or you play native (aren’t they all?) soft synths with a MIDI keyboard.
The question is, “Is it still a problem?” Unbelievably and sadly, the answer is still yes. Anyone can hear the result of 5-6ms RTL if they’re monitoring their own voice through a native processor. The resulting comb-filtered sound makes my teeth rattle…and I always have to switch to direct monitoring.
BUT that’s because what your DAW reads in latency usually isn’t full Round Trip Latency, it’s INTERNAL latency. You’ll think it’s 2.2-10ms but it’s really more like 22.2-30ms which is VERY noticeable.
In 20 years, I’m baffled at how this issue hasn’t been addressed, I’m starting to think it never will.
Fantastic as always mate. Regards from London
@sweetwater 10:40 Mitch mentioned a few 'workarounds' to direct monitoring. I'm not unfamiliar with such techniques, particularly utilizing an aux send to hear fx. However, I'd like to ask are there any particular kinds of plugins that essentially will not work using this workaround?
Hi, thanks for your interest! Good question - really, any effects plug-in will work using the aux send method, but it will probably yield best results with time-based effects like reverb. If you’re using direct monitoring and also tracking through something like a compressor on an aux send, you might run into an unwanted “echo” effect and/or phasing issues caused by the dry and processed signals going into your headphones almost simultaneously.
Hope this helps a bit - feel free to contact me directly with any further questions, and thanks again!
Caleb Lowrey, Sweetwater Sales Engineer, (800) 222-4700 ext. 1620, caleb_lowrey@sweetwater.com
Sweetwater, I love you. Someone helped themselves to my guitar amp and my bday is tomorrow. Just saying.
Happy Birthday!
Great teacher. I like when people can explain things that way.
Well done as always. The most complete explanation of latency yet.
This was great; really well done. One thing I was hoping to learn a bit more about was ASIO. Is ASIO still a thing?
One aín a time explanation! Just awesome!
Good information all around, one thing I need to try -> taking my guitar plugins out of their channel strips and run them aux - Logic Pro X.
I use Windows 10 (i5-4690K CPU) and my DAW is Propellerhead Reason version 11, using a YAMAHA MG10XU mixer, the Yamaha USB Steingberg drivers do a pretty damn good job, you can customize the audio quality, and the samples, it goes to 2048 samples all the way down to 32 samples, and you can go from 8ms to 2.5ms which is the fastest. This version of Reason supports VST plugins so I use NeuralDSP plugins with it, it's a fantastic setup for just playing at home and doing some basic recording. I connect my guitar either directly to the interface or connect my Line 6 HD pro X balanced output to the input of the interface.
Nice info thanks. I’m using Reason 10 with Arturia MiniFuse 4 on Windows 10 i5 PC. Latency is low and acceptable. Good to know a mixer like your MG10XU is usable.
Excellent video Mitch & Sweetwater
High five Mitch Gallagher ! You are The BEST.
Great spot on explaination on the subject. Thanks bunches.
Idk the sample rate can affect latency, most of them aren't really that much different (input and output around 40-50 each (according to guitar rig)), except for the highest one which suddenly drops to basically 0... As soon as I tried that suddenly it felt like 3x easier to play my guitar, you have no idea, it felt Sooooo GOOD...
I’ve couldn’t explained it better, Mitch. Great video. Also, the Revolution audio interface looks awesome.
The Line6 Toneport UX2, 8 or KB37 was waaaaaaaaaaaaaaaaay ahead of it's time
the ux2 was my first "decent" interface, it was pretty horrible to be fair but the way you could monitor through the effects on it with no discernable latency was indeed far ahead of its time, imagine they did something like that now but with helix as the base software.
I started my recording with a Line 6 Toneport in the early 2000’s that I still have but since retired years ago after I upgraded from win xp to a win 10 laptop.
They were way ahead of its time with some great sounds using gearbox FX. 😊
The thing that Mitch doesn't emphasize enough is trying to track with plugins, either just monitoring through them, or printing that effect. Plugins can greatly impact your latency. This can especially be noticed when you are overdubbing and your project already is impacting latency due to plugin processing on current active tracks. This is why something PT HD, Metric Halo, or UAD low latency options work well.
Mitch is a wealth of knowledge.
Here is a neat trick from the live world that may also help! I have yet to see a live digital mixer that can perform under 3 milliseconds roundtrip to hardwired headphone amplifier so any singer that says they can perceive 3 milliseconds isn't perceiving latency so much as a phase misalignment that can be fixed with a simple Phase flip. So sometimes the signal resonating in a singers skull will be out of phase with the signal coming back to their ears in the IEMs or headphones. So if the total latency is under 4 milliseconds it is most likely a phase/polarity issue. So next time you have this issue try flipping the phase on the microphone the singer is using and it will solve the issue 9 times out of 10. Also I'm talking hardwired headphones, I'm not talking about wireless IEM systems which add some delay as well depending on multiple factors such as if they are digital or not and what UHF or radio frequency they use. Then just reverse the phase again once you are done recording if it is an issue.
underrated comment
WHAT AN INCREDIBLE INFORMATIVE VIDEO!
Wow, I had no idea increasing your sample rate decreases your latency. I have a cheap focusrite usb interface for my windows computer and raising my sample rate to 96KHz and 64 samples reduced my round trip latency to 4ms. Wow! I exclusively record on my macbook with a UAD Apollo, so I've never had any issues with recording, but sometimes I play 100% in the box on Ableton with my Windows computer. Have 4ms roundtrip with this cheap interface is amazing.
Buffer size divided by sample rate x 1000 x 2 (for roundtrip) = your exact RTL.
Great piece, and in under 14 minutes. For most settings, a musician could go by this and be set every time. For one of my setups, I am still looking for a solution. I use a Zoom L-12 mixer as an audio interface, send it to a laptop with Presonus and use that with various VST's on instruments, vocals, keyboards, and automation to change VST settings *during* each song. Then, back to the Zoom L-12 and out by stereo to monitors and speakers. It sounds like a recording studio but for live performances. However: Latency is still a problem when using several VST's. This solution works...most of the time. But I have to skip some songs using this solution, and skip some CPU-intensive VST's. The best a Zoom L-12 can do is between 13ms and 18 ms. This is worse when more automation comes in during some songs. Conclusion: Looking for a 12+ channel audio interface that can handle the round trip back to itself for final 2-channel mixing and out to the speakers. Have yet to find it. Anyone have an idea? Obviously, I am not looking for just a low-latency audio interface, but a bi-directional one. EVO-16 doesn't work, neither does the Focusrite Scarlet 18i20.
The "aux send" tracking tip is one I haven't actually heard of before, I'll have to try it. Thanks Mitch!
I don't even know what that means. I was confused
@@najiahgriffiths4216you’ll understand it soon. ☺️ we’re always learning.
I’m fairly satisfied now that I can run my tracking template on a M2 @64 buffer / 48k. It’s still not as good as a dsp chain but the kids don’t seem to care they are use to it. Autotune is of course the elephant in the room since it’s always on when cutting in my world.
Thanks for the video ! Note to self 7:20
Does latency still matter? Yes.
Are there substantially more ways to address/minimize this latency then ever before? Yes. Latency is always something to consider but with modern technology it is a pretty easily addressable issue that shouldn’t be a problem for you
That's a great summary of the video.
Unless you have a live set in Ableton with about 70 tracks each with a plugin on them
IIRC, USB polling is a contributor to interface latency and that Thunderbolt don't have this issue (or at least as much.)
nice video with good information and perspective
if you need a doctor, watch a video about to go to a doctor.. If you like to look better, cut your hairs by your self simply. If you need lover latency, look simply for the driver. Good to have videos like this, because this explains all for 100% success.
I always had a problem with it until I quit using a Windows computer in my studio. I switched to a Macbook and NO MORE PROBLEM.
I monitor drums thru an analog rack (api) my headphones are in phase with the mics / drums - It makes a difference.
It is great that modern DAWs can have very low latency. The previous problem with latency was that there was a perceived delay that the musicians could hear. A good solution was to monitor the performance directly from the analog console while injecting insert sends from the analog console to the recorder. This technique would be the ultra low latency of a pure analog console. On a good DAW with a good analog console the sonic would be fine as long as there was no "bleeding" from a previously recorded track with a new track being recorded. When there is bleed phase filtering can compromise the sound tremendously, even though there is no perceived latency between the old and the new track. Preventing bleed was the solution. I started out recording to tape and bleed wasn't as big of a problem as it was once digital processing enters the picture. The solution to correcting for track misalignment down to a small number of samples is possible but DAW dependent and may not be practical or necessary.
For a live performer latency truly matters. Its so frustrating searching for an audio interface without tem showing these readings
Not sure if I agree about higher sample rate. I thought the clicks and artifacts come from the output buffers being full before the system can output the sound. So while we measure buffers in samples, I would have thought that to the system it’s really the milliseconds that count. A CPU and usb controller don’t really care about latency, samples etc, they care about how much time they have to work with a certain amount of data.
Great explanation, Mitch.
How does Roland Octa-Capture hold up in 2024 against RME,Audient and UA Apollo Interfaces conversion wise? Do I need to change my Interface? I am a weekend music hobbyist
Does the video have anything to do with UA’s announcement 😬
Thanks very much.
Very informative 👍
Great video!
Assuming that most of people will work at 44.100/24bit or 48.000/24bit, between 128/512 buffer size, "producing on the flow". What are the best options of audio interface in terms of RTL on those values?
For eliminating latency with MIDI devices, do not use your DAW as the MIDI router! Get a quality hardware based MIDI interface (I love the MioXL) and run all your devices through that. The computer should just be another node/device on that network. Otherwise you get the latency both in and out of the computer, and given the layers of drivers and software it can add up. Besides, this is a super cool way to have your studio because you can jam and write "DAWless", then just lock your DAW to the timing clock when it comes time to record.
Can you detail this a little more? Most MIDI latency I experience is due to slow CPUs in old 80s synths I have. I have never noticed perceptible latency from having the DAW make the MIDI connections as opposed to another program, nor have the audio buffer settings in the DAW ever had any effect on MIDI latency with hardware synths. And what clock would the DAW be syncing to?
Great video Mitch!!!!
Great stuff Mitch :)
I have a great set, a mixer a sq5, a sennheiser digital 6000, and rcf TT speakers, so I start with a total of 7 ms latency, people say you can't notice it, but if I stand at 4 meters from my speakers I have 10 ms latency, this calculation does drive me a bit crazy with that expensive stuff, next to an old analog set you make a win-win situation with analog, is analog equipment the winner in the end?
Great video! This has been the subject at my job recently as we consider whether we need to go Avid for a new room. One thing we can’t seem to get around is preserving auto input monitoring. We still need to do punches and we don’t love the idea of hearing both the live input and the recorded track together before and after the punch. As far as I can tell, HDX is really the only solution to this. Although maybe Apogee has it figured with their new “dual path” hardware integration with Logic?
Great video.
I have a steinberg ur12 but I cannot lower the latency. I applied all the things you suggested but still there is an annoying latency
Superb Mitch - been using the 2nd mixer/software option with interfaces for a long time - just recently got a chance to use the SSL 12 with a good spec new PC and wow! First time I've been able to monitor direct in Cubase - brave new world (for me anyway)... Cheers 😎
I was wondering about the latency on the new Kemper Profile Player. prob just gonna have to try it out for myself
You need good computer processing. Most interfaces have a buffer setting. You can get it down real low just under 2 milliseconds but you can't be running a lot of vst at the same time. That's why it's good to record first setting your buffer low and gaining next to no latency, after recording is done crank the buffer up and latency won't matter when mixing down and you get your computing power back to turn on all you other vst's
Yep, record dry with low latency. Mix with a massive buffer.
The vast majority of interfaces aren’t going to get below 3ms at any buffer setting, on any computer, because of their drivers and the limitations inherent in USB. A faster computer may allow you to run more plugins at the lowest buffer before glitching, but it won’t speed up latency.
@@daddyzhoam even 5-10ms is fine. That’s less than a tenth of an eye blink.
@@modelcitizen1977 Try playing edrums triggering a sample library with 10ms latency and get back to me
@@modelcitizen1977 Well, those numbers are not always means same physical latency, but in my case on all my setups 5-10ms wasn't ok even for wery slow music.
In the end, latency is dictated more than anything else by the drivers for your audio interface. An interface with lousy drivers is going to put up poor RTL numbers even if it’s hooked up to the fastest computer on earth.
I played the pipe organ in Fulton MO in middle school, and I don't remember any latency even though some of those biggest pipes were ~100' away. I was young though, maybe it was horrid, idk.
almost a great video, but then you made the sample rate mistake haha: higher sample rate does not really = lower latency as a 128 samples@96 kHz buffer should bring the same performance as 64 samples@48 kHz: 48000samples=1000ms => 1sample=1000/48000ms => 1000/48000*64 = 1,3333333333ms; 96000samples=1000ms => 1sample=1000/96000ms => 1000/96000*128 = 1,3333333333ms; fascinating isn't it
This man is great!
Sweetwater should have an option on their search filters to look for units with a direct monitoring feature.
I would love to hear some thoughts on the new Mac M2 mini computers… reason I say this, I’m hearing a vast amount of people saying these machines have pretty well solved problems in latency. I have been looking to buy another machine in the near future, but I REALLY wish I could find something more in depth on this.
I use Apple Silicon and it does help. At higher sample rates, there is going to be issues especially when using many tracks with many plugins.
Interesting! I'm on a m2 Mac. Still I prefer my UA Apollo's direct monitoring to take care of my signal. When I play through a plugin in Logic With plugins running from my Mac, I just get this furry feeling when I touch the strings, which I don't like.
When I hear of someone claiming to detect 3 ms of latency (delay), I automatically suspect that this is the common scenario of the manufacturer exaggerating the interface spec's. I've owned interfaces claiming 3 ms latency when in fact it was over 6 ms at the same sampling rate.
Seems in the discussion of latency the focus is almost always on latency while recording audio. I hope someone will correct me if I'm wrong but for musicians recording software instruments the problem of latency is still very real and obviously does not benefit from a direct monitoring capability. The problem of course is that the sound from midi dr8ven vsts isn't even created until the midi input gets to the vst and the vst, using computer resources needs to essentially create the waveform. Studio One allows you to set buffers differently for playback of recorded tracks which at least in theory helps reduce load on cpu and allows for smaller buffers to be used on your activel6 recording tracks. Not sure if any other daws do this or how much difference it makes.
The upside with software instruments is that there’s no input buffer since you’re not using the interface’s A/D, only D/A. So the real world numbers are always faster than the reported RTL, because no round trip has been made and MIDI input is always instantaneous and unaffected by audio buffers.
Great review!!!
Help iv bought a focus rite 2i4. It plays great using mixpad, BUT I record all my backing at once in reggae as the backing from my keyboard, it's studio quality ". But when I try to add the rhythm guitar, and as you know it's like a sharp percussion too another track while listening to the backing it's way lagged behind. Too noticeable to use !!! And it says five milliseconds!! But with reggae it's a timing nightmare". Iv tried buffer size , upped the sample to over 88 k. Which is actually the best. And buffer at 16. Which my computer handled great. What else can I do. Otherwise the expensive focusrite.and Software are a waste of money. I'm no better off than using the audio in and out on the computer itself " any ideas please thank you kaiwen
Cool video. Thanks
IMO companies should stop trying to cut corners. The truth is when recording the more analog you go the better. I used to have all kinds of problems tracking at home. Now I track using a console, outboard eq and compressor. No latency. Use the computer as a tape machine. Interfaces should start to sell a 2 or 4 channel strip with all analog preamp, like neve, have a pultec eq and a couple of compressors like 1176 and LA2a all analog so we record on the way in the right way! What try to reinvent the wheel! Make a small console that has also all the digital advantages of technology
Totally agree with you