MIDI delay recording

UHE is now closed. For Technical Support from Ableton, please go here: http://www.ableton.com/support
John Daminato
Posts: 103
Joined: Sun Jan 19, 2003 6:59 pm
Location: chicago
Contact:

Post by John Daminato » Sat Dec 02, 2006 4:26 pm

Thats a interesting find. One you can easily get 10,000rpm SATA drives now and pretty cheap so I wouldn't tell people to be buying 72,000's. :wink: Two musicians/keyboardest will in most cases be using some kinda midi controller so how can they not select it as the input?
"Everybody is right in some way"
http://WWW.JOHNDAMINATO.COM http://www.myspace.com/daminato
i920 radiator cooled, hyper threaded, 7gigs-1333ram, Wndws 7,RMEfireface,UAD, bla bla

Hepha Luemp
Posts: 56
Joined: Wed Mar 01, 2006 9:57 am
Location: Oslo, Norway

Post by Hepha Luemp » Tue Dec 05, 2006 4:09 pm

Well, while we're all waiting for response regarding midi-recording, here's a little something about audio-recordings.

Here is a very very simpel test, you can to it as a thought experioment, if you'd like, but it works the same in reality.

Open Live
Turn on the metronome click.
Find a microphone and plug it in.
Route the input from the mic into two audio track.
Turn the monitor to "off" for one of the audiotracks, let it be to "auto" on the other.
Enable the tracks for recording.
Turn of the green monitor button (so you don't get any feedback).

Go to session view, hit play, make sure there's a click sound in your speakers.

Hold your mic up to one speaker.

Start recording.
Record for a while.
Take the tracks out of recording.

Now - what do you think - if you now start one of the clips, will it play in sync with the click?
Will both of them?
Or maybe just the one with monitor off?


If your recorded click does not play in sync with your metronome click - I ask:

In what kind of situation would one want this recording to NOT be in sync?
:?:

Hepha Luemp
- Just call me Hepha

murphf
Posts: 54
Joined: Thu Sep 02, 2004 9:37 am

Post by murphf » Tue Dec 12, 2006 9:47 am

I have to concurr. I would like the midi timing to be as it was as I heard it when I played it. This should be the default.

On a side note, will some kind of time stamping be added to Live, in the Direct Music format or the format of the Emagic AMT? That would tighten up timing a lot. the Emagic code may become public soon, as Apple, the new owner, does not support it anymore.

Pasha
Posts: 3328
Joined: Tue Dec 27, 2005 12:45 pm
Location: Lost Island
Contact:

Post by Pasha » Wed Dec 13, 2006 6:25 am

Joining the flow.
MIDI should be played back as it was recorded. It's the DAW that should take care of compensation, notes should never be shifted.
Personally I quantize MIDI and switch from Operator to Simpler, to External MIDI devices to SampleTank. When I record MIDI I want to be sure that plays in sync whatever sound generator I attach to it. I hadn't noticed this flaw but it should not happen when quantize is on correct?

What about audio? Is the behaviour the same?

- Best
- Pasha
Mac Studio M1
Live 12 Suite,Zebra ,Valhalla Plugins, MIDI Guitar (2+3),Guitar, Bass, VG99, GP10, JV1010 and some controllers
______________________________________
Music : http://alonetone.com/pasha

yannxou
Posts: 148
Joined: Fri Feb 27, 2004 8:39 pm
Location: Barcelona

Post by yannxou » Wed Dec 13, 2006 2:29 pm

Chris J wrote:Not sure I understand : what I hear is what I play, what I play is what I hear. Then when I play back I hear that it's not what I played
This great sentence shows both the problem and the solution. :wink:

I also hope it's fixed soon.
Live 9 Suite / MaxForLive / MacBook Black 2.4Ghz / 4Gb RAM / OSX 10.7.5 / Motu828 mk1 / MicroKONTROL / Midisport 4x4 / Korg Kaoss Pad II / Nocturn / Wiimote.

iain.morland
Posts: 111
Joined: Thu Dec 08, 2005 9:57 pm
Contact:

Post by iain.morland » Sat Dec 16, 2006 2:21 pm

Bump.

What's happening with this? Has it really broken again after the fix in Live 5?

If so, that's a deal-breaker for me and I won't be upgrading to 6. :evil:

tylenol
Posts: 564
Joined: Wed Aug 16, 2006 2:31 am
Location: Baltimore, MD
Contact:

Post by tylenol » Sat Dec 16, 2006 5:18 pm

iain.morland wrote:Bump.

What's happening with this? Has it really broken again after the fix in Live 5?

If so, that's a deal-breaker for me and I won't be upgrading to 6. :evil:
Well, I tested it in 5.2 and 6 and got exactly the same behavior. But no one who is upset in this thread responded, so I don't even know whether it's the correct behavior or not (though as far as I can tell, and I wasn't here for the original debate, it is the correct behavior in both).

popslut
Posts: 1056
Joined: Sun Oct 22, 2006 4:58 pm

Post by popslut » Sat Dec 16, 2006 5:42 pm

You're never going to get Ableton to address this as a bug because it isn't a bug. They've designed it to do this on purpose.

The monitor on/auto/off issue took me a while to work out, but I've got it fixed in my head now and it can be made to make sense if you approach it from a programmer's point of view.

From a musician's point of view it is the most insane scheme anyone ever cooked up.

The basic issue is that Live doesn't apply latency compensation when the channel is set to monitoring 'on' or 'auto'.

What Ableton have concluded is that a human playing a synth with 30ms latency will automatically compensate for this latency by playing consistently 30ms early. They assume that the human is capable of latency compensation. Anyone who has ever tried this will know how impossible this is.

Likewise, they assume that a singer monitoring through the software will compensate for the latency by singing consistently 30ms early.

If this were possible, their on/auto/off scheme would make perfect sense. Any audio or midi recorded with monitoring set to 'on' or 'auto' - according to Ableton's theory - will have been compensated for in the singing or playing - in order for it to sound correct whilst recording - and so will not require compensating for again.

This is obviously ridiculous to anyone with any experience of recording.

I've read [in this thread, stated by Amaury] that this is the only way it can be done due to the constraints of the laws of the space-time continuum, but anyone who has ever used Cubase, Nuendo, Logic, Sonar, Reason, and every single other piece of audio software available to mankind know this is nonsense. They all manage to do it seamlessly - Ableton is the only sequencer that doesn't automatically compensate for latency in the background.

I want my audio software to compensate invisibly for any latency. I want my softweare to "know" that if I hit 'that' key as 'that' kick drum sounds from the speakers i want 'that' bass note to be placed beside 'that' kick drum.

This doesn't require a rewrite of the laws of physics - it simply requires that as soon as the note is played during recording that 30ms [or whatever the latency amount is] be subtracted from the "real" note position. It really is not rocket science.

When I record vocals I use direct monitoring through my soundcard. No singer likes performing with latency in their headphones.

Occasionally, I'll forget to set the monitoring to "off" on the Ableton channel and only realise when I play back my recorded vocal and discover it is late by the total latency amount.

Can anyone at Ableton please explain to me in which circumstance this would be a desirable outcome?

While you're there, can you also explain to me why my external midi equipment responds with the same latency as my VSTis?

I'm using Ableton as a midi sequencer - my Kurzweil K2000 is connected to my analogue mixer and goes nowhere near my soundcard. yet, I find that midi data that passes through Ableton Live is subject to the same latency as audio. WHY???

I can think of no technical reason why this should be the case - every other midi/audio sequencer on the planet manages to pass midi data through its recording engine without adding audio latency delay. Why not Ableton Live?


I must say - Live is an absolutely amazing piece of gear but this is one aspect which needs changing really soon. It keeps being reported as a bug and this is because to most musicians this behaviour is so counter-intuitive and illogical it is difficult to believe anyone could have thought it was a desirable way to operate.
Last edited by popslut on Sat Dec 16, 2006 5:46 pm, edited 1 time in total.

iain.morland
Posts: 111
Joined: Thu Dec 08, 2005 9:57 pm
Contact:

Post by iain.morland » Sat Dec 16, 2006 5:43 pm

tylenol: Ah - I see.

I must confess I got confused when trying, after the previous discussion about this, to identify and articulate exactly what was happening wrongly - so I left it to the other tech-savvy forum members to report and discuss.

But I did get the impression that in 5.2.2 there was a fix (and wasn't it possible to change it back through editing some kind of txt file? or perhaps I'm misremembering...)

popslut: your excellent post is precisely the kind of exposition I didn't feel able to offer myself. Thanks.

So can someone confirm once and for all whether this IS fixed in 5.2.2 and IS broken in 6, or are they actually the same?

tylenol
Posts: 564
Joined: Wed Aug 16, 2006 2:31 am
Location: Baltimore, MD
Contact:

Post by tylenol » Sat Dec 16, 2006 7:37 pm

popslut wrote: What Ableton have concluded is that a human playing a synth with 30ms latency will automatically compensate for this latency by playing consistently 30ms early. They assume that the human is capable of latency compensation. Anyone who has ever tried this will know how impossible this is.
Actually, this isn't true at all -- piano players do it all the time. I agree that many musicians who are not piano players may not have this training (though my main training is in piano so I don't really know how hard it is to learn for e.g. a snare player). I ran across this interesting article (google cache of a pdf) while trying to get the figures for mechanical latency of a piano. It seems that there are even experimental studies on it: "The most important aspect of this is the fact that we can subconsciously adjust our performance to compensate for such different feedback conditions. During experiments with delayed feedback, subjects clearly altered their behavior according to the characteristics of each trial, forcing the researchers to introduce control trials between each pair of trials (Aschersleben and Prinz 1997; Mates and Aschersleben 2000). In piano performance, the time elapsed between pressing a key and the corresponding note onset is around 100ms for piano notes and around 30ms for staccato, forte notes (Askenfelt and Jansson 1990). Even if we assume that the pianist expects the note onset to happen somewhere in the middle of the course of the key, it is very likely that latencies will be different for different dynamic levels. Still, pianists have no problem dealing with such different latencies; since voices in pieces for the piano usually have dynamics that change continuously, the performer has the opportunity to adjust himself to the corresponding changes in latency."
I've read [in this thread, stated by Amaury] that this is the only way it can be done due to the constraints of the laws of the space-time continuum, but anyone who has ever used Cubase, Nuendo, Logic, Sonar, Reason, and every single other piece of audio software available to mankind know this is nonsense. They all manage to do it seamlessly - Ableton is the only sequencer that doesn't automatically compensate for latency in the background.
Actually, all Amaury said about what is and isn't possible is that you can't compensate for latency during monitoring (due to the laws of the space-time continuum). Which is absolutely true, as far as I can tell. Live would have to know you were going to hit a key 30ms later if it wanted a softsynth with 30ms latency to start its audio exactly when you hit the note. What it does do is ensure (which seems to be what bothers people) that the midi note on message that is recorded coincides with the real start of the audio (via automatic compensation that happens in the background). The alternative, I suppose, would be to have delay compensation arrange it so that the midi on corresponds to the key press -- what this will mean, though, is that the actual recorded track will be earlier than the monitored version by the amount of latency, i.e. you won't record what you monitored. Since monitor=off basically produces this effect, it is easy to see what it will sound like, and in my quick tests the timing sounded both off and early. But then I am a piano player. This behavior (in my quick tests) seems to be exactly the same in 5.2.

Pasha
Posts: 3328
Joined: Tue Dec 27, 2005 12:45 pm
Location: Lost Island
Contact:

Post by Pasha » Sun Dec 17, 2006 7:58 am

I read through the whole post. It might be I'm dumb.... :oops:
My understanding so far is :

1) As long as you record Quantized MIDI this phenomena should not occur in either AUTO or OFF Monitoring
2) If you record with AUTO monitoring (as you should otherwise you can't hear what you play, even on external gear) Live records with no compensation so in a bar (4/4) the 3rd 1/8 note is not in the right place (but you ear it well when played back). It's early as the Audio Buffer and latency settings dictate, and your brain compensates.
3) If you record with OFF monitoring (as you can when you monitor through a Mixer) Live records with compensation so in a bar (4/4) the 3rd 1/8 note is in the right place.
The same should happen with Audio as well.
4) When creating MIDI clips with step editor we write the note on messages in the right place on the MIDI time line and this is played back accordingly in real time, although not at the same time but after a new loop start. This confuses me a little. Shouldn't those notes we draw played out of sync (early?)

So far I didn't notice that because I record with Quantize on (I hear a little flanging effect when playing Audio & MIDI together if I perform the bass kick test
but this happens in Cubase as well so I did not care) and at the end all appears in sync to my ears, even if I switch MIDI source. It doesn't matter if its an external gear, Operator,Impulse,Simpler and a VST.
I'm afraid that a fix will probably mess up all our previous recordings, because every Project should be preprocessed to put the note on messages in the right place (however where is the right place if you record with no Quantization and the latency depends on VST+Audio settings). Audio can be even worse. I do not know any Live's internals but I guess programmers know where to find a suitable time line for the conversion.
Pls help me to understand I'm a little confused....

- Best
- Pasha
:?
Mac Studio M1
Live 12 Suite,Zebra ,Valhalla Plugins, MIDI Guitar (2+3),Guitar, Bass, VG99, GP10, JV1010 and some controllers
______________________________________
Music : http://alonetone.com/pasha

popslut
Posts: 1056
Joined: Sun Oct 22, 2006 4:58 pm

Post by popslut » Sun Dec 17, 2006 1:35 pm

tylenol wrote: [...]

All very interesting but still I've yet to meet many musicians who are willing - let alone able - to perform consistently 30ms [or whatever] early just to overcome the shortcomings of a recording system.

As I stated, EVERY other DAW compensates for latency in the background - invisibly and with great success. All the time I used Nuendo/Logic/Cubase I never once had to move my audio manually to compensate for system latency. the software always just "knew" where I wanted it and got it right 100% of the time.

This issue only became apparent to me when I started using Ableton Live.

As long as I remember to switch monitoring to "off" when i record audio [when using direct monitoring via my mixer] I have no problem with audio or VSTi's.

I do have problems when using my external midi gear however.

If I switch monitoring to "off" in this instance I cannot hear what I'm playing because Live doesn't pass the midi data through to the external synth. If I switch monitoring to "on/auto" I can hear what I'm playing, but late by the latency amount.

NO sequencer I've ever used adds latency to external midi signals. Why Ableton?

So - and assuming I'm not technically adept enough to be able to play consistently 30ms ahead of the beat - everything I play is positioned 30ms later than I want it.

I can understand that for some this arrangement works fine and seems totally logical, but for others, myself included, it makes Ableton a pain in the arse to use and leads to much head scratching and apologising over the talkback.
...you can't compensate for latency during monitoring (due to the laws of the space-time continuum).
Obviously.


So maybe the perfect solution would be to include in the "preferences" an option to "Automatically compensate for latency when monitoring is set to on/auto" so that all three monitoring modes behave like "off" mode with respect to latency compensation.

There will always be latency whilst monitoring but as a seasoned DAW user I'm used to this and have developed strategies to overcome it; Direct Monitoring being one of them.

However, I'm sure I'm not alone in wishing that Live had an option to automatically compensate for it in all three monitoring modes. The fact that this "feature" keeps cropping up as a bug report would seem to bear that out.

iain.morland
Posts: 111
Joined: Thu Dec 08, 2005 9:57 pm
Contact:

Post by iain.morland » Sun Dec 17, 2006 1:54 pm

I think the reason why this is causing so much head-scratching is not that the way Live works is totally illogical, just that it's totally unconventional in relation to other DAWs. :roll:

Hepha Luemp
Posts: 56
Joined: Wed Mar 01, 2006 9:57 am
Location: Oslo, Norway

Post by Hepha Luemp » Sun Dec 17, 2006 2:07 pm

iain.morland wrote:I think the reason why this is causing so much head-scratching is not that the way Live works is totally illogical, just that it's totally unconventional in relation to other DAWs. :roll:
You mean, unconventional, because if I make an audio recording with a monitor to auto or on, like if I am recording guitar and listening thru a software guitar-amp, - then I always will have to put on some minus-delay afterwords to get it in sync.

And the same with midi-tracks, always!!

You find that just unconventional?????

:roll:

Well, I guess with practice one really doesn't need to hear what one's doing, really, it's just a matter of practice.

Speaking about unconventional:

Here's the link to The Famous Fifteen Minute Deaf Mix Story
http://recforums.prosoundweb.com/index.php/t/4307/0/

This guy get's into this situation where he claims he's so good at mixing, he can do it without monitors and without track-sheets!
He can just look at the meters, and then figure out what kind of track it is.

So, he has to prove it!

And yes, there is a link in the thread to the audio also!!!
Sound pretty good, actually!!
8)


But, serioulsy, Ableton: I really like your software, but it is a bit of hassle to have to move everything I record, just to get it to be where it should be if your software behaved like a good recorder/DAW should...

Hepha Luemp
- Just call me Hepha

Pasha
Posts: 3328
Joined: Tue Dec 27, 2005 12:45 pm
Location: Lost Island
Contact:

Post by Pasha » Sun Dec 17, 2006 6:29 pm

Hepha Luemp wrote:
iain.morland wrote:I think the reason why this is causing so much head-scratching is not that the way Live works is totally illogical, just that it's totally unconventional in relation to other DAWs. :roll:
You mean, unconventional, because if I make an audio recording with a monitor to auto or on, like if I am recording guitar and listening thru a software guitar-amp, - then I always will have to put on some minus-delay afterwords to get it in sync.

And the same with midi-tracks, always!!

You find that just unconventional?????

:roll:

Well, I guess with practice one really doesn't need to hear what one's doing, really, it's just a matter of practice.

Speaking about unconventional:

Here's the link to The Famous Fifteen Minute Deaf Mix Story
http://recforums.prosoundweb.com/index.php/t/4307/0/

This guy get's into this situation where he claims he's so good at mixing, he can do it without monitors and without track-sheets!
He can just look at the meters, and then figure out what kind of track it is.

So, he has to prove it!

And yes, there is a link in the thread to the audio also!!!
Sound pretty good, actually!!
8)


But, serioulsy, Ableton: I really like your software, but it is a bit of hassle to have to move everything I record, just to get it to be where it should be if your software behaved like a good recorder/DAW should...

Hepha Luemp
I do agree. Should be fixed, but after a year and an half usage I did not find all those problems even if I can confirm all the test suggested in the forum. Probably because I record with quantize on and thus Live compensates only during playback. I do agree that for the benefit of expression not quantized recording should not be compensated, but what Groove Quantize is all about then? I thought that you have to record with quantize on and after that let Groove Quantize do the trick...probably I'm missing something.
IMHO Live was created to perform Live (sorry for the pun) and then evolved into a DAW. Probably the method used during recording is more suitable for session play/rec while as a DAW needs to be improved.

- Best
- Pasha
Mac Studio M1
Live 12 Suite,Zebra ,Valhalla Plugins, MIDI Guitar (2+3),Guitar, Bass, VG99, GP10, JV1010 and some controllers
______________________________________
Music : http://alonetone.com/pasha

Locked