-
Notifications
You must be signed in to change notification settings - Fork 82
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Why is the Algorithm detecting only every second beat? #19
Comments
It depends on the audio that is being tracked - sometimes the beat detection can be prone to tapping 'on the off-beat' as an error. Or at a slower tempo than expected. There are also many cases where the "correct" rate to track is subjective, so it is hard to have a right answer in that case... |
ok I get that, but I dont think its tapping on the off beat or something like that as the tempo estimation is correct.. like it says its 120bpm but only triggers the beat event with 60bpm |
It's hard to say without seeing the specific example, but the algorithm's tempo estimation and beat location parts are actually separate. The beat detection is based upon a sense of 'forward momentum' which is guided by the tempo, so it is possible for these two things to not be identical. But I do expect them to be the same most of the time... |
thats interesting. If you want to, I can give you the Visual studio Project (or just the executable and the sourcecode so you dont have to compile) so cou can try it out yourself. |
I encountered the same problem with the latest btrack source and openframeworks. BPM gets estimated correctly, but only every second beat is detected. This is always the case.
in BTrack.cpp in the if statement at line 241. The code in there is supposed to be run halfway between beats. |
It's hard to comment fully without seeing your code or an audio example, but it's important to say that beat tracking is a subjective task - your perception that the beat should be tracked twice as fast as the algorithm is doing it is just that... a subjective opinion. This is something we struggle a lot with in the field algorithmically - what is the 'correct' answer? A lot of the time humans don't agree on this, and so some of our algorithmic performance measures try to take this into account. I would also add that in BTrack, the tempo and beat estimation phases are separate processes. The beat detection phase operates over its own momentum and the tempo provided to it is more of a 'guide'. It's hard to explain this without some graphs and equations. |
If you're happy to provide some example of how you are using the library I can try to see if there is a code issue to resolve |
One final thing to add - I just remembered that 15 years ago when I wrote this, we used the tempo specifically calculated as the time between beats output by the algorithm, rather than the 'estimated tempo' provided by the algorithm. I can see this is a bit unclear, but that 'estimated tempo' is just a guide to the beat tracking stage. If you take the time difference between beats you can calculate the (actual) tempo. |
Thank you for the quick and detailed response! I'll try to put a minimum c++ example together within the next weeks. |
Just to add to this: While using BTrack in a hobby project (music visualizer) I also noticed this apparently strange behavior. While the detected BPM (beats per minute) matched my expectation, calling I made a half-assed attempt at understanding the algorithm and bodged in a fix, which seems to work afaict. Looking back on it now, my gut instinct in replacing the hard-coded I understand your argument of
but think that from an API-perspective 'beats per minute' and reported beats ( Thanks for this awesome library ❤️ |
@adamstark I didnt see all the replies here, as you asked for an example, my Project is here on public github, check out the LiveBeat Repo if you want to see how I implemented it :) |
The onset detection function only consumes the hop of the passed-in frame, right? So I'm guessing one is supposed to feed in the frame in hop-sized chunks? So typically twice? i.e. &frame[0], &frame[hop]
|
Hi all, Thanks for your thoughts - I'll try and clear this up :) Firstly, just to re-iterate - the BTrack algorithm is really two separate sub-systems... A) Tempo estimation phase: which reports a BPM value in a "tempo octave" of 80 - 160bpm B) Beat tracking phase: which uses the bpm from the tempo estimation phase as a guide, but is otherwise is an entirely separate process based on its own momentum and analysis of the incoming data (onset detection function) What I'm getting at here is that the In general, it should be more obvious whether you are getting the "tempo estimation phase" BPM value, or the "what tempo is the beat tracker actually tracking beats at?". @marcuseckert - you make a good point and I've noticed some ambiguity in the comments around my use of "hop size" and "frame size". To answer your question I'll need to take a deeper look, but that could possibly be the cause of some of the issues here also. @CodingGhost - I'll take a look at your approach as I do this work to see whether it matches what the algorithm is expecting or not. @Ge-lx - the sample rate thing is also something that should be addressed. Working now as a professional audio software developer I would never do something like hard-code the sample rate. But in research - where I was in 2008-09 when creating BTrack - this is the kind of simplification that is made because almost all the test databases are in 44.1kHz and I was mainly interested in getting good results (rather than the algorithm working in every use case). Obviously I should fix this. So, I have some work to do - but that's fun. I'll do my best to do this quickly, but I'm also working on multiple product releases at the moment so let's see... :) As a final thought - in my other repositories on Github, I'm trying to provide the best algorithm possible for the thing that the repo is trying to do. But in this one I'm trying to keep it as a bit of a "preserved record" of my work during my PhD. I want people to be able to read the paper and my thesis and for the code to match their expectations about how things work. So if I seem a bit slow to change this repo, that's a little bit by design so that it matches the written documents out there that I can't change. Thanks for your patience and I'll report back 😄 |
The detection works fine, the tempo also gets measured correct. But the Beat event only gets triggered with half tempo. is this intentional?
Btw, could you review my PR?
The text was updated successfully, but these errors were encountered: