This is a topic in which pitfalls abound, but I think I’ve just about sorted out the various ways to get an accurate idea of the a video file’s framerate. I’m using RealBasic for this, but the calls should be pretty generic in any language that talks to quicktime.
If you poke around a bit, you’ll find that the Movie.Timescale property is “sometimes” the number you’re looking for. In particular, video files generated from Apple applications (FCP, Compressor) seem to have their timescale set to the framerate. However, many other files will have their timescale set to the sample rate of the audio track, or perhaps to something entirely different.
If you can use the timescale, go for it. It’s much faster than the alternative. Note that it may come in as an int – so 2997 instead of 29.97.
What I’m doing is testing whether the timescale number makes reasonable sense in the context of video. I figure I’m not likely to have a video with a framerate higher than 60fps, or lower than 10fps. If the timescale result falls somewhere outside those bounds, I fallback to an alternative method.
By using the GetNextInterestingTime Quicktime call, or in Realbasic using NextInterestingVideoTimeMBS (requires MonkeyBread) you can measure the time between frames. NextInterestingVideoTimeMBS will return the current time in the movie and the duration of time between two “interesting times” (which usually equate to frames). The number you’re most interested in is the duration. By calling NextInterestingVideoTime a few times, you can be sure that the duration of each frame is the same.
Once you’ve got your duration number, you can divide your Timescale by the duration, and get the number of frames per second. You can even reset the Timescale to your new value (multiplied by 100) and then use the TimeDuration call to get the length of the movie in seconds.
Seems to work pretty reliably on the videos I’ve thrown at it. Anybody have better methods?