To those of you that find timecode perplexing… you might find this history of timecode useful. Kindly written by our chief tech Martin Pedder (ex BBC broadcasting house) of Boomerang Sounds. Pulled from his own experience & various sources.
Black and white American TV was at 30 fps and synced to their 60 Hz mains. The movie industry worked at 24 fps and the cinema projectors were also referenced to mains frequency. There is a 1:2 ratio between 30 fps and 60 Hz and a 2:5 ratio between 24 fps and 60 Hertz so conversion was easy. In the days before crystal locked sync generators, they just used a synchronous mains motor with the right number of teeth on the cog wheels and everything worked at the right speed.
Then some ‘genius’ decided that the only way to get the colour information into an American TV picture and not interfere with the sound carrier was to slow it down a fraction. 0.1% in fact. The change was small enough that black and white TVs would still display the pictures without modification. 30 fps – 0.1% = 29.97 fps. If they had continued at 30 fps, some TVs would have had problems with the sound and needed a small modification. The decision to avoid annoying some B&W American TV owners in 1953 is still haunting us 6 decades later.
When it came to syncing pictures and sound equipment, there were two ways to operate. You could run everything at 29.97 fps or carry on working at 30 fps but have a timecode that dropped the occasional frame number to compensate. Neither was a perfect or simple solution and we are stuck with it. Many music studios preferred working at 30 fps and suffered the drop frame as it kept the relationships between frames and tempo at simple ratios. The 0.1% pitch difference is imperceptible to many people but some can hear it clearly.
This meant that an “hour of timecode” at a nominal frame rate of 29.97 frame/s was longer than an hour of wall-clock time by 3.59 seconds, leading to an error of almost a minute and a half over a day, as the timecode was calculated in a manner that assumed the frame rate was exactly 30 frame/s.
To correct this, drop frame SMPTE timecode was invented. In spite of what the name implies, no video frames are dropped (skipped) using drop-frame timecode. What’s actually being dropped are some of the timecode “labels”. In order to make an hour of timecode match an hour on the clock, drop-frame timecode drops frame numbers 0 and 1 of the first second of every minute, except when the number of minutes is divisible by ten (i.e. when minutes mod 10 equals zero). This achieves an “easy-to-track” drop frame rate of 18 frames each ten minutes (18,000 frames @ 30frame/s) and almost perfectly compensates for the difference in rate, leaving a residual timing error of roughly 86.4 milliseconds per day, an error of only 1.0 ppm.
That is, drop frame TC drops 2 frames every minute, except every tenth minute, achieving 30×0.999 = 29.97 frame/s. The error is the difference between 0.999 and 1/1.001 = 0.999000999000999….
For example, the sequence when frames are dropped:
For each tenth minute
While non-drop time code is displayed with colons separating the digit pairs—”HH:MM:SS:FF”—drop frame is usually represented with a semi-colon (;) or period (.) as the divider between all the digit pairs—”HH;MM;SS;FF”, “HH.MM.SS.FF”—or just between the seconds and frames—”HH:MM:SS;FF” or “HH:MM:SS.FF”. The period is usually used on VTRs and other devices that don’t have the ability to display a semi-colon.
Drop frame timecode is typically abbreviated as DF and non-drop as NDF.
What has changed?
Crucially, is the change in the way that movie makers are working in post production. With video processing and animation being such a major part of so many movies, it makes sense to shoot at the same rate that the post work will happen.
Film was shot at 24fps. It was converted to 30 fps video but played at 29.97 as that is the speed that NTSC TV works. ie, all the post was done at the wrong speed and pitch. The layback happened and the final film was then played at the original speed.
Film is shot referenced to 23.976 so the conversion to 29.97 is a simple 4:5 ratio. Video imaging and processing can then be used at the shoot. Bluray and HD TVs can play at the same 23.976 originally shot. The film print may be played slightly fast at 24fps but I bet modern projectors can run at the 23.976 speed too. So, everything is good except an hour isn’t an hour.
Any old synchronisers that do not have 23.976 fps will have to run at 24 fps. If they are reading code, as they just display the timecode numbers they get, they should sync OK to a 23.976 code. What they can’t do is generate the 23.976 code and so even devices as recent as the Universal Slave Driver are not as “universal” as they need to be.
In the UK
We synced our black and white TVs to 25 fps which is half of our line frequency of 50 Hertz. Fortunately with PAL TV we did not have the problem when colour TV was launched and continued to work at 25 fps. So, no pull ups or downs, no drop frames, no hours that aren’t actually an hour and most importantly no confusion for post engineers and sound-to-picture people working on UK projects. It is a shame that Hollywood didn’t see HDTV, Bluray and digital broadcasting as the opportunity to ditch their silly and confusing system.
(Taken from Apples Logic Pro 9 User manual)
Note: In drop frame formats, certain frames are left out. (This follows a regular pattern.) To distinguish between formats, those without dropped frames are sometimes referred to as “nd” or “non drop.”
- 24 fps: Film, high definition video
- 25 fps: PAL video/television broadcasts
- 30 fps (drop frame): NTSC video/television broadcast; rarely used
- 30 fps: High definition video; early black-and-white NTSC video; older rate that is rarely used today
- 29.97 fps (drop frame): NTSC video/television broadcasts
- 29.97 fps: Standard definition NTSC
- 23.976 fps: 24 fps running at 99.9%, which facilitates easier transfer of film to NTSC video