Having worked in both film projection and video post, I've seen a lot of variance from the agreed-upon standards that made things more confusing. At our theater we had a platter system and never cut more than 1 frame off the head of the reel for identification purposes, but we'd often get prints from other theaters where 3, 4, even 5 frames were cut off, creating a huge pop in the audio when screened. And when I worked in post, only our broadcast masters started at Hour 10. The archival masters that we used always started at hour 1 for both film and television. We also processed telecine transfers of film reels for language dubbing, and while the on-screen BITC always had the hours matching the reels (reel 2 starts at hour 2,) the VITC & LTC timecode on the actual tapes started at Hour 1 regardless of which reel it was.
we used to (late 90s) be able to ask our distributor for 'trash reels' to test our projector. We asked them for one and we got reel 2 of Raiders, so we asked them again.... essentially after about two years we had two working prints of Raiders, a New Hope and somehow almost an entire Indy 3. They were basically just giving away film.
This was way, WAY more educational than I thought it would be. I was expecting a bit of trivia, this was an actual lesson in Film and Television 101. Sincerely, thank you for this video!
Love your work as always Christian! Former projectionist and current dubbing engineer here: Film reels were around 20 mins because the light was generated by running current through two carbon rods. After 20-25 mins or so they would have burnt down and would no longer be touching, so you would lose the light which necessitated changeovers between two projectors for the next reel. Then high wattage xenon lamps were developed which could run all day so many cinemas switched to the platter/payout module system you mentioned, or ran three reels per projector and just did one changeover. Also the leaders would contain the audio for the last few seconds of the previous reel, so in theory you wouldn’t lose the audio at the beginning. But we always removed the leader (along with the first frame of picture) and replaced it with one of our own. So by the time a print had made its way around a bunch of cinemas god knows how many frames had been removed at each end! To be honest I’ve ran many old classics with so many frames missing because a proj wanted to cut out a key frame from a classic scene to keep! And whilst I’d love to say I have, I can confirm that I’ve never spliced a single frame of pornography into a children’s film, tempting though it was...
Great video Christian. The reason for 25 frames per second (FPS) for UK TV is because the raster scan of the Cathode Ray Tube (CRT) was driven proportional to the frequency of the electrical supply, which is 50Hz in the UK. The USA power supply is 60Hz, hence the frame rate in USA is 30 FPS.
My first job was as a theatre projectionist. We had a tower system though where you spliced the reels together onto one large reel on a torque motor which would feed through the single projector and back onto a second large reel. Which was the solution before platters came along. We regularly had a chap who would come and certify our Dolby status with some fancy boxes and IIRC the fallback method at that time (late 90's) was Dolby Digital (little squares of digital data), then DTS, Dolby SR, then I think back to analogue. It was one of the first things I did which gave me a very attuned concept of latency, which at that time with my work in studios being mostly analogue was critical. This is as always a brilliant explanation which has also taught me something at the same time.
Christian yet again going above and beyond helping out the next generation of media composers. Your reply to my comment on the facent video was a top tier explanation, but THIS is just absolutely fantastic. I'm exceptionally grateful you took the time to produce it. Cheers matey
Back in the day, when recording on analogue multitrack I used to stripe (record) SMPTE time code to the last track of the tape (track 16 in my case). This was an audio representation of the time code and when played back a synchroniser then allowed an Atari ST1040 running Cubase to follow the tape machine (essentially the computer was slaved off the tape). This meant all the midi could be controlled via the PC while the tape was used for analogue sounds (drums, bass, guitars, vocals etc.) It was a pain in the neck!
Great explanation. Worked in recording studios in the mid 90s and early 2000’s. Rarely worked with video, we had a mater clock all SMPTE was tied too, ran 29.97 drop frame for some reason (the head engineers decision) and the SMPTE was always set tp the actually time of day (if the tape tarted recoding at 10:18 am the SMPTE time was 10:18, if the tape started recording at 3.:15 pm the SPMTE time was 15:15). One of my skills was getting 2 or 3 tape machines to lock together, or Protools and a tape machine to lock up. Had a nightmare session where there was a A800, a 2248 and a Protools to lock up and for some reason the SMPTE was different on all three, took me a while but with two Timeline Lynx’s i was able to get it all to work. The one fame off demo reminded me of hearing two tape machine flam when locked up. Haven’t heard that in ages.
Wow! This video was great! Thank You. There is another thing that is interesting and connected to this topic, and this is the delay between the video and the audio. When You playback video and audio from your DAW. There is a different time elapse to sensing with your eyes and ears. The video is a 'bigger bunch of data' and runs through several conversations until appears on your monitor. And think about the video compression, the system needs to decode the encoded video from that video container, which needs time and therefore adds delay for your video playback. And this video thing could be more complex... The Audio is nearly the same but a 'less bunch of data' and their delay depends rather on the plugins and the compensation process of the DAW that You use than the compression. And going more deeply in digital circuits, there are a few buffers in the audio and video hardware, which gives more delay. And another interesting thing with the audio: If You sit far from your speakers, for example, 3,4m/11 feet. It gives ~10ms delay in the audio. For compensating for these things, advanced DAWs have an option to set the delay between video and audio. In Pro Tools, this is the "Video Sync Offset..." And this is one of the whys we use leaders and sync pops today. (Sadly there are pro companies in the industry who totally disregard this quintessence...)
First time on your channel. I thought I heard you say Poirot, but figured I was hearing things until I checked out more of your work. My wife are currently watching all Poirot episodes, one per night (...again - this is at least second or third time going through them all). I will definitely be listening more carefully tonight! This was very educational.
Great bunch of knowledge, here, Christian. I kind of miss the days when synchronization (especially for music composition and production) was a much more involved and complicated process than opening a video file in a DAW session. Back when you needed rooms filled with mag machines, 3/4" video decks, time code on tape decks, and synchronizers, it was MUCH easier to charge a lot more for recording studio time. ;) Engineers and studios that had this together could charge a premium. I had Adams Smith equipment in my studio (first with an MCI JH-24, and then a Sony APR-24), and you really needed to crack some manuals to get that all running - and things really got fun when you had to sync 3/4" video with 24 track and an early DAW (or Synclavier). There were no TH-cam videos to show you how to do it. ;) BTW, in my part of the world, it was pretty common for sessions working on TV/radio commercials to start at one hour in.
I first heard of SMPTE around 1980 when Whistle Test played "Baby Snakes" by Frank Zappa (on their long lamented NYE pick of the year) where he rhymes SMPTE with "empty" in his lyric. I recall being intrigued by this, and now over 40 years later I have a concise explanation. Thanks Christian!
Fascinating insight into a professional musician workflow. Thanks so much, Christian. I enjoy all of your content. I am looking forward to learning more.
3 ปีที่แล้ว
Christian, thank you, as always, for the diversity of knowledge you spread out on your channel.
I used DV Cam and Beta cam for around 10 years. The time code usually started at 0, and the footage would start 1 minute in. As a sound editor, I would "punch in" (synced to Pro Tools via machine control) at 55 seconds, having a 5 second preroll and a 5 second lead in, just in case the tape would get mangled, the picture could still be recovered. Now, with non linear encoding, everything can start at 0 and ends at the last usable frame.
That was great! Fascinating stuff. Also you inspired me with that massive tv screen in your setup now. Just bought one and having it shipped. Love how you have it setup with pro tools and the film and logic. So clean 👌🏻
As usual you have allowed so many the opportunity to learn from your years of experience. Well Done!!! The 10 hours was a new one for me. I might add the importance of open communication with post or the people in charge of providing you your files. No more Betamax or VHS :) P.S. I’m dating myself but do you remember using the Knudsen click book to figure out your timings?
When I grew up, a major problem was Star Trek episodes were being butchered to make time for more "commercial interruptions". Then as Trek became culturally significant a mark for cut-frames developed accelerating the process. By the 1990s, some station's reels had whole scenes missing and the episode plots started to make no sense as vital dialogue of famous scenes went missing. The broadcaster where I grew up wisely broadcasted and marketed their episodes as "uncut" in the early 1970s and charged premium rates to advertisers for the 4 minutes(?) of available commercial space. Broadcasters now can show 12 minutes of commercials an hour ...maybe 16 in late night. All that "time" has to be chopped out of vintage programming. Now they can digitally speed-up a vintage TV show to compensate using the same technology as the speed-up or slow movies to fit the broadcast windows..
Slight correction at 7:59, (or 10:07:51:20 as it were): A changeover happens by a manually triggered shutter on each machine, synchronously blocking the light source on one machine and unblocking it on the other. Carbon arcs and xenons often do not achieve a steady arc on the first strike.
Fantastic stuff! TV used to be terrible for not fading up the start of a tv programme and missing first notes etc. Happens less now with digital play out etc Use 48khz is easily explains too, adopted because it divides samples cleanly by 24,25 and 30 FPS.
When I first came to Canada, the sync on the cable TV was often so far out that I couldn't look directly at the screen, I had to use my peripheral vision. I'd been spoiled by the BBC and ITV who were almost never noticeably out of sync. Thankfully digital improved that (but replaced it by delayed shadows caused by blocked data compression).
Timecode was first developed as a means of editing video. In essence, it's an electronic sprocket hole with the analog of that being film. Since film has sprockets, unless a frame is dropped from a scene in the edit bay, physical sprocket holes allow for tight synchronization of sound and picture. The problem with timecode is that it's based on frame rate, of which there are more than one... 30FPS, 29.97, 25, and 24, along with the variants of drop frame and non-drop frame. The window burn on the screen can be taken from a longitudinal timecode track, but in video is most often derived from the VITC (vertical interval time code) that's part of every video frame. I've always seen TC in music start at 1:00:00:00 so as to allow for additional sections at the beginning of a song to be added without passing over 'midnight' (0:00:00:00) and thus creating one of the biggest technical kerfuffles ever known. Love your videos, Christian!! Thanks for this one... brings back a lot of memories of working on the audio portion of TV productions... network specials and the like.
Knew about leaders and 2-pop and such from my days in commercial music and shortform IMAX scores. But the 10 hour TV thing is new to me. Makes good sense.
Great video, but totally obscured by the fact that I now NEED a monitor like that in front of my keyboard! I've been using 2 monitors side by side but now I want that!
I worked for 20 years in audio post much of it for network broadcast television here in the US and we always started at 1 hour... pre roll audio calibration tones and video calibration charts to come before. Also a little trivia, here in the USA 30fps was used for black and white television due to the fact that AC electricity alternates at 60HZs here. I believe 29.97 time code originated to synchronize satellite communications, later to become the standard for color TV... and then drop frame was added to make it conform better to real time. Don’t get me started with pull down and pull up for working between film and video...drove me crazy. I believe your system is set to 25fps due to the electricity alternating @ 50Hz....
There are/were strict standards on the number of cycles the electricity generators had to produce in a given time for that 50 Hz. My brain is fuzzy on the details. As well as televisions there used to be electric clocks that used the 50 Hz for timing as well as power. Until they switched to the modern rectangular-pin plugs in the late 60s (or was it early 70s?) , there was often a special outlet on mantlepieces for people's electric clocks.
29.97 was done to include color, originally it was 30fps, but to maintain backwards compatibility they had to fudge some details. Stand Up Math has a video on it.
i love this channel, you and guy. i admire your success despite growing up with all the adversity and social mobility challenges of having parents who were national treasures lol
Thanks for that, absolutely fascinating. I didn't even know about feet+frames! I'll look forward to the film scoring behind the scenes as I have an experimental score to write myself this year on an independent feature 🎛️🛠️🎹
Wondering why Davinci Resolve starts at 1 hour lead me here - thanks for the amazing video. (My srt file didn't work because of the time code) My takeaway from this is that it's only needed for professional productions and analog media. (I also can't figure out why resolve is doing this, because the exported video files don't seem to have any leaders - only the subtitles)
Fun topic. The difference between 30 and 29.97 is actually pretty crucial. If there's a mismatch in fps of the video and the decoding device you'll get the stuttering image when the scene camera is panning. As you'll see TH-cam doesn't deal with well. Or at all actually as it seems.
no. what you're seeing is a symptom of poor frame-rate conversion, a failure of motion-interpolation. the difference between 29.97 & 30 is less than a tenth of a percent. to play 24 fps material in the 29.97/30fps world, a technique called 2:3 pulldown is used to play every other frame three times instead of twice. to play 24fps material in a 25 frame world, very often the practise is to varispeed it by 4% & pitch-shift the audio. as a fan of the 'x files' back in the day, I had noticed glitching in the music bed, & one day got the chance to ask the guy who actually dubbed the show (he was backstage at a pj harvey gig... long story.) it's also why you might see different run-times for the same cut of a movie. the shortcomings of youtube, especially a few years ago, are something else again. frame-rate is actually a lot less important & it's left to the embedded media player in your browser to deal with, switching rates & resolutions almost seamlessly. but that image stabiliser thing..... my eyes!
@@duncan-rmi Actually, In the UK, (and probably elsewhere where 25 fps is used for television) films are indeed just shown at 25 fps, and many a time, they don't even use a 4% pitch shift. No need. All it does mean, as you pointed out, that 24fps material times are 4% shorter than the "actual" times when shown this way. You can often tell, if you have an instrument that is nominally in tune, to 440 Hz, but is slighly flat if you play along with music in the film. And if you listen carefully, you can also hear the (very) slight "chipmunk" effect on people's speaking voices.
@@duncan-rmi Many thanks Duncan! As I learned even the very small difference between 29,97 and 30 fps has an impact and cause on stuttering panning moving image. I might be poorly informed? But the main problem is probably as you say the media players themselves. Compare watching a Netflix movie/series on Google Chrome or Safari and the massive stutter you'll get, and then watch it on a Apple TV that runs smooth. And if so, why Google Chrome has such poorly conversion of TH-cam videos is quite a mystery. Anyway thanks for your comment.
@@cornerliston oh, I could talk all day about this.... how long have you got? 😂 I worked for viacom for many years, & one of the things I had to deal with a lot was 'localisation' of programming that originated in far-off NTSC land. a lot of people will think that this just means standards-conversion, or changing 29.97 (59.94 fields per second) into 25 (50 fields per second), & that is indeed the part where the sorts of things you mention can creep in. poor motion interpolation is especially obvious on sideways motion like pans & tracks. it gets far far worse when you attempt a field-based estimation of the motion (in order to change the FPS) on material that's been created as whole frames, like animations or film- the machinery sees the two "fields" of the original material as identical & so the interpolation is jerky; it simply has less data to work with. many operators don't know that the machine can base its interpolation on frames instead of fields (which latter mode you would use with material that originated as video rather than frames of film). add to that the fact that most animators still work at 12FPS because they always have, because it's less drawing... less rendering.... localisation is also often going to mean replacing principal dialogue with a new language, re-editing scenes for regulatory reasons, changing the music beds because of publishing rights in different territories (there's a song that's central to the plot of the very first episode of 'spongebob squarepants' that prevents the episode airing in its original form outside the US). I discovered that many of nickelodeon's animated titles were being converted to 29.97FPS (from 12 or 24) *before* any of the audio work was done. ideally you'd re-do this stage against a fresh 24FPS copy of the image for overseas sales.... but it wasn't possible, I was told, to rework the workflow so that we got the right audio on the best possible transfers from the film originals (which would've stayed at 24FPS if I'd got my way). too expensive. 🤨 & also they'd lost all the paperwork because they didn't think they'd ever be recutting 'rugrats', yadda yadda. what tended to happen in the early days of internet video services is that we would get a request from itunes' UK operation for some (say) nickelodeon content that started out as 12FPS, got converted to 29.97 for the domestic US tv market, then badly converted for tv use in 25FPS television abroad, by operators making the mistakes described above... so we got complaints from itunes-UK about stuff that we'd got away with, even though we knew it looked crap on telly, because itunes was a newcomer to streaming video & wanted to make the best impression. that's how, as an engineering manager for viacom, I got to digging around into these processes. fast-forward a few years from that, & I'm working for a VoD outfit a bit like netflix, in madrid. there, I inherited more of the same. the CTO was convinced, even though this was an online operation feeding tablets, phones & smart tvs, that everything should be converted to 25FPS because we were "in a 25FPS territory". we aren't, any more. these days even a normal telly doesn't care what frame rate you chuck at it, within reason. *there is absolutely no need to change the frame rate from its original for streaming services* the problem is that a lot of the localisation is already done against bad copies that were sold to tv networks; these were the versions that were sent to the streaming & VoD operators, often as badly encoded files. so, in short, the problem is that the content has been messed about with already, & it isn't viable- economically- to rework it all against the original image sequences. far too much re-editing would have to be done... colour grading, audio work... only the very high ticket material gets this treatment. disney are among the big studios whose pride in their own product is exemplary in a technical sense. viacom were way behind when I left, ten years ago, & it was one of the reasons I couldn't stay.
@@duncan-rmi Again, many thanks! Good to have answers from people with tons of experience in certain topics. Although I still find this issue being a bit unsolved isn't? In technical terms I mean. If the media device handles the conversion than it's probably just that, we need better conversion software? Since TH-cam has this issue on all content and Google own that technology alongside Chrome, then they should be able to solve this pretty easy? But comparing with another big company-you mention Itunes. Apple TV watched on an Apple TV box works great. But when watching on computer their software seems to be doing the job not so well, to put it in kind words. Why can't even the companies that develope their own technology be consistent over several media?
Thanks for sharing, got into problems using a DAW that can't set the offset to 10h. Spent a lot of time trying to get the first piano drop film in sync time and frames. Repainted the cabin inside instead. At every 10th plank I think of you.
Hey Christian! I recently worked with a small production, and the guy thought burning the timecode in was a waste of time, and straight out refused to do it. He later complained that he was struggling to sync the music up in his project and was asking for a whole heap of unnecessary shit, causing headache for the both of us because of his refusal to take my advice in the first place. I'm trying to think how to handle this better, should I have refused to work without the timecode and risk losing the job? Tried to educate him more on why it should be done? It reflected a level of unprofessionalism and arrogance which I haven't come across before, something which was a simple task, but it meant him making another export of the movie, which he didn't want to do. I didn't pull the "I told you so" card, I don't think I needed to, but the whole situation was a complete pain. What would you have done here?
I think you must always expect varying degrees of proficiency in productions. Sometimes you’ll have a less experienced assistant editor who will always send things differently etc etc. Be safe in the knowledge that professionals would expect leaders and BiTC and split audio tracks you’re not being a diva. You then return the favour by giving the director / producer / editor what they need to do their work with your work. It’s a collaborative process and if people refuse to collaborate on a technical level then they’re not good collaborators. Worth avoiding if you can in future.
@@TheCrowHillCo Thanks Christian! - It was very frustrating, and I shuddered at the talk of "we're shooting the second film in the summer". If he comes back, I'll likely turn it down. I love my job, and people making things unnecessarily difficult makes you love it just a little bit less at the time!
The 10-hour offset is used in Britain and only the CBC in Canada uses the same offset. However, everything else coming from other broadcasters in Canada I have worked on other than productions from the CBC, including all US-based productions I've worked on uses a 1-hour offset. In fact, I believe that the SMPTE standard is the 1-hour offset and not the 10-hour offset. Interestingly enough, SMPTE was developed by Leo O'Donnell while he was working for the National Film Board of Canada in the 1960s.
On the subject of physical film medium (I'm just assuming standard vertical 35, because that's the normal delivery medium that is relatively easily avaialbe) there's also the fun with current digital audio format of choice (i.e. mainly Dbloy), where the digital audio data is encoded in the space between the sprocket holes. This means the overpriced Dbloy play head and processing box has to transcode that digital signal into the DSP architecture and output some sensible signal. Did I also mention that the Dbloy head is about half a foot behind the lense for added fun times, and then there's of course for backup purposes your analogue film sound signal that is read and compared in case there should be some serious digital data missing (quite common that you have to remove some frames that have suffered from heavy use and splicing). We also have to remember that film film (unlike 35mm audio film or tape) is not pulled linearly, but intermittently (which as an aside also puts a lot of stress on the film, due to the massive rate of change of momentum).
I remember being taught about SMPTE at university and being told the story of people putting gaffa over the first number when working on TV, as the 10 hours would confuse some visiting clients.
Nice video Christian! Speaking of time... It would be interesting to see your approach to finding the right tempo and managing the """out of sync""" hitpoints when composing to picture :)
Yep I have to be honest and say that video totally blew my mind with confusion. I always wondered why it started at 10 hours in. I've had one opportunity to write a score for a short film, and I can match the timecode accordingly to the frame rate to progress. But the whole history from analogue to digital is a bit confusing. So more vids on leaders, time sync in general would be great. Lol, maybe I should stop watching historical war documentaries and more about timecode sync. Jeronimo 🤣
In the days of U-Matic SP and Betacam, starting your program material at 10 minutes into a tape avoided hang-ups with pre roll as the machines would not engage to record before 00:00:00:00 (hr:min:sec:fr)
Brilliantly presented nerdery as usual! As much as I love the idea of investing (career changing) my life in/to what I love - this being music and the fabrication of which by any and every technological means necesssary - is it or can it be a viable investment for A) a passionate team's life and times? B) a third-party of capital stakeholders? And C) the needs and expectations of spouses and families of said passionatas? Naturally, the revenue stream is dependent on the nature and marketability of the work, but so many other variables are at play. Perhaps Christian could put together a "The Business Of Music" (or better titled) video (series?) which could outline the anecdotal pitfalls and successes he has personally had, leading to where he is now. Or perhaps less personal and more textbook with his own curriculum? Love your work Sir. I'd attend that class, Just saying.
Fascinating as always Christian. Notice you've got the new SSL control surface there. (basically an Avid Artist Mix isn't it?). Definitely a better choice than the Icon anyway.
UK and EU broadcasters require a programme start at 10:00:00:00. In the US, their spec is to start at 01:00:00;00, I have no idea why this is. I almost made the booboo when laying off master tapes when I first started making US programming after years of UK work (back in the old linear days). Thankfully, I read the spec document just in time.
I don’t know how much this video will be loved but suffice to say; I loved it. My first feature was shot on film and, I guess, like you, that’s given me a respect for time and it’s measurements. *Always remember being given a chase scene to score that crossed a reel change and had to figure out how to stop the music for the reel change and get back in again whilst not letting the drama of the chase down musically. As an aside - reels were always delivered to theatres unwound and had to be rewound before they could be played. This meant that in the early days of film, the easiest place for exhibitors and distributors to splice in promos for coming attractions was at the end of the last reel. These promos would “trail” the main feature which is where the term “trailer” comes from. Of course, they figured out quickly that nobody was staying to watch the trailers after the main feature ended so moved them to the front of the screening, but the name stuck. **My favourite is Academy Leader. Do you have a favourite? 😀
Excellent video! Thanks, Christian! A little tangential, but I see you have some serious screen real estate these days. What display are you using, and what resolution are you running it at?
This was very helpful, I was just trying to figure out why DaVinci Resolve starts at 1h... Now, let's say I don't use it for TV, it's kinda useless and I can change it to start at zero if I only produce for TH-cam for example, right? And thanks so much :)
25fps = 1/2 of the 50hz of the European AC 30fps = 1/2 of the 60Hz of the US AC TVs, and all editing & broadcasting machines at the TV stations, are plugged on the AC socket. Simple solution to get a reference frequency. 🧐
I was always told 10 rather than 0 avoided problems with pre-roll. As the machines if parked before zero would struggle, attempting to spin all the way back to zero. Also that historically in the live TV days recording started at 10 am.
The shift from 30 to 29.97 FPS was due to the "colour subcarrier" which was a solution to dealing with color television and being compatible with older b/w TVs. Check th-cam.com/video/ykjyNeuQROU/w-d-xo.html where they explain the color subcarrier. Check www.sciencedirect.com/topics/computer-science/colour-subcarrier for more details. (Hm? Not sure if my previous comment was removed somehow, hence this one, which is a copy)
OK, so this is weird but I've been wondering for a while about the Spitfire coffee (tea?) mug full of what I assume are pencils? Is that just a lot of pencils? Colored pencils?
Late comment but just thought... Re the splicing of 22 minute reals and not having music at the beginning, doesn't that mean every film ever made would have to have no music playing throughout sections at 22 minutes, 44 minutes, 66 minutes, 88 minutes etc? Thanks Christian, Great video
More Timecode facts: if you watch any 80s BBC video rushes with burnt in timecode e.g. Doctor Who, it blew my mind when it was pointed out that the hours and minutes of the timecode were simply the ACTUAL time of recording!!
Why Does TV Timecode Start At 10 HOURS???....In the UK. Here in Netherlands it can be 01:00:00:00, 00:00:00:00 or even ( and very old school by now..) 00:02:00:00.
God Christian I'm meant to be getting on with my assingment at Thinkspace but instead I'm watching you talking about Timecode (super interesting by the way) Can you make a video on how to stop procrastinating while composing haha!
It's a bit confusing, often people in the industry say 30fps but really they mean 29.97 (similar with 24 and 23.98). They are not the same though. I'm not sure about the iPhone, but you can get MediaInfo freeware that will analyse the file and tell you the frame rate...or any video editor software would tell you the source frame rate.
BTW Christian, this nerdery might interest you too-maybe you even know what's going on here: Have you noticed that on HBO the voice sound has like a super short slap delay added on the mid low frequencies. The same as many movies and movie trailers does it. But this is not audible on Netflix. I can only understand this as a sound effect thing since it's on all HBO programs but (I believe) none of Netflix' content. Why and what? Many years ago I thought it was an interesting effect. Nowadays I get mostly annoyed by it.
@@SOUNTH11composingdesign thanks, I usually ask for Mp4 but wasn't sure if that format added an extra empty frame in the front of the file or not, like mp3's do.
Having worked in both film projection and video post, I've seen a lot of variance from the agreed-upon standards that made things more confusing. At our theater we had a platter system and never cut more than 1 frame off the head of the reel for identification purposes, but we'd often get prints from other theaters where 3, 4, even 5 frames were cut off, creating a huge pop in the audio when screened. And when I worked in post, only our broadcast masters started at Hour 10. The archival masters that we used always started at hour 1 for both film and television. We also processed telecine transfers of film reels for language dubbing, and while the on-screen BITC always had the hours matching the reels (reel 2 starts at hour 2,) the VITC & LTC timecode on the actual tapes started at Hour 1 regardless of which reel it was.
we used to (late 90s) be able to ask our distributor for 'trash reels' to test our projector. We asked them for one and we got reel 2 of Raiders, so we asked them again.... essentially after about two years we had two working prints of Raiders, a New Hope and somehow almost an entire Indy 3. They were basically just giving away film.
My timecode started at 100 hours, after it says 99:59:59 in this video: th-cam.com/video/HpH9xEpcsI8/w-d-xo.html
Of course, reel 24 starts at 24:00:00 and reel 75 at 75:00:00
This video is way underrated. Informative narration, smooth flow of information and clear communication. Thank you for this video, what a legend!
"they work with feet and frames...that's why we call it footage" MIND BLOWN! Thanks for this fantastic educational content Christian!
This was way, WAY more educational than I thought it would be. I was expecting a bit of trivia, this was an actual lesson in Film and Television 101. Sincerely, thank you for this video!
Love your work as always Christian!
Former projectionist and current dubbing engineer here:
Film reels were around 20 mins because the light was generated by running current through two carbon rods. After 20-25 mins or so they would have burnt down and would no longer be touching, so you would lose the light which necessitated changeovers between two projectors for the next reel.
Then high wattage xenon lamps were developed which could run all day so many cinemas switched to the platter/payout module system you mentioned, or ran three reels per projector and just did one changeover.
Also the leaders would contain the audio for the last few seconds of the previous reel, so in theory you wouldn’t lose the audio at the beginning. But we always removed the leader (along with the first frame of picture) and replaced it with one of our own. So by the time a print had made its way around a bunch of cinemas god knows how many frames had been removed at each end! To be honest I’ve ran many old classics with so many frames missing because a proj wanted to cut out a key frame from a classic scene to keep!
And whilst I’d love to say I have, I can confirm that I’ve never spliced a single frame of pornography into a children’s film, tempting though it was...
Great video Christian.
The reason for 25 frames per second (FPS) for UK TV is because the raster scan of the Cathode Ray Tube (CRT) was driven proportional to the frequency of the electrical supply, which is 50Hz in the UK. The USA power supply is 60Hz, hence the frame rate in USA is 30 FPS.
My first job was as a theatre projectionist. We had a tower system though where you spliced the reels together onto one large reel on a torque motor which would feed through the single projector and back onto a second large reel. Which was the solution before platters came along. We regularly had a chap who would come and certify our Dolby status with some fancy boxes and IIRC the fallback method at that time (late 90's) was Dolby Digital (little squares of digital data), then DTS, Dolby SR, then I think back to analogue. It was one of the first things I did which gave me a very attuned concept of latency, which at that time with my work in studios being mostly analogue was critical.
This is as always a brilliant explanation which has also taught me something at the same time.
Christian yet again going above and beyond helping out the next generation of media composers. Your reply to my comment on the facent video was a top tier explanation, but THIS is just absolutely fantastic. I'm exceptionally grateful you took the time to produce it. Cheers matey
Back in the day, when recording on analogue multitrack I used to stripe (record) SMPTE time code to the last track of the tape (track 16 in my case). This was an audio representation of the time code and when played back a synchroniser then allowed an Atari ST1040 running Cubase to follow the tape machine (essentially the computer was slaved off the tape). This meant all the midi could be controlled via the PC while the tape was used for analogue sounds (drums, bass, guitars, vocals etc.)
It was a pain in the neck!
I hot-rodded my Tascam 244 to bypass the dbx on track 4 to enable striping. Also with Atari and Cubase. I totally agree, a pain in the neck!
You are perhaps the only person who can convey information understandably.
Thanks Christian! 😄
true!
Great explanation. Worked in recording studios in the mid 90s and early 2000’s. Rarely worked with video, we had a mater clock all SMPTE was tied too, ran 29.97 drop frame for some reason (the head engineers decision) and the SMPTE was always set tp the actually time of day (if the tape tarted recoding at 10:18 am the SMPTE time was 10:18, if the tape started recording at 3.:15 pm the SPMTE time was 15:15). One of my skills was getting 2 or 3 tape machines to lock together, or Protools and a tape machine to lock up. Had a nightmare session where there was a A800, a 2248 and a Protools to lock up and for some reason the SMPTE was different on all three, took me a while but with two Timeline Lynx’s i was able to get it all to work.
The one fame off demo reminded me of hearing two tape machine flam when locked up. Haven’t heard that in ages.
Thank you so much for creating this content Christian. Always extremely useful and educational. Shout out from South Korea!
Wow! This video was great! Thank You.
There is another thing that is interesting and connected to this topic, and this is the delay between the video and the audio. When You playback video and audio from your DAW. There is a different time elapse to sensing with your eyes and ears. The video is a 'bigger bunch of data' and runs through several conversations until appears on your monitor. And think about the video compression, the system needs to decode the encoded video from that video container, which needs time and therefore adds delay for your video playback. And this video thing could be more complex...
The Audio is nearly the same but a 'less bunch of data' and their delay depends rather on the plugins and the compensation process of the DAW that You use than the compression. And going more deeply in digital circuits, there are a few buffers in the audio and video hardware, which gives more delay. And another interesting thing with the audio: If You sit far from your speakers, for example, 3,4m/11 feet. It gives ~10ms delay in the audio. For compensating for these things, advanced DAWs have an option to set the delay between video and audio. In Pro Tools, this is the "Video Sync Offset..." And this is one of the whys we use leaders and sync pops today.
(Sadly there are pro companies in the industry who totally disregard this quintessence...)
First time on your channel. I thought I heard you say Poirot, but figured I was hearing things until I checked out more of your work. My wife are currently watching all Poirot episodes, one per night (...again - this is at least second or third time going through them all). I will definitely be listening more carefully tonight! This was very educational.
Didn’t think I would watch it till the end, but then it was over too soon :). I found this very informative! Thank you!
Great bunch of knowledge, here, Christian. I kind of miss the days when synchronization (especially for music composition and production) was a much more involved and complicated process than opening a video file in a DAW session. Back when you needed rooms filled with mag machines, 3/4" video decks, time code on tape decks, and synchronizers, it was MUCH easier to charge a lot more for recording studio time. ;) Engineers and studios that had this together could charge a premium. I had Adams Smith equipment in my studio (first with an MCI JH-24, and then a Sony APR-24), and you really needed to crack some manuals to get that all running - and things really got fun when you had to sync 3/4" video with 24 track and an early DAW (or Synclavier). There were no TH-cam videos to show you how to do it. ;)
BTW, in my part of the world, it was pretty common for sessions working on TV/radio commercials to start at one hour in.
I first heard of SMPTE around 1980 when Whistle Test played "Baby Snakes" by Frank Zappa (on their long lamented NYE pick of the year) where he rhymes SMPTE with "empty" in his lyric. I recall being intrigued by this, and now over 40 years later I have a concise explanation. Thanks Christian!
"Need to remove any possibility of confusion", and then came DropFrame.... 😂
Fascinating insight into a professional musician workflow. Thanks so much, Christian. I enjoy all of your content.
I am looking forward to learning more.
Christian, thank you, as always, for the diversity of knowledge you spread out on your channel.
Brilliantly explained, edited and researched! Thanks Christian!!!
I used DV Cam and Beta cam for around 10 years. The time code usually started at 0, and the footage would start 1 minute in. As a sound editor, I would "punch in" (synced to Pro Tools via machine control) at 55 seconds, having a 5 second preroll and a 5 second lead in, just in case the tape would get mangled, the picture could still be recovered.
Now, with non linear encoding, everything can start at 0 and ends at the last usable frame.
Fascinating stuff as always. Thank you. Can't wait to see the longer process to come.
That was great! Fascinating stuff. Also you inspired me with that massive tv screen in your setup now. Just bought one and having it shipped. Love how you have it setup with pro tools and the film and logic. So clean 👌🏻
Fascinating! Footage ... of course!!
Wonderful explanation to help crossover information to understand why....many thanks 🙏
nice touch with the beep at the end.. as always great content
Thank you Christian for this video on explaining why SMPTE starts at 10 hours. - Justice Constantine
As usual you have allowed so many the opportunity to learn from your years of experience. Well Done!!!
The 10 hours was a new one for me. I might add the importance of open communication with post or the people in charge of providing you your files. No more Betamax or VHS :)
P.S. I’m dating myself but do you remember using the Knudsen click book to figure out your timings?
Thanks for that bit of info. I was gonna wait 10 hours before I posted.
Any way, Really looking forward to the series breakdown vids!
When I grew up, a major problem was Star Trek episodes were being butchered to make time for more "commercial interruptions". Then as Trek became culturally significant a mark for cut-frames developed accelerating the process. By the 1990s, some station's reels had whole scenes missing and the episode plots started to make no sense as vital dialogue of famous scenes went missing.
The broadcaster where I grew up wisely broadcasted and marketed their episodes as "uncut" in the early 1970s and charged premium rates to advertisers for the 4 minutes(?) of available commercial space. Broadcasters now can show 12 minutes of commercials an hour ...maybe 16 in late night. All that "time" has to be chopped out of vintage programming. Now they can digitally speed-up a vintage TV show to compensate using the same technology as the speed-up or slow movies to fit the broadcast windows..
Slight correction at 7:59, (or 10:07:51:20 as it were): A changeover happens by a manually triggered shutter on each machine, synchronously blocking the light source on one machine and unblocking it on the other. Carbon arcs and xenons often do not achieve a steady arc on the first strike.
Fantastic stuff!
TV used to be terrible for not fading up the start of a tv programme and missing first notes etc. Happens less now with digital play out etc
Use 48khz is easily explains too, adopted because it divides samples cleanly by 24,25 and 30 FPS.
When I first came to Canada, the sync on the cable TV was often so far out that I couldn't look directly at the screen, I had to use my peripheral vision. I'd been spoiled by the BBC and ITV who were almost never noticeably out of sync. Thankfully digital improved that (but replaced it by delayed shadows caused by blocked data compression).
Timecode was first developed as a means of editing video. In essence, it's an electronic sprocket hole with the analog of that being film. Since film has sprockets, unless a frame is dropped from a scene in the edit bay, physical sprocket holes allow for tight synchronization of sound and picture. The problem with timecode is that it's based on frame rate, of which there are more than one... 30FPS, 29.97, 25, and 24, along with the variants of drop frame and non-drop frame. The window burn on the screen can be taken from a longitudinal timecode track, but in video is most often derived from the VITC (vertical interval time code) that's part of every video frame. I've always seen TC in music start at 1:00:00:00 so as to allow for additional sections at the beginning of a song to be added without passing over 'midnight' (0:00:00:00) and thus creating one of the biggest technical kerfuffles ever known. Love your videos, Christian!! Thanks for this one... brings back a lot of memories of working on the audio portion of TV productions... network specials and the like.
Knew about leaders and 2-pop and such from my days in commercial music and shortform IMAX scores. But the 10 hour TV thing is new to me. Makes good sense.
Great video, but totally obscured by the fact that I now NEED a monitor like that in front of my keyboard! I've been using 2 monitors side by side but now I want that!
Christian, does the new monitor mean you are now on a new Mac Pro? Time for an update to this video???
th-cam.com/video/EouLohgmvrI/w-d-xo.html
It looks like it could be an LG CX
I worked for 20 years in audio post much of it for network broadcast television here in the US and we always started at 1 hour... pre roll audio calibration tones and video calibration charts to come before. Also a little trivia, here in the USA 30fps was used for black and white television due to the fact that AC electricity alternates at 60HZs here. I believe 29.97 time code originated to synchronize satellite communications, later to become the standard for color TV... and then drop frame was added to make it conform better to real time. Don’t get me started with pull down and pull up for working between film and video...drove me crazy. I believe your system is set to 25fps due to the electricity alternating @ 50Hz....
Amazing info thanks.
There are/were strict standards on the number of cycles the electricity generators had to produce in a given time for that 50 Hz. My brain is fuzzy on the details. As well as televisions there used to be electric clocks that used the 50 Hz for timing as well as power. Until they switched to the modern rectangular-pin plugs in the late 60s (or was it early 70s?) , there was often a special outlet on mantlepieces for people's electric clocks.
29.97 was done to include color, originally it was 30fps, but to maintain backwards compatibility they had to fudge some details.
Stand Up Math has a video on it.
i love this channel, you and guy. i admire your success despite growing up with all the adversity and social mobility challenges of having parents who were national treasures lol
Thanks for that, absolutely fascinating. I didn't even know about feet+frames! I'll look forward to the film scoring behind the scenes as I have an experimental score to write myself this year on an independent feature 🎛️🛠️🎹
Wondering why Davinci Resolve starts at 1 hour lead me here - thanks for the amazing video.
(My srt file didn't work because of the time code)
My takeaway from this is that it's only needed for professional productions and analog media.
(I also can't figure out why resolve is doing this, because the exported video files don't seem to have any leaders - only the subtitles)
And then to confuse matters, trails for live tx mix it up and start at 20:00:00:00 whilst live VT pictures start at N0:09:59:00
Fun topic.
The difference between 30 and 29.97 is actually pretty crucial.
If there's a mismatch in fps of the video and the decoding device you'll get the stuttering image when the scene camera is panning.
As you'll see TH-cam doesn't deal with well. Or at all actually as it seems.
no. what you're seeing is a symptom of poor frame-rate conversion, a failure of motion-interpolation. the difference between 29.97 & 30 is less than a tenth of a percent.
to play 24 fps material in the 29.97/30fps world, a technique called 2:3 pulldown is used to play every other frame three times instead of twice.
to play 24fps material in a 25 frame world, very often the practise is to varispeed it by 4% & pitch-shift the audio. as a fan of the 'x files' back in the day, I had noticed glitching in the music bed, & one day got the chance to ask the guy who actually dubbed the show (he was backstage at a pj harvey gig... long story.) it's also why you might see different run-times for the same cut of a movie.
the shortcomings of youtube, especially a few years ago, are something else again. frame-rate is actually a lot less important & it's left to the embedded media player in your browser to deal with, switching rates & resolutions almost seamlessly. but that image stabiliser thing..... my eyes!
@@duncan-rmi Actually, In the UK, (and probably elsewhere where 25 fps is used for television) films are indeed just shown at 25 fps, and many a time, they don't even use a 4% pitch shift. No need. All it does mean, as you pointed out, that 24fps material times are 4% shorter than the "actual" times when shown this way. You can often tell, if you have an instrument that is nominally in tune, to 440 Hz, but is slighly flat if you play along with music in the film. And if you listen carefully, you can also hear the (very) slight "chipmunk" effect on people's speaking voices.
@@duncan-rmi Many thanks Duncan!
As I learned even the very small difference between 29,97 and 30 fps has an impact and cause on stuttering panning moving image. I might be poorly informed?
But the main problem is probably as you say the media players themselves.
Compare watching a Netflix movie/series on Google Chrome or Safari and the massive stutter you'll get, and then watch it on a Apple TV that runs smooth.
And if so, why Google Chrome has such poorly conversion of TH-cam videos is quite a mystery.
Anyway thanks for your comment.
@@cornerliston oh, I could talk all day about this.... how long have you got? 😂
I worked for viacom for many years, & one of the things I had to deal with a lot was 'localisation' of programming that originated in far-off NTSC land.
a lot of people will think that this just means standards-conversion, or changing 29.97 (59.94 fields per second) into 25 (50 fields per second), & that is indeed the part where the sorts of things you mention can creep in. poor motion interpolation is especially obvious on sideways motion like pans & tracks.
it gets far far worse when you attempt a field-based estimation of the motion (in order to change the FPS) on material that's been created as whole frames, like animations or film- the machinery sees the two "fields" of the original material as identical & so the interpolation is jerky; it simply has less data to work with.
many operators don't know that the machine can base its interpolation on frames instead of fields (which latter mode you would use with material that originated as video rather than frames of film). add to that the fact that most animators still work at 12FPS because they always have, because it's less drawing... less rendering....
localisation is also often going to mean replacing principal dialogue with a new language, re-editing scenes for regulatory reasons, changing the music beds because of publishing rights in different territories (there's a song that's central to the plot of the very first episode of 'spongebob squarepants' that prevents the episode airing in its original form outside the US). I discovered that many of nickelodeon's animated titles were being converted to 29.97FPS (from 12 or 24) *before* any of the audio work was done. ideally you'd re-do this stage against a fresh 24FPS copy of the image for overseas sales.... but it wasn't possible, I was told, to rework the workflow so that we got the right audio on the best possible transfers from the film originals (which would've stayed at 24FPS if I'd got my way). too expensive. 🤨 & also they'd lost all the paperwork because they didn't think they'd ever be recutting 'rugrats', yadda yadda.
what tended to happen in the early days of internet video services is that we would get a request from itunes' UK operation for some (say) nickelodeon content that started out as 12FPS, got converted to 29.97 for the domestic US tv market, then badly converted for tv use in 25FPS television abroad, by operators making the mistakes described above... so we got complaints from itunes-UK about stuff that we'd got away with, even though we knew it looked crap on telly, because itunes was a newcomer to streaming video & wanted to make the best impression.
that's how, as an engineering manager for viacom, I got to digging around into these processes.
fast-forward a few years from that, & I'm working for a VoD outfit a bit like netflix, in madrid. there, I inherited more of the same. the CTO was convinced, even though this was an online operation feeding tablets, phones & smart tvs, that everything should be converted to 25FPS because we were "in a 25FPS territory". we aren't, any more. these days even a normal telly doesn't care what frame rate you chuck at it, within reason.
*there is absolutely no need to change the frame rate from its original for streaming services*
the problem is that a lot of the localisation is already done against bad copies that were sold to tv networks; these were the versions that were sent to the streaming & VoD operators, often as badly encoded files.
so, in short, the problem is that the content has been messed about with already, & it isn't viable- economically- to rework it all against the original image sequences. far too much re-editing would have to be done... colour grading, audio work... only the very high ticket material gets this treatment. disney are among the big studios whose pride in their own product is exemplary in a technical sense. viacom were way behind when I left, ten years ago, & it was one of the reasons I couldn't stay.
@@duncan-rmi Again, many thanks! Good to have answers from people with tons of experience in certain topics.
Although I still find this issue being a bit unsolved isn't? In technical terms I mean.
If the media device handles the conversion than it's probably just that, we need better conversion software?
Since TH-cam has this issue on all content and Google own that technology alongside Chrome, then they should be able to solve this pretty easy?
But comparing with another big company-you mention Itunes.
Apple TV watched on an Apple TV box works great. But when watching on computer their software seems to be doing the job not so well, to put it in kind words.
Why can't even the companies that develope their own technology be consistent over several media?
Thanks for sharing, got into problems using a DAW that can't set the offset to 10h. Spent a lot of time trying to get the first piano drop film in sync time and frames. Repainted the cabin inside instead. At every 10th plank I think of you.
Hey Christian!
I recently worked with a small production, and the guy thought burning the timecode in was a waste of time, and straight out refused to do it. He later complained that he was struggling to sync the music up in his project and was asking for a whole heap of unnecessary shit, causing headache for the both of us because of his refusal to take my advice in the first place.
I'm trying to think how to handle this better, should I have refused to work without the timecode and risk losing the job? Tried to educate him more on why it should be done?
It reflected a level of unprofessionalism and arrogance which I haven't come across before, something which was a simple task, but it meant him making another export of the movie, which he didn't want to do.
I didn't pull the "I told you so" card, I don't think I needed to, but the whole situation was a complete pain.
What would you have done here?
I think you must always expect varying degrees of proficiency in productions. Sometimes you’ll have a less experienced assistant editor who will always send things differently etc etc. Be safe in the knowledge that professionals would expect leaders and BiTC and split audio tracks you’re not being a diva. You then return the favour by giving the director / producer / editor what they need to do their work with your work. It’s a collaborative process and if people refuse to collaborate on a technical level then they’re not good collaborators. Worth avoiding if you can in future.
@@TheCrowHillCo Thanks Christian! - It was very frustrating, and I shuddered at the talk of "we're shooting the second film in the summer". If he comes back, I'll likely turn it down. I love my job, and people making things unnecessarily difficult makes you love it just a little bit less at the time!
The 10-hour offset is used in Britain and only the CBC in Canada uses the same offset. However, everything else coming from other broadcasters in Canada I have worked on other than productions from the CBC, including all US-based productions I've worked on uses a 1-hour offset. In fact, I believe that the SMPTE standard is the 1-hour offset and not the 10-hour offset. Interestingly enough, SMPTE was developed by Leo O'Donnell while he was working for the National Film Board of Canada in the 1960s.
That was entirely fascinating. 🙏
This was really refreshing mate. All the best
On the subject of physical film medium (I'm just assuming standard vertical 35, because that's the normal delivery medium that is relatively easily avaialbe) there's also the fun with current digital audio format of choice (i.e. mainly Dbloy), where the digital audio data is encoded in the space between the sprocket holes. This means the overpriced Dbloy play head and processing box has to transcode that digital signal into the DSP architecture and output some sensible signal.
Did I also mention that the Dbloy head is about half a foot behind the lense for added fun times, and then there's of course for backup purposes your analogue film sound signal that is read and compared in case there should be some serious digital data missing (quite common that you have to remove some frames that have suffered from heavy use and splicing).
We also have to remember that film film (unlike 35mm audio film or tape) is not pulled linearly, but intermittently (which as an aside also puts a lot of stress on the film, due to the massive rate of change of momentum).
another lovely vid, thanks Christian :)
We still have 2 reel projectors at my job. We mainly use digital now (DCPs) though.
Wednesday school, check :) Thank you, sir. Loved it! So useful!
This was amazing. More!
"Time doesn't travel in two directions..." *Tenet would like to know your location*
I remember being taught about SMPTE at university and being told the story of people putting gaffa over the first number when working on TV, as the 10 hours would confuse some visiting clients.
Nice break down. Best regards from across the pond.
Brilliant video Christian! Would be really interested in hearing some insights on your work for Fresh Meat. The music in that show was amazing!!
Very interesting film info here !!!
Thank you once again Christian
A wealth of information thanks!
Nice video Christian! Speaking of time... It would be interesting to see your approach to finding the right tempo and managing the """out of sync""" hitpoints when composing to picture :)
Yep I have to be honest and say that video totally blew my mind with confusion. I always wondered why it started at 10 hours in. I've had one opportunity to write a score for a short film, and I can match the timecode accordingly to the frame rate to progress. But the whole history from analogue to digital is a bit confusing. So more vids on leaders, time sync in general would be great. Lol, maybe I should stop watching historical war documentaries and more about timecode sync. Jeronimo 🤣
In the days of U-Matic SP and Betacam, starting your program material at 10 minutes into a tape avoided hang-ups with pre roll as the machines would not engage to record before 00:00:00:00 (hr:min:sec:fr)
Brilliantly presented nerdery as usual! As much as I love the idea of investing (career changing) my life in/to what I love - this being music and the fabrication of which by any and every technological means necesssary - is it or can it be a viable investment for A) a passionate team's life and times? B) a third-party of capital stakeholders? And C) the needs and expectations of spouses and families of said passionatas? Naturally, the revenue stream is dependent on the nature and marketability of the work, but so many other variables are at play. Perhaps Christian could put together a "The Business Of Music" (or better titled) video (series?) which could outline the anecdotal pitfalls and successes he has personally had, leading to where he is now. Or perhaps less personal and more textbook with his own curriculum?
Love your work Sir. I'd attend that class, Just saying.
SMPTE timecode is in this format: *Hours : Minutes : Seconds : Frames*
Here, 1400:00:00:00 is read as *1400 hours 0 minutes 0 seconds 0 frames*
1400 hours?! That only exists on studio.youtube.com !!!
Timestamps are only allowed up to 99:59:59
Fascinating as always Christian. Notice you've got the new SSL control surface there. (basically an Avid Artist Mix isn't it?). Definitely a better choice than the Icon anyway.
UK and EU broadcasters require a programme start at 10:00:00:00. In the US, their spec is to start at 01:00:00;00, I have no idea why this is. I almost made the booboo when laying off master tapes when I first started making US programming after years of UK work (back in the old linear days). Thankfully, I read the spec document just in time.
Lovin the thumbnail!!
Amazing stuff, thank you!
loving the new monitor! :)
This was so helpful. Thank you!
I don’t know how much this video will be loved but suffice to say; I loved it. My first feature was shot on film and, I guess, like you, that’s given me a respect for time and it’s measurements. *Always remember being given a chase scene to score that crossed a reel change and had to figure out how to stop the music for the reel change and get back in again whilst not letting the drama of the chase down musically.
As an aside - reels were always delivered to theatres unwound and had to be rewound before they could be played. This meant that in the early days of film, the easiest place for exhibitors and distributors to splice in promos for coming attractions was at the end of the last reel. These promos would “trail” the main feature which is where the term “trailer” comes from. Of course, they figured out quickly that nobody was staying to watch the trailers after the main feature ended so moved them to the front of the screening, but the name stuck.
**My favourite is Academy Leader. Do you have a favourite? 😀
Excellent video! Thanks, Christian! A little tangential, but I see you have some serious screen real estate these days. What display are you using, and what resolution are you running it at?
This was very helpful, I was just trying to figure out why DaVinci Resolve starts at 1h... Now, let's say I don't use it for TV, it's kinda useless and I can change it to start at zero if I only produce for TH-cam for example, right? And thanks so much :)
25fps = 1/2 of the 50hz of the European AC
30fps = 1/2 of the 60Hz of the US AC
TVs, and all editing & broadcasting machines at the TV stations, are plugged on the AC socket.
Simple solution to get a reference frequency.
🧐
And, in old-school interlaced video, each frame is made up of two fields. One per cycle of the mains.
I was always told 10 rather than 0 avoided problems with pre-roll. As the machines if parked before zero would struggle, attempting to spin all the way back to zero. Also that historically in the live TV days recording started at 10 am.
Oh and in the early VT. days we started line up at 9570000 so we could cut the ends of the tape off as it got ragged.
This was a fantastic video. Thank you! Why in Pro Tools is the standard in film is to begin at 1 hour? Would 10 be too many?
Has SMPTE changed to accomodate 4k 120FPS etc? Is there a new timecoding method in the offing?
This was fantastic
Very interesting, thanks!
Yay‼️ Nerdy tidbits 🎯
The shift from 30 to 29.97 FPS was due to the "colour subcarrier" which was a solution to dealing with color television and being compatible with older b/w TVs.
Check th-cam.com/video/ykjyNeuQROU/w-d-xo.html where they explain the color subcarrier.
Check www.sciencedirect.com/topics/computer-science/colour-subcarrier for more details.
(Hm? Not sure if my previous comment was removed somehow, hence this one, which is a copy)
OK, so this is weird but I've been wondering for a while about the Spitfire coffee (tea?) mug full of what I assume are pencils? Is that just a lot of pencils? Colored pencils?
I used to stay down in London a lot. There’s a chain of hotels (Firmdale) that give you a pencil every time you stay.
Late comment but just thought... Re the splicing of 22 minute reals and not having music at the beginning, doesn't that mean every film ever made would have to have no music playing throughout sections at 22 minutes, 44 minutes, 66 minutes, 88 minutes etc?
Thanks Christian, Great video
More Timecode facts: if you watch any 80s BBC video rushes with burnt in timecode e.g. Doctor Who, it blew my mind when it was pointed out that the hours and minutes of the timecode were simply the ACTUAL time of recording!!
Why Does TV Timecode Start At 10 HOURS???....In the UK. Here in Netherlands it can be 01:00:00:00, 00:00:00:00 or even ( and very old school by now..) 00:02:00:00.
that is one looooong iMessage you got there christian
Your channel has been an incredible resource for myself, as I transition to doing more film/TV work. Thank you Christian and team!
Love the CGI examples! Haha 🤣
hidden roast
@@christophpawlowskimusic213 Instantly laughed, by far the two worst CGI examples going haha 😂
God Christian I'm meant to be getting on with my assingment at Thinkspace but instead I'm watching you talking about Timecode (super interesting by the way) Can you make a video on how to stop procrastinating while composing haha!
Bravo!
"... a visual cue..." (cleans spectacles) :-)
Does,anyone have more info about 29.97 and 30fps. Are they the same. Does my iPhone shoot at 30fps or 29.97 FPS?
It's a bit confusing, often people in the industry say 30fps but really they mean 29.97 (similar with 24 and 23.98). They are not the same though. I'm not sure about the iPhone, but you can get MediaInfo freeware that will analyse the file and tell you the frame rate...or any video editor software would tell you the source frame rate.
Zappa fans know what SMPTE stands for!
Baybeeee snaaaaaakes!
Really interesting video! But our eyes do not see at “roughly 24 frames per second”. They are way more sensitive than that.
Our eyes do not only see at 24fps! Otherwise you couldn't tell the difference between a 60Hz monitor and higher, which you can
Mine do.
It was a compromise for at which rate we consider the frames as "moving", and the cost of film.
You do only see at 24 fps but you feel interactions at a far higher rate.
Can I ask what Monitor you’re using there? And where’s the iMac? 😊
BTW Christian, this nerdery might interest you too-maybe you even know what's going on here:
Have you noticed that on HBO the voice sound has like a super short slap delay added on the mid low frequencies. The same as many movies and movie trailers does it.
But this is not audible on Netflix.
I can only understand this as a sound effect thing since it's on all HBO programs but (I believe) none of Netflix' content.
Why and what?
Many years ago I thought it was an interesting effect. Nowadays I get mostly annoyed by it.
Simple question but what video file format do you usually ask for to put in your DAW?
MP4 or MOV.
@@SOUNTH11composingdesign thanks, I usually ask for Mp4 but wasn't sure if that format added an extra empty frame in the front of the file or not, like mp3's do.
@@JuanAMatos-zx4ub No format adds empty frames. That would be bad.
@@SOUNTH11composingdesign thanks, just wanted to make sure. MP3's add a tiny amount of empty space in the front, that's why I asked for clarification
@@JuanAMatos-zx4ub Never tried this.
th-cam.com/video/6tx6JTUeQ0Q/w-d-xo.html
The 2 all-time best examples of CGI there.
Ha ha that was Robbie’s choice!