Well done, John! Because I WANT TO SEE image artifacts and TV flaws, I just built a 267 line, 15FPS progressive video system to show what Philo Farnsworth *might* have seen in his lab in the late 20s and early 30s. Ultimately, this project will add an actual *image dissector* tube camera section! WARNING! I make really bad videos!!! th-cam.com/video/zcUuoyxSJLY/w-d-xo.html
@Frank Olsen Hi Frank... you obviously dont know how to communicate to people so any effort to interpret the nonsense you wrote would be a complete waste of my valuable time. Goodbye and I hope you have a shitty day :)
@@FilmmakerIQ - Thank you kindly sir. I appreciate the ride on your shirt tail and the class and kindness you have shown to me and my channel here and in the past. I promise not to abuse your good nature. I got another hundred subscribers too. Did I say Thank You? Again and again! Your videos are ABSOLUTELY THE BEST ON THE SUBJECT BAR NONE! I suffer severe envy at the quality of your productions. Have a great day! Now I have to go and keep making something GREAT!
@Frank Olsen The amount of insults here was rather unnecessary. If you do have valuable information and you want to correct someone then at least tone it down with the language. As for the information, thank you.
There are several other reasons that the Twilight Zone "Videotape Episodes" look strange is because of where they where shot, CBS Television City has smaller sound stages and very different lighting then MGM where the film episodes where produced, many of the scenes had to be compressed to fit onto the CBS-TC soundstages giving them a more claustrophobic look. Also, the cameras used where the RCA TK-11 3" Image Orthicon based cameras, CBS later used the Marconi MK-VI camera which used 4.5" Image Orthicon tubes and thus had a higher resolution. Lastly, most of those shows were pretty much done "Live to tape" with the TD selecting the shots on the fly, as the freedom of editing in film did not yet exist for videotape.
Further to my previous comment, I remember back in the days of camcorders (1980s) I took some video of a thunder storm, and caught some lightning. When playing it back I could see the lightning at normal speed, but when I tried to pause it, it wasn't there. When you pause a VHS it only displays one field, and the lightning was on the opposite field. It was at that point I realized that 50i does actually record 50 different images per second.
Thank you for explaining this! Too many people think 60fps is a new invention, but pretty much anything that was live or taped and wasn't shot on film was shown at that high frame rate until digital cameras blew up in the 2000s. Sadly, the Internet has done a terrible job archiving old recorded content and so most old TV clips on TH-cam are in 30fps. There are even a lot of newscasts and talk shows that broadcast at 60fps but upload clips online at 30fps, with Jimmy Kimmel being one exception
Yeah. Years ago I was confused with the "30fps" nonsense. My analog video captures looked so choppy. Now I capture at the proper 59.94 and I don't throw away fields.
Daniel - I think that's one of the most accurate recreation of the 480i experience I've seen. And I'm saying that as someone who would have watched a video like that very closely to dissect it when it was released (because that's when I started my career and would have wanted to make something just like that). Only thing I would say is it is a tad soft but that might have been the deinterlacer.
@@FilmmakerIQ sorry, I moved my channel from a personal channel to a brand channel, so my reply got deleted in the process. So here is me remaking the reply: Here is how I preserve interlaced content for TH-cam: th-cam.com/video/m_cIjk5IiPo/w-d-xo.html [Filmmaker IQ's reply] Yeah. It's probably soft because it used a LaserDisc source. Also, the upscaler was the one built in to Sony Movie Studio, so it's probably not perfect. Finally, here I compare how 480i content looks natively on a CRT to how it looks with various deinterlacers (and I got a better upscaler, nnedi3_rpow2(factor = 4)): th-cam.com/video/talRdXXNzyI/w-d-xo.html
I actually work in a part of the industry where 60i is still heavily used - live events and presentations. Simple reason, 1080i carries further and more stable over SDI than 60p due to bandwidth and the runs to projectors and LED displays are usually very long, and you can't cut live video to 30p because it's just less smooth enough for people to notice the difference.
Thanks! It’s great to hear someone who knows what they’re talking about. Before I became a technical writer I worked in and taught video production. People insisting that 1080p means 1080 pixels reminds me of people who think that the term “pulldown” has something to do with a mathematical conversion. I’ve read/heard people say the frame rate was “pulled down”! Even people who have lots of experience can be operating with a poor understanding of the technology. I once had a protracted “discussion” with a TV engineer turned pro video equipment sales guy who thought component video was digital.
60i is not really effectively 30p. It's more of a 60p with half the resolution, which is what they originally wanted. That gives you the illusion of 30 full frames a second, but each of the two 1/60 pictures is a new frame.
Great video on interlacing, good to see. You and Technology Connections have presented the most info for this topic, which has sadly been lacking on youtube.
I've been watching your videos for about three years and I admire and love the way you present often complex information in a clear, easily understandable and entertaining way! As a teacher, I can only hope my lessons are as good as yours, congratulations!
Thank you John. I don't know how often you hear this, but I really appreciate the amount of research that goes in on each episode you deliver. Your channel educates me in a way that many do not. I love to learn, but appreciate it most when the information is presented clear and concise. Bottom line, Filmmaker IQ makes me smarter !!
I tried to explain combing to a student a while back and I had to look on the internet to find an example. Although I don't see it anymore I always prefer a monitor rather than a TV so I can see the fields if it ever does come up again. I'm going to show this to my editing classes from now on. I already use the 24fps video. Thanks John!
I'm an old dude, and I remembered that I first saw 'combing' artifacts when some SD tv shows were transferred to DVD. Babylon 5 comes to mind. I might have had the DVD player going to the Trinitron in progressive connection, too. Anyway, thanks for the knowledge!
Such a great video. People’s lack of understanding of interlacing has also niggled me. I particular watching mismatched field order footage on news reports from video shot by bystanders to me is like nails down a blackboard
The main benefit of the PAL system (Phase-Alternate-Line) was that reducing the UHF signal bandwidth lessened the atmospheric affects on the broadcast and gave better colour stability. Indeed, my college tutor (for electronics radio and tv theory) said that the NTSC system was unkindly dubbed 'Never Twice The Same Colour'.
This is why any video enthusiast has to keep a good CRT around to enjoy "legacy" content properly, which will never look right on flat panels. A few of Sony's OLED professional monitors were able to do a trick to represent "true interlace", it doesn't look too bad but it's still not the same.
Interlacing was a trend but that stupid comment with 195 likes really put me over the edge haha! Now y'all have to wait for me to get to the part where I'm talking about how you can't see 144 frames per second..
Just Lurrrrrve your work. Was going to say "Ignore these Heathens" but then we would not have got this Brilliant peice from you had you done so. So roll on the idiots!!!!! Cheers & Thanks@@FilmmakerIQ
@@pbthevlogs5561 was going to comment the same thing.. plus the great reference to The Dunning-Kruger Effect, I spit out coffee when he said that, sooo true, but funny as hell
A shoutout to the Editors and Motion Graphic Artists for this video! 17+ minute video and there's a review process. I understand how much credit you truly deserve!
A good presentation. Interlacing is still in common use for broadcasting, especially for “Freesat” and “Freeview” on this side of the pond. In this case it’s 1920x1080i @25 fps. No doubt it keeps the bandwidth down to reasonable levels for transmission overground or via the satellites, weather conditions etc. Interlacing was also used for HDVi which was used for tape cameras, such as an older Canon XH/A1 which I still have; it was then possible to record about 1 hour on a miniDV tape cassette at 1480x1080i @25 fps (with the 1480 bit being anamorphic, then split into 1920 post production). All ended up being progressive when burnt onto DVD or BD.
For anyone going "TL;DR": 480/60i is not the same as 480/30p or 240/30p. It's zero full frames per second. It's 60 half-frames per second. Those 60 half frames are not drawn from the hypothetical 30 full frames, each half-frame is a new image. So it's like 60fps if you turned off half the scanlines each frame. That's it. You can probably compare it to 240/60p on a frame-by-frame level, but there is much more detail in two combined frames of 480i than in two static frames of 480p as it's drawing the second half of the screen in the second frame. However those two combined frames will look slightly misaligned if there is action in them because the second frame is refreshed.
@@gamecubeplayer but the original content will never have 60 frames per second, not unless the frames are ran to play twice as fast as theyre supposed to. the video will always be 30 new frames per second, displaying the same image across two half frames, whereas true 60i is 60 new half-frames per second. i wonder about the 1080i because wouldnt that require the 480 to be upscaled non-1:1 as it would need to be a 540 base image to be 1:1?
Everyone take notes. He is right in every single way. There is not a single thing you can argue about this. I'm currently working on a video on how to transfer Old Tapes Properly and in a segment I write pretty much Verbatim what you discussed here. It gives me faith knowing that people like you are passing on this knowledge.
Good video. Most TV studio cameras are 1080i60 because ATSC's 1080 stream was standardized at 60i for bandwidth reasons. Several people have told me that it's the same as 540p60 while some argue it's like 1080p30. Well, no, it's not. 1080i60 is notably sharper than 540p60 and the motion is notably smoother than 1080p30. That said, there's really no reason modern displays can't be tricked into showing interlaced video correctly, with 120Hz refresh rates being fairly common. Display field-1 at full brightness, reduce brightness next refresh, display field-2 at full brightness while field-1 is off, display field-2 at reduced brightness. It would be a fairly simple algorithm to implement in the TV's image processor but I suppose the manufacturers don't want to sacrifice the perceived brightness.
I had this discussion on Twitter with Tech Connections and came two conclusion why modern displays cannot produce the look of CRT... First of all modern TVs have very regulated grid like patterns. Analog TV has lines but less defined horizontal information. The arrangement of color is also different. The other big reason is CRT are additive while modern designs based on LCD are subtractive. Being additive, the lines in a field will bloom into each other. In a subtractive space the lines will still bloom but are actively cut by the black alternate field. Now you might be able to build a OLED with similar pixel layout to a CRT... but the question then remains why bother...
Nice dub at 4:34. I've had to do one or two (or six or seven) of these myself on my own show, which is always a fun little challenge... And you did it perfectly! I would not have caught it if I wasn't looking at your face when you said it. Anyway, terrific video!
10:53 so no one is talking about the magic that you aligned each field of the 60i CRT screen with each frame of the video. no rolling scan lines, no half image.
If my camera was at 59.94fps and the CRT was also at 59.94fps there would be no rolling scan line or half image because they are both in sync. There might be a frame tear as the frames may not match at the same starting point (you see it more in the NES demo)
@@FilmmakerIQ oh my. you remembered to say 59.94 ❤️ I dunno why… but our here when i try at 25 fps i still get around ⅓Hz or so beating of the scan lines. Maybe my television set timing circuit is neither exactly 25 fps nor 24 fps nor 29.97 fps. Huh 😳
You can convert old 60i footage to 60p by just taking each field and doubling each line. The resolution is lower but you preserve the original framerate without combing. Alternatively you can dump every other frame and get 30p. But I never understand the logic of combining the fields, that's what gives you the awful combed look.
The only time the combing works is if the original content was 30p and shown in 60i using naive algorithm, but yeah, I have never understood the de-interlacing obsession with converting 60i to 30p, it just produces too many artificats and in best case destroys the temporaral resolution of the interlaced format.
Converting 60i to 60p isn't a problem. Modern codecs deal with it flawlessly. Converting 60i to 30p is still a huge problem even for the most advanced codecs.
"You can convert old 60i footage to 60p by just taking each field and doubling each line" that will not produce a good result since your video will move up or down a line (or half a line at your lower effective resolution) every frame. There's no perfect way to convert 60i to 60p since when using a CRT the interpolation was done by your eyes and your brain. Good algorithms try to reconstruct the missing lines by using the neighbouring lines of the same field as well as the same line on neighbouring fields with motion interpolation to hopefully be able to retain full resolution when there's little or no motion while avoiding combing artifacts during high motion. Converting 60i to 30p is just converting 60i to 60p and then converting 60p to 30p.
I Love the fact that any old vhs tape can be deinterlaced to 60fps, unfortunately not everyone does that to their tapes when digitizing it, idk i dont feel right when i see a video which came from a tape and is not deinterlaced into 60 progressive frames and just let it to 30 .
I REALLY wish that (great) videos like this would mention an important alternative: when you're converting interlaced video to a digital format, you don't have to use the standard de-interlacing methods that reduce the effective framerate by half. There are a number of great algorithms that convert, for example, 60i to 60p, by "filling in" the missing lines of each frame with intelligent interpolations based on previous/following frames and so forth. To almost anyone's eye it looks much closer to the original, since it preserves all the temporal information that is otherwise discarded, with almost no downside. I have used QTGMC, which might be challenging for a non-tech person, but provides great results. It drives me nuts that most converters and video editing software don't have a simple way to achieve this. Just think of all the video footage that has been degraded for no good reason.
Since deinterlacing is impossible as the two fields usually are not the same sample in time, i think Bob deinterlacing is the only effective way to "deinterlace" something since you still get the correct frame rate and generally it is quite faithful to the original interlaced source.
Well, Bob is the most efficient way since it barely uses processing power to fill in the missing information on the fields. It just line double the fields and present then in the same frame rate
@@phazonlord0098 It's definitely a compute-intensive thing to deinterlace in the way I'm suggesting. Google for "videolan wiki deinterlacing" and read through the methods available -- those are some of the real-time options, some of which are claimed to be superior to bobbing (haven't tried them myself). For conversion, QTGMC is slow but works great. It's a sophisticated algorithm: google "QTGMC Avisynth wiki" and see section 3.1.2. I think if you compare with bob, you will never look back. The Bob method can cause horizontal lines in the scene to "bob" up and down as the fields alternate (hence the name, I take it.) QTGMC starts with a bob, but goes much further.
The players and viewers do that too, so (unless you are manually guiding the process) leaving it as-is is best. Let the DVD player or TV convert to 480p, and as the algorithm is applied when playing, it will get better over time as techniques improve, and you can override if it's not looking good. The VLC software has several different deinterlace algorithms including an "auto" meta-algorithm to chose the best one based on content and hardware specifics.
@@CaseyConnor yeah, bob it's just the most basic and efficient one if you just want to make a good progressive transfer without too much hassle, I also hear great things about Yadiff on VLC, it seems most algorithms start on bob deinterlacing but instead of line doubling the fields they try to interpolate or analyse parts of the picture that doesn't change between fields. Very processing intensify stuff but the results are amazing.
Thanks for another very interesting video, I had not heard of Flicker fusion threshold before, and now I have some fun nighttime reading. 30ish years ago I was in electronics school, and the reason we were told we did 50Hz in Europe while it was 60Hz in the US was because of the en.wikipedia.org/wiki/Mains_hum. I guess flicker fusion and main hum combined to help pick that frame rate. Thanks again for the video. Brian
D Resolve (and I'm sure many other editing software) will deinterlace 50i footage (yes I live in Europe) to 50fps. And if you flick through frame by frame you can see that each frame is different. A man walking will be in a different position on every one of the 50 frames. So it's completely true that 50i contains 50 different images per second. The fact that these are interlaced makes no difference. When you watch 50i (or 60i) TV you are seeing motion equivalent to 50 or 60 fps. I think John's videos are great.
A funny anedocte is that Rod Serling purposefully chose some of the worst scripts to the videotape episodes, so that it maybe would be easier for him convince the network to come back to shooting on film. And Rod Serling's vocal delivery of the introduction in The Lateness Of The Hour also seems so completely rushed and phoned-in. The big exception is The Night Of The Meek. It's one of the most beloved Twilight Zone episodes, and the DVD commentary even says that maybe shooting on tape helped with the sincerity and simplicity of such a low-key, charming and heartwarming Christmas episode. Though one can still think that the episode would be even better if it was shot on film. Videotape was simply so severely limiting, simple and cheap in all aspects, the frame rate being only one of the problems. I will add that while 24FPS is part of the traditional cinematic look, there is obviously much more that also makes a video look cinematic. Many youtubers (not you) think that 24FPS is enough to make a video cinematic, while many 30FPS videos are actually pretty cinematic. On another matter, I will talk about one of your points from one of your other videos, when you say that comparing 24FPS and 60FPS side-by-side is not right. I agree, and I think it's so because when one frame rate is blatantly higher than another, watching them side by side can give the illusion that the lower frame is choppier than it actully is. The video below, comparing the same animation in 24FPS, 12FPS and 8FPS, is a good example. Watching the 12FPS animation right after 24FPS is really jarring. But the exact same 12FPS animation looks perfectly fine when watched right after 8FPS animation. th-cam.com/video/0r3d2eMw8Ws/w-d-xo.html Also, higher frame rates in animation truly follow the law of immensely diminishing returns, as these video comparisons below show very well. th-cam.com/video/o2WBBgqV21s/w-d-xo.html th-cam.com/video/RdGwTIhEsIU/w-d-xo.html th-cam.com/video/-ZnxYYABLww/w-d-xo.html We don't really need higher frame-rates in animation. Anyone who thinks otherwise really needs to look at this awesome and buttery smooth animation chase scene from Richard Williams' The Thief And The Cobbler (that film has plenty of examples of GOAT animation). th-cam.com/video/Usf5vtaYDI0/w-d-xo.html And art is very subjective. Higher frame rates aren't objectively mean better or worse, the only objective thing about them is that they have a higher number of frames per second! Though Richard Williams, in his book The Animator's Survival Kit, was always a huge defender of animation in 24FPS as much as possible (Williams was always obssessed with animation being as smooth as possible), he also says that there are many animators who actually think that 12FPS in many instances can look better than 24FPS, and he mentions Art Babbit, one of Richard Williams' own biggest mentors, as one example of such, even though Williams still doesn't agree with Babbit's view. I also highly recommend this video below. th-cam.com/video/YtYpif-dLjI/w-d-xo.html Cheers! I wish you the best!
NTSC and PAL are color norms, but number of lines and frame/field rate were defined in the B&W time. Hence, compatible color. The original norm (and basis of the color) are still called CCIR B/G for 625/50 and CCIR M for 525/60, and there are more of them, some obsolete, for different specifications for resolution, video bandwidth, audio carrier, and modulation type (positive/negative, AM or FM)
I do the opposite. Look at the video title and see that it could be controversial, and head right for the comments. Although, this time I actually watched the video first. 📺🎦 😁
@@my3dviews well it does on cgi pictures, i give you that. but in the real world, it's time for you to take off those 3D glasses & accept the fact that we've been lied to.
It's funny how you mentioned those specific Twilight Zone episodes because they reminded me so much of earlier Doctor Who episodes, which would explain so much.
alright, 3 minutes 35 secs into the video and I am all with John on this - I've found even people who work for a living as filmmakers seem to not understand that TV programs deliver 60fps (see, F P S) and that creates that smooth look, not 30fps. I am not American, so I can't really say if true 30fps is that common in TV (shot in progressive 30p and broadcast as 60i with frames doubled) - I've heard that some shows were or are shot that way. But whenever I hear "that TV look", it refers to 60fps (broadcast interlaced) and not 30fps. 30fps is actually almost as jerky as 24p
Combing artifacts are not inherent to deinterlacing interlaced content, that only happens if you merge the two interlaced fields (half-frames) into one progressive frame. Inherently, however, _some_ visual compromise is always involved in interlacing. For example, it is also possible to display each half-frame as its own frame, which gives you smooth 60fps motion, but half the vertical resolution and the scene “bobbing” very slightly up and down every frame, or to discard all the odd or even frames, which gives you 30fps motion and half the vertical resolution, but no combing or bobbing. There are also algorithms that try to achieve 60fps with full resolution, but since they have to guess, filling in missing information, they can make mistakes etc.
Ok; I was going to write a comment about how I was so glad you made this video because so many people nowadays fail to understand that 30p is NOT THE SAME as 30i and doesn't look like live TV (which looks the way it does because of the 60 interlaced fields per second), but then, after you spent all that time explaining it... You told people they could shoot in 30p if they wanted the NTSC look. wat. 30p is only 6 frames away from 24p and looks very similar to 24p... As you had PREVIOUSLY mentioned, they're going to need 60 fields or 60 progressive frames per second (or greater) to get that live look. I don't understand why you would say that right after explaining why that ISN'T the case.
True progressive 29.97 has been around for about 20+ years - so it has become a look that is associated with NTSC video. 30 fps doesn't look similar at all to 24fps - those 6 frames per second represent 25% increase in speed. Go through any broadcaster's TH-cam channel and you will see it's all 30fps: th-cam.com/channels/eY0bbntWzzVIaj2z3QigXg.html
Another excellent video, and I'm glad that you're prepared to address controversy. Just my personal opinion, but the switch from 24p to 60p makes the image look much more realistic. However, we need to remember that this is an art form - is realistic always better? To take an extreme example, I'm sure we've all seen the trailers for the new CGI Lion King. They undoubtedly look more _realistic_ than the cell animation. Do they look _better?_
You're starting to get into apples and oranges territory there. Yes CGI Lion King looks better given a certain set of criteria. Yes hand drawn Lion King looks better given another set of criteria. But neither will look better at 60FPS :P
I'm willing to bet, however, that if television started with 60p, that the industry would not have gone to 24p. Personally, I'll take the 60, but I'd take the sappy drama out 😄
Think of 60i as 60p with half the vertical resolution, Now everybody is happy. But there is a reason why some people complain about interlaced materials, they use computer players that often use wrong de-interlacing settings which reveal the nasty combing that everybody is talking about, I do capture analog video for customers and in the recent years I was forced to de-interlace the videos for them the right way using high quality commend line QTGMC program so when they play the files there will be no risks for them to see the combing in case they use the wrong player or a player with wrong setting.
It is and it's not 60p but it's close... - which is why you have to run it through your software. Be careful trying to "oversimplify things" unnecessarily :)
@@FilmmakerIQOne field is displayed in progressive by modern flat panels and that's the reason for combing artifacts, CRT TV's don't display the whole field at once like flat panels and that's why they don't exhibit the combing effect. That's what I meant by 60p with half the resolution ( a field resolution).
Yes, you don't need to explain what interlacing in a comment on a video that explains what interlacing is. It's rude. Thing is 60i is can also be 30p full resolution with combing obviously. Saying 60i is 60p is but half rez is only one way to interpret 60i
100% agreed. I think a major problem has been poor software deinterlacing to create 60p from 60i. The default blending of 60i to 30p creates ghosting, then "bob" deinterlacing gets 60p but halves the resolution. Since the "Yadif" deinterlacer has been introduced in software like VLC and Handbrake, it has transformed how I see 60i sources.
Well, if you look at the tip of the metronome at 60i it almost disappear at its fastest speed where as at 60p, the tip stays well visible with only some blur.
I really don't notice the difference when watching. If I slow down I see that 60p has more pictures than 24 or 30. 720p looks great to me. 1080 and 4k aren't really necessary. For me the only real difference is the size of the files and the time it takes me to download, upload, and edit. I do believe that the quality differences are there, but I just don't notice them.
Watching this at 7 in the morning before work and can't stop! This perfectly answered the question I had since I was 8 and felt there was something weird about that new BBC series (which later I realized was "filmed" with U-matic). Thank you! P.S. this vid inspired me to experiment to see if we can create a 60fps feeling in video games and real-time animation using fields, and cut the processing power in half.
yes you can but it won't look good on a progressive screen. If you want a throwback to the early days of PC gaming what they used to do was render only every other line of video in order to preserve processing power and file sizes. Pretty much exactly what that Nintendo did but not on a CRT screen
I’m still fascinated that may TV broadcasters went with 720p instead of 1080i, mainly arguing with sports. Now we see the results and of course 1080i is better. I’m able to watch Formula 1 on 2 separate networks, one at 720p, the other at 1080i and the difference is staggering. The jump then from 1080i to 1080p is also a great one and I’m not even talking about 2160p but that’s another story
On my DVD copy of Super Mario Bros, there is ONE scene (the climax, as the Dinosaur world and the real world start combining) where all of a sudden, it's a Interlaced image with terrible combing. It's really odd because the entire film is fine (well... it's Super Mario Bros, but you know what I mean).
Hellow I would like some help I hope what I am about to say will make sense... I have canon 7d mk ii and atoms shogun flame... via HDMI the 7d send a 1080 50i and 50p signal then when I change setting on the 1080i 50i I can get it to 25p and gives a 2:2 pull down. And whn delivering I need to deliver in MXF XDCAM 50 1080i 25i I hope that makes sense Now my question is whn I shoot in 25p and edit in final cut as a 25i and export it as 25i will I loose any quality? What impact will it make on the footage? Please assist?
I like the more depth you got into in this one, just as I liked the original. I really like shooting 60p I like that smooth "soap opera" effect coupled with the higher resolution for that "extra crispy" image.
Let me ask you something. I actually support interlacing since I discovered 50i is literally 50 vertically half frames. Now I’m thinking, in the world of film anamorphic lenses were or are used which on the horizontal side (mostly) has less detail than vertical. My point is: if it’s possible to convert 50i to 50p with vertically half the information, why don’t we keep doing that if films also do that?? On the other side, it’s a ongoing war of better imagequality between resolution versus framerate.
With regards to film, Anamorphic doesn't reduce the detail horizontally, it actually increases the detail... Squeezing more detail horizontally than regular lens onto the same size film area. Then when projected the extra detail is spread out to create a wider image. In the digital space it's more about effective use of the sensor but it's still incorrect to say detail is lost in the horizontal sense. Detail is added with anamorphic. Next the flaw in your statement is equating visual resolution with temporal resolution. They are not the same. So why don't we see more 50i to 50p conversion? Well you do these days at least... It's called line doubling or Bob deinterlacing. Problem with this type of deinterlacing is generally the source is going to be quite low resolution to begin with so you're going to make things worse.
Sadly you didn't mention deinterlacing. Of course it's "just software hocus-pocus", but it does work pretty well. There are a ton of different algorithms implemented in e.g. VLC player, I've found Yadif 2 to work the best. To activate it, right click on the video->video->deinterlacing->on/automatic. Then right click again->video->deinterlacing mode to set the algorithm. You can alter those permanently in the settings CTRL+P->video. Of course it won't be as sharp as the original, but give it a try! Also if the video is stored as progressive despite really being interlaced (e.g. if you digitised an old video and set the export to progressive) it wont work. There are example pictures (no videos sadly) on the vlc deinterlacing page: wiki.videolan.org/Deinterlacing/#Examples
Here is a video of me comparing deinterlacing methods for TH-cam uploads: th-cam.com/video/talRdXXNzyI/w-d-xo.html I designed this video specifically to generate interlacing artifacts. Or actually, finding videos with interlacing artifacts and compiling them.
John. Could you help me (and maybe several others on the web). Do you happen to know how to convert 30p exported 60i footage back whatever you want?? Basically the video is in progressive but the footage is in interlace with combing. Do you happen to know how I get it back to interlaced??
well I would throw the footage on a 60i timeline in Adobe Premiere and it should covert it back. You may want to explore Handbrake as a free option though what your asking isn't something common. Handbrake has a decomb option that might solve your issue without going back to Interlace
Historical note... In the UK, we used to use 405 lines for B+W TV, (VHF as it was called) and then switched to 625 lines (UHF) with the advent of colour in the 60's... Also, looking forward to your explanation as to why 29.97 (ish!) and other similar frame rates are required. Are there REALLY people still watching broadcast TV on CRTs? At least here in PAL and SECAM land we don't have to indulge in that strange pull down nonsense to show movies on television. We (well, the TV channels) just show 24fps movies at 25fps. Which is why the movies on TV are always 4% shorter than the timing listed by IMDB. And yes, it does pitch audio up by 4%, too.
yes... I have CRTs at home. Plus there's huge librarys of content. and pulldown is superior to the PAL solution. We don't need with the speed of playback. 3:2 pulldown is simple and completely non destructive.
Interlacing is honestly underrated. Even Jon Carmack, godfather of "getting gaming technology right" and ambassador for higher framerates has pushed the idea of interlacing. He also floated the idea of doing every 3rd or every 4th or 10th line, to get 3x or 4x or 10x the framerate instead of just 2x. Personally I'm really excited for that and I hope someone implements it someday soon. imagine gaming at 1Khz.....
Saying its effectively 30 as if it looks like actual 30 shows an empty spot between keyboard and chair that may look like a human, but no, it's not even a shadow.
Saying it's effectively 30 comes from the post side because in video editing you can't actually separate the fields. So when you're editing you have the treat it like 30 FPS. Problem is we're so far removed from using 60i these days that people that never experienced 60i don't know what it looks like
Thanks. Good history/future lesson. Probably I don't really need to know this, but one of the joys of TH-cam is learning stuff you have no real need of----right now, but might somewhere down the road.
You only see combing when viewing interlaced video on a non-interlaced display because the software tries to combine the separated interlaced frames. blending them together.
@@FilmmakerIQ Yeah, but that's because it's shot interlaced, each field is going to be 1/60th in time at half the res anyway. I'm saying if you have interlaced footage on say a DV tape, you can convert your 60i footage to 60P and in software and it's more accurate when watching on things like TH-cam or Netflix. This is something I do for people when I convert VHS to MP4, if I were to say to remove one field to get rid of the combing effect, then I'd get half the res, if I were to blend the 2 fields together I'd get a weird motion blur over the whole thing, but if each field is converted to a frame at 60 FPS rather than 30, I preserve the motion and get some "perceived" resolution back because the frames are flashing at 60 FPS. You can try this in VLC, set the deinterlacing mode to linear on some old interlaced footage. Didn't mean to come off as arrogant, love the channel, just my 2 cents.
@@FilmmakerIQsir, which frames per seconds do you prefer for Animated CG shows? 60fps or 24fps? I personally prefer 48fps the middleground. What is your professional recommendation?
@@FilmmakerIQ You need a 24hz or 120/144hz capable display to properly enjoy 24fps video..! It just looks horrible on 60hz progressive scan displays, which are still the most common nowadays. 120hz is the sweet spot here for enjoying both 30/60fps TH-cam vids _and_ 24fps movies.
Nope 3:2 pulldown is not the culprit the internet thinks it is. I thought you could step up to 144hz and get smooth motion but the 144hz monitor looks WAY worse than 60hz. Sample and Hold is the culprit, people have been watching 3:2 pulldown for years in CRT.
This isn't directly related, but I have been able to view interlaced recordings of some older games on PS2 and such, and going through the recording in raw interlaced format "frame by frame", it's interesting to see what tricks games can do when they run at 60 (or 50 for PAL) FPS but output in interlaced format. For example, it looks like Ratchet And Clank 3 overlays a semi transparent layer (with a bit weirder colors) of the next frame in the current interlaced output refresh, so when movement happens and the next interlaced refresh occurs, it should help to compliment the picture in motion. I suspect this was done more to help deinterlacing algorithms on flatscreens, as technically it should help the TV to structure a full frame better, depending on the method of deinterlacing. These layers aren't noticeable in motion or deinterlaced progressive video capture, so I would say it was a fairly effective and interesting technique that at least didn't damage the picture. But to tie back to the original video, yes, even games occasionally were 60FPS with interlaced outputs. Another strength of games was also that technically you could do 30 full frames with 60i, if the game was set to output in such a fashion and had a stable framerate. As extra trivia information, even older game systems used another trick to draw full 60 frames "progressive" on CRT TVs, by only drawing the same lines every refresh, instead of alternating lines every refresh. This eliminated the inherent interlaced picture flicker, but also worked as a performance saving as games could render at half the resolution (320x240 instead of 640x480). This is these days known as "240p", as it draws 240 lines of progressive video but it's only delivered in an interlaced format. I believe back in the day, some folks who worked in the industry called it by different names, I think Nintendo called it "double streak" or something. And yeah, interlaced can be troublesome with modern TVs, but as you might guess, 240p can be a huge issue too, as vast majority of TVs and converters interpret 240p as 480i, scrambling the line order and deinterlacing it, which can result in excess blur with movement, but all sorts of different artifacts and issues can prop up as well. In the worst case scenario, the device simply wont draw the video at all, be it in analog or digital 240p signal. Which is sad, because technically 240p should be much easier to deal with than 480i, as the proper way of handling it is to "line double" the signal, just double the existing line information to the missing line information in each refresh, and you'll get clean 480p. If you interpret 240p as 480i and line double it, you not only get flicker, but also bad vertical shaking of the screen.
Be careful with your conclusions because some of it might be artifacts of the capturing process. seeing quickly on Wikipedia the PS2 was did basically support 240p but only though the component cable. If it was captured by other means it might have been resampled into 480i which might explain the ghosting effects you're talking about especially if the capture isn't exactly in sync with the source.
@@FilmmakerIQ I'll admit that the issue with recording could be a possibility, but I wouldn't outright rule anything out yet. I noticed this by using one of those EZCap devices some time ago, and the device I specifically have is very finicky with the drivers for some reason. I only got it working through Virtualdub, which captures raw AVI footage. Everything else looks very crisp, and the video files are large, so there's nothing else odd about it, and it's not video compression at that point yet. Either way, I was just testing things around, and found this interesting effect, especially when going frame by frame. Could be intentional or unintentional. But no, it's definitely not related to 240p! PS2 had only a handful of 240p games (mostly ports of really old games and collections, a couple of native PS2 games too), and will output all 240p PS1 games at such resolution, but the game I brought as an example is definitely 480i native. 240p also works over all video cable types and signal standards (because it's nearly the same as 480i, just with a different line order), but component video happens to be the least compatible one. It has something to do with the processing pipeline of digital TVs and processors being different for different inputs, such as composite/S-video/RGB and component. SCART was commonly used for composite and RGB, occasionally S-video, so the way 240p is handled is typically the same through SCART inputs and composite video. However, a lot of TVs handle 240p differently over component compared to their composite ports, or sometimes might not support it at all. It's a very troublesome resolution these days.
If I understand it right, you could technically make 60p out of 60i easily just by duplicating lines. Odd lines for frame 1, even lines for frame 2 and so on in the next fame. There would be no artifacts on progressive screens. Or am I wrong?
Not exactly. Each 60i field is only half the vertical resolution. So if you make a field into a frame, you then need to interpolate the in between horizontal lines (either even or odd) on each frame. You cannot just double up the lines or you end up with half the resolution on each frame. If there is no motion, you can use the other field for the in between lines, but that doesn't work if anything moves or if the frame is being panned.
Yes, just remember the even frames only has one of the top lines, and the odd frames only has one of the bottom lines. This will ensure the spatial data is centered correctly. It is called bob deinterlacing, because you bob each frame half a line up and down and then scale 2x.
the line count is always vertical in TV. How many lines counting vertically does a screen have. Its really the only dimension you control precisely on an analog CRT. In film they count pixels horizontally when digital film scanning started.
If you ever try to convert a DVD that's interlaced into a MP4 file using Handbrake, it will attempt to deinterlace the video. Doing so will make the video look more jerky because it was running at 48 to 60 fps and was brought down to 24 to 30 fps. The solution to that is to change the output frame rate from 30 fps max to 60 fps max as well as turning on decomb and changing the preset to EEDI2 Bob. That is the best way I've found to remove interlacing AND keep the video looking like it's 60 fps. In actuality, it converts the video to a variable rate frame rate. Anyway, I'm sorry that there's so many people out there who think they're right about interlacing and progressive video. The p in 1080p never was short for "pixel".
On a related matter, are there any plans on a video about the Vidfire system that is used (at least here in the UK) to restore old shows to their original look? I'd be interested to learn more about that. In connection to this I think your viewers might also be interested to learn how the BBC have been able to restore colour to shows where only black and white backup copies survive (not colourisation but extrapolating the actual original colour info). It's fascinating and incredibly ingenious!
Stupid question: why don't we show TV shows now-a-days at 60 frames (fields?) per second instead of 30? So you don't see comb artifacts. Or do we do that already? Do you see the fields alternating then, just like on your TV w/ the metronome in slow motion?
We actually watch most TV shows (except for LIVE Shows and Events) at 24. But if we're sticking to 30 content... it's because 60 progressive frames takes TWICE as much bandwidth as 60 interlaced fields. Most stuff like Talk Shows might be done at 30 PsF and then broken into fields and sent as 60i. When it comes to TV, it's better to be able to broadcast two channels for the same price at 60i than to only broadcast one channel at 60p (1080 resolution)
Most TV shows were never video'ed in 60fps, but filmed in 24fps. The 60fps content was limited to live broadcasts (talk shows, sport) and low budget shows. All the popular stuff that people might rewatch today (StarTrek, X-Files, etc.) that was all 24fps. As for the combing artifacts, you should never see them on an actual modern TV, as your TV will automatically deinterlace the video stream and convert it to 60fps. The combing artifacts only become visible when the video isn't deinterlaced properly and instead the 60i video gets played as if it was a 30p video. This happened a lot when watching or editing a video on a computer, especially in the early days when there weren't good/fast deinterlacer around. These days it can happen by accident sometimes. As for playing back the fields individually like a CRT would do, that's possible in theory, but it wouldn't look good as the image would be missing half the lines and thus be half the brightness of a progressive image. It's just easier to deinterlace the image than trying to boost the brightness of an LCD to emulate a CRT.
Broadcast TV engineer here. We are legally as well as technically limited to a certain amount of bandwidth. Everything we broadcast is limited to a 6MHz channel per FCC specs. Increasing the frame rate would require more bandwidth... or more compression. And you wouldn't like how that looks. Studies have found generally that "30" fps is more than enough, especially when so much of what we air is produced in "24" anyway... including LIVE shows. I just ran the Billboard Music Awards which appeared to be in "24," or "faux film" as it's sometimes called. It tends to make a show look more classy. Hell, one of our more popular commercials these days was shot on an iPhone. In cinema, you can gobble up all the bandwidth you need... assuming your storage card can handle it.
Thank's y'all for the explanation. So it's mostly a bandwidth concern. I captured my old VHS "home movie" holyday tapes back in the day and I would throw away half of the fields to get rid of combing. Capturing at 60 Hz (or should I say 60 fields/sec) wasn't possible. Certain professional documentaries containing old (live) TV material had those awful combing (or ghosting) problems too. Looked very unprofessional.
@@FilmmakerIQ My 2 cents: Disney- and FOX-owned channels (among others) broadcast 720p60 content. It's said that the rationale for choosing 720p60 (over 1080i60) was that they considered it was better suited for sports content, which has a lot of motion (think ESPN, FOX Sports). en.wikipedia.org/wiki/High-definition_television_in_the_United_States
Is 60i on a digital progressive screen effectively 30p as it just combined every 2 fields into 1 frame when deinterlacing but on a CRT it will look a lot smoother and have that 60hz soap opera effect?
This is answered in the video. I'm happy to answer any questions from the video. Though what I think you're asking is 60i effectively 30p? Depends on how you convert it, it can be, but it can also be converted to 60p. The reason we say 60i is effectively 30 is because in the old days you always counted 2 fields as one frame. There's no simple way to cut by the field.
@@FilmmakerIQ In PAL regions some Blurays of TV shows are 50i but I believe they are effectively 25p but 50i is used as Bluray doesn't support 25p. There is no soap opera effect as it merges every 2 fields into a frame and it was originally shot in 25p. However I've watched 60i Blurays and they had the smooth motion soap opera effect and I'm not sure why.
@@ShaneJMcEntee 50i is the same case as 60i in that 50i is the same experience as 50p. However when you deinterlace you can opt to combine Fields into 25 frames per second. If the original show was shot on film in Europe it would be 25p but if it was shot in studio on video cameras it'll be 50i. An example of this is Monty Python flying circus where you can clearly see the different motion between the film clips and the video portions done in studio
Wait a minute--there is a discrepancy in your discussion. It has to do with transferring film to tv. It wasn't a 3-2 drop down, it was, and still is 4 frames plus one repeated frame. Watch a movie, filmed at a full 24 fps. Take a look at it frame by frame. You will see 4 different frames in a sequence followed by a repetition of the 4th frame. No one notices that repetition because it happens so quickly. I realized it when I was editing a film to video conversion.
That's only when converting from 24fps to 30fps. TV until recently (as in last 20 years) has never been 30fps. And even the 30fps of today is really broadcast over an interlace stream of 60i 3:2 drop down with interlacing is 24fps to 60i. See the demonstration at 12:40
The real problem with those few videotaped TZ episodes was less the 60Hz video than the use of image orthicons, which created a "black halo" or "black aura" around any bright object in the scene.
Excellent until the very end. 60i, 60p, 50i, and 50p all have the smooth motion we associate with traditional television. But 25p is essentially indistinguishable from 24p, and 30p is very close. For what it's worth I experimented with this using variable speed film projectors in the early 1970s. (I'm an even older fogey than you are.) For more on interlacing see my post here: dgarygrady.com/2015/01/22/interlaced-video/
@@FilmmakerIQ I don't think we disagree. My point was that in terms of motion characteristics 30p looks more like 24/25p than like 60i or 60p. The difference between 30p and 24p is subtle but visible if you look for it. In North America some filmed television commercials and music videos are shot at 30 fps to take advantage of the slightly smoother motion while not getting the soap opera effect. By the way, Douglas Trumbull developed a large format high frame rate system (60 fps as I recall) for use in amusement park rides because it looked more like reality, but when he tried to produce narrative project he found people disliked the soap opera look. Again audiences like 24 fps, just as you're been saying.
Well done, John! Because I WANT TO SEE image artifacts and TV flaws, I just built a 267 line, 15FPS progressive video system to show what Philo Farnsworth *might* have seen in his lab in the late 20s and early 30s. Ultimately, this project will add an actual *image dissector* tube camera section! WARNING! I make really bad videos!!! th-cam.com/video/zcUuoyxSJLY/w-d-xo.html
THAT. IS. FRIGGIN. COOL!!!!
@@FilmmakerIQ Oh it is and just WAIT until the Image Dissector camera is finished :)
@Frank Olsen Hi Frank... you obviously dont know how to communicate to people so any effort to interpret the nonsense you wrote would be a complete waste of my valuable time. Goodbye and I hope you have a shitty day :)
@@FilmmakerIQ - Thank you kindly sir. I appreciate the ride on your shirt tail and the class and kindness you have shown to me and my channel here and in the past. I promise not to abuse your good nature. I got another hundred subscribers too. Did I say Thank You? Again and again! Your videos are ABSOLUTELY THE BEST ON THE SUBJECT BAR NONE! I suffer severe envy at the quality of your productions. Have a great day! Now I have to go and keep making something GREAT!
@Frank Olsen The amount of insults here was rather unnecessary. If you do have valuable information and you want to correct someone then at least tone it down with the language. As for the information, thank you.
There are several other reasons that the Twilight Zone "Videotape Episodes" look strange is because of where they where shot, CBS Television City has smaller sound stages and very different lighting then MGM where the film episodes where produced, many of the scenes had to be compressed to fit onto the CBS-TC soundstages giving them a more claustrophobic look.
Also, the cameras used where the RCA TK-11 3" Image Orthicon based cameras, CBS later used the Marconi MK-VI camera which used 4.5" Image Orthicon tubes and thus had a higher resolution.
Lastly, most of those shows were pretty much done "Live to tape" with the TD selecting the shots on the fly, as the freedom of editing in film did not yet exist for videotape.
Further to my previous comment, I remember back in the days of camcorders (1980s) I took some video of a thunder storm, and caught some lightning. When playing it back I could see the lightning at normal speed, but when I tried to pause it, it wasn't there. When you pause a VHS it only displays one field, and the lightning was on the opposite field. It was at that point I realized that 50i does actually record 50 different images per second.
Thank you for explaining this! Too many people think 60fps is a new invention, but pretty much anything that was live or taped and wasn't shot on film was shown at that high frame rate until digital cameras blew up in the 2000s. Sadly, the Internet has done a terrible job archiving old recorded content and so most old TV clips on TH-cam are in 30fps. There are even a lot of newscasts and talk shows that broadcast at 60fps but upload clips online at 30fps, with Jimmy Kimmel being one exception
Ironically Soap Operas are the worst culprits. They broadcast 60p and put stuttery 24p on TH-cam and then use 30p for their network site.
Yeah. Years ago I was confused with the "30fps" nonsense. My analog video captures looked so choppy.
Now I capture at the proper 59.94 and I don't throw away fields.
Daniel - I think that's one of the most accurate recreation of the 480i experience I've seen. And I'm saying that as someone who would have watched a video like that very closely to dissect it when it was released (because that's when I started my career and would have wanted to make something just like that). Only thing I would say is it is a tad soft but that might have been the deinterlacer.
@@FilmmakerIQ sorry, I moved my channel from a personal channel to a brand channel, so my reply got deleted in the process. So here is me remaking the reply:
Here is how I preserve interlaced content for TH-cam: th-cam.com/video/m_cIjk5IiPo/w-d-xo.html
[Filmmaker IQ's reply]
Yeah. It's probably soft because it used a LaserDisc source. Also, the upscaler was the one built in to Sony Movie Studio, so it's probably not perfect.
Finally, here I compare how 480i content looks natively on a CRT to how it looks with various deinterlacers (and I got a better upscaler, nnedi3_rpow2(factor = 4)): th-cam.com/video/talRdXXNzyI/w-d-xo.html
A shoutout to Technology Connections! He really does have great videos. Great sense of humour too!
@@zebunker NO U!
I actually work in a part of the industry where 60i is still heavily used - live events and presentations. Simple reason, 1080i carries further and more stable over SDI than 60p due to bandwidth and the runs to projectors and LED displays are usually very long, and you can't cut live video to 30p because it's just less smooth enough for people to notice the difference.
Time to invest in fiber optic converters and 3G SDI!
Of course 1080 i is still better than 1080p, the lower bandwidth signal will always be more stable.
Thanks! It’s great to hear someone who knows what they’re talking about. Before I became a technical writer I worked in and taught video production. People insisting that 1080p means 1080 pixels reminds me of people who think that the term “pulldown” has something to do with a mathematical conversion. I’ve read/heard people say the frame rate was “pulled down”! Even people who have lots of experience can be operating with a poor understanding of the technology. I once had a protracted “discussion” with a TV engineer turned pro video equipment sales guy who thought component video was digital.
60i is not really effectively 30p. It's more of a 60p with half the resolution, which is what they originally wanted. That gives you the illusion of 30 full frames a second, but each of the two 1/60 pictures is a new frame.
Great video on interlacing, good to see. You and Technology Connections have presented the most info for this topic, which has sadly been lacking on youtube.
I've been watching your videos for about three years and I admire and love the way you present often complex information in a clear, easily understandable and entertaining way! As a teacher, I can only hope my lessons are as good as yours, congratulations!
Thank you John. I don't know how often you hear this, but I really appreciate the amount of research that goes in on each episode you deliver. Your channel educates me in a way that many do not. I love to learn, but appreciate it most when the information is presented clear and concise. Bottom line, Filmmaker IQ makes me smarter !!
My Sony Handycam has an option to shoot in black and white and 60i. Good on them for showing the current generation what it was like in the '50s.
Shouts to Technology Connections. Alec deserves the praise
@@zebunker What do you have against the guy that's two comments I've seen you respond to.
I'm getting an Internet Comment Etiquette vibe from John reading the comments.
*I love it*
5:15 If the p in resolution really meant pixel then interlace wouldn't make sense because it would also be pixels
I tried to explain combing to a student a while back and I had to look on the internet to find an example. Although I don't see it anymore I always prefer a monitor rather than a TV so I can see the fields if it ever does come up again. I'm going to show this to my editing classes from now on. I already use the 24fps video. Thanks John!
I'm an old dude, and I remembered that I first saw 'combing' artifacts when some SD tv shows were transferred to DVD. Babylon 5 comes to mind. I might have had the DVD player going to the Trinitron in progressive connection, too. Anyway, thanks for the knowledge!
Such a great video. People’s lack of understanding of interlacing has also niggled me. I particular watching mismatched field order footage on news reports from video shot by bystanders to me is like nails down a blackboard
The main benefit of the PAL system (Phase-Alternate-Line) was that reducing the UHF signal bandwidth lessened the atmospheric affects on the broadcast and gave better colour stability. Indeed, my college tutor (for electronics radio and tv theory) said that the NTSC system was unkindly dubbed 'Never Twice The Same Colour'.
This is why any video enthusiast has to keep a good CRT around to enjoy "legacy" content properly, which will never look right on flat panels. A few of Sony's OLED professional monitors were able to do a trick to represent "true interlace", it doesn't look too bad but it's still not the same.
I keep CRTs around cause I'm cheap!
"False Claim"
now we need at least 195 likes and we get a new Filmmaker IQ video!
Interlacing was a trend but that stupid comment with 195 likes really put me over the edge haha! Now y'all have to wait for me to get to the part where I'm talking about how you can't see 144 frames per second..
Filmmaker IQ Cue Chief Inspector Dreyfus Twitch...
Just Lurrrrrve your work. Was going to say "Ignore these Heathens" but then we would not have got this Brilliant peice from you had you done so. So roll on the idiots!!!!! Cheers & Thanks@@FilmmakerIQ
@@pbthevlogs5561 was going to comment the same thing.. plus the great reference to The Dunning-Kruger Effect, I spit out coffee when he said that, sooo true, but funny as hell
@@FilmmakerIQ I'm looking forwars to this one. But the eye doesn't see in fps so I'm really excited for your arguments ;)
A shoutout to the Editors and Motion Graphic Artists for this video! 17+ minute video and there's a review process. I understand how much credit you truly deserve!
You're looking at the editor and Motion Graphics artist
A good presentation. Interlacing is still in common use for broadcasting, especially for “Freesat” and “Freeview” on this side of the pond. In this case it’s 1920x1080i @25 fps. No doubt it keeps the bandwidth down to reasonable levels for transmission overground or via the satellites, weather conditions etc.
Interlacing was also used for HDVi which was used for tape cameras, such as an older Canon XH/A1 which I still have; it was then possible to record about 1 hour on a miniDV tape cassette at 1480x1080i @25 fps (with the 1480 bit being anamorphic, then split into 1920 post production). All ended up being progressive when burnt onto DVD or BD.
Thank you for your time and effort in making and posting these videos. They are greatly appreciated!
For anyone going "TL;DR": 480/60i is not the same as 480/30p or 240/30p. It's zero full frames per second. It's 60 half-frames per second. Those 60 half frames are not drawn from the hypothetical 30 full frames, each half-frame is a new image. So it's like 60fps if you turned off half the scanlines each frame. That's it.
You can probably compare it to 240/60p on a frame-by-frame level, but there is much more detail in two combined frames of 480i than in two static frames of 480p as it's drawing the second half of the screen in the second frame. However those two combined frames will look slightly misaligned if there is action in them because the second frame is refreshed.
Add to that that each Field is also slightly different in physical space so that's why you can get vibrating diagonal lines
if the original video is 480p30 (or 1080p30) then it can be losslessly converted to 480i60 (or 1080i60) & back to 480p30 (or 1080p30)
@@gamecubeplayer but the original content will never have 60 frames per second, not unless the frames are ran to play twice as fast as theyre supposed to. the video will always be 30 new frames per second, displaying the same image across two half frames, whereas true 60i is 60 new half-frames per second. i wonder about the 1080i because wouldnt that require the 480 to be upscaled non-1:1 as it would need to be a 540 base image to be 1:1?
@@tipsy634yes, you have to upscale 480i (& 576i) non-1:1 to 1080i
@@gamecubeplayerIt would first need to be deinterlaced with something like QTGMC to avoid magnifying the low vertical resolution.
Everyone take notes. He is right in every single way. There is not a single thing you can argue about this.
I'm currently working on a video on how to transfer Old Tapes Properly and in a segment I write pretty much Verbatim what you discussed here. It gives me faith knowing that people like you are passing on this knowledge.
I'd be interested in your video. I also work on 60i-60p conversions.
To @@TriforceofShadows and to anyone else who was curios to see if I ever made that video...
th-cam.com/video/UWh1CWUO1Ok/w-d-xo.html
would choosing 1080i with 60fps ps4 game will give me 60 fps look on lcd ,please?
by look i mean smoothness
Good video. Most TV studio cameras are 1080i60 because ATSC's 1080 stream was standardized at 60i for bandwidth reasons. Several people have told me that it's the same as 540p60 while some argue it's like 1080p30. Well, no, it's not. 1080i60 is notably sharper than 540p60 and the motion is notably smoother than 1080p30.
That said, there's really no reason modern displays can't be tricked into showing interlaced video correctly, with 120Hz refresh rates being fairly common. Display field-1 at full brightness, reduce brightness next refresh, display field-2 at full brightness while field-1 is off, display field-2 at reduced brightness. It would be a fairly simple algorithm to implement in the TV's image processor but I suppose the manufacturers don't want to sacrifice the perceived brightness.
I had this discussion on Twitter with Tech Connections and came two conclusion why modern displays cannot produce the look of CRT... First of all modern TVs have very regulated grid like patterns. Analog TV has lines but less defined horizontal information. The arrangement of color is also different. The other big reason is CRT are additive while modern designs based on LCD are subtractive. Being additive, the lines in a field will bloom into each other. In a subtractive space the lines will still bloom but are actively cut by the black alternate field. Now you might be able to build a OLED with similar pixel layout to a CRT... but the question then remains why bother...
@@FilmmakerIQ Good points.
PAL-M earlier used only in Brazil operated at 60Hz. So what you say was applied to PAL-M too.
Nice dub at 4:34. I've had to do one or two (or six or seven) of these myself on my own show, which is always a fun little challenge... And you did it perfectly! I would not have caught it if I wasn't looking at your face when you said it.
Anyway, terrific video!
10:53 so no one is talking about the magic that you aligned each field of the 60i CRT screen with each frame of the video. no rolling scan lines, no half image.
If my camera was at 59.94fps and the CRT was also at 59.94fps there would be no rolling scan line or half image because they are both in sync. There might be a frame tear as the frames may not match at the same starting point (you see it more in the NES demo)
@@FilmmakerIQ oh my. you remembered to say 59.94 ❤️
I dunno why… but our here when i try at 25 fps i still get around ⅓Hz or so beating of the scan lines. Maybe my television set timing circuit is neither exactly 25 fps nor 24 fps nor 29.97 fps. Huh 😳
My cameras have a match scan feature of the shutter speed to finely tune and match a screen on the camera. I most like used that as well
You can convert old 60i footage to 60p by just taking each field and doubling each line. The resolution is lower but you preserve the original framerate without combing. Alternatively you can dump every other frame and get 30p.
But I never understand the logic of combining the fields, that's what gives you the awful combed look.
The only time the combing works is if the original content was 30p and shown in 60i using naive algorithm, but yeah, I have never understood the de-interlacing obsession with converting 60i to 30p, it just produces too many artificats and in best case destroys the temporaral resolution of the interlaced format.
Or use some interpolation algorithm like QTGMC on AviSynth.
Converting 60i to 60p isn't a problem. Modern codecs deal with it flawlessly. Converting 60i to 30p is still a huge problem even for the most advanced codecs.
"You can convert old 60i footage to 60p by just taking each field and doubling each line" that will not produce a good result since your video will move up or down a line (or half a line at your lower effective resolution) every frame. There's no perfect way to convert 60i to 60p since when using a CRT the interpolation was done by your eyes and your brain. Good algorithms try to reconstruct the missing lines by using the neighbouring lines of the same field as well as the same line on neighbouring fields with motion interpolation to hopefully be able to retain full resolution when there's little or no motion while avoiding combing artifacts during high motion. Converting 60i to 30p is just converting 60i to 60p and then converting 60p to 30p.
I Love the fact that any old vhs tape can be deinterlaced to 60fps, unfortunately not everyone does that to their tapes when digitizing it, idk i dont feel right when i see a video which came from a tape and is not deinterlaced into 60 progressive frames and just let it to 30 .
I REALLY wish that (great) videos like this would mention an important alternative: when you're converting interlaced video to a digital format, you don't have to use the standard de-interlacing methods that reduce the effective framerate by half. There are a number of great algorithms that convert, for example, 60i to 60p, by "filling in" the missing lines of each frame with intelligent interpolations based on previous/following frames and so forth. To almost anyone's eye it looks much closer to the original, since it preserves all the temporal information that is otherwise discarded, with almost no downside. I have used QTGMC, which might be challenging for a non-tech person, but provides great results. It drives me nuts that most converters and video editing software don't have a simple way to achieve this. Just think of all the video footage that has been degraded for no good reason.
Since deinterlacing is impossible as the two fields usually are not the same sample in time, i think Bob deinterlacing is the only effective way to "deinterlace" something since you still get the correct frame rate and generally it is quite faithful to the original interlaced source.
Well, Bob is the most efficient way since it barely uses processing power to fill in the missing information on the fields. It just line double the fields and present then in the same frame rate
@@phazonlord0098 It's definitely a compute-intensive thing to deinterlace in the way I'm suggesting. Google for "videolan wiki deinterlacing" and read through the methods available -- those are some of the real-time options, some of which are claimed to be superior to bobbing (haven't tried them myself). For conversion, QTGMC is slow but works great. It's a sophisticated algorithm: google "QTGMC Avisynth wiki" and see section 3.1.2. I think if you compare with bob, you will never look back. The Bob method can cause horizontal lines in the scene to "bob" up and down as the fields alternate (hence the name, I take it.) QTGMC starts with a bob, but goes much further.
The players and viewers do that too, so (unless you are manually guiding the process) leaving it as-is is best. Let the DVD player or TV convert to 480p, and as the algorithm is applied when playing, it will get better over time as techniques improve, and you can override if it's not looking good.
The VLC software has several different deinterlace algorithms including an "auto" meta-algorithm to chose the best one based on content and hardware specifics.
@@CaseyConnor yeah, bob it's just the most basic and efficient one if you just want to make a good progressive transfer without too much hassle, I also hear great things about Yadiff on VLC, it seems most algorithms start on bob deinterlacing but instead of line doubling the fields they try to interpolate or analyse parts of the picture that doesn't change between fields. Very processing intensify stuff but the results are amazing.
Thanks for another very interesting video, I had not heard of Flicker fusion threshold before, and now I have some fun nighttime reading.
30ish years ago I was in electronics school, and the reason we were told we did 50Hz in Europe while it was 60Hz in the US was because of the en.wikipedia.org/wiki/Mains_hum. I guess flicker fusion and main hum combined to help pick that frame rate.
Thanks again for the video.
Brian
50i IS 50p. Some people forget TVs “bob” the fields
Yup, at least the experience is. Spatially, it's not quite but people get way to hung up on that.
Happy to see shout outs to Technology Connections and Slow Mo Guys!
D Resolve (and I'm sure many other editing software) will deinterlace 50i footage (yes I live in Europe) to 50fps. And if you flick through frame by frame you can see that each frame is different. A man walking will be in a different position on every one of the 50 frames. So it's completely true that 50i contains 50 different images per second. The fact that these are interlaced makes no difference. When you watch 50i (or 60i) TV you are seeing motion equivalent to 50 or 60 fps. I think John's videos are great.
the good news is since this video has come out I have had NO ONE tell me that 60i isn't 60 different images. That line of argument is dead.
I was born in interlace, molded by it, I didn't see progressive until I was a man!
love your show man learned a lot from you keep it up
always so much to learn here. Thank you so much :)
A funny anedocte is that Rod Serling purposefully chose some of the worst scripts to the videotape episodes, so that it maybe would be easier for him convince the network to come back to shooting on film. And Rod Serling's vocal delivery of the introduction in The Lateness Of The Hour also seems so completely rushed and phoned-in.
The big exception is The Night Of The Meek. It's one of the most beloved Twilight Zone episodes, and the DVD commentary even says that maybe shooting on tape helped with the sincerity and simplicity of such a low-key, charming and heartwarming Christmas episode. Though one can still think that the episode would be even better if it was shot on film. Videotape was simply so severely limiting, simple and cheap in all aspects, the frame rate being only one of the problems.
I will add that while 24FPS is part of the traditional cinematic look, there is obviously much more that also makes a video look cinematic. Many youtubers (not you) think that 24FPS is enough to make a video cinematic, while many 30FPS videos are actually pretty cinematic.
On another matter, I will talk about one of your points from one of your other videos, when you say that comparing 24FPS and 60FPS side-by-side is not right. I agree, and I think it's so because when one frame rate is blatantly higher than another, watching them side by side can give the illusion that the lower frame is choppier than it actully is. The video below, comparing the same animation in 24FPS, 12FPS and 8FPS, is a good example. Watching the 12FPS animation right after 24FPS is really jarring. But the exact same 12FPS animation looks perfectly fine when watched right after 8FPS animation.
th-cam.com/video/0r3d2eMw8Ws/w-d-xo.html
Also, higher frame rates in animation truly follow the law of immensely diminishing returns, as these video comparisons below show very well.
th-cam.com/video/o2WBBgqV21s/w-d-xo.html
th-cam.com/video/RdGwTIhEsIU/w-d-xo.html
th-cam.com/video/-ZnxYYABLww/w-d-xo.html
We don't really need higher frame-rates in animation. Anyone who thinks otherwise really needs to look at this awesome and buttery smooth animation chase scene from Richard Williams' The Thief And The Cobbler (that film has plenty of examples of GOAT animation).
th-cam.com/video/Usf5vtaYDI0/w-d-xo.html
And art is very subjective. Higher frame rates aren't objectively mean better or worse, the only objective thing about them is that they have a higher number of frames per second! Though Richard Williams, in his book The Animator's Survival Kit, was always a huge defender of animation in 24FPS as much as possible (Williams was always obssessed with animation being as smooth as possible), he also says that there are many animators who actually think that 12FPS in many instances can look better than 24FPS, and he mentions Art Babbit, one of Richard Williams' own biggest mentors, as one example of such, even though Williams still doesn't agree with Babbit's view.
I also highly recommend this video below.
th-cam.com/video/YtYpif-dLjI/w-d-xo.html
Cheers! I wish you the best!
NTSC and PAL are color norms, but number of lines and frame/field rate were defined in the B&W time. Hence, compatible color. The original norm (and basis of the color) are still called CCIR B/G for 625/50 and CCIR M for 525/60, and there are more of them, some obsolete, for different specifications for resolution, video bandwidth, audio carrier, and modulation type (positive/negative, AM or FM)
Never read the comments. NEVER!
No Kidding... wait...
WHAT AM I DOING WITH MY LIFE?????
I do the opposite. Look at the video title and see that it could be controversial, and head right for the comments. Although, this time I actually watched the video first. 📺🎦 😁
@@my3dviews Arent you the same guy who believe the earth to be a spinning gyrating pear?? though, i could be mistaken .
@@imodium438 Let me guess. You're a flat Earther. No the Earth is not a gyrating pear, but a near perfect spherical planet that rotates once per day.
@@my3dviews well it does on cgi pictures, i give you that. but in the real world, it's time for you to take off those 3D glasses & accept the fact that we've been lied to.
It's funny how you mentioned those specific Twilight Zone episodes because they reminded me so much of earlier Doctor Who episodes, which would explain so much.
alright, 3 minutes 35 secs into the video and I am all with John on this - I've found even people who work for a living as filmmakers seem to not understand that TV programs deliver 60fps (see, F P S) and that creates that smooth look, not 30fps. I am not American, so I can't really say if true 30fps is that common in TV (shot in progressive 30p and broadcast as 60i with frames doubled) - I've heard that some shows were or are shot that way. But whenever I hear "that TV look", it refers to 60fps (broadcast interlaced) and not 30fps. 30fps is actually almost as jerky as 24p
Combing artifacts are not inherent to deinterlacing interlaced content, that only happens if you merge the two interlaced fields (half-frames) into one progressive frame. Inherently, however, _some_ visual compromise is always involved in interlacing. For example, it is also possible to display each half-frame as its own frame, which gives you smooth 60fps motion, but half the vertical resolution and the scene “bobbing” very slightly up and down every frame, or to discard all the odd or even frames, which gives you 30fps motion and half the vertical resolution, but no combing or bobbing. There are also algorithms that try to achieve 60fps with full resolution, but since they have to guess, filling in missing information, they can make mistakes etc.
12:10 Japanese release of the NES, called famicom in Japan, released in 1983
the NES in the US and other countries was released in 1985
Ok; I was going to write a comment about how I was so glad you made this video because so many people nowadays fail to understand that 30p is NOT THE SAME as 30i and doesn't look like live TV (which looks the way it does because of the 60 interlaced fields per second), but then, after you spent all that time explaining it... You told people they could shoot in 30p if they wanted the NTSC look.
wat.
30p is only 6 frames away from 24p and looks very similar to 24p...
As you had PREVIOUSLY mentioned, they're going to need 60 fields or 60 progressive frames per second (or greater) to get that live look.
I don't understand why you would say that right after explaining why that ISN'T the case.
True progressive 29.97 has been around for about 20+ years - so it has become a look that is associated with NTSC video. 30 fps doesn't look similar at all to 24fps - those 6 frames per second represent 25% increase in speed.
Go through any broadcaster's TH-cam channel and you will see it's all 30fps: th-cam.com/channels/eY0bbntWzzVIaj2z3QigXg.html
Love your sense of humour John! I do like your "soap opera mode" that you showed in this video
for real, the "They must be delusional!" line he dropped in 9:34 had me dying
Another excellent video, and I'm glad that you're prepared to address controversy. Just my personal opinion, but the switch from 24p to 60p makes the image look much more realistic. However, we need to remember that this is an art form - is realistic always better? To take an extreme example, I'm sure we've all seen the trailers for the new CGI Lion King. They undoubtedly look more _realistic_ than the cell animation. Do they look _better?_
You're starting to get into apples and oranges territory there. Yes CGI Lion King looks better given a certain set of criteria. Yes hand drawn Lion King looks better given another set of criteria.
But neither will look better at 60FPS :P
@@FilmmakerIQ they'd look more _realistic_ though ;P
Actually might look less realistic... High frame rates have this weird phenomenon of revealing every imperfection
@@FilmmakerIQDon't get me wrong; I totally agree they look worse!! But reality _is_ imperfect ;)
I'm willing to bet, however, that if television started with 60p, that the industry would not have gone to 24p.
Personally, I'll take the 60, but I'd take the sappy drama out 😄
Think of 60i as 60p with half the vertical resolution, Now everybody is happy. But there is a reason why some people complain about interlaced materials, they use computer players that often use wrong de-interlacing settings which reveal the nasty combing that everybody is talking about, I do capture analog video for customers and in the recent years I was forced to de-interlace the videos for them the right way using high quality commend line QTGMC program so when they play the files there will be no risks for them to see the combing in case they use the wrong player or a player with wrong setting.
It is and it's not 60p but it's close... - which is why you have to run it through your software. Be careful trying to "oversimplify things" unnecessarily :)
@@FilmmakerIQOne field is displayed in progressive by modern flat panels and that's the reason for combing artifacts, CRT TV's don't display the whole field at once like flat panels and that's why they don't exhibit the combing effect. That's what I meant by 60p with half the resolution ( a field resolution).
Yes, you don't need to explain what interlacing in a comment on a video that explains what interlacing is. It's rude.
Thing is 60i is can also be 30p full resolution with combing obviously.
Saying 60i is 60p is but half rez is only one way to interpret 60i
@@FilmmakerIQ I didn't mean to be rude, I was just expressing my opinion on the matter I deal with almost daily, Analog video capturing.
That's okay. It's just that the video almost point by point says exactly what you said.
:)
4:47 Was that an audio desync or a really smooth audio edit to fix a mistake when reading the number 94?
Masterful presentation, well done!
9:10 So that's what Windows and DaVinci Resolve were doing to my Canon Vixia's footage. I assure you it looks 60fps!
100% agreed. I think a major problem has been poor software deinterlacing to create 60p from 60i. The default blending of 60i to 30p creates ghosting, then "bob" deinterlacing gets 60p but halves the resolution. Since the "Yadif" deinterlacer has been introduced in software like VLC and Handbrake, it has transformed how I see 60i sources.
Extremely well done! The metronome demo was excellent and conclusive.
Well, if you look at the tip of the metronome at 60i it almost disappear at its fastest speed where as at 60p, the tip stays well visible with only some blur.
No it doesn't disappear... it blurs exactly the same way as 60p version does.
I really don't notice the difference when watching. If I slow down I see that 60p has more pictures than 24 or 30. 720p looks great to me. 1080 and 4k aren't really necessary. For me the only real difference is the size of the files and the time it takes me to download, upload, and edit. I do believe that the quality differences are there, but I just don't notice them.
Watching this at 7 in the morning before work and can't stop! This perfectly answered the question I had since I was 8 and felt there was something weird about that new BBC series (which later I realized was "filmed" with U-matic). Thank you!
P.S. this vid inspired me to experiment to see if we can create a 60fps feeling in video games and real-time animation using fields, and cut the processing power in half.
yes you can but it won't look good on a progressive screen. If you want a throwback to the early days of PC gaming what they used to do was render only every other line of video in order to preserve processing power and file sizes. Pretty much exactly what that Nintendo did but not on a CRT screen
I’m still fascinated that may TV broadcasters went with 720p instead of 1080i, mainly arguing with sports. Now we see the results and of course 1080i is better. I’m able to watch Formula 1 on 2 separate networks, one at 720p, the other at 1080i and the difference is staggering. The jump then from 1080i to 1080p is also a great one and I’m not even talking about 2160p but that’s another story
On my DVD copy of Super Mario Bros, there is ONE scene (the climax, as the Dinosaur world and the real world start combining) where all of a sudden, it's a Interlaced image with terrible combing. It's really odd because the entire film is fine (well... it's Super Mario Bros, but you know what I mean).
Woah how did u get get 60 and 24 fps in the same video???
Please share
It's easy, just a 24fps video stream in a 60fps timeline. The math works out that each two frames of 24 get 3 and then 2 frames of the 60fps video
Hellow I would like some help I hope what I am about to say will make sense...
I have canon 7d mk ii and atoms shogun flame... via HDMI the 7d send a 1080 50i and 50p signal then when I change setting on the 1080i 50i I can get it to 25p and gives a 2:2 pull down.
And whn delivering I need to deliver in MXF XDCAM 50 1080i 25i I hope that makes sense
Now my question is whn I shoot in 25p and edit in final cut as a 25i and export it as 25i will I loose any quality? What impact will it make on the footage? Please assist?
I like the more depth you got into in this one, just as I liked the original.
I really like shooting 60p I like that smooth "soap opera" effect coupled with the higher resolution for that "extra crispy" image.
i mind the soap opera effect when its like the smoothing 24 fps > 30 fps > 60 fps built into tvs since it stutters so much but regular 60 fps is great
In defense of the comment section: Interlacing isn't easy to understand and most people who think they do, don't.
That's unfortunately very true!
Let me ask you something. I actually support interlacing since I discovered 50i is literally 50 vertically half frames. Now I’m thinking, in the world of film anamorphic lenses were or are used which on the horizontal side (mostly) has less detail than vertical. My point is: if it’s possible to convert 50i to 50p with vertically half the information, why don’t we keep doing that if films also do that?? On the other side, it’s a ongoing war of better imagequality between resolution versus framerate.
With regards to film, Anamorphic doesn't reduce the detail horizontally, it actually increases the detail... Squeezing more detail horizontally than regular lens onto the same size film area. Then when projected the extra detail is spread out to create a wider image. In the digital space it's more about effective use of the sensor but it's still incorrect to say detail is lost in the horizontal sense. Detail is added with anamorphic.
Next the flaw in your statement is equating visual resolution with temporal resolution. They are not the same.
So why don't we see more 50i to 50p conversion? Well you do these days at least... It's called line doubling or Bob deinterlacing. Problem with this type of deinterlacing is generally the source is going to be quite low resolution to begin with so you're going to make things worse.
Sadly you didn't mention deinterlacing. Of course it's "just software hocus-pocus", but it does work pretty well. There are a ton of different algorithms implemented in e.g. VLC player, I've found Yadif 2 to work the best. To activate it, right click on the video->video->deinterlacing->on/automatic. Then right click again->video->deinterlacing mode to set the algorithm. You can alter those permanently in the settings CTRL+P->video. Of course it won't be as sharp as the original, but give it a try! Also if the video is stored as progressive despite really being interlaced (e.g. if you digitised an old video and set the export to progressive) it wont work.
There are example pictures (no videos sadly) on the vlc deinterlacing page: wiki.videolan.org/Deinterlacing/#Examples
Here is a video of me comparing deinterlacing methods for TH-cam uploads: th-cam.com/video/talRdXXNzyI/w-d-xo.html I designed this video specifically to generate interlacing artifacts. Or actually, finding videos with interlacing artifacts and compiling them.
Your tongue in cheek sarcasm aimed at stupid people is infinitely funnier than your "special fx" gag.
John. Could you help me (and maybe several others on the web). Do you happen to know how to convert 30p exported 60i footage back whatever you want?? Basically the video is in progressive but the footage is in interlace with combing. Do you happen to know how I get it back to interlaced??
well I would throw the footage on a 60i timeline in Adobe Premiere and it should covert it back. You may want to explore Handbrake as a free option though what your asking isn't something common. Handbrake has a decomb option that might solve your issue without going back to Interlace
Historical note... In the UK, we used to use 405 lines for B+W TV, (VHF as it was called) and then switched to 625 lines (UHF) with the advent of colour in the 60's...
Also, looking forward to your explanation as to why 29.97 (ish!) and other similar frame rates are required. Are there REALLY people still watching broadcast TV on CRTs?
At least here in PAL and SECAM land we don't have to indulge in that strange pull down nonsense to show movies on television. We (well, the TV channels) just show 24fps movies at 25fps. Which is why the movies on TV are always 4% shorter than the timing listed by IMDB. And yes, it does pitch audio up by 4%, too.
yes... I have CRTs at home. Plus there's huge librarys of content.
and pulldown is superior to the PAL solution. We don't need with the speed of playback. 3:2 pulldown is simple and completely non destructive.
Somebody thinks the P stands for Pixel??? Glad I never got that student. Of course I guess I should just count myself lucky so far.
insisted that since that is the way he and his cohorts used it that "pixel" is the correct meaning of "p" regardless of its original use.
Interlacing is honestly underrated. Even Jon Carmack, godfather of "getting gaming technology right" and ambassador for higher framerates has pushed the idea of interlacing. He also floated the idea of doing every 3rd or every 4th or 10th line, to get 3x or 4x or 10x the framerate instead of just 2x. Personally I'm really excited for that and I hope someone implements it someday soon. imagine gaming at 1Khz.....
Saying its effectively 30 as if it looks like actual 30 shows an empty spot between keyboard and chair that may look like a human, but no, it's not even a shadow.
Saying it's effectively 30 comes from the post side because in video editing you can't actually separate the fields. So when you're editing you have the treat it like 30 FPS. Problem is we're so far removed from using 60i these days that people that never experienced 60i don't know what it looks like
The switch from 60 to 24 FPS looks like a dia show
VideoToaster Mexican guy here!!! Made 4 frames captures or renders intarlaced, becomes great looping moving backgrounds,
Thanks. Good history/future lesson. Probably I don't really need to know this, but one of the joys of TH-cam is learning stuff you have no real need of----right now, but might somewhere down the road.
You only see combing when viewing interlaced video on a non-interlaced display because the software tries to combine the separated interlaced frames. blending them together.
that is the default operation when feeding an interlaced video stream into a TV or NLE.
@@FilmmakerIQ Modern TV's or older CRT TV's?
@@ikannunaplays progressive screens. CRT TVs are interlaced.
watching this video on my ViewSonic 17GS CRT at 800x600i@180hz :3
Is there any way to convert 1080i 60 to progressive 1080 60fps? Dont want to lose quality neither 60fps
No. Only through spatial interpolation which potentially loses quality. But then again maybe something you won't notice...
Im'm still using crt with interlaced resolution @120i to 160i
If anyone can please answer. For low light sports, should I use 24p or 60i? If the answer is 60i should I use 1/60 or 1/125 shutter speed?
60i is typically shot at 1/60. You can shoot whatever you like.
60/50p is good for getting rid of the combing effect when viewing interlaced footage. That's how Netflix should handle it
But then your resolution gets cut on half...
@@FilmmakerIQ Yeah, but that's because it's shot interlaced, each field is going to be 1/60th in time at half the res anyway. I'm saying if you have interlaced footage on say a DV tape, you can convert your 60i footage to 60P and in software and it's more accurate when watching on things like TH-cam or Netflix. This is something I do for people when I convert VHS to MP4, if I were to say to remove one field to get rid of the combing effect, then I'd get half the res, if I were to blend the 2 fields together I'd get a weird motion blur over the whole thing, but if each field is converted to a frame at 60 FPS rather than 30, I preserve the motion and get some "perceived" resolution back because the frames are flashing at 60 FPS. You can try this in VLC, set the deinterlacing mode to linear on some old interlaced footage. Didn't mean to come off as arrogant, love the channel, just my 2 cents.
@@TVperson1 thing is though 240 60p is a really hard sell. Most people would take the 480 30p even if it came along with some deinterlacing blur.
@@FilmmakerIQ 480 up convert? 😏
@@TVperson1 480 blend two fields into one frame. No upconversion needed ;)
i just like 60fps. I know that makes me a heathen but i like it.
I like the 60 FPS too for some things, just not live action narrative cinema
@@FilmmakerIQsir, which frames per seconds do you prefer for Animated CG shows? 60fps or 24fps? I personally prefer 48fps the middleground. What is your professional recommendation?
24. No one makes animated CG shows in 60 fps or even 48.
@@FilmmakerIQ You need a 24hz or 120/144hz capable display to properly enjoy 24fps video..! It just looks horrible on 60hz progressive scan displays, which are still the most common nowadays.
120hz is the sweet spot here for enjoying both 30/60fps TH-cam vids _and_ 24fps movies.
Nope 3:2 pulldown is not the culprit the internet thinks it is. I thought you could step up to 144hz and get smooth motion but the 144hz monitor looks WAY worse than 60hz. Sample and Hold is the culprit, people have been watching 3:2 pulldown for years in CRT.
This isn't directly related, but I have been able to view interlaced recordings of some older games on PS2 and such, and going through the recording in raw interlaced format "frame by frame", it's interesting to see what tricks games can do when they run at 60 (or 50 for PAL) FPS but output in interlaced format.
For example, it looks like Ratchet And Clank 3 overlays a semi transparent layer (with a bit weirder colors) of the next frame in the current interlaced output refresh, so when movement happens and the next interlaced refresh occurs, it should help to compliment the picture in motion. I suspect this was done more to help deinterlacing algorithms on flatscreens, as technically it should help the TV to structure a full frame better, depending on the method of deinterlacing.
These layers aren't noticeable in motion or deinterlaced progressive video capture, so I would say it was a fairly effective and interesting technique that at least didn't damage the picture.
But to tie back to the original video, yes, even games occasionally were 60FPS with interlaced outputs. Another strength of games was also that technically you could do 30 full frames with 60i, if the game was set to output in such a fashion and had a stable framerate.
As extra trivia information, even older game systems used another trick to draw full 60 frames "progressive" on CRT TVs, by only drawing the same lines every refresh, instead of alternating lines every refresh. This eliminated the inherent interlaced picture flicker, but also worked as a performance saving as games could render at half the resolution (320x240 instead of 640x480). This is these days known as "240p", as it draws 240 lines of progressive video but it's only delivered in an interlaced format. I believe back in the day, some folks who worked in the industry called it by different names, I think Nintendo called it "double streak" or something.
And yeah, interlaced can be troublesome with modern TVs, but as you might guess, 240p can be a huge issue too, as vast majority of TVs and converters interpret 240p as 480i, scrambling the line order and deinterlacing it, which can result in excess blur with movement, but all sorts of different artifacts and issues can prop up as well. In the worst case scenario, the device simply wont draw the video at all, be it in analog or digital 240p signal. Which is sad, because technically 240p should be much easier to deal with than 480i, as the proper way of handling it is to "line double" the signal, just double the existing line information to the missing line information in each refresh, and you'll get clean 480p. If you interpret 240p as 480i and line double it, you not only get flicker, but also bad vertical shaking of the screen.
Be careful with your conclusions because some of it might be artifacts of the capturing process. seeing quickly on Wikipedia the PS2 was did basically support 240p but only though the component cable. If it was captured by other means it might have been resampled into 480i which might explain the ghosting effects you're talking about especially if the capture isn't exactly in sync with the source.
@@FilmmakerIQ
I'll admit that the issue with recording could be a possibility, but I wouldn't outright rule anything out yet. I noticed this by using one of those EZCap devices some time ago, and the device I specifically have is very finicky with the drivers for some reason. I only got it working through Virtualdub, which captures raw AVI footage. Everything else looks very crisp, and the video files are large, so there's nothing else odd about it, and it's not video compression at that point yet. Either way, I was just testing things around, and found this interesting effect, especially when going frame by frame. Could be intentional or unintentional.
But no, it's definitely not related to 240p! PS2 had only a handful of 240p games (mostly ports of really old games and collections, a couple of native PS2 games too), and will output all 240p PS1 games at such resolution, but the game I brought as an example is definitely 480i native. 240p also works over all video cable types and signal standards (because it's nearly the same as 480i, just with a different line order), but component video happens to be the least compatible one. It has something to do with the processing pipeline of digital TVs and processors being different for different inputs, such as composite/S-video/RGB and component. SCART was commonly used for composite and RGB, occasionally S-video, so the way 240p is handled is typically the same through SCART inputs and composite video. However, a lot of TVs handle 240p differently over component compared to their composite ports, or sometimes might not support it at all.
It's a very troublesome resolution these days.
@@FilmmakerIQ would choosing 1080i from ps4 settings " with 60fps ps4 game" give me 60 fps look on lcd in terms of smoothness,please?
So is it like shooting with 2 30p cameras and sync it to a 60i video?
at 13:59 the color correction changes
12:14 It's a bit later than 1982... But seeing as that's my only complaint with the video then it must be great!
I knew I couldn't have been the only one to notice that haha
Can you just tell me which one is better for gaming or not? I got two options, i or p, and im trying to do fast paced aircraft fighting simulation
Why don't you just try both and see which one you like better?
@@FilmmakerIQ because my setup is cursed enough and i really dont want to make it that extra bit more
Love this vid, nice one Jon
If I understand it right, you could technically make 60p out of 60i easily just by duplicating lines. Odd lines for frame 1, even lines for frame 2 and so on in the next fame. There would be no artifacts on progressive screens. Or am I wrong?
Not exactly. Each 60i field is only half the vertical resolution. So if you make a field into a frame, you then need to interpolate the in between horizontal lines (either even or odd) on each frame. You cannot just double up the lines or you end up with half the resolution on each frame. If there is no motion, you can use the other field for the in between lines, but that doesn't work if anything moves or if the frame is being panned.
Correcting the former comment... yes exactly. 1080 60i footage can easily be converted to 540 60p in the way you're describing
Yes, just remember the even frames only has one of the top lines, and the odd frames only has one of the bottom lines. This will ensure the spatial data is centered correctly. It is called bob deinterlacing, because you bob each frame half a line up and down and then scale 2x.
Dude you’re a freaking genius. I don’t care what anyone says
How does it have 1080 vertical lines. Shouldnt that be horizontal or 1920 vertical lines instead
the line count is always vertical in TV. How many lines counting vertically does a screen have. Its really the only dimension you control precisely on an analog CRT. In film they count pixels horizontally when digital film scanning started.
If you ever try to convert a DVD that's interlaced into a MP4 file using Handbrake, it will attempt to deinterlace the video. Doing so will make the video look more jerky because it was running at 48 to 60 fps and was brought down to 24 to 30 fps. The solution to that is to change the output frame rate from 30 fps max to 60 fps max as well as turning on decomb and changing the preset to EEDI2 Bob. That is the best way I've found to remove interlacing AND keep the video looking like it's 60 fps. In actuality, it converts the video to a variable rate frame rate.
Anyway, I'm sorry that there's so many people out there who think they're right about interlacing and progressive video. The p in 1080p never was short for "pixel".
On a related matter, are there any plans on a video about the Vidfire system that is used (at least here in the UK) to restore old shows to their original look? I'd be interested to learn more about that.
In connection to this I think your viewers might also be interested to learn how the BBC have been able to restore colour to shows where only black and white backup copies survive (not colourisation but extrapolating the actual original colour info). It's fascinating and incredibly ingenious!
Technology Connection did an interesting bit about color restoration they dis referencing some of color interference patterns it generated.
This was awesome!!!
Stupid question: why don't we show TV shows now-a-days at 60 frames (fields?) per second instead of 30? So you don't see comb artifacts. Or do we do that already? Do you see the fields alternating then, just like on your TV w/ the metronome in slow motion?
We actually watch most TV shows (except for LIVE Shows and Events) at 24.
But if we're sticking to 30 content... it's because 60 progressive frames takes TWICE as much bandwidth as 60 interlaced fields. Most stuff like Talk Shows might be done at 30 PsF and then broken into fields and sent as 60i. When it comes to TV, it's better to be able to broadcast two channels for the same price at 60i than to only broadcast one channel at 60p (1080 resolution)
Most TV shows were never video'ed in 60fps, but filmed in 24fps. The 60fps content was limited to live broadcasts (talk shows, sport) and low budget shows. All the popular stuff that people might rewatch today (StarTrek, X-Files, etc.) that was all 24fps. As for the combing artifacts, you should never see them on an actual modern TV, as your TV will automatically deinterlace the video stream and convert it to 60fps. The combing artifacts only become visible when the video isn't deinterlaced properly and instead the 60i video gets played as if it was a 30p video. This happened a lot when watching or editing a video on a computer, especially in the early days when there weren't good/fast deinterlacer around. These days it can happen by accident sometimes. As for playing back the fields individually like a CRT would do, that's possible in theory, but it wouldn't look good as the image would be missing half the lines and thus be half the brightness of a progressive image. It's just easier to deinterlace the image than trying to boost the brightness of an LCD to emulate a CRT.
Broadcast TV engineer here. We are legally as well as technically limited to a certain amount of bandwidth. Everything we broadcast is limited to a 6MHz channel per FCC specs. Increasing the frame rate would require more bandwidth... or more compression. And you wouldn't like how that looks. Studies have found generally that "30" fps is more than enough, especially when so much of what we air is produced in "24" anyway... including LIVE shows. I just ran the Billboard Music Awards which appeared to be in "24," or "faux film" as it's sometimes called. It tends to make a show look more classy.
Hell, one of our more popular commercials these days was shot on an iPhone.
In cinema, you can gobble up all the bandwidth you need... assuming your storage card can handle it.
Thank's y'all for the explanation. So it's mostly a bandwidth concern. I captured my old VHS "home movie" holyday tapes back in the day and I would throw away half of the fields to get rid of combing. Capturing at 60 Hz (or should I say 60 fields/sec) wasn't possible. Certain professional documentaries containing old (live) TV material had those awful combing (or ghosting) problems too. Looked very unprofessional.
@@FilmmakerIQ My 2 cents: Disney- and FOX-owned channels (among others) broadcast 720p60 content. It's said that the rationale for choosing 720p60 (over 1080i60) was that they considered it was better suited for sports content, which has a lot of motion (think ESPN, FOX Sports).
en.wikipedia.org/wiki/High-definition_television_in_the_United_States
Is 60i on a digital progressive screen effectively 30p as it just combined every 2 fields into 1 frame when deinterlacing but on a CRT it will look a lot smoother and have that 60hz soap opera effect?
This is answered in the video. I'm happy to answer any questions from the video.
Though what I think you're asking is 60i effectively 30p? Depends on how you convert it, it can be, but it can also be converted to 60p. The reason we say 60i is effectively 30 is because in the old days you always counted 2 fields as one frame. There's no simple way to cut by the field.
@@FilmmakerIQ In PAL regions some Blurays of TV shows are 50i but I believe they are effectively 25p but 50i is used as Bluray doesn't support 25p. There is no soap opera effect as it merges every 2 fields into a frame and it was originally shot in 25p. However I've watched 60i Blurays and they had the smooth motion soap opera effect and I'm not sure why.
@@ShaneJMcEntee 50i is the same case as 60i in that 50i is the same experience as 50p. However when you deinterlace you can opt to combine Fields into 25 frames per second.
If the original show was shot on film in Europe it would be 25p but if it was shot in studio on video cameras it'll be 50i. An example of this is Monty Python flying circus where you can clearly see the different motion between the film clips and the video portions done in studio
@@FilmmakerIQ thanks
Wait a minute--there is a discrepancy in your discussion. It has to do with transferring film to tv. It wasn't a 3-2 drop down, it was, and still is 4 frames plus one repeated frame. Watch a movie, filmed at a full 24 fps. Take a look at it frame by frame. You will see 4 different frames in a sequence followed by a repetition of the 4th frame. No one notices that repetition because it happens so quickly. I realized it when I was editing a film to video conversion.
That's only when converting from 24fps to 30fps. TV until recently (as in last 20 years) has never been 30fps. And even the 30fps of today is really broadcast over an interlace stream of 60i
3:2 drop down with interlacing is 24fps to 60i. See the demonstration at 12:40
lol, someone actually thought the p was for pixels?
The real problem with those few videotaped TZ episodes was less the 60Hz video than the use of image orthicons, which created a "black halo" or "black aura" around any bright object in the scene.
The 60hz was a problem too.
Excellent until the very end. 60i, 60p, 50i, and 50p all have the smooth motion we associate with traditional television. But 25p is essentially indistinguishable from 24p, and 30p is very close. For what it's worth I experimented with this using variable speed film projectors in the early 1970s. (I'm an even older fogey than you are.) For more on interlacing see my post here: dgarygrady.com/2015/01/22/interlaced-video/
agreed that 25 is indistinguishable from 24. but there is a study saying you have to get it up to 26 to be just barely noticeable.
@@FilmmakerIQ I don't think we disagree. My point was that in terms of motion characteristics 30p looks more like 24/25p than like 60i or 60p. The difference between 30p and 24p is subtle but visible if you look for it. In North America some filmed television commercials and music videos are shot at 30 fps to take advantage of the slightly smoother motion while not getting the soap opera effect. By the way, Douglas Trumbull developed a large format high frame rate system (60 fps as I recall) for use in amusement park rides because it looked more like reality, but when he tried to produce narrative project he found people disliked the soap opera look. Again audiences like 24 fps, just as you're been saying.
@@DGaryGrady yep we don't disagree :) Every time they try pushing higher frame rates in movies its always failed.