*CORRECTIONS, UPDATES and CALL FOR EXPERTS* Did you work with bluescreen on film with optical printers? Were you involved in the model filming for Star Trek: The Next Generation? Are you familiar with the processes involved in television CSO? I would love to speak to some experts to confirm my research. If you would like to help, please contact me via my website shiveringcactus.net (my email’s spam folder is particularly aggressive at the moment, so this is the best way) Inevitably, despite trying to do loads of research I've missed some elements. And I thought I'd summarise them here: 1) TEL-E-SIN-EE, not TEL-E-SEEN - thank you @TheHandOfFear and @jpofgwynedd3878 2) Looks like TVs in the 1950s did not have 576 lines - that came later. "Actually at the time of the coronation the UK used 405 lines (377 visible, ~188 per field), when the BBC started television on an experimental basis they used 30 (yes thirty) lines, moved to 405 in 1936, and started 625 lines in 1964." - @laurdy, also @6ECF01 3) The first laser recorder was developed internally at Pixar in 1984 - @priyeshpv 4) Black-and-white TVs did not use mask holes apparently. I'd summarised two bits of info, although the workarounds by the Telecine department are accurate, but thank you @im.thatoneguy 5) The film perforations are not correct. They aren't really meant to be, just illustrative, but point taken - @jmalmsten and @computationalcinematographer Thank you for all the helpful corrections - if TH-cam allowed editing a video or even editing the audio I'd replace these within the video.
It's a very nice video but my eye started twitching when you said CRTs had a resolution of 720x576 or 720x480. This is a modern, digital definition of SDTV norms. Analog CRTs do not have an inherit resolution and while the vertical resolution was somewhat fixed and standardized with 625 lines (about 576 visible) or 525 line (about 480 visible) there is no fixed standard for horizontal resolution. It's a line drawn by analog components and the amout of horizontal details is determined by the quality of the device capturing or creating the image, the bandwith of the signal and the display. The picture can have more horizontal detail than could fit into 720 or less detail but it's not bound by 720.
It’s something I completely blanked on, assuming TV resolution was always what it ended up being. What’s even more frustrating is that it’s there in screen when I show the VT Old Boys site!
It's already been pointed out but at 4:11 You're incorrect about the number of "dots" in horizontal resolution in analogue NSTC/PAL/SECAM. That's what was decided on as "a good enough" resolution when we started moving to digital. It's actually continuous. The number of lines is correct and distinct. You could argue that due to Shannon's Law, there is a maximum number of "pixels" you can get across I guess which would depend on frequency response. You COULD (sort of) argue there were distinct pixels, but only on color displays, but the signal wasn't distinct. Various manufacturers would have different amounts of R/G/B phosphor dots on the display. I would image they more or less copied one another to optimize costs. Trinitron was famous for changing how the dots were arranged and it was certainly superior for computer displays than the triangle configuration most televisions used, and is still (strangely) used today on some LCD television sets, but not monitors. 8:48 - "Aren't all computers digital" - well, actually not. Analogue computers still exist. Analogue means it MIMICS the process of something else. You can mimic water flow in a wire for example but instead of having water molecules, you have electrons. This is why it's called "current", voltage is very analogous to pressure. Analogue computers aren't in widespread use today. Those are all the mistakes I could spot, all two of them, and they are pretty trivial and nitpicky.
I'm just imagining the FBI asking John Whitney Sr. why he bought an anti-aircraft gun controller and he replies "It's for my job. You see, I'm an animator."
"Okay, well. We were just wondering if you wanted to buy some surplus 40mms." "Oh! I've never seen anything that shoots in 40mm. I'll buy whatever you've got."
Filmmakers/Prop Houses get an FFL (Federal Firearms License) from the ATF. Despite California being known as a deep blue state with heavy gun control, they have more guns and more machine guns including transferable ones per capita than any other state. That's largely attributed to the film industry and scientists.
I used to write drivers for outputting computer graphics to film cameras back in the late 80's ... we'd run off a few frames to test overnight (it might take a couple of hours to create a single frame of film), pull the film off the film camera in the morning, nip into the nearest town for lunch and get the local boots to process them (obviously they'd come back with loads of stickers on them) and then we'd check the slides that afternoon, make any changes and then run new tests overnight. The image was built up by having colour filters and a very tightly calibrated intensity of beam and then slowly scanning the beam across a monochrome tube (no shadow mask) at *VARIABLE* speed, slowing or pausing the beam for brighter areas - in that way we could get 4000 lines of resolution.
That is wild. There was so much more hands-on use of physics and math needed to achieve results than today. And dependence upon physical properties like the emulsion of film. Nowadays, solo CGI artists just complain about the graphics card.
@@ShiveringCactus … slide film is (was?) very different to normal photo film in how it reacts to light but is (was?) also very much more expensive so we would use normal photo film for everyday testing knowing that it wouldn’t look right but given that we also knew how ‘wrong’ it should look then it was fine for most testing, only moving on to proper slide film for final testing, so the images that we got back from Boots must have confused the absolute crap out of them as they looked nothing like a photo that you’d ever take. This was mainly for putting presentation graphics out to slides rather than doing actual rendering of animations but we could do rendering to cine film too if required (using a big ‘mickey mouse’ looking bulk film back … I also did development on what were at the time hugely expensive full colour Hitachi A4 dye-sublimation printers as well … our kit was all custom but based on the original Acorn RISC processors - that was the days of using 100MB Rodime SCSI hard discs, Syquest 44MB cartridge drives for removable storage and if you wanted to go really mad then there were the big Panasonic 5.25” magneto optical external SCSI drives (again which we had to write our own drivers for) which I think held something like 256MB on a double sided cartridge that you had to manually eject and flip over to access the other side … how things have changed!!!
BTW, this explains the glow issue mentioned in the video. If the image was displayed statically, the film camera could just have taken a shot of the frame with a sub-one-second exposure. There would have been no afterglow as the computer would just have shown a single frame once on the screen and the shutter could close instantly after the frame was done. But slowly painting the picture with a modulated beam and keeping the shutter open for an extended length of time would present that issue. Now, instead of the current pixel, you also get the afterglow of the previous pixels and the background glow of the tube itself. You can somewhat compensate for the former (only pixels that are not black will have an afterglow, so you can simply reduce each pixel's brightness by the expected amount of afterglow), but the latter is just there constantly. So the longer it takes to draw the frame, the more background glow builds up everywhere. And even worse, the glow level depends on the time to draw an image, so it may be different for different frames. Now a slit shutter makes sense, as it would only allow light from the region the electron beam currently is drawing in to the film. This eliminates most of the background glow from long exposure.
In about 1980 I was 14 and I got to visit my friend’s dad’s work. It was Triple I. I actually saw animators working on Tron. They literally had converted a closet to capture the animation onto film. It was painted black on the inside and the door was light tight. It had a film camera (probably a Mitchell) pointed down at the monitor. So, I actually laid eyes on the III system.
I studied 3d animation (and game design) in college, and still learned a LOT in this video. TBF, the tech that went into printing CGI on film is very much obsolete, but it still is a fascinating topic to learn about, and it explains a lot about the origin of industry terms and concepts we are taught.
I was visiting ILM (when they were just an unmarked building in a San Rafael industrial park) while they were filming the "Genesis Device" explosion sequence for "Star Trek II: The Wrath of Khan" 1982. In a small room was a color monitor slowly drawing each frame (more than a minute each if I recall correctly) from top to bottom. Close by aimed at the monitor was a (35mm I believe) film camera, that would automatically click off a film frame as each graphics frame completed. Then the entire process would start again. Pretty time consuming back then!
@@ShiveringCactus The fun part is that I couldn't resist sticking my hand in-between the monitor and the camera, so if you watch the sequence very carefully there is one frame with a hand in it. Just kidding. But the thought did occur to me at the time ...
@@ArneChristianRosenfeldt Who said anything about 1080i? It was actually more like 500 lines, but various techniques were used to make it look better. Here's an article describing the process -- it's that Genesis Demo rendering and filming that I described watching in person that day. Reading the article again now, I see the time per frame was much longer than I remembered after all these years, though apparently I walked in while the shorter duration frames were being drawn! - ohiostate.pressbooks.pub/app/uploads/sites/45/2020/01/StarTrekII_GenesisDemo.pdf
One small note. "The size of the phosphor dots was just bigger". That's not really true. Black and white televisions didn't need masks so there are no dots and larger black and white TVs are just softer. The entire screen was coated in phosphor material, not little individual dots of phosphor. Color TVs usually had different numbers of mask holes depending on their size. For instance my little 5" travel TV that I had in my car for long road trips had very very few mask holes and very few dots. My large 32" trinitron standard definition TV had way more mask holes. For 1953 before color television, the coronation undoubtedly would not have had to deal with moire from electron beam masks.
That’s really interesting. I found so little information online about the black-and-white TVs. What I did find was colour TV descriptions and that seemed to line up with the information on the old BBC websites. I know they over-powered the beam to reduce flicker as it scanned the alternate lines, and I tried to put that in context.
@@ShiveringCactus The Technology Connections video is really excellent for a deep but accessible guide to analog TV that I learned quite a bit from: th-cam.com/play/PLv0jwu7G_DFUGEfwEl0uWduXGcRbT7Ran.html The main takeaway is the reminder that television is analog so there is no horizontal "resolution" there's a continuous wave of infinite "detail" to the electron beam (assuming you count noise as "detail"). The lines are the only unit which has discreet counts. So horizontally analog has infinite resolution, while vertically there are discreet scanline counts. Since analog television doesn't have a "resolution", the color dots could theoretically also be infinitely small (ignoring the physics of refraction from the color mask). So you could have an extremely fine electron beam color mask down to your refraction limit. On a large TV that could be 4k "resolution" worth of masked color dots. However, most large format "TVs" of the analog era were projectors with 3 separate CRT black and white TVs with a filter. This had convergence and focus issues for each color to line up but had no horizontal moire because it had zero dots just lines for 3 black and white "infinite" detail horizontally rendered screens layered on top. For something like a telecine that exposed each color channel separately there also wouldn't be a color mask so it would have horizontal infinite analog detail, but vertically discreet lines. So you could get moire from the scanlines but not from a color mask.
@@ShiveringCactus B&W CRTs don't have a shadow mask at all. Instead, they have a uniform coating of phosphor across the screen. Basically, what determines their resolution is how sharp the electron beam is focused and how fast it sweeps across the screen. I'm no expert, but I'm guessing they used a B&W CRT because the technology existed back then to make an extremely high-res one, at least compared to the best color CRTs of the time.
In the early 90's to get animation to tape we had a frame buffer, about the size of a modern desktop, and all it did was hold a single 24 bit 640x480 image and generate a component video signal. Then we'd write scripts that would load an image into the buffer, roll a betacam deck back 5 seconds, pre roll, record that single frame to tape, load a new frame into the buffer, rinse and repeat. Getting field rendered frames was a whole other issue early on. Took around 10 minutes to get a single second of animation to tape.
So this is 921600 bytes. 1 MB. Amiga 1000 had 1 MB in 1985 . Did the framebuffer interleave banks? I think that the EGA card did set up the address on 4 banks at once and then Output enabled them in a fast session. Maybe with discrete DACs it makes sense to really use 24 bit instead of 8 bit. The sequencer in the EGA card needs to have a latch per plane. With custom hardware it might be easier to have two address busses. So while we cycle through output enable on bank 0..3, we set the address on bank 4..7. 8 banks times 3 bytes are a lot of modules .
@@ShiveringCactus Ya, and that 10 minutes per second didn't even count render time. Splitting the render up onto multiple computers, running around with Jazz drives. It all got better when Avids, Media 100's, and CoSA ( early After Effects ) came around in the mid 90's.
The 'frame buffer' still exists today in computer game graphics. It now generally means the location in video memory where the final composite image of the scene is built up, as the shaders each render out their fragments.
@@akaHarvesteR I always have to read this up: multiple fragments from the same triangle per-pixel, depending on various multisampling parameters and OpenGL state. There will be at least one fragment produced for every pixel area covered by the primitive being rasterized. For 4k displays at 144 Hz there is exactly one fragment per pixel or even less when you consider the path tracing part. This language is outdated. Maybe it makes sense when you render offline for cinema. At least it should be clear: fragments exist only in the GPU. The framebuffer contains pixels. On the N64 for example throughout the pipeline in the GPU fragments have greater range and precision in their RGB values, while pixels are 15 bit with dither pattern.
Extremely good explanation and animation. This makes me feel like I'm back home with my dad (big time TV producer and sometimes filmmaker) listening to him explain the tech he used to me. Thanks so much.
I'm shocked that there's a filmmaking topic like this I DON'T ALREADY KNOW and understand... but I have always wondered HOW (in practice) this was done. Wow, thanks!
It’s strange that it’s just a hidden part of the process. And from posts in the comments here I’ve learned it is still being used, especially for IMAX.
This is such high quality work, you should make a separate channel for video essays like this that appeal to an audience bigger than the after effects tutorial people!!
I loved your documentary. At the time TH-cam showed me your video, I had my script written but the stories were told were really similar to yours, which forced a rewrite. 😆😪
This was a great video. When you were talking about the BBC And Doctor Who I'd hoped you'd cover the pioneering effects work they did in the 70s with CSO and yellowscreens but i understand this was about film
Massive congratulations on an informative and entertaining video, ShiveringCactus! You answered so many questions I've had for years about early CG. Just spectacular. I don't understand how you only have ~6K subscribers, but I'm proud to add one more :^)
Wow, thank you. I make videos I’m interested in and there’s a lot of competition in After Effects tutorials. But I’ve really enjoyed this departure and it seems to have resonated. Hope I keep you entertained.
Triple-I's system was more than your run of the mill PDP-10. It was a one of a kind PDP-10 implemented in emitter coupled logic, called the Foonly F1. Completed in 1978, it had an operating speed of 4.5 million instructions per second (MIPS). It would generate images of 6000 by 4000 pixels, rendering them onto the PFR, and onto the film recorder. The F1 had a disk capacity just large enough to hold two complete frames: the frame being rendered, and the frame being printed. The rendering software was called TRANEW, and it was written in Fortran-IV.
Well, it did create a strong reaction, so he’s been dumped for my new VFX History video: Why is it blue in bluescreen and how does color film work? th-cam.com/video/Yhf-QQMjHwA/w-d-xo.html
Really great deep-dive! One other thing I would add is that an exact measure of film resolution is a bit hard to pin down, it's usually expressed in continually decreasing contrast for increasing line density (what an MTF plot shows). On that note, over 100 cy/mm is easily attainable by many films and that translates to ~5000 lines along the short axis of a 35mm still frame. But, for some other stocks, 120, 150, even 200 can be hit in color. In black and white things can go higher. Lastly, if you use microfilms, you can attain 800+ cy/mm (Adox CMS 20 II, Fuji HR-21, etc). So yeah, very variable! One other fun thing is how film recorders advanced. I think some others have mentioned IMAX filmouts use a CRT based system that they've kept around to achieve that resolution, but there are other systems that could achieve that or better. The AGFA Alto from 1992 could output 16k images onto frames of film (8k max for 35mm), it was used similarly as you mention in the video: a high quality storage medium due to the lack of storage for something so high resolution, or merely as the output format. And there were more examples from around this time, too!
A related early piece of tech was the digital frame store. 1970s computers could render film quality effects and ray-traced graphics, but not in real time, so they had to commit a clip to film to see what full motion sequences would look like. Then came digital frame stores in the 1980s. They were refridgerator sized racks, stacked with a few GBs of RAM, and capable of playing short clips of full HD quality video. They used a SCSI interface similar to hard drives of the era. And they supported 1035i analog HDTV output (the original MUSE HDTV standard from the early 80s was 1035i, 1080 didn't become the standard until the early 90s). In the early 90s Sony came out with reel-to-reel helical scan magnetic tape capable of real-time uncompressed HDTV, but it was very expensive and RAM prices were falling, so digital frame stores were still commonly used in the 1990s.
Very interesting! FYI Back in the late 1980's a tiny animation studio called AVA used three color-wheel camera systems to record digitally painted animation to film. First was a Polaroid freeze frame modified to fit a Double M 35mm camera (video resolution at best - was used for most new animation in BUGS .VS. DAFFY Warners TV special). 2nd was a 20' dark tunnel with Sony trinitron monitor at one end, same Double M camera at other (for various projects including THE GOOD SAMARITAN for Omni Productions). Last was a Matrox Tri-color recorder originally used by Pixar, also fitted to the Double M camera (tests for Fern Gully, music videos, etc.). AVA closed in early 1990's. All future digital animation projects I worked on were direct to video save for a Disney Australia show recorded on film from saved digital files).
That reminds me of how I took pictures of my favoured Commodore 64 games with my dad's Nikon F301 off the monitor onto slide film, then projecting the slides onto the side of the neighbours house. 11 year old me felt so avant garde.
@@ShiveringCactus Sweet. I took a class that was similar to your video also. There was some interesting stuff that I thought was really interesting. It's interesting to imagine how they experimenting effects at the time as well.
I’ve never touched one physically, but I’ve seen a telecine in action. My ex worked in a post-production house in the early aughts. I remember their telecine being something of a relic at that time but still very much in use, like a COBOL mainframe the govt definitely wants to scrap but a better alternative was never identified for that specific issue. Or in this case, never would be, as I think that process officially died with the HD switch. It was massive (they had to reinforce the floor under it), temperamental, always under some form of maintenance despite an unlimited maintenance budget, and was basically the bane of users because oftentimes you only got one shot at it and if you didn’t capture properly, you were just flat screwed. Can’t say the name of the company but man, they had two of everything that ever recorded anything to film. But only one telecine. They weren’t stupid, after all.
Actually at the time of the coronation the UK used 405 lines (377 visible, ~188 per field), when the BBC started television on an experimental basis they used 30 (yes thirty) lines, moved to 405 in 1936, and started 625 lines in 1964. We even had sets that deflected the electron beam electrostatically.
NTSC was 480 lines plus 45 "reserve" for early TVs to be able to catch up with each field, so that any working TV could broadcast a complete interlaced picture at all times. PAL was 576 lines plus 49 (some of which were used from the 70s-00s for Teletext broadcasting). BBC2 used PAL from 1967, before then it was a largely PAL-compatible BW system. The 405 line picture was a prewar standard that lasted until 1984, although BBC2 and Channel 4 and later channels never used the 405 line system.
Film is one of the things I don't want to get back to. :) I started in this industry doing model work, matte paintings, cel animation and assembling comps on an optical printer. We started building our first own film printer and a custom frame buffer(fridge-sized) back in 1983. It took a year from there to built it and write the control software around our existing PDP-10 and(if I remember correctly) an IRIS 1000 with an IBM HDD rack(also fridge-sized). In the end this allowed us to store AND output elements at up to 2000 lines, all from the same room/facility. :)
One thing I’ve learned while researching these topics is that there was never a point when the technology was “done”. Every movie seems to have improved with its own iteration of each technique.
Great video! - Being so nerdy there's always going to be a few things one can nitpick, here's mine you showed 8 perforation perf film running horizontally however motion picture camera uses 4 perf vertical. Also arguably 4 perf film has a much higher resolution given there are 10k scanners out there. Btw, I'm telecine transferring at 16mm copy of the Queen's coronation at 4k this evening! Next gripe - all that interlaced footage! Trivial comments, well described, just passing on the information for your knowledge.
I went back and forth over depicting film strips accurately. I'd created the reusable content and was sort of aware it wasn't accurate, but it allowed me to illustrate it without having to get into vertical depictions and explanations. But I'll force myself to not do so next time. One thing I struggled with was finding an image of an film strip. Let me know if you have a source. And the interlacing. Tell me about it. Unfortunately that was baked into the clips I used and nothing I tried got rid of them. Amazing coincidence that you've been working on the Coronation. Is it a personal copy or for a larger audience?
@@ShiveringCactus The interlacing is such a baked in curse in the world of archive footage, it's understandable, yet ironic that the earliest clips of video being burnt to film has such an artefact! Look up "35mm with optical sound" and you get lots of 4 perforation images that clearly show film running vertical - and shows the amount of image that had to be given up for the sound track. Format's like Super 35mm and Super 16mm acquire the image without wasting that space and also re-center the optical axis of the lens. Film is fascinating stuff, truly tactile imagery. The Queen's Coronation is just for my father's archive channel, not a great copy, but the only one of this point of view that I can find. Once again - GREAT VIDEO you made, I was only pointing those things out for the sake of correctness as it seems you are very serious about the facts. Cheers!
Thank you. It’s always a difficult balancing act. I left out several things, like the titles of the Black Hole (which was really the same story as the other uses) and I avoided talking about Film Scanning, which was also impressive but I found quite confusing.
8:04 The british didn't use the 625 line system at that time, they used a 405 line system so the resolution would be about 360 lines or haved it would be about 180 lines.
I am 40, and I remember very well when the ArriLaser got on the market and allowed automatic, self calibrating, high speed exposure of CG graphics for movies on film, blowing the competition off the market.
In the very early days of television, the frame rate was indeed locked to the AC power line, but before long sync signals were embedded into the video so that it was no longer locked to the power. An early television engineer demonstrated this with TV reception in an airplane. When the US developed color television, the frame rate was slowed slightly to prevent undesired interaction with the color subcarrier. The vertical scan rate remained close to the AC power frequency to reduce the annoyance of hum bars and flickering when scenes were illuminated by certain AC lighting. In the US, at least, the television networks were locked to stable atomic time bases and their video could be used as a frequency standard, whereas the power line would experience short-term frequency variations in response to changing load.
Amazing video!! There`s a little error, though. The first laser recorder was not the Arrilaser in the late 90ies but way earlier, the first being a laser recorder developed internally at Pixar in 1984 and later on, research for the commercially available Kodak Cineon "Lightning I" began in 1989 and came out somewhere around 1993/1994 and it`s sucessor, the "Lightning II" recorder came a few years later (but even that before the Arrilaser). As a kid, I used to pester any company on the globe to learn what is needed to create digital effects and who makes the hard- and software, so I ended up getting brochures from Kodak/Cineon as well - that`s why I still remember that :)
Laser printers are old as is the HeNe laser. What took them so long? Sensitivity to the color red? HeNe can be tuned to green! I suggest a ring laser with Faraday isolator and a Pockels cell to let out 0 to 1% of the light. So we only need moderat voltage to drive the cell and also not have problems with piezo effect. Power to the tube could vary inversely to keep a steady temperature ( the small loss of power through to the film ). And to have a darker black. Just never let the laser go out completely.
It’s weird that you describe The Coronation as the introduction to telecines. The German pioneering television from the 1930s already used this principle. That also has the nice side effect that some of its broadcasts have survived. But it was clearly already devised decades before the events you describe.
What you describe is a vidicon, the vidicons were almost not used on TV, they were used in space exploration, light amplifiers, security cameras until CCD and later CMOS rendered them obsolete by the mid 80's. TV cameras used icnoscopes and later image orticons. The icnoscope used a mica plate with silver dots in one side and a low work factor metal or semi metal, usually cesium covering it. They emitted electrons and a sweeping electron bean on the other side acted as the other plate of a capacitor measuring the charge of the dots. The image orthicon worked on a similar way but using a second plate and secondary emission to enhance the signal to noise ratio,
Why multiple frames? I thought that film was always faster than the computer and you would expose as soon as a single frame was ready. I could understand that for stability a whole sequence would be stored digitally on diagonal track tape.
Great video! Filming CRTs seems to be a permanent struggle even in modern age. I read somewhere that before the movie Being There (1979) there was no safe way to film a TV set on a scene without the common artifacts (rolling bars, stuttering, moire patterns...). Cinematographers would use optical tricks to fake an image like in 2001(1968) or just ignore the problem like in Network (1976).
7:23 back in 1953 the TVs were monochrome. The phosphor coating had no dot pitch, but was continuous. What you DID have was scanlines. Those would have been visible on film. Overexposing the tube would have made those disappear. Maybe also doable by adjusting the focus voltage on the tube. But yes, it made the image blurrier. Also there was no Moire pattern as neither the film nor the tube had regular grids. This only became an issue with later color systems when using CCDs and dot pitched tubes.
In the making of documentary of _The Frighteners,_ they show that after their CGI and VFX work is done, they have this machine that prints the rendered/composited images to film for it to be processed afterwards.
Interesting. It covers the part of CG to film before I was involved in it. I worked (and developed the electronics) for a company in the mid 80's that built a 35mm camera that in conjunction with dunn 635 (hires monitor system) was used by pixar to make reds dream and tin toy.. ahh the memories
Excellent video! It definitely takes me back to my childhood, watching those '80s CGI animations. Including NASA's state-of-the-art 3d animations of the planets, using images relayed back the Voyager probes.👏👏👏 👍
16:38 This is the first time I've heard anyone put the essential qualities of 1980s CGI to words. He explains the dull pastel highlights and indistinct edges well.
My hand to God. I was literally wondering how they get in computer edited footage, back into a film print, when this video appeared in my recommendations.
amazing how technology progresses over 40 or so years. Now a gaming computer can render 3d graphics at 100fps or more at 4k resolution direct to the screen. And even crazier is how some streaming productions are building digital Mattes using LED walls and Unreal Engine 5. I honestly never thought id see the day where a video game engine made for FPS games where you frag your buddies with a railgun would become a tool for multi-million dollar productions.
13:43 Similar to the Disney film, comes from the TRON instruction of the PDP-10, one of the 64 logical test and modify instructions. The TRON mnemonic means _Test accumulator _Right half immediate, set selected bits to _One, and skip unless all selected bits are zero (the skip phrase being coded as "N" in the mnemonic)
Telecine is usually pronounced « Telesinny » 6:22 I started hand animation in 1988/89 and computer animating in 1989/90 and the only way to cycle frames was to render one frame and send an edit command to tape for one frame only. Video buffer was deleted and rendering continued. A 30 second sequence would often take a week to print to tape. After leaning what the Quantel paintbox could do I lost patience and became a video editor instead. 🤣 Worked with the very first version of After Effects.
I worked for MATRIX (originally ImaPRO) in the 90s, just after their takeover by Agfa. Their CRT based exposure units (film recorders) were used by PIXAR for Tin Toy and Andre & Wally B. Agfa-MATRIX then went on to produce large format laser X-Ray printers.
I worked at Matrix Instruments from July 1985 till July 1988, primarily writing device drivers for the Imapro QVP render board (rebadged as the MVP, it was a complete 68000 CPU computer that was on an ISA bus card and it rendered a vector graphic language called SCODL) which would feed pixel data to the Imapro Film Recorders that Matrix had an exclusive license to resell in the U.S. and Japan. When I started working for Matrix, we could render 2k images onto 35mm color film; by the time I last worked for them (as a contractor) in 1990, they were approaching 16k frames. In 1989, Agfa Corp. bought Matrix and renamed the firm Agfa Matrix Instruments. Matrix film recorders were used by Pixar, among others, to render on film the CGI images they were creating (Agfa Matrix is mentioned in the credits of 'Tin Toy').
Fond memories of working for DEC in Reading, many many years ago, during the 11/70 and DEC-10 era. I knew they had been pioneers in the early days of computer gaming with the PDP-1, but hadn't realised that they were also pioneers in the early days of movie special effects. The VT100 was just being introduced when I was there, replacing the VT52. DEC was a great company to work for, and it was sad to see it disappear into Compaq and then HP...
In the late 1980s, I had a summer job working on 3D visualization software for scientists at NCAR in Boulder, Colorado. I did most of my programming on VT100 terminals and their 1980s descendants, which could only display text. And we had things like Tektronix storage-tube terminals that could display wireframe graphics. But to view high-resolution color images like what my software produced, we'd have the minicomputers we were using output to expensive "graphics computers" from Chromatics or Raster Technologies that had capabilities like what would come to personal computers in the next decade: big 1040x768 or 1280x1040 CRTs with 8-bit or 24-bit color. And to record to film, we used something called a DICOMED that was basically a camera mounted to a CRT that it could photograph. It all felt very cutting-edge and high-tech. Over the coming years, we got DEC graphics workstations that made the high-res color a bit more accessible.
I had a still image film recorder in the early 90’s it was basically a little CRT inside and a poloroid camera mounted to the outside pointed at the crt and some software / hardware connected to the computer.
The film printer for Tron was using a CRT light beam directly printing frame by frame a portion of 65 mm film they use 3 black and white film for each color filter channel , like the technicolor 3 strip process they made laser printer only in the mid 90s. But worst of that they have no floppy disc so they render each frame on a fly, some people type by hand the code instruction for the animation.
You have to copy film for cinema anyway. So I don’t see a problem with separate cheap bw Film for the master. Still no justification to not script the whole sequence. Somehow today with blender correct exposure is much more of a problem than back in the day with only directional lights.
The Genesis Torpedo demonstration in "Star Trek II - the Wrath of Khan" was even a multi exposure because rendering ALL the elements for one sequence was just not feasable. So they did one take and then another one and so on, with all the elements needed, resulting in a single complete shot with all the elements. All done by a multi-exposure. Was a creative way to bypass the technical limitations of the rendering machines, to get this complex shot. And believe it or not but the CGI for "The Last Starfighter" could have looked EVEN BETTER but they couldn't do it in the small timeslot this movies production. So if they had a few months more, the CGI would have looked even more realistic and detailed. Omnibus were also responsible for the CGI super reflective shots of the Droneship in "Flight of the Navigator".
Did they use a ray tracer? Occlusion makes it really hard to render a scene where not everything is in memory. I’d rather swap tiles of the frame buffer and z buffer than mess with the analog stuff.
@@ArneChristianRosenfeldt i'm not sure what they used. Back then so much renderers were very proprietary and every house had their own style. But they sure would have had more precise light and shadows and reflections of the shiny surfaces i think. Movement would have probably have some more motion blur too.
So.. I had the Mitchell/Acme 35mm camera used by Evans and Southerlands in the 1970’s to record their early animations. I say had as it was stolen in 1998. Sigh. Long story how I came by it.
Thank you. There’s very, very little information about this. As far as I can tell, images were plotted out on 2D graph paper and each point’s coordinates were then typed into an array. I couldn’t get a confirmed source though, so had to skip going into detail on that.
I have no idea what prompted me to ask the question, but once it came into my mind, I was really stuck for an answer. Hopefully anyone else wondering the same will now find this video.
I love the last star fighter, I spent ages tracking it down as an adult when dvds were just starting to take off and ordering things from the other side of the world was more of an ordeal.
i used to work for a digital effects creator for TV and it was all hardware driven. CEL electronics was the one of the bigger companies. even did some effectcs proccing as part of uni.
The misunderstood ways the perforations are shown bugs me more than I really should let them. :) In short. With vertically fed films, the perforations along with the film gate opening determine the vertical size of the image exposed to film. Perforations also make it possible to precisely position each frame of the film as its held still for both exposure in camera and projection in a projector. For normal 35mm cinema productions. 4 perf per frame were and and still is the norm. The issue with the visualizations here in the video is that in the running horse example at the start. The "film" has way, way too many perforations per frame, suggesting a film size I have never heard of in any cinematographic system using vertically fed film. And the perforations are moving slowly downwards, suggesting a pulldown that is not synced to the framerate of the film. This should make the image of the horse move down with it but it doesn't, breaking any illusion that we are watching a real projection. Also. The still from Star Wars there talking about film resolution is a frame of VistaVision. VistaVision has the frame traveling horizontally through the camera and with that, they could double the horizontal width of the exposed frame. This was necessary on effects heavy films like Star Wars as they could get footage on regular 35mm film but the grain size that was closer to 65mm film. This results in much cleaner final image as the final composite is squeezed down to the final 35mm scope prints. VistaVision there uses a film width of 8 perf per frame, so, using the equivalent pixels per grains ratio as with regular 4 perf vertically fed 35mm film, the resolution of the frame shown there would be 7200x5400. But this video did FINALLY gave me a number for the "high resolution" screens used in making films like Tron. Everywhere I looked, it always said just "high resolution screens". Although, I do wonder about that number in some shots of Tron where clear aliasing can be seen onfine diagonal lines on the BluRay I have, artifacts suggesting a screen resolution lower than 2K. But other CGI vendors were involved in that film that likely didn't have access to that 5.4K screen. But its a very comprehensive video so my nitpicks are only about the visualisations of the physical films shown.
Thank you - I've made a pinned post to note these corrections. I struggled with getting an image of a regular film strip and decided to use my own to illustrate the concepts. If you have a source, please share and I'll use in future. The source I used for the film grain was officially the two papers by Gary Demos: Demos, G. (2005) My personal history in the early explorations of computer graphics, Computer Graphics and Computer Animation A Retrospective Overview. Available at: ohiostate.pressbooks.pub/app/... (Accessed: 09 February 2024). Demos, G., Brown, M.D. and Weinbery, R.A. (2012) Digital Scene Simulation: The Synergy of Computer Technology and Human Creativity , Digital Scene Simulation. Available at: ohiostate.pressbooks.pub/app/... (Accessed: 11 February 2024). (I think it might be in both, I can't remember.) I posted this video on the r/vfx sub-Reddit and one interesting they mentioned was that the film emulsion was very forgiving to single pixels and would have some blurring. Maybe the blu-ray was cleaned up and highlighted these more, or maybe not. In the Cinefex interviews with Triple-I they were always claiming the 5k resolution. I was pleased to find a second source though - in case it was all hype.
A minor correction: although TV refresh rates generally match the AC line frequency, TV's do not actually use the AC frequency for anything - it would be incredibly impractical as any variation in phase between the supply at the transmitter and receiver would make the picture incredibly unstable. TVs generate their vertical refresh by using a PLL oscillator that locks to the vertical sync pulse of the signal.
if you didn't see the last star fighter you didn't miss much. EXCUSE ME!?!?!?!? It's a cult classic and I love it. In fact I'm planning on 3d printing a gunstar as my next multi-color project.
By the time of the coronation the resolution was not 567 lines at all. The system used was 405 lines, which in modern terms is 376i; which makes the recording 188 line.
Thank You!! Someone finally did a video explaining it! 🙏I had to research it, back in 2008. When I'd ask back then, people would just shrug or talk about something unrelated.
Very good - a couple of roughly approximates here and there, but nice. Picky: 'telecine' was rponounced 'telly-sinny', and 'moire' is 'mwarhray'. Telecine is now just about extinct. Happy days, Ol.
There is a video on here somewhere which shows a film, I think toy story, being recorded onto film using a high definition monitor, line by line, because it couldn't get enough power to expose the entire film in one go
CGI is also, oddly, a term that used to be used in development of websites. People used to store "CGI Scripts" - scripts that was needed to interface with stuff only visible on the server such as databases - in the "cgi-bin" folder. It is completely unrelated to Computer Graphics and it makes going through old forum posts and StackOverflow posts a bit...odd.
When i started working we used to print on silver film and paper. The printers didin't use ink but laser to do the job. We printed 70/100mb uncompressed picture on 24x36 or midle format with a dos station lol, the printer was washing machine sized. We used to print on paper too, same system but much much bigger (room sized) and a alpha station on unix for the software part.
4:33 the number of resolvable details for a film stock is not equal to the number of silver halide crystals on a frame of film. For one, neither developed black and white or colour negative contains any silver halide crystals, it either contains filamentary structures of metallic silver or dye clouds, not AgX crystals as they are fixed away after processing, for developed metallic silver grains, most are smaller than the wavelengths of visible light, the typical resolving power expressed for colour negative (6 microns or 80 lp/mm) is given by the fact that contrast for those spatial frequencies reaches about 20% of scene contrast. The reason the resolving power is over a larger area than the size of the grains is because when the film is exposed it scatters through each sublayer, producing a larger point spread function than the original area of incident light. Halation is another parameter and in development the use of development inhibitor and accelerator releasing couplers impact resolution also, with the inhibitors sharpening edges and accelerators softening given the diffusion radius of oxidised color developer and what these couplers release.
Have always enjoyed the ShivCac tuts... and while the content here is fabulous, it's the AE AI PS etc, etc mastery here that gets me...excited... which is saying a lot cos not too much does these days... I mean, besides good hair days... Thank you! ps I loved The Last Starfighter too...
@@ShiveringCactus Motivated? What is this obscurity that you speak of? And can one drink it with a chaser? That said, it is you that keeps I for the motivationalism... well, you and home grown...er...olives. Yes, olives... that works.
This is a bit confusing topic. Because in the days film was supersonic brilliant in it's quality by pixels/zilverkorrels, CGI and television were not at all. Of course everybody tried to improve this process, but on the film side there was nothing to improve so you depend on the CGI side. Then you show the subject of 24/25 frames a second the BBC had to deal with and forced (by brute force) a 25 frames per second film camera. Well, that was no big deal, the frame rate of a camera is a simple mechanical resolution (and revolution). This solution they already needed for other problems like NTSC material from the USA (30 fps). So it was in their DNA to solve these issues up front. CGI was far behind this problem of quality because the computer resolution was to slow, to little. I have animated a lot of graphics on the Amiga computer to stylisch my tv programmes on local tv in Amsterdam. On a TV set this worked out well. I have never feeled the need to push that to film in a higher resolution. Nowadays in the HD environment you wish you could this Amiga technique into HD as well but it isn't there. And that is exactly what bothers me, this search for perfection in the cinema world doesn't fit the need for completion of earlier successes. Let somebody reinvent the Amiga as a graphical machine with a 4K resolution. Than we have a solution that fit to film.
Really, you were involved with the model filming? I’d love to know more, would you drop me a line using the contact form on my website: shiveringcactus.net/contact
That south park looking guy though... Very funny how I was wondering how they did that, and so I found this video. It's only been 2 days since this was uploaded
*CORRECTIONS, UPDATES and CALL FOR EXPERTS*
Did you work with bluescreen on film with optical printers? Were you involved in the model filming for Star Trek: The Next Generation? Are you familiar with the processes involved in television CSO? I would love to speak to some experts to confirm my research. If you would like to help, please contact me via my website shiveringcactus.net (my email’s spam folder is particularly aggressive at the moment, so this is the best way)
Inevitably, despite trying to do loads of research I've missed some elements. And I thought I'd summarise them here:
1) TEL-E-SIN-EE, not TEL-E-SEEN - thank you @TheHandOfFear and @jpofgwynedd3878
2) Looks like TVs in the 1950s did not have 576 lines - that came later. "Actually at the time of the coronation the UK used 405 lines (377 visible, ~188 per field), when the BBC started television on an experimental basis they used 30 (yes thirty) lines, moved to 405 in 1936, and started 625 lines in 1964." - @laurdy, also @6ECF01
3) The first laser recorder was developed internally at Pixar in 1984 - @priyeshpv
4) Black-and-white TVs did not use mask holes apparently. I'd summarised two bits of info, although the workarounds by the Telecine department are accurate, but thank you @im.thatoneguy
5) The film perforations are not correct. They aren't really meant to be, just illustrative, but point taken - @jmalmsten and @computationalcinematographer
Thank you for all the helpful corrections - if TH-cam allowed editing a video or even editing the audio I'd replace these within the video.
Good job publishing corrections and updates! Appreciated ;)
It's a very nice video but my eye started twitching when you said CRTs had a resolution of 720x576 or 720x480. This is a modern, digital definition of SDTV norms. Analog CRTs do not have an inherit resolution and while the vertical resolution was somewhat fixed and standardized with 625 lines (about 576 visible) or 525 line (about 480 visible) there is no fixed standard for horizontal resolution. It's a line drawn by analog components and the amout of horizontal details is determined by the quality of the device capturing or creating the image, the bandwith of the signal and the display. The picture can have more horizontal detail than could fit into 720 or less detail but it's not bound by 720.
It’s something I completely blanked on, assuming TV resolution was always what it ended up being. What’s even more frustrating is that it’s there in screen when I show the VT Old Boys site!
1 *Telecine
It's already been pointed out but at 4:11 You're incorrect about the number of "dots" in horizontal resolution in analogue NSTC/PAL/SECAM. That's what was decided on as "a good enough" resolution when we started moving to digital. It's actually continuous. The number of lines is correct and distinct. You could argue that due to Shannon's Law, there is a maximum number of "pixels" you can get across I guess which would depend on frequency response.
You COULD (sort of) argue there were distinct pixels, but only on color displays, but the signal wasn't distinct. Various manufacturers would have different amounts of R/G/B phosphor dots on the display. I would image they more or less copied one another to optimize costs. Trinitron was famous for changing how the dots were arranged and it was certainly superior for computer displays than the triangle configuration most televisions used, and is still (strangely) used today on some LCD television sets, but not monitors.
8:48 - "Aren't all computers digital" - well, actually not. Analogue computers still exist. Analogue means it MIMICS the process of something else. You can mimic water flow in a wire for example but instead of having water molecules, you have electrons. This is why it's called "current", voltage is very analogous to pressure. Analogue computers aren't in widespread use today.
Those are all the mistakes I could spot, all two of them, and they are pretty trivial and nitpicky.
I'm just imagining the FBI asking John Whitney Sr. why he bought an anti-aircraft gun controller and he replies "It's for my job. You see, I'm an animator."
😂 - love that idea.
“You mean like a cartoonist?”
“No, I going to point this electron beam at a film strip”
“?”
“Yes, celluloid is highly combustible”
This is the greatest out of context comment ever.
"Okay, well. We were just wondering if you wanted to buy some surplus 40mms."
"Oh! I've never seen anything that shoots in 40mm. I'll buy whatever you've got."
Filmmakers/Prop Houses get an FFL (Federal Firearms License) from the ATF.
Despite California being known as a deep blue state with heavy gun control, they have more guns and more machine guns including transferable ones per capita than any other state. That's largely attributed to the film industry and scientists.
I used to write drivers for outputting computer graphics to film cameras back in the late 80's ... we'd run off a few frames to test overnight (it might take a couple of hours to create a single frame of film), pull the film off the film camera in the morning, nip into the nearest town for lunch and get the local boots to process them (obviously they'd come back with loads of stickers on them) and then we'd check the slides that afternoon, make any changes and then run new tests overnight.
The image was built up by having colour filters and a very tightly calibrated intensity of beam and then slowly scanning the beam across a monochrome tube (no shadow mask) at *VARIABLE* speed, slowing or pausing the beam for brighter areas - in that way we could get 4000 lines of resolution.
That is wild. There was so much more hands-on use of physics and math needed to achieve results than today. And dependence upon physical properties like the emulsion of film. Nowadays, solo CGI artists just complain about the graphics card.
I love that you used a high street pharmacy and they’d be checking the quality!
@@ShiveringCactus … slide film is (was?) very different to normal photo film in how it reacts to light but is (was?) also very much more expensive so we would use normal photo film for everyday testing knowing that it wouldn’t look right but given that we also knew how ‘wrong’ it should look then it was fine for most testing, only moving on to proper slide film for final testing, so the images that we got back from Boots must have confused the absolute crap out of them as they looked nothing like a photo that you’d ever take.
This was mainly for putting presentation graphics out to slides rather than doing actual rendering of animations but we could do rendering to cine film too if required (using a big ‘mickey mouse’ looking bulk film back …
I also did development on what were at the time hugely expensive full colour Hitachi A4 dye-sublimation printers as well … our kit was all custom but based on the original Acorn RISC processors - that was the days of using 100MB Rodime SCSI hard discs, Syquest 44MB cartridge drives for removable storage and if you wanted to go really mad then there were the big Panasonic 5.25” magneto optical external SCSI drives (again which we had to write our own drivers for) which I think held something like 256MB on a double sided cartridge that you had to manually eject and flip over to access the other side … how things have changed!!!
BTW, this explains the glow issue mentioned in the video. If the image was displayed statically, the film camera could just have taken a shot of the frame with a sub-one-second exposure. There would have been no afterglow as the computer would just have shown a single frame once on the screen and the shutter could close instantly after the frame was done.
But slowly painting the picture with a modulated beam and keeping the shutter open for an extended length of time would present that issue. Now, instead of the current pixel, you also get the afterglow of the previous pixels and the background glow of the tube itself. You can somewhat compensate for the former (only pixels that are not black will have an afterglow, so you can simply reduce each pixel's brightness by the expected amount of afterglow), but the latter is just there constantly. So the longer it takes to draw the frame, the more background glow builds up everywhere. And even worse, the glow level depends on the time to draw an image, so it may be different for different frames.
Now a slit shutter makes sense, as it would only allow light from the region the electron beam currently is drawing in to the film. This eliminates most of the background glow from long exposure.
In about 1980 I was 14 and I got to visit my friend’s dad’s work. It was Triple I. I actually saw animators working on Tron. They literally had converted a closet to capture the animation onto film. It was painted black on the inside and the door was light tight. It had a film camera (probably a Mitchell) pointed down at the monitor. So, I actually laid eyes on the III system.
Oh my goodness, how incredible. Did I get close with the representation?
@@ShiveringCactus yeah I think so. It was a long time ago and I was just a kid so it was all amazing to me.
Basically how Disney had been doing it decades earlier with hand painted animation.
Was it a stop motion altered version of the Mitchell similar to the one they used for Pingu?
woah
I thought I knew a lot about early CGI and VFX but most of this information is new to me. Excellent.
Thank you. I had a similar experience when I started looking into it.
I studied 3d animation (and game design) in college, and still learned a LOT in this video.
TBF, the tech that went into printing CGI on film is very much obsolete, but it still is a fascinating topic to learn about, and it explains a lot about the origin of industry terms and concepts we are taught.
I was visiting ILM (when they were just an unmarked building in a San Rafael industrial park) while they were filming the "Genesis Device" explosion sequence for "Star Trek II: The Wrath of Khan" 1982. In a small room was a color monitor slowly drawing each frame (more than a minute each if I recall correctly) from top to bottom. Close by aimed at the monitor was a (35mm I believe) film camera, that would automatically click off a film frame as each graphics frame completed. Then the entire process would start again. Pretty time consuming back then!
Oh, am I so envious that you had that experience.
@@ShiveringCactus The fun part is that I couldn't resist sticking my hand in-between the monitor and the camera, so if you watch the sequence very carefully there is one frame with a hand in it. Just kidding. But the thought did occur to me at the time ...
😂 You had me for a second there.
Ah, a 1080i color CRT in 1982 ?
@@ArneChristianRosenfeldt Who said anything about 1080i? It was actually more like 500 lines, but various techniques were used to make it look better. Here's an article describing the process -- it's that Genesis Demo rendering and filming that I described watching in person that day. Reading the article again now, I see the time per frame was much longer than I remembered after all these years, though apparently I walked in while the shorter duration frames were being drawn! - ohiostate.pressbooks.pub/app/uploads/sites/45/2020/01/StarTrekII_GenesisDemo.pdf
I can't explain how much "up-my-alley" this video was. Thank you!
Glad you enjoyed it
One small note. "The size of the phosphor dots was just bigger".
That's not really true. Black and white televisions didn't need masks so there are no dots and larger black and white TVs are just softer. The entire screen was coated in phosphor material, not little individual dots of phosphor.
Color TVs usually had different numbers of mask holes depending on their size. For instance my little 5" travel TV that I had in my car for long road trips had very very few mask holes and very few dots. My large 32" trinitron standard definition TV had way more mask holes.
For 1953 before color television, the coronation undoubtedly would not have had to deal with moire from electron beam masks.
That’s really interesting. I found so little information online about the black-and-white TVs. What I did find was colour TV descriptions and that seemed to line up with the information on the old BBC websites. I know they over-powered the beam to reduce flicker as it scanned the alternate lines, and I tried to put that in context.
@@ShiveringCactus The Technology Connections video is really excellent for a deep but accessible guide to analog TV that I learned quite a bit from:
th-cam.com/play/PLv0jwu7G_DFUGEfwEl0uWduXGcRbT7Ran.html
The main takeaway is the reminder that television is analog so there is no horizontal "resolution" there's a continuous wave of infinite "detail" to the electron beam (assuming you count noise as "detail"). The lines are the only unit which has discreet counts. So horizontally analog has infinite resolution, while vertically there are discreet scanline counts. Since analog television doesn't have a "resolution", the color dots could theoretically also be infinitely small (ignoring the physics of refraction from the color mask). So you could have an extremely fine electron beam color mask down to your refraction limit. On a large TV that could be 4k "resolution" worth of masked color dots.
However, most large format "TVs" of the analog era were projectors with 3 separate CRT black and white TVs with a filter. This had convergence and focus issues for each color to line up but had no horizontal moire because it had zero dots just lines for 3 black and white "infinite" detail horizontally rendered screens layered on top.
For something like a telecine that exposed each color channel separately there also wouldn't be a color mask so it would have horizontal infinite analog detail, but vertically discreet lines. So you could get moire from the scanlines but not from a color mask.
Standard-def CRTs certainly were, but I'm sure the ones used for film scanning used electron beams that were much more tightly-focused.
@@ShiveringCactus B&W CRTs don't have a shadow mask at all. Instead, they have a uniform coating of phosphor across the screen. Basically, what determines their resolution is how sharp the electron beam is focused and how fast it sweeps across the screen. I'm no expert, but I'm guessing they used a B&W CRT because the technology existed back then to make an extremely high-res one, at least compared to the best color CRTs of the time.
@@im.thatoneguy And remember, they're NOT PIXELS!!!!
In the early 90's to get animation to tape we had a frame buffer, about the size of a modern desktop, and all it did was hold a single 24 bit 640x480 image and generate a component video signal. Then we'd write scripts that would load an image into the buffer, roll a betacam deck back 5 seconds, pre roll, record that single frame to tape, load a new frame into the buffer, rinse and repeat. Getting field rendered frames was a whole other issue early on. Took around 10 minutes to get a single second of animation to tape.
And these days, people complain if they can’t get more than 60 frames per second in their PC game!
So this is 921600 bytes. 1 MB. Amiga 1000 had 1 MB in 1985 . Did the framebuffer interleave banks? I think that the EGA card did set up the address on 4 banks at once and then Output enabled them in a fast session. Maybe with discrete DACs it makes sense to really use 24 bit instead of 8 bit. The sequencer in the EGA card needs to have a latch per plane. With custom hardware it might be easier to have two address busses. So while we cycle through output enable on bank 0..3, we set the address on bank 4..7. 8 banks times 3 bytes are a lot of modules .
@@ShiveringCactus Ya, and that 10 minutes per second didn't even count render time. Splitting the render up onto multiple computers, running around with Jazz drives.
It all got better when Avids, Media 100's, and CoSA ( early After Effects ) came around in the mid 90's.
The 'frame buffer' still exists today in computer game graphics. It now generally means the location in video memory where the final composite image of the scene is built up, as the shaders each render out their fragments.
@@akaHarvesteR I always have to read this up: multiple fragments from the same triangle per-pixel, depending on various multisampling parameters and OpenGL state. There will be at least one fragment produced for every pixel area covered by the primitive being rasterized.
For 4k displays at 144 Hz there is exactly one fragment per pixel or even less when you consider the path tracing part. This language is outdated. Maybe it makes sense when you render offline for cinema.
At least it should be clear: fragments exist only in the GPU. The framebuffer contains pixels. On the N64 for example throughout the pipeline in the GPU fragments have greater range and precision in their RGB values, while pixels are 15 bit with dither pattern.
Sometimes youtube actually gives you something you wanna watch. Great video
Thank you.
Extremely good explanation and animation. This makes me feel like I'm back home with my dad (big time TV producer and sometimes filmmaker) listening to him explain the tech he used to me. Thanks so much.
I'm shocked that there's a filmmaking topic like this I DON'T ALREADY KNOW and understand... but I have always wondered HOW (in practice) this was done. Wow, thanks!
It’s strange that it’s just a hidden part of the process. And from posts in the comments here I’ve learned it is still being used, especially for IMAX.
This is such high quality work, you should make a separate channel for video essays like this that appeal to an audience bigger than the after effects tutorial people!!
Thank you. That’s a point, but… how to separate the current vids? argh. Wasn’t quite expecting this one to take off the way it has done
Thanks for linking my documentary ☺️
I loved your documentary. At the time TH-cam showed me your video, I had my script written but the stories were told were really similar to yours, which forced a rewrite. 😆😪
@@ShiveringCactus Ah ah, glad you got to stumble on it!
This was a great video. When you were talking about the BBC And Doctor Who I'd hoped you'd cover the pioneering effects work they did in the 70s with CSO and yellowscreens but i understand this was about film
Thank you. All that’s coming in a future video
@@ShiveringCactus can't wait! I subscribed
This is really interesting!
I wish more films could capture that old CGI feel, it's so unique in its look.
There’s been some really interesting comments on this thread. One thing I’ve learned since is that the film emulsion helped blend the edges of the cgi
@@ShiveringCactus That's interesting! I didn't know that.
Massive congratulations on an informative and entertaining video, ShiveringCactus! You answered so many questions I've had for years about early CG. Just spectacular. I don't understand how you only have ~6K subscribers, but I'm proud to add one more :^)
Wow, thank you. I make videos I’m interested in and there’s a lot of competition in After Effects tutorials. But I’ve really enjoyed this departure and it seems to have resonated. Hope I keep you entertained.
Triple-I's system was more than your run of the mill PDP-10. It was a one of a kind PDP-10 implemented in emitter coupled logic, called the Foonly F1. Completed in 1978, it had an operating speed of 4.5 million instructions per second (MIPS). It would generate images of 6000 by 4000 pixels, rendering them onto the PFR, and onto the film recorder. The F1 had a disk capacity just large enough to hold two complete frames: the frame being rendered, and the frame being printed.
The rendering software was called TRANEW, and it was written in Fortran-IV.
Aw hell naw not the South Park avatar.
Well, it did create a strong reaction, so he’s been dumped for my new VFX History video: Why is it blue in bluescreen and how does color film work?
th-cam.com/video/Yhf-QQMjHwA/w-d-xo.html
It’s awful
Great video! love the series. Educational and interesting. Thank you for investing so much time in and effort.
Thank you for your kind words and your own amazing tutorials Eran. I’ve been learning from you for 20 years!
Really great deep-dive! One other thing I would add is that an exact measure of film resolution is a bit hard to pin down, it's usually expressed in continually decreasing contrast for increasing line density (what an MTF plot shows). On that note, over 100 cy/mm is easily attainable by many films and that translates to ~5000 lines along the short axis of a 35mm still frame. But, for some other stocks, 120, 150, even 200 can be hit in color. In black and white things can go higher. Lastly, if you use microfilms, you can attain 800+ cy/mm (Adox CMS 20 II, Fuji HR-21, etc).
So yeah, very variable!
One other fun thing is how film recorders advanced. I think some others have mentioned IMAX filmouts use a CRT based system that they've kept around to achieve that resolution, but there are other systems that could achieve that or better. The AGFA Alto from 1992 could output 16k images onto frames of film (8k max for 35mm), it was used similarly as you mention in the video: a high quality storage medium due to the lack of storage for something so high resolution, or merely as the output format. And there were more examples from around this time, too!
no longer will i be kept awake at night with these questions, thanks for this video 👍
Any time!
A related early piece of tech was the digital frame store. 1970s computers could render film quality effects and ray-traced graphics, but not in real time, so they had to commit a clip to film to see what full motion sequences would look like. Then came digital frame stores in the 1980s. They were refridgerator sized racks, stacked with a few GBs of RAM, and capable of playing short clips of full HD quality video. They used a SCSI interface similar to hard drives of the era. And they supported 1035i analog HDTV output (the original MUSE HDTV standard from the early 80s was 1035i, 1080 didn't become the standard until the early 90s).
In the early 90s Sony came out with reel-to-reel helical scan magnetic tape capable of real-time uncompressed HDTV, but it was very expensive and RAM prices were falling, so digital frame stores were still commonly used in the 1990s.
LEGENDARY STATUS information here!!!! THANK YOU !!!!!!
Very interesting! FYI Back in the late 1980's a tiny animation studio called AVA used three color-wheel camera systems to record digitally painted animation to film. First was a Polaroid freeze frame modified to fit a Double M 35mm camera (video resolution at best - was used for most new animation in BUGS .VS. DAFFY Warners TV special). 2nd was a 20' dark tunnel with Sony trinitron monitor at one end, same Double M camera at other (for various projects including THE GOOD SAMARITAN for Omni Productions). Last was a Matrox Tri-color recorder originally used by Pixar, also fitted to the Double M camera (tests for Fern Gully, music videos, etc.). AVA closed in early 1990's. All future digital animation projects I worked on were direct to video save for a Disney Australia show recorded on film from saved digital files).
That reminds me of how I took pictures of my favoured Commodore 64 games with my dad's Nikon F301 off the monitor onto slide film, then projecting the slides onto the side of the neighbours house. 11 year old me felt so avant garde.
The Last Starfighter was an Amazing movie and is still a great movie today!
Someone grew up watching Connections as a kid. That intro.
Just a hair too young for Connections unfortunately, but I’m sure I picked up that style from others who had watched Connections.
I sometimes wheel that all the TH-camrs that do this kind of video picked up on James Burke :)
This was an amazing video. I felt I was back in college when I seen this. Great job!
Thank you. I felt like I was too, getting all those references correct!
@@ShiveringCactus Sweet. I took a class that was similar to your video also. There was some interesting stuff that I thought was really interesting. It's interesting to imagine how they experimenting effects at the time as well.
I’ve never touched one physically, but I’ve seen a telecine in action. My ex worked in a post-production house in the early aughts. I remember their telecine being something of a relic at that time but still very much in use, like a COBOL mainframe the govt definitely wants to scrap but a better alternative was never identified for that specific issue. Or in this case, never would be, as I think that process officially died with the HD switch.
It was massive (they had to reinforce the floor under it), temperamental, always under some form of maintenance despite an unlimited maintenance budget, and was basically the bane of users because oftentimes you only got one shot at it and if you didn’t capture properly, you were just flat screwed.
Can’t say the name of the company but man, they had two of everything that ever recorded anything to film. But only one telecine. They weren’t stupid, after all.
Wow, I knew Telecines were big, but to need reinforcing the floor. I wonder if that's why the BBC had them in the basement of Lime Grove.
Actually at the time of the coronation the UK used 405 lines (377 visible, ~188 per field), when the BBC started television on an experimental basis they used 30 (yes thirty) lines, moved to 405 in 1936, and started 625 lines in 1964. We even had sets that deflected the electron beam electrostatically.
Wow - I spent ages on TV resolution during research and didn't come across that. Thanks for sharing.
I'm loving this series. You've really gone above and beyond.
As much as I enjoy making the tutorials, I’m getting something special from the research and animations.
NTSC was 480 lines plus 45 "reserve" for early TVs to be able to catch up with each field, so that any working TV could broadcast a complete interlaced picture at all times. PAL was 576 lines plus 49 (some of which were used from the 70s-00s for Teletext broadcasting). BBC2 used PAL from 1967, before then it was a largely PAL-compatible BW system. The 405 line picture was a prewar standard that lasted until 1984, although BBC2 and Channel 4 and later channels never used the 405 line system.
@@anonUK ORTF has a 819 line system back then.
Film is one of the things I don't want to get back to. :) I started in this industry doing model work, matte paintings, cel animation and assembling comps on an optical printer.
We started building our first own film printer and a custom frame buffer(fridge-sized) back in 1983. It took a year from there to built it and write the control software around our existing PDP-10 and(if I remember correctly) an IRIS 1000 with an IBM HDD rack(also fridge-sized). In the end this allowed us to store AND output elements at up to 2000 lines, all from the same room/facility. :)
One thing I’ve learned while researching these topics is that there was never a point when the technology was “done”. Every movie seems to have improved with its own iteration of each technique.
Glad to have found this channel, great content! Thanks.
I hope I keep you entertained
Great video! - Being so nerdy there's always going to be a few things one can nitpick, here's mine you showed 8 perforation perf film running horizontally however motion picture camera uses 4 perf vertical. Also arguably 4 perf film has a much higher resolution given there are 10k scanners out there. Btw, I'm telecine transferring at 16mm copy of the Queen's coronation at 4k this evening! Next gripe - all that interlaced footage! Trivial comments, well described, just passing on the information for your knowledge.
I went back and forth over depicting film strips accurately. I'd created the reusable content and was sort of aware it wasn't accurate, but it allowed me to illustrate it without having to get into vertical depictions and explanations. But I'll force myself to not do so next time. One thing I struggled with was finding an image of an film strip. Let me know if you have a source.
And the interlacing. Tell me about it. Unfortunately that was baked into the clips I used and nothing I tried got rid of them.
Amazing coincidence that you've been working on the Coronation. Is it a personal copy or for a larger audience?
@@ShiveringCactus The interlacing is such a baked in curse in the world of archive footage, it's understandable, yet ironic that the earliest clips of video being burnt to film has such an artefact! Look up "35mm with optical sound" and you get lots of 4 perforation images that clearly show film running vertical - and shows the amount of image that had to be given up for the sound track. Format's like Super 35mm and Super 16mm acquire the image without wasting that space and also re-center the optical axis of the lens. Film is fascinating stuff, truly tactile imagery. The Queen's Coronation is just for my father's archive channel, not a great copy, but the only one of this point of view that I can find. Once again - GREAT VIDEO you made, I was only pointing those things out for the sake of correctness as it seems you are very serious about the facts. Cheers!
Cramming all that into 20 minutes is an amazing feat.
Thank you. It’s always a difficult balancing act. I left out several things, like the titles of the Black Hole (which was really the same story as the other uses) and I avoided talking about Film Scanning, which was also impressive but I found quite confusing.
Wow, awesome video! I can't wait for the next one!
Thank you. I’ve only just started the research, hope you can be patient. 😉
8:04 The british didn't use the 625 line system at that time, they used a 405 line system so the resolution would be about 360 lines or haved it would be about 180 lines.
Thank you - I've made a pinned post to note these corrections.
I am 40, and I remember very well when the ArriLaser got on the market and allowed automatic, self calibrating, high speed exposure of CG graphics for movies on film, blowing the competition off the market.
What a great video, I've been intrigued by this branch of technology for quite a while!
Thank you
In the very early days of television, the frame rate was indeed locked to the AC power line, but before long sync signals were embedded into the video so that it was no longer locked to the power. An early television engineer demonstrated this with TV reception in an airplane. When the US developed color television, the frame rate was slowed slightly to prevent undesired interaction with the color subcarrier. The vertical scan rate remained close to the AC power frequency to reduce the annoyance of hum bars and flickering when scenes were illuminated by certain AC lighting. In the US, at least, the television networks were locked to stable atomic time bases and their video could be used as a frequency standard, whereas the power line would experience short-term frequency variations in response to changing load.
Amazing video!! There`s a little error, though. The first laser recorder was not the Arrilaser in the late 90ies but way earlier, the first being a laser recorder developed internally at Pixar in 1984 and later on, research for the commercially available Kodak Cineon "Lightning I" began in 1989 and came out somewhere around 1993/1994 and it`s sucessor, the "Lightning II" recorder came a few years later (but even that before the Arrilaser). As a kid, I used to pester any company on the globe to learn what is needed to create digital effects and who makes the hard- and software, so I ended up getting brochures from Kodak/Cineon as well - that`s why I still remember that :)
I had no idea. I’ve popped a pinned corrections post to the comments, and mentioned this, thank you.
@@ShiveringCactus You`re welcome!
Laser printers are old as is the HeNe laser. What took them so long? Sensitivity to the color red? HeNe can be tuned to green!
I suggest a ring laser with Faraday isolator and a Pockels cell to let out 0 to 1% of the light. So we only need moderat voltage to drive the cell and also not have problems with piezo effect. Power to the tube could vary inversely to keep a steady temperature ( the small loss of power through to the film ). And to have a darker black. Just never let the laser go out completely.
Try find Cinefex magazine:
8 - Tron
17 - Last Starfighter
18 - Star Trek 2
37 - Star Trek Next Generation
It’s weird that you describe The Coronation as the introduction to telecines. The German pioneering television from the 1930s already used this principle. That also has the nice side effect that some of its broadcasts have survived. But it was clearly already devised decades before the events you describe.
What you describe is a vidicon, the vidicons were almost not used on TV, they were used in space exploration, light amplifiers, security cameras until CCD and later CMOS rendered them obsolete by the mid 80's. TV cameras used icnoscopes and later image orticons. The icnoscope used a mica plate with silver dots in one side and a low work factor metal or semi metal, usually cesium covering it. They emitted electrons and a sweeping electron bean on the other side acted as the other plate of a capacitor measuring the charge of the dots. The image orthicon worked on a similar way but using a second plate and secondary emission to enhance the signal to noise ratio,
A room-sized GPU to generate a few frames, to be re-recorded by a regular camera! Awesome video
"GPU" in that sense is not correct.
Why multiple frames? I thought that film was always faster than the computer and you would expose as soon as a single frame was ready. I could understand that for stability a whole sequence would be stored digitally on diagonal track tape.
Great video! Filming CRTs seems to be a permanent struggle even in modern age. I read somewhere that before the movie Being There (1979) there was no safe way to film a TV set on a scene without the common artifacts (rolling bars, stuttering, moire patterns...). Cinematographers would use optical tricks to fake an image like in 2001(1968) or just ignore the problem like in Network (1976).
Wow, I knew there was a technique to sync up TVs for film, but I had no idea it was so late.
A few simplifications at the start of the video when talking about CRTs and such, but still very good!
7:23 back in 1953 the TVs were monochrome. The phosphor coating had no dot pitch, but was continuous. What you DID have was scanlines. Those would have been visible on film. Overexposing the tube would have made those disappear. Maybe also doable by adjusting the focus voltage on the tube. But yes, it made the image blurrier. Also there was no Moire pattern as neither the film nor the tube had regular grids. This only became an issue with later color systems when using CCDs and dot pitched tubes.
In the making of documentary of _The Frighteners,_ they show that after their CGI and VFX work is done, they have this machine that prints the rendered/composited images to film for it to be processed afterwards.
Interesting. It covers the part of CG to film before I was involved in it. I worked (and developed the electronics) for a company in the mid 80's that built a 35mm camera that in conjunction with dunn 635 (hires monitor system) was used by pixar to make reds dream and tin toy.. ahh the memories
This was a great video with super-helpful visuals! Thank you! :D
Thank you.
Excellent video! It definitely takes me back to my childhood, watching those '80s CGI animations. Including NASA's state-of-the-art 3d animations of the planets, using images relayed back the Voyager probes.👏👏👏 👍
Glad you enjoyed it!
16:38 This is the first time I've heard anyone put the essential qualities of 1980s CGI to words. He explains the dull pastel highlights and indistinct edges well.
Steve’s great p, isn’t he. He was so generous with his time and we had a great conversation about the VFX industry.
My hand to God. I was literally wondering how they get in computer edited footage, back into a film print, when this video appeared in my recommendations.
TH-cam is spooky that way
amazing how technology progresses over 40 or so years. Now a gaming computer can render 3d graphics at 100fps or more at 4k resolution direct to the screen. And even crazier is how some streaming productions are building digital Mattes using LED walls and Unreal Engine 5. I honestly never thought id see the day where a video game engine made for FPS games where you frag your buddies with a railgun would become a tool for multi-million dollar productions.
13:43 Similar to the Disney film, comes from the TRON instruction of the PDP-10,
one of the 64 logical test and modify instructions. The TRON
mnemonic means _Test accumulator _Right half immediate, set selected
bits to _One, and skip unless all selected bits are zero (the skip
phrase being coded as "N" in the mnemonic)
That was excellent, well done!
Thank you kindly!
Telecine is usually pronounced « Telesinny » 6:22 I started hand animation in 1988/89 and computer animating in 1989/90 and the only way to cycle frames was to render one frame and send an edit command to tape for one frame only. Video buffer was deleted and rendering continued. A 30 second sequence would often take a week to print to tape. After leaning what the Quantel paintbox could do I lost patience and became a video editor instead. 🤣 Worked with the very first version of After Effects.
Came here to see if anyone had commented on the Telecine pronunciation, only found you! It jarred a bit.
Dude, you got me just for short answer at the beginning.
I worked for MATRIX (originally ImaPRO) in the 90s, just after their takeover by Agfa. Their CRT based exposure units (film recorders) were used by PIXAR for Tin Toy and Andre & Wally B. Agfa-MATRIX then went on to produce large format laser X-Ray printers.
This was such a hidden field to me, it a been great to hear from so many people in the industry. Thank you for posting
I worked at Matrix Instruments from July 1985 till July 1988, primarily writing device drivers for the Imapro QVP render board (rebadged as the MVP, it was a complete 68000 CPU computer that was on an ISA bus card and it rendered a vector graphic language called SCODL) which would feed pixel data to the Imapro Film Recorders that Matrix had an exclusive license to resell in the U.S. and Japan. When I started working for Matrix, we could render 2k images onto 35mm color film; by the time I last worked for them (as a contractor) in 1990, they were approaching 16k frames. In 1989, Agfa Corp. bought Matrix and renamed the firm Agfa Matrix Instruments. Matrix film recorders were used by Pixar, among others, to render on film the CGI images they were creating (Agfa Matrix is mentioned in the credits of 'Tin Toy').
Wow, this was amazing and I have subbed!
Thank you, hope I keep you entertained.
Fond memories of working for DEC in Reading, many many years ago, during the 11/70 and DEC-10 era.
I knew they had been pioneers in the early days of computer gaming with the PDP-1, but hadn't realised that they were also pioneers in the early days of movie special effects. The VT100 was just being introduced when I was there, replacing the VT52. DEC was a great company to work for, and it was sad to see it disappear into Compaq and then HP...
Did I get the look right of the 3D model? I only had two images to work from?
Top shelf content
Thank you!
In the late 1980s, I had a summer job working on 3D visualization software for scientists at NCAR in Boulder, Colorado. I did most of my programming on VT100 terminals and their 1980s descendants, which could only display text. And we had things like Tektronix storage-tube terminals that could display wireframe graphics. But to view high-resolution color images like what my software produced, we'd have the minicomputers we were using output to expensive "graphics computers" from Chromatics or Raster Technologies that had capabilities like what would come to personal computers in the next decade: big 1040x768 or 1280x1040 CRTs with 8-bit or 24-bit color. And to record to film, we used something called a DICOMED that was basically a camera mounted to a CRT that it could photograph. It all felt very cutting-edge and high-tech. Over the coming years, we got DEC graphics workstations that made the high-res color a bit more accessible.
I had a still image film recorder in the early 90’s it was basically a little CRT inside and a poloroid camera mounted to the outside pointed at the crt and some software / hardware connected to the computer.
The film printer for Tron was using a CRT light beam directly printing frame by frame a portion of 65 mm film they use 3 black and white film for each color filter channel , like the technicolor 3 strip process they made laser printer only in the mid 90s. But worst of that they have no floppy disc so they render each frame on a fly, some people type by hand the code instruction for the animation.
You have to copy film for cinema anyway. So I don’t see a problem with separate cheap bw Film for the master. Still no justification to not script the whole sequence. Somehow today with blender correct exposure is much more of a problem than back in the day with only directional lights.
Subtitles seem to stop at 16:27. Any chance of having it updated?
I'm not sure what went wrong there, but I've reuploaded the file and I think that's sorted the issue.
@@ShiveringCactus 👍 thanks
The Genesis Torpedo demonstration in "Star Trek II - the Wrath of Khan" was even a multi exposure because rendering ALL the elements for one sequence was just not feasable. So they did one take and then another one and so on, with all the elements needed, resulting in a single complete shot with all the elements. All done by a multi-exposure. Was a creative way to bypass the technical limitations of the rendering machines, to get this complex shot. And believe it or not but the CGI for "The Last Starfighter" could have looked EVEN BETTER but they couldn't do it in the small timeslot this movies production. So if they had a few months more, the CGI would have looked even more realistic and detailed. Omnibus were also responsible for the CGI super reflective shots of the Droneship in "Flight of the Navigator".
Did they use a ray tracer? Occlusion makes it really hard to render a scene where not everything is in memory. I’d rather swap tiles of the frame buffer and z buffer than mess with the analog stuff.
@@ArneChristianRosenfeldt i'm not sure what they used. Back then so much renderers were very proprietary and every house had their own style. But they sure would have had more precise light and shadows and reflections of the shiny surfaces i think. Movement would have probably have some more motion blur too.
@@KRAFTWERK2K6 for motion blur a raytracer needs to expose the frame buffer (now called accumulation buffer) with multiple fields.
So.. I had the Mitchell/Acme 35mm camera used by Evans and Southerlands in the 1970’s to record their early animations. I say had as it was stolen in 1998. Sigh. Long story how I came by it.
Absolutely loved this!!
Thank you!
0:00 My first guess would be taking photos of a screen....
Subscribed! glad yt showed me this 😄I'm left wondering about the design and programming side of the images, like Fonda's head and the space ships.
Thank you. There’s very, very little information about this. As far as I can tell, images were plotted out on 2D graph paper and each point’s coordinates were then typed into an array. I couldn’t get a confirmed source though, so had to skip going into detail on that.
That's weird I thought this exact same thing a while ago. Couldn't find much info about the film transfer method used back then. Thanks!
I have no idea what prompted me to ask the question, but once it came into my mind, I was really stuck for an answer. Hopefully anyone else wondering the same will now find this video.
Something I've always wondered! Especially being into computers through all these years and wondering what system would have that kind of resolution.
It’s amazing, isn’t it. I’m still not sure how Triple-I made the monitors, but they’ve always claimed those resolutions.
I love the last star fighter, I spent ages tracking it down as an adult when dvds were just starting to take off and ordering things from the other side of the world was more of an ordeal.
i used to work for a digital effects creator for TV and it was all hardware driven. CEL electronics was the one of the bigger companies. even did some effectcs proccing as part of uni.
The misunderstood ways the perforations are shown bugs me more than I really should let them. :)
In short. With vertically fed films, the perforations along with the film gate opening determine the vertical size of the image exposed to film. Perforations also make it possible to precisely position each frame of the film as its held still for both exposure in camera and projection in a projector. For normal 35mm cinema productions. 4 perf per frame were and and still is the norm.
The issue with the visualizations here in the video is that in the running horse example at the start. The "film" has way, way too many perforations per frame, suggesting a film size I have never heard of in any cinematographic system using vertically fed film. And the perforations are moving slowly downwards, suggesting a pulldown that is not synced to the framerate of the film. This should make the image of the horse move down with it but it doesn't, breaking any illusion that we are watching a real projection.
Also. The still from Star Wars there talking about film resolution is a frame of VistaVision. VistaVision has the frame traveling horizontally through the camera and with that, they could double the horizontal width of the exposed frame. This was necessary on effects heavy films like Star Wars as they could get footage on regular 35mm film but the grain size that was closer to 65mm film. This results in much cleaner final image as the final composite is squeezed down to the final 35mm scope prints.
VistaVision there uses a film width of 8 perf per frame, so, using the equivalent pixels per grains ratio as with regular 4 perf vertically fed 35mm film, the resolution of the frame shown there would be 7200x5400.
But this video did FINALLY gave me a number for the "high resolution" screens used in making films like Tron. Everywhere I looked, it always said just "high resolution screens". Although, I do wonder about that number in some shots of Tron where clear aliasing can be seen onfine diagonal lines on the BluRay I have, artifacts suggesting a screen resolution lower than 2K. But other CGI vendors were involved in that film that likely didn't have access to that 5.4K screen.
But its a very comprehensive video so my nitpicks are only about the visualisations of the physical films shown.
Thank you - I've made a pinned post to note these corrections. I struggled with getting an image of a regular film strip and decided to use my own to illustrate the concepts. If you have a source, please share and I'll use in future.
The source I used for the film grain was officially the two papers by Gary Demos:
Demos, G. (2005) My personal history in the early explorations of computer graphics, Computer Graphics and Computer Animation A Retrospective Overview. Available at: ohiostate.pressbooks.pub/app/... (Accessed: 09 February 2024).
Demos, G., Brown, M.D. and Weinbery, R.A. (2012) Digital Scene Simulation: The Synergy of Computer Technology and Human Creativity , Digital Scene Simulation. Available at: ohiostate.pressbooks.pub/app/... (Accessed: 11 February 2024).
(I think it might be in both, I can't remember.)
I posted this video on the r/vfx sub-Reddit and one interesting they mentioned was that the film emulsion was very forgiving to single pixels and would have some blurring. Maybe the blu-ray was cleaned up and highlighted these more, or maybe not.
In the Cinefex interviews with Triple-I they were always claiming the 5k resolution. I was pleased to find a second source though - in case it was all hype.
What a real G move to put the legit TL;DR at the front in spite of the algorithm
Gamble seems to have paid off
You should do a follow up video about how early desktop computer graphics were done using Amigas and the PAR (personal animation recorder cards.
Thanks for the idea. I know almost nothing about that era, but then I didn’t know anything about Film Recorders / Printers until I looked into it.
Only standard definition and not many colors.
Fascinating!
Thank you
A minor correction: although TV refresh rates generally match the AC line frequency, TV's do not actually use the AC frequency for anything - it would be incredibly impractical as any variation in phase between the supply at the transmitter and receiver would make the picture incredibly unstable. TVs generate their vertical refresh by using a PLL oscillator that locks to the vertical sync pulse of the signal.
if you didn't see the last star fighter you didn't miss much.
EXCUSE ME!?!?!?!? It's a cult classic and I love it. In fact I'm planning on 3d printing a gunstar as my next multi-color project.
Hopefully I redeemed myself a few seconds after making that comment.
@@ShiveringCactus lol yeah It's interesting how some of the shots still hold up while others look like atari 2600
By the time of the coronation the resolution was not 567 lines at all. The system used was 405 lines, which in modern terms is 376i; which makes the recording 188 line.
btw it was also filmed on film, this is why there is colour pictures too.
Thank You!! Someone finally did a video explaining it! 🙏I had to research it, back in 2008. When I'd ask back then, people would just shrug or talk about something unrelated.
You’re welcome. When I first wondered about it, I found all sorts about scanning film into a computer but nothing about getting it out.
This is awesome!
Thank you
Very good - a couple of roughly approximates here and there, but nice. Picky: 'telecine' was rponounced 'telly-sinny', and 'moire' is 'mwarhray'. Telecine is now just about extinct. Happy days, Ol.
Thank you. That Telecine pronunciation will haunt me for years to come. If I could update the video without losing the analytics I would.
THANK YOU 🙏
There is a video on here somewhere which shows a film, I think toy story, being recorded onto film using a high definition monitor, line by line, because it couldn't get enough power to expose the entire film in one go
I wonder if one could use a CRT vectorscope as a film recorder
CGI is also, oddly, a term that used to be used in development of websites. People used to store "CGI Scripts" - scripts that was needed to interface with stuff only visible on the server such as databases - in the "cgi-bin" folder. It is completely unrelated to Computer Graphics and it makes going through old forum posts and StackOverflow posts a bit...odd.
That’s true. I’d forgotten all about server-sides scripts.
Very cool video :)
Thank you
@@ShiveringCactus you're welcome :)
I'm just here for the ship in the thumbnail.
When i started working we used to print on silver film and paper. The printers didin't use ink but laser to do the job. We printed 70/100mb uncompressed picture on 24x36 or midle format with a dos station lol, the printer was washing machine sized. We used to print on paper too, same system but much much bigger (room sized) and a alpha station on unix for the software part.
One thing I couldn’t find an explanation for was how Triple-I used monitors for printing. Did you ever use monitors in that way?
@@ShiveringCactus never Seen something so old.
First time watching the channel, great content! Have to say, though, the animated host is distracting. I'd show it a lot less.
4:33 the number of resolvable details for a film stock is not equal to the number of silver halide crystals on a frame of film. For one, neither developed black and white or colour negative contains any silver halide crystals, it either contains filamentary structures of metallic silver or dye clouds, not AgX crystals as they are fixed away after processing, for developed metallic silver grains, most are smaller than the wavelengths of visible light, the typical resolving power expressed for colour negative (6 microns or 80 lp/mm) is given by the fact that contrast for those spatial frequencies reaches about 20% of scene contrast. The reason the resolving power is over a larger area than the size of the grains is because when the film is exposed it scatters through each sublayer, producing a larger point spread function than the original area of incident light. Halation is another parameter and in development the use of development inhibitor and accelerator releasing couplers impact resolution also, with the inhibitors sharpening edges and accelerators softening given the diffusion radius of oxidised color developer and what these couplers release.
Have always enjoyed the ShivCac tuts... and while the content here is fabulous, it's the AE AI PS etc, etc mastery here that gets me...excited... which is saying a lot cos not too much does these days... I mean, besides good hair days... Thank you! ps I loved The Last Starfighter too...
Thank you. Comments like this really help keep me motivated!
@@ShiveringCactus Motivated? What is this obscurity that you speak of? And can one drink it with a chaser? That said, it is you that keeps I for the motivationalism... well, you and home grown...er...olives. Yes, olives... that works.
This is a bit confusing topic. Because in the days film was supersonic brilliant in it's quality by pixels/zilverkorrels, CGI and television were not at all. Of course everybody tried to improve this process, but on the film side there was nothing to improve so you depend on the CGI side. Then you show the subject of 24/25 frames a second the BBC had to deal with and forced (by brute force) a 25 frames per second film camera. Well, that was no big deal, the frame rate of a camera is a simple mechanical resolution (and revolution). This solution they already needed for other problems like NTSC material from the USA (30 fps). So it was in their DNA to solve these issues up front. CGI was far behind this problem of quality because the computer resolution was to slow, to little. I have animated a lot of graphics on the Amiga computer to stylisch my tv programmes on local tv in Amsterdam. On a TV set this worked out well. I have never feeled the need to push that to film in a higher resolution. Nowadays in the HD environment you wish you could this Amiga technique into HD as well but it isn't there. And that is exactly what bothers me, this search for perfection in the cinema world doesn't fit the need for completion of earlier successes. Let somebody reinvent the Amiga as a graphical machine with a 4K resolution. Than we have a solution that fit to film.
We used UV mattes on Next Gen?!?!? I should have paid more attention, but they didn't let me out much back then.
Really, you were involved with the model filming? I’d love to know more, would you drop me a line using the contact form on my website: shiveringcactus.net/contact
شرح بسيط.. وفيديو ممتع 👍🏼👍🏼
This may be the first time I've seen any TH-camr use the word "comprise" properly
😀 thanks. But according to the comments I got plenty of other words wrong
That south park looking guy though...
Very funny how I was wondering how they did that, and so I found this video. It's only been 2 days since this was uploaded
Thank you.
I’m sort of stuck with the character, as it’s easier than a camera, lights etc… 😂
@@ShiveringCactus I can maybe design you a character if you want. 👍