The new DUNE movie was shot digital (on the ARRI ALEXA LF and Mini LF), then was transferred to 35mm film, and then was scanned back to digital. All that to create the most accurate emulation possible, reducing the digital sharpness, and elevating softness.
@@ThePooper3000 That would be the "Film-Out" method. This is where the movie is printed out to film using a film printer and has been used as long as there have been digital film editing software. Originally done to print scenes with digital effects, it extended to printing whole film reel masters from it's computer film editing software. Sadly this method is not used as much anymore thanks to digital cinema, but some still use it.
@@victorseastrom3455 Because they got a better result doing it this way. They tried shooting on film when doing tests on set and they didn't get the result they wanted but still wanted some of the characteristics of shooting on film so they went with the third option which is this digital to film method. It's just as valid an option as any other.
@@victorseastrom3455 The DoP of Dune was Greig Fraser, one of the best out there. He’s shot with both film and digital. On Roger Deakin’s podcast he talks about testing both film and digital for dune, but they decided to go with the Alexa LF and the the film print/re-scan in post, to give a look that they felt sat in the middle of the two mediums. On a personal note, I think they made a very good choice. I think it’s one of my favourite looking films to have ever been shot.
I guess part of the reason why everything is shot in 4k these days is because its safer and offers more options in post; its far easier to remove detail from, and soften a high resolution image than doing the inverse.
I guess that's not the point of the video. the thing is people are obsessed with gear thinking it will make their work better, but only skill and dedication can do it.
We shouldn't confuse resolution with sharpness here. A digital and extremely sharp look is certainly a stylistic choice, and not always desirable. But as stated in the Video, high resolution doesn't necessarily lead to a very sharp image. Low resolution on the other hand, (especially once one starts to see individual pixels) looks extremely digital to me, and should be treated as a stylistic choice as well. Really high resolutions like 8k give a pretty neutral representation of what is captured by the lens. The level of sharpness and detail in the image can then be freely controlled by adding diffusion, choosing a softer lens etc. Considering this, I'd say higher resolution actually can lead to a more organic looking digital image.
this honestly put into words what I was thinking, while I will agree 8K isn't always needed, the video mostly talks about sharpness, and the two while being partially linked, are not the same. I also would like to add as a viewer (though I also work with cameras typically for photography but I want to try my hand at cinematography) when something is delivered at a lower resolution than the display device or if the display device has too low of a resolution for the size, especially noticeable on stuff like 1080p projectors in home setups (and 480 line CRTs but that's obsolete), you can see the pixels and the loss of detail in an undesirable way, but when you deliver the content at the appropriate resolution for the screen size and distance it looks more natural on the screen and the obviousness that its artificial disappears, then combine that with scanned film or diffusion filters/any other way to soften the image without lowering the resolution its self and you can have a soft image without harsh pixels that can distract from the viewing experience. The video goes on about resolution when sharpness is the main thing here,
This. Thank you. High resolution does not mean the end result has to be tack sharp. There are numerous ways to soften a shot if that is the goal but doing it via lower digital resolution is not generally going to be the best choice. And then there is the whole issue of color space and HDR that could be more important to presentation that resolution. This whole video was nonsense.
Honestly, high resolution is just simply peace of mind knowing I can crop or punch in any way I want in post. You can film further from your subject and then crop closer in post without losing detail for the perfect framing.
Surely your DOP/cinematographer should be in charge of getting the perfect framing on set. It's lazy to say "I won't move closer, I'll just crop it in post and reframe then?"
Human eyes are a lot more sensitive to contrast than resolution. A 2K presentation in HDR will look sharper to our eye than 4K SDR. Or scenes with a high dynamic range. 4K scenes that are very flat in range will seem less appealing and dimensional. But 4K is more than enough. I can really enjoy a supercrisp 4K presentation like The Revenant but the softer more romantic look also has it's charm. It's a creative decision. 8K is a waste of space, only usefull to have more reframing possibilities in editing but ridiculous as an output resolution.
@@berlin03030 I bought an LG B7V 65 inch OLED 4 years ago for about 2200 euros. Very happy with it. Blacks are perfect. The max brightness is 750 nits for 10% of the screen but it works very well. More recent OLED's do have higher max brightness and thus a bit better HDR. But I wartch in a darkened room so it's not a big issue for me. OLED screen have been getting cheaper the last couple of years. A similar TV now is about 1500 euros so considerably cheaper. QD-OLED might drive the price down.
To add to your point, dynamic range IS also a creative decision. A lack of dynamic range in a shot doesn't necessitate it being less appealing or dimensional. Shooting a scene in higher dynamic range where it diminishes the artistic intention is just as silly, of course I imagine it's way easier to "correct" this in post.
@@TinLeadHammer I think we all know the difference between pixels and the actual detail that is resolved. My point was for a theoretical perfect 2K and 4K presentation. A lot of UHD discs don't actually resolve 4K detail, certainly not those 'real' 35mm scans. But that was not my point.
Maybe for the average viewer with a TV, 8K might be ridiculous. But for people with big projection screens, the added resolution might bring the home experience closer to the cinema experience.
Resolution when it comes to digital cinema is always misunderstood. Acquisition of 8K has nothing to do with pushing resolution higher just "because". 8K acquisition generates a much better supersampling and it generally softens the image, not sharpen it. You mentioned nothing of this in the video? Supersampling algorithms are usually using sharpness when rescaling, but many choose to not oversharpen while supersampling down in order for the sharpness to become natural instead. Shooting 8K with this supersampling moves a digital camera closer to an IMAX resolve. 8K supersampled down to 4K removes the digital problems of the bayer sensor, you get pure pixels instead of divided between red/green/blue. The digital noise is also lowered. The point is that you use optical elements to set the character for the image, instead of letting the sensor block your work. A problem is also that you chose the new Matrix movie that is so overly produced in color grading and mismatching shutter speeds that it's a very bad representation of digital cinema. Even though the point is that resolution is a creative choice, we still have standardized 4K now, new TVs aren't sub-4K in resolution and HDR will in a couple of years also be standardized. There's no point in learning a workflow that does not end with a 4K HDR master at the output because that's gonna be the standard for any project soon. It's a rip the band-aid situation and people need to understand how digital resolution practically works, not just that it's a choice. For instance, take something like the Red Raptor and use vintage glass on it. The supersampling will take care of the digitalness of the image, and without post-sharpening, the image will resolve everything about the vintage glass. Apply proper post-workflows, maybe even post-grain processes and if everyone knows the tech knows the post-workflow, it can reach 4K resolution without those bad high-resolution looks that people complain about. It's a joint effort between the art department, makeup, cinematographer, director, image pipeline personnel, colorist, and VFX. I've done hundreds of projects since digital cinema first matured and every time there are people who don't know what they're doing, i.e they're incompetent, the digital image becomes trash. Shooting high-resolution digital cinema requires people who understand both the technology AND the art of creating images. The problem right now is that there are either people who knows the art or people who knows the tech. Back in the days of 35mm, that was not the case. Back then, cinematographers understood the technology of 35mm, they knew how to handle it. But today, many cinematographers have no knowledge, or interest to learn the knowledge of digital cinema, which leads to incompetent handling of the technology.
And that my friends is why you don't choose your DI from the camera department. Hiring a proper DI is the way to establish the best baseline for the image.
@@namesurname624 Started out on set as a camera crew up to focus puller, moved to editing, post production and VFX, now I'm writing and directing. So I've been through most segments of production, from tech to art and that's why I mean tech and art aren't opposites, tech informs the art, the art demands the tech.
You’ve illustrated 100% of my feelings towards today’s filming trends. I really hate that netflixy feel, these projects lack so much of substance that they try as hard as they can to fill them with over good looking cast, over sharp images, overuse of catchy music, injustified camera movements, and so on… everything feel so “plastic” it’s absolutely awful. It’s terrible to see that things are more and more driven by money making, and not only in cinema, but all sort of art mediums, and in fact in almost every aspect of our lives today, everything gets industrialised… this + the fact that today’s children are educated by this type of content i’m worried about what future artists they will become…
i think the craze for high resolution dictated by mainstream audiences that are already accustomed to clean sharp images (ai processed smartphone pictures, high end clean look commercials), therefore many companies demand things to be shot at a high resolution and a clean color tone. high resolution can be great when handled properly, such as mindhunter shot on 6k and 8k.
@@mubaraksuleiman5227 I believe that diversity of medium has to be preserved. Filmmakers must keep the choice of medium on which they want their stories to be told…
love this point you made; just because technology can do something doesn’t mean its always right. We have all these incredible cameras out there with 8k capabilities but DP's such as myself slap on a black pro-mist and add a load of film grain to give it the same look that 16mm would give off anyways.
True!! Thats the funny part.. at this point if youre not shooting a film like Matrix, actual "8k output" would result in an image so clinical that you would have to slap 20 filters to add character to It 🤣🤣
It is ironic we do this, BUT...I'd say it's still cheaper (and faster maybe?) to go this convoluted route than to do it with the equipment that does it by default especially since everything is distributed digital these days anyway.
Digital grain isn’t very good tho. Look at films like 3 from hell that used the old footage from the previous shot on film installment to match the grain structure. That works only due to the care and attention used with that add on. Than compare to others like joker which look so unnatural that it takes away from the presentation due to the uniformity real grain doesn’t have. It’s better to retain the natural look of the camera you are using than to add so many layers and effects that it’s like you’re watching when the killer calls. It is easy and a lot more cost effective but when in the movie business has those words ever yielded a movie worth talking about.
This video was really encouraging to me. I don't really have enough money to get a high res camera, I always thought I couldn't make my own films because of this limitation. This told me I don't need to worry about that. My job is just to make art.
Great points made here. I'd love to point out how intriguing it is that The Matrix is the visual centerpiece used to frame the discussion. I agree with many of the points made and I'd also argue that The Matrix is one of the few counterexamples in which high-res and ultra-sharp visuals actually serve the art and further the intention on the film and story. Makes this video even more wonderful and complex, well done!
Those HYPER crisp images in the new Matrix felt very intentional. Especially since the movie is SO concerned with capturing faces. You see ever pore, scar, and wrinkle. It's quite striking.
The next step in this conversation with regards to action films but The Matrix more specifically is BLOCKING. After watching Resurrections, I went back and looked at a few scenes from the trilogy and found myself enjoying and following the action than the new installment. Why? I think because of the size of the camera. I believe the trilogy was shot on a large 35mm camera (my guess is a Panavision or Arri). They’re bigger, heavier, more tactile. It’s not like a nimble little Red Komodo or the like that you can just whip around set on a whim. The result was I found the new movie to have actions scenes that were either unintelligible in terms of a storytelling (I’m thinking specifically of the Merovingian fight in Resurrections) as opposed to the trilogy (take your pic; they’re all extremely clear). There is something we’re beginning to lose in terms of craft at the altar of digital clarity, because more and more we’re enamored not by the data we’ve captured for post, but the flippant nature of our blocking with such powerful little tools at our disposal.
@@_mixedsignals while I see what you are saying there are a number of films shot with heavy film cameras that had very rapid movement like the Bourne sequels. Yet those offer more clarity the Taken which also shot using similarly heavy cameras, and the same “shaky cam” aesthetic. It’s just now with the easier barriers means people have quicker means to opt for that quick blocking. Without ever needing to learn to work within “the box” of a weighty camera rig. And this happening all over again with the ronin 4d, I have seen so many people now thinking everything needs to a one take tracking shot just because they can. So I think the age old adage “just because we can doesn’t mean me should.” Applies here. It the case of the matrix I have heard Lana loved the small rigs because she felt liberated enough to use a more free flowing work flow and add set ups on the fly. Where as reloaded and revolutions used a stricter approach which involved copious amounts of takes. (30-50 per set up.) This burned out Bill Pope so much that he hasn’t worked with Lana or Lilly again. So in the pursuit of effective well crafted shots that tell the story it’s always important to remember how many people are involved and try to find a way to capture the film without burning out your crew members. For me personally it all comes down to prep, getting a grip on the story and then the logistics and gear to capture that story while ensuring the actors (hopefully) get enough time to play around a bit and be ready for any “happy accidents”. “We’ll fix it in post.” Has and always will be a rather large red flag, (of course there are numerous examples of post saving films but using as a crutch is never a great place to start.)
Steve yedlin, ASC did an amazing deep dive into the difference between technical resolution, and the perceived resolution to the human eye - I think it would interest a lot of people
@@Ljm488 The "resolution" of the human eye is a pretty complex topic, but it's better described in terms of pixel density than total resolution; and even then it depends on viewing distance. For instance, a 4K smartphone is wasteful overkill, while on a large tv, 4K is more meaningful - but if you're far enough away, it could be 720p and you wouldn't be able to notice.
@@AxTechs in terms of pixels on a monitor or computer not really. That’s what I meant, how the eye views an image in relation to an image. And Steve Yeldin did say the exact thing I’m saying in his video so I’m just relaying that message
I feel like comparing the old Matrix with the newer one would have been better. Having two totally different films with a drastically different look and feel doesn't quite work.
You can also see this same comparison with the most recent and previous. Representing the matrix with more abstract photography is more fitting in my opinion. The world is abstract, thus if the photography is abstract it just helps to make the world feel more erie. Which is what the original achieved very well as apposed to the new adaptation.
I disagree here, while I have other problems with the new Matrix fillm, I think this hyper realistic and artificial look adds to the digital world of the matrix which is quite artificial. Maybe it would've looked to different but I could imagine they could've shot the "real world" scenes on actual Film to further seperate both worlds. In the end it is a tool you use to convey your message: Do you paint with big or small brushes?
I think you're missing out on one crucial consideration when access to 4K and 8K cameras is becoming more and more universal: the longevity of high resolution master for different output solutions across a wide range of platforms for years to come
Diffusion filters on an 8K capture and 1080p are two very different things. The filter isn't "bringing the fidelity back down" to be in line with a low res look. Using a small sensor/small film format is a way different look from low digital res, too.
You can remember a great movie when it's almost a textural feel, the lighting mood & music, pacing of the camera ... everything feels like an organic vision of someone or a small team of people collaborating. So many movies today feel focused on the ip or a story that might go viral online that they forget that there's so much else to consider ... such as not making something automatically extremely high-res and sanitised.
I love recording in 8k or 4k to set 1080p as final output for a simple reason: it’s easier to crop to zoom while editing, this helps a lot to get “fake camera zoom” to focus on a specific part of the subject (like the faces). It gives a bit more freedom on low budget projects, framing the scene is easier since some parts could be cropped and discarded later. Recording directly in 1080p requires more precise shots, making “digital zoom in editing” impossible.
Speak for yourself. When I meet a new person, I also focus on the details of their face, including blemishes. Thus, I prefer high resolution movies because it feels real, not dreamy. I just don’t wanna be an spectator, I wanna be immersed.
YOU answered your own question the lower the res. The blockier the image, we associate that with cheap and crappy, which most low res cameras are (now don't say arris only use 1080p n 2k since recently, there 1080ps look so good do to the shape of there noise and etc, but I'm not talking about 1080p, even though it doesn't look as crisp) I'm talking about lower resolutions show noise more, are blockier and typically stink. Period. We also like to be able to punch in images and zoom during post, it's a workflw thing and makes it slot easier in pot production when footage is higher res.
The higher the resolution of the sensor and recording format, the more organic and less pixelated the image will be. Choosing a softer lens and/or filming on S16mm film (which will limit the effective resolution) are valid artistic choices, which will not negatively affect the technical quality of the image and won't cause it to look "digital". Over-sampling (shooting in a higher resolution than the finishing resolution will end up being), can reduce the appearance of digital noise, and improve image quality. Giving the advice to shoot in lower resolutions is truly a terrible advice. Shooting or even finishing in less than 4k is going to limit distribution options, will sometimes even limit which festivals the project be exhibited in, and will cause it to appear more digital "videoy" and pixelated when watched on a high resolution display. You can soften the image with the right choices of lenses and filters, and can also do it in post production in a more controlled way, reaching whichever look you desire.
Great video! I think there was a time in the digital world where resolution actually mattered. Because the more resolution you had, the more natural/organic the image looked and the less you were reminded that you're looking at something digital. And I think it was very important for technology to move over that threshold where the pixels don't distract anymore. For me that threshold is in most scenarios 1080p/2K. It's a very neutral way of presenting footage. I think 4K is still kinda within the boundaries of being neutral. Anything above 4K and below 1080p for me is a stylised choice and it doesn't affect quality, but style. While I think it's important to use new technology and advance, I think it's important that we accept that there WAS a time when more resolution meant better image quality, but that we are now past that. Now resolution doesn't mean the same anymore and neither does quality. Technologically I think we are at a sweet spot now for the way images are viewed/used today. There might be workflow improvements for filmmakers that helps them tell the stories (high resolution is one of them), but It won't affect the technical quality of the image we see in the end. Maybe it's really a sweet spot most of us should embrace and use the chance to focus more on the storytelling and less on technology. In the end that's what good technology does, it makes you forget that you're using it.
Kind of like when I was so happy to get a birthday card done on a dot matrix computer printer when I was 9 in 1991...I was like "wow, the future is here, cool dots..." but later I learned that the drive for better quality was to make it look so that a document made on a computer should not look like it was 'made on a computer'. I kind of get the nostalgic feel for big pixels, simple polygons/teapots, a black void background, and music with oscillator/FM synthesizers...technology just doesn't seem so futuristic anymore...
Wow, a lot of terrific comments below - and love the video - in my mind spot on. I once sang in a choir where the chap beside me was so over perfect in his words, it was downright annoying singing beside him - it was unnatural and unpleasant. Digital is done the same thing in a way - so over the top it is not much fun anymore. No one looks at the tiny pores in a face, or the insane detail in the Hobbit (which was a bomb in comparison to LOTR). We are absorbed in the story, like the great movies. Yes Ben Hur was shot in 70mm for that detail but it was for an enormous wide screen that needed it. The story was king and a person was so absorbed in the story, they didn't look at insane resolution. I have both Blueray and standard 1080 editions and quite honestly, when I got to watching the movie, i forgot about the detail and got entrenched in the story. Oh, that was 1080? Hmmm, never knew. Ya, this video is excellent. Well done.
You did not convince me. Recording in a higher resolution is always better if there are no tradeoffs. The image being too sharp should not be a concern, since it can always be blurred in post production. In the real world there are obviously trade-offs to be made considering storage and editing, so 2K or 4K are close to the sweet spot, but that's just for today. In the future, higher resolutions will be used more until we get to the point where human eyes won't be able to tell the difference.
I totally agree resolution isn't needed for every movie it brings out the flaws of the production, for example I watched a movie recently and because it was super high resolution I could see the actors wearing fake moustaches and beards it took me out of the story. Film grain brings out detail too in a movies that's why older movies look so good in 4K UHD disc because of the grain there is so much detail.
I think your example of the downsides of high resolution is very true but I disagree with your statement about grain. Grain doesn't increase detail in any way because it's just a random pattern. The image might appear sharper but that's a different story.
@@maxkern4419 This is not something I made up about film grain, some Filmmakers have said it that filming on older cameras that weren't digital they way to bring out detail was how on it was filmed on old film stock which was very grainy but it bought out the detail.
The grain acts like a dithering pattern, allowing you to perceive details that would normally be below the noise floor. The same trick is used in audio mastering. A 7bit piano recording sounds way more realistic with a spoonful of dithering noise than without.
@@bigogleOh really, but when I watch behind the scenes of movies the filmmakers talks about how the film grain brings out details, am I misinterpreting it.
before watching this video my opinion is: A sharper image is not important. High resolution is very nice and I like it but the sharpness of an image is something different! After watching the video: Yeah my opinion hasn't changed. The sharpness is an artistic thing. I actually put a slight blur ontop most of my irl shot videos, just because I like the look of old films. But I still keep it on a higher resolution. Blurry images look different on different resolutions! In his video it sounds like the higher resolution are sharp everytime. Look at movies like Jurassic Park. The blu ray has a greater resolution but the image itself is still blurry. I hope someone gets what I trying to say... So I agree that a low resolution can achieve the look you want but where I disagree is that I think a high res video can still result in the same artistic feeling as a low res.
3:10 16mm can be delivered in 4k. Go watch Evil Dead 1981 on 4k Blu-ray. Also you need a certain amount of resolution or things look terrible. Case in point 28 Days Later 2002 looks terrible today. The only thing that really matters is having a sufficient resolution for the viewing device. Since 28 Days Later was captured with a 480i digital camera the movie is mostly ruined for future use because most devices are too high of resolution. One of the beauty of film is you just can keep scanning it. Even 16mm can be scanned at 8k would benefit from a higher scan theoretically if you had a 8k tv and a native 8k source since you wouldn't need any upscaling.
If you scan 16mm film at 4K you aren't getting a more detailed image. You are merely getting more detailed grain. It's similar to exporting a 1080 HD project in 8K: it can work, but there isn't more detail in the exported project.
Great points! I would add one thing: lens matters! A lot of "digital look" comes not from resolution itself but from very sharp modern glass that is used with a camera. I prefer to shoot with vintage lenses, made in early 70s and even with my 4K camera I end up with nice, filmic soft look that I adore. 4K gives me some comfort in post-production (for example for zooming in slightly in post), soft lenses give me that vintage, organic vibe.
very true. another thing I notice with film cameras are the focus pullers. some of the charm I loved from yesteryear (including MANY blockbusters) is when the focus pullers couldn't track so perfectly. there was an organic feel about it. of course with digital cinema camera they still use manual focus but we have a lot of computer aids to help. plus I miss matte lines in VFX work ha ha. when I was a kid watching some old special effect laden films, when I saw those matte lines I knew I was in for some magic. like a pavlovian response mechanism ha ha
This reminds me of the components run on PCs in the 2000s-early 2010s, when you'd buy the XXX CPU, however much RAM you needed and the YYY GPU, to find six months later that your CPU was late gen and couldn't support the games you wanted to play. I still see it on some games, Call of duty and the likes, where each new iteration offers more or less the gameplay ("if it ain't broke...") but needing more and more CPU GPU and diskspace to process all the new textures for a "photoreal" game. I think it's good videogames have moved from that with the rise of indy studios, and no one, nowadays bats an eyes at an "unrealistic looking" game, rather focusing on the design and the mechanics. It seems that digital cinematography is on the same slope. So, Netflix requires you to use a camera equipped with a 4K sensor (due to, if I'm not mistaken, some folks on the Netflix board being also the Arri board ?), but really it's an artificial constraint.4K for the sake of 4K, or for being able to brag about your content being 4K. Like you said, some cinematographers still use filters to sotfen it. Or even digital filters to give it a "retro" look. The question of lighting and composition also comes to mind. We grew up on VHS and DVDs, and the great movies were no less impressive. Die Hard remains an amazing piece of action movie with some of the best cinematography ever, whether I put my DVD or my Blu-Ray in the player. Or Michael Mann's Miami Vice (the movie) shot on HD cameras. And it's brilliant and beautiful and makes for an amazing film. On the other hand, some recent netflix movies look like shit. Or, you know, every other netflix movies. This discussion reminds me of late 40s-early 50s technicolor movies. So West Side story (everyone go check out the Spielberg remake BTW, it's amazing), Duel in the sun, the red shoes, she wore a yellow ribbon. The goal was never fidelity, it was heightened hollywood reality. And it works beautifully. So yeah, thanks a lot for making this point and sorry for the long post. Love your work.
Very interesting point of view. In fact when I was watching new movies (filmed in very high res) I was feeling that there were no magic or a lost of "special taste" I didn't think about this before and it makes sense for me too, my attention was scattered in details rather than the story of the film. But as you said in some projects is good filming in high res and for others not. Very good video
Totally agree with this. The obsession with high resolution, over such things as dynamic range, colour depth, bit rates, codecs, and even subjective ‘character’, can actually be quite depressing. Many aren’t aware that the camera with the highest number of cinematography awards to its name, the Arri Alexa, doesn’t even shoot true 4K. For me, the only real consideration when it comes to resolution is what the final production is intended to be viewed on. Modern cameras and glass can be far too clinical and overly sharp for me. I rarely shoot without diffusion and often use vintage lenses to offset the contemporary look. Great video mate 👍🏻
👍🏾 Tommy Rowe, GREAT to see you here, mate!! I purchased your awesome DJI Pocket LUTs awhile back, and luv 'em. Great point about the Arri, too! Hey, wishing you and your family a Fantastic Year, brother.
honestly im glad to see consumer equipment starting to focus on HDRI etc. I thought wed be forever caught in the "resolution wars".. and Thank God the 3D fad is over. it'll never sell until they can pull it off without glasses. corniest fad ever, felt like I was back in the 50's
I like to think of this as being similar to cameras that shoot 60fps and 120fps, or even more. Just because the camera is capable of doing it doesn't mean everything should be shot in 60fps or 120fps, it should only be done when necessary and when it brings something to the story or the artists vision.
I've been considering this for some time and couldn't agree more. I recently watched a film about the camera setup of various youtubers. Most everyone was shooting in 4K. The segment of the video that appealed most to me, however, was shot in FHD. It looked more cinematic, in my opinion. Since then, I've been experimenting with a 12 year old Canon rebel camera and have been really impressed by what can be achieved with light, lenses and patience. Lower resolution has a definite role in certain situations.
That's how I see it too. And I love seeing grain, even if it's just grain from a digital sensor. Which you often don't see in 1080, especially when its compressed (youtube, Netflix etc.) I heard the new ARRI Alexa cameras have even texture settings for the sensor
It's so great that this video exists amidst a scarce few on the same topic, yet it should be noted that there is an inherent downplay on the photochemical choice of filmmaking. Kodak 35mm or 70mm film, provide immense resolution with the inherent components that you present in your video, and are far more superior choices as far as depth of colour, dynamic range, and overall softness is concerned. Needless to say that 65mm IMAX film is an immaculate medium for capturing images, although not used in commercial terms due to financial reasons. Kodak has a few papers on their website in which they compare digital sensors against 35mm film that yield some amazing results. I think it's extremely important that not only you talk about these current trends in the industry, as I was talking about this exact same thing because The Matrix really befuddled me with its cinematography in that regard. Its is extremely important to not forget how sophisticated the human eye is, but also how our perception of what constitutes images cinematic can be atrophied over time, simply by new technology from companies whose sole goal is to produce this new technology only for financial gain.
You're missing 2 important factors here. One is that movies are filmed in high res and then "diffused" because upscaling ruins the quality. It's better to have a "diffused", "vintage lens filmed" in 8k then downscaled to a 4k bluray and/or 1080p TV than having a 720p movie upscaled by deeplearning algorithms or stretched by the monitor. Other factor is that scenes filmed at 8k/12k can be edited in post, with zoom-ins, for example, without losing quality if the final product is sold in 4k/1080p.
Completely agree. 'Upscaling' with film is far different than Digital. That's why we can re-release old movies in Blu-ray with minimal intervention compared to if a film was recorded in 480p and then upscaled to 1440p. I completely get that the standard resolution that we are shooting/watching at is getting near it's necessary limits, but you're locked into the quality that you shot at a LOT more in digital than you are in film.
@@pilebunker420 exactly. You can go back to original film and scan it at a higher resolution and a lot of quality will be retained. Where as you can't with digital. I feel like this channel is trying to apply film logic to digital.
Great video! I also feel that with the most modern cameras pushing 8k+ resolution, the clients/bookings will start to see it as a standard in the gear you bringing, further widening the gap between low-normal budget filmmaking and the $10,000+ cameras shooting in 8k. A sad state, but hopefully we can bring attention to it with videos like this and not let that happen! :)
I'm one of those pople that say: always future proof your resolution. If you want a "softer look" go for softer lenses, not less resolution. That way, in future, on 8K monitors people will still see smooth image instead of smooth squares.
@@km099 I remember when 720p was deemed "unnecessary for most people" because "nobody would ever need this much resolution". how do you know people wont have 10 feet wide screens at home in future? I already do for my 4K projector.
@@bqgin I'm not convinced. Unlike you, I never heard people complain about 720p. The step up from SD to HD was significant and very noticeable to consumers. But resolutions like 6K or 8K? Not so much. I'm talking about final delivery of course, not necessarily the recording. There's simply a point of diminishing returns when it comes to resolution. Stuff like a better dynamic range are way more important, just look at the new 4.6K Alexa 35. Just ask your friends if they can tell whether their local movie theater uses a 2K or 4K digital projector without looking it up.
@@km099 As an indie filmmaker I purchased a 4K 140inch short throw projector for special effects. Let me tell you, the difference between 2K and 4K is immediately noticable. So is on my friends 6K tv I don't know exact dimensions but its alot smaller than my projector screen. And it's extremely apparent with cgi. Maybe you should wear glasses if you can't see the difference? I'm not being mean I genuinely don't know how one can not notice the difference. I get on a phone or a computer screen cause they're small but on a TV and moreso in cinema it is visible even nowadays. Who knows what technology can bring us later?
@@km099 Regular VHS quality was the dogs breakfast compared to 480p double scan. 720p looked gorgeous. 1080p not much better, and 4K is like killing a bathtub spider with an AR-15.
I watched this on a 14" Daewoo CRT I bought for $1 at the thrift shop with a $10 HDMI>RCA converter for my Chromecast. Needless to say, it spoke to my soul. I love watching TH-cam on an ActualTube.
This reminds me of what happened with audio -- basically sample rates climbed and climbed, but after a few years of that it became apparent that 44.1 or 48k were sufficient for pretty much everything. I personally like 1920 x 1080 most of the time, it's easy enough to upscale with no loss in quality.
A very valid point. I have thought of this before too. The extreme level of detail and sharpness can be in your face and rather distracting. Taking the focus from what is actually being conveyed to the things that are used to convey the story -yes, I am saying this even though I am a big proponent of ultra high image quality.
1:44: The "p" stands for "progressive" in this case, not pixels. This means that each frame has all the pixels updated at once, as opposed to "interlaced" where each frame only has every other line updated at once as TV often worked in the past.
it's a little like in todays' music production. state of the art microphones and plug ins and so on make it possible, to make music without any noise whatsoever (unlike in the past, where compressors, microphones and for the very least tapes and vinyls added some (or more) noise in the background). And what do we do? We use saturation, added vinyl crackling and other ambient recordings or "analog noises" to give our peace back some naturalism.
I like that Roger Deakins is such an outlier when it comes to this. He creates some of the most beautiful images in modern cinema but he likes having the cleanest image he possibly can. I remember in interviews he’s said that he doesn’t like seeing any blemishes from the lens or camera used at all. But he fully acknowledges that it’s his preference and will shoot anamorphic or use a stylized technique if it helps tell the story better.
roger has a great lighting skill, paired with a knowledgable DI Colorist, i think thats why his movies dont look "boring", there is always something eye catching in the frame be it the shadows or the color tones.
@@HBarnill True, I meant more in my interpretation of his sentiments he might be willing to do something like that if he and the director agreed it helped tell the story. He did employ that very stylized look for flashbacks in The Assassination of Jessy James by the Coward Robert Ford. But yes he definitely has a clear preference for clean, spherical images.
Great video. Funnily enough, as soon as I saw this video title, I wondered how long before Geff Boyle Cooke Optics video would appear. It's all about the feel that you want for the production or photography.
The 4 nonsensical "obsessions" of new and amateur videographers: - the highest resolution possible - extremely wide open aperture shots - high FPS - anamorphic lenses None of the above is nessesary or makes you automatically a good "film maker". Yet they all want to have it, often without really knowing what all these things really are and how they effect their footage or creating a mood. But they obsess, discuss and fight over it as if there was no tomorrow. In the end, we have to remember Star Wars Episode 1 was shot on a full HD cinema cam. Jaws was shot on film and you can get both in 4k Remastered and never know on which original medium or resolution it was first shot. Also interpolation software these days is INSANELY good for enlagening formats. And I personally also think that a too high resolution movie loses its magic. Also I can't enjoy a lot of remastered movies because they add so much artificial sharpness and correct every little flaw out of the image that all the character of the movie gets lost. Let's get back to Jaws or Apocalypse Now: The remastered versions look stunning and feel as if they could have been shot in the early 2010s (just by judging the image style), but they look nothing like the originals that I love so much. They corrected out the yellow tin from Jaws to make the white balance spot on perfect and released it in 4k with so much sharpness that it looses all it's charme because of it.
I've seen lesser and lesser projects with anamorphic lenses, so idk what you're talking about in that last point. Plus, they actually work against the whole 8k prores crispy netflix thing because they lack the sharpness and even have distortion. Knowing when and where to use them is very important. Take the rise of Marvel for example. So many of the shots in these Marvel movies are shot on spherical, and colour graded like an Amazon commercial.
"They corrected out the yellow tin from Jaws to make the white balance spot on perfect and released it in 4k with so much sharpness that it looses all it's charme because of it." That's just like your opinion man. Also anamorphic lenses are fine, there's a reason they are still used today.
@@zr_1234 You are missing the point. I didn't say anamorphic was bad in any way, I am saying that amateurs who should more focus developing skill before obsessing over gear they don't understand and waste a lot of money on gear, which they could have put into a project to actually have something in their portfolio. And yes, that is indeed my subjective opinion - just as it is yours to disagree.
High resolution is only great for cropping without losing detail but most professional movie productions don't need to crop out 50% of the shot. High resolution is best for non professional filmmakers
Cropping, digital zooming, reducing for sharpening when you miss focus slightly, reducing for better color (on cheaper cameras), stabilization in post, there are tons of reasons a low budget film maker may need those extra pixels. When you are a "one man band" I don't care how good you are, you will mess up time-to-time, and that extra resolution can be a life saver.
I think the flexibility to “crop in in post” is a strong argument for capturing in higher resolution. The movement from a small zoom push or pan faked in the editing can be a huge asset. And it gets a lot easier to crop out a Starbucks cup that found its way into Westeros than it is to rotoscope. For photography cameras (vs cinema) the ability to “punch in” to an APS-C shooting mode allows much faster reframes, which can keep the scene hot and finish the scene sooner and more cheaply. Oversharp images are definitely a thing. But capture in high resolution isn’t to blame.
I recently shot some Super 8 when working on a short film for an organization. I did not know if this would work as I had not touched this camera in 40 years and also I had misgivings about how it would look in the shift to the 4k footage (edited in 1080 though) that makes up the bulk of the film. The Super 8 experiment gave me just enough usable footage to open and close the final film nicely. Inevitably, a friend who saw it praised what she thought were the use of "vintage effects."
You can actually shoot 4k or whatever high resolution you want, but make it look less sharp in post production. Creativity shifts, we are allowed to change our minds. You may think the story requires a less sharp image, but when shot, you discover the medium you or your audience would view your work through, may desperately condemn your work, because you chose to go for less and not fully harness the power you are given. It's better you have it and not use it, as opposed to you not having it and getting phased out or ignored by the people your work of art was to appeal to. 😁
I agree In one sense But You pointed out the major issue without going into it! “When you focus on someone” You’re not bouncing around all over the place like most films are with the handheld look in 4/8K Handheld look is fine for its purpose (personal perspective), but when the only steady shots are super wide landscape/drone shots it screws up and makes you look everywhere but the actors eyes!
The topic has merit of discussion but it's odd to compare a movie intentionally made to look vintage like Jackie with a movie where the vast majority of the time is spent in a simulation designed to look artificial like Matrix 4.
It's perfect for this point because the 16mm in Jackie helps to create a feeling of the early 60's and the detail of Matrix helps contribute to the artificial world feel. The medium as a tool point
@@davidcunningham9282 It wasn't 16 and it wasn't presented as ruggedly as Jackie though. Larrain very much uses film texture to time stamp his movies. He did it perfectly in Spencer this year
@@JonPlarr that's not the point i'm making. i'm just disagreeing with the notion that a matrix movie should look more digital. in fact after rewatching it recently i'd forgotten there's a decent amount of grain
@@davidcunningham9282 I got you. Yeah I have no problem with a futuristic movie still being shot on film. Probably would be my choice. I think set design and color is probably more effective in conveying a feeling in that case than medium anyway
I disagree. I think the advancements in image quality with sensors and lenses has greatly benefitted movie making. The artistic looks of 16mm film and certain aesthetics make movies like Midsommar and The Lighthouse greatly stand out. With Midsommar being shot in 8K with a super clean clinical look that enhanced the visual aspects, where as The Lighthouse being shot on low-sensitivity film and WWII-era lenses gives it a very authentic feel for its time and setting.
This is exactly what he spoke about, that having it available is great, but that many films use the highest resolution when it doesn't really suit that particular movie. It works for some, not for others.
just to oppose your essay, high resolution is a good thing. Back then, 1080p is high end high resolution. additionally, film is high resolution and costly. so you see, we are still below the cost of shooting with film. it "only" becomes not a good thing if one thinks having high resolution makes a product/production/output better. while 8k video is currently dumb, it might matter 10 years from now. another thing i currently hate is fixation on 24p. while it is true it is a matter of taste especially of the older audiences and creators, we should move on and move it to at minimum of 30 fps. 120fps should be supported by youtube by now, it matters with plenty action or plenty camera movement.
shoot 4k and deliver full hd ... gives you freedom and remember ... you can always scale down, but seldomly you can scale up with video ... have a great day and your point is taken ... great video !
I have to fully disagree with you. Super crisp sharpness CAN be incredibly visceral as an experience if its shot artfully. KNIVES OUT has an incredibly sharp image, deliberately exposed and shot (both on 35mm and digital in almost equal measure) to push a super-sharp and very crunchy aesthetic and it makes the film not only look gorgeous, romantic and engaging, but also creates a sense of immediacy as well. 4K scans of VistaVision films have the same effect for me. THE TROUBLE WITH HARRY is breathtaking for instance. I think it really is a matter of personal preference and while there is an aesthetic appropriateness for soft images vs sharp images, sharp images alone are not a problem. You keep using THE MATRIX RESURRECTIONS as an example and yet its actually one of the most atmospheric, rich and textured films I've seen on Digital in a very long time. Especially in the bland wasteland of Marvel and Netflix films. There's no superiority of sharpness or softness inherently within the aesthetic or process. Just how its shot and how its displayed.
Holy crap, yes. Thank you. Every time I hear someone talk about what does or does not make something "look cinematic" I throw up a bit. All of these things are just tools that are used as a means to an end. They can be used skillfully, or they can...not. I thought The Matrix Resurrections looked cinematic as hell. If anyone tries to tell me that looked "too video-y", I am going to tell them they are out of their damn mind. And Rian Johnson? I mean come on. That stuff is going to be stunning. It doesn't matter if it's shot digitally, or at which resolution, or frame rate. If the people know what they are doing, it will look good. I'm willing to bet that few people are criticizing Christopher Nolan in this way for lugging around gigantic IMAX cameras around his shoots. That's he's "too focused on resolution" or whatever. And I'll bet the reason why is because he shoots on film. In that case, it's fine. I think the thing that bothers me most when people talk about what is cinematic, is that so much of it is simply what they've been told is cinematic. And what they were told is cinematic was based on the technological limitations of film-making during the first part of the 20th century. If those same attitudes were applied to other forms of art there'd be two or three "real" painters, and nothing newer than Tchaikovsky would be considered "real" music. Sorry for the rant. I don't mean to direct it at you. I mean, we are in agreement. I just fucking hate gatekeepers, man. Whether it's cinema, video games, whatever. And I can get a bit riled up about it.
I completely disagree with you. Every single movie from say the 90s shot on the film looks more romantic and engaging but it's also related to who the movies used to be lit. Nowadays nobody seems to have a clue what they are doing with the light. And don't get me even started on modern post processing.
@@dash3dot I dunno man. That's a pretty broad statement, and I'd be willing to say objectively not true. There was an awful lot of dreck produced in the 90s, the same as there is an awful lot of dreck produced every decade. And something looking "romantic" or "engaging" is totally subjective. You might prefer a certain look, but that's just a personal preference. It's what you've become acclimated to, or learned from a young age that "this is what cinema looks like", or whatever. And that is totally fine. People will like what they like, and I am all about encouraging them to explore that. But that does not objectively mean film is a better medium than digital formats for making motion pictures. It's just a tool. One that can be used well, or not. The same as any other tool. Shooting on film isn't going to save you if the DP stinks, or if the editor is having a lousy month. You pointed that out yourself, the ability to control light is of utmost importance in a visual medium. Put an iPhone in the hands of a skilled and creative director, with a shit-hot DP, and they will be able to make something that is extremely engaging and beautiful.
@@NicholosRichter Agreed - I can't see someone watching GOOD WILL HUNTING or BEVERLY HILLS COP 3 or SHE'S ALL THAT or a plethora of other flat, piece-meal, blandly filmed movies of the 90's and think that just because they're shot on celluloid that they have some kind of 'romanticism' to them. If anything they've GAINED romanticism just because they look a certain way that's distinctly different from digital, but in context they were nothing to be wowed by. If anything, those films could have been shot on Digital and honestly beyond a much heightened sharpness to edge lines they wouldn't look all that different to something filmed today. If anything its the difference of LED lighting vs tungsten lighting that would make the differences seem more pronounced. It's always been down to the DOP, not the medium. And no matter the medium, there will always be DOPs with ideas that just won't work at all. I am kinda disturbed, personally, by this trend of people thinking that because film (through duplication artefacts) is "Soft" and not as sharp as digital, that this is somehow better.
How did your preference in anyway in contrast with his views in this video? He never said super crisp sharpness can't be visceral or artful, he merely stated that it is a part of the artistic equation, meaning you can CHOOSE to use it to your advantage to manipulate the image's artistic expression. It is just not something that is inherently and automatically superior.
More resolution will ALWAYS look better, even if your source material is shot with 8mm film. Having that scanned and shown at highest possible resolution will make it feel more analog when you get rid of as much of those pixels and that aliasing as possible. And shooting digital with objectively bad lenses for artistic reasons will also look better the higher resolution you have because it captures all those minute details of the lens characteristics better. Except for cost I really don't see no reason whatsoever to go for as high resolution as possible.
You’ve made excellent points, but you didn’t factor compression. Streaming platforms typically give more bit rate to higher resolutions. So much so, a project shot in 1080 but uploaded in 4K would look better than a 1080 upload. My approach is to shoot at a minimum of 4K and soften the image with filters or lenses for the right vibe. I think it’s also important to know the difference between resolution, sharpness and detail. As it’s not always gives the same result.
Me : "I don't want to see Matrix images before I see the film. But I really want to watch the new IDC video. But I don't want to get spoiled. But I want to watch the IDC video now. But..." **brain disconnects**
While I agree with pretty much everything said here, I simply don't like images below 720p. They're too blocky for my tastes. If you're going to give me a less perfect image, please use a filter or a lower mm film camera.
I actually remember reading an interview with Robert Elswit where he said lighting Velvet Buzzsaw was extremely difficult because Netflix insisted on 4K cameras but that meant he needed double the lighting fixtures he needed and needed completely different lenses. And on Billy Lynns Halftime Walk and Gemini Man which shot at 120 frames a second, the reason both films look so visually flat is because they needed even more lighting then with traditional digital or film cameras because having so many lighting fixtures was the only way to get a usable image. So unfortunately there’s an element of truth to some 4K or 8K films having an unnatural look to them.
I think GEMINI MAN especially is only aesthetically useful when you pair it with 3D. On it's own, it is very flat. But the depth the 3D brings to the high framerate, there's nothing else quite like it out there.
You can just shoot at hight res and than export it at a lower one, so you'll have more flexibility in post... (for example crop-in, stabilization, etc.)
Is it really a resolution problem though? Modern native digital 4K movies can look amazing and "cinematic", just look at Roger Deakins work. Film scans can get up to 8K (or even more?) and also look great. But displaying 1080p on 4K display is just a shittier experience, no amount of artsy justification will change that. I prefer the older film look, but that is achievable with high resolution and looks better than targeting 1080p. The bad thing about Matrix: Resurrections isn't the high resolution, it's mostly bland cinematography with majorly handheld shots, with bad shutter speeds that add motion blur. That's why I think it looks like a TV show and not a big budget, thought out, "real" cinema.
I have to disagree with the point about the extras in the background being "more attractive" these days, for a number of reasons. The "modern" shots you used as the example of attractive extras, we're of 1) Amanda Seyfried, who I highly doubt was an extra in the movie you referenced, and 2) a male model extra in the foreground of the shot. You seem to confuse "focus" with "resolution" in all of these shots, or at least complain that the attractive models in the background are only visible because of the increased resolution of digital as opposed to the director of photography choosing a wider depth of field for the shot, a choice he could have made in a film or digital production. All of the reason anyone ever cites for why analog is better than digital (even in the classic argument of records vs CDs) always comes down to some pseudo-scientific bull like "warmth" or "softness" when we all know that anything you want to do to digital to emulate the look of film can be done in a computer. A movie filmed on 16mm film can never be as sharp as one filmed in a 8k, but you can always apply grain, blur, post effects etc, to make 8k footage more gritty. It's about flexibility.
It's more like having a long piece of wire or wooden board that needs to be cut to a certain length, and you need to guess where to cut it. It's better to cut the wood or wire so it ends up too long, even if off by a large amount, then even a bit too short because if it's too long you can always cut more off...it's a lot harder to put it back on to make it longer again. Same goes for digitizing magnetic audio tapes...you want the azimuth to be at the most sizzlingly brightest sound to be captured...there is no setting on the azimuth screw that's going to result in a sound that's brighter than what was originally recorded, anything but the correct setting will produce a duller, muddier sounding transfer. If the tape is screechy or annoying you can run a low pass filter in digital audio editing before you burn the final CD.
@@brentfisher902 yup, not to mention people will complain about modern technology making everything look too digital, as they watch movies filmed on 60mm IMAX film stock on their 13-inch 1080p MacBook air.
I agree with this video definitely, the biggest reason I upgraded to 4k wasn't the high fidelity but rather the other benefits like lowlight (smaller pixels = less noticeable noise), zooming in and out (I use it in nearly every single shot), more data to work with allowing higher bitrates and then manipulate the image to something more than you used to be capable of... many other things really...
actually larger sensor pixels can get better low light. noise is always a problem but theyre getting there. never thought id see the ISO's were seeing now.
@@flipnap2112 I was talking about video pixels not sensor pixels. The denser the pixel count in a video will produce cleaner video. But yes, bigger sensor = bigger pixels = better lowlight
Achieving a less sharp image is something I am completely on board with. Achieving it by using low resolutions I am absolutely not on board with. Its different to have a soft organic image and its different to have a low res blocky aliased pictured. Low resolution always looks worse. I usually like soft images with high resolution.
I think it generally depends on the project. One thing that is usually true is that it's easier to degrade a nice image than improving a bad one. But, I generally agree that not everything needs to be overly sharp.
What does that even mean? If you degrade a good image by say putting a promist filter over the sensor, how does that mean it started as a good image? Does that mean using sharp glass in front of softer 35mm film makes 35mm film a bad image which is being improved with sharp glass?
Owning a 4k disc for me is just about owning the best version of a movie. Imo when you have huge landscape scenes like in 1917 or Hostiles or Dunkirk, 4k is better. Kubrick's 2001 looks great with HDR. Some 4k discs improve the sound quality. Sometimes I just want a cool looking steelbook, lol. But for the most part the best version of a movie is the original director approved blu ray where the colors aren't touched and the overall feel of the movie is as intended.
This is very interesting to me, as I feel similarly when it comes to video game resolutions. Let me explain: I own a 4K monitor on which I play my games, and my PS5 usually is capable of outputting 4K images, which results in overly sharp images; it doesn't look cinematic to me. The previous generation (PS4 Pro) wasn't fully capable of outputting true 4K images; they were usually sub 4K, in the 1800p area, and to me that looked more cinematic. The edges of the objects looked less jagged and very slightly blurry, which looked more natural.
Lower framerate(~33 fps from my own tests but YMMV) will make it look "cinematic" in addition to the art style. I disagree with your opinion because many more photorealistic games benefit from higher resolutions cleaning up slightly underexposed areas in addition to helping smooth out aliasing. Lower resolutions will look MORE jagged because they will have more aliasing(unless it has a smooth artstyle with good anti-aliasing). If your games look too sharp try lowering the sharpness on the monitor itself.
Great video! I have similar feelings about these digital cameras. Although being able to record at high resolutions with great dynamic range, perhaps the visuals became too perfect where it’s not pleasing on the eyes. Reminds me of friends I know who work in audio production where they record a digital recording onto analog tape to remove the “digital-ness” in the original where the tape adds analog hiss and other imperfections. Or another example would be adding grain or even a simple vignette onto digital photos. What was originally an imperfection of the analog format, then replaced with perfectly designed sensors and lenses but returns back for that nostalgic and intimate look.
Disagree. I think an image should hold as many details as possible. I see so many production flaws in hi-res and while that's annoying it will result in an evolution of moving pictures, there will be better props, better costumes, better photography, better scenes, better lights. I kinda stopped watching films that aren't UHD and rewatching my old favs in UHD is mindbending. I don't see why in the age of MUBI and AppleTv or Netflix anyone should watch pixelated pictures
i feel that your right with resolution being a tool. but practically, a resolution that is slightly higher than what you want your output to be is best as you are able to adjust the frame and crop, and have more space for post production. this is what i do with my personal projects, as i dont want to return from a shoot and realise that i don't like the shot, and have to redo the shoot. of course, this is all up to the preference of the producer, this is just my opinion.
And there's the Hara-Kiri problem too...sometimes one of the actors fails catastrophically before you get another go at busting more frames in his backside.
@in depth cine a great way to put this issue to bed is to put some classic films shot on Super16, Super8, and the various popular 35mm stock and perf variations all together in a 4K scan and let's all see that the reality is. when we drop in broadcast TV from the 80s/90s...people will begin to understand why there is still a significant market for the OG BMPCC As always, thank you for your insightful, well presented discussions. #TH-camFilmSchool essentials!
Great video! I will go more to the lower resolution side of filmmaking every time! I shoot digital, but I also shoot a lot of film0which I prefer. All I hear people say these days are “is it in 8K?” I do not like at all the hyper-sharpness of video, unless it is in sports or live events. I love the subdued look of film/video, & I will keep SHOOTING for that, Cheers!
I watched a comparison vid where the guy shot a 3 minute mini. he used everything from 720 to 8k. pretty much nobody could tell the difference, and when they could it was because of the color packing.
Totally agree! I've had to "soften" at times to make the feel, feel right. BUT, being able to shoot 8K gives you flexibility in zoom. YOu can always soften in post.
Great points there. I think this obsessions with high res gets perpetuated by the marketing machines of the tech companies and as a selling point for movies or shows way too much. On the other hand, I'm happy to see there are also other trends where people are going back to analog cameras whose results feel more authentic to many (including myself).
For me the reasons that I shoot in 4K or higher and deliver usually in 1080 is because of VFX and the flexibility it gives me in post. Anyone who’s tried to rotoscope or key soft footage anything in less than 4K knows how artifacts become really obvious and masks/mattes can look distractingly harsh, so something like a matrix movie being shot high-res and with little diffusion makes sense (maybe soften in post). Shooting at a higher res than delivered also gives me a lot of flexibility with cropping and reframing shots without losing too much detail (you can get a wide, medium, and a close with one camera set up with negligible downsides). In some cases I’ve even been able to enhance dolly zooms by having the footage zoom in post as well. It also makes things like stabilizing shots or doing digital pans and tilts easier.
I’m old enough to remember 35mm still cameras were for amateurs. Professionals used medium format because the images were so detailed with very little grain you could blow it up to a 40x60 and have it look great. If we did not want that sharpness we would put on soft filters. You can soften an image to take off the edge of the sharpness but you can’t make something that is soft sharp. If we want grain we can always add it in post. Resolution is a great thing. I had a favorite lens on my Hasselblad that was from the early 60’s. It was great for portraits because it was not super sharp but that does take away the greatness of the sharpness a medium format. One thing we can learn from the lessons of the past is that we never learn from the past.
The new DUNE movie was shot digital (on the ARRI ALEXA LF and Mini LF), then was transferred to 35mm film, and then was scanned back to digital. All that to create the most accurate emulation possible, reducing the digital sharpness, and elevating softness.
I didn't even know that you can transfer digital footage to analog film to begin with. Does the process have some sort of special name?
@@ThePooper3000 That would be the "Film-Out" method. This is where the movie is printed out to film using a film printer and has been used as long as there have been digital film editing software. Originally done to print scenes with digital effects, it extended to printing whole film reel masters from it's computer film editing software. Sadly this method is not used as much anymore thanks to digital cinema, but some still use it.
@@victorseastrom3455 Because they got a better result doing it this way. They tried shooting on film when doing tests on set and they didn't get the result they wanted but still wanted some of the characteristics of shooting on film so they went with the third option which is this digital to film method. It's just as valid an option as any other.
@@victorseastrom3455 The DoP of Dune was Greig Fraser, one of the best out there. He’s shot with both film and digital. On Roger Deakin’s podcast he talks about testing both film and digital for dune, but they decided to go with the Alexa LF and the the film print/re-scan in post, to give a look that they felt sat in the middle of the two mediums. On a personal note, I think they made a very good choice. I think it’s one of my favourite looking films to have ever been shot.
@@victorseastrom3455 digital is cheaper to shoot on and much more convenient to work with
I guess part of the reason why everything is shot in 4k these days is because its safer and offers more options in post; its far easier to remove detail from, and soften a high resolution image than doing the inverse.
I agree, it’s even (sometimes) detrimental to the use of VFX…
Agreed. I thought its applications for ease of production were rather obvious
I guess that's not the point of the video. the thing is people are obsessed with gear thinking it will make their work better, but only skill and dedication can do it.
4k is okay but now they are pushing specs on another level 6k/8k.
exactly
We shouldn't confuse resolution with sharpness here. A digital and extremely sharp look is certainly a stylistic choice, and not always desirable. But as stated in the Video, high resolution doesn't necessarily lead to a very sharp image.
Low resolution on the other hand, (especially once one starts to see individual pixels) looks extremely digital to me, and should be treated as a stylistic choice as well.
Really high resolutions like 8k give a pretty neutral representation of what is captured by the lens. The level of sharpness and detail in the image can then be freely controlled by adding diffusion, choosing a softer lens etc.
Considering this, I'd say higher resolution actually can lead to a more organic looking digital image.
this honestly put into words what I was thinking, while I will agree 8K isn't always needed, the video mostly talks about sharpness, and the two while being partially linked, are not the same. I also would like to add as a viewer (though I also work with cameras typically for photography but I want to try my hand at cinematography) when something is delivered at a lower resolution than the display device or if the display device has too low of a resolution for the size, especially noticeable on stuff like 1080p projectors in home setups (and 480 line CRTs but that's obsolete), you can see the pixels and the loss of detail in an undesirable way, but when you deliver the content at the appropriate resolution for the screen size and distance it looks more natural on the screen and the obviousness that its artificial disappears, then combine that with scanned film or diffusion filters/any other way to soften the image without lowering the resolution its self and you can have a soft image without harsh pixels that can distract from the viewing experience. The video goes on about resolution when sharpness is the main thing here,
Why use a 8k footage using diffusion / other filters instead of just recording in 2k or 1080p? Saving costs and time...
@@eduardo.chaves Because the Result is NOT the same. Sure 2k will save cost and time, but it will be a different result.
This was my thought as well, along with bitrate being a greater limiting factor in this era of streaming video.
This. Thank you. High resolution does not mean the end result has to be tack sharp. There are numerous ways to soften a shot if that is the goal but doing it via lower digital resolution is not generally going to be the best choice. And then there is the whole issue of color space and HDR that could be more important to presentation that resolution. This whole video was nonsense.
Honestly, high resolution is just simply peace of mind knowing I can crop or punch in any way I want in post. You can film further from your subject and then crop closer in post without losing detail for the perfect framing.
Surely your DOP/cinematographer should be in charge of getting the perfect framing on set. It's lazy to say "I won't move closer, I'll just crop it in post and reframe then?"
Human eyes are a lot more sensitive to contrast than resolution. A 2K presentation in HDR will look sharper to our eye than 4K SDR. Or scenes with a high dynamic range. 4K scenes that are very flat in range will seem less appealing and dimensional. But 4K is more than enough. I can really enjoy a supercrisp 4K presentation like The Revenant but the softer more romantic look also has it's charm. It's a creative decision. 8K is a waste of space, only usefull to have more reframing possibilities in editing but ridiculous as an output resolution.
@@berlin03030 I bought an LG B7V 65 inch OLED 4 years ago for about 2200 euros. Very happy with it. Blacks are perfect. The max brightness is 750 nits for 10% of the screen but it works very well. More recent OLED's do have higher max brightness and thus a bit better HDR. But I wartch in a darkened room so it's not a big issue for me. OLED screen have been getting cheaper the last couple of years. A similar TV now is about 1500 euros so considerably cheaper. QD-OLED might drive the price down.
To add to your point, dynamic range IS also a creative decision. A lack of dynamic range in a shot doesn't necessitate it being less appealing or dimensional. Shooting a scene in higher dynamic range where it diminishes the artistic intention is just as silly, of course I imagine it's way easier to "correct" this in post.
Depends on the image size. 4K is not resolution, it is simply a number of pixels.
@@TinLeadHammer I think we all know the difference between pixels and the actual detail that is resolved. My point was for a theoretical perfect 2K and 4K presentation. A lot of UHD discs don't actually resolve 4K detail, certainly not those 'real' 35mm scans. But that was not my point.
Maybe for the average viewer with a TV, 8K might be ridiculous. But for people with big projection screens, the added resolution might bring the home experience closer to the cinema experience.
Resolution when it comes to digital cinema is always misunderstood. Acquisition of 8K has nothing to do with pushing resolution higher just "because". 8K acquisition generates a much better supersampling and it generally softens the image, not sharpen it. You mentioned nothing of this in the video? Supersampling algorithms are usually using sharpness when rescaling, but many choose to not oversharpen while supersampling down in order for the sharpness to become natural instead. Shooting 8K with this supersampling moves a digital camera closer to an IMAX resolve. 8K supersampled down to 4K removes the digital problems of the bayer sensor, you get pure pixels instead of divided between red/green/blue. The digital noise is also lowered.
The point is that you use optical elements to set the character for the image, instead of letting the sensor block your work. A problem is also that you chose the new Matrix movie that is so overly produced in color grading and mismatching shutter speeds that it's a very bad representation of digital cinema. Even though the point is that resolution is a creative choice, we still have standardized 4K now, new TVs aren't sub-4K in resolution and HDR will in a couple of years also be standardized. There's no point in learning a workflow that does not end with a 4K HDR master at the output because that's gonna be the standard for any project soon. It's a rip the band-aid situation and people need to understand how digital resolution practically works, not just that it's a choice.
For instance, take something like the Red Raptor and use vintage glass on it. The supersampling will take care of the digitalness of the image, and without post-sharpening, the image will resolve everything about the vintage glass. Apply proper post-workflows, maybe even post-grain processes and if everyone knows the tech knows the post-workflow, it can reach 4K resolution without those bad high-resolution looks that people complain about. It's a joint effort between the art department, makeup, cinematographer, director, image pipeline personnel, colorist, and VFX.
I've done hundreds of projects since digital cinema first matured and every time there are people who don't know what they're doing, i.e they're incompetent, the digital image becomes trash. Shooting high-resolution digital cinema requires people who understand both the technology AND the art of creating images. The problem right now is that there are either people who knows the art or people who knows the tech. Back in the days of 35mm, that was not the case. Back then, cinematographers understood the technology of 35mm, they knew how to handle it. But today, many cinematographers have no knowledge, or interest to learn the knowledge of digital cinema, which leads to incompetent handling of the technology.
And that my friends is why you don't choose your DI from the camera department. Hiring a proper DI is the way to establish the best baseline for the image.
What's your job?
@@namesurname624 Started out on set as a camera crew up to focus puller, moved to editing, post production and VFX, now I'm writing and directing. So I've been through most segments of production, from tech to art and that's why I mean tech and art aren't opposites, tech informs the art, the art demands the tech.
hence the word Proper
Do you have a website? I’d love to connect and view your work
You’ve illustrated 100% of my feelings towards today’s filming trends. I really hate that netflixy feel, these projects lack so much of substance that they try as hard as they can to fill them with over good looking cast, over sharp images, overuse of catchy music, injustified camera movements, and so on… everything feel so “plastic” it’s absolutely awful.
It’s terrible to see that things are more and more driven by money making, and not only in cinema, but all sort of art mediums, and in fact in almost every aspect of our lives today, everything gets industrialised… this + the fact that today’s children are educated by this type of content i’m worried about what future artists they will become…
Alright mate calm down
I absolutely agree!
i think the craze for high resolution dictated by mainstream audiences that are already accustomed to clean sharp images (ai processed smartphone pictures, high end clean look commercials), therefore many companies demand things to be shot at a high resolution and a clean color tone.
high resolution can be great when handled properly, such as mindhunter shot on 6k and 8k.
Loool bro evolve man, the fact is just we have better cinema and art than ever. Embrace the technologies man and don’t get left behind
@@mubaraksuleiman5227 I believe that diversity of medium has to be preserved.
Filmmakers must keep the choice of medium on which they want their stories to be told…
I've come to realize that NOT having too many options in post can lead to an easier and better edit.
love this point you made; just because technology can do something doesn’t mean its always right. We have all these incredible cameras out there with 8k capabilities but DP's such as myself slap on a black pro-mist and add a load of film grain to give it the same look that 16mm would give off anyways.
True!! Thats the funny part.. at this point if youre not shooting a film like Matrix, actual "8k output" would result in an image so clinical that you would have to slap 20 filters to add character to It 🤣🤣
It is ironic we do this, BUT...I'd say it's still cheaper (and faster maybe?) to go this convoluted route than to do it with the equipment that does it by default especially since everything is distributed digital these days anyway.
@@heavysystemsinc. you’re totally right it still is cheaper and in many ways less risky
I still think high resolution looks amazing though. If it takes a more skilled dp to work with higher resolutions then so be it.
Digital grain isn’t very good tho.
Look at films like 3 from hell that used the old footage from the previous shot on film installment to match the grain structure.
That works only due to the care and attention used with that add on. Than compare to others like joker which look so unnatural that it takes away from the presentation due to the uniformity real grain doesn’t have.
It’s better to retain the natural look of the camera you are using than to add so many layers and effects that it’s like you’re watching when the killer calls.
It is easy and a lot more cost effective but when in the movie business has those words ever yielded a movie worth talking about.
This video was really encouraging to me. I don't really have enough money to get a high res camera, I always thought I couldn't make my own films because of this limitation. This told me I don't need to worry about that. My job is just to make art.
I got an old bmpcc og and I don't care what other people say, I'm a filmmaker now.
I love how you keep it simple with the editing yet it's an amazing way of delivering the topics, happy new year!
Happy new year!
4K HDR should be the common and final resolution for consumer devices.
Great points made here. I'd love to point out how intriguing it is that The Matrix is the visual centerpiece used to frame the discussion. I agree with many of the points made and I'd also argue that The Matrix is one of the few counterexamples in which high-res and ultra-sharp visuals actually serve the art and further the intention on the film and story. Makes this video even more wonderful and complex, well done!
Those HYPER crisp images in the new Matrix felt very intentional. Especially since the movie is SO concerned with capturing faces. You see ever pore, scar, and wrinkle. It's quite striking.
The next step in this conversation with regards to action films but The Matrix more specifically is BLOCKING. After watching Resurrections, I went back and looked at a few scenes from the trilogy and found myself enjoying and following the action than the new installment. Why? I think because of the size of the camera. I believe the trilogy was shot on a large 35mm camera (my guess is a Panavision or Arri). They’re bigger, heavier, more tactile. It’s not like a nimble little Red Komodo or the like that you can just whip around set on a whim. The result was I found the new movie to have actions scenes that were either unintelligible in terms of a storytelling (I’m thinking specifically of the Merovingian fight in Resurrections) as opposed to the trilogy (take your pic; they’re all extremely clear). There is something we’re beginning to lose in terms of craft at the altar of digital clarity, because more and more we’re enamored not by the data we’ve captured for post, but the flippant nature of our blocking with such powerful little tools at our disposal.
@@_mixedsignals while I see what you are saying there are a number of films shot with heavy film cameras that had very rapid movement like the Bourne sequels. Yet those offer more clarity the Taken which also shot using similarly heavy cameras, and the same “shaky cam” aesthetic.
It’s just now with the easier barriers means people have quicker means to opt for that quick blocking. Without ever needing to learn to work within “the box” of a weighty camera rig. And this happening all over again with the ronin 4d, I have seen so many people now thinking everything needs to a one take tracking shot just because they can.
So I think the age old adage “just because we can doesn’t mean me should.” Applies here.
It the case of the matrix I have heard Lana loved the small rigs because she felt liberated enough to use a more free flowing work flow and add set ups on the fly. Where as reloaded and revolutions used a stricter approach which involved copious amounts of takes. (30-50 per set up.)
This burned out Bill Pope so much that he hasn’t worked with Lana or Lilly again.
So in the pursuit of effective well crafted shots that tell the story it’s always important to remember how many people are involved and try to find a way to capture the film without burning out your crew members.
For me personally it all comes down to prep, getting a grip on the story and then the logistics and gear to capture that story while ensuring the actors (hopefully) get enough time to play around a bit and be ready for any “happy accidents”.
“We’ll fix it in post.” Has and always will be a rather large red flag, (of course there are numerous examples of post saving films but using as a crutch is never a great place to start.)
It’s also very VFX heavy and having low-res, soft footage makes it far more difficult for visual effects artists to effectively do their job.
Couldn’t agree more! It really annoys me when clients like Netflix only want their project shot in 4K limiting what cameras you can use.
Netflix is simply serving what the consumers demand and the consumers want 4k because TV manufacturers have been banging on about it for 10 years now.
Its shot in 4k for future proofing reasons.
Steve yedlin, ASC did an amazing deep dive into the difference between technical resolution, and the perceived resolution to the human eye - I think it would interest a lot of people
when footage is graded to hell and back, you can literally achieve the result you want.
Human I can’t see beyond 3k I believe.
@@Ljm488 The "resolution" of the human eye is a pretty complex topic, but it's better described in terms of pixel density than total resolution; and even then it depends on viewing distance. For instance, a 4K smartphone is wasteful overkill, while on a large tv, 4K is more meaningful - but if you're far enough away, it could be 720p and you wouldn't be able to notice.
@@Ljm488 ppi matters a lot more than resolution, the eye can see well beyond 8k in the right circumstances
@@AxTechs in terms of pixels on a monitor or computer not really. That’s what I meant, how the eye views an image in relation to an image. And Steve Yeldin did say the exact thing I’m saying in his video so I’m just relaying that message
Higher resolutions like 8k would benefit more a screen big as those in movie theaters especially imax.
I feel like comparing the old Matrix with the newer one would have been better. Having two totally different films with a drastically different look and feel doesn't quite work.
You can also see this same comparison with the most recent and previous. Representing the matrix with more abstract photography is more fitting in my opinion. The world is abstract, thus if the photography is abstract it just helps to make the world feel more erie. Which is what the original achieved very well as apposed to the new adaptation.
I disagree here, while I have other problems with the new Matrix fillm, I think this hyper realistic and artificial look adds to the digital world of the matrix which is quite artificial. Maybe it would've looked to different but I could imagine they could've shot the "real world" scenes on actual Film to further seperate both worlds. In the end it is a tool you use to convey your message: Do you paint with big or small brushes?
You can easily make 1080p from 4k, or remove sharpness on post, or crop. So it complitely reasonable to shoot highest resolution possible.
I think you're missing out on one crucial consideration when access to 4K and 8K cameras is becoming more and more universal: the longevity of high resolution master for different output solutions across a wide range of platforms for years to come
Diffusion filters on an 8K capture and 1080p are two very different things. The filter isn't "bringing the fidelity back down" to be in line with a low res look. Using a small sensor/small film format is a way different look from low digital res, too.
You can remember a great movie when it's almost a textural feel, the lighting mood & music, pacing of the camera ... everything feels like an organic vision of someone or a small team of people collaborating. So many movies today feel focused on the ip or a story that might go viral online that they forget that there's so much else to consider ... such as not making something automatically extremely high-res and sanitised.
Just recently re-watched French Connection 1 and that really does have an organic / textural feel to it.
I love recording in 8k or 4k to set 1080p as final output for a simple reason: it’s easier to crop to zoom while editing, this helps a lot to get “fake camera zoom” to focus on a specific part of the subject (like the faces).
It gives a bit more freedom on low budget projects, framing the scene is easier since some parts could be cropped and discarded later.
Recording directly in 1080p requires more precise shots, making “digital zoom in editing” impossible.
Speak for yourself. When I meet a new person, I also focus on the details of their face, including blemishes. Thus, I prefer high resolution movies because it feels real, not dreamy. I just don’t wanna be an spectator, I wanna be immersed.
YOU answered your own question the lower the res. The blockier the image, we associate that with cheap and crappy, which most low res cameras are (now don't say arris only use 1080p n 2k since recently, there 1080ps look so good do to the shape of there noise and etc, but I'm not talking about 1080p, even though it doesn't look as crisp) I'm talking about lower resolutions show noise more, are blockier and typically stink. Period. We also like to be able to punch in images and zoom during post, it's a workflw thing and makes it slot easier in pot production when footage is higher res.
The higher the resolution of the sensor and recording format, the more organic and less pixelated the image will be. Choosing a softer lens and/or filming on S16mm film (which will limit the effective resolution) are valid artistic choices, which will not negatively affect the technical quality of the image and won't cause it to look "digital". Over-sampling (shooting in a higher resolution than the finishing resolution will end up being), can reduce the appearance of digital noise, and improve image quality. Giving the advice to shoot in lower resolutions is truly a terrible advice. Shooting or even finishing in less than 4k is going to limit distribution options, will sometimes even limit which festivals the project be exhibited in, and will cause it to appear more digital "videoy" and pixelated when watched on a high resolution display. You can soften the image with the right choices of lenses and filters, and can also do it in post production in a more controlled way, reaching whichever look you desire.
Great video! I think there was a time in the digital world where resolution actually mattered. Because the more resolution you had, the more natural/organic the image looked and the less you were reminded that you're looking at something digital. And I think it was very important for technology to move over that threshold where the pixels don't distract anymore. For me that threshold is in most scenarios 1080p/2K. It's a very neutral way of presenting footage. I think 4K is still kinda within the boundaries of being neutral. Anything above 4K and below 1080p for me is a stylised choice and it doesn't affect quality, but style.
While I think it's important to use new technology and advance, I think it's important that we accept that there WAS a time when more resolution meant better image quality, but that we are now past that. Now resolution doesn't mean the same anymore and neither does quality.
Technologically I think we are at a sweet spot now for the way images are viewed/used today. There might be workflow improvements for filmmakers that helps them tell the stories (high resolution is one of them), but It won't affect the technical quality of the image we see in the end.
Maybe it's really a sweet spot most of us should embrace and use the chance to focus more on the storytelling and less on technology. In the end that's what good technology does, it makes you forget that you're using it.
Depends on the size of the screen and how close you are to it.
Kind of like when I was so happy to get a birthday card done on a dot matrix computer printer when I was 9 in 1991...I was like "wow, the future is here, cool dots..." but later I learned that the drive for better quality was to make it look so that a document made on a computer should not look like it was 'made on a computer'. I kind of get the nostalgic feel for big pixels, simple polygons/teapots, a black void background, and music with oscillator/FM synthesizers...technology just doesn't seem so futuristic anymore...
Please do a video explanation of how composition, framing etc is used in cinema (especially on old classics which used the academy ratio).
Wow, a lot of terrific comments below - and love the video - in my mind spot on. I once sang in a choir where the chap beside me was so over perfect in his words, it was downright annoying singing beside him - it was unnatural and unpleasant. Digital is done the same thing in a way - so over the top it is not much fun anymore. No one looks at the tiny pores in a face, or the insane detail in the Hobbit (which was a bomb in comparison to LOTR).
We are absorbed in the story, like the great movies. Yes Ben Hur was shot in 70mm for that detail but it was for an enormous wide screen that needed it. The story was king and a person was so absorbed in the story, they didn't look at insane resolution. I have both Blueray and standard 1080 editions and quite honestly, when I got to watching the movie, i forgot about the detail and got entrenched in the story. Oh, that was 1080? Hmmm, never knew.
Ya, this video is excellent. Well done.
You seem to have confused resolution with sharpness
You did not convince me. Recording in a higher resolution is always better if there are no tradeoffs. The image being too sharp should not be a concern, since it can always be blurred in post production. In the real world there are obviously trade-offs to be made considering storage and editing, so 2K or 4K are close to the sweet spot, but that's just for today. In the future, higher resolutions will be used more until we get to the point where human eyes won't be able to tell the difference.
I totally agree resolution isn't needed for every movie it brings out the flaws of the production, for example I watched a movie recently and because it was super high resolution I could see the actors wearing fake moustaches and beards it took me out of the story. Film grain brings out detail too in a movies that's why older movies look so good in 4K UHD disc because of the grain there is so much detail.
I think your example of the downsides of high resolution is very true but I disagree with your statement about grain. Grain doesn't increase detail in any way because it's just a random pattern. The image might appear sharper but that's a different story.
@@maxkern4419 This is not something I made up about film grain, some Filmmakers have said it that filming on older cameras that weren't digital they way to bring out detail was how on it was filmed on old film stock which was very grainy but it bought out the detail.
@@Capeau oh really, then I must of misunderstood it.
The grain acts like a dithering pattern, allowing you to perceive details that would normally be below the noise floor. The same trick is used in audio mastering. A 7bit piano recording sounds way more realistic with a spoonful of dithering noise than without.
@@bigogleOh really, but when I watch behind the scenes of movies the filmmakers talks about how the film grain brings out details, am I misinterpreting it.
before watching this video my opinion is:
A sharper image is not important. High resolution is very nice and I like it but the sharpness of an image is something different!
After watching the video:
Yeah my opinion hasn't changed. The sharpness is an artistic thing. I actually put a slight blur ontop most of my irl shot videos, just because I like the look of old films. But I still keep it on a higher resolution. Blurry images look different on different resolutions! In his video it sounds like the higher resolution are sharp everytime.
Look at movies like Jurassic Park. The blu ray has a greater resolution but the image itself is still blurry. I hope someone gets what I trying to say...
So I agree that a low resolution can achieve the look you want but where I disagree is that I think a high res video can still result in the same artistic feeling as a low res.
3:10 16mm can be delivered in 4k. Go watch Evil Dead 1981 on 4k Blu-ray.
Also you need a certain amount of resolution or things look terrible. Case in point 28 Days Later 2002 looks terrible today. The only thing that really matters is having a sufficient resolution for the viewing device. Since 28 Days Later was captured with a 480i digital camera the movie is mostly ruined for future use because most devices are too high of resolution. One of the beauty of film is you just can keep scanning it. Even 16mm can be scanned at 8k would benefit from a higher scan theoretically if you had a 8k tv and a native 8k source since you wouldn't need any upscaling.
If you scan 16mm film at 4K you aren't getting a more detailed image. You are merely getting more detailed grain. It's similar to exporting a 1080 HD project in 8K: it can work, but there isn't more detail in the exported project.
Great points! I would add one thing: lens matters! A lot of "digital look" comes not from resolution itself but from very sharp modern glass that is used with a camera. I prefer to shoot with vintage lenses, made in early 70s and even with my 4K camera I end up with nice, filmic soft look that I adore. 4K gives me some comfort in post-production (for example for zooming in slightly in post), soft lenses give me that vintage, organic vibe.
very true. another thing I notice with film cameras are the focus pullers. some of the charm I loved from yesteryear (including MANY blockbusters) is when the focus pullers couldn't track so perfectly. there was an organic feel about it. of course with digital cinema camera they still use manual focus but we have a lot of computer aids to help. plus I miss matte lines in VFX work ha ha. when I was a kid watching some old special effect laden films, when I saw those matte lines I knew I was in for some magic. like a pavlovian response mechanism ha ha
This reminds me of the components run on PCs in the 2000s-early 2010s, when you'd buy the XXX CPU, however much RAM you needed and the YYY GPU, to find six months later that your CPU was late gen and couldn't support the games you wanted to play. I still see it on some games, Call of duty and the likes, where each new iteration offers more or less the gameplay ("if it ain't broke...") but needing more and more CPU GPU and diskspace to process all the new textures for a "photoreal" game. I think it's good videogames have moved from that with the rise of indy studios, and no one, nowadays bats an eyes at an "unrealistic looking" game, rather focusing on the design and the mechanics.
It seems that digital cinematography is on the same slope. So, Netflix requires you to use a camera equipped with a 4K sensor (due to, if I'm not mistaken, some folks on the Netflix board being also the Arri board ?), but really it's an artificial constraint.4K for the sake of 4K, or for being able to brag about your content being 4K. Like you said, some cinematographers still use filters to sotfen it. Or even digital filters to give it a "retro" look.
The question of lighting and composition also comes to mind. We grew up on VHS and DVDs, and the great movies were no less impressive. Die Hard remains an amazing piece of action movie with some of the best cinematography ever, whether I put my DVD or my Blu-Ray in the player. Or Michael Mann's Miami Vice (the movie) shot on HD cameras. And it's brilliant and beautiful and makes for an amazing film. On the other hand, some recent netflix movies look like shit. Or, you know, every other netflix movies.
This discussion reminds me of late 40s-early 50s technicolor movies. So West Side story (everyone go check out the Spielberg remake BTW, it's amazing), Duel in the sun, the red shoes, she wore a yellow ribbon. The goal was never fidelity, it was heightened hollywood reality. And it works beautifully.
So yeah, thanks a lot for making this point and sorry for the long post. Love your work.
Very interesting point of view. In fact when I was watching new movies (filmed in very high res) I was feeling that there were no magic or a lost of "special taste" I didn't think about this before and it makes sense for me too, my attention was scattered in details rather than the story of the film. But as you said in some projects is good filming in high res and for others not. Very good video
Totally agree with this. The obsession with high resolution, over such things as dynamic range, colour depth, bit rates, codecs, and even subjective ‘character’, can actually be quite depressing.
Many aren’t aware that the camera with the highest number of cinematography awards to its name, the Arri Alexa, doesn’t even shoot true 4K. For me, the only real consideration when it comes to resolution is what the final production is intended to be viewed on.
Modern cameras and glass can be far too clinical and overly sharp for me. I rarely shoot without diffusion and often use vintage lenses to offset the contemporary look.
Great video mate 👍🏻
👍🏾 Tommy Rowe, GREAT to see you here, mate!! I purchased your awesome DJI Pocket LUTs awhile back, and luv 'em. Great point about the Arri, too! Hey, wishing you and your family a Fantastic Year, brother.
@@FilmSpook hey dude! 👋
Ah that’s awesome. Pleased to hear you like them and they’re working well for you! Thanks very much mate, right back at ya 🙌
@@RoweFilms 😎 Very Welcome, Appreciated!
honestly im glad to see consumer equipment starting to focus on HDRI etc. I thought wed be forever caught in the "resolution wars".. and Thank God the 3D fad is over. it'll never sell until they can pull it off without glasses. corniest fad ever, felt like I was back in the 50's
Color detail can matter. A 4k 422 image will have the same color detail as a 2k 444 image. Resolution is a side effect, but tonality is important
I like to think of this as being similar to cameras that shoot 60fps and 120fps, or even more. Just because the camera is capable of doing it doesn't mean everything should be shot in 60fps or 120fps, it should only be done when necessary and when it brings something to the story or the artists vision.
Shhhh, don’t tell that to gamers.
@@Skrenja unlike movies, games benefit tremendously from high fps. Imagine playing games in sub 60 fps in 2022 lol.
I've been considering this for some time and couldn't agree more. I recently watched a film about the camera setup of various youtubers. Most everyone was shooting in 4K. The segment of the video that appealed most to me, however, was shot in FHD. It looked more cinematic, in my opinion. Since then, I've been experimenting with a 12 year old Canon rebel camera and have been really impressed by what can be achieved with light, lenses and patience. Lower resolution has a definite role in certain situations.
4K resolution can still be soft, depending on the camera and lens used
That's how I see it too. And I love seeing grain, even if it's just grain from a digital sensor. Which you often don't see in 1080, especially when its compressed (youtube, Netflix etc.)
I heard the new ARRI Alexa cameras have even texture settings for the sensor
I understand. You made good points. It comes down to the filmmakers choice of style.
And what their vision is.
It's so great that this video exists amidst a scarce few on the same topic, yet it should be noted that there is an inherent downplay on the photochemical choice of filmmaking. Kodak 35mm or 70mm film, provide immense resolution with the inherent components that you present in your video, and are far more superior choices as far as depth of colour, dynamic range, and overall softness is concerned. Needless to say that 65mm IMAX film is an immaculate medium for capturing images, although not used in commercial terms due to financial reasons. Kodak has a few papers on their website in which they compare digital sensors against 35mm film that yield some amazing results. I think it's extremely important that not only you talk about these current trends in the industry, as I was talking about this exact same thing because The Matrix really befuddled me with its cinematography in that regard. Its is extremely important to not forget how sophisticated the human eye is, but also how our perception of what constitutes images cinematic can be atrophied over time, simply by new technology from companies whose sole goal is to produce this new technology only for financial gain.
Resolution, or larger negative size?
I actually like to overshoot the resolution on set and then bring it down in the editing render, which helps a lot with effects and color grading.
You're missing 2 important factors here. One is that movies are filmed in high res and then "diffused" because upscaling ruins the quality. It's better to have a "diffused", "vintage lens filmed" in 8k then downscaled to a 4k bluray and/or 1080p TV than having a 720p movie upscaled by deeplearning algorithms or stretched by the monitor. Other factor is that scenes filmed at 8k/12k can be edited in post, with zoom-ins, for example, without losing quality if the final product is sold in 4k/1080p.
It all depends on what you want to do and how you want the end product to look like
Completely agree. 'Upscaling' with film is far different than Digital. That's why we can re-release old movies in Blu-ray with minimal intervention compared to if a film was recorded in 480p and then upscaled to 1440p. I completely get that the standard resolution that we are shooting/watching at is getting near it's necessary limits, but you're locked into the quality that you shot at a LOT more in digital than you are in film.
@@Obie. isn't film just scanned with higher res output?
@@pilebunker420 afaik yes, though the amount of meaningful detail is still limited by the gauge of the film
@@pilebunker420 exactly. You can go back to original film and scan it at a higher resolution and a lot of quality will be retained. Where as you can't with digital. I feel like this channel is trying to apply film logic to digital.
Great video!
I also feel that with the most modern cameras pushing 8k+ resolution, the clients/bookings will start to see it as a standard in the gear you bringing, further widening the gap between low-normal budget filmmaking and the $10,000+ cameras shooting in 8k.
A sad state, but hopefully we can bring attention to it with videos like this and not let that happen! :)
Can you do a cinematography style video on Linus Sandgren?
Portrait of a Lady on Fire was shot digitally and looks incredible.
Films from the 70s remastered to real 4K look incredible.
A very interesting and a bit divisive topic…
really loving the ultra wide video format, and a great video, keep it up good sir.
I'm one of those pople that say: always future proof your resolution. If you want a "softer look" go for softer lenses, not less resolution. That way, in future, on 8K monitors people will still see smooth image instead of smooth squares.
People might be able to tell the difference between 8K and 4K on a billboard sized screen, but at home? I don't think so
@@km099 I remember when 720p was deemed "unnecessary for most people" because "nobody would ever need this much resolution". how do you know people wont have 10 feet wide screens at home in future? I already do for my 4K projector.
@@bqgin I'm not convinced. Unlike you, I never heard people complain about 720p. The step up from SD to HD was significant and very noticeable to consumers. But resolutions like 6K or 8K? Not so much.
I'm talking about final delivery of course, not necessarily the recording.
There's simply a point of diminishing returns when it comes to resolution. Stuff like a better dynamic range are way more important, just look at the new 4.6K Alexa 35.
Just ask your friends if they can tell whether their local movie theater uses a 2K or 4K digital projector without looking it up.
@@km099 As an indie filmmaker I purchased a 4K 140inch short throw projector for special effects. Let me tell you, the difference between 2K and 4K is immediately noticable. So is on my friends 6K tv I don't know exact dimensions but its alot smaller than my projector screen. And it's extremely apparent with cgi.
Maybe you should wear glasses if you can't see the difference? I'm not being mean I genuinely don't know how one can not notice the difference. I get on a phone or a computer screen cause they're small but on a TV and moreso in cinema it is visible even nowadays. Who knows what technology can bring us later?
@@km099 Regular VHS quality was the dogs breakfast compared to 480p double scan. 720p looked gorgeous. 1080p not much better, and 4K is like killing a bathtub spider with an AR-15.
I watched this on a 14" Daewoo CRT I bought for $1 at the thrift shop with a $10 HDMI>RCA converter for my Chromecast. Needless to say, it spoke to my soul. I love watching TH-cam on an ActualTube.
This reminds me of what happened with audio -- basically sample rates climbed and climbed, but after a few years of that it became apparent that 44.1 or 48k were sufficient for pretty much everything. I personally like 1920 x 1080 most of the time, it's easy enough to upscale with no loss in quality.
True. manufacturers were pimping 96K, and even 192k audio, and mp3 came and crashed the whole party. 44.1 is enough for 99.99% people.
@@sanaksanandan When was the last time you heard someone played anything in 44.1k?
A very valid point. I have thought of this before too. The extreme level of detail and sharpness can be in your face and rather distracting. Taking the focus from what is actually being conveyed to the things that are used to convey the story -yes, I am saying this even though I am a big proponent of ultra high image quality.
1:44: The "p" stands for "progressive" in this case, not pixels. This means that each frame has all the pixels updated at once, as opposed to "interlaced" where each frame only has every other line updated at once as TV often worked in the past.
Just another video content creator making videos about a subject they don't grasp. Nothing new.
Yup and he seems to conflate sharpness with resolution
it's a little like in todays' music production. state of the art microphones and plug ins and so on make it possible, to make music without any noise whatsoever (unlike in the past, where compressors, microphones and for the very least tapes and vinyls added some (or more) noise in the background). And what do we do? We use saturation, added vinyl crackling and other ambient recordings or "analog noises" to give our peace back some naturalism.
I like that Roger Deakins is such an outlier when it comes to this. He creates some of the most beautiful images in modern cinema but he likes having the cleanest image he possibly can. I remember in interviews he’s said that he doesn’t like seeing any blemishes from the lens or camera used at all. But he fully acknowledges that it’s his preference and will shoot anamorphic or use a stylized technique if it helps tell the story better.
Yes as long as it's on an Arri sensor.
Deakins has never once shot on anamorphic.
roger has a great lighting skill, paired with a knowledgable DI Colorist, i think thats why his movies dont look "boring", there is always something eye catching in the frame be it the shadows or the color tones.
@@HBarnill True, I meant more in my interpretation of his sentiments he might be willing to do something like that if he and the director agreed it helped tell the story. He did employ that very stylized look for flashbacks in The Assassination of Jessy James by the Coward Robert Ford. But yes he definitely has a clear preference for clean, spherical images.
Great video. Funnily enough, as soon as I saw this video title, I wondered how long before Geff Boyle Cooke Optics video would appear. It's all about the feel that you want for the production or photography.
The 4 nonsensical "obsessions" of new and amateur videographers:
- the highest resolution possible
- extremely wide open aperture shots
- high FPS
- anamorphic lenses
None of the above is nessesary or makes you automatically a good "film maker". Yet they all want to have it, often without really knowing what all these things really are and how they effect their footage or creating a mood. But they obsess, discuss and fight over it as if there was no tomorrow. In the end, we have to remember Star Wars Episode 1 was shot on a full HD cinema cam. Jaws was shot on film and you can get both in 4k Remastered and never know on which original medium or resolution it was first shot. Also interpolation software these days is INSANELY good for enlagening formats. And I personally also think that a too high resolution movie loses its magic. Also I can't enjoy a lot of remastered movies because they add so much artificial sharpness and correct every little flaw out of the image that all the character of the movie gets lost.
Let's get back to Jaws or Apocalypse Now:
The remastered versions look stunning and feel as if they could have been shot in the early 2010s (just by judging the image style), but they look nothing like the originals that I love so much. They corrected out the yellow tin from Jaws to make the white balance spot on perfect and released it in 4k with so much sharpness that it looses all it's charme because of it.
"bro it's not even External rec 8K ProRes RAW high dynamic Range, your film will suck"
I've seen lesser and lesser projects with anamorphic lenses, so idk what you're talking about in that last point. Plus, they actually work against the whole 8k prores crispy netflix thing because they lack the sharpness and even have distortion. Knowing when and where to use them is very important. Take the rise of Marvel for example. So many of the shots in these Marvel movies are shot on spherical, and colour graded like an Amazon commercial.
"They corrected out the yellow tin from Jaws to make the white balance spot on perfect and released it in 4k with so much sharpness that it looses all it's charme because of it." That's just like your opinion man.
Also anamorphic lenses are fine, there's a reason they are still used today.
@@zr_1234 You are missing the point. I didn't say anamorphic was bad in any way, I am saying that amateurs who should more focus developing skill before obsessing over gear they don't understand and waste a lot of money on gear, which they could have put into a project to actually have something in their portfolio.
And yes, that is indeed my subjective opinion - just as it is yours to disagree.
agree with everything but the anamorphic point.
This is a long video that can be summed up as "Artistic choice"
High resolution is only great for cropping without losing detail but most professional movie productions don't need to crop out 50% of the shot. High resolution is best for non professional filmmakers
Cropping, digital zooming, reducing for sharpening when you miss focus slightly, reducing for better color (on cheaper cameras), stabilization in post, there are tons of reasons a low budget film maker may need those extra pixels.
When you are a "one man band" I don't care how good you are, you will mess up time-to-time, and that extra resolution can be a life saver.
I think the flexibility to “crop in in post” is a strong argument for capturing in higher resolution. The movement from a small zoom push or pan faked in the editing can be a huge asset. And it gets a lot easier to crop out a Starbucks cup that found its way into Westeros than it is to rotoscope. For photography cameras (vs cinema) the ability to “punch in” to an APS-C shooting mode allows much faster reframes, which can keep the scene hot and finish the scene sooner and more cheaply.
Oversharp images are definitely a thing. But capture in high resolution isn’t to blame.
I recently shot some Super 8 when working on a short film for an organization. I did not know if this would work as I had not touched this camera in 40 years and also I had misgivings about how it would look in the shift to the 4k footage (edited in 1080 though) that makes up the bulk of the film. The Super 8 experiment gave me just enough usable footage to open and close the final film nicely. Inevitably, a friend who saw it praised what she thought were the use of "vintage effects."
Never thought of it that way, but I absolutely see your point
You can actually shoot 4k or whatever high resolution you want, but make it look less sharp in post production. Creativity shifts, we are allowed to change our minds. You may think the story requires a less sharp image, but when shot, you discover the medium you or your audience would view your work through, may desperately condemn your work, because you chose to go for less and not fully harness the power you are given. It's better you have it and not use it, as opposed to you not having it and getting phased out or ignored by the people your work of art was to appeal to. 😁
Just like having a Naval Ship Gun when obeying the 2nd Amendment...it's better to have it and not need it, then need it and not have it.
I agree In one sense
But
You pointed out the major issue without going into it!
“When you focus on someone”
You’re not bouncing around all over the place like most films are with the handheld look in 4/8K
Handheld look is fine for its purpose (personal perspective), but when the only steady shots are super wide landscape/drone shots it screws up and makes you look everywhere but the actors eyes!
The topic has merit of discussion but it's odd to compare a movie intentionally made to look vintage like Jackie with a movie where the vast majority of the time is spent in a simulation designed to look artificial like Matrix 4.
It's perfect for this point because the 16mm in Jackie helps to create a feeling of the early 60's and the detail of Matrix helps contribute to the artificial world feel. The medium as a tool point
the original matrix was done on film so that argument doesn't hold up
@@davidcunningham9282 It wasn't 16 and it wasn't presented as ruggedly as Jackie though. Larrain very much uses film texture to time stamp his movies. He did it perfectly in Spencer this year
@@JonPlarr that's not the point i'm making. i'm just disagreeing with the notion that a matrix movie should look more digital. in fact after rewatching it recently i'd forgotten there's a decent amount of grain
@@davidcunningham9282 I got you. Yeah I have no problem with a futuristic movie still being shot on film. Probably would be my choice. I think set design and color is probably more effective in conveying a feeling in that case than medium anyway
The reason why we keep going up from 1080,2k,4k,now 8k is because of porn. In film school, this is literally the first thing they told us.
I disagree. I think the advancements in image quality with sensors and lenses has greatly benefitted movie making. The artistic looks of 16mm film and certain aesthetics make movies like Midsommar and The Lighthouse greatly stand out. With Midsommar being shot in 8K with a super clean clinical look that enhanced the visual aspects, where as The Lighthouse being shot on low-sensitivity film and WWII-era lenses gives it a very authentic feel for its time and setting.
Thats literally the point he made at the end.....
This is exactly what he spoke about, that having it available is great, but that many films use the highest resolution when it doesn't really suit that particular movie. It works for some, not for others.
just to oppose your essay, high resolution is a good thing. Back then, 1080p is high end high resolution. additionally, film is high resolution and costly. so you see, we are still below the cost of shooting with film. it "only" becomes not a good thing if one thinks having high resolution makes a product/production/output better.
while 8k video is currently dumb, it might matter 10 years from now. another thing i currently hate is fixation on 24p. while it is true it is a matter of taste especially of the older audiences and creators, we should move on and move it to at minimum of 30 fps. 120fps should be supported by youtube by now, it matters with plenty action or plenty camera movement.
i only watch films on my apple smartwatch so 4k is ok I guess
The resolution on the smartwatch is too good, really ruins the experience, I only watch movies on my ipod nano.
shoot 4k and deliver full hd ... gives you freedom and remember ... you can always scale down, but seldomly you can scale up with video ... have a great day and your point is taken ... great video !
I have to fully disagree with you. Super crisp sharpness CAN be incredibly visceral as an experience if its shot artfully.
KNIVES OUT has an incredibly sharp image, deliberately exposed and shot (both on 35mm and digital in almost equal measure) to push a super-sharp and very crunchy aesthetic and it makes the film not only look gorgeous, romantic and engaging, but also creates a sense of immediacy as well.
4K scans of VistaVision films have the same effect for me. THE TROUBLE WITH HARRY is breathtaking for instance.
I think it really is a matter of personal preference and while there is an aesthetic appropriateness for soft images vs sharp images, sharp images alone are not a problem. You keep using THE MATRIX RESURRECTIONS as an example and yet its actually one of the most atmospheric, rich and textured films I've seen on Digital in a very long time. Especially in the bland wasteland of Marvel and Netflix films.
There's no superiority of sharpness or softness inherently within the aesthetic or process. Just how its shot and how its displayed.
Holy crap, yes. Thank you. Every time I hear someone talk about what does or does not make something "look cinematic" I throw up a bit. All of these things are just tools that are used as a means to an end. They can be used skillfully, or they can...not. I thought The Matrix Resurrections looked cinematic as hell. If anyone tries to tell me that looked "too video-y", I am going to tell them they are out of their damn mind. And Rian Johnson? I mean come on. That stuff is going to be stunning. It doesn't matter if it's shot digitally, or at which resolution, or frame rate. If the people know what they are doing, it will look good.
I'm willing to bet that few people are criticizing Christopher Nolan in this way for lugging around gigantic IMAX cameras around his shoots. That's he's "too focused on resolution" or whatever. And I'll bet the reason why is because he shoots on film. In that case, it's fine.
I think the thing that bothers me most when people talk about what is cinematic, is that so much of it is simply what they've been told is cinematic. And what they were told is cinematic was based on the technological limitations of film-making during the first part of the 20th century. If those same attitudes were applied to other forms of art there'd be two or three "real" painters, and nothing newer than Tchaikovsky would be considered "real" music.
Sorry for the rant. I don't mean to direct it at you. I mean, we are in agreement. I just fucking hate gatekeepers, man. Whether it's cinema, video games, whatever. And I can get a bit riled up about it.
I completely disagree with you. Every single movie from say the 90s shot on the film looks more romantic and engaging but it's also related to who the movies used to be lit. Nowadays nobody seems to have a clue what they are doing with the light. And don't get me even started on modern post processing.
@@dash3dot I dunno man. That's a pretty broad statement, and I'd be willing to say objectively not true. There was an awful lot of dreck produced in the 90s, the same as there is an awful lot of dreck produced every decade. And something looking "romantic" or "engaging" is totally subjective. You might prefer a certain look, but that's just a personal preference. It's what you've become acclimated to, or learned from a young age that "this is what cinema looks like", or whatever. And that is totally fine. People will like what they like, and I am all about encouraging them to explore that.
But that does not objectively mean film is a better medium than digital formats for making motion pictures. It's just a tool. One that can be used well, or not. The same as any other tool. Shooting on film isn't going to save you if the DP stinks, or if the editor is having a lousy month. You pointed that out yourself, the ability to control light is of utmost importance in a visual medium. Put an iPhone in the hands of a skilled and creative director, with a shit-hot DP, and they will be able to make something that is extremely engaging and beautiful.
@@NicholosRichter Agreed - I can't see someone watching GOOD WILL HUNTING or BEVERLY HILLS COP 3 or SHE'S ALL THAT or a plethora of other flat, piece-meal, blandly filmed movies of the 90's and think that just because they're shot on celluloid that they have some kind of 'romanticism' to them.
If anything they've GAINED romanticism just because they look a certain way that's distinctly different from digital, but in context they were nothing to be wowed by. If anything, those films could have been shot on Digital and honestly beyond a much heightened sharpness to edge lines they wouldn't look all that different to something filmed today. If anything its the difference of LED lighting vs tungsten lighting that would make the differences seem more pronounced.
It's always been down to the DOP, not the medium. And no matter the medium, there will always be DOPs with ideas that just won't work at all.
I am kinda disturbed, personally, by this trend of people thinking that because film (through duplication artefacts) is "Soft" and not as sharp as digital, that this is somehow better.
How did your preference in anyway in contrast with his views in this video? He never said super crisp sharpness can't be visceral or artful, he merely stated that it is a part of the artistic equation, meaning you can CHOOSE to use it to your advantage to manipulate the image's artistic expression. It is just not something that is inherently and automatically superior.
More resolution will ALWAYS look better, even if your source material is shot with 8mm film. Having that scanned and shown at highest possible resolution will make it feel more analog when you get rid of as much of those pixels and that aliasing as possible. And shooting digital with objectively bad lenses for artistic reasons will also look better the higher resolution you have because it captures all those minute details of the lens characteristics better. Except for cost I really don't see no reason whatsoever to go for as high resolution as possible.
You’ve made excellent points, but you didn’t factor compression. Streaming platforms typically give more bit rate to higher resolutions. So much so, a project shot in 1080 but uploaded in 4K would look better than a 1080 upload. My approach is to shoot at a minimum of 4K and soften the image with filters or lenses for the right vibe.
I think it’s also important to know the difference between resolution, sharpness and detail. As it’s not always gives the same result.
Me : "I don't want to see Matrix images before I see the film. But I really want to watch the new IDC video. But I don't want to get spoiled. But I want to watch the IDC video now. But..." **brain disconnects**
While I agree with pretty much everything said here, I simply don't like images below 720p. They're too blocky for my tastes. If you're going to give me a less perfect image, please use a filter or a lower mm film camera.
yeah, I agree. This is the thing where I disagree with the message of the video. You can have a blurry image but a high resolution.
if you're using RED CAMERA any band, you should be using Resolution 2K and 4K It's working No Blur almost 100% Frame rate Per Second (FPS)
I actually remember reading an interview with Robert Elswit where he said lighting Velvet Buzzsaw was extremely difficult because Netflix insisted on 4K cameras but that meant he needed double the lighting fixtures he needed and needed completely different lenses.
And on Billy Lynns Halftime Walk and Gemini Man which shot at 120 frames a second, the reason both films look so visually flat is because they needed even more lighting then with traditional digital or film cameras because having so many lighting fixtures was the only way to get a usable image.
So unfortunately there’s an element of truth to some 4K or 8K films having an unnatural look to them.
Great point!
I think GEMINI MAN especially is only aesthetically useful when you pair it with 3D. On it's own, it is very flat. But the depth the 3D brings to the high framerate, there's nothing else quite like it out there.
@@iansmart4158that movie looks like video game and not in a good way.
filmmakers should just lie to netflix. because netflix lies anyway. stream is not true 4k. netflix ceo's/producers would not know the difference.
You can just shoot at hight res and than export it at a lower one, so you'll have more flexibility in post...
(for example crop-in, stabilization, etc.)
Is it really a resolution problem though? Modern native digital 4K movies can look amazing and "cinematic", just look at Roger Deakins work. Film scans can get up to 8K (or even more?) and also look great. But displaying 1080p on 4K display is just a shittier experience, no amount of artsy justification will change that. I prefer the older film look, but that is achievable with high resolution and looks better than targeting 1080p.
The bad thing about Matrix: Resurrections isn't the high resolution, it's mostly bland cinematography with majorly handheld shots, with bad shutter speeds that add motion blur. That's why I think it looks like a TV show and not a big budget, thought out, "real" cinema.
I love that you gave the Fran 8k a nod 😆 Could have been the greatest camera of all time
I have to disagree with the point about the extras in the background being "more attractive" these days, for a number of reasons.
The "modern" shots you used as the example of attractive extras, we're of 1) Amanda Seyfried, who I highly doubt was an extra in the movie you referenced, and 2) a male model extra in the foreground of the shot.
You seem to confuse "focus" with "resolution" in all of these shots, or at least complain that the attractive models in the background are only visible because of the increased resolution of digital as opposed to the director of photography choosing a wider depth of field for the shot, a choice he could have made in a film or digital production.
All of the reason anyone ever cites for why analog is better than digital (even in the classic argument of records vs CDs) always comes down to some pseudo-scientific bull like "warmth" or "softness" when we all know that anything you want to do to digital to emulate the look of film can be done in a computer.
A movie filmed on 16mm film can never be as sharp as one filmed in a 8k, but you can always apply grain, blur, post effects etc, to make 8k footage more gritty.
It's about flexibility.
It's more like having a long piece of wire or wooden board that needs to be cut to a certain length, and you need to guess where to cut it. It's better to cut the wood or wire so it ends up too long, even if off by a large amount, then even a bit too short because if it's too long you can always cut more off...it's a lot harder to put it back on to make it longer again. Same goes for digitizing magnetic audio tapes...you want the azimuth to be at the most sizzlingly brightest sound to be captured...there is no setting on the azimuth screw that's going to result in a sound that's brighter than what was originally recorded, anything but the correct setting will produce a duller, muddier sounding transfer. If the tape is screechy or annoying you can run a low pass filter in digital audio editing before you burn the final CD.
@@brentfisher902 yup, not to mention people will complain about modern technology making everything look too digital, as they watch movies filmed on 60mm IMAX film stock on their 13-inch 1080p MacBook air.
I agree with this video definitely, the biggest reason I upgraded to 4k wasn't the high fidelity but rather the other benefits like lowlight (smaller pixels = less noticeable noise), zooming in and out (I use it in nearly every single shot), more data to work with allowing higher bitrates and then manipulate the image to something more than you used to be capable of... many other things really...
actually larger sensor pixels can get better low light. noise is always a problem but theyre getting there. never thought id see the ISO's were seeing now.
@@flipnap2112 I was talking about video pixels not sensor pixels. The denser the pixel count in a video will produce cleaner video. But yes, bigger sensor = bigger pixels = better lowlight
Achieving a less sharp image is something I am completely on board with. Achieving it by using low resolutions I am absolutely not on board with. Its different to have a soft organic image and its different to have a low res blocky aliased pictured. Low resolution always looks worse. I usually like soft images with high resolution.
I think it generally depends on the project. One thing that is usually true is that it's easier to degrade a nice image than improving a bad one. But, I generally agree that not everything needs to be overly sharp.
What does that even mean? If you degrade a good image by say putting a promist filter over the sensor, how does that mean it started as a good image? Does that mean using sharp glass in front of softer 35mm film makes 35mm film a bad image which is being improved with sharp glass?
Owning a 4k disc for me is just about owning the best version of a movie. Imo when you have huge landscape scenes like in 1917 or Hostiles or Dunkirk, 4k is better. Kubrick's 2001 looks great with HDR. Some 4k discs improve the sound quality. Sometimes I just want a cool looking steelbook, lol. But for the most part the best version of a movie is the original director approved blu ray where the colors aren't touched and the overall feel of the movie is as intended.
you know the uhd blurays sometimes aren’t even 4k, just 1080p with a higher bitrate
@@AdventVFX Ooohff don’t even start…
This is very interesting to me, as I feel similarly when it comes to video game resolutions. Let me explain: I own a 4K monitor on which I play my games, and my PS5 usually is capable of outputting 4K images, which results in overly sharp images; it doesn't look cinematic to me. The previous generation (PS4 Pro) wasn't fully capable of outputting true 4K images; they were usually sub 4K, in the 1800p area, and to me that looked more cinematic. The edges of the objects looked less jagged and very slightly blurry, which looked more natural.
Lower framerate(~33 fps from my own tests but YMMV) will make it look "cinematic" in addition to the art style. I disagree with your opinion because many more photorealistic games benefit from higher resolutions cleaning up slightly underexposed areas in addition to helping smooth out aliasing. Lower resolutions will look MORE jagged because they will have more aliasing(unless it has a smooth artstyle with good anti-aliasing).
If your games look too sharp try lowering the sharpness on the monitor itself.
2k, thats the best one I swear
Great video! I have similar feelings about these digital cameras. Although being able to record at high resolutions with great dynamic range, perhaps the visuals became too perfect where it’s not pleasing on the eyes. Reminds me of friends I know who work in audio production where they record a digital recording onto analog tape to remove the “digital-ness” in the original where the tape adds analog hiss and other imperfections. Or another example would be adding grain or even a simple vignette onto digital photos. What was originally an imperfection of the analog format, then replaced with perfectly designed sensors and lenses but returns back for that nostalgic and intimate look.
Disagree. I think an image should hold as many details as possible. I see so many production flaws in hi-res and while that's annoying it will result in an evolution of moving pictures, there will be better props, better costumes, better photography, better scenes, better lights. I kinda stopped watching films that aren't UHD and rewatching my old favs in UHD is mindbending. I don't see why in the age of MUBI and AppleTv or Netflix anyone should watch pixelated pictures
I got into it with some upscale nerds recently about this. Thanks for the video!
i feel that your right with resolution being a tool. but practically, a resolution that is slightly higher than what you want your output to be is best as you are able to adjust the frame and crop, and have more space for post production. this is what i do with my personal projects, as i dont want to return from a shoot and realise that i don't like the shot, and have to redo the shoot.
of course, this is all up to the preference of the producer, this is just my opinion.
And there's the Hara-Kiri problem too...sometimes one of the actors fails catastrophically before you get another go at busting more frames in his backside.
@in depth cine a great way to put this issue to bed is to put some classic films shot on Super16, Super8, and the various popular 35mm stock and perf variations all together in a 4K scan and let's all see that the reality is. when we drop in broadcast TV from the 80s/90s...people will begin to understand why there is still a significant market for the OG BMPCC
As always, thank you for your insightful, well presented discussions. #TH-camFilmSchool essentials!
Great video! I will go more to the lower resolution side of filmmaking every time! I shoot digital, but I also shoot a lot of film0which I prefer. All I hear people say these days are “is it in 8K?” I do not like at all the hyper-sharpness of video, unless it is in sports or live events. I love the subdued look of film/video, & I will keep SHOOTING for that, Cheers!
I watched a comparison vid where the guy shot a 3 minute mini. he used everything from 720 to 8k. pretty much nobody could tell the difference, and when they could it was because of the color packing.
Totally agree! I've had to "soften" at times to make the feel, feel right. BUT, being able to shoot 8K gives you flexibility in zoom. YOu can always soften in post.
The whole video had one thing to say: resolution is an artistic choice. I think it became a little repetitive as it went on.
Great points there. I think this obsessions with high res gets perpetuated by the marketing machines of the tech companies and as a selling point for movies or shows way too much. On the other hand, I'm happy to see there are also other trends where people are going back to analog cameras whose results feel more authentic to many (including myself).
For me the reasons that I shoot in 4K or higher and deliver usually in 1080 is because of VFX and the flexibility it gives me in post.
Anyone who’s tried to rotoscope or key soft footage anything in less than 4K knows how artifacts become really obvious and masks/mattes can look distractingly harsh, so something like a matrix movie being shot high-res and with little diffusion makes sense (maybe soften in post).
Shooting at a higher res than delivered also gives me a lot of flexibility with cropping and reframing shots without losing too much detail (you can get a wide, medium, and a close with one camera set up with negligible downsides). In some cases I’ve even been able to enhance dolly zooms by having the footage zoom in post as well. It also makes things like stabilizing shots or doing digital pans and tilts easier.
I’m old enough to remember 35mm still cameras were for amateurs. Professionals used medium format because the images were so detailed with very little grain you could blow it up to a 40x60 and have it look great. If we did not want that sharpness we would put on soft filters. You can soften an image to take off the edge of the sharpness but you can’t make something that is soft sharp. If we want grain we can always add it in post. Resolution is a great thing. I had a favorite lens on my Hasselblad that was from the early 60’s. It was great for portraits because it was not super sharp but that does take away the greatness of the sharpness a medium format. One thing we can learn from the lessons of the past is that we never learn from the past.