Dave I am only practicing photography for about 4 years, the comparison between FF/APSC and the effect crop factors has with lenses is the most misrepresented topic on TH-cam. I find your videos on the ff/apsc comparisons to be the most accurate
Thanks, I've been getting them for a few weeks now and I keep removing them, just bots and probably can't do much about them ... But I guess it classes as more viewsn🤣
But he is completely wrong, ISO is calculated per inch, in order to set the "same settings", ISO800 on APS-C is the same setting as ISO1800 on Full Frame, the formula is multiply ISO by (crop factor)**2, e.g. ISO125 on APS-C is ISO280 in full frame to get the same light capture, noise level, etc ,etc - i.e. same settings. This is like saying a large bucket holds the same water per square inch as a small bucket (what ISO measures), ok, so? - ISO number is pointless without knowing the sensor size and applying the relevant calculation to it.
@@Thirsty_Fox Not exactly, you get the same exposure at ISO1600 in photo and ISO12,800 in video on many Sony bodies. On your iPhone, you would've noticed you get crazy low ISOs like 32 to get the same exposure as much higher ISOs on APS-C or Full Frame. You also have different and conflicting standards like SOS and ERI. But as a general rule, if you want to match exposures between sensor sizes it's multiply ISO by crop-factor squared to get pretty close. But as I said, as most cameras are ISO invariant, what you should be doing really is shooting at the native, changing your exposure in post editing and then comparing the final images if your purpose is to compare different systems or sensor sizes.
Totally agree with you re: sensor sizes vs. equivalent exposure. If you were not correct, sensor size would have to be figured into the rule of "Sunny 16" however, on a bright day, the exposure for any size format would be 1/ISO at f/16... No need to figure in the sensor size... Where the difference in aperture between different sensor sizes originates is the difference in depth of field. Simply stated: framing any image identically between sensor sizes using the same focal length lens will require you to shoot from a closer distance with the larger sensor or from a longer distance using a smaller sensor. If you shoot from a closer distance using the same f/stop and same focal length, the DOF will be more narrow than shooting from a farther distance with the same parameters. Actually, shooting from the SAME DISTANCE using the SAME FOCAL LENGTH will result in a more narrow DOF for the smaller size sensor and a wider DOF for the larger sensor...
There is a difference hidden in ISO amplification set by manufacturer. F-stop of lenses are misleading, they work great with exposure triangle - hence you have same exposure regardless of your lens or camera sensor, but there is a caviat - f-stop itself doesnt tell you how much light passes through, not without focal lenght anyway. Real mesurment of lens "lightness" is entrance pupil, and when You start counting that, evrything starts to be more clear. Full frame 50mm F2 lens has 25mm entrance pupil - this is a size of a hole, filtered by a FoV cone - exactly the amount od light lens is passing through. For u43 with same FoV, and landscape compression you need 25mm lens with F2 - but now, your entrance pupil is only 12,5mm - 4 times smaller area of lens pipeline. Same FoV, same landscape compression, same exposure. Thats why, iso100 on u43 is similar to iso400 on FF in terms of noise - it has to be amplificated by 2EV to get same exposure. In camera here and there it says same iso, but they mean different amplification.
But entrance pupil size is proportional to f-stop anyway given that it's a ratio of pupil size to focal length, to keep the same aperture at longer focal lengths you have to widen the entrance pupil. Yes the full frame lens has a larger entrance pupil so let's more light through, but that light is spreading over a full frame sensor where as the M4/3's lens let's less light through but doesn't spread it out as far - thus the amount of light that falls on a each unit area of a lens remains the same. But entrance pupil size isn't the true measurement of light passing through the lens as that can depend on how much glass is in the lens, the types of glass elements used and any coatings on the optics The true measurement of a light passing through is T-stops (light transmission), which is the measurement used on all cine lenses, so any 2 lenses with the same T-stop value will put the same density of light onto the sensor and M4/3 cine lenses have the same values as full frame cine lenses DxOMark even give the light transmission values for all lenses they test in their database and the values are always roughly the same regardless of sensor size The difference in noise performance between sensors comes down to the fact that larger photosites create a better signal to noise ratio so have less interference
@@DaveMcKeegan DxOMark is a good one, it immediately shows you that an f/0.95 on APS-C has the same light transmission as an f/1.4 on Full Frame. This is mathematically sound and can be tested experientially too. When you apply a crop factor, e.g. 23mm on APS-C is 35mm on Full Frame to get the same focal length, you MUST apply a calculation to the entire exposure triangle, e.g. a 23mm f/2.8 ISO400 on APS-C is equivalent to 35mm f/4.2 ISO900. If you forget to apply the calculation to ISO, sure you may get the same exposure but drastically different images when it comes to depth of field and noise levels. This isn't about sensor quality as this has been demonstrated time and time again across different systems, look at Tony Northrup's videos.
What I figured out, it's not entirely the sensors fault. Given sensors from the same generation the low light performance is about the same, but it depends which lenses were used. If you have a 400 mm f4 on FF you would need a 200 mm f2 on mft to get the same performance, but you will get the depth of field of a 200 mm lens, which can be an advantage or a disadvantage depending what you want to achieve. From my understanding crop sensors get a bad rep, because people don't have the proper lenses for them, and the lenses aren't available, very expensive or impossible to make. Let's say you want the same field of view and same depth of field of a 50 mm f1.2 on FF on mtf, you would need a 25 mm f0.6 which would be expensive and almost impossible to manufacture (f0.5 is the physical limit for any lens). Most people don't have such lenses, and that's the reason people think their FF body has better low light capabilities as their crop body, because they put their 50 mm f1.8 on FF and see better results, that's because on the crop body it wasn't actually a f1.8 it was a f3.6 wide open. In conclusion if you want to get the best results on a crop body you need very good very fast glass, which is mostly unobtainable, and if available more expensive, which defeats the purpose of having a crop body if you just want to safe money. Shooting on crop sensor can be an advantage if you want more depth of field 200 mm f2.8 on mtf would give you the same depth of field as a 400 mm f5.6 on FF, so you can achieve more depth of field with a smaller lens. In most cases less depth of field is beneficial so most people will get FF, because to achieve the same look on a crop body it would require special lenses which are mostly not available.
In a nutshell, almost all theses video that compare 'FF vs APS-C" on this topic don't talk about the most important point. Thank you for your comment, I was about to say the same.
No, you don't. It's a crop sensor; it's no different than if you took the picture on a full frame lens, then ignored all the outer pixels. The noise does not change simply because the sensor is bigger or smaller.
Dave, I started photography as a keen hobby about a year ago, and so much of what I have listened to on TH-cam gave me the impression that I was somehow at a disadvantage, with my Sony APS-C mirrorless camera, when compared to a Full Frame, with respect to exposure. What you say in this video does make sense, but I am somewhat shocked that, everything I have believed, as described in the previous paragraph, was such a myth. i.e fundamentally the only key difference between different sensor sizes is Field of View, Depth of Field and Minimum focusing distance, when using similar lenses, at identical settings of aperture, shutter speed and ISO. This is such a gamechanger for me, so I really do not need a FULL FRAME camera body/sensor, unless I needed the increased Field of View and reduced Depth of Field, that the FULL FRAME camera body/sensor will enable, over an APS-C sensor/camera body. These virtues are things that rarely needed, at the level of photography I am at now. I was hell bent, prior to this video, on upgrading to a FULL FRAME DSLR - one of the Canon 5D's, whenever I had the money. But I must thank you cos you have saved me a whole lot of money. A whole lot, I mean a whole lot of money, and I can enjoy what I have already invested in. Thanks a million. What shocks me, is how come nobody else has shared this in such a simple believable manner. If only all the knowledge in the world was discussed with such simplicity and clarity and conviction. I owe you a hge debt of gratitude, from delivering me from the din of so called experts, who have touted the Full Frame mantra, to the enrichment of the camera and lens businesses, with people buying more than they need. Today is a very good day for me. Thanks.
I'm glad you found this helpful. Image noise is also another factor to consider though, the same settings will create the same exposure however larger sensors are less susceptible to noise when shooting at higher ISOs, so if you're finding your images are currently quite noisy then full frame could still be a consideration, but it wouldn't make your images any brighter.
@@DaveMcKeegan You are a Godsend, the world needs more people like you. Absolutely agree yes , with a caveat, (assuming that larger sensor sizes results in larger pixel size), yes the noise is likely to be reduced in a larger sensor. But from a practical sense this is only a bother for those shooting in low light or at higher ISO's. For the vast majority of us, and I make the assumption that there are far more non professionals with digital SLR's or mirrorless cameras or mobile phone cameras, and as long as we stick within sensible ISO's e.g no more than about 3000 ISO, with modern cameras, we should be ok. Of course there are other comparative factors like the quality and technology of the sensor (backlighting, etc), and any in camera processing that is done, but for those shooting RAW, as I do, you have much more flexibility to tailor your noise reduction, to taste in a photo editor. But with all these huge megapixels on all modern cameras from the last 10 years (typically over 12Megapixels), As in the example in your video wher several full Frame cameras of similar snsor size have different pixel densities(and therefore different pixel sizes) - Question is - do the lower pixel densities such as the 12 Mega pixel Sony Full Frame camera's reduce noise in low light (high ISO) situations? Topic for another video response from you. Hint Hint.
Here are some of my other videos which should hopefully answer your other questions Sensor size Vs noise th-cam.com/video/IhQMJhIP4ik/w-d-xo.html Sensor size Vs depth of field th-cam.com/video/v_rTNxOIdoA/w-d-xo.html
It's not a brighter exposure, but it is more light. Same photon flux density over a larger area equals more photons. More photons means a better signal to noise ratio.
As demonstrated here, the common region has exactly the same light. Collecting more picture outside of that region does not improve signal inside it. The relevant difference here is that the walls separating photo sites may require less relative area with larger photo sites.
The comparison has to be done on the same field. So comparing a 50mm f/2 on APC-c with a 85mm f/2 on a full frame => Same field. Then it's obvious that the 85mm give way more photons to the sensor. It's a physical matter of front lens size. 50mm f/2 gives a 25mm equivalent lens diameter 85mm f/2 gives a 42.5mm equivalent lens diameter So obviously way more light comes into the 85mm... Then of course if you compare two things which are not comparable, you get confused. Comparing 2 photos taken with APS-C and FF on a different fields, you get exactly the same amount of light on the same field, but you've lost all the light arround, not gathered by the cropped sensor.
If you project the same image of an object at the same distance with the same absolute aperture the the light entering the camera is the same but is distributed on to a larger sensor. The intensity is actually lower by the crop factor to the power of two.
I knew this from both my experience and theory (which is not complicated at all), but I encountered so many clams bigger sensors are better in low light that I started doubting myself. And here you are with another great explanation, as always. Thanks! From the professional point of view, it's only about the noise, right.
Larger sensors are better in low light because they capture more total light, so have better signal/noise ratio and so have less noise in the image - but the larger sensor doesn't produce a brighter exposure because the same amount of light falls over each part of the sensor Same with larger pixels of a lower resolution sensor - the large pixel captures more light so produces less noise than lots of little pixels, but the total light capture is the same so the brightness of the image remains the same
@@DaveMcKeegan I hope I can memorize that this time, thanks. You know, Dave, I have a topic for an unrelated, but very interesting video, I think: How on Earth can lenses have f-number smaller than 1? I can't wrap my head around lenses having f/0.95, let alone f/0.7.
In short answer to your question is the f number is a ratio of focal length to pupil entrance size - so if both are the same size you get an aperture of f1.0, if the entrance pupil is wider than the focal length then you get an aperture ratio less than 1 But I'll definitely consider making this into a more in-depth video
@@DaveMcKeegan Why, oh why did I think it's a ratio of the aperture diameter to the barrel of the lens diameter?.. So, it is absolutely possible. I also realize now that constant aperture zooms actually change the aperture hole size all the time, that's cool. Thank you for educating. I heard so many times what an f-number is, but didn't really understand what I heard.
Great explanation. But what about 50mm on Full Frame vs 35mm on APS-C? With the same aperture. In real use scenario people won’t crop full frame in post. They’ll just get tighter lens
But... of the four picture shown at 1:37, the lower right one IS in fact darker than the top left one... Your argument with cropping into the full frame or this water example doesn't even hold up. A full frame equivalent can of course "hold more water" because those bottles are placed in a much bigger area and each bottle by itself WILL collect more water. And if it doesn't matter which resolution you run on the same sized sensor, then why do Smartphone manufacturers for example NEED to use pixel binning in order to get usable and bright pictures out of those for example 64 or 108MP and downscale them to 12 or 16MP? It kinda doesn't make any sense if this was the case. Same for example with the iPhone 12. The 12 Pro Max has a 1/1.7" sensor with 12MP at f/1.6 and need noticeable less time to capture the same picture / amount of light than the other iPhone 12 which uses a much smaller 1/2.55" sensor with the same 12MP and also f/1.6.
The centre of the 4 images are equal in exposure The lower left appears darker because it's wider field of view is seeing the darker corners of the room, but if crop them to the same framing then they are equal. Smartphone run pixel binning for 2 reasons Firstly is down to marketing, because being able to advertise a camera as physically being 48MP is more appealing to people than saying it takes 12MP images The second is because pixel binning down samples noise, smartphone sensors are so small that even a 12MP sensor would suffer from noise very quickly, but by shooting at 48MP and down sampling then you filter the noise down which will result in cleaner images - then if the phone detects you are in sufficient light then it doesn't need to downsample, thus allowing you much higher resolution
I took a photo of 2 deer under gloomy canopy, they were so grainy on my Pentax kr they were unusable, now getting back into photography I was eyeing up a eos 200 or a used d5, one crop and one full.The full frame is supposed to still be good at around 2000iso while the eos 200 around half that.My question is, at the same zoomed image size and same pixel count, would I have more chance of low grain on the full frame that can use double the iso?
That would depend on what your sensor is geared towards, if it's geared towards hypersensitivity, as in the a7S, then a full-frame sensor would do a better job than an equivalent-sensitivity APS-C or 4/3 sensor due to lower pixel density and having larger pixels and more light gathering area as a result.
Great explanation Dave, but I think I might be missing something? Isn’t the ISO rating of any sensor decided by the manufacturer to provide the standard level of luminance? Ie. you’d never know if the smaller sensor is collecting less light, because they will rate say ISO 100 to be the same! What you’re getting instead, is more noise or gain added to the circuit. You can think of it as larger photsites don’t produce more light… they produce less noise !!! One of the reasons larger photsites collect more light (less. noise) is because they have fewer borders around each individual site. Using your tray/shot glass example… it’s the little diamond shapes between each shot glass (assuming you’ve not used square shot glasses)… that add up to be significant. Why not compare the noise levels between your 24MP A7iii and the 12MP A7Siii to see if larger photsites can collect more light with exactly the same settings?
The ISO ratings are set out by an industry standard called ISO:12232 - I've got a copy of it here you can download and read through if you're bored :D drive.google.com/drive/folders/17ABxsllQZQ_piM1W0dR8y6AM2FqrxcI6?usp=sharing But basically it lays out that how any camera must be put into a standardised, controlled situation - pre-determinedm, lighting and shot at an 18% grey scale card to measure the exposure, which likely determines base ISO - the ISO speed ratings are then that base ISO signal being amplified. It'll be like the fuel economy measurements of cars, real world usage will likely vary to the labs but that will vary from camera to camera, there is certainly no correlation between pixel size and exposure level.
@@DaveMcKeegan Not boring at all... I know that document well... 😂 The interesting sections of that document are actually 6, 7 and 8 (the bit you have to pay for)… they describe the method of determining the Exposure Index and ISO ratings for any digital sensor. The key thing here is the ISO definition of what is acceptable quality… particularly when we’re looking at the top end rather than base. Obviously, we need a standardised controlled situation to ascertain the exact point at which the manufacturer can determine the ISO in order to provide the correct illumination on our image. As well as trying to rate sensors at their point of maximum dynamic range, (Base ISO), we also need to rate exposure prior to any application of gamma curves or compression... because these can also effect the impression of brightness or acceptable noise. You’re absolutely correct to say there’s no correlation between pixel size and exposure level… but my point was, you’re measuring exposure level with ISO, which we agree, has been standardised to match as above What if we measure acceptable exposure with noise levels? Clearly, at ISO 100 we’re not going to see much of a difference… lets try ISO 64,000 or ISO 128,000. ??? At this point, there is a significant difference in the signal to noise levels between the 24MP A7iii and 12MP A7Siii. Where has this extra signal come from?
It's not so much where is the signal coming from but rather where is the noise coming from? With fewer pixels there is less circuitry, this in turn means less interference being mixed into the signal So it's not that lower resolution sensors capture more light but rather than they introduce less noise into the same signal - you can then amplify the signal further because the noise level reaches a particular threshold.
@@DaveMcKeegan Thank you for taking the time to reply Dave... I love these discussions because they stop and make me think again. 👍 I believe we're talking about the same thing? I know where the noise is coming from... it's the electronic amplification or gain applied to the signal to achieve the desired ISO rating. We differ because, you say the loss of signal is due to extra circuitry... I think it's because we're using larger photsites (buckets) to collect the photons in !!! Maybe we'll have to disagree, but I'm grateful because you've given me an idea for another video... maybe we should do a collab ??? 🤔
I think we're on roughly the same page - each pixel will be larger and collect more light individually but there are fewer of them, so total signal collected will end up being the same but the noise inference will differ which in turn produces different signal noise ratio. Certainly something we could potentially collaborate on
Thank you, the explanation that i needed to hear. I have low light issues with my A6500, i will see for big aperture lense. I'm not sure that i need to change the body to full frame sensor.
How do you measure exposure? The FF image is clearly different in the middle, but can't tell if brighter or just because of the colours. A 100% zoom comparison would have been better.
There can be some differences between how cameras process different colours, plus the samples were shot in jpeg. Here is a folder with 2 other samples - these ones are original raw files plus 2 jpegs created from the raw files in which i cropped the FF down to the size of the asp-c for a direct comparison. drive.google.com/drive/folders/17ABxsllQZQ_piM1W0dR8y6AM2FqrxcI6?usp=sharing
If figured that was the case. The amount of light focused onto the pixel is determined by t stops no? (Pretty much f stops) A wider frame collects more light over more pixels at a ratio of 1:1 Therefore no brighter exposure, each pixel gets the same light. (larger sensor may have more vingetting)
No, the f2.8 is the same across all sensors, however the background blur changes since the sensor is smaller area, means smaller vision area needs coverage creating something called compression which is why full frame lenses needs to be larger than four thirds sensors which are 2 times smaller in area hence 2 times crop factor. But a micro four thirds lenses has higher compression rate compared to full frame glass, the light is concentrated into a smaller circle, its like moving a projector closer to the wall, you get a smaller image, its sharper, when you move the projector further away, the image gets bigger but it also becomes more blurry, lenses work in the same sense except they do not loose sharpness like that nor brightness for that matter.
Yes, absolutely it does, it is just basic mathematics. That's how you convert between them: multiply focal length by crop factor, multiply aperture by crop factor, multiply ISO by crop factor ** 2 - people forget to apply the crop factor calculation to ISO and then are surprised why the exposure looks the same, doh! - 23mm f/2.8 ISO400 on APS-C is equivalent to 35mm f/4.2 ISO900 in every way, exposure, noise level, depth of field, etc.
As you said Dave, the full frame lens will cover more area than the APS-C sensor, but the exposure brightness has to do simply with energy. That is to say amount of light over area. The centre part of the full frame sensor will not be brighter if you put an APS-C sensor in its place instead. You’ll change your field of view with the smaller sensor, but it’ll be equally as bright. However since the photo sites are smaller, assuming a same generation APS-C sensor as the full frame one, they will be saturated with less photons each, having a lower count of photons needed to be from total black to total white. This would give you less discrete steps between the two extremes, resulting in a worse noise to signal ratio in the smaller sensor, assuming of course everything else in the electronics to be equal.
You are talking about noise performance, which is irrelevant to exposure, its simply how clean the image is, as you say the pixels are smaller and therefore they "fill" faster with light then a larger pixel would which creates less information to go by hence the noise. Which is why 6 megapixel super 35 sensors in videocams have just as good noise performance as a full frame sensor despite being APS size sensor.
тогда простой вопрос- если светосила эквивалентна почему большие датчики дают больше информации в тенях? Вы можете заснять кусты в тени на микру и на полный кадр вы увидите что на полном кадре в тенях будет информация а на микре каша?
@@magicdmode думаю светосила эквевроентна разница в размерах пикселя их количиве и дд матрицы. А светосила эквевроента , проблема в том что изображение проецируется на меньшую площадь с мелкими пикселями , gh5s тому подтверждение что Микра может. Другой вопрос что изготовить оптику для микры эквеваоентного качестве как и полного кадра фактически невозможно, совершенная оптически объектив всегда будет большим . И по этому пошли на полный кадр , потому что оптика упёрлась в размеры. Ну и если впихнуть технологию от c70 дд будет фактически такой же как на aps. Дело не в размере а в оптике. Другое дело как полнокадровая оптика стечится с полнокадровой , спидьусиеры кастели
Thanks so much this finally explains something that never made any sense to me. One thing I still don’t understand though- in theory shouldn’t a crop sensor designed lens that doesn’t have light spilling off the side of the sensor accomplish the same thing as a speed booster by narrowing down the light to all hit the sensor allowing more light to hit the sensor? In that situation I would think a crop sensor lens should outperform a full frame lens on a crop sensor camera in terms of how much light is captured.
Glad to have a clear explanation that would have helped me a lot when I was getting into photography :) I think that the noise performance you bring up only briefly at the end deserves more consideration though - you can effectively get a brighter exposure by bumping the ISO, but if you are limited by noise then FF is effectively brighter.
yes, this video would blow up if it had more on what you gain with larger sensor so people could make decisions between the f numbers and the sensor sizes. I still don't know.
Actually no, you get the same noise on APS-C and Full-Frame, you are just forgetting to apply the crop factor to ISO. A 23mm F1.4 lens on APS-C is equivalent to to a 35mm F2.1 lens on Full Frame, but if you're shooting at ISO400 on APS-C, you MUST apply the calculation to ISO too, most people apply it to focal length and aperture but forget to apply it to the ISO - that's a fundamental mistake. To get the same light captured (and noise, etc), ISO400 on APS-C becomes ISO900 on Full Frame (ISO*CropFactor**2)
Nah, this is wrong - you even basically admitted it when you said that the noise level was going to be different between the two. The ISO 125 on the APS-C camera is not the same as the ISO 125 on the full frame camera. The ISO 100 on both cameras will indeed give you the same exposure, but that's because it's a fake number - to get the same noise you'd have to bump the full frame camera to ISO 150~ and that's the true equivalent on full frame. At that point you have the same amount of noise, but the higher exposure on the full frame - just because ISO is the same on both cameras does not mean it performs the same. I guess the other way to consider it is this: ISO is standardised by exposure but one could argue it should be standardised by noise levels. Consider that if I wanted to make a photo at really high ISOs, I'd be limited by the smaller sensor ISO filling my image with noise, where a larger sensor would not (all other things equal). Again it's all a fake number, but standardising ISO by noise makes more sense because then we'd be able to compare useable performances on cameras by their ISO. As it currently standards ISO 3200 is unusable on some cameras and still exceedingly clean on others. The TL;DR argument here is speedboosters (which you also mentioned.) If exposure on a full frame vs an APS-C sensor is the same then a speedbooster is literally creating light out of nowhere. You can add a stop, or so, of light to your APS-C sensor just by putting the speedbooster between a full frame lens and a sensor, so now you've magically made a f2 lens into an f1.4 lens with the same viewing angle as it would on full frame. If a smaller sensor did not need more light for the same performance, you've just brought in additional light out of nowhere - which is impossible.
Exposure and noise are 2 completely different things - exposure is measuring luminance whilst noise is interference I never claimed different sensor sizes perform the same in terms of noise (the exact opposite in fact, I said larger photosites give you a cleaner signal and less noise) While a larger sensor captures a greater volume of light, that's over a larger space so the amount of light still remains the same per square unit of sensor space and so you still get the same exposure. ISO isn't a direct measurement of exposure but rather a measure of amplification, but changing it to a measurement of noise wouldn't work as you don't change the ISO setting in order to add more noise into your image. Noise performance differs wildly between cameras, like modern aps-c cameras have noise performance similar to old full frame. As for speedboosters, they don't get light from nowhere, they act as a funnel and take the extra light from a full frame lens and pack it into a smaller space, that increases the volume per unit of space and thus increases the exposure
@@DaveMcKeegan when it comes to ISO noise is inversely proportional to exposure. ISO is standardised so you can use a light meter to know your exposure, so of course if you set the ISO the same you'll get the same, when the actual results are the gain has already been pumped more on the smaller sensor to achieve the same result (hence introducing noise and why noise is inversely proportional to exposure when gained by ISO.) The best analogue example is with sound recording. You can increase the quality of a sound recording by either making the sound louder or moving the microphone closer. Alternatively you could pump the gain/volume which does the same thing but introduces noise. What you're saying is like comparing two audio interfaces and saying that, with no gain, both produce the same volume while completely ignoring that the fact that (in this hypothetical example) the less optimised interface is already introducing noise because it has a base gain to compensate for its inneficiency.
But that's an argument of noise performance and the quality of the equipment. If you have a sensor with backside illumination and gapless microlenses then it's able to capture more light than a sensor of the same size / resolution without them, but on equal terms - 2 sensors with the same architecture, having larger pixels captures more light per individual pixel but the light per unit area remains the same and having a larger sensor captures more total light but over a larger area, so again the light per unit area is the same.
Yeah I don’t understand the insistence of some people to not treat noise level as a quantitative metric. Noise can be measured and equalized to compare exposure. But we tend to not do that and just handwave noise level as some qualitative metric of how good a sensor is intrinsically or something. The worse sensor is just boosting its signal to get an equivalent exposure at the cost of more noise.
I never really construed sensor size with exposure, simply low light performance. This harkens back to the days of film. If you shoot in daylight or studio light, slow film is fine, otherwise, you need a big flash and fast film (and the inherent problems both of these cause).
Great video, underrated by YT algorithm, however most people will need more practical answers for their choices or more on what's the trade off and benefit of larger sensor in camparison. For example I have an old Lumix LX7 with 1/1.7-inch sensor 12.1 megapixels but excellent f/1.4. AN I shoot usually low light party videos. Will I benefit from a larger sensor BUT darker aperture, for example F2.8 but 1inch sensor (Sony RX100 VII) or I'd have to go for F1.4 and APS-C (Sony e10l with Sigma 16 lens) to see a difference? I mean for example better quality for crops with still good low light video performance.
Good video, agreed on most of it, just one thing about the pixel pitch. Larger photocites do capture more light, and need less amplification to get to (for example) ISO 100. As ISO is not directly translatable to amplification but more of a target for how bright the end image should be. Sounds off, so let me try to explain. Take 2 pixels, a large one, with the size of 1. and a small pixel which is 1/4th the size of the large pixel (0.25). Keep in mind that the large pixel could also be on an aps/c sized sensor, therefore making this purely about pixel size not sensor size. If you throw the correct amount of light at both pixels the large pixel would collect a number of photons (X) due to the smaller pixel's size being 1/4th of that size it will also collect 1/4th of the photons. here is where the pre amplification step comes in, which is decided by pixel pitch and maybe some other factors in the sensor (maybe this sensor lets less light through due to inefficient filters etc.), to get to a common number. The small pixel's charge would have to be amplified more than the large pixel due to it capturing less light. the amplification for that being roughly four times that of the larger pixel. also, if you had 4 small pixels you'd still need to amplify the signal more due to every individual pixel needing to get to that common number. when you get to that common number you can start to amplify to get close to the ISO standard. therefore comparing shots by just ISO value is kind of incorrect technically speaking. HOWEVER, this is just about the size of the pixels, which does not have ANYTHING to do with sensor size. the only way to see this amplification in your images is the difference in "noise", which could be caused by many things, but a large factor would be the amplification due to pixel pitch.
Forgot to mention, Thanks for trying to spread correct information around in a way alot of people can understand it. it's a good thing for the communnity as a whole!
Thank you for such extensive input, however if I'm understanding your statements right you are suggesting that a pixel with 1/4 size of another people would need to be amplified to the same level as the larger pixel? However if that's the case - given that there would be 4x more of smaller pixels on a equal size sensor, the total amount of light hitting the sensor remains the same then your total photons detected would be the same for entire sensor - so surely if the pre-amps were amplifying each individual photosite up to match a larger pixel, then the total sensor signal would become 4x larger so would actually cause a brighter exposure?
@@DaveMcKeegan Not exactly, it wouldn't cause an exposure that is 4 times as bright. you would have 4 times the amplification, yes, but the pixels on an individual level will have the same exposure as the bigger pixel with the "normal" amplification. This is because an individual pixel does not care about what the pixels around it capture (Putting bayer array color processing aside). Struggling to find a way to express the idea but I think a way to explain it would be with Water in a pool. which would be quite similar to the shot glasses and basket, in a way. however, this time its just about contents and not size, as computers don't care about how big a pixel is. Lets take a pool that has 100 Liters of water in it. It can hold more than 100 Liters, but it would become too deep. It can also hold less than 100 Liters but at that point it becomes harder to swim in. If you divide the water of that pool into 4 smaller pools you get 25 Liters each. which is too little water to swim in. the total amount of water is still the same, but individually the pools do not hold enough water. If you would add in the "amplification" they would all be at 100 Liters. In total there is 400 Liters, but individually you can swim just as well in all of them. This would work the same for image data (putting bayer array aside, again). per pixel there would be less of a charge to work with if it was not amplified, resulting in a darker image for your digital interpreter (editor, camera itself whatevers reads and shows the image) to work with. Also, as I might not have explained it well enough (which is probably the case) So here is some food for thought in that case: If the only thing that matters in an exposure is the total amount of light on the sensor, why are there some pixels that are blown out while others are not? I hope I was able to get my idea across, if not I'm always happy to try again! Thanks again for not writing me off and reading my explanation and have a good day!
@@ToniTechBoi I'm not entirely sure the swimming pool analogy works. If you had say a 10x10 pool with 100 litres then the water would come up to a certain level in the pool. If you then replaced the 10x10 pool with 4 5x5 pools arranged in a square and shared the water out between them, each pool would only have 25 litres each but it would still be the same total volume in the same total area and so each pools water level would be the same as the original 10x10 pool. Blown highlights occur when a pixel overflows with light, but you still then have to factor in that the smaller pixel is looking at a smaller part of the scene, so sticking with the swimming pools, you put 80 litres of water into a 100 litre capacity pool then the pool is 4/5's full so doesn't overflow (highlights wouldn't be blown), if you tried putting 80 litres into a 25 litre pool then it would overflow, but you'd have to keep the area the same so you'd again need 4x 25 litre pools to cover the area of the water source, then you'd end up with each pool having 20 litres of water so each still 4/5's full If smaller pixels needed to be amplified to produce the same exposure as larger pixels then manufacturers could not include the pre-amp step and allow smaller pixel cameras to record the original signal which would mean basically having a much lower native ISO
@@DaveMcKeegan First of, I excluded the size of the pools, as that is not important to what is happening to the individual voltage outputs, what I tried to explain is that the voltage output of that individual pixel is lower than the bigger pixel. sure, the 4 pixels combined may have the same voltage, however, you need to display them individually, therefore, the voltage level needs to be amplified. Edit: I just thought about it, if you include the size of the pools again and compare water levels then you're only seeing if the amount of "water" captured measures up to its area, but the amount of water in that individual pool is still less, therefore it needs to be amplified. I do agree the pool analogy was a really bad one, as sizes in pools are kinda... well.. necessary. For the second point. what I meant to reach towards is that a single pixel may be overblown, but the rest of the pixels are not affected by the voltage output of that individual pixel. On the Third point, It may not be a pre-amp step, which would be impractical, to just add another step, which could also be achieved by the amplification that is already going on for ISO. which would also explain why they don't show the lower ISO values, they would become something more alike extended ISO values I believe, Kind of the same reason some cameras have dual native ISO, they probably have their Native ISO already set higher up. It may be the case, may also not be, this is the first time I've actually thought about this one. I think I included the pre-amp step to split the system up making it simpler to go through, can't remember when though....
Thanks for a video. I think to compare amount of light we need to match a focal length (mechanically) from APC to Full Frame. If we consider a brightness as amount of light (see that sometimes is confused / defined as a light perception) then left photo has still more light due to more light information picking it from the sides. In this context is still completely right assume that a LARGER SENSOR give you more LIGTH (I won’t dare say BRIGHTER). Also I barely ever heard anyone saying that LARGE sensor gives a BRIGHTER EXPOSURE, instead usually it's being thought that it gives more LIGTH.
I've come across quite a few people who thought it gave a brighter exposure - I suspect they have fallen into the misconception of thinking that larger apertures and slower shutter speeds mean more like gathered and a brighter exposure so a larger sensor gathering more light means the same thing
You can debate different types of equivalents all day. Only thing that matters is that pictures look better on a full frame or APSC than all the smaller stuff at the end of the day.
I would like to start by saying that I think your content is great and you are exceptional at explaining things. Rusty is awesome as well. Thank you! Now, I feel that I need some help with this one. I agree with everything you've said; however, at 7 minute mark, you have the A7III and a6400 images displayed next to each other showing the, roughly, the same image. In the A7III image, the bricks appear brighter than in the a6400 image. Is this simply my old eyes playing games with me or something along those lines, or is there some other explanation or something I'm not understanding? You and your channel have really helped me take my photography to the next level. Thank you and keep up the great content!
Thank you for the kind words, in regards to your point about the test images those were a bit of a school boy error on my part as I shot the images in jpeg straight in camera which baked the picture profiles into the images and there can be variations with how the cameras process different colours To clear it up here are some different test shots done in raw (the original raw files plus jpeg versions which were done on the computer from the raw files and the FF cropped down to APS-C) which you're free to download and inspect. drive.google.com/folderview?id=17ABxsllQZQ_piM1W0dR8y6AM2FqrxcI6
I have been told that iso is not part of exposure and that it is applied after the shot has been taken and therefore there is no exposure triangle. So this would mean that all camera’s have the same base iso and that some sensors are better at dealing with signal to noise ratio than others. Have I been misinformed?
I would say that's wrong, firstly cameras don't all have the same base ISO because Nikon's base is usually 64 rather than 100. Although cameras do certainly differ between each other as to how well they handle noise In certain cameras the iso is never baked into the file but rather just kept as a side note so it can be freely changed later, however with most cameras the iso is baked into the file so if you over do your ISO in camera and blow out the highlights then they are permanently gone - this would make iso part of the exposure triangle.
@@DaveMcKeegan ok. Pal2tech and the angry photographer claim iso has nothing to do with exposure. Many others agree and say that the exposure triangle is just a way to explain how exposure works. Anyway it’s not a big deal.
ISO doesn't change the exposure of the sensor but does impact the exposure of an image. You'd never have them line up a correct exposure at iso100 then ramp it up by 5 stops and then argue that's not over exposed because the sensor is correctly exposed 🤣
Eg, reason why the a7s line of camera uses only 12mp sensor, but there relatively large compared to the size of the ones on say the a7iii/a9 with there similar 24mp, as I believe the size of the sensor in the a7iii, a9 and a7s cameras is the same size, but delivers great low light performance. Correct me if I am wrong Dave.
The sensors are all the same size across the A7's/A9/A1 - so for any given exposure, all of them will collect near enough the same amount of light as each other Where the fewer, large pixels of the S come in is that each individual pixel collects more light and produces less noise, but in having fewer of those pixels the total amount of light gathered across the sensor remains the same as thus the exposures don't change
nobody said it gives you Brighter Exposure, but overall Larger amount of light that is exactly the reason of cleaner low light images, so for the same clear image with small sensor you must to double the exposure and reduce ISO, so nothing is wrong or sensational
I beg to differ, I've had multiple people in the past try to argue that larger sensors actually give you a brighter photo There are even comments on this very video trying to argue that very fact
23mm f/2.8 ISO400 on APS-C is equivalent to 35mm f/4.2 ISO900. If you forget to apply the crop factor calculation to any one of focal length, aperture and ISO, you will get a drastically different image, it could be different depth of field, or different amounts of noise, or a different field of view. If you want to compare "same settings" you need to apply the calculation to all 3.
I shoot APS-C. If I had a nickel for every person that I ran into that said "Oh, I shoot full frame cuz it lets in more light" I wouldn't have to work any more. On one occasion, the guy was so adamant, I whipped out my vintage Toshiba linear light meter and asked him to point to the sensor size input. He was like "huh?". Exactly!! Exposure is measured per unit area, NOT total. He left unconvinced, but I sure felt better. Why is this concept so hard for people to understand?!!! Great explanation Dave! If my dog was looking over my shoulder, she would have been licking and nibbling my ear the whole time LOL!
Those people are completely correct. Of course the bigger sensor captures more light in total if the light intensity is the same per area unit. (The exposure is the same, because that's just how the ISO has been determined.)
Indeed, per unit area, that's right! - larger area, more light, doh! :) - but it's irrelevant, you just need to apply a calculation: 23mm f/2.8 ISO400 on APS-C is equivalent to 35mm f/4.2 ISO900 on full frame - as long as you compare like for like you'll get the same results
Thank you for this. What you are saying is what I thought. And professional photographers are saying that a full frame sensor makes a brighter exposure than apsc.
Intensity and exposure is the same but global exposed aera is bigger so perceived brightness is higher. Exposure the same yes but surface allows more photons on the subject, more information, less noise. That's why smartphones are far more noisy. Look at dxo mark values, the noise directly varies with sensor dimension for a same exposure. Large format film cameras are used partially for this reason. Exposure is not all.
I think this notion arises from the fact that a lower resolution sensor of the SAME size has more light falling on any given pixel. For some reason people misconstrue this as meaning the exposure ends up being different when in fact it does not. The only time this situation produces a 'different' exposure is under extremely low-light situations, and then you have to ignore the fact you also have a lower resolution image as a result. But it is why my 11MP full-frame Canon 1Ds is better for infrared and astrophotography than my 17MP APS-C Canon T100.
It does give you a brighter exposure if you adjust for framing (as in use equivalent focal lengths but the same aperture (not accounting for lens specific light transmission)). Less noise and a brighter exposure are the same exact thing
If you are still happy with the pixel count you got left. So this why many feel the A7R4 and now A1's are probably sony's best 'crop' cameras since if you put them in crop mode you still get 26MP and 21MP without a AA filter. The A7M3 talked in this example would only have about 10MP left in crop mode. But for the rest you are correct....
THis is a scond post - COuld you please assist us noobies - I am sure now that exposure is not affected by sensor size - Got that. But I have to now throw out all that I have previously assumed. I am sure that Field of View is influenced by sensor size. No contest about that. The questions are though : 1. Does sensor size influence Depth of Field, in any way, if we compared the cropped area of a larger sensor's image, with the full image taken on a smaller sensor (assuming focal length, aperture, and shutter speed and ISO remain identical). My initial thoughts are - No - if image capture area is matched, the Depth of Field should remain identical. i.e the only parameters changing Depth of Field would be Focal Length and Aperture, and in a strange round about way, the smaller sensor has an advantage when blurry backgrounds are preferred cos it has a longer effective focal length, from the manner in which it increases effective focal length on similar lenses. Just my thoughts as a non expert, Happy to be corrected if I am wrong. 2. Does sensor size influence Minimum focussing distance, from the perspective of the focus within the cropped area of the image taken with the larger sensor, that matches the image taken with the smaller sensor? From your video, my answer would be - also No - Minumum focussing distance of a largr sensor is longer only because in order to take a photo of a larger field of view, with a similar level of focus on the whole image, it would take a longer distance from subject, to get a similar accuracy of focus across the whole image. But if one was only concerned about focus within the cropped portion of a larger sensor's image, the focussing distance would be identical to that of the smaller sensor. One way to think of this is - if one has a Full Frame sensor, and an APS-C lens is mounted, and the camera switches to APS-C mode(i.e not using the whole sensor to capture the image, but only the APS-C portion of the sensor)0, all aspects of photography would be 100% identical to that of an APS-C sensor body. But seriously could it be so simple. Dear Dave, please review these assertions and conclusions and kindly respond and make a video or a set of videos on TH-cam addressing how sensor size affects (or does not affect) Depth of Field and Minimum focussing distance. I think these would be very popular videos, that many people would like to understand much better. The advantage you have is that you have the gear to test and compare and prove the thinking. Some of us have only one camera, and we do not trust the manufacturers, so we are relying on "free" thinkers like you who would have the gear to test these assertions and enlighten us the more. I look forward to your response, and these videos.
This video is so wrong on every level. 23mm f/2.8 ISO400 on APS-C is equivalent to 35mm f/4.2 ISO900. ISO125 on APS-C is ISO280 on Full Frame - ISO * (crop-factor**2). You cannot apply the crop factor calculation to just focal length and then ignore aperture and ISO, it doesn't work like that! - Once you apply the crop factor calculation to all 3, you'll see an identical image. The image you he is showing either has a different field of view, depth of field, or amount of noise. This isn't about ISO performance, it is just ISO125 on APS-C is identical to ISO280 on FF, in every sense of the word, same amplification, same light gathered, same noise, etc - you have to set an equivalent ISO to get a similar image, or you're not comparing "same settings".
Well... I disagree. And Dave is getting more attention with debunking of flat earthers. I wish him the best, he is a very nice person and what a lovely dog isn't!
Anyone who has ever used a hand held light meter before gets this stuff.I have several analog meters in addition to a now rather old Minolta IV digital light meter, and not a single one of them asks what format of film or sensor that I am using. Hmmmmm....
How do you know exposure is the same? Have you compared the histograms, waveforms or you telling that by your eye? 5:40. You do not have the same amount of light spread on the larger area. More light hits FF sensor compared to APS-C sensor. Since diameter of entry pupil is the same in both cases when you used FF lens then most of the light hit FF sensor (some not, since iris is spherical and sensor is a rectangle). More light missed APS-C sensor hitting area around the sensor (light is not magically bend inside the lens if you connect it to APS-C body). Also comparing different crops makes no sense, crop should be the same so you should use different focal lengths for each sensor size to match the framing. My guess is you got multi metering mode on both cameras, so exposure is somewhat average of the whole picture, which is different for different crops ... seems like the whole method used here is all wrong. And I shoot MFT and APS-C, so not a FF fanboy.
I didn't say there was the same light over a larger area, I said a larger sensor captures more total light but because it's over a larger area then the amount of light falling on each unit area of the sensor remains the same and thus the exposures are the same. I also shot at different focal lengths to create the same framing (35mm on FF Vs 23mm on APS-C) - but then focal length doesn't change exposure otherwise a constant aperture zoom lens would change exposure as you zoom in. And metering modes didn't come into play because I shot fully manual and dialed in the exact same settings to both cameras
The relevant factor at hand for purposes of exposure is the intensity of light per given area, not the total amount. The FF sensor can capture a larger area of the projected circle from the lens, but the intensity per mm remains the same. The result is no different to cropping in the FF image in post to match the image from the APS-c camera. A shorter focal length (at a given aperture diameter in mm) does yield greater intensity of light. This is because light radiates from many angles, and a broader field of view better captures the scattered light (the whole reason the Aperture value is expressed as a quotient, as we can use it to gauge exposure across different focal lengths). Once passed the lens however, the light is relatively uniform. Sensor size cannot alter the intensity of the light. The benefits gained from full frame cameras are an easing of design restrictions in lenses as well as signal-noise ratios and dynamic range. Smaller, more densely packed sensors require sharper lenses to properly utilize, and are more easily diffraction limited at small apertures.
@@DaveMcKeegan Why would it? - you are getting a totally different image in terms of depth of field there. Try to compare identical images, you will see that 23mm f/2.8 ISO400 on APS-C is equivalent to 35mm f/4.2 ISO900. This isn't about ISO performance, it is just ISO125 on APS-C is identical to ISO280 on FF, in every sense of the word, same amplification, same light gathered, same noise, etc - you have to set an equivalent ISO to get a similar image, or you're not comparing "same settings".
@@DigiDriftZone The noise is cleaner when comparing similar resolutions because the pixel density of the full frame is less f you take DPReviews image comparison tool and compare an APS-C like the A6700 to a full frame like the A7RV you see the noise levels look the same at the same ISO levels because the pixel densities are about the same Take an A7RV in crop mode alongside an A6700, using the same lens on both with the same settings and you'll get the same looking results - same exposure and same noise
@@DaveMcKeegan Surprisingly pixel density doesn’t seem to matter. Take an A7S III and A7 IV, vastly different pixel density, but the noise is similar, it looks slightly different but in terms of signal to noise, they are almost identical. As for same settings, 23mm f/2.8 at ISO400 on APS-C is 35mm f/4.2 ISO900 on full frame. This will give you the same field of view, depth of field, exposure, noise, etc. The images you are comparing are largely different images, nobody can look at them and say that’s an identical image, the above will give you an actual (near enough) identical image, try it. OK, so we’re in agreement, your A7R V when cropped to APS-C size has 1/3 less image, meaning 1/3 less light, correct? - If your exposure stays the same (which of course it does), that means you’ve just lost 1/3 of your captured light by simply only using your 2/3 of your full frame sensor. If you used the whole sensor, you’d get 1.5x more light captured. This applies across the board. If you switch from a 23mm f1.8 lens to a 35mm f1.8, the exposure is the same, but the amount of light the 35mm lens captures is 1.5x greater. In all of this, I think the biggest mistake is you're just forgetting to apply the crop factor calculation to the ISO (it is ISO * cropfactor**2).
The FF image is clearly a bit more exposed than the APSC image. Also, a larger signal/noise ratio is functionally equivalent to a larger signal. All signals of light are interpreted by the camera, not given to us as some absolute value reading. ISO settings are used to manipulate how that signal/noise ratio is interpreted. ISO values are not absolutes like shutter speed. They are relative to the sensor. One sensor at ISO 100 isn’t the same as another sensor at ISO 100. A sensor with a better signal/noise ratio will have a brighter image at ISO 100 because the camera can afford to interpret the signal as a brighter image because the proportion of noise is lower.
Here are some downloadable RAW files - a full frame, an APS-C & a full frame in APS-C crop mode All settings identical and all exposures are equal drive.google.com/folderview?id=17ABxsllQZQ_piM1W0dR8y6AM2FqrxcI6 And if signal to noise ratios were a factor then that would mean lower resolution full frames like the A7SIII would expose brighter than higher resolutions like the A7RIV
@@DaveMcKeegan a7SIII and A7rIV are both FF sensors. A single pixel on the S had a better signal/noise ratio than a single pixel on the R. That is why ISO noise shows up faster on R when it is cranked up and zoomed in. But an equivalent region of the sensor will have more similar signal/noise ratio between the two sensors. The R will have more area dedicated to pixel boundaries that are not collecting light, and the two sensors can be different in other ways for performance. There is a trade-off between higher pixel count and larger photo sensors. Amount of distinct pieces of information and amount of confidence for each piece of information. The S camera has fewer pieces of distinct information, fewer pixels. The R camera has less confidence in any particular distinct piece of information, worse signal/noise ratio per pixel. At full image size, the greater amount of of pieces of distinct information helps to offset the relative lack of confidence per pixel. But when you crop in to get similar pixel density, this lack of confidence is apparent with greater noise. But all of this is separate to the issue of better signal/noise being functionally identical to brighter image. We have created a new variable by significantly increasing the pixel count. If you have two identical sensors in all ways, except sensor A has a better signal/noise capture than sensor B, then a camera maker will (or at least a smart camera maker should) make sensor A produce a brighter image than sensor B at the same ISO. This is because ISO specs are determined relative to the sensor and sensor A can label “ISO 100” to be a higher amplified image signal without having more noise relative to image B.
You're saying the R & S create equal exposures because despite different sized pixels, an equal portion of the sensors both see the same amount of light ... Which means an APS-C sized portion of a full frame must expose the same as an APS-C because they are seeing the same area of scene.
@@DaveMcKeegan A simpler way to look at larger signal/noise being functionally equivalent to a larger signal is by just doing some simple math. Assume higher numbers mean a brighter pixel. I will admit that I am not doing any real signal processing math. Things could be squared or square-rooted in real signal processing math to increase or decrease the significance, but it definitely will be an improvement of some delta value. Assume that a noise value at or below 5 is great, at or below 10 is okay, and above 15 is bad. One sensor gets a signal/noise ratio of 10/1 and another gets a signal/noise ratio of 25/5. The first sensor captures less signal than the second. But the first sensor can amplify the 10/1 to a signal of 50/5, creating a brighter image while still maintaining the same amount of noise as the second sensor.
@@GoodFun_HaveLuck But signal to noise ratio has nothing to do with exposure. The SIII has a better signal to noise ratio than an RIV because each unit area of pixels captures the same amount of light but in fewer pixels which produces less noise and so a lower S/N ratio, but the exposure remains the same because the total amount of light captured is still the same. The same rule applies comparing APS-C to full frame. The A7RIV has slightly smaller size pixels than an A6600, as it's APS-C crop mode is 26mp Vs the A6600 24mp. But if you shoot an A7RIV in crop mode alongside an A6600, with the same lens & settings - they will produce the same exposure. If you then switch the A7RIV into full frame mode, the centre super35 section still continues to expose the same as before, it's just now seeing more of the scene - so the total light capture increases but the light captured from the middle doesn't change so the exposure doesn't change. Meaning a full frame exposes the same as an APS-C
Dave, “just” perfect once again! There might be an additional thing that usually has some discussion about: bokeh. In a FF, using the same aperture, you get more blurred background. Do you think you could explain this, in that clear way you always do?
@@DaveMcKeegan fun fact: I did understand what you explained there, because I had seen your video a while ago. I just didn’t remember it was yours. Thank you for making these. I really appreciate getting it from someone with this clear view of photography and capability to explain it simple and accurately. Cheers, Dave.
I find that a big sensor makes life a lot easier when it gets dark. But yes it uses the full whole sensor to capture that light... meaning it has a little easier time in the dark. But images are always weighted to be in balance. The big difference is in just how late you still can go out and capture a decent image without a tripod ;)
I dunno mate, use a brighter lens might be a better idea than going out buying a large format sensor camera just to shoot an hour or two longer outside. I never had issue in the arctic nights with my f1.2 lenses on me APS-C body, clearly, you have a skill issue, not a gear issue.
I have never heard anyone claim that the exposure is relative to the crop factor, so this whole video seems like a huge strawman fallacy. What people claim is that the smaller sensors capture less light given the same F-number, which is absolutely correct. If you want (approximately) the same field of view, the same depth of field, and the same amount of image noise then you have to use the equivalent focal length, the equivalent F-number, and the equivalent ISO, while keeping everything else the same.
You are the first person that has properly explained this topic. I haven’t been able to understand this since I started shooting in 2013. Thank you so much for breaking this down. I just found your channel and I’ve been binge watching. I have learned so much from you and I look forward to future topics. Have a wonderful day!!!
But it's completely wrong... ISO is calculated per inch, in order to set the "same settings", ISO800 on APS-C is the same setting as ISO1800 on Full Frame, the formula is multiply ISO by (crop factor)**2, e.g. ISO125 on APS-C is ISO280 in full frame to get the same light capture, noise level, etc ,etc - i.e. same settings.
6:22 to me it does not correlate with your words. and its not just that A7III picture is just flatter. i bet if you would go on it one F stop more it would look the same.
Thanks for mythbusting 👍 Would you consider a video on dog photography / editing? 🙏 I'm struggling to get good action photos of a black, woolly dog with a long snout (standard 🐩)
I can understand the "logic" behind the erroneous arguments, though I am glad they are wrong because it would have completely changed my understanding of light if they weren't. Anyway, since I announced to people I know that I intended to go Full Frame I have been bombarded with opinions on why it's not better than a crop. I never said nor thought it was, I just want some wider lenses than I can currently get but no matter how much I explain, the crop fans (who I am technically one of, owning a crop camera) insist on telling me I am a fool. Meanwhile, FF fans defend their choice in answer to the crop fans and between these two groups of very opinionated people the only conclusion I can reach is people do talk a lot of bum gravy when they love something.
These were shot in jpeg in camera so have slightly different image processing If you compare up these raw files you can see there is no difference drive.google.com/folderview?id=17ABxsllQZQ_piM1W0dR8y6AM2FqrxcI6
Guess the exposure is controlled by the manufacturer, in tech, what every manufacturer consider as a proper exposure, with my nikonD300 D700 D800 same settings, same light, gives me the same result, straight of the cam,
So the a7rIV and the a7s3 get the exact same picture? Don't think so. There is clearly a difference. And your tray versus shot glass analogy: the shot glasses don't cover the same area as they have walls, and so do photosites. Which explains why smaller pixels aren't a great idea in most cases. Clearly the a7s3 and a7r4 have the same "exposure" but for the same iso setting, you get different signal to nosie ratio. Which is a better signal. What you are ignoring is that iso isn't a measure of light to signal, but a scale that is fitted to it. The topic is exact the same as iso isn't the same on every camera. ..the fact that people use full frame instead of 1" proofs that there is a difference. Photo statistics exist and so does readout noise.
Of course there are differences in picture, you get less noise from having fewer pixels because there is less circuitry on the sensor to create noise but it doesn't create a brighter exposure. ISO is a measurement of signal amplification, but shooting at base ISO you have no amplification its just the raw signal level straight off the sensor But all of that is relating to signal to noise ratio, not overall signal - so the argument still stands, there might be a few photons difference but there is no truly measurable difference in actual luminance with the exact same settings Noise performance is a big factor as to why people favour larger sensors but then there other factors too - larger sensors generally make creating shallow depths of field easier and diffraction doesn't show up as much with larger photosites.
@@DaveMcKeegan almost as if, using the same settings gives you the same result? odd huch? that is exactly what an ISO rating in a digital camera is doing. giving you the same result. same light reaches the sensor, same result. but lets think of it this way. you shoot for an image with the same angle of view and the same depth of field. and to avoid differences between sensors and processors, you use the same camera but just crop the center. shoot 50mm f/2 and 25mm f/1.4 (you would need T stops for transmission, but assume it is perfect and keep the F stop measurement for dof) and use the exposure duration because motion blur needs to be the same as well or it is motion picture. to get the same results, you end up using ISO 400 on the FF 50mm f/2 and ISO 200 on the cropped (so about 2x crop but same aspect ratio so it wouldn't be MFT) 25mm f/1.4. But remember how we used the same sensor for this? almost as if, a bigger sensor calls ISO 400 what a smaller sensor calls ISO 200 to get the same result. Because we are seeing the same picture the same light hit the area of the image. This is total light not light per area or light per pixel. We took the same image but one is smaller, and lower res. the other is bigger but has more gain ("base iso" is never a solid 100, it is just what the engineers tune the DAC to do in 0010 operation). the point in your video isn't that different sensor sizes change exposure, but that ISO is normalized across sensor sizes, pixel sizes and resolutions. if you go back to the days of analog, where ISO is a rated speed of the medium and not a gain setting, you get the expected behavior that most people learn from a digital camera. However there is a striking difference, in the analog world, dynamic range has no hard borders and a larger negative can hold more detail, more "resolution" and more dynamic range if you enlarge it. simply because the size of the silver grains and the density are physically limited. you keep making the point that signal to noise ratio is different but the signal itself isn't. This is contradictory. Signal to noise ratio is what we think of as dynamic range, therefore a better signal to noise ratio gives you a better image -> "more signal". As long as you don't clip the signal and have the bit depth to bright back an image by developing the "raw" data (or a negative you print) - better signal to noise ratio means better signal in the end. photon statistics is complex physics, but you can't beat physics with mathematics. So bigger pixels and less resolution is better than smaller pixels but more of them. You can't get the same result by mathematically averaging pixels, the signal to noise ratio will be worse. (ignoring read out speed and storage). That is why you see cameras like the a7s3 because in reality it does make a difference. But why do we see phone cameras with 108MP? because there is a lot of processing being done in a phone than in a camera (might change in the next few years), and with learning based methods and temporal averaging... more data beats cleaner data. Which is a surprise and unexpected result in information theory but reality proofs you wrong. iPhones still only have 13MP tho, and they are still said to be the best camera found in a phone.
Your comparison is flawed. You must compare the same field of view. So, on an APSC camera you should put a 35mm lens on it and a 50mm lens on a full frame. Then, you can compare to your hearts content. I do a lot of photography in dimly lit settings such as in theaters, and APSC cameras will never compare to the results you get from a full frame camera.
Around 7 minutes in the video I compared 35mm on FF to 23mm on APS-C - although these were shot in jpeg in camera so the picture profiles has given a difference to the blue in the wall. However here are some raw files with the same setups you can download and inspect drive.google.com/folderview?id=17ABxsllQZQ_piM1W0dR8y6AM2FqrxcI6 The exposures themselves are near enough identical but as I then stressed near the end of the video, a larger sensor with larger photosites will give you less noise. There can be other factors such as light transmission of the lens and sensor architecture like is it backside illuminated etc But for equal sensors with equal lenses the size of the sensor doesn't change the luminance of a shot
Yeah, this is the the thing. The gathered light depends on the lens, not the sensor. That is agreed. But if a Milky Way shot requires a 14mm view, you need to compare the light gathered by a 9mm on ASP-C and a 14mm on FF. At same shutter speed and aperture, the 14mm lens captures more light as per physics. So while sensor size itself does not matter, it forces you to use lenses with inherently lower light gathering capabilities to take the same shot. See ex. equations on clarkvision site. So choice of sensor indirectly matters besides better noise performance imho. 9mm f/2.8 has an aperture area of 1.26 sq. cm 14mm f/2.8 has an aperture area of 1.57 sq. cm. So when you choose the FF solution for the shot you gain 25% better light gathering capability. Because you can use lens with larger aperture area. As for comparison it is hard to do unless you know how ISO was defined for the two cameras.
But while the aps-c ens has less like passing through, it's displaying it over a smaller area, so the amount of light falling in each square unit of sensor remains the same - thus giving the same exposure. The noise in an image is due to the signal to noise per photosite
Thanks for reply! I am a bit confused how that fits to your first test where you show that the same physical lens on ASP-C and FF gives the exact same exposure. So the light gathered should have nothing to do with sensor size? For same area (after crop) the pictures look equally exposed?
The first test was to show that with the same lens, the exposure stays the same regardless of sensor size because the smaller sensor is capturing less total light but it's also looking at a smaller part of the scene. In essence it is exactly like taking a shot with a FF setup and then cropping in - you end up with a narrower angle of view and less resolution but the exposure doesn't change. Then the second experiment was adding an aps-c with an equivalent field of view into the mix, which shows the exposure doesn't change between using FF & aps-c lenses even on the same sensor
@@DaveMcKeegan Don't think so, they are just pointing out the images you're comparing are different images, i.e. different depth of field, different focal length, different noise levels, etc - you forgot to apply the crop factor calculation to ISO: 23mm f/2.8 at ISO400 on APS-C is equivalent to 35mm f/4.2 at ISO900 on Full Frame, you will see a very similar image, same depth of field, same field of view, same noise level, same exposure, same everything. You can of course achieve the same results across systems with some systems having some unique lenses and advantages, but you have to apply the crop factor calculation to all 3: focal length, aperture and ISO, you can't selective apply it to just 1 or 2, you'll get a totally different image.
Sony should really work on their dog exposure detection its way to sticky to that human face in the video... Wonder if that works btw do the new sony's also expose for animal faces and if so can you register them as you can with faces to prio them... have to test that...
@@DigiDriftZoneI hope you are just joking since an external meter reading (EV) is film/sensor size independent. My statement was meant to be sarcastic to prove the point, and I never do any such calculation in my 45 year journey. 🙂
@@bfs5113 I’m not, contact the manufacturer, they assume 35mm for historic reasons, you need to apply your own calculations if that’s not what you’re using.
It's because of callibration. The sensor includes micro-lenses that further focus the light onto the individual photosites. So, if there's an 8mp, or whatever the original ones were, full frame sensor from decades ago, it will have more light hitting an individual photosite, because it's being channeled into a single spot. They're calibrated so that the settings don't deviate too far from what an external light meter says for the sake of everybody's sanity. But, if you took a much lower resolution sensor with larger microlenses, you would otherwise see a difference. But, we don't ever get that set up, because it would be dumb. So, they get callibrated so that the exposure matches when necessary. In practice though, there's never that much difference within a generation or two and so you wouldn't see much of a difference, without comparing a first gen FF against a current gen cropped sensor, in which case, you wouldn't see it due to the callibration process eliminating any significant deviation that might occur.
Very well explained and frankly its dumb that so few people understand this, in fact its embarrassing when lens manufacturers have to make articles to explain this concept because of all the false information out there.
2:00am in the morning .. Nice and Simple Dave. Brilliant. Waiting for a response from Tony N. "several subscribers have contacted me saying, Dave McKeegan has just said," ..😂 Who cares, Rusty is Back.. 😁😁 Informative video again Dave thanks !!
Tony N has never claimed that exposure is relative to crop factor. In fact, I've never heard anyone ever claim that, so this whole video seems like a strawman correction.
@@marcus3d No, but he did claim that 23mm f/2.8 ISO400 on APS-C is equivalent to 35mm f/4.2 ISO900. You cannot apply the crop factor calculation to just focal length and then ignore aperture and ISO, it doesn't work like that. Once you apply the crop factor calculation to all 3, you'll see a near identical image. The images he is showing either have a different field of view, different depth of field, or different amount of noise, so it's not the "same settings". Those settings are useless without applying the crop factor. It is like saying I am going at 60 speed without specifying if it's miles or kilometres.
A lot of people have been told that a 2.8 lens on APS-C is like a f4 on full frame. That's only correct for depth of field. A 1.8 lens is as good in low light regardless of it's APS-C or full frame.
That's the answer to all my problems, all I wanted to hear in months. So my 1.8 lens on APS-C is as bright as 1.8 in full frame, so the two different sensors get the same light, right? But, when you use a speed booster? They claim one more stop of light. Does that mean that using the same 1.8 lens with a speed booster on a crop sensor makes my crop sensor reciving more light than the full frame one as well as the same depth of field?
While the exposure settings will be the same the end effect will not. APSC will have more noise at the same ISO than a full frame sensor with the same megapixels, so to get the same image quality you’ll need a lower ISO, which is why it makes sense to apply the crop factor to focal length and aperture rather than just focal length
This is very true while what Christ K also said is true that at any given ISO the APSC will have more noise than a full frame. However with all these Sony cameras being incredibly good at low light imagery said noise really only becomes a factor when shooting at high ISOs. If you can't get a relatively clean image with a Sony APSC camera/lens that says more about your technique than the limitations of the gear.
@@GeraldBertramPhotography I don't think anyone's arguing that it's not possible to get great pictures with crop sensors, but the conversions can still be very useful. I regularly use FF, APSC, as well as 1 inch sensor cameras, so knowing roughly what to expect from image quality with quick conversions can be very useful. If it's golden hour and I have to decide between grabbing my RX10 or lugging out my a7iii+200-600 to shoot some birds I need to know that the noise on the smaller sensor will quickly spike as the light leaves. If I just look uncritically at the f/4 of the RX10 at 600mm vs the f/6.3 of the 200-600 then it looks like the RX10 is the better option for low light, which is obviously not the case
A speedbooster takes a full frame lens and redirects the excess light that would otherwise miss an APS-C sensor. This reduces the crop factor of the sensor (so rather than being 1.5-1.6x crop it might only be a 1.2-1.3x if a speedbooster is used) - it also means the density of light hitting the sensor is higher so you get a brighter exposure - it effectively lets your sensor receive 1 stop more light than it usually would. I have a another video here which explains it properly. th-cam.com/video/e852D0-Bsf4/w-d-xo.html
Dave I am only practicing photography for about 4 years, the comparison between FF/APSC and the effect crop factors has with lenses is the most misrepresented topic on TH-cam. I find your videos on the ff/apsc comparisons to be the most accurate
Thank you for saying so, I do try :)
@@DaveMcKeegan your getting spammed Dave, look at the thread
Thanks, I've been getting them for a few weeks now and I keep removing them, just bots and probably can't do much about them ... But I guess it classes as more viewsn🤣
But he is completely wrong, ISO is calculated per inch, in order to set the "same settings", ISO800 on APS-C is the same setting as ISO1800 on Full Frame, the formula is multiply ISO by (crop factor)**2, e.g. ISO125 on APS-C is ISO280 in full frame to get the same light capture, noise level, etc ,etc - i.e. same settings. This is like saying a large bucket holds the same water per square inch as a small bucket (what ISO measures), ok, so? - ISO number is pointless without knowing the sensor size and applying the relevant calculation to it.
@@Thirsty_Fox Not exactly, you get the same exposure at ISO1600 in photo and ISO12,800 in video on many Sony bodies. On your iPhone, you would've noticed you get crazy low ISOs like 32 to get the same exposure as much higher ISOs on APS-C or Full Frame. You also have different and conflicting standards like SOS and ERI. But as a general rule, if you want to match exposures between sensor sizes it's multiply ISO by crop-factor squared to get pretty close.
But as I said, as most cameras are ISO invariant, what you should be doing really is shooting at the native, changing your exposure in post editing and then comparing the final images if your purpose is to compare different systems or sensor sizes.
Totally agree with you re: sensor sizes vs. equivalent exposure. If you were not correct, sensor size would have to be figured into the rule of "Sunny 16" however, on a bright day, the exposure for any size format would be 1/ISO at f/16... No need to figure in the sensor size...
Where the difference in aperture between different sensor sizes originates is the difference in depth of field. Simply stated: framing any image identically between sensor sizes using the same focal length lens will require you to shoot from a closer distance with the larger sensor or from a longer distance using a smaller sensor.
If you shoot from a closer distance using the same f/stop and same focal length, the DOF will be more narrow than shooting from a farther distance with the same parameters.
Actually, shooting from the SAME DISTANCE using the SAME FOCAL LENGTH will result in a more narrow DOF for the smaller size sensor and a wider DOF for the larger sensor...
There is a difference hidden in ISO amplification set by manufacturer.
F-stop of lenses are misleading, they work great with exposure triangle - hence you have same exposure regardless of your lens or camera sensor, but there is a caviat - f-stop itself doesnt tell you how much light passes through, not without focal lenght anyway.
Real mesurment of lens "lightness" is entrance pupil, and when You start counting that, evrything starts to be more clear.
Full frame 50mm F2 lens has 25mm entrance pupil - this is a size of a hole, filtered by a FoV cone - exactly the amount od light lens is passing through.
For u43 with same FoV, and landscape compression you need 25mm lens with F2 - but now, your entrance pupil is only 12,5mm - 4 times smaller area of lens pipeline.
Same FoV, same landscape compression, same exposure.
Thats why, iso100 on u43 is similar to iso400 on FF in terms of noise - it has to be amplificated by 2EV to get same exposure. In camera here and there it says same iso, but they mean different amplification.
But entrance pupil size is proportional to f-stop anyway given that it's a ratio of pupil size to focal length, to keep the same aperture at longer focal lengths you have to widen the entrance pupil.
Yes the full frame lens has a larger entrance pupil so let's more light through, but that light is spreading over a full frame sensor where as the M4/3's lens let's less light through but doesn't spread it out as far - thus the amount of light that falls on a each unit area of a lens remains the same.
But entrance pupil size isn't the true measurement of light passing through the lens as that can depend on how much glass is in the lens, the types of glass elements used and any coatings on the optics
The true measurement of a light passing through is T-stops (light transmission), which is the measurement used on all cine lenses, so any 2 lenses with the same T-stop value will put the same density of light onto the sensor and M4/3 cine lenses have the same values as full frame cine lenses
DxOMark even give the light transmission values for all lenses they test in their database and the values are always roughly the same regardless of sensor size
The difference in noise performance between sensors comes down to the fact that larger photosites create a better signal to noise ratio so have less interference
@@DaveMcKeegan DxOMark is a good one, it immediately shows you that an f/0.95 on APS-C has the same light transmission as an f/1.4 on Full Frame. This is mathematically sound and can be tested experientially too. When you apply a crop factor, e.g. 23mm on APS-C is 35mm on Full Frame to get the same focal length, you MUST apply a calculation to the entire exposure triangle, e.g. a 23mm f/2.8 ISO400 on APS-C is equivalent to 35mm f/4.2 ISO900. If you forget to apply the calculation to ISO, sure you may get the same exposure but drastically different images when it comes to depth of field and noise levels. This isn't about sensor quality as this has been demonstrated time and time again across different systems, look at Tony Northrup's videos.
What I figured out, it's not entirely the sensors fault. Given sensors from the same generation the low light performance is about the same, but it depends which lenses were used. If you have a 400 mm f4 on FF you would need a 200 mm f2 on mft to get the same performance, but you will get the depth of field of a 200 mm lens, which can be an advantage or a disadvantage depending what you want to achieve. From my understanding crop sensors get a bad rep, because people don't have the proper lenses for them, and the lenses aren't available, very expensive or impossible to make. Let's say you want the same field of view and same depth of field of a 50 mm f1.2 on FF on mtf, you would need a 25 mm f0.6 which would be expensive and almost impossible to manufacture (f0.5 is the physical limit for any lens). Most people don't have such lenses, and that's the reason people think their FF body has better low light capabilities as their crop body, because they put their 50 mm f1.8 on FF and see better results, that's because on the crop body it wasn't actually a f1.8 it was a f3.6 wide open. In conclusion if you want to get the best results on a crop body you need very good very fast glass, which is mostly unobtainable, and if available more expensive, which defeats the purpose of having a crop body if you just want to safe money. Shooting on crop sensor can be an advantage if you want more depth of field 200 mm f2.8 on mtf would give you the same depth of field as a 400 mm f5.6 on FF, so you can achieve more depth of field with a smaller lens. In most cases less depth of field is beneficial so most people will get FF, because to achieve the same look on a crop body it would require special lenses which are mostly not available.
In a nutshell, almost all theses video that compare 'FF vs APS-C" on this topic don't talk about the most important point.
Thank you for your comment, I was about to say the same.
You get the same exposure but NOT THE SAME IMAGE. You have 4x as much noise on a sensor half the size. That's exponential noise.
No, you don't. It's a crop sensor; it's no different than if you took the picture on a full frame lens, then ignored all the outer pixels. The noise does not change simply because the sensor is bigger or smaller.
Straight to the point, no annoying intro or product placement, great video!
_If a lightbeam travels through a lens and there's no sensor to catch it, is it still light?_ :-)
Yes 🙂
@Kenj But of course! (According to my wife)
Dave, I started photography as a keen hobby about a year ago, and so much of what I have listened to on TH-cam gave me the impression that I was somehow at a disadvantage, with my Sony APS-C mirrorless camera, when compared to a Full Frame, with respect to exposure.
What you say in this video does make sense, but I am somewhat shocked that, everything I have believed, as described in the previous paragraph, was such a myth. i.e fundamentally the only key difference between different sensor sizes is Field of View, Depth of Field and Minimum focusing distance, when using similar lenses, at identical settings of aperture, shutter speed and ISO.
This is such a gamechanger for me, so I really do not need a FULL FRAME camera body/sensor, unless I needed the increased Field of View and reduced Depth of Field, that the FULL FRAME camera body/sensor will enable, over an APS-C sensor/camera body. These virtues are things that rarely needed, at the level of photography I am at now.
I was hell bent, prior to this video, on upgrading to a FULL FRAME DSLR - one of the Canon 5D's, whenever I had the money. But I must thank you cos you have saved me a whole lot of money. A whole lot, I mean a whole lot of money, and I can enjoy what I have already invested in. Thanks a million.
What shocks me, is how come nobody else has shared this in such a simple believable manner. If only all the knowledge in the world was discussed with such simplicity and clarity and conviction. I owe you a hge debt of gratitude, from delivering me from the din of so called experts, who have touted the Full Frame mantra, to the enrichment of the camera and lens businesses, with people buying more than they need.
Today is a very good day for me. Thanks.
I'm glad you found this helpful.
Image noise is also another factor to consider though, the same settings will create the same exposure however larger sensors are less susceptible to noise when shooting at higher ISOs, so if you're finding your images are currently quite noisy then full frame could still be a consideration, but it wouldn't make your images any brighter.
@@DaveMcKeegan You are a Godsend, the world needs more people like you. Absolutely agree yes , with a caveat, (assuming that larger sensor sizes results in larger pixel size), yes the noise is likely to be reduced in a larger sensor. But from a practical sense this is only a bother for those shooting in low light or at higher ISO's. For the vast majority of us, and I make the assumption that there are far more non professionals with digital SLR's or mirrorless cameras or mobile phone cameras, and as long as we stick within sensible ISO's e.g no more than about 3000 ISO, with modern cameras, we should be ok.
Of course there are other comparative factors like the quality and technology of the sensor (backlighting, etc), and any in camera processing that is done, but for those shooting RAW, as I do, you have much more flexibility to tailor your noise reduction, to taste in a photo editor.
But with all these huge megapixels on all modern cameras from the last 10 years (typically over 12Megapixels), As in the example in your video wher several full Frame cameras of similar snsor size have different pixel densities(and therefore different pixel sizes) - Question is - do the lower pixel densities such as the 12 Mega pixel Sony Full Frame camera's reduce noise in low light (high ISO) situations? Topic for another video response from you. Hint Hint.
Here are some of my other videos which should hopefully answer your other questions
Sensor size Vs noise
th-cam.com/video/IhQMJhIP4ik/w-d-xo.html
Sensor size Vs depth of field
th-cam.com/video/v_rTNxOIdoA/w-d-xo.html
@@DaveMcKeegan Highly appreciated. Thanks again and have a lovely evening.
Intensity vs total photon collection. Relative vs absolute quantity of light.
All begins with diametre lens entrance.
Nothing new since film cameras.
It's not a brighter exposure, but it is more light. Same photon flux density over a larger area equals more photons. More photons means a better signal to noise ratio.
As demonstrated here, the common region has exactly the same light. Collecting more picture outside of that region does not improve signal inside it. The relevant difference here is that the walls separating photo sites may require less relative area with larger photo sites.
@@0LoneTech He/she was referring to signal-to-noise ratio which is absolutely true and exactly what this dude was referring to at 9:28
The comparison has to be done on the same field. So comparing a 50mm f/2 on APC-c with a 85mm f/2 on a full frame => Same field.
Then it's obvious that the 85mm give way more photons to the sensor. It's a physical matter of front lens size.
50mm f/2 gives a 25mm equivalent lens diameter
85mm f/2 gives a 42.5mm equivalent lens diameter
So obviously way more light comes into the 85mm...
Then of course if you compare two things which are not comparable, you get confused.
Comparing 2 photos taken with APS-C and FF on a different fields, you get exactly the same amount of light on the same field, but you've lost all the light arround, not gathered by the cropped sensor.
You are completely wrong
If you project the same image of an object at the same distance with the same absolute aperture the the light entering the camera is the same but is distributed on to a larger sensor. The intensity is actually lower by the crop factor to the power of two.
I knew this from both my experience and theory (which is not complicated at all), but I encountered so many clams bigger sensors are better in low light that I started doubting myself.
And here you are with another great explanation, as always. Thanks!
From the professional point of view, it's only about the noise, right.
Larger sensors are better in low light because they capture more total light, so have better signal/noise ratio and so have less noise in the image - but the larger sensor doesn't produce a brighter exposure because the same amount of light falls over each part of the sensor
Same with larger pixels of a lower resolution sensor - the large pixel captures more light so produces less noise than lots of little pixels, but the total light capture is the same so the brightness of the image remains the same
@@DaveMcKeegan I hope I can memorize that this time, thanks.
You know, Dave, I have a topic for an unrelated, but very interesting video, I think: How on Earth can lenses have f-number smaller than 1? I can't wrap my head around lenses having f/0.95, let alone f/0.7.
In short answer to your question is the f number is a ratio of focal length to pupil entrance size - so if both are the same size you get an aperture of f1.0, if the entrance pupil is wider than the focal length then you get an aperture ratio less than 1
But I'll definitely consider making this into a more in-depth video
@@DaveMcKeegan Why, oh why did I think it's a ratio of the aperture diameter to the barrel of the lens diameter?.. So, it is absolutely possible. I also realize now that constant aperture zooms actually change the aperture hole size all the time, that's cool.
Thank you for educating. I heard so many times what an f-number is, but didn't really understand what I heard.
@@DaveMcKeegan Very enlightening! Thanks!
Great explanation. But what about 50mm on Full Frame vs 35mm on APS-C? With the same aperture. In real use scenario people won’t crop full frame in post. They’ll just get tighter lens
But... of the four picture shown at 1:37, the lower right one IS in fact darker than the top left one...
Your argument with cropping into the full frame or this water example doesn't even hold up. A full frame equivalent can of course "hold more water" because those bottles are placed in a much bigger area and each bottle by itself WILL collect more water.
And if it doesn't matter which resolution you run on the same sized sensor, then why do Smartphone manufacturers for example NEED to use pixel binning in order to get usable and bright pictures out of those for example 64 or 108MP and downscale them to 12 or 16MP? It kinda doesn't make any sense if this was the case.
Same for example with the iPhone 12. The 12 Pro Max has a 1/1.7" sensor with 12MP at f/1.6 and need noticeable less time to capture the same picture / amount of light than the other iPhone 12 which uses a much smaller 1/2.55" sensor with the same 12MP and also f/1.6.
The centre of the 4 images are equal in exposure
The lower left appears darker because it's wider field of view is seeing the darker corners of the room, but if crop them to the same framing then they are equal.
Smartphone run pixel binning for 2 reasons
Firstly is down to marketing, because being able to advertise a camera as physically being 48MP is more appealing to people than saying it takes 12MP images
The second is because pixel binning down samples noise, smartphone sensors are so small that even a 12MP sensor would suffer from noise very quickly, but by shooting at 48MP and down sampling then you filter the noise down which will result in cleaner images - then if the phone detects you are in sufficient light then it doesn't need to downsample, thus allowing you much higher resolution
I took a photo of 2 deer under gloomy canopy, they were so grainy on my Pentax kr they were unusable, now getting back into photography I was eyeing up a eos 200 or a used d5, one crop and one full.The full frame is supposed to still be good at around 2000iso while the eos 200 around half that.My question is, at the same zoomed image size and same pixel count, would I have more chance of low grain on the full frame that can use double the iso?
That would depend on what your sensor is geared towards, if it's geared towards hypersensitivity, as in the a7S, then a full-frame sensor would do a better job than an equivalent-sensitivity APS-C or 4/3 sensor due to lower pixel density and having larger pixels and more light gathering area as a result.
Great explanation Dave, but I think I might be missing something?
Isn’t the ISO rating of any sensor decided by the manufacturer to provide the standard level of luminance? Ie. you’d never know if the smaller sensor is collecting less light, because they will rate say ISO 100 to be the same! What you’re getting instead, is more noise or gain added to the circuit.
You can think of it as larger photsites don’t produce more light… they produce less noise !!!
One of the reasons larger photsites collect more light (less. noise) is because they have fewer borders around each individual site. Using your tray/shot glass example… it’s the little diamond shapes between each shot glass (assuming you’ve not used square shot glasses)… that add up to be significant.
Why not compare the noise levels between your 24MP A7iii and the 12MP A7Siii to see if larger photsites can collect more light with exactly the same settings?
The ISO ratings are set out by an industry standard called ISO:12232 - I've got a copy of it here you can download and read through if you're bored :D
drive.google.com/drive/folders/17ABxsllQZQ_piM1W0dR8y6AM2FqrxcI6?usp=sharing
But basically it lays out that how any camera must be put into a standardised, controlled situation - pre-determinedm, lighting and shot at an 18% grey scale card to measure the exposure, which likely determines base ISO - the ISO speed ratings are then that base ISO signal being amplified.
It'll be like the fuel economy measurements of cars, real world usage will likely vary to the labs but that will vary from camera to camera, there is certainly no correlation between pixel size and exposure level.
@@DaveMcKeegan Not boring at all... I know that document well... 😂
The interesting sections of that document are actually 6, 7 and 8 (the bit you have to pay for)… they describe the method of determining the Exposure Index and ISO ratings for any digital sensor. The key thing here is the ISO definition of what is acceptable quality… particularly when we’re looking at the top end rather than base. Obviously, we need a standardised controlled situation to ascertain the exact point at which the manufacturer can determine the ISO in order to provide the correct illumination on our image. As well as trying to rate sensors at their point of maximum dynamic range, (Base ISO), we also need to rate exposure prior to any application of gamma curves or compression... because these can also effect the impression of brightness or acceptable noise.
You’re absolutely correct to say there’s no correlation between pixel size and exposure level… but my point was, you’re measuring exposure level with ISO, which we agree, has been standardised to match as above
What if we measure acceptable exposure with noise levels? Clearly, at ISO 100 we’re not going to see much of a difference… lets try ISO 64,000 or ISO 128,000. ???
At this point, there is a significant difference in the signal to noise levels between the 24MP A7iii and 12MP A7Siii. Where has this extra signal come from?
It's not so much where is the signal coming from but rather where is the noise coming from?
With fewer pixels there is less circuitry, this in turn means less interference being mixed into the signal
So it's not that lower resolution sensors capture more light but rather than they introduce less noise into the same signal - you can then amplify the signal further because the noise level reaches a particular threshold.
@@DaveMcKeegan Thank you for taking the time to reply Dave... I love these discussions because they stop and make me think again. 👍
I believe we're talking about the same thing? I know where the noise is coming from... it's the electronic amplification or gain applied to the signal to achieve the desired ISO rating. We differ because, you say the loss of signal is due to extra circuitry... I think it's because we're using larger photsites (buckets) to collect the photons in !!!
Maybe we'll have to disagree, but I'm grateful because you've given me an idea for another video... maybe we should do a collab ??? 🤔
I think we're on roughly the same page - each pixel will be larger and collect more light individually but there are fewer of them, so total signal collected will end up being the same but the noise inference will differ which in turn produces different signal noise ratio.
Certainly something we could potentially collaborate on
Thank you, the explanation that i needed to hear. I have low light issues with my A6500, i will see for big aperture lense. I'm not sure that i need to change the body to full frame sensor.
How do you measure exposure? The FF image is clearly different in the middle, but can't tell if brighter or just because of the colours. A 100% zoom comparison would have been better.
There can be some differences between how cameras process different colours, plus the samples were shot in jpeg. Here is a folder with 2 other samples - these ones are original raw files plus 2 jpegs created from the raw files in which i cropped the FF down to the size of the asp-c for a direct comparison.
drive.google.com/drive/folders/17ABxsllQZQ_piM1W0dR8y6AM2FqrxcI6?usp=sharing
Very enlightening! Thanks!
But what's the exposure on the equivalent 23 mm lens? Wouldn't it be 1/30 f1.8 ISO 320?
If figured that was the case.
The amount of light focused onto the pixel is determined by t stops no? (Pretty much f stops)
A wider frame collects more light over more pixels at a ratio of 1:1
Therefore no brighter exposure, each pixel gets the same light. (larger sensor may have more vingetting)
For the lens comparison: does the f2.8 become f4.2 FF equivalent on crop sensor due to crop factor?
No, the f2.8 is the same across all sensors, however the background blur changes since the sensor is smaller area, means smaller vision area needs coverage creating something called compression which is why full frame lenses needs to be larger than four thirds sensors which are 2 times smaller in area hence 2 times crop factor.
But a micro four thirds lenses has higher compression rate compared to full frame glass, the light is concentrated into a smaller circle, its like moving a projector closer to the wall, you get a smaller image, its sharper, when you move the projector further away, the image gets bigger but it also becomes more blurry, lenses work in the same sense except they do not loose sharpness like that nor brightness for that matter.
Yes, absolutely it does, it is just basic mathematics. That's how you convert between them: multiply focal length by crop factor, multiply aperture by crop factor, multiply ISO by crop factor ** 2 - people forget to apply the crop factor calculation to ISO and then are surprised why the exposure looks the same, doh! - 23mm f/2.8 ISO400 on APS-C is equivalent to 35mm f/4.2 ISO900 in every way, exposure, noise level, depth of field, etc.
As you said Dave, the full frame lens will cover more area than the APS-C sensor, but the exposure brightness has to do simply with energy. That is to say amount of light over area. The centre part of the full frame sensor will not be brighter if you put an APS-C sensor in its place instead. You’ll change your field of view with the smaller sensor, but it’ll be equally as bright. However since the photo sites are smaller, assuming a same generation APS-C sensor as the full frame one, they will be saturated with less photons each, having a lower count of photons needed to be from total black to total white. This would give you less discrete steps between the two extremes, resulting in a worse noise to signal ratio in the smaller sensor, assuming of course everything else in the electronics to be equal.
You are talking about noise performance, which is irrelevant to exposure, its simply how clean the image is, as you say the pixels are smaller and therefore they "fill" faster with light then a larger pixel would which creates less information to go by hence the noise.
Which is why 6 megapixel super 35 sensors in videocams have just as good noise performance as a full frame sensor despite being APS size sensor.
тогда простой вопрос- если светосила эквивалентна почему большие датчики дают больше информации в тенях? Вы можете заснять кусты в тени на микру и на полный кадр вы увидите что на полном кадре в тенях будет информация а на микре каша?
Потомучто светосила APS-C не эквивалентна FF, автор видео ошибается.
@@magicdmode думаю светосила эквевроентна разница в размерах пикселя их количиве и дд матрицы. А светосила эквевроента , проблема в том что изображение проецируется на меньшую площадь с мелкими пикселями , gh5s тому подтверждение что Микра может. Другой вопрос что изготовить оптику для микры эквеваоентного качестве как и полного кадра фактически невозможно, совершенная оптически объектив всегда будет большим . И по этому пошли на полный кадр , потому что оптика упёрлась в размеры. Ну и если впихнуть технологию от c70 дд будет фактически такой же как на aps. Дело не в размере а в оптике. Другое дело как полнокадровая оптика стечится с полнокадровой , спидьусиеры кастели
Thanks so much this finally explains something that never made any sense to me. One thing I still don’t understand though- in theory shouldn’t a crop sensor designed lens that doesn’t have light spilling off the side of the sensor accomplish the same thing as a speed booster by narrowing down the light to all hit the sensor allowing more light to hit the sensor? In that situation I would think a crop sensor lens should outperform a full frame lens on a crop sensor camera in terms of how much light is captured.
Glad to have a clear explanation that would have helped me a lot when I was getting into photography :)
I think that the noise performance you bring up only briefly at the end deserves more consideration though - you can effectively get a brighter exposure by bumping the ISO, but if you are limited by noise then FF is effectively brighter.
yes, this video would blow up if it had more on what you gain with larger sensor so people could make decisions between the f numbers and the sensor sizes. I still don't know.
Actually no, you get the same noise on APS-C and Full-Frame, you are just forgetting to apply the crop factor to ISO. A 23mm F1.4 lens on APS-C is equivalent to to a 35mm F2.1 lens on Full Frame, but if you're shooting at ISO400 on APS-C, you MUST apply the calculation to ISO too, most people apply it to focal length and aperture but forget to apply it to the ISO - that's a fundamental mistake. To get the same light captured (and noise, etc), ISO400 on APS-C becomes ISO900 on Full Frame (ISO*CropFactor**2)
Great video as usual Dave, detailed yet concise !
Thank you Jonathan :)
Nah, this is wrong - you even basically admitted it when you said that the noise level was going to be different between the two. The ISO 125 on the APS-C camera is not the same as the ISO 125 on the full frame camera. The ISO 100 on both cameras will indeed give you the same exposure, but that's because it's a fake number - to get the same noise you'd have to bump the full frame camera to ISO 150~ and that's the true equivalent on full frame. At that point you have the same amount of noise, but the higher exposure on the full frame - just because ISO is the same on both cameras does not mean it performs the same.
I guess the other way to consider it is this: ISO is standardised by exposure but one could argue it should be standardised by noise levels. Consider that if I wanted to make a photo at really high ISOs, I'd be limited by the smaller sensor ISO filling my image with noise, where a larger sensor would not (all other things equal). Again it's all a fake number, but standardising ISO by noise makes more sense because then we'd be able to compare useable performances on cameras by their ISO. As it currently standards ISO 3200 is unusable on some cameras and still exceedingly clean on others.
The TL;DR argument here is speedboosters (which you also mentioned.) If exposure on a full frame vs an APS-C sensor is the same then a speedbooster is literally creating light out of nowhere. You can add a stop, or so, of light to your APS-C sensor just by putting the speedbooster between a full frame lens and a sensor, so now you've magically made a f2 lens into an f1.4 lens with the same viewing angle as it would on full frame. If a smaller sensor did not need more light for the same performance, you've just brought in additional light out of nowhere - which is impossible.
Exposure and noise are 2 completely different things - exposure is measuring luminance whilst noise is interference
I never claimed different sensor sizes perform the same in terms of noise (the exact opposite in fact, I said larger photosites give you a cleaner signal and less noise)
While a larger sensor captures a greater volume of light, that's over a larger space so the amount of light still remains the same per square unit of sensor space and so you still get the same exposure.
ISO isn't a direct measurement of exposure but rather a measure of amplification, but changing it to a measurement of noise wouldn't work as you don't change the ISO setting in order to add more noise into your image.
Noise performance differs wildly between cameras, like modern aps-c cameras have noise performance similar to old full frame.
As for speedboosters, they don't get light from nowhere, they act as a funnel and take the extra light from a full frame lens and pack it into a smaller space, that increases the volume per unit of space and thus increases the exposure
@@DaveMcKeegan when it comes to ISO noise is inversely proportional to exposure. ISO is standardised so you can use a light meter to know your exposure, so of course if you set the ISO the same you'll get the same, when the actual results are the gain has already been pumped more on the smaller sensor to achieve the same result (hence introducing noise and why noise is inversely proportional to exposure when gained by ISO.)
The best analogue example is with sound recording. You can increase the quality of a sound recording by either making the sound louder or moving the microphone closer. Alternatively you could pump the gain/volume which does the same thing but introduces noise.
What you're saying is like comparing two audio interfaces and saying that, with no gain, both produce the same volume while completely ignoring that the fact that (in this hypothetical example) the less optimised interface is already introducing noise because it has a base gain to compensate for its inneficiency.
But that's an argument of noise performance and the quality of the equipment.
If you have a sensor with backside illumination and gapless microlenses then it's able to capture more light than a sensor of the same size / resolution without them, but on equal terms - 2 sensors with the same architecture, having larger pixels captures more light per individual pixel but the light per unit area remains the same and having a larger sensor captures more total light but over a larger area, so again the light per unit area is the same.
Excellent comment. 100% agree.
Yeah I don’t understand the insistence of some people to not treat noise level as a quantitative metric. Noise can be measured and equalized to compare exposure. But we tend to not do that and just handwave noise level as some qualitative metric of how good a sensor is intrinsically or something. The worse sensor is just boosting its signal to get an equivalent exposure at the cost of more noise.
Thank you so much for this detailed information! always love to learn more of these technical photography stuff!
Finally u gave me some closure for my apsc camera. I often wondered if i made a mistake by not choosing full frame
what this video does not mention is that same exposure does not mean the same image quality.
I never really construed sensor size with exposure, simply low light performance. This harkens back to the days of film. If you shoot in daylight or studio light, slow film is fine, otherwise, you need a big flash and fast film (and the inherent problems both of these cause).
Great video, underrated by YT algorithm, however most people will need more practical answers for their choices or more on what's the trade off and benefit of larger sensor in camparison. For example I have an old Lumix LX7 with 1/1.7-inch sensor 12.1 megapixels but excellent f/1.4. AN I shoot usually low light party videos. Will I benefit from a larger sensor BUT darker aperture, for example F2.8 but 1inch sensor (Sony RX100 VII) or I'd have to go for F1.4 and APS-C (Sony e10l with Sigma 16 lens) to see a difference? I mean for example better quality for crops with still good low light video performance.
Good video, agreed on most of it, just one thing
about the pixel pitch.
Larger photocites do capture more light, and need less amplification to get to (for example) ISO 100.
As ISO is not directly translatable to amplification but more of a target for how bright the end image should be.
Sounds off, so let me try to explain.
Take 2 pixels, a large one, with the size of 1. and a small pixel which is 1/4th the size of the large pixel (0.25).
Keep in mind that the large pixel could also be on an aps/c sized sensor, therefore making this purely about pixel size not sensor size.
If you throw the correct amount of light at both pixels the large pixel would collect a number of photons (X)
due to the smaller pixel's size being 1/4th of that size it will also collect 1/4th of the photons.
here is where the pre amplification step comes in, which is decided by pixel pitch and maybe some other factors in the sensor (maybe this sensor lets less light through due to inefficient filters etc.), to get to a common number.
The small pixel's charge would have to be amplified more than the large pixel due to it capturing less light. the amplification for that being roughly four times that of the larger pixel. also, if you had 4 small pixels you'd still need to amplify the signal more due to every individual pixel needing to get to that common number.
when you get to that common number you can start to amplify to get close to the ISO standard. therefore comparing shots by just ISO value is kind of incorrect technically speaking.
HOWEVER, this is just about the size of the pixels, which does not have ANYTHING to do with sensor size.
the only way to see this amplification in your images is the difference in "noise", which could be caused by many things, but a large factor would be the amplification due to pixel pitch.
Forgot to mention, Thanks for trying to spread correct information around in a way alot of people can understand it. it's a good thing for the communnity as a whole!
Thank you for such extensive input, however if I'm understanding your statements right you are suggesting that a pixel with 1/4 size of another people would need to be amplified to the same level as the larger pixel?
However if that's the case - given that there would be 4x more of smaller pixels on a equal size sensor, the total amount of light hitting the sensor remains the same then your total photons detected would be the same for entire sensor - so surely if the pre-amps were amplifying each individual photosite up to match a larger pixel, then the total sensor signal would become 4x larger so would actually cause a brighter exposure?
@@DaveMcKeegan Not exactly, it wouldn't cause an exposure that is 4 times as bright.
you would have 4 times the amplification, yes, but the pixels on an individual level will have the same exposure as the bigger pixel with the "normal" amplification.
This is because an individual pixel does not care about what the pixels around it capture (Putting bayer array color processing aside).
Struggling to find a way to express the idea but I think a way to explain it would be with Water in a pool. which would be quite similar to the shot glasses and basket, in a way. however, this time its just about contents and not size, as computers don't care about how big a pixel is.
Lets take a pool that has 100 Liters of water in it. It can hold more than 100 Liters, but it would become too deep. It can also hold less than 100 Liters but at that point it becomes harder to swim in.
If you divide the water of that pool into 4 smaller pools you get 25 Liters each. which is too little water to swim in. the total amount of water is still the same, but individually the pools do not hold enough water. If you would add in the "amplification" they would all be at 100 Liters. In total there is 400 Liters, but individually you can swim just as well in all of them.
This would work the same for image data (putting bayer array aside, again).
per pixel there would be less of a charge to work with if it was not amplified, resulting in a darker image for your digital interpreter (editor, camera itself whatevers reads and shows the image) to work with.
Also, as I might not have explained it well enough (which is probably the case) So here is some food for thought in that case:
If the only thing that matters in an exposure is the total amount of light on the sensor, why are there some pixels that are blown out while others are not?
I hope I was able to get my idea across, if not I'm always happy to try again!
Thanks again for not writing me off and reading my explanation and have a good day!
@@ToniTechBoi I'm not entirely sure the swimming pool analogy works. If you had say a 10x10 pool with 100 litres then the water would come up to a certain level in the pool. If you then replaced the 10x10 pool with 4 5x5 pools arranged in a square and shared the water out between them, each pool would only have 25 litres each but it would still be the same total volume in the same total area and so each pools water level would be the same as the original 10x10 pool.
Blown highlights occur when a pixel overflows with light, but you still then have to factor in that the smaller pixel is looking at a smaller part of the scene, so sticking with the swimming pools, you put 80 litres of water into a 100 litre capacity pool then the pool is 4/5's full so doesn't overflow (highlights wouldn't be blown), if you tried putting 80 litres into a 25 litre pool then it would overflow, but you'd have to keep the area the same so you'd again need 4x 25 litre pools to cover the area of the water source, then you'd end up with each pool having 20 litres of water so each still 4/5's full
If smaller pixels needed to be amplified to produce the same exposure as larger pixels then manufacturers could not include the pre-amp step and allow smaller pixel cameras to record the original signal which would mean basically having a much lower native ISO
@@DaveMcKeegan First of, I excluded the size of the pools, as that is not important to what is happening to the individual voltage outputs, what I tried to explain is that the voltage output of that individual pixel is lower than the bigger pixel. sure, the 4 pixels combined may have the same voltage, however, you need to display them individually, therefore, the voltage level needs to be amplified.
Edit: I just thought about it, if you include the size of the pools again and compare water levels then you're only seeing if the amount of "water" captured measures up to its area, but the amount of water in that individual pool is still less, therefore it needs to be amplified. I do agree the pool analogy was a really bad one, as sizes in pools are kinda... well.. necessary.
For the second point. what I meant to reach towards is that a single pixel may be overblown, but the rest of the pixels are not affected by the voltage output of that individual pixel.
On the Third point, It may not be a pre-amp step, which would be impractical, to just add another step, which could also be achieved by the amplification that is already going on for ISO. which would also explain why they don't show the lower ISO values, they would become something more alike extended ISO values I believe, Kind of the same reason some cameras have dual native ISO, they probably have their Native ISO already set higher up. It may be the case, may also not be, this is the first time I've actually thought about this one.
I think I included the pre-amp step to split the system up making it simpler to go through, can't remember when though....
You should also consider typical SNR for small and big sensors and how does ISO affects SNR.
This was something I never gave a thought.
Thanks for a video. I think to compare amount of light we need to match a focal length (mechanically) from APC to Full Frame. If we consider a brightness as amount of light (see that sometimes is confused / defined as a light perception) then left photo has still more light due to more light information picking it from the sides. In this context is still completely right assume that a LARGER SENSOR give you more LIGTH (I won’t dare say BRIGHTER). Also I barely ever heard anyone saying that LARGE sensor gives a BRIGHTER EXPOSURE, instead usually it's being thought that it gives more LIGTH.
I've come across quite a few people who thought it gave a brighter exposure - I suspect they have fallen into the misconception of thinking that larger apertures and slower shutter speeds mean more like gathered and a brighter exposure so a larger sensor gathering more light means the same thing
You can debate different types of equivalents all day. Only thing that matters is that pictures look better on a full frame or APSC than all the smaller stuff at the end of the day.
The absolute best explanation I’ve seen on this topic, thank you so much!
And the dog steals the show again :-)
Does the diameter of the lens matter? Does a bigger glass piece in the front allow to catch more light and then compress it onto the sensor?
I would like to start by saying that I think your content is great and you are exceptional at explaining things. Rusty is awesome as well. Thank you! Now, I feel that I need some help with this one. I agree with everything you've said; however, at 7 minute mark, you have the A7III and a6400 images displayed next to each other showing the, roughly, the same image. In the A7III image, the bricks appear brighter than in the a6400 image. Is this simply my old eyes playing games with me or something along those lines, or is there some other explanation or something I'm not understanding?
You and your channel have really helped me take my photography to the next level. Thank you and keep up the great content!
Thank you for the kind words, in regards to your point about the test images those were a bit of a school boy error on my part as I shot the images in jpeg straight in camera which baked the picture profiles into the images and there can be variations with how the cameras process different colours
To clear it up here are some different test shots done in raw (the original raw files plus jpeg versions which were done on the computer from the raw files and the FF cropped down to APS-C) which you're free to download and inspect.
drive.google.com/folderview?id=17ABxsllQZQ_piM1W0dR8y6AM2FqrxcI6
@@DaveMcKeegan Thank you very much for setting that straight. I look forward to the next video.
I have been told that iso is not part of exposure and that it is applied after the shot has been taken and therefore there is no exposure triangle. So this would mean that all camera’s have the same base iso and that some sensors are better at dealing with signal to noise ratio than others. Have I been misinformed?
I would say that's wrong, firstly cameras don't all have the same base ISO because Nikon's base is usually 64 rather than 100.
Although cameras do certainly differ between each other as to how well they handle noise
In certain cameras the iso is never baked into the file but rather just kept as a side note so it can be freely changed later, however with most cameras the iso is baked into the file so if you over do your ISO in camera and blow out the highlights then they are permanently gone - this would make iso part of the exposure triangle.
@@DaveMcKeegan ok. Pal2tech and the angry photographer claim iso has nothing to do with exposure. Many others agree and say that the exposure triangle is just a way to explain how exposure works.
Anyway it’s not a big deal.
ISO doesn't change the exposure of the sensor but does impact the exposure of an image.
You'd never have them line up a correct exposure at iso100 then ramp it up by 5 stops and then argue that's not over exposed because the sensor is correctly exposed 🤣
Eg, reason why the a7s line of camera uses only 12mp sensor, but there relatively large compared to the size of the ones on say the a7iii/a9 with there similar 24mp, as I believe the size of the sensor in the a7iii, a9 and a7s cameras is the same size, but delivers great low light performance.
Correct me if I am wrong Dave.
The sensors are all the same size across the A7's/A9/A1 - so for any given exposure, all of them will collect near enough the same amount of light as each other
Where the fewer, large pixels of the S come in is that each individual pixel collects more light and produces less noise, but in having fewer of those pixels the total amount of light gathered across the sensor remains the same as thus the exposures don't change
@@DaveMcKeegan I forgot to put clearer image quality at higher ISO mainly due to the fewer but larger pixels. As you mentioned in the video.
nobody said it gives you Brighter Exposure, but overall Larger amount of light
that is exactly the reason of cleaner low light images, so for the same clear image with small sensor you must to double the exposure and reduce ISO, so nothing is wrong or sensational
I beg to differ, I've had multiple people in the past try to argue that larger sensors actually give you a brighter photo
There are even comments on this very video trying to argue that very fact
23mm f/2.8 ISO400 on APS-C is equivalent to 35mm f/4.2 ISO900. If you forget to apply the crop factor calculation to any one of focal length, aperture and ISO, you will get a drastically different image, it could be different depth of field, or different amounts of noise, or a different field of view. If you want to compare "same settings" you need to apply the calculation to all 3.
9:48 Has anyone a link to the video? I can't find it.
Good explanation. The light intensity never changes.
I shoot APS-C. If I had a nickel for every person that I ran into that said "Oh, I shoot full frame cuz it lets in more light" I wouldn't have to work any more. On one occasion, the guy was so adamant, I whipped out my vintage Toshiba linear light meter and asked him to point to the sensor size input. He was like "huh?". Exactly!! Exposure is measured per unit area, NOT total. He left unconvinced, but I sure felt better. Why is this concept so hard for people to understand?!!! Great explanation Dave! If my dog was looking over my shoulder, she would have been licking and nibbling my ear the whole time LOL!
Those people are completely correct. Of course the bigger sensor captures more light in total if the light intensity is the same per area unit. (The exposure is the same, because that's just how the ISO has been determined.)
Indeed, per unit area, that's right! - larger area, more light, doh! :) - but it's irrelevant, you just need to apply a calculation: 23mm f/2.8 ISO400 on APS-C is equivalent to 35mm f/4.2 ISO900 on full frame - as long as you compare like for like you'll get the same results
Thank you for this. What you are saying is what I thought. And professional photographers are saying that a full frame sensor makes a brighter exposure than apsc.
Oh... Hes a good boy!
Intensity and exposure is the same but global exposed aera is bigger so perceived brightness is higher.
Exposure the same yes but surface allows more photons on the subject, more information, less noise.
That's why smartphones are far more noisy.
Look at dxo mark values, the noise directly varies with sensor dimension for a same exposure.
Large format film cameras are used partially for this reason.
Exposure is not all.
I think this notion arises from the fact that a lower resolution sensor of the SAME size has more light falling on any given pixel. For some reason people misconstrue this as meaning the exposure ends up being different when in fact it does not. The only time this situation produces a 'different' exposure is under extremely low-light situations, and then you have to ignore the fact you also have a lower resolution image as a result.
But it is why my 11MP full-frame Canon 1Ds is better for infrared and astrophotography than my 17MP APS-C Canon T100.
It does give you a brighter exposure if you adjust for framing (as in use equivalent focal lengths but the same aperture (not accounting for lens specific light transmission)). Less noise and a brighter exposure are the same exact thing
Yes if you mean absolute aperture (iris diameter). No if you mean relative aperture (f/ stop).
@@kburke1965 that is literally the same thing
So if you put a sony apsc 70-350 on a full frame body that would be a good alternative to those supper expenses and huge sports lenses
If you are still happy with the pixel count you got left. So this why many feel the A7R4 and now A1's are probably sony's best 'crop' cameras since if you put them in crop mode you still get 26MP and 21MP without a AA filter. The A7M3 talked in this example would only have about 10MP left in crop mode. But for the rest you are correct....
As Daniel says, yes you can but you'll loose resolution in doing so, which then becomes a question of its that's worth it to you or not
THis is a scond post - COuld you please assist us noobies - I am sure now that exposure is not affected by sensor size - Got that.
But I have to now throw out all that I have previously assumed.
I am sure that Field of View is influenced by sensor size. No contest about that.
The questions are though :
1. Does sensor size influence Depth of Field, in any way, if we compared the cropped area of a larger sensor's image, with the full image taken on a smaller sensor (assuming focal length, aperture, and shutter speed and ISO remain identical). My initial thoughts are - No - if image capture area is matched, the Depth of Field should remain identical. i.e the only parameters changing Depth of Field would be Focal Length and Aperture, and in a strange round about way, the smaller sensor has an advantage when blurry backgrounds are preferred cos it has a longer effective focal length, from the manner in which it increases effective focal length on similar lenses. Just my thoughts as a non expert, Happy to be corrected if I am wrong.
2. Does sensor size influence Minimum focussing distance, from the perspective of the focus within the cropped area of the image taken with the larger sensor, that matches the image taken with the smaller sensor? From your video, my answer would be - also No - Minumum focussing distance of a largr sensor is longer only because in order to take a photo of a larger field of view, with a similar level of focus on the whole image, it would take a longer distance from subject, to get a similar accuracy of focus across the whole image. But if one was only concerned about focus within the cropped portion of a larger sensor's image, the focussing distance would be identical to that of the smaller sensor.
One way to think of this is - if one has a Full Frame sensor, and an APS-C lens is mounted, and the camera switches to APS-C mode(i.e not using the whole sensor to capture the image, but only the APS-C portion of the sensor)0, all aspects of photography would be 100% identical to that of an APS-C sensor body.
But seriously could it be so simple.
Dear Dave, please review these assertions and conclusions and kindly respond and make a video or a set of videos on TH-cam addressing how sensor size affects (or does not affect) Depth of Field and Minimum focussing distance. I think these would be very popular videos, that many people would like to understand much better.
The advantage you have is that you have the gear to test and compare and prove the thinking. Some of us have only one camera, and we do not trust the manufacturers, so we are relying on "free" thinkers like you who would have the gear to test these assertions and enlighten us the more.
I look forward to your response, and these videos.
This video is so wrong on every level. 23mm f/2.8 ISO400 on APS-C is equivalent to 35mm f/4.2 ISO900. ISO125 on APS-C is ISO280 on Full Frame - ISO * (crop-factor**2). You cannot apply the crop factor calculation to just focal length and then ignore aperture and ISO, it doesn't work like that! - Once you apply the crop factor calculation to all 3, you'll see an identical image. The image you he is showing either has a different field of view, depth of field, or amount of noise.
This isn't about ISO performance, it is just ISO125 on APS-C is identical to ISO280 on FF, in every sense of the word, same amplification, same light gathered, same noise, etc - you have to set an equivalent ISO to get a similar image, or you're not comparing "same settings".
This video should receive more views..........
Well... I disagree. And Dave is getting more attention with debunking of flat earthers. I wish him the best, he is a very nice person and what a lovely dog isn't!
Anyone who has ever used a hand held light meter before gets this stuff.I have several analog meters in addition to a now rather old Minolta IV digital light meter, and not a single one of them asks what format of film or sensor that I am using. Hmmmmm....
How do you know exposure is the same? Have you compared the histograms, waveforms or you telling that by your eye?
5:40. You do not have the same amount of light spread on the larger area. More light hits FF sensor compared to APS-C sensor.
Since diameter of entry pupil is the same in both cases when you used FF lens then most of the light hit FF sensor (some not, since iris is spherical and sensor is a rectangle). More light missed APS-C sensor hitting area around the sensor (light is not magically bend inside the lens if you connect it to APS-C body).
Also comparing different crops makes no sense, crop should be the same so you should use different focal lengths for each sensor size to match the framing. My guess is you got multi metering mode on both cameras, so exposure is somewhat average of the whole picture, which is different for different crops ... seems like the whole method used here is all wrong.
And I shoot MFT and APS-C, so not a FF fanboy.
I didn't say there was the same light over a larger area, I said a larger sensor captures more total light but because it's over a larger area then the amount of light falling on each unit area of the sensor remains the same and thus the exposures are the same.
I also shot at different focal lengths to create the same framing (35mm on FF Vs 23mm on APS-C) - but then focal length doesn't change exposure otherwise a constant aperture zoom lens would change exposure as you zoom in.
And metering modes didn't come into play because I shot fully manual and dialed in the exact same settings to both cameras
The relevant factor at hand for purposes of exposure is the intensity of light per given area, not the total amount. The FF sensor can capture a larger area of the projected circle from the lens, but the intensity per mm remains the same. The result is no different to cropping in the FF image in post to match the image from the APS-c camera.
A shorter focal length (at a given aperture diameter in mm) does yield greater intensity of light. This is because light radiates from many angles, and a broader field of view better captures the scattered light (the whole reason the Aperture value is expressed as a quotient, as we can use it to gauge exposure across different focal lengths). Once passed the lens however, the light is relatively uniform. Sensor size cannot alter the intensity of the light.
The benefits gained from full frame cameras are an easing of design restrictions in lenses as well as signal-noise ratios and dynamic range. Smaller, more densely packed sensors require sharper lenses to properly utilize, and are more easily diffraction limited at small apertures.
@@DaveMcKeegan Why would it? - you are getting a totally different image in terms of depth of field there. Try to compare identical images, you will see that 23mm f/2.8 ISO400 on APS-C is equivalent to 35mm f/4.2 ISO900. This isn't about ISO performance, it is just ISO125 on APS-C is identical to ISO280 on FF, in every sense of the word, same amplification, same light gathered, same noise, etc - you have to set an equivalent ISO to get a similar image, or you're not comparing "same settings".
@@DigiDriftZone The noise is cleaner when comparing similar resolutions because the pixel density of the full frame is less
f you take DPReviews image comparison tool and compare an APS-C like the A6700 to a full frame like the A7RV you see the noise levels look the same at the same ISO levels because the pixel densities are about the same
Take an A7RV in crop mode alongside an A6700, using the same lens on both with the same settings and you'll get the same looking results - same exposure and same noise
@@DaveMcKeegan Surprisingly pixel density doesn’t seem to matter. Take an A7S III and A7 IV, vastly different pixel density, but the noise is similar, it looks slightly different but in terms of signal to noise, they are almost identical.
As for same settings, 23mm f/2.8 at ISO400 on APS-C is 35mm f/4.2 ISO900 on full frame. This will give you the same field of view, depth of field, exposure, noise, etc. The images you are comparing are largely different images, nobody can look at them and say that’s an identical image, the above will give you an actual (near enough) identical image, try it.
OK, so we’re in agreement, your A7R V when cropped to APS-C size has 1/3 less image, meaning 1/3 less light, correct? - If your exposure stays the same (which of course it does), that means you’ve just lost 1/3 of your captured light by simply only using your 2/3 of your full frame sensor. If you used the whole sensor, you’d get 1.5x more light captured.
This applies across the board. If you switch from a 23mm f1.8 lens to a 35mm f1.8, the exposure is the same, but the amount of light the 35mm lens captures is 1.5x greater.
In all of this, I think the biggest mistake is you're just forgetting to apply the crop factor calculation to the ISO (it is ISO * cropfactor**2).
The FF image is clearly a bit more exposed than the APSC image.
Also, a larger signal/noise ratio is functionally equivalent to a larger signal. All signals of light are interpreted by the camera, not given to us as some absolute value reading. ISO settings are used to manipulate how that signal/noise ratio is interpreted. ISO values are not absolutes like shutter speed. They are relative to the sensor. One sensor at ISO 100 isn’t the same as another sensor at ISO 100. A sensor with a better signal/noise ratio will have a brighter image at ISO 100 because the camera can afford to interpret the signal as a brighter image because the proportion of noise is lower.
Here are some downloadable RAW files - a full frame, an APS-C & a full frame in APS-C crop mode
All settings identical and all exposures are equal
drive.google.com/folderview?id=17ABxsllQZQ_piM1W0dR8y6AM2FqrxcI6
And if signal to noise ratios were a factor then that would mean lower resolution full frames like the A7SIII would expose brighter than higher resolutions like the A7RIV
@@DaveMcKeegan a7SIII and A7rIV are both FF sensors. A single pixel on the S had a better signal/noise ratio than a single pixel on the R. That is why ISO noise shows up faster on R when it is cranked up and zoomed in. But an equivalent region of the sensor will have more similar signal/noise ratio between the two sensors. The R will have more area dedicated to pixel boundaries that are not collecting light, and the two sensors can be different in other ways for performance.
There is a trade-off between higher pixel count and larger photo sensors. Amount of distinct pieces of information and amount of confidence for each piece of information. The S camera has fewer pieces of distinct information, fewer pixels. The R camera has less confidence in any particular distinct piece of information, worse signal/noise ratio per pixel. At full image size, the greater amount of of pieces of distinct information helps to offset the relative lack of confidence per pixel. But when you crop in to get similar pixel density, this lack of confidence is apparent with greater noise.
But all of this is separate to the issue of better signal/noise being functionally identical to brighter image. We have created a new variable by significantly increasing the pixel count. If you have two identical sensors in all ways, except sensor A has a better signal/noise capture than sensor B, then a camera maker will (or at least a smart camera maker should) make sensor A produce a brighter image than sensor B at the same ISO. This is because ISO specs are determined relative to the sensor and sensor A can label “ISO 100” to be a higher amplified image signal without having more noise relative to image B.
You're saying the R & S create equal exposures because despite different sized pixels, an equal portion of the sensors both see the same amount of light ... Which means an APS-C sized portion of a full frame must expose the same as an APS-C because they are seeing the same area of scene.
@@DaveMcKeegan
A simpler way to look at larger signal/noise being functionally equivalent to a larger signal is by just doing some simple math. Assume higher numbers mean a brighter pixel. I will admit that I am not doing any real signal processing math. Things could be squared or square-rooted in real signal processing math to increase or decrease the significance, but it definitely will be an improvement of some delta value.
Assume that a noise value at or below 5 is great, at or below 10 is okay, and above 15 is bad.
One sensor gets a signal/noise ratio of 10/1 and another gets a signal/noise ratio of 25/5. The first sensor captures less signal than the second. But the first sensor can amplify the 10/1 to a signal of 50/5, creating a brighter image while still maintaining the same amount of noise as the second sensor.
@@GoodFun_HaveLuck But signal to noise ratio has nothing to do with exposure.
The SIII has a better signal to noise ratio than an RIV because each unit area of pixels captures the same amount of light but in fewer pixels which produces less noise and so a lower S/N ratio, but the exposure remains the same because the total amount of light captured is still the same.
The same rule applies comparing APS-C to full frame.
The A7RIV has slightly smaller size pixels than an A6600, as it's APS-C crop mode is 26mp Vs the A6600 24mp.
But if you shoot an A7RIV in crop mode alongside an A6600, with the same lens & settings - they will produce the same exposure.
If you then switch the A7RIV into full frame mode, the centre super35 section still continues to expose the same as before, it's just now seeing more of the scene - so the total light capture increases but the light captured from the middle doesn't change so the exposure doesn't change.
Meaning a full frame exposes the same as an APS-C
Dave, “just” perfect once again! There might be an additional thing that usually has some discussion about: bokeh. In a FF, using the same aperture, you get more blurred background. Do you think you could explain this, in that clear way you always do?
I think this should cover what you're looking for 😊
th-cam.com/video/v_rTNxOIdoA/w-d-xo.html
@@DaveMcKeegan fun fact: I did understand what you explained there, because I had seen your video a while ago. I just didn’t remember it was yours.
Thank you for making these. I really appreciate getting it from someone with this clear view of photography and capability to explain it simple and accurately.
Cheers, Dave.
Something feels different. Photos on the FF feel better.
I find that a big sensor makes life a lot easier when it gets dark.
But yes it uses the full whole sensor to capture that light... meaning it has a little easier time in the dark. But images are always weighted to be in balance. The big difference is in just how late you still can go out and capture a decent image without a tripod ;)
/facepalm :)
Tf?
I dunno mate, use a brighter lens might be a better idea than going out buying a large format sensor camera just to shoot an hour or two longer outside.
I never had issue in the arctic nights with my f1.2 lenses on me APS-C body, clearly, you have a skill issue, not a gear issue.
Very informative! ^^
the a7III footage looked brighter to me.
I have never heard anyone claim that the exposure is relative to the crop factor, so this whole video seems like a huge strawman fallacy.
What people claim is that the smaller sensors capture less light given the same F-number, which is absolutely correct. If you want (approximately) the same field of view, the same depth of field, and the same amount of image noise then you have to use the equivalent focal length, the equivalent F-number, and the equivalent ISO, while keeping everything else the same.
You are the first person that has properly explained this topic. I haven’t been able to understand this since I started shooting in 2013. Thank you so much for breaking this down. I just found your channel and I’ve been binge watching. I have learned so much from you and I look forward to future topics. Have a wonderful day!!!
But it's completely wrong... ISO is calculated per inch, in order to set the "same settings", ISO800 on APS-C is the same setting as ISO1800 on Full Frame, the formula is multiply ISO by (crop factor)**2, e.g. ISO125 on APS-C is ISO280 in full frame to get the same light capture, noise level, etc ,etc - i.e. same settings.
then why is bigger sensor still better in low light?
If it did, light meters would come with a FF/ crop sensor mode. They do not. And never will.
I always try to explain this topic via optics diagrams. But people don't seem to get it. I'll just send them this vid.
6:22 to me it does not correlate with your words. and its not just that A7III picture is just flatter. i bet if you would go on it one F stop more it would look the same.
2:05 UK sunlight, not always traveling at the same speed...
Light travels at different speeds through different materials 😉
@@DaveMcKeegan you mean the UK rain?
That makes sense!
Thanks for mythbusting 👍
Would you consider a video on dog photography / editing? 🙏
I'm struggling to get good action photos of a black, woolly dog with a long snout (standard 🐩)
Certainly something i'll consider when I clear out my current plan of videos
I can understand the "logic" behind the erroneous arguments, though I am glad they are wrong because it would have completely changed my understanding of light if they weren't.
Anyway, since I announced to people I know that I intended to go Full Frame I have been bombarded with opinions on why it's not better than a crop. I never said nor thought it was, I just want some wider lenses than I can currently get but no matter how much I explain, the crop fans (who I am technically one of, owning a crop camera) insist on telling me I am a fool.
Meanwhile, FF fans defend their choice in answer to the crop fans and between these two groups of very opinionated people the only conclusion I can reach is people do talk a lot of bum gravy when they love something.
6:26 my eye tells me left image got higher exposure. It's brighter on my monitor.
These were shot in jpeg in camera so have slightly different image processing
If you compare up these raw files you can see there is no difference
drive.google.com/folderview?id=17ABxsllQZQ_piM1W0dR8y6AM2FqrxcI6
Guess the exposure is controlled by the manufacturer, in tech, what every manufacturer consider as a proper exposure, with my nikonD300 D700 D800 same settings, same light, gives me the same result, straight of the cam,
So the a7rIV and the a7s3 get the exact same picture? Don't think so. There is clearly a difference.
And your tray versus shot glass analogy: the shot glasses don't cover the same area as they have walls, and so do photosites. Which explains why smaller pixels aren't a great idea in most cases.
Clearly the a7s3 and a7r4 have the same "exposure" but for the same iso setting, you get different signal to nosie ratio. Which is a better signal.
What you are ignoring is that iso isn't a measure of light to signal, but a scale that is fitted to it.
The topic is exact the same as iso isn't the same on every camera. ..the fact that people use full frame instead of 1" proofs that there is a difference.
Photo statistics exist and so does readout noise.
Of course there are differences in picture, you get less noise from having fewer pixels because there is less circuitry on the sensor to create noise but it doesn't create a brighter exposure.
ISO is a measurement of signal amplification, but shooting at base ISO you have no amplification its just the raw signal level straight off the sensor
But all of that is relating to signal to noise ratio, not overall signal - so the argument still stands, there might be a few photons difference but there is no truly measurable difference in actual luminance with the exact same settings
Noise performance is a big factor as to why people favour larger sensors but then there other factors too - larger sensors generally make creating shallow depths of field easier and diffraction doesn't show up as much with larger photosites.
@@DaveMcKeegan almost as if, using the same settings gives you the same result? odd huch?
that is exactly what an ISO rating in a digital camera is doing. giving you the same result. same light reaches the sensor, same result.
but lets think of it this way.
you shoot for an image with the same angle of view and the same depth of field. and to avoid differences between sensors and processors, you use the same camera but just crop the center.
shoot 50mm f/2 and 25mm f/1.4 (you would need T stops for transmission, but assume it is perfect and keep the F stop measurement for dof) and use the exposure duration because motion blur needs to be the same as well or it is motion picture.
to get the same results, you end up using ISO 400 on the FF 50mm f/2 and ISO 200 on the cropped (so about 2x crop but same aspect ratio so it wouldn't be MFT) 25mm f/1.4.
But remember how we used the same sensor for this?
almost as if, a bigger sensor calls ISO 400 what a smaller sensor calls ISO 200 to get the same result. Because we are seeing the same picture the same light hit the area of the image. This is total light not light per area or light per pixel. We took the same image but one is smaller, and lower res. the other is bigger but has more gain ("base iso" is never a solid 100, it is just what the engineers tune the DAC to do in 0010 operation).
the point in your video isn't that different sensor sizes change exposure, but that ISO is normalized across sensor sizes, pixel sizes and resolutions.
if you go back to the days of analog, where ISO is a rated speed of the medium and not a gain setting, you get the expected behavior that most people learn from a digital camera. However there is a striking difference, in the analog world, dynamic range has no hard borders and a larger negative can hold more detail, more "resolution" and more dynamic range if you enlarge it. simply because the size of the silver grains and the density are physically limited.
you keep making the point that signal to noise ratio is different but the signal itself isn't. This is contradictory. Signal to noise ratio is what we think of as dynamic range, therefore a better signal to noise ratio gives you a better image -> "more signal". As long as you don't clip the signal and have the bit depth to bright back an image by developing the "raw" data (or a negative you print) - better signal to noise ratio means better signal in the end.
photon statistics is complex physics, but you can't beat physics with mathematics. So bigger pixels and less resolution is better than smaller pixels but more of them. You can't get the same result by mathematically averaging pixels, the signal to noise ratio will be worse. (ignoring read out speed and storage). That is why you see cameras like the a7s3 because in reality it does make a difference. But why do we see phone cameras with 108MP? because there is a lot of processing being done in a phone than in a camera (might change in the next few years), and with learning based methods and temporal averaging... more data beats cleaner data. Which is a surprise and unexpected result in information theory but reality proofs you wrong. iPhones still only have 13MP tho, and they are still said to be the best camera found in a phone.
Haven’t watched the video. But I say yes.
Your comparison is flawed. You must compare the same field of view. So, on an APSC camera you should put a 35mm lens on it and a 50mm lens on a full frame. Then, you can compare to your hearts content. I do a lot of photography in dimly lit settings such as in theaters, and APSC cameras will never compare to the results you get from a full frame camera.
Around 7 minutes in the video I compared 35mm on FF to 23mm on APS-C - although these were shot in jpeg in camera so the picture profiles has given a difference to the blue in the wall.
However here are some raw files with the same setups you can download and inspect
drive.google.com/folderview?id=17ABxsllQZQ_piM1W0dR8y6AM2FqrxcI6
The exposures themselves are near enough identical but as I then stressed near the end of the video, a larger sensor with larger photosites will give you less noise.
There can be other factors such as light transmission of the lens and sensor architecture like is it backside illuminated etc
But for equal sensors with equal lenses the size of the sensor doesn't change the luminance of a shot
Yeah, this is the the thing. The gathered light depends on the lens, not the sensor. That is agreed. But if a Milky Way shot requires a 14mm view, you need to compare the light gathered by a 9mm on ASP-C and a 14mm on FF. At same shutter speed and aperture, the 14mm lens captures more light as per physics. So while sensor size itself does not matter, it forces you to use lenses with inherently lower light gathering capabilities to take the same shot. See ex. equations on clarkvision site. So choice of sensor indirectly matters besides better noise performance imho.
9mm f/2.8 has an aperture area of 1.26 sq. cm
14mm f/2.8 has an aperture area of 1.57 sq. cm.
So when you choose the FF solution for the shot you gain 25% better light gathering capability. Because you can use lens with larger aperture area.
As for comparison it is hard to do unless you know how ISO was defined for the two cameras.
But while the aps-c ens has less like passing through, it's displaying it over a smaller area, so the amount of light falling in each square unit of sensor remains the same - thus giving the same exposure.
The noise in an image is due to the signal to noise per photosite
Thanks for reply! I am a bit confused how that fits to your first test where you show that the same physical lens on ASP-C and FF gives the exact same exposure. So the light gathered should have nothing to do with sensor size? For same area (after crop) the pictures look equally exposed?
The first test was to show that with the same lens, the exposure stays the same regardless of sensor size because the smaller sensor is capturing less total light but it's also looking at a smaller part of the scene. In essence it is exactly like taking a shot with a FF setup and then cropping in - you end up with a narrower angle of view and less resolution but the exposure doesn't change.
Then the second experiment was adding an aps-c with an equivalent field of view into the mix, which shows the exposure doesn't change between using FF & aps-c lenses even on the same sensor
Literally nobody said that. You're attacking a straw man.
There are people commenting on this very video arguing that it does
@@DaveMcKeegan Don't think so, they are just pointing out the images you're comparing are different images, i.e. different depth of field, different focal length, different noise levels, etc - you forgot to apply the crop factor calculation to ISO: 23mm f/2.8 at ISO400 on APS-C is equivalent to 35mm f/4.2 at ISO900 on Full Frame, you will see a very similar image, same depth of field, same field of view, same noise level, same exposure, same everything.
You can of course achieve the same results across systems with some systems having some unique lenses and advantages, but you have to apply the crop factor calculation to all 3: focal length, aperture and ISO, you can't selective apply it to just 1 or 2, you'll get a totally different image.
Sony should really work on their dog exposure detection its way to sticky to that human face in the video... Wonder if that works btw do the new sony's also expose for animal faces and if so can you register them as you can with faces to prio them... have to test that...
It would be a marketing disaster if they started advertising cameras as having "Rusty autofocus" :D
life is too short....
Does an external meter asks which camera you use?
No, it uses 35mm as reference, you need to apply your own calculations if your sensor is either smaller or larger
@@DigiDriftZoneI hope you are just joking since an external meter reading (EV) is film/sensor size independent. My statement was meant to be sarcastic to prove the point, and I never do any such calculation in my 45 year journey. 🙂
@@bfs5113 I’m not, contact the manufacturer, they assume 35mm for historic reasons, you need to apply your own calculations if that’s not what you’re using.
It's because of callibration. The sensor includes micro-lenses that further focus the light onto the individual photosites. So, if there's an 8mp, or whatever the original ones were, full frame sensor from decades ago, it will have more light hitting an individual photosite, because it's being channeled into a single spot. They're calibrated so that the settings don't deviate too far from what an external light meter says for the sake of everybody's sanity.
But, if you took a much lower resolution sensor with larger microlenses, you would otherwise see a difference. But, we don't ever get that set up, because it would be dumb. So, they get callibrated so that the exposure matches when necessary. In practice though, there's never that much difference within a generation or two and so you wouldn't see much of a difference, without comparing a first gen FF against a current gen cropped sensor, in which case, you wouldn't see it due to the callibration process eliminating any significant deviation that might occur.
Further proof is that a light meter does NOT have settings for different sensor sizes.
A very good point Lonnie 😉
Why would it, it measures light per square inch. Larger surface area, more total light but the light per inch never changes.
What a man... finally you answered it with a prove!! keep it up 👍👍
Very well explained and frankly its dumb that so few people understand this, in fact its embarrassing when lens manufacturers have to make articles to explain this concept because of all the false information out there.
Great video as always. I hope rusty’s merch is selling well 👏👏
2:00am in the morning .. Nice and Simple Dave. Brilliant.
Waiting for a response from Tony N. "several subscribers have contacted me saying, Dave McKeegan has just said," ..😂
Who cares, Rusty is Back.. 😁😁
Informative video again Dave thanks !!
Tony N has never claimed that exposure is relative to crop factor. In fact, I've never heard anyone ever claim that, so this whole video seems like a strawman correction.
@@marcus3d No, but he did claim that 23mm f/2.8 ISO400 on APS-C is equivalent to 35mm f/4.2 ISO900. You cannot apply the crop factor calculation to just focal length and then ignore aperture and ISO, it doesn't work like that. Once you apply the crop factor calculation to all 3, you'll see a near identical image. The images he is showing either have a different field of view, different depth of field, or different amount of noise, so it's not the "same settings". Those settings are useless without applying the crop factor. It is like saying I am going at 60 speed without specifying if it's miles or kilometres.
@@DigiDriftZone Yes, and Tony N is absolutely correct in that claim.
i have never seen a sensor size on a light meter
Why would you, it's per square inch. But larger surface area, more light captured.
A lot of people have been told that a 2.8 lens on APS-C is like a f4 on full frame. That's only correct for depth of field.
A 1.8 lens is as good in low light regardless of it's APS-C or full frame.
That's the answer to all my problems, all I wanted to hear in months. So my 1.8 lens on APS-C is as bright as 1.8 in full frame, so the two different sensors get the same light, right? But, when you use a speed booster? They claim one more stop of light. Does that mean that using the same 1.8 lens with a speed booster on a crop sensor makes my crop sensor reciving more light than the full frame one as well as the same depth of field?
While the exposure settings will be the same the end effect will not. APSC will have more noise at the same ISO than a full frame sensor with the same megapixels, so to get the same image quality you’ll need a lower ISO, which is why it makes sense to apply the crop factor to focal length and aperture rather than just focal length
This is very true while what Christ K also said is true that at any given ISO the APSC will have more noise than a full frame. However with all these Sony cameras being incredibly good at low light imagery said noise really only becomes a factor when shooting at high ISOs. If you can't get a relatively clean image with a Sony APSC camera/lens that says more about your technique than the limitations of the gear.
@@GeraldBertramPhotography I don't think anyone's arguing that it's not possible to get great pictures with crop sensors, but the conversions can still be very useful. I regularly use FF, APSC, as well as 1 inch sensor cameras, so knowing roughly what to expect from image quality with quick conversions can be very useful.
If it's golden hour and I have to decide between grabbing my RX10 or lugging out my a7iii+200-600 to shoot some birds I need to know that the noise on the smaller sensor will quickly spike as the light leaves. If I just look uncritically at the f/4 of the RX10 at 600mm vs the f/6.3 of the 200-600 then it looks like the RX10 is the better option for low light, which is obviously not the case
A speedbooster takes a full frame lens and redirects the excess light that would otherwise miss an APS-C sensor. This reduces the crop factor of the sensor (so rather than being 1.5-1.6x crop it might only be a 1.2-1.3x if a speedbooster is used) - it also means the density of light hitting the sensor is higher so you get a brighter exposure - it effectively lets your sensor receive 1 stop more light than it usually would.
I have a another video here which explains it properly.
th-cam.com/video/e852D0-Bsf4/w-d-xo.html
Finally someone with some common sense
Literally nobody claims that smaller sensors produce lower exposure.