How much exposure time do I need? Use my free SNR app!

แชร์
ฝัง
  • เผยแพร่เมื่อ 7 มิ.ย. 2024
  • I made a free SNR app that can help you know how much integration time you need. It's simple to use! Check it out here: deepskydetail.shinyapps.io/Ca...
    #astronomy #astrophotography #neuralnetwork #gimp #rstudio #photography #machinelearning #free #celestron #zwo #siril
    Try the tool I made! Download it from GitHub here:
    github.com/deepskydetail/Astr... nqFI29h8W0Fp9UkKOJry
    For those interested, here is the gear I use (or equivalent; Orion has discontinued some of their products, so the equivalent skywatcher products are listed). Most of the links are Amazon Affiliate links. Thanks!
    Skywatcher HEQ5 Mount amzn.to/3NVrVQi
    Skywatcher 80mm F7.5 Doublet amzn.to/47mbfZi
    ZWO EFW mini filter wheel amzn.to/48OxhoN
    Optolong LRGB Filter Set (1.25") amzn.to/47r23Tw
    Orion 0.8x Reducer amzn.to/47s64a4
    USB RJ45 Cable for Controlling Mount amzn.to/4aPlIzu
    Raspberry Pi to control mount amzn.to/48mTBWH
    R Pi case with touch screen amzn.to/3HvdNKl
    ZWO Autofocuser amzn.to/3Skg0hX

ความคิดเห็น • 103

  • @deepskydetail
    @deepskydetail  ปีที่แล้ว +5

    Here is the URL to the SNR planning app: deepskydetail.shinyapps.io/Calculate_SNR/

  • @anata5127
    @anata5127 ปีที่แล้ว +1

    Superb! I will try this. Thanks for your efforts.

  • @ByquPL
    @ByquPL ปีที่แล้ว +3

    I have subscribed your channel faster than one can say "WOW man, this is superb!"

  • @Graeme_Lastname
    @Graeme_Lastname ปีที่แล้ว +1

    I have just recently "discovered" your channel. I'm very interested this and all that goes with it. Looking forward to the next one. Thanks m8. 😃 🇦🇺 🖖

  • @AntonioPena1
    @AntonioPena1 ปีที่แล้ว +1

    James, definitely after watching your video looking to use your application, thanks for reviving Craig. Good job!

  • @viewintospace
    @viewintospace ปีที่แล้ว +1

    Thanks a lot! Amazing video and amazing app!!! Really cool stuff!

  • @marvinwhisman3333
    @marvinwhisman3333 4 หลายเดือนก่อน

    Great video. Thanks so much for your work and contributions to the community.

    • @deepskydetail
      @deepskydetail  4 หลายเดือนก่อน

      Thank you! Glad to do it!

  • @stefan_astro
    @stefan_astro ปีที่แล้ว +3

    Really cool app! Thank you for creating it. I will try it out myself aswell!

  • @yougoattube
    @yougoattube ปีที่แล้ว +2

    Hats off to you, sir! I've made a small donation to help the cause along, and to encourage others who might want to contribute to the astrophotography/astronomy community. Many thanks! PS - I also subscribed to your channel.

  • @TheBillyclash
    @TheBillyclash ปีที่แล้ว +1

    Bravo, you rock!

  • @monkeypuzzlefarm
    @monkeypuzzlefarm ปีที่แล้ว +3

    Fantastic video! I can't believe how quickly you whipped up that app!

    • @deepskydetail
      @deepskydetail  ปีที่แล้ว

      Thanks! This app wasn't too difficult because the math was already in the original spreadsheet. It was just a matter of turning it into simple code. Plus, the R packages that I use to make the application make coding everything pretty simple.

  • @andiheimann
    @andiheimann ปีที่แล้ว +1

    awesome work!

  • @yervantparnagian5999
    @yervantparnagian5999 หลายเดือนก่อน +1

    Very interesting. Thank you for your work. I suspect, seeing most everyone uses some type of filter, the hours of integration/exposures will go up exponentially.

  • @HyperX07
    @HyperX07 หลายเดือนก่อน +1

    Thank you bro for your hardwork

  • @mohicanspap
    @mohicanspap ปีที่แล้ว +4

    I hope this tool gets more developed over time. Such a great thing to have especially for beginners or even professionals. Also with the camera I have I don’t use bias frames but dark flats. Hopefully there will be an option for those in the future. Great work!!!

    • @deepskydetail
      @deepskydetail  ปีที่แล้ว +2

      You should be able to just take some bias frames to get the calculations. I only use dark flats too, but taking a few bias frames is easy and the numbers should be valid for a while.

    • @mohicanspap
      @mohicanspap ปีที่แล้ว

      @@deepskydetail Good to know! thanks for getting back😃

  • @Thommy78
    @Thommy78 ปีที่แล้ว

    Great work, awesome video. Thank you very much!

  • @DrillingDataSystems
    @DrillingDataSystems 3 หลายเดือนก่อน +1

    Great work! Thanks..

  • @Ronbo765
    @Ronbo765 ปีที่แล้ว +1

    Thank you!

    • @deepskydetail
      @deepskydetail  ปีที่แล้ว

      You're welcome! And thanks for watching!

  • @mikehardy8247
    @mikehardy8247 7 หลายเดือนก่อน +1

    This should prove very useful to a newbie, who probably oversamples.

  • @Nabby13
    @Nabby13 ปีที่แล้ว +1

    Well done! One question: how is quantum efficiency of a camera taken into account? The shot noise originates from Poisson distribution of photons. Are you assuming that the quantum efficiency is something like 85% and then taking that into account?

    • @deepskydetail
      @deepskydetail  ปีที่แล้ว +2

      I don't think quantum efficiency is needed to get the signal. Why? Because the total signal is being calculated from the light frame itself. If QE goes up for a camera, then you'll see a brighter image in your light frame (i.e., the ADU will be higher). In other words, the app will adjust the signal based on that based on the two samples you took from the light frame.

  • @pipercherokee8598
    @pipercherokee8598 ปีที่แล้ว

    Excellent work. One thing I can't quite get my head around - no matter what numbers I plug in based on my subs and biases, the app always suggests something in the order of hundreds or even thousands of hours needed, even to get a SNR or 75 or so. I haven't seen anything with my frames to suggest they are overly noisy - what might I be missing? i am also using Siril stats to get the values. I am happy to take 10 or even 20 hours worth over a few nights, but not hundreds :). Thanks again, this will be very valuable once I get it figured out!

    • @deepskydetail
      @deepskydetail  ปีที่แล้ว

      Thanks, do you mind sharing the numbers you're putting into the app?

    • @pipercherokee8598
      @pipercherokee8598 ปีที่แล้ว +1

      @@deepskydetail SO sorry I missed your reply here - I actually did get it figured out (though I don't remember what the solution was at the moment). Much appreciated.

  • @DKelly350
    @DKelly350 ปีที่แล้ว +1

    Thanks, this sounds like a great tool. Do you know how I would relate camera gain (I assume you are using an Astro-camera) to ISO on a one-shot-color camera?

    • @deepskydetail
      @deepskydetail  ปีที่แล้ว

      I'm going yo have a video on it. It will probably come out today or tomorrow.

  • @chrisschomburg4936
    @chrisschomburg4936 ปีที่แล้ว +1

    Nice tool and great designed Video. Well done. Maybe you want to check your exposures. The graph of your camera showed 1.9e readnoise, not 0.9e.

    • @deepskydetail
      @deepskydetail  ปีที่แล้ว +1

      Thanks, but don't I have 2 for my read noise in the video? 0.9 is the gain value in electrons per ADU.

    • @chrisschomburg4936
      @chrisschomburg4936 ปีที่แล้ว +1

      ​@@deepskydetail aaaaah, got it. 😅

  • @samwarfelphotos
    @samwarfelphotos ปีที่แล้ว

    This is super cool! I use a Nikon D5300 to image, where can I find those camera stats for it, since I don’t use an astro cam that gives them to you on its website?

    • @deepskydetail
      @deepskydetail  ปีที่แล้ว +1

      You should check out this video I made for DSLRs: th-cam.com/video/oBpAkHim7dY/w-d-xo.html
      It takes a bit of calculation to get the RN and gain for DSLRs but it is doable.

    • @samwarfelphotos
      @samwarfelphotos ปีที่แล้ว

      @@deepskydetail Thank you!

  • @xtctaz
    @xtctaz 9 หลายเดือนก่อน

    Very cool. Is there a way to skip bias frames? I have a DWARFLAB dwarf2 smart telescope and normally most users of this scope do not use bias of flats

    • @deepskydetail
      @deepskydetail  9 หลายเดือนก่อน

      I think you'd need bias for accurate results. But they're really easy to take. The dwarf2 might have a function to let you do that (but I'm not sure). If not, I'd email the developers to get that functionality :)

  • @pcboreland1
    @pcboreland1 ปีที่แล้ว +1

    Fantastic work! What do you mean by bias signal. I do not take bias frames with a cmos camera. There are darks, lights, flats and flat darks. Please say more about getting the value needed to enter into your bias input.

    • @bbasiaga
      @bbasiaga ปีที่แล้ว +3

      A bias frame is same iso as your flats/Darks, shortest possible exposure. It contains basically only read noise. It's an alternative to a flat dark to calibrate noise out of your flats.

    • @deepskydetail
      @deepskydetail  ปีที่แล้ว

      Yes, I would take a very short exposure with the same ISO as your dark and light to get the bias signal.

    • @pcboreland1
      @pcboreland1 ปีที่แล้ว

      @@deepskydetail Thanks. I have been testing your very nicely built App, some parameters like read noise have little to no impact on the curves.? Neither does the bias signal? I know you based this on a spreadsheet set of equations, what are your thought on this?

    • @deepskydetail
      @deepskydetail  ปีที่แล้ว

      @@pcboreland1 I've gotten similar questions on this. Basically, read noise should affect the curve, but it might not have a huge effect (especially if you have a lot of light pollution).
      The bias signal should have an effect, but the dark signal shouldn't. Basically, the skyglow noise estimate takes into account the dark signal (skyglow - dark) , and the total dark signal also takes into account the dark signal (i.e., dark - bias). So... in the end those two things cancel each other out in the math, so the dark signal is kind of redundant. But, they might eventually be relevant if I ever include more information in the app.
      The biggest impact is probably going to come from the target signal and the skyglow.

    • @dimitriospetikopoulos2420
      @dimitriospetikopoulos2420 11 วันที่ผ่านมา

      I was wondering if any of this applies to OSC DSLR cameras

  • @galactus012345
    @galactus012345 ปีที่แล้ว +1

    Than you. Very Interesting tool, I am using it. We can see from the graph that it can go beyond 100.
    So it means that at a point adding frames is no longer increasing the SNR. What to think abount people adding 100H ?

    • @deepskydetail
      @deepskydetail  ปีที่แล้ว

      Thanks! I'm not sure exactly what the question is. What do you mean by "adding 100H"? Thanks :)

    • @--Adrian--
      @--Adrian-- ปีที่แล้ว

      No, gathering more exposure time will always increase the SNR

    • @galactus012345
      @galactus012345 ปีที่แล้ว

      @@deepskydetail Sorry for the typo, I meant 100 subs. Isn't 100 SNR ratio a limit ? Are we increasing the quality by going over SNR 100 ?

    • @deepskydetail
      @deepskydetail  ปีที่แล้ว

      @@galactus012345 That's a good question, and I should have been clearer in the video. You can theoretically have infinite SNR if you have infinite integration time. It's a ratio, not a percentage, so it doesn't stop at 100.

  • @Groeko
    @Groeko ปีที่แล้ว

    How do I calculate the needed exposures per filter for mono cameras? If I use L-RGB filters, I have to divide the number simply by 4?

    • @deepskydetail
      @deepskydetail  ปีที่แล้ว +1

      I think most of your data will be from the L filter, so I would use that and be a bit conservative in the estimate. If you do want to include the RGB, I think you'd want to use a weighted mean depending on the % of frames that will be L vs RGB.

  • @pcboreland1
    @pcboreland1 ปีที่แล้ว +1

    Hi! Changing the camera read noise value or changing the dark signal value appears to have no impact on the SNR curve. Is this expected behavior? In fact I can set both values to zero!

    • @deepskydetail
      @deepskydetail  ปีที่แล้ว

      Hi, I just checked, and
      (1) changing the read noise value does change the SNR curve for me. Try increasing the read noise to something like 500 to see if it changes. It did for me.
      (2) After double checking, it is normal that changing the dark signal value does not change the curve. Basically, the skyglow noise estimate takes into account the dark signal (skyglow - dark) , and the total dark signal also takes into account the dark signal (i.e., dark - bias). So... in the end those two things cancel each other out in the math, so the dark signal is kind of redundant. But, they might eventually be relevant if I ever include more information in the app.

  • @user-lp6ri9dn4j
    @user-lp6ri9dn4j 2 หลายเดือนก่อน +1

    Thank you for Videos and Apps! Is most appreciated. What makes me very sad ist to see the exposure time i need to get an SNR of 50 under Bortle 5 skies( Mag 20). 144 hours when i use 60s frames. But the exposure time depends a lot on how i frame when i get the statistics. I used a small galaxy NGC3184, and when i only framed a part of the galaxy i got an exposure time of 27 hours, when i framed in more than the galaxy, i got an exposure time of 144 hours! So what do you think regarding how to obtain statistics from light frames? I used Siril by the way.

    • @deepskydetail
      @deepskydetail  2 หลายเดือนก่อน +1

      Thanks! Wrt your questions, you really need to get a good sample of the sky background for things to give you an accurate measurement. Take the statistics of the sky background as close to the galaxy as you can, but making sure you don't include the galaxy in it. Also, the part of the galaxy you take a sample of also matters. The resulting SNR in the App is only for the sample that you took. So, if you took a sample from the outer arms of the galaxy, you'll need more imaging time to get the same SNR compared to the brighter core. Hope that makes sense.
      Also make sure you're using an "average" sub frame. One that isn't your best, but not your worst either. These things are important! I hope that answered the question.

    • @user-lp6ri9dn4j
      @user-lp6ri9dn4j 2 หลายเดือนก่อน +1

      Thank's for your reply! . I have tried to do as you described, and i find that the results make more sense. 23.45 hours under Bortle 5 skies for SNR 50 makes more sense for me. With the 10 hours of integration time i've got, i have to use Binning to achieve a fairly acceptable result. Thank you again! Question: Is it possible to measure the SNR direct in a stacked Fits file? Some time i wish i could do that to be able to compare stacks with different exposure times(60s vs 180s). Total time would be both the same.

    • @deepskydetail
      @deepskydetail  2 หลายเดือนก่อน +1

      @@user-lp6ri9dn4j So, I think different stacking software will attempt to measure SNR of a stacked image. But, I'm not sure how well they work, and different software produce different results. I've measured the SNR in a few different videos myself (see my Bortle analysis, Ha filter analysis or SNR and stacking analysis videos). I'm thinking of making an app that can do that with help from Siril by using registered sub frames. But, as far as I know, you need to measure the SNR as you stack.

    • @user-lp6ri9dn4j
      @user-lp6ri9dn4j 2 หลายเดือนก่อน

      Thank you,@@deepskydetail !

  • @felipeleon_astrofotos
    @felipeleon_astrofotos ปีที่แล้ว +1

    Hello, @deepskydetail. I've been studying this stuff recently from de articles of Craig Stark, and been using the spreadsheet also (same one that appears in 1:40 of your video). I have a doubt, hope you can help me to see where I'm wrong. If I enter to the spreadsheet the very same data you show in your excel (and that is loaded automatically when opening the web app), the target SNR for a single image is 0,794 and then, to get to an SNR of 5, I'd need 40 images (this last calculation is also made by the spreadsheet itself in the Stacking Sheet). The problem is that this is not the value that the web app gives (it says I only need 33 subs). Where is the difference coming from?
    Thanks in advance. By the way, the web app is really amazing and helpful.

    • @deepskydetail
      @deepskydetail  ปีที่แล้ว +1

      Thanks for looking into this!! Just fixed it based on your comment! I double checked and it seems like there was mistake in the code when calculating signal in electrons. It was throwing off the SNR of a single sub exposure a bit, which of course has compounding effects!

    • @felipeleon_astrofotos
      @felipeleon_astrofotos ปีที่แล้ว +1

      ​@@deepskydetail Hey! Thanks for considering my comment. And as I said, it's a really helpful tool. Thanks for your work! (BTW, now the numbers match)

  • @Xanthus723
    @Xanthus723 ปีที่แล้ว

    I must be doing something wrong. If I'm using pixinsight to grab the adu.. I make a preview of the faint and dark areas and then select statistics tool. Like you've got there in Siril.
    Problem is the numbers seem way off. I have the 16 bit selection of the drop down selected since that what my 2600mm is. I tried normalized and 14 bit since the numbers were closer to yours. Neither worked.
    The outputs I got were as follows Input 1: 3078 2: 1909 3: 12 4: 10.9 The camera stuff: 1: .21 2: 1.5 Desired SNR 90 Exposure duration 90
    Halp!

    • @deepskydetail
      @deepskydetail  ปีที่แล้ว

      It looks like to me the ADU values for the light frame might be off. The values seem to be off (too big of a difference maybe?). What is the target you are shooting and what is the light pollution in your area?

    • @Xanthus723
      @Xanthus723 ปีที่แล้ว

      @@deepskydetail m101 and bottle 4

    • @Xanthus723
      @Xanthus723 ปีที่แล้ว

      Bortle*

    • @deepskydetail
      @deepskydetail  ปีที่แล้ว

      I'm having a bit of trouble figuring out which numbers go where. Is this correct?
      DSO + Skyglow Signal: 3078
      Skyglow background: 1909
      Dark Signal: 12
      Bias Signal: 10.9
      Camera Gain: 21
      Camera's Read Noise: 1.5
      If I put those numbers in, I get 9.7 hours to get an SNR of 90. But I think the dark and bias signal values may be off. Are those numbers in ADU? I would expect those to be a bit larger. And the camera gain should be lower (probably around 1.0 or so if you are at unity gain).
      Also, the difference between the DSO+Skyglow and Skyglow numbers seem a bit too large. I'd double check those and make sure you're getting an output in ADU and not some other unit.

  • @andrewoler1
    @andrewoler1 2 หลายเดือนก่อน

    I'm trying to get an estimate from my subs but when I load them into Siril, I get separate numbers for red, green, and blue. Why color am I supposed to use? They're unbalanced so I get different estimates for each -- the estimates are 27hr, 5hr, and 8.5hr of exposures using red, green and blue mean values, respectively. Any ideas on the best way to handle this? Thanks!

    • @deepskydetail
      @deepskydetail  2 หลายเดือนก่อน

      Good question. Are you using RGB filters with a monochrome camera or a one shot color camera? If the former, I would use the RGB compositing tool, do a quick color balance and then use the RGB tab to find the DSO and sky background values. Then put that into the app and multiply the sub length by three. If you're using a one shot color, then use the RGB tab. Hopefully, I understood the question.

    • @andrewoler1
      @andrewoler1 2 หลายเดือนก่อน

      @@deepskydetail Thanks for the reply. I’m using a one shot color camera. I save them as fits files using SharpCap, then load them into Siril. By default it loads it as black and white but when I get statistics it breaks it down by R, G, B separately. If I debayer upon open it shows tabs for R, G, B, and RGB, but even with the RGB tab, when I get statistics on a selection, it splits it into R, G, B channels.

    • @andrewoler1
      @andrewoler1 2 หลายเดือนก่อน

      @@deepskydetail I do notice there’s a box on the top right corner of the statistics window that says “Per CFA channel” which is checked but I’m unable to uncheck it. Maybe that would convert them to a single number of I could uncheck it. Not sure why it won’t let me uncheck the box…

    • @andrewoler1
      @andrewoler1 2 หลายเดือนก่อน +1

      Okay I figured it out. I can load an image and NOT debayer it, then get stats on a selection and it will let me uncheck “Per CFA channel” and report a single value as in your video.

    • @deepskydetail
      @deepskydetail  2 หลายเดือนก่อน

      @@andrewoler1 Glad you figured it out! :)

  • @PeterK6502
    @PeterK6502 2 หลายเดือนก่อน

    I tried the link to calculate the SNR, but the "Camera Gain (e-/ADU)" does not make any sense to me. e-/ADU stands for "electrons per Analog-to-Digital Unit," which implies more sensitivity for lower values. The lowest read noise for my sensor is at gain 450, where the e-/ADU value is near 0 according to the specifications of my camera. But when I use 0 in your app, then I need infinite exposures according to your app. This does not make any sense to me because that does not match with my observations.

    • @deepskydetail
      @deepskydetail  2 หลายเดือนก่อน

      1) Camera gain is not read noise. e-/ADU basically means that each time a photon hits your sensor, how many electrons will it send to the processor. If it is 0, then you won't record any photons (as measured electrons). Higher numbers mean more sensitivity.
      2) People generally use unity gain, or close to it. It just means that if a photon hits your sensor, it will record it as one electron. Higher gain values generally mean more shot noise*,̵ ̵w̵h̵i̵c̵h̵ ̵i̵s̵ ̵a̵ ̵s̵e̵p̵a̵r̵a̵t̵e̵ ̵m̵e̵t̵r̵i̵c̵ ̵y̵o̵u̵ ̵p̵u̵t̵ ̵i̵n̵ ̵a̵s̵ ̵"̵C̵a̵m̵e̵r̵a̵'̵s̵ ̵R̵e̵a̵d̵ ̵N̵o̵i̵s̵e̵ ̵(̵E̵l̵e̵c̵t̵r̵o̵n̵s̵)̵"̵.̵

    • @PeterK6502
      @PeterK6502 2 หลายเดือนก่อน

      @@deepskydetail You misunderstood me. I know that camera gain is not read noise, but I have a camera with the IMX585 sensor, and according to the specs, when a high gain of 450 is selected, then the e-/ADU is near 0. According to your explanation and app, then it will NOT record any photons; however, this setting is MAX gain, therefore it records the maximum number of photons possible. When I select a gain of 0, then the e-/ADU is about 12. According to your explanation, this is the most sensitive setting; however, this is the LEAST sensitive setting. Your explanation only makes sense if you talk about ADU/e- and not e-/ADU.
      Higher gain gives lower read noise for the IMX585 (not higher). This sensor does have a read noise of 12 when gain 0 is used, read noise of 4 for unity gain, and read noise of 1 for gain 450. Therefore, you have the lowest read noise when the highest gain is selected.
      Therefore, I use a maximum gain of 450 for very faint objects, to have the most sensitivity (e-/ADU near 0) with the least read noise.
      See specs: astronomy-imaging-camera.com/product/asi585mc/

    • @PeterK6502
      @PeterK6502 2 หลายเดือนก่อน

      @@deepskydetail I try to reply, but TH-cam's bot removes my reply instantly. I will try to reply in several posts.

    • @deepskydetail
      @deepskydetail  2 หลายเดือนก่อน

      Oh, I see what you're saying now. I explained it poorly, and have updated my previous answer a bit. A better explanation is that e-/ADU is how many electrons represent 1 ADU. If it is 0.5, then each intensity step (Analog Digital Unit) represents 0.5 electrons. When gain increases, fewer recorded electrons make the image brighter. When e-/ADU is below one, you aren't actually improving SNR because the signal is being artificially amplified. In the extreme case when e-/ADU is zero, then your camera will be saturated with a zero second exposure (i.e., you'll get a completely white image).
      Example: If e-/ADU is 0.5, then if two photons are captured and converted perfectly to electrons, the ADU would increase by 4. But the true signal (2) is unchanged.
      With that said you shouldn't use the read noise to guide your gain settings. A good rule of thumb is to use unity gain. Why? read noise isn't as important as shot noise and dynamic range. Let's say your e-/ADU is still 0.5 and like before you capture two photons. Higher gain is not magic: The signal is still 2. But the shot noise (not the read noise) has increased. Instead of being sqrt(2) if you had unity gain (1 e-/ADU), you have a shot noise of sqrt(4)=2.

    • @PeterK6502
      @PeterK6502 2 หลายเดือนก่อน

      @@deepskydetail
      The reason everyone uses "unity gain" is because everyone uses unity gain.
      This, in my opinion, is more of a copy behavior than an actual good reason.
      Because I work with a Dobsonian on an equatorial platform, I want to minimize the length of sub-frames to minimize tracking errors.
      That's why I've done some research into "deep sky lucky imaging," where they work with sub-frame lengths as short as 4 seconds.
      An explanation can be found here (in French): th-cam.com/video/0lKCB0q0jV0/w-d-xo.html
      The section about gain settings starts at 19:30. So, the gain is set slightly below the maximum value there to minimize read noise (in other words, not unity gain).
      This is exactly what I do as well; I set the gain to almost the maximum value and take many short exposures.
      I came to your page precisely because I want to know how short I can make my exposures without read noise becoming the dominant factor.
      But this information could I not extract with your application.
      Currently, I take a few thousand sub-exposures of 15 seconds each per object, but I want to reduce the sub-exposure to below 10 seconds.
      Your previous explanation about the conversion process is correct; higher gain settings above unity gain do not make the camera more sensitive, but they do reduce the read noise.

  • @dadwhitsett
    @dadwhitsett ปีที่แล้ว

    According to this then you will significantly shorten the total integration time by using short duration light frames, correct?? I calculate that if I use 36 seconds per light frame I can reach SNR>100 after 7.9 hours but with 180 second duration it would take 40 hours???? Really?

    • @deepskydetail
      @deepskydetail  ปีที่แล้ว

      Thanks for the comment. What numbers are you putting into the app to get you such different results, and how are you getting at such different numbers? The app tries to estimate noise and target signal on an electron per second basis. I haven't seen that sort of discrepancy with what I have tested.
      For example in my light polluted area, to get an SNR of 30 on M78 with 300 second exposure frames, I would need about 15.5 hours of data (300s x 186 exposures = 15.5). When I use the **slider** to estimate the SNR with 60 second exposures, it tells me I still need 15.55 hours of exposure time (60s x 933 exposures). If I move the slider up to 5x, I still get 15.83 hours of exposure time (1500s x 83).
      If you could comment on the parameters you're using to get both results, I'd appreciate it. Thanks!

  • @the.Potocky
    @the.Potocky ปีที่แล้ว

    You will need 30753 exposures to get an SNR > 100.1 at 180 seconds (1537.65 hours)0.570814666845778
    what am I doing wrong ?
    lights: 513 and 508
    Bias: 501,8
    Dark: 502,5
    Zwo Asi 6200mm Pro
    Gain: 100
    Exp: 180 sek
    😢😢😢😢 PLS HELP 🤦‍♂

    • @deepskydetail
      @deepskydetail  ปีที่แล้ว

      1/2
      OK, let me see if I understand this:
      DSO + SkyGlow Signal: 513
      SkyGlow Background: 508
      Dark Signal: 502.5
      Bias Signal: 501.8
      Camera Gain (e-/ADU): 0.25 (at 100 regular gain, this is what I have for the ZWO 6200mm)
      Camera's Read Noise (Electrons): 1.5 (again, at 100 regular gain)
      Desired SNR: 100
      I get that you will need 1600 hours. That might be correct for 100 SNR. Notice that your DSO + Skyglow signal is only 5 more than your Skyglow background signal.
      That means the part of the image you are using to calculate it is really faint. Or you have a lot of light pollution.

    • @deepskydetail
      @deepskydetail  ปีที่แล้ว

      2/2
      Which target are you trying to image, and what part of the target did you select? Also, what is your light pollution like?

    • @the.Potocky
      @the.Potocky ปีที่แล้ว

      @@deepskydetail Hi ! That's right. The photo we use for the reading was taken if Bortle 2 - high in the mountains!

    • @the.Potocky
      @the.Potocky ปีที่แล้ว

      @@deepskydetail The Spaghetti Nebula, Bortle 2.
      I have a mono camera, what filter is the best to take a photo with?
      Thank you !

    • @deepskydetail
      @deepskydetail  ปีที่แล้ว

      Do you mind sharing the telescope and FOV you used? I think Ha and OIII would be the go to filters for that target.
      The spaghetti nebula is pretty faint to begin with. If you are using a faint part of the nebula for your input, that might be the reason, especially if you aren't using filters at all. Maybe use a brighter part? Also, you're in bortle 2, so you might want to use longer exposures. Noise will hardly go up, but your signal should improve (try 300s or even 600s if your mount can handle it).
      It should definitely be possible in bortle 2 though! Nico Carver could do it in pretty light polluted skies (www.nebulaphotos.com/sharpless/sh2-240/ ). Try a light frame using an Ha filter and see what you get. Let me know what the results are :)

  • @sHuRuLuNi
    @sHuRuLuNi ปีที่แล้ว

    who takes 15s subs ...

    • @michalringes4386
      @michalringes4386 ปีที่แล้ว +2

      pro advice here: do shortest possible subs that are skylimited and do as many as possible, discard any that are not perfect(guiding errors ,worse seeing and such), it will improve you photos dramatically, you only want to go longer subs for extremelly faint objects where there is like no signal at all on single sub. What matters is total integration time and when you have subs that are skylimited you wont gain too much when doing longer subs, but you can loose a lot, it can be pretty difficult to get perfect guiding for longer time also you subs will be more blurry if seeing is not super stable. With shorter subs you can be very selective with what you use at the end discarding 30s sub is much easier than 10 min one

  • @mrpoopybutthole666
    @mrpoopybutthole666 ปีที่แล้ว +1

    this is awesome! can it be used with dslr, like canon 60d? there is no gain, only iso

    • @deepskydetail
      @deepskydetail  ปีที่แล้ว +3

      The short answer is yes. The long answer is you will have to do some calculations to figure it out. Below are some quick (although not the most accurate) calculations:
      For the gain signal, you need to take two flats. Get the average value of your flats in Siril in terms of ADU. Add the two means together (i.e., Mean of Flat 1 + Mean of Flat 2). Then find the variance of the difference between the flats. You'll probably need pixel math to do that. Then divide that into the added means. So the formula will be:
      (Mean Flat1 + Mean Flat2)/ Var( Flat1 - Flat2)
      For the read noise, you'll need a bias frame. Find the standard deviation of it, and that should give you your read noise.
      If I can figure out how to do it effectively in Siril, I might make a video on it.

    • @yougoattube
      @yougoattube ปีที่แล้ว +1

      Can't speak for other DSLR/Siril users, but it would be great if there's a way to get the data needed, and even greater if someone made a video of it!

  • @astroattorney
    @astroattorney ปีที่แล้ว

    SharpCap Smart Histogram Brain