Corrections, comments, and clarifications: 1) First comment - go check out reddit.com/r/trytryagain! I made a community post a while back and got a lot of support for the idea, and this was the most popular name by far, so here it is! And I'm REALLY looking forward to seeing more projects iterating along! 2) A bunch of people have asked if the camera had lens stabilization on and I regret to inform you that you're totally right. in retrospect this was something that I'd wondered if I messed up but didn't check until actually writing this comment - wouldn't have helped me while I was stabilizing but now it makes me feel dumb lol... Speaks to the need for proper testing, and having a checklist more exhaustive than you think you need... Granted I was also wiping dew off the lenses every 2 minutes for the first hour or so of the total eclipse, and still relatively high frequency after that, so I was certainly adding jitter manually... 3) I've had a few people ask for just the timelapse, so I'll throw that up as a standalone 4k clip - hopefully the compression doesn't hit it too hard!
Hey, nice video and impressive that you made that whole moon tracking system! I'm sure you've thought of this, but some things make me worry that you missed something. At 9:40 You mention the muted grey region where different exposures are overlapping. Then, at 10:30, they were "added" together in the most literal sense of the word. The thing is that adding images like these directly will give a grey region, but it doesn't have to if the blending is done properly. I was curious so made a comparison. It shows a muted grey similar to your situation and it disappears completely with correction. (removed imgur link because YT keeps deleting my comment. lmk if you want me to send it somewhere else) The key to high dynamic range is indeed in the transfer function and logarithmic perception of the human eye, although I'm not sure what you meant exactly by connecting two straight lines. I understood you did artificially amplify the dark footage. But if that went well, you shouldn't have had to worry about the dark footage being brighter than the other. The approach that worked for me (see source code) is to apply a log to the estimated real-life "light per second" (I think it's called radiosity?). So: 1. Divide each footage's brightness by exposure times. 2. Average the result, but take a pixel from just one of the images if it was under/over exposed in the other 3. Apply a logarithmic curve (like human perception, making the result an image with details in both highlights and shadows,. (The source code in the Imgur post is slightly different, because it multiplies the dark image instead of dividing the bright) I hope this helps!
The fact that you did the entire processing by hand, started making the video, and then "tried tried again" and just wrote the code anyways is why you are one of the great channels of TH-cam.
I think a lot of photographers get a harsh reality check the first time they go out to photograph the moon. Don't even get me started on night time-lapses and condensation!
Yea, the first time i tried I though it would be relatively simple. I used film and was hoping that reciprocity failure would give me an advantage. Still havent gotten a good photo
Man, no kidding! I remember being younger and using an off-axis guide trying to take pictures of galaxies and stuff and hoping like hell the ridiculously long film exposures would turn out something...lol Doing things digitally is obviously easier because you can see it right away, do image stacking and don't have to expose is long buuuutt that does NOT mean that it's easy! 😂 People able to do a high-end photography using a Schmidt telescope (not a Schmidt-Cas..but just an actual Schmidt telescope made for photography) is still absolutely incredible to me, that is an incredible art right there! Sticking a film blank in and hoping everything works out sort of a deal LOL
As a southern Arizonan, I'm glad I don't have to deal with condensation, but even in the desert there are clouds from time to time... And always at the worst time
Just spitballing, but do you think you could use an IR lamp / LED to keep the lens hot enough to drive off the moisture? I know cameras are sensitive to certain IR wavelengths but maybe there's something that wouldn't wash it out?
As a guy with a DSLR, I can confirm. First of all, the auto focus refuses to work half the time. (And not at all if it's the stars you're trying to take photo of) Then the auto exposure decides moon is a white disc, and nothing more. So. Finally, with enough zooming, and manual focus, and manual exposure adjustment, it's too hard to take a proper photo without a tripod. And when you finally get the correct exposure and framing, camera sees ONLY the moon, and not all the beautiful clouds around it. And after getting a perfect photo, I'm all happy and upload it to Facebook, and I get 2 likes. Because nobody even starts to think I took the photo. Even a selfie of my ugly face in low lighting would get 10x more likes. And if I ever tell about it to a normal guy, All they say is "it looks exactly like the photos you get from Google images" ofcourse it does. It's the same subject... And with star photography, It's a crazy shit all on its own...
You don't post videos often but every time you do, it's worth the wait. Thank you for all your hard work, both your results and the effort you put into them are fascinating.
"there can be a perception that everything always works... when that is literally, never the case". that's so true, no matter what you apply it to. I'm currently doing an internship in a molecular biology lab and I gotta tell you, the number of times we've gone through the full procedure from getting a gene sequence, to putting it into a plasmid, to getting that into a bacteria, etc etc, just to find out at the end that the whole process had failed and we had to adjust something, it really mind boggles you that we've learned as much as we have in just a couple thousand years. Stuff goes wrong and that's just how it is. Great video as always
People are broken clocks that only tell the right time twice a day. The reason we made that progress is because we have a couple billion of those broken clocks making mistakes all the time and occasionally getting things right.
I love seeing the failures and hope you continue to highlight them on your channel. A similar channel that comes to mind is Cody's Lab. I think it's far more satisfying to see the entire process with all it's challenges and failures. Not knowing if or how you will come to a conclusion can be a lot more thought provoking than just a clean cut science channel that gives you all the facts like it's a simple process with no surprises.
A thousand times dimmer would actually be only 1/30th of the pixel value because the camera's gamma is going to be at least 2. Which is to say, pixel value is brightness raised to one over gamma. So at one thousandths of the brightness, the pixels would not be all zero.
It always did strike me as strange that in videos of a lunar eclipse, the moon would suddenly be brighter when it's fully in the earth's shadow. But I never stopped and thought about it, your explanation that cameras (and our eyes) automatically adjust to make it more visible makes sense. I applaud the effort that you put into this video. I don't personally know enough / have enough experience with image stabilization to have implemented what you did (or it would take me way longer). That being said, there will always be a trade-off between quantity and quality, and I think there is diminishing returns if we go too far on the quality side. You said that you are inclined to improve the quality of your videos given the higher amount of subscribers you have, and so these videos take longer. I think it would be okay for you to scale it back a bit to ultimately share more knowledge. But these are just some thoughts from someone who barely has any experience making videos! If you think your process is fine, then keep at it and I'll keep enjoying the videos :)
Personally I love the long unplanned tangent projects you end up doing en route to a finished product, it's part of that makes this channel amazing. Tbh, I'm shocked you managed to do all of this in a month since the eclipse
Throughout this video I knew that you were about to bang your head against that "perfectionist" perspective that you have. You did display the classic "devil in the details" issue that plagues a great experimenter. I applaud your honesty regarding how much personal time gets 'trashed' as one struggles to reach perfection. We are so fortunate to have access to your work.
i think our vision also benefits from the fact that we stitch a lot of separate images together to form our "view", so to speak. our eyes jump around extremely fast and our pupil contracts and dilates during all of those saccades, so that we have a more or less even contrast ratio across our entire visual field.
Last one posting eclipse content …. False, I still haven’t gotten motivated to post my (finally developed) film from that night. Haha. Awesome work, as always.
What a spectacular video. In my opinion, the numerous "failures" actually made it more interesting than just having a perfect video to watch, and really showcased just how much time, research, effort and planning went into this endeavor.
Great work! I make a hyper-local calendar for my area and wrote some custom code to plot the phase of the moon on it... in generating that curve (using data from elsewhere) I noticed a very slight wobble in the curve; I thought it was a drawing bug at first, but it turned out to be due to the same effect you described: the vantage point of the observer on earth changing as the earth rotated. Neat stuff!
15:15 we love and appreciate you, brother. Take your time and put your heart and passion into it and it will always be an enjoyable experience together.
Did you have image stabilisation turned on on each camera? If so they’d be doing so somewhat differently, so any common vibrations would be reacted to differently. Though it’s still pretty likely that your system just wasn’t rigid enough to prevent differential vibrations. Next time, try a solid carbon fibre plate with support trusses, with the cameras directly bolted onto it. Hope you have some ideas for the pending powers of 2
Dude! Thank you so much for making this video. This is one of those things I’ve had in the back of my head for so long but never freed up enough RAM to sit down and think about, let alone do the experiments you did. Epic as always!
AMAZING! One of the few channels on youtube that provides QUALITY content despite the constant pressure to produce flashy viral crap for the algorithm! Immediately joined your subreddit :))
I'm from Germany and was "watching" and photographing the eclipe as well. Problem here was, the moon slowly "dipped into" a dense stream of clouds, came back again underneath it, to then set behind the horizon while turning red, which was cool in itself. And it was a very nice morning nonetheless, bc I was standing on a little hill, seeing the disappearing moon in the West and the sunrise in the East, where I took the opportunity to take some gorgeous pictures of meadow/flower silhouettes against the colorful morning sky. A morning to remember! Thank you a lot for remindung me a month later. The time you took to unfold the beauty of your immense effort, firstly, was well spent and, secondly, brought me back to that little hill.
Been through all that myself, 437 3 image brackets that needed to be manually aligned as I did not have a capable enough mount at the time, all for a 30 second 4K video clip, a few minutes of extra planning could have saved me hours in post, very jealous of the software,
When I've done manual stabilization, I find it most efficient to place the beginning, then the end, then the middle, then going for the middle of each of those, etc. You can move quickly to the next and previous key frame. And, yes. Tedious.
Huh, in the Animation you made explaining the movement of the umbra and the moon (3:00), the stars are very subtly spelling the word "SUBSCRIBE". Clever little Easteregg. What else have you hidden in your other videos? I hope the subconscious message will make people decide to subscribe to you even more, because I can't wait for the 2¹⁶ special. All in all keep it up, entertaining and educational content!
Lunar eclipses are really pretty-i'd argue even more so than solar eclipses. Your work to capture that beauty into a viewable-at-any-time form is fascinating. Great video!
I remember sitting in my balcony watching this moon while drinking some tea; I am so happy that you've not only recorded it but decided to piece it all together in such a smooth way for us. I really appreciate your try it again attitude and high standards. I love making projects, albeit smaller in scale and less professional, but the question is always there: why? Why do we record beautiful phenomena & make cool things through all this effort? I'm not a functionalist, so I think the answer is simpler. It's to call out to a fellow human being, "hey, check out this cool thing," inviting them to collectively enjoy and appreciate something that moves us in our souls. Thank you for your work, looking forward to the next eclipse 😊
Some advice on bracketing. You can actually capture a whole necessary dynamic range on just one camera using bracketing. On Canon DSLR with Magic Lantern installed you can use "Advanced bracket" function which allows you to take up to 12 shots with 0.5-8EV increments (varying Tv, Av and ISO) in one stack. Also for image aligment take a look at panorama stiching tools like "hugin" itself or the toolkit which it uses. Maybe it will be helpfull to ease the process somehow for the next attempts.
That time lapse looks fantastic. I was lucky to have a lunar eclipse last November and then the one you photographed this past May visible from my home and I was able to vastly improve the quality of my images with my tracking mount.
Just to be clear as part of your audience, you showing all the failures is definitely the highlight of your channel. That isnt to say the end results aren't cool, they are, but there are a million science youtube channels with cool end results. Your failure and iteration are wonderful and the main other channel I remember doing similar (And thus I enjoy for similar reasons) is Shane over at Stuff Made Here. Both you and him are gems.
Ah yes memory usage... brings back memories when I tried to code something in C++ and it basically ended up just growing with about 1MB per second in my memory:, ^) Took me so long the figure out where to garbage collect and how to manage memory. Ended up with some Memory managing code that off loads some parts of the RAM onto the SSD since it won't be accessed for a long time due to the tree structure. My heart goes out to everyone out there who has to fight with memory usage... Really cool videos btw, love the time laps ^^
Your mention of the difficulties in making bread reminds me of my own journey in espresso. One aspect is that taste and smell are quite difficult to share in writing or pictures or sounds! You have to experience them yourself, and then you have to train your senses. This limits the amount of experience you can gain by simply communicating with others. So trial and error is the rule in such fields!
“Consistently OK”? I’m waiting for Vib and Biff to get api access to match wins and losses so they can publish an ELO and we can find out how OK we all are 😂
4:45 Technically yes and technically no. Have you heard that the brightest point of the shadow of a spherical object is in its center? This phenomenon is called Poisson Spot there is a Veritasium video explaining it. Video name: "The Brightest Part of a Shadow is in the Middle"
I’m his video he specifies 6 minutes in that it has to be coherent light, and a perfect sphere. I honestly have no clue if the earth would qualify as spherical enough because I don’t know if the relevant length scale for surface roughness is the wavelength of light or the aspect ratio of the roughness to the size of the whole object, but either way, the sun here isnt a coherent point source, so moon does keep getting dimmer the whole time. I actually tried to film the penumbra that would have made this a little clearer but was foiled by clouds!
@@AlphaPhoenixChannel I don't understand that kind of stuff very well. Even when not on the best conditions I think some of the phenomenon would still happen but just too dim to be noticeable. I just brought it up as a fun fact :D
Clarification: You were NOT filming the moon lighted by a scattered ring of *sunset*. Moon is lit by sunset on one side, sunrise on the other, and - given when it was filmed - two areas of polar twilight.
Even though I'm something of a TH-cam conneisuer, I've never seen one of your videos...and this was a great one to start with. I noticed how difficult it was to photograph the eclipse (of course I was just using my phone camera), and ended up with some wild pics when I tried to take any. This definitely helped explain some of it for me. Also, don't think we didn't see that hidden "subscribe" message written in the stars!
I try to leave comments on videos this good even when I don't have anything particularly useful to say, just because this is an excellent channel and I hope it keeps reaching more people in the future!
I think a good add-on to the message at the end of this video is the following adage that I know, but have somehow not taken to heart thus far: "The difference between practice and theory is greater in practice than in theory." I would say to keep this in mind when making plans, but as given by the adage itself, that won't exactly fix things.
I love the idea of the tryAgain reddit! Learning from other peoples finished projects is cool, seeing their iterations is much much more useful with a completely different set of skills being transported. I definitely learned more from this than from a video without all those issues.
May I suggest that you upscale your footage and upload as 4K? Even though it will be fake 4K and you won't really get any extra detail, the compression that TH-cam uses for 4K video is much more generous. I don't even have a 4K monitor, but watching videos in 4K (especially dark videos with a lot of banding like this) look obviously and significantly better.
@@ChillGuy511 In normal TH-cam it doesn't give me a option to watch in 1440p for instance (premium or not, I just mentioned premium because Vanced is generally used to not have ads and to enable background play without paying for it)
A point you didn’t bring up which is very important in the difference between the eye and cameras is that even without the correction made at a brain level, the receptors in the eye have a far better dynamic range than sensor cameras - almost 50% better, which helps in contemporarily seeing very bright areas and very dark ones
Hi! Loved the video, thanks for the amazing work :) For once, you‘re talking about something that I actually understand, so that‘s really refreshing. Something I would point out is that the blue pixels (see 6:43) aren‘t just underexposed; in Camera RAW / LR / LR CC, when you turn on this view, the blue pixels are shadow clippings. That means that in the areas of the sensor corresponding to those pixels, the camera did not collect any data, with all color channels being at 0. Which raises the question: At 6:49, why is there a somewhat round area where no data was recorded, as opposed to the slightly more chaotic (and expectedly noisy) pattern outside that circle? It seems a bit too distinct to just be optical artifacts, no? I‘m confused now. Anyway, thanks for the video!
I’ve seen that before and I’m not entirely sure. It’s either a weird lens vignette that’s so faint it only matters in extreme dark, or it’s a weird sensor thing where the periphery is less sensitive. The angular edges of the pattern really confuse me because I assumed the iris has more leaves than that…
This reminds me of the last demo CD I recorded at a "state of the art" studio. We had to go through EVERY song, and replace EVERY snare drum strike with a sampled one because the tech had failed to turn on one of the snare mics. El....o.....El :( :(:(........ I wasn't charged for the additional mixdown time but my brain still hurts from adjusting each strike to retain the continuity of the music. Gotta love technology....in a sort of "love hate" kinda way. Great video! Love your style and informative personality!
Another thing to consider is that a lot of people who do solar eclipses, remove their filters for just the duration of the full eclipse...because there's no danger to their sensor...and then put the filters back on for the rest of the eclipse. So for those, the sensor is actually getting a huge percentage greater of the light during the full portion of the eclipse.
I work in biotechnology where like 95% of all experiments go wrong. I really like that you show the failure since this is the first step to success. Thanks for the great content!
Trying to stabilize sooooooooooooooo many frames by hand... OMG u have my respect bro! BTW I'm still using your barn door tracking mount that I learn from your channel many years ago, I think that is the first video that I saw from you many years ago lol, Im following you since I saw that video. THX for your job,
I learned the "don't have image stabilization on in a tripod timelapse" the hard way too, heheh. Also, if you get tired of wiping dew off your lenses, you can get lens heaters; they wrap around the outside edge of the lens and you can set it to be just warm enough to keep dew/frost from forming. If you don't want or have something to plug into, you can use those air-activated hand-warmers velcroed to the side of the main lens instead.
when you were talking about merging the two projects to get something to work it made me think of keras, there's a _lot_ of machine learning stuff out there that kind of stuff
13:00 Did you use fixed lenses w/o optical image stabilization and cameras that don't have a floating sensor for the stabilization? That and yeah I can imagine the stiffness of the frame and your mounting solution could play a significant role with such zoomed in footage.
I have never been to reddit in my life, but sir - that is literally the most useful handle you could have ever put out there. Try try again. How we finally got there. Brilliant.
I believe our retinas have dynamic sensitivity to light levels. That's why we can see details in both the dark region and the brightly lit, otherwise blown-out region at the same time. It takes a small amount of time for our rods and cones to adjust but the individual rods and cones adapt very quickly. This is also what causes the after-image to stick around when we look elsewhere after looking at a high-contrast region. Effectively you'd need to take enough stages of bracketed photos and process all the data, compressing every pixel using every bracket into whatever your luminosity space is.
Alternatively, you could use a camera which already does this for you. The power of computational photography in today's highest-end smartphones has superceded a lot of previously necessary techniques you might apply to manually-taken images from a normal SLR. Believe me, I know how hard it is to take pictures of the moon with anything else in shot. Yet a Galaxy s22 can take a better picture of the moon than my SLR.
Lots of photos use similar techniques. Photos showing clouds and city streets at the same time. Photos showing glowing nebula gases and relatively bright stars at the same time. Photos showing the stars at night and a night landscape at the same time. Many photographers, I think, can now show scenes much better and more dramatically than we can see them with our eyes, or with plain telescopes/binoculars. My opinions.
Are the play buttons still coming? It's possible to see faint stars while looking at the moon because of the cone distribution in your eye. The center of vision is mainly seeing detail and color, the peripheral fov essentially just see brightness. It's also closer to s difference camera than a snapshot camera. And if you want to go a bit further you will see that afterimage is caused by triggering too many potentials and that leads to a reduced frequency. Which is then amplified by your brain. So you lose dynamic range after a short while (seconds) and gain after image for like half a minute. And it shouldn't be a limitation that requires multiple cameras. It's just a limitation of your bracketing options. You could for example use ND filters and keep exposure time constant. and if you motorize a filter wheel and sync it with your trigger, it will give you whatever dynamic range you want. Digital stills cameras usually capture their raw frames in linear. While jpg is used sRGB so 2.2 gamma. there is more than 255 colors in raw frames. Often 12 or 14bit of information, 16 on higher end cameras. Astro cams might be worth a consideration because you get more controls. Screens are usually SDR, which is like 5-6 stops at best. And even "HDR" screens don't so that much more. There is a limit to what you even postal TH-cam. Using the color page tracker in Resolve is really the worst tool for this job. And the "cloud" refers to a cloud of points in pseudo 3D space... What you are looking for is PIPP. The Planetary Imaging Pre Processor. It will have easily aligned and stabilized your frames and even gave you statistics along the way. The astrophotography community does have a lot of tools and they are all math nerds. If you go further to the actual astrophysicist, they encounter exactly the same problems but on larger scales and more precision. There are python packages available to do all of this programmatically and it's open source.
Thanks for the explanation of the image processing details. We do cross-correlation between synthetic aperture radar (SAR) images to be sure they are coregistered for SAR interferometry or InSAR. SAR images have speckle that adds noise. No clouds though.
Awesome work. Love the way you accounted for the vast exposure difference needed for a lunar eclipse. I have a big collection of poorly aligned time-lapses I figure I will refine sometime in the future. Hopefully it won't come down to a custom program to align each one. :)
Thank you so much for making videos of your failures. I watch video game challenge videos, and they ask questions like "Can You Beat X Under Y Conditions?". The vast majority of videos are either "Yes", or "Technically no but here's the closest you can get", but I've noticed that a lot of these youtubers eventually say that if the challenge fails, they scrap the video and just make something else. This ends up reducing the amount of content in the world, takes away opportunities to learn about that new scenario, prevents them from getting feedback on alternate methods, causes duplicate work when other people unknowingly try the same impossible task who also don't post their results, and creates biases in perception about what things are possible and how hard certain things are. I'd really rather watch the video where things don't turn out well instead of having no video at all. I've heard that something similar happens in science where people don't write or publish studies when the experiment "fails", causing lost knowledge and duplicate work. People have created journals just for publishing negative results to try to help with this problem.
Shadow apparently moving the wrong direction is due to Earth's axial rotation, not orbital motion. It makes the same apparent motion as the sun, since it is in exactly the opposite direction of the sun. At dusk, the shadow points eastward, at dawn it points westward.
Try turning off all sensor stabilizations, manual focus, manual ISO and frame rate settings. Focus breathing is the lense jittering you were referring to
Corrections, comments, and clarifications:
1) First comment - go check out reddit.com/r/trytryagain! I made a community post a while back and got a lot of support for the idea, and this was the most popular name by far, so here it is! And I'm REALLY looking forward to seeing more projects iterating along!
2) A bunch of people have asked if the camera had lens stabilization on and I regret to inform you that you're totally right. in retrospect this was something that I'd wondered if I messed up but didn't check until actually writing this comment - wouldn't have helped me while I was stabilizing but now it makes me feel dumb lol... Speaks to the need for proper testing, and having a checklist more exhaustive than you think you need... Granted I was also wiping dew off the lenses every 2 minutes for the first hour or so of the total eclipse, and still relatively high frequency after that, so I was certainly adding jitter manually...
3) I've had a few people ask for just the timelapse, so I'll throw that up as a standalone 4k clip - hopefully the compression doesn't hit it too hard!
Nice video bro :D
@@FelixHdez bro you’ve watched 10 seconds
@@vintyprod lol
@@vintyprod mmmmmmaybe
Hey, nice video and impressive that you made that whole moon tracking system!
I'm sure you've thought of this, but some things make me worry that you missed something.
At 9:40 You mention the muted grey region where different exposures are overlapping. Then, at 10:30, they were "added" together in the most literal sense of the word.
The thing is that adding images like these directly will give a grey region, but it doesn't have to if the blending is done properly. I was curious so made a comparison. It shows a muted grey similar to your situation and it disappears completely with correction. (removed imgur link because YT keeps deleting my comment. lmk if you want me to send it somewhere else)
The key to high dynamic range is indeed in the transfer function and logarithmic perception of the human eye, although I'm not sure what you meant exactly by connecting two straight lines.
I understood you did artificially amplify the dark footage. But if that went well, you shouldn't have had to worry about the dark footage being brighter than the other.
The approach that worked for me (see source code) is to apply a log to the estimated real-life "light per second" (I think it's called radiosity?). So:
1. Divide each footage's brightness by exposure times.
2. Average the result, but take a pixel from just one of the images if it was under/over exposed in the other
3. Apply a logarithmic curve (like human perception, making the result an image with details in both highlights and shadows,.
(The source code in the Imgur post is slightly different, because it multiplies the dark image instead of dividing the bright)
I hope this helps!
The fact that you did the entire processing by hand, started making the video, and then "tried tried again" and just wrote the code anyways is why you are one of the great channels of TH-cam.
I think a lot of photographers get a harsh reality check the first time they go out to photograph the moon. Don't even get me started on night time-lapses and condensation!
Yea, the first time i tried I though it would be relatively simple. I used film and was hoping that reciprocity failure would give me an advantage. Still havent gotten a good photo
Man, no kidding! I remember being younger and using an off-axis guide trying to take pictures of galaxies and stuff and hoping like hell the ridiculously long film exposures would turn out something...lol
Doing things digitally is obviously easier because you can see it right away, do image stacking and don't have to expose is long buuuutt that does NOT mean that it's easy! 😂
People able to do a high-end photography using a Schmidt telescope (not a Schmidt-Cas..but just an actual Schmidt telescope made for photography) is still absolutely incredible to me, that is an incredible art right there! Sticking a film blank in and hoping everything works out sort of a deal LOL
As a southern Arizonan, I'm glad I don't have to deal with condensation, but even in the desert there are clouds from time to time... And always at the worst time
Just spitballing, but do you think you could use an IR lamp / LED to keep the lens hot enough to drive off the moisture? I know cameras are sensitive to certain IR wavelengths but maybe there's something that wouldn't wash it out?
As a guy with a DSLR, I can confirm.
First of all, the auto focus refuses to work half the time. (And not at all if it's the stars you're trying to take photo of)
Then the auto exposure decides moon is a white disc, and nothing more.
So. Finally, with enough zooming, and manual focus, and manual exposure adjustment, it's too hard to take a proper photo without a tripod.
And when you finally get the correct exposure and framing, camera sees ONLY the moon, and not all the beautiful clouds around it.
And after getting a perfect photo, I'm all happy and upload it to Facebook, and I get 2 likes. Because nobody even starts to think I took the photo. Even a selfie of my ugly face in low lighting would get 10x more likes.
And if I ever tell about it to a normal guy,
All they say is "it looks exactly like the photos you get from Google images" ofcourse it does. It's the same subject...
And with star photography,
It's a crazy shit all on its own...
You don't post videos often but every time you do, it's worth the wait. Thank you for all your hard work, both your results and the effort you put into them are fascinating.
I appreciated that you included the famous constellation Sub Scribica
Elated whenever you post. Always something really neat. Looking forward to watching.
"there can be a perception that everything always works... when that is literally, never the case". that's so true, no matter what you apply it to. I'm currently doing an internship in a molecular biology lab and I gotta tell you, the number of times we've gone through the full procedure from getting a gene sequence, to putting it into a plasmid, to getting that into a bacteria, etc etc, just to find out at the end that the whole process had failed and we had to adjust something, it really mind boggles you that we've learned as much as we have in just a couple thousand years. Stuff goes wrong and that's just how it is. Great video as always
People are broken clocks that only tell the right time twice a day. The reason we made that progress is because we have a couple billion of those broken clocks making mistakes all the time and occasionally getting things right.
I love seeing the failures and hope you continue to highlight them on your channel. A similar channel that comes to mind is Cody's Lab. I think it's far more satisfying to see the entire process with all it's challenges and failures. Not knowing if or how you will come to a conclusion can be a lot more thought provoking than just a clean cut science channel that gives you all the facts like it's a simple process with no surprises.
A thousand times dimmer would actually be only 1/30th of the pixel value because the camera's gamma is going to be at least 2. Which is to say, pixel value is brightness raised to one over gamma. So at one thousandths of the brightness, the pixels would not be all zero.
It always did strike me as strange that in videos of a lunar eclipse, the moon would suddenly be brighter when it's fully in the earth's shadow.
But I never stopped and thought about it, your explanation that cameras (and our eyes) automatically adjust to make it more visible makes sense.
I applaud the effort that you put into this video. I don't personally know enough / have enough experience with image stabilization to have implemented what you did (or it would take me way longer).
That being said, there will always be a trade-off between quantity and quality, and I think there is diminishing returns if we go too far on the quality side. You said that you are inclined to improve the quality of your videos given the higher amount of subscribers you have, and so these videos take longer. I think it would be okay for you to scale it back a bit to ultimately share more knowledge.
But these are just some thoughts from someone who barely has any experience making videos! If you think your process is fine, then keep at it and I'll keep enjoying the videos :)
I remmber with the November eclipse I could see the dark side clearly when the shadow began going over the moon
This really just goes to show how incredible the dynamic range of the human eye is
Personally I love the long unplanned tangent projects you end up doing en route to a finished product, it's part of that makes this channel amazing. Tbh, I'm shocked you managed to do all of this in a month since the eclipse
@1:02: "It's more counterintuitive than you might expect."
Throughout this video I knew that you were about to bang your head against that "perfectionist" perspective that you have. You did display the classic "devil in the details" issue that plagues a great experimenter. I applaud your honesty regarding how much personal time gets 'trashed' as one struggles to reach perfection. We are so fortunate to have access to your work.
i think our vision also benefits from the fact that we stitch a lot of separate images together to form our "view", so to speak. our eyes jump around extremely fast and our pupil contracts and dilates during all of those saccades, so that we have a more or less even contrast ratio across our entire visual field.
I was watching another one of your videos as this one came up, looking forward to this one!
Last one posting eclipse content …. False, I still haven’t gotten motivated to post my (finally developed) film from that night. Haha. Awesome work, as always.
What a spectacular video. In my opinion, the numerous "failures" actually made it more interesting than just having a perfect video to watch, and really showcased just how much time, research, effort and planning went into this endeavor.
Great work! I make a hyper-local calendar for my area and wrote some custom code to plot the phase of the moon on it... in generating that curve (using data from elsewhere) I noticed a very slight wobble in the curve; I thought it was a drawing bug at first, but it turned out to be due to the same effect you described: the vantage point of the observer on earth changing as the earth rotated. Neat stuff!
15:15 we love and appreciate you, brother. Take your time and put your heart and passion into it and it will always be an enjoyable experience together.
Did you have image stabilisation turned on on each camera? If so they’d be doing so somewhat differently, so any common vibrations would be reacted to differently. Though it’s still pretty likely that your system just wasn’t rigid enough to prevent differential vibrations. Next time, try a solid carbon fibre plate with support trusses, with the cameras directly bolted onto it.
Hope you have some ideas for the pending powers of 2
Image stab. should be OFF! It just introduces a problem you cannot compensate for afterwards.
The result is awesome but it is made sooo much better by you explaining all the work that went into it! Awesome video as always!
Dude! Thank you so much for making this video. This is one of those things I’ve had in the back of my head for so long but never freed up enough RAM to sit down and think about, let alone do the experiments you did. Epic as always!
I love your "not good enough" mindset. Makes your videos worth the wait :)
this will always be one of my favorite channels of all time.
Your channel's motto almost verifies itself every time, it's a good motto
AMAZING! One of the few channels on youtube that provides QUALITY content despite the constant pressure to produce flashy viral crap for the algorithm!
Immediately joined your subreddit :))
Thanks for sharing the struggle. This is fun to watch. I recently started a simple circuit board, and every day it gets further from completion.
I'm from Germany and was "watching" and photographing the eclipe as well. Problem here was, the moon slowly "dipped into" a dense stream of clouds, came back again underneath it, to then set behind the horizon while turning red, which was cool in itself. And it was a very nice morning nonetheless, bc I was standing on a little hill, seeing the disappearing moon in the West and the sunrise in the East, where I took the opportunity to take some gorgeous pictures of meadow/flower silhouettes against the colorful morning sky. A morning to remember!
Thank you a lot for remindung me a month later. The time you took to unfold the beauty of your immense effort, firstly, was well spent and, secondly, brought me back to that little hill.
I'm rewatching this and I just noticed what the star field at 3:00 spells out
Well done! Very subtle
This is just unheard of, I love it!
The amount of work you've put into this is amazing.
The huge amount of effort you put into this video is appreciated - well done!
I love how deep you go into your projects! I can't get enough of this ultra nerdy scientific content! Thank you for your uploads and incredible work!
Been through all that myself, 437 3 image brackets that needed to be manually aligned as I did not have a capable enough mount at the time, all for a 30 second 4K video clip, a few minutes of extra planning could have saved me hours in post, very jealous of the software,
Thanks for showing the learning process together with the finished cake!
When I've done manual stabilization, I find it most efficient to place the beginning, then the end, then the middle, then going for the middle of each of those, etc. You can move quickly to the next and previous key frame. And, yes. Tedious.
Huh, in the Animation you made explaining the movement of the umbra and the moon (3:00), the stars are very subtly spelling the word "SUBSCRIBE".
Clever little Easteregg. What else have you hidden in your other videos? I hope the subconscious message will make people decide to subscribe to you even more, because I can't wait for the 2¹⁶ special.
All in all keep it up, entertaining and educational content!
Subliminal TH-cam game right here.
I enjoyed everything about the video. The astronomy, the photography, the genuine attitude, the invitation to an awesome sub. Thank you!
Lunar eclipses are really pretty-i'd argue even more so than solar eclipses. Your work to capture that beauty into a viewable-at-any-time form is fascinating. Great video!
19 minutes went by so quickly! Awesome as always!
SUBSCRIBE, it was written in the stars! 3:06
I remember sitting in my balcony watching this moon while drinking some tea; I am so happy that you've not only recorded it but decided to piece it all together in such a smooth way for us. I really appreciate your try it again attitude and high standards. I love making projects, albeit smaller in scale and less professional, but the question is always there: why? Why do we record beautiful phenomena & make cool things through all this effort? I'm not a functionalist, so I think the answer is simpler. It's to call out to a fellow human being, "hey, check out this cool thing," inviting them to collectively enjoy and appreciate something that moves us in our souls. Thank you for your work, looking forward to the next eclipse 😊
Some advice on bracketing. You can actually capture a whole necessary dynamic range on just one camera using bracketing. On Canon DSLR with Magic Lantern installed you can use "Advanced bracket" function which allows you to take up to 12 shots with 0.5-8EV increments (varying Tv, Av and ISO) in one stack.
Also for image aligment take a look at panorama stiching tools like "hugin" itself or the toolkit which it uses. Maybe it will be helpfull to ease the process somehow for the next attempts.
hold up
do the stars stay 'Subscribe' at 3:03
i'm impressed honestly
yes
That time lapse looks fantastic. I was lucky to have a lunar eclipse last November and then the one you photographed this past May visible from my home and I was able to vastly improve the quality of my images with my tracking mount.
Your one of my favorite youtubers. Your videos are informative and too the point. Thanks so much for uploading
Just to be clear as part of your audience, you showing all the failures is definitely the highlight of your channel. That isnt to say the end results aren't cool, they are, but there are a million science youtube channels with cool end results. Your failure and iteration are wonderful and the main other channel I remember doing similar (And thus I enjoy for similar reasons) is Shane over at Stuff Made Here. Both you and him are gems.
Ah yes memory usage... brings back memories when I tried to code something in C++ and it basically ended up just growing with about 1MB per second in my memory:, ^)
Took me so long the figure out where to garbage collect and how to manage memory. Ended up with some Memory managing code that off loads some parts of the RAM onto the SSD since it won't be accessed for a long time due to the tree structure.
My heart goes out to everyone out there who has to fight with memory usage...
Really cool videos btw, love the time laps ^^
Dude. I absolutely loved it. All your effort is very appreciated. Thanks man.
"So instead, I did it by hand."
Wow! That is definitely a labor of love!
Your mention of the difficulties in making bread reminds me of my own journey in espresso.
One aspect is that taste and smell are quite difficult to share in writing or pictures or sounds! You have to experience them yourself, and then you have to train your senses.
This limits the amount of experience you can gain by simply communicating with others. So trial and error is the rule in such fields!
200k subscribers gained in the past year? Amazing. Consistently good videos, consistently okay combat player.
“Consistently OK”? I’m waiting for Vib and Biff to get api access to match wins and losses so they can publish an ELO and we can find out how OK we all are 😂
@@AlphaPhoenixChannel I'm still better 😁
That constelation in the animation at 2:55 !! :D Great easter egg.
Too bad I'm allready subscribed :-)
4:45 Technically yes and technically no. Have you heard that the brightest point of the shadow of a spherical object is in its center?
This phenomenon is called Poisson Spot there is a Veritasium video explaining it.
Video name: "The Brightest Part of a Shadow is in the Middle"
I’m his video he specifies 6 minutes in that it has to be coherent light, and a perfect sphere. I honestly have no clue if the earth would qualify as spherical enough because I don’t know if the relevant length scale for surface roughness is the wavelength of light or the aspect ratio of the roughness to the size of the whole object, but either way, the sun here isnt a coherent point source, so moon does keep getting dimmer the whole time. I actually tried to film the penumbra that would have made this a little clearer but was foiled by clouds!
@@AlphaPhoenixChannel I don't understand that kind of stuff very well.
Even when not on the best conditions I think some of the phenomenon would still happen but just too dim to be noticeable.
I just brought it up as a fun fact :D
Clarification: You were NOT filming the moon lighted by a scattered ring of *sunset*. Moon is lit by sunset on one side, sunrise on the other, and - given when it was filmed - two areas of polar twilight.
Even though I'm something of a TH-cam conneisuer, I've never seen one of your videos...and this was a great one to start with. I noticed how difficult it was to photograph the eclipse (of course I was just using my phone camera), and ended up with some wild pics when I tried to take any. This definitely helped explain some of it for me. Also, don't think we didn't see that hidden "subscribe" message written in the stars!
Thank you for your timelapse footage. You have some mind blowing footage for sure🌌
Glad to see that the subreddit already has one post and it's only been a few hours
How is every video you post so good
I try to leave comments on videos this good even when I don't have anything particularly useful to say, just because this is an excellent channel and I hope it keeps reaching more people in the future!
I think a good add-on to the message at the end of this video is the following adage that I know, but have somehow not taken to heart thus far:
"The difference between practice and theory is greater in practice than in theory."
I would say to keep this in mind when making plans, but as given by the adage itself, that won't exactly fix things.
I love the idea of the tryAgain reddit! Learning from other peoples finished projects is cool, seeing their iterations is much much more useful with a completely different set of skills being transported. I definitely learned more from this than from a video without all those issues.
May I suggest that you upscale your footage and upload as 4K? Even though it will be fake 4K and you won't really get any extra detail, the compression that TH-cam uses for 4K video is much more generous. I don't even have a 4K monitor, but watching videos in 4K (especially dark videos with a lot of banding like this) look obviously and significantly better.
Wait... So you say it's better to watch TH-cam videos in the highest quality available eventhough the display doesn't support it?
@@ChillGuy511 yes
@@ChillGuy511 one of the reasons I use TH-cam Vanced on my phone even though I pay for premium. So it doesn't cap at 1080 (my phone's resolution)
@@TwskiTV oh that's interesting! Thanks! I guess you have to increas the resolution each time in normal TH-cam premium?
@@ChillGuy511 In normal TH-cam it doesn't give me a option to watch in 1440p for instance (premium or not, I just mentioned premium because Vanced is generally used to not have ads and to enable background play without paying for it)
I saw the message in the stars. Nice touch
Your heroic work is appreciated!
This frame processing is fantastic.
A point you didn’t bring up which is very important in the difference between the eye and cameras is that even without the correction made at a brain level, the receptors in the eye have a far better dynamic range than sensor cameras - almost 50% better, which helps in contemporarily seeing very bright areas and very dark ones
Wow, incredible dedication... Love your final footage :D
Whenever Alpha Phoenix posts something you know it's going to be good
Your content is so good!
If I wasn’t already subscribed, I definitely would’ve after that little easter egg in the stars in the umbra part.
Mirror slap can/will add motion blur to the image. Having the cameras on a double-ended diving board will magnify it as well.
Great video! Thank you!
Respect to you sir for your dedication! Thank you for the upload :)
16:23 damn this is super cool. I do code for a living but stuff like this really excites me
Hi! Loved the video, thanks for the amazing work :)
For once, you‘re talking about something that I actually understand, so that‘s really refreshing. Something I would point out is that the blue pixels (see 6:43) aren‘t just underexposed; in Camera RAW / LR / LR CC, when you turn on this view, the blue pixels are shadow clippings. That means that in the areas of the sensor corresponding to those pixels, the camera did not collect any data, with all color channels being at 0. Which raises the question: At 6:49, why is there a somewhat round area where no data was recorded, as opposed to the slightly more chaotic (and expectedly noisy) pattern outside that circle? It seems a bit too distinct to just be optical artifacts, no? I‘m confused now. Anyway, thanks for the video!
I’ve seen that before and I’m not entirely sure. It’s either a weird lens vignette that’s so faint it only matters in extreme dark, or it’s a weird sensor thing where the periphery is less sensitive. The angular edges of the pattern really confuse me because I assumed the iris has more leaves than that…
internal reflections of the lens
This reminds me of the last demo CD I recorded at a "state of the art" studio. We had to go through EVERY song, and replace EVERY snare drum strike with a sampled one because the tech had failed to turn on one of the snare mics. El....o.....El
:( :(:(........
I wasn't charged for the additional mixdown time but my brain still hurts from adjusting each strike to retain the continuity of the music.
Gotta love technology....in a sort of "love hate" kinda way.
Great video!
Love your style and informative personality!
Another thing to consider is that a lot of people who do solar eclipses, remove their filters for just the duration of the full eclipse...because there's no danger to their sensor...and then put the filters back on for the rest of the eclipse.
So for those, the sensor is actually getting a huge percentage greater of the light during the full portion of the eclipse.
Nice silver a6000. It's the camera I started with, and still own, and the camera that sparked my love for photography.
I work in biotechnology where like 95% of all experiments go wrong. I really like that you show the failure since this is the first step to success. Thanks for the great content!
Trying to stabilize sooooooooooooooo many frames by hand... OMG u have my respect bro! BTW I'm still using your barn door tracking mount that I learn from your channel many years ago, I think that is the first video that I saw from you many years ago lol, Im following you since I saw that video. THX for your job,
This was fantastic. Excellent work.
I learned the "don't have image stabilization on in a tripod timelapse" the hard way too, heheh.
Also, if you get tired of wiping dew off your lenses, you can get lens heaters; they wrap around the outside edge of the lens and you can set it to be just warm enough to keep dew/frost from forming. If you don't want or have something to plug into, you can use those air-activated hand-warmers velcroed to the side of the main lens instead.
when you were talking about merging the two projects to get something to work it made me think of keras, there's a _lot_ of machine learning stuff out there that kind of stuff
Love your channel and the failures. Great video as always.
That subscribe constellation though💯
13:00 Did you use fixed lenses w/o optical image stabilization and cameras that don't have a floating sensor for the stabilization? That and yeah I can imagine the stiffness of the frame and your mounting solution could play a significant role with such zoomed in footage.
I have never been to reddit in my life, but sir - that is literally the most useful handle you could have ever put out there. Try try again. How we finally got there. Brilliant.
I believe our retinas have dynamic sensitivity to light levels. That's why we can see details in both the dark region and the brightly lit, otherwise blown-out region at the same time. It takes a small amount of time for our rods and cones to adjust but the individual rods and cones adapt very quickly. This is also what causes the after-image to stick around when we look elsewhere after looking at a high-contrast region. Effectively you'd need to take enough stages of bracketed photos and process all the data, compressing every pixel using every bracket into whatever your luminosity space is.
Alternatively, you could use a camera which already does this for you. The power of computational photography in today's highest-end smartphones has superceded a lot of previously necessary techniques you might apply to manually-taken images from a normal SLR. Believe me, I know how hard it is to take pictures of the moon with anything else in shot. Yet a Galaxy s22 can take a better picture of the moon than my SLR.
Lots of photos use similar techniques. Photos showing clouds and city streets at the same time. Photos showing glowing nebula gases and relatively bright stars at the same time. Photos showing the stars at night and a night landscape at the same time. Many photographers, I think, can now show scenes much better and more dramatically than we can see them with our eyes, or with plain telescopes/binoculars. My opinions.
do not know why but looking at the starry background makes me want to subscribe :) nice one sir
Are the play buttons still coming?
It's possible to see faint stars while looking at the moon because of the cone distribution in your eye. The center of vision is mainly seeing detail and color, the peripheral fov essentially just see brightness. It's also closer to s difference camera than a snapshot camera. And if you want to go a bit further you will see that afterimage is caused by triggering too many potentials and that leads to a reduced frequency. Which is then amplified by your brain. So you lose dynamic range after a short while (seconds) and gain after image for like half a minute.
And it shouldn't be a limitation that requires multiple cameras. It's just a limitation of your bracketing options. You could for example use ND filters and keep exposure time constant. and if you motorize a filter wheel and sync it with your trigger, it will give you whatever dynamic range you want. Digital stills cameras usually capture their raw frames in linear. While jpg is used sRGB so 2.2 gamma. there is more than 255 colors in raw frames. Often 12 or 14bit of information, 16 on higher end cameras. Astro cams might be worth a consideration because you get more controls.
Screens are usually SDR, which is like 5-6 stops at best. And even "HDR" screens don't so that much more. There is a limit to what you even postal TH-cam.
Using the color page tracker in Resolve is really the worst tool for this job. And the "cloud" refers to a cloud of points in pseudo 3D space... What you are looking for is PIPP. The Planetary Imaging Pre Processor. It will have easily aligned and stabilized your frames and even gave you statistics along the way. The astrophotography community does have a lot of tools and they are all math nerds. If you go further to the actual astrophysicist, they encounter exactly the same problems but on larger scales and more precision. There are python packages available to do all of this programmatically and it's open source.
Thanks for the explanation of the image processing details. We do cross-correlation between synthetic aperture radar (SAR) images to be sure they are coregistered for SAR interferometry or InSAR. SAR images have speckle that adds noise. No clouds though.
Awesome work. Love the way you accounted for the vast exposure difference needed for a lunar eclipse. I have a big collection of poorly aligned time-lapses I figure I will refine sometime in the future. Hopefully it won't come down to a custom program to align each one. :)
graphic stars, i'm already subscribed, thank you
Awesome! Really impressive as well.
Thank you so much for making videos of your failures. I watch video game challenge videos, and they ask questions like "Can You Beat X Under Y Conditions?". The vast majority of videos are either "Yes", or "Technically no but here's the closest you can get", but I've noticed that a lot of these youtubers eventually say that if the challenge fails, they scrap the video and just make something else. This ends up reducing the amount of content in the world, takes away opportunities to learn about that new scenario, prevents them from getting feedback on alternate methods, causes duplicate work when other people unknowingly try the same impossible task who also don't post their results, and creates biases in perception about what things are possible and how hard certain things are. I'd really rather watch the video where things don't turn out well instead of having no video at all.
I've heard that something similar happens in science where people don't write or publish studies when the experiment "fails", causing lost knowledge and duplicate work. People have created journals just for publishing negative results to try to help with this problem.
Shadow apparently moving the wrong direction is due to Earth's axial rotation, not orbital motion. It makes the same apparent motion as the sun, since it is in exactly the opposite direction of the sun. At dusk, the shadow points eastward, at dawn it points westward.
the "subscribe constelatinon" in the backround of the diagram made my day
Good on you for doing it right, thanks for another amazing video!
Try turning off all sensor stabilizations, manual focus, manual ISO and frame rate settings. Focus breathing is the lense jittering you were referring to