What a privilege it is to have access to such beautiful visualisations and digestible explanations for free. You sir are a very gifted communicator! Thank you for your videos!
Don't usually comment, but I had to pause after 7 minutes in and say thanks! Those animations explaining spatial coherence are so great. This is an amazing resource, and all the work you put into it is appreciated!
The way you explain all those principles behind methods like this is just so great. I did many experiments with light during my time at university, but questions like "how can you make this light interfere exactly", or why an optical setup was just as it was, often were left unanswered. This explanation and its connection to spatial coherence is super intuitive and revealing. Again, a super good video. I think this now is my #1 favourite channel on TH-cam.
About five times through this video I paused it thinking "Oh, that's fascinating! I definitely should like the video to show how much I appreciate it!" only to find I had already liked the video. Wonderful stuff - and your own enthusiasm for this topic shines through, as well as the high coherence of transmission of ideas. Very enlightening.
From my observation seeing all your videos,I noticed that you are taking your viewers doubts into feedback and addressing those in the subsequent videos,glad for that.Thank you
The "crude" measurement (63nm) was astonishing! Given the complexity of the machine and software, it would not surprise me to learn that your first value was better. Thank you for making this.
The software is actually not all that complex. What you measure is a movie where every frame represents a well-known delay. Each pixel contains a linear autocorrelation, i.e. the white light interference trace as a function of delay which looks like a pulse. Then, the software determines the exact position of that pulse within the "movie axis" (delay axis), and these peak positions for every pixel directly give you the relative height map. The rest is scaling and calibration. It's actually a nice little exercise for a grad student to code that up in Matlab or something like that.
I doubt it. Remember that he guesstimated the average wave length, no doubt the Zygo device has a better way to determine average wave length. Like correlating the width of the pattern with the (known) displacement of the occular.
@@RealNovgorod Remember this machine is 30 years old. That means the SW was running on a 80486, or, more likely, an early RISC processor like an early model SPARC.
@@TheEvertw So? It's simple arithmetic - at most it may use a Hilbert transform or FFT to precisely measure the relative pulse position, but a crude implementation with FIR filters or some simple thresholding would also work. FFT is very optimized and can run on very old machines without problems as long as the signal length is reasonable. That's what made ancient modem-era speech compression codecs work. Maybe the machine could use dedicated coprocessor hardware for that, but I doubt it.
Just a note: the step height is affected by the phase change on reflection (dissimilar materials: glass and chromium), thus the step is actually a bit higher (I would expect 10 nm more than what was measured)... More details can be found in Zygo webinars - nowadays they can correct for this (in the software, if the material is known).
Oh, OK, so that's how you take advantage of frequency diversity. I'm used to radar imaging where our spectral bandwidth relates directly to resolution. This summing of the reference and target signal must somehow equate to our "mixing" (which is a multiplication) of our target and reference signals. Just a wonderful presentation of all of the physics involved... well done. Thanks for taking the time to put this together.
I applaud you for your description and your ability to get the white light fringes like you did. When we did this experiment in a college lab we had the use of large stable optical tables, very nice optics, nice micrometers, etc. and we still had a large amount of fiddling to do in order to see the fringes. I remember the feeling of great success when I actually got to see those white light interference patterns.
I guess the complication in any environment is people walking around causing vibrations. In this particular setup, the trick to finding the fringes is to first add a bit more tilt and then start searching with the translation stage. This increases your chances of seeing the pattern pass by.
What a gratifying feeling to intuitively know where you are going and be able to correctly predict how the experiment would work but only because I've been following your videos and apparently absorbing knowledge! Thank you.
Super late to the party, but wow! What a video! You really outdid yourself with this one, I'm just in awe of how much information and demonstration were shown. I had a general idea about how WLI worked, but now feel like I have enough detail to actually build or troubleshoot one. Great work, thanks as always for sharing your knowledge! ❤
Thanks Zack, you can actually buy these instruments (the hardware) from Ebay, sometimes for relatively little money. However, as I also said in the video, building the software is the real challenge. In order to get to 0.1nm resolution, you need to establish the relation between scanning position and phase within a milli-radian. So it's not just about simply finding a maximum for the interference intensity for a pixel, but rather fitting the data to an expected interference curve. In fact I don't even know how they do it exactly. But it's a very nice (but also niche) technique with fairly limited applicability.
Your explanation and illustration are amazing. A subject can be explained as much as you have done. You touched all the senses we have. Thank you, and keep making more videos like this.
I did white light interferometry on silicon wafers in an earlier jobs and it was one of coolest things to do..😊 Glad to see that you managed to go all the way upto explaining a Mirau configuration.. This is for sure one of the most informative videos on WLI on TH-cam. Maybe can you also make a video on confocal microscopy and perhaps compare the two techniques? (WLI and Confocal)
Thank you, one of my favorite channels on TH-cam. What took me days to understand in uni , watching your videos im able to grasp it in matter of minutes. I hope you continue to make videos. They hold an amazing educational value! Best wishes.
What an excellent video! Do I understand correctly that you want to use broad spectrum light so you have a thin "slice" of your sample? Because this priniciple will still work with laser, but then you cannot discriminate between 1/2 wavelength or 1.5 wavelength, whereas with broad spectrum the more wavelenghts away it gets filtered away?
Fascinating, thank you! I have used an FTIR spectrometer and IR microscope for years, but didn’t realize you could use the same interferometry principle for surface height measurements!
Not gonna lie, have struggled with some of the explanations in the past - but adding the manim charts that really intuitively show the fourier breakdowns and interference as the broadband signal shifts is really really helpful. Very very cool.
So fascinating, and as wonderfully presented as ever. The visuals combined with narration were especially clear and illuminating in this one. Thank you again!
Doggamn, this was a FASCINATING video from beginning to end, and I absolutely loved the footage you got with the commercial device. Can only imagine how much more accuracy modern high resolution digital cameras and better processing could add to that. =)
Schitterende video, Jeroen! 🥰 De simulatie en uitleg van witlicht was erg leerzaam. Kwam toevallig op een goed moment voor me. En gefeliciteerd met dit prachtige apparaat!
Thank you for making these videos. I work building optical encoders and while there are some very clever optical engineers and physics in our department I never get the chance to learn how these things work. And having never studied optics formally the academic explanations go over my head
I am relieved in the first half of this video, considering I tried to fully understand this technique in an afternoon, that it is in fact not as simple as I thought.
I was just wondering about the effect of illumination bandwidth, and then you said the wider the bandwidth the better the resolution! That makes sense, but I would not have guessed it on my own. It definitely feels like a Fourier transform thing -- having more width in the spectral domain would allow sharper waves in time or space domain.
Hi Ben, the resolution limitations are actually mostly in the accuracy of the scanning with the piezo and being able to relate the scanning position to the interference "phase" in each pixel. To get to the 0.1nm resolution that the instrument has, you have to be able to measure the phase with an accuracy smaller than a milliwavelength. And you cannot do this by just determining where the maximum fringe intensity that you find using a 25 frames per second frame rate. So yes, there is some frequency analysis present in the software. In the case of narrowband radiation, it would actually be easier to measure phase very accurately, but it would be very hard to identify which fringe is which. The latter is way more easy in broadband radiation, because there aren't that many to choose from. @AppliedScience
White light interferometry can be used to measure distances up to almost a km with an accuracy of 70 micrometers like they used to do to measure base lines for geodesy. It's a very powerful technique.
@@HuygensOptics It's called the Väisälä interference comparator. You basically use a setup like the Mireau objective, not quite the same but similar principle. In order to see an interference pattern, the distance between the "sample" surface (also a mirror in this setup) and the reference mirror has to be twice the distance between sample and the half mirror to within the coherence length, i.e. a few microns. They used quartz rods of about 1m in length whose absolute length was known to a few tens of nanometers(!) and with this setup they could "double" the length of the rod. This length could then be doubled etc. up to about 1km. Yrjö Väisälä developed this technique at my former home institute Tuorla Observatory, Finland. There is still a 80m long tunnel in the bedrock and a 300m test track there. I should also mention that these measurements were extremely difficult due to the unstable air, but during a few days in a year with exceptional weather it could be done. Great channel by the way, the best of the internet in optics!
@@HuygensOptics It's called the Väisälä Interference comparator. It is similar to the Mireau setup you showed. In order to see an interference pattern at all, the distance between the "sample" (a mirror in the Väisälä setup) and the reference mirror has to be twice the distance between the half mirror and the sample to within the coherence length. This allows you to "double" any known distance, which can then be doubled, etc. Still the most accurate (although very difficult to perform) method to measure distances of ~1km "in the field". Great channel by the way, be best in the internet about optics!
Oh wow. This is sooo good! 🤗 I even think I understand CSI now. 😲 Brilliantly explained, wonderful visuals, perfect boundary conditions (if you know what i mean 😉) Thanks so, soo much 🙏
Another amazing and educating video! Thanks for all the work you've put into this, I always love watching your stuff. Groeten van een amateurtelescoopbouwer uit Rijswijk!
Whoa Thanks a lot for this video! This is incredible; your video couldn't be more pertinent to me timewise. I am having trouble at work with telling what are the dimensions and material of micron sized parts and this could actually be a solution. Plus, as I am only an intern, this is so clear and well presented that I can actually understand it and think of ways to apply it to solve my problem! Thanks a lot Huygens Optics for sharing your knowledge, Great video, as usual!
Awesome video! you could also look into Optical coherence tomography (OCT) which uses roughly the same principle to "look into" certain materials such as skin or your retina.
What a fascinating method, very clever and your video made the explanation very approachable! Really neat that "white" light (otherwise often ignored in high precision optical methods) can be exploited by taking advantage of the very characteristics that make it less useful in most other techniques. 13:08 This reminds me of the property that extremely short light pulses necessarily have broader frequency spectrum, which places upper limits on how closely (in frequency) you can pack two optical channels in something like a fibre optic cable while increasing their modulation rate; eventually if you modulate the light fast enough, the two adjacent channels (which have plenty of separation at lower modulation rates) end up smearing over each other. I seem to recall a previous video I've watched (not sure if one of yours or another creator) that mentioned ultrashort laser pulses and how they're effectively "broadband" and contain all frequencies, which is related to how in audio, they'll often use the impulse response of an audio signal to characterize a given device or environment, because the resulting waveform captures the effect of the device under test across all (relevant) frequencies.
Typically well explained, thanks again. This is a level above the Foucault test I used on my 6" parabolic mirror. But it works well enough. cheers from sunny Vienna, Scott
That’s a beautiful instrument. The signal processing math one can do on an array of pixel values is awesome, and a great way to learn math and programming. I built an optical position measurement instrument using a sensor which had one straight line of 1024 pixels, to show where a shadow falls on the array, and used a breadboard hobby chip called Teensy. It’s good for beginners learning how to program, and not slow like the original Arduino learning board.
really amazing explanation I love your description about shifting intensity by a certain number of wavelengths (integer or non integer). It called to mind the fourier transform for me
@HuygensOptics I wonder, what is the condition that a certain pinhole will result in (spatially) coherent light? I would imagine something like this: As seen from every emission point of the light source (the LED), the phase difference between light passing the top and the bottom of the pinhole has to be smaller than some value related to the wavelength? So a smaller pinhole or a pinhole placed further away does increase the degree of spatial coherence? Another thought; if you do spectroscopy of light emitted by an incoherent light source (even if it is monochromatic, e. g. a Sodium Lamp, in which every Atom is on its own and its emission phase probably has no correlation to the emission phase of another Atom), would you need a similar setup of a pinhole or slit in a certain distance to even be able to do spectroscopy e. g. via diffraction with a grating? I remember, when doing spectroscopy during physics lab courses, there always was some kind of slit and I would imagine it is there exactly for this reason (which was never taught or even mentioned, surprisingly)
I actually discussed this in a previous video. Everything is about the pinhole or slit having the correct area of coherence to achieve spatial coherence. This is the link directly to the chapter in that video: th-cam.com/video/nba4ztLBEh0/w-d-xo.html
I love how far you've come with your animations and overall visualisations. I worry, however, that Robin Renzetti will have to discard his granite surface plates after seeing this video.
Thanks. There are large differences in the surface quality of granite, depending on the origin / composition, but also because of the method that they were polished. This particular sample was of "tombstone" quality (for graves). The stones you find in mechanical workshops generally don't show multi micron-size height differences.
Excellent video as always, very clearly explained. It might be interesting use this microscope to a examine a metal surface that has been scraped flat.
@@HuygensOptics Thank you for the response. Indeed, I suppose it would only show the difference between the idea of precision in optics vs machining. I still think it could be interesting, although maybe only with a larger viewed area than achievable even at only 2.5x, at which point using this microscope feels like "using a cannon to kill a fly."
When the two combine, it's like it's doing an autocorrelation with a different offset version of itself. The brighter parts are where the parts were most alike even though they were offset.
This is awesome! I just build a low coherence interferometer myself and the part showing the phase offsets and how it relates to the intensity distribution was excellent. if you're interested, see the paper "Absolute optical ranging using low coherence interferometry" by Danielson and Boisrobert, I thought the Fourier treatment was clever.
Be careful when measuring two dissimilar materials, such as metal-glass interfaces as a phase error is introduced. I'll direct you to a paper "Offset of coherent envelope position due to phase change on reflection." Thanks for the cool video(s)!
That is indeed correct, there will be phase shift differences with different reflective metals (including chromium). I decided to leave this out to not further complicate the explanation about how to calculate height differences.
Amazing video & technology. Amazing how just with some relatively simple components anyone with the know-how could make a device capable of sub-micron measurements. But that software processing the images in the Zygo instrument is something else. The GUI reminds me of the UNIX workstations of the nineties, I wouldn't be surprised if there was one hiding somewhere in the sytem ;-)
beautiful explanation thank you! Im not an expert and I would like to share that I been wanting to do a simple and very cheap michelsons interferometer just to see light interfere with itself maybe you can help by answering a few questions about your video: In Minute 6:40 you show the aluminum stuck to the led light can the aluminum pinhole be a few cm separated from several leds? will this be the same? In minute 6:54 you clearly state that you are using an Achromatic lens but in minute 6:22 the example shows a simple Chromatic lens made with 2 red lines and just to get my mind of this idea... Wouldnt an Apochromatic lens be the best? Can I use a simple Chromatic lens from a cheap magnifying glass to interfere light? Can I use simple and cheap mirrors to interfere light? Can I use a simple piece of glass as the beam splitter or should I buy one in amazon? I hope my questions are not to simple or even dumb... in any case thank you in advance for any feed back you wish to share! Brian
Separating the pinhole and LED by distance will make the light source extremely dim. To preserve spatial coherence, you should use high-quality mirrors and lenses. If the wavefront is not flat, you cannot do any good measurements. What helps is using an aperture with your lens: this will allow you to achieve better wavefront quality over the area, but of course that area will be much smaller...
@@HuygensOptics thank you very much for replying. I’ll do my best to buy high-quality stuff not that easy of course and what about the magnifying glass which one of those three do you recommend am I right and the last one is even better than the one they use in this video?
In the simulation at 5:30 ive actually seen LED light emit particles like this, while looking up really close you could see a halo of particles just like that shooting out. It was very cool to see and i wondered why i was seeing it and if it was some kind of refraction effect from my eyes but it didnt seem to clear up.
Brilliant!!! I like your 10,000 ft. approach to this topic. Though I think that explaining the fringes would have been easier if you described constructive/destructive interference of the visible spectrum and how that relates to the optical path difference between the part under test, the reference, the focal plane of the objective. Also, it should be added that too much tilt can cause ray trace errors and slope acceptance errors as well. Try to spread the Peak fringe (center fringe) across the entire FOV (aka Nulling), then take the measurement. You can also play around with the Aperture stop to get returns deeper into the bottom of the cavities. Another FYI, the peak wavelength of the Halogen lamp is the industry standard for the high end optical profilometers (white light). So the peak wavelength should be between 540-550nm. Great job on this video.
That's an interesting question! The lines' fuzziness comes from the interference fringe being of sinusoidal shape, which you cannot get rid of when you have a single peak in your input light source. However if you add a second or third lightsource at double the mean frequency you get a faster oscillating fringe which adds to the existing fringe. If you add the right phase shift this can make the lines sharper.
at 27:29 we can see a second, weaker scan of an interference pattern. Why is that? edit: awesome video as always! I particularly liked the explanation of temporal coherence and the small note that the arms in a white light interferometer need to be exactly the same length.
You would indeed see a lot of fringes, but you would not be able to identify which was the central one. So you cannot correlate z-position and fringe pattern. @GeoffryGifari
At 27:18, after the first interference pattern has left the screen, a second, weaker interference pattern is visible (right before you cut to the software showing the results). What is the cause of this?
Good observation! These are due to internal reflections in the optical system which can also create interference. They are no problem for the measurement since they are ignored by the software because of their very low contrast.
What a privilege it is to have access to such beautiful visualisations and digestible explanations for free. You sir are a very gifted communicator! Thank you for your videos!
My thoughts exactly!
Same here!
Agreed!
If it's free, you are the commodity.
Don't usually comment, but I had to pause after 7 minutes in and say thanks! Those animations explaining spatial coherence are so great. This is an amazing resource, and all the work you put into it is appreciated!
The way you explain all those principles behind methods like this is just so great. I did many experiments with light during my time at university, but questions like "how can you make this light interfere exactly", or why an optical setup was just as it was, often were left unanswered.
This explanation and its connection to spatial coherence is super intuitive and revealing.
Again, a super good video. I think this now is my #1 favourite channel on TH-cam.
About five times through this video I paused it thinking "Oh, that's fascinating! I definitely should like the video to show how much I appreciate it!" only to find I had already liked the video. Wonderful stuff - and your own enthusiasm for this topic shines through, as well as the high coherence of transmission of ideas. Very enlightening.
From my observation seeing all your videos,I noticed that you are taking your viewers doubts into feedback and addressing those in the subsequent videos,glad for that.Thank you
Experiments are the only way to find out how things really work.
The "crude" measurement (63nm) was astonishing! Given the complexity of the machine and software, it would not surprise me to learn that your first value was better. Thank you for making this.
The software is actually not all that complex. What you measure is a movie where every frame represents a well-known delay. Each pixel contains a linear autocorrelation, i.e. the white light interference trace as a function of delay which looks like a pulse. Then, the software determines the exact position of that pulse within the "movie axis" (delay axis), and these peak positions for every pixel directly give you the relative height map. The rest is scaling and calibration. It's actually a nice little exercise for a grad student to code that up in Matlab or something like that.
I doubt it. Remember that he guesstimated the average wave length, no doubt the Zygo device has a better way to determine average wave length. Like correlating the width of the pattern with the (known) displacement of the occular.
@@RealNovgorod Remember this machine is 30 years old. That means the SW was running on a 80486, or, more likely, an early RISC processor like an early model SPARC.
@@TheEvertw So? It's simple arithmetic - at most it may use a Hilbert transform or FFT to precisely measure the relative pulse position, but a crude implementation with FIR filters or some simple thresholding would also work. FFT is very optimized and can run on very old machines without problems as long as the signal length is reasonable. That's what made ancient modem-era speech compression codecs work. Maybe the machine could use dedicated coprocessor hardware for that, but I doubt it.
Just a note: the step height is affected by the phase change on reflection (dissimilar materials: glass and chromium), thus the step is actually a bit higher (I would expect 10 nm more than what was measured)... More details can be found in Zygo webinars - nowadays they can correct for this (in the software, if the material is known).
I nearly forgot:
So much gratitue for having real subtitles! 🙏
Much appreciated.
New Optics Video to start the week! I cant ask for more!
I love WLI, and low coherence interferometry in general. Absolute position information down to the sub-nanometer scale.
Oh, OK, so that's how you take advantage of frequency diversity. I'm used to radar imaging where our spectral bandwidth relates directly to resolution. This summing of the reference and target signal must somehow equate to our "mixing" (which is a multiplication) of our target and reference signals. Just a wonderful presentation of all of the physics involved... well done. Thanks for taking the time to put this together.
Radar is a form of light with a very long wave length: order of 10 cm instead of tens of nm.
This was cool, exciting and educational!
Fantastic video!! Thank you so much for such a beautiful presentation. I love the special effects at 24:14 😁
I applaud you for your description and your ability to get the white light fringes like you did. When we did this experiment in a college lab we had the use of large stable optical tables, very nice optics, nice micrometers, etc. and we still had a large amount of fiddling to do in order to see the fringes. I remember the feeling of great success when I actually got to see those white light interference patterns.
I guess the complication in any environment is people walking around causing vibrations. In this particular setup, the trick to finding the fringes is to first add a bit more tilt and then start searching with the translation stage. This increases your chances of seeing the pattern pass by.
Fascinating video. I love how well you keep the subject accessible even when talking about such advanced concepts
i am so glad that youtube exists, otherwhise i would never have found out about fascinating stuff like this!
Even though I know how WLI works but I enjoy your videos. Thanks
I just wanted to thank you again for approaching every subject as an opportunity to teach, it makes your videos such a pleasure to partake in.
What a gratifying feeling to intuitively know where you are going and be able to correctly predict how the experiment would work but only because I've been following your videos and apparently absorbing knowledge! Thank you.
It's always such a treat when you put out a new video! Thank you very much!
Super late to the party, but wow! What a video! You really outdid yourself with this one, I'm just in awe of how much information and demonstration were shown. I had a general idea about how WLI worked, but now feel like I have enough detail to actually build or troubleshoot one. Great work, thanks as always for sharing your knowledge! ❤
Thanks Zack, you can actually buy these instruments (the hardware) from Ebay, sometimes for relatively little money. However, as I also said in the video, building the software is the real challenge. In order to get to 0.1nm resolution, you need to establish the relation between scanning position and phase within a milli-radian. So it's not just about simply finding a maximum for the interference intensity for a pixel, but rather fitting the data to an expected interference curve. In fact I don't even know how they do it exactly. But it's a very nice (but also niche) technique with fairly limited applicability.
Your visualizations of compex phenomena is unrivalled (and yes, i follow the other greats here)
Your explanation and illustration are amazing. A subject can be explained as much as you have done. You touched all the senses we have. Thank you, and keep making more videos like this.
I did white light interferometry on silicon wafers in an earlier jobs and it was one of coolest things to do..😊 Glad to see that you managed to go all the way upto explaining a Mirau configuration.. This is for sure one of the most informative videos on WLI on TH-cam.
Maybe can you also make a video on confocal microscopy and perhaps compare the two techniques? (WLI and Confocal)
I’m so glad I found this channel.
Can’t get enough of your excellent videos.
Bored, you are wild. I will NEVER get bored by whatever you will teach us! Awesome video.
So good ! Its great that you strike such a good balance between accesability and depth in your presentation.
Thanks Manuel. I just looked at you channel, again, you are doing pretty good with the subscriber numbers!
Thank you, one of my favorite channels on TH-cam. What took me days to understand in uni , watching your videos im able to grasp it in matter of minutes.
I hope you continue to make videos. They hold an amazing educational value!
Best wishes.
What an excellent video! Do I understand correctly that you want to use broad spectrum light so you have a thin "slice" of your sample? Because this priniciple will still work with laser, but then you cannot discriminate between 1/2 wavelength or 1.5 wavelength, whereas with broad spectrum the more wavelenghts away it gets filtered away?
Yes correct, only with a short coherence length will you be able to identify and relate the correct fringes easily.
Great content sir, Zygo is one of the best manufacturers of CSI technology. Great thanks from the Zygo Represent team in Vietnam.
Makes the 2u work of EUV photolithography even more impressive. As always, love your content. Please keep more coming.
Fascinating, thank you! I have used an FTIR spectrometer and IR microscope for years, but didn’t realize you could use the same interferometry principle for surface height measurements!
Not gonna lie, have struggled with some of the explanations in the past - but adding the manim charts that really intuitively show the fourier breakdowns and interference as the broadband signal shifts is really really helpful. Very very cool.
So fascinating, and as wonderfully presented as ever. The visuals combined with narration were especially clear and illuminating in this one. Thank you again!
Really cool to see you cover WLI !!!
Doggamn, this was a FASCINATING video from beginning to end, and I absolutely loved the footage you got with the commercial device. Can only imagine how much more accuracy modern high resolution digital cameras and better processing could add to that. =)
Always a pleasure to catch one of your videos.
Interesting. Awesome video, if you get bored of this unit I'll gladly buy it. Need one of these or an equivalent for a personal project / home lab :D
Schitterende video, Jeroen! 🥰 De simulatie en uitleg van witlicht was erg leerzaam. Kwam toevallig op een goed moment voor me. En gefeliciteerd met dit prachtige apparaat!
Bedankt Nobby, jij ook nog super bedankt voor je ondersteuning bij het debuggen!
Ik ken je niet, maar degene die iemand helpt met het debuggen heeft mijn diepste respect. Dank u wel 🤗
Another amazing demonstration Jeroen! The Zygo Newview 100 is a seriously cool piece of hardware!
Thank you for making these videos. I work building optical encoders and while there are some very clever optical engineers and physics in our department I never get the chance to learn how these things work. And having never studied optics formally the academic explanations go over my head
What a wonderful visualisation of the phenomenon. Thank you for making this!
I am relieved in the first half of this video, considering I tried to fully understand this technique in an afternoon, that it is in fact not as simple as I thought.
I love your videos. I didn't want this one to end
Your channel made the understanding of light very intuitive for me thanks
Man, this content is outstanding. Thank you for sharing.
You always have the coolest ideas to show, thanks a ton for sharing!
Thank you for bringing this interesting topic and instrument to the public.
I was just wondering about the effect of illumination bandwidth, and then you said the wider the bandwidth the better the resolution! That makes sense, but I would not have guessed it on my own. It definitely feels like a Fourier transform thing -- having more width in the spectral domain would allow sharper waves in time or space domain.
Hi Ben, the resolution limitations are actually mostly in the accuracy of the scanning with the piezo and being able to relate the scanning position to the interference "phase" in each pixel. To get to the 0.1nm resolution that the instrument has, you have to be able to measure the phase with an accuracy smaller than a milliwavelength. And you cannot do this by just determining where the maximum fringe intensity that you find using a 25 frames per second frame rate. So yes, there is some frequency analysis present in the software. In the case of narrowband radiation, it would actually be easier to measure phase very accurately, but it would be very hard to identify which fringe is which. The latter is way more easy in broadband radiation, because there aren't that many to choose from.
@AppliedScience
An other gorgeous video!!! Many thanks
White light interferometry can be used to measure distances up to almost a km with an accuracy of 70 micrometers like they used to do to measure base lines for geodesy. It's a very powerful technique.
Can you tell me how this works with a coherence length of only a few micron?
@@HuygensOptics It's called the Väisälä interference comparator. You basically use a setup like the Mireau objective, not quite the same but similar principle. In order to see an interference pattern, the distance between the "sample" surface (also a mirror in this setup) and the reference mirror has to be twice the distance between sample and the half mirror to within the coherence length, i.e. a few microns. They used quartz rods of about 1m in length whose absolute length was known to a few tens of nanometers(!) and with this setup they could "double" the length of the rod. This length could then be doubled etc. up to about 1km. Yrjö Väisälä developed this technique at my former home institute Tuorla Observatory, Finland. There is still a 80m long tunnel in the bedrock and a 300m test track there. I should also mention that these measurements were extremely difficult due to the unstable air, but during a few days in a year with exceptional weather it could be done. Great channel by the way, the best of the internet in optics!
@@HuygensOptics It's called the Väisälä Interference comparator. It is similar to the Mireau setup you showed. In order to see an interference pattern at all, the distance between the "sample" (a mirror in the Väisälä setup) and the reference mirror has to be twice the distance between the half mirror and the sample to within the coherence length. This allows you to "double" any known distance, which can then be doubled, etc. Still the most accurate (although very difficult to perform) method to measure distances of ~1km "in the field". Great channel by the way, be best in the internet about optics!
@@KN-vz8dj Interesting, thank you for the additional info. I'll certainly look into it!
@@HuygensOpticsI hope you do a video on how it works. I imagine the possibilities in the vacuum of space would exceed 1 km
Oh wow. This is sooo good! 🤗
I even think I understand CSI now. 😲
Brilliantly explained, wonderful visuals, perfect boundary conditions (if you know what i mean 😉)
Thanks so, soo much 🙏
Fabulous as always
the graphics to explain how the interference pattern arises and the sums of the two reflected beams are phenomenal
Glad I left it in, initially I thought it was maybe too obvious and long and did not want to use it in the video.
Oh, Your channel is SO interesting! I didn't watch the video yet but I just know it's gonna be good. Keeping it for a dessert.
thanks for the lecture on CSI, much appreciated.
Another amazing and educating video! Thanks for all the work you've put into this, I always love watching your stuff. Groeten van een amateurtelescoopbouwer uit Rijswijk!
Whoa Thanks a lot for this video! This is incredible; your video couldn't be more pertinent to me timewise. I am having trouble at work with telling what are the dimensions and material of micron sized parts and this could actually be a solution.
Plus, as I am only an intern, this is so clear and well presented that I can actually understand it and think of ways to apply it to solve my problem!
Thanks a lot Huygens Optics for sharing your knowledge,
Great video, as usual!
Awesome video! you could also look into Optical coherence tomography (OCT) which uses roughly the same principle to "look into" certain materials such as skin or your retina.
I don't have one ;-). Also building one myself would not feasible technically, especially the software.
This is an absolutely fantastic video. Thankyou!
What a fascinating method, very clever and your video made the explanation very approachable! Really neat that "white" light (otherwise often ignored in high precision optical methods) can be exploited by taking advantage of the very characteristics that make it less useful in most other techniques.
13:08 This reminds me of the property that extremely short light pulses necessarily have broader frequency spectrum, which places upper limits on how closely (in frequency) you can pack two optical channels in something like a fibre optic cable while increasing their modulation rate; eventually if you modulate the light fast enough, the two adjacent channels (which have plenty of separation at lower modulation rates) end up smearing over each other.
I seem to recall a previous video I've watched (not sure if one of yours or another creator) that mentioned ultrashort laser pulses and how they're effectively "broadband" and contain all frequencies, which is related to how in audio, they'll often use the impulse response of an audio signal to characterize a given device or environment, because the resulting waveform captures the effect of the device under test across all (relevant) frequencies.
Thanks, another great lesson!
astonishing explanation ! thank you very much !
Typically well explained, thanks again. This is a level above the Foucault test I used on my 6" parabolic mirror. But it works well enough.
cheers from sunny Vienna, Scott
I wish my undergraduate optics lectures had been so easy to understand!
thank you for your excellent work
Thanks for the wonderful explanation.
Amazing work as always!
That’s a beautiful instrument. The signal processing math one can do on an array of pixel values is awesome, and a great way to learn math and programming. I built an optical position measurement instrument using a sensor which had one straight line of 1024 pixels, to show where a shadow falls on the array, and used a breadboard hobby chip called Teensy. It’s good for beginners learning how to program, and not slow like the original Arduino learning board.
really amazing explanation
I love your description about shifting intensity by a certain number of wavelengths (integer or non integer). It called to mind the fourier transform for me
The interference pattern reminds me of the wavelets used in the wavelet transform.
Absolutely amazing, thank you for the great content.
@HuygensOptics I wonder, what is the condition that a certain pinhole will result in (spatially) coherent light? I would imagine something like this: As seen from every emission point of the light source (the LED), the phase difference between light passing the top and the bottom of the pinhole has to be smaller than some value related to the wavelength? So a smaller pinhole or a pinhole placed further away does increase the degree of spatial coherence?
Another thought; if you do spectroscopy of light emitted by an incoherent light source (even if it is monochromatic, e. g. a Sodium Lamp, in which every Atom is on its own and its emission phase probably has no correlation to the emission phase of another Atom), would you need a similar setup of a pinhole or slit in a certain distance to even be able to do spectroscopy e. g. via diffraction with a grating? I remember, when doing spectroscopy during physics lab courses, there always was some kind of slit and I would imagine it is there exactly for this reason (which was never taught or even mentioned, surprisingly)
I actually discussed this in a previous video. Everything is about the pinhole or slit having the correct area of coherence to achieve spatial coherence. This is the link directly to the chapter in that video: th-cam.com/video/nba4ztLBEh0/w-d-xo.html
Thank you! Oh wow; actually this simulation now helped me so much understanding the concept of area of coherence!
I love how far you've come with your animations and overall visualisations. I worry, however, that Robin Renzetti will have to discard his granite surface plates after seeing this video.
Thanks. There are large differences in the surface quality of granite, depending on the origin / composition, but also because of the method that they were polished. This particular sample was of "tombstone" quality (for graves). The stones you find in mechanical workshops generally don't show multi micron-size height differences.
Awesome ! never seen this before and it's fascinating please keep educating !....cheers.
You're amazing Huygens.
15:33 Isn't intensity always time averaged (integrated) per definition? Because it's impossible to measure those frequencies directly I ~ < E(t)^2 >
Yes in any practical situation, the value of the intensity is a time average over many wave cycles.
Great work!
Fantastic lesson. Thank you.
Excellent video as always, very clearly explained. It might be interesting use this microscope to a examine a metal surface that has been scraped flat.
On the nanometer level, it will likely look like the Himalayas
@@HuygensOptics Thank you for the response. Indeed, I suppose it would only show the difference between the idea of precision in optics vs machining. I still think it could be interesting, although maybe only with a larger viewed area than achievable even at only 2.5x, at which point using this microscope feels like "using a cannon to kill a fly."
I was waiting for it :) Thank you!
When the two combine, it's like it's doing an autocorrelation with a different offset version of itself. The brighter parts are where the parts were most alike even though they were offset.
This is awesome! I just build a low coherence interferometer myself and the part showing the phase offsets and how it relates to the intensity distribution was excellent. if you're interested, see the paper "Absolute optical ranging using low coherence interferometry" by Danielson and Boisrobert, I thought the Fourier treatment was clever.
Be careful when measuring two dissimilar materials, such as metal-glass interfaces as a phase error is introduced. I'll direct you to a paper "Offset of coherent envelope position due to phase change on reflection." Thanks for the cool video(s)!
That is indeed correct, there will be phase shift differences with different reflective metals (including chromium). I decided to leave this out to not further complicate the explanation about how to calculate height differences.
Excellent!
Amazing video & technology.
Amazing how just with some relatively simple components anyone with the know-how could make a device capable of sub-micron measurements. But that software processing the images in the Zygo instrument is something else. The GUI reminds me of the UNIX workstations of the nineties, I wouldn't be surprised if there was one hiding somewhere in the sytem ;-)
Correct, it is ported from UNIX to windows, which is clear from the way you have to close windows or move them around.
Another awesome video! I'd love to play around with data from something like that machine!
beautiful explanation thank you! Im not an expert and I would like to share that I been wanting to do a simple and very cheap michelsons interferometer just to see light interfere with itself maybe you can help by answering a few questions about your video:
In Minute 6:40 you show the aluminum stuck to the led light can the aluminum pinhole be a few cm separated from several leds? will this be the same?
In minute 6:54 you clearly state that you are using an Achromatic lens but in minute 6:22 the example shows a simple Chromatic lens made with 2 red lines and just to get my mind of this idea... Wouldnt an Apochromatic lens be the best?
Can I use a simple Chromatic lens from a cheap magnifying glass to interfere light?
Can I use simple and cheap mirrors to interfere light?
Can I use a simple piece of glass as the beam splitter or should I buy one in amazon?
I hope my questions are not to simple or even dumb... in any case thank you in advance for any feed back you wish to share!
Brian
Separating the pinhole and LED by distance will make the light source extremely dim. To preserve spatial coherence, you should use high-quality mirrors and lenses. If the wavefront is not flat, you cannot do any good measurements. What helps is using an aperture with your lens: this will allow you to achieve better wavefront quality over the area, but of course that area will be much smaller...
@@HuygensOptics thank you very much for replying. I’ll do my best to buy high-quality stuff not that easy of course and what about the magnifying glass which one of those three do you recommend am I right and the last one is even better than the one they use in this video?
I just realized that interferometry patterns are really just wavelets
Is this notion taken advantage of sometimes?
In the simulation at 5:30 ive actually seen LED light emit particles like this, while looking up really close you could see a halo of particles just like that shooting out. It was very cool to see and i wondered why i was seeing it and if it was some kind of refraction effect from my eyes but it didnt seem to clear up.
This channel is so damn cool
Modern René Descartes. Respect to you, Sir.
How do you ensure that the reference mirror is completely flat?
I tested it before I used it. Flatness is approx 1/6th of a wave in the visible area.
hmmm does the maximum fringe spacing we can attain in this setup has a particular meaning?
Brilliant!!! I like your 10,000 ft. approach to this topic. Though I think that explaining the fringes would have been easier if you described constructive/destructive interference of the visible spectrum and how that relates to the optical path difference between the part under test, the reference, the focal plane of the objective. Also, it should be added that too much tilt can cause ray trace errors and slope acceptance errors as well. Try to spread the Peak fringe (center fringe) across the entire FOV (aka Nulling), then take the measurement. You can also play around with the Aperture stop to get returns deeper into the bottom of the cavities. Another FYI, the peak wavelength of the Halogen lamp is the industry standard for the high end optical profilometers (white light). So the peak wavelength should be between 540-550nm. Great job on this video.
Is the fuzziness of the interference lines due to the Heisenberg uncertainty principal? Is it possible to make the lines sharper?
That's an interesting question! The lines' fuzziness comes from the interference fringe being of sinusoidal shape, which you cannot get rid of when you have a single peak in your input light source. However if you add a second or third lightsource at double the mean frequency you get a faster oscillating fringe which adds to the existing fringe. If you add the right phase shift this can make the lines sharper.
I love your channel!
Amazing!
at 27:29 we can see a second, weaker scan of an interference pattern. Why is that?
edit: awesome video as always! I particularly liked the explanation of temporal coherence and the small note that the arms in a white light interferometer need to be exactly the same length.
Hmm so you can't use lasers in this application because the interference fringes will just appear everywhere and scanning is not possible?
oh wait... would there even be fringes?
You would indeed see a lot of fringes, but you would not be able to identify which was the central one. So you cannot correlate z-position and fringe pattern.
@GeoffryGifari
At 27:18, after the first interference pattern has left the screen, a second, weaker interference pattern is visible (right before you cut to the software showing the results). What is the cause of this?
Good observation! These are due to internal reflections in the optical system which can also create interference. They are no problem for the measurement since they are ignored by the software because of their very low contrast.
@@HuygensOptics Interesting, thank you
Thank you so much for sharing your knowledge! :D