Smartphone Camera Lens Design: A Patent Study
ฝัง
- เผยแพร่เมื่อ 26 ธ.ค. 2024
- I dissected a recently issued patent for a 6-element smartphone camera lens. As much was learned about mobile phone cameras in general as about the patent itself.
I always find some kind of typo. There is one at 3:28. The "r" in the sag equation is actually a "y".
OpticStudio is a product of Zemax (www.zemax.com)
Please consider clicking on "LIKE". Your positive feedback is the only encouragement that I receive to continue creating this kind of helpful material.
Here's the playlist with all of my optical design videos: • Optics and Optical Design
This video took over a month to research and produce. It sent me deep into the literature. Here are the sources that I consulted in order to make this video:
1. The patent reference is:
Chen, Chun-Shan, Tsung-Han Tsai, and Ming-Ta Chou. "Optical image lens system." U.S. Patent No. 11,561,375. Jan. 24, 2023.
2. A very detailed overview of the issues involved in designing mobile phone lenses:
Blahnik, Vladan, and Oliver Schindelbeck. "Smartphone imaging technology and its applications." Advanced Optical Technologies 10.3 (2021): 145-232.
3. Integration of a lens with the focal plane array sensor is done while considering both the optical resolution and the detector resolution, as described in:
Kasunic, Keith J. Optical systems engineering. McGraw-Hill Education, 2011.
4. He included a bite-sized chapter on miniature lens design. Wow! A must-read:
Sasián, José. Introduction to lens design. Cambridge University Press, 2019.
5. The subtle matter of how the aspheric lenses influence Petzval curvature even though the aspheric surfaces don't was discussed on his blog:
Kats Ikeda, www.pencilofra...
6. For OpticStudio users, there is a really good tutorial on the Zemax Knowledgebase that takes you through the set-up and some of the analysis of a mobile phone lens:
support.zemax....
I don't ask for support for developing this content. There is no sponsor. No Patreon. I don't make viewers sit through ads (although TH-cam does). So, please click "Like" whenever appropriate. It's difficult to measure the success of this channel. My only metric of success is that the videos help people to do what they do better. How do you measure that? The only signal for success comes from your "Like".
Photo credits (in order of appearance):
1. H. Raab (User:Vesta), CC BY-SA 4.0 (creativecommon..., via Wikimedia Commons
2. By Filya1 - Own work, CC BY-SA 3.0, commons.wikime...
3. Photo by Colin M.L. Burnett CC BY-SA 3.0, commons.wikime...
4. Photo by Colin M.L. Burnett CC BY-SA 3.0, commons.wikime...
Optical Design Playlist: • Optics and Optical Design
#lensdesign
#opticaldesign
It is a priviledge watching you, a real optical expert, walk through this design, bringing your insight and experience so clearly to it's analysis. Thank you.
Wow, thank you! What a generous compliment.
Thanks for the great video! Regarding your question: The RI in modern sensors is corrected by a 17x17 array of values (luma shading). Those shading values are measured after assembly of lens and sensor with a white scene target and (usually) written to the sensor or some other available memory in the module. Later those values can be used to correct for the RI digitally. The same is done for R/G and B/G values to correct for 'color' shading (depending on wavelength and angle the transmission changes as well).
Thanks for the description!
Thank you for sharing the inside and it is wonderful to see the real time simulation. Please keep doing.
Fantastic, thank you
Awesome review, lots of information. Thank you
This is amazing. I had seen a similar patent from Apple and I wanted to read the paper someday. A video is much better to get an introduction. Also I wondered what the kinds of Optical CAD software used were, and it's also explained here!
First of all, thank you for your video; it’s very beginner-friendly. Recently, I tried to replicate the lens design based on your video and the patent mentioned in it. All the surface parameters match the table exactly, and I kept five decimal places in the simulation, just like you did. However, I noticed that my layout is somewhat different from the one in the patent and your layout. Specifically, lenses 1 and 2 are very close to each other, with their edges overlapping, which results in the edge light not converging on the image plane. The shapes of the other lenses appear to be consistent. I carefully checked the parameters for the four surfaces of lenses 1 and 2 and didn’t find any issues. Additionally, I used EFLY to check the focal length of each lens, and they are essentially the same as in the table. I personally believe that the shape of an aspherical lens is determined by its radius, thickness, and aspherical coefficients. Since these data match the patent data, why do the results differ? Could you please advise if there’s something I might have overlooked that’s causing my results to differ?
The first thing I would check are the clear apertures on those four surfaces. After I optimized, the first lens element had a front-side clear semi-diameter of 0.724mm and a back-side of 0.792mm. If all of the surface geometry is entered without error, then the next thing I would check is how the field and the aperture are defined. I used Real Image Height for the field and set the max field to 2.801mm. I used Image Space F/# for the aperture, and I set it to 2.35. Hope this helps.
Great Videos
Thanks for producing them !!
Great video as as usual, thanks a ton for taking the time to make these deep dives!
Regarding how the falloff in relative illumination is corrected in software, one thing that is done for some applications is to create an inverse map of the falloff (often from calibration of individual lenses during manufacturing) and then multiplicatively scale the pixel readings to reach a flat field. Sure, this decreases the SNR of the outer portions of the image, but given the use is often far from the noise floor, it does not become terribly noticeable for the end user.
Thank you for the informative reply to my end-of-video question.
Great video, thanks for posting.
Thank you for this video!
What is your opinion of Leica Co working with Xiaomi?
Thanks a lot for this tutorial!!! is it possible to download a Zemax file of this design? Thank you.
I don't have a good way to share files, but you can find a lot of good Zemax files, including some similar to this one, at www.lens-designs.com
Can we just have only one lens and one plastic filter that corect the light?
now try and tolerance this thing. So hard to get the plastic molders to commit to anything.
But they got it. Sharpnes is good enough.
I wonder why plastic is not used for photograph lenses in larger formats.
Small range of refractive index, high coefficient of thermal expansion and poor optical properties
I have seen plastic lenses on aliexpress, they have poor quality but low price
It is - canon has got quite good at moulding elements at that size, so their cheap lenses tend to include some very extreme plastic aspherics (e.g. the RF 28mm f/2.8)
Thanks for wonderful video but I am still a little confused that the magtify of distortion should change following the curve of lens shape but how can you get a number telling about distortion since lens 5 and 6 are not spherical lens so distortion might change from barrel to pincushion based on how chief ray is reflacted on curve, can you help me to get deeply in it? I am college student and trying to research about distortion, thank you alot
This is a really good question. Wavefront aberration coefficients can be computed analytically through fourth order aspheric. But for higher order aspherics, ray tracing is exclusively used to understand the final image locations. A fourth order aspheric will cause a small departure from a spherical surface, unlike the higher order terms in the surfaces used here. As you noted, the distortion is hybrid, meaning that there is a change in sign moving from the center to the image edge, and ray tracing, rather than a single number such as a Seidel coefficient, is the only way to look at it. Zemax, and all other programs, do compute a table of Seidel coefficients. But when higher order aberrations dominate, and high order aspherics are used, I really don't know what meaning, if any, they have. I'm sure they are meaningful, and maybe someone can help us out here.
By the way, you can get hybrid distortion without using aspherics. Some lenses balance out the third order distortion leaving higher order field dependent magnifications that can result in a wavey distortion versus field plot. Bear in mind that a distortion plot is not the result of only the third order polynomial term, but of all polynomial terms that describe a displacement in the chief ray.
@stephenremillard1 thank for great explaination, I did some experiments in cooke triplet lens structure and realize that distortion's magnitude is really hybrid through lenses but in another lens structure, it not really working that way, when the system get signigiciant magnitude on barrel distortion, I add into it another got barrel distortion lens and the final result make the distortion decreased but image smaller, this is quite interesting, the image is really compressed to expand FOV with lower distortion. But I still dont know the way it works.
interesting, thank you for sharing,
I loved it since i don't own any sophesticated device, telscope or ... but it was fun to learn more about my phone.
unrelted question, can you tell me about absorption in the lens? my knowledge to optics is limited but i am fascinated by this field, I apprecite if you can guide me, and tell me more about absorption in the lenses.
Also I don't know if this is something common in the market and devices or not, but it would be interesting if I could filter out specific band or frequency out of my camera, I know there are different filters in scientific world, low/ high pass or band pass, but I am dreaming of something that i can play or make it in my home.
Also, you have mentioend the glass layer to filter the inferared for CMOS, wouldn't be nice if we could remove the glass layer nd also filter out the visible range, then we could end up with inferared camera in our phone ( and yes, I know there are dongles and gadgest as inferared camera for pone, but when you make something at home it is more funa dn joyable)
final question, is there softwares for simulating optics for free ? i don't have much money to buy some expensive sofware for hoby and playing games.
I have seen an ad on in FreeCAD, but it is not that advanced, and simulation is ram consuming
These plastics are engineered for very low absorption across the visible spectrum. I see from the datasheet (jp.mitsuichemicals.com/en/special/apel/lineup/) that the APL series of plastics have transmissivity at d-light of 91% through 3 mm of material - assuming I'm reading it correctly. So, that would be about 99% transmitted through each lens element in this patent. I don’t have any information about IR and UV for these materials. It seems that replacing the IR filter with an IR window (iriss.com/articles/what-type-of-lens-materials-are-used-in-infrared-inspection-windows/#:~:text=The%20most%20common%20materials%20used,used%20infrared%20window%20optic%20materials) would certainly make for an interesting IR camera.
There are some budget options for optical design software. Here are two. There is ATMOS-ATM which can be purchased for $400 from www.astro-physics.com/software/. If you can use non-sequential ray tracing, you can request the demo version of FRED at photonengr.com/fred-software/. It comes with a perpetual license. The limitation on the free demo is that you can’t save files or write scripts. The number of rays is also limited. In other words, it's exactly what it is called. A demo. But it’s free.
@@stephenremillard1 Thank you so much for the information.
As for the camera, yes; I also had an eye to the market, but when i make something myself, to me have tousand times more value than buying something prety out of market.
As for fred, i know about it, my prof had a commercial version of it, but I was wondering about something that i can use commercially; in the CAD world or graphics, there are many free and open source programs, I was hoping to find similar for the optics.
Filtering specific frequencies is easily done with a filter in front of the lens, either a screw in type for a camera or you can get ones that clip onto a smartphone. You can get a few cheap UV filters (which are generally just plain glass these days) and play with coatings on them at home without risking damage to any expensive optics, or mass produced filters that block or pass specific frequencies are widely available for astrophotography enthusiasts