Thank you for this magnificent explanation. Regarding the code, I've tried running and modifying the Python code in order to get the same images of the book. I was almost successful, except for the picture with mustache @11:05. I'm unable to get algorithm to converge, even though I've increased the maximum number of iterations and reduced `eps`. Technically, it converges, but the resulting vector is not sparse, thus the reconstructed image is a blend of many faces. Does anybody else tried the code?
I don't understand why it's important to down sample. E.g. if you did the exact same thing but kept the images at full res, you'd have a system that's over-determined instead of under-determined (assuming that the number of pixels is > number of people * images per person), but you could enforce sparsity via the solver. E.g. use matching pursuit or something? Wouldn't that do better since it has full resolution atoms? Or is the down sampling helping in some other way? For that matter, why not do a separate solve per-person instead of solving across all people at once and then find the person with the best match as a second step?
It amuses me how he insists (from 4:38) that the image MUST be downsampled. The original article humbly explains: “The only reason for resizing the images is to be able to run all the experiments within the memory size of Matlab on a typical PC. The algorithm relies on linear programming and is scalable in the image size.” Furthermore Section 4.2 states that the algorithm works no matter if the system is over- or underdetermined.
4:44 I guess down sampling is important to ensure, that the height of the vector (or matrix) is below the width of the entire matrix. A high resolution image would lead to a very tall vector, which would need an insanely big data set.
Disturbed by how robust this is. I'm concerned about companies selling this tech to the police/government, to allow them to identify people who are unhappy and protest, and assume that a mask or glasses will keep them safe. Is there any use for this that isn't dystopian? Honest question. Most of my engineering/science cases have been reasonable, and nothing close to the @1:50 bottom row noise case. I get that noisy signal detection and restoration is important, but this seems extreme.
If U'r ''feeling so disturbed''. please don't smear! this man, is a brilliant educator! & this is very helpful 2 many! I would hate 2 see, ''another'' high quality educator. go elsewhere! then only 2 find another, in an airport bathroom?
I've been reading your book / online material and watching your presentations, which are great! Thank you for all you do! I do have a question that I keep wondering, how could one construct a data collection policy that would provide the well-labelled data required for these kinds of algorithms? I know deep learning has it's place, but when I was working on mass data sets in the past for keystroke identification, we had massive data collection plans for supporting the collection of well-structured data sets when we were using classical machine learning techniques. Obviously, labelled data is a challenge in all supervised learning models, but it seems SVD derived techniques are highly sensitive to pre-structured data collection. How can proper feature engineering overcome real-world application, e.g., like with a robust data collection policy?
this is an ''educator'', not a ''corporation''! not everything revolves around, a bizarre super secret data collection policy of ''public'' information! a good example here is detecting forgeries! if U don't mind my opinion! as I'm sure U'r not talking about ''data regression''! regards..
@@Jibs-HappyDesigns-990 It is nice to have responses to TH-cam comments, thank you. The comment wasn’t about a cybersecurity data policy but how to collect the data to keep it planned and consistent-a data collection policy. See, this is exactly the problem, educators are not really preparing data scientists to understand the field that industry is rapidly advancing. I understand this is academic, but I’ve worked in academia for a decade and also now receive the product of academic professional preparation in the mass of junior professionals leaving academia when they enter their career. My question was centered in that research, which is actually common with industry (though industry is worse at formalizing data collection). I read Dr. Burton’s book and looked at his online material. I’m a huge fan and very appreciative of what he is doing! My question / comments were about what is missing in academic professional preparation, and what I see across academia’s response to the market need for data scientists by surging graduates-most whom can’t actually do the application work. So I noted the point Dr. Brunton made in his book about the Eigenface technique with PCA (and most of these ML models) requiring the data to be meticulously aligned on the eyes, for example. This is a larger challenge than most recent graduates understand or can cope with. PCA / SVD are great, but without data alignment their lower dimensional inflexibility cannot cope, and recent graduates move to the neural network black boxes which I believe Drs. Brunton and Kutz both are trying to overcome. I agree with them and am trying to help (and am a little tired of having to train DS graduates in how to actually do what their universities say they can do when they accredit them with degrees in Data Science). I can tell you colleagues and even my boss would rather hire mathematicians or engineers to do data science work because they believe they are more prepared to do actual data science application. And, IMHO it’s odd to me to consider data science a lofty university topic when the data is being generated en masse in commercial applications, not necessarily in university, which has directly caused universities to surge graduates in often hodge-podge programs smashing statistics, computer science, and mathematics together without sufficient training in software engineering, networking, IT, or cybersecurity training. Maybe that’s my point. What use is a data scientist without understanding how to collect and use data? It would be more valid if I were commenting on the lack of data scientists to turn their models into software accessible to users or systems through user or system interfaces, then I would concede, but graduates also are missing that understanding of MLOps, which is literally being invented by industry as I write this. We have a long way to go, defending academic progress by calling it education, in this case, won’t suffice.
@@mraarone thank U 4 the very helpful reply! Ah!! I see! this is usually the ''trade'' secret, of the course/bcs! yup! the funny thing though, this is changing every 3-6 month's. due 2 new NN's, mathematical systems. and new technologies..yes!! this can become a genuine twister! European Countries R very good @ keeping ''principals in data sciences'' organized? good luck finding what U need! in the mean time, here is a link 2 some1. who's very helpful, with finding the ''needed'' testing data... in regards 2 ''basic''NN's! NN's are probably what U'r looking for. 2 filter those grounds, from that coffee!! so U could be, looking @ another ''learning curve''! hope this helps;) th-cam.com/channels/R1-GEpyOPzT2AO4D_eifdw.html
Wait, does it mean, if CSI have a HUGEEEE database (the library of faces) and good computation power, they can actually get my face from the low definition security cameras? Like in theory
Love this series, but at the end of the talk I notice that you haven't completely erased your clear-board. Still a few letters ghosting in the upper left.
what does the ability of , ''how well he cleans his board''! have 2 do with anything here?? please don't scare him away! he's a wonderful ''educator''!!
yea!! matlab! nice. thanks 4 the opportunity 2 examine this project! my mustache man, want have highlight's! the flow field's, remind's me of a fellow's theory. that dark mater, could just be instrument correlation's!! or something like that!!forgot now..wow! this will be great!!
You're the future of education, respect.
Can't express how grateful I am for this wonderful content! Thank you for all the effort!
Outstanding explanation of the SRC paper. Very intuitive and visually explained. As always, second to none.
I read this in your book some months ago. This explanation help me to round the idea. Thank You!
Thanks for such a great explanation
Thank you for this magnificent explanation.
Regarding the code, I've tried running and modifying the Python code in order to get the same images of the book. I was almost successful, except for the picture with mustache @11:05. I'm unable to get algorithm to converge, even though I've increased the maximum number of iterations and reduced `eps`. Technically, it converges, but the resulting vector is not sparse, thus the reconstructed image is a blend of many faces. Does anybody else tried the code?
Thank you so much for this fantastic introduction to the concept, I was lost before this video, but now at least I have some footing
the code links on the website doesnt work. please help me with that. they just direct to the corresponding video
I don't understand why it's important to down sample. E.g. if you did the exact same thing but kept the images at full res, you'd have a system that's over-determined instead of under-determined (assuming that the number of pixels is > number of people * images per person), but you could enforce sparsity via the solver. E.g. use matching pursuit or something? Wouldn't that do better since it has full resolution atoms? Or is the down sampling helping in some other way?
For that matter, why not do a separate solve per-person instead of solving across all people at once and then find the person with the best match as a second step?
It amuses me how he insists (from 4:38) that the image MUST be downsampled. The original article humbly explains: “The only reason for resizing the images is to be able to run all the experiments within the memory size of Matlab on a typical PC. The algorithm relies on linear programming and is scalable in the image size.” Furthermore Section 4.2 states that the algorithm works no matter if the system is over- or underdetermined.
4:44 I guess down sampling is important to ensure, that the height of the vector (or matrix) is below the width of the entire matrix. A high resolution image would lead to a very tall vector, which would need an insanely big data set.
Very good explanations sir. Thank you.
Thank you Professor
Disturbed by how robust this is. I'm concerned about companies selling this tech to the police/government, to allow them to identify people who are unhappy and protest, and assume that a mask or glasses will keep them safe.
Is there any use for this that isn't dystopian? Honest question. Most of my engineering/science cases have been reasonable, and nothing close to the @1:50 bottom row noise case. I get that noisy signal detection and restoration is important, but this seems extreme.
If U'r ''feeling so disturbed''. please don't smear! this man, is a brilliant educator! & this is very helpful 2 many! I would hate 2 see, ''another'' high quality educator. go elsewhere! then only 2 find another, in an airport bathroom?
I've been reading your book / online material and watching your presentations, which are great! Thank you for all you do! I do have a question that I keep wondering, how could one construct a data collection policy that would provide the well-labelled data required for these kinds of algorithms? I know deep learning has it's place, but when I was working on mass data sets in the past for keystroke identification, we had massive data collection plans for supporting the collection of well-structured data sets when we were using classical machine learning techniques. Obviously, labelled data is a challenge in all supervised learning models, but it seems SVD derived techniques are highly sensitive to pre-structured data collection. How can proper feature engineering overcome real-world application, e.g., like with a robust data collection policy?
this is an ''educator'', not a ''corporation''! not everything revolves around, a bizarre super secret data collection policy of ''public'' information! a good example here is detecting forgeries! if U don't mind my opinion! as I'm sure U'r not talking about ''data regression''! regards..
@@Jibs-HappyDesigns-990 It is nice to have responses to TH-cam comments, thank you. The comment wasn’t about a cybersecurity data policy but how to collect the data to keep it planned and consistent-a data collection policy. See, this is exactly the problem, educators are not really preparing data scientists to understand the field that industry is rapidly advancing. I understand this is academic, but I’ve worked in academia for a decade and also now receive the product of academic professional preparation in the mass of junior professionals leaving academia when they enter their career. My question was centered in that research, which is actually common with industry (though industry is worse at formalizing data collection). I read Dr. Burton’s book and looked at his online material. I’m a huge fan and very appreciative of what he is doing! My question / comments were about what is missing in academic professional preparation, and what I see across academia’s response to the market need for data scientists by surging graduates-most whom can’t actually do the application work. So I noted the point Dr. Brunton made in his book about the Eigenface technique with PCA (and most of these ML models) requiring the data to be meticulously aligned on the eyes, for example. This is a larger challenge than most recent graduates understand or can cope with. PCA / SVD are great, but without data alignment their lower dimensional inflexibility cannot cope, and recent graduates move to the neural network black boxes which I believe Drs. Brunton and Kutz both are trying to overcome. I agree with them and am trying to help (and am a little tired of having to train DS graduates in how to actually do what their universities say they can do when they accredit them with degrees in Data Science). I can tell you colleagues and even my boss would rather hire mathematicians or engineers to do data science work because they believe they are more prepared to do actual data science application. And, IMHO it’s odd to me to consider data science a lofty university topic when the data is being generated en masse in commercial applications, not necessarily in university, which has directly caused universities to surge graduates in often hodge-podge programs smashing statistics, computer science, and mathematics together without sufficient training in software engineering, networking, IT, or cybersecurity training. Maybe that’s my point. What use is a data scientist without understanding how to collect and use data? It would be more valid if I were commenting on the lack of data scientists to turn their models into software accessible to users or systems through user or system interfaces, then I would concede, but graduates also are missing that understanding of MLOps, which is literally being invented by industry as I write this. We have a long way to go, defending academic progress by calling it education, in this case, won’t suffice.
@@mraarone thank U 4 the very helpful reply! Ah!! I see! this is usually the ''trade'' secret, of the course/bcs! yup! the funny thing though, this is changing every 3-6 month's. due 2 new NN's, mathematical systems. and new technologies..yes!! this can become a genuine twister! European Countries R very good @ keeping ''principals in data sciences'' organized?
good luck finding what U need! in the mean time, here is a link 2 some1. who's very helpful, with finding the ''needed'' testing data... in regards 2 ''basic''NN's! NN's are probably what U'r looking for. 2 filter those grounds, from that coffee!! so U could be, looking @ another ''learning curve''! hope this helps;)
th-cam.com/channels/R1-GEpyOPzT2AO4D_eifdw.html
Love the content and the explanations! 👏
Wait, does it mean, if CSI have a HUGEEEE database (the library of faces) and good computation power, they can actually get my face from the low definition security cameras? Like in theory
Love this series, but at the end of the talk I notice that you haven't completely erased your clear-board. Still a few letters ghosting in the upper left.
what does the ability of , ''how well he cleans his board''! have 2 do with anything here?? please don't scare him away! he's a wonderful ''educator''!!
Thank you so much sir.
yea!! matlab! nice. thanks 4 the opportunity 2 examine this project! my mustache man, want have highlight's!
the flow field's, remind's me of a fellow's theory. that dark mater, could just be instrument correlation's!! or something like that!!forgot now..wow! this will be great!!
can you please help us in coding fft method to compute difference equation and stimulate using matlab
solving difference equation using FFT in matlab