Kent by now I should no longer feel as amazed and humbled by the range and depth of technical knowledge you bring to your presentations and yet you have done it again here. Wow!
wow, as a photographer, and machinist, this is very impressive imagery, I was completely lost in the explanation, except for the 50mm Schneider lens.....you have some great Manfroto C-Stands and Grip gear.....and of course your South Bend lathe......best wishes from an old guy in Florida, Paul
At my place of employment, we use these cameras to troubleshoot and diagnose issues with extremely high-speed machines. I was amazed the first time I saw it used almost 15 years ago now. I never thought of it being used on a lathe or mill, though.
I am just blown away! I am a tehnical person, but the depth of knowledge that you can go is absolutely mind blowing! I cannot imagine how much time it went into research, for what you’ve demonstrated here!
Thank you Vasi! I have experience with computer vision for measuring plant health and FPGAs as application accelerators. I had started an FPGA based electronic leadscrew, before I had LinuxCNC, so had done much of the work prior. Wishing you all the best!
Really cool! I'm reminded of Applied Science's video of a record playing inside an SEM; a big effort to create a unique view a difficult to capture process.
Thank you! I wasn't thinking when I started cutting and later realized I had showered the electronics with small bits of steel. I do need to tidy up and make all more robust. Finding a more affordable camera, in case of breakage or oil, is equally important. Would be a sad day to damage this camera.
Wow that is really rather excellent. I'm afraid the software is beyond me now. Last time I did any VHDL was back in year 2002 and too many cells have perished since then.
Thank you Murray! Fantastic that you worked with VHDL at all. I started when I needed to accelerate software. VHDL takes a different way of thinking that continues to be hard for me to get used to.
Amazing job! But I can't believe you went through all this trouble to show just 30 sec of cutting footage :D Other youtubers could build their whole channel around this gimmick!
So cool! Would love to have something similar but the camera requirements (trigger video frame on signal) make it way out of reach. Camera you used is 2749 EUR. HOWEVER, what if we set it up the other way around? Spindle already has rpm control. "All" that's needed is to sync it up so that it's an exact multiple of the camera fps. Cameras that can do 15,24,25,30 fps would allow for 900,1440,1500,1800rpms and multiples. Requirements: 1) camera must have low frame timing jitter but my guess is that most already do, much more advanced spindle controller that can not only 2) hold frequency but also 3) allow fine adjust for initial sync and 4) compensate for momentary load/cutting induced slow-downs by slightly speeding up after. For lower power spindles there's open source, modable controllers (VESC6 or Odrive) that should work. Plus for a code jedi advanced spindle control is a pathway to many CNC abilities.. (I have some things planned in this space)
I love your ingenuity! What you describe could work, but it will be challenging. You'll likely leave the camera in freerunning mode, and hope the camera triggers itself consistently. Trigger commands over USB would be too variable. Not sure how the result will change if the camera is sending a video stream instead of frames. Try to find a global shutter camera, which unfortunately drives up the price. Most cameras are rolling shutter. For quick and dirty checks, a rolling shutter is probably fine. A global shutter would be necessary for measurements.
I love this. But the film was cut at the beginning of the video.. You have to go to 3:57 to see it work. And it is glorious. Really great work. And what it took you to do it, with all that FPGA stuff. Goodness, that was just awful to have to do all of that work, but the results are worth, ...even if it has no practical use. Look at it on the screen. It looks so cool. Who cares if it has a practical use, but you can see that clearly this is an amazing invention. I would count this as an invention, and I don't even believe in those. They are discoveries, but this one is I think the single coolest thing I have ever seen on youtube about CNC machines. It is so creative, and amazingly useful. You can see the part sitting perfectly still, but while it is rotating. And magnification lenses could be added if needed to get way down into the details. Congratulations on this "invention". I think you created something completely new, with amazing ramifications.
I'm very grateful for all your support! There's been good interest in this project but mostly as a novelty instead of a tool. I would need to show more examples. There's an update to this project that adds tool position to the display. And the demo is at the beginning of the video. th-cam.com/video/Bqa2grG0XtA/w-d-xo.html
wow, a real diy motion camera project 🥳 that is very well thought out 🤔 this can be adapted for many other uses such as for test tire rotation defects 🤓thank you 👌🥠☕
Your comment is a great example of the benefit of sharing project videos. My Dad was a mechanic and used this ancient tube-based tire balancer. Probably could reuse the the balancer's motor and replace the electronics with a magnet attached accelerometer which triggers the camera. If you're interested, I've continued to work on the lathe example and plan to have the next update in a week. Thank you!
Did you consider something simpler like a magnet, high speed hall effect sensor and fet driving the shutter ? Impressive results, would be very helpful to see how the tool engages with the stock to identify issues.
The existing hardware allowed for a similar setup: attach the camera trigger to one 7i76e output and add one line to the HAL file. This has nearly the same effect of what you describe, but your approach would not have the 1ms jitter (big plus.) Both are fast to setup but with the drawback of needing to use velocity and time to estimate when to trigger the camera with a non-zero phase offset. (Or a means to rotate the hall effect sensor.) If the spindle rotation is constant, all is good. My more complicated approach remains stable with change in acceleration. If using timing to implement phase offset, a handy way to do this is by setting the time delay on the camera, which delays taking the image by a programmed amount following a trigger. Also, the images are delivered from the camera with a microsecond resolution timestamp piggybacked which is nice for estimating velocity, especially considering USB latency. Thank you for asking. It's all very fascinating!
That's pretty cool, definitely have that dialed in with the DE10-Nano. What is your current exposure time on the camera? I do practically the very same thing but using a custom component in LinuxCNC but it's a much different application. You are right about the delay caused by the camera trigger coming from the LinuxCNC signal back to the Mesa card especially running a 1ms servo-thread, but it's not as bad as it sounds and in some cases it doesn't matter at all. As long as the delay stack up is consistent it's just realized as a sightly later trigger which you have to offset back anyway. I'd be curious to replicate this with my own LinuxCNC setup but I don't have a CNC'd Lathe, just a mill. I would think that my setup would be less consistent than yours but I would be curious to see how it works. My component employs circumferential positioning. So a circumference is set as a pin which gives it a length from the index-input. Then length is set in the component to trigger the output, which in this case is the camera and the camera controls it's own strobe. I make my own LED strobes and these are not on constantly because that consumes alot of power and heat. For what I do and the power of my strobe I expose at about 250us. I have had a strobe that was powerful enough to expose at
100us. A screenshot from an oscilloscope monitoring the time difference between the Mesa card vs. from the hardware is pretty striking. The on-camera effect is amplified at larger radii. I was going to perform data collection with the Mesa card, which seemed cool, but it just is not suitable. I started with a strobe, no camera, but the flashes are rough on the operator. I'll try again if I do the remote control project. I have a small amount of jitter remaining from the 1us FPGA freq. and maybe encoder belt flex. Thank you
@@kentvandervelden I checked the 2nd video, That's pretty much perfect on-time triggering. Now that I think of it, the machine I have the cameras attached to has a roller driven by a Clearpath servo, and that roller is coupled to the encoder used for positioning. It's there for simulation but it's actually perfect for testing the triggering accuracy. I expect my setup to be pretty bad compared to what you have because the image processing, LinuxCNC and everything else is done on the same small PC. For my use, high accuracy is not necessary because I'm not even taking a picture of the same -exact- object at every trigger, though I think this would be a good test nonetheless.
That was really cool! What does the trigger output control? From you saying you picked that light because it is flicker free, I would guess it controls the shutter on the camera. Would it be practical to switch a bank of LEDs as well or instead? Electronic shutters may have changed the situation, but it used to always be easier to control the light precisely than the shutter. Secondly, what kind of quadrature decoder are you using, 1x or 4x?
Thank you, especially since I know you'll understand the electronics! The trigger controls the readout from the camera's sensor buffer; there's no mechanical shutter. (It's a full frame sensor and the entire frame is read out at once. Not sure how they implement that; I guess each photo site has two accumulators that are swapped per frame.) The capture speed is limited by USB3 bandwidth. I.e., 30FPS at 12MP, 8-bit Bayer/mono, but reduce the captured region and proportionally increase the FPS. The light selection helps avoid overall image brightness flicker in the image. With the timestamp added to the image by the camera, and electrical mains with pretty consistent frequency, it may be possible to crudely correct. Easiest is to use a DC source. (I changed from a Canon 80D to 90D last year, and I see flicker in the videos especially if scrubbing. Need to remember to set the exposure time higher.) I've considered pulsing IR LEDs. A strobe is distracting but something out of the range of visible light might be OK. Harder to find a sensitive camera with decent resolution that's affordable when outside the visible range. I'm reading the encoder in 4x mode. 2,500LPR x 4 = 10,000CPR. The encoder has a response frequency of 200kHz, but I don't remember if that's limiting maximum rotational speed or mechanical integrity. Decrease the LPR and speed is limited by mechanical integrity.
With the lens... about four times the cost of a Canon 90D. Maybe $4.7k total new with lens. I was able to borrow it from another project, and I worry about dropping it on the concrete floor. The camera is bandwidth limited by USB3. Reduced the ROI, send less data per frame, and the frame rate can increase a lot. The software API is nice and makes changing models easy.
Hey man! I saw this video and thought to myself "Surely this can be done cheaper? And without an FPGA?" So I tried to use a raspberry pi global shutter camera. That ... didn't end up panning out so I spent some $$$$ and ended up with this: th-cam.com/video/AW_NqhAaRco/w-d-xo.html
Part Deux: th-cam.com/video/Bqa2grG0XtA/w-d-xo.html
My mind is blown with every video you post.
Thank you, that's very humbling! A high standard to maintain. Regardless, I hope you continue to find something useful.
When you said you were making a video you weren't joking that is so cool Kent
Thank you Aaron!
Kent by now I should no longer feel as amazed and humbled by the range and depth of technical knowledge you bring to your presentations and yet you have done it again here. Wow!
Charles you're the most productive and skilled engineer I've met. That makes your compliment one to strive to live up to over a lifetime. Thank you
wow, as a photographer, and machinist, this is very impressive imagery, I was completely lost
in the explanation, except for the 50mm Schneider lens.....you have some great Manfroto
C-Stands and Grip gear.....and of course your South Bend lathe......best wishes from an old guy
in Florida, Paul
At my place of employment, we use these cameras to troubleshoot and diagnose issues with extremely high-speed machines. I was amazed the first time I saw it used almost 15 years ago now. I never thought of it being used on a lathe or mill, though.
I am just blown away! I am a tehnical person, but the depth of knowledge that you can go is absolutely mind blowing!
I cannot imagine how much time it went into research, for what you’ve demonstrated here!
Thank you Vasi! I have experience with computer vision for measuring plant health and FPGAs as application accelerators. I had started an FPGA based electronic leadscrew, before I had LinuxCNC, so had done much of the work prior. Wishing you all the best!
Really cool! I'm reminded of Applied Science's video of a record playing inside an SEM; a big effort to create a unique view a difficult to capture process.
Thank you! Is this the video you were remembering?
th-cam.com/video/GuCdsyCWmt8/w-d-xo.html
@@kentvandervelden Thats the one!
This is awesome! I want… no need… to see more cuts!
Thank you! I wasn't thinking when I started cutting and later realized I had showered the electronics with small bits of steel. I do need to tidy up and make all more robust. Finding a more affordable camera, in case of breakage or oil, is equally important. Would be a sad day to damage this camera.
That cutting on 5:42 made me smile 😁 That looks pretty cool!
Thanks! It's a neat effect. I would like to make a more robust setup and a practical demonstration.
this deserves a lot more views!
Very Cool! Nice job.
Thank you :)
Wow that is really rather excellent. I'm afraid the software is beyond me now. Last time I did any VHDL was back in year 2002 and too many cells have perished since then.
Thank you Murray! Fantastic that you worked with VHDL at all. I started when I needed to accelerate software. VHDL takes a different way of thinking that continues to be hard for me to get used to.
Amazing job!
But I can't believe you went through all this trouble to show just 30 sec of cutting footage :D
Other youtubers could build their whole channel around this gimmick!
Have you subscribed to my second channel? Just kidding :) Must respect people's time. Hopefully people don't see ads the first few days too. Thank you
What a unique combination of skills
Metal work
FPGA
CNC
to name a few.
That's impressive
I appreciate your message a lot, thank you.
Great as always!
Thank you, glad you found the video interesting
So cool! Would love to have something similar but the camera requirements (trigger video frame on signal) make it way out of reach. Camera you used is 2749 EUR. HOWEVER, what if we set it up the other way around? Spindle already has rpm control. "All" that's needed is to sync it up so that it's an exact multiple of the camera fps. Cameras that can do 15,24,25,30 fps would allow for 900,1440,1500,1800rpms and multiples. Requirements: 1) camera must have low frame timing jitter but my guess is that most already do, much more advanced spindle controller that can not only 2) hold frequency but also 3) allow fine adjust for initial sync and 4) compensate for momentary load/cutting induced slow-downs by slightly speeding up after. For lower power spindles there's open source, modable controllers (VESC6 or Odrive) that should work. Plus for a code jedi advanced spindle control is a pathway to many CNC abilities.. (I have some things planned in this space)
I love your ingenuity! What you describe could work, but it will be challenging. You'll likely leave the camera in freerunning mode, and hope the camera triggers itself consistently. Trigger commands over USB would be too variable. Not sure how the result will change if the camera is sending a video stream instead of frames. Try to find a global shutter camera, which unfortunately drives up the price. Most cameras are rolling shutter. For quick and dirty checks, a rolling shutter is probably fine. A global shutter would be necessary for measurements.
I love this. But the film was cut at the beginning of the video.. You have to go to 3:57 to see it work. And it is glorious. Really great work. And what it took you to do it, with all that FPGA stuff. Goodness, that was just awful to have to do all of that work, but the results are worth, ...even if it has no practical use. Look at it on the screen. It looks so cool. Who cares if it has a practical use, but you can see that clearly this is an amazing invention. I would count this as an invention, and I don't even believe in those. They are discoveries, but this one is I think the single coolest thing I have ever seen on youtube about CNC machines. It is so creative, and amazingly useful. You can see the part sitting perfectly still, but while it is rotating. And magnification lenses could be added if needed to get way down into the details. Congratulations on this "invention". I think you created something completely new, with amazing ramifications.
I'm very grateful for all your support! There's been good interest in this project but mostly as a novelty instead of a tool. I would need to show more examples. There's an update to this project that adds tool position to the display. And the demo is at the beginning of the video.
th-cam.com/video/Bqa2grG0XtA/w-d-xo.html
wow, a real diy motion camera project 🥳 that is very well thought out 🤔 this can be adapted for many other uses such as for test tire rotation defects 🤓thank you 👌🥠☕
Your comment is a great example of the benefit of sharing project videos. My Dad was a mechanic and used this ancient tube-based tire balancer. Probably could reuse the the balancer's motor and replace the electronics with a magnet attached accelerometer which triggers the camera. If you're interested, I've continued to work on the lathe example and plan to have the next update in a week. Thank you!
I want to see this thread cutting. Or knurling.
Those are great suggestions! It'll be a little while, I would like to reduce the hardware to a board and protect the camera. Thank you
Did you consider something simpler like a magnet, high speed hall effect sensor and fet driving the shutter ? Impressive results, would be very helpful to see how the tool engages with the stock to identify issues.
The existing hardware allowed for a similar setup: attach the camera trigger to one 7i76e output and add one line to the HAL file. This has nearly the same effect of what you describe, but your approach would not have the 1ms jitter (big plus.) Both are fast to setup but with the drawback of needing to use velocity and time to estimate when to trigger the camera with a non-zero phase offset. (Or a means to rotate the hall effect sensor.) If the spindle rotation is constant, all is good. My more complicated approach remains stable with change in acceleration. If using timing to implement phase offset, a handy way to do this is by setting the time delay on the camera, which delays taking the image by a programmed amount following a trigger. Also, the images are delivered from the camera with a microsecond resolution timestamp piggybacked which is nice for estimating velocity, especially considering USB latency. Thank you for asking. It's all very fascinating!
@@kentvandervelden great explanation, cheers
Pretty cool man!
Thank you! I've been working on it some more and will post an update in a couple of weeks.
As always, cool!
Wow, thank you :)
That's pretty cool, definitely have that dialed in with the DE10-Nano. What is your current exposure time on the camera? I do practically the very same thing but using a custom component in LinuxCNC but it's a much different application. You are right about the delay caused by the camera trigger coming from the LinuxCNC signal back to the Mesa card especially running a 1ms servo-thread, but it's not as bad as it sounds and in some cases it doesn't matter at all. As long as the delay stack up is consistent it's just realized as a sightly later trigger which you have to offset back anyway. I'd be curious to replicate this with my own LinuxCNC setup but I don't have a CNC'd Lathe, just a mill. I would think that my setup would be less consistent than yours but I would be curious to see how it works.
My component employs circumferential positioning. So a circumference is set as a pin which gives it a length from the index-input. Then length is set in the component to trigger the output, which in this case is the camera and the camera controls it's own strobe. I make my own LED strobes and these are not on constantly because that consumes alot of power and heat. For what I do and the power of my strobe I expose at about 250us. I have had a strobe that was powerful enough to expose at
100us. A screenshot from an oscilloscope monitoring the time difference between the Mesa card vs. from the hardware is pretty striking. The on-camera effect is amplified at larger radii. I was going to perform data collection with the Mesa card, which seemed cool, but it just is not suitable. I started with a strobe, no camera, but the flashes are rough on the operator. I'll try again if I do the remote control project. I have a small amount of jitter remaining from the 1us FPGA freq. and maybe encoder belt flex. Thank you
Sorry, I thought you were commenting on the newer video. Please check th-cam.com/video/Bqa2grG0XtA/w-d-xo.html
@@kentvandervelden I checked the 2nd video, That's pretty much perfect on-time triggering. Now that I think of it, the machine I have the cameras attached to has a roller driven by a Clearpath servo, and that roller is coupled to the encoder used for positioning. It's there for simulation but it's actually perfect for testing the triggering accuracy. I expect my setup to be pretty bad compared to what you have because the image processing, LinuxCNC and everything else is done on the same small PC. For my use, high accuracy is not necessary because I'm not even taking a picture of the same -exact- object at every trigger, though I think this would be a good test nonetheless.
That was really cool! What does the trigger output control? From you saying you picked that light because it is flicker free, I would guess it controls the shutter on the camera. Would it be practical to switch a bank of LEDs as well or instead? Electronic shutters may have changed the situation, but it used to always be easier to control the light precisely than the shutter.
Secondly, what kind of quadrature decoder are you using, 1x or 4x?
Thank you, especially since I know you'll understand the electronics! The trigger controls the readout from the camera's sensor buffer; there's no mechanical shutter. (It's a full frame sensor and the entire frame is read out at once. Not sure how they implement that; I guess each photo site has two accumulators that are swapped per frame.) The capture speed is limited by USB3 bandwidth. I.e., 30FPS at 12MP, 8-bit Bayer/mono, but reduce the captured region and proportionally increase the FPS. The light selection helps avoid overall image brightness flicker in the image. With the timestamp added to the image by the camera, and electrical mains with pretty consistent frequency, it may be possible to crudely correct. Easiest is to use a DC source. (I changed from a Canon 80D to 90D last year, and I see flicker in the videos especially if scrubbing. Need to remember to set the exposure time higher.) I've considered pulsing IR LEDs. A strobe is distracting but something out of the range of visible light might be OK. Harder to find a sensitive camera with decent resolution that's affordable when outside the visible range. I'm reading the encoder in 4x mode. 2,500LPR x 4 = 10,000CPR. The encoder has a response frequency of 200kHz, but I don't remember if that's limiting maximum rotational speed or mechanical integrity. Decrease the LPR and speed is limited by mechanical integrity.
Cool, how expensive is that camera?
With the lens... about four times the cost of a Canon 90D. Maybe $4.7k total new with lens. I was able to borrow it from another project, and I worry about dropping it on the concrete floor. The camera is bandwidth limited by USB3. Reduced the ROI, send less data per frame, and the frame rate can increase a lot. The software API is nice and makes changing models easy.
Wow !
Thank you
Frankly ridiculous goal, but a most excellent job and excellent results. Lots of practical purposes for sure.
This is very much a duck; nothing flashy on top but definitely a lot going on underneath!
In 2022: Try to be more like a duck. I'm going to keep this is mind. A good standard. Thank you
probably could use PIO on pi pico instead of fpga
👍
Thank you :)
I can see someone getting too comfortable and confusing the spinning part for the stationary. Accident waiting to happen.
I would pay big bucks for a plug and play product to accomplish this.
Drop me an email about what you need. Maybe I have something that would work. Thanks
Hey man! I saw this video and thought to myself "Surely this can be done cheaper? And without an FPGA?" So I tried to use a raspberry pi global shutter camera. That ... didn't end up panning out so I spent some $$$$ and ended up with this: th-cam.com/video/AW_NqhAaRco/w-d-xo.html