Dobot Magician with Pixy2 Vision Sensor

แชร์
ฝัง
  • เผยแพร่เมื่อ 7 มิ.ย. 2024
  • Walkthrough of project where I used a Pixy2 camera and an Arduino to make a machine vision based pick and place routine for the Dobot Magician Robotic arm.
    Written article including links to several sources:
    uptimefab.com/2019/10/02/how-t...
    Arduino program:
    github.com/Robin61/Arduino-Pi...
    STL file of Pixy2 holder: www.thingiverse.com/thing:389...

ความคิดเห็น • 102

  • @LegacyMicro
    @LegacyMicro 3 ปีที่แล้ว +1

    I'm 1/4 way through this video and giving it a thumbs up! Great job! Thanks!

  • @coppelltvrepair
    @coppelltvrepair 4 ปีที่แล้ว +1

    Sincere congratulations, excellent video!

  • @sermadreda399
    @sermadreda399 2 ปีที่แล้ว

    Man ,such a good video just what I needed ,thank you for sharing

    • @uptimefab7412
      @uptimefab7412  2 ปีที่แล้ว +1

      Thanks, great to see it was useful for you!

  • @khetimachineryindia
    @khetimachineryindia 4 ปีที่แล้ว

    Amazing video...everything well explanied..thanks for your help....

    • @khetimachineryindia
      @khetimachineryindia 4 ปีที่แล้ว

      if I use ordinary arm then how much will be error in object picking.. or did you faced these during your project?

  • @user-gd8kt4ef2n
    @user-gd8kt4ef2n 3 ปีที่แล้ว

    Wow... Amazing video!

  • @chrisdovale1255
    @chrisdovale1255 2 ปีที่แล้ว

    Good job! Great project!

  • @taoufikbfd1514
    @taoufikbfd1514 2 ปีที่แล้ว

    Trés bon travail merci.

  • @realjack88
    @realjack88 4 ปีที่แล้ว +1

    A great project! Thanks for posting and sharing. I noticed the picker and suction cup are slightly off centered before picking up the objects. Could that be improved or it is due to the intrinsic accuracy of the system?

    • @uptimefab7412
      @uptimefab7412  4 ปีที่แล้ว +3

      Hi Dan, Thanks for your feedback! I also noticed the gripper and vacuum cup were not always perfectly centered above the parts. For higher parts this might be due to the perspective of the camera. The camera finds the top of the part, while the robot arm goes to the bottom of the part. This is because the calibration procedure is done with flat colored dots on paper, while the parts have a certain height. If a part is higher, it is closer to the camera and therefore appears to be further outwards than it actually is, if that makes sense. I am not sure if this has any significant effect since the parts are not very high, bit this should be easy to verify, by doing the calibration at the same height as the parts used for the pick and place routine. sorry for the long explanation...still there...?
      An option to deal with variation is by averaging out several measurements to eliminate "noise" in the measurements. Sometime you can see the boxes around a part jump around a bit in the video feed.
      If I end up checking these theories, I'll post the results.
      Let me know if you have any other suggestions.
      Cheers,
      Robin

    • @realjack88
      @realjack88 4 ปีที่แล้ว

      Thanks for the response. Also for calibration, could you use the dobot arm button to move the arm freely to the desired positions instead of the jog buttons on your controller?

    • @uptimefab7412
      @uptimefab7412  4 ปีที่แล้ว +2

      @@realjack88 This should be possible. The Dobot communication protocol documentation describes some commands for reading the position after the handheld teach button was pressed. This would be a nice feature indeed. For now I have to make do with the previously stored positions in the EEPROM, so going to these stored positions (which should be close) already eliminates some of the button pressing needed to get to the various locations. Your suggestion might be a good idea for a future project, some more programming challenges for a rainy day!

  • @101digitalguru7
    @101digitalguru7 4 ปีที่แล้ว +1

    Good job thank u

  • @chickenz4604
    @chickenz4604 3 ปีที่แล้ว

    Very nice Project! Thank You for your video! I have a question about the camera. Is it possible to use a static normal webcam from the top (the one that has USB port)? for the pick and place application. I also have Dobot and plan to integrate this with standard camera, but im not sure how is the process. Maybe I will do the code for the image processing algorithm in Matlab. Another consideration is to buy directly the vision kit from Dobot. Need your help, Thank You

    • @uptimefab7412
      @uptimefab7412  3 ปีที่แล้ว

      Hi, using a standard USB camera might be possible, bit will be much harder to integrate into your project. The pixy has built in logic to give you the coordinates of all signatures found on screen. For a generic USB camera you would have to program all of this yourself, including an algorithm to find the parts in the image. I just received a review unit of the Dobot vision kit, but have not used it yet. From what I have seen however, it has very powerful software, with shape and text recognition, measurent options etcetera. This is quite expensive though. The pixel is only around 60 euros. (Of course far less advanced features but nice to start with). Pixy can be connected using USB to raspberry pi or maybe even PC. Hope this helps
      Robin

  • @borair77
    @borair77 2 ปีที่แล้ว

    Great project. Can you share with us how you managed to make a new and longer cable for the pixy cam?

    • @uptimefab7412
      @uptimefab7412  2 ปีที่แล้ว

      Hi of course, I believe it is somewhere in one of the videos as well, but I just got some flat cable and connectors with the same amount of pins. Then just crimp the connector over the flag cable with some flat pliers. That's it! Make sure to check your connections with a multimeter to see if all pins make contact.

  • @jach1969
    @jach1969 ปีที่แล้ว

    goooood job ...please which model of pression in the end effecter do you use !???

    • @uptimefab7412
      @uptimefab7412  ปีที่แล้ว

      Hi, thank you! Could you rephrase your question? I don't understand your question. Do you mean what vacuum level? If that is the case I don't know since there is no gauge on the pump.

  • @urieltorreslopez5657
    @urieltorreslopez5657 2 ปีที่แล้ว

    El mejor de todo loS proyectos bon PIXY2 APEA

  • @santiagorinconmartinez6324
    @santiagorinconmartinez6324 8 หลายเดือนก่อน

    Hi, thanks for sharing. What are the commands only to move the steppers?

    • @uptimefab7412
      @uptimefab7412  8 หลายเดือนก่อน

      Hi, I am not sure if there are any. From what I recall from the reference document by Dobot, there are only move command for the robot itself, but again not sure.

  • @101digitalguru7
    @101digitalguru7 4 ปีที่แล้ว +1

    I love you

  • @KKMaity
    @KKMaity 3 ปีที่แล้ว

    Can it recognise angle with respect to other objects and strait line or angle between two straight line

    • @uptimefab7412
      @uptimefab7412  3 ปีที่แล้ว

      It can detect lines including the angle of the line in line tracking mode, but I am not sure about angle between lines or lines and objects. Based on what I could find I think it is not possible.

  • @edwarddurden2626
    @edwarddurden2626 ปีที่แล้ว

    Awesome work 👍!! By any chance would this work with dobot mg400?

    • @uptimefab7412
      @uptimefab7412  ปีที่แล้ว +1

      Good question, I am not sure of there is any Arduino libraries for the MG400. Sorry for late response, missed this question yesterday. poor internet at the moment on my vacation address so can't do much research.

  • @user-py8gk1ct9f
    @user-py8gk1ct9f 2 หลายเดือนก่อน

    Hello, thank you for your tutorial. May I ask why the LCD screen doesn't light up when I download the program?

    • @uptimefab7412
      @uptimefab7412  2 หลายเดือนก่อน

      This could have many possible causes, but it might be good to start checking if the pin out of the program is the same as the physical connections. Did you try running the display with a simple example program? This could eliminate hardware or wiring issues as a cause.

    • @user-py8gk1ct9f
      @user-py8gk1ct9f 2 หลายเดือนก่อน

      @@uptimefab7412 Thank you for your reply. I have solved this problem now. Can I ask you about visual calibration again? My side now displays---The calibration mode fails to start

  • @mrwonk
    @mrwonk 3 ปีที่แล้ว

    Did you ever work out the feed-back issue to be able to determine position?

    • @uptimefab7412
      @uptimefab7412  3 ปีที่แล้ว

      No, unfortunately I still don't have a solution for this. My programming skills are not at the level where I could write my own libraries. I think it is very unfortunate that this is not just included in the standard Arduino library as supplied by Dobot. Working with position feedback is incredibly useful. (If anyone reading this has a solution, please post it here).
      Robin

    • @mrwonk
      @mrwonk 3 ปีที่แล้ว

      @@uptimefab7412 It is my understanding, the only way to get feedback is to either use a servo that includes a feedback wire for returning the PWM position, or having your own hardware attached to detect the position and feed it back (like a servo without a motor). I was hoping you had a different trick that I hadn't thought of.

    • @uptimefab7412
      @uptimefab7412  3 ปีที่แล้ว +1

      Hi David, adding additional encoders is probably possible, but indeed a method using the existing built-in encoders would be a much better option. Sorry I can't help you further at this time. I just hope they update the Arduino libraries to easily access all functions.

  • @Ib.m86
    @Ib.m86 2 ปีที่แล้ว

    Where did you get the vacuum griper?

    • @uptimefab7412
      @uptimefab7412  2 ปีที่แล้ว

      Hi, it was part of the set that I purchased, so it is the standard vacuum cup for the Magician.

  • @Looki2000
    @Looki2000 ปีที่แล้ว

    You could store the position of each bottle cap and then move them all at once.

    • @uptimefab7412
      @uptimefab7412  ปีที่แล้ว

      Hi, you are absolutely right. For this example that would be faster and eliminate some of the issues I had with the delays. I did intentionally have the camera reassess the position of the samples, so you can add new ones while the robot is running. This does make it a bit slower though.

    • @Looki2000
      @Looki2000 ปีที่แล้ว

      @@uptimefab7412 If you want to add more objects while the robot is running, you can take another photo after you have moved all the objects registered with the first photo to check if more objects have been added to be moved.

  • @faridafandibinnorzaiskanda1066
    @faridafandibinnorzaiskanda1066 11 หลายเดือนก่อน

    can i change the program to control it using the shield without pixy2??

    • @uptimefab7412
      @uptimefab7412  11 หลายเดือนก่อน

      Yes, this should be perfectly possible. Just remove any library or code related to the pixy.

  • @agrimechatronic4114
    @agrimechatronic4114 ปีที่แล้ว

    Hello.First of all thanks a lot for these great tutorials.I need help.I am doing object detection and I have 1 question.What if 2 objects are in same window.I mean 2 blue things are in same window. In which one will motors go? Can we do a filter for them for ex: go at the nearest one, wait x seconds and go to the second one.I asked these because i will burn detected objects and I need any filter.Can you help me? Thanks a lot.

    • @uptimefab7412
      @uptimefab7412  ปีที่แล้ว

      Hi, thanks for your interest in this project. The pixy will place all objects it finds I to an array and assign a number to them. The number is maintained as long as the part is in the image. For me it did not matter which number was picked so I just picked the first af the do and signatures and picked it up. If you want to pick the nearest signature I would just go through the array and for each signature, measure the distance from your reference location (for example the center of the frame) and the given signature by trigonometry, making them absolute and check for the smallest value. Can't provide the exact code, but that would be the approach. For ignoring older objects you could maybe create another array which indicates status. Sorry I can't provide a full code for this, but hope this helps at least a bit.

    • @agrimechatronic4114
      @agrimechatronic4114 ปีที่แล้ว

      @@uptimefab7412 Thank you, it doesnt matter for me also which one is the nearest,its ok if it is doing burning one by one,I just dont want to crash my program,can you help me?

    • @uptimefab7412
      @uptimefab7412  ปีที่แล้ว

      Hi, the issue you are describing mostly related to general programming of arrays and math. I am sorry but I can't help with generic programming. If you have any specific questions regarding the pixy code and why I used it please let me know.

  • @kongson14
    @kongson14 ปีที่แล้ว

    hi sir, do you think i can control dobot via arduino uno? if i can, whats the different between mega2560 and uno?

    • @uptimefab7412
      @uptimefab7412  ปีที่แล้ว

      Hi, the difference between the mega and uno is the number of IO connections. For this project with a pixy the Mega is needed because both the Pixy and Dobot need a serial connection. Mega has 3, Uno only 1.

    • @kongson14
      @kongson14 ปีที่แล้ว

      @@uptimefab7412 thank you for answering this, so i can use the demo code and just changed the serial to 1 then it shd all working? cuz my ultimate goal is using button to control the magician only

    • @uptimefab7412
      @uptimefab7412  ปีที่แล้ว

      Sorry, maybe I did not explain properly. You cannot use an Uno because you need 2 serial connections for this project. The Uno only has 1. This is the main reason I use the Mega.

  • @BeeRich33
    @BeeRich33 3 ปีที่แล้ว

    Any idea how small items can get before resolution becomes a problem? I might need to look for 2mm circles.

    • @uptimefab7412
      @uptimefab7412  3 ปีที่แล้ว

      I did some testing on smaller color signatures. For 2 mm diameter parts you would need to have the camera quite close (5-10 cm) for it to appear large enough to be identified. It might be possible with enough light and the camera close to the parts, but failure is also an option.

    • @BeeRich33
      @BeeRich33 3 ปีที่แล้ว

      @@uptimefab7412 I was reading up earlier, and I thought a controlled environment with a black background and sufficient light that doesn't change. I wasn't sure if this was tested or not. All this is new to me. As for proximity, I'll have 90mm clearance.

    • @uptimefab7412
      @uptimefab7412  3 ปีที่แล้ว

      @@BeeRich33 My guess is that it will be difficult at that distance for 2 mm parts. Indeed this depends on your particular lighting condition. It might be worth a try, especially if you can get closer with the camera, but that is up to you of course, if you are willing to take the chance.

    • @BeeRich33
      @BeeRich33 3 ปีที่แล้ว

      @@uptimefab7412 Lighting and background are indeed controllable. The camera can be 2 cm away if need be. If the camera cannot focus, getting closer to make the image larger is irrelevant. If the lighting is proper, would you guestimate how close this camera could get? Cheers

    • @uptimefab7412
      @uptimefab7412  3 ปีที่แล้ว

      Sorry for late response. I have had the camera pretty close (~5 cm) to objects and they were still relatively sharp even with fixed focus. The small sensor helps here. 2 cm might be too close but around 5 cm should work. At this distance it might just work. Given the low cost of the camera it might be worth a try.

  • @Eckmuhl29
    @Eckmuhl29 ปีที่แล้ว

    What a great job!
    Maybe you can help me here guys, I'm trying to use a Pixy2 cam with a FMU and its I2C port. On PixyMon the camera works, but the FMU has no response from the camera. I checked the driver, the wires, the port and that's seems OK. I setup the address and the baudrate with what I found in the driver code (so the default address and 400k), I upload again the firmware. I don't have more idea about that and I can't find anything on forums 😅

    • @uptimefab7412
      @uptimefab7412  ปีที่แล้ว +1

      Thanks! I am not familiar with what an FMU is. The only thing I have connected the Pixy to is an Arduino, which works fine. Sorry I probably can't be of much help here.

    • @Eckmuhl29
      @Eckmuhl29 ปีที่แล้ว

      @@uptimefab7412 No problem thank you

  • @devarsh2932
    @devarsh2932 ปีที่แล้ว

    Hi sir I want help in this project .after setting the co ordination it works but as we turn off the power after turning on it again the Dobot does not working ,the data inside the mega is erased after that we again need to provide the co ordinates . please give us solution on this issue .

    • @uptimefab7412
      @uptimefab7412  ปีที่แล้ว

      Hi, it would be best here to isolate the issue. Try to make a simple program that only writes a variable to the arduino in the void loop section (not continuously or you will wear out the eeprom) and also place commands in the void setup to read it from memory and print it to the serial monitor. Change the variable to any random number, run the program, reset the arduino and see if it prints the newly set value. If not the issue is with the arduino. Commands for storing and reading variables from and to the eeprom can be found in the program I posted on GitHub or from other sources online. It should work since this is a basic function of any arduino. If not try a different arduino. Hope this helps.

  • @shubhamchavan7875
    @shubhamchavan7875 ปีที่แล้ว

    I think there are some mistakes in the code, even after pressing the up or down key on the LCD keyboard shield there are no changes happening to the Dobot's position instead the value of the axis is increasing or decreasing, I'm an undergrad student trying to make this project so it would be great if you could help me out.

    • @uptimefab7412
      @uptimefab7412  ปีที่แล้ว

      Hi, it seems there is a communication problem between the arduino and the dobot. It does not have a feedback loop, so the arduino does not know if the arm actually moved. First make sure the baud rate in the arduino code is the same as the setting for the robot. Also for trouble shooting this issue I would go back to the absolute basics. Download the demo from dobot.cc under the download section for the dobot magician. It contains the libraries and a sample arduino program called "dobot_test_code.ino". It should make some simple moves. Does that program work for you? If not, first try to get it running with this sample program. As mentioned check serial speed but also cables connections etcetera. I hope this helps.

    • @shubhamchavan7875
      @shubhamchavan7875 ปีที่แล้ว

      Hi @UptimeFab , thanks for your reply it really helped me alot to understand the code better
      The dobot arm is moving properly now but I'm facing issues to set the auto pick and place function and I'm really confused about all the other settings. So it would mean alot if you could provide a bit detailed version of the setting up process.

    • @uptimefab7412
      @uptimefab7412  ปีที่แล้ว

      Good to see you were able to get the arm moving. The most important thing in the setup process is calibration of the camera. Make sure to print the calibration sheet (pdf in article) and set the green dots to signature #7 in the pixymon software. This needs to be done, because the arduino program will search for signature#7 in the caloibration procedure. Before running the calibration you need to go to each calibration dot and save its coordinate in the arduino menu. Calibration point A is the lower left, point B is lower right and point C is top left. When you have done this you can run the calibration cycle. When that completes succesfully (finds the 3 dots) the system is calibrated and ready for use. I want to prevent writing a very long answer with the chance of not adressing your questions. If there are any more specific items that are unclear, please let me know and I will try to answer them. Hope this helps, Robin

  • @user-ue8le8js3q
    @user-ue8le8js3q 11 หลายเดือนก่อน

    How did you home the arm?

    • @uptimefab7412
      @uptimefab7412  11 หลายเดือนก่อน +1

      There is a button on the back of the base. If you long-press the button it will start the homing sequence.

    • @user-ue8le8js3q
      @user-ue8le8js3q 11 หลายเดือนก่อน

      @@uptimefab7412 Oh my goodness I've banging my head against the wall trying to figure this out. Thank you so much. I've been trying to hard code a homing function in using get/setHomeCmd like in API description and Communication Protocol. Guess I can avoid that mess now. Thanks again 👍

    • @uptimefab7412
      @uptimefab7412  11 หลายเดือนก่อน

      Yeah, from what I recall there is not a simple homing instruction for Arduino code, so I just resorted to using the button. Not very sophisticated but it works:)

  • @rohitmangale4256
    @rohitmangale4256 ปีที่แล้ว

    Hello Robin, I am trying the above calibration settings and after completing it when I try to start the pick and place it shows me xnan and ynan, and the dobot doesn't move. Can you please help me figure out why that happens? Thanks!

    • @uptimefab7412
      @uptimefab7412  ปีที่แล้ว

      Hi, because the arm does not move in auto mode I suspect no signature is found or the calibration point are not set. You could try a couple of checks. Move to each calibration point in the calibration menu by pressing the goto... button for each point. If it moves to the correct point (A lower left, B lower right, C upper left) then that is ok. Also, in the auto cycle check the serial monitor. It provides al lot of information on if a signature was found, where is it, what the robot is going to do etc. Does it show a signature found when it is in the camera position in the auto cycle? If not they need to be retrained. If this doesn't help please let me know exactly what the last message is you get from the serial monitor.

    • @rohitmangale4256
      @rohitmangale4256 ปีที่แล้ว

      @@uptimefab7412 I tried what you said but my calibration failed(drive.google.com/drive/folders/14tnDAGaEJLk0Gs43DRI-0rrpuGdww0WY?usp=sharing),as you said I have attached the screenshots of the serial monitor, can you please guide me with the next procedure, Thanks!

    • @uptimefab7412
      @uptimefab7412  ปีที่แล้ว

      Hi, from the serial monitor output you sent it seems like one or more calibration dots cannot be found. Did you teach the geen calibration dots in pixymon as signature #7? After doing this, please check in the pixymon window of you can see all green calibration dots and if they are identified as signature #7. If they are found by the pixymon software, the calibration routine with the arduino should also work.

    • @rohitmangale4256
      @rohitmangale4256 ปีที่แล้ว

      @@uptimefab7412 heyy do we to select start_calib first or after setting the calibration points?Is there a way we can call you?

    • @uptimefab7412
      @uptimefab7412  ปีที่แล้ว

      Hi, first move the arm to each one of the calibration dots and store each location. Only after this is done you should run the calibration routine. Sorry I don't provide personal support by phone. This would take too much time and does not benefit other people. Here everyone can read the discussion. For me this is just a hobby to create something and share it with other makers, so I hope that makes sense. Let me know if you run into any other specific problems.

  • @CallousCoder
    @CallousCoder 3 ปีที่แล้ว

    Very cool! And I think this dude is Dutch 😉
    But colour detection, is trivial to do in OpenCV and then you don’t need a Pix. I thought it actually also did shape recognition, now I learned it doesn’t :) So I don’t purchase one and keep with OpenCV and dlib I guess.

    • @uptimefab7412
      @uptimefab7412  3 ปีที่แล้ว +1

      Hi, OpenCV looks like a nice solution indeed for more advanced stuff. This was my first DYI vision project, and the Pixy is great for that. Easy to get something working with an Arduino and just fun to play around with. Shape recognition would have been nice though. Might have a look at OpenCV in the future if time allows, thanks for the tip!
      Robin

    • @CallousCoder
      @CallousCoder 3 ปีที่แล้ว

      @@uptimefab7412 tijd is altijd de grootste limiterende factor, niet? :) Ik heb afgelopen week twee video’s online gezet met face detection en 1 heeft face recognition; ik ga door de code heen na de demo. De video’s heten “AI has Better taste than wonen” en “Wanna see My Horny Box?” - ja de vunzigheid hoort bij mijn karakter :)
      Ik schrijf het in C++, maar als je comfy bent in Python, kan je OpenCV heel eenvoudig in Python gebruiken. De meeste voorbeelden online zijn voor de Python bindings. Python was voor de Pi’s die ik uiteindelijk gebruik geen optie.

    • @uptimefab7412
      @uptimefab7412  3 ปีที่แล้ว +1

      Hi, stukje gezien, ziet er leuk uit die video's over AI. Zo te zien ben je wel een stuk verder dan ik met coding. Ik doe het gewoon maar wat voor de hobby. Weet je zeker dat de na van je kanaal handig is? Als ik callous intyp krijg ik hele vieze plaatjes van voeten te zien. Misschien kan dit mensen afschrikken als ze later nog eens zoeken naar jouw kanaal.

    • @CallousCoder
      @CallousCoder 3 ปีที่แล้ว

      @@uptimefab7412 callous betekend ook “gehard en grof” Maar Maurice, die op het printje die zei dat ook al :)

    • @CallousCoder
      @CallousCoder 3 ปีที่แล้ว

      @@uptimefab7412 programmeren is iets dat je leert door het veel te doen, en nog meer fouten te maken. Ik was toch zwaar onder de indruk van jou video! Want dat calibreren is toch niet triviaal!

  • @mubashir-alam
    @mubashir-alam 3 ปีที่แล้ว

    Sir What is the price of Dobot magician?

    • @uptimefab7412
      @uptimefab7412  3 ปีที่แล้ว +1

      Hi, around 1400 euros, price may vary between different countries.

  • @joaopaulosoares8512
    @joaopaulosoares8512 7 หลายเดือนก่อน

    Como faz para sugar a tempinho? É ar

    • @uptimefab7412
      @uptimefab7412  7 หลายเดือนก่อน

      Hi, I am using delays. From the top of my head around 1 second, to make ensure there is vacuum. Also a similar delay when releasing the part.

  • @ikeyang
    @ikeyang 3 ปีที่แล้ว

    The music is too loud

    • @uptimefab7412
      @uptimefab7412  3 ปีที่แล้ว

      Thanks for the heads up, I will make sure to check audio levels for the next upload.

  • @uniquerobotics2019
    @uniquerobotics2019 ปีที่แล้ว

    Hi Robin,
    What a wonderful integration. This is soo good. I have Pixy2 and Dobot Magician. I tried to integrate the same way as you did but I get lot of errors and other issues. It would be really helpfull if you can share your email ID and help me troubleshoot. Thanks alot in advance.

    • @uptimefab7412
      @uptimefab7412  ปีที่แล้ว +1

      Hi, thanks for your interest in this project. Please let me know what the specific issues are in this comment section or on the website. This way everyone can benefit from the discussion. There are a lot of discussions on issues already available under the website articles for both the Arduino project and the later one with the touchscreen. Please check those as well to see if you experience anything similar. Please let me know if there is anything else causing issues.

    • @uniquerobotics2019
      @uniquerobotics2019 ปีที่แล้ว

      @@uptimefab7412 Great. I have checked the website and went through the discussions in the comment section. Interestingly, I saw similar issues faced by others as well. I have troubleshooted most of them. The main issue am facing now is that the Dobot magician is not activating the vaccum pump to pick up the block during pick and place. Eventhough there is slight offset in the center point, it does pick and place very well except for the suction part. I tried increasing the delay and decreasing the velocity and acceleration of the arm as well. It will only work sometimes. Second issue is the center point of the object. I have calibrated the A_xy, B_xy and C_xy coordinates (using the same object) corrrectly but it still doesn't reach to the centre of the object (the object am using here is the colored cubes that comes along with the Dobot Conveyor belt)

    • @uptimefab7412
      @uptimefab7412  ปีที่แล้ว

      Have you tried disconnecting the USB cable from the Dobot when you are running the Arduino program? Probably best to restart the Dobot and Arduino after disconnecting. Another thing to experiment with is changing the baud rate for communication. However, the baud rate that is in the sketch is the one I used and should work. What you could also try is to make a very simple sketch with a move command, followed by a vacuum command to open, another move and a close vacuum command. Put some delays in between and see if this simple sketch also has issues. That is all I can think of at the moment.

    • @uniquerobotics2019
      @uniquerobotics2019 ปีที่แล้ว

      @@uptimefab7412 Yup, the cable was always disconnected from the dobot. I use it only when Homing is required and restart everything. Baud rate is 115200, I didn't change anything. I tried with the manual vacuum command and it is working fine. The issue appears only when running Pick and Place program🤔. Anyway, I will keep trying to troubleshoot the issue. In case I get any positive news, I will definitely inform here :). What about the [positioning of the Dobot to reach the center of the object? How can I make it more accurate?

    • @uptimefab7412
      @uptimefab7412  ปีที่แล้ว

      Hi, I hope you can get it working, not sure why it would not fun cation with o ly the auto ated routine. I have not looked I to improving positional accuracy. Sometimes it is I deed a bit off center, but for me this was just a proof of concept and for that purpose I deemed it to be ok. Might be able to be improved with a different calibration routine and some more effort though. I think some of the error is coming from not compensation for lens distortion.