RoboCup rescue line - evacuation zone - France 2023

แชร์
ฝัง
  • เผยแพร่เมื่อ 1 ต.ค. 2024
  • We, team RoboLectro, used a Raspberry Pi, camera and a ML model to detect the victims and safe zones.
    For more details, email me at aarravanil@gmail.com

ความคิดเห็น • 20

  • @loky2187
    @loky2187 3 หลายเดือนก่อน

    Sir can you share me your source code for this line follower robot concept because i am currently working in this project, if share the source code it will really very helpful for me

  • @pepememo1122
    @pepememo1122 6 หลายเดือนก่อน

    Could you share the cad file for the robot??

  • @BLADE-yf5mc
    @BLADE-yf5mc 8 หลายเดือนก่อน

    What camera did you use? Also did you use tensorflow or opencv to detect the balls? My team is using opencv I’m just wondering if we should switch to tenserflow.

    • @aarravoltics4592
      @aarravoltics4592  8 หลายเดือนก่อน

      We used a raspberry Pi V2 camera (because it was inexpensive and has good resolution)
      We used tensorflow lite for the ML model

    • @BLADE-yf5mc
      @BLADE-yf5mc 8 หลายเดือนก่อน

      Also why did you use two omni wheels. The robot won’t have the omni directional movement

    • @BLADE-yf5mc
      @BLADE-yf5mc 8 หลายเดือนก่อน

      My team is considering using the ESP cam and Arduino mega. Do you think we’ll run into any problems?

    • @aarravoltics4592
      @aarravoltics4592  8 หลายเดือนก่อน

      @@BLADE-yf5mc omnidirectional wheels because we wanted the centre of rotation of the robot to be in the front. So we had to put normal wheels in the front and Omni wheels in the back. This is so that the back wheels also contribute to traction (especially on ramp) and it can freely move sideways to allow for centre of rotation to be in the front.
      But I would recommend you to use 4 similar wheels with good traction in the front-back movement but which can slide sideways when turning.
      Eg - 3d printed wheel hub with a TPU tyre

    • @aarravoltics4592
      @aarravoltics4592  8 หลายเดือนก่อน

      @@BLADE-yf5mc you can use the espcam but from our testing, we found out that it's video processing capabilities are very less (frame rate of the video is very less). So if you plan to implement ML, espcam is not a suitable choice.
      It's best to use a raspberry Pi with a pi camera. Then you can serial connect the pi to your Arduino.
      Even though we used a pi, we got only 2.5 frames per second...
      You can try to use different models and choose the one which has the highest frame rate and accuracy.

  • @BLADE-yf5mc
    @BLADE-yf5mc 6 หลายเดือนก่อน

    what colour sensors did you use sir?

    • @aarravoltics4592
      @aarravoltics4592  6 หลายเดือนก่อน

      QRE1113 IR sensor by sparkfun for line follow
      TCS34725 colour sensor for green patch

    • @BLADE-yf5mc
      @BLADE-yf5mc 6 หลายเดือนก่อน

      How did you get the reflective tape working? Im using the same colour sensors but im unable to get a reliable readings for reflective tape. What integration time did you use for the colour sensors?

    • @aarravoltics4592
      @aarravoltics4592  6 หลายเดือนก่อน

      @@BLADE-yf5mc we used the IR sensors itself to detect the silver strip. But it wasn't very accurate. I advise you to use another sensor for the silver strip.
      2.4ms integration time

    • @BLADE-yf5mc
      @BLADE-yf5mc 6 หลายเดือนก่อน

      @@aarravoltics4592 so 2 colour sensors for green patch and another one for silver strip?

    • @aarravoltics4592
      @aarravoltics4592  6 หลายเดือนก่อน

      @@BLADE-yf5mcyes, thats right. You can mount the sensor at a slight angle so that it can detect the silver better.