We, team RoboLectro, used a Raspberry Pi, camera and a ML model to detect the victims and safe zones. For more details, email me at aarravanil@gmail.com
Sir can you share me your source code for this line follower robot concept because i am currently working in this project, if share the source code it will really very helpful for me
What camera did you use? Also did you use tensorflow or opencv to detect the balls? My team is using opencv I’m just wondering if we should switch to tenserflow.
@@BLADE-yf5mc omnidirectional wheels because we wanted the centre of rotation of the robot to be in the front. So we had to put normal wheels in the front and Omni wheels in the back. This is so that the back wheels also contribute to traction (especially on ramp) and it can freely move sideways to allow for centre of rotation to be in the front. But I would recommend you to use 4 similar wheels with good traction in the front-back movement but which can slide sideways when turning. Eg - 3d printed wheel hub with a TPU tyre
@@BLADE-yf5mc you can use the espcam but from our testing, we found out that it's video processing capabilities are very less (frame rate of the video is very less). So if you plan to implement ML, espcam is not a suitable choice. It's best to use a raspberry Pi with a pi camera. Then you can serial connect the pi to your Arduino. Even though we used a pi, we got only 2.5 frames per second... You can try to use different models and choose the one which has the highest frame rate and accuracy.
How did you get the reflective tape working? Im using the same colour sensors but im unable to get a reliable readings for reflective tape. What integration time did you use for the colour sensors?
@@BLADE-yf5mc we used the IR sensors itself to detect the silver strip. But it wasn't very accurate. I advise you to use another sensor for the silver strip. 2.4ms integration time
Sir can you share me your source code for this line follower robot concept because i am currently working in this project, if share the source code it will really very helpful for me
Could you share the cad file for the robot??
What camera did you use? Also did you use tensorflow or opencv to detect the balls? My team is using opencv I’m just wondering if we should switch to tenserflow.
We used a raspberry Pi V2 camera (because it was inexpensive and has good resolution)
We used tensorflow lite for the ML model
Also why did you use two omni wheels. The robot won’t have the omni directional movement
My team is considering using the ESP cam and Arduino mega. Do you think we’ll run into any problems?
@@BLADE-yf5mc omnidirectional wheels because we wanted the centre of rotation of the robot to be in the front. So we had to put normal wheels in the front and Omni wheels in the back. This is so that the back wheels also contribute to traction (especially on ramp) and it can freely move sideways to allow for centre of rotation to be in the front.
But I would recommend you to use 4 similar wheels with good traction in the front-back movement but which can slide sideways when turning.
Eg - 3d printed wheel hub with a TPU tyre
@@BLADE-yf5mc you can use the espcam but from our testing, we found out that it's video processing capabilities are very less (frame rate of the video is very less). So if you plan to implement ML, espcam is not a suitable choice.
It's best to use a raspberry Pi with a pi camera. Then you can serial connect the pi to your Arduino.
Even though we used a pi, we got only 2.5 frames per second...
You can try to use different models and choose the one which has the highest frame rate and accuracy.
what colour sensors did you use sir?
QRE1113 IR sensor by sparkfun for line follow
TCS34725 colour sensor for green patch
How did you get the reflective tape working? Im using the same colour sensors but im unable to get a reliable readings for reflective tape. What integration time did you use for the colour sensors?
@@BLADE-yf5mc we used the IR sensors itself to detect the silver strip. But it wasn't very accurate. I advise you to use another sensor for the silver strip.
2.4ms integration time
@@aarravoltics4592 so 2 colour sensors for green patch and another one for silver strip?
@@BLADE-yf5mcyes, thats right. You can mount the sensor at a slight angle so that it can detect the silver better.