
- 15
- 11 067
Giang Nguyen
เข้าร่วมเมื่อ 12 พ.ค. 2011
Everything about Software engineering, Machine learning and Robotics
ROS2 Autonomous Car with Machine Learning: P1 URDF modelling and Gazebo Simulation
This is the part 1/4 ROS2 Autonomous car with Machine Learning.
Using ROS2 to model a 4-wheel differential drive robot using URDF. Then, simulate it in a custom race track Gazebo world. I use teleop_twist_joy to control the robot in Gazebo.
GitHub repository: github.com/andynguyencoding/skid_steer_robot
Documentation: docs.google.com/document/d/1016Zxn6dG75GtNGNyriwe28Fx69vgWufZ9b3zFASQIQ/edit?usp=sharing
Reference:
Previous project I made with Raspberry Pi and Machine Learning: th-cam.com/video/uuslVHPjWS8/w-d-xo.html
Articulated inertial macro file: github.com/joshnewans/articubot_one/blob/d5aa5e9bc9039073c0d8fd7fe426e170be79c087/description/inertial_macros.xacro
Using ROS2 to model a 4-wheel differential drive robot using URDF. Then, simulate it in a custom race track Gazebo world. I use teleop_twist_joy to control the robot in Gazebo.
GitHub repository: github.com/andynguyencoding/skid_steer_robot
Documentation: docs.google.com/document/d/1016Zxn6dG75GtNGNyriwe28Fx69vgWufZ9b3zFASQIQ/edit?usp=sharing
Reference:
Previous project I made with Raspberry Pi and Machine Learning: th-cam.com/video/uuslVHPjWS8/w-d-xo.html
Articulated inertial macro file: github.com/joshnewans/articubot_one/blob/d5aa5e9bc9039073c0d8fd7fe426e170be79c087/description/inertial_macros.xacro
มุมมอง: 2 519
วีดีโอ
Laser welding machine demo
มุมมอง 28หลายเดือนก่อน
In this project, I built an automatic mechanism that works with the Laser welding machine that produces a small sink basket. #servo #PLC #Ladder
Demo 3 - Advanced 2
มุมมอง 362 หลายเดือนก่อน
In this demo, we show that switching from Lidar data to Camera sensor for determining the camera coverage.
Demo 2 Planning 2, 3 and Advanced 1
มุมมอง 302 หลายเดือนก่อน
- In this video, the planning 2 can be seen when new artifact is found, the robot behaviour is switched to “InspectArtifact” (see planning 3 demo). You can see that the robot is approaching the artifact at a distance, and facing the artifact. - Planning 3: robot’s behaviour is monitored in the terminator bottom window on the right. - Advance 1: all artifacts are marked on Rviz while the robot e...
Demo 1 - Frontier exploration + one artifact type detection
มุมมอง 362 หลายเดือนก่อน
In this demonstration, the robot explores the cave using Frontier exploration algorithm. It also target one specific type of the artifact, which is the Mushroom in this case.
Space Robotics Assignment 2
มุมมอง 324 หลายเดือนก่อน
Path planning using Dijkstra's and A* algorithm; show connectivity in PRM map
Speech Command using Deep Learning on Raspberry Pi using Tensorflow
มุมมอง 908 หลายเดือนก่อน
Speech Command using Deep Learning on Raspberry Pi. The model is trained on the HuggingFace speech command dataset. A quantized model is created using Tensorflow TFLite library. Librosa is used to extract spectrogram audio features. #raspberrypi #machinelearning #deeplearning #tensorflow #Tensorflow-lite #audioprocessing #speechrecognition #asr #voicerecognition
Self-driving car with Lidar, RaspberryPi and Machine Learning
มุมมอง 6K10 หลายเดือนก่อน
Self-driving car with Lidar, RaspberryPi and Machine Learning In this video I walkthrough the process of making self-driving car with machine learning with real-time lidar data on Raspberry Pi. Documentation: docs.google.com/document/d/1gwkeCLW_RRi6tINub1Y3P05DJqnUuMJ5te2nC33V53c/edit?usp=sharing Source code: github.com/andynguyencoding/lidar_robotcar References: Murtaza video on Joystick contr...
3 Bước tạo Chatbot tiếng Việt giống như ChatGPT trên local với Ollama, HuggingFace
มุมมอง 1.9K11 หลายเดือนก่อน
3 Bước tạo Chatbot tiếng Việt giống như ChatGPT trên máy tính cá nhân sử dụng Ollama, HuggingFace và Docker Tài liệu: docs.google.com/document/d/1bT74s_f0-RtDr74mOusvv3-kWxM2ycrgpEP6Vn-hmsw/edit?usp=sharing HuggingFace vinallama: huggingface.co/vilm/vinallama-7b-chat-GGUF #chatgpt #huggingface #vietnam #ollama #vietnamese #llm #chatbot
DIY Laser CNC Mach3
มุมมอง 375ปีที่แล้ว
DIY Laser CNC with Mach 3 control. Use Inkscape to generate Gcode. Stepper motor
Raspberrypi based robot car with LEDs control at front and back
มุมมอง 255 ปีที่แล้ว
This is the demonstration of a robot car which powered by Raspberrypi, the coolest computer I ever used!
Bia tai StarHub Singapore
มุมมอง 5812 ปีที่แล้ว
Solo Giang vs Cong Anh: Giang chap nhan foul dieu 8 ra mieng lo, Cong Anh danh bo nho che bi va day la cu quyet dinh voi 3 bang
Hi, i really like the way you designed your autonomous car, i recently bought a pi 5 and im trying to make a similar robot, but im running into some issues regarding the power necessary to keep the pi 5 working safely Do you think it's best to run the motor driver directly from the raspberry pi? Or is it better to have a low level controller to handle motor control (an Arduino connected to the pi via i2c) How did you get away with using the 18650 batteries without a bms circuit? My first battery powered robot didn't had one it's batteries died due to over discharge What are the specs of your power bank? (How many watts can it deliver, does accept pd protocol? How many mah? Does it have quick charge?) i tried running a pi zero 2 w with a 1500mah powerbank but it struggles quite a lot, and in online forums i saw that using fast charging power banks can be really dangerous for the pi
@@JorgeJesúsVergelMayorga hi, your question is very good one and it is always one of the key contraints when designing any robotics or mechatronic system. I am no expert in this field but some experience I have is that 1) always read the electronic spec for power requirements of your equipments, 2) always good practice to separate the power circuit for controlling device such as raspberry pi and the actuators such as motors. So in short, I chose to power my raspberry pi with the powerbank that meets rpi specs 5v-3A. I chose powerbank with 10000 mAh over lipo battery for longer duration, safer discharge rate. For motor it is ok for 18650 or lipo as long as it designed as independent circuit, only for actuators. Hope it helps. Cheers
Hay quá ạ
@@aurora2621 cảm ơn bạn 😀
amazing!
Glad you like it!
Than chao Giang, cho anh xin email de lien lac duoc khong, hay Zalo cua Giang cung duoc?
hi Justin2Nguyen, cảm ơn đã xem video. my email: hoang.g.nguyen71424@gmail.com
hello, so I've just release a new video in which I rebuild the robot with ROS2 Humble. Checkout the video here th-cam.com/video/xXajON74BDw/w-d-xo.html
is that ROS ?
@@hongsethya4932 hi its not ROS but yea great idea. Im also thinking of making an update version with ROS and sensor fusion of camera and lidar. Will share it too when it ready
Your video is amazing; thank you so much for sharing your knowledge. Please, can you share links to where we can buy these components? The components range from the car frame to the lidar. I am eager to embark on a project similar to yours in order to gain knowledge and understanding.
hi, you can check out the section component list on the documentation, docs.google.com/document/d/1gwkeCLW_RRi6tINub1Y3P05DJqnUuMJ5te2nC33V53c/edit?usp=sharing There is a link for the main components such as Raspberry Pi and Lidar for your reference. Btw, better to source the component from local dealers for a better price.
Diode laser have a very short life. Dont waste it with simple demos. Congratulations.
làm sao mình có thể tải được cái link hugging mà có cả model file hả anh , em ấn vào link không biết tải sao
rất hữu ích, không biết anh có thể chia sẻ thêm về việc custom model on ollama kh ạ
Mình muốn training cho con model để nó trả lời được những gì mình dạy nó thì phải làm sao thế ạ? Bạn có video nào hướng dẫn training cho model k ạ?
uh việc này là hoàn toàn có thể. Thực tế thì nhiều tổ chức họ chọn cách này vì train một model mới từ đầu là tốn kém. Câu trả lời ngắn gọn là "LLM fine tuning". Mình đang làm một dự án tương tự, chắc thời gian tới lúc sẽ làm video hướng dẫn.
@@andyhgnguyen hy vong som co video
Please provide that pdf so we can give idea form in it also provide code in description of the video..... Your video are amazing....I am planning to do similar project but can't find right guide therefore please ones again share your pdf ....
Great to see someone with the same passion. I put the link to the source code and documentation in the video description. Feel free to explore and change suit your needs. cheers
@@andyhgnguyen Hey, I cant find the document in the description. Can only find 2 links to different videos. I am doing a similar Project and would be great help if I could read the pdf. Thanks in advance
@@parjanyadalal3014 hi, you can find the source code here github.com/andynguyencoding/lidar_robotcar. Hope it helps you in some way
Cấu hình tối thiểu để cài là bao nhiêu bạn
Bạn cần OS là Ubuntu 18 trở lên với cấu hình 8GB cho 3B model, 16GB cho 7B model và 32GB cho 13B model. CPU tối thiểu 4 cores, GPU thì không yêu cầu nhưng Ollama có tính năng tối ưu GPU cho inference nên có sẽ ngon hơn. Trong video mình dùng laptop Asus tuf gaming F15 core i7 16GB RAM và Nvidia GPU RTX 3050.
Any git repository or documentation?
yes, I am putting all together and will release a git repo. It may be ready soon
The video with explanation is here if you want to take a look th-cam.com/video/uuslVHPjWS8/w-d-xo.html
I want to do this project Explain it brifly
Yea sure. I intend to make video explain details soon. Will let you know when its ready
Hello, here is the video with explanation th-cam.com/video/uuslVHPjWS8/w-d-xo.html