![Kai Nakamura](/img/default-banner.jpg)
- 9
- 18 905
Kai Nakamura
เข้าร่วมเมื่อ 20 ก.พ. 2021
VEX IQ - Sensing Color
This video shows how to sense color using the VEX optical sensor. It covers setting up the sensor and how to use it, along with using if statements to make decisions in your code.
VEX IQ Tutorial Playlist: th-cam.com/play/PL49dF5loXCunQO-Fnx2JYFAsxjAtfytil.html
0:00 Intro
0:13 Optical Sensor Setup
0:40 Red Light, Green Light
1:59 Adding More Colors
2:50 If Statements
4:13 Conclusion
VEXcode IQ App: www.vexrobotics.com/vexcode/install/iq
VEX IQ Building Instructions: www.vexrobotics.com/iq/downloads/build-instructions
VEX IQ Tutorial Playlist: th-cam.com/play/PL49dF5loXCunQO-Fnx2JYFAsxjAtfytil.html
0:00 Intro
0:13 Optical Sensor Setup
0:40 Red Light, Green Light
1:59 Adding More Colors
2:50 If Statements
4:13 Conclusion
VEXcode IQ App: www.vexrobotics.com/vexcode/install/iq
VEX IQ Building Instructions: www.vexrobotics.com/iq/downloads/build-instructions
มุมมอง: 95
วีดีโอ
VEX IQ - Sensing Touch
มุมมอง 6414 วันที่ผ่านมา
This video shows how to sense touch using the VEX bumper sensor. It covers how to set up the bumper sensor, how to use the "wait until" block, and how to create a simple obstacle avoidance program. VEX IQ Tutorial Playlist: th-cam.com/play/PL49dF5loXCunQO-Fnx2JYFAsxjAtfytil.html 0:00 Intro 0:14 Attaching the Bumper Sensor 0:31 Devices Menu 1:15 How the Bumper Sensor Works 1:46 Bumper Sensor Set...
VEX IQ - Loops
มุมมอง 4414 วันที่ผ่านมา
This video shows how to use loops to repeat blocks of code in your programs. It covers what loops are, when to use them, the differences between the repeat block and forever block, and how to create complex loops. VEX IQ Tutorial Playlist: th-cam.com/play/PL49dF5loXCunQO-Fnx2JYFAsxjAtfytil.html 0:00 Intro 0:14 Why Loops? 1:04 Repeat Block 1:43 Forever Block 1:58 Loops Inside Loops 2:20 Conclusi...
VEX IQ - Controlling Motors
มุมมอง 9714 วันที่ผ่านมา
This video shows how to control individual motors on your robot. It covers how to set up the motors, how to program them, and how to move them with the controller. VEX IQ Tutorial Playlist: th-cam.com/play/PL49dF5loXCunQO-Fnx2JYFAsxjAtfytil.html 0:00 Intro 0:12 Motor Setup 1:02 Motor Blocks 1:34 Spin for Degrees 2:22 Using the Controller VEXcode IQ App: www.vexrobotics.com/vexcode/install/iq VE...
VEX IQ - Driving and Turning
มุมมอง 6614 วันที่ผ่านมา
This video shows how to program your robot to drive and turn. It covers how to use the basic driving and turning blocks, the differences between waiting and non-waiting blocks, and how to change the speed of the robot. VEX IQ Tutorial Playlist: th-cam.com/play/PL49dF5loXCunQO-Fnx2JYFAsxjAtfytil.html 0:00 Intro 0:14 Drivetrain Setup 1:00 Driving Blocks 1:45 Turning Blocks 2:15 Multiple Movements...
VEX IQ - Getting Started
มุมมอง 14414 วันที่ผ่านมา
This video covers everything you need to know to get started with VEX IQ robotics. We'll cover navigating around the code editor, pairing up the robot and controller, creating your first program, and finally driving around the robot. VEX IQ Tutorial Playlist: th-cam.com/play/PL49dF5loXCunQO-Fnx2JYFAsxjAtfytil.html 0:00 Intro 0:14 VEXcode IQ Editor 0:45 Pairing the Brain 1:35 First Program 2:29 ...
How to Make an Autonomous Mapping Robot Using SLAM
มุมมอง 12Kหลายเดือนก่อน
This video explains the basics of SLAM (Simultaneous Localization and Mapping), how a LIDAR sensor works, frontier exploration, pathfinding, pure pursuit, obstacle avoidance, and Monte Carlo localization. This project was part of my RBE 3002 class, more details can be found at: kainakamura.com/project/rbe3002. 0:00 What is SLAM? 0:44 Implementing SLAM 1:44 Frontier Exploration 2:31 Pathfinding ...
Squidbox: A DIY Bluetooth MIDI Controller
มุมมอง 1Kหลายเดือนก่อน
Squidbox is a low-cost Bluetooth MIDI controller. It provides an intuitive and interactive way to learn about scales and chords while having fun making music. It runs on an ESP32 and features a 2-axis joystick, a small OLED screen, a knob, and multiple buttons. This project was made by Kai Nakamura, Evan Carmody, Dennis Garvey, Karish Gupta, and Niko Tan. More details can be found at kainakamur...
Developing a Pick-and-Place Robotic Arm
มุมมอง 6K8 หลายเดือนก่อน
Development of a 4-DOF serial robotic manipulator designed for pick and place tasks. We discuss robot kinematics, camera calibration, transforming 2D pixel coordinates into 3D world coordinates, object detection and classification, and object localization challenges. This project was made by Kai Nakamura, Owen Sullivan, and Evan Carmody for RBE 3001. More details and a full report can be found ...
is 1400€ low-cost? I found way better, full aluminium, 6-DOF robotic arms for not even a 10% of the price
You explained it better in just 5mins what my college was not able to teach me in one semester.
What application are you using
It's called VEXcode IQ, you can install is here: www.vexrobotics.com/vexcode/install/iq There's also a browser version but I highly recommend installing the app instead: codeiq.vex.com/
Can i know what model of LiDar sensor did you use
It's an LDS-01 on top of a TurtleBot3: emanual.robotis.com/docs/en/platform/turtlebot3/appendix_lds_01/
How do you display the data in a 3d simulation?
All of the 3D plots were made in MATLAB, mostly using the plot3 and quiver3 functions. Full code can be found here: github.com/KaiNakamura/RBE3001
@@kaihnakamura thank you
❤❤❤❤
Dude amazing work
Can you make a video about mapping
cool
Hi Mr. Kai Nakamura, could you please tell me where you are able to get the walls for the arena? Is it custom or you buy it from online store? I have a project that needs such wall. Thank you in advance
I just contacted my professor and they told me the walls are custom made from rectangular thin plywood sheets and 3D printed parts. The curved walls are made by laser cutting a zig-zag pattern into the plywood, allowing it to bend. If you're interested in creating your own I could probably contact the lab staff and get the files needed to reproduce them if you're interested. Hope this helps!
This is the best overview of SLAM that I've seen yet. Excellent presentation and just enough detail to get people started. I'd love to see a deep dive on getting ROS setup on a small rover like this.
me too i also want to go through nav2 package from scratch
Me too! Although it’s meaningless because it’s the first one. Data needs context. 😅
one video on mapping please
Super! That is a nice easy explanation of SLAM. Well done!
♥♥♥
insanely easy to understand! thank you sir
I’m interested in creating an outdoor rover. In your opinion, would lidar work for accurate positioning in an outdoor environment, like a yard?
It's hard to say for sure if LIDAR would work well in an outdoor environment for your needs. One thing to keep in mind for the robot I used is that it only makes LIDAR scans parallel to the ground. This worked find for my needs because the only obstacles were the walls. But if the environment were full of obstacles shorter than the LIDAR sensor, then the robot would be unable to detect them. There are 3D LIDAR sensors that allow you to create point clouds in 3D, but these can be quite expensive. Some SLAM robots use a 2D LIDAR scanner in combination with a stereo camera to achieve the same effect. The robot in the video is a TurtleBot3, but I know the TurtleBot4 uses the LIDAR and camera approach. Hope this helps! turtlebot.github.io/turtlebot4-user-manual/overview/features.html
@@kaihnakamura I really appreciate your thoughts and input. I was considering using GPS, but realized I would need RTK to make it accurate enough and that’s a bit more expensive than I was hoping for.
sometime in the future, would it be able to detect changes in the environment? where an obstruction has been added or a wall moved so that it can update it's map?
Currently, the robot is using the gmapping ROS package to map the environment which only works for static environments. However, there are other SLAM implementations in ROS that allow for dynamic maps such as slam_toolbox. gmapping: wiki.ros.org/gmapping slam_toolbox: github.com/SteveMacenski/slam_toolbox#lifelong-mapping
Cool, I gotta say this is a very well made video, I’m surprised at how small the Chanel is and I hope to see it grow.
Thank you so much! :)
Wow, really outstanding results! How many people were on this project and how long it took you to accomplish it? Are you planning to release the code for this project? Again, thank you for sharing!
Three people over the course of a seven week term. Unfortunately, I cannot share the code in its entirety due to academic policy (students next year would just be able to copy it). But I could share code snippets or point toward resources with some of the algorithms I used if you’re interested. Thank you so much for your interest in this project! :)
can u share me as well
@@kaihnakamura Yeah, it would be awesome! If you could share info about the hardest points, not in a code, but maybe as a theory, methodology or something similar. It could help others and i think will be a interesting material for your blog!
All that fancy work and you couldn't bother to get the buttons on straight... thats a B-
Woah, this is really cool
This device is so cool! I hope WPI will allow the students and professor to continue to develop this and potentially market it perhaps as a fundraiser for the Institute. I know someone who wants to buy one already!
This is great 😃👍
Can you please explain how we design the algorithm regarding conputer vision based pick and place robotic hand?? I'm struggling with my project
There’s no one right way to create a computer vision pipeline, but there are certainly some things that help a whole lot. One of the more difficult steps is getting the color thresholding just right since it can vary a lot with ambient light levels so it’ll inevitably pick up some unwanted pixels. To combat this, I highly recommend using the MATLAB imfindcircles function to clean up unwanted pixels. In my experience it works far better than other alternatives like erosion. Best of luck with your project!
Can you do more of this videos how did you guys design the hand was it via solidworks I working with a robot hand but I having trouble with it
The arm we used is the OpenMANIPULATOR-X. More details about it can be found in our full report here: kainakamura.com/project/rbe3001
You guys use mathlab damn that's cool
How can I get to talk to you I wanted to know how u guys came up with the program and did you use java
We used MATLAB to control the arm, you can look at the source code here: github.com/KaiNakamura/RBE3001
Hi can you help me out please
Wonder if you could use the new gpt api to understand where objects are and how to mainuplate them
wish i had friends like you
How are the servos not blocki g each other with no timed millis. I still dont understand in the code how the servos know where to go or when?
The Dynamixel servos are more advanced than normal servos. They each have their own processor and can receive a command to move their joint to a specific position, which they then do without additional help from the main computer
Very nice video! The reference frames shown @3:46 are left-handed, not right-handed. Was this intentional?
Good catch! No this was not intentional, the x and y axes should be switched in each frame.
@@kaihnakamura Got it. Anyway, really good video. With your permission, I'd like to use this video for when I'll teach RBE 3001 in C term.
@@kid-a Absolutely, feel free to use this video!
can you send the link of any paper on this you have cited or writed ?
Yes! I've got some more details and a full report on my website at kainakamura.com/project/rbe3001
Thanks @@kaihnakamura
this is so cool
For the forward kinematics, i thought the matrices were exponentials of those 4x4 transformation matrices with [omega, velocity]. Love this video btw, awesome presentation
The ball juggler
brother this is awesome! what areas should i study to get in to this.
probably trigonometry, linear algebra and Python programming