Bringing Generative AI to Life with NVIDIA Jetson
ฝัง
- เผยแพร่เมื่อ 6 ต.ค. 2024
- Learn about generative AI at the edge with this NVIDIA Jetson webinar.
Unlock the full potential of edge computing with accelerated inference on NVIDIA Jetson Orin devices.
Experience next-generation applications emerging in robotics and computer vision by deploying foundational LLMs and vision transformers into real-world embedded systems at the edge.
Explore optimization techniques and practical tips for building interactive AI agents with advanced reasoning and cognitive abilities.
Visit the NVIDIA Jetson Generative AI Lab to get started: www.jetson-ai-l...
Join the NVIDIA Developer Program: nvda.ws/3OhiXfl
Read and subscribe to the NVIDIA Technical Blog: nvda.ws/3XHae9F
Jetson, Gen AI, generative AI, edge computing, large language models, LLM, Vision Transformer
Big kudos to Dustin for his brilliant communicative skills! You're special, man!
What a fantastic overview of what's possible! This tracks a lot of components I've been working on and I can't wait to merge in some of these modules.
that's incredible! amazing presentation thank you
This is incredible. Absolutely futuristic.
I have a jetson yet when LLMs hit the market I stopped playing around with it over night as LLMs changed the way AI is thought of. I would love to have a new version of Jetson that had a smaller LLM locally!! Jetbot version 2? I can't wait!
Awesome job Dusty!
Would be far more impressive if jetpack 6 also supported the jetson nano platform as well.
Nice post with regards By Adv T E Barat Bushan Senior Advocate Member of MHAA Chennai
This is awesome! thanks for the information and great examples!!!!
A great talk.
This is amazing, I think its really wonderful and If we could talke to videos too this could be of great help
video object detection comparison between nvidia T4 16GB (320 Tensor Cores) vs Orin 64GB (64 Tensor cores) would be really good, as T4 is also low power consumption GPU
Does jetpack 6 support the older Xavier platform?
Amazing, great video, love the demo. Can I also run these demos on my Jetson Orin?
Wow, that's impressive!
What would it even mean to use an LLM with motion sensors? That's pretty interesting - like could you feed it IMU data and tell the LLM that you wanted it to be alerted if it was dropped or if someone was running vs walking with the device?
Thanks for the video, great developments! Is it possible to run those on Xavier NX ? Has anyone tried?
Will all the examples he showed run on Jetson Orin Nano Developer Kit?
😍
cool, im interested in autonomous virtual humans 🤔
How in the hell do you expect to do any real development without conda support ? I picked one in hopes to this is depressing.
There’s always virtualenv.