Local AI Coding in VS Code: Installing Llama 3 with continue.dev & Ollama
ฝัง
- เผยแพร่เมื่อ 7 ก.ย. 2024
- Want to take your VS Code experience to the next level with AI-powered coding assistance? In this step-by-step tutorial, discover how to supercharge Visual Studio Code with the incredible Llama 3 AI model using the game-changing continue.dev extension and Ollama.
Resources mentioned in the episode:
🤖 continue.dev
🤖 github.com/con...
🤖 ollama.com
Learn how to:
- Install the continue.dev extension in VS Code
- Download and set up the powerful Llama 3 AI model with Ollama
- Configure continue.dev to work seamlessly with Llama 3
- Leverage Llama 3's AI capabilities for code completion, explanations, and refactoring
-
Boost your coding productivity and efficiency with this unbeatable setup
Whether you're a beginner looking to enhance your coding experience or a seasoned developer seeking to optimize your workflow, this video will guide you through the process of integrating Llama 3 AI into your VS Code environment using the intuitive continue.dev extension.
Don't miss out on this opportunity to revolutionize the way you code. Watch now and unlock the full potential of AI-assisted programming in Visual Studio Code!
#vscode #Llama3 #ContinueDev #AICoding #CodeWithAI #Ollama #Productivity
This is awesome. Currently using codeium, but will install good later and give it a go.
@@codelinx let me know how it goes 💪🤖
nice clip jan, thanks its working like a charm.
hardest part was getting rid of amazon aws stuff in vs code that kept on installing amazon Q and stealing the code completion ^^.
I'm glad I didn't have AWS connected to VS Code in that case :D Glad it worked well for you!
Doesn't work for me. I can do it from command line but Continue plugin seems not working at all. Did all configuration and it responds with nothing.
@@scornwell100 did you check the issues listed in their GitHub repo? github.com/continuedev/continue
You can also join their Discord server to get more detailed help: discord.gg/vapESyrFmJ
simple and great explanation ! thank you
@@Ilan-Aviv you bet, glad the tutorial was useful! What are you building with AI?
@@iamjankoch Building a code assistant agent to work inside VSC editor, to help me with other projects.
actually I will ask for your advice:
I have a trading bot written in NodeJS react while this project written by another programmer, for some parts struggling with it development.
like to have a useful AI assistant to help me find bugs and understand the app structure.
the app runs server with browser client while it have about 130 files.
tried to use open AI GPT but it's to many files for it, while it's loosing the context, with the other issues it have.
I came into the conclusion that the best way is to have a local llm running on my machine.
if you have any recommendations for the right AI assistant you would use, I'll appreciate you advice.🙏
Dope. I'm going on an RV trip for two weeks so will have spotty service but can still ship 🚢
Sounds awesome, enjoy the trip!
Great explanation
@@tanercoder1915 thank you!
Great video my friend. Worked like a charm.I thank you very much!.
Glad to hear that, happy coding!
Great video ! really helped me setting things up!!
@@caiohrgm22 glad to hear that!!!
Thank you so much!
@@2ru2pacFan glad you enjoyed the tutorial!
Is 8gb RAM sufficient? I have enough Storage but when I try to use this after installing, it just doesn't work. Keeps loading.
I run it on 16 GB. The processor and GPU are quite important for Ollama as well.
16GB RAM are recommended: github.com/open-webui/open-webui/discussions/736#
I run it with 11GB of VRAM from command line and it seems fine, but inside VSCode I can't get it to respond, it throws errors that the stream is not readable.
Thank you! I was looking for this. Wat are the specs of your mac?
Glad you enjoyed the video! It’s a 2023 MacBook Pro with M2 Pro and 16 GB RAM
@@iamjankoch Nice, thank you for the fast answer, I was thinking if I need a M3/M2 Max with a lot of ram to load Llama 3 in the MacBook.
@@PedrinbeepOriginal Not really. Granted, I don't do much other heavy work when I'm coding but it runs super smooth with the 16GB. The only time I wish I had more RAM is when doing video editing lol
luv u bro
@@user-13853jxjdd glad you enjoyed the video!
Thnaks alot! =)
You’re welcome!