Free and Private GitHub Copilot Clone for VS Code Using Ollama and Continue

แชร์
ฝัง
  • เผยแพร่เมื่อ 2 มิ.ย. 2024
  • Install a Private AI Coding Assistant in VS Code for Free Using Ollama and Continue
    In this video, we'll see how you can use Ollama and Continue to run a private GitHub Copilot clone locally for free using open source large language models (LLMs) such as codellama and deepseek-coder.
    This lets you try out different models, and even use uncensored models.
    Don't send your private data to OpenAI's ChatGPT or GitHub's Copilot, keep it private on your pc or mac.
    👍 Please like if you found this video helpful, and subscribe to stay updated with my latest tutorials. 🔔
    ❤️ You can support this channel by buying me a ☕: buymeacoffee.com/codesfinance
    🔖 Chapters:
    00:00 Intro
    00:22 Install Ollama and Models
    01:34 Install Continue
    05:00 Chat
    05:31 Edit and Refactor
    06:21 Fix Bugs
    07:12 Other Uses
    07:26 Outro
    🍺 Homebrew installation commands:
    brew install ollama
    🔗 Video links:
    Ollama: ollama.com/
    Continue: continue.dev/
    🐍 More Vincent Codes Finance:
    - ✍🏻 Blog: vincent.codes.finance
    - 🐦 X: / codesfinance
    - 🧵 Threads: www.threads.net/@codesfinance
    - 😺 GitHub: github.com/Vincent-Codes-Finance
    - 📘 Facebook: / 61559283113665
    - 👨‍💼 LinkedIn: / vincent-codes-finance
    - 🎓 Academic website: www.vincentgregoire.com/
    #chatgpt #llm #largelanguagemodels #ollama #copilot #githubcopilot #python #javascript #programming #code #gpt #opensource #opensourceai #llama2 #mistral #bigdata #research #researchtips #vscode #professor #datascience #dataanalytics #dataanalysis #uncensored #private
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น • 15

  • @VincentCodesFinance
    @VincentCodesFinance  หลายเดือนก่อน

    👍 Please like if you found this video helpful, and subscribe to stay updated with my latest tutorials. 🔔
    ❤ You can support this channel by buying me a ☕: buymeacoffee.com/codesfinance

  • @wrOngplan3t
    @wrOngplan3t 26 วันที่ผ่านมา

    This was pretty cool! I'm just a barely a hobby coder, some Arduino or Processing Java stuff once in a blue moon lol. I've barely scratched the surface with this extension. I've yet to see how much better this is than just copy-paste to / from an LLM, but it certainly have potential! Seems pretty helpful, although you also have to be skeptical of it's outputs and claims (same as ever), sometimes it seems to hallucinate a bit, or assume some stuff not quite right. But it's alright as long as one is aware of it.
    Using Linux Mint 21.3, and VSCodium. Already had Ollama present. I found the Continue add-on.and thanks to your config.json editing I got it working. So far with Codegemma.
    Great video, thanks!

    • @VincentCodesFinance
      @VincentCodesFinance  25 วันที่ผ่านมา +1

      Thanks! I guess the main benefit of having it directly in VS Code is convenience: it lets you stay within VS Code, is integrated with the UI, and the buttons and slash commands will automatically wraps prompts around your code.. In terms of output quality, it will mostly depend on the model you use.

  • @user-dr7yi2fj6x
    @user-dr7yi2fj6x หลายเดือนก่อน

    Thanks for awesome video! Great info :)

  • @ak-if9wg
    @ak-if9wg 2 หลายเดือนก่อน

    very good info

  • @dogan1318
    @dogan1318 20 วันที่ผ่านมา

    hi can we ollama provider url? i want to use ollama which i serve in my server

    • @VincentCodesFinance
      @VincentCodesFinance  20 วันที่ผ่านมา

      Yes! You can specify the url with the apiBase parameter. See docs.continue.dev/reference/Model%20Providers/ollama for an example.

    • @dogan1318
      @dogan1318 19 วันที่ผ่านมา

      @@VincentCodesFinance Thank you 🙏

  • @vishnusureshperumbavoor
    @vishnusureshperumbavoor หลายเดือนก่อน

    Bruh i moved it to the right aide & after closing it's not visible now

    • @VincentCodesFinance
      @VincentCodesFinance  หลายเดือนก่อน +1

      If you moved it to the right, it's now in your secondary side bar. If it was closed, you can reopen it from the menu: View->Appearance->Secondary Side Bar (command-option B on mac)

    • @vishnusureshperumbavoor
      @vishnusureshperumbavoor หลายเดือนก่อน

      @@VincentCodesFinance oh tnx bruh 🙏🙏💪

  • @vertigoz
    @vertigoz หลายเดือนก่อน

    on osx i cant even find the config.json :(

    • @VincentCodesFinance
      @VincentCodesFinance  หลายเดือนก่อน +1

      By default it will be created in "~/.continue/config.json", but you can open it directly by clicking on the gears icon in the Continue extension sidebar.