meremortaldev
meremortaldev
  • 4
  • 319
Streaming LLM responses vs blocking LLM responses with Ollama and Node.js
Live streamed on September 12, 2024.
We're using Ollama and Node.js to look at the code-level differences between steaming LLM responses and blocking on synchronous responses. Best of all since it's Ollama, the LLMs we look at-like Llama 3.1 and Tiny Llama-are running locally.
Get Ollama here: ollama.com
Ollama API docs are here: github.com/ollama/ollama/blob/main/docs/api.md
มุมมอง: 78

วีดีโอ

Talking to LLMs locally with Ollama with Node.js
มุมมอง 1082 หลายเดือนก่อน
Live streamed on September 11, 2024. Running LLMs locally is amazing, and Ollama makes it possible. Even better, Ollama isn't just a CLI-it also comes with a local REST API server. In the video, look at running Llama 3.1 and Tiny Llama via CLI. Then we get into calling LLMs from a Node.js script. Get Ollama here: ollama.com
Using EmbedJS for AI RAG systems in Node.js
มุมมอง 942 หลายเดือนก่อน
Live streamed on September 10, 2024. Here are a few resources I created while I was playing around with EmbedJS for RAG: RAG in Node.js with EmbedJS - Blog: www.ashryan.io/rag-in-node-js-with-embedjs/ - Repo: github.com/ashryanbeats/embedjs-rag Batch loading resources in EmbedJS for RAG - Blog: www.ashryan.io/batch-loading-resources-in-embedjs-for-rag/ - Repo: github.com/ashryanbeats/embedjs-ra...
Making a React based math game with Anthropic's Claude 3.5 Sonnet LLM
มุมมอง 432 หลายเดือนก่อน
Live streamed on September 9, 2024. You can try the deployed game on Val Town here: ashryanio-multiplicationgameapp.web.val.run

ความคิดเห็น

  • @ipis07
    @ipis07 หลายเดือนก่อน

    Thanks for this video! Had a hard time getting the streaming response.

  • @e4rohan
    @e4rohan 2 หลายเดือนก่อน

    love this thanks for the video :D

  • @shadyShift
    @shadyShift 2 หลายเดือนก่อน

    Awesome !! Following for more content

    • @meremortaldev
      @meremortaldev 2 หลายเดือนก่อน

      Thanks @shadyShift!