Llama 3 in Ollama VS LM Studio - Which is Faster at Translating Subtitles in Subtitle Edit?

แชร์
ฝัง
  • เผยแพร่เมื่อ 14 ม.ค. 2025

ความคิดเห็น • 11

  • @AIFusion-official
    @AIFusion-official 7 หลายเดือนก่อน +6

    go into advanced configuration then gpu settings, and make sure to have the gpu offload at max

    • @Lift.Life0
      @Lift.Life0 6 หลายเดือนก่อน

      I did all your steps and when I want to translate to Spanish, it gives me the following error:
      "Ollama (local LLM)" requires a web server running locally!
      {"error":"CreateFile C:\\Users\\jgcar\\.ollama\\models\\blobs\\sha256-00e1317cbf74d901080d7100f57580ba8dd8de57203072dc6f668324ba545f29: The system cannot find the file specified."}

  • @rallisf1
    @rallisf1 8 หลายเดือนก่อน +6

    It seems to me like LM Studio uses your CPU, which is slow for AI, while ollama uses your GPU. Check their settings.

  • @ProyectoHermenautas
    @ProyectoHermenautas 8 หลายเดือนก่อน +4

    Hey David, it seems like you haven't activated the local server in lmstudio yet. You can do this by clicking on the double arrow icon on the left-hand side and finding the "Start Server" button there.
    Thank you so much for your ongoing contributions to those of us using Subtitle Edit!

    • @Lift.Life0
      @Lift.Life0 6 หลายเดือนก่อน

      I did all your steps and when I want to translate to Spanish, it gives me the following error:
      "Ollama (local LLM)" requires a web server running locally!
      {"error":"CreateFile C:\\Users\\jgcar\\.ollama\\models\\blobs\\sha256-00e1317cbf74d901080d7100f57580ba8dd8de57203072dc6f668324ba545f29: The system cannot find the file specified."}

  • @mustaphahatif4226
    @mustaphahatif4226 6 หลายเดือนก่อน

    Très belle outils Mercie
    Bonne explication

  • @dayantargaryen4129
    @dayantargaryen4129 8 หลายเดือนก่อน

    When I translate with llama 3 these characters appear "". why?

    • @Dop3Dawg
      @Dop3Dawg 8 หลายเดือนก่อน

      The tag inserts a single line break. BR = Break Row.

  • @Lift.Life0
    @Lift.Life0 6 หลายเดือนก่อน

    I did all your steps and when I want to translate to Spanish, it gives me the following error:
    "Ollama (local LLM)" requires a web server running locally!
    {"error":"CreateFile C:\\Users\\jgcar\\.ollama\\models\\blobs\\sha256-00e1317cbf74d901080d7100f57580ba8dd8de57203072dc6f668324ba545f29: The system cannot find the file specified."}

  • @HyperUpscale
    @HyperUpscale 8 หลายเดือนก่อน +1

    Without watching the video - Ollama wins in every aspect

    • @thesuperfluousone2537
      @thesuperfluousone2537 7 หลายเดือนก่อน +1

      Is this comment genuine or sarcastic? Is Ollama secure and able to read from files? Because LM Studio is, more or less, but is indeed very slow.