I did all your steps and when I want to translate to Spanish, it gives me the following error: "Ollama (local LLM)" requires a web server running locally! {"error":"CreateFile C:\\Users\\jgcar\\.ollama\\models\\blobs\\sha256-00e1317cbf74d901080d7100f57580ba8dd8de57203072dc6f668324ba545f29: The system cannot find the file specified."}
Hey David, it seems like you haven't activated the local server in lmstudio yet. You can do this by clicking on the double arrow icon on the left-hand side and finding the "Start Server" button there. Thank you so much for your ongoing contributions to those of us using Subtitle Edit!
I did all your steps and when I want to translate to Spanish, it gives me the following error: "Ollama (local LLM)" requires a web server running locally! {"error":"CreateFile C:\\Users\\jgcar\\.ollama\\models\\blobs\\sha256-00e1317cbf74d901080d7100f57580ba8dd8de57203072dc6f668324ba545f29: The system cannot find the file specified."}
I did all your steps and when I want to translate to Spanish, it gives me the following error: "Ollama (local LLM)" requires a web server running locally! {"error":"CreateFile C:\\Users\\jgcar\\.ollama\\models\\blobs\\sha256-00e1317cbf74d901080d7100f57580ba8dd8de57203072dc6f668324ba545f29: The system cannot find the file specified."}
go into advanced configuration then gpu settings, and make sure to have the gpu offload at max
I did all your steps and when I want to translate to Spanish, it gives me the following error:
"Ollama (local LLM)" requires a web server running locally!
{"error":"CreateFile C:\\Users\\jgcar\\.ollama\\models\\blobs\\sha256-00e1317cbf74d901080d7100f57580ba8dd8de57203072dc6f668324ba545f29: The system cannot find the file specified."}
It seems to me like LM Studio uses your CPU, which is slow for AI, while ollama uses your GPU. Check their settings.
Hey David, it seems like you haven't activated the local server in lmstudio yet. You can do this by clicking on the double arrow icon on the left-hand side and finding the "Start Server" button there.
Thank you so much for your ongoing contributions to those of us using Subtitle Edit!
I did all your steps and when I want to translate to Spanish, it gives me the following error:
"Ollama (local LLM)" requires a web server running locally!
{"error":"CreateFile C:\\Users\\jgcar\\.ollama\\models\\blobs\\sha256-00e1317cbf74d901080d7100f57580ba8dd8de57203072dc6f668324ba545f29: The system cannot find the file specified."}
Très belle outils Mercie
Bonne explication
When I translate with llama 3 these characters appear "". why?
The tag inserts a single line break. BR = Break Row.
I did all your steps and when I want to translate to Spanish, it gives me the following error:
"Ollama (local LLM)" requires a web server running locally!
{"error":"CreateFile C:\\Users\\jgcar\\.ollama\\models\\blobs\\sha256-00e1317cbf74d901080d7100f57580ba8dd8de57203072dc6f668324ba545f29: The system cannot find the file specified."}
Without watching the video - Ollama wins in every aspect
Is this comment genuine or sarcastic? Is Ollama secure and able to read from files? Because LM Studio is, more or less, but is indeed very slow.