Easy Setup! Self-host Mixtral-8x7B across devices with a 2M inference app

แชร์
ฝัง
  • เผยแพร่เมื่อ 1 ม.ค. 2024
  • Run open source large language model 'Mixtral-8x7B' locally. This MoE model follows Apache 2.0. It can be easily deployed on various devices with WasmEdge. Whether it’s a laptop or an edge device, you can get it running with just a few command lines. The fully portable inference app that runs this model is only 2MB! Do not believe? Try it out www.secondstate.io/articles/m...
    Try out on your Mac with 1 single command www.secondstate.io/run-llm/
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น •