OpenLLM: Operating LLMs in production

แชร์
ฝัง
  • เผยแพร่เมื่อ 15 พ.ย. 2024
  • Tim Liu is Head of Product at BentoML.
    Learn about OpenLLM's open platform for operating LLMs in production. See how to easily experiment with different models and configs. Tackle the challenges of creating production-ready LLM apps -- including cost, latency, compute, data privacy, scalability, evaluation, and the ecosystem of dev tools -- as well as how to run open-source LLMs with ease.
    This lecture was originally delivered at “From Toy To Production: Building LLM-Powered Systems that Work in the Real World,” an event in New York City dedicated to scaling LLM-powered systems from experimental stages to real-world production environments.
    Get updates from Arize on future events: arize.com/comm...

ความคิดเห็น •