Introduction to LlamaIndex v0.10

แชร์
ฝัง
  • เผยแพร่เมื่อ 21 ก.ค. 2024
  • This video gives a comprehensive overview of LlamaIndex v0.10, the largest release to our Python package to date.
    We cover the following topics:
    1. Creating a `llama-index-core` package and splitting integrations/templates into separate packages
    2. LlamaHub Revamp
    3. ServiceContext Deprecation
    Throughout these sections, we'll cover a migration guide, usage examples, and a contributing guide.
    If you have questions come hop in our Discord: / discord
    Timeline:
    00:00-02:30 Intro
    02:30-09:30 Package Refactor Overview
    09:30-13:17 Package Refactor Usage Example
    13:17-16:00 Package Refactor Migration Guide
    16:00-18:17 LlamaHub
    18:17-23:28 Deprecate ServiceContext
    23:28 Contributing to LlamaIndex v0.10
    Additional Notes:
    - If you're installing in a virtual environment and running into installation/import issues, try deleting your entire virtual environment and trying again. Another example that works is this:
    ```
    pip uninstall llama-index
    pip install llama-index --upgrade --no-cache-dir --force-reinstall
    ```

ความคิดเห็น • 12

  • @gonzariosm
    @gonzariosm 4 หลายเดือนก่อน

    amazing changes :) and brilliant explanation 🎉

  • @unclecode
    @unclecode 4 หลายเดือนก่อน

    Kudos! Living without that annoying "ServiceContext," ;) especially mixed with global settings, shrinks the codebase by up to 10 times. Thx for the video! It was a headache figuring out these new things without this intro.

  • @dayanemarcos1923
    @dayanemarcos1923 5 หลายเดือนก่อน +5

    Hello, are there plans for deeper DSPy integration for optimized prompt capabilities?

  • @user-kp7ms7jl3n
    @user-kp7ms7jl3n 5 หลายเดือนก่อน

    Thanks, very good. 😀

  • @ai-cowboy
    @ai-cowboy 5 หลายเดือนก่อน

    What's the process for contributing to README's? Is it the same as packages and readers?

  • @bertobertoberto3
    @bertobertoberto3 5 หลายเดือนก่อน +2

    This seems like a mostly structural change, so all the demo videos should still work, just with the updated paths right?

    • @LlamaIndex
      @LlamaIndex  5 หลายเดือนก่อน +1

      yep exactly!

  • @ravvbike
    @ravvbike 5 หลายเดือนก่อน

    Hello, can you create a video showing the main differences between Llma Index and Langchain. When to use which and so on. Thanks, amazing work!

    • @jordanmoyo434
      @jordanmoyo434 4 หลายเดือนก่อน

      Choosing Between Llama Index and Lang-Chain:
      When choosing between the Llama Index and Lang-Chain, consider the following factors:
      Use Case: Determine the specific tasks and requirements you have for text analysis. If you need a wide range of NLP capabilities and have access to significant computational resources, the Llama Index may be more suitable. If you prioritize decentralization, data privacy, and community-driven approaches, Lang-Chain could be a better fit.
      Resource Availability: Assess your access to computational resources, including hardware and expertise. The Llama Index may require substantial resources for setup and maintenance, whereas Lang-Chain may offer a more accessible infrastructure for decentralized processing.
      Security and Privacy: Consider the sensitivity of your data and whether you require a decentralized approach for enhanced security and privacy. Lang-Chain's blockchain-based architecture may offer advantages in terms of data control and privacy.
      Community and Ecosystem: Evaluate the support and ecosystem around each tool. The availability of documentation, community forums, and developer resources can impact ease of use and troubleshooting.
      Ultimately, the choice between the Llama Index and Lang-Chain depends on your specific needs, priorities, and available resources. It may be beneficial to experiment with both solutions and assess their suitability for your use case before making a decision.

  • @Pingu_astrocat21
    @Pingu_astrocat21 5 หลายเดือนก่อน

    from llama_index.embeddings.huggingface import HuggingFaceEmbedding
    from llama_index.core import Settings
    Settings.embed_model = HuggingFaceEmbedding(
    model_name="BAAI/bge-small-en-v1.5"
    )
    I tried to change to the new version. Can anyone help me out how to fix this? Its not working.
    It throws an attribute error: module 'torch._subclasses' has no attribute 'functional_tensor'