Generative AI Serverless - Apply Guardrail, Bedrock Converse API, RAG - Chat with your document!

แชร์
ฝัง
  • เผยแพร่เมื่อ 8 ก.ย. 2024
  • In this video, I am going to show you how to build a serverless GenAI RAG solution to implement a document chat feature using Amazon Bedrock Converse API and Lambda. Also, I will apply one of the newest features introduced in July 2024 which is apply guardrail so that we have control over input prompt as well as response being returned to the calling app/consumer.
    Guardrail is a much needed feature supported by Amazon Bedrock to guard the contents while using a Generative AI solution.
    'Chat With Document' features supported by Amazon Bedrock is a form of RAG and allows you to have a contextual conversation and ask questions based on the data in the document augmented with LLM for Generative AI.
    RAG, which stands for Retrieval Augmented Generation, is becoming increasingly popular in the world of Generative AI. It allows organizations to overcome the limitations of LLMs and utilize contextual data for their Generative AI solutions.
    I will use the recently released Anthropic Sonnet foundation model and invoke it via the Amazon Bedrock Converse using Lambda and API.
    Lambda development: • How to develop AWS Lam...

ความคิดเห็น • 2

  • @srinivs1
    @srinivs1 หลายเดือนก่อน

    Can you please share github code for the above example

    • @CloudWithGirish
      @CloudWithGirish  หลายเดือนก่อน

      Please send me an email - girish@cloudwithgirish.com. i will send you a link to the git repo.