Generative AI Serverless - Build a Travel Policy Assistant using Bedrock RAG KB, Lambda and API

แชร์
ฝัง
  • เผยแพร่เมื่อ 16 ก.ย. 2024
  • In this video, I am going to show you how to build a serverless Generative AI Retrieval Augmented Generation (RAG) solution to implement a single document knowledge base using Amazon Bedrock, Lambda, and API.
    With this solution, I will create a travel policy assistant that can provide contextual responses based on the travel policy document created by HR team. This policy document is posted in a S3 bucket. It documents travel policy information like class of services allowed, expense limits for domestic vs international travel and other policy info.
    'Chat with your document' is the latest Generative AI feature added by Amazon to its already feature-rich areas of GenAI, Knowledge Base, and RAG.
    RAG, which stands for Retrieval Augmented Generation, is becoming increasingly popular in the world of Generative AI. It allows organizations to overcome the limitations of large language models (LLMs) and utilize contextual data for their Generative AI solutions.
    Amazon Bedrock is a fully managed service that offers a variety of foundation models, such as Anthropic Claude, AI21 Jurassic-2, Stability AI, Amazon Titan, and others.
    I will use the recently released Anthropic Sonnet foundation model and invoke it via Amazon Bedrock using Lambda and API.
    Build Lambda function locally: • How to develop AWS Lam...

ความคิดเห็น •