DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Last call! Secure your stack and shape the future! Help dev teams across the globe navigate their software supply chain security challenges.

Modernize your data layer. Learn how to design cloud-native database architectures to meet the evolving demands of AI and GenAI workloads.

Releasing software shouldn't be stressful or risky. Learn how to leverage progressive delivery techniques to ensure safer deployments.

Avoid machine learning mistakes and boost model performance! Discover key ML patterns, anti-patterns, data strategies, and more.

Related

  • Exploration of Azure OpenAI
  • Creating Custom Skills for Chatbots With Plugins
  • Implementing Ethical AI: Practical Techniques for Aligning AI Agents With Human Values
  • Foundational Building Blocks for AI Applications

Trending

  • How to Configure and Customize the Go SDK for Azure Cosmos DB
  • Transforming AI-Driven Data Analytics with DeepSeek: A New Era of Intelligent Insights
  • Performing and Managing Incremental Backups Using pg_basebackup in PostgreSQL 17
  • AI-Based Threat Detection in Cloud Security
  1. DZone
  2. Data Engineering
  3. AI/ML
  4. Building an Interactive Chatbot With Streamlit, LangChain, and Bedrock

Building an Interactive Chatbot With Streamlit, LangChain, and Bedrock

The article helps you to create a chatbot using low code frontend, LangChain for conversation management, and Bedrock LLM for generating responses.

By 
Karan Bansal user avatar
Karan Bansal
·
Oct. 16, 24 · Tutorial
Likes (6)
Comment
Save
Tweet
Share
3.8K Views

Join the DZone community and get the full member experience.

Join For Free

In the ever-evolving landscape of AI, chatbots have become indispensable tools for enhancing user engagement and streamlining information delivery. This article will walk you through the process of building an interactive chatbot using Streamlit for the front end, LangChain for orchestrating interactions, and Anthropic’s Claude Model powered by Amazon Bedrock as the Large Language Model (LLM) backend. We'll dive into the code snippets for both the backend and front end and explain the key components that make this chatbot work.

Core Components

  • Streamlit frontend: Streamlit's intuitive interface allows us to create a low-code user-friendly chat interface with minimal effort. We'll explore how the code sets up the chat window, handles user input, and displays the chatbot's responses.
  • LangChain orchestration: LangChain empowers us to manage the conversation flow and memory, ensuring the chatbot maintains context and provides relevant responses. We'll discuss how LangChain's ConversationSummaryBufferMemory and ConversationChain are integrated.
  • Bedrock/Claude LLM backend: The true magic lies in the LLM backend. We'll look at how to leverage Amazon Bedrock’s claude foundation model to generate intelligent and contextually aware responses. 

Chatbot Architecture
chatbot architecture

Conceptual Walkthrough of the Architecture

  1. User interaction: The user initiates the conversation by typing a message into the chat interface created by Streamlit. This message can be a question, a request, or any other form of input the user wishes to provide.
  2. Input capture and processing: Streamlit's chat input component captures the user's message and passes it on to the LangChain framework for further processing.
  3. Contextualization with LangChain memory: LangChain plays a crucial role in maintaining the context of the conversation. It combines the user's latest input with the relevant conversation history stored in its memory. This ensures that the chatbot has the necessary information to generate a meaningful and contextually appropriate response.
  4. Leveraging the LLM: The combined context is then sent to the Bedrock/Claude LLM. This powerful language model uses its vast knowledge and understanding of language to analyze the context and generate a response that addresses the user's input in an informative way.
  5. Response retrieval: LangChain receives the generated response from the LLM and prepares it for presentation to the user.
  6. Response display: Finally, Streamlit takes the chatbot's response and displays it in the chat window, making it appear as if the chatbot is engaging in a natural conversation with the user. This creates an intuitive and user-friendly experience, encouraging further interaction.

Code Snippets

Frontend (Streamlit)

Python
 
import streamlit 
 import chatbot_backend 
 from langchain.chains import ConversationChain
 from langchain.memory import ConversationSummaryBufferMemory
 import boto3
 from langchain_aws import ChatBedrock
 import pandas as pd
 
 # 2 Set Title for Chatbot - streamlit.title("Hi, This is your Chatbott")  
 
 # 3 LangChain memory to the session cache - Session State - 
 if 'memory' not in streamlit.session_state:
     streamlit.session_state.memory = demo.demo_memory()  

 # 4 Add the UI chat history to the session cache - Session State 
 if 'chat_history' not in streamlit.session_state:  
     streamlit.session_state.chat_history = []  

 # 5 Re-render the chat history 
 for message in streamlit.session_state.chat_history:
     with streamlit.chat_message(message["role"]):
         streamlit.markdown(message["text"])
 
 # 6 Enter the details for chatbot input box
 input_text = streamlit.chat_input("Powered by Bedrock")  

 if input_text:
     with streamlit.chat_message("user"):
         streamlit.markdown(input_text)
 
     streamlit.session_state.chat_history.append({"role": "user", "text": input_text})
 
     chat_response = demo.demo_conversation(input_text=input_text,
                                            memory=streamlit.session_state.memory)  
     with streamlit.chat_message("assistant"):
         streamlit.markdown(chat_response)
 
     streamlit.session_state.chat_history.append({"role": "assistant", "text": chat_response})


Backend (LangChain and LLM)

Python
 
from langchain.chains import ConversationChain
 from langchain.memory import ConversationSummaryBufferMemory
 import boto3
 from langchain_aws import ChatBedrock
 
 # 2a Write a function for invoking model- client connection with Bedrock with profile, model_id
 def demo_chatbot():
     boto3_session = boto3.Session(
              # Your aws_access_key_id, 
              # Your aws_secret_access_key,
         region_name='us-east-1'
     )
     llm = ChatBedrock(
         model_id="anthropic.claude-3-sonnet-20240229-v1:0",
         client=boto3_session.client('bedrock-runtime'),
         model_kwargs={
             "anthropic_version": "bedrock-2023-05-31",
             "max_tokens": 20000,
             "temperature": .3,
             "top_p": 0.3,
             "stop_sequences": ["\n\nHuman:"]
         }
     )
     return llm
 
 # 3 Create a Function for  ConversationSummaryBufferMemory  (llm and max token limit)
 def demo_memory():
     llm_data = demo_chatbot()
     memory = ConversationSummaryBufferMemory(llm=llm_data, max_token_limit=20000)
     return memory
 
 # 4 Create a Function for Conversation Chain - Input text + Memory
 def demo_conversation(input_text, memory):
     llm_chain_data = demo_chatbot()
 
     # Initialize ConversationChain with proper llm and memory
     llm_conversation = ConversationChain(llm=llm_chain_data, memory=memory, verbose=True)
 
     # Call the invoke method
     full_input = f" \nHuman: {input_text}"
     llm_start_time = time.time()
     chat_reply = llm_conversation.invoke({"input": full_input})
     llm_end_time = time.time()
     llm_elapsed_time = llm_end_time - llm_start_time
     memory.save_context({"input": input_text}, {"output": chat_reply.get('response', 'No Response')})
     return chat_reply.get('response', 'No Response')


Conclusion

We've explored the fundamental building blocks of an interactive chatbot powered by Streamlit, LangChain, and a powerful LLM backend. This foundation opens doors to endless possibilities, from customer support automation to personalized learning experiences. Feel free to experiment, enhance, and deploy this chatbot for your specific needs and use cases.

Chatbot Language model AI

Opinions expressed by DZone contributors are their own.

Related

  • Exploration of Azure OpenAI
  • Creating Custom Skills for Chatbots With Plugins
  • Implementing Ethical AI: Practical Techniques for Aligning AI Agents With Human Values
  • Foundational Building Blocks for AI Applications

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!