Kritim Yantra
Apr 23, 2025
Have you ever chatted with a bot that forgets everything the moment you hit "send"?
It’s like talking to a goldfish. 🐟
But what if you want your chatbot to remember your name, your last question, or the topic you're discussing?
That’s where LangChain Memory comes in. In this blog, we’ll break down:
Let’s get into it 👇
LangChain is an open-source framework that makes it easier to build apps using LLMs (like ChatGPT or Claude).
You can think of it as a set of tools to:
✅ Connect your chatbot to custom data (like PDFs, websites)
✅ Make it interactive (use buttons, search, filters)
✅ Add memory and logic to conversations
LangChain works with models from OpenAI, Anthropic, Cohere, HuggingFace, and more.
By default, LLMs don’t remember your past messages unless you send them again in every request.
That’s like this:
User: What is Python?
Assistant: Python is a programming language...
User: And who created it?
Assistant: I don’t know what “it” means.
That’s annoying, right?
With Memory, your chatbot can remember previous messages so that future replies make sense in context.
Think of it like a chat with a real person—they remember what you said before.
LangChain offers a few types of memory:
ConversationBufferMemory
ConversationSummaryMemory
ConversationBufferWindowMemory
VectorStoreRetrieverMemory
In this blog, we’ll use the simplest one: ConversationBufferMemory
.
Let’s build a memory-enabled chatbot using LangChain + OpenAI.
Install the packages:
pip install openai langchain python-dotenv
Create a .env
file with your OpenAI key:
OPENAI_API_KEY=your_openai_api_key
from langchain.chat_models import ChatOpenAI
from langchain.chains import ConversationChain
from langchain.memory import ConversationBufferMemory
from dotenv import load_dotenv
import os
load_dotenv()
# Load the model
llm = ChatOpenAI(temperature=0.7)
# Add memory
memory = ConversationBufferMemory()
# Create conversation chain
conversation = ConversationChain(
llm=llm,
memory=memory,
verbose=True
)
# Chat loop
while True:
user_input = input("You: ")
if user_input.lower() in ['exit', 'quit']:
break
response = conversation.predict(input=user_input)
print(f"Bot: {response}")
You: My name is Ajay.
Bot: Nice to meet you, Ajay!
You: What is the capital of France?
Bot: The capital of France is Paris.
You: And what's my name?
Bot: Your name is Ajay.
Amazing! The bot remembers your name from earlier in the chat. 🧠✅
The memory stores both:
Every time you send a new message, the bot gets:
History:
User: My name is Ajay.
Bot: Nice to meet you, Ajay!
User: What is the capital of France?
Current Input:
And what's my name?
This helps the model stay in context like a human.
ConversationSummaryMemory
To keep memory without hitting token limits, use:
from langchain.memory import ConversationSummaryMemory
memory = ConversationSummaryMemory(llm=llm)
It keeps a compressed summary of your chats—great for long-form interactions.
Use Case | Memory Needed? |
---|---|
Customer support bot | ✅ Yes |
Educational tutor | ✅ Yes |
Q&A from PDF | ❌ Not required (stateless) |
AI diary assistant | ✅ Definitely |
Memory = human-like = more useful assistant.
Now that you’ve learned how to add memory, here’s what you can explore next:
In this blog, you learned:
And now you’re one step closer to building your own personal AI assistant!
No comments yet. Be the first to comment!
Please log in to post a comment:
Sign in with Google