Your First LangChain App
Overview
In this lesson, you’ll build your first LangChain-powered chatbot using OpenAI’s GPT model. This basic chatbot will take user input and respond like a conversation assistant. You’ll learn how to connect an LLM, construct prompts, and wrap it all using LangChain’s LLMChain
.
Learning Objectives
-
Use
ChatOpenAI
from LangChain to access OpenAI GPT -
Use
PromptTemplate
to dynamically construct prompts -
Build a working chatbot with user input and model output
-
Understand the flow of a simple LLMChain
Section 1: Project Setup Recap
Before starting, make sure you’ve:
-
Installed required packages:
langchain
,openai
,python-dotenv
-
Set up your
.env
file with:
OPENAI_API_KEY=your-openai-api-key-here
Section 2: Code – Building Your First Chatbot
Create a file called simple_chatbot.py
and add the following code:
import os
from dotenv import load_dotenv
from langchain.chat_models import ChatOpenAI
from langchain.prompts import PromptTemplate
from langchain.chains import LLMChain
# Load API key from .env file
load_dotenv()
# Initialize the LLM
llm = ChatOpenAI(temperature=0.7)
# Define the prompt template
prompt = PromptTemplate(
input_variables=["user_input"],
template="You are a helpful assistant. Answer the following:\n{user_input}"
)
# Create the LLM chain
chat_chain = LLMChain(llm=llm, prompt=prompt)
# Run in loop to simulate a chatbot
while True:
user_input = input("User: ")
if user_input.lower() in ["exit", "quit"]:
print("Goodbye!")
break
response = chat_chain.run(user_input=user_input)
print("Bot:", response)
Section 3: How It Works
-
ChatOpenAI
: Connects to the OpenAI chat model (e.g., GPT-3.5/4) -
PromptTemplate
: Dynamically inserts the user's input into a formatted prompt -
LLMChain
: Combines the prompt and model for one cohesive response generation -
The
while True
loop keeps the interaction going until the user exits
Section 4: Run the Chatbot
In your terminal:
python simple_chatbot.py
Sample Output:
User: What is the capital of France?
Bot: The capital of France is Paris.
Section 5: Tips for Improvement
-
Set
temperature=0.0
for more factual responses -
Add logging or history memory (coming in later lessons)
-
Wrap it into a function or API for deployment
Mini Task
Modify the prompt so the bot answers like a teacher explaining to a 10-year-old.
template="You are a teacher. Explain to a 10-year-old:\n{user_input}"
Quick Recap Quiz
-
What does
PromptTemplate
do in LangChain? -
What is the role of
LLMChain
? -
How would you stop the chatbot loop?