Chat Conversations with Gemini
Maintain context over multiple messages in a conversation using Gemini's chat functionality.
- Typescript
- Python
import { GoogleGenerativeAI } from "@google/generative-ai";
const genAI = new GoogleGenerativeAI(process.env.GEMINI_API_KEY);
const model = genAI.getGenerativeModel({ model: "gemini-2.0-flash" });
// Start a chat session
const chat = model.startChat({
history: [
{
role: "user",
parts: "Hello, Gemini."
},
{
role: "model",
parts: "Hi there! How can I assist you today?"
}
]
});
// Continue the conversation
const result1 = await chat.sendMessage("I have 3 cats. How many paws are in total?");
console.log(result1.response.text());
const result2 = await chat.sendMessage("Can you translate that answer into French?");
console.log(result2.response.text());
from google import genai
# Initialize the model
model = genai.GenerativeModel("gemini-2.0-flash")
# Start a chat session with history
chat = model.start_chat(history=[
{"role": "user", "parts": "Hello, Gemini."},
{"role": "model", "parts": "Hi there! How can I assist you today?"}
])
# Continue the conversation
response1 = chat.send_message("I have 3 cats. How many paws are in total?")
print(response1.text)
response2 = chat.send_message("Can you translate that answer into French?")
print(response2.text)
Chat Configuration Options
history
: Array of previous messagescontext
: Additional context for the conversationexamples
: Example interactions to guide the model
Best Practices for Chat
- Keep conversation history concise
- Provide clear context when needed
- Use examples to guide model behavior