Skip to main content

Examples and Use Cases

Memoer is designed to enhance your LLM applications by providing robust memory management. Here are some common use cases and examples.

Chatbot with Contextual Memory

Create a chatbot that remembers previous interactions:

import { memoer, MemoryConfig, ConversationStrategy } from "memoer";
import { OpenAI } from "openai";

// Initialize Memoer memory
const memory = memoer.createMemory({
id: "support-bot",
systemMessage: {
role: "system",
content: "You are a helpful customer support assistant."
},
managers: {
conversation: {
strategy: ConversationStrategy.SLIDING_WINDOW,
slidingWindowSize: 10
}
}
});

// Initialize OpenAI client
const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY
});

// Handle a new message
async function handleMessage(userMessage) {
// Add user message to memory
memory.conversation.add({
role: "user",
content: userMessage
});

// Get optimized context from memory
const context = await memory.conversation.getContext();

// Generate response from LLM
const completion = await openai.chat.completions.create({
model: "gpt-4",
messages: context
});

const assistantMessage = completion.choices[0].message;

// Add assistant response to memory
memory.conversation.add(assistantMessage);

return assistantMessage.content;
}

Multi-Turn Question Answering

Implement a system that can answer follow-up questions:

import { memoer, MemoryConfig } from "memoer";

// Initialize memory with default conversation strategy
const memory = memoer.createMemory({
id: "qa-system",
systemMessage: {
role: "system",
content:
"You are a question answering system that provides accurate and helpful information."
}
});

// Example conversation flow
async function demonstrateQA() {
// First question
memory.conversation.add({
role: "user",
content: "What are the benefits of using TypeScript?"
});

// Simulated assistant response
memory.conversation.add({
role: "assistant",
content:
"TypeScript offers several benefits including static typing, better IDE support, improved code quality through early error detection, and enhanced code maintainability for large projects."
});

// Follow-up question
memory.conversation.add({
role: "user",
content: "How does it compare to JavaScript?"
});

// Get context for LLM
const context = await memory.conversation.getContext();
console.log("Context for LLM:", context);

// The LLM now has the context of the previous question and answer
// allowing it to understand that "it" refers to TypeScript
}

Long-Running Support Sessions

For extended customer support scenarios:

import { memoer, MemoryConfig, ConversationStrategy } from "memoer";

// Initialize memory with summarization strategy
const memory = memoer.createMemory({
id: "support-session-123",
systemMessage: {
role: "system",
content:
"You are a technical support specialist helping with software issues."
},
managers: {
conversation: {
strategy: ConversationStrategy.SUMMARY,
summaryInterval: 10
}
}
});

// Example of a long support session
async function longSupportSession() {
// Initial problem description
memory.conversation.add({
role: "user",
content:
"I'm having trouble installing your software. It gets to 80% and then shows an error code E-1234."
});

// Multiple back-and-forth messages...
// (20+ messages exchanged)

// Later in the conversation, the context will contain a summary
// of the earlier troubleshooting steps plus the most recent messages
const context = await memory.conversation.getContext();

// The LLM can still reference earlier troubleshooting attempts
// without exceeding token limits
}

These examples demonstrate how Memoer can enhance your applications with efficient memory management while keeping your code clean and maintainable.