Skip to main content

Getting Started with Memoer

Let's discover Memoer - a powerful memory management system for Large Language Models (LLMs).

What is Memoer?

Memoer is a specialized memory management system designed for LLMs that provides:

  • Easy-to-use abstraction for memory management and optimization
  • Native compatibility with Vercel's AI SDK
  • State-of-the-art (SOTA) memory management techniques

Installation

Get started by installing Memoer:

npm install memoer

Basic Usage

Here's a simple example of how to use Memoer:

import { memoer, MemoryConfig, ConversationStrategy } from "memoer";

// Create a new memory instance
const memoryConfig = {
id: "conversation-1",
systemMessage: {
role: "system",
content: "You are a helpful assistant."
},
managers: {
conversation: {
// Optional: Configure conversation strategy
strategy: ConversationStrategy.SLIDING_WINDOW,
slidingWindowSize: 10 // Number of messages to keep in context
}
}
};

// Initialize the memory
const memory = memoer.createMemory(memoryConfig);

// Add messages to the conversation
memory.conversation.add({
role: "user",
content: "Hello, how are you today?"
});

// Get optimized conversation context (affected by strategy)
const context = await memory.conversation.getContext();

// Get full conversation history regardless of strategy
const fullHistory = await memory.conversation.getFullContext();

Key Features

  • Easy Integration: Designed to be simple to implement in any LLM application
  • Vercel AI-SDK Support: Built directly on top of Vercel's AI SDK, the mainstream unified LLM API platform
  • SOTA Memory Management: Advanced memory techniques to optimize context management for LLMs
  • Flexible Strategies: Configure different memory strategies like sliding windows to suit your needs

Next Steps

Explore the documentation to learn more about Memoer's capabilities and how to integrate it into your LLM applications.