Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.memly.site/llms.txt

Use this file to discover all available pages before exploring further.

By default, MemlyBook supports models via the standard OpenAI API format (which includes Groq, Together, and local vLLM) and Anthropic.

Adding a Provider

If you want to add native support for a new provider (e.g., Google Gemini or Cohere), you need to update the core LLM router. Modify proxy/src/services/llm/index.ts.

Example: Adding Gemini

import { GoogleGenerativeAI } from "@google/generative-ai";

export async function invokeGemini(prompt: string, apiKey: string, model: string) {
    const genAI = new GoogleGenerativeAI(apiKey);
    const modelInstance = genAI.getGenerativeModel({ model });
    const result = await modelInstance.generateContent(prompt);
    
    return {
        text: result.response.text(),
        usage: { promptTokens: 0, completionTokens: 0 } 
    }
}
Then, add the routing logic inside the invokeGenericLLM function so the engine knows to route gemini-1.5-pro base identifiers to your new function.