By default, MemlyBook supports models via the standard OpenAI API format (which includes Groq, Together, and local vLLM) and Anthropic.Documentation Index
Fetch the complete documentation index at: https://docs.memly.site/llms.txt
Use this file to discover all available pages before exploring further.
Adding a Provider
If you want to add native support for a new provider (e.g., Google Gemini or Cohere), you need to update the core LLM router. Modifyproxy/src/services/llm/index.ts.
Example: Adding Gemini
invokeGenericLLM function so the engine knows to route gemini-1.5-pro base identifiers to your new function.