Overview
Exposing API endpoints for AI models to reference your website content is crucial for building intelligent features like chatbots, search assistants, and content recommendations. These endpoints provide clean, structured access to your content in formats optimized for large language models (LLMs), enabling AI-powered interactions with your documentation, blog posts, and other web content.
How LLM Endpoints Work
LLM endpoints transform your content into AI-readable formats, typically markdown or plain text optimized for LLM processing. Here's how they work:
Content Export Endpoints
Quick Setup: Create llm.md
endpoints to export pages as AI-ready markdown.
import { createMarkdownExportHandler } from '@kit/notion/api';
export const GET = createMarkdownExportHandler(notionConfig, 'posts');
What it does:
- Takes slug from URL → fetches content → converts to markdown → returns clean text
Benefits for AI Integration
These endpoints enable powerful AI features:
- Context-aware chatbots that reference your specific documentation
- Smart search that understands content relationships
- Content recommendations based on user queries
- Automated content summarization
- Cross-referencing between related pages
AskAI Component Integration
This LLM endpoint infrastructure enables the implementation of AI-powered components like the AskAI feature. The AskAI component uses these endpoints to:
- Fetch relevant content based on user queries
- Provide contextual answers using your documentation
- Maintain consistency with your content structure
- Offer interactive help directly in your documentation pages
The AskAI component will be covered in detail in the next documentation section.
Learn how to use Notion as Content Management System in your application.
Add powerful search functionality to your Notion-powered applications.
How is this guide?
Last updated on 10/17/2025