FeaturesCMSNotion
PreviousNext

LLMs

Learn how to expose content for AI models.

Overview

Exposing API endpoints for AI models to reference your website content is crucial for building intelligent features like chatbots, search assistants, and content recommendations. These endpoints provide clean, structured access to your content in formats optimized for large language models (LLMs), enabling AI-powered interactions with your documentation, blog posts, and other web content.

How LLM Endpoints Work

LLM endpoints transform your content into AI-readable formats, typically markdown or plain text optimized for LLM processing. Here's how they work:

Content Export Endpoints

Quick Setup: Create llm.md endpoints to export pages as AI-ready markdown.

/blog/[slug]/llm.md/route.ts
import { createMarkdownExportHandler } from '@kit/notion/api';
 
export const GET = createMarkdownExportHandler(notionConfig, 'posts');

What it does:

  • Takes slug from URL → fetches content → converts to markdown → returns clean text

Benefits for AI Integration

These endpoints enable powerful AI features:

  • Context-aware chatbots that reference your specific documentation
  • Smart search that understands content relationships
  • Content recommendations based on user queries
  • Automated content summarization
  • Cross-referencing between related pages

AskAI Component Integration

This LLM endpoint infrastructure enables the implementation of AI-powered components like the AskAI feature. The AskAI component uses these endpoints to:

  1. Fetch relevant content based on user queries
  2. Provide contextual answers using your documentation
  3. Maintain consistency with your content structure
  4. Offer interactive help directly in your documentation pages

The AskAI component will be covered in detail in the next documentation section.

How is this guide?

Last updated on 10/17/2025