Generate AI-ready content files for LLM consumption using the Nuxt LLMs module.
EletroDS uses the nuxt-llms module to generate optimized content files that can be consumed by Large Language Models (LLMs). These files provide structured documentation in formats that are easy for AI models to parse and understand.
The following endpoints are automatically generated and available:
| Endpoint | Description |
|---|---|
/llms.txt | A compact text representation of all documentation |
/llms-full.txt | Complete documentation content in text format |
https://eletro.design/llms.txt
https://eletro.design/llms-full.txt
http://localhost:3000/llms.txt
http://localhost:3000/llms-full.txt
The LLMs module is configured in nuxt.config.ts:
export default defineNuxtConfig({
modules: ['nuxt-llms'],
llms: {
domain: 'https://eletro.design/',
title: 'EletroDS - Design System',
description: 'A complete design system for creating cohesive, accessible, and scalable experiences within Mercado Eletrônico.',
full: {
title: 'EletroDS - Full Documentation',
description: 'Complete documentation for EletroDS Vue 3, the modern design system for Mercado Eletrônico.'
},
sections: [
{
title: 'Getting Started',
contentCollection: 'docs',
contentFilters: [
{ field: 'path', operator: 'LIKE', value: '/getting-started%' }
]
},
{
title: 'Components',
contentCollection: 'docs',
contentFilters: [
{ field: 'path', operator: 'LIKE', value: '/components%' }
]
}
]
}
})
You can use these endpoints to build custom AI integrations:
// Fetch documentation for AI processing
const response = await fetch('https://eletro.design/llms.txt')
const documentation = await response.text()
// Use with your AI model
const aiResponse = await yourAIModel.generate({
context: documentation,
prompt: 'How do I use the MeButton component?'
})
The LLMs files can be used as a knowledge base for RAG implementations:
/llms-full.txt content| Feature | MCP Server | LLMs Files |
|---|---|---|
| Real-time access | ✅ Yes | ❌ No (static files) |
| IDE integration | ✅ Native support | ❌ Requires custom setup |
| Programmatic access | ✅ Via MCP protocol | ✅ Via HTTP |
| Offline usage | ❌ Requires connection | ✅ Can be cached |
| Best for | IDE assistants | Custom AI integrations |