Centralize documentation for seamless LLM integration.
- Add support for more LLM services (Gemini, OpenAI)
- Separate services between file processing and storage, data retrieval
- Add MCP support for external integration
- Add RAG for searching specific Confluence pages (direct RAG or processing to Database?)
docs-lm/
├── src/
│ ├── main.js # Main entry point (TODO)
│ ├── chains/
│ │ └── ragChain.js # RAG chain implementation
│ ├── demos/
│ │ └── demo.js # Demo script for testing
│ ├── embeddings/
│ │ └── embeddingService.js # HuggingFace embedding service
│ ├── llm/
│ │ └── anthropicService.js # Anthropic Claude LLM service
│ ├── loaders/
│ │ └── documentLoader.js # Markdown document loading utilities
│ └── vectorStore/
│ └── chromaStore.js # ChromaDB vector storage (memory or persistent)
├── data/
│ ├── docs/ # Documentation files (markdown)
│ └── testing/ # Test data
├── package.json # Project configuration
├── yarn.lock # Dependency lock file
└── README.md # Project documentation