Building with LLMs
Use LLMs to assist in integrating Courier into your application.
You can use large language models (LLMs) to assist in integrating Courier into your application. We provide a set of tools to help you if you’re using an LLM in your integration, like when using an AI-assisted editor such as Cursor, VS Code with Copilot, or Windsurf.
Plain text docs
Every page on our docs site is accessible as a plain text file by appending a .md
extension. For example, this page is accessible as working-with-llms.md
. Our plain text pages are useful to feed to an LLM when building your integration.
We also host an /llms.txt
and /llms-full.txt
files which instructs AI tools and agents how to retrieve the plain text versions of our pages.
Courier Model Context Protocol (MCP) Server
We provide a Mintlify-hosted MCP server that exposes Courier’s documentation and API reference to LLMs and AI agents via the Model Context Protocol.
You can install and use the Courier MCP server locally to aid in building your Courier integration:
This MCP server gives AI tools and agents access to our comprehensive documentation, API reference, and code examples to help with Courier integration tasks.
Future MCP Server Enhancements
We’re actively working on a more robust MCP server that will provide enhanced capabilities for AI-assisted development with Courier. Our planned features include:
- Direct API access for sending notifications and managing users
- Real-time notification status and delivery tracking
- Template management and content creation assistance
- User preference and audience management
- Automation workflow design and testing
If you’re interested in these enhanced MCP capabilities or have specific use cases you’d like to see supported, we’d love to hear from you. Send us an email at devrel@courier.com with your feedback and requirements.