Robots Txt Parser
Parses and analyzes the robots.txt of a given website, providing insights into crawlers access.
Try it
curl -G https://robots-txt-parser.apimesh.xyz/check --data-urlencode "url=https://example.com"
Use cases
- a developer needs to verify if their crawler respects site directives before crawling large domains.
- seo teams want to analyze robots.txt rules across multiple client websites efficiently.
- content managers want to ensure that sensitive URLs are properly disallowed in robots.txt files.
Payment methods
Every endpoint supports three payment methods:
- x402 — pay per call with USDC on Base. No signup needed.
- MPP — Stripe Machine Payments Protocol. Cards + stablecoins.
- API key — traditional auth. Sign up and buy credits.
MCP integration
npx @mbeato/apimesh-mcp-server
Adds all APIMesh tools to Claude, Cursor, Windsurf, or any MCP client.