
Restaurant, travel, rentals…
Organize group expenses with the agility of a spreadsheet.
A robots.txt file tells crawlers which parts of a site they should visit, skip, or treat with special rules before they start fetching pages.
This tool lets you choose a default crawl policy, add explicit allow and disallow paths, set an optional crawl delay, and append sitemap or host directives without writing the file by hand.
Robots.txt is a crawl directive, not an access-control system, so sensitive content should still be protected with authentication or other server-side restrictions.
Different crawlers support different directives. For example, the host directive is not universal and crawl-delay may be ignored by some bots, so always validate your final rules against the crawlers you care about.

We got more utilities for you...
Select several utilities and put them in a dashboard.