Robots.txt Generator – Build Your Robots.txt File Visually

Create a properly formatted robots.txt file for your website using a simple visual editor. Add rules for specific user agents, allow or disallow paths, include your sitemap URL, and set crawl delays β€” all without writing a single line manually.

Rule Editor

Rule Group #1
Global Settings

Generated robots.txt

Configure your rules and click "Generate robots.txt" to see the output here.

How to use: Save the generated content as robots.txt and upload it to the root directory of your website (e.g., https://yoursite.com/robots.txt).

Why Use the Tools Oasis Robots.txt Generator?

Tools Oasis provides a visual, mistake-free way to create robots.txt files. No need to remember syntax or worry about formatting errors that could accidentally block search engines from your important pages.

  • Visual Editor: Add, remove, and configure rules with clicks instead of manual text editing.
  • Multiple User Agents: Create separate rules for Googlebot, Bingbot, and any other crawler.
  • Sitemap Support: Include your XML sitemap URLs so search engines can find all your pages.
  • 100% Private: Everything runs in your browser β€” no data is ever stored or sent anywhere.
  • Instant Download: Download the generated robots.txt file directly, ready to upload to your server.

Understanding Robots.txt Directives

A robots.txt file is a plain text file placed at the root of your website that instructs search engine crawlers which pages or sections they are allowed or disallowed to access. Here are the key directives:

  • User-Agent: Specifies which crawler the rules apply to. Use * for all bots, or target specific ones like Googlebot.
  • Disallow: Tells the crawler not to access a specific path. Example: Disallow: /admin/
  • Allow: Overrides a Disallow rule for a specific path within a blocked directory. Example: Allow: /admin/public/
  • Sitemap: Points crawlers to your XML sitemap for better page discovery.
  • Crawl-Delay: Requests that the crawler wait a specified number of seconds between requests (supported by Bing and Yandex, ignored by Google).

Related Tools

Frequently Asked Questions

What is a robots.txt file?

A robots.txt file is a plain text file placed at the root of your website (e.g., yoursite.com/robots.txt) that tells search engine crawlers which pages they can and cannot access. It's a key part of technical SEO and helps you control how search engines discover and index your site.

Where do I upload the robots.txt file?

The robots.txt file must be placed in the root directory of your website, so it's accessible at https://yourdomain.com/robots.txt. Upload it via FTP, your hosting file manager, or your CMS settings. It must be at the root β€” placing it in a subdirectory won't work.

Will robots.txt prevent pages from appearing in Google?

Not exactly. Disallow tells crawlers not to crawl a page, but Google may still index the URL if it finds links to it. To fully prevent indexing, you should also use a <meta name="robots" content="noindex"> tag. Use our Meta Tag Generator to create one.

Does Google respect the Crawl-Delay directive?

No. Googlebot ignores Crawl-Delay. This directive is supported by Bing, Yandex, and some other crawlers. To control Google's crawl rate, use the Crawl Rate settings in Google Search Console instead.

Can a robots.txt file hurt my SEO?

Yes, if misconfigured. A broad Disallow: / rule will block all crawlers from your entire site, effectively removing it from search results. Always double-check your rules before uploading. This tool helps prevent common mistakes by providing a visual interface.

Is my data saved anywhere?

No. All processing happens entirely in your browser using JavaScript. Tools Oasis never stores, collects, or transmits any information you enter β€” your rules stay completely private on your device.

More Questions About Robots.txt

What is a robots.txt file?

A robots.txt file is a plain text file placed at the root of your website that instructs search engine crawlers which pages and directories they are allowed or disallowed to access. It is a fundamental part of technical SEO and helps you control how your site is crawled and indexed.

Can I block Google from crawling specific pages?

Yes. Add a rule group targeting "Googlebot" as the user agent and add Disallow directives for the paths you want to block. However, note that Disallow prevents crawling but Google may still index the URL if it finds links to it. For full de-indexing, also use a noindex meta tag.

Where do I put the robots.txt file?

The robots.txt file must be placed in the root directory of your website so it is accessible at https://yourdomain.com/robots.txt. Upload it via FTP, your hosting file manager, or your CMS settings. Placing it in a subdirectory will not work.

Does robots.txt affect my SEO?

Yes, significantly. A properly configured robots.txt helps search engines focus on your important pages and avoid wasting crawl budget on irrelevant sections. However, a misconfigured file β€” such as accidentally disallowing your entire site β€” can remove your pages from search results entirely.