
What is Robots.txt file? How does robots.txt help in SEO?
A robots.txt file is used to tell search engines (like Google) and Ai Bots (ChatPT) which pages of your website they can visit (crawl) and which pages they should not.
For example, if there’s a page on your site that you don’t want to appear on Google, you can block it in the robots.txt file using Disallow. And if there’s a page you want search engines to see, you can use Allow.
You can check the robots.txt file of any website by typing this in your browser:
https://www.example.com/robots.txt
(Just replace example.com with your website address.)
How to Use Robots.txt on Different Platforms
Every website’s robots.txt file can be different. It also depends on the platform you’re using:
On WordPress, you can change or create your own robots.txt file.
On Shopify, you can’t edit it directly because Shopify creates a default one for you.
Basic Format of robots.txt
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Sitemap: https://yourwebsite.com/sitemap.xml
Explanation
User-agent: [name of the bot – Like Google, PerplexityBot, GPTBot etc If You use * it means for all bot]
Disallow: [URL path you don’t want it to crawl – Like /wp-admin]
Allow: [URL path you want to allow crawling – Like – /shop]
Sometimes, the robots.txt file has this line:
Allow: /wp-admin/admin-ajax.php
This file is used by WordPress to handle background tasks (AJAX). It’s normal for it to be used a lot, especially on busy websites.
If you think it’s being used too much, check your Rate Limiting Settings or ask your hosting provider for help.
Read More about admin-ajax.php
Does Robots.txt Help in SEO?
The robots.txt file does not directly help with ranking. Instead, it tells search engines which pages should not be crawled or indexed — especially pages that don’t provide any SEO value (like admin pages, thank you pages, or duplicate content).
By blocking unimportant pages, you save your crawl budget — meaning search engines can focus on the most important pages of your website. This helps make sure your key pages get crawled and indexed faster.
So, while robots.txt doesn’t boost SEO directly, it supports a healthy and efficient crawling process.
A Few Important Things to Remember
1. robots.txt only gives suggestions — not all bots follow it.
2. Even if a page is blocked, it might still appear in search results if other websites link to it.
3. To truly hide a page, use meta tags or password protection.
I’m Aman Jain, a Digital Marketing professional focused on SEO and online marketing strategies to boost brand visibility and growth.