Shopify Robots.txt Guide (2025): The Ultimate SEO Download 🚀
Welcome, Shopify store owner! You've built a beautiful store, but how do you make sure Google sees all your best products and ignores the boring stuff? The answer lies in a powerful little file called robots.txt.
![]() |
Shopify Robots.txt |
While Shopify’s default `robots.txt` is good, a customized version can give you a serious competitive advantage. This guide is built from years of e-commerce SEO experience. We'll provide a downloadable, optimized `robots.txt` file and explain everything in simple, human-friendly language.
What is a Robots.txt File? 🤔
Think of `robots.txt` as a set of rules you give to search engine bots (like Googlebot). It's like putting a "Staff Only" sign on certain doors of your store. You want Google to explore your product aisles (your product pages) but stay out of the stockroom (your admin area) and checkout lanes.
By guiding these bots, you help Google focus its energy on indexing the pages that actually make you money, which can lead to better rankings and more traffic.
The Best Robots.txt for Shopify: Optimized for 2025
Shopify's default file is a great start. We've taken it and added a few crucial lines for even better SEO performance. This version blocks more types of duplicate content and provides clearer instructions to search engines.
Simply copy the code below to use in your Shopify store.
# Shopify Optimized Robots.txt - 2025 by [Your Name/Website]
User-agent: *
Disallow: /admin
Disallow: /cart
Disallow: /orders
Disallow: /account
Disallow: /checkouts
Disallow: /checkout
Disallow: /apps/
Disallow: /search
Disallow: /collections/*+*
Disallow: /collections/*sort_by*
Disallow: /*?*preview_theme_id*
# Allow important policy pages and the sitemap
Allow: /policies/
# Sitemap location using Shopify's Liquid variable
Sitemap: {{ shop.url }}/sitemap.xml
New Lines and Enhancements Explained:
Disallow: /checkouts
: Shopify uses the plural `/checkouts` for its checkout process. We've added this and kept the singular `/checkout` for extra safety.Disallow: /apps/
: Blocks pages generated by some of your installed apps that have no SEO value.Disallow: /search
: Prevents Google from indexing your internal search result pages, which are classic examples of low-quality, duplicate content.Disallow: /collections/*+*
: This is a more effective way to block filtered collection pages in Shopify. When a customer filters a collection (e.g., by color or size), Shopify adds a "+" to the URL. This single line blocks all of them.Disallow: /*?*preview_theme_id*
: Blocks Google from indexing your unpublished theme preview URLs.Sitemap: {{ shop.url }}/sitemap.xml
: This is a smart upgrade. Instead of manually typing your domain, this uses Shopify's Liquid code to automatically insert your store's URL. It's foolproof!- Logic Cleanup: We removed the unnecessary `Allow: /sitemap.xml` rule. The `Sitemap:` directive at the bottom is the correct and only way to declare your sitemap.
Your `robots.txt` file is for blocking bots from crawling whole sections of your site. If you just want to hide a single product or page from search results, it's better to use a noindex tag. You can do this in Shopify by going to the product/page editor, scrolling to "Search engine listing," clicking "Edit," and checking the "Hide from search engines" box.
How to Edit Your `robots.txt.liquid` File in Shopify
Editing this file is simple and takes just a few clicks. Follow these steps:
- From your Shopify Admin dashboard, go to Online Store > Themes.
- Find your current theme, click the three-dots button (...), and select Edit code.
- In the file search bar on the left, type
robots
and click on therobots.txt.liquid
file to open it. - Delete all the existing code in the file and paste the optimized code you copied from this guide.
- Click Save in the top right corner. You're done!
Frequently Asked Questions (FAQs) 🙋
Q: Is it safe for me to edit this file?
A: Yes, it's safe if you follow the instructions. If you ever feel you've made a mistake, you can simply delete the code you pasted and Shopify will automatically revert to its default `robots.txt` file.
Q: What's the main benefit of this custom file over the Shopify default?
A: The main benefit is better crawl budget optimization. This custom file blocks more types of low-value and duplicate pages (like search results and app pages), forcing Google to spend its limited time crawling and indexing the pages that actually matter—your products and categories.
Q: Should I block AI bots like Google-Extended or ChatGPT-User?
A: For most stores, the answer is no. The future of search involves AI-generated answers and overviews. Blocking these bots might prevent your products from appearing in these new types of results. We recommend allowing them unless you have a specific reason not to.
Q: My traffic dropped after I changed the file! What should I do?
A: Don't worry! This is very rare but easy to fix. The most likely cause is an accidental typo, like adding `Disallow: /products`. Go back to the code editor (Online Store > Themes > Edit code > robots.txt.liquid), delete everything, and click Save. This will restore the default Shopify file and fix the issue.