robots.txt for Shopify: Edit, Customize, and Block AI Bots
Shopify's robots.txt was locked for years. A lot of guides still say you can't edit it. That stopped being true in mid-2021. Since then, every Shopify store can fully customize robots.txt through a Liquid template called robots.txt.liquid. Here is how to do it without breaking Shopify's defaults, plus copy-paste rules for blocking AI training crawlers.
Step 1: Open the Theme Code Editor
- Shopify admin → Online Store → Themes
- Find your live theme and click Edit code (or the three-dot menu → Edit code on newer admin layouts)
- In the left sidebar under "Templates", click Add a new template
- In the "Create a new template for" dropdown, choose robots.txt
- Shopify will create
templates/robots.txt.liquidprefilled with the default content
If robots.txt.liquid already exists in your theme, open that file directly instead of creating a new one.
Important: Back up the existing file before editing. Copy its full contents into a text file somewhere you will find again. If you break the defaults, you restore from this backup.
Step 2: Understand the Default Template
When you create robots.txt.liquid, Shopify prefills it with something like this:
{%— for group in robots.default_groups —%}
{{- group.user_agent }}
{%— for rule in group.rules —%}
{{ rule }}
{%— endfor —%}
{%— if group.sitemap != blank —%}
{{ group.sitemap }}
{%— endif —%}
{%— endfor —%}
This loop prints Shopify's default rules — blocking /admin, /cart, /checkout, /orders, and various search and collection filter URLs. Keep this loop. It is what prevents search engines from indexing your checkout flow and internal URLs.
Step 3: Add Custom Rules On Top
The safe pattern is: keep the default loop, then append your custom rules after it.
Block all known AI training crawlers
{%— for group in robots.default_groups —%}
{{- group.user_agent }}
{%— for rule in group.rules —%}
{{ rule }}
{%— endfor —%}
{%— if group.sitemap != blank —%}
{{ group.sitemap }}
{%— endif —%}
{%— endfor —%}
User-agent: GPTBot
Disallow: /
User-agent: ChatGPT-User
Disallow: /
User-agent: ClaudeBot
Disallow: /
User-agent: Claude-Web
Disallow: /
User-agent: Google-Extended
Disallow: /
User-agent: PerplexityBot
Disallow: /
User-agent: Perplexity-User
Disallow: /
User-agent: Bytespider
Disallow: /
User-agent: CCBot
Disallow: /
User-agent: FacebookBot
Disallow: /
User-agent: Applebot-Extended
Disallow: /
Save the file. Shopify deploys it immediately. Visit https://yourstore.com/robots.txt in a browser and you should see the default Shopify rules followed by your AI bot blocks.
Block only a specific path (like a private collection)
If you want to keep crawlers out of one specific URL path (say a /private-collection), add a rule inside the default group loop:
{%— for group in robots.default_groups —%}
{{- group.user_agent }}
{%— for rule in group.rules —%}
{{ rule }}
{%— endfor —%}
{%— if group.user_agent == 'User-agent: *' —%}
Disallow: /collections/private-collection
{%— endif —%}
{%— endfor —%}
This adds your custom Disallow to the User-agent: * group that Shopify already generates, instead of creating a parallel group that competing crawlers might interpret differently.
Remove a specific default Disallow rule
Sometimes you actually want Google to crawl something Shopify blocks by default (for example, a filtered collection URL that drives real traffic). Use a conditional inside the loop:
{%— for group in robots.default_groups —%}
{{- group.user_agent }}
{%— for rule in group.rules —%}
{%— unless rule.directive == 'Disallow' and rule.value contains '/collections/' —%}
{{ rule }}
{%— endunless —%}
{%— endfor —%}
{%— endfor —%}
This skips any Disallow rule matching /collections/. Be careful — Shopify blocks those paths for a reason (filter parameter bloat in search results), so only do this if you know you want it.
Check your Shopify store's AI readiness
The AI Readiness Checker scans your robots.txt, llms.txt, structured data, and more.
Run AI Readiness CheckStep 4: Verify It Actually Works
- Open
https://yourstore.com/robots.txtin a browser. You should see Shopify's default rules plus your new additions. - Confirm the defaults are still there. If
Disallow: /admin,Disallow: /cart, andDisallow: /checkoutare missing, something is wrong with your loop — restore from backup and try again. - Run the AI Readiness Checker against your store URL. It will tell you exactly which AI bots are blocked.
- In Shopify admin, go to Online Store → Preferences → check that your sitemap URL is set correctly (Shopify generates this automatically, you cannot edit it).
Things Shopify Merchants Ask
Does blocking AI crawlers hurt my Google Shopping listings?
No. Google Shopping uses your product feed, not robots.txt AI rules. Google-Extended controls Gemini training data only — it has zero effect on product search, shopping ads, or organic rankings.
What about Shopify's own app crawlers?
Shopify runs internal crawlers for features like search suggestions and app integrations. These are not subject to your robots.txt edits because they access your store through internal APIs. Adding rules for unknown user-agents does not affect them.
Can I edit robots.txt from the Shopify admin without touching theme code?
No, not directly. Shopify intentionally puts robots.txt editing behind the theme code editor. There is no UI panel for it. If you are not comfortable editing Liquid templates, there are apps in the Shopify App Store that provide a wrapper around robots.txt.liquid, but they ultimately do the same thing.
My theme does not have a robots.txt.liquid file. Is that a problem?
No. Shopify generates a default robots.txt on the fly when the template file does not exist. You only need to create the template when you want to customize the rules.
Rule of thumb: Add, do not replace. The Shopify default template exists because Shopify knows which internal URLs should not appear in search results. Your custom rules go on top of that, not instead of it.
What About llms.txt on Shopify?
Shopify does not generate an llms.txt file automatically, and there is no Liquid template for it yet. If you want one, the workaround is to create it as a page (not a template) and set the URL handle to llms, then use a redirect to serve it at /llms.txt. It is not ideal — the served file will have HTML headers — and for most Shopify stores, it is fine to skip llms.txt entirely and rely on structured data instead.
See our llms.txt examples guide for templates if you do decide to host one on a subdomain or custom route.
Generate a robots.txt with AI bot presets
Use the generator to build your custom rules, then paste them after the Shopify default loop.
Open Robots.txt GeneratorFrequently Asked Questions
Can I actually edit robots.txt on Shopify?
Yes, since mid-2021. Shopify exposes robots.txt as a Liquid template called robots.txt.liquid. You edit it from the theme code editor under Online Store > Themes > Edit Code. Before 2021 the file was locked, which is why many older guides tell you it cannot be changed. It can.
Will editing robots.txt.liquid break my Shopify SEO?
Only if you accidentally disallow important paths. Shopify ships a sensible default robots.txt that blocks checkout, cart, and account pages from search engines. The safe pattern is to keep the default output intact and add your custom rules on top of it, using the Liquid include syntax Shopify provides. Never replace the file with a blank one, and never use Disallow: / without knowing exactly what you are doing.
Does blocking AI crawlers on Shopify affect my Google rankings?
No. Crawlers like GPTBot, ClaudeBot, Bytespider, and Google-Extended are separate from Googlebot. Google-Extended only controls whether your content is used to train Gemini, not how your store ranks in Google Search. Blocking these AI bots has zero impact on your Google Shopping or Google Search visibility.