Shopify automatically generates a default robots.txt file for every store, which helps search engines understand which parts of your site can or cannot be crawled. While this default configuration works well for most stores, Shopify allows you to fully customize your robots.txt file using the robots.txt.liquid template if you need more control. To access this file for your Shopify store, simply go to the following URL on your store: https://example.com/robots.txt

What is robots.txt and Where is it Located?

  • The robots.txt file is a set of instructions for search engine crawlers, specifying which URLs they can access on your site.
  • Shopify’s default robots.txt is not included in themes by default, but you can add and customize it by creating a robots.txt.liquid template in your theme’s Templates directory

How to Add or Edit the robots.txt.liquid Template

Access Your Theme’s Code Editor

  • In Shopify admin, go to Online Store > Themes.
  • Find your current theme, click ... (Actions), and select Edit code.

Add the robots.txt.liquid Template

  • In the left sidebar, under Templates, click Add a new template.
  • Choose robots.txt from the dropdown menu.
  • Click Create template.

Edit the Template

  • The robots.txt.liquid template supports only certain Liquid objects, such as robots, group, rule, user-agent, and sitemap.
  • You can use Liquid code to output default rules, add new rules, or override existing ones.

Save Your Changes:

  • After making your edits, click Save. The file at /robots.txt on your site will now reflect your customizations.

Customizing Your robots.txt Rules

You can perform several types of customizations:

  • Add a New Rule to an Existing Group:
    For example, to block all crawlers from accessing pages with a ?q= parameter, you can add a Liquid condition to the default output.
  • Remove a Default Rule:
    If you want to allow crawlers to access a page that Shopify blocks by default (e.g., /policies/), you can modify the Liquid loop to skip that rule.
  • Add Custom Rules:
    You can manually add rules outside the default groups, such as blocking or allowing specific crawlers or adding extra sitemap URLs.

Examples

Block a Directory

{%- for group in robots.groups -%}
User-agent: {{ group.user_agent }}
{%- for rule in group.rules -%}
Disallow: {{ rule.value }}
{%- endfor -%}
{%- if group.user_agent == '*' -%}
Disallow: /blog/
{%- endif -%}
{%- endfor -%}

Block a Specific Crawler

User-agent: ChatGPT
Disallow: /

Allow a Specific Crawler

User-agent: discobot
Allow: /

Add a Sitemap

Sitemap: https://yourstore.com/sitemap.xml

Considerations and Best Practices

  • Use Liquid Objects:
    It’s strongly recommended to use Liquid objects rather than replacing the entire file with static rules, as Shopify updates default rules to follow SEO best practices.
  • Backup Your Customizations:
    Always back up your original file before making changes, so you can revert if needed.
  • Be Cautious:
    Incorrect rules can prevent important pages from being indexed, potentially harming your SEO and traffic. If unsure, consult an SEO expert.

Pros and Cons of Customizing robots.txt

Pros:

  • Control which pages are crawled by search engines.
  • Optimize your site’s crawl budget by excluding thin or duplicate content.
  • Block unwanted bots or crawlers.

Cons:

  • Requires technical knowledge; mistakes can harm SEO.
  • Removing default rules may unintentionally expose sensitive or low-value pages to search engines.

Restoring the Default robots.txt

If you want to revert to Shopify’s default robots.txt, simply delete the robots.txt.liquid template from your theme. This will remove all customizations and restore the original behavior.

By following these steps, you can safely and effectively customize your Shopify store’s robots.txt file to suit your SEO and privacy needs.

Share this article