In the world of e-commerce, optimizing a website’s crawling is essential to ensure its visibility in search engines. Shopify, as one of the most popular e-commerce platforms, offers online store owners the possibility of optimizing their website’s crawling through the correct configuration of the robots.txt file.
That's why below we will explain in detail how to create and edit the robots.txt file in Shopify.
What are Robots.txt files used for?
Robots.txt files are used to give
specific instructions to search engines on how they should crawl and access a website. Below we list some of its most important functions. Read carefully!
-
Control access to image files. One of the main functions of the robots.txt file is to prevent search engines from indexing and displaying image files on your page in search results.
By restricting access to these images, you force users to visit your page in order to view them, which
can lead to more traffic and potential conversions for your business . However, it's important to note that
the robots.txt file doesn't prevent other websites from copying and sharing your image links .
-
Control access to web pages. This allows you to block search engine access to those pages that you consider do not add value to your strategy.
By preventing search engines from crawling and displaying these pages,
you can save resources on your server and improve user experience by directing traffic to the most important pages . Keep in mind that users will still be able to access these pages if they have a direct link to them.
-
Block access to resource files. The robots.txt file can also be used to block access to other files, such as scripts and stylesheets, that are not critical to the functioning of your site. This helps reduce the load on your servers and improve the loading speed of your site.
It is important to use this feature with caution , as
it can block files , which makes it difficult for search engines to work and negatively affect the analysis of your page.
Steps to follow to examine the Robots.txt file
Using the
robots.txt Tester tool, you can check which URLs are being restricted and which are not. Follow these steps to perform the check:
-
Run the tool: Open your web browser and search for "robots.txt tester." Make sure you use a reputable and trusted tool.
-
Enter the URL of the page to check: At the bottom of the tool page, you will find a field where you can enter the URL of the page you want to check. Enter the full URL, including the protocol (http:// or https://).
-
Select the appropriate User-Agent: A User-Agent is a text string that identifies the crawler or browser that is accessing your website. The robots.txt Tester tool will allow you to select the user-agent you want to use for testing.
-
Hit the "Test" button: Once you have entered the URL and selected the User-Agent, simply hit the "Test" button to start testing.
-
Check the status of the "test" button: After the tool has performed the check, the status of the "test" button will change. If the URL is being blocked by robots.txt, you will see the "blocked" option. On the other hand, if the URL is not being restricted, it will appear "accepted".
Using the
robots.txt Tester tool, you can examine which URLs on your website are being blocked or allowed by the robots.txt file. This will help you check that search engines are accessing the exact parts of your website. Don't hesitate to use this tool to improve your website's performance!
How to edit the Robots.txt file in Shopify
If you're looking to
edit the
robots.txt.liquid file, we recommend working with a Shopify expert or someone who has experience in code editing and SEO.
You can use
Liquid to add or remove directives in the
robots.txt.liquid template
. This allows Shopify to automatically keep the file up to date in the future. For a complete guide on how to edit this file, you can check out the Shopify developer page
"Customizing robots.txt.liquid."
Before editing the robots.txt.liquid file, remove any customizations such as using a third-party service like Cloudflare. Here are the steps to follow:
- In your Shopify admin, click "Settings" and then "Apps and sales channels."
- From the "Apps and sales channels" page, click "Online store" .
- Click on "Open sales channel" .
- Click the options button (...) and then click "Edit code".
- Select "Add a new template" and choose "robots".
- Click "Create Template" .
- Make your desired changes to the default template. If you need more information about Liquid variables and common use cases, you can check out the Shopify developer page "Customizing robots.txt.liquid" .
-
Save your changes to the robots.txt.liquid file in your published theme.
It is important to note that the changes are immediate, but the crawlers do not
always react instantly. You can test your changes using Google's
"robots.txt Tester" .
Advantages of creating a Robots.txt file in Shopify
A
good robots.txt file in Shopify will help improve search engine rankings by indicating which parts of your site should be crawled and which should not. This is especially effective if you have pages or sections that you don't want to be indexed by search engines.
By customizing Shopify's robots.txt.liquid file, you can specify which paths or directories should not be crawled, which
can be useful for protecting sensitive information or avoiding duplicate content.
Robots.txt FAQ in Shopify
How to interpret the Robots.txt file?
Interpreting the robots.txt file involves understanding that the “Disallow” and “Allow” directives indicate which parts of the website may or may not be crawled by search engines. Also, whenever you see “User-agent” this specifies which crawler it applies to. Comments and “Sitemap” can provide additional information. Correctly interpreting the Robots.txt file is very important to control web crawler indexing and access to your site.