Robots.txt file – managing the default file
What We Do by Default
Every site hosted on Staq comes with a pre-configured robots.txt file designed to help protect sensitive areas of your WordPress site while still allowing essential assets (like CSS and JS) to be crawled.
Here’s the default content:
# Default WPStaq robots.txt for WordPress hosting
# WARNING: This file is automatically managed by WPStaq and will be overwritten on updates.
# To customize: Remove the "Default WPStaq robots.txt" comment above to make this file user-managed.
User-agent: *
Disallow: /calendar/action*
Disallow: /events/action*
Disallow: /cdn-cgi*
Allow: /*.css
Allow: /*.js
Disallow: /*?
Crawl-delay: 10
Why This Matters
- Blocks dynamic calendar and event actions from being crawled.
- Allows static assets like CSS and JS files to be crawled by search engines.
- Disallows parameter-based URLs (e.g.,
?ref=,?utm=) to avoid crawl budget waste. - Sets a crawl delay of 10 seconds to reduce load on your site from aggressive bots.
How to Override the robots.txt File
If you want to customize the robots.txt rules to suit your SEO strategy or specific site setup, you can easily take control of the file.
Step 1: Connect via sFTP
To access the file directly, connect to your website’s server using sFTP.
👉 Follow our sFTP guide here:
How to Access Your Site via sFTP
Step 2: Locate the File
Once connected:
- Navigate to the root directory of your WordPress site (inside the
wordpressfolder - Look for the file named
robots.txt.
Step 3: Take Ownership
By default, the file is managed by WPStaq and will be overwritten during system updates.
To make it user-managed:
- Edit the file using a text/code editor (via sFTP).
- Delete the comment:
# Default WPStaq robots.txt for WordPress hosting
at the top of the file.
Once that line is removed, Staq will stop overriding the file during updates.
Customizing Your Rules
After taking control of the file, you can modify it.
About the Crawl-delay Rule
By default, Staq includes a Crawl-delay: 10 directive in your robots.txt file. This rule helps reduce server load from aggressive bots that don’t pace their requests.
Why We Add Crawl-delay
- Protects server resources – prevents bots from overwhelming your PHP workers with too many simultaneous requests.
- Controls bandwidth usage – reduces spikes that could eat into your monthly bandwidth.
- Safe for SEO – major search engines like Google ignore
Crawl-delay, so it won’t slow down your indexing.
FAQ
- Does this affect Google? – No. Google and most major search engines ignore the
Crawl-delaydirective. - What if I want to manage my own robots.txt file? – You can easily do that by logging into the sFTP details, as suggested above, and then delete the comment
# Default WPStaq robots.txt for WordPress hostingat the top of the file. Once that line is removed, Staq will stop overriding the file during updates. - Can I change the Crawl-delay? Sure. Log into the sFTP and remove the comment
# Default WPStaq robots.txt for WordPress hostingat the top of the file and then you can configure your Crawl-delay value.
Need some help?
We all do sometimes. Please reach out to our support team by dropping us a support ticket. We will respond fast.