squarespace seo problem - rad rank

A New Squarespace SEO Problem Emerges [Fix Inside]6 min read

A tl;dr for those who want to skim: Squarespace just added a new page to your site for every tag you’ve ever used. This influx of typically hundreds of low quality pages into the index is having a negative affect on every Squarespace site we’ve looked at. If you keep scrolling, we have a fix.

Squarespace SEO isn’t Great.

It felt good to get that out.

We’ve recently been digging into the technical issues of the Squarespace platform. Our short term goal is finding every technical SEO issue on Squarespace and finding a way to fix them for our clients. Long term, we’re hoping that raising these issues in the support forums and with Squarespace Help will help them solve these on their end. So far.. it’s been interesting to say the least. Technically, they have sizable issues and if you search for their responses in the help forums, they try to brush them off as meaningless or cast blame elsewhere.

That’s a lot of Squarespace Tag Pages

With Game of Thrones returning.. this isn’t the kind of wall you want to see appear in your Search Console.

While researching the effects of the incorrect handling of Squarespace’s current tag pages, which is a post for another day, (see below):

We noticed new tag pages being indexed by Google. I’d crawled the same sites earlier that day and hadn’t seen those pages in the crawl. I quickly realized that these outcast tag pages are newcomers to the Squarespace sitemap, but they don’t actually appear anywhere on the actual site.



These look very much like the standard WordPress tag pages, with a few key differences. First, these tag pages are inheriting the same page title and meta description as the main blog page. Second, they’re not linked to anywhere in the current templates that I reviewed. Third, there is no way to hide them from search results.

Why is this an issue?

In the sites we checked, almost every one doubled or tripled in the number of pages indexed. You might be thinking, “More pages equals more opportunity for Google clicks!” Unfortunately, that couldn’t be further from the case in 2019. The name of the game in SEO right now is tightly controlling the number of pages being indexed. You want to make sure that only the content with an awesome user experience is included in search results. These tag pages are not an example of a great user experience. (They’re typically a bummer to click on!)

You want the ratio of quality pages to pages indexed to be as close to 1:1 as possible.

Did this affect rankings?

Sure looks like it. I’d noticed a handful of Squarespace projects taking dips in late March. Even sites that I hadn’t touched in months, sites that are generally as steady as a rock. This looked very familiar to the infamous Yoast Bug from last year, when a massive amount of image attachment pages were indexed, causing similar index bloat and ranking drops.

Well Great.. How do we fix it?

From what I can see, there is no Squarespace setting to handle tag archive pages. The best case scenario would be to have the ability to both choose if you wanted them included in search results or not and to choose to include them in the sitemap. We can’t modify the sitemap, but we can control the indexation.

Add Google Tag Manager to your Site

Google Tag Manager is a great way to handle simple tags like your Google Analytics tracking script or a Facebook Pixel, but it is so much more powerful than that. You can use it, with a bit of javascript, to modify pretty much anything in the code of your site. This can be super useful as a tool to get around platform limitations. This will only work on Business plans, as you need to use Code Injection.

First, you’ll need to go to https://tagmanager.google.com and create an account. Then select web container. They’ll then give you two snippets like below.

Code Injection

Squarespace doesn’t let you inject into the <body> tag, but instead injecting the second code into the footer will work great.

Next We Need to Create the Tag

This will be a simple, but very powerful, tag. It creates a meta tag called “robots”, with the noindex directive. As soon as search engines see this tag, they remove the page from their results. You will definitely want to test and make sure that it’s only being injected on pages you want to hide. Tag Manager has a great preview option that shows you which tags are or aren’t firing on the page as you browse, very handy.

Select “new tag”, then “tag configuration”, then “custom HTML” and paste the code from below this screenshot.

Here is the code snippet:

<script>document.querySelector('head').appendChild( Object.assign( document.createElement('meta'),{name:'robots',content:'noindex'} ) )
</script>

Then you need to create a trigger for the tag.
Click “Choose a trigger to make this tag fire..”
Then the “+” sign for new trigger.
Then “Choose a trigger type to begin setup…”
After that, Window Loaded and “Some Window Loaded Events”.
Then Page Path contains /tag/
Don’t forget to name the trigger (I used tag pages)

Then you simply make sure that you have the trigger associated with your noindex tag and hit “Preview”. This allows you to click onto the main pages of your site and make sure that the noindex tag is under “tags that did not fire”. To test it on a tag page, search for “site:yourdomain.com tag” to see a list of the tag pages that have been indexed. Click on one of those and you should see that the noindex has fired.

You can then use a browser extension like “Open SEO Stats” to see that the noindex tag is working properly. Pro-tip, make sure you don’t have ad blocking turned on with your browser. It can block Tag Manager from working.

You can now publish the container. It’s best to add a description of what you’ve done when publishing, allowing you to easily look back at a later date.

Finally, you can submit a tag page to Search Console to make sure they see it as well.


Once Tag Pages are Noindex

Now we can use the Google Search Console URL removal tool. It’s a lesser known fact, but you can remove entire subdirectories (folders) of pages with this tool.
1. Go to the tool and select your property: https://www.google.com/webmasters/tools/url-removal
2. Find your tag folder, for most it will either be domain.com/blog/tag/ or domain.com/journal/tag/
3. Hit temporarily hide, put that folder URL into the box, then hit continue.
4. In the drop down, select “Clear cache and temporarily hide all URLs beginning with..”
5. Click “Submit Request”

This tool is extremely quick. The requested URLs will be out of the index within minutes.

Mission Accomplished

Awesome. We learned about a FUN new Squarespace feature, added Google Tag Manager to our sites, and were able to clean up their little mess.
Between this fix and the tip we included in the video on fixing the built-in site, you should be seeing a much healthier number of pages indexed for your site. If it was legal, I’d bet money dollars that your rankings will improve after making these changes.

If this fix is too technical for you to handle on your own, contact us to inquire on pricing and availability. We’re building a package of technical SEO fixes that apply to all Squarespace sites.


Huge thanks to A Fist Full of Bolts for letting us use their site as an example.

Share this post

Share on facebook
Share on twitter
Share on linkedin
Share on pinterest
Share on email

15 thoughts on “A New Squarespace SEO Problem Emerges [Fix Inside]<span class="wtr-time-wrap after-title"><span class="wtr-time-number">6</span> min read</span>”

  1. Can I just say how completely grateful I am for people like you who have the passion and ability to master the confusing world of SEO and related technical issues?! Thank you for sharing such thorough, honest, and helpful information!

  2. Thanks for the info, all done on my site.

    Out of interest how long do you assume it takes to resolve?

    My fix has been applied for roughly two weeks and it’s still seeing an extra 400+ tag pages in Search Console.

  3. This is not a fix. It just creates a potentially bigger problem.

    Your site is still telling Google they are submitted and the noindex bounces the bot causing red errors for every page in console instead of the amber ones.

    Serious issue now. I have emailed the author of this site but no response he needs to remove this article before he trashes more people’s sites with this garbage!

    1. Submitting pages in the sitemap that are marked noindex isn’t, in any way, a potentially larger problem.

      This is exactly what should be done if you’re in a situation where you have a large number of indexed pages that you want removed from the index.

      Google must crawl the pages, and then see the noindex tag, in order to remove them (unless you do a manual temporary hide in Search Console of the entire directory).

      This is how Yoast handled its attachment page bloat issues (creating sitemap of all affected pages and keeping it submitted until all pages are deindexed).

    1. Fortunately, Squarespace listened to the backlash from this issue and came out with the noindex tag pages features.
      They should have had the feature all along and notified their customers that there could be potentially hundreds or thousands of new indexable pages being added to their site.

    1. Definitely.

      If you’ve already done the Google Tag Manager fix, you should click the check-box for noindexing all tag archive pages and either leave the GTM tag alone.. or remove the trigger.

Leave a Comment

Your email address will not be published. Required fields are marked *