I love Squarespace. That may sound strange given the title of this post, but it is true. Out of all the website builders out there — Site 123, Ucraft, Webflow, Webnode, Wix etc. — Squarespace is my favorite one to build a site on.
Alas, it just sucks when it comes to search engine optimization. And it breaks my heart.
While it has come a long way from its inception, there are still a few (but major) areas where Squarespace falls short on, and there doesn’t seem to be plans to act on them any time soon.
Let’s explore what they are:
When there are two or more versions of a site, it’s not ideal for search engines, websites or their users. This faux pas tends to emerge when canonicalization isn’t established.
For example, the URL of a page can exist in many forms:
Whether there’s a WWW or a trailing slash, all URLs should redirect to a single version and nothing more. Unfortunately, this isn’t the case with Squarespace. A case can be made for the rel=canonical tag (as Squarespace does), but sadly, there are many SEO tools which don’t recognize these tags and will continually alert site owners that they have duplicate content issues when in reality they don’t.
At the very least, it would be great for Squarespace to allow users the option to decide which version they want canonicalized, as opposed to a default choice.
In this instance, while it won’t cause major visibility issues, the lack of canonical control leads to frustration for analysts and webmasters alike.
2. Robots.txt & Sitemap.xml
Robots.txt is a file which is used to restrict and control the behavior of all types of search engine robots and other third-party crawlers.
Sitemap.xml is a file which lists every page of a website in hierarchical order for search engine robots to crawl.
Both files are helpful for search engines to understand and process a website with greater ease and accuracy. Common sense would reason for a website owner to have full control over what they want and do not want search engines to crawl and index.
Regrettably, not with Squarespace.
It is not possible to edit your site’s robots.txt, nor your sitemap.xml file. This is a tragedy, considering the sheer number of scenarios where alterations and modifications are absolutely necessary. Both are automatically generated, and while there is no clear explanation as to why this is the case, it quickly becomes problematic for those of us who want to optimize our sites for full visibility and crawl bandwidth.
3. Structured Data
Structured data is simply a standardized format of providing information about a page and classifying content to help search engines better understand what’s on the page. Schema is a collaborative guideline for structured data and is used by many website owners to structure their website information for search engines.
Unfortunately, Squarespace hinders the opportunity of structured data. This is due to pre-existing schema which is inserted into the code of every site, whether you have opted for it or not. This itself is not an issue, however it does not maximize the full features and attributes available.
If you generate schema markup and include it on your site, Squarespace will not replace the pre-existing schema with your own, rather only add to it, which tends to make things more convoluted (especially if it’s the same schema type!).
In most cases, using Squarespace, it’s best to not add any structured data all in order to avoid any complications.
While we hold out hope Squarespace will eventually evolve and grow to a point where these obstacles are resolved and full control is native to the platform, at this current stage we have to revert to alternative content management systems and live to optimize another day.
Was this helpful? Let us know!
Sebastian Hovv is the CEO & founder of SEO 101 and the author of the book, The Little White Book of SEO. With over 13 years of experience in SEO, he has become an expert in the field and a contributor to the world of online marketing education through the online courses and learning material SEO 101 provides.