In this guide, we'll explain common terms you may see from Google about your site's SEO.
Note: Google Search Console is an advanced third-party service. Squarespace can support you with the verification process, but we're unable to offer general help on using Google Search Console. For more help, visit Google's documentation.
Mobile-first indexing enabled
If Google enables mobile-first indexing for your site, it means it's found that your site is well-designed for viewing on mobile devices, and it'll use the mobile site when indexing and ranking your site. Most visitors now access Google from a mobile device, so Google prioritizes the version of your site that works best for the largest number of potential visitors.
If you received an email about mobile-first indexing, it's good news, and you don't have to do anything to create a separate mobile site. Squarespace sites have built-in responsive design, they're made to display beautifully across a range of devices. Even if a customer finds your site in search results from a computer, they'll see a version that's optimized for their larger screen.
Mobile Usability issue
Google Search Console sends Mobile Usability issue alerts when it identifies elements on your site that aren't optimized for mobile. For example, Clickable elements too close together or Text too small to read.
We're unable to troubleshoot or investigate the source of these alerts, as they come from a third-party page analysis tool. To ensure your site is mobile-friendly, review our guides on responsive design and mobile style:
- How will my site appear on mobile devices?
- Tips for keeping your site mobile-friendly
- Responsive design
Parallel tracking is a Google Ads feature for advertisers who use click measurements.
Since Squarespace isn't a click measurement provider, we aren't able to help with parallel tracking. If you've seen a decrease in site traffic since October 30, 2018, when Google began requiring parallel tracking, we recommend you review Google's documentation about auto-tagging, and work with your click management provider to fix the issue.
robots.txt error or index coverage issue
A robots.txt file tells a search engine which pages on your site it shouldn't crawl. All Squarespace sites use the same robots.txt file and Squarespace users can't access or edit the file. This helps us follow SEO best practices and keep your site Google-friendly.
When verifying your site with Google Search Console, you'll see a message showing that parts of your URL are restricted by robots.txt. This is completely normal, and you can ignore the message. We ask Google not to crawl these pages because they’re for internal use only or display duplicate content. For example, /config/ is your Admin login page, and /api/ blocks our Analytics tracking cookie.
Here are some pages we ask Google not to crawl. These pages organize content that exists elsewhere on your site.
View a complete list of excluded pages in the robot.txt file.
Structured data warnings
Structured data warnings mean Google is looking for specific markup code on your site and can't find it. For example, you may see the following warning for each product on your Squarespace Commerce site:
"Either "offers", "review", or "aggregateRating" should be specified."
If you're seeing these warnings, your website and content will continue to be indexed by Google and will still be eligible for features like rich snippets.
Our engineers are always making improvements to our system and are currently monitoring the fields that Google requires. In the meantime, use Google’s Data Highlighter to tag information that should display in rich snippets.
HTML improvement errors don’t prevent visitors or search engines from accessing or indexing your site. Since Squarespace is designed to help you build any kind of site, you may see HTML improvement suggestions that don’t apply to you. You can ignore these.
To ensure your site is optimized, visit Increasing your site’s visibility to search engines. To learn more about HTML Improvements in Google Search Console, visit Google’s documentation.
The most common URL crawl error is "404 page could not be found." This can happen if you've removed a page or changed its URL slug without setting a 301 or 302 redirect. To fix this, set a 301 or 302 redirect for that page, then ask Google to index your site. You might need to wait some time for Google to process the request, then crawl and index the page. If the error still appears after a few weeks, or if you can't set up a redirect, contact us for help.
A page on your site can also return a soft 404 error if Google misinterprets text on that page as error messaging. If you experience this issue on an otherwise good page, check the page's content for phrases like:
- no longer available
- item not available
- not in stock
- does not exist
To learn more about crawl errors, visit Google's documentation.
Google automatically flags sites for what it considers deceptive behavior, such as malware or spam. We can't control when a site is flagged or see why it happened.
To remove the warning, contact Google directly. There are a few steps that can help:
- Verify your site with Google Search Console.
- Remove any content from your site that Google might consider malicious. Visit Google's documentation to see what the different warnings mean and examples of malicious content.
Follow Google's steps for requesting a review of your site.
Other third-party SEO tools
While we can offer some insight into understanding Google SEO emails and console errors, we're unable to troubleshoot or investigate the results from other third-party page analysis tools like SEMrush, MOZ, or AHREFS. While these tools can provide you with SEO suggestions and best practices, they're built to optimize search rankings for a site you designed and coded yourself. Their results can be misleading for Squarespace sites, which are built on a CMS platform.