Skip to main content
XML sitemaps and robots.txt for Wix SEO
Module 2·Lesson 6 of 10·16 min read

XML sitemaps and robots.txt on Wix, what you need to know

Wix auto-generates your sitemap and robots.txt, but they are not always optimal. This lesson covers what Wix includes by default, how to customise them, and common mistakes that block important pages from Google.

What you will learn in this Wix SEO lesson

  • How Wix generates and updates the XML sitemap
  • Checking your sitemap is accessible and accurate
  • What robots.txt does and Wix's default settings
  • How to exclude specific pages from your sitemap
  • Submitting your sitemap in Google Search Console

Your sitemap and robots.txt are the two files that directly communicate with search engines about what to crawl and index. Wix manages both automatically, but knowing what they contain and how to influence them is essential for advanced SEO.

Your Wix XML Sitemap

Wix automatically generates a sitemap at yourdomain.com/sitemap.xml and updates it whenever you add, remove, or publish pages. The sitemap includes all public pages, blog posts, and product pages (if you have a Wix Store). It is automatically split into subsitemaps if you have a large number of pages.

How to check and customise your Wix sitemap

  1. 1Visit yourdomain.com/sitemap.xml in a browser to see your sitemap
  2. 2Check that all your important pages are listed and none that should be private are included
  3. 3To exclude a page from the sitemap, go to the page in the Wix editor, click Settings > SEO (Google), and toggle "Hide page from search results"
  4. 4Submit the sitemap URL in Google Search Console
  5. 5Check the Sitemaps report in GSC to see how many URLs were submitted and indexed

Your Wix robots.txt

Wix hosts your robots.txt at yourdomain.com/robots.txt. Unlike most SEO platforms, Wix does not allow you to directly edit this file. The default Wix robots.txt allows all bots to crawl all public pages, which is correct for most sites.

Important

Because you cannot edit robots.txt on Wix, you cannot block specific bots or directories. If you need fine-grained control over crawler access, use the noindex meta tag on individual pages rather than robots.txt directives.


Complete How-To Guide: Checking and Optimising Your Wix Sitemap and Robots.txt

This guide walks you through verifying your Wix sitemap is accurate, checking your robots.txt for issues, and making sure Google can find and index every important page on your site.

Follow these steps to check and optimise your Wix sitemap and robots.txt

  1. 1Open your Wix sitemap by navigating to yoursite.com/sitemap.xml in a browser and verify it loads without errors
  2. 2Check that all your important pages (homepage, service pages, location pages, blog posts) are listed in the sitemap, noting any that are missing
  3. 3If a page is missing from your sitemap check whether it has been accidentally set to Hide from search engines in the Wix editor SEO (Google) panel
  4. 4To exclude unwanted pages from the sitemap (like thank-you pages or test pages) go to the Wix editor, open the page, click SEO (Google) and toggle Hide from search engines to on
  5. 5Open your robots.txt file by navigating to yoursite.com/robots.txt in a browser and check that it does not block any important page paths
  6. 6Verify the robots.txt includes a Sitemap directive pointing to your sitemap URL (Wix adds this automatically but confirm it is present)
  7. 7Log into Google Search Console and go to Sitemaps, check the status of your submitted sitemap shows as Success
  8. 8If you have not submitted your sitemap yet enter yoursite.com/sitemap.xml in the Add a new sitemap field and click Submit
  9. 9Check the Coverage report in GSC for pages with Crawled but not indexed or Excluded by noindex status to identify pages being blocked unintentionally
  10. 10For Wix Blog sites check that tag pages and empty category pages are not bloating your sitemap with thin content, if they are set them to noindex
  11. 11After making any changes to page visibility settings wait 24 hours then re-check your sitemap to confirm the changes are reflected
  12. 12Use Google's URL Inspection tool to test individual pages that you want indexed, confirming they are allowed by robots.txt and included in sitemap

Final Checkpoint

Your sitemap should contain only the pages you want Google to index, robots.txt should not block any important content, and the GSC Coverage report should show zero unexpected excluded pages. If your sitemap shows more URLs than you expect, identify and noindex the unnecessary pages.

Finished this lesson?

Mark it complete to track your course progress.

AI Lesson Tutor

AI Powered

Got a question about this lesson? Ask the AI tutor for a quick explanation or practical examples.

Your Course Resources

11 downloadable PDFs -- checklists, templates, worksheets and your certificate

Course Progress0/561 lessons

Checklists

Wix SEO Audit ChecklistPDF

20-point site-wide audit covering technical, on-page, content and local SEO

On-Page SEO ChecklistPDF

37-point per-page checklist: titles, headings, content, images, links, schema

Technical SEO Deep-DivePDF

50-point technical audit: crawlability, Core Web Vitals, speed, security, Wix-specific

Local SEO Setup ChecklistPDF

42-point local checklist: Google Business Profile, NAP, citations, reviews, local links

Site Launch SEO ChecklistPDF

48-point pre-launch and post-launch guide for new Wix sites going live

Templates & Worksheets

Keyword Research TemplatePDF

Printable tracker with columns for volume, difficulty, intent, priority and notes

Monthly SEO Report TemplatePDF

Client-ready report covering traffic, rankings, technical health and action plan

Content Brief TemplatePDF

Plan every page: target keywords, outline, competitor analysis, internal links, CTAs

Backlink Outreach TrackerPDF

Campaign log with status tracking plus 3 proven outreach email templates

Competitor Analysis WorksheetPDF

14-metric comparison table, content gap analysis and SEO SWOT framework

Achievement

Certificate of CompletionLocked

Complete all lessons to unlock (0/561 done)

Lesson Tools

No part of this Wix SEO Course content may be reproduced, copied, or distributed without the written consent of Michael Andrews.

This lesson on XML sitemaps and robots.txt on Wix, what you need to know is part of Module 2: How to Set Up Your Wix Site for Maximum SEO in The Most Comprehensive Complete Wix SEO Course in the World (2026 Edition). It covers Wix SEO optimization (US) and optimisation (UK) strategies applicable to businesses in the United Kingdom, United States, Australia, Canada, New Zealand, Ireland and worldwide. Created by Michael Andrews, the UK's No.1 Wix SEO Expert with 14 years of hands-on experience, 750+ completed Wix SEO projects and 425+ verified five-star reviews. This is lesson 13 of 561 in the most affordable, most comprehensive Wix SEO training programme available in 2026.