Technical SEO Wiki Entry
robots.txt
A text file at the site root that instructs crawlers which URLs to access.
robots.txt uses User-agent and Disallow/Allow directives to control crawler access. It cannot prevent indexing (use noindex for that), only crawling. Sitemaps are typically declared via Sitemap: directive. Wix exposes robots.txt editing through the SEO Tools dashboard with safety guardrails to prevent accidentally blocking critical resources.
Want to apply this in practice?
The Complete Wix SEO Course covers this topic and 67 others in step-by-step lessons designed for real Wix sites.
Explore the course
Translate