Wix AI bot blocking: controlling AI crawler access via the Robots.txt Editor

Module 16: Wix Native SEO Tools & AI Visibility | Lesson 192 of 687 | 28 min read

By Michael Andrews, Wix SEO Expert UK

Wix gives you native control over which AI crawlers can access your site through the built-in Robots.txt Editor. With the explosion of AI crawlers in 2026, managing this access has become a critical part of your SEO strategy. This lesson walks you through Wix's specific tools for blocking or allowing AI bots, monitoring their activity, and making strategic decisions about AI access to your content.

The Wix Robots.txt Editor for AI Bot Management

Accessing and configuring the editor

Which AI Bots to Block and Which to Allow

Not all AI crawlers serve the same purpose. Some crawl exclusively for model training (taking your content to improve their AI), while others crawl to provide real-time answers that cite your website. Blocking training-only crawlers protects your content from being absorbed into AI models. Allowing retrieval crawlers means your business appears when potential customers ask AI tools for recommendations.

# Block AI training crawlers
User-agent: GPTBot
Disallow: /

User-agent: ClaudeBot
Disallow: /

User-agent: Google-Extended
Disallow: /

User-agent: CCBot
Disallow: /

User-agent: Bytespider
Disallow: /

# Allow AI retrieval crawlers (so your content appears in AI answers)
User-agent: ChatGPT-User
Allow: /

User-agent: PerplexityBot
Allow: /

Monitoring AI Bot Activity After Changes

After configuring your robots.txt, monitor the Wix Bot Traffic dashboards to verify that blocked crawlers have stopped visiting and allowed crawlers continue. It may take 1-2 weeks for all AI crawlers to respect new robots.txt directives, as they revisit the file on their own schedule. Some crawlers are better at respecting robots.txt than others, and a small number may ignore it entirely.

Important Limitation: Robots.txt is a voluntary protocol. Well-known AI companies generally respect it, but smaller or less scrupulous crawlers may ignore your directives. There is no technical enforcement mechanism beyond server-level IP blocking, which Wix does not support. However, major AI companies (OpenAI, Anthropic, Google) have publicly committed to respecting robots.txt.

Selective Blocking: Protecting Specific Content

Instead of blocking AI crawlers site-wide, you can protect only specific directories or page types. For example, if you have a proprietary research section but want your service pages discoverable by AI, you can allow AI crawlers on most of your site while blocking specific paths.

User-agent: GPTBot
Disallow: /blog/research/
Disallow: /premium-content/
Allow: /services/
Allow: /about/
Strategic Recommendation: Review your AI bot blocking strategy every quarter. The AI landscape is evolving rapidly, and what makes sense today may need adjustment as new crawlers emerge and AI search tools become more important for driving business. Check the Wix Bot Traffic dashboards monthly to stay informed about which AI systems are accessing your content.

How to Edit Your Wix Robots.txt to Block Training Crawlers While Allowing Retrieval Bots

How to configure the Wix robots.txt editor to allow AI retrieval bots while blocking training-only crawlers from your site

This lesson on Wix AI bot blocking: controlling AI crawler access via the Robots.txt Editor is part of Module 16: Wix Native SEO Tools & AI Visibility in The Most Comprehensive Complete Wix SEO Course in the World (2026 Edition). Created by Michael Andrews, the UK's No.1 Wix SEO Expert with 14 years of hands-on experience, 750+ completed Wix SEO projects and 425+ verified five-star reviews.