Skip to content
Guides/IIS
IIS · Windows Server · .NET · Web Server9 min read

How to Block AI Bots on IIS: Complete 2026 Guide

Internet Information Services (IIS) is Microsoft's web server, widely used for ASP.NET, .NET Core, PHP, and static site hosting on Windows Server. Bot blocking in IIS uses web.config — the XML configuration file that controls IIS behaviour per-site and per-directory. This guide covers URL Rewrite rules for User-Agent blocking, HTTP response headers, robots.txt, rate limiting, and Application Request Routing.

Prerequisites — URL Rewrite module

IIS does not include the URL Rewrite module by default. Install it before adding rewrite rules — without it, the <rewrite> section in web.config causes a 500 error.

Install URL Rewrite module

# Option 1: Web Platform Installer (GUI)
# Open IIS Manager → Web Platform Installer → search "URL Rewrite" → Install

# Option 2: Direct download
# https://www.iis.net/downloads/microsoft/url-rewrite
# Run the MSI installer

# Option 3: PowerShell (requires Web Platform Installer CLI)
webpicmd /Install /Products:"UrlRewrite2"

# Verify installation
Get-WebConfigurationProperty -pspath 'MACHINE/WEBROOT' -filter "system.webServer/rewrite" -name "." 2>&1
IIS Express (development): URL Rewrite works with IIS Express for local development — install the same module. Rules in web.config are picked up automatically.

URL Rewrite rules for UA blocking

The URL Rewrite module matches request properties (URL, headers, server variables) and applies actions. For bot blocking, match the HTTP_USER_AGENT server variable and abort or return 403.

web.config — AbortRequest (fastest, closes connection)

<?xml version="1.0" encoding="utf-8"?>
<configuration>
  <system.webServer>
    <rewrite>
      <rules>
        <rule name="Block AI Bots" stopProcessing="true">
          <match url=".*" />
          <conditions>
            <add
              input="{HTTP_USER_AGENT}"
              pattern="(GPTBot|ClaudeBot|anthropic-ai|CCBot|Google-Extended|AhrefsBot|Bytespider|Amazonbot|Diffbot|FacebookBot|cohere-ai|PerplexityBot|YouBot)"
              ignoreCase="true"
            />
          </conditions>
          <action type="AbortRequest" />
        </rule>
      </rules>
    </rewrite>
  </system.webServer>
</configuration>

web.config — CustomResponse 403 (preferred)

<?xml version="1.0" encoding="utf-8"?>
<configuration>
  <system.webServer>
    <rewrite>
      <rules>
        <rule name="Block AI Bots" stopProcessing="true">
          <match url=".*" />
          <conditions>
            <add
              input="{HTTP_USER_AGENT}"
              pattern="(GPTBot|ClaudeBot|anthropic-ai|CCBot|Google-Extended|AhrefsBot|Bytespider|Amazonbot|Diffbot|FacebookBot|cohere-ai|PerplexityBot|YouBot)"
              ignoreCase="true"
            />
          </conditions>
          <action type="CustomResponse" statusCode="403" statusReason="Forbidden" statusDescription="Access denied" />
        </rule>
      </rules>
    </rewrite>
  </system.webServer>
</configuration>
AbortRequest vs CustomResponse: AbortRequest immediately closes the TCP connection — most efficient, no response body. Some HTTP clients may log connection errors. CustomResponse sends a proper HTTP 403 — cleaner signal for crawlers that respect HTTP status codes. For bot blocking, CustomResponse with statusCode="403" is generally preferred.
stopProcessing="true": Always include this on bot-blocking rules. Without it, IIS continues evaluating subsequent rewrite rules after matching — the request may still be processed. stopProcessing="true" halts the rewrite pipeline immediately when the rule matches.

URL Rewrite condition flags

AttributeValueEffect
ignoreCase"true"Case-insensitive pattern matching
negate"true"Invert match (condition true when pattern does NOT match)
matchType"Pattern" (default)Regex match against the input
matchType"IsFile"True if input resolves to an existing file

X-Robots-Tag via customHeaders

Add custom HTTP response headers in web.config under system.webServer/httpProtocol/customHeaders. No additional modules required — this is built into IIS:

<?xml version="1.0" encoding="utf-8"?>
<configuration>
  <system.webServer>
    <httpProtocol>
      <customHeaders>
        <add name="X-Robots-Tag" value="noai, noimageai" />
      </customHeaders>
    </httpProtocol>
  </system.webServer>
</configuration>

This applies X-Robots-Tag to all responses from the site, including static files, ASP.NET responses, and error pages.

Remove existing X-Robots-Tag before adding (prevent duplicates)

<httpProtocol>
  <customHeaders>
    <remove name="X-Robots-Tag" />
    <add name="X-Robots-Tag" value="noai, noimageai" />
  </customHeaders>
</httpProtocol>
Inheritance: customHeaders in a site-level web.config apply to the entire site. Headers in a subdirectory web.config apply to that directory only. Child configs inherit from parent unless overridden with <remove>.

robots.txt as a static file

Place robots.txt in the site's root directory (the physical path configured in IIS Manager). IIS serves static files from the root by default — no additional configuration needed.

User-agent: *
Allow: /

User-agent: GPTBot
Disallow: /

User-agent: ClaudeBot
Disallow: /

User-agent: anthropic-ai
Disallow: /

User-agent: CCBot
Disallow: /

User-agent: Google-Extended
Disallow: /

User-agent: AhrefsBot
Disallow: /

User-agent: Bytespider
Disallow: /

User-agent: Amazonbot
Disallow: /

User-agent: Diffbot
Disallow: /

User-agent: FacebookBot
Disallow: /

User-agent: cohere-ai
Disallow: /

User-agent: PerplexityBot
Disallow: /

User-agent: YouBot
Disallow: /

Sitemap: https://example.com/sitemap.xml
MIME type for robots.txt: IIS may block serving files without a registered MIME type. If robots.txt returns a 404 or 403, add its MIME type explicitly in web.config:
<system.webServer>
  <staticContent>
    <mimeMap fileExtension=".txt" mimeType="text/plain" />
  </staticContent>
</system.webServer>

Dynamic IP Restrictions

The Dynamic IP Restrictions extension (free, from Microsoft) provides rate limiting and concurrent request limiting at the IIS level — without URL Rewrite:

<system.webServer>
  <security>
    <dynamicIpSecurity
      denyAction="Forbidden"
      enableLoggingOnDenial="true">
      <!-- Block IPs making more than 50 requests in 1 second -->
      <denyByRequestRate
        enabled="true"
        maxRequests="50"
        requestIntervalInMilliseconds="1000" />
      <!-- Block IPs with more than 20 concurrent requests -->
      <denyConcurrentRequests
        enabled="true"
        maxConcurrentRequests="20" />
    </dynamicIpSecurity>
  </security>
</system.webServer>
Install Dynamic IP Restrictions: Available via Web Platform Installer or the IIS website. Required even on Windows Server — it's not installed by default with IIS.

Application Request Routing (ARR)

If IIS is acting as a reverse proxy with ARR (Application Request Routing), add bot blocking rules in the server farm's web.config at the proxy level so blocked bots never reach the backend:

<!-- web.config on the ARR proxy server -->
<configuration>
  <system.webServer>
    <rewrite>
      <rules>
        <!-- Block AI bots at the proxy level (before forwarding to backend) -->
        <rule name="Block AI Bots at Proxy" stopProcessing="true">
          <match url=".*" />
          <conditions>
            <add
              input="{HTTP_USER_AGENT}"
              pattern="(GPTBot|ClaudeBot|anthropic-ai|CCBot|Google-Extended|AhrefsBot|Bytespider|Amazonbot|Diffbot|FacebookBot|cohere-ai|PerplexityBot|YouBot)"
              ignoreCase="true"
            />
          </conditions>
          <action type="CustomResponse" statusCode="403" statusReason="Forbidden" />
        </rule>

        <!-- ARR reverse proxy rule (only runs if bot blocking didn't match) -->
        <rule name="ReverseProxy" stopProcessing="true">
          <match url="(.*)" />
          <action type="Rewrite" url="http://backend-server/{R:1}" />
        </rule>
      </rules>
    </rewrite>
  </system.webServer>
</configuration>

Full web.config example

<?xml version="1.0" encoding="utf-8"?>
<configuration>
  <system.webServer>

    <!-- Block AI bots by User-Agent (requires URL Rewrite module) -->
    <rewrite>
      <rules>
        <rule name="Block AI Bots" stopProcessing="true">
          <match url=".*" />
          <conditions logicalGrouping="MatchAny">
            <add
              input="{HTTP_USER_AGENT}"
              pattern="(GPTBot|ClaudeBot|anthropic-ai|CCBot|Google-Extended|AhrefsBot|Bytespider|Amazonbot|Diffbot|FacebookBot|cohere-ai|PerplexityBot|YouBot)"
              ignoreCase="true"
            />
          </conditions>
          <action type="CustomResponse" statusCode="403" statusReason="Forbidden" statusDescription="AI crawlers are not permitted" />
        </rule>
      </rules>
    </rewrite>

    <!-- X-Robots-Tag and security headers -->
    <httpProtocol>
      <customHeaders>
        <remove name="X-Powered-By" />
        <add name="X-Robots-Tag" value="noai, noimageai" />
        <add name="X-Content-Type-Options" value="nosniff" />
        <add name="X-Frame-Options" value="SAMEORIGIN" />
        <add name="Referrer-Policy" value="strict-origin-when-cross-origin" />
      </customHeaders>
    </httpProtocol>

    <!-- MIME types (ensure robots.txt and .txt files are served) -->
    <staticContent>
      <mimeMap fileExtension=".txt" mimeType="text/plain" />
    </staticContent>

    <!-- Rate limiting (requires Dynamic IP Restrictions extension) -->
    <security>
      <dynamicIpSecurity denyAction="Forbidden" enableLoggingOnDenial="true">
        <denyByRequestRate enabled="true" maxRequests="100" requestIntervalInMilliseconds="10000" />
        <denyConcurrentRequests enabled="true" maxConcurrentRequests="30" />
      </dynamicIpSecurity>
    </security>

    <!-- Default document -->
    <defaultDocument>
      <files>
        <clear />
        <add value="index.html" />
        <add value="index.aspx" />
      </files>
    </defaultDocument>

    <!-- Custom error pages -->
    <httpErrors errorMode="Custom" existingResponse="Replace">
      <remove statusCode="403" />
      <error statusCode="403" path="/errors/403.html" responseMode="File" />
    </httpErrors>

  </system.webServer>
</configuration>

IIS Manager (GUI alternative)

All of the above can also be configured through IIS Manager without editing web.config directly:

Add X-Robots-Tag via GUI

  1. Open IIS Manager → select your site
  2. Double-click HTTP Response Headers
  3. Click Add in the Actions panel
  4. Name: X-Robots-Tag, Value: noai, noimageai
  5. Click OK — change takes effect immediately

Add URL Rewrite rule via GUI

  1. Open IIS Manager → select your site
  2. Double-click URL Rewrite (requires module installed)
  3. Click Add Rule(s)…Blank Rule
  4. Name: Block AI Bots
  5. Match URL: Pattern = .*, Using = Regular Expressions
  6. Conditions: Add → Input: {HTTP_USER_AGENT}, Pattern: bot regex, Ignore case: checked
  7. Action type: Custom Response, Status code: 403
  8. Apply → Back to Rules → rule is now active
GUI changes write to web.config: All changes made in IIS Manager are saved to web.config (or applicationHost.config for server-level settings). You can copy the resulting XML to other servers for consistent deployments.

FAQ

How do I block AI bots by User-Agent in IIS?

Use the URL Rewrite module (must be installed separately) in web.config. Add a rule with a condition matching {HTTP_USER_AGENT} against a regex of bot names, then set action to CustomResponse with statusCode="403". Include stopProcessing="true" on the rule.

Do I need to install anything extra on IIS to block bots?

Yes — the URL Rewrite module (free, from Microsoft) is required for <rewrite> rules in web.config. Without it, IIS returns a 500 error if the rewrite section is present. Download from the IIS website or install via Web Platform Installer.

How do I add X-Robots-Tag in IIS?

Add it under system.webServer/httpProtocol/customHeaders in web.config: <add name="X-Robots-Tag" value="noai, noimageai" />. No additional modules required — this is built into IIS. Use <remove> before <add> to prevent duplicate headers.

Where do I place robots.txt on an IIS site?

In the site's root physical directory. IIS serves static files from the root by default. If robots.txt returns 404, add its MIME type: <mimeMap fileExtension=".txt" mimeType="text/plain" /> under system.webServer/staticContent.

What is the difference between AbortRequest and CustomResponse?

AbortRequest closes the TCP connection immediately — most efficient, no response sent. CustomResponse sends a proper HTTP response with your chosen status code. For bot blocking, CustomResponse with 403 is generally preferred as it sends a clean signal that bots that respect status codes can act on.

Can I use web.config rules without the URL Rewrite module?

No. The <rewrite> section requires the URL Rewrite module. Without it, IIS returns a 500 error. Install via Web Platform Installer or the IIS download page before adding rewrite rules.

Is your site protected from AI bots?

Run a free scan to check your robots.txt, meta tags, and overall AI readiness score.