Skip to content
openshadow.io/guides/blocking-ai-bots-vuejs

How to Block AI Bots on Vue.js: Complete 2026 Guide

Vue.js SPAs have a quirk that changes the blocking picture entirely: bots that don't run JavaScript see an empty <div id="app"></div> shell — your content is invisible to them. This means useHead() noai tags don't get delivered. The reliable approach is the server layer: nginx, Express, or your CDN's response header config.

8 min read·Updated April 2026·Vue 3 / Vite / @vueuse/head

Vue SPA vs Nuxt.js: This guide is for standalone Vue SPAs (Vite + Vue 3, no SSR). If you're using Nuxt.js, your pages are server-rendered and bots see full HTML — see the Nuxt.js guide instead. For Vue SPAs: your content is invisible to non-JS bots by default, but your noai meta tags are too. Use server-level headers.

Methods overview

Method
robots.txt in Vite public/

Always — first thing to add

noai meta tag via @vueuse/head useHead()

As a belt-and-suspenders addition (not sufficient alone)

X-Robots-Tag response header

Best approach for SPAs — set in nginx/Express

nginx hard block (403)

Self-hosted or VPS deployments

Express server middleware block

When serving Vue dist/ via Express

Netlify / Vercel headers config

Netlify or Vercel static deployments

1. robots.txt in Vite public/

Vite copies everything in public/ directly to dist/ on build. Put your robots.txt here — it becomes /robots.txt in production without any route configuration.

public/robots.txtVite project root → public/
User-agent: GPTBot
Disallow: /

User-agent: ChatGPT-User
Disallow: /

User-agent: OAI-SearchBot
Disallow: /

User-agent: ClaudeBot
Disallow: /

User-agent: anthropic-ai
Disallow: /

User-agent: Google-Extended
Disallow: /

User-agent: Bytespider
Disallow: /

User-agent: CCBot
Disallow: /

User-agent: PerplexityBot
Disallow: /

User-agent: meta-externalagent
Disallow: /

User-agent: Amazonbot
Disallow: /

User-agent: Applebot-Extended
Disallow: /

User-agent: xAI-Bot
Disallow: /

User-agent: DeepSeekBot
Disallow: /

User-agent: MistralBot
Disallow: /

User-agent: Diffbot
Disallow: /

User-agent: cohere-ai
Disallow: /

User-agent: AI2Bot
Disallow: /

User-agent: Ai2Bot-Dolma
Disallow: /

User-agent: YouBot
Disallow: /

User-agent: DuckAssistBot
Disallow: /

User-agent: omgili
Disallow: /

User-agent: omgilibot
Disallow: /

User-agent: webzio-extended
Disallow: /

User-agent: gemini-deep-research
Disallow: /

User-agent: *
Allow: /

Vue CLI (webpack) projects: Same pattern — put robots.txt in the public/ folder at the project root. Vue CLI also copies public/ to dist/ on build.

2. noai meta tag via @vueuse/head useHead()

Add noai meta tags using @vueuse/head or the built-in useHead() composable (Vue 3.4+). This works for bots that execute JavaScript, but most AI training crawlers do not. Use this in addition to server-level headers — not instead of them.

App.vue — global noai (Vue 3 + @vueuse/head)

src/App.vue
<script setup>
import { useHead } from '@vueuse/head'

useHead({
  meta: [
    {
      name: 'robots',
      content: 'noai, noimageai',
    },
  ],
})
</script>

<template>
  <router-view />
</template>

Per-route noai in a page component

src/views/ProductPage.vue
<script setup>
import { useHead } from '@vueuse/head'

useHead({
  title: 'My Product — Shop',
  meta: [
    { name: 'description', content: 'Product description here.' },
    { name: 'robots', content: 'noai, noimageai' },
  ],
})
</script>

<template>
  <main>
    <!-- product content -->
  </main>
</template>

Inject noai directly in index.html (no JS required)

For maximum compatibility — including bots that don't run JS — inject the meta tag directly into the static index.html template. This is served in the raw HTML shell before any JavaScript runs.

index.html (Vite project root)
<!doctype html>
<html lang="en">
  <head>
    <meta charset="UTF-8" />
    <meta name="viewport" content="width=device-width, initial-scale=1.0" />

    <!-- Block AI training bots — delivered before JS executes -->
    <meta name="robots" content="noai, noimageai" />

    <title>My App</title>
  </head>
  <body>
    <div id="app"></div>
    <script type="module" src="/src/main.ts"></script>
  </body>
</html>

index.html meta tag caveat: This tag applies globally to every route since the SPA serves the same index.html for all URLs. If you want to allow AI crawlers on some routes, you cannot use this approach — use per-route useHead() calls instead. For most apps, global noai from index.html is the right call.

3. X-Robots-Tag response header

The X-Robots-Tag HTTP response header delivers noai signals at the server level — before any JavaScript runs, and for every response including the HTML shell, API responses, and static assets. This is the most reliable method for Vue SPAs.

nginx — add to your SPA server block

/etc/nginx/sites-available/my-vue-app
server {
    listen 80;
    server_name myapp.com www.myapp.com;
    root /var/www/my-vue-app/dist;
    index index.html;

    # noai header for all responses
    add_header X-Robots-Tag "noai, noimageai" always;

    # SPA fallback — all routes serve index.html
    location / {
        try_files $uri $uri/ /index.html;
    }

    # Cache static assets (but still send X-Robots-Tag)
    location ~* \.(js|css|png|jpg|jpeg|gif|ico|svg|woff2)$ {
        expires 1y;
        add_header Cache-Control "public, immutable";
        add_header X-Robots-Tag "noai, noimageai" always;
    }
}

Express — serve Vue dist/ with header

server.js
import express from 'express';
import path from 'path';
import { fileURLToPath } from 'url';

const __dirname = path.dirname(fileURLToPath(import.meta.url));
const app = express();
const PORT = process.env.PORT || 3000;
const DIST = path.join(__dirname, 'dist');

// Set X-Robots-Tag on all responses
app.use((req, res, next) => {
  res.setHeader('X-Robots-Tag', 'noai, noimageai');
  next();
});

// Serve Vue dist/
app.use(express.static(DIST));

// SPA fallback
app.get('*', (req, res) => {
  res.sendFile(path.join(DIST, 'index.html'));
});

app.listen(PORT, () => console.log(`Server on ${PORT}`));

4. nginx hard block (403)

Stop AI bots entirely before your Vue app is served. This works for bots that ignore robots.txt (Bytespider, Diffbot). Add a map block to your nginx config and return 403 for matched user agents.

/etc/nginx/nginx.conf (http block) + site config
# In /etc/nginx/nginx.conf — inside http { } block:
map $http_user_agent $blocked_ai_bot {
    default                 0;
    "~*GPTBot"              1;
    "~*ChatGPT-User"        1;
    "~*OAI-SearchBot"       1;
    "~*ClaudeBot"           1;
    "~*anthropic-ai"        1;
    "~*Google-Extended"     1;
    "~*Bytespider"          1;
    "~*CCBot"               1;
    "~*PerplexityBot"       1;
    "~*meta-externalagent"  1;
    "~*Amazonbot"           1;
    "~*Applebot-Extended"   1;
    "~*xAI-Bot"             1;
    "~*DeepSeekBot"         1;
    "~*MistralBot"          1;
    "~*Diffbot"             1;
    "~*cohere-ai"           1;
    "~*AI2Bot"              1;
    "~*Ai2Bot-Dolma"        1;
    "~*omgili"              1;
    "~*omgilibot"           1;
    "~*webzio-extended"     1;
    "~*gemini-deep-research" 1;
}

# In your site's server { } block:
server {
    listen 80;
    server_name myapp.com;
    root /var/www/my-vue-app/dist;
    index index.html;

    add_header X-Robots-Tag "noai, noimageai" always;

    # Block matched AI bots
    if ($blocked_ai_bot) {
        return 403;
    }

    location / {
        try_files $uri $uri/ /index.html;
    }
}

5. Express server middleware block

If you're serving your Vue SPA via an Express/Node.js server (common for SSR-adjacent setups or custom server logic), add bot-blocking middleware before the static file handler.

server.js — add before express.static()
import express from 'express';
import path from 'path';
import { fileURLToPath } from 'url';

const __dirname = path.dirname(fileURLToPath(import.meta.url));
const app = express();
const DIST = path.join(__dirname, 'dist');

const AI_BOT_PATTERN = /GPTBot|ChatGPT-User|OAI-SearchBot|ClaudeBot|anthropic-ai|Google-Extended|Bytespider|CCBot|PerplexityBot|meta-externalagent|Amazonbot|Applebot-Extended|xAI-Bot|DeepSeekBot|MistralBot|Diffbot|cohere-ai|AI2Bot|Ai2Bot-Dolma|YouBot|DuckAssistBot|omgili|omgilibot|webzio-extended|gemini-deep-research/i;

// Block AI bots — runs before static files and SPA fallback
app.use((req, res, next) => {
  const ua = req.headers['user-agent'] ?? '';

  // Always allow robots.txt through
  if (req.path === '/robots.txt') return next();

  if (AI_BOT_PATTERN.test(ua)) {
    return res.status(403).send('Forbidden');
  }

  res.setHeader('X-Robots-Tag', 'noai, noimageai');
  next();
});

app.use(express.static(DIST));

app.get('*', (req, res) => {
  res.sendFile(path.join(DIST, 'index.html'));
});

app.listen(process.env.PORT ?? 3000);

6. Netlify / Vercel headers config

For static Vue deployments on Netlify or Vercel, configure response headers in their platform config files. This adds X-Robots-Tag to every response at the edge — no server process needed.

Netlify — netlify.toml

netlify.toml (project root)
[build]
  command = "npm run build"
  publish = "dist"

[[headers]]
  for = "/*"
  [headers.values]
    X-Robots-Tag = "noai, noimageai"

# SPA redirects — all routes to index.html
[[redirects]]
  from = "/*"
  to = "/index.html"
  status = 200

Netlify — _headers file (alternative)

public/_headers
/*
  X-Robots-Tag: noai, noimageai

Vercel — vercel.json

vercel.json (project root)
{
  "headers": [
    {
      "source": "/(.*)",
      "headers": [
        {
          "key": "X-Robots-Tag",
          "value": "noai, noimageai"
        }
      ]
    }
  ],
  "rewrites": [
    {
      "source": "/((?!api/.*).*)",
      "destination": "/index.html"
    }
  ]
}

Hard blocking on Netlify/Vercel: Neither platform supports user-agent conditions in their headers config. For hard 403 blocking on these platforms, use Cloudflare WAF in front of your Netlify/Vercel deployment — proxy your custom domain through Cloudflare and add a WAF custom rule (free plan: 5 rules).

Vue SPA vs SSR: what bots actually see

ScenarioBot sees
Vue SPA — no JS execution<div id="app"></div> shell only
Vue SPA — useHead() onlyEmpty shell (JS required)
Vue SPA — X-Robots-Tag headerResponse header on every request
Nuxt.js SSRFull rendered HTML
Vue + Vite SSG (vite-ssg)Pre-rendered HTML per route

AI bots to block on Vue applications

BotOperator
GPTBotOpenAI
ChatGPT-UserOpenAI
OAI-SearchBotOpenAI
ClaudeBotAnthropic
anthropic-aiAnthropic
Google-ExtendedGoogle
BytespiderByteDance
CCBotCommon Crawl
PerplexityBotPerplexity
meta-externalagentMeta
AmazonbotAmazon
Applebot-ExtendedApple
xAI-BotxAI
DeepSeekBotDeepSeek
MistralBotMistral
DiffbotDiffbot
cohere-aiCohere
AI2BotAllen Institute
Ai2Bot-DolmaAllen Institute
YouBotYou.com
DuckAssistBotDuckDuckGo
omgiliWebz.io
omgilibotWebz.io
webzio-extendedWebz.io
gemini-deep-researchGoogle

FAQ

Do AI bots see my Vue SPA content if they don't run JavaScript?

No. Vue SPAs serve an empty <div id="app"></div> shell as the initial HTML. Bots that don't execute JS see only your index.html skeleton — not your rendered content, product listings, or blog posts. This makes Vue SPAs paradoxically private for non-JS bots, but your useHead() noai tags are also invisible.

Where do I put robots.txt in a Vue.js Vite project?

In public/robots.txt at the project root. Vite copies everything in public/ directly to dist/ without processing — no route needed. Do not put it in src/.

Does @vueuse/head noai meta tag work for AI bots?

Only for bots that execute JavaScript. Most AI training crawlers (CCBot, Bytespider, GPTBot) do not run JS. Use the index.html meta tag or X-Robots-Tag response headers for reliable delivery.

How do I block AI bots on a Vue SPA deployed to Netlify or Vercel?

Netlify: add netlify.toml with [[headers]] targeting /*. Vercel: add vercel.json with a headers array. For hard 403 blocking on both platforms, proxy through Cloudflare WAF.

Can I use vue-router navigation guards to block AI bots?

No. Navigation guards run after the JS bundle loads client-side. AI training crawlers typically don't execute JavaScript and never reach your router. Use nginx, Express, or Cloudflare WAF for effective blocking.

What's the difference between Vue SPA and Nuxt.js for AI bot blocking?

Nuxt.js renders HTML server-side so bots see full content including meta tags. Vue SPA serves an empty HTML shell — non-JS bots see nothing. For AI bot blocking, use server-level X-Robots-Tag headers in both cases. For Nuxt.js, see the Nuxt.js guide.

Is your site protected from AI bots?

Run a free scan to check your robots.txt, meta tags, and overall AI readiness score.

Related Guides