How to Block AI Bots on Laravel
Laravel gives you four independent layers to control AI crawlers: a static public/robots.txt, a dynamic route, a BlockAiBots middleware for hard 403 responses, and server-level rules in .htaccess or nginx. Covers Laravel 10 and Laravel 11+.
Quick fix — create public/robots.txt
Drop this file in public/ (Laravel's webroot). No route, no controller, no Artisan command needed.
User-agent: GPTBot Disallow: / User-agent: ClaudeBot Disallow: / User-agent: CCBot Disallow: / User-agent: Google-Extended Disallow: /
All Methods
public/robots.txt (Recommended)
EasyAll environments
public/robots.txt
Laravel's webroot is public/. A robots.txt file here is served at /robots.txt automatically by Apache/nginx with no route, no controller, no config required.
Works on shared hosting, Forge, Vapor, Envoyer — everywhere. Plain text, no Blade syntax.
Dynamic robots.txt route
EasyAll environments
routes/web.php
A Route::get('robots.txt', ...) closure that returns a text/plain Response. Useful for environment-based rules (block all on staging) or pulling bot lists from config.
Add to routes/web.php outside any middleware groups. Set Cache-Control headers to avoid re-fetching on every crawl.
Blade layout — noai meta tag
EasyAll environments
resources/views/layouts/app.blade.php
Add <meta name="robots" content="noai, noimageai"> to the master Blade layout. Applies to every page that extends this layout without further configuration.
Use @section / @yield for per-page override if needed.
BlockAiBots middleware — hard blocking
EasyAll environments
app/Http/Middleware/BlockAiBots.php
Laravel middleware that checks User-Agent and returns 403 before the controller runs. Register globally in Kernel.php (Laravel 10) or bootstrap/app.php (Laravel 11+).
Most complete protection — bots receive 403 and no page HTML is generated. Works on Vapor too.
.htaccess — Apache server-level block
EasyApache only
public/.htaccess
Add RewriteCond rules to match AI bot user agents and return a 403 before PHP is invoked. Faster than middleware — zero PHP overhead for blocked bots.
Requires mod_rewrite (standard on all Apache setups). Does not work on nginx or Vapor.
nginx — server-level block
Intermediatenginx only
nginx server block config
Add an if block matching AI bot user agents in the nginx server config to return 403 before Laravel handles the request. Zero PHP overhead.
Requires access to nginx config files (VPS/Forge). Not available on shared hosting or Vapor.
Method 1: public/robots.txt
Laravel's webroot is the public/ directory — it's what Apache or nginx points to as the document root. A file at public/robots.txt is served as yourdomain.com/robots.txt with zero application code involved. Plain text only — no Blade directives, no PHP tags.
User-agent: * Allow: / User-agent: GPTBot Disallow: / User-agent: ChatGPT-User Disallow: / User-agent: OAI-SearchBot Disallow: / User-agent: ClaudeBot Disallow: / User-agent: anthropic-ai Disallow: / User-agent: Google-Extended Disallow: / User-agent: Bytespider Disallow: / User-agent: CCBot Disallow: / User-agent: PerplexityBot Disallow: / User-agent: meta-externalagent Disallow: / User-agent: Amazonbot Disallow: / User-agent: Applebot-Extended Disallow: / User-agent: xAI-Bot Disallow: / User-agent: DeepSeekBot Disallow: / User-agent: MistralBot Disallow: / User-agent: Diffbot Disallow: / User-agent: cohere-ai Disallow: / User-agent: AI2Bot Disallow: / User-agent: Ai2Bot-Dolma Disallow: / User-agent: YouBot Disallow: / User-agent: DuckAssistBot Disallow: / User-agent: omgili Disallow: / User-agent: omgilibot Disallow: / User-agent: webzio-extended Disallow: / User-agent: gemini-deep-research Disallow: /
Method 2: Dynamic robots.txt Route
A route closure in routes/web.php that returns a text/plain response. The advantage over a static file: you can use app()->environment() to block everything on staging, or pull the bot list from a config file.
// routes/web.php
Route::get('robots.txt', function () {
$bots = [
'GPTBot', 'ChatGPT-User', 'OAI-SearchBot',
'ClaudeBot', 'anthropic-ai', 'Google-Extended',
'Bytespider', 'CCBot', 'PerplexityBot',
'meta-externalagent', 'Amazonbot', 'Applebot-Extended',
'xAI-Bot', 'DeepSeekBot', 'MistralBot', 'Diffbot',
'cohere-ai', 'AI2Bot', 'Ai2Bot-Dolma', 'YouBot',
'DuckAssistBot', 'omgili', 'omgilibot',
'webzio-extended', 'gemini-deep-research',
];
$lines = ["User-agent: *", "Allow: /", ""];
// Block all crawlers on non-production environments
if (! app()->isProduction()) {
array_unshift($lines, "User-agent: *", "Disallow: /", "");
} else {
foreach ($bots as $bot) {
$lines[] = "User-agent: {$bot}";
$lines[] = "Disallow: /";
$lines[] = "";
}
}
$lines[] = "Sitemap: " . url('/sitemap.xml');
return response(implode("\n", $lines), 200)
->header('Content-Type', 'text/plain')
->header('Cache-Control', 'public, max-age=86400');
});Static file takes precedence
If public/robots.txt exists, Apache and nginx will serve it directly without touching Laravel. The route above only runs if there is no static public/robots.txt. Use one or the other.
Method 3: noai Meta Tag in Blade Layout
Add the noai and noimageai meta tags to your master Blade layout. In a standard Laravel app this is resources/views/layouts/app.blade.php. In apps using Laravel Livewire Volt or Jetstream it may be resources/views/components/layouts/app.blade.php.
{{-- resources/views/layouts/app.blade.php (excerpt) --}}
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1">
<title>{{ config('app.name') }}</title>
{{-- Block AI training crawlers --}}
<meta name="robots" content="noai, noimageai">
@yield('head')
@vite(['resources/css/app.css', 'resources/js/app.js'])
</head>Child views can extend or override the head section using @section('head') and @parent:
{{-- resources/views/blog/show.blade.php — allow AI to see this page --}}
@extends('layouts.app')
@section('head')
@parent
<meta name="robots" content="index, follow">
@endsectionMethod 4: BlockAiBots Middleware
Laravel middleware runs before any controller logic. Create a BlockAiBots middleware that checks the incoming user agent and returns a 403 response immediately for matched bots. No page HTML is generated — the bot receives nothing useful.
# Generate the middleware file php artisan make:middleware BlockAiBots
<?php
// app/Http/Middleware/BlockAiBots.php
namespace App\Http\Middleware;
use Closure;
use Illuminate\Http\Request;
use Symfony\Component\HttpFoundation\Response;
class BlockAiBots
{
private const BLOCKED_UAS = '/GPTBot|ChatGPT-User|OAI-SearchBot|ClaudeBot|anthropic-ai|Google-Extended|Bytespider|CCBot|PerplexityBot|meta-externalagent|Amazonbot|Applebot-Extended|xAI-Bot|DeepSeekBot|MistralBot|Diffbot|cohere-ai|AI2Bot|Ai2Bot-Dolma|YouBot|DuckAssistBot|omgili|omgilibot|webzio-extended|gemini-deep-research/i';
public function handle(Request $request, Closure $next): Response
{
$ua = $request->userAgent() ?? '';
if (preg_match(self::BLOCKED_UAS, $ua)) {
return response('Forbidden', 403);
}
return $next($request);
}
}Laravel 10 — register in Kernel.php:
// app/Http/Kernel.php
protected $middlewareGroups = [
'web' => [
// ... existing middleware ...
\App\Http\Middleware\BlockAiBots::class,
],
];Laravel 11+ — register in bootstrap/app.php:
// bootstrap/app.php
use App\Http\Middleware\BlockAiBots;
return Application::configure(basePath: dirname(__DIR__))
->withRouting(/* ... */)
->withMiddleware(function (Middleware $middleware) {
$middleware->appendToGroup('web', BlockAiBots::class);
})
->create();Route::middleware(BlockAiBots::class)->group(...) in routes/web.php, or add ->middleware(BlockAiBots::class) to individual route definitions.Method 5: .htaccess (Apache)
Laravel ships with a public/.htaccess file for Apache. Add user agent conditions to block AI bots before PHP is invoked — zero Laravel overhead for blocked requests:
# public/.htaccess — add BEFORE the existing RewriteEngine block
<IfModule mod_rewrite.c>
RewriteEngine On
# Block AI training crawlers
RewriteCond %{HTTP_USER_AGENT} (GPTBot|ClaudeBot|anthropic-ai|CCBot|Bytespider|Google-Extended|PerplexityBot|Diffbot|DeepSeekBot|MistralBot|cohere-ai|meta-externalagent|Amazonbot|Applebot-Extended|xAI-Bot|AI2Bot|omgili|omgilibot|webzio-extended|gemini-deep-research|OAI-SearchBot|ChatGPT-User) [NC]
RewriteRule ^ - [F,L]
# ... existing Laravel .htaccess rules below ...
</IfModule>Add before the Laravel rewrite rules
The bot-blocking RewriteRule must appear before the RewriteRule ^ index.php [L]line in Laravel's default .htaccess. The [F] flag returns 403 Forbidden; [L] stops further rule processing.
Method 6: nginx Config
On VPS deployments via Laravel Forge or manual nginx setup, add a user agent check to your nginx server block. This is the most efficient method — requests are rejected before reaching PHP-FPM:
# nginx server block (e.g. /etc/nginx/sites-available/yourdomain.conf)
server {
listen 80;
server_name yourdomain.com;
root /var/www/html/public;
index index.php;
# Block AI training crawlers
if ($http_user_agent ~* "(GPTBot|ClaudeBot|anthropic-ai|CCBot|Bytespider|Google-Extended|PerplexityBot|Diffbot|DeepSeekBot|MistralBot|cohere-ai|meta-externalagent|Amazonbot|xAI-Bot|AI2Bot|omgili|webzio-extended|gemini-deep-research|OAI-SearchBot|ChatGPT-User)") {
return 403;
}
location / {
try_files $uri $uri/ /index.php?$query_string;
}
location ~ \.php$ {
fastcgi_pass unix:/var/run/php/php8.3-fpm.sock;
fastcgi_index index.php;
include fastcgi_params;
fastcgi_param SCRIPT_FILENAME $realpath_root$fastcgi_script_name;
}
}On Laravel Forge: go to your site → Nginx Configuration, add the if ($http_user_agent ...) block inside the server block, then click Save. Forge will reload nginx automatically.
Laravel Vapor (Serverless)
On Vapor, requests go through AWS API Gateway → Lambda. You have no nginx or .htaccess access. Use:
- ✓
public/robots.txt— Vapor serves static assets from S3/CloudFront - ✓
BlockAiBotsmiddleware — runs in Lambda, works normally - ✓ CloudFront Functions — add a CF function to the Vapor CloudFront distribution for edge-level blocking
- ✓ Cloudflare in front of Vapor — WAF custom rule blocks before hitting AWS
AI Bots to Block
25 user agents covering AI training crawlers and AI search bots. The robots.txt and middleware patterns above include all of them.
Frequently Asked Questions
Where do I put robots.txt in a Laravel application?
Place robots.txt in the public/ directory — Laravel's webroot. This is the only directory served directly by Apache or nginx, so public/robots.txt becomes yourdomain.com/robots.txt automatically with no route or controller required. Alternatively, create a dynamic route in routes/web.php that returns a text/plain response, which lets you generate rules programmatically or differ by environment.
How do I create middleware to block AI bots in Laravel?
Run php artisan make:middleware BlockAiBots to create the middleware file at app/Http/Middleware/BlockAiBots.php. In the handle() method, check $request->userAgent() against a regex of AI bot names and return response('Forbidden', 403) for matches. In Laravel 10 and earlier, register it in app/Http/Kernel.php under $middlewareGroups['web']. In Laravel 11+, register it in bootstrap/app.php using $middleware->appendToGroup('web', BlockAiBots::class). Apply it globally or to specific route groups.
What's the difference between blocking AI bots in .htaccess vs middleware?
.htaccess (Apache) and nginx rules block at the web server layer — before PHP or Laravel is invoked at all. This is faster and uses zero PHP resources. Laravel middleware blocks after the request reaches PHP but before the controller runs. For high-traffic sites, server-level blocking (.htaccess or nginx) is more efficient. For shared hosting where nginx config is inaccessible, or when you want to log blocked requests through Laravel, middleware is more appropriate.
How do I add noai meta tags globally in Laravel Blade?
Edit your master Blade layout file — typically resources/views/layouts/app.blade.php or resources/views/components/layouts/app.blade.php if using Volt/Livewire components. Add <meta name="robots" content="noai, noimageai" /> inside the <head> section. This applies to every page that extends or uses this layout. For per-page control in Blade, use @section('head') and @yield('head') to allow child views to override or extend the head section.
Does blocking AI bots work differently on Laravel Vapor (serverless)?
Yes. On Laravel Vapor (AWS Lambda), you don't have access to nginx or .htaccess — the request goes through AWS API Gateway and then hits your Lambda function. The static public/robots.txt works fine (Vapor serves it). Laravel middleware works normally. For hard network-edge blocking on Vapor, use CloudFront (AWS CDN) functions or Cloudflare in front of your Vapor domain. The middleware approach is the most portable option across Vapor, Forge, Envoyer, and shared hosting.
Should I use a Laravel package for robots.txt management?
For simple AI bot blocking, a static public/robots.txt requires no package at all. Packages like spatie/laravel-robots-middleware or imliam/laravel-robots-txt add value when you need: environment-based rules (block everything on staging), per-route robots directives, or integration with a CMS that manages crawl rules. For the specific use case of blocking AI training bots, a plain static robots.txt is the simplest and most reliable approach.
Is your site protected from AI bots?
Run a free scan to check your robots.txt, meta tags, and overall AI readiness score.