Skip to content
Guides/Beego (Go)

How to Block AI Bots on Beego (Go): Complete 2026 Guide

Beego is a full-featured Go MVC framework with its own ORM, template engine, and session management. Unlike Gin, Echo, and Chi, Beego uses InsertFilter() with named execution points rather than a linear middleware chain. Bot blocking belongs at web.BeforeRouter — the earliest point, before routing occurs.

InsertFilter — not a middleware chain

Beego filters use web.InsertFilter(pattern, point, func) — no next() call, no wrapper function, no return value. Execution points: BeforeStatic BeforeRouter BeforeExec AfterExec FinishRouter. Use BeforeRouter for bot blocking — it runs before routing, so blocked requests never hit the controller.

Protection layers

1
robots.txtstatic/ directory auto-served by Beego, or explicit /robots.txt route — registered before filters
2
noai meta tagBeego template with {{.robots}} ViewData variable — set in base controller or filter
3
X-Robots-Tag headerctx.ResponseWriter.Header().Set("X-Robots-Tag", "noai, noimageai") in filter after bot check
4
Hard 403 blockctx.Abort(403, "Forbidden") — writes response, stops controller execution

Layer 1: robots.txt

Place robots.txt in your static/ directory. Beego serves it at /static/robots.txt by default. To serve it at /robots.txt (required for crawlers), add an explicit route or configure StaticDir:

# static/robots.txt

User-agent: *
Allow: /

User-agent: GPTBot
User-agent: ClaudeBot
User-agent: anthropic-ai
User-agent: Google-Extended
User-agent: CCBot
User-agent: Bytespider
User-agent: Applebot-Extended
User-agent: PerplexityBot
User-agent: Diffbot
User-agent: cohere-ai
User-agent: FacebookBot
User-agent: omgili
User-agent: omgilibot
User-agent: Amazonbot
User-agent: DeepSeekBot
User-agent: MistralBot
User-agent: xAI-Bot
User-agent: AI2Bot
Disallow: /

Option A — serve at /robots.txt via explicit route

// main.go — register before filters
web.Get("/robots.txt", func(ctx *context.Context) {
    ctx.Output.Header("Content-Type", "text/plain")
    http.ServeFile(ctx.ResponseWriter, ctx.Request, "static/robots.txt")
})

Option B — map / to static/ directly in app.conf

# conf/app.conf
StaticDir = static:static  # /static/* → ./static/
# or via code:
# web.SetStaticPath("/static", "static")

Layer 2: noai meta tag

Set a robots variable in your base controller's Prepare() method. All controllers that embed the base controller inherit it:

controllers/base.go

type BaseController struct {
    beego.Controller
}

func (b *BaseController) Prepare() {
    // Default robots value for all pages
    b.Data["robots"] = "noai, noimageai"
}

views/layouts/main.tpl (Beego template)

<meta name="robots" content="{{.robots}}">

Controller override

type BlogController struct {
    BaseController
}

func (c *BlogController) Get() {
    c.Data["robots"] = "index, follow"  // override for this route
    c.TplName = "blog.tpl"
}

Layers 3 & 4: InsertFilter

filters/ai_bot_blocker.go

package filters

import (
	"strings"

	"github.com/beego/beego/v2/server/web/context"
)

var aiBotPatterns = []string{
	"gptbot", "chatgpt-user", "oai-searchbot",
	"claudebot", "anthropic-ai", "claude-web",
	"google-extended", "ccbot", "bytespider",
	"applebot-extended", "perplexitybot", "diffbot",
	"cohere-ai", "facebookbot", "meta-externalagent",
	"omgili", "omgilibot", "amazonbot",
	"deepseekbot", "mistralbot", "xai-bot", "ai2-bot",
}

var exemptPaths = map[string]bool{
	"/robots.txt":  true,
	"/sitemap.xml": true,
	"/favicon.ico": true,
}

// AiBotFilter is a Beego filter function — no next() call, no return value.
// Register with web.InsertFilter("*", web.BeforeRouter, filters.AiBotFilter)
func AiBotFilter(ctx *context.Context) {
	// Always pass through exempt paths
	if exemptPaths[ctx.Request.URL.Path] {
		return
	}

	ua := strings.ToLower(ctx.Request.Header.Get("User-Agent"))

	for _, pattern := range aiBotPatterns {
		if strings.Contains(ua, pattern) {
			// Layer 4: hard 403 block
			// ctx.Abort() writes status + body and stops further execution
			ctx.Abort(403, "Forbidden")
			return
		}
	}

	// Layer 3: set X-Robots-Tag for legitimate requests
	ctx.ResponseWriter.Header().Set("X-Robots-Tag", "noai, noimageai")
}

Key points

  • No next() call: Beego filter functions have no next() parameter — unlike Gin, Echo, and Chi. A filter either calls ctx.Abort() to stop, or simply returns to continue to the next filter/controller.
  • Blocking: ctx.Abort(403, "Forbidden") writes the HTTP 403 status and body, and sets an internal flag that prevents the controller action from running. Beego checks this flag after each filter at the BeforeRouter point.
  • Reading User-Agent: ctx.Request.Header.Get("User-Agent") ctx.Request is the standard *http.Request.
  • Writing response headers: ctx.ResponseWriter.Header().Set() ctx.ResponseWriter is Beego's response writer wrapping http.ResponseWriter.

Registering the filter

// main.go
package main

import (
	"yourapp/filters"

	beego "github.com/beego/beego/v2/server/web"
	"github.com/beego/beego/v2/server/web/context"
)

func main() {
	// Global filter — runs on every request before routing
	beego.InsertFilter("*", beego.BeforeRouter, filters.AiBotFilter)

	// Your routes
	beego.Router("/", &controllers.HomeController{})
	beego.Router("/api/data", &controllers.ApiController{})

	beego.Run()
}

BeforeRouter vs BeforeExec

BeforeRouter runs before Beego attempts to match the URL to a controller. This means blocked requests never consume routing resources, and filters run even for unmatched paths (404s). BeforeExec runs after routing — the route is already matched, and Beego has allocated a controller instance. For bot blocking, always use BeforeRouter to reject at the earliest possible point.

Route-scoped filter

Apply the filter only to a path prefix using a glob pattern:

// Apply only to /api/* — public marketing pages unaffected
beego.InsertFilter("/api/*", beego.BeforeRouter, filters.AiBotFilter)

// Multiple patterns — both API and admin routes
beego.InsertFilter("/api/*", beego.BeforeRouter, filters.AiBotFilter)
beego.InsertFilter("/admin/*", beego.BeforeRouter, filters.AiBotFilter)

Comparison: Beego vs Gin vs Echo vs Chi

Beego — InsertFilter (no next())

func AiBotFilter(ctx *context.Context) {
    ua := strings.ToLower(ctx.Request.Header.Get("User-Agent"))
    if isBot(ua) {
        ctx.Abort(403, "Forbidden")  // no next() needed
        return
    }
    ctx.ResponseWriter.Header().Set("X-Robots-Tag", "noai, noimageai")
}
// Register: beego.InsertFilter("*", beego.BeforeRouter, AiBotFilter)

Gin — chain model with c.Next()/c.Abort()

func AiBotBlocker() gin.HandlerFunc {
    return func(c *gin.Context) {
        if isBot(c.Request.Header.Get("User-Agent")) {
            c.AbortWithStatus(403)
            return
        }
        c.Next()
        c.Header("X-Robots-Tag", "noai, noimageai")
    }
}
// Register: r.Use(AiBotBlocker())

Echo — wrapper pattern

func AiBotBlocker(next echo.HandlerFunc) echo.HandlerFunc {
    return func(c echo.Context) error {
        if isBot(c.Request().Header.Get("User-Agent")) {
            return echo.NewHTTPError(403, "Forbidden")
        }
        err := next(c)
        c.Response().Header().Set("X-Robots-Tag", "noai, noimageai")
        return err
    }
}
// Register: e.Use(AiBotBlocker)

Chi — pure net/http (identical to Chi)

func AiBotBlocker(next http.Handler) http.Handler {
    return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
        if isBot(r.Header.Get("User-Agent")) {
            http.Error(w, "Forbidden", 403)
            return
        }
        w.Header().Set("X-Robots-Tag", "noai, noimageai")
        next.ServeHTTP(w, r)
    })
}
// Register: r.Use(AiBotBlocker)

Beego is the only Go framework here that does not use a next() call. Its filter system is simpler to write but less composable — filters cannot wrap the response the way Gin and Echo middleware can.

Verification

# Should return 403 (blocked AI bot)
curl -I -A "GPTBot" http://localhost:8080/

# Should return 200 (regular browser)
curl -I -A "Mozilla/5.0" http://localhost:8080/

# robots.txt must always return 200
curl -I -A "GPTBot" http://localhost:8080/robots.txt

# Check X-Robots-Tag on legitimate request
curl -si -A "Mozilla/5.0" http://localhost:8080/ | grep -i x-robots

Default Beego port is 8080. Expected: GPTBot → 403. Mozilla/5.0 → 200 with X-Robots-Tag: noai, noimageai. robots.txt → 200 for any user agent.

Is your site protected from AI bots?

Run a free scan to check your robots.txt, meta tags, and overall AI readiness score.