Back to Blog
    seo
    Pillar: AI Chatbot for Customer Support

    Robots.txt: What It Is, Why It Matters, and How to Write One

    SiteSupport TeamFebruary 15, 2025Last updated February 15, 20256 min read
    SEO
    robots.txt
    crawling
    search engines
    The robots.txt file is a simple text file that tells search engine crawlers which parts of your website they can and cannot access. Despite its simplicity, mistakes in robots.txt can have devastating SEO consequences.

    How robots.txt Works

    When a search engine bot arrives at your site, the first thing it checks is /robots.txt. This file contains directives that tell the bot:
    Which pages to crawl
    Which pages to skip
    Where to find your sitemap

    Basic Syntax

    plaintext
    User-agent: *
    Allow: /
    Disallow: /admin/
    Disallow: /private/
    
    Sitemap: https://yoursite.com/sitemap.xml

    Key Directives

    **User-agent**: Which bot the rules apply to (* means all)
    Allow: Explicitly permit crawling of a path
    Disallow: Block crawling of a path
    Sitemap: Location of your XML sitemap

    What to Block

    Admin panels (/admin/, /wp-admin/)
    User account pages (/account/, /profile/)
    Search results pages (/search?)
    API endpoints (/api/)
    Duplicate content paths
    Staging or development paths

    What NOT to Block

    CSS and JavaScript files (Google needs these to render pages)
    Images you want in Google Images
    Any page you want indexed
    Your sitemap

    Common Mistakes

    1.Blocking CSS/JS — This prevents Google from rendering your pages
    2.Using robots.txt for security — It's publicly readable, not a security tool
    3.**Blocking your entire site** — A single Disallow: / blocks everything
    4.Forgetting about subdomains — Each subdomain needs its own robots.txt
    5.Not including sitemap reference — Always add your sitemap URL

    Testing Your robots.txt

    Use Google Search Console's robots.txt tester to verify your file works as expected before deploying changes.

    Free Tool

    Use our Robots.txt Generator to create a properly formatted robots.txt file with sitemap references and custom crawl rules. No signup required.

    About the author

    SiteSupport Team

    Cross-functional team of product specialists and support operators publishing practical guidance on AI support, SEO, and knowledge-base workflows.

    View full author profile

    Related Articles

    Continue Exploring This Topic

    Want AI-powered customer support?

    Deploy a custom AI chatbot trained on your website in minutes. No code required.