2019-10-21 04:22:28 -04:00
---
2023-07-29 05:15:54 -04:00
title: Robots.txt file
linkTitle: Robots.txt
2019-10-21 04:22:28 -04:00
description: Hugo can generate a customized robots.txt in the same way as any other template.
categories: [templates]
keywords: [robots,search engines]
menu:
docs:
2023-05-22 10:43:12 -04:00
parent: templates
2023-07-29 05:15:54 -04:00
weight: 230
weight: 230
2019-10-21 04:22:28 -04:00
aliases: [/extras/robots-txt/]
---
2023-07-29 05:15:54 -04:00
To generate a robots.txt file from a template, change the [site configuration]:
2019-10-21 04:22:28 -04:00
2023-12-04 09:14:18 -05:00
{{< code-toggle file = hugo > }}
2021-04-20 14:21:45 -04:00
enableRobotsTXT = true
{{< / code-toggle > }}
By default, Hugo generates robots.txt using an [internal template][internal].
```text
2019-10-21 04:22:28 -04:00
User-agent: *
```
2021-04-20 14:21:45 -04:00
Search engines that honor the Robots Exclusion Protocol will interpret this as permission to crawl everything on the site.
2019-10-21 04:22:28 -04:00
2023-07-29 05:15:54 -04:00
## robots.txt template lookup order
2019-10-21 04:22:28 -04:00
2021-04-20 14:21:45 -04:00
You may overwrite the internal template with a custom template. Hugo selects the template using this lookup order:
2019-10-21 04:22:28 -04:00
2021-04-20 14:21:45 -04:00
1. `/layouts/robots.txt`
2. `/themes/<THEME>/layouts/robots.txt`
2019-10-21 04:22:28 -04:00
2023-07-29 05:15:54 -04:00
## robots.txt template example
2019-10-21 04:22:28 -04:00
2023-12-04 09:14:18 -05:00
{{< code file = layouts/robots.txt > }}
2019-10-21 04:22:28 -04:00
User-agent: *
2021-04-20 14:21:45 -04:00
{{ range .Pages }}
Disallow: {{ .RelPermalink }}
{{ end }}
2019-10-21 04:22:28 -04:00
{{< / code > }}
2021-04-20 14:21:45 -04:00
This template creates a robots.txt file with a `Disallow` directive for each page on the site. Search engines that honor the Robots Exclusion Protocol will not crawl any page on the site.
{{% note %}}
To create a robots.txt file without using a template:
2023-12-04 09:14:18 -05:00
1. Set `enableRobotsTXT` to `false` in the site configuration.
2021-04-20 14:21:45 -04:00
2. Create a robots.txt file in the `static` directory.
Remember that Hugo copies everything in the [static directory][static] to the root of `publishDir` (typically `public` ) when you build your site.
[static]: /getting-started/directory-structure/
{{% /note %}}
2019-10-21 04:22:28 -04:00
2023-07-29 05:15:54 -04:00
[site configuration]: /getting-started/configuration/
2021-04-20 14:21:45 -04:00
[internal]: https://github.com/gohugoio/hugo/blob/master/tpl/tplimpl/embedded/templates/_default/robots.txt