2015-12-08 21:13:09 +00:00
---
2016-01-07 04:11:02 +00:00
lastmod: 2016-01-06
date: 2015-12-08
2015-12-08 21:13:09 +00:00
menu:
main:
parent: extras
next: /community/mailing-list
prev: /extras/urls
2016-01-07 04:11:02 +00:00
title: Custom robots.txt
2015-12-08 21:13:09 +00:00
weight: 120
---
2016-01-07 04:11:02 +00:00
Hugo can generated a customized [robots.txt ](http://www.robotstxt.org/ ) in the
[same way as any other templates ]({{< ref "templates/go-templates.md" >}} ).
2015-12-08 21:13:09 +00:00
2016-01-07 04:11:02 +00:00
By default, it generates a robots.txt, which allows everything, with the following content:
2015-12-08 21:13:09 +00:00
2016-01-07 04:11:02 +00:00
```http
User-agent: *
```
2015-12-08 21:13:09 +00:00
2016-01-07 04:11:02 +00:00
To disable it, just set `disableRobotsTXT` option to `false` in the [command line ]({{< ref "commands/hugo.md" >}} ) or [configuration file ]({{< ref "overview/configuration.md" >}} ).
2015-12-08 21:13:09 +00:00
2016-01-07 04:11:02 +00:00
Hugo will use the template `robots.txt` according to the following list in descending precedence:
2015-12-08 21:13:09 +00:00
* /layouts/robots.txt
* /themes/`THEME`/layout/robots.txt
An example of a robots.txt layout is:
2016-01-07 04:11:02 +00:00
```http
User-agent: *
2015-12-08 21:13:09 +00:00
2016-01-07 04:11:02 +00:00
{{range .Data.Pages}}
Disallow: {{.RelPermalink}}{{end}}
```
2015-12-08 21:13:09 +00:00
This template disallows and all the pages of the site creating one `Disallow` entry for each one.