1.6 KiB
title | linktitle | description | date | publishdate | lastmod | categories | keywords | menu | weight | sections_weight | draft | aliases | toc | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Robots.txt File | Robots.txt | Hugo can generate a customized robots.txt in the same way as any other template. | 2017-02-01 | 2017-02-01 | 2017-02-01 |
|
|
|
165 | 165 | false |
|
false |
To create your robots.txt as a template, first set the enableRobotsTXT
value to true
in your configuration file. By default, this option generates a robots.txt with the following content, which tells search engines that they are allowed to crawl everything:
User-agent: *
Robots.txt Template Lookup Order
The lookup order for the robots.txt
template is as follows:
/layouts/robots.txt
/themes/<THEME>/layouts/robots.txt
{{% note %}}
If you do not want Hugo to create a default robots.txt
or leverage the robots.txt
template, you can hand code your own and place the file in static
. Remember that everything in the static directory is copied over as-is when Hugo builds your site.
{{% /note %}}
Robots.txt Template Example
The following is an example robots.txt
layout:
{{< code file="layouts/robots.txt" download="robots.txt" >}} User-agent: *
{{range .Data.Pages}} Disallow: {{.RelPermalink}} {{end}} {{< /code >}}
This template disallows all the pages of the site by creating one Disallow
entry for each page.