hugo/content/templates/robots.md
Bjørn Erik Pedersen ecf5e081b5 Squashed 'docs/' changes from 000ab7c42..4628b9ec2
4628b9ec2 commands: Regen CLI doc
2525f2ed0 data: Regenerate docs helper
6f5a0eb19 Add Hugo 0.30 poster image
72c3fac9e Merge branch 'chroma-next2' into next
364973d3f Fix typo in syntax highlighting.
ce10cc02e Update Chroma highlighting docs
9dcc4d4dd Update robots.md
1e64cb483 Rename title of cross references' page
d6dfbbc51 Add warning about MMark and TOCs
e8d259d32 Fix link to subsection in page
6adead19d Merge commit '040d8d2833c26c53cf9f0e035910821ed50e3863'
040d8d283 Squashed 'themes/gohugoioTheme/' changes from cdaa89c8..6b632895
bde95d890 Add Atlas starter kit
fc40d078d Remove page arg from examples of relref shortcode
c578620b5 Remove page arg from examples of ref shortcode
ee81931a4 Remove delimiters in YAML and TOML config examples
62d7b269f Clarify that .Lastmod automatically uses .GitInfo.AuthorDate (#226)

git-subtree-dir: docs
git-subtree-split: 4628b9ec2c52df4de673a4d6b9621a65d8e8f5a4
2017-10-15 10:20:55 +02:00

1.6 KiB

title linktitle description date publishdate lastmod categories keywords menu weight sections_weight draft aliases toc
Robots.txt File Robots.txt Hugo can generate a customized robots.txt in the same way as any other template. 2017-02-01 2017-02-01 2017-02-01
templates
robots
search engines
docs
parent weight
templates 165
165 165 false
/extras/robots-txt/
false

To create your robots.txt as a template, first set the enableRobotsTXT value to true in your configuration file. By default, this option generates a robots.txt with the following content, which tells search engines that they are allowed to crawl everything:

User-agent: *

Robots.txt Template Lookup Order

The lookup order for the robots.txt template is as follows:

  • /layouts/robots.txt
  • /themes/<THEME>/layouts/robots.txt

{{% note %}} If you do not want Hugo to create a default robots.txt or leverage the robots.txt template, you can hand code your own and place the file in static. Remember that everything in the static directory is copied over as-is when Hugo builds your site. {{% /note %}}

Robots.txt Template Example

The following is an example robots.txt layout:

{{< code file="layouts/robots.txt" download="robots.txt" >}} User-agent: *

{{range .Data.Pages}} Disallow: {{.RelPermalink}} {{end}} {{< /code >}}

This template disallows all the pages of the site by creating one Disallow entry for each page.