mirror of
https://github.com/gohugoio/hugo.git
synced 2024-11-14 20:37:55 -05:00
ecf5e081b5
4628b9ec2 commands: Regen CLI doc 2525f2ed0 data: Regenerate docs helper 6f5a0eb19 Add Hugo 0.30 poster image 72c3fac9e Merge branch 'chroma-next2' into next 364973d3f Fix typo in syntax highlighting. ce10cc02e Update Chroma highlighting docs 9dcc4d4dd Update robots.md 1e64cb483 Rename title of cross references' page d6dfbbc51 Add warning about MMark and TOCs e8d259d32 Fix link to subsection in page 6adead19d Merge commit '040d8d2833c26c53cf9f0e035910821ed50e3863' 040d8d283 Squashed 'themes/gohugoioTheme/' changes from cdaa89c8..6b632895 bde95d890 Add Atlas starter kit fc40d078d Remove page arg from examples of relref shortcode c578620b5 Remove page arg from examples of ref shortcode ee81931a4 Remove delimiters in YAML and TOML config examples 62d7b269f Clarify that .Lastmod automatically uses .GitInfo.AuthorDate (#226) git-subtree-dir: docs git-subtree-split: 4628b9ec2c52df4de673a4d6b9621a65d8e8f5a4
54 lines
1.6 KiB
Markdown
54 lines
1.6 KiB
Markdown
---
|
|
title: Robots.txt File
|
|
linktitle: Robots.txt
|
|
description: Hugo can generate a customized robots.txt in the same way as any other template.
|
|
date: 2017-02-01
|
|
publishdate: 2017-02-01
|
|
lastmod: 2017-02-01
|
|
categories: [templates]
|
|
keywords: [robots,search engines]
|
|
menu:
|
|
docs:
|
|
parent: "templates"
|
|
weight: 165
|
|
weight: 165
|
|
sections_weight: 165
|
|
draft: false
|
|
aliases: [/extras/robots-txt/]
|
|
toc: false
|
|
---
|
|
|
|
To create your robots.txt as a template, first set the `enableRobotsTXT` value to `true` in your [configuration file][config]. By default, this option generates a robots.txt with the following content, which tells search engines that they are allowed to crawl everything:
|
|
|
|
```
|
|
User-agent: *
|
|
```
|
|
|
|
## Robots.txt Template Lookup Order
|
|
|
|
The [lookup order][lookup] for the `robots.txt` template is as follows:
|
|
|
|
* `/layouts/robots.txt`
|
|
* `/themes/<THEME>/layouts/robots.txt`
|
|
|
|
{{% note %}}
|
|
If you do not want Hugo to create a default `robots.txt` or leverage the `robots.txt` template, you can hand code your own and place the file in `static`. Remember that everything in the [static directory](/getting-started/directory-structure/) is copied over as-is when Hugo builds your site.
|
|
{{% /note %}}
|
|
|
|
## Robots.txt Template Example
|
|
|
|
The following is an example `robots.txt` layout:
|
|
|
|
{{< code file="layouts/robots.txt" download="robots.txt" >}}
|
|
User-agent: *
|
|
|
|
{{range .Data.Pages}}
|
|
Disallow: {{.RelPermalink}}
|
|
{{end}}
|
|
{{< /code >}}
|
|
|
|
This template disallows all the pages of the site by creating one `Disallow` entry for each page.
|
|
|
|
[config]: /getting-started/configuration/
|
|
[lookup]: /templates/lookup-order/
|
|
[robots]: http://www.robotstxt.org/
|