Browse Source

Inject config in robots.txt context

This allow to use base_url in robots.txt, to reference a sitemap for
example.
index-subcmd
Greizgh 6 years ago
parent
commit
fec58054b4
4 changed files with 7 additions and 3 deletions
  1. +3
    -1
      components/site/src/lib.rs
  2. +1
    -0
      components/site/tests/site.rs
  3. +2
    -2
      docs/content/documentation/templates/robots.md
  4. +1
    -0
      test_site/templates/robots.txt

+ 3
- 1
components/site/src/lib.rs View File

@@ -670,9 +670,11 @@ impl Site {
/// Renders robots.txt
pub fn render_robots(&self) -> Result<()> {
ensure_directory_exists(&self.output_path)?;
let mut context = Context::new();
context.insert("config", &self.config);
create_file(
&self.output_path.join("robots.txt"),
&render_template("robots.txt", &self.tera, &Context::new(), &self.config.theme)?,
&render_template("robots.txt", &self.tera, &context, &self.config.theme)?,
)
}



+ 1
- 0
components/site/tests/site.rs View File

@@ -168,6 +168,7 @@ fn can_build_site_without_live_reload() {

// robots.txt has been rendered from the template
assert!(file_contains!(public, "robots.txt", "User-agent: gutenberg"));
assert!(file_contains!(public, "robots.txt", "Sitemap: https://replace-this-with-your-url.com/sitemap.xml"));
}

#[test]


+ 2
- 2
docs/content/documentation/templates/robots.md View File

@@ -6,8 +6,8 @@ weight = 70
Gutenberg will look for a `robots.txt` file in the `templates` directory or
use the built-in one.

Robots.txt is the simplest of all templates: it doesn't take any variables
and the default is what most site want.
Robots.txt is the simplest of all templates: it only gets the config
and the default is what most site want:

```jinja2
User-agent: *


+ 1
- 0
test_site/templates/robots.txt View File

@@ -1,2 +1,3 @@
User-agent: gutenberg
Allow: /
Sitemap: {{config.base_url}}/sitemap.xml

Loading…
Cancel
Save