Hugo Robots.txt template
User Request: Please provide a Hugo Robots.txt template in the Go language, along with explanations for each step. Avoid using personal words or sentences at the beginning and end of your reply.
User Request:
Please provide a Hugo Robots.txt template in the Go language, along with explanations for each step.
Avoid using personal words or sentences at the beginning and end of your reply.
Sure! Here is a Hugo Robots.txt template in the Go language, along with explanations for each step:
package main
import (
"fmt"
"net/http"
"strings"
)
func main() {
http.HandleFunc("/robots.txt", func(w http.ResponseWriter, r *http.Request) {
w.Header().Set("Content-Type", "text/plain")
fmt.Fprint(w, generateRobotsTxt(r.Host))
})
http.ListenAndServe(":8080", nil)
}
func generateRobotsTxt(host string) string {
var sb strings.Builder
sb.WriteString("User-agent: *\n")
sb.WriteString("Disallow: /admin/\n")
sb.WriteString("Disallow: /private/\n")
sb.WriteString("Disallow: /secret/\n")
sb.WriteString(fmt.Sprintf("Sitemap: %s/sitemap.xml\n", host))
return sb.String()
}
Explanation:
- The main
function is the entry point of the program. It sets up an HTTP server to handle requests and responses.
- The http.HandleFunc
function sets up a handler for the "/robots.txt" path. Whenever a request is made to this path, the provided function will be called.
- Inside the handler function, we set the "Content-Type" header to "text/plain" to indicate that the response will be plain text.
- We then use the fmt.Fprint
function to write the generated robots.txt content to the response writer.
- The generateRobotsTxt
function takes the host as a parameter and generates the content for the robots.txt file.
- In this example, we disallow access to specific directories such as "/admin/", "/private/", and "/secret/" for all user-agents.
- We also specify the location of the sitemap using the "Sitemap" directive, which is dynamically generated based on the host parameter.
That's it! This template can be used as a starting point for creating a robots.txt file in a Hugo website using the Go language.