#75 ✓resolved
Kieran P

Create a dynamic robots.txt file for search engine crawlers

Reported by Kieran P | October 6th, 2008 @ 10:42 AM | in 1.2

Evaluate updating public/robots.txt whenever a basket it is created with dissallows for things like forms, logins, etc

We'll want to add any new links that a bot shouldn't follow (anything that triggers a login redirect for example)

We'll want a rake task to update robots.txt for existing baskets (for sites that already exist)

See the current public/robots.txt

Make it a rails based file like uptime.txt, create the output, then cache the entire page (not fragment caching).

Add a route to it in routes.rb

Add an observer/after_filter on basket model/controller to dump the cache when a basket is created, or deleted

Comments and changes to this ticket

  • Kieran P

    Kieran P October 15th, 2008 @ 01:35 PM

    • State changed from “new” to “resolved”
    • Tag set to baskets, robots, seo

    Master branch now contains code to dynamically generate robots.txt when requested, and caches it until it's changed (when a basket is created, edited, or removed).

Please Sign in or create a free account to add a new ticket.

With your very own profile, you can contribute to projects, track your activity, watch tickets, receive and update tickets through your email and much more.

New-ticket Create new ticket

Create your profile

Help contribute to this project by taking a few moments to create your personal profile. Create your profile ยป

Kete was developed by Horowhenua Library Trust and Katipo Communications Ltd. to build a digital library of Horowhenua material.

People watching this ticket