Letting the Robots through
I guess this is a feature request if you don't already have something that will do this.
I'm working on a site for a cigar company that requires the user to verify that they are 21 or older. I'm having to block access to all the pages of the site unless the visitor enters their birthdate and submits the form where I'm using a session to allow access to the rest of the site. Of course, we want Google and the others to be able to crawl the site. I know there's a browser helper as well as a nifty google analytics helper. Do you have something built in that would detect spiders and such so I could let them through?
I'm sure I could assemble a regex to look for common search engine matches to go with $_SERVER['HTTP_USER_AGENT'] but thought it would be way cool to find out you've already got something like this ready to go.
Comments
http://codeigniter.com/user_guide/libraries/user_agent.html