How To Use Robots & Spammers To Run Your Cron Jobs
By Angsuman Chakraborty, Gaea News NetworkMonday, July 24, 2006
Many online blogging software and other products require some type of cron and / or asynchronous execution functionality. Spammers and web robots are an essential part of any website today. Most web hosting providers, especially on shared hosting, do not provide cron functionality. Let’s see how we can leverage spammers and web robots (like MSN bot) to get our perodic or asynchronous jobs done.
Using robots.txt
You can use .htaccess to use a php (or any other server side scripting language like jsp or asp) file to serve your robots.txt. Then when you find unfamiliar (and less useful or useless) robots, you can trigger your cron job. That way the robot gets some delay (who cares?) and your job gets done, without bothering your users with unexpected delay.
Using spammers:
Several spammers are easily caught by simple .htaccess rules. Instead of denying them access you can re-direct them to a php file which then triggers your cron jobs or background processes. As an added benefit it slows down their rate of spamming.
Caveat: If you have a brand new blog then you may not get much robot juice or spam attack.
Credit: The idea of using robots.txt was conceived (as far as I know) by Brian Layman and others are ethusiastically following it with their own variants currently in wp-hackers mailing list.
Tags: ASP, JSP, Leverage
August 7, 2006: 3:34 pm
Sir/ madam, I want a FREE script by which I can host Wordpress Blogs on Sub Domains on my website, for my users. I want full version script. No Demos. Thanx. |
Gurpreet Singh