One of the websites I maintain is the wiki for http://auckland.thursdaynightcurry.com . As is common these days (I found out) there are a few bots out there that will add URLs of sites to wikis, blogs and other sites to try and direct hits (and improve to search engine ranking) to various places.
After cleaning up the first few defacements I put in a simple config change to stop this happening in the future
All that I do is require a username and password to be typed in when somebody posts to the wiki, the dumb trick is that the username and password are in plain sight. This means people have no problems but bots are stopped.
What I did1. Allow .htaccess files to work in the wiki area.
<Directory /var/www/curry> Options +ExecCGI AllowOverride Limit AuthConfig </Directory>2. Create a .htaccess file that ONLY stops POSTS and which will tell people the username and password.
AuthUserFile /var/wherevever/passwd AuthName "Username is curry, password is curry" AuthType Basic <Limit POST> require user curry </Limit>3. Create a file with the username and password.
htpasswd -nb curry curry > passwd4. Thats it.
CommentsObviously the method is pretty simple and isn't going to stop people who are manually spamming of defacing the site. But compared to making all my contributors create their own username and password or run some sort of other anti-spam filter it's pretty good. Let me know if you have any comments, thanks to Leon Strong and Ed Murphy for some pointers when I was getting it to work.