Apache 禁止指定user_agent(防止爬虫扒取)
该功能可以禁止部分我们不需要的访问名单(),通过改写rewrite模块来实现。tail /tmp/
tail/usr/local/apache2/logs/test.com-access_20150121_log
在rewrite模块中添加如下信息:
RewriteCond %{HTTP_USER_AGENT} ^.*curl.*
RewriteCond %{HTTP_USER_AGENT} ^.*chrome*
RewriteRule .* -
保存退出
测试
apachectl -t
apachectl restart
curl -x127.0.0.1:80 www.test.com/data/forum.php(默认403错误)
curl -x192.168.11.160:80 www.test.com/data/forum.php(默认403错误)
vi/usr/local/apache2/conf/extra/httpd-vhosts.conf(注释掉)
# RewriteCond %{HTTP_USER_AGENT} ^.*curl.*
apachectl -t
apachectl restart
curl -x192.168.11.160:80 www.test.com/adsjkf(404)
curl -x192.168.11.160:80 www.test.com/forum.php -I(200)
浏览器 www.test.com
curl -A "sdfsadsdfa"-x192.168.11.160:80 www.test.com/forum.php -I
curl -A "sdfsadchromesdfa"-x192.168.11.160:80 www.test.com/forum.php -I
页:
[1]