
如何做搜索引擎蜘蛛日志分析
搜索引擎蜘蛛日志文件是一种非常强大但未被站长充分利用的文件,分析它可以获取有关每个搜索引擎如何爬取网站内容的相关信息点,及查看搜索引擎蜘蛛在一段时间内的行为。
IP地址(59) | 服务器名称 | 所属国家 |
---|---|---|
20.53.78.236 | ? | AU |
51.104.167.87 | ? | IE |
52.143.95.204 | ? | US |
52.190.37.160 | ? | US |
52.154.169.50 | ? | US |
52.224.20.190 | ? | US |
52.154.170.113 | ? | US |
52.149.58.69 | ? | US |
52.224.16.221 | ? | US |
52.149.28.18 | ? | US |
20.204.246.254 | ? | IN |
52.146.63.80 | ? | US |
52.149.58.139 | ? | US |
52.224.19.152 | ? | US |
20.50.48.192 | ? | NL |
20.50.50.121 | ? | NL |
20.50.50.130 | ? | NL |
20.50.50.145 | ? | NL |
52.224.21.53 | ? | US |
52.224.21.4 | ? | US |
104.43.54.127 | ? | SG |
52.143.243.117 | ? | US |
52.154.170.209 | ? | US |
51.104.160.177 | ? | IE |
40.114.182.45 | ? | NL |
40.119.232.50 | ? | ? |
52.224.21.20 | ? | ? |
20.207.72.21 | ? | IN |
20.204.242.101 | ? | IN |
52.149.61.51 | ? | US |
52.143.247.235 | ? | US |
40.119.232.251 | ? | SG |
40.119.232.215 | ? | SG |
51.104.167.61 | ? | IE |
20.50.49.25 | ? | NL |
51.138.90.206 | ? | NL |
40.114.182.172 | ? | NL |
51.104.167.19 | ? | IE |
52.154.170.96 | ? | US |
52.154.171.150 | ? | US |
52.224.21.61 | ? | US |
52.149.28.83 | ? | US |
52.224.20.227 | ? | US |
52.224.21.51 | ? | US |
51.138.90.161 | ? | NL |
20.207.72.110 | ? | IN |
52.154.170.229 | ? | US |
40.114.182.153 | ? | NL |
52.149.60.38 | ? | US |
52.154.170.122 | ? | US |
51.104.164.189 | ? | IE |
191.235.201.214 | ? | BR |
52.154.172.2 | ? | US |
52.224.16.229 | ? | US |
52.224.20.249 | ? | US |
20.50.48.159 | ? | NL |
52.154.169.200 | ? | US |
52.149.30.45 | ? | US |
52.154.170.28 | ? | US |
IP地址(12) | 服务器名称 | 所属国家 |
---|---|---|
51.8.71.117 | 51.8.71.117 | US |
4.209.224.56 | 4.209.224.56 | IE |
20.3.1.178 | 20.3.1.178 | US |
108.141.83.74 | 108.141.83.74 | NL |
172.169.17.165 | 172.169.17.165 | US |
4.213.46.14 | 4.213.46.14 | IN |
20.49.136.28 | 20.49.136.28 | GB |
4.228.76.163 | 4.228.76.163 | BR |
20.12.141.99 | 20.12.141.99 | US |
51.120.48.122 | 51.120.48.122 | NO |
40.80.242.63 | 40.80.242.63 | CA |
一般不要拦截。搜索引擎爬虫为搜索引擎提供动力,是用户发现您网站的有效途径。事实上,拦截搜索引擎爬虫可能会严重减少网站的自然流量。
您可以通过在网站的 robots.txt 中设置用户代理访问规则来屏蔽 DuckAssistBot 或限制其访问权限。我们建议安装 Spider Analyser 插件,以检查它是否真正遵循这些规则。
# robots.txt # 下列代码一般情况可以拦截该代理 User-agent: DuckAssistBot Disallow: /
您无需手动执行此操作,可通过我们的 Wordpress 插件 Spider Analyser 来拦截不必要的蜘蛛或者爬虫。