我正在用Scrapy在一些网站上搜索招聘信息。如果网站上的页面符合我的要求,我会在数据库中存储指向该页面的链接。没有问题。我还创建了一个脚本,它遍历数据库中的每个链接并ping URL。如果它返回404,它将被删除。我遇到的问题是,当我执行删除检查时,一些站点返回403个错误。奇怪的是,他们都允许刮擦,但却阻止了检查。这是我用于执行删除检查的脚本:
from pymongo import MongoClient
import requests
import urllib3
from operator import itemgetter
import random
import time
client = MongoClient("path-to-mongo")
db = client["mongoDB"]
col = db['mongoCollection']
openings = list(col.find())
sorted_openings = sorted(openings, key=itemgetter('Company'))
del_counter = 0
user_agents = ["Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; .NET CLR 1.1.4322)",
"Mozilla/5.0 CK={} (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0.3729.169 Safari/537.36",
"Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.121 Safari/537.36",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0.3729.157 Safari/537.36",
"Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; .NET CLR 1.1.4322)",
"Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1)",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_5) AppleWebKit/605.1.15 (KHTML, like Gecko)",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_13_6) AppleWebKit/605.1.15 (KHTML, like Gecko)",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_4) AppleWebKit/605.1.15 (KHTML, like Gecko)",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_4) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/13.1 Safari/605.1.15",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_6) AppleWebKit/603.3.8 (KHTML, like Gecko)"]
headers = {"User-Agent": user_agents[random.randint(0,11)]}
counter = 0
del_counter = 0
passed_counter = 0
deleted_links = []
passed_links = []
forbidden = []
for item in sorted_openings:
try:
if requests.get(item['Link'], allow_redirects=False, verify=False, headers=headers).status_code == 200:
print(str(requests.get(item['Link'])) + ' ' + item['Link'])
counter += 1
print(counter)
elif requests.get(item['Link'], allow_redirects=False, verify=False, headers=headers).status_code == 304:
print(requests.get(item['Link']))
counter += 1
print(counter)
elif requests.get(item['Link'], allow_redirects=False, verify=False, headers=headers).status_code == 403:
forbidden.append(item['Link'])
print(requests.get(item['Link']))
counter += 1
print(counter)
else:
db.openings.remove(item)
deleted_links.append(item['Link'])
del_counter += 1
counter += 1
print('Deleted ' + item['Link'])
print(counter)
except:
pass
passed_links.append(item['Link'])
passed_counter += 1
counter += 1
print('Passed link ' + item['Link'])
print(counter)
在每个条件下发送请求,发送一个请求,并将结果存储在一个值中,然后检查条件
相关问题 更多 >
编程相关推荐