爬虫方便函数—copyheaders

2018-12-23  本文已影响0人  真夜猫
图片上传自简书.jpg

在我们写爬虫脚本的时候,把网页的头文件复制过来总是要一个一个的添加引号,很不方便。在这里给大家介绍一个小技巧,让你需要挨个加引号。来,直接上代码:

from copyheaders import headers_raw_to_dict
headers = b'''
    :authority:c.y.qq.com
    :method:GET
    :path:/soso/fcgi-bin/client_search_cp?ct=24&qqmusic_ver=1298&new_json=1&remoteplace=txt.yqq.center&searchid=46360413927906065&t=0&aggr=1&cr=1&catZhida=1&lossless=0&flag_qc=0&p=1&n=20&w=%E6%98%8E%E5%A4%A9%E4%BD%A0%E5%A5%BD&g_tk=5381&jsonpCallback=MusicJsonCallback7934911028613236&loginUin=0&hostUin=0&format=jsonp&inCharset=utf8&outCharset=utf-8¬ice=0&platform=yqq&needNewCode=0
    :scheme:https
    accept:*/*
    accept-encoding:gzip, deflate, sdch, br
    accept-language:zh-CN,zh;q=0.8
    cookie:cuid=6852877350; pgv_pvi=6596119552; RK=xB5dmM0g81; tvfe_boss_uuid=622f2b2912bb7f83; o_cookie=2353184487; ts_refer=www.baidu.com/link; ptcz=410ebd7ac68d0a114d731d573a83ff7f6572ed57fa43d90ad9ab90c7205751d8; pt2gguin=o2353184487; pgv_si=s6436702208; yplayer_open=1; yq_index=0; qqmusic_fromtag=66; yqq_stat=0; pgv_info=ssid=s4116171870; ts_last=y.qq.com/portal/search.html; pgv_pvid=2839864484; ts_uid=2016409769; player_exist=1
    referer:https://y.qq.com/portal/search.html
    user-agent:Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/53.0.2785.104 Safari/537.36 Core/1.53.4549.400 QQBrowser/9.7.12900.400
    '''
headers = headers_raw_to_dict(headers)
print(headers)

好了,这里我们就header就被我们写好了,是不是很方便呢,赶快测试一下吧。

上一篇 下一篇

猜你喜欢

热点阅读