python爬虫实战之怎样爬取高德地区-创新互联
小编给大家分享一下python爬虫实战之怎样爬取高德地区,相信大部分人都还不怎么了解,因此分享这篇文章给大家参考一下,希望大家阅读完这篇文章后大有收获,下面让我们一起去了解一下吧!
站在用户的角度思考问题,与客户深入沟通,找到万全网站设计与万全网站推广的解决方案,凭借多年的经验,让设计与互联网技术结合,创造个性化、用户体验好的作品,建站类型包括:网站制作、成都网站建设、企业官网、英文网站、手机端网站、网站推广、域名与空间、网络空间、企业邮箱。业务覆盖万全地区。代码:
import requests import json def weatherlist(url1,url2,headers,proxies): response = requests.get(url=url1, headers=headers, proxies=proxies).content.decode('utf-8') response = json.loads(response) for i in response["data"]["cityByLetter"].values(): for j in i: adcode = j["adcode"] name = j["name"] full_url = url2+adcode response = requests.get(url=full_url, headers=headers, proxies=proxies).content.decode('utf-8') response = json.loads(response) print(response) try: if response["data"]["data"]: for weather in response["data"]["data"]: for weather in weather['forecast_data']: weather_name = weather['weather_name'] temp_min = weather['min_temp'] temp_max = weather['max_temp'] with open('weather_list.txt', 'a', encoding='utf-8') as fp: fp.write("城市:"+name+ " 天气: "+weather_name+" 高气温: "+ temp_max +" 最低气温: "+temp_min+'\n') except: print('空') if __name__ == '__main__': url1 = '/tupian/20230522/cityList& url2 = '/tupian/20230522/weather headers = {'User-Agent':'Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/78.0.3904.97 Safari/537.36','Cookie':'BIDUPSID=F6BBCD59FE2A646812DB8DAE641A0BE5; PSTM=1573713375; BAIDUID=F6BBCD59FE2A6468D0329C1E2F60212F:FG=1; BD_UPN=12314353; BDORZ=B490B5EBF6F3CD402E515D22BCDA1598; H_PS_PSSID=1452_21098_29568_29221_26350; delPer=0; BD_CK_SAM=1; PSINO=2; H_PS_645EC=50d5uY51q2qJG%2BVlK7rlPmCgY73TcN9qKRz4sPKuBII1GIkIx4QkChitGd4; BDSVRTM=209'} proxies = {'http':'124.113.217.5:9999','https':''} weatherlist(url1,url2,headers,proxies)
以上是“python爬虫实战之怎样爬取高德地区”这篇文章的所有内容,感谢各位的阅读!相信大家都有了一定的了解,希望分享的内容对大家有所帮助,如果还想学习更多知识,欢迎关注创新互联行业资讯频道!
分享题目:python爬虫实战之怎样爬取高德地区-创新互联
转载源于:http://myzitong.com/article/dphdgi.html