Python第三种方法
import urllib2
import cookielib
url = "http://www.baidu.com/"
print 'third'
cj = cookielib.CookieJar()
opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cj))
urllib2.install_opener(opener)
response3 = urllib2.urlopen(url)
print response3.getcode()
print cj
print response3.read()
import urllib2
import cookielib
url = "http://www.baidu.com/"
print 'third'
cj = cookielib.CookieJar()
opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cj))
urllib2.install_opener(opener)
response3 = urllib2.urlopen(url)
print response3.getcode()
print cj
print response3.read()
2016-11-01
Python2.7.12 第二种方法
————————————————————————————————
import urllib2
import cookielib
url = "http://www.baidu.com/"
print 'second'
request = urllib2.Request(url)
request.add_header('user-agent', 'Mozilla/5.0')
response2 = urllib2.urlopen(request)
print response2.getcode()
print len(response2.read())
————————————————————————————————
import urllib2
import cookielib
url = "http://www.baidu.com/"
print 'second'
request = urllib2.Request(url)
request.add_header('user-agent', 'Mozilla/5.0')
response2 = urllib2.urlopen(request)
print response2.getcode()
print len(response2.read())
2016-11-01
Python2.7.12
————————————————————————————————
import urllib2
import cookielib
url = "http://www.baidu.com/"
print 'first'
response1 = urllib2.urlopen(url)
print response1.getcode()
print len(response1.read())
————————————————————————————————
import urllib2
import cookielib
url = "http://www.baidu.com/"
print 'first'
response1 = urllib2.urlopen(url)
print response1.getcode()
print len(response1.read())
2016-11-01
最新回答 / 宇娃
find_all是beautifulsoup里面的一个模块cmd安装方法:C:\Python27\Scripts>pip install Beautifulsoup
2016-11-01
#增加一些东西
def output_html(self):
fount=open("output.html","w",encoding='utf-8')
fount.write("<meta charset=\'utf-8\'>")
def output_html(self):
fount=open("output.html","w",encoding='utf-8')
fount.write("<meta charset=\'utf-8\'>")
2016-10-31
我的输出是这个C:\Python27\python.exe D:/pycharm/xiexie/baike_spider/spider_main.py
craw 1 : None
craw failed
Process finished with exit code 0
为什么?
craw 1 : None
craw failed
Process finished with exit code 0
为什么?
2016-10-27
最新回答 / 慕粉4289539
运行以后是这样的C:\Python27\python.exe D:/pycharm/xiexie/baike_spider/spider_main.pycraw 1 : Nonecraw failed Process finished with exit code 0
2016-10-27