使用mac 的童鞋,只需要将
fout.write("" % data['title'].encode('utf8'))
fout.write("" % data['summary'].encode('utf8'))
后面的.encode('utf8')去掉,即可解决乱码问题,因为mac(Linux)是默认utf8编码,而Windows默认是gbk,所以老师才会解码.
ps :这个坑太深,搞了我3个小时
fout.write("" % data['title'].encode('utf8'))
fout.write("" % data['summary'].encode('utf8'))
后面的.encode('utf8')去掉,即可解决乱码问题,因为mac(Linux)是默认utf8编码,而Windows默认是gbk,所以老师才会解码.
ps :这个坑太深,搞了我3个小时
2017-02-07
Python3
print("正则匹配")
link_node=soup.find('a',href=re.compile(r'ill'))
print(link_node.name,link_node['href'],link_node.get_text())
print("正则匹配")
link_node=soup.find('a',href=re.compile(r'ill'))
print(link_node.name,link_node['href'],link_node.get_text())
2017-02-06
Python3
print("获取lacie")
link_node=soup.find('a',href="http://example.com/tillie")
print(link_node.name,link_node['href'],link_node.get_text())
print("获取lacie")
link_node=soup.find('a',href="http://example.com/tillie")
print(link_node.name,link_node['href'],link_node.get_text())
2017-02-06
Python3
print("获取所有连接")
links =soup.find_all('a')
for link in links:
print (link.name,link['href'],link.get_text())
print("获取所有连接")
links =soup.find_all('a')
for link in links:
print (link.name,link['href'],link.get_text())
2017-02-06
'HtmlOutputer' object has no attribute 'datas' 。。救命快奔了
2017-02-04