ImportError: cannot import name ‘HTTPClientFactory‘ from ‘twisted.web.client‘

ImportError: cannot import name ‘HTTPClientFactory‘ from ‘twisted.web.client‘ ,第1张

文章目录
    • 遇到问题
    • 解决方法
    • 参考

遇到问题

做基金爬取数据的项目,爬虫基本上写好了,但是在测试的时候遇到如下问题:

scrapy爬虫

ImportError: cannot import name 'HTTPClientFactory' from 'twisted.web.client' (unknown location)

解决方法

google了一下,看到原因是使用了代理,但是我关掉代理并不管用。反复尝试,也并没有用。

后来自己升级了scrapy, 解决了这个问题。

当然,这里使用了豆瓣源,用于加速。

pip install -U Scrapy -i http://pypi.douban.com/simple --trusted-host pypi.douban.com

看一下升级后的版本,以供读者参考

pip install -U Scrapy -i http://pypi.douban.com/simple --trusted-host pypi.douban.com

Defaulting to user installation because normal site-packages is not writeable
Looking in indexes: http://pypi.douban.com/simple
Requirement already satisfied: Scrapy in  (2.6.1)
Requirement already satisfied: protego>=0.1.15 in  (from Scrapy) (0.2.1)
Requirement already satisfied: itemadapter>=0.1.0 in  (from Scrapy) (0.6.0)
Requirement already satisfied: setuptools in d:\program files\python38\lib\site-packages (from Scrapy) (49.2.1)
Requirement already satisfied: cryptography>=2.0 in  (from Scrapy) (37.0.2)
Requirement already satisfied: pyOpenSSL>=16.2.0 in  (from Scrapy) (22.0.0)
Requirement already satisfied: queuelib>=1.4.2 in  (from Scrapy) (1.6.2)
Requirement already satisfied: service-identity>=16.0.0 in  (from Scrapy) (21.1.0)
Requirement already satisfied: itemloaders>=1.0.1 in  (from Scrapy) (1.0.4)
Requirement already satisfied: tldextract in  (from Scrapy) (3.3.0)
Requirement already satisfied: PyDispatcher>=2.0.5 in  (from Scrapy) (2.0.5)
Requirement already satisfied: Twisted>=17.9.0 in  (from Scrapy) (22.4.0)
Requirement already satisfied: cssselect>=0.9.1 in  (from Scrapy) (1.1.0)
Requirement already satisfied: parsel>=1.5.0 in  (from Scrapy) (1.6.0)
Requirement already satisfied: w3lib>=1.17.0 in  (from Scrapy) (1.22.0)
Requirement already satisfied: zope.interface>=4.1.3 in  (from Scrapy) (5.4.0)
Requirement already satisfied: lxml>=3.5.0 in  (from Scrapy) (4.8.0)
Requirement already satisfied: cffi>=1.12 in  (from cryptography>=2.0->Scrapy) (1.15.0)
Requirement already satisfied: pycparser in  (from cffi>=1.12->cryptography>=2.0->Scrapy) (2.21)      
Requirement already satisfied: jmespath>=0.9.5 in  (from itemloaders>=1.0.1->Scrapy) (1.0.0)
Requirement already satisfied: six>=1.6.0 in  (from parsel>=1.5.0->Scrapy) (1.15.0)
Requirement already satisfied: attrs>=19.1.0 in  (from service-identity>=16.0.0->Scrapy) (21.4.0)
Requirement already satisfied: pyasn1 in  (from service-identity>=16.0.0->Scrapy) (0.4.8)
Requirement already satisfied: pyasn1-modules in  (from service-identity>=16.0.0->Scrapy) (0.2.8)     
Requirement already satisfied: Automat>=0.8.0 in  (from Twisted>=17.9.0->Scrapy) (20.2.0)
Requirement already satisfied: constantly>=15.1 in  (from Twisted>=17.9.0->Scrapy) (15.1.0)
Requirement already satisfied: hyperlink>=17.1.1 in  (from Twisted>=17.9.0->Scrapy) (21.0.0)
Requirement already satisfied: twisted-iocpsupport<2,>=1.0.2 in  (from Twisted>=17.9.0->Scrapy) (1.0.2)
Requirement already satisfied: typing-extensions>=3.6.5 in  (from Twisted>=17.9.0->Scrapy) (4.2.0)    
Requirement already satisfied: incremental>=21.3.0 in  (from Twisted>=17.9.0->Scrapy) (21.3.0)        
Requirement already satisfied: idna>=2.5 in  (from hyperlink>=17.1.1->Twisted>=17.9.0->Scrapy) (2.10) 
Requirement already satisfied: filelock>=3.0.8 in  (from tldextract->Scrapy) (3.7.0)
Requirement already satisfied: requests-file>=1.4 in  (from tldextract->Scrapy) (1.5.1)
Requirement already satisfied: requests>=2.1.0 in  (from tldextract->Scrapy) (2.25.1)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in  (from requests>=2.1.0->tldextract->Scrapy) (1.25.11)
Requirement already satisfied: chardet<5,>=3.0.2 in  (from requests>=2.1.0->tldextract->Scrapy) (4.0.0)
Requirement already satisfied: certifi>=2017.4.17 in  (from requests>=2.1.0->tldextract->Scrapy) (2020.12.5)

问题得以解决

下面是成功爬取的数据,存入Mysql数据库

参考

欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/langs/942712.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-05-18
下一篇 2022-05-18

发表评论

登录后才能评论

评论列表(0条)

保存