在Ubuntu上安装scrapy时遇到问题

3 投票
2 回答
6792 浏览
提问于 2025-04-18 12:32

我最近刚开始使用Linux,想用Scrapy这个工具。

jeremy@jeremy-Lenovo-G580:~/Dropbox/projects/scrapy_stuff$ uname -a
Linux jeremy-Lenovo-G580 3.5.0-52-generic #79~precise1-Ubuntu SMP Fri Jul 4 21:03:49 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux

为此我安装了Python 2.7。

$ python -V
Python 2.7.3

然后我安装了pip(用命令sudo easy_install pip),接着用它安装了Scrapy 0.24。

sudo pip install scrapy

Scrapy一开始运行得不错,我在 http://doc.scrapy.org/en/latest/intro/tutorial.html 上的教程也顺利完成了。不过每次运行Scrapy时,它都会抱怨找不到service_identity这个东西,所以我用pip安装了它(我没有记录安装时的输出,除非它在某个日志里)。但不知道为什么(可能是service_identity的问题?),Scrapy就坏掉了。

$ scrapy -V
Traceback (most recent call last):
  File "/usr/local/bin/scrapy", line 4, in <module>
    execute()
  File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 122, in execute
    cmds = _get_commands_dict(settings, inproject)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 46, in _get_commands_dict
    cmds = _get_commands_from_module('scrapy.commands', inproject)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 29, in _get_commands_from_module
    for cmd in _iter_command_classes(module):
  File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 20, in _iter_command_classes
    for module in walk_modules(module_name):
  File "/usr/local/lib/python2.7/dist-packages/scrapy/utils/misc.py", line 68, in walk_modules
    submod = import_module(fullpath)
  File "/usr/lib/python2.7/importlib/__init__.py", line 37, in import_module
    __import__(name)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/commands/bench.py", line 3, in <module>
    from scrapy.tests.mockserver import MockServer
  File "/usr/local/lib/python2.7/dist-packages/scrapy/tests/mockserver.py", line 6, in <module>
    from twisted.internet import reactor, defer, ssl
  File "/usr/local/lib/python2.7/dist-packages/twisted/internet/ssl.py", line 223, in <module>
    from twisted.internet._sslverify import (
  File "/usr/local/lib/python2.7/dist-packages/twisted/internet/_sslverify.py", line 184, in <module>
    verifyHostname, VerificationError = _selectVerifyImplementation()
  File "/usr/local/lib/python2.7/dist-packages/twisted/internet/_sslverify.py", line 159, in _selectVerifyImplementation
    from service_identity import VerificationError
  File "/usr/local/lib/python2.7/dist-packages/service_identity/__init__.py", line 11, in <module>
    from . import pyopenssl
  File "/usr/local/lib/python2.7/dist-packages/service_identity/pyopenssl.py", line 12, in <module>
    from pyasn1_modules.rfc2459 import GeneralNames
  File "/usr/local/lib/python2.7/dist-packages/pyasn1_modules/rfc2459.py", line 72, in <module>
    class AttributeValue(univ.Any): pass
AttributeError: 'module' object has no attribute 'Any'

所以我根据 http://doc.scrapy.org/en/latest/topics/ubuntu.html#topics-ubuntu 的建议,先尝试卸载Scrapy(用命令sudo pip uninstall scrapy),然后再试着重新安装。

sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv 627220E7
echo 'deb http://archive.scrapy.org/ubuntu scrapy main' | sudo tee /etc/apt/sources.list.d/scrapy.list
sudo apt-get update && sudo apt-get install scrapy-0.24

我还尝试了几次用pip安装和卸载Scrapy,也试过用easy_install scrapy,然后又尝试更新。

 sudo pip install -U scrapy

这似乎和

sudo pip intall --upgrade scrapy 

是一样的(在我第一次安装时,发现有一个旧版本的Scrapy在运行,虽然我已经安装了新版本,卸载旧版本后,Scrapy才第一次成功运行,所以我怀疑更新可能会再次解决问题)。

$sudo pip install -U scrapy

Requirement already up-to-date: scrapy in /usr/local/lib/python2.7/dist-packages
Requirement already up-to-date: Twisted>=10.0.0 in /usr/local/lib/python2.7/dist-packages (from scrapy)
Requirement already up-to-date: w3lib>=1.2 in /usr/local/lib/python2.7/dist-packages (from scrapy)
Requirement already up-to-date: queuelib in /usr/local/lib/python2.7/dist-packages (from scrapy)
Downloading/unpacking lxml from https://pypi.python.org/packages/source/l/lxml/lxml-3.3.5.tar.gz#md5=88c75f4c73fc8f59c9ebb17495044f2f (from scrapy)
  Downloading lxml-3.3.5.tar.gz (3.5MB): 3.5MB downloaded
  Running setup.py (path:/tmp/pip_build_root/lxml/setup.py) egg_info for package lxml
    /usr/lib/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'bugtrack_url'
      warnings.warn(msg)
    Building lxml version 3.3.5.
    Building without Cython.
    ERROR: /bin/sh: 1: xslt-config: not found

    ** make sure the development packages of libxml2 and libxslt are installed **

    Using build configuration of libxslt

    warning: no previously-included files found matching '*.py'
Downloading/unpacking pyOpenSSL from https://pypi.python.org/packages/source/p/pyOpenSSL/pyOpenSSL-0.14.tar.gz#md5=8579ff3a1d858858acfba5f046a4ddf7 (from scrapy)
  Downloading pyOpenSSL-0.14.tar.gz (128kB): 128kB downloaded
  Running setup.py (path:/tmp/pip_build_root/pyOpenSSL/setup.py) egg_info for package pyOpenSSL

    warning: no previously-included files matching '*.pyc' found anywhere in distribution
    no previously-included directories found matching 'doc/_build'
Requirement already up-to-date: cssselect>=0.9 in /usr/local/lib/python2.7/dist-packages (from scrapy)
Requirement already up-to-date: six>=1.5.2 in /usr/local/lib/python2.7/dist-packages (from scrapy)
Downloading/unpacking zope.interface>=3.6.0 from https://pypi.python.org/packages/source/z/zope.interface/zope.interface-4.1.1.tar.gz#md5=edcd5f719c5eb2e18894c4d06e29b6c6 (from Twisted>=10.0.0->scrapy)
  Downloading zope.interface-4.1.1.tar.gz (864kB): 864kB downloaded
  Running setup.py (path:/tmp/pip_build_root/zope.interface/setup.py) egg_info for package zope.interface

    warning: no previously-included files matching '*.dll' found anywhere in distribution
    warning: no previously-included files matching '*.pyc' found anywhere in distribution
    warning: no previously-included files matching '*.pyo' found anywhere in distribution
    warning: no previously-included files matching '*.so' found anywhere in distribution
Downloading/unpacking cryptography>=0.2.1 (from pyOpenSSL->scrapy)
  Downloading cryptography-0.5.1.tar.gz (319kB): 319kB downloaded
  Running setup.py (path:/tmp/pip_build_root/cryptography/setup.py) egg_info for package cryptography

    Installed /tmp/pip_build_root/cryptography/cffi-0.8.6-py2.7-linux-x86_64.egg
    Searching for pycparser
    Reading http://pypi.python.org/simple/pycparser/
    Best match: pycparser 2.10
    Downloading https://pypi.python.org/packages/source/p/pycparser/pycparser-2.10.tar.gz#md5=d87aed98c8a9f386aa56d365fe4d515f
    Processing pycparser-2.10.tar.gz
    Running pycparser-2.10/setup.py -q bdist_egg --dist-dir /tmp/easy_install-LtjYh9/pycparser-2.10/egg-dist-tmp-1dc4kT
    zip_safe flag not set; analyzing archive contents...

    Installed /tmp/pip_build_root/cryptography/pycparser-2.10-py2.7.egg

    building '_Cryptography_cffi_684bb40axf342507b' extension
    gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I/usr/include/python2.7 -c cryptography/hazmat/primitives/__pycache__/_Cryptography_cffi_684bb40axf342507b.c -o /tmp/pip_build_root/cryptography/cryptography/hazmat/primitives/__pycache__/cryptography/hazmat/primitives/__pycache__/_Cryptography_cffi_684bb40axf342507b.o
    gcc -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro /tmp/pip_build_root/cryptography/cryptography/hazmat/primitives/__pycache__/cryptography/hazmat/primitives/__pycache__/_Cryptography_cffi_684bb40axf342507b.o -o /tmp/pip_build_root/cryptography/cryptography/hazmat/primitives/__pycache__/_Cryptography_cffi_684bb40axf342507b.so
    building '_Cryptography_cffi_8f86901cxc1767c5a' extension
    gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I/usr/include/python2.7 -c cryptography/hazmat/primitives/__pycache__/_Cryptography_cffi_8f86901cxc1767c5a.c -o /tmp/pip_build_root/cryptography/cryptography/hazmat/primitives/__pycache__/cryptography/hazmat/primitives/__pycache__/_Cryptography_cffi_8f86901cxc1767c5a.o
    gcc -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro /tmp/pip_build_root/cryptography/cryptography/hazmat/primitives/__pycache__/cryptography/hazmat/primitives/__pycache__/_Cryptography_cffi_8f86901cxc1767c5a.o -o /tmp/pip_build_root/cryptography/cryptography/hazmat/primitives/__pycache__/_Cryptography_cffi_8f86901cxc1767c5a.so
    building '_Cryptography_cffi_79a5b0a3x3a8a382' extension
    gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I/usr/include/python2.7 -c cryptography/hazmat/bindings/__pycache__/_Cryptography_cffi_79a5b0a3x3a8a382.c -o /tmp/pip_build_root/cryptography/cryptography/hazmat/bindings/__pycache__/cryptography/hazmat/bindings/__pycache__/_Cryptography_cffi_79a5b0a3x3a8a382.o
    gcc -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro /tmp/pip_build_root/cryptography/cryptography/hazmat/bindings/__pycache__/cryptography/hazmat/bindings/__pycache__/_Cryptography_cffi_79a5b0a3x3a8a382.o -lcrypto -lssl -o /tmp/pip_build_root/cryptography/cryptography/hazmat/bindings/__pycache__/_Cryptography_cffi_79a5b0a3x3a8a382.so
    no previously-included directories found matching 'docs/_build'
    warning: no previously-included files matching '*' found under directory 'vectors'
Downloading/unpacking setuptools from https://pypi.python.org/packages/3.4/s/setuptools/setuptools-5.4.1-py2.py3-none-any.whl#md5=5b7b07029ad2285d1cbf809a8ceaea08 (from zope.interface>=3.6.0->Twisted>=10.0.0->scrapy)
  Downloading setuptools-5.4.1-py2.py3-none-any.whl (528kB): 528kB downloaded
Downloading/unpacking cffi>=0.8 (from cryptography>=0.2.1->pyOpenSSL->scrapy)
  Downloading cffi-0.8.6.tar.gz (196kB): 196kB downloaded
  Running setup.py (path:/tmp/pip_build_root/cffi/setup.py) egg_info for package cffi

Downloading/unpacking pycparser (from cffi>=0.8->cryptography>=0.2.1->pyOpenSSL->scrapy)
  Downloading pycparser-2.10.tar.gz (206kB): 206kB downloaded
  Running setup.py (path:/tmp/pip_build_root/pycparser/setup.py) egg_info for package pycparser

Installing collected packages: lxml, pyOpenSSL, zope.interface, cryptography, setuptools, cffi, pycparser
  Found existing installation: lxml 2.3.2
    Uninstalling lxml:
      Successfully uninstalled lxml
  Running setup.py install for lxml
    /usr/lib/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'bugtrack_url'
      warnings.warn(msg)
    Building lxml version 3.3.5.
    Building without Cython.
    ERROR: /bin/sh: 1: xslt-config: not found

    ** make sure the development packages of libxml2 and libxslt are installed **

    Using build configuration of libxslt
    building 'lxml.etree' extension
    gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I/tmp/pip_build_root/lxml/src/lxml/includes -I/usr/include/python2.7 -c src/lxml/lxml.etree.c -o build/temp.linux-x86_64-2.7/src/lxml/lxml.etree.o -w
    In file included from src/lxml/lxml.etree.c:346:0:
    /tmp/pip_build_root/lxml/src/lxml/includes/etree_defs.h:9:31: fatal error: libxml/xmlversion.h: No such file or directory
    compilation terminated.
    error: command 'gcc' failed with exit status 1
    Complete output from command /usr/bin/python -c "import setuptools, tokenize;__file__='/tmp/pip_build_root/lxml/setup.py';exec(compile(getattr(tokenize, 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record /tmp/pip-Iog1QC-record/install-record.txt --single-version-externally-managed --compile:
    /usr/lib/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'bugtrack_url'

  warnings.warn(msg)

Building lxml version 3.3.5.

Building without Cython.

ERROR: /bin/sh: 1: xslt-config: not found



** make sure the development packages of libxml2 and libxslt are installed **



Using build configuration of libxslt

running install

running build

running build_py

creating build

creating build/lib.linux-x86_64-2.7

creating build/lib.linux-x86_64-2.7/lxml

copying src/lxml/builder.py -> build/lib.linux-x86_64-2.7/lxml

copying src/lxml/sax.py -> build/lib.linux-x86_64-2.7/lxml

copying src/lxml/__init__.py -> build/lib.linux-x86_64-2.7/lxml

copying src/lxml/pyclasslookup.py -> build/lib.linux-x86_64-2.7/lxml

copying src/lxml/ElementInclude.py -> build/lib.linux-x86_64-2.7/lxml

copying src/lxml/_elementpath.py -> build/lib.linux-x86_64-2.7/lxml

copying src/lxml/cssselect.py -> build/lib.linux-x86_64-2.7/lxml

copying src/lxml/doctestcompare.py -> build/lib.linux-x86_64-2.7/lxml

copying src/lxml/usedoctest.py -> build/lib.linux-x86_64-2.7/lxml

creating build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/__init__.py -> build/lib.linux-x86_64-2.7/lxml/includes

creating build/lib.linux-x86_64-2.7/lxml/html

copying src/lxml/html/html5parser.py -> build/lib.linux-x86_64-2.7/lxml/html

copying src/lxml/html/_diffcommand.py -> build/lib.linux-x86_64-2.7/lxml/html

copying src/lxml/html/builder.py -> build/lib.linux-x86_64-2.7/lxml/html

copying src/lxml/html/defs.py -> build/lib.linux-x86_64-2.7/lxml/html

copying src/lxml/html/clean.py -> build/lib.linux-x86_64-2.7/lxml/html

copying src/lxml/html/__init__.py -> build/lib.linux-x86_64-2.7/lxml/html

copying src/lxml/html/ElementSoup.py -> build/lib.linux-x86_64-2.7/lxml/html

copying src/lxml/html/diff.py -> build/lib.linux-x86_64-2.7/lxml/html

copying src/lxml/html/_setmixin.py -> build/lib.linux-x86_64-2.7/lxml/html

copying src/lxml/html/soupparser.py -> build/lib.linux-x86_64-2.7/lxml/html

copying src/lxml/html/formfill.py -> build/lib.linux-x86_64-2.7/lxml/html

copying src/lxml/html/_html5builder.py -> build/lib.linux-x86_64-2.7/lxml/html

copying src/lxml/html/usedoctest.py -> build/lib.linux-x86_64-2.7/lxml/html

creating build/lib.linux-x86_64-2.7/lxml/isoschematron

copying src/lxml/isoschematron/__init__.py -> build/lib.linux-x86_64-2.7/lxml/isoschematron

copying src/lxml/lxml.etree.h -> build/lib.linux-x86_64-2.7/lxml

copying src/lxml/lxml.etree_api.h -> build/lib.linux-x86_64-2.7/lxml

copying src/lxml/includes/xpath.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/relaxng.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/xmlparser.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/xinclude.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/xslt.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/c14n.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/dtdvalid.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/schematron.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/htmlparser.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/config.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/xmlerror.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/uri.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/etreepublic.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/tree.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/xmlschema.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/etree_defs.h -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/lxml-version.h -> build/lib.linux-x86_64-2.7/lxml/includes

creating build/lib.linux-x86_64-2.7/lxml/isoschematron/resources

creating build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/rng

copying src/lxml/isoschematron/resources/rng/iso-schematron.rng -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/rng

creating build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl

copying src/lxml/isoschematron/resources/xsl/RNG2Schtrn.xsl -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl

copying src/lxml/isoschematron/resources/xsl/XSD2Schtrn.xsl -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl

creating build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl/iso-schematron-xslt1

copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_dsdl_include.xsl -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl/iso-schematron-xslt1

copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_svrl_for_xslt1.xsl -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl/iso-schematron-xslt1

copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_abstract_expand.xsl -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl/iso-schematron-xslt1

copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_schematron_skeleton_for_xslt1.xsl -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl/iso-schematron-xslt1

copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_schematron_message.xsl -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl/iso-schematron-xslt1

copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/readme.txt -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl/iso-schematron-xslt1

running build_ext

building 'lxml.etree' extension

creating build/temp.linux-x86_64-2.7

creating build/temp.linux-x86_64-2.7/src

creating build/temp.linux-x86_64-2.7/src/lxml

gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I/tmp/pip_build_root/lxml/src/lxml/includes -I/usr/include/python2.7 -c src/lxml/lxml.etree.c -o build/temp.linux-x86_64-2.7/src/lxml/lxml.etree.o -w

In file included from src/lxml/lxml.etree.c:346:0:

/tmp/pip_build_root/lxml/src/lxml/includes/etree_defs.h:9:31: fatal error: libxml/xmlversion.h: No such file or directory

compilation terminated.

error: command 'gcc' failed with exit status 1

----------------------------------------
  Rolling back uninstall of lxml
Cleaning up...
Command /usr/bin/python -c "import setuptools, tokenize;__file__='/tmp/pip_build_root/lxml/setup.py';exec(compile(getattr(tokenize, 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record /tmp/pip-Iog1QC-record/install-record.txt --single-version-externally-managed --compile failed with error code 1 in /tmp/pip_build_root/lxml
Storing debug log for failure in /home/jeremy/.pip/pip.log

有一件奇怪的事(我觉得很奇怪),就是pip不运行,但sudo pip可以。

jeremy@jeremy-Lenovo-G580:~/Dropbox/projects/scrapy_stuff$ pip install scrapy
bash: /usr/bin/pip: No such file or directory
jeremy@jeremy-Lenovo-G580:~/Dropbox/projects/scrapy_stuff$ sudo pip install scrapy
Requirement already satisfied (use --upgrade to upgrade): scrapy in /usr/local/lib/python2.7/dist-packages
Requirement already satisfied (use --upgrade to upgrade): Twisted>=10.0.0 in /usr/local/lib/python2.7/dist-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): w3lib>=1.2 in /usr/local/lib/python2.7/dist-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): queuelib in /usr/local/lib/python2.7/dist-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): lxml in /usr/lib/python2.7/dist-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): pyOpenSSL in /usr/lib/python2.7/dist-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): cssselect>=0.9 in /usr/local/lib/python2.7/dist-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): six>=1.5.2 in /usr/local/lib/python2.7/dist-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): zope.interface>=3.6.0 in /usr/lib/python2.7/dist-packages (from Twisted>=10.0.0->scrapy)
Cleaning up...

所以,我打算尝试安装libxml2和libxslt的开发包,但此刻我并不乐观——我感觉自己陷入了一个死胡同,不清楚是什么导致Scrapy最开始的正常运行坏掉了……任何帮助都很感激,我的棕色头发都快变灰了,灰色的又变白,白色的快要燃烧了。

也许我应该试着在我设置的Windows虚拟机上运行Python和Scrapy(顺便说一下,这个过程相当简单),但这样就失去了切换到Linux的意义(我本来是想更接近我感兴趣的很多项目的源代码,以及享受开源的乐趣)。

好吧,我刚刚尝试了

sudo apt-get install libxml2-dev
sudo apt-get install libxslt1-dev 
sudo apt-get install python2.7-dev

但Scrapy还是死得很惨,报了个属性错误:模块对象没有'任何'这个属性。

2 个回答

1

我在使用Ubuntu 14.04的时候遇到了这个问题。我确保安装了service_identity 16.0.0所需的所有依赖,具体可以在这里查看。我发现我缺少了pyasn1pyasn1-modules(通过pip安装的),但我忽略了attrs这个包。于是我把service_identity卸载了,然后重新安装,这样就把那个包也装上了:

yes | pip uninstall service_identity; pip install service_identity

不过,使用pip的升级功能来更新service_identity并没有解决这个问题。

1

重新安装了一下Ubuntu,虽然有点极端,但效果不错。我这次安装的是Ubuntu 14 LTS,而不是之前的12 LTS。唯一的小问题是安装后

$sudo apt-get install python-pip
$sudo pip install scrapy
...
twisted/runner/portmap.c:10:20: fatal error: Python.h: No such file or directory

但是

 sudo apt-get install build-essential python-dev

解决了这个问题。希望这次Scrapy能用超过一个小时。两天的时间花得值吗?

撰写回答