破文件说
Scrapy will look for configuration parameters in ini-style scrapy.cfg files in standard locations: /etc/scrapy.cfg
我已经将scrapy.cfg
放在/etc/scrapy.cfg
的位置并尝试运行
scrapy crawl <spider_name>
在项目根目录tutorial
中。
我收到错误:
Traceback (most recent call last):
File "/home/user/.local/bin/scrapy", line 11, in <module>
sys.exit(execute())
File "/home/user/.local/lib/python2.7/site-packages/scrapy/cmdline.py", line 110, in execute
settings = get_project_settings()
File "/home/user/.local/lib/python2.7/site-packages/scrapy/utils/project.py", line 68, in get_project_settings
settings.setmodule(settings_module_path, priority='project')
File "/home/user/.local/lib/python2.7/site-packages/scrapy/settings/__init__.py", line 292, in setmodule
module = import_module(module)
File "/usr/lib/python2.7/importlib/__init__.py", line 37, in import_module
__import__(name)
ImportError: No module named tutorial.settings
我做错什么了?在
尝试创建
tutorial/settings.py
。您的输出包含以下抱怨:ImportError: No module named tutorial.settings
,因此,可能需要这个文件来运行。在相关问题 更多 >
编程相关推荐