使用Pyinstaller将Scrapy打包为更大程序的一部分

2024-05-15 21:18:57 发布

您现在位置:Python中文网/ 问答频道 /正文

我正在尝试使用Pyinstaller将Scrapy打包为更大程序的一部分。
执行源代码时,一切都按预期运行,但从可执行文件运行时,它会抛出:

[scrapy.utils.log] INFO: Scrapy 2.3.0 started (bot: NGL)
[scrapy.utils.log] INFO: Versions: lxml 4.5.1.0, libxml2 2.9.10, cssselect 1.1.0, parsel 1.6.0, w3lib 1.22.0, Twisted 20.3.0, Python 3.8.5 (default, Jul 27 2020, 08:42:51) - [GCC 10.1.0], pyOpenSSL 19.1.0 (OpenSSL 1.1.1g  21 Apr 2020), cryptography 3.0, Platform Linux-5.8.3-arch1-1-x86_64-with-glibc2.4
Traceback (most recent call last):
  File "scrapy/spiderloader.py", line 76, in load
KeyError: 'ngl'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "cli.py", line 9, in <module>
  File "ui.py", line 159, in main
  File "ui.py", line 147, in start
  File "ui.py", line 67, in print
  File "NGL/spiders/NGL.py", line 127, in main
  File "scrapy/crawler.py", line 191, in crawl
  File "scrapy/crawler.py", line 224, in create_crawler
  File "scrapy/crawler.py", line 228, in _create_crawler
  File "scrapy/spiderloader.py", line 78, in load
KeyError: 'Spider not found: ngl'

不幸的是,我不知道如何实际调试pyinstaller包:/

这是目录树:

.
├── LICENSE
├── main
│   ├── checkers.py
│   ├── cli.py
│   ├── decorators.py
│   ├── fixers.py
│   ├── licences.py
│   ├── NGL
│   │   ├── __init__.py
│   │   ├── items.py
│   │   ├── middlewares.py
│   │   ├── pipelines.py
│   │   └── spiders
│   │       ├── __init__.py
│   │       └── NGL.py
│   ├── scrapy.cfg
│   ├── terminal_tools.py
│   ├── text_tools.py
│   └── ui.py

NGL中没有settings.py,因为我在/main/NGL/spider/NGL.py中注入了一个笔直的爬行器:

def main(url, save_path):
    folder = save_path
    if os.path.exists(folder):
        shutil.rmtree(folder)
    process = CrawlerProcess(
        settings={
            "LOG_ENABLED": True,
            "LOG_FORMAT": "[%(name)s] %(levelname)s: %(message)s",
            "LOG_LEVEL": "INGO",
            "BOT_NAME": "NGL",
            "SPIDER_MODULES": ["NGL.spiders"],
            "NEWSPIDER_MODULE": "NGL.spiders",
            "IMAGES_STORE": folder,
            "ROBOTSTXT_OBEY": True,
            "ITEM_PIPELINES": {
                "NGL.pipelines.DownloadPipeline": 300,
                "NGL.pipelines.CleanerPipeline": 600,
            },
            "IMAGES_URLS_FIELD": "Image Url",
            "FEEDS": {
                f"{folder}/data.csv": {
                    "format": "csv",
                    "encoding": "utf8",
                    "fields": [
                        "Inventory number",
                        "Full title",
                        "Date made",
                        "Artist",
                        "Artist dates",
                        "Medium and support",
                        "Dimensions",
                        "Overview",
                        "In-Depth",
                        "Copywright",
                        "Image Url",
                        "Artwork Url",
                    ],
                }
            },
        },
    )
    process.crawl("ngl", start_urls=[url])
    process.start()

以及用于打包的命令:

pyinstaller main/cli.py  --clean --onefile --name NGA_linux -p main:main/NGL:main/NGL/spiders

Tags: inpyuiclimainlinefolderstart
1条回答
网友
1楼 · 发布于 2024-05-15 21:18:57

<-编辑->
这解决了KeyError: 'Spider not found: ngl'错误,但如果您尝试将编译后的包移动到项目文件夹之外的任何位置,就会出现新的错误:(
一切都按预期开始,但随后会引发此问题:

Traceback (most recent call last):
  File "scrapy/utils/defer.py", line 55, in mustbe_deferred
  File "scrapy/core/spidermw.py", line 60, in process_spider_input
  File "scrapy/core/scraper.py", line 152, in call_spider
  File "scrapy/utils/misc.py", line 218, in warn_on_generator_with_return_value
  File "scrapy/utils/misc.py", line 203, in is_generator_with_return_value
  File "inspect.py", line 985, in getsource
  File "inspect.py", line 967, in getsourcelines
  File "inspect.py", line 798, in findsource
OSError: could not get source code

<;-编辑-2->;
我能够缩小这个问题的范围。 我需要在同一位置编译包和spider,并保留dir结构,如下所示:

.
├── NGA_linux # <  pyinstaller package
└── NGL
    └── spiders
        └── NGL.py < spider

有了这个,一切正常,但是有人知道如何消除这个“额外”文件吗

<;-编辑-3->;
在OSX下编译仍然会破坏它
有人知道如何解决这个问题吗?
<;->

我找到了答案!
但我不确定这是如何解决问题的,但确实如此
有趣的是,只有一些有用的信息出现在一些中文博客上,尽管

https://iamting93.github.io/2019/08/31/python/linux%E4%B8%8B%E5%88%A9%E7%94%A8pyinstaller%E6%89%93%E5%8C%85scrapy/(或懂中文的人)
它进一步引用了这篇文章:
https://blog.csdn.net/u010600274/article/details/99345367
它为.spec文件

提供了很好的基础示例 所要做的就是将('.','.')添加到datas=[]
这会将整个项目复制到最终包的根目录中。
不是很优雅,但很管用!
据我所知,Scrapy动态加载一些文件,并对最终包中的数据结构感到非常不安

总之

  1. 清理了项目树(但不确定这是否真的重要):
.
├── cli.py
├── LICENSE
├── NGL_linux.spec
├── scrapy.cfg
├── main
│   ├── __init__.py
│   ├── checkers.py
│   ├── decorators.py
│   ├── fixers.py
│   ├── licences.py
│   ├── terminal_tools.py
│   ├── text_tools.py
│   └── ui.py
└── NGL
    ├── __init__.py
    ├── items.py
    ├── middlewares.py
    ├── pipelines.py
    └── spiders
        ├── __init__.py
        └── NGL.py
 
  1. 生成的NGL_linux.spec具有:
pyi-makespec cli.py  onefile -n NGL_linux -p main:NGL:NGL/spiders  windowed
  1. 在Pyinstaller的.spec文件中将datas设置为('.','.')
# NGL_linux.spec
# -*- mode: python ; coding: utf-8 -*-

block_cipher = None


a = Analysis(['cli.py'],
             pathex=['main', 'NGL', 'NGL/spiders', {absolute project path here}],
             binaries=[],
             datas=[('.','.')],
             hiddenimports=[],
             hookspath=[],
             runtime_hooks=[],
             excludes=[],
             win_no_prefer_redirects=False,
             win_private_assemblies=False,
             cipher=block_cipher,
             noarchive=False)
pyz = PYZ(a.pure, a.zipped_data,
             cipher=block_cipher)
exe = EXE(pyz,
          a.scripts,
          a.binaries,
          a.zipfiles,
          a.datas,
          [],
          name='NGL_linux',
          debug=False,
          bootloader_ignore_signals=False,
          strip=False,
          upx=True,
          upx_exclude=[],
          runtime_tmpdir=None,
          console=True )
  1. 运行pyinstaller NGL_linux.spec clean生成包

现在一切顺利

相关问题 更多 >