找不到语言fr、cs的sphinx标记器

2024-04-27 05:28:04 发布

您现在位置:Python中文网/ 问答频道 /正文

我用狮身人面像作为文档。我想用拼写检查来处理法语。在

到目前为止,我已经做了以下工作:

  • 正在安装sphinx spellcheck扩展
sudo pip install sphinxcontrib-spelling
  • 安装法语
 sudo apt-get install myspell-fr-fr
  • 添加扩展插件配置文件在
 extensions = ["sphinxcontrib.spelling"]
 spelling_lang='fr'
  • 添加拼写生成器

builder = ["html", "pdf", "spelling" ],

这是我运行斯芬克斯时得到的回溯:

Exception occurred:
  File "/usr/lib/python2.7/dist-packages/sphinx/cmdline.py", line 188, in main
warningiserror, tags)
  File "/usr/lib/python2.7/dist-packages/sphinx/application.py", line 134, in __init__
self._init_builder(buildername)
  File "/usr/lib/python2.7/dist-packages/sphinx/application.py", line 194, in _init_builder
self.builder = builderclass(self)
  File "/usr/lib/python2.7/dist-packages/sphinx/builders/__init__.py", line 57, in __init__
self.init()
  File "/usr/lib/pymodules/python2.7/sphinxcontrib/spelling.py", line 253, in init
filters=filters,
  File "/usr/lib/pymodules/python2.7/sphinxcontrib/spelling.py", line 181, in __init__
self.tokenizer = get_tokenizer(lang, filters)
  File "/usr/lib/python2.7/dist-packages/enchant/tokenize/__init__.py", line 186, in get_tokenizer
raise TokenizerNotFoundError(msg)
TokenizerNotFoundError: No tokenizer found for language 'fr'

欢迎任何帮助:-)


Tags: inpyselfinitlibpackagesusrdist
2条回答

我也犯了同样的错误,看来这和字典丢失没有关系。在

PyEnchant只是没有法国代币器,但只有一个英国。如Extending enchant.tokenize文档中所述:

The author would be very grateful for tokenization routines for languages other than English which can be incorporated back into the main PyEnchant distribution.

必须将给定语言的标记器添加到PyEnchant。在

急功近利

将pyenchant repo和cd克隆到其中:

$ git clone git@github.com:rfk/pyenchant.git
$ cd pyenchant

转到目录,其中定义了标记器:

^{pr2}$

将现有的en.py标记器复制到您要使用的语言代码中(我缺少cs,可以尝试fr):

$ cp en.py cs.py
$ cp en.py fr.py

根据修改后的代码安装包:

$ cd ../..  # first return to the dir with `setup.py`
$ pip install -e .

现在它会起作用的。在

更好的解决方案是检查复制的标记器并修改它不适合您的语言的地方。并对pyenchant做出贡献。在

相关问题 更多 >