Spark Networks DevOps编码挑战
sparkd的Python项目详细描述
Spark Networks DevOps编码挑战
https://gitlab.com/askainet/spark-networks-code-challenge
要求
- python=3.4
安装
来自公共pypi存储库:
pip install sparkd
演示
docker build . -t sparkd
docker run -it -e RETRIES=5 sparkd demo.sh
用法
usage: sparkd [-h] [--version] [-l LOGFILE]
[--command-logfile COMMAND_LOGFILE] [-n NAME] [-v] [-d]
[-r RETRIES] [-i RETRY_INTERVAL] [-c CHECK_INTERVAL]
[command] [arguments [arguments ...]]
Run any command as a daemon and supervise it
positional arguments:
command The command to run as a daemon to supervise
arguments Arguments to the command
optional arguments:
-h, --help show this help message and exit
--version Show version
-l LOGFILE, --logfile LOGFILE
Set the logfile for the Sparkd supervisor
--command-logfile COMMAND_LOGFILE
Set the logfile for the command to supervise
-n NAME, --name NAME Set the name of this Sparkd instance
-v, --verbose Enable verbose logging
-d, --debug Enable debug logging
-r RETRIES, --retries RETRIES
Number of retries to restart the process
-i RETRY_INTERVAL, --retry-interval RETRY_INTERVAL
Seconds to wait between retries to restart the process
-c CHECK_INTERVAL, --check-interval CHECK_INTERVAL
Seconds to wait between checking process status
开发
测试
添加了一些基本测试,以便只覆盖Process
类,
只是演示如何在python中使用unittest
。
make test
起毛
使用pylint
进行linting,以保持代码良好和健康。
make lint
发布到pypi
打包并发布到pypi存储库。
make dist
从源安装
make install
文件
文档是基于代码中的docstring使用sphinx
构建的。
readthedocs集成配置为自动生成和上载 文档到https://sparkd.readthedocs.io/en/latest/更新到主分支。
可以运行make docs