斯普林克河
splunk-hec-stream的Python项目详细描述
展开HEC流
splunk hec的python流式日志处理程序
此处理程序不是事件发件人。 处理程序本身不涉及转发到splunk hec端点。
这意味着日志源不必关心缓冲、转换和重试。 这些功能负责日志转发服务(如aws kinisis firehose、fluentd、logstash等)。
用例
- 将aws lambda函数中的日志转发到splunk
- 通过aws kinisis firehose和cloudwarch日志将事件发送到splunk hec端点
- 与日志收集器(如fluentd和logstash)一起使用
- 从日志文件中读取事件并由日志收集器进行处理
如何安装
pip3 install splunk-hec-stream
示例
importloggingimportjsonfromdatetimeimportdatetimefromsplunk_hec_stream.loggingimportSplunkHECStreamHandlerlogging.basicConfig(level=logging.INFO,handlers=[SplunkHECStreamHandler("main","splunk-logger-test","aws:lambda","_json")])logging.info({"key1":"value1"})logging.info("test")logging.info('''testln''')logging.info(json.dumps({"key1":"value1"}))# You can overwrite logged time by _time extra key (that must be float)logging.info({"key":"value"},extra={'_time':datetime.utcnow().timestamp()})
此示例代码将以下日志放入标准输出:
{"loggingHandler":"SplunkHECStreamHandler","time":1557301830.617483,"host":"aws:lambda","index":"main","source":"splunk-logger-test","sourcetype":"_json","event":{"key1":"value1"}}{"loggingHandler":"SplunkHECStreamHandler","time":1557301830.617758,"host":"aws:lambda","index":"main","source":"splunk-logger-test","sourcetype":"_json","event":{"message":"test"}}{"loggingHandler":"SplunkHECStreamHandler","time":1557301830.617904,"host":"aws:lambda","index":"main","source":"splunk-logger-test","sourcetype":"_json","event":{"message":"test\nln"}}{"loggingHandler":"SplunkHECStreamHandler","time":1557301830.618075,"host":"aws:lambda","index":"main","source":"splunk-logger-test","sourcetype":"_json","event":{"message":"{\"key1\": \"value1\"}"}}{"loggingHandler":"SplunkHECStreamHandler","time":1557269430.618213,"host":"aws:lambda","index":"main","source":"splunk-logger-test","sourcetype":"_json","event":{"key":"value"}}
通过将这些json行转发到splunk hec端点,splunk可以将它们读取并存储为事件。
与AWS Lambda一起使用
这个包对于将aws lambda的日志转发到splunk非常有用。
- lambda函数将日志放入cloudwatch,
- 订阅筛选器将它们转发到Firehose,
- 消防水龙带将它们向前推进,使它们爆裂。
如何
- 创建包含此库的lambda层
- 配置Kinesis Firehose以将事件发送到Splunk HEC终结点
/contrib/aws_firehose_splunk_hec_stream_processor.py
可用于事件处理器lambda。
- 配置cloudwatch日志订阅筛选器,并将筛选的事件发送到firehose流
loggingHandler
json中的键可用于筛选转发到splunk hec端点的日志。
地形
/contrib/terraform
目录包含上述转发系统的terraform模块。
provider "aws" {} variable "python_lib_path" { default = "/usr/local/lib/python3.7/site-packages/splunk_hec_stream" } module "handler_layer" { source = "github.com/shuichiro-makigaki/splunk_hec_stream//contrib/terraform/aws_lambda_layer" layer_name = "splunk_hec_stream_handler" lib_path = var.python_lib_path } module "firehose_processor" { source = "github.com/shuichiro-makigaki/splunk_hec_stream//contrib/terraform/aws_firehose" lib_path = var.python_lib_path hec_endpoint = "https://example.com" hec_token = "XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX" layer_arn = module.handler_layer.arn s3_delivery_bucket_name = "XXXXXXXX" }
变量python_lib_path
应该在您这边替换。