Elasticsearch、Kibana数据导出实战
- -以下两个导出问题来自Elastic中文社区. 问题1、kibana怎么导出查询数据. 问题2:elasticsearch数据导出. 就像数据库数据导出一样,elasticsearch可以么. 或者找到它磁盘上存放数据的位置,拷贝出来,放到另一个es服务器上或者转成自己要的数据格式. 实际业务实战中,大家或多或少的都会遇到导入、导出问题.
问题1、kibana怎么导出查询数据?实际业务实战中,大家或多或少的都会遇到导入、导出问题。
问题2:elasticsearch数据导出
就像数据库数据导出一样,elasticsearch可以么?
或者找到它磁盘上存放数据的位置,拷贝出来,放到另一个es服务器上或者转成自己要的数据格式?
注意:建议7.X以上版本使用。低版本不支持。
1es2csv -u 192.168.1.1:9200 -q '{"_source":{"excludes":["*gxn",,"*kex","vperxs","lpix"]},"query":{"term":{"this_topic":{"value":41}}}}' -r -i sogou_topic -o ~/export.csv4、使用效果:
导出索引、检索结果、别名或模板为Json
导出索引为gzip
支持导出大文件切割为小文件
支持统一集群不同索引间或者跨索引数据拷贝
1elasticdump \如上,将检索结果导出为json文件。更多导入、导出详见github介绍。4、使用效果:
2 --input=http://production.es.com:9200/my_index \
3 --output=query.json \
4 --searchBody='{"query":{"term":{"username": "admin"}}}'
1D:\logstash-6.5.4\bin>logstash-plugin.bat install logstash-output-csv步骤2:配置conf文件
2Validating logstash-output-csv
3Installing logstash-output-csv
4Installation successful
输入:指定ES地址,索引,请求query语句;
输出:csv输出地址,输出字段列表。
1input {步骤3:执行导出
2 elasticsearch {
3 hosts => "127.0.0.1:9200"
4 index => "company_infos"
5 query => '
6 {
7 "query": {
8 "match_all": {}
9 }
10 }
11 '
12 }
13}
14
15output {
16 csv {
17 # elastic field name
18 fields => ["no", "name", "age", "company_name", "department", "sex"]
19 # This is path where we store output.
20 path => "D:\logstash-6.5.4\export\csv-export.csv"
21 }
22}
1D:\\logstash-6.5.4\bin>logstash -f ../config/logstash_ouput_csv.conf地址:https://medium.com/@shaonshaonty/export-data-from-elasticsearch-to-csv-caaef3a19b69
2Sending Logstash logs to D:/2.es_install/logstash-6.5.4/logs which is now configured via log4j2.properties
3[2019-08-03T23:45:00,914][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
4[2019-08-03T23:45:00,934][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.5.4"}
5[2019-08-03T23:45:03,473][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
6[2019-08-03T23:45:04,241][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x34b305d3 sleep>"}
7[2019-08-03T23:45:04,307][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
8[2019-08-03T23:45:04,740][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
9[2019-08-03T23:45:05,610][INFO ][logstash.outputs.csv ] Opening file {:path=>"D:/logstash-6.5.4/export/csv-export.csv"}
10[2019-08-03T23:45:07,558][INFO ][logstash.pipeline ] Pipeline has terminated {:pipeline_id=>"main", :thread=>"#<Thread:0x34b305d3 run>"}