我无法用logstash将数据加载到localhost上的弹性搜索节点...我想让logstash读取csv文件并将这些数据加载到弹性搜索。但是什么都不起作用,我只能读取我手动添加到弹性搜索中的数据,看起来logstash什么都不做。
我的logstash配置是:
input {
file {
path => [ "C:\Users\Michele\Downloads\logstash-1.5.3\logstash-1.5.3\Users\*.csv" ]
start_position => "beginning"
}
}
filter {
csv {
columns => ["timestamp", "impianto", "tipo_misura", "valore","unita_misura"]
separator => ","
}
}
output {
elasticsearch {
action => "index"
host => "localhost"
cluster => "elasticsearch"
node_name => "NCC-1701-A"
index => "myindex"
index_type => "pompe"
workers => 1
}
}
我的csv文件是:
2015-08-03T18:46:00,Abbiategrasso,Pressione gruppo 1,55.5,m
2015-08-03T18:46:10,Abbiategrasso,Pressione gruppo 1,44.4,m
2015-08-03T18:46:20,Abbiategrasso,Pressione gruppo 1,66.6,m
2015-08-03T18:46:30,Abbiategrasso,Pressione gruppo 1,33.3,m
2015-08-03T18:46:40,Abbiategrasso,Pressione gruppo 1,22.2,m
2015-08-03T18:46:50,Abbiategrasso,Pressione gruppo 1,77.7,m
2015-08-03T18:47:00,Abbiategrasso,Pressione gruppo 1,11.1,m
2015-08-03T18:47:10,Abbiategrasso,Pressione gruppo 1,44.4,m
2015-08-03T18:47:20,Abbiategrasso,Pressione gruppo 1,55.5,m
2015-08-03T18:47:30,Abbiategrasso,Pressione gruppo 1,33.3,m
2015-08-03T18:47:40,Abbiategrasso,Pressione gruppo 1,22.2,m
2015-08-03T18:47:50,Abbiategrasso,Pressione gruppo 1,66.6,m
2015-08-03T18:48:00,Abbiategrasso,Pressione gruppo 1,33.3,m
2015-08-03T18:48:10,Abbiategrasso,Pressione gruppo 1,77.7,m
2015-08-03T18:48:20,Abbiategrasso,Pressione gruppo 1,22.2,m
2015-08-03T18:48:30,Abbiategrasso,Pressione gruppo 1,88.8,m
2015-08-03T18:48:40,Abbiategrasso,Pressione gruppo 1,55.5,m
2015-08-03T18:48:50,Abbiategrasso,Pressione gruppo 1,33.3,m
2015-08-03T18:49:00,Abbiategrasso,Pressione gruppo 1,55.5,m
“MyIndex”索引上没有新的内容出现!但我不知道为什么...
将logstash配置文件从不获取文件的“*.csv”更改为“abbiategrasso.csv”
logstash“立即尝试”上的My--Debug消息是:
←[33mfailed action with response of 400, dropping action: ["index", {:_id=>nil,
:_index=>"abbiategrasso", :_type=>"pompe", :_routing=>nil}, #<LogStash::Event:0x
1cea7b7 @metadata_accessors=#<LogStash::Util::Accessors:0x1e577ee @store={"path"
=>"C:\\Users\\Michele\\Downloads\\logstash-1.5.3\\logstash-1.5.3\\Users\\abbiate
grasso.csv", "retry_count"=>0}, @lut={"[path]"=>[{"path"=>"C:\\Users\\Michele\\D
ownloads\\logstash-1.5.3\\logstash-1.5.3\\Users\\abbiategrasso.csv", "retry_coun
t"=>0}, "path"]}>, @cancelled=false, @data={"message"=>["2015-08-03T18:48:00,Abb
iategrasso,Pressione gruppo 1,66.6,m\r"], "@version"=>"1", "@timestamp"=>"2015-0
9-07T17:28:32.501Z", "host"=>"Michele-HP", "path"=>"C:\\Users\\Michele\\Download
s\\logstash-1.5.3\\logstash-1.5.3\\Users\\abbiategrasso.csv", "timestamp"=>"2015
-08-03T18:48:00", "impianto"=>"Abbiategrasso", "tipo_misura"=>"Pressione gruppo
1", "valore"=>"66.6", "unita_misura"=>"m"}, @metadata={"path"=>"C:\\Users\\Miche
le\\Downloads\\logstash-1.5.3\\logstash-1.5.3\\Users\\abbiategrasso.csv", "retry
_count"=>0}, @accessors=#<LogStash::Util::Accessors:0x2ff785 @store={"message"=>
["2015-08-03T18:48:00,Abbiategrasso,Pressione gruppo 1,66.6,m\r"], "@version"=>"
1", "@timestamp"=>"2015-09-07T17:28:32.501Z", "host"=>"Michele-HP", "path"=>"C:\
\Users\\Michele\\Downloads\\logstash-1.5.3\\logstash-1.5.3\\Users\\abbiategrasso
.csv", "timestamp"=>"2015-08-03T18:48:00", "impianto"=>"Abbiategrasso", "tipo_mi
sura"=>"Pressione gruppo 1", "valore"=>"66.6", "unita_misura"=>"m"}, @lut={"host
"=>[{"message"=>["2015-08-03T18:48:00,Abbiategrasso,Pressione gruppo 1,66.6,m\r"
], "@version"=>"1", "@timestamp"=>"2015-09-07T17:28:32.501Z", "host"=>"Michele-H
P", "path"=>"C:\\Users\\Michele\\Downloads\\logstash-1.5.3\\logstash-1.5.3\\User
s\\abbiategrasso.csv", "timestamp"=>"2015-08-03T18:48:00", "impianto"=>"Abbiateg
rasso", "tipo_misura"=>"Pressione gruppo 1", "valore"=>"66.6", "unita_misura"=>"
m"}, "host"], "path"=>[{"message"=>["2015-08-03T18:48:00,Abbiategrasso,Pressione
gruppo 1,66.6,m\r"], "@version"=>"1", "@timestamp"=>"2015-09-07T17:28:32.501Z",
"host"=>"Michele-HP", "path"=>"C:\\Users\\Michele\\Downloads\\logstash-1.5.3\\l
ogstash-1.5.3\\Users\\abbiategrasso.csv", "timestamp"=>"2015-08-03T18:48:00", "i
mpianto"=>"Abbiategrasso", "tipo_misura"=>"Pressione gruppo 1", "valore"=>"66.6"
, "unita_misura"=>"m"}, "path"], "message"=>[{"message"=>["2015-08-03T18:48:00,A
bbiategrasso,Pressione gruppo 1,66.6,m\r"], "@version"=>"1", "@timestamp"=>"2015
-09-07T17:28:32.501Z", "host"=>"Michele-HP", "path"=>"C:\\Users\\Michele\\Downlo
ads\\logstash-1.5.3\\logstash-1.5.3\\Users\\abbiategrasso.csv", "timestamp"=>"20
15-08-03T18:48:00", "impianto"=>"Abbiategrasso", "tipo_misura"=>"Pressione grupp
o 1", "valore"=>"66.6", "unita_misura"=>"m"}, "message"], "timestamp"=>[{"messag
e"=>["2015-08-03T18:48:00,Abbiategrasso,Pressione gruppo 1,66.6,m\r"], "@version
"=>"1", "@timestamp"=>"2015-09-07T17:28:32.501Z", "host"=>"Michele-HP", "path"=>
"C:\\Users\\Michele\\Downloads\\logstash-1.5.3\\logstash-1.5.3\\Users\\abbiategr
asso.csv", "timestamp"=>"2015-08-03T18:48:00", "impianto"=>"Abbiategrasso", "tip
o_misura"=>"Pressione gruppo 1", "valore"=>"66.6", "unita_misura"=>"m"}, "timest
amp"], "impianto"=>[{"message"=>["2015-08-03T18:48:00,Abbiategrasso,Pressione gr
uppo 1,66.6,m\r"], "@version"=>"1", "@timestamp"=>"2015-09-07T17:28:32.501Z", "h
ost"=>"Michele-HP", "path"=>"C:\\Users\\Michele\\Downloads\\logstash-1.5.3\\logs
tash-1.5.3\\Users\\abbiategrasso.csv", "timestamp"=>"2015-08-03T18:48:00", "impi
anto"=>"Abbiategrasso", "tipo_misura"=>"Pressione gruppo 1", "valore"=>"66.6", "
unita_misura"=>"m"}, "impianto"], "tipo_misura"=>[{"message"=>["2015-08-03T18:48
:00,Abbiategrasso,Pressione gruppo 1,66.6,m\r"], "@version"=>"1", "@timestamp"=>
"2015-09-07T17:28:32.501Z", "host"=>"Michele-HP", "path"=>"C:\\Users\\Michele\\D
ownloads\\logstash-1.5.3\\logstash-1.5.3\\Users\\abbiategrasso.csv", "timestamp"
=>"2015-08-03T18:48:00", "impianto"=>"Abbiategrasso", "tipo_misura"=>"Pressione
gruppo 1", "valore"=>"66.6", "unita_misura"=>"m"}, "tipo_misura"], "valore"=>[{"
message"=>["2015-08-03T18:48:00,Abbiategrasso,Pressione gruppo 1,66.6,m\r"], "@v
ersion"=>"1", "@timestamp"=>"2015-09-07T17:28:32.501Z", "host"=>"Michele-HP", "p
ath"=>"C:\\Users\\Michele\\Downloads\\logstash-1.5.3\\logstash-1.5.3\\Users\\abb
iategrasso.csv", "timestamp"=>"2015-08-03T18:48:00", "impianto"=>"Abbiategrasso"
, "tipo_misura"=>"Pressione gruppo 1", "valore"=>"66.6", "unita_misura"=>"m"}, "
valore"], "unita_misura"=>[{"message"=>["2015-08-03T18:48:00,Abbiategrasso,Press
ione gruppo 1,66.6,m\r"], "@version"=>"1", "@timestamp"=>"2015-09-07T17:28:32.50
1Z", "host"=>"Michele-HP", "path"=>"C:\\Users\\Michele\\Downloads\\logstash-1.5.
3\\logstash-1.5.3\\Users\\abbiategrasso.csv", "timestamp"=>"2015-08-03T18:48:00"
, "impianto"=>"Abbiategrasso", "tipo_misura"=>"Pressione gruppo 1", "valore"=>"6
6.6", "unita_misura"=>"m"}, "unita_misura"]}>>] {:level=>:warn, :file=>"/Users/M
ichele/Downloads/logstash-1.5.3/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logs
tash-output-elasticsearch-1.0.5-java/lib/logstash/outputs/elasticsearch.rb", :li
ne=>"531", :method=>"submit"}←[0m
Logstash无法上载数据...
如果使用Marvel.Sense手动记录了数据,请从索引中删除当前数据。为此,请使用索引的DELETE命令。下一次,logstash将能够创建一个符合其要求的新索引!
同样,对于logstash配置文件中的文件定义,不要使用“*.csv”,它不起作用...它不搜索文件夹中的所有文件,它搜索命名文件如下所示:(
我现在的配置是:
input {
file {
path => [ "C:\Users\Michele\Downloads\logstash-1.5.3\logstash-1.5.3\Users\abbiategrasso4.csv" ]
start_position => "beginning"
}
}
filter {
csv {
columns => ["timestamp", "impianto", "tipo_misura", "valore","unita_misura"]
separator => ","
}
mutate {
convert => { "valore" => "float" }
}
}
output {
elasticsearch {
action => "index"
host => "localhost"
cluster => "elasticsearch"
node_name => "NCC-1701-A"
index => "abbiategrasso"
document_type => "pompe"
workers => 1
}
stdout { codec => rubydebug }
}
现在起作用了,我会尝试Kibana:)
当您进行故障排除时,我将从打印到标准输出开始。
output { stdout { codec => rubydebug } }
这样,您就可以看到问题是出在logstash还是出在ElasticSearch的接口上。
我还发现在执行文件输入时使用sincedb非常有用。可能是在排除故障时,读取了数据并修改了sincedb。在进行故障排除时,将以下行添加到文件{}中。但是,当您进入实际使用时,不要忘记删除它,否则您将重新摄取您不打算摄取的数据。
sincedb_path => "/dev/null"
这是logstash.err的错误: 连接失败:在/opt/logstash/vendor/bundle/jruby/1.9/gems/Faraday-0.9.0/lib/Faraday/adapter/net_http调用文件结束。rb:44 build_response at/opt/logstash/vendor/bundle/jruby/1.9/gems/faraday-0.9.0/lib
我想将json文件数据导入弹性搜索。这是我的logstash配置文件- 输入{file{type= 输出{stdout{codec= 这是我的json文件--- {"水果":"苹果","大小":"小","颜色":"红色" }, { "水果":"木瓜","大小":"大","颜色":"黄色"测试":"甜"} 我使用这个命令执行了上面的配置文件---- 但我在弹性搜索索引中得到了如下数据-- 请帮我获得
我试图使用docker容器创建一个弹性搜索安装。我只使用Elastic.io提供者的映像。 我不知道为什么,但logstash告诉我,他无法连接到带有此错误消息的ElasticSearch实例: 如果logstash真的得到了我的设置,有人能告诉我为什么他使用了一个坏的主机事件吗?
问题内容: 如何使用Logstash将数据从Elasticsearch导出到CSV?我只需要包括特定的列。 问题答案: 安装2个插件:elasticsearch输入插件和csv输出插件。然后创建一个配置文件。这是这种情况的一个很好的例子。 您现在就可以开始了,只需运行: 并检查中指定的文件。
我整晚都在做这个,快把我逼疯了。它应该很简单,但它不起作用。这适用于Oracle,但不适用于MySQL,我创建了类似的数据库。使用-f选项提供给logstash的配置。 } 输出{stdout{codec= } 一旦我运行了logstash,它就不会将数据加载到弹性搜索索引中。当我执行以下操作时,我甚至看不到名为visitDb的索引。 curl'localhost:9200/_cat/索引?v'
问题内容: 不知道我在这里缺少什么,但是这段代码运行时没有任何错误消息,但是表中没有任何内容。我正在将三列的CSV值加载到mysql表中 如果有人可以看看,将不胜感激。 谢谢。 问题答案: 我认为您必须将所有内容插入。 像这样