2017-10-12 3 views
0

Ich versuche, meine .CSV-Datei in Kibana zur Visualisierung zu bekommen. Es fühlt sich an, als wäre ich kurz davor, es in Gang zu bringen, aber ich kann nicht herausfinden, wie ich meine Leistung richtig machen kann.Logstash-Ausgabe erkennt keine Spalten in Kibana

In Kibana sehe ich meine CSV-Datei als:

message: News,[email protected],10.10.10.10 

Es sieht aus wie meine CSV ouput ist in 1 Feld namens Nachricht. Ich möchte 3 verschiedene Felder bekommen: Name, Email, IP. Ich habe viele csv-Dateien und verschiedene Codes ausprobiert, aber noch keinen Erfolg.

CSV-Datei:

Name,Email,IP 
Auto,[email protected],10.0.0.196 
News,[email protected],10.10.10.10 
nieuwsbrieven,[email protected],10.10.10.10 

CONF-Datei:

input { 
file { 
path => "C:\Users\JOEY2\Desktop\Deelproblemen\Applicatie\Output\test.csv" 
start_position => beginning 
sincedb_path => "/dev/null" 
}} 

filter { 
csv { 
separator => "," 
columns => ["Date","Open","High"] 
} 
} 
output { 
elasticsearch { 
hosts => ["http://localhost:9200"] 
index => "csv_index" 
} 
stdout {} 
} 

filebeat.yml filebeat.prospectors:

- input_type: log 
paths: 
- C:\Users\JOEY2\Desktop\Deelproblemen\Applicatie\Output\test.csv 

output.elasticsearch: 
hosts: ["localhost:9200"] 
template.name: "testttt" 
template.overwrite: true 

output.logstash: 
hosts: ["localhost:5044"] 

Logstash CMD Ausgabe:

[2017-10-12T13:53:52,682][INFO ][logstash.pipeline  ] Pipeline main started 
[2017-10-12T13:53:52,690][INFO ][org.logstash.beats.Server] Starting server on port: 5044 
[2017-10-12T13:53:53,003][INFO ][logstash.agent   ] Successfully started Logstash API endpoint {:port=>9600} 
{ 
    "@timestamp" => 2017-10-12T11:53:53.659Z, 
     "offset" => 15, 
     "@version" => "1", 
    "input_type" => "log", 
      "beat" => { 
      "name" => "DESKTOP-VEQHHVT", 
     "hostname" => "DESKTOP-VEQHHVT", 
     "version" => "5.6.2" 
    }, 
      "host" => "DESKTOP-VEQHHVT", 
     "source" => "C:\\Users\\JOEY2\\Desktop\\Deelproblemen\\Applicatie\\Output\\test.csv", 
     "message" => "Name,Email,IP", 
      "type" => "log", 
      "tags" => [ 
     [0] "beats_input_codec_plain_applied" 
    ] 
} 
{ 
    "@timestamp" => 2017-10-12T11:53:53.659Z, 
     "offset" => 44, 
     "@version" => "1", 
    "input_type" => "log", 
      "beat" => { 
      "name" => "DESKTOP-VEQHHVT", 
     "hostname" => "DESKTOP-VEQHHVT", 
     "version" => "5.6.2" 
    }, 
      "host" => "DESKTOP-VEQHHVT", 
     "source" => "C:\\Users\\JOEY2\\Desktop\\Deelproblemen\\Applicatie\\Output\\test.csv", 
     "message" => "Auto,[email protected],10.0.0.196", 
      "type" => "log", 
      "tags" => [ 
     [0] "beats_input_codec_plain_applied" 
    ] 
} 
{ 
    "@timestamp" => 2017-10-12T11:53:53.659Z, 
     "offset" => 77, 
     "@version" => "1", 
      "beat" => { 
      "name" => "DESKTOP-VEQHHVT", 
     "hostname" => "DESKTOP-VEQHHVT", 
     "version" => "5.6.2" 
    }, 
    "input_type" => "log", 
      "host" => "DESKTOP-VEQHHVT", 
     "source" => "C:\\Users\\JOEY2\\Desktop\\Deelproblemen\\Applicatie\\Output\\test.csv", 
     "message" => "News,[email protected],10.10.10.10", 
      "type" => "log", 
      "tags" => [ 
     [0] "beats_input_codec_plain_applied" 
    ] 

Meine CSV-Spalten/Zeilen werden in die Variable Nachricht eingefügt.

Curl-Befehlsausgabe: (curl es localhost: 9200/_cat/Indices v?)

yellow open filebeat-2017.10.12 ux6-ByOERj-2XEBojkxhXg 5 1   3   0  13.3kb   13.3kb 
    enter code here 

Klemme ELAC OUTPUT:

[2017-10-12T13:53:11,763][INFO ][o.e.n.Node    ] [] initializing ... 
[2017-10-12T13:53:11,919][INFO ][o.e.e.NodeEnvironment ] [Zs6ZAuy] using [1] data paths, mounts [[(C:)]], net usable_space [1.9tb], net total_space [1.9tb], spins? [unknown], types [NTFS] 
[2017-10-12T13:53:11,920][INFO ][o.e.e.NodeEnvironment ] [Zs6ZAuy] heap size [1.9gb], compressed ordinary object pointers [true] 
[2017-10-12T13:53:12,126][INFO ][o.e.n.Node    ] node name [Zs6ZAuy] derived from node ID [Zs6ZAuyyR2auGVnPoD9gRw]; set [node.name] to override 
[2017-10-12T13:53:12,128][INFO ][o.e.n.Node    ] version[5.6.2], pid[3384], build[57e20f3/2017-09-23T13:16:45.703Z], OS[Windows 10/10.0/amd64], JVM[Oracle Corporation/Java HotSpot(TM) 64-Bit Server VM/1.8.0_144/25.144-b01] 
[2017-10-12T13:53:12,128][INFO ][o.e.n.Node    ] JVM arguments [-Xms2g, -Xmx2g, -XX:+UseConcMarkSweepGC, -XX:CMSInitiatingOccupancyFraction=75, -XX:+UseCMSInitiatingOccupancyOnly, -XX:+AlwaysPreTouch, -Xss1m, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djna.nosys=true, -Djdk.io.permissionsUseCanonicalPath=true, -Dio.netty.noUnsafe=true, -Dio.netty.noKeySetOptimization=true, -Dio.netty.recycler.maxCapacityPerThread=0, -Dlog4j.shutdownHookEnabled=false, -Dlog4j2.disable.jmx=true, -Dlog4j.skipJansi=true, -XX:+HeapDumpOnOutOfMemoryError, -Delasticsearch, -Des.path.home=C:\ELK-Stack\elasticsearch\elasticsearch-5.6.2] 
[2017-10-12T13:53:13,550][INFO ][o.e.p.PluginsService  ] [Zs6ZAuy] loaded module [aggs-matrix-stats] 
[2017-10-12T13:53:13,616][INFO ][o.e.p.PluginsService  ] [Zs6ZAuy] loaded module [ingest-common] 
[2017-10-12T13:53:13,722][INFO ][o.e.p.PluginsService  ] [Zs6ZAuy] loaded module [lang-expression] 
[2017-10-12T13:53:13,798][INFO ][o.e.p.PluginsService  ] [Zs6ZAuy] loaded module [lang-groovy] 
[2017-10-12T13:53:13,886][INFO ][o.e.p.PluginsService  ] [Zs6ZAuy] loaded module [lang-mustache] 
[2017-10-12T13:53:13,988][INFO ][o.e.p.PluginsService  ] [Zs6ZAuy] loaded module [lang-painless] 
[2017-10-12T13:53:14,059][INFO ][o.e.p.PluginsService  ] [Zs6ZAuy] loaded module [parent-join] 
[2017-10-12T13:53:14,154][INFO ][o.e.p.PluginsService  ] [Zs6ZAuy] loaded module [percolator] 
[2017-10-12T13:53:14,223][INFO ][o.e.p.PluginsService  ] [Zs6ZAuy] loaded module [reindex] 
[2017-10-12T13:53:14,289][INFO ][o.e.p.PluginsService  ] [Zs6ZAuy] loaded module [transport-netty3] 
[2017-10-12T13:53:14,360][INFO ][o.e.p.PluginsService  ] [Zs6ZAuy] loaded module [transport-netty4] 
[2017-10-12T13:53:14,448][INFO ][o.e.p.PluginsService  ] [Zs6ZAuy] no plugins loaded 
[2017-10-12T13:53:18,328][INFO ][o.e.d.DiscoveryModule ] [Zs6ZAuy] using discovery type [zen] 
[2017-10-12T13:53:19,204][INFO ][o.e.n.Node    ] initialized 
[2017-10-12T13:53:19,204][INFO ][o.e.n.Node    ] [Zs6ZAuy] starting ... 
[2017-10-12T13:53:20,071][INFO ][o.e.t.TransportService ] [Zs6ZAuy] publish_address {127.0.0.1:9300}, bound_addresses {127.0.0.1:9300}, {[::1]:9300} 
[2017-10-12T13:53:23,130][INFO ][o.e.c.s.ClusterService ] [Zs6ZAuy] new_master {Zs6ZAuy}{Zs6ZAuyyR2auGVnPoD9gRw}{jBwTE7rUS4i_Ugh6k6DAMg}{127.0.0.1}{127.0.0.1:9300}, reason: zen-disco-elected-as-master ([0] nodes joined) 
[2017-10-12T13:53:23,883][INFO ][o.e.g.GatewayService  ] [Zs6ZAuy] recovered [5] indices into cluster_state 
[2017-10-12T13:53:25,962][INFO ][o.e.c.r.a.AllocationService] [Zs6ZAuy] Cluster health status changed from [RED] to [YELLOW] (reason: [shards started [[.kibana][0]] ...]). 
[2017-10-12T13:53:25,981][INFO ][o.e.h.n.Netty4HttpServerTransport] [Zs6ZAuy] publish_address {127.0.0.1:9200}, bound_addresses {127.0.0.1:9200}, {[::1]:9200} 
[2017-10-12T13:53:25,986][INFO ][o.e.n.Node    ] [Zs6ZAuy] started 
[2017-10-12T13:53:59,245][INFO ][o.e.c.m.MetaDataCreateIndexService] [Zs6ZAuy] [filebeat-2017.10.12] creating index, cause [auto(bulk api)], templates [filebeat, testttt], shards [5]/[1], mappings [_default_] 
[2017-10-12T13:53:59,721][INFO ][o.e.c.m.MetaDataMappingService] [Zs6ZAuy] [filebeat-2017.10.12/ux6-ByOERj-2XEBojkxhXg] create_mapping [doc] 

Filebeat output:

C:\ELK-Stack\filebeat>filebeat -e -c filebeat.yml -d "publish" 
2017/10/12 11:53:53.632142 beat.go:297: INFO Home path: [C:\ELK-Stack\filebeat] Config path: [C:\ELK-Stack\filebeat] Data path: [C:\ELK-Stack\filebeat\data] Logs path: [C:\ELK-Stack\filebeat\logs] 
2017/10/12 11:53:53.632142 beat.go:192: INFO Setup Beat: filebeat; Version: 5.6.2 
2017/10/12 11:53:53.634143 publish.go:228: WARN Support for loading more than one output is deprecated and will not be supported in version 6.0. 
2017/10/12 11:53:53.635144 output.go:258: INFO Loading template enabled. Reading template file: C:\ELK-Stack\filebeat\filebeat.template.json 
2017/10/12 11:53:53.636144 output.go:269: INFO Loading template enabled for Elasticsearch 2.x. Reading template file: C:\ELK-Stack\filebeat\filebeat.template-es2x.json 
2017/10/12 11:53:53.637143 output.go:281: INFO Loading template enabled for Elasticsearch 6.x. Reading template file: C:\ELK-Stack\filebeat\filebeat.template-es6x.json 
2017/10/12 11:53:53.638144 client.go:128: INFO Elasticsearch url: http://localhost:9200 
2017/10/12 11:53:53.639143 outputs.go:108: INFO Activated elasticsearch as output plugin. 
2017/10/12 11:53:53.639143 logstash.go:90: INFO Max Retries set to: 3 
2017/10/12 11:53:53.640143 outputs.go:108: INFO Activated logstash as output plugin. 
2017/10/12 11:53:53.640143 publish.go:243: DBG Create output worker 
2017/10/12 11:53:53.641143 publish.go:243: DBG Create output worker 
2017/10/12 11:53:53.641143 publish.go:285: DBG No output is defined to store the topology. The server fields might not be filled. 
2017/10/12 11:53:53.642144 publish.go:300: INFO Publisher name: DESKTOP-VEQHHVT 
2017/10/12 11:53:53.634143 metrics.go:23: INFO Metrics logging every 30s 
2017/10/12 11:53:53.646143 async.go:63: INFO Flush Interval set to: 1s 
2017/10/12 11:53:53.647142 async.go:64: INFO Max Bulk Size set to: 50 
2017/10/12 11:53:53.647142 async.go:72: DBG create bulk processing worker (interval=1s, bulk size=50) 
2017/10/12 11:53:53.648144 async.go:63: INFO Flush Interval set to: 1s 
2017/10/12 11:53:53.648144 async.go:64: INFO Max Bulk Size set to: 2048 
2017/10/12 11:53:53.649144 async.go:72: DBG create bulk processing worker (interval=1s, bulk size=2048) 
2017/10/12 11:53:53.649144 beat.go:233: INFO filebeat start running. 
2017/10/12 11:53:53.650144 registrar.go:68: INFO No registry file found under: C:\ELK-Stack\filebeat\data\registry. Creating a new registry file. 
2017/10/12 11:53:53.652144 registrar.go:106: INFO Loading registrar data from C:\ELK-Stack\filebeat\data\registry 
2017/10/12 11:53:53.654145 registrar.go:123: INFO States Loaded from registrar: 0 
2017/10/12 11:53:53.655145 crawler.go:38: INFO Loading Prospectors: 1 
2017/10/12 11:53:53.655145 prospector_log.go:65: INFO Prospector with previous states loaded: 0 
2017/10/12 11:53:53.656144 prospector.go:124: INFO Starting prospector of type: log; id: 11034545279404679229 
2017/10/12 11:53:53.656144 crawler.go:58: INFO Loading and starting Prospectors completed. Enabled prospectors: 1 
2017/10/12 11:53:53.655145 spooler.go:63: INFO Starting spooler: spool_size: 2048; idle_timeout: 5s 
2017/10/12 11:53:53.655145 registrar.go:236: INFO Starting Registrar 
2017/10/12 11:53:53.657144 log.go:91: INFO Harvester started for file: C:\Users\JOEY2\Desktop\Deelproblemen\Applicatie\Output\test.csv 
2017/10/12 11:53:53.655145 sync.go:41: INFO Start sending events to output 
2017/10/12 11:53:58.682432 client.go:214: DBG Publish: { 
    "@timestamp": "2017-10-12T11:53:53.659Z", 
    "beat": { 
    "hostname": "DESKTOP-VEQHHVT", 
    "name": "DESKTOP-VEQHHVT", 
    "version": "5.6.2" 
    }, 
    "input_type": "log", 
    "message": "Name,Email,IP", 
    "offset": 15, 
    "source": "C:\\Users\\JOEY2\\Desktop\\Deelproblemen\\Applicatie\\Output\\test.csv", 
    "type": "log" 
} 
2017/10/12 11:53:58.685434 client.go:214: DBG Publish: { 
    "@timestamp": "2017-10-12T11:53:53.659Z", 
    "beat": { 
    "hostname": "DESKTOP-VEQHHVT", 
    "name": "DESKTOP-VEQHHVT", 
    "version": "5.6.2" 
    }, 
    "input_type": "log", 
    "message": "Auto,[email protected],10.0.0.196", 
    "offset": 44, 
    "source": "C:\\Users\\JOEY2\\Desktop\\Deelproblemen\\Applicatie\\Output\\test.csv", 
    "type": "log" 
} 
2017/10/12 11:53:58.685434 client.go:214: DBG Publish: { 
    "@timestamp": "2017-10-12T11:53:53.659Z", 
    "beat": { 
    "hostname": "DESKTOP-VEQHHVT", 
    "name": "DESKTOP-VEQHHVT", 
    "version": "5.6.2" 
    }, 
    "input_type": "log", 
    "message": "News,[email protected],10.10.10.10", 
    "offset": 77, 
    "source": "C:\\Users\\JOEY2\\Desktop\\Deelproblemen\\Applicatie\\Output\\test.csv", 
    "type": "log" 
} 
2017/10/12 11:53:58.686434 output.go:109: DBG output worker: publish 3 events 
2017/10/12 11:53:58.686434 output.go:109: DBG output worker: publish 3 events 
2017/10/12 11:53:58.738437 client.go:667: INFO Connected to Elasticsearch version 5.6.2 
2017/10/12 11:53:58.748436 output.go:317: INFO Trying to load template for client: http://localhost:9200 
2017/10/12 11:53:58.890446 output.go:324: INFO Existing template will be overwritten, as overwrite is enabled. 
2017/10/12 11:53:59.154461 client.go:592: INFO Elasticsearch template with name 'testttt' loaded 
2017/10/12 11:54:00.020510 sync.go:70: DBG Events sent: 4 

Kibana-Ausgabe:

@timestamp:October 12th 2017, 13:53:53.659 beat.hostname:DESKTOP-VEQHHVT beat.name:DESKTOP-VEQHHVT beat.version:5.6.2 input_type:log message:Auto,[email protected],10.0.0.196 offset:44 source:C:\Users\JOEY2\Desktop\Deelproblemen\Applicatie\Output\test.csv type:log _id:AV8QbyIcTtSiVplm9CwA _type:doc _index:filebeat-2017.10.12 _score:1 
@timestamp:October 12th 2017, 13:53:53.659 beat.hostname:DESKTOP-VEQHHVT beat.name:DESKTOP-VEQHHVT beat.version:5.6.2 input_type:log message:News,[email protected],10.10.10.10 offset:77 source:C:\Users\JOEY2\Desktop\Deelproblemen\Applicatie\Output\test.csv type:log _id:AV8QbyIcTtSiVplm9CwB _type:doc _index:filebeat-2017.10.12 _score:1 
@timestamp:October 12th 2017, 13:53:53.659 beat.hostname:DESKTOP-VEQHHVT beat.name:DESKTOP-VEQHHVT beat.version:5.6.2 input_type:log message:Name,Email,IP offset:15 source:C:\Users\JOEY2\Desktop\Deelproblemen\Applicatie\Output\test.csv type:log _id:AV8QbyIcTtSiVplm9Cv_ _type:doc _index:filebeat-2017.10.12 _score:1 
+0

Wenn das CSV-Import in ES eine einmalige Aufgabe ist, brauchen Sie nicht Dateibeat. –

Antwort

1

Sie geben in csv-Filtern falsche Spaltennamen an, und der Spaltenname sollte ohne Anführungszeichen (") angegeben werden. Ich habe das versucht und es funktioniert für mich. Überprüfen Sie, ob dies für Sie funktioniert. Meine Logstash-Konfigurationsdatei:

+0

Danke für Ihre Antwort. Entschuldigung, das war ein Fehler von einem anderen Beispiel. Wenn ich dein Skript ausführe, bleibt das Problem bestehen. Ich sehe immer noch die falsche Ausgabe in Kibana. Ein Eintrag in der entdecken ist: Es sieht aus wie es meine Spalten nicht erkennt. Der Rest der CSV-Datei ist der Name. Der Feldname: Nachricht enthält alle Daten <\t Automatisch, auto @ newsuk, 10.0.0.196> Ich suche, um alle 3 Werte in einem anderen Feld zu bekommen, damit ich sie für die Visualisierung verwenden kann.
Danke für Ihre Zeit. –

+0

Welche Version von ES und Logstash verwenden Sie? –

+0

Fügen Sie Ihre ES-Dokumente nach dem Ausführen von logstash –