2016-12-03 1 views
2

Ich habe Filebeat 5.0 auf meinem App-Server installiert und haben 3 Filebeat Prospektoren, jeder der Prospector zeigen auf verschiedene Protokollpfade und Ausgabe zu einem Kafka-Thema namens myapp_applog und alles funktioniert gut.Filebeat 5.0 Ausgabe zu Kafka mehrere Themen

Mein Filebeat Ausgangskonfiguration zu einem Thema - Arbeits

output.kafka: 
    # initial brokers for reading cluster metadata 
    hosts: ["broker.1.ip.address:9092", "broker.2.ip.address:9092", "broker.3.ip.address:9092"] 

    # message topic selection + partitioning 
    topic: 'myapp_applog' 
    partition.round_robin: 
     reachable_only: false 

    required_acks: 1 
    compression: gzip 
    max_message_bytes: 1000000 

Was will ich tun ist jede der Protokolldateien senden Themen auf einem Bedingungsteil siehe Dokumentation auf topics trennen basieren. Ich habe versucht, es zu tun, aber zu keinem der Themen wurden Daten gesendet. Weiß jemand, warum mein Zustand nicht übereinstimmt oder richtig ist? Ich kann scheinen, ein Beispiel zu finden, wie man die "Thema-Thema-Bedingung" richtig verwendet.

Hier ist meine Kafka-Ausgabe, um Themen Konfiguration zu multiplizieren.

Nicht

output.kafka: 
    # initial brokers for reading cluster metadata 
    hosts: ["broker.1.ip.address:9092", "broker.2.ip.address:9092", "broker.3.ip.address:9092"] 

    # message topic selection + partitioning 
    topics: 
     - topic: 'myapp_applog' 
     when: 
      equals: 
      document_type: applog_myappapi 
     - topic: 'myapp_applog_stats' 
     when: 
      equals: 
      document_type: applog_myappapi_stats 
     - topic: 'myapp_elblog' 
     when: 
      equals: 
      document_type: elblog_myappapi 
    partition.round_robin: 
     reachable_only: false 

    required_acks: 1 
    compression: gzip 
    max_message_bytes: 1000000 

Hier zu arbeiten ist voll filebeat.yml Konfigurationsdatei.

################### Filebeat Configuration Example ######################### 
############################# Filebeat ###################################### 
filebeat.prospectors: 
    # App logs - prospector 
    - input_type: log 
     paths: 
     - /myapp/logs/myapp.log 
     exclude_lines: [".+? INFO[^*].+", ".+? DEBUG[^*].+"] 
     exclude_files: [".gz$", ".tmp"] 
     fields: 
     api: myappapi 
     environment: STG 
     ignore_older: 24h 
     document_type: applog_myappapi 
     scan_frequency: 1s 

     # Multine on Timestamp, YYYY-MM-DD 
     # https://www.elastic.co/guide/en/beats/filebeat/master/multiline-examples.html 
     multiline: 
     pattern: '^[0-9]{4}-[0-9]{2}-[0-9]{2}' 
     negate: true 
     match: after 
     max_lines: 500 
     timeout: 5s 

    # Server Stats - prospector 
    - input_type: log 
     paths: 
     - /myapp/logs/serverstats.log 

     # Exclude messages with log level 
     exclude_lines: [".+? ERROR[^*].+", ".+? DEBUG[^*].+"] 
     exclude_files: [".gz$", ".tmp"] 
     fields: 
     api: myappapi 
     environment: STG 
     ignore_older: 24h 
     document_type: applog_myappapi_stats 
     scan_frequency: 1s 

    # ELB prospector 
    - 
     input_type: log 
     paths: 
     - /var/log/httpd/elasticbeanstalk-access_log 
     document_type: elblog_myappapi 
     fields: 
     api: myappapi 
     environment: STG 
     exclude_lines: [".+? INFO[^*].+", ".+? DEBUG[^*].+"] 
     exclude_files: [".gz$", ".tmp"] 
     ignore_older: 24h 

     # 0s, it is done as often as possible. Default: 10s 
     scan_frequency: 1s 
registry_file: /var/lib/filebeat/registry 

############################# Output ########################################## 
# Configure what outputs to use when sending the data collected by the beat. 
# Multiple outputs may be used. 
#----------------------------- Kafka output -------------------------------- 

output.kafka: 
    # initial brokers for reading cluster metadata 
    hosts: ["broker.1.ip.address:9092", "broker.2.ip.address:9092", "broker.3.ip.address:9092"] 

    # message topic selection + partitioning 
    topics: 
     - topic: 'myapp_applog' 
     when: 
      equals: 
      document_type: applog_myappapi 
     - topic: 'myapp_applog_stats' 
     when: 
      equals: 
      document_type: applog_myappapi_stats 
     - topic: 'myapp_elblog' 
     when: 
      equals: 
      document_type: elblog_myappapi 
    partition.round_robin: 
     reachable_only: false 

    required_acks: 1 
    compression: gzip 
    max_message_bytes: 1000000 

############################# Logging ######################################### 

# There are three options for the log ouput: syslog, file, stderr. 
# Under Windos systems, the log files are per default sent to the file output, 
# under all other system per default to syslog. 
logging: 

    # Send all logging output to syslog. On Windows default is false, otherwise 
    # default is true. 
    to_syslog: true 

    # Write all logging output to files. Beats automatically rotate files if rotateeverybytes 
    # limit is reached. 
    to_files: true 

    # To enable logging to files, to_files option has to be set to true 
    files: 
    # The directory where the log files will written to. 
    path: /var/log/ 

    # The name of the files where the logs are written to. 
    name: filebeats.log 

    # Configure log file size limit. If limit is reached, log file will be 
    # automatically rotated 
    rotateeverybytes: 10485760 # = 10MB 

    # Number of rotated log files to keep. Oldest files will be deleted first. 
    keepfiles: 7 

    # Enable debug output for selected components. To enable all selectors use ["*"] 
    # Other available selectors are beat, publish, service 
    # Multiple selectors can be chained. 
    #selectors: ["*" ] 

    # Sets log level. The default log level is error. 
    # Available log levels are: critical, error, warning, info, debug 
    level: info 

Antwort

3

Ich habe das gleiche Problem bekam und definieren mit ihr umgehen durch Ausgabe als:

topics: 
    - topic: '%{[type]}' 
use_type: true 

und als Eingabe nur in document_type gesetzt haben: Kaffka das Thema

  • eingabetyp: log pfade:

    • /path/to/log/file document_type: "you'r Kaffka das Thema 1"
  • INPUT_TYPE: log Pfade:

    • /path/to/andere/log/file

    document_type: "you'r anderen Kaffka das Thema 1"

1

Input:

- type: log 
    fields: 
    kafka_topic: "my_topic_1" 

- type: log 
    fields: 
    kafka_topic: "my_topic_2" 

Output:

output.kafka: 
    hosts: ["mybroker:9092"] 
    topic: '%{[fields.kafka_topic]}' 

obige Beispiel zeigt 2 log Eingänge und 2 Ausgänge kafka Thema

Verwandte Themen