2015-06-11 8 views
5

Nuovo logstash, davvero piacevole.logstash parsing timestamp halfday am/pm

Tentativo di analizzare un file CSV contenente un timestamp. Vorrebbe analizzare il timestamp e usarlo come il campo @timestamp.

Campione del mio ingresso CSV

input { 
    stdin {} 
} 

filter { 
    # filter the input by csv (i.e. comma-separated-value) 
    csv { 
     columns => [ 
      "Job ID", 
      "Server Name", 
      "Status Code", 
      "Job Type", 
      "Client Name", 
      "Start Time", 
      "End Time" 
     ] 
    } 
    # parse the start time to create a real date 
    date { 
     # Examples of times in this log file 
     # "May 29, 2015 10:00:01 PM" 
     # "May 9, 2015 4:47:23 AM" 
     match => [ "End Time", 
        "MMM dd, YYYY HH:mm:ss aa", 
        "MMM d, YYYY HH:mm:ss aa" ] 
    } 
} 

# send the output to stdout, using the rubydebug codec 
# rubydedug uses the Ruby Awesome Print library 
output { 
    stdout { codec => rubydebug } 
} 

Campione del mio ingresso

108628,anmuswcnbu01,1,Backup,anmuswcrfax01.na.jnj.com,"May 29, 2015 10:00:01 PM","May 30, 2015 6:21:29 AM" 
108629,anmuswcnbu01,1,Backup,anmuswcapps01.na.jnj.com,"May 29, 2015 10:00:01 PM","May 9, 2015 10:51:39 pm" 
108630,anmuswcnbu01,1,Backup,anmuswcapps03.na.jnj.com,"May 29, 2015 10:00:01 PM","May 29, 2015 9:31:19 PM" 

Esempio della mia produzione

Logstash startup completed 
{ 
     "message" => [ 
     [0] "108628,anmuswcnbu01,1,Backup,anmuswcrfax01.na.jnj.com,\"May 29, 2015 10:00:01 PM\",\"May 30, 2015 6:21:29 AM\"\r" 
    ], 
     "@version" => "1", 
    "@timestamp" => "2015-05-30T06:21:29.000Z", 
      "host" => "ip-172-31-34-14", 
     "Job ID" => "108628", 
    "Server Name" => "anmuswcnbu01", 
    "Status Code" => "1", 
     "Job Type" => "Backup", 
    "Client Name" => "anmuswcrfax01.na.jnj.com", 
    "Start Time" => "May 29, 2015 10:00:01 PM", 
     "End Time" => "May 30, 2015 6:21:29 AM" 
} 
{ 
     "message" => [ 
     [0] "108629,anmuswcnbu01,1,Backup,anmuswcapps01.na.jnj.com,\"May 29, 2015 10:00:01 PM\",\"May 9, 2015 10:51:39 pm\"\r" 
    ], 
     "@version" => "1", 
    "@timestamp" => "2015-05-09T10:51:39.000Z", 
      "host" => "ip-172-31-34-14", 
     "Job ID" => "108629", 
    "Server Name" => "anmuswcnbu01", 
    "Status Code" => "1", 
     "Job Type" => "Backup", 
    "Client Name" => "anmuswcapps01.na.jnj.com", 
    "Start Time" => "May 29, 2015 10:00:01 PM", 
     "End Time" => "May 9, 2015 10:51:39 pm" 
} 
{ 
     "message" => [ 
     [0] "108630,anmuswcnbu01,1,Backup,anmuswcapps03.na.jnj.com,\"May 29, 2015 10:00:01 PM\",\"May 29, 2015 9:31:19 PM\"\r" 
    ], 
     "@version" => "1", 
    "@timestamp" => "2015-05-29T09:31:19.000Z", 
      "host" => "ip-172-31-34-14", 
     "Job ID" => "108630", 
    "Server Name" => "anmuswcnbu01", 
    "Status Code" => "1", 
     "Job Type" => "Backup", 
    "Client Name" => "anmuswcapps03.na.jnj.com", 
    "Start Time" => "May 29, 2015 10:00:01 PM", 
     "End Time" => "May 29, 2015 9:31:19 PM" 
} 
Logstash shutdown completed 

Ad esempio, in quest'ultima (3a) riga di dati , anziché:

"@timestamp" => "2015-05-29T09:31:19.000Z", 

mi sento come dovrei ottenere

"@timestamp" => "2015-05-29T21:31:19.000Z", 

Come meglio posso dire, il filtro della data sembra essere ignorando la mia sintassi "mezza giornata"

match => [ "End Time", 
        "MMM dd, YYYY HH:mm:ss aa", 
        "MMM d, YYYY HH:mm:ss aa" ] 

Nuovo per logstash, così mi chiedo se sto facendo qualcosa di sbagliato?

-Chad

risposta

5

Il filtro date utilizza un formato compatibile con Joda-Time.

Citando parte della tabella dei simboli di Joda:

Symbol Meaning      Presentation Examples 
------ -------      ------------ ------- 
a  halfday of day    text   PM 
K  hour of halfday (0~11)  number  0 
h  clockhour of halfday (1~12) number  12 

H  hour of day (0~23)   number  0 
k  clockhour of day (1~24)  number  24 
m  minute of hour    number  30 
s  second of minute    number  55 
S  fraction of second   number  978 

E 'facile trascurare, ma le ore di mezza giornata nel tuo caso sono KK invece di HH.

+0

Brillante! Ha funzionato perfettamente Grazie per la risposta gentile e veloce. –