I am using logstash to parse log entries from an input log file.
LogLine:
TID: [0] [] [2016-05-30 23:02:02,602] INFO {org.wso2.carbon.registry.core.jdbc.EmbeddedRegistryService} - Configured Registry in 572ms {org.wso2.carbon.registry.core.jdbc.EmbeddedRegistryService}
Grok Pattern:
TID:%{SPACE}\[%{INT:SourceSystemId}\]%{SPACE}\[%{DATA:ProcessName}\]%{SPACE}\[%{TIMESTAMP_ISO8601:TimeStamp}\]%{SPACE}%{LOGLEVEL:MessageType}%{SPACE}{%{JAVACLASS:MessageTitle}}%{SPACE}-%{SPACE}%{GREEDYDATA:Message}
The Grok pattern is working fine. Now I want to send the output of this parsing in an transformed way to my rest service.
Expected Output:
{
"MessageId": "654656",
"TimeStamp": "2001-12-31T12:00:00",
"CorrelationId": "986565",
"Severity": "NORMAL",
"MessageType": "INFO",
"MessageTitle": "TestTittle",
"Message": "Sample Message",
"MessageDetail": {
"SourceSystemId": "65656",
"ServerIP": "192.168.1.1",
"HostName": "wedev.101",
"ProcessId": "986",
"ProcessName": "JAVA",
"ThreadId": "65656",
"MessageComponentName": "TestComponent"
}
}
Problem Statement:
I want that the json message that is sent to my rest based service should be in the above mentioned format.Is it possible in the logstash that I can also add some hard codded values and use that values that I am getting through parsing the logs.
Following is my logstash-conf file:
input {
file {
path => "C:\WSO2Environment\wso2esb-4.8.1\repository\logs\wso2carbon.log"
type => "wso2"
codec => multiline {
charset => "UTF-8"
multiline_tag => "multiline"
negate => true
pattern => "^%{YEAR}\s%{MONTH}\s%{MONTHDAY}\s%{TIME}:\d{3}\s%{LOGLEVEL}"
what => "previous"
}
}
}
filter {
if [type] == "wso2" {
grok {
match => [ "message", "TID:%{SPACE}\[%{INT:SourceSystemId}\]%{SPACE}\[%{DATA:ProcessName}\]%{SPACE}\[%{TIMESTAMP_ISO8601:TimeStamp}\]%{SPACE}%{LOGLEVEL:MessageType}%{SPACE}{%{JAVACLASS:MessageTitle}}%{SPACE}-%{SPACE}%{GREEDYDATA:Message}" ]
add_tag => [ "grokked" ]
}
if !( "_grokparsefailure" in [tags] ) {
date {
match => [ "log_timestamp", "yyyy MMM dd HH:mm:ss:SSS" ]
add_tag => [ "dated" ]
}
}
}
if ( "multiline" in [tags] ) {
grok {
match => [ "message", "Service:(?<log_service>\s[\w]+)[.\W]*Operation:(?<log_operation>\s[\w]+)" ]
add_tag => [ "servicedetails" ]
tag_on_failure => [ "noservicedetails" ]
}
}
}
output {
# stdout { }
http {
url => "http://localhost:8087/messages"
http_method => "post"
format => "json"
}
}
Note:
I still have to configure the multiline format, so please ignore that part in my logstash configuration file.
To add fields to an event, possibly including data parsed from the event, you will probably want to use the add_field
functionality that most Logstash filters implement.
The easiest way to do this would be by adding a mutate filter with any add_field functions that you wanted.
mutate {
add_field => {
"foo_%{somefield}" => "Hello world, from %{host}"
}
}