Question

I need to use grok expression for ELK to filter the message data

I have gone through the article related to filters using grok expression in ELK (Logstash).

I have elk configured using docker compose and filebeat on client server using docker container.

I have following files configured for filters using grok expression.

input { beats { port => 5044 } }

filter { if [path] == “/var/log/apache2/access.log”) grok { match => { “message” => “%{IP:client_ip} | %{DATA:syslog_timestamp} | %{WORD:method} | %{DATA:unknow} | %{DATA:xyz} | %{DATA:code} | %{DATA:unno} | %{DATA:byte} | %{DATA:port} | %{GREEDYDATA:syslog_message}” } }

if [path] == “/var/log/apache2/error.log”) grok { match => { “message” => “%{DATA:timestamp} | %{DATA:Loglevel} | %{DATA:requet} | %{DATA:url} | %{GREEDYDATA:syslog_message}” } }

if [path] = “/var/log/apache2/request.log”) grok { match => { “message” => “%{DATA:timestamp} +0000%{DATA:unknow} [0]%{DATA:method} <-%{DATA:code} %{DATA:httpcode} %{GREEDYDATA:message} %{GREEDYDATA:responsetime}” } } }

output { elasticsearch { hosts => “elasticsearch:9200” } }

filebeat.yml as follows.

filebeat.config: modules: path: ${path.config}/modules.d/*.yml reload.enabled: false

processors:

  • add_cloud_metadata: ~

filebeat.inputs:

  • type: log enabled: true paths:

    • “/var/log/apache2/*.log” exclude_files: [‘.gz$’] json.message_key: log
  • type: log enabled: true paths:

    • “/var/log/aem/*.log” exclude_files: [‘.gz$’] json.message_key: log

when I bring up the ELK stack and start the filebeat I am not able to see the tag in the discover section on the kibana dashboard.

Can you please suggest on this?

Thank you in advance!

Subscribe
Share

Submit an answer
You can type!ref in this text area to quickly search our full set of tutorials, documentation & marketplace offerings and insert the link!