vinny
By:
vinny

Logstash - NGINX and HTTPD logs

November 25, 2014 4.2k views

Hi Mitchell -
Your guide on getting ELK setup was perfect. Worked like a charm.
What do I do if I want to import previous log files (e.g., maillog-date or maillog.processed.x.gz)?

Also, I would like to import NGINX or HTTPD logs. is there a specific formula that I need to use to import and parse them?

Thanks!

2 Answers

Hey!

I'm not @manicas, but hopefully I can help! First of all, I'd suggest you take a look at the second part of his series: Adding Logstash Filters To Improve Centralized Logging It specifically covers Nginx and Apache logs.

Generally, there are two main pieces to getting a log from the server being monitored to the ELK server. Using Nginx as an example, you need to add a new entry to the the "files" section of /etc/logstash-forwarder

,
    {
      "paths": [
        "/var/log/nginx/access.log"
       ],
      "fields": { "type": "nginx-access" }
    }

This tells the logstash-forwarder to send /var/log/nginx/access.log to the ELK instance.

One the other end, you'd need to create a filter telling logstash how to parse the file. The tutorial give an example.

This site can come in very useful when writing filters: https://grokdebug.herokuapp.com/

by Mitchell Anicas
One way to increase the effectiveness of your Logstash setup is to collect important application logs and structure the log data by employing filters. This guide is a sequel to the [How To Use Logstash and Kibana To Centralize Logs On Ubuntu 14.04](https://www.digitalocean.com/community/tutorials/how-to-use-logstash-and-kibana-to-centralize-and-visualize-logs-on-ubuntu-14-04) tutorial, and focuses primarily on adding filters for various common application log.
  • i tried as you said.. but it didn't work.. I only see my syslogs :) and no nginx logs on kibana. #sad

    {
      "network": {
        "servers": [ "erasedip:5000" ],
        "timeout": 15,
        "ssl ca": "/etc/pki/tls/certs/logstash-forwarder.crt"
      },
      "files": [
        {
          "paths": [ "/var/log/syslog", "/var/log/auth.log" ],
          "fields": { "type": "syslog" }
            },
        {
          "paths": [ "/var/log/nginx/access.log" ],
          "fields": { "type": "nginx-access" }
            }
    
        ]
    }
    
  • @esmyl911 Are you seeing anything out of the ordinary in logstash-forwarder's own logs? Did you create the logstash filter as well? Checkout this tutorial of the details.

    by Mitchell Anicas
    One way to increase the effectiveness of your Logstash setup is to collect important application logs and structure the log data by employing filters. This guide is a sequel to the [How To Use Logstash and Kibana To Centralize Logs On Ubuntu 14.04](https://www.digitalocean.com/community/tutorials/how-to-use-logstash-and-kibana-to-centralize-and-visualize-logs-on-ubuntu-14-04) tutorial, and focuses primarily on adding filters for various common application log.
  • @asb Hey Sorry.. I had to restart logstash on the server and logstash forwarder on my nginx. It worked perfect after that. Thanks!! But there is still another issue! Right now i am unable to save my dashboard. It seems like this bug is fixed in latest update?

    https://www.digitalocean.com/community/tutorials/how-to-use-logstash-and-kibana-to-centralize-and-visualize-logs-on-ubuntu-14-04

    bug fixed in kibana 4.x

    so are you guys going to update the doc?

    by Mitchell Anicas
    In this tutorial, we will go over the installation of Logstash 1.4.2 and Kibana 3, and how to configure them to gather and visualize the syslogs of our systems in a centralized location. Logstash is an open source tool for collecting, parsing, and storing logs for future use. Kibana 3 is a web interface that can be used to search and view the logs that Logstash has indexed. Elasticsearch, Logstash, and Kibana, when used together is known as an ELK stack.
Have another answer? Share your knowledge.