Question

Logstash - NGINX and HTTPD logs

  • Posted on November 25, 2014
  • vinnyAsked by vinny

Hi Mitchell - Your guide on getting ELK setup was perfect. Worked like a charm. What do I do if I want to import previous log files (e.g., maillog-date or maillog.processed.x.gz)?

Also, I would like to import NGINX or HTTPD logs. is there a specific formula that I need to use to import and parse them?

Thanks!


Submit an answer

This textbox defaults to using Markdown to format your answer.

You can type !ref in this text area to quickly search our full set of tutorials, documentation & marketplace offerings and insert the link!

Sign In or Sign Up to Answer

These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.

Want to learn more? Join the DigitalOcean Community!

Join our DigitalOcean community of over a million developers for free! Get help and share knowledge in Q&A, subscribe to topics of interest, and get courses and tools that will help you grow as a developer and scale your project or business.

This comment has been deleted

Hey!

I’m not @manicas, but hopefully I can help! First of all, I’d suggest you take a look at the second part of his series: Adding Logstash Filters To Improve Centralized Logging It specifically covers Nginx and Apache logs.

Generally, there are two main pieces to getting a log from the server being monitored to the ELK server. Using Nginx as an example, you need to add a new entry to the the “files” section of /etc/logstash-forwarder

,
    {
      "paths": [
        "/var/log/nginx/access.log"
       ],
      "fields": { "type": "nginx-access" }
    }

This tells the logstash-forwarder to send /var/log/nginx/access.log to the ELK instance.

One the other end, you’d need to create a filter telling logstash how to parse the file. The tutorial give an example.

This site can come in very useful when writing filters: https://grokdebug.herokuapp.com/