How can I do for to insert nginx log to mysql?

I want to insert my log file into a database like mysql This is un example of a log file that I have: 195.xx.x.x - - [13/Apr/2017:09:60:xx +0200] "POST /userx/index.php?m=contacts&xxxx… 192.xx.x.x - - [13/Apr/2017:09:45:xx +0200] "POST /userx/index.php?m=customer&xxxx… 197.xx.x.x - - [13/Apr/2017:09:10:xx +0200] "POST /userx/index.php?m=meeting&xxxx… 197.xx.x.x - - [13/Apr/2017:09:20:xx +0200] "POST /userx/index.php?m=dashboard&xxxx…

Submit an answer

This textbox defaults to using Markdown to format your answer.

You can type !ref in this text area to quickly search our full set of tutorials, documentation & marketplace offerings and insert the link!

Sign In or Sign Up to Answer

These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.


There’s a few ways you could do this – programming language (PHP for example) or bash, though I’d really question whether this is practical.

For example, let’s say you’re log file is around 1,000 lines right now. If you’re wanting to insert each of the lines as an individual entry, you’re looking at 1,000 inserts. You’d then need to flush the log file out using log rotate or code in-app so that you’re not continuously inserting the same data.

If you’re wanting to do this live or nearly live and the machine you’re running this on is production, and accepting numerous connections, you’re going to be doing a lot of file reads (the logs), file processing (again, logs), as well as writes to the database server (since each entry would be an insert).

When it comes to bash or a programming language, you can simply read a file to memory, run a loop over the data (such as a for or foreach statement), and within the loop, you build your query(ies). Once each line has been read and the query(ies) is/are built, you’d execute which would then perform the actual inserts.

Whether this is very practical depends on what you really need to do with the data. You may very well be better off setting up log rotate and simply scanning the files as needed unless you have an absolute need for it to be in a database.