There's a few ways you could do this -- programming language (PHP for example) or bash, though I'd really question whether this is practical.
For example, let's say you're log file is around 1,000 lines right now. If you're wanting to insert each of the lines as an individual entry, you're looking at 1,000 inserts. You'd then need to flush the log file out using log rotate or code in-app so that you're not continuously inserting the same data.
If you're wanting to do this live or nearly live and the machine you're running this on is production, and accepting numerous connections, you're going to be doing a lot of file reads (the logs), file processing (again, logs), as well as writes to the database server (since each entry would be an
When it comes to bash or a programming language, you can simply read a file to memory, run a loop over the data (such as a
foreach statement), and within the loop, you build your query(ies). Once each line has been read and the query(ies) is/are built, you'd execute which would then perform the actual inserts.
Whether this is very practical depends on what you really need to do with the data. You may very well be better off setting up log rotate and simply scanning the files as needed unless you have an absolute need for it to be in a database.