I run a PHP script that reads data rows from a file, analyses them and inserts them one by one into a local MySQL database:
$mysqli = new mysqli($db_host, $db_user, $db_password, $db_db);
if ($mysqli->connect_errno) {
echo "Failed to connect to MySQL: (" . $mysqli->connect_errno . ") " . $mysqli->connect_error;
} else {
/* As long as there is data in the file */
while(...) {
... // analyse each row (contained in an object $data)
/* Write it to the database table. */
$mysqli->query($data->getInsertQuery($db_table));
}
}
I have 40 million data rows. The first couple of million datasets were inserted very fast, But in the last 6 hours only two million were inserted (I'm now at 30 million), and it seems that it becomes slower and slower (so far, no index was defined!).
I was wondering, if the is a more efficient way of writing data to the table. If possible I'd preferr a solution without extra (temporary) files.