1

I run a PHP script that reads data rows from a file, analyses them and inserts them one by one into a local MySQL database:

$mysqli = new mysqli($db_host, $db_user, $db_password, $db_db);
if ($mysqli->connect_errno) {
  echo "Failed to connect to MySQL: (" . $mysqli->connect_errno . ") " . $mysqli->connect_error;
} else {
  /* As long as there is data in the file */
  while(...) {
    ... // analyse each row (contained in an object $data)
    /* Write it to the database table. */
    $mysqli->query($data->getInsertQuery($db_table));
  }
}

I have 40 million data rows. The first couple of million datasets were inserted very fast, But in the last 6 hours only two million were inserted (I'm now at 30 million), and it seems that it becomes slower and slower (so far, no index was defined!).

I was wondering, if the is a more efficient way of writing data to the table. If possible I'd preferr a solution without extra (temporary) files.

R_User
  • 10,682
  • 25
  • 79
  • 120
  • 1
    Have you considered using [`LOAD DATA INFILE`](http://dev.mysql.com/doc/refman/5.1/en/load-data.html)? What type of analysis must be performed on each row? – George Cummins May 21 '13 at 14:57
  • 1
    sounds more like a MySQL optimization issue that PHP. You can also try do reduce the # of inserts by grouping multiple inserts into 1 extended insert, e.g. `INSERT INTO \`table_name\` (\`col1\`,\`col2\`) VALUES ('value1', 'value2'), ('value3', 'value4'), ('value5',....)` – Populus May 21 '13 at 14:58
  • 2
    Not sure if this applies in MySQL as it does in Oracle, but a `COMMIT` every few hundred or thousand rows might help. The millions of uncommitted rows could be slowing things down. – Ed Gibbs May 21 '13 at 15:06
  • 2
    Are you using InnoDB, and if yes - did you configure your MySQL instance at all? I'm asking because I'm doing 100mil rows inserts every now and then and it doesn't slow down as much as it does for you. Sounds like you're running on default settings and with autocommit on (each query is in its own transaction). – N.B. May 21 '13 at 15:24
  • 1
    I'd give the last 2 comments and the comment by STT LCU a chance to be accepted as an answer. Those 3 answers really add new content which has not been added to the mentioned "duplicate". – John Garreth May 22 '13 at 10:05

3 Answers3

2

You will be more efficient to first translate your file into a SQL one (so simple change your script to write down statements into a file) and then load it using mysql command line like that:

mysql -uuser -p dbname < file.sql

Over such large import, it will save you quite a bit on overhead that comes from using PHP. Just remember to stream the data into file one query at a time ;)

Tymoteusz Paul
  • 2,732
  • 17
  • 20
1

It's possible to pregenerate and store SQL inserts command to file and then import data to MySQL.

mysql --default-character=utf8 --user=your_user -p your_db < tbl.sql
Valery Viktorovsky
  • 6,487
  • 3
  • 39
  • 47
0

You can use prepared statements to speed it up a bit:

See http://devzone.zend.com/239/ext-mysqli-part-i_overview-and-prepared-statements/ which I found googling for "stackoverflow mysqli prepared statements"

Adder
  • 5,708
  • 1
  • 28
  • 56