Speeding up large numbers of mysql updates and inserts Speeding up large numbers of mysql updates and inserts php php

Speeding up large numbers of mysql updates and inserts


Some useful links:

From MySQL Documentation:

Speed of INSERT Statements says:

  • If you are inserting many rows from the same client at the same time, use INSERT statements with multiple VALUES lists to insert several rows at a time. This is considerably faster (many times faster in some cases) than using separate single-row INSERT statements. If you are adding data to a nonempty table, you can tune the bulk_insert_buffer_size variable to make data insertion even faster.

  • If multiple clients are inserting a lot of rows, you can get higher speed by using the INSERT DELAYED statement.

  • For a MyISAM table, you can use concurrent inserts to add rows at the same time that SELECT statements are running, if there are no deleted rows in middle of the data file.

  • When loading a table from a text file, use LOAD DATA INFILE. This is usually 20 times faster than using INSERT statements.

  • With some extra work, it is possible to make LOAD DATA INFILE run even faster for a MyISAM table when the table has many indexes.


If you're not already, use prepared statements (via either mysqli, PDO, or some other DB library that supports them). Reusing the same prepared statement and simply changing the parameter values will help speed things up, since the MySQL server only has to parse the query once.

INSERTs can be batched by providing multiple sets of VALUES - a single query that inserts many rows is much faster than an equivalent number of individual queries each inserting one row.


What you suggest is correct. Try reducing the amount of queries you send to the server since this will save you from the multiple communication overhead.

Of course, if the inital select ranged query returns too many data, then you may have a bottleneck on PHP.