Import Multiple .sql dump files into mysql database from shell Import Multiple .sql dump files into mysql database from shell mysql mysql

Import Multiple .sql dump files into mysql database from shell


cat *.sql | mysql? Do you need them in any specific order?

If you have too many to handle this way, then try something like:

find . -name '*.sql' | awk '{ print "source",$0 }' | mysql --batch

This also gets around some problems with passing script input through a pipeline though you shouldn't have any problems with pipeline processing under Linux. The nice thing about this approach is that the mysql utility reads in each file instead of having it read from stdin.


One-liner to read in all .sql files and imports them:

for SQL in *.sql; do DB=${SQL/\.sql/}; echo importing $DB; mysql $DB < $SQL; done

The only trick is the bash substring replacement to strip out the .sql to get the database name.


There is superb little script at http://kedar.nitty-witty.com/blog/mydumpsplitter-extract-tables-from-mysql-dump-shell-script which will take a huge mysqldump file and split it into a single file for each table. Then you can run this very simple script to load the database from those files:

for i in *.sqldo  echo "file=$i"  mysql -u admin_privileged_user --password=whatever your_database_here < $idone

mydumpsplitter even works on .gz files, but it is much, much slower than gunzipping first, then running it on the uncompressed file.

I say huge, but I guess everything is relative. It took about 6-8 minutes to split a 2000-table, 200MB dump file for me.