Tips for optimizing an sqlite database with over a gig of data in it? [closed] Tips for optimizing an sqlite database with over a gig of data in it? [closed] sqlite sqlite

Tips for optimizing an sqlite database with over a gig of data in it? [closed]


Have you read the SQLite Optimization FAQ (a few years old, but still seems useful)?

I don't think 1gb is particularly large, even for SQLite. It can certainly handle much larger databases stably.


An update - we've had the most success with more intelligent ordering of data at import time, aggressive indexing (more than doubling the size of the db file in the process), tuning settings in Windows to get XP to behave more like a server than a desktop, changing the order of joins (you can't solely rely on the optimizer) and by measuring your progress by creating a performance test suite (just a harness for running queries and taking measurements.

The experience has been fun and the end users are happy and are able to work with the data.


If you are doing large imports of data, I found the most efficient way was to

  1. Order your inserted data preferably in primary index sequence
  2. Use prepared statements (doh)
  3. Drop any indexes
  4. Insert the data in bulk wrapped with transactions ( say 10,000 records in a hit)
  5. Add you indexes back

also remember that sqlite does not support the or operator in the where clause yet. You may be able to tweak things by demogranizing the where clause to get to to use ands.