Autocomplete getting data from a huge table Autocomplete getting data from a huge table postgresql postgresql

Autocomplete getting data from a huge table


If you're doing autocomplete, I'm assuming that you're looking for matches based on a prefix. The standard data-structure for prefix based lookups is a trie.

If you are unable to get adequate performance from postgres using an index, and prefix based lookup (some string%), you can periodically do a full query of all 2 million rows, and build a trie or keep one in parallel to the database.

The worst case performance of a Trie is O(m), where m is the length of your prefix, so once built it will provide for very fast autocomplete.


You could add an index to the field being searched.

Also, if it's avoidable, don't use open ended wild cards like %some string% they really hurt performance. If possible, do some string%.


If you can afford the extra insert/update time maybe you can use the pg_trgm extension

You have some tests in that link with a 2 million records table to see the improvement in the best case scenario.