When to use Json over key/value tables in postgres for billions of rows When to use Json over key/value tables in postgres for billions of rows json json

When to use Json over key/value tables in postgres for billions of rows


You can think about this in terms of query complexity. If you have an index to the json documents (maybe user_id) you can do a simple index-scan to access the whole json string very fast.

You have to dissect it on the client side then, or you can pass it to functions in postgres, if e.g. you want to extract only data for specific values.

One of the most important features of postgres when dealing with json is having functional indexes. In comparison to "normal" index which index the value of a column, function indexes apply a function to a value of one (or even more) column values and index the return value. I don't know the function that extracts the value of a json string, but consider you want the user that have bookclub_id = 1. You can create an index like

create index idx_bookblub_id on mytable using getJsonValue("bookclub_id",mytable.jsonvalue)

Afterwards queries like

select * from mytable where getJsonValue("bookclub_id",mytable.jsonvalue) = 1

are lightning fast.