Join all PostgreSQL tables and make a Python dictionary Join all PostgreSQL tables and make a Python dictionary pandas pandas

Join all PostgreSQL tables and make a Python dictionary


Why don't you create a postgres function instead of script?

Here are some advises that could help you to avoid the memory error:

  • You can use WITH clause which makes better use of your memory.
  • You can create some physical tables for storing the information ofdifferent groups of tables of your database. These physical tables will avoid to use a great amount of memory. After that, all you have to do is joining only those physical tables. You can create a function for it.
  • You can create a Data Warehouse by denormalizing the tables you need.
  • Last but not least: Make sure you are using Indexes appropriately.


I'm not certain this will help, but you can try pd.concat

raw_dict = pd.concat([d.set_index('USER_ID') for d in df_arr], axis=1)

Or, to get a bit more disctinction

raw_dict = pd.concat([d.set_index('USER_ID') for d in df_arr], axis=1, keys=sql_tables)

If this isn't helpful, let me know and I'll delete it.