How to update selected rows with values from a CSV file in Postgres?
COPY
the file to a temporary staging table and update the actual table from there. Like:
CREATE TEMP TABLE tmp_x (id int, apple text, banana text); -- but see belowCOPY tmp_x FROM '/absolute/path/to/file' (FORMAT csv);UPDATE tblSET banana = tmp_x.bananaFROM tmp_xWHERE tbl.id = tmp_x.id;DROP TABLE tmp_x; -- else it is dropped at end of session automatically
If the imported table matches the table to be updated exactly, this may be convenient:
CREATE TEMP TABLE tmp_x AS SELECT * FROM tbl LIMIT 0;
Creates an empty temporary table matching the structure of the existing table, without constraints.
Privileges
SQL COPY
requires superuser privileges for this. (The manual):
COPY
naming a file or command is only allowed to databasesuperusers, since it allows reading or writing any file that theserver has privileges to access.
The psql meta-command \copy
works for any db role. The manual:
Performs a frontend (client) copy. This is an operation that runs anSQL
COPY
command, but instead of the server reading or writing thespecified file, psql reads or writes the file and routes the databetween the server and the local file system. This means that fileaccessibility and privileges are those of the local user, not theserver, and no SQL superuser privileges are required.
The scope of temporary tables is limited to a single session of a single role, so the above has to be executed in the same psql session:
CREATE TEMP TABLE ...;\copy tmp_x FROM '/absolute/path/to/file' (FORMAT csv);UPDATE ...;
If you are scripting this in a bash command, be sure to wrap it all in a single psql call. Like:
echo 'CREATE TEMP TABLE tmp_x ...; \copy tmp_x FROM ...; UPDATE ...;' | psql
Normally, you need the meta-command \\
to switch between psql meta commands and SQL comands in psql, but \copy
is an exception to this rule. The manual again:
special parsing rules apply to the
\copy
meta-command. Unlike most other meta-commands, the entire remainder of the line is always taken to be the arguments of\copy
, and neither variable interpolation nor backquote expansion are performed in the arguments.
Big tables
If the import-table is big it may pay to increase temp_buffers
temporarily for the session (first thing in the session):
SET temp_buffers = '500MB'; -- example value
Add an index to the temporary table:
CREATE INDEX tmp_x_id_idx ON tmp_x(id);
And run ANALYZE
manually, since temporary tables are not covered by autovacuum / auto-analyze.
ANALYZE tmp_x;
Related answers:
You can try the below code written in python, the input file is the csv file whose contents you want to update into the table. Each row is split based on comma so for each row, row[0]is the value under first column, row[1] is value under second column etc.
import csv import xlrd import os import psycopg2 import django from yourapp import settings django.setup() from yourapp import models try: conn = psycopg2.connect("host=localhost dbname=prodmealsdb user=postgres password=blank") cur = conn.cursor() filepath = '/path/to/your/data_to_be_updated.csv' ext = os.path.splitext(filepath)[-1].lower() if (ext == '.csv'): with open(filepath) as csvfile: next(csvfile) readCSV = csv.reader(csvfile, delimiter=',') for row in readCSV: print(row[3],row[5]) cur.execute("UPDATE your_table SET column_to_be_updated = %s where id = %s", (row[5], row[3])) conn.commit() conn.close() cur.close() except (Exception, psycopg2.DatabaseError) as error: print(error) finally: if conn is not None: conn.close()