How to copy from CSV file to PostgreSQL table with headers in CSV file? How to copy from CSV file to PostgreSQL table with headers in CSV file? postgresql postgresql

How to copy from CSV file to PostgreSQL table with headers in CSV file?


This worked. The first row had column names in it.

COPY wheat FROM 'wheat_crop_data.csv' DELIMITER ';' CSV HEADER


With the Python library pandas, you can easily create column names and infer data types from a csv file.

from sqlalchemy import create_engineimport pandas as pdengine = create_engine('postgresql://user:pass@localhost/db_name')df = pd.read_csv('/path/to/csv_file')df.to_sql('pandas_db', engine)

The if_exists parameter can be set to replace or append to an existing table, e.g. df.to_sql('pandas_db', engine, if_exists='replace'). This works for additional input file types as well, docs here and here.


Alternative by terminal with no permission

The pg documentation at NOTESsay

The path will be interpreted relative to the working directory of the server process (normally the cluster's data directory), not the client's working directory.

So, gerally, using psql or any client, even in a local server, you have problems ... And, if you're expressing COPY command for other users, eg. at a Github README, the reader will have problems ...

The only way to express relative path with client permissions is using STDIN,

When STDIN or STDOUT is specified, data is transmitted via the connection between the client and the server.

as remembered here:

psql -h remotehost -d remote_mydb -U myuser -c \   "copy mytable (column1, column2) from STDIN with delimiter as ','" \   < ./relative_path/file.csv