Pandas writing dataframe to other postgresql schema
Update: starting from pandas 0.15, writing to different schema's is supported. Then you will be able to use the schema
keyword argument:
df.to_sql('test', engine, schema='a_schema')
Writing to different schema's is not yet supported at the moment with the read_sql
and to_sql
functions (but an enhancement request has already been filed: https://github.com/pydata/pandas/issues/7441).
However, you can get around for now using the object interface with PandasSQLAlchemy
and providing a custom MetaData
object:
meta = sqlalchemy.MetaData(engine, schema='a_schema')meta.reflect()pdsql = pd.io.sql.PandasSQLAlchemy(engine, meta=meta)pdsql.to_sql(df, 'test')
Beware! This interface (PandasSQLAlchemy
) is not yet really public and will still undergo changes in the next version of pandas, but this is how you can do it for pandas 0.14.
Update: PandasSQLAlchemy
is renamed to SQLDatabase
in pandas 0.15.
Solved, thanks to joris answer. Code was also improved thanks to joris comment, by passing around sqlalchemy engine instead of connection objects.
import pandas as pdfrom sqlalchemy import create_engine, MetaDataengine = create_engine(r'postgresql://some:user@host/db')meta = sqlalchemy.MetaData(engine, schema='a_schema')meta.reflect(engine, schema='a_schema')pdsql = pd.io.sql.PandasSQLAlchemy(engine, meta=meta)df = pd.read_sql("SELECT * FROM xxx", con=engine) pdsql.to_sql(df, 'test')