Execute SQL from file in SQLAlchemy
I was able to run .sql schema files using pure SQLAlchemy and some string manipulations. It surely isn't an elegant approach, but it works.
# Open the .sql filesql_file = open('file.sql','r')# Create an empty command stringsql_command = ''# Iterate over all lines in the sql filefor line in sql_file: # Ignore commented lines if not line.startswith('--') and line.strip('\n'): # Append line to the command string sql_command += line.strip('\n') # If the command string ends with ';', it is a full statement if sql_command.endswith(';'): # Try to execute statement and commit it try: session.execute(text(sql_command)) session.commit() # Assert in case of error except: print('Ops') # Finally, clear command string finally: sql_command = ''
It iterates over all lines in a .sql file ignoring commented lines.Then it concatenates lines that form a full statement and tries to execute the statement. You just need a file handler and a session object.
You can do it with SQLalchemy and psycopg2.
file = open(path)engine = sqlalchemy.create_engine(db_url)escaped_sql = sqlalchemy.text(file.read())engine.execute(escaped_sql)
Unfortunately I'm not aware of a good general answer for this. Some dbapi's (psycopg2 for instance) support executing many statements at a time. If the files aren't huge you can just load them into a string and execute them on a connection. For others, I would try to use a command-line client for that db and pipe the data into that using the subprocess module.
If those approaches aren't acceptable, then you'll have to go ahead and implement a small SQL parser that can split the file apart into separate statements. This is really tricky to get 100% correct, as you'll have to factor in database dialect specific literal escaping rules, the charset used, any database configuration options that affect literal parsing (e.g. PostgreSQL standard_conforming_strings).
If you only need to get this 99.9% correct, then some regexp magic should get you most of the way there.