Switching Django project from sqlite3 backend to postgresql failes when loading datadump
The problem is simply that you're getting the content types defined twice - once when you do syncdb
, and once from the exported data you're trying to import. Since you may well have other items in your database that depend on the original content type definitions, I would recommend keeping those.
So, after running syncdb
, do manage.py dbshell
and in your database do TRUNCATE django_content_type;
to remove all the newly-defined content types. Then you shouldn't get any conflicts - on that part of the process, in any case.
There is a big discussion about it on the Django ticket 7052. The right way now is to use the --natural
parameter, example: ./manage.py dumpdata --natural --format=xml --indent=2 > fixture.xml
In order for --natural
to work with your models, they must implement natural_key
and get_by_natural_key
, as described on the Django documentation regarding natural keys.
Having said that, you might still need to edit the data before importing it with ./manage.py loaddata
. For instance, if your applications changed, syncdb
will populate the table django_content_type
and you might want to delete the respective entries from the xml-file before loading it.
This worked for me. You probably want to ensure the server is stopped so no new data is lost. Dump it:
$ python manage.py dumpdata --exclude auth.permission --exclude contenttypes --natural > db.json
Make sure your models don't have signals (e.g. post_save) or anything that creates models. If you do, comment it out momentarily.
Edit settings.py to point to the new database and set it up:
$ python manage.py syncdb$ python manage.py migrate
Load the data:
./manage.py loaddata db.json