Django-compressor: how to write to S3, read from CloudFront? Django-compressor: how to write to S3, read from CloudFront? python python

Django-compressor: how to write to S3, read from CloudFront?


I wrote a wrapper storage backend around the one provided by boto

myapp/storage_backends.py:

import urlparsefrom django.conf import settingsfrom storages.backends.s3boto import S3BotoStoragedef domain(url):    return urlparse.urlparse(url).hostname    class MediaFilesStorage(S3BotoStorage):    def __init__(self, *args, **kwargs):        kwargs['bucket'] = settings.MEDIA_FILES_BUCKET        kwargs['custom_domain'] = domain(settings.MEDIA_URL)        super(MediaFilesStorage, self).__init__(*args, **kwargs)class StaticFilesStorage(S3BotoStorage):    def __init__(self, *args, **kwargs):        kwargs['bucket'] = settings.STATIC_FILES_BUCKET        kwargs['custom_domain'] = domain(settings.STATIC_URL)        super(StaticFilesStorage, self).__init__(*args, **kwargs)

Where my settings.py file has...

STATIC_FILES_BUCKET = "myappstatic"MEDIA_FILES_BUCKET = "myappmedia"STATIC_URL = "http://XXXXXXXX.cloudfront.net/"MEDIA_URL = "http://XXXXXXXX.cloudfront.net/"DEFAULT_FILE_STORAGE = 'myapp.storage_backends.MediaFilesStorage'COMPRESS_STORAGE = STATICFILES_STORAGE = 'myapp.storage_backends.StaticFilesStorage'


I made a few, different changes to settings.py

AWS_S3_CUSTOM_DOMAIN = 'XXXXXXX.cloudfront.net' #important: no "http://"AWS_S3_SECURE_URLS = True #default, but must set to false if using an alias on cloudfrontCOMPRESS_STORAGE = 'example_app.storage.CachedS3BotoStorage' #from the docs (linked below)STATICFILES_STORAGE = 'example_app.storage.CachedS3BotoStorage'

Compressor Docs

This above solution saved the files locally as well as uploaded them to s3. This let me compress the files offline. If you aren't gzipping, the above ought to work for serving compressed files from CloudFront.

Adding gzip adds a wrinkle:

settings.py

AWS_IS_GZIPPED = True

though this resulted in an error whenever a compressible file (css and js according to storages) was being pushed to s3 during collectstatic:

AttributeError: 'cStringIO.StringO' object has no attribute 'name'

This was due to some bizarre error having to do with the compression of the css/js files that I don't understand. These files I need locally, unzipped, and not on s3, so I could do avoid the problem altogether if I tweak the storage subclass referenced above (and provided in the compressor docs).

new storage.py

from os.path import splitext from django.core.files.storage import get_storage_class  from storages.backends.s3boto import S3BotoStorage  class StaticToS3Storage(S3BotoStorage):     def __init__(self, *args, **kwargs):         super(StaticToS3Storage, self).__init__(*args, **kwargs)         self.local_storage = get_storage_class('compressor.storage.CompressorFileStorage')()     def save(self, name, content):         ext = splitext(name)[1]         parent_dir = name.split('/')[0]         if ext in ['.css', '.js'] and not parent_dir == 'admin':             self.local_storage._save(name, content)         else:                 filename = super(StaticToS3Storage, self).save(name, content)             return filename 

This then saved all .css and .js files (excluding the admin files, which I serve uncompressed from CloudFront) while pushing the rest of the files to s3 (and not bothering to save them locally, though could easily add the self.local_storage._save line).

But when I run compress, I want my compressed .js and .css files to get pushed to s3 so I create another sublcass for compressor to use:

class CachedS3BotoStorage(S3BotoStorage):         """         django-compressor uses this class to gzip the compressed files and send them to s3         these files are then saved locally, which ensures that they only create fresh copies         when they need to         """         def __init__(self, *args, **kwargs):             super(CachedS3BotoStorage, self).__init__(*args, **kwargs)             self.local_storage = get_storage_class('compressor.storage.CompressorFileStorage')()         def save(self, filename, content):             filename = super(CachedS3BotoStorage, self).save(filename, content)             self.local_storage._save(filename, content)             return filename 

Finally, given these new subclasses, I need to update a few settings:

COMPRESS_STORAGE = 'example_app.storage.CachedS3BotoStorage' #from the docs (linked below)STATICFILES_STORAGE = 'example_app.storage.StaticToS3Storage'

And that is all I have to say about that.


Seems like the problem was actually fixed upstream in Django, https://github.com/django/django/commit/5c954136eaef3d98d532368deec4c19cf892f664

The problematic _get_size method could probably patched locally to work around it for older versions of Django.

EDIT: Have a look at https://github.com/jezdez/django_compressor/issues/100 for an actual work around.