Serving Rails assets from the cloud


Recently I have had to build several high performance websites. These sites were simple sites but were heavy in images and we were expecting large amounts of traffic. I was using heroku to host the sites so I could easily scale to a large number of dynos but why have all of the excess http request for all of the assets hit my heroku app too.

So I knew I was going to offload all of my images, css, fonts, and javascript onto s3 and maybe even run cloudfront to speed things up further, but managing:

  1. copying all of those assets to s3
  2. being sure that every time I change an asset I upload it to S3
  3. managing the urls to the assets

was going to be a pain in the ass.

Luckily with the help of Rails 3.2’s asset pipeline and a little gem called ‘asset_sync’ everything can be made much simpler.

You start by adding the gem to you Gemfile:

gem 'asset_sync'

and bundle install

Next we need to add the asset_host to our production.rb

config.action_controller.asset_host = "//#{ENV['FOG_DIRECTORY']}"

Then we need to set some config variables on heroku for the gem to use. You will need your s3 bucket name, aws access key and aws secret key. You can optionally add that you want the assets gzipped and what AWS region you bucket is located in.

heroku config:add AWS_ACCESS_KEY_ID=xxxx
heroku config:add AWS_SECRET_ACCESS_KEY=xxxx
heroku config:add FOG_DIRECTORY=xxxx
heroku config:add FOG_PROVIDER=AWS
# and optionally:
heroku config:add FOG_REGION=us-west-1
heroku config:add ASSET_SYNC_GZIP_COMPRESSION=true

If you are going to use this in production that is all you need, but if you are running your app in any other environment (e.g. staging) you will need to run:

heroku labs:enable user-env-compile -a myapp

Without this everything will be compiled as if in production which may cause some issues.

Now the next time you push to heroku your assets will be uploaded to s3 and all of you links/sources will point to the s3 urls (provided you used the rails view helpers).