Automatic Static File Versioning System Upgrade

  Follow me: Follow Bruce Kirkpatrick by email subscription Bruce Kirkpatrick on Twitter Bruce Kirkpatrick on Facebook
Sun, Feb 05, 2012 at 4:15PM

Today, I'm one step closer to finishing the new skinning system for my custom web site platform.

Most of the time a web site developer uploads a new version of a file like a stylesheet via ftp and it has the same filename, so any users who have already downloaded the file would see the old version.   Usually a new browser window or refreshing the browser would allow the new version to load, but you can't expect users to refresh, so for a period of time existing users would see the site broken and without new javascript functionality, which could be minor or serious depending on the changes.  Most of the time out of laziness, developers simply ignore this and don't care about it, but I made it a goal to address this problem.
http://www.carlosring.com/stylesheets/zsystem.css

Common Solutions

Renaming your static files every time they change avoids the problem, but creates extra manual effort because you have to remember to update the HTML includes everywhere they occur each time you make a change and you have to be consistent as to not get confused about your naming conventions and where the actual original is.   It also causes the file system to have copies of the files that may stay online forever creating a mess.  Laziness and human error will still cause this to fail sometimes because of typos and forgetting.
http://www.carlosring.com/stylesheets/zsystem2.css
http://www.carlosring.com/stylesheets/zsystem3.css
http://www.carlosring.com/stylesheets/zsystem4.css
etc

You could turn the static requests into dynamic requests that change the http caching headers or disable caching globally, but then by disabling browser caching, you have caused the web site to load more slowly on each request.   This is good when you can't control the entire source code like a third party app on rackspace cloud sites requires this.

My Solution

My new system solves the problem.   It stores a snapshot of the file system in ram and in the database and updates it when certain events occur where the file might have changed.  This makes it have zero performance loss.    Additionally, by using rewrite rules, the request is still a static file request which can be cached by the web server and served more quickly then using php or cfml.   Because all new and old versions actually link to the same file, there are no extra copies of the file creating a mess.  The developer can work on the same file forever without concern for which version is the newest one.   

I just have to replace my css and js includes
<link rel="stylesheet" href="/stylesheets/style.css" type="text/css" />

with a function which outputs the new version's url automatically.
#request.zos.skin.includeCSS("/stylesheets/style.css")#

Example of the versioned urls:
http://www.carlosring.com/stylesheets/style.css?zversion=159-159
http://www.carlosring.com/z/stylesheets/zOS.css?zversion=151-1529

Because the original url stays the same and only the query string has changed, the developer can assume where the file exists on the filesystem without having to know how this system works.

Works with .html, .js and .css files

This same system is setup to detect changes to .html files and store backups of every change.   This means whether you use FTP or the browser to make skin changes, the system will be able to detect changes and store copies.  The system has very high performance because all of the compiled skin information is stored in ram for future requests and compiled skins become coldfusion components that are stored in ram, which even eliminates the overhead of a file include like most CFML and php of the existing frameworks still have.   Even with hundreds of includes, everything would load at the speed of memory instead of the disk.

My web sites (and most sites on the internet) can't apply aggressive caching safely when they run the risk of some users having a broken experience when you make changes, but as I transition to the new urls for all the css / js files, I would be able to set these files to be permanently cached so that the browser doesn't attempt to make an HTTP at all potentially.   This is what rackspace does to save money, but because they apply it to everyone by default, it actually causes trouble for many dynamic web sites where a CMS may update the files without changing their name and the images can be permanently stuck in cache.

This feature improves network performance
AND rendering performance

Since most of the load time for a site is these extra static files, compiling of css and javascript, having a cached version eliminates the browser from having to recompile them, which saves not just the network time, but the cpu time too.   This would make a web site on a  phone or tablet appear more responsive too since they have slower cpus.  Most of the new browsers now compile javascript instead of re-interpreting it constantly, which is part of how they made javascript faster in the last few years.


Bookmark & Share



Popular tags on this blog

Performance |