Automatic Static File Versioning System Upgrade
Today, I'm one step closer to finishing the new skinning system for my custom web site platform.
Renaming your static files every time they change avoids the problem, but creates extra manual effort because you have to remember to update the HTML includes everywhere they occur each time you make a change and you have to be consistent as to not get confused about your naming conventions and where the actual original is. It also causes the file system to have copies of the files that may stay online forever creating a mess. Laziness and human error will still cause this to fail sometimes because of typos and forgetting.
You could turn the static requests into dynamic requests that change the http caching headers or disable caching globally, but then by disabling browser caching, you have caused the web site to load more slowly on each request. This is good when you can't control the entire source code like a third party app on rackspace cloud sites requires this.
My new system solves the problem. It stores a snapshot of the file system in ram and in the database and updates it when certain events occur where the file might have changed. This makes it have zero performance loss. Additionally, by using rewrite rules, the request is still a static file request which can be cached by the web server and served more quickly then using php or cfml. Because all new and old versions actually link to the same file, there are no extra copies of the file creating a mess. The developer can work on the same file forever without concern for which version is the newest one.
I just have to replace my css and js includes
<link rel="stylesheet" href="/stylesheets/style.css" type="text/css" />
with a function which outputs the new version's url automatically.
Example of the versioned urls:
Because the original url stays the same and only the query string has changed, the developer can assume where the file exists on the filesystem without having to know how this system works.
Works with .html, .js and .css files
This same system is setup to detect changes to .html files and store backups of every change. This means whether you use FTP or the browser to make skin changes, the system will be able to detect changes and store copies. The system has very high performance because all of the compiled skin information is stored in ram for future requests and compiled skins become coldfusion components that are stored in ram, which even eliminates the overhead of a file include like most CFML and php of the existing frameworks still have. Even with hundreds of includes, everything would load at the speed of memory instead of the disk.
My web sites (and most sites on the internet) can't apply aggressive caching safely when they run the risk of some users having a broken experience when you make changes, but as I transition to the new urls for all the css / js files, I would be able to set these files to be permanently cached so that the browser doesn't attempt to make an HTTP at all potentially. This is what rackspace does to save money, but because they apply it to everyone by default, it actually causes trouble for many dynamic web sites where a CMS may update the files without changing their name and the images can be permanently stuck in cache.
This feature improves network performance
AND rendering performance
Bookmark & Share
Most Popular Articles
- Mass virtual hosting security tip when using a reverse proxy to connect to other servers
- Solution for MariaDB Field 'xxx' doesn't have a default value
- How to lock Windows immediately upon smart card removal
- Stop using sleep mode with Windows Bitlocker for better security. Learn how to use hibernate in Windows 8.
- Is Google Public DNS actually better then your ISP?
- Planning a system to visually create responsive data-driven web page layouts & widgets in the Jetendo CMS browser interface
- Pros and Cons of CFML vs PHP and other languages
- Run Windows Guest in CentOS 6 Linux Host using Virtualbox 4 via the command line