Jetendo real estate application features rewritten to improve performance and code maintenance

  Follow me: Follow Bruce Kirkpatrick by email subscription Bruce Kirkpatrick on Twitter Bruce Kirkpatrick on Facebook
Sun, Feb 10, 2013 at 6:15PM

I spent the weekend rewriting large sections of the real estate application.

Start-up & caching flush speed-ups

There is a lot of data that loads when the server starts up or when I need to reset the cache.   It used to do xml parsing for this every time, which I saw was adding several seconds to the load time.  I rewrote this so that it caches a native object to disk and loads that object instead of the XML.  This dropped the load time from over 14 seconds to under 2 seconds.  Additionally, I made the process use multiple threads to drop another second off the load time.

The skin caching system was also taking about 1 second during startup/cache flush, and I was able to make this run in a separate thread to make that process not slow down the request.

These changes mostly make it easier for the developer to do their work.

Scope access and Railo 4 debugging

Railo 4 debugger is able to tell you when you don't refer to variables explicitly.  If you run a large batch job with debugging on, it may cause Railo to take exponentially longer while it spends time tracking thousands of implicit variable accesses.  I rewrote all the listing import and display scripts to have explicit scope so that I could increase their performance especially when debugging.

Faster listing detail pages

On listing detail pages, there are many small property features listed on the page describing the property.  The application used to dynamically process these on every request.  Now it is able to cache them all in the database when the listings are first imported.  This makes the listing detail pages load up to 3 times faster thanks to significantly less CPU overhead.   I also rewrote this code to be more object oriented and implemented some basic tests.  The average load time for these scripts is showing only 8ms now.  It used to be around 20ms on average.   This should improve our scalability since this is our most frequently used page.   It took a full day's work to get this changed across the 10 different IDX vendor scripts.

Expired listing image deletion re-written

The old image delete scripts were unreliable because of concurrent access, and delay in processing of data compared to processing images.   Sometimes listing would go inactive for a few days, but when they come back on the market, the images would have been deleted, because of how RETS connector & RETS servers work.   The new script is able to process the files on disk instead of depending on the file import process.   It checks for images that have no matching listing in the database for at least 30 days and deletes them.   Our server was getting close to running out of space due to over 300gb of image files.   This new script will cut that space usage in half at least and make sure that we minimize waste while also retaining images long enough to be sure we don't delete images that are still needed.

Image file names that don't match the listing id

Some realtor associations use a separate media table for images which causes the images to have a different ID from the mls listing.  This leads to running extra queries or having to pass more data around to find and display the images.  It has also made it difficult to delete images when they expire.  For the associations that have done this, we now rename the images to use the listing id instead, which should solve the related complexity / problems.

Conclusion

This stuff was quite difficult to get done - it took over 20 hours I think.  I had several bugs occur, and some of the listing images shouldn't have been deleted.  In a few hours/days, everything should be back to normal.

Our real estate web application is even better!


Bookmark & Share



Popular tags on this blog

Performance |