Unifying cached content for different websites
Posted by Max Robbins on October 17th, 2011Many of our customers have multiple addresses for referencing their sites. Sites often have a subsets for dealers or agents or SEO purposes. While the majority of the content is the same, there are small differences that make it worth unique configurations for optimizing the individual domains.
The issue is that this leads to a lot of cached content the is the same eating up memory.
In a Dynamic cache like aiScaler, where all objects reside in Memory, this can require an unnecessary amount of additional RAM.
The solution:
We have enabled aiScaler to identify common cached elements using the our regular expression technology. All HTTP requests that match these patterns become a single memory object shared across the sites.
You can still get as granular as you need with inclusion and exclusions on a domain basis.
The bottom line is you can now maximize the efficiency of common shared objects in a dynamic environment with some straight forward easy to use configurations.
WARNING IT GETS GEEKIER BELOW THIS LINE
You might have a setup when a number of different sites cached via aiScaler, for example a.com,
b.com and c.com. Yet some of the content on these sites is identical and you’d like to cache(store) only a single copy of it, effectively sharing it across different sites.
For example, let’s assume that you want to share URLs that contain /css and /images prefix. To accomplish this, we simply tell aiScaler to use a different signature prefix – instead of matching website’s hostname, we configure a different prefix via pattern-level sig_hostname setting.
For example:
website a.com
pattern /css simple 1d
sig_hostname css_content
website b.com
pattern /css simple 1d
sig_hostname css_content
website c.com
pattern /css simple 1d
sig_hostname css_content
Now, when a request is made for a.com/css/main.css, b.com/css/main.css or c.com/css/main.css, same cached response will be returned, saving the overhead of having to maintain 3 different copies of the same content.
As usual, make sure that the URLs are cacheable and are not website-specific in any way.