cache issues
Client side (browser) caching must be controlled by the developer. If ignored, you will likely experience problems, especially with dynamic web sites and applications.
Too Much Caching
- old static content pages will be shown to users
- old versions of dynamic web pages will be shown to users
- new stylesheets and scripts will not be loaded with the page, so layout and javascript bugs are present
- pdf or other binary file downloads will fail
- pdf or other binary inline displays will fail
Too Little Caching
- unnecessary page loading for every request when content doesn't change
- slow performance for the client going page to page
- server load is increased under high traffic
GET requests
The same page request can be cached by the client, or by the client's proxy server, even if you do your part to tell it not to. This is because some organizations have very aggressive caching policies to control bandwidth. One way to prevent this, is to always change the url. This can be accomplished by adding a timestamp onto the end of the url. You can do this through php or javascript.
OLD URL: http://www.example.com/products/list/ NEW URL: http://www.example.com/products/list/?ts=<?=time()?> ...the NEW URL might look something like this on output: http://www.example.com/products/list/?ts=1258776186
It is also common to do this but encode the entire GET string, such as with base64 encoding. You can pass any variable as usual, just add the timestamp as shown above. The cryptic mess looks less inviting to a user to mess with than a series of variables and values. Just pass the encrypted string as a value of some generic variable name, such as 'do'. It might turn out like this:
http://www.example.com/products/list/?do=dHM9MTI1ODc3NjQwOQ==
session_cache_limiter()
PHP has a function named session_cache_limiter(). This allows you easily control caching. Some claim that it doesn't work, and that other settings or tags are necessary to accomplish your intended behavior.
Cache-Control
Pragma