Table of Contents

cache issues

Client side (browser) caching must be controlled by the developer. If ignored, you will likely experience problems, especially with dynamic web sites and applications.

Too Much Caching

Too Little Caching

GET requests

The same page request can be cached by the client, or by the client's proxy server, even if you do your part to tell it not to. This is because some organizations have very aggressive caching policies to control bandwidth. One way to prevent this, is to always change the url. This can be accomplished by adding a timestamp onto the end of the url. You can do this through php or javascript.

OLD URL:
http://www.example.com/products/list/

NEW URL:
http://www.example.com/products/list/?ts=<?=time()?>

...the NEW URL might look something like this on output:
http://www.example.com/products/list/?ts=1258776186

It is also common to do this but encode the entire GET string, such as with base64 encoding. You can pass any variable as usual, just add the timestamp as shown above. The cryptic mess looks less inviting to a user to mess with than a series of variables and values. Just pass the encrypted string as a value of some generic variable name, such as 'do'. It might turn out like this:

http://www.example.com/products/list/?do=dHM9MTI1ODc3NjQwOQ==

session_cache_limiter()

PHP has a function named session_cache_limiter(). This allows you easily control caching. Some claim that it doesn't work, and that other settings or tags are necessary to accomplish your intended behavior.

Cache-Control

FIXME

Pragma

FIXME

See Also