Also: I don't like system that just check for the age of the cache, and not a more elaborate system that, using a single simple DB query, checks if the input data has changed, before regenerating the file. That looks way more useful to me. But, of course, you have to hand code that query for each page.
http://www.oracle.com/technetwork/articles/dsl/white-php-par...
BTW their demonstration code where they hit filemtime constant to see if a file is outdated is also a bad recommendation since there is OS overhead, especially exaggerated if you use NAS. Instead use time of day for an zero-load way to check if sometime is out of date, and only update during times of lower load.
> if (file_exists($file) && (filemtime($file) + $timeout) > time()) {
It would seem (at least on the surface) to be preferable (faster and simpler) to just serve static pages already, with another cron to overwrite (i.e., freshen) them periodically. That skips testing for the file's existence, the hash, time(), the comparison.
While all those (trivial) operations are no doubt lightning fast, that code is already serving static pages, so the cache seems like it has no real benefit (to me).