On the subject of performance, putting the files on the server (in the html directory) is a boost when it comes to performance, because the request doesn’t have to pass through the domino database-security-mechanisms, and can be served fairly directly by the http. I believe this is fastest way compared to pages, javascriplibraries and fileresources (we did some performance tests on this a few years ago) from a performance perspective that is, but as Tomas mentions, it doesn’t replicate, you need file access to the server (or use an agent to detach the files), and is a pain to update. This is perhaps the best solution for “static” framework files (ext, dojo, yui, prototype et al)
If you combine gizp + server-location + cache-headers, you have come a long way. If you then reduce the number of external (js) files per request (eg use one or a few large files instead of many, you can boost performance even more). And if you want to take it further you can always configure a cdn for all the files, this will make the pages load even faster (because of the limit in number of simultaneous requests to a single domain, among other things). I used to compact/pack my larger js files before going into production but now I often just gzip the content instead (if I can).
Domino serverside cached content (by using CacheValid) is also a very very fast performer (a cached ?ReadForm for example). And I use it often on public websites. The main draw-back is that it’s a very ”blunt” tool when it comes to configuration and you can’t really use it effectively on anything other than public sites (The same content is served regardless of who is accessing the page (the page is computed for the first user accessing the page)). Serverside cached content combined with custom cache-headers is very fast (the content is waiting in the memory of the server ready to be served to you, instead of being computed/rendered for each request).
DAP works in a similar way as the serverside mechanism but with cache-headers and is way more intelligent (nice work Tomas) and reduces the amount of data transferred over the wire (80% on one of our servers). DAP combined with proper cache-headers and perhaps on a cdn
should be very efficient and enough for most web-solutions I think. Another thing that does wonder when it comes to performance is moving all js references from the head to the body of a page (this will allow the browsers to load the js files more concurrent instead of in a sequence). This might not be the pragmatist’s way… but speed is always nice, isn't it :-)
The thing I really like about being able to cache content in a smart way, is the effect it has on the server (the noticeable reduction in CPU- and memory utilization) combined with that the users actually feels a performance boost.
We often focus on css and js files, and try to optimize those in every way we can but it’s equally important to optimize how we serve images (cache all images for example). Images tend to be huge and many (compare the size of the images with the size of the js files, and the fact that each image equals one request (normally)).
Sorry if this is a bit off topic….(but performance is allways a fun topic)