dominoExperts.com - Powered by Domino 8.5.2 Domino Accelerator Pack
- Serve faster pages for your users
Lotus Triple Search DominoExperts + Blogs + R8 forum
dominoExperts.com -> General Domino Talk

 Script Libraries or File resources


hakusekiPost date: 2008-10-14 09:32

I have experienced some issues with Script Libraries in Domino. A limitation in size makes the script non-accessable.

I first saw this when trying to add 1.6 of prototype.js. The script wouldn't compile and after googling a bit I found that there's a limit in how much code between { } that Domino can handle... and this API has a lot ;)

So I had to move them to File Resources instead...

When we ported our application to "web 2.0" and removed old R5 Domino coding we used Team Studio configurator to search and replace. My experience was that those elements added as resources in HTTP Header and JS Header where not found and we had to convert manually.

Another reason to add all references manually to HTTP Header with a link or script tag instead of using Resources.

Anyone else with experience on this topic?


Tomas NielsenPost date: 2008-10-14 09:41

When I have the luxury of having file access to the server I put the static script libraries in a "script" directory in the html directory. But that is rare and does not work with replication.

But most of the time I use pages to hold JavaScript code. Then I can apply HTTP headers as I please to cache them correctly.


hakusekiPost date: 2008-10-14 09:47

Was thinking of deploying to the server instead of within database but we have limited access to the external servers directories on our application...

But everything is doable in Domino, just a small agent to deploy teh shit ;)

 


Fredrik StöckelPost date: 2008-10-14 21:48

I'm also a fan of using pages as container for js. Simple, computable, directly editable and so on... 


Joacim BoivePost date: 2008-12-05 21:53

I always use a ScriptLibrary, if possible because of size restrictions. The second choice is a FileResource.

Back in the day I used a Page as well because of the reasons you mention, but I don't think that's the optimal solution performance wise. Also, from a while back it's no longer an argument as you can use the Domino Developer plugin for Aptana to directly access both your JS and CSS:

http://www.jeffgilfelt.com/DominoAptana/

http://www.aptana.com/studio

Have a look at it, if you haven't already, it simply rocks!

 

If you need to compute variables I use a SCRIPT-tag on the form/page to keep the variables and the rest of the code in the external lib.

 

/J


Fredrik StöckelPost date: 2008-12-06 23:08

I totally agree, Aptana is a fantastic IDE for JavaScript, and the Aptana plug-in for Domino works great. It has actually become my everyday tool. The option to fine-tune pages with custom http-headers, being able to set desired content-type, embed views (js-loaders), are some of the reasons why I still use pages as containers for JavaScript and css resources (and because it’s very easy to modify a page quickly without launching an external tool, I’m lazy, can’t help it).

On the subject of performance, putting the files on the server (in the html directory) is a boost when it comes to performance, because the request doesn’t have to pass through the domino database-security-mechanisms, and can be served fairly directly by the http. I believe this is fastest way compared to pages, javascriplibraries and fileresources (we did some performance tests on this a few years ago) from a performance perspective that is, but as Tomas mentions, it doesn’t replicate, you need file access to the server (or use an agent to detach the files), and is a pain to update. This is perhaps the best solution for “static” framework files (ext, dojo, yui, prototype et al)

If you combine gizp + server-location + cache-headers, you have come a long way. If you then reduce the number of external (js) files per request (eg use one or a few large files instead of many, you can boost performance even more).  And if you want to take it further you can always configure a cdn for all the files, this will make the pages load even faster (because of the limit in number of simultaneous requests to a single domain, among other things). I used to compact/pack my larger js files before going into production but now I often just gzip the content instead (if I can).

Domino serverside cached content (by using CacheValid) is also a very very fast performer (a cached ?ReadForm for example). And I use it often on public websites. The main draw-back is that it’s a very ”blunt” tool when it comes to configuration and you can’t really use it effectively on anything other than public sites (The same content is served regardless of who is accessing the page (the page is computed for the first user accessing the page)). Serverside cached content combined with custom cache-headers is very fast (the content is waiting in the memory of the server ready to be served to you, instead of being computed/rendered for each request).

DAP works in a similar way as the serverside mechanism but with cache-headers and is way more intelligent (nice work Tomas) and reduces the amount of data transferred over the wire (80% on one of our servers). DAP combined with proper cache-headers and perhaps on a cdn
should be very efficient and enough for most web-solutions I think. Another thing that does wonder when it comes to performance is moving all js references from the head to the body of a page (this will allow the browsers to load the js files more concurrent instead of in a sequence). This might not be the pragmatist’s way… but speed is always nice, isn't it  :-)

The thing I really like about being able to cache content in a smart way, is the effect it has on the server (the noticeable reduction in CPU- and memory utilization) combined with that the users actually feels a performance boost.

We often focus on css and js files, and try to optimize those in every way we can but it’s equally important to optimize how we serve images (cache all images for example). Images tend to be huge and many (compare the size of the images with the size of the js files, and the fact that each image equals one request (normally)).

Sorry if this is a bit off topic….(but performance is allways a fun topic)




RSS feed
Subscribe to Forum

Share this page

Top posters
Tomas Nielsen212
Joacim Boive27
Fredrik Stöckel27
Danne14
Niklas Waller13
Kenneth Haggman11
Bryan Kuhn10
Daniel Lehtihet9
Jonas Israelsson8
dm997