Closed Thread Icon

Topic awaiting preservation: Old optimzer tricks (Page 1 of 1) Pages that link to <a href="http://ozoneasylum.com/backlink?for=23846" title="Pages that link to Topic awaiting preservation: Old optimzer tricks (Page 1 of 1)" rel="nofollow" >Topic awaiting preservation: Old optimzer tricks <span class="small">(Page 1 of 1)</span>\

 
Ray Norrish
Nervous Wreck (II) Inmate

From:
Insane since: Sep 2004

posted posted 10-29-2004 01:07

Well.. it's quiet here, so why not ask some questions..
An old trick in my amiga days for code optimizing was expanding loops. Of course, when I expand large loops in JS, I run the risk of humungous script size, which defeats the object in terms of download time. This leads me on to a persistent thought of why I can't find any inline decompressors for JS. Nothing fancy.. a simple compressor would do.. that way, expanded loops would compress nicely, and we would hope, result in faster execution.
This would of course, also apply to precalc blocks for sin and other mathematical data which could be decompressed at run time.

What thoughts you guys, and has anyone experimented with these ideas?

poi
Paranoid (IV) Inmate

From: France
Insane since: Jun 2002

posted posted 10-29-2004 15:36

There is some Javascript compressors. One of the most famous being Javascript Compression at http://www.dithered.com. Most of the real JavaScript compressors ( not those crap that trim the lines, remove the comments and pretend to compress effectively the code ) I've seen were based on the same method and had more or less the same constraints.

For 3D TOMB II, I've coded a JS packer based on that technique, and managed to obtain a ratio of 67.2% of the orignal size. Which downsized the JS code from 4429 to 2976 bytes.

But from what you said, I think the most appropriate method would be to generate the unrolled loops rather than compressing/decompressing them.

For more infos about web site optimization you can also read Speed up your site by Andrew B. King. But honestly I doubt you'll learn something you don't already know.



(Edited by poi on 10-29-2004 15:46)

Ray Norrish
Nervous Wreck (II) Inmate

From:
Insane since: Sep 2004

posted posted 10-30-2004 21:12

Ok, but unrolling loops and precalcing etc. take time and extra code. My idea is so that a generic compression - which is optimised for JS - can be written in anything (I would use Delphi, say) and compresses the entire script into one large array of data. A small decompression routine can decompress the entire block to a raw fresh window or frame and jump into the expanded code.
I`m sure this would be a lot more economical for size (size limited comps?) and would allow all of the above: expanded loops, larger precalc tables etc.
The compressor can perform all of the usual "fake" tricks, like renaming variables, stripping spaces, tabs etc. I did something like this many years ago for an old scripted language called arexx using all of the above procedures.

poi
Paranoid (IV) Inmate

From: France
Insane since: Jun 2002

posted posted 10-30-2004 23:14

I did my JS packer in PHP. Basically It uses the method from dithered.com ( though I found it somewhere else before ) and provides 3 different types of tokens allowing respectively a maximum of 91, 95 and 127 strings replaced. I do the "fake" tricks by hand before.

To improve the ratio of that sort of packer you can arrange your code to increase the number of similar patterns. For instance if in the initialisation of an effect you have a for( i=0; i<80; i++), it's quite likely that you'll need a similar loop in the main function. Using exactly the same code in the main function will increase the probability to see it packed, and thus stored only once and replaced everywhere by a short token. If you play with the DOM and style properties there's all the chances that you'll have many patterns of code repeated. Try to arrange them in a way to maximize their size.

If you have some large chunks of code that are exactly the same a normal packer would work, but if there's some subtle variations it'll be almost inefficient. Actually I doubt the efforts to make a clever packer ( like an LZW, or HUFFMAN ) are worth it compared to the ease of generating the unrolled loops and other precalc.

Ray Norrish
Nervous Wreck (II) Inmate

From:
Insane since: Sep 2004

posted posted 10-31-2004 00:46

I`ve just been looking at if it was possible to implement the gzip or deflate content-encoding capability. I use this for server-side page delivery with DB apps, and most generated pages are aound 10% of original size.
To do this, I have tried to force the HTTP-EQUIV content-encoding to 'deflate' etc.
Some headers are able to be changed, but I can't get to work changing the content-encoding header client-side... maybe it's not possible like some other headers.. but it is very tempting to try and get this to work, as my test file is 7K down to <1K

poi
Paranoid (IV) Inmate

From: France
Insane since: Jun 2002

posted posted 10-31-2004 02:06

I've never heard of a way to touch the HTTP headers ( except for cookies ) on client side except with an xmlHTTPrequest object where it's possible to send/receive an HTTP request and indeed use different content-encoding methods. But that new object is supported since little time by the the 4 major browsers.

On the other hand, coding an LZW or HUFFMAN packer in JS is possible but damn tricky. Since JavaScript is in plain text the browsers choke on certain characters ( CR, LF, DEL, ESC, NUL, ... ) thus even the storing method is problematic, and working with binary datas across several bytes and avoiding the choking characters can quickly be a pain in the ass. But be it only for the "you're not supposed to do that" factor it's worth trying to implement such a packer/unpacker in JS.

bsilby
Obsessive-Compulsive (I) Inmate

From: Christchurch
Insane since: Sep 2004

posted posted 10-31-2004 06:15

I've been using the w3compiler to shrink my JS size. Its really useful. Removing whitespace reduces the file size by about 7%. It also has a setting for variable renaming and object remapping. Switching all these on reduces file size by over 20% in some cases. I don't enable these settings, however, since it caused a problem with a couple of my games.

Brent.

BRENT SILBY
www.def-logic.com
Neo-Arcade Videogames

« BackwardsOnwards »

Show Forum Drop Down Menu