UPDATE: what I wrote here is very childish thinking. Please ignore
it. Every Byte Matters™.
I've been secretly working on UglifyJS v2. I'll release some nice code, very soonish.
Here's just a thought I've been pondering.
Some quick test against my 650K DynarchLIB reveals an
interesting observation. It seems that the lion's share in compression is
brought by name mangling (i.e. converting local names to single characters)
and whitespace removal. Everything else is, well, quite insignificant.
“Compression”, meaning those complicated AST transformations that lead to
replacing if with the ternary operator, or to block brackets
removal, saves about 5.5K on my file. That's not bad, but after
gzip
the net savings go down to 500 bytes.
For those 5.5K 500 bytes, UglifyJS spends 2 long seconds
only on squeezing the code (all the other steps combined, like
parsing, mangling, lifting variables and generating code, take way less than
2 seconds).
Is it worth it? I'm not so sure... In any case, I do plan to make v2
compression as good as v1's, but I couldn't help noticing this fact:
if I had 100,000 unique visitors each day (and that's quite an astronomical
figure to me), and if that script would be served to each of them (no cache
involved), then 500 bytes less in the file would save me about 1.5G/month,
which is about 0.015% of my monthly bandwidth (which I pay for anyway). I'm
not very sure it's worth the trouble.
Update: better than “is it worth it?” perhaps the question
should be “how could I make it better?” Every byte matters after
all, but it seems to me that working 2 seconds for a net saving of 500 bytes,
which is 0.68% of the gzipped file size, means we're doing something wrong...