Sep
5
2012

UglifyJS v2 news

Update (Sep 18): I posted more news. The compressor beats UglifyJS v1 now on jQuery and on other libraries I tried. Also, the jQuery test suite is fully covered after minification.

First, a big THANK YOU! to everybody who donated so far! I was silent since I started the funding campaign, but my motto is “work, not talk”. I've been hard at work actually and I'm happy to announce that UglifyJS v2 has got support for generating source maps, a pretty exhaustive compressor (it beats v1 in some of my tests) and a minimal CLI utility.

I humbly remind you that the pledgie is still open and still far from the target. If you like what I'm doing, please consider contributing a few bucks!

Click here to lend your support to: Funding development of UglifyJS 2.0 and make a donation at www.pledgie.com !

UPDATE (Sep 07): Thanks to significant contributions from the jQuery Foundation and the Dojo Foundation, and from many individuals who heard the buzz, we are a lot closer to the target! Thank you people!

Update (Sep 11): Telerik matched the donations from jQuery and Dojo, providing more than enough to complete the campaign target! Thank you guys! I've been working on it, I'll post some updates and new code soon! What follows is old news.

Next, some information about the current state of affairs.

[ read more... ]


Aug
22
2012

JavaScript minification — is it worth it?

UPDATE: what I wrote here is very childish thinking. Please ignore it. Every Byte Matters™.

I've been secretly working on UglifyJS v2. I'll release some nice code, very soonish. Here's just a thought I've been pondering.

Some quick test against my 650K DynarchLIB reveals an interesting observation. It seems that the lion's share in compression is brought by name mangling (i.e. converting local names to single characters) and whitespace removal. Everything else is, well, quite insignificant.

“Compression”, meaning those complicated AST transformations that lead to replacing if with the ternary operator, or to block brackets removal, saves about 5.5K on my file. That's not bad, but after gzip the net savings go down to 500 bytes.

For those 5.5K 500 bytes, UglifyJS spends 2 long seconds only on squeezing the code (all the other steps combined, like parsing, mangling, lifting variables and generating code, take way less than 2 seconds).

Is it worth it? I'm not so sure... In any case, I do plan to make v2 compression as good as v1's, but I couldn't help noticing this fact: if I had 100,000 unique visitors each day (and that's quite an astronomical figure to me), and if that script would be served to each of them (no cache involved), then 500 bytes less in the file would save me about 1.5G/month, which is about 0.015% of my monthly bandwidth (which I pay for anyway). I'm not very sure it's worth the trouble.

Update: better than “is it worth it?” perhaps the question should be “how could I make it better?” Every byte matters after all, but it seems to me that working 2 seconds for a net saving of 500 bytes, which is 0.68% of the gzipped file size, means we're doing something wrong...


Aug
11
2012

Why Lisp?

As this new website is written in Lisp, I thought a first post to praise Lisp would be in order. If you're a Lisper, you know all this already ;-) and if you're not, I hope it inspires you.

Early 2010 I joined the IT team of a small research center in Italy. Our manager at the time was a smart guy, a programmer himself—Emacs user and Haskell lover. We started a big project and, although their initial plan was to develop it in PHP (having already a few PHP devs there), the manager agreed I can use whatever language I want to bring us faster to the objective. He'd prefer Haskell, but I don't know it. I was having some 10 years experience with JavaScript, but not on server-side (NodeJS was in early stages those days), 8 years with Perl and only a few months with Common Lisp. I picked Lisp and I still congratulate myself for that choice.

[ read more... ]


« Before 11 Aug 2012