Thursday, February 12, 2009

Javascript performance testing

I've been horribly disappointed with IE's performance with some of my javascript, so the point that I've pretty much decided to simply hide the troublesome functions from IE users. That's nearly 85% of our audience who won't get to experience the work I've put in to some of this stuff. I thought I needed to get some actual figures on performance, so I took the worst offender and turned it into a test case. Then I thought while I'm doing that I might as well compare jQuery 1.2.6 to jQuery 1.3.1. So I made a 2nd test case.

First of all, you need to understand that I'm terribly abusing the plugin in question. I'm using treeview to do the online version of our organizational chart. There's a demo showing how it works with large lists. That demo has 290 total list items, 53 of which are expandable. Our org chart has 403 list items, 81 of which are expandable. It's large enough that I had to edit the default images to extend all the way down the fully expanded list.

I added a crude counter to the plugin. Here's the code. The lines involving Date() and getTime() and alert() functions are my additions (114 to 127 or so). It's far from perfect, but it should be equally imperfect for all browsers and therefore free of bias. What this does is whenever the expand all or collapse all function is triggered, it grabs the time early on in that process, then grabs the time again near the end of that process, computes the difference (in milliseconds) and alerts the value. In each browser I expanded the list, recorded the value, collapsed the list, recorded that value, and repeated the process until I had 15 measurements for both the expand and the collapse feature.

The Data

We'll do the Mac browsers first. I'm running on a 20 inch iMac with a 2.0 gHz Intel Core Duo processor and 4 gigs of RAM. I ran these tests under "normal" conditions. So I had other applications going and in the case of Firefox several other tabs open.

All figures are in milliseconds.

Firefox 3.0.6

jQuery: 1.3.1 1.2.6
Expand Collapse Expand Collapse
Tests
  1. 838
  2. 840
  3. 847
  4. 826
  5. 827
  6. 821
  7. 839
  8. 835
  9. 827
  10. 827
  11. 825
  12. 839
  13. 824
  14. 816
  15. 845
  1. 1178
  2. 1184
  3. 1183
  4. 1188
  5. 1188
  6. 1184
  7. 1187
  8. 1205
  9. 1186
  10. 1190
  11. 1189
  12. 1182
  13. 1185
  14. 1182
  15. 1190
  1. 506
  2. 514
  3. 511
  4. 493
  5. 482
  6. 513
  7. 480
  8. 510
  9. 493
  10. 484
  11. 508
  12. 484
  13. 492
  14. 512
  15. 491
  1. 1006
  2. 1024
  3. 999
  4. 998
  5. 1011
  6. 998
  7. 1010
  8. 1003
  9. 1009
  10. 1008
  11. 1005
  12. 1002
  13. 1000
  14. 997
  15. 994
Min 816 1178 480 994
Max 847 1205 514 1024
Range 31 27 34 30
Median 827 1186 493 1003
Average 831.73 1186.73 498.2 1004.26

The most surprising thing here is that jQuery 1.2.6 is actually a bit faster than 1.3.1. Of course, 1.3.2 is supposed to release soon. And as we'll see that's not true for every browser. Collapsing takes longer than expanding. I'll leave explaining that to someone that knows more about jQuery DOM transversal.

Safari 3.2.1

jQuery: 1.3.1 1.2.6
Expand Collapse Expand Collapse
Tests
  1. 262
  2. 256
  3. 267
  4. 269
  5. 264
  6. 269
  7. 266
  8. 267
  9. 270
  10. 270
  11. 267
  12. 269
  13. 292
  14. 267
  15. 267
  1. 161
  2. 159
  3. 164
  4. 163
  5. 167
  6. 164
  7. 169
  8. 165
  9. 163
  10. 164
  11. 165
  12. 164
  13. 164
  14. 166
  15. 166
  1. 311
  2. 311
  3. 326
  4. 316
  5. 314
  6. 311
  7. 311
  8. 319
  9. 312
  10. 314
  11. 310
  12. 316
  13. 318
  14. 318
  15. 317
  1. 147
  2. 146
  3. 149
  4. 151
  5. 148
  6. 148
  7. 146
  8. 149
  9. 149
  10. 149
  11. 149
  12. 151
  13. 148
  14. 152
  15. 150
Min 256 159 310 146
Max 292 169 326 152
Range 36 10 16 6
Median 267 164 314 149
Average 268.13 164.27 314.9 148.8

At first I thought the performance boost for 1.3.1 here was pretty small, but I think the fact that all the numbers involved is tiny threw me off. If you add up the median figures, you'll find a difference of about 7.5%. That's nothing compared to the relative advantage Safari already has over most of the other browsers, but I think it's still a decent speed boost for the new jQuery release.

Safri's js engine is awesome. :)

Can anyone explain why in Safari collapse is quicker than expand, but in Firefox it's the other way around?

Opera 9.63

jQuery: 1.3.1 1.2.6
Expand Collapse Expand Collapse
Tests
  1. 692
  2. 690
  3. 737
  4. 667
  5. 711
  6. 703
  7. 715
  8. 703
  9. 704
  10. 692
  11. 719
  12. 680
  13. 731
  14. 695
  15. 682
  1. 698
  2. 709
  3. 721
  4. 709
  5. 715
  6. 730
  7. 697
  8. 695
  9. 692
  10. 696
  11. 705
  12. 704
  13. 697
  14. 713
  15. 696
  1. 875
  2. 944
  3. 904
  4. 875
  5. 872
  6. 869
  7. 864
  8. 929
  9. 905
  10. 910
  11. 891
  12. 885
  13. 926
  14. 857
  15. 918
  1. 691
  2. 711
  3. 692
  4. 692
  5. 684
  6. 692
  7. 741
  8. 688
  9. 652
  10. 687
  11. 674
  12. 707
  13. 701
  14. 692
  15. 690
Min 667 692 857 652
Max 737 730 944 741
Range 70 38 87 89
Median 703 698 891 692
Average 701.4 705.13 894.93 692.93

Opera seems slow compared to Safari, but it's on par with Firefox, and more than tolerable. We also see a decent speed boost with 1.3.1 here.

Windows Browsers

I'm running Windows XP via virtualization with 412 megs of RAM. These tests were run with no other running applications, so more "ideal" conditions than the Mac browsers got. Then again, the Macs get a ton more RAM and aren't running in a virtual machine. I tested both Firefox 2 and Firefox 3 because a decent portion of our Firefox using audience haven't upgraded yet.

Firefox 2.0.0.20

jQuery: 1.3.1 1.2.6
Expand Collapse Expand Collapse
Tests
  1. 1018
  2. 1069
  3. 1019
  4. 1019
  5. 1019
  6. 1019
  7. 989
  8. 1049
  9. 1019
  10. 989
  11. 1029
  12. 1009
  13. 1029
  14. 999
  15. 969
  1. 1169
  2. 1109
  3. 1129
  4. 1129
  5. 1139
  6. 1159
  7. 1129
  8. 1189
  9. 1149
  10. 1129
  11. 1128
  12. 1129
  13. 1129
  14. 1189
  15. 1119
  1. 629
  2. 629
  3. 609
  4. 649
  5. 609
  6. 639
  7. 619
  8. 629
  9. 619
  10. 649
  11. 630
  12. 629
  13. 640
  14. 629
  15. 639
  1. 740
  2. 719
  3. 739
  4. 729
  5. 749
  6. 749
  7. 749
  8. 730
  9. 739
  10. 739
  11. 759
  12. 750
  13. 759
  14. 719
  15. 769
Min
Max
Range
Median
Average

Here we're starting to get slower than I'd like. A full second for anything to happen is a bit much. Then again, 403 list items in 81 lists is a lot of munch on. Once again we loose a little speed with the new jQuery release. Hmm.

Firefox 3.0.6

jQuery: 1.3.1 1.2.6
Expand Collapse Expand Collapse
Tests
  1. 431
  2. 449
  3. 461
  4. 434
  5. 464
  6. 472
  7. 558
  8. 493
  9. 481
  10. 436
  11. 457
  12. 442
  13. 472
  14. 462
  15. 458
  1. 657
  2. 688
  3. 648
  4. 623
  5. 657
  6. 708
  7. 632
  8. 650
  9. 695
  10. 649
  11. 634
  12. 649
  13. 648
  14. 659
  15. 625
  1. 450
  2. 467
  3. 424
  4. 475
  5. 443
  6. 456
  7. 446
  8. 472
  9. 460
  10. 477
  11. 449
  12. 464
  13. 483
  14. 468
  15. 440
  1. 618
  2. 650
  3. 634
  4. 638
  5. 667
  6. 637
  7. 632
  8. 632
  9. 646
  10. 643
  11. 622
  12. 614
  13. 628
  14. 651
  15. 640
Min
Max
Range
Median
Average

Here we go! Come on Firefox users, update already!

Safari 3.2.1

jQuery: 1.3.1 1.2.6
Expand Collapse Expand Collapse
Tests
  1. 253
  2. 326
  3. 277
  4. 257
  5. 268
  6. 288
  7. 278
  8. 269
  9. 244
  10. 271
  11. 264
  12. 281
  13. 284
  14. 281
  15. 293
  1. 141
  2. 171
  3. 148
  4. 160
  5. 147
  6. 143
  7. 147
  8. 150
  9. 151
  10. 171
  11. 163
  12. 158
  13. 154
  14. 142
  15. 147
  1. 303
  2. 300
  3. 324
  4. 327
  5. 313
  6. 312
  7. 283
  8. 297
  9. 355
  10. 332
  11. 367
  12. 332
  13. 360
  14. 317
  15. 307
  1. 130
  2. 133
  3. 129
  4. 134
  5. 135
  6. 134
  7. 129
  8. 130
  9. 132
  10. 130
  11. 133
  12. 135
  13. 148
  14. 133
  15. 129
Min
Max
Range
Median
Average

Safari continues to be awesome, even on Windows. :)

Opera 9.63

jQuery: 1.3.1 1.2.6
Expand Collapse Expand Collapse
Tests
  1. 470
  2. 480
  3. 469
  4. 480
  5. 480
  6. 499
  7. 490
  8. 580
  9. 469
  10. 519
  11. 460
  12. 470
  13. 479
  14. 563
  15. 480
  1. 399
  2. 410
  3. 429
  4. 409
  5. 419
  6. 399
  7. 419
  8. 409
  9. 400
  10. 400
  11. 400
  12. 430
  13. 410
  14. 399
  15. 399
  1. 949
  2. 989
  3. 949
  4. 929
  5. 939
  6. 939
  7. 939
  8. 929
  9. 939
  10. 939
  11. 959
  12. 949
  13. 958
  14. 1009
  15. 989
  1. 370
  2. 380
  3. 390
  4. 409
  5. 400
  6. 390
  7. 370
  8. 390
  9. 380
  10. 400
  11. 399
  12. 420
  13. 380
  14. 400
  15. 420
Min
Max
Range
Median
Average

With Opera on Windows, we see some pretty solid speed boosts with the 1.3.1 release of jQuery.

Chrome 1.0.154.48

jQuery: 1.3.1 1.2.6
Expand Collapse Expand Collapse
Tests
  1. 229
  2. 203
  3. 191
  4. 195
  5. 158
  6. 208
  7. 201
  8. 188
  9. 204
  10. 244
  11. 360
  12. 327
  13. 307
  14. 305
  15. 178
  1. 88
  2. 77
  3. 119
  4. 88
  5. 79
  6. 106
  7. 84
  8. 86
  9. 95
  10. 158
  11. 116
  12. 145
  13. 135
  14. 61
  15. 81
  1. 512
  2. 598
  3. 546
  4. 607
  5. 607
  6. 640
  7. 526
  8. 674
  9. 545
  10. 631
  11. 549
  12. 635
  13. 581
  14. 616
  15. 531
  1. 111
  2. 112
  3. 133
  4. 115
  5. 109
  6. 105
  7. 103
  8. 103
  9. 105
  10. 106
  11. 133
  12. 112
  13. 109
  14. 110
  15. 114
Min
Max
Range
Median
Average

Chrome also sees some good boosts from 1.3.1 and manages to out awesome even Safari. When you look at the range as a percentage, it's all over the place. Sometimes the alert box would have a check box in it that would say something to the effect of "prevent this site from producing alert boxes" and I think those tended to take longer to generate. Maybe Chrome's doing other stuff behind the scenes. I'd need to do more specific research to figure this one out.

IE 7.0.5730.13

jQuery: 1.3.1 1.2.6
Expand Collapse Expand Collapse
Tests
  1. 2990
  2. 2897
  3. 2867
  4. 2917
  5. 3016
  6. 2877
  7. 2877
  8. 2847
  9. 2907
  10. 2897
  11. 2877
  12. 2916
  13. 2897
  14. 2888
  15. 2848
  1. 1718
  2. 1738
  3. 1832
  4. 1718
  5. 1668
  6. 1708
  7. 1678
  8. 1678
  9. 1698
  10. 1709
  11. 1728
  12. 1738
  13. 1689
  14. 1688
  15. 1708
  1. 2538
  2. 2587
  3. 2557
  4. 2557
  5. 2577
  6. 2568
  7. 2567
  8. 2557
  9. 2548
  10. 2707
  11. 2548
  12. 2708
  13. 2558
  14. 2568
  15. 2568
  1. 1248
  2. 1219
  3. 1249
  4. 1248
  5. 1279
  6. 1249
  7. 1208
  8. 1228
  9. 1258
  10. 1229
  11. 1229
  12. 1239
  13. 1389
  14. 1229
  15. 1239
Min
Max
Range
Median
Average

Ugh! I'm in browser hell! What's up with this? 2.5 to 3 seconds to expand the list? Come on!

IE6

I don't know the exact version number here because of the way the Multiple IE installer works.

jQuery: 1.3.1 1.2.6
Expand Collapse Expand Collapse
Tests
  1. 10779
  2. 12577
  3. 13060
  4. 14085
  5. 15034
  6. 16123
  7. 17272
  8. 19549
  9. 19859
  10. 21121
  11. 23080
  12. 23249
  13. 24961
  14. 24770
  15. 25953
  1. 4935
  2. 5165
  3. 5774
  4. 5964
  5. 6314
  6. 7032
  7. 7152
  8. 8021
  9. 8231
  10. 8611
  11. 9082
  12. 9402
  13. 9892
  14. 9959
  15. 10349
  1. 11078
  2. 12617
  3. 13167
  4. 14785
  5. 15684
  6. 16876
  7. 19643
  8. 21740
  9. 20642
  10. 21947
  11. 22726
  12. 23925
  13. 36235
  14. 28810
  15. 29422
  1. 3899
  2. 4005
  3. 4356
  4. 4715
  5. 5105
  6. 5534
  7. 6453
  8. 7366
  9. 6842
  10. 7453
  11. 7801
  12. 8241
  13. 9700
  14. 9364
  15. 9849
Min
Max
Range
Median
Average

No. I was wrong before. This is browser hell. Notice how each test gains about 1 full second? I think that means we've got memory leaks. I closed and relaunched the browser between test cases to avoid skewing the results between jQuery versions.

I'll complete the number crunching and do a bit more in depth analysis tomorrow. It's 5 o'clock and I'm heading home for the day.

Thursday, February 05, 2009

Content! Third-hand content, but still, content!

One of the people I encountered in all the discussion on higher ed in the web design industry is Kurt Schmidt. (ImTheSchmidt.com, isn't that one of the greatest domain names ever?)

Since then, I've been following his blog semi-daily. Earlier today he posted Make Your Site Faster - Or Else!, which is his take on info from Geeking with Greg. I started to reply to Kurt, but about the time I started my 3rd paragraph I figured I should just turn it into a blog post here. See what I mean about blogs and peer review?

The importance of speed and performance on the user experience is something that I'm currently struggling with. In the redesign I'm working on, I've made extensive use of jQuery. Everything runs super quick on all the Mac browsers. Chrome and Safari on Windows also blaze. Firefox on Windows is noticeably slower than the rest, but still tolerable. Who does that leave to drool on his shirt and crap his pants at the browser testing party?

Both IE6 and IE7 are intolerably slow. I've considered simply filtering out IE6 from all the js goodness. Since I've designed for progressive enhancement, everything will still function, just not the same way. But that still leaves IE7 users with a clunky and ungraceful experience. As much as I hate IE, I won't kid myself about it's market share.

The best solution from the users' point of view would be to test each jQuery implementation on the site separately and selectively enable/disable for IE based on performance in context. Some of the stuff I'm doing on the redesign works fine in IE. I know because I've been doing it on the current site for more than a year.

I also need to dig deeper into my jQuery-fu and make sure my calls are as efficient as possible. But some of it is straight up plug in functionality. I'm not too keen on rewriting other people's plug ins to increase efficiency. The next version of Firefox will probably incorporate ideas currently only available in Chrome's V8 js engine, making it's tolerable performance that much better. But that will probably just widen the gap between IE users and everyone else even more!

The next current version of jQuery (which I haven't updated to yet, oops) will also have a revamped selection engine that should boost performance across the board. But unless those changes impact performance in IE a hell of a lot more than everyone else, the gap will still be huge. It's the gap in the user experience that worries me.

What little testing I've done so far would seem to indicate that IE6 is bad at say 2000 milliseconds. IE7 is barely any better and a lot less consistent, with speeds ranging from 1500-2300ms. Firefox is around 700ms. Opera is 300-400ms. Safari is 200-300ms. Chrome is about 150ms (10 times better than IE7's best run!). I guess with those numbers, even if sizzle improves performance by 20% across the board, Chrome gains 30ms and IE7 gains at least 300ms. Maybe the gap will get smaller. But still, 1.2 seconds waiting for all the js to trigger is unacceptable from a user centered outlook. But why should I be punished as a designer and the 20% or so of our users who don't run IE miss out on those enhancements simply because IE can't get it's crap together? These sorts of situations are the only part of my job I hate. I'll stop there before this turns into a rant.

Well, now that I know the new version of jQuery is out, it's pointless to flap my jaw any more about this stuff until I've updated and retested. I've got work to do.

I've been neglecting all my imaginary readers

Things have been crazy. Professionally, academically, personally; you name it. Chaos defines me. Even more than usual. I've had plenty of blog worthy thoughts in the past month, but this is the first chance I've had to sit down and actually blog, you know, as a verb.

There was an explosion of discussion over at A List Apart on higher ed and web design. And by explosion, I mean I posted a lot and the discussion was meaningful to me. I'm sure there are articles with a hell of a lot more discussion.

The articles in question are Elevate Web Design at the University Level by Leslie Jensen-Inman and Brighter Horizons for Web Education by Aarron Walter. Both are worth checking out if you haven't already.

I've tried, and failed, to articulate what my heart tells me about this issue. I've posted discussions for both those articles. I've posted to the University Web Devlopers group on Ning. I've posted on various blogs where the discussion has spilled over. I've carried on conversations via email with a few folks. The closest I've come to getting it right is probably this comment in response to Leslie's article.

I'm trying to make the case in favor of a new type of “digital apprenticeship” in the industry. I think a formal education can work in our field. No. I know it can. My bachelors was a great model and actually follows a lot of the suggestions that people have come up with in these ongoing discussions. But the shelf life of that program as I experienced it was limited to about 5 years. A change in leadership is all it took to bring it back to business as usual (in other words, the kind of troubled program Leslie was targeting with her research). I don't think our industry can afford to wait a generation for the top level of campus politics to embrace the educational philosophies required to produce marketable graduates. For the next 20 years we'll see a lot of hit or miss programs, and some campuses will be more progressive in that respect than others. There are probably half a dozen or so solid programs in existence right now that have the proper institutional support to remain viable and stable. But I don't think students should be required to seek out places like MIT to find a decent web design program. And I'm sure less prestigious/well known campuses are capable of fielding the sort of program I'm talking about, but we're still faced with the problem of how do students find out about them? I mean, the only reason I found the program at TTU is because I started out there as an engineering student.

So while I appreciate the efforts that are being made to entice higher ed to get with the program, I don't think that's a sphere where we have any real influence. A lack of solid educational material hasn't been a problem in this industry for at least a decade. The fact that most academics will pay no mind to such material until it's formalized in a book or an accepted peer reviewed journal is a failing on their part (within the context of our industry), not on our part. In my mind, the spirit of peer review is alive and well on prominent blogs. And the transparent nature of the comment and debate system removes a lot of the politics that can infect trade journals. Movements like open course ware seem to indicate that some academics understand this and are forging new educational models. But until that kind of thinking can emerge from the underground, it only serves as a further means of exclusion.

We can't make higher ed listen to us. And we can't force HR departments to not require a degree for a given position. But I don't think we have to. Before we can effectively maneuver around the momentum and bureaucracy of the old schools of thought, we may be able to build a better model ourselves. After a couple of hiring cycles of employees with graphic design or computer science degrees who can't address the full range of skills a web position requires, HR might learn to value portfolios and experience a bit more, at least for these positions. If universities start noticing problems with retention and the ability of their graduates to compete in the market, they'll come to use for a solution.

Right now I feel like we're trying to cook a more enticing meal for a man who doesn't even realize he's hungry. But he's not just hungry, he's starving. Things like the Opera Web Standards Curriculum are great. But in the mind of most academics, it's foistware. We're pitching a solution to a problem that isn't even on the radar yet.

But, I have a bit more hope now than a few weeks ago. Leslie's article on ALA was largely preaching to the choir. But she's an academic herself (and a fellow Tennessean). The research that lead her to produce the ALA article is also the sort of research that academics may be willing to pay attention to. In fact, she caught the attention of the Chronicle of Higher Education. Maybe academia is more willing to listen than I've assumed. Still, I'll believe it when I see it.