OZONE Asylum
Forums
DHTML/Javascript
IE JS/DOM performance degradation weirdness
This page's ID:
27610
Search
QuickChanges
Forums
FAQ
Archives
Register
Edit Post
Who can edit a post?
The poster and administrators may edit a post. The poster can only edit it for a short while after the initial post.
Your User Name:
Your Password:
Login Options:
Remember Me On This Computer
Your Text:
Insert Slimies »
Insert UBB Code »
Close
Last Tag
|
All Tags
UBB Help
Wow, a month has passed since I originally posted this. :) OK, I've made some progress in this area. From what I have found, it seems that DOM (and possibly JS engine overall?) performance under IE (and Safari perhaps to an extent) are affected by growing amounts of Javascript objects - but not Firefox. It does not appear to be solely DOM references (though assigning event handlers can contribute some "weight") - it is mostly just data inside of objects. I'm creating objects which contain DOM references, arrays and other data, and as more objects are created, DOM performance (eg. looping through an array created as the result of getElementsByTagName() ) appears to be hit. For each new "page" of data in the earlier-described project I'm working on, I've found that I have to destroy the javascript objects for the current page. I'm creating a fixed number of objects (eg. 100 items means 100 objects) per page, and performance appears to be related to the number of active objects. Eg. putting the following arbitrary assignment code in the constructor of each "item" object contributes heavily to the slowdown: [code]for (var i=0; i<100; i++) { this['random'+i] = 'random'+i; }[/code] Obviously creating these name/property values takes some time, but there's more to it than that. As I'm showing 100 items per page, this equates to 10,000 property assignments. As new pages are loaded, performance (time to create a new page) appears to degrade exponentially. I could assign these to either JS objects or DOM nodes referenced by the object, the effect appears to be the same. The type or "weight" of data may not really matter: Assigning a long string to each object seems to have a similar effect. The obvious downside here is that the previously-viewed pages can't be just swapped out (ie. hide page 2, show page 1,) the data must be re-fetched from the API and related objects re-created. The key point from all of this: [i]If I delete the objects for the currently-active page before moving to the next one (at which point 100 new objects are created, etc.,) the performance / render time is consistent and does not degrade.[/i] I have not had time to try making an isolated test case, but it may be worthwhile. I would be interested in knowing what part of the browser/JS engine is "bottlenecking," (object look-ups, ?) and if there are techniques that can be used as far as JS code style etc. to avoid the problem. [small](Edited by [url=http://www.ozoneasylum.com/user/2276]Scott[/url] on 04-17-2006 18:22)[/small]
Loading...
Options:
Enable Slimies
Enable Linkwords
« Backwards
—
Onwards »