My test code consists of the following:
var limit = 1000000;Fairly straight forward. I can change the limit value and check to see what the differences are for different size loops. I compared in Firefox 2 and IE 7. And I found the following.
function testi(){
var n = 0;
for (var i = 0; i < n =" i;" n =" 0;" i =" limit;"> 0; i--)
n = i;
}
time1 = new Date().valueOf();
testi();
time2 = new Date().valueOf();
document.write("time incrementing (" + limit + " times) = " + (time2 - time1) + "
");
time1 = new Date().valueOf();
testd();
time2 = new Date().valueOf();
document.write("time decrementing (" + limit + " times) = " + (time2 - time1) + "
");
- At a million loops in FF and IE, decrementing is usually twice as fast as incrementing (~220ms vs ~440ms). But how important is a 200ms baseline difference when you are doing a loop a million times? Presumably, whatever you are doing inside of that loop will most likely eclipse the difference.*
- If we decrease the loop to 100,000, the difference in IE becomes ~20ms decrementing vs ~40ms incrementing. And Firefox, surprisingly, shows the same results.
- At 10,000 loops, Firefox wavers between 0 and 10 ms for either incrementing or decrementing. As does IE.
- At 1,000 loops, we are in 0ms land for both IE and FF.
One problem with decrementing loops I have found is that they don't behave the same as incrementing loops. If you are looping through an array, you end up going through it backwards. This is sometimes a problem, if for example you are executing a string of functions stored in an array, and you want to ensure that the first one in is executed first. A simple way to fix this, is to find the index by subtracting your decrementing index from your limit. But does putting this simple bit of math into the loop corrodes any performance gains we made by decrementing in the first place? To find out, I substituted the n = i; assignment in the decrementing loop function with a n = limit - i; assignment. I found that:
- In Firefox (at 1 million loops), the cost of decrementing rose to ~550ms vs ~440ms for incrementing. Erasing any gains and adding cost. In IE, the time for decrementing increased beyond the time for incrementing.
- Below 1 million loops, decrementing lost all of its advantage (at best being equal) when the compensating math was implemented.
*Strangely, in Internet Explorer only, if I reverse the order of the loops and execute the decrementing function first (at a million loops), decrementing takes ~210ms and incrementing takes anywhere from 900 to over 2000 ms. This is a big difference, and I haven't been able to figure out the source of the disparity.
No comments:
Post a Comment