Why JavaScript's Native Filter Method is Slow
Whenever I see that a native function is available, I assume it will be fast, at least faster than my hand rolled version. It is a common assumption that is rarely tested.
I initially set out to test various ways of filtering a list with an arbitrary predicate function. I tried two approaches, the first is a standard for loop:
function filter(array, fn) {
var results = [];
var item;
for (var i = 0, len = array.length; i < len; i++) {
item = array[i];
if (fn(item)) results.push(item);
}
return results;
}
// Usage:
var list = [1,2,3,...,500];
var predicate = function(n){ ... };
var matching = filter(list, predicate);
This is pretty simple, you just iterate over the array, and push
any matching item
onto your results
. The second approach used the JavaScript 1.6 filter function:
var matching = list.filter(predicate);
If you take a moment and check out the actual tests on jsPerf.com, you will see how wrong my assumption was. On average the for loop I defined above was around twice as fast as calling Array.prototype.filter
.
Apples to Apples
If you read the documentation for filter, you will see why it's so much slower.
- It ignores deleted values and gaps in the array
- It optionally sets the execution context of the predicate function
- It prevents the predicate function from mutating the data
This is a robust implementation, however it incurs a lot of overhead for edge cases that you can probably avoid. If your data is guaranteed to not have deleted values, and your predicate function is well behaved, these precautions just serve to slow your filtering down.
The moral of the story here is two fold. Firstly, test your assumptions from time to time. Sites like jsPerf make this really easy. Secondly, when in doubt, read whatever available documentation you can find. Mozilla's JavaScript documentation is a great place to start.
blog comments powered by Disqus