Counting the number of non-NaN elements in a numpy ndarray in Python Counting the number of non-NaN elements in a numpy ndarray in Python python python

Counting the number of non-NaN elements in a numpy ndarray in Python


np.count_nonzero(~np.isnan(data))

~ inverts the boolean matrix returned from np.isnan.

np.count_nonzero counts values that is not 0\false. .sum should give the same result. But maybe more clearly to use count_nonzero

Testing speed:

In [23]: data = np.random.random((10000,10000))In [24]: data[[np.random.random_integers(0,10000, 100)],:][:, [np.random.random_integers(0,99, 100)]] = np.nanIn [25]: %timeit data.size - np.count_nonzero(np.isnan(data))1 loops, best of 3: 309 ms per loopIn [26]: %timeit np.count_nonzero(~np.isnan(data))1 loops, best of 3: 345 ms per loopIn [27]: %timeit data.size - np.isnan(data).sum()1 loops, best of 3: 339 ms per loop

data.size - np.count_nonzero(np.isnan(data)) seems to barely be the fastest here. other data might give different relative speed results.


Quick-to-write alterantive

Even though is not the fastest choice, if performance is not an issue you can use:

sum(~np.isnan(data)).

Performance:

In [7]: %timeit data.size - np.count_nonzero(np.isnan(data))10 loops, best of 3: 67.5 ms per loopIn [8]: %timeit sum(~np.isnan(data))10 loops, best of 3: 154 ms per loopIn [9]: %timeit np.sum(~np.isnan(data))10 loops, best of 3: 140 ms per loop


An alternative, but a bit slower alternative is to do it over indexing.

np.isnan(data)[np.isnan(data) == False].sizeIn [30]: %timeit np.isnan(data)[np.isnan(data) == False].size1 loops, best of 3: 498 ms per loop 

The double use of np.isnan(data) and the == operator might be a bit overkill and so I posted the answer only for completeness.