Compound assignment operators in Python's Numpy library
Posted
by
Leonard
on Programmers
See other posts from Programmers
or by Leonard
Published on 2012-06-12T16:34:47Z
Indexed on
2012/06/12
16:47 UTC
Read the original article
Hit count: 477
python
The "vectorizing" of fancy indexing by Python's numpy library sometimes gives unexpected results. For example:
import numpy
a = numpy.zeros((1000,4), dtype='uint32')
b = numpy.zeros((1000,4), dtype='uint32')
i = numpy.random.random_integers(0,999,1000)
j = numpy.random.random_integers(0,3,1000)
a[i,j] += 1
for k in xrange(1000):
b[i[k],j[k]] += 1
Gives different results in the arrays 'a' and 'b' (i.e. the appearance of tuple (i,j) appears as 1 in 'a' regardless of repeats, whereas repeats are counted in 'b'). This is easily verified as follows:
numpy.sum(a)
883
numpy.sum(b)
1000
It is also notable that the fancy indexing version is almost two orders of magnitude faster than the for loop. My question is: "Is there an efficient way for numpy to compute the repeat counts as implemented using the for loop in the provided example?"
© Programmers or respective owner