Compute eigenvectors of image in python Compute eigenvectors of image in python numpy numpy

Compute eigenvectors of image in python


Just a quick note, there are several tools to fit a gaussian to an image. The only thing I can think of off the top of my head is scikits.learn, which isn't completely image-oriented, but I know there are others.

To calculate the eigenvectors of the covariance matrix exactly as you had in mind is very computationally expensive. You have to associate each pixel (or a large-ish random sample) of image with an x,y point.

Basically, you do something like:

    import numpy as np    # grid is your image data, here...    grid = np.random.random((10,10))    nrows, ncols = grid.shape    i,j = np.mgrid[:nrows, :ncols]    coords = np.vstack((i.reshape(-1), j.reshape(-1), grid.reshape(-1))).T    cov = np.cov(coords)    eigvals, eigvecs = np.linalg.eigh(cov)

You can instead make use of the fact that it's a regularly-sampled image and compute it's moments (or "intertial axes") instead. This will be considerably faster for large images.

As a quick example, (I'm using a part of one of my previous answers, in case you find it useful...)

import numpy as npimport matplotlib.pyplot as pltdef main():    data = generate_data()    xbar, ybar, cov = intertial_axis(data)    fig, ax = plt.subplots()    ax.imshow(data)    plot_bars(xbar, ybar, cov, ax)    plt.show()def generate_data():    data = np.zeros((200, 200), dtype=np.float)    cov = np.array([[200, 100], [100, 200]])    ij = np.random.multivariate_normal((100,100), cov, int(1e5))    for i,j in ij:        data[int(i), int(j)] += 1    return data def raw_moment(data, iord, jord):    nrows, ncols = data.shape    y, x = np.mgrid[:nrows, :ncols]    data = data * x**iord * y**jord    return data.sum()def intertial_axis(data):    """Calculate the x-mean, y-mean, and cov matrix of an image."""    data_sum = data.sum()    m10 = raw_moment(data, 1, 0)    m01 = raw_moment(data, 0, 1)    x_bar = m10 / data_sum    y_bar = m01 / data_sum    u11 = (raw_moment(data, 1, 1) - x_bar * m01) / data_sum    u20 = (raw_moment(data, 2, 0) - x_bar * m10) / data_sum    u02 = (raw_moment(data, 0, 2) - y_bar * m01) / data_sum    cov = np.array([[u20, u11], [u11, u02]])    return x_bar, y_bar, covdef plot_bars(x_bar, y_bar, cov, ax):    """Plot bars with a length of 2 stddev along the principal axes."""    def make_lines(eigvals, eigvecs, mean, i):        """Make lines a length of 2 stddev."""        std = np.sqrt(eigvals[i])        vec = 2 * std * eigvecs[:,i] / np.hypot(*eigvecs[:,i])        x, y = np.vstack((mean-vec, mean, mean+vec)).T        return x, y    mean = np.array([x_bar, y_bar])    eigvals, eigvecs = np.linalg.eigh(cov)    ax.plot(*make_lines(eigvals, eigvecs, mean, 0), marker='o', color='white')    ax.plot(*make_lines(eigvals, eigvecs, mean, -1), marker='o', color='red')    ax.axis('image')if __name__ == '__main__':    main()

enter image description here


Fitting a Gaussian robustly can be tricky. There was a fun article on this topic in the IEEE Signal Processing Magazine:

Hongwei Guo, "A Simple Algorithm for Fitting a Gaussian Function" IEEE Signal Processing Magazine, September 2011, pp. 134--137

I give an implementation of the 1D case here:

http://scipy-central.org/item/28/2/fitting-a-gaussian-to-noisy-data-points

(Scroll down to see the resulting fits)


Did you try Principal Component Analysis (PCA)? Maybe the MDP package could do the job with minimal effort.