Out-of-memory algorithms for addressing large arrays

Posted by reve_etrange on Stack Overflow See other posts from Stack Overflow or by reve_etrange
Published on 2010-05-21T06:05:52Z Indexed on 2010/05/21 6:10 UTC
Read the original article Hit count: 217

I am trying to deal with a very large dataset. I have k = ~4200 matrices (varying sizes) which must be compared combinatorially, skipping non-unique and self comparisons. Each of k(k-1)/2 comparisons produces a matrix, which must be indexed against its parents (i.e. can find out where it came from). The convenient way to do this is to (triangularly) fill a k-by-k cell array with the result of each comparison. These are ~100 X ~100 matrices, on average. Using single precision floats, it works out to 400 GB overall.
I need to 1) generate the cell array or pieces of it without trying to place the whole thing in memory and 2) access its elements (and their elements) in like fashion. My attempts have been inefficient due to reliance on MATLAB's eval() as well as save and clear occurring in loops.

for i=1:k
    [~,m] = size(data{i});
    cur_var = ['H' int2str(i)];
    %# if i == 1; save('FileName'); end; %# If using a single MAT file and need to create it.
    eval([cur_var ' = cell(1,k-i);']);
    for j=i+1:k
        [~,n] = size(data{j});
        eval([cur_var '{i,j} = zeros(m,n,''single'');']);
        eval([cur_var '{i,j} = compare(data{i},data{j});']);
    end
    save(cur_var,cur_var); %# Add '-append' when using a single MAT file.
    clear(cur_var);
end

The other thing I have done is to perform the split when mod((i+j-1)/2,max(factor(k(k-1)/2))) == 0. This divides the result into the largest number of same-size pieces, which seems logical. The indexing is a little more complicated, but not too bad because a linear index could be used.

Does anyone know/see a better way?

© Stack Overflow or respective owner

Related posts about matlab

Related posts about out-of-memory