what changes when your input is giga/terabyte sized?

Posted by Wang on Stack Overflow See other posts from Stack Overflow or by Wang
Published on 2010-06-10T06:34:47Z Indexed on 2010/06/14 14:42 UTC
Read the original article Hit count: 194

I just took my first baby step today into real scientific computing today when I was shown a data set where the smallest file is 48000 fields by 1600 rows (haplotypes for several people, for chromosome 22). And this is considered tiny.

I write Python, so I've spent the last few hours reading about HDF5, and Numpy, and PyTable, but I still feel like I'm not really grokking what a terabyte-sized data set actually means for me as a programmer.

For example, someone pointed out that with larger data sets, it becomes impossible to read the whole thing into memory, not because the machine has insufficient RAM, but because the architecture has insufficient address space! It blew my mind.

What other assumptions have I been relying in the classroom that just don't work with input this big? What kinds of things do I need to start doing or thinking about differently? (This doesn't have to be Python specific.)

© Stack Overflow or respective owner

Related posts about python

Related posts about large-data-volumes