Any efficient way to read datas from large binary file?

Posted by limi on Stack Overflow See other posts from Stack Overflow or by limi
Published on 2009-08-17T12:40:41Z Indexed on 2010/05/25 11:31 UTC
Read the original article Hit count: 124

Filed under:
|
|

Hi,

I need to handle tens of Gigabytes data in one binary file. Each record in the data file is variable length.

So the file is like:

<len1><data1><len2><data2>..........<lenN><dataN>

The data contains integer, pointer, double value and so on.

I found python can not even handle this situation. There is no problem if I read the whole file in memory. It's fast. But it seems the struct package is not good at performance. It almost stuck on unpack the bytes.

Any help is appreciated.

Thanks.

© Stack Overflow or respective owner

Related posts about python

Related posts about file