Any strategies for assessing the trade-off between CPU loss and memory gain from compression of data
Posted
by indiehacker
on Stack Overflow
See other posts from Stack Overflow
or by indiehacker
Published on 2010-04-06T05:22:48Z
Indexed on
2010/04/06
18:43 UTC
Read the original article
Hit count: 233
google-app-engine
|google-datastore
Are very large TextProperties a burden? Should they be compressed?
Say I have a information stored in 2 attributes of type TextProperty in my datastore entities. The strings are always the same length of 65,000 characters and have lots of repeating integers, a sample appearing as follows:
entity.pixel_idx = 0,0,0,0,0,0,0,0,0,1,1,1,1,1,1,1,1,1,1,1,5,5,5,5,5,5,5,5,5,5,5,5....etc.
entity.pixel_color = 2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,1,1,1,1,1,1,1,1,1,1,1,1,...etc.
So these above could also be represented using much less storage memory by compressing say using only each integer and the length of its series ( '0,8' for '0,0,0,0,0,0,0,0'
) but then its takes time and CPU to compress and decompress?
Any general ideas?
Are there some tricks for testing different attempts to the problem?
© Stack Overflow or respective owner