Fastest method in merging of the two: dicts vs lists

Posted by tipu on Stack Overflow See other posts from Stack Overflow or by tipu
Published on 2010-05-17T12:46:48Z Indexed on 2010/05/17 13:10 UTC
Read the original article Hit count: 151

Filed under:
|

I'm doing some indexing and memory is sufficient but CPU isn't. So I have one huge dictionary and then a smaller dictionary I'm merging into the bigger one:

big_dict = {"the" : {"1" : 1, "2" : 1, "3" : 1, "4" : 1, "5" : 1}}
smaller_dict = {"the" : {"6" : 1, "7" : 1}}
#after merging
resulting_dict = {"the" : {"1" : 1, "2" : 1, "3" : 1, "4" : 1, "5" : 1, "6" : 1, "7" : 1}}

My question is for the values in both dicts, should I use a dict (as displayed above) or list (as displayed below) when my priority is to use as much memory as possible to gain the most out of my CPU?

For clarification, using a list would look like:

big_dict = {"the" : [1, 2, 3, 4, 5]}
smaller_dict = {"the" : [6,7]}
#after merging
resulting_dict = {"the" : [1, 2, 3, 4, 5, 6, 7]}

Side note: The reason I'm using a dict nested into a dict rather than a set nested in a dict is because JSON won't let me do json.dumps because a set isn't key/value pairs, it's (as far as the JSON library is concerned) {"a", "series", "of", "keys"}

Also, after choosing between using dict to a list, how would I go about implementing the most efficient, in terms of CPU, method of merging them?

I appreciate the help.

© Stack Overflow or respective owner

Related posts about python

Related posts about efficiency