speeding up parsing of files

Posted by user248237 on Stack Overflow See other posts from Stack Overflow or by user248237
Published on 2010-06-05T18:35:41Z Indexed on 2010/06/05 18:42 UTC
Read the original article Hit count: 115

Filed under:
|
|
|

the following function parses a CSV file into a list of dictionaries, where each element in the list is a dictionary where the values are indexed by the header of the file (assumed to be the first line.)

this function is very very slow, taking ~6 seconds for a file that's relatively small (less than 30,000 lines.)

how can I speed it up?

def csv2dictlist_raw(filename, delimiter='\t'):
    f = open(filename)
    header_line = f.readline().strip()
    header_fields = header_line.split(delimiter)
    dictlist = []
    # convert data to list of dictionaries
    for line in f:
    values = map(tryEval, line.strip().split(delimiter))
    dictline = dict(zip(header_fields, values))
    dictlist.append(dictline)
    return (dictlist, header_fields)

thanks.

© Stack Overflow or respective owner

Related posts about python

Related posts about csv