python - Improve loop efficiency -
i have 21,000 rows i'm pulling sqlite3 database, , need put dictionary appended array. loop takes 1.531 seconds, , i'm trying minimize as can. what's faster way can this? (i using python 2.7.3)
results = cursor.fetchall() array = [] row in results: = {} b = row[11] c = str(row[8]) d = str(row[9]) + " ? " + str(row[10]) e = row[3] f = row[4] key = e + " - " + f rowtoadd = { 'aaa': row[3], 'bbb' : row[0], 'ccc' : row[1], 'ddd' : row[12], 'eee' : row[2], 'fff' : row[9], 'ggg' : row[1], 'hhh' : str(row[6]) + " (" + str(row[5]) + ")", 'iii' : e, 'jjj' : d, 'kkk' : f } rowtoadd.append(array)
is there way directly map cursors output list of dictionaries custom keys?
is there more pythonic way speed up?
leave concatenations sqlite instead, , use sqlite3.row
row factory instead, gives object can used both tuple , dictionary.
have query name keys using column aliases:
cursor.execute(''' select col3 aaa, col9 bbb, col6 || ' (' || col5 || ')' hhh, col9 || ' ? ' || col10 jjj some_table ''')
then use result instead of building new list. if must have list these can use:
array = cursor.fetchall()
or
array = list(cursor)
but better of iterating on results need have access rows.
Comments
Post a Comment