My list contains 100k records and i'm sending all those records via for loop to the async def functions which is not working efficiently. Are there any efficient method to send all the records to the async function via for loop other than this?
df= pd.read_csv('myfile.csv')
to_list= df['val'].tolist()
async def url(val):
#do something here
async def ftp_url(val):
result= await url(val)
#merge url function results with ftp_url function results (final_results)
return final_results
async def main():
tasks= [ftp_url(val) for val in to_list] #how can i do changes to my loop to handle huge data volumes( 100k )]
my_results= await asyncio.gather(*tasks)
data.extend(my_results)
return data
if __name__ = "__main__":
#all the loop related statements
write_to_csv= pd.DataFrame(data)
write_to_csv('my_final_output.csv')
question from:
https://stackoverflow.com/questions/65933623/how-to-iterate-through-huge-data-volumeslist-via-for-loop-asyncio 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…