I've got a curly one (atleast for me) for you.
I'm pretty new to go, but need to process some .csv files into a database, which would seem to be pretty easy if the csv files were consistent. However, there not.
The postgresql database table will have fields for all possible values of the .csv columns, with the non-existant ones blank..
The files have, in the header line, the data they contain.
To be exact.. starttime, endtime, b1, c1, f3..... where in one file it may have b1 only, in another it may have 6 of these records populated.
Is there a way of handling this in go, or should I be looking at some other way to do this?
I'm not sure were to even start here. I think I should be making a dynamic struct some how to hold the data from each file (?), or would I make a struct to hold all possible data and somehow just populate the fields I have?
Any suggestions or pointers please?
Sorry, I should have mentioned another wrinkle which I think rules out the copy command unfortunately.
There may be "replacement" rows in the files. So each row needs to be versioned, if there is a later version of the row (at a given time stamp) the version number needs to be raised (this won't be in the csv file).
question from:
https://stackoverflow.com/questions/66057524/import-csv-data-into-a-database-in-golang-but-with-a-variable-number-of-fields 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…