I have a wide CSV file of about 350mb, and want to load it into a SQL database and properly model the data to make it easier to use for analysis.
- I could split the data into tables with python and then loaded into sql
- Or load the file into the database as a table, and then split it using sql.
What would be the standard approach? Or how should I choose?