How do I export a large table into 50 smaller csv files of 100,000 records each
- by Eddie
I am trying to export one field from a very large table - containing 5,000,000 records, for example - into a csv list - but not all together, rather, 100,000 records into each .csv file created - without duplication. How can I do this, please?
I tried
SELECT field_name
FROM table_name
WHERE certain_conditions_are_met
INTO OUTFILE /tmp/name_of_export_file_for_first_100000_records.csv
LINES TERMINATED BY '\n'
LIMIT 0 , 100000
that gives the first 100000 records, but nothing I do has the other 4,900,000 records exported into 49 other files - and how do I specify the other 49 filenames?
for example, I tried the following, but the SQL syntax is wrong:
SELECT field_name
FROM table_name
WHERE certain_conditions_are_met
INTO OUTFILE /home/user/Eddie/name_of_export_file_for_first_100000_records.csv
LINES TERMINATED BY '\n'
LIMIT 0 , 100000
INTO OUTFILE /home/user/Eddie/name_of_export_file_for_second_100000_records.csv
LINES TERMINATED BY '\n'
LIMIT 100001 , 200000
and that did not create the second file...
what am I doing wrong, please, and is there a better way to do this? Should the LIMIT 0 , 100000 be put Before the first INTO OUTFILE statement, and then repeat the entire command from SELECT for the second 100,000 records, etc?
Thanks for any help.
Eddie