I have a working project, but I want to tune it a bit.
I use: C: \ xampp \ mysql \ bin \ mysql --force -u User-PIPA database 1 & lt; C: \ something \ import.sql
file import.sql
is a file that contains a dot-statement. Each file is about 30 MB in size, and it exponentially grows here. Here is a data example (very basic)
id. DATE | Test 1 1-1-14 y 2 1-2-14 y
For each date, some entries are added. In the file tomorrow 49999 lines
Today 50002 lines in the file
, I really need 3 rows of that file! And "errors" on my batch 49999 lines, that is a duplicate line
Is there any way to speed it up?
It is not possible to give advice to SQL without speed knowing SQL There are all kinds of complex factors. You can consider editing your question to show more details.
The fastest way of bulk-load data in a MySQL server is that this command reads data directly from the table with a flat file like a value file separated by a CSV or tab. It does not work with files of SQL commands
No comments:
Post a Comment