Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
menu search
person
Welcome To Ask or Share your Answers For Others

Categories

I have a php script that parses XML files and creates a large SQL file that looks something like this:

INSERT IGNORE INTO table(field1,field2,field3...)
VALUES ("value1","value2",int1...),
("value1","value2",int1)...etc

This file adds up to be over 20GB (I've tested on a 2.5GB file but it fails too).

I've tried commands like:

mysql -u root -p table_name < /var/www/bigfile.sql

this works on smaller files, say around 50MB. but it doesn't work with a larger file.

I tried:

mysql> source /var/www/bigfile.sql

I also tried mysqlimport but that won't even properly process my file.

I keep getting an error that says

ERROR 2006 (HY000): MySQL server has gone away

Happens approx. 30 seconds after I start executing.

I set allowed_max_packet to 4GB but when verifying it with SHOW VARIABLES it only shows 1GB.

Is there a way to do this without wasting another 10 hours?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
459 views
Welcome To Ask or Share your Answers For Others

1 Answer

Try splitting the file into multiple INSERT queries.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
...