Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
menu search
person
Welcome To Ask or Share your Answers For Others

Categories

I try to insert millions of records into a table that has more than 20 indexes.

In the last run it took more than 4 hours per 100.000 rows, and the query was cancelled after 3? days...

Do you have any suggestions about how to speed this up.

(I suspect the many indexes to be the cause. If you also think so, how can I automatically drop indexes before the operation, and then create the same indexes afterwards again?)

Extra info:

  • The space used by the indexes is about 4 times the space used by the data alone
  • The inserts are wrapped in a transaction per 100.000 rows.

Update on status:

The accepted answer helped me make it much faster.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
291 views
Welcome To Ask or Share your Answers For Others

1 Answer

You can disable and enable the indexes. Note that disabling them can have unwanted side-effects (such as having duplicate primary keys or unique indices etc.) which will only be found when re-enabling the indexes.

--Disable Index
ALTER INDEX [IXYourIndex] ON YourTable DISABLE
GO

--Enable Index
ALTER INDEX [IXYourIndex] ON YourTable REBUILD
GO

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
...