Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
menu search
person
Welcome To Ask or Share your Answers For Others

Categories

I have created an app that, given enough data, fails to complete, with "The transaction log for database 'tempdb' is full due to 'ACTIVE_TRANSACTION'." and "Cannot find table 0."

The Stored Procedure that the report uses does not explicitly reference "tempdb" so it must be something that SQL Server manages on its own.

Anyway, I ran the "Resource Contention" analysis in Visual Studio 2013 via Analyze > Performance and Diagnostics.

When it finished, it told me in the "Concurrency Profiling Report" that there were 30,790 total contentions, with "Handle2" and "Multiple Handles 1" making up over 99% of the "Most Contended Resources" and "_CorExeMain", Thread ID 4936 as the "Most Contended Thread"

This is all interesting enough, I guess, but now that I know this, what can I do about it?

Is 30,790 an excessive amount of total contentions? It sounds like it, but I don't know. But again, assuming it is, this information doesn't really seem to tell me anything of value, that is: what can I do to improve the situation? How can I programmatically prevent or limit Resource and/or Thread contention?

There were no errors in the report generated, and the six messages were of the "for information only" variety. There was one Warning:

Warning 1 DA0022: # Gen 1 Collections / # Gen 2 Collections = 2.52; There is a relatively high rate of Gen 2 garbage collections occurring. If, by design, most of your program's data structures are allocated and persisted for a long time, this is not ordinarily a problem. However, if this behavior is unintended, your app may be pinning objects. If you are not certain, you can gather .NET memory allocation data and object lifetime information to understand the pattern of memory allocation your application uses.

...but the persisting of data structures for "a long time" is, indeed, by design.

UPDATE

I then ran the ".NET Memory Allocation" report, and here too I'm not sure what to make of it:

enter image description here

Is 124 million bytes excessive? Is there anything untoward about the functions allocating the most memory, or which types have the most memory allocated?

I also don't understand why the red vertical line moves about after the report has been generated; it was first around "616", then it moved to 0 (as seen in the screenshot above), and now it's around 120.

UPDATE 2

I see now (after running the final performance check (Instrumentation)) that the vertical red line is just a lemming - it follows the cursor wherever you drag it. This has some purpose, I guess...

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
150 views
Welcome To Ask or Share your Answers For Others

1 Answer

This is what ended up working: a two-pronged attack:

0) The database guru refactored the Stored Procedure to be more efficient.

1) I reworked the data reading code by breaking it into two parts. I determined the first half of the data to be retrieved, and got that, then followed a short pause with a second pass, retrieving the rest of the data. In short, what used to be this:

. . .
ReadData(_unit, _monthBegin, _monthEnd, _beginYearStr, _endYearStr);
. . .

...is now this:

. . .
if // big honkin' amount of data
{
    string monthBegin1 = _monthBegin;
    string monthEnd1 = GetEndMonthOfFirstHalf(_monthBegin, _monthEnd, _beginYearStr, _endYearStr);
    string monthBegin2 = GetBeginMonthOfSecondHalf(_monthBegin, _monthEnd, _beginYearStr, _endYearStr);
    string monthEnd2 = _monthEnd;

    string yearBegin1 = _beginYearStr;
    string yearEnd1 = GetEndYearOfFirstHalf(_monthBegin, _monthEnd, _beginYearStr, _endYearStr);
    string yearBegin2 = GetBeginYearOfSecondHalf(_monthBegin, _monthEnd, _beginYearStr, _endYearStr);
    string yearEnd2 = _endYearStr;

    ReadData(_unit, monthBegin1, monthEnd1, yearBegin1, yearEnd1);
    Thread.Sleep(10000);
    ReadData(_unit, monthBegin2, monthEnd2, yearBegin2, yearEnd2);
}
else // a "normal" Unit (not an overly large amount of data to be processed)
{
    ReadData(_unit, _monthBegin, _monthEnd, _beginYearStr, _endYearStr);
}
. . .

It now successfully runs to completion without any exceptions. I don't know if the problem was entirely database-related, or if all Excel Interop activity was also problematic, but I can now generate Excel files of 1,341KB with this operation.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
...