Out of memory exception
Moderator: SourceGear
Out of memory exception
We've upgraded to 2.0 on Friday, and Vault appeared to be humming along fine til today. This is from the sgvault.log file:
Critical error downloading file! Exception of type System.OutOfMemoryException was thrown.
This exception was thrown several times for several users. I also get the same exception in a SOAP context when trying to open the Admin Tool. It opens, but no repository-specific info is displayed.
In addition, I also see the following error in the logs:
The SqlCommand is currently Open, Fetching.
and then,
VaultServiceBase.VaultResponseCheckIn returned: Failure.
This error occurred when one of our users was merely checking back in a file.
The system we're running on has 4gigs of RAM and plenty of disk space. Looking in TaskMgr, I see ASP.NET taking about 1gig (which seems like a little much) and SQL about 1.7 gigs (which is fine).
Any ideas?
Thanks,
Joel
Critical error downloading file! Exception of type System.OutOfMemoryException was thrown.
This exception was thrown several times for several users. I also get the same exception in a SOAP context when trying to open the Admin Tool. It opens, but no repository-specific info is displayed.
In addition, I also see the following error in the logs:
The SqlCommand is currently Open, Fetching.
and then,
VaultServiceBase.VaultResponseCheckIn returned: Failure.
This error occurred when one of our users was merely checking back in a file.
The system we're running on has 4gigs of RAM and plenty of disk space. Looking in TaskMgr, I see ASP.NET taking about 1gig (which seems like a little much) and SQL about 1.7 gigs (which is fine).
Any ideas?
Thanks,
Joel
No error messages in the system or application logs. Could this be a network-related issue? Although, I'd expect a network drop to show up as such.
jclausius wrote:joel:
can you examine your windows event viewer ( in both the system and application ) logs? please post any event errors related to .net.
thx,
joel:
while we're waiting to see if anything was logged in the event viewer, i thought of one other thing that you might want to check.
vault is configured to cache "X" last trees in order to improve performance when a client refreshes. the cached tree saves a trip to the database as the calculated tree delta is lightening fast computed in ram.
if you examine your vault.config, you will see two xml elements which affect "X". <CachingStrategy> and <TreeManagerSize>.
if your TreeManagerSize is -1, then the CachingStrategy and total physical memory of your machine is used to control how many trees are cached. what do you have for these values?
if your TreeManagerSize is > 0, then this is the total number of trees cached by the server.
if it turns out the event viewer says it is recycling the asp.net process due to memory restrictions, i would suggest self-tuning the TreeManagerSize. the setting you use depends on how "out of date" your users become over the course of a couple of days.
in an environment of 50 people, i would figure people are usually about 250 versions out of date, but this setting varies from site to site.
note, any change to vault.config will require you to reload the settings within vault ( just use iisreset.exe ).
while we're waiting to see if anything was logged in the event viewer, i thought of one other thing that you might want to check.
vault is configured to cache "X" last trees in order to improve performance when a client refreshes. the cached tree saves a trip to the database as the calculated tree delta is lightening fast computed in ram.
if you examine your vault.config, you will see two xml elements which affect "X". <CachingStrategy> and <TreeManagerSize>.
if your TreeManagerSize is -1, then the CachingStrategy and total physical memory of your machine is used to control how many trees are cached. what do you have for these values?
if your TreeManagerSize is > 0, then this is the total number of trees cached by the server.
if it turns out the event viewer says it is recycling the asp.net process due to memory restrictions, i would suggest self-tuning the TreeManagerSize. the setting you use depends on how "out of date" your users become over the course of a couple of days.
in an environment of 50 people, i would figure people are usually about 250 versions out of date, but this setting varies from site to site.
note, any change to vault.config will require you to reload the settings within vault ( just use iisreset.exe ).
Jeff Clausius
SourceGear
SourceGear
Yes, they were on the server.
Caching strategy is set to 1, and TreeManagerSize is set to -1. So, when ASP.NET gets recycled, the app log should show it, or the system log? And what's the message, or is it pretty obvious?
BTW, what are the following settings, and how would you suggest tuning them?
EnableFileCaching=0
DBBufferSize=256
SaveFullTreeInterval=2000
TreeDeltaCompressionThreshold=1000
GetFileCheckBatchSize=256
Thanks,
Joel
Caching strategy is set to 1, and TreeManagerSize is set to -1. So, when ASP.NET gets recycled, the app log should show it, or the system log? And what's the message, or is it pretty obvious?
BTW, what are the following settings, and how would you suggest tuning them?
EnableFileCaching=0
DBBufferSize=256
SaveFullTreeInterval=2000
TreeDeltaCompressionThreshold=1000
GetFileCheckBatchSize=256
Thanks,
Joel
jclausius wrote:i didn't clarify... those event viewer logs were on the server correct?
When asp.net is recycled (which can happen when the .net process allocates 1GB of memory or 60% of memory), you will usually find an entry in the Windows Event Viewer indicating "aspnet_wp.exe (PID: ABCD) was recycled because memory consumption exceeded ..."So, when ASP.NET gets recycled, the app log should show it, or the system log? And what's the message, or is it pretty obvious?
If you start at the top, and just give the list an overview, you should see some mention of .net, asp.net or aspnet_wp.exe.
EnableFileCaching = 0 -> assigned for a future version of vault.BTW, what are the following settings, and how would you suggest tuning them?
EnableFileCaching=0
DBBufferSize=256
SaveFullTreeInterval=2000
TreeDeltaCompressionThreshold=1000
GetFileCheckBatchSize=256
-----------
DBBufferSizeKB=256 -> size in KB of the buffer used to stream blob (file) data in/out of sql server.
the value of this setting really depends on the size of your average "change" (for checkin modifications), and the size of the overall file (for new file additions).
-----------
-----------
SaveFullTreeInterval=2000 -> the interval at which a full tree is stored in the database. to conserve space, every version of your sourcetree is stored in a "diff" format. however if this were the case since beginning of your use, it might take 30 minutes to build tree at root's version 3. so, for performance reasons, a "full" tree is saved every 2000 tree modifications.
setting this value too low will result in a bigger database. setting this value too high will result in longer times to compute a version of your tree.
-----------
-----------
TreeDeltaCompressionThreshold=1000 -> the number of entries in the tree delta sent to a client, which will trigger the server to actually, zip up the entire tree and send it as a compressed tree.
setting this value too low will result in longer times the client will wait to retrieve a tree refresh (because the server is compressing the information). setting this value too high may result in bigger network transfer times due to the size of the tree delta being sent to the client.
-----------
-----------
GetFileCheckBatchSize=256 -> number of files per batch used when querying sql server during any get of a label.
setting this value too low will result in slightly longer "get" times during file transfers. setting this value too high will result in more memory consumption/processing within sql server when checking the validity of a file before a get.
-----------
hth
Jeff Clausius
SourceGear
SourceGear
OutOfMemory Exception +1
Hi
I think we are experiencing the same problem - this morning several of our clients recieved OutOfMemory exceptions and others recieved Object reference not set to instance of object after a long period of no response from our server.
Like Joel were running 4 gigs of ram on the server and similaraly Sql server is eating about 1.7gig ram + 1.7gig virtual, the asp.net wp was at 1gig ish ram + 1 gig ish virtual.
At 11:39 our ASP.net reported in the event log that it was recycling the application "because it was suspected to be in a deadlocked state. It did not send any responses for pending requests for the last 180 seconds"
Since this recycling the server seems to be responding fine again. Our TreeManagerSize is also -1 and our CachingStrategy section is 1. I think that you were suggesting altering the TreeManagerSize value but I'm not sure I understood how to guestimate a new suitable value.
If changing this value is correct could you please help me work out what a suitable starting value would be and what sort of things we should be looking for to decide if its correct.
We are running Vault 2.0.0.0.
thanks for your help
Jim
I think we are experiencing the same problem - this morning several of our clients recieved OutOfMemory exceptions and others recieved Object reference not set to instance of object after a long period of no response from our server.
Like Joel were running 4 gigs of ram on the server and similaraly Sql server is eating about 1.7gig ram + 1.7gig virtual, the asp.net wp was at 1gig ish ram + 1 gig ish virtual.
At 11:39 our ASP.net reported in the event log that it was recycling the application "because it was suspected to be in a deadlocked state. It did not send any responses for pending requests for the last 180 seconds"
Since this recycling the server seems to be responding fine again. Our TreeManagerSize is also -1 and our CachingStrategy section is 1. I think that you were suggesting altering the TreeManagerSize value but I'm not sure I understood how to guestimate a new suitable value.
If changing this value is correct could you please help me work out what a suitable starting value would be and what sort of things we should be looking for to decide if its correct.
We are running Vault 2.0.0.0.
thanks for your help
Jim
Jim,
TreeManagerSize set to -1 tells the Vault service to guess how many trees to hold based on available memory. Unfortunately, it can guess too high, as you have seen. The TreeManager keeps these trees in cahce so that it can easily compare the current tree to one of the cached trees, if a client connects that is out of date. If a client connects and the last tree that they downloaded is no longer in the TreeManager cache, a rather expensive database call must be made to get that client up to date with the latest tree.
Setting TreeManagerSize to a number (lets say 100) tells Vault to throw away any trees that are 100 versions out of date. If you can safely say that most of the people who connect will not be more than 100 versions out of date, then you will have no problems. If your tree goes through less than 100 versions per day, then 100 is a good value for TreeManagerSize .
As a reference point, on a machine with 4 gig RAM, TreeManagerSize set to -1, and CachingStrategy set to 1, the web service will try to keep 545 trees in memory. So 545 is obviously too many.
In summary, set TreeManagerSize to more than the typical number of tree versions you see in a day, but less that 545. I would start a 222 and see if that gives acceptable performance and memory consumption.
TreeManagerSize set to -1 tells the Vault service to guess how many trees to hold based on available memory. Unfortunately, it can guess too high, as you have seen. The TreeManager keeps these trees in cahce so that it can easily compare the current tree to one of the cached trees, if a client connects that is out of date. If a client connects and the last tree that they downloaded is no longer in the TreeManager cache, a rather expensive database call must be made to get that client up to date with the latest tree.
Setting TreeManagerSize to a number (lets say 100) tells Vault to throw away any trees that are 100 versions out of date. If you can safely say that most of the people who connect will not be more than 100 versions out of date, then you will have no problems. If your tree goes through less than 100 versions per day, then 100 is a good value for TreeManagerSize .
As a reference point, on a machine with 4 gig RAM, TreeManagerSize set to -1, and CachingStrategy set to 1, the web service will try to keep 545 trees in memory. So 545 is obviously too many.
In summary, set TreeManagerSize to more than the typical number of tree versions you see in a day, but less that 545. I would start a 222 and see if that gives acceptable performance and memory consumption.