Server unavailable for transaction end - Transaction failed

If you are having a problem using Vault, post a message here.

Moderator: SourceGear

Post Reply
elyna
Posts: 4
Joined: Thu Nov 11, 2004 9:40 am

Server unavailable for transaction end - Transaction failed

Post by elyna » Thu Nov 11, 2004 9:50 am

Our VSS --> Vault import has been running pretty smoothly, but I just ran across an issue with a larger branch of the VSS tree.

We are following suggestions found in this forum for our conversion, archiving branches of our VSS tree off to a separate VSS DB for import to Vault - it's been working fine until we ran into a branch with a few ~50mb files.

IIS Max upload is set to almost 100mb on the Vault server.

The log says:
Server unavailable for transaction end
Transaction Failed

I can email the full text of the log upon request. Thanks!

jeremy_sg
Posts: 1821
Joined: Thu Dec 18, 2003 11:39 am
Location: Sourcegear
Contact:

Post by jeremy_sg » Thu Nov 11, 2004 11:06 am

The most common cause for large files killing the import are memory related. Some steps you can take to reduce the memory consumption:

1. Run the import and the web server on two different machines.
2. In the Vault GUI client on the machine that is running the import, select Use Chunked Encoding in Tools->Options->Network Settings.
If these suggestions aren't helpful, email me your import log file and your Vault server log file.

Thanks!

elyna
Posts: 4
Joined: Thu Nov 11, 2004 9:40 am

Post by elyna » Thu Nov 11, 2004 1:04 pm

Thanks for the advice, I will try that.

What affect do large files (~50mb-100mb) have on the performance of the Vault Server? We're debating removing the large files from source control and placing them non-versioned in a fileshare somewhere, your input on that matter may sway our desicion one way or another.

THanks again, Jeremy.

jeremy_sg
Posts: 1821
Joined: Thu Dec 18, 2003 11:39 am
Location: Sourcegear
Contact:

Post by jeremy_sg » Thu Nov 11, 2004 4:58 pm

Honestly (just me speaking, not Sourcegear), really large files aren't worth it. There are exceptions to this, but if the file is just a zip of stuff that's already in the repository, or a zip of exes generated by compiling parts of the repository, or test files then it probably doesn't belong.

An exception could be made for a zip file of a library that needs to be compiled in order for your product to compile. We have several third party libraries that we keep as a zip file in our Vault repository. The automated build will download the zip, unzip it and then compile it. It's a good way to make sure that everyone is building off the same version.

To answer your specific question, large files have no real impact on Vault's speed, but do tend to trigger flaky network timeouts and session timeouts more frequently, because they take longer to upload/download/put into the database. Also, as you've seen, they take up large chunks of memory when uploading.

Post Reply