Vault client performance problems with lots of changed files

This forum is now locked, since Gold Support is no longer offered.

Moderator: SourceGear

Locked
stevek2
Posts: 70
Joined: Wed Jun 23, 2004 5:53 pm

Vault client performance problems with lots of changed files

Post by stevek2 » Fri Nov 10, 2006 5:10 pm

Running Vault 3.5.1 on client & server.

I have two large projects, each over 10,000 files, call them $/a and $/b. I did a 'Get Latest' on $/b, then imported (by copying into c:\b) many thousands of changed files. At this point the Vault client becomes extremely slow, maxing the CPU for over 10 seconds for a "Show Differences" on a trivial file with no changes (this slowness happens on files in both $/a and $/b). I then exit Vault, and manually rename c:\b to c:\b-temp, rerun Vault, and do a new 'Get Latest' on $/b. Now the performance is fine again -- a trivial Show Differences is instantaneous. I exit Vault, rename c:\b to c:\b-orig, and rename c:\b-temp back to c:\b. Start Vault again, and the performance is horrible again.

I double-checked that my Search pane is set to "Don't Search", and I verified that this performance hit only affects my client, not the server (other developer's clients do not see a performance decrease when I change all these files on my local machine).

Now, my Pending Change Set tab has thousands of entries, so I might expect some slowdown when working in this tab, or maybe even when working in the $/b project. But I don't expect that trivial operations in completely different projects/directories should become so slow.

Thanks,
Steve

lbauer
Posts: 9736
Joined: Tue Dec 16, 2003 1:25 pm
Location: SourceGear

Post by lbauer » Mon Nov 13, 2006 3:35 pm

Vault stores baselines of each file which has been retrieved to the working directory. If the timestamp of the file in the working directory changes, then Vault compares the file in the working directory with the baseline file to determine the file status.

If you copy thousands of files into the working directory, the Vault client must do this check on each one, which will take a while, and cause a slowdown in other client-side operations, like diff.

When you did the get latest on C:\b, Vault automatically updated the baselines, so the client was free to quickly do a diff.

That explains why the client becomes slow, but may not address the specific problem you're having. Let us know what problem you're trying to solve and we can offer some suggestions.
Linda Bauer
SourceGear
Technical Support Manager

stevek2
Posts: 70
Joined: Wed Jun 23, 2004 5:53 pm

Post by stevek2 » Mon Nov 13, 2006 4:59 pm

Sure, I understand that if there are 10,000 changed files, it's going to take some time to update the PendingChangeSet tab. But once it's updated, my understanding was that Vault posted a ChangeNotification on those directories, so Windows would notify it when any of the files changed. So unless there's a notification, no work should be required. I'm not modifying any files in the $/b directory tree. I'm merely doing a simple Get or Diff or Properties on a file somewhere else -- in $/a -- and when I do this, Vault seems to re-read many if not all the files in $/b. I have run SysInternals' filemon utility, and whenever I do a simple operation on a file in $/a, I can see Vault reading thousands of files in the $/b working directory.

This 'refresh' behavior also occurs whenever Vault regains the window focus, which is also odd.

You had asked what I was trying to do -- well, I just imported (by copying) a large number of changed files into $/b, and I'm trying to use Vault to show diffs, etc, and eventually merge these changes into my source tree. I suppose I could manually copy only a few changed files at a time, but that's awfully tedious.

stevek2
Posts: 70
Joined: Wed Jun 23, 2004 5:53 pm

Post by stevek2 » Wed Nov 15, 2006 4:58 pm

Any update on this???

Locked