"Get Latest Version" randomly fails, but deletes f
Moderator: SourceGear
This workaround definately doesn't work. Here's what I just did, and the results...
-- From my machine --
I created a new folder on my disk called "test_vault_folder", and inside it created a load of folders and sub-folders, with text files at every level, containing "original contents".
I added the folder to Vault and committed
-- From my colleagues machine --
I hit F5 in Vault to see the new folder and did Get Latest Version on it.
I modified all text files to read "new contents" at all levels.
I removed the "test_vault_folder" from vault from and committed.
I re-added the "test_vault_folder" folder and committed.
-- Back on my machine --
I right-clicked the parent folder of "test_vault_folder" and hit Get Latest Version. I selected the "Do not remove" option for deleted files, and hit OK.
-- The results --
When Vault had finished, not a single file on my disk had the new contents. All were old, and 4 of the 8 files were now listed in Vault as being modified by me.
This illustrates both the major bugs I've mentioned, and it seems to happen every time, it's not intermittent.
I've attached the log file from my machine, cleared out just before I started.
-- From my machine --
I created a new folder on my disk called "test_vault_folder", and inside it created a load of folders and sub-folders, with text files at every level, containing "original contents".
I added the folder to Vault and committed
-- From my colleagues machine --
I hit F5 in Vault to see the new folder and did Get Latest Version on it.
I modified all text files to read "new contents" at all levels.
I removed the "test_vault_folder" from vault from and committed.
I re-added the "test_vault_folder" folder and committed.
-- Back on my machine --
I right-clicked the parent folder of "test_vault_folder" and hit Get Latest Version. I selected the "Do not remove" option for deleted files, and hit OK.
-- The results --
When Vault had finished, not a single file on my disk had the new contents. All were old, and 4 of the 8 files were now listed in Vault as being modified by me.
This illustrates both the major bugs I've mentioned, and it seems to happen every time, it's not intermittent.
I've attached the log file from my machine, cleared out just before I started.
- Attachments
-
- VaultGUIClient.zip
- (120.17 KiB) Downloaded 814 times
I've reproduced this here, but I don't think it is a bug. Here's why:
When you delete a folder and add a new one of the same name in the same location, they are different entities in Vault, with different IDs and different histories.
This means the working folder that you are using is no longer valid, because it points to an entity in Vault that no longer exists (or at least is not valid until it becomes undeleted).
As such, the client has to start over in determining the state of the files in that working folder. It sees a folder of the same name and files of the same name, so it tries to see if the file it has on disk has the same contents as any version in that folder. If it does, it knows what version the file corresponds to. If it can't find matching contents, the file is correctly considered edited. Since the baseline for the file disappeared when the folder was deleted, there is no way to determine that someone else actually checked in a new version by deleted and adding it.
Note this behavior will probably be there even after we fix the other problem, where unmodified files are deleted on the first Get when the "perform repository deletions locally" is on after a delete an an add of the same name.
I'd like to have more details on why you need to delete entire folders and readd them as part of your dev process. Since this has the effect of removing all history for those folders everytime someone checks them in, and also doesn't allow two people to work on the same file at the same time, it seems like it subverts some of the main reasons to have source control.
When you delete a folder and add a new one of the same name in the same location, they are different entities in Vault, with different IDs and different histories.
This means the working folder that you are using is no longer valid, because it points to an entity in Vault that no longer exists (or at least is not valid until it becomes undeleted).
As such, the client has to start over in determining the state of the files in that working folder. It sees a folder of the same name and files of the same name, so it tries to see if the file it has on disk has the same contents as any version in that folder. If it does, it knows what version the file corresponds to. If it can't find matching contents, the file is correctly considered edited. Since the baseline for the file disappeared when the folder was deleted, there is no way to determine that someone else actually checked in a new version by deleted and adding it.
Note this behavior will probably be there even after we fix the other problem, where unmodified files are deleted on the first Get when the "perform repository deletions locally" is on after a delete an an add of the same name.
I'd like to have more details on why you need to delete entire folders and readd them as part of your dev process. Since this has the effect of removing all history for those folders everytime someone checks them in, and also doesn't allow two people to work on the same file at the same time, it seems like it subverts some of the main reasons to have source control.
The reason we're deleting folders is that our software creates tons of new sub-folders all over the place, so removing the parent folder and adding it again is the only easy way to get all the new contents in (since the Detect New Files thing doesn't work with folders). If deleting folders isn't recommended, how would you recommend we do this?
As for your suggestion this isn't a bug, I strongly disagree. I understand why it happens from a technical side, but you should be considering it from the users point of view. Why would it ever make sense to overwrite new files with old files just because someone removed and re-added the folder?
As for your suggestion this isn't a bug, I strongly disagree. I understand why it happens from a technical side, but you should be considering it from the users point of view. Why would it ever make sense to overwrite new files with old files just because someone removed and re-added the folder?
But the problem is that it isn't really removing a folder and re-adding it - it is removing a folder and adding a new, different folder which happens to have the same name as the old one. The files on another user's machine are not really new versions, they are Unknown files with the same name.link wrote:The reason we're deleting folders is that our software creates tons of new sub-folders all over the place, so removing the parent folder and adding it again is the only easy way to get all the new contents in (since the Detect New Files thing doesn't work with folders). If deleting folders isn't recommended, how would you recommend we do this?
As for your suggestion this isn't a bug, I strongly disagree. I understand why it happens from a technical side, but you should be considering it from the users point of view. Why would it ever make sense to overwrite new files with old files just because someone removed and re-added the folder?
Yes this is a technical reason, but it is unfortunately just the way Vault and other source control systems work. Once deleted, there is no link to later additions of the same name. In fact, you might look at the properties of the folder where all the deletions are happening. There will be lots and lots of deleted folders with the same name stuck there.
If you constantly have a lot of unknown folders that need added to your project, you might need to create a script or a special program that searches your hard drive for folders that don't exist in folder. Our client API would allow you to do this.
This isn't a great solution (to have to develop and run a separate program). However, I wonder - if these are program generated files, how critical is it that they but under source control? Often, only the code that generates things (dlls, tests, etc) is put under source control, since the things they produce are easily reproduced by running the programs.
Hi Dan,
Thanks for the quick reply..
Thanks for the quick reply..
I do understand that. It is rather misleading in our scenario though, so maybe it's something you could consider addressing in the future? Or at least, when the user deletes a folder, give them a dialog telling them this happens if they re-add the folder later?dan wrote:But the problem is that it isn't really removing a folder and re-adding it - it is removing a folder and adding a new, different folder which happens to have the same name as the old one. The files on another user's machine are not really new versions, they are Unknown files with the same name.
This sounds like a possibility. Do you have anything similar that could be easily adapted? Or could you provide an example of how we can easily get the repository tree with your API? (Preferably something that uses the local cached version rather than hitting the server, but anything is better than nothing).dan wrote:If you constantly have a lot of unknown folders that need added to your project, you might need to create a script or a special program that searches your hard drive for folders that don't exist in folder. Our client API would allow you to do this.
I agree. However, if the Detect New Files option is extended to cover folders in the near future, it may be acceptable. Do you have an ETA for this functionality?dan wrote:This isn't a great solution (to have to develop and run a separate program).
I probably described this badly. The folders are added as we add new actions to our test scripts, so they are important, and not like dlls etc. that are built from source.dan wrote:However, I wonder - if these are program generated files, how critical is it that they but under source control? Often, only the code that generates things (dlls, tests, etc) is put under source control, since the things they produce are easily reproduced by running the programs.
Actually, there is a new feature coming up in 3.5 where the GUI tree will show folders on the file system that don't exist in the repository as "ghosted" folders (displayed with a different icon), with a command to add them to the repository.
So, it should get much easier to do this when 3.5 is released, currently targeted in May.
If you want to go the API route, download the Client API from the downloads page of our main website, and take a look at the command line client code, which is in there. It is fairly easy to traverse the repository tree - just search for the function call:
ClientInstance.TreeCache.Repository.Root.FindFolderRecursive(strFolderName);
and it will return a node in the tree you can manually walk, or you can simply call this function repeatedly to see whether a local folder is in the tree at the location you think it should be.
So, it should get much easier to do this when 3.5 is released, currently targeted in May.
If you want to go the API route, download the Client API from the downloads page of our main website, and take a look at the command line client code, which is in there. It is fairly easy to traverse the repository tree - just search for the function call:
ClientInstance.TreeCache.Repository.Root.FindFolderRecursive(strFolderName);
and it will return a node in the tree you can manually walk, or you can simply call this function repeatedly to see whether a local folder is in the tree at the location you think it should be.
I can't find this option. I've added files into a new folder in our structure, and can't find it anywhere, in the tree, under Detect New Files to Add etc.dan wrote:Actually, there is a new feature coming up in 3.5 where the GUI tree will show folders on the file system that don't exist in the repository as "ghosted" folders (displayed with a different icon), with a command to add them to the repository.
How do I find and add all of the files and new folders within our source folder easily?
In Vault 3.5, this option is in the GUI Client under Tools->Options->General->Show non-version controlled Files/Folders ghosted in folder tree/file list.
This should be on by default, but if not, check it.
Also, you need to have a working folder set for this to work.
This should be on by default, but if not, check it.
Also, you need to have a working folder set for this to work.
Linda Bauer
SourceGear
Technical Support Manager
SourceGear
Technical Support Manager