Obliterate via running stored procedure directly?

This forum is now locked, since Gold Support is no longer offered.

Moderator: SourceGear

CCS
Posts: 23
Joined: Thu Oct 13, 2005 3:46 pm

Obliterate via running stored procedure directly?

Post by CCS » Wed Feb 08, 2006 4:13 pm

After a few unsuccessful attempts (timeouts) to obliterate large amounts of data (approximately 2.5-3 GB) we came across this post:
jeremy_sg wrote: ---Edited by jeremy_sg to remove potentially dangerous advice.---

does it mean that source gear has finally changed it's point of view on the problem and now does recommend to run the stored procedure directly? I recall that just recently you strongly opposed it without providing any suggestions on how to resolve the problem or sufficient explanations why the SP spobliteratefsobject shouldn't be run directly?

Please confirm.

Also VSS had database integrity checker – does source gear has something similar for Vault? We’ve got nothing to loose – obliterate just doesn’t work when executed through command line tool, so we want to try to run the SP directly – if something goes wrong we always have backups. But the real question is what if the SP completes but leaves the DB in inconsistent state? How can we check that?

ian_sg
Posts: 787
Joined: Wed May 04, 2005 10:55 am
Location: SourceGear
Contact:

Post by ian_sg » Thu Feb 09, 2006 8:48 am

Unfortunately there are a number of dangers in running this stored procedure by itself, so we do not support that as a general-case workaround for obliterate issues.

Please email me using the button below if you need assistance with a specific obliterate.
Ian Olsen
SourceGear

CCS
Posts: 23
Joined: Thu Oct 13, 2005 3:46 pm

Post by CCS » Thu Feb 09, 2006 12:16 pm

since this info might be interested to someone else...
once again could you please elaborate a bit more on "a number of dangers in running this stored procedure by itself"?
I understand that client opens up a transaction prior to running the SP, but SP appears to be self sufficient so why can't the same be done while running the SP in query analyzer?! This is why I initially asked for clarification since it appears to be a straight forward one-to-one relationship between obliterating ONE folder in Vault & ONE call to the SP?
So in a simple case when folders are obliterated one by one can the SP be safely used, and if not then why?
Also what about the integrity checker or some sort? Do you maybe have a SQL script for that or something?

ian_sg
Posts: 787
Joined: Wed May 04, 2005 10:55 am
Location: SourceGear
Contact:

Post by ian_sg » Thu Feb 09, 2006 1:11 pm

There is no integrity checking utility.

Running the stored procedure alone is fraught with peril for the following reasons:

1. You have to be fairly SQL-savvy to come up with the right parameters. We have customers who run the entire spectrum when it comes to SQL Server knowledge, and we have to consider all of them when we say "this is supported." With obliterate, the stakes are high. It would be quite easy to obliterate the wrong thing with a typo or a misunderstanding. A simple, common mistake can mean accidentally deleting data that cannot be recovered.

2. The stored procedure relies on a database transaction to roll back changes it has already made in several known circumstances. For example, you can't obliterate the trunk of any branches, at any point in history, if there exists any branch that has not also previously been obliterated. You sometimes don't find out if this is the case until you're well into the obliteration process. You're not running inside a transaction when you run the sproc alone. There is a huge matrix of possibilities here, and given the stakes (irrecoverable data loss) we're not prepared to provide a SQL script that initiates its own transaction and exhaustively considers all the possibilities.

3. We haven't exhaustively tested it. This makes it quite likely that we don't know everything that could possibly go wrong, particularly when other users are attempting to use the repository while the obliterate is going on. Where obliterate is concerned, we're unwilling to compromise when it comes to support for untested (or undertested) scenarios.

Let me know if I can further clarify any of this. :)
Ian Olsen
SourceGear

CCS
Posts: 23
Joined: Thu Oct 13, 2005 3:46 pm

Post by CCS » Thu Feb 09, 2006 1:19 pm

thanks for clarifications, but I still have few more questions

1) regarding incorrect parameters - I though the best way to get it right is by running profiler & simply capturing the output from the client (or command line tool) - this way everything is auto generated & so what can possibly go wrong with just copying the command & running it directly?
2) didn't really get the point - are you talking about nested transactions or what since we were planning to run the SP within SQL transaction so it'd looks something line
open tran
exec spobliteratefsobject

then manually commit tran if things look good (this is where integrity checker would help a lot!)

ian_sg
Posts: 787
Joined: Wed May 04, 2005 10:55 am
Location: SourceGear
Contact:

Post by ian_sg » Thu Feb 09, 2006 1:40 pm

I'm going to map the two explicit questions you asked to the two implicit questions I think you're really asking:

1) Is it possible to obliterate using the stored procedure?

2) Do you recommend this as a workaround for people who've had trouble obliterating with the normal UI?

The answer to question 1 is yes. The answer to question 2 is no, for all the reasons I gave.

If you'd like to get more specific, with respect to your particular issue(s), please feel free to send me an email.
Ian Olsen
SourceGear

Tri
Posts: 288
Joined: Wed Dec 22, 2004 11:10 am

Post by Tri » Fri Feb 10, 2006 7:41 pm

I am too is very frustrated by the Obliterate process. It takes long time to show the list, it takes again even longer time to perform the obliteration ... and return an error message "the file abc cannot be obliterated because it has been branched, etc" (don;t remember the exact error message). And all the process have to be restarted over again.

From the GUI point of view, this would make sense to me to display ONLY the items which are "obliterable". It doesn't make any sense to have to be an SQL Server specialist for doing obliteration.

CCS
Posts: 23
Joined: Thu Oct 13, 2005 3:46 pm

Problems, Problems

Post by CCS » Mon Feb 20, 2006 12:11 pm

well- we ran spobliteratefsobject with @reallyobliterate = 1 on Friday at 11 pm & it's Monday 10 am & the SP is still running - 59 hours already
and our SQL Server is 4 CPU 64 bit Itanium with 32Gb of ram etc :(

the repository seems to be functional so despite the initial plan of killing the process is it doesn't complete by Monday morning we're now thinking of keeping it running (since we're desperate in trying to get rid of about 3Gb of wasted space), but would really like to know 2 things:
1) is there any way to access the progress? We looked at the SP at as we suspected you used cursors with temp tables so can we maybe check #queue temp table to see the progress?
2) is it dangerous to use the repository while running obliterate?

please advice since this isn't the best situation we'd like to be in.
we'd really appreciate a prompt response.
Thanks!

lbauer
Posts: 9736
Joined: Tue Dec 16, 2003 1:25 pm
Location: SourceGear

Post by lbauer » Mon Feb 20, 2006 1:45 pm

Don't stop the process, unless you just want to give up on it and restore a backup. Stopping obliterate can leave the database in a half-baked state.

I need to consult with our developers on the other questions.
Linda Bauer
SourceGear
Technical Support Manager

CCS
Posts: 23
Joined: Thu Oct 13, 2005 3:46 pm

over 61 hours & still running

Post by CCS » Mon Feb 20, 2006 2:03 pm

lbauer wrote:I need to consult with our developers on the other questions.
please do as the first part was relatively obvious & we realize that most likely the process can't safely be stopped in the middle.

But also (and based on some responses here when people were stopping it after 190 hours or so) we can't allow this query running forever - even though we have a powerful SQL server the query still utilizes about 25% resources so...

lbauer
Posts: 9736
Joined: Tue Dec 16, 2003 1:25 pm
Location: SourceGear

Post by lbauer » Mon Feb 20, 2006 2:54 pm

1) is there any way to access the progress? We looked at the SP at as we suspected you used cursors with temp tables so can we maybe check #queue temp table to see the progress?
Not that we know of.
2) is it dangerous to use the repository while running obliterate?
The main danger is that this obliterate operation fails and you have to restore from backup, and lose any changes since the backup.

As Ian stated earlier in this thread, we don't recommend obliterating with a stored procedure. It's just not something you want to do with a production database.
Linda Bauer
SourceGear
Technical Support Manager

CCS
Posts: 23
Joined: Thu Oct 13, 2005 3:46 pm

Post by CCS » Mon Feb 20, 2006 3:19 pm

lbauer wrote: As Ian stated earlier in this thread, we don't recommend obliterating with a stored procedure. It's just not something you want to do with a production database.
this is just ridiculous – you know better then me that
a) performance of Vault degrades as the number of deleted but not destroyed files/folders increases
b) obliterate does NOT work with large volums even when executed via the command line tool (left alone the Admin tool which would attempt to display a dialog box with a list of all sub-folders to be obliterated which WILL time out if the folder selected for obliteration is large enough)

so we were left with no choice – as I stated earlier about 3 GB of space is permanently wasted with deleted folders (about 70000 subfolders contained in 3 large folders to be precise)
we’ve been asking for ANY suggestions from SourceGear for 6 months now until we just gave up & decided to use the SP (based on the fact that there’s indeed one-on-one mapping between obliterating a folder through the command line tool & a call to SP spobliteratefsobject).
If you read through the explanations above you quickly realize that they are not sufficient & certainly do not provide any inside we were hoping for: the tool calls the same SP & NOTHING else – you recommend using the tool, but not the SP while there’s nothing else involved. The only real difference is that in one case it's your client initiates the request by executing the SP through web services call (so IIS gets in the picture), and we executed it directly from query analyzer & so removed 2 extra layers - and yet you do not recommend it – so where’s the logic?

I’m sorry for such a negative post, but unfortunately in our situation there’s not much to be positive about.

lbauer
Posts: 9736
Joined: Tue Dec 16, 2003 1:25 pm
Location: SourceGear

Post by lbauer » Mon Feb 20, 2006 3:57 pm

We know that Vault obliterate performance is not what it should be, particularly with large amounts of data. Improving performance and the interface for Obliterate is a top priority for Vault 4.0.
a) performance of Vault degrades as the number of deleted but not destroyed files/folders increases
Deleted items shouldn't have an impact on the Vault server itself. If you are seeing degradation in performance it could be due to other factors. We have Vault users with databases of 50+ GB, so while the 3GB of extra data is undesirable, it shouldn't be a performance killer unless you're running out of disk space, etc.

I don't want to minimize the frustration users have had with Obliterate. But running the stored procedure is risky. At best, it ties up Vault for days and at worst, it can trash your database. We not only can't recommend it, we are unable to support it. If something goes wrong, the only recourse is to restore a backup.

In your case, let the stored procedure run, since you're already into the process.

For other users:
-- I know it means waiting several months, but Vault 4.0 will have a better Obliterate.
-- Another suggestion that can be tedious but certainly not as risky -- undelete the deleted items in your database. Then delete one project or folder and obliterate that; delete another project, obliterate, etc. Obliterating in smaller chunks may work for some users.
Linda Bauer
SourceGear
Technical Support Manager

CCS
Posts: 23
Joined: Thu Oct 13, 2005 3:46 pm

dammit!

Post by CCS » Tue Feb 21, 2006 3:44 pm

The query has been running for almost 87 hours & now it seems that it's trapped into some sort of infinite loop - it's been doing select for hours now. And we're only trying to obliterate 1.3 GB of data in the first attempt (one folder out of 3)
It almost looks like there might be some bug in it, but without understanding of the DB schema we can't say where possibly it can be...
So is there at least a query that would allow us to count a number of files in a specific folder (included deleted ones)? We could then ran this query to compate the number of files in the folders we're trying to obliterate before & after the operation, as well we could ran it periodically to assess the progress (if any)?
Please advise.

lbauer
Posts: 9736
Joined: Tue Dec 16, 2003 1:25 pm
Location: SourceGear

Post by lbauer » Tue Feb 21, 2006 5:26 pm

In the Vault GUI Client, folder properties (select a folder, right click and select Properties) will show you the number of files in a folder, plus the number of any deleted files. Does that help?
Linda Bauer
SourceGear
Technical Support Manager

Locked