Our new Indie Games subforum is now open for business in G&T. Go and check it out, you might land a code for a free game. If you're developing an indie game and want to post about it, follow these directions. If you don't, he'll break your legs! Hahaha! Seriously though.
Our rules have been updated and given their own forum. Go and look at them! They are nice, and there may be new ones that you didn't know about! Hooray for rules! Hooray for The System! Hooray for Conforming!
I have this script that I'm running at a number of sites. It's a script that runs for hours and basically trims data from a database that has grown too large. The problem I'm running into is that these sites have their log files on a drive with only ~50-100 gigs free (yeah, only). The log files grow and in about 3-4 hours a 50meg log file turns into a 100 gig log file.
I can stop the script and shrink the log file, no problem... but this isn't optimal as I would prefer the script be able to run overnight without waking up every few hours to check on it.
Right now I have the recovery model set to simple for doing this. This seems to slow down the growth of the log file a bit, but it still grows pretty large. I've tried to restrict the growth... but when the log file size hits the restriction the query stops.
Anyone have any ideas? I've looked around on google for a bit, but I must admit I'm no DBA or even someone who is particularly versed in the ways of Microsoft SQL Server. This is on Microsoft SQL Server 2005/2008.