Our new Indie Games subforum is now open for business in G&T. Go and check it out, you might land a code for a free game. If you're developing an indie game and want to post about it, follow these directions. If you don't, he'll break your legs! Hahaha! Seriously though.
Our rules have been updated and given their own forum. Go and look at them! They are nice, and there may be new ones that you didn't know about! Hooray for rules! Hooray for The System! Hooray for Conforming!

SQL: Transaction Log Files growing like cancer

GrundlterrorGrundlterror Registered User regular
edited May 2011 in Help / Advice Forum
Hey guys.

I have this script that I'm running at a number of sites. It's a script that runs for hours and basically trims data from a database that has grown too large. The problem I'm running into is that these sites have their log files on a drive with only ~50-100 gigs free (yeah, only). The log files grow and in about 3-4 hours a 50meg log file turns into a 100 gig log file.

I can stop the script and shrink the log file, no problem... but this isn't optimal as I would prefer the script be able to run overnight without waking up every few hours to check on it.

Right now I have the recovery model set to simple for doing this. This seems to slow down the growth of the log file a bit, but it still grows pretty large. I've tried to restrict the growth... but when the log file size hits the restriction the query stops.

Anyone have any ideas? I've looked around on google for a bit, but I must admit I'm no DBA or even someone who is particularly versed in the ways of Microsoft SQL Server. This is on Microsoft SQL Server 2005/2008.

Thanks!

Grundlterror on
steam_sig.png

Posts

  • clam2000clam2000 SeattleRegistered User regular
    edited May 2011
    There are a couple meta-questions worth thinking about in this situation should help clarify the situation:

    How quickly is the data churning in the main database?
    How much do you care if recent transactions are lost due to a crash?
    How is the database getting backed up?

    Basically, you shouldn't be logging more than there is actual data changing in the database. If you are, then you need to change the script.
    Otherwise, you need to figure out how critical it is to recover these logs. If you're never going to use them, just have the script pipe it's output to /dev/null and be done with it.

    cdogwal.gif
  • schussschuss Registered User regular
    edited May 2011
    Yeah, sounds like you're logging too much.

  • GrundlterrorGrundlterror Registered User regular
    edited May 2011
    This is like a one time script. It gets run at the site once and thats it. I make backups immediately before I run the script, if anything goes wrong. This is all done while the site is dark (they aren't using the DB).

    The script does make temporary tables that are gigantic. I don't think there is much that I can do about that. What I've been doing now is running the script one delete query at a time one year at a time for this particular site thats has a DB thats over 180 gigs. It would just be much more convenient to run this script over night without having to wake up every hour to check on it/run the next query after shrinking the log file.

    steam_sig.png
  • schussschuss Registered User regular
    edited May 2011
    Can't you turn off logging on the server for the duration of the job?

  • GrundlterrorGrundlterror Registered User regular
    edited May 2011
    I don't think it's possible to turn off the transaction logging with sql server, is it?

    steam_sig.png
  • schussschuss Registered User regular
    edited May 2011
    From a quick google it looks like that's confined to higher level DB's (I work in a large enterprise, so our log files would be many gigs per hour if it was on for everything). As it sounds like you already have it split into smaller delete batches, could you setup a script to run through 5000 deletes, then delete the log file, then 5000 more etc.?

  • GrundlterrorGrundlterror Registered User regular
    edited May 2011
    I was thinking of doing it like that. I've been setting the end date every 3 months now with this one very intense query. So, I can set the query like this:

    set @enddate = '2008-09-30'

    delete query

    shrink

    set @enddate = '2008-12-30'

    delete query

    shrink

    etc

    steam_sig.png
  • Walrus von ZeppelinWalrus von Zeppelin Registered User
    edited May 2011
    Uh, you can just use a while loop and increment the date instead of typing it out for each 3 month set like in your post.


    Disclaimer: If you were just doing that for demonstration purposes then my bad.

  • GrundlterrorGrundlterror Registered User regular
    edited May 2011
    I was not just doing it for demonstration purposes, thanks for the tip. Not sure why it didn't occur to me to try it that way!

    steam_sig.png
  • vonPoonBurGervonPoonBurGer Registered User
    edited May 2011
    Right now I have the recovery model set to simple for doing this. This seems to slow down the growth of the log file a bit, but it still grows pretty large.
    You cannot operate a SQL Server DB without transaction logs because they hold dirty pages (i.e. pages that have been changed, but have yet to be written to their respective datafiles). Under the Simple recovery model, transaction logs are truncated when a checkpoint occurs, i.e. when dirty pages are written to disk and SQL Server knows it no longer needs those transaction logs. Checkpoints can only write out pages for committed transaction though, so if your transaction logs are getting too large it probably means you're performing too many write operations without a commit. Break that delete into batches and make sure you're committing between each batch. Automatic checkpoints will be able to write the dirty pages to disk, and SQL Server will be able to truncate the transactional logs for reuse rather than growing them indefinitely.

    Xbox Live:vonPoon | PSN: vonPoon | Steam: vonPoonBurGer
  • GrundlterrorGrundlterror Registered User regular
    edited May 2011
    Thanks a lot guys, the script worked like a charm. Still not 100% clear on the transaction log files but I know a lot more than I did a few days ago.

    steam_sig.png
Sign In or Register to comment.