Read a SQL Server transaction log

SQL Server transaction logs contain records describing changes made to a database. They store enough information to recover the database to a specific point in time, to replay or undo a change. But, how to see what’s in them, find a specific transaction, see what has happened and revert the changes such as recovering accidentally deleted records

To see what is stored in an online transaction log, or a transaction log backup is not so simple

Opening LDF and TRN files in a binary editor shows unintelligible records so these clearly cannot be read directly. For instance, this in an excerpt from an LDF file:

Opening LDF and TRN files in a binary editor

Use fn_dblog

fn_dblog is an undocumented SQL Server function that reads the active portion of an online transaction log

Let’s look at the steps you have to take and the way the results are presented

  1. Run the fn_dblog function
  2. Select * FROM sys.fn_dblog(NULL,NULL)

    Results set returned by fn_dblog function

    As the function itself returns 129 columns, returning only the specific ones is recommended as well as narrowing down the results to a specific transaction type, if applicable

  3. From the results set returned by fn_dblog, find the transactions you want to see

  4. To see transactions for inserted rows, run:

    SELECT [Current LSN], 
           Operation, 
           Context, 
           [Transaction ID], 
           [Begin time]
           FROM sys.fn_dblog
       (NULL, NULL)
      WHERE operation IN
       ('LOP_INSERT_ROWS');

    Transactions for inserted rows

    To see transactions for deleted records, run:

    SELECT [begin time], 
           [rowlog contents 1], 
           [Transaction Name], 
           Operation
      FROM sys.fn_dblog
       (NULL, NULL)
      WHERE operation IN
       ('LOP_DELETE_ROWS');

    Transactions for deleted rows

  5. Find the column that stores the value inserted or deleted – check out the RowLog Contents 0 , RowLog Contents 1 , RowLog Contents 2 , RowLog Contents 3 , RowLog Contents 4, Description and Log Record

  6. Row data is stored in different columns for different operation types. To be able to see exactly what you need using the fn_dblog function, you have to know the column content for each transaction type. As there’s no official documentation for this function, this is not so easy

    The inserted and deleted rows are displayed in hexadecimal values. To be able to break them into fields you have to know the format that is used, understand the status bits, know the total number of columns and so on

  7. Convert binary data into table data taking into account the table column data type. Note that mechanisms for conversion are different for different data types

fn_dbLog is a great, powerful, and free function but it does have a few limitations – reading log records for object structure changes is complex as it usually involves reconstructing the state of several system tables, only the active portion of an online transaction log is read, and there’s no UPDATE/BLOB reconstruction

As the UPDATE operation is minimally logged in transaction logs, with no old or new values, just what was changed for the record (e.g. SQL Server may log that “G” was changed to “F”, when actually the value “GLOAT” was changed into ”FLOAT”), you have to manually reconstruct the state prior to the update which involves reconstructing all the intermediary states between row’s original insertion into page and the update you are trying to reconstruct

When deleting BLOBs, the deleted BLOB is not inserted into a transaction log, so just reading the log record for the DELETE BLOB cannot bring the BLOB back. Only if there is an INSERT log record for the deleted BLOB, and you manage to pair these two, you will be able to recover a deleted BLOB from a transaction log using fn_dblog

Use fn_dump_dblog

To read transaction log native or natively compressed backups, even without the online database, use the fn_dump_dblog function. Again, this function is undocumented

  1. Run the fn_dump_dblog function on a specific transaction log backup. Note that you have to specify all 63 parameters
  2. SELECT *
    FROM fn_dump_dblog
    (NULL,NULL,N'DISK',1,N'E:\ApexSQL\backups\AdventureWorks2012_05222013.trn', 
    DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT, 
    DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,
    DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,
    DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,
    DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,
    DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,
    DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT, 
    DEFAULT);

    fn_dump_dblog function output

    The same as with fn_dbLog, 129 columns are returned, so returning only the specific ones is recommended

    SELECT [Current LSN], 
           Operation, 
           Context, 
           [Transaction ID], 
         [transaction name],
           Description
    FROM fn_dump_dblog
    (NULL,NULL,N'DISK',1,N'E:\ApexSQL\backups\AdventureWorks2012_05222013.trn', 
    DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT, 
    DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,
    DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,
    DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,
    DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,
    DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,
    DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT, 
    DEFAULT);

    Again, you have to decipher hex values to get the information you’re looking for

    Returning specific columns using fn_dump_dblog function

    And you are back to square one as with using fn_dblog – you need to reconstruct all row values manually, you need to reconstruct entire state chains for UPDATE operations and BLOB values and so on

    If you don’t want to actually extract transactions from the transaction log backup, but to restore the database to a point in time before a specific operation occurred, you can:

  3. Determine the LSN for this transaction
  4. Convert the LSN into the format used in the WITH STOPBEFOREMARK = ‘<mark_name>’ clause, e.g 00000070:00000011:0001 should be transformed into 112000000001700001
  5. Restore the full log backup chain until you reach the time when the transactions occurred. Use the WITH STOPBEFOREMARK = ‘<mark_name>’ clause to specify the referencing transaction LSN
    RESTORE LOG AdventureWorks2012
    FROM
        DISK = N'E:\ApexSQL\backups\AW2012_05232013.trn'
    WITH
        STOPBEFOREMARK = 'lsn:112000000001700001',
        NORECOVERY;

USE DBCC PAGE

Another useful, but again undocumented command is DBCC PAGE. Use it to read the content of database online files – MDF and LDF. The syntax is:

DBCC PAGE ( {'dbname' | dbid}, filenum, pagenum [, printopt={0|1|2|3} ])

To dump the first page in the AdventureWorks2012 database online transaction log file, use:

SELECT FILE_ID ('AdventureWorks2012_Log') AS 'File ID' 
-- to determine Log file ID = 2
DBCC PAGE (AdventureWorks2012, 2, 0, 2)

You’ll get

DBCC execution completed. If DBCC printed error messages, 
contact your system administrator.

By default, the output is not displayed. If you want an output in SQL Server Management Studio, turn on the trace flag 3604 first

DBCC TRACEON (3604, -1)

And then re-run

DBCC PAGE (AdventureWorks2012, 2, 0, 2)

You’ll get a bunch of errors and bad header and you can ignore all that. At the end you’ll get a glorious hexadecimal output from the online LDF file:

Hexadecimal output from the online LDF file

Which is not the friendliest presentation of your database data and is basically no different than viewing it in a hex editor (just more uncomfortable) though at least you get access to the online data

Use ApexSQL Log

ApexSQL Log is a sql server transaction log reader which reads online transaction logs, detached transaction logs and transaction log backups – both native and natively compressed. When needed, it will also read database backups to get enough data for a successful reconstruction. It can replay data and object changes that have affected a database, including those that had occurred before it was installed. Unlike the undocumented and unsupported functions described above, you’ll get perfectly understandable information about what happened, on which object, and what the old and the new value were

  1. Start ApexSQL Log
  2. Connect to the database for which you want to read the transaction logs

    Connecting to the database to read the transaction logs from

  3. In the Select data sources step, select the logs you want to read. Make sure they form a full chain. To add transaction log backups and detached LDF files, use the Add button

    Selecting the transaction logs to read from

  4. In the next step, chose the output type – for the analysis purposes, select Open results in grid – this will load the audited data in the grid which is preferable output for investigation and analysis tasks.

  5. Next, use the Filter setup options to narrow down the transactions read using the time range, operation type, table and other available filters

    Filtering the transactions read

  6. Click Finish

    Fully comprehensive results will be shown in the ApexSQL Log grid

    You will be able to see the time the operation began and ended, the operation type, the schema and object name of the object affected, the name of the user who executed the operation, the computer and application used to execute the operation. For UPDATEs, you’ll see the old and the new value of the updated fields

    Fully comprehensive results shown in the ApexSQL Log grid

To avoid hex values, undocumented functions, unclear column content, long queries, complex action steps, incomplete UPDATE and BLOB reconstruction when reading SQL Server transaction logs, use ApexSQL Log. It will read transaction logs for you and present the results “in plain English”. Besides that, undo and redo scripts are just a click away

Downloads

Please download the script(s) associated with this article on our GitHub repository.

Please contact us for any problems or questions with the scripts.

 

May 27, 2013