SQL Server database transaction log files are continuously pumped with the transactional information and details on database changes by the SQL Server itself. Even though the information from the transaction log files and backups can be used as a solid resource for database auditing, SQL Server does not provide a solid solution to utilize transaction log files to their full potential neither does it offer any simple way to explore those transaction log files or analyze the information within in order to perform continuous SQL Server database auditing.
In order to use the information inside the transaction log files and backups, ApexSQL Log is a powerful auditing and recovery tool that will not only enable user to explore their transaction log files, but will also provide an easy solution to allow main auditing requirements:
Continuous and seamless auditing of DML and DDL changes – which ensures that all SQL Server database changes are audited without creating any duplicate entries
Unattended/automated auditing – constant and uninterrupted flow of nightly audit jobs which are pre-scheduled per auditing plans
Audited data being stored in repository tables – which can be created directly inside audited or any specific SQL Server database
Quick and easy access to the audited data either by querying the repository directly or creating auditing reports
In this article, we are going to show how to setup ApexSQL Log to perform continuous nightly auditing of SQL Server database transaction log files/backups and create repository tables that will store all audited data by creating a repeatable session in ApexSQL Log. Then, we are going to show how to automate execution of this session on regular basis by using the PowerShell as a wrapper. Finally, we are going to address the reporting capabilities and how to query repository tables manually which can translate to creation of various stored procedures.
Configure ApexSQL Log for continuous auditing – ApexSQL Log is configured in such a way to automatically pick up all existing and still-to-be created transaction log files and audit them into the repository database/tables, as well as to continue each subsequent auditing job where the previous job has finished
Finally, we will look into how to extract the audited information from the repository tables
As mentioned above, we will start by creating and configuring a session in ApexSQL Log.
Once ApexSQL Log is started, new session is automatically initiated, and the user can always start a fresh one by clicking on the ‘New’ button in the main application ribbon.
In the first step of the session wizard, chose a SQL Server from the server drop-menu, choose between windows and SQL Server authentication methods and provide appropriate credentials (if applicable). Finally, chose a database for auditing and click on the Next button to proceed
In the next step of the wizard, the Data sources, it is important to ensure that the ApexSQL Log looks into the appropriate transaction log files that will hold the proper information for auditing. In cases when transaction log backups are being created on a regular basis, it is important to ensure that all files are included in the auditing process and that they are automatically added as sources by the application. To do this, we will use a name pattern feature that will fetch all transaction log files from the specified folder(s) and include all transaction log backups that match the provided pattern – wildcards should be used here to configure pattern to match the specific environment conditions. To set this up, simply click on the ‘Add pattern’ button, provide the folder location and configure the pattern using wildcards – files that match the pattern will be shown immediately to help ensure this has been properly configured. Click Ok to close the pattern configuration window, and make sure that the pattern is checked as a data source. Additionally, online transaction log should remain checked as another data source, but it can be unchecked/excluded if the user doesn’t want it used as a data source due to specific requirements – this will not affect the auditing output, but will only affect the last auditing job – if the online transaction log is not checked, every last auditing will end with the last information from the transaction log backup (and not include later info residing in the online transaction log file until the new backup is created. To achieve best performance, it is recommended to create regular transaction log backups (e.g. on hourly basis) and to take regular full database backups (e.g. daily/weekly) – this will ensure that only the latest backups are used in the auditing process, making it quicker and less performance-intensive in comparison to auditing endless chains of transaction logs or gigantic online transaction log file.
Note: In order for ApexSQL Log to provide full auditing details, it is important to always include full chain of transaction log backups (all transaction log backups from the latest full database backup onwards) in the auditing job, so some users may consider adjusting their backup creation/retention policies to ensure that the full chain is always available to ApexSQL Log.
Once we proceed through the wizard, the ‘Select output’ step will provide us with various choices for the auditing output. For the purpose of continuous auditing, let’s opt for the ‘Export results’ option.
This brings us to the filter setup on the session wizard. First and foremost, the user should select the ‘Continuous auditing’ filter and chose the location where the tracking file will be stored – this file will store the information on where the latest continuous auditing job has ended (LSN value of the last transaction) to ensure that the next one continues from the first next LSN when the next auditing job is initiated.
In addition, users can use various filters to fine-tune their auditing job. Good place to start is to first choose which DML/DDL operations to include in the auditing job, then proceed to choose which database/system tables will be included in the process, and to choose more filters including users filter, transaction filters, SPID as well as some additional filtering options available in this step of the wizard.
Once the filters have been configured, the final step of the wizard will allow us to configure export output. First, let’s opt for ‘Export to database’ option, since we are aiming to insert the auditing information in the repository tables. Then, click on the ‘Select database’ button and choose the database in which the repository tables will be created and the information pumped into (user will need to choose SQL Server and provide appropriate credentials) and click connect.
Finally, click on the ‘Save’ button and provide the session name to save the session and complete the configuration process.
Now that the session has been created, it can be run from the application GUI whenever the user requires it (by clicking on the ‘Open’ button in the main ribbon and going through the pre-defined wizard), or the session can be used for the automation process which is what we will show next.
ApexSQL Log fully support CLI and all available GUI features are also available in the CLI through various switches. More on available CLI commands can be seen in this article.
In order to automate the continuous auditing job and perform regular auditing jobs on predefined intervals (and in specific time frames), we can use various tools. For instance, one such automation can be achieved by creating a batch script file which includes ApexSQL Log CLI switches that defined the auditing job which can be scheduled via windows scheduler or similar tool. Other solution is to use Windows PowerShell to create and schedule the task which is superior to the first example since everything is achieved in PowerShell project, and there is no need to use additional tools for scheduling.
To automate auditing job via PowerShell, we will call the ApexSQL Log session (.axlp) file we’ve created at the end of the previously described wizard.
Here are the commands we will use:
|Schtasks.exe /Create||# Create a task|
|/TN||# Task name|
|/SC||# Define task frequency (hourly, daily…)|
|/ST||# Task start time|
|/ET||# Task end time|
|/TR||# Path for the task that will be run (with additional CLI commands)|
With these command we can schedule a regular auditing task. Here is an example of a PowerShell script that will create a task “ApexSQL Log continuous auditing”, that will run predefined session (saved as c:\ApexSQL Log\Saved sessions\Continuous auditing.axlp) on an hourly basis every day between 05:00 AM and 11:00 PM
Now that we’ve seen how to create and automate continuous auditing, let’s turn the attention over to the reporting. Since all of the information audited with the above shown example is being stored in the two SQL Server database tables, the easiest way to extract the information is by querying the tables directly via SQL Server Management Studio or any other similar tool. For detailed information on repository topology, consult detailed guide in the Knowledgebase article, “ApexSQL Log continuous auditing repository topography”. While running basic reports like daily auditing trail, changes per user or specific table history reports can be easily created, getting more specific data in the reports can be tricky for some users with less SQL Server experience. For those that prefer querying database directly, detailed guide on how to work with repository tables directly is offered in the article “How to work with the ApexSQL Log continuous auditing repository directly, including querying and reporting”. For those that want to take reporting one step further, here are the examples of most commonly used reports:
Transactions per Month – this report will show all transactions by the time frame specified in the @time_frame variable (the @time_frame variable can be changed to adjust the precision per date/time. E.G. ‘YYYY-MM-DD-hh will show all transactions all the way to the per-hour level)
DECLARE @time_frame AS VARCHAR(50) SET @time_frame = 'yyyy-MM' SELECT FORMAT(TRANSACTION_BEGIN, @time_frame) AS "Date/Time" ,count(DISTINCT TRANSACTION_ID) AS "Transaction Count" FROM [dbo].[APEXSQL_LOG_OPERATION] GROUP BY FORMAT(TRANSACTION_BEGIN, @time_frame)
Transactions per day – report will show all transactions per day
SELECT CONVERT (DATE ,TRANSACTION_BEGIN) AS "Date" ,count (DISTINCT TRANSACTION_ID) AS "Transaction Count" FROM [dbo].[APEXSQL_LOG_OPERATION] GROUP BY CONVERT (DATE ,TRANSACTION_BEGIN
Operation count by type – report will show number of operations by operation type
SELECT OPERATION_TYPE ,COUNT(OPERATION_TYPE) AS "Operation count" FROM [dbo].[APEXSQL_LOG_OPERATION] GROUP BY OPERATION_TYPE
Operation count by date (and type) – report will show number of specific operations separated per date
SELECT CONVERT 8DATE ,TRANSACTION_BEGIN) AS "Date" ,OPERATION_TYPE ,COUNT (OPERATION_TYPE) AS "Operation count" FROM [dbo].[APEXSQL_LOG_OPERATION] GROUP BY OPERATION_TYPE ,CONVERT (DATE ,TRANSACTION_BEGIN)
Operations count by type and date per operation – report will show number of operations per date for specific operation type. To configure the report for specific operation type, simply set the @operation_type variable to a specific operation (e.g. @operation_type = ‘DELETE’ to get the results for delete operations)
DECLARE @operation_type AS VARCHAR(50) SET @operation_type = 'INSERT' SELECT CONVERT (DATE ,TRANSACTION_BEGIN) AS "Date" ,(OPERATION_TYPE) ,COUNT OPERATION_TYPE AS "Operation count" FROM [dbo].[APEXSQL_LOG_OPERATION] WHERE OPERATION_TYPE = @operation_type GROUP BY OPERATION_TYPE ,CONVERT (DATE ,TRANSACTION_BEGIN)
Top 10 transactions count by table – report will show top 10 transactions count for database tables. To configure time-frame, set @start_date and @end_date variables to specific from/to dates)
DECLARE @start_date AS VARCHAR(50) DECLARE @end_date AS VARCHAR(50) SET @start_date = '2016-06-01' SET @end_date = '2016-12-01’ SELECT TOP 10 OBJECT_NAME, count(distinct TRANSACTION_ID) as "Transaction Count" FROM [dbo].[APEXSQL_LOG_OPERATION] WHERE TRANSACTION_BEGIN > @start_date AND TRANSACTION_END < @end_date group by OBJECT_NAME ORDER BY "Transaction Count" DESC
Top 10 transactions count by user – report will show top 10 transactions count per user. As with the previous report, to configure time-frame, set @start_date and @end_date variables to specific from/to dates)
DECLARE @start_date AS VARCHAR(50) DECLARE @end_date AS VARCHAR(50) SET @start_date = '2016-6-01' SET @end_date = '2016-12-12' SELECT TOP 10 USER_NAME ,count(DISTINCT TRANSACTION_ID) AS "Transaction Count" FROM [dbo].[APEXSQL_LOG_OPERATION] WHERE TRANSACTION_BEGIN > @start_date AND TRANSACTION_END < @end_date GROUP BY USER_NAME ORDER BY "Transaction Count" DESC
Top 10 longest running transactions – report shows longest running transactions with appropriate details (when has the transaction occurred, user that run it…) that can be further configured by adding more table columns
SELECT DISTINCT TOP 10 TRANSACTION_ID ,CAST TRANSACTION_END - TRANSACTION_BEGIN AS TIME AS "Transatcion_duration" ,USER_NAME ,TRANSACTION_BEGIN ,TRANSACTION_DESCRIPTION FROM [dbo].[APEXSQL_LOG_OPERATION] ORDER BY CAST(TRANSACTION_END - TRANSACTION_BEGIN AS TIME) DESC
Top 10 transactions by operation count –this report shows the transactions that have included the most operations
SELECT TOP 10 TRANSACTION_ID ,MAX(USER_NAME) AS "Username" ,MAX(TRANSACTION_BEGIN) AS "Start time" ,COUNT(TRANSACTION_ID) AS "Operation count" FROM [dbo].[APEXSQL_LOG_OPERATION] GROUP BY TRANSACTION_ID ORDER BY COUNT(TRANSACTION_ID) DESC
All of the above shown queries can be engulfed as stored procedures for the re-use purpose by using the standard ‘create stored procedure’ code:
CREATE PROCEDURE < PROCEDURE_NAME ,SYSNAME ,ProcedureName > -- Add the parameters for the stored procedure here < @Param1 ,SYSNAME ,@p1 > < Datatype_For_Param1 , ,INT > = < Default_Value_For_Param1 , ,0 > ,< @Param2 ,SYSNAME ,@p2 > < Datatype_For_Param2 , ,INT > = < Default_Value_For_Param2 , ,0 > AS BEGIN -- SET NOCOUNT ON added to prevent extra result sets from -- interfering with SELECT statements. SET NOCOUNT ON; -- Insert statements for procedure here SELECT < @Param1 ,SYSNAME ,@p1 > ,< @Param2 ,SYSNAME ,@p2 > END GO
Now that we’ve created various reports and saved them as stored procedures, we can use SQL Server Reporting Service to complete the reporting process and create high-end reports based on these stored procedures. The reports can be enhanced in many ways, including adding graphs, lists, charts and many other elements to the reports. The guide on how to create such reports can be found here
Can i run the ApexSQL Log session manually when required (in addition to automated jobs)?
A: Yes – this will allow you to perform immediate auditing when required and will simply update the tracking information so that the automated job can start from where you’ve finished.
I want to perform manual continuous auditing but do not want to affect the already automated auditing – what steps should I perform?
If the user wants to continue auditing where the previous process has started without affecting the current (active) tracking file, simply make a copy of it and specify the copied file location in the tracking file location field in the date/time filter step of the session wizard
Can i have multiple separate continuous auditing jobs set on the same database at once?
Yes, just ensure that each job uses its own separate tracking file and that the jobs are not scheduled at the same time (only one auditing job at a time can be run)
Please contact us for any problems or questions with the scripts.
November 10, 2016