Tuesday, August 13, 2013

T-SQL Tuesday #45

T-SQL Tuesday
T-SQL Tuesday is rotation blog series started by Adam Machanic. Each month a different person hosts the series and decides on a topic. Then, everyone writes a blog post on the same topic. This month, it is being hosted by Mickey Stuewe, and the topic is Auditing.


In layman's terms, Auditing is keeping track of changes and who made them. There could be many reason for wanting to do this. For example, legal requirements, change control process, security or troubleshooting.

Who Done It?

Once upon a time, I was a Database Administrator and I had a system that was suffering from some random errors that we were unable to pin down. Basically, at seemingly random times various parts of the application would fail. Usually this would involve alerts firing after hours and the ensuing late night fun.

The team would get called together to troubleshoot the problem, and typically a developer would put together a fix to the code to get the application working again. However, we were unable to identify the root cause.

Turns out, one of the developers was making unauthorized changes to the database. For example, modifying a table or stored procedure. This, in turn, would cause the application to break.

What the Deuce?

We did not have the best separation of duties at this organization. The developers were part of the deployment process. As such, one of the developers had gained access to the Application Service Account. So, he would use it from time to time to "fix" things, or slide in some last minute changes that got missed in the deployment steps.

We started to suspect this when it was always the same developer who would volunteer to help us troubleshoot and miraculously come up with the code fix. He got to be the hero, and we were the zeroes.

To Catch a Thief?

We added a DDL Trigger to start logging all changes to the database. In addition to the standard information returned by the EVENTDATA() function, we added a few other things. Since he was using the Application Service Account, we needed a few more details to distinguish the culprit from one of us. So, we added the IP Address and the name of the application that was issuing the DDL commands.

Show Me the Audit

One thing you'll want is a central place to store all of your Audit History. While you could create this table within the database you are auditing, I prefer using a separate database. This way, you can have several databases logging to the same database for easier reporting. Or, you could locate the audit database on a separate server for added security and redundancy.


-- create a separate database for housing all of your audit information
create database DBAAudit;

use DBAAudit;

-- create a table to store the history of DDL changes
create table dbo.DDLHistory (

DDLID int identity(1, 1) not null,
DDLDate datetime not null default getdate(),
LoginName nvarchar(256) not null,
IPAddress nvarchar(25) not null,
AppName nvarchar(100) not null,
DatabaseName nvarchar(256) not null,
SchemaName nvarchar(256) not null,
ObjectName nvarchar(256) not null,
ObjectType nvarchar(50) not null,
EventType nvarchar(50) not null,
DDLCommand nvarchar(max) not null,
DDLXML xml not null



Setting up a DDL Trigger is fairly straightforward. All the relevant information is returned by the EVENTDATA() function. This will return various details about the DLL event in an XML format.

You can add any other code to flesh out your trigger and you see fit.

I like to add the IP Address and the Application Name. These are easy enough with some built-in functions.


-- replace with your own database name
create database MyTest;

-- replace with your own database name
use MyTest;
-- modify trigger to capture the information that is relevant to you
create trigger DDLTracking
on database
for create_table, alter_table, drop_table,
create_procedure, alter_procedure, drop_procedure,
create_function, alter_function, drop_function
as begin

set nocount on;

-- grab the trigger event data
declare @eventdata xml;
set @eventdata = EVENTDATA();

-- grab the ip address, sometimes people use another login, this will help trace to their machine
declare @ipaddress nvarchar(25);

select @ipaddress = client_net_address
from sys.dm_exec_connections
where session_id = @@SPID;

-- log the info in our table
insert into DBAAudit.dbo.DDLHistory
(LoginName, IPAddress, AppName, DatabaseName, SchemaName, ObjectName, ObjectType, EventType, DDLCommand, DDLXML)
values (
@eventdata.value('(/EVENT_INSTANCE/LoginName)[1]', 'nvarchar(256)'),
APP_NAME(), -- grabs what program the user was using, e.g. management studio
@eventdata.value('(/EVENT_INSTANCE/DatabaseName)[1]', 'nvarchar(256)'),
@eventdata.value('(/EVENT_INSTANCE/SchemaName)[1]', 'nvarchar(256)'),
@eventdata.value('(/EVENT_INSTANCE/ObjectName)[1]', 'nvarchar(256)'),
@eventdata.value('(/EVENT_INSTANCE/ObjectType)[1]', 'nvarchar(50)'),
@eventdata.value('(/EVENT_INSTANCE/EventType)[1]', 'nvarchar(50)'),
@eventdata.value('(/EVENT_INSTANCE/TSQLCommand)[1]', 'nvarchar(256)'),



Now, this is a fairly basic DDL Trigger. From here, you can modify it to add any addition information that you may require. As always, tailor any code to your own situation.

If you are co-mingling information from multiple servers, you may wish to add a column for that. You may also wish to look into locating the Audit table on a remote server.

And There You Have It

A good Audit or Logging System can help you solve all manner of mysteries. They make troubleshooting a server much easier than trying to divine what happened in the past.

And remember, when you have ruled out the impossible, whatever remains, however improbable, is the answer.

Monday, August 5, 2013

Why is my SQL Agent Job Slowing Down?

The other day, we had a situation where a user had the perception that a particular job was taking longer than normal.

So I opened the Job History and glanced over the run times. Everything seemed fine to me. But I realized, I wanted something more precise than 'it looks good when I look at it'.

msdb to the Rescue

So I started digging into the job history tables in msdb. msdb is one of the system databases. Specifically, it keeps track of all the jobs, their run times, history, errors, outcomes, etc.

In fairly short order, I pulled together a couple of scripts to analyze the job history, and tell me exactly how run times have been changing, or not, over the past month or so.

The Persistence of Time

As it turns out, the run times were stable, the user's perception of time was not.

Overview of all jobs:

Job Run Times - All Jobs
Details on one job:

Job Run Times - One Job
A Couple of Quick Scripts

The first script gives an overview of all jobs, aggregated by month.


-- Steven Ormrod
-- job run times - all jobs.sql

use msdb;

-- aggregations, omits failures as they skew the data
select j.name, left(h.run_date, 6) as 'year/month',
avg(h.run_duration) as 'avg duration',
max(h.run_duration) as 'max',
min(h.run_duration) as 'min',
cast(stdev(h.run_duration) as numeric(20, 2)) as 'stdev'
from dbo.sysjobhistory h
join dbo.sysjobs j
on h.job_id = j.job_id
where h.step_id = 0
and h.run_status = 1 -- omits failures
group by j.name,
left(h.run_date, 6)
order by j.name,
left(h.run_date, 6) desc;


The second script gives you the details for a specific job.


-- Steven Ormrod
-- job run times - one job.sql

use msdb;

declare @strJobName varchar(100);

-- details
select j.name, h.run_date, h.run_time, h.run_duration,
--select 'Slow Running Job' as 'name', h.run_date, h.run_time, h.run_duration,
h.run_status, h.message
from dbo.sysjobhistory h
join dbo.sysjobs j
on h.job_id = j.job_id
where j.name = @strJobName
and h.step_id = 0
and h.run_status = 1 -- just the successful runs
order by h.run_date desc,



Details are Important

The next time one of your users reports that a job is taking longer than usual, gather some basic stats so you can establish the facts before going too far.