Menu Bar

Monday, October 31, 2011

Another way of looking at JOINs


"No one would surrender to the Dread Pirate Westley." - Westly, The Princess Bride
I recently had a call from a colleague asking me about how to get some information out of the database. Since the query was using an Oracle database, it was using the (+) outer join syntax on the predicate level which to me, has never been what you'd call intuitive, particularly when joining multiple tables together.

For those people not familiar with it, a simple left outer join would read as follows:

select a.col1, b.col2
from tab1 a, tab2 b
where a.col1 = b.col1(+)

The (+) syntax is used in the side of the predicate where NULL's would be expected in the event of an outer join. So this example would indicate a left outer join.

The query was more complicated than this simple example and I suggested using ANSI standard syntax to build the query instead.

In SQL Server, in order to perform an outer join, you need to use the ANSI standard syntax since it's the only way an outer join is supported. I suggested to my colleague that he learn it as well, for future benefit.

What is an outer join? An outer join is where you join two tables on a predicate and if the table being outer joined doesn't match, the row is returned anyway, but with NULL's in the result set in any columns from the joined table where a match was not found.

Going back to Elementary School math, think of a Venn Diagram as follows (We will call the intersection in the middle "C"):

An Inner Join is only going to return the data shown as "C" - those rows common to both "A" and "B" across the join predicate.

A Left Outer Join will show all rows from "A" as well as "C" (non matching data from "B" will return as NULL)
A Right Outer Join will show all rows from "B" as well as "C" (non matching data from "A" will return as NULL)

So, with that explained, on to the syntax. The Syntax of a join, when done using the ANSI standard syntax is:

SELECT <list of columns>
FROM <TABLE1>
[LEFT|RIGHT] [OUTER|INNER] JOIN <TABLE2>
ON <JOIN CONDITION>
WHERE <EXTRA PREDICATES>

So let's say we have a table with a purchase order header and a table with PO Items and we want a report of all of the PO's for a particular vendor. There are two ways we could right this code:

SELECT a.PO_NO, b.PO_ITEM_NO, b.QTY_ORDERED, b.UNIT_PRICE
FROM PO_HDR a, PO_ITEM b
WHERE a.PO_NO = b.PO_NO
AND a.VENDOR = 'ACME'
ORDER BY a.PO_NO, b.PO_ITEM_NO

The same query written using ANSI JOIN syntax would be like this:

SELECT a.PO_NO, b.PO_ITEM_NO, b.QTY_ORDERED, b.UNIT_PRICE
FROM PO_HDR a
JOIN PO_ITEM b
ON a.PO_NO = b.PO_NO
WHERE a.VENDOR = 'ACME'
ORDER BY a.PO_NO, b.PO_ITEM_NO

The two queries would generate the exact same output.

The above example would return only those records in the "C" subset on our Venn Diagram above.

Let's try something different. Let's say we have a table containing all Employees (one of the columns is the Department that the employee belongs to) and a table of all Departments. If you wanted to find all employees who had not yet been assigned a department (This example courtesy of Wikipedia):

SELECT a.EmployeeID, a.DepartmentID, b.DepartmentID, b.DepartmentName
FROM   employee a
LEFT OUTER JOIN department  b
ON a.DepartmentID = b.DepartmentID;

This would return a list of all employees and their departments. Any employee not assigned a department would show as NULL in columns 3 and 4. If the same query were executed as a Right Outer join, we'd get a list of Departments without Employees assigned and columns 1 and 2 would be null if no employees were found for any given department.

LEFT and RIGHT, in the case of an outer join, define which table of the join, the first (LEFT) or the second (RIGHT) should always return data. The other side returns the NULL in the case of something not matching.

The final part I want to talk about is the logical structure of building a query. If you ever find yourself writing a query that is joining multiple tables, I've sometimes found that making sure all the joins are correct can be burdensome at times.

I find that the easiest way to build a complex query is one table at a time. Write a simple select statement and build it up one table at a time. As you add in more tables, think of the whole query you currently have as the "table" you're joining your next table into, so you can consider you join conditions. This advice is probably more for those who are not deeply familiar with SQL, but I've found this invaluable when analyzing a complex query that is performing badly.

So we could start with a list of employees:

SELECT EmployeeID, EmployeeName
FROM EMPLOYEES

Then add in their departments, including employees not in a department at the moment

SELECT a.EmployeeID, a.DepartmentID, b.DepartmentID, b.DepartmentName
FROM   employee a
LEFT OUTER JOIN department  b
ON a.DepartmentID = b.DepartmentID;

Then add in the description of the pay group that each employee is in - only listing those that have a pay group

SELECT a.EmployeeID, a.DepartmentID, b.DepartmentID, b.DepartmentName, 
       a.PayGroup, c.PayGroupDesc
FROM   employee a
LEFT OUTER JOIN department  b
ON a.DepartmentID = b.DepartmentID
INNER JOIN paygroup c
ON a.PayGroup = c.PayGroup

And then finally add in their physical location description. As a final selection criteria, only those employees that are not terminated.

SELECT a.EmployeeID, a.DepartmentID, b.DepartmentID, b.DepartmentName, 
       a.PayGroup, c.PayGroupDesc, d.PhysicalLocDesc
FROM   employee a
LEFT OUTER JOIN department  b
ON a.DepartmentID = b.DepartmentID
INNER JOIN paygroup c
ON a.PayGroup = c.PayGroup
INNER JOIN physlocation d
ON a.PhysicalLoc = d.PhysicalLoc
WHERE a.terminated = 'N'

When looking at a complex query this way, it's easy to keep track of what your query is doing, rather than attempted to add every column you need to that select list right at the start and trying to build it all at once. The more tables you're trying to join together, the more likely you are to forget a join condition and end up with a nasty Cartesian join in there somewhere.

Tuesday, October 18, 2011

Expanding on Chris Shaw's Utility Database


"Well look at this. Appears we got here just in the nick of time. What does that make us?"
"Big damn heroes, Sir"
"Ain't. We. Just" - Mal and Zoe, Firefly - 'Safe'

When I went to SQL Saturday #94 here in Salt Lake City back in September, Chris Shaw (blog|twitter) did a presentation on a Utility Database - a database you as the DBA create to track certain things (disk free space, high use sprocs, a parse of the daily error log etc).

With the gracious permission of Mr. Shaw, I took what he started and modified it for my particular needs, and this posting is about what I've done and how I'm using it.

Chris presented the same topic at the PASS Summit 2011 in Seattle in October. After the Summit, Buck Woody (blog|twitter) tweeted a #sqlchallenge to blog about something you learned at the PASS Summit.

So, I'm cheating a little, since what I'm writing about, while presented at the Summit, I actually learned about a few weeks before.

The Utility Database is a DB you create. In it, there are a few stored procedures to collect some data daily and email it to you every day to alert you to certain things that may be happening in your system. It also keeps a historical log of these alerts. To quote from his slide deck on what this is: "A user defined, created database. Keeps information on your server about your server. Keeps information you want as long as you want it. Presents information the way you want to see it. This is your database. Acts as a repository for you the DBA".

I had been sending myself some emails along these lines every day. I hadn't thought about recording the data for historical purposes or reporting from it in order to predict disk usage trends, etc.

The Original scripts associated with this presentation did the following:

  • Created two tables, the Daily Check List table (for today's issues) and a historical table to keep anything not from today.
  • Gathered information on the local drives to record free space on each local drive on the server
  • Captured the size of each datafile in each database
  • Read the Error Log and parsed out certain messages that we don't want to see the records the rest
  • Queried Report Server Log to get information on number of reports executed
  • Queries the index physical stats DMO to report index fragmentation
  • Gather statistics about execution of stored procedures and the reads and writes and elapsed times on each
  • Calls a CLR Procedure to gather processor utilization information
  • Emails a summary to the DBA

He also went on to set up a basic report in Report Server to email a pretty formatted report to himself each morning with all the information from the Daily Check List.

This did some of what I wanted, didn't record all of what I wanted and was performing extra steps that I just didn't need.

I'd never be able to get a CLR Routine DLL past change management (code unseen from an unknown third party), so that was dropped. Index Maintenance is a scheduled task each month already, so I really didn't need that. The application I'm primarily responsible for doesn't use Report Server and has an extremely limited (single digit, if that) number of stored procedure calls. That also went.

The file information displayed also wasn't showing everything I wanted and I needed the output a little different. So I set out to make my version of this Utility Database.

First, a disclaimer. I'm relatively new to SQL Server and the work I've been doing in it has certainly not been in T-SQL - I'm a DBA, not a developer. I'm not necessarily doing things the most elegant way, but for once daily or once a month reporting, what I have below is working for me.

I've got all of this in one large script for my implementation, but I'll break it down and annotate it for the purposes of this article.

First step: Create the database.


-- This script defines the objects needed to track daily reporting
-- against all databases installed in this instance of SQL Server
--
-- Ideas in this script have been culled from a presentation by
-- Chris Shaw at SQL Saturday 94 in Salt Lake City (@SQLShaw on twitter,
-- http://chrisshaw.wordpress.com) and from a series of articles 
-- on SQLServerCentral.com called SQL Overview.  
-- The last article in the series is linked here and it
-- refers back to all of the others.
-- http://www.sqlservercentral.com/articles/Monitoring/69650/
--
-- The SQL Overview series gave lots of good information, but didn't keep
-- any of it for trending analysis.  Chris Shaw's presentation gave ideas 
-- on how to keep that in line.
-- 

-- First, create the database

CREATE DATABASE [UtilityDB] 
    ON  PRIMARY 
( NAME = N'UtilityDB', 
  FILENAME = N'G:\MSSQL\Data\UtilityDB.mdf' , 
  SIZE = 102400KB , 
  MAXSIZE = UNLIMITED, 
  FILEGROWTH = 102400KB )
    LOG ON 
( NAME = N'UtilityDB_log', 
  FILENAME = N'G:\MSSQL\Data\UtilityDB_log.ldf' , 
  SIZE = 51200KB , 
  MAXSIZE = 2048GB , 
  FILEGROWTH = 51200KB )
GO

Next step was to create the tables that I'd be using for this database. I decided to create clustered indexes on them purely for performance on queries down the road. Over time, if never archived, these tables may grow to a significant size and having a clustered index would hopefully assist in performance since most of the queries are date based.

First couple of tables are the Daily Check List and the Historical Check List tables and the tables for storing disk free space and file growth.


-- Create the tables used in this utility

USE [UtilityDB];

-- This Table contains the daily information emailed to the DBA.
-- It is cleared out first thing each morning and moved to the 
-- cumulative history table

CREATE TABLE [dbo].[DayCheckList](
 [InstanceName] [varchar](50) NOT NULL,
 [EventStatus] [varchar](50) NOT NULL,
 [EventDate] [datetime] NULL,
 [EVENT] [varchar](2000) NOT NULL
) ON [PRIMARY]
GO

CREATE CLUSTERED INDEX IX_DayCheckList
    ON DayCheckList(EventDate)
GO

-- This is the cumulative history table.  It contains the complete
-- history of events captured by this utility in the range of from
-- yesterday back until when data capture started.

CREATE TABLE [dbo].[HistoryCheckList](
 [InstanceName] [varchar](50) NOT NULL,
 [EventStatus] [varchar](50) NOT NULL,
 [EventDate] [datetime] NULL,
 [EVENT] [varchar](2000) NOT NULL
) ON [PRIMARY]
GO

CREATE CLUSTERED INDEX IX_HistoryCheckList
    ON HistoryCheckList(EventDate)
GO

-- This table holds a history of the daily size of your datafiles.

CREATE TABLE [dbo].[FileSizeGrowth](
    [InstanceName] [varchar](50) NOT NULL,
    [DBName] [varchar] (50) NOT NULL,
    [FileName] [varchar] (50) NOT NULL,
 [PhysicalName] [varchar](200) NULL,
 [SizeInMB] [int] NULL,
 [SizeInGB] [int] NULL,
 [ReadingDate] [datetime] 
) ON [PRIMARY]
GO

CREATE CLUSTERED INDEX IX_FileSizeGrowth
    ON FileSizeGrowth(ReadingDate)
GO

-- Not a huge fan of default values myself, but we can place this 
-- here just in case.

ALTER TABLE [dbo].[FileSizeGrowth] 
  ADD  DEFAULT (getdate()) FOR [ReadingDate]
GO

-- This is a work table used when populating the historical FileFreeSpace 
-- table.  It is truncated each day after calculating what is needed

CREATE TABLE [dbo].[FreeSpaceWork](
    [InstanceName] [varchar] (50),
    [DBName] [varchar] (50),
    [FileName] [varchar] (50),
 [PhysicalName] [varchar](200) NULL,
 [TotalSizeInMB] [int] NULL,
 [FreeSizeInMB] [int] NULL
) ON [PRIMARY]
GO

-- This table keeps the daily tracking of the percentage of free space
-- in the data files.  This data will be used in the monthly reporting
-- for size trending.

CREATE TABLE [dbo].[FileFreeSpace](
    [InstanceName] [varchar] (50),
    [DBName] [varchar] (50),
    [FileName] [varchar] (50),
 [PhysicalName] [varchar](200) NULL,
 [TotalSizeInMB] [int] NULL,
 [FreeSizeInMB] [int] NULL,
    [PctFreeSpace] [int],
 [ReadingDate] [datetime]
) ON [PRIMARY]
GO

CREATE CLUSTERED INDEX IX_FileFreeSpace
    ON FileFreeSpace(ReadingDate)
GO

-- Again, not a huge fan of default values myself, but we can place this 
-- here just in case.

ALTER TABLE [dbo].[FileFreeSpace] 
  ADD  DEFAULT (getdate()) FOR [ReadingDate]
GO

One of the things I added to my version of this was an indication in the daily email that a database hadn't been backed up in a certain number of days. For most user databases, one would expect that you'd want a backup at least every 24 hours - but maybe you're happy only backing up your system databases once a week. The database control table below lists, for each database you are going to report on, how far in the past (in days) the most recent backup can be before we get upset and alert about it.

Also, there may be databases on the server that you don't want reported. Maybe they are test or development or training databases and you don't particular care about them. The ExcludeFromReporting table just contains the database name of any database you never want to see reported. Then, I went an populated the seed data into these tables.


-- This table defines the maximum numbers of days than can elapse after
-- a database is backed up before it's reported as missing a backup.
-- For example, system databases may be backed up weekly, but others
-- are daily

CREATE TABLE [dbo].[DatabaseBackupControl](
 [DBName] [char](50) NOT NULL,
 [MaxDaysBetweenBackups] [int] NOT NULL
) ON [PRIMARY]

-- We want the data in this table to be unique, so put a unique key here.

CREATE UNIQUE CLUSTERED INDEX IX_DatabaseBackupControl 
    ON DatabaseBackupControl (DBName)
GO

-- This table defines any databases we don't want included in reporting,
-- such as test or development databases.

CREATE TABLE [dbo].[ExcludeFromReporting](
 [DBName] [char](50) NULL
) ON [PRIMARY]

-- We want the data in this table to be unique, so put a unique key here.

CREATE UNIQUE CLUSTERED INDEX IX_ExcludeFromReporting 
    ON ExcludeFromReporting (DBName)

-- Populate initial seed information into control tables.

-- Control settings for checking for missing backups

INSERT INTO DatabaseBackupControl VALUES ('myprod', 1)
INSERT INTO DatabaseBackupControl VALUES ('master', 7)
INSERT INTO DatabaseBackupControl VALUES ('model', 7)
INSERT INTO DatabaseBackupControl VALUES ('msdb', 7)
INSERT INTO DatabaseBackupControl VALUES ('monitoring', 7)
INSERT INTO DatabaseBackupControl VALUES ('ReportServer', 1)

-- Control settings for databases to exclude from checking

INSERT INTO ExcludeFromReporting VALUES ('mydev');
INSERT INTO ExcludeFromReporting VALUES ('mytst');
INSERT INTO ExcludeFromReporting VALUES ('mytrn');
INSERT INTO ExcludeFromReporting VALUES ('tempdb');
INSERT INTO ExcludeFromReporting VALUES ('LiteSpeedLocal');
INSERT INTO ExcludeFromReporting VALUES ('ReportServerTempDB');
INSERT INTO ExcludeFromReporting VALUES ('RSExecutionLog');

Moving along! The next thing I created was a view to make reporting a little easier. I also wanted my report to show me each day any SQL Server Agent jobs that had failed in the previous day.


-- In order to simplify some of the reporting, views may be 
-- created that are snapshots of system catalog tables or 
-- views from msdb or master.

-- This view summarizes any SQL Server Agent job that failed yesterday.
-- The view is set up so that it can just be inserted into the daily
-- notification table when the daily procedures runs.

CREATE VIEW [dbo].[YesterdayFailedJobs] as
SELECT @@ServerName as ServerName, 
       'Job Failed' as Event_Status, 
       getdate() as Event_Date, 
       cast(run_date as char(8)) + '/' + cast(run_time as char(6)) + 
            ' Job: '+name + ' failed with ' + message as EventMsg
FROM msdb..sysjobs a
JOIN (SELECT * from msdb..sysjobhistory where sql_severity <> 0) b
ON a.job_id = b.job_id
WHERE b.run_date = cast(convert(char(8), getdate()-1, 112) as int)

OK that defines all the tables and views. Next are the individual stored procedures that gather the various data aspects. First couple gather free space on local drives and the size of the data files.


-- This stored procedure gathers information about the free space on 
-- the drives attached to the server.

CREATE PROCEDURE [dbo].[GetDriveSpaceFree]
AS
DECLARE @driveSpace TABLE (drive CHAR(2), MBFree int)
INSERT INTO @driveSpace
EXEC sp_executesql N'xp_fixeddrives'
INSERT INTO UtilityDB..DayCheckList
 SELECT @@ServerName, 
           'Drive Space', 
           GETDATE(), 
          'Free space on ' + drive + ' is ' + 
          CONVERT (VARCHAR(20), MBFree/1024) + ' Gigs' 
 FROM @driveSpace
GO

-- This stored procedure gathers information about the growth of your 
-- data files.

CREATE Procedure [dbo].[GetFileSizeGrowth]
AS
insert into FileSizeGrowth
  Select @@ServerName, 
         db_name(database_id),
         name,
         physical_name,
         SizeInMB = (size * 8 / 1024),
         SizeInGB = (size * 8 / 1024 / 1024),
         GETDATE()
  from sys.master_files
GO

CREATE Procedure [dbo].[GetFileFreeSpace]
AS
declare @RETURN_VALUE int
declare @command1 nvarchar(2000)
declare @yesterday datetime
declare @today datetime

set @command1 = 'use ?
    Select 
    @@ServerName, 
    ''?'',
    name,
 physical_name,
 TotalSizeInMB = size/128, 
 FreeSizeInMB = size/128.0 - 
             CAST(FILEPROPERTY(name, ''SpaceUsed'') AS int)/128.0
 from ?.sys.database_files'

Insert Into FreeSpaceWork (InstanceName, DBName, FileName, PhysicalName, 
      TotalSizeInMB, FreeSizeInMB)
 exec @RETURN_VALUE = sp_MSforeachdb @command1 = @command1;

Insert Into FileFreeSpace (InstanceName, DBName, FileName, PhysicalName, 
      TotalSizeInMB, FreeSizeInMB, PctFreeSpace, ReadingDate)
select InstanceName, 
       DBName,
       FileName, 
       PhysicalName, 
       TotalSizeInMB, 
       FreeSizeInMB, 
       FreeSizeInMB * 100 / CASE 
                            WHEN TotalSizeInMB = 0 
                            THEN 1 
                            ELSE TotalSizeinMB 
                            END,
       GETDATE()
from FreeSpaceWork
where DBName not in (select DBName from ExcludeFromReporting);

select @yesterday = dateadd(day, datediff(day, 0, getdate()-1), 0)
select @today = dateadd(day, datediff(day, 0, getdate()), 0)

insert into DayCheckList
select InstanceName, 'Free Space ('+DBName+')', 
       ReadingDate, 
       rtrim(ltrim(cast(PctFreeSpace as Char(3)))) + '% Freespace in ' + 
         PhysicalName
from FileFreeSpace
where ReadingDate between @yesterday and @today;

truncate table FreeSpaceWork;
GO

The next stored procedure reads the SQL Server error log and filters out things we don't want to see before recording the rest in the log. I'm also recording a startup for any database started in the last 7 days


-- This stored procedure reads the SQL Server Error Log and parses
-- out information we don't want to see and reports the rest.

Create Procedure GetFilteredErrorLog
AS
DECLARE @Errorlog TABLE (LogDate datetime, 
                         ProcessorInfo VARCHAR (100),
                         ErrorMSG VARCHAR(2000))

INSERT INTO @Errorlog
EXEC sp_executesql N'xp_readerrorlog'

Delete 
FROM @Errorlog 
WHERE ErrorMSG LIKE '%Log was backed up%'
   OR ErrorMSG LIKE '%Setting database option COMPATIBILITY_LEVEL%'
   OR ErrorMSG LIKE '%were backed up%'
   OR ErrorMSG LIKE '%DBCC TRACEON%'
   OR ErrorMSG LIKE '%without errors%'
   OR ErrorMSG LIKE '%\ERRORLOG%'
   OR ErrorMSG LIKE '%Attempting to cycle errorlog%'
   OR ErrorMSG LIKE '%Errorlog has been reinitialized.%' 
   OR ErrorMSG LIKE '%found 0 errors and repaired 0 errors.%'
   OR ErrorMSG LIKE '%without errors%'
   OR ErrorMSG LIKE '%This is an informational message%'
   OR ErrorMSG LIKE '%WARNING:%Failed to reserve contiguous memory%'
   OR ErrorMSG LIKE '%The error log has been reinitialized%'
   OR ErrorMSG LIKE '%Setting database option ANSI_WARNINGS%';
 
INSERT INTO UtilityDB..daychecklist
SELECT @@ServerName, 'Error Log', Logdate, SUBSTRING(ErrorMSG, 1, 2000) 
FROM @Errorlog 
WHERE LogDate > DATEADD(dd, -1, GETDATE()) 
 
INSERT INTO UtilityDB..daychecklist
SELECT @@ServerName, 'DB Restart',Logdate,ErrorMSG 
FROM @Errorlog 
WHERE ErrorMSG LIKE '%Starting up database%'
AND LogDate > DATEADD(dd, -7, GETDATE()) 
GO

Next: Let's check for failed SQL Server Agent jobs and backups outside of our tolerances


CREATE PROCEDURE [dbo].[CheckFailedJobs]
AS
declare @Count as int
select @Count=count(*) from YesterdayFailedJobs

if @Count > 0
Begin
  INSERT INTO daychecklist
  SELECT *
  FROM YesterdayFailedJobs
END
GO

-- This stored procedure checks recent backups against the control tables.
-- This query is a UNION of three selects, each checking different things.
-- If a database is found that has a backup and is not defined on the 
-- control table, it is reported (first select)
-- If a database is found where the last backup is outside the recovery 
-- window, it is reported (second select)
-- If a database is found that is not defined in either backup control 
-- table, it is reported (third select)

CREATE PROCEDURE CheckRecentBackups 
AS
insert into DayCheckList
select @@ServerName, 
       'Missing Backup', 
       getdate(), 
       a.database_name + ' last backed up ' + 
       convert(char(20), max(backup_finish_date), 100) + 
       ' and control record is not defined'
from (select * from msdb.dbo.backupset where type <> 'L') a
JOIN sys.databases b
ON a.database_name = b.name
LEFT OUTER JOIN DatabaseBackupControl c
ON a.database_name = c.DBName
where database_name not in (select DBName from ExcludeFromReporting)
and MaxDaysBetweenBackups is NULL
group by database_name, MaxDaysBetweenBackups
UNION
select @@ServerName, 
       'Missing Backup', 
       getdate(), 
       a.database_name+ ' last backed up '+ convert(char(20), 
       max(backup_finish_date), 100) + ' limit is ' + 
       cast(MaxDaysBetweenBackups as char(2)) + ' day(s)'
from (select * from msdb.dbo.backupset where type <> 'L') a
JOIN sys.databases b
ON a.database_name = b.name
LEFT OUTER JOIN DatabaseBackupControl c
ON a.database_name = c.DBName
where database_name not in (select DBName from ExcludeFromReporting)
group by database_name, MaxDaysBetweenBackups
having getdate() > 
    dateadd(day, MaxDaysBetweenBackups, max(backup_finish_date))
UNION
select @@ServerName, 
       'Missing Backup', 
       getdate(), 
       name + 
       ' not defined in either Control Table or Exclude table.   ' +
       'Please investigate.'
from sys.databases
where name not in (select DBName from ExcludeFromReporting)
and name not in (select DBName from DatabaseBackupControl)
GO

Almost there. This next procedure is the one that will query the daily error log, format it nicely into an HTML table and email it to you.


-- This stored procedure will take the contents of the Daily log table
-- and email it formatted as an HTML table to interested parties.

CREATE PROCEDURE [dbo].[SendDailyEmail] 
as
 DECLARE @Subject     varchar (100)
 SET @Subject='SQL Server - Daily Event Log'

declare @Count as int
select @Count=count(*) from DayCheckList

IF @Count > 0 
Begin

DECLARE @tableHTML  NVARCHAR(MAX) ;
SET @tableHTML =
        N'<table border="1">' +
    N'<tr><th>InstanceName</th>' +
    N'<th width="150">EventStatus</th>' +
    N'<th width="180">EventDate</th>' +
    N'<th>Event</th>' +
    N'</tr>' +
    CAST ( ( SELECT td=[InstanceName],''
      ,td=[EventStatus],''
      ,td=[EventDate],''
      ,td=[Event],''

  from DayCheckList
order by InstanceName, EventStatus
      FOR XML PATH('tr'), TYPE 
    ) AS NVARCHAR(MAX) ) +
    N'</table>' ;

EXEC msdb.dbo.sp_send_dbmail 
    @profile_name = 'SQL Server Agent Mail Profile',
    @recipients = 'me@mydomain.com',
    @subject = @Subject,
    @body = @tableHTML,
    @body_format = 'HTML' ;
END
GO

Almost done. This Stored Procedure will be run daily via a SQL Server Agent job to generate the email


-- Finally, this is the stored procedure that ties it all together.
-- This procedures should be executed daily as a job

CREATE PROCEDURE [dbo].[DayCheck]
AS
------------------------Maintenance Work---------------------------------
INSERT INTO HistoryCheckList
 SELECT * FROM DayChecklist

IF @@error = 0
 BEGIN
  Delete UtilityDB..DayCheckList
 END
---------------------------Disk Space------------------------------------
EXECUTE GetDriveSpaceFree
EXECUTE GetFileSizeGrowth
EXECUTE GetFileFreeSpace
--------------------------Read Error Log---------------------------------
EXECUTE GetFilteredErrorLog
--------------------------Check Failed Jobs------------------------------
EXECUTE CheckFailedJobs
--------------------------Check Recent Backups---------------------------
EXECUTE CheckRecentBackups
--------------------------Send Daily Email-------------------------------
EXECUTE SendDailyEmail
GO

Finally, set up a SQL Server Agent job to run this last Stored Procedure (DayCheck) at some time that works for you.

So, collecting all this data is wonderful, but what about using some of the data file data to generate a report with some trending data for data growth etc. I created this query that I run on the first of each month to show me growth of files and growth of data within the data files.

Note the commented lines at the bottom of the query. There are all sorts of things you can do here. You can exclude the databases included in the ExcludeFromReporting table. You can ignore log files (assuming they have been named *.LDF). You could also only pick up those records where there was growth of some sort etc


Use UtilityDB
go
IF EXISTS (SELECT name FROM sys.objects WHERE name = 'MonthlyReport')
    DROP TABLE MonthlyReport;

DECLARE @Today   DATETIME,
        @BDATE   DATETIME,
        @EDATE   DATETIME,
        @MINDATE DATETIME,
        @MAXDATE DATETIME,
        @DAYS    DECIMAL;

-- Today's date, modified to be midnight.
SELECT @Today = dateadd(dd, datediff(dd, 0, getdate())+0, 0)     
-- One Month back from today
SELECT @Today = DATEADD(month, -1, @Today);
-- First of last month
SELECT @BDATE = 
   DATEADD(dd,-(DAY(DATEADD(mm,1,@Today))-1),DATEADD(mm,0,@Today))  
-- First of this month   
SELECT @EDATE = 
   DATEADD(dd,-(DAY(DATEADD(mm,1,@Today))-1),DATEADD(mm,1,@Today))  
-- Calculate days in the month   
SELECT @DAYS = 
   DAY(DATEADD (m,1,DATEADD(d,1-DAY(GETDATE()),GETDATE()))-1) 
-- Calculate the first and last dates in the date range
-- and adjust both to be midnight
SELECT @MINDATE = min(ReadingDate) 
  from FileFreeSpace 
  where ReadingDate >= @BDATE and ReadingDate < @EDATE;
SELECT @MAXDATE = max(ReadingDate) 
  from FileFreeSpace 
  where ReadingDate >= @BDATE and ReadingDate < @EDATE;
SELECT @MINDATE = dateadd(dd, datediff(dd, 0, @MINDATE)+0, 0)
SELECT @MAXDATE = dateadd(dd, datediff(dd, 0, @MAXDATE)+0, 0)

select a.InstanceName, a.DBName, a.FileName, a.PhysicalName, 
       a.TotalSizeInMB as StartSize, b.TotalSizeInMB as EndSize,
       (b.TotalSizeInMB - a.TotalSizeInMB) as Growth,
       (a.TotalSizeInMB - a.FreeSizeInMB) as StartUsed,
       (b.TotalSizeInMB - b.FreeSizeInMB) as EndUsed,
       (b.TotalSizeInMB - b.FreeSizeInMB) - 
          (a.TotalSizeInMB - a.FreeSizeInMB) as DiffUsed,
       CAST((((b.TotalSizeInMB - b.FreeSizeInMB) -   
              (a.TotalSizeInMB - a.FreeSizeInMB)) / 
              @Days) as DECIMAL(9,2)) as AvgGrowthDay,
       (b.FreeSizeInMB * 100) / 
       CASE b.TotalSizeInMB 
       WHEN 0 
       THEN 1 
       ELSE b.TotalSizeInMB 
       END as PctFreeEOM
into MonthlyReport
from (select * 
      from FileFreeSpace 
      where dateadd(dd,datediff(dd,0,ReadingDate)+0,0) = @MINDATE) a,
     (select * 
      from FileFreeSpace 
      where dateadd(dd,datediff(dd,0,ReadingDate)+0,0) = @MAXDATE) b
where a.InstanceName = b.InstanceName
and a.DBName = b.DBName
and a.FileName = b.FileName
and a.PhysicalName = b.PhysicalName
and a.DBName not in (select DBName from ExcludeFromReporting)
and upper(a.PhysicalName) not like '%.LDF'
order by a.DBName, a.FileName
GO
-- and DBName not in (select DBName from ExcludeFromReporting)
-- Having ((max(FreeSizeInMB) - min(FreeSizeInMB)) / @Days)
-- and upper(a.PhysicalName) not like '%.LDF'

DECLARE @Subject     varchar (100)
SET @Subject='SQL Server - Monthly File Growth Statistics'

declare @Count as int
select @Count=count(*) from MonthlyReport

IF @Count > 0 
Begin

DECLARE @tableHTML  NVARCHAR(MAX) ;
SET @tableHTML =
        N'<table border="1">' +
    N'<tr><th>InstanceName</th>' +
    N'<th>DBName</th>' +
    N'<th>FileName</th>' +
    N'<th>PhysicalName</th>' +
    N'<th>StartSize</th>' +
    N'<th>EndSize</th>' +
    N'<th>Growth</th>' +
    N'<th>StartUsed</th>' +
    N'<th>EndUsed</th>' +
    N'<th>DiffUsed</th>' +
    N'<th>AvgGrowthDay</th>' +
    N'<th>PctFreeEOM</th>' +
    N'</tr>' +
    CAST ( ( SELECT td=[InstanceName],''
      ,td=[DBName],''
      ,td=[FileName],''
      ,td=[PhysicalName],''
      ,td=[StartSize],''
      ,td=[EndSize],''
      ,td=[Growth],''
      ,td=[StartUsed],''
      ,td=[EndUsed],''
      ,td=[DiffUsed],''
      ,td=[AvgGrowthDay],''
      ,td=[PctFreeEOM],''
  from MonthlyReport
order by DBName, FileName
      FOR XML PATH('tr'), TYPE 
    ) AS NVARCHAR(MAX) ) +
    N'</table>' ;

EXEC msdb.dbo.sp_send_dbmail 
    @profile_name = 'SQL Server Agent Mail Profile',
    @recipients = 'me@mydomain.com',
    @subject = @Subject,
    @body = @tableHTML,
    @body_format = 'HTML' ;
END
GO

IF EXISTS (SELECT name FROM sys.objects WHERE name = 'MonthlyReport')
    DROP TABLE MonthlyReport;


So this is an example of the kind of things that I'm using this sort of data for. Hopefully someone else will find this useful too.

Any suggestions for improvement, please leave comments!

Thursday, October 13, 2011

Things I've learned at PASS Summit 2011


"Wow". "Wow!" "WOW!" - Flt Lt Hendley, Capt Hilts and another American PoW, The Great Escape
Today's not so random movie quote pretty much sums up my feelings about the PASS Summit 2011 so far and the SQL Server community in General.

I've been working in SQL Server for a couple of years but I've only just recently discovered PASS and all the goodies that come with it.

I attended SQL Saturday 94 in Salt Lake City last month and got some great information out of that. I joined Twitter and started watching #sqlpass and #sqlhelp (In fact I had an issue at midnight my time the other night, tweeted to #sqlhelp and had a couple of good ideas inside 15 minutes.

Right after SQL Saturday, I started this blog to record assorting technical musings.

This was all leading up to the PASS Summit and I'm amazed at what I've found. In the last few weeks, I signed up for the Big Brothers / Big Sisters program. I was assigned a Big Brother and a group of fellow First Timers. We swapped some emails and shared a little bit of info.

Arriving at the Summit on Monday for my first precon, I sat in and listened while Grant Fritchey and Gail Shaw talked about Execution Plans. I learned a great deal in those 7 hours. I don't know that there's much I can immediately take back to work (since the software I work with is provided by a third party vendor and there's not a lot you can do to modify the code), but my big take away was wanting to learn more about the optimizer and execution plans. One thing that I specifically remember learning was the importance of Referential Integrity constraints in the Optimizer's simplification phase. The app I work with has all the relationships in the business logic and doesn't put the RI constraints on the database - I never knew that this could have an adverse affect on optimization.

Tuesday's precon was with Adam Machanic who talked about Performance Tuning. Lots of good stuff came out of this one, including a wonderful stored procedure called sp_whoisactive. This is going on all my servers the moment I set foot in the door at work next Monday. It slices, it dices and it makes getting active session data out of SQL Server easy.

The Welcome Reception was interesting - or at least the entry was. Ever wanted to feel like you were on the field at the Super Bowl ? The First Timers (there were hundreds of us) were brought into the main ball room to cheering Alumni, spotlights, balloons and smoke. Very glitzy. Of course, the MVP on stage, who shall remain nameless, probably disappointed half the crowd by not knowing the name of the 4th Star Trek Movie during the Quiz Bowl :)

Official day 1 of the conference, with the smaller concentrated sessions had me in a jam. So many things to see and do and only one of me. Thankfully the audio for each session will be included on the session DVDs which I have on order, which will allow a nice chance to review later. Today I saw a couple of great sessions on I/O subsystems, attended the Lightning Talk sessions (Rob Farley can sing, play the guitar and write a nifty little song, who knew?) and watched Brent Ozar unveil his latest BLITZ! script.

This has all been information overload, but some of the best has been meeting and hanging out with some new people. I've enjoyed the company of a couple of new friends from my first timer's group (Go Team Honey Badgers!) and hope that after the event is over we can continue to keep in touch.

The Midnight DBA's put on a fun party tonight, "Inappropriate Sessions" which was as entertaining as it was inappropriate.

I look forward to the rest of the Summit and seeing what else I can take away from this week - not just new knowledge, but new friends and relationships that hopefully will last a long time.

Saturday, October 8, 2011

Tricks to sending email from SQL Server


"This is the Captain. We have a little problem with our entry sequence, so we may experience some slight turbulence and then... explode." - Captain Malcolm Reynolds, Serenity

I had a requirement recently on SQL Server 2005 to write a trigger to call a stored procedure to send an email when a certain event happened. So I logged into SQL Server Management Studio and coded up my trigger and stored procedure and executed the update statement that would fire the trigger and I get the nice "Mail queued" message in the results window at the bottom.

Next thing was to execute the application that executed the same update. Upon trying it there, it failed with no readily apparent error code. Now being an old Oracle guy, my grasp of some of the finer points of T-SQL error trapping are not up to par. I'm still learning some of the finer points to the fine art of T-SQL coding - so my rudimentary attempt at error trapping wasn't doing what it was supposed to do.

It suddenly hit me that since the application runs under another user to what my ID was that it must be a permissions error. So I granted execute on the stored procedure to the application owner and tried again. Same thing.

Next step was to queue up SQL Profiler, and trace the execution and I could see that the failure was occurring at the execution of sp_send_dbmail. The next brilliant idea in the thought process was to login to SSMS as the application owner and see what happened.

That's when I got this message:

EXECUTE permission denied on object 'sp_send_dbmail', 
database 'msdb', schema 'dbo'.

Did a little research on this error message and was reminded that in order to use sp_send_dbmail, a user needs to be granted execute on msdb.dbo.sp_send_dbmail. In order to do that a user needs to be created in msdb with the same name as the user in the target database that we're running against. Also, the user needs to be a member of the DatabaseMailUserRole.

EXEC msdb.dbo.sp_addrolemember @rolename = 'DatabaseMailUserRole', 
   @membername = '' ;
GO

So, set that up and ran it again. Still failed. Ah, the pain. There was a different message this time though:

Profile not valid

A little further investigation and I realized that I needed to grant access to the email profile to the user in msdb.

Getting all that done finally made it work.

The long and short of this rambling story is that in order to use sp_send_dbmail, there are several pieces that need to be set up.

  • A user needs to be set up in msdb matching the user in the calling database
  • The msdb user needs to be added to the DatabaseMailUserRole in msdb
  • The user in msdb needs to be granted access to the Mail Profile via the Database Mail Configuration Wizard
  • If called from a Stored Procedure, the user needs to be granted execute on the Stored Procedure (but then, that's a given).

References: Troubleshooting Database Mail: Permission denied on sp_send_dbmail Troubleshooting Database Mail: Profile not valid