SQL Server Performance Dashboard Reports

SQL Server 2012 Performance Dashboard Reports is one of most popular SQL Server monitoring solution for customers and SQL community leveraging dynamic management views (DMVs) for monitoring and reporting and available at no cost to consumers. SQL Server Performance Dashboard Reports are available as a set of custom reports in SQL Server Management Studio (SSMS) which runs against the connected instance in Object Explorer. When monitoring large enterprise deployments of SQL Server, hosting SQL Server Performance Dashboard Reports on a central reporting server can provide additional benefits making life easier for enterprise DBAs for monitoring and troubleshooting SQL Server issues. To support hosting SQL performance dashboard reports on a central SQL Server Reporting Services instance, we have customized SQL Server 2012 Performance Dashboard Reports, added new reports and uploaded in Tiger toobox github repository for customers and SQL community. The reports are tested to run against SQL Server 2012, SQL Server 2014 and SQL Server 2016 versions of target SQL Server instance and can be deployed against SQL Server 2012, SQL Server 2014 or SQL Server 2016 Reporting Services instance.

Following are some of the benefits of hosting SQL Performance dashboard reports on central SSRS reporting server.

  • Monitoring Reports accessible anytime, anywhere using browser – This removes the dependency of thick client like SQL Server Management Studio (SSMS) to be present on the workstation server allowing DBAs, DevOps audience to check the health of SQL Server and resource consumption using web browser from any workstation machine with access to the server.
  • Scheduling automatic report delivery – SSRS allows scheduled email or file share delivery of reports. This allows DBAs, application owners and database stakeholders to choose push model where by performance health reports can be scheduled to run against specified SQL Server instances at the specified time and be delivered in their mailbox to proactively monitor overall health of SQL Server instance and detect any anomaly.
  • Performance Base lining using Report Snapshots – SSRS allows you to capture scheduled point in time report snapshots at the specified time interval allowing DBAs to establish performance baselines using historical snapshots for the target SQL Server instances.
  • Linked Reports for Application owners and other stakeholders – In an enterprise environment, most application teams and stakeholders are interested to see the performance, resource consumption, blocking information and overall health of their SQL Server instance on-demand. In such scenarios, DBAs can create linked reports for the target SQL Server instances on the SSRS central server and delegate them permissions to view reports for their target SQL Server instance of interest. This allows application teams, developers to be self-sufficient to check the overall health of their SQL Server instances creating some bandwidth for DBAs who needs to be contacted only if there is an anomaly or problem detected.

Click here to view the rest of this post.

SQL Server DBCC CHECKDB Overview

SQL Server database corruption can be a problem and can cause serious damage to a database. If you’re an experienced DBA then you probably have safeguards in place to detect this, but over the years I’ve seen hundreds of SQL Servers with no detection methods at all and this is a problem. There are a few ways to detect database corruption, but this tip will focus more on DBCC CHECKDB.

You may or may not have heard of DBCC (database console commands) statements. These statements are used to perform different operations in your database and can be broken down into four categories: Maintenance, Miscellaneous, Informational, and Validation. I use some of the DBCC statements on a daily basis, but none more than DBCC CHECKDB.

What is SQL Server DBCC CHECKDB

DBCC CHECKDB, from Microsoft MSDN Library, checks logical and physical integrity of all the objects in the specified database by performing the following operations:

  • Runs DBCC CHECKALLOC on the database – Checks consistency of disk space allocation structures for a specified database.
  • Runs DBCC CHECKTABLE on every table and view in the database – Checks the integrity of all the pages and structures that make up the table or indexed view.
  • Runs DBCC CHECKCATALOG on the database – Checks for catalog consistency within the database.
  • Validates the contents of every indexed view in the database.
  • Validates link-level consistency between table metadata and file system directories and files when storing varbinary(max) data in the file system using FILESTREAM.
  • Validates the Service Broker data in the database

If you’ve ever ran DBCC CHECKDB you know it takes some time for large databases. Now that you know all of the steps that are run, you can see why it takes time to complete.

How can SQL Server DBCC CHECKDB help me?

Data corruption is bad. It can cause all sorts of issues within the database that may include incorrect data results, failed SQL statements, and in some cases can take down the entire SQL instance. DBCC CHECKDB warns you of corruption so that you can fix it before (hopefully) it gets too bad.

How do I use SQL Server DBCC CHECKDB?

DBCC CHECKDB is pretty straightforward. There are a few options you can use with the statement and we’ll go over some of those in the next section, but the basic syntax looks like this:

DBCC CHECKDB ('DatabaseName')

Pretty simple.

Automate SQL Server DBCC CHECKDB

Obviously, you don’t want to log in every morning and run this statement on each database, so you can automate this process using a few different methods:

  • SQL Server Maintenance plans – Maintenance plans are part of SQL Server out of the box (unless you’re running Express Edition). I don’t like using maintenance plans for the most part, but I don’t mind using them for this type of task. In the Maintenance Plan toolbox you’ll need to use the Check Database Integrity task. The only configurable option is to include indexes so it’s not really user friendly, but in some cases this is all you need. Again, we’ll talk about other options in the next section.

Custom scripts – Custom scripts are usually what I use and offer the best flexibility as far as adding the options you want. My go-to scripts are already created and free to use from Ola Hallengren. He’s done a wonderful job of creating these and sharing them to the world. Thanks Ola!

Click here to view the rest of this post.

SQL Server Index Fragmentation Overview

We’ve all heard about database/index fragmentation (and if you haven’t, continue reading), but what is it? Is it an issue? How do I know if it resides in my database? How do I fix it? These questions could be a tip all in itself, but I’ll try to give you an idea of each in this post.

Without going into a lot of detail, SQL Server stores data on 8KB pages. When we insert data into a table, SQL Server will allocate one page to store that data unless the data inserted is more than 8KB in which it would span multiple pages. Each page is assigned to one table. If we create 10 tables then we’ll have 10 different pages.

As you insert data into a table, the data will go to the transaction log file first. The transaction log file is a sequential record meaning as you insert, update, and delete records the log will record these transactions from start to finish. The data file on the other hand is not sequential. The log file will flush the data to the data file creating pages all over the place.

Now that we have an idea of how data is stored, what does this have to do with fragmentation?

There are two types of fragmentation: Internal Fragmentation and External Fragmentation.

SQL Server Internal Fragmentation

SQL Server Internal Fragmentation is caused by pages that have too much free space. Let’s pretend at the beginning of the day we have a table with 40 pages that are 100% full, but by the end of the day we have a table with 50 pages that are only 80% full because of various delete and insert statements throughout the day. This causes an issue because now when we need to read from this table we have to scan 50 pages instead of 40 which should may result in a decrease in performance. Let’s see a quick and dirty example.

Let’s say I have the following table with a Primary Key and a non-clustered index on FirstName and LastName:

IndexFragmentationI’ll talk about ways to analyze fragmentation later in this tip, but for now we can right click on the index, click Properties, and Fragmentation to see fragmentation and page fullness. This is a brand new index so it’s at 0% fragmentation.

IndexFragmentation

Click here to view the rest of this post.

Creating a Backup plan on SQL Express using Ola Hallengren’s scripts

SQL Express doesn’t have the SQL Server Agent so we can’t schedule jobs like normal. Follow this post to create a backup plan that will back up all the databases using Windows Task Scheduler.

  • Download CommandExecute – https://ola.hallengren.com/scripts/CommandExecute.sql
  • Download DatabaseBackup – https://ola.hallengren.com/scripts/DatabaseBackup.sql
  • Execute both of these stored procedures against the target server

Backups with Ola 1

  • Open Notepad on the target server and copy the following code changing the path to where the backups will be stored and the cleanup time. The cleanup time is specified in hours :
EXECUTE dbo.DatabaseBackup
@Databases = 'USER_DATABASES',
@Directory = 'C:\Backups',
@BackupType = 'FULL',
@Compress = 'Y', 
@CleanupTime = 48
  • Save the file with a .sql extension

Backups with Ola 2

  • Start the Task Scheduler by clicking on StartMenu/All Programs/Accessories/System Tools/Task Schedule
  • Click Create Basic Task to start the Scheduled Task Wizard
  • Type a name for the Task

Backups with Ola 3

  • Choose Daily from the scheduling options

Backups with Ola 4

  • Click Next, specify the information about the time to run the task. Set Start Time to an appropriate value when the load on the server is low. Set the recur every option to 1 day and click Next

Backups with Ola 5

  • Click Browse. Browse to SQLCMD.exe (C:\Program Files\Microsoft SQL Server\Client SDK\ODBC\110\Tools\Binn\SQLCMD.EXE) and then click Open
  • Type the following content to the Add arguments text box and then click Next
    • –S ServerName –i “c:\SQLScripts\SQLBackups.sql”

Backups with Ola 7

  • Select the checkbox to Open the Advanced Properties for this task and click Finish

Backups with Ola 8

  • Check Run whether user is logged in or not and Run with highest privileges under Security Options then click OK.

Backups with Ola 9

**Note**

You may have to run this in the command prompt until it succeeds and copy exactly from CMD into the arguments section

Capturing Deadlocks with Extended Events

I’ve been noticing a lot of deadlocks on my server. What’s the best way to track down the queries so I can fix the problem?

There are a few different ways that you can capture deadlock information. You can setup a trace flag (1222) to write the deadlock information to the error log, setup a Profiler trace to capture the deadlock graph, or setup an Extended Event to capture all sorts of information.

I’m going to focus on setting up an Extended Event in this post since MS continues to say Profiler will not be released in future versions. Extended Events are the future so why not start using them now?

In SSMS, drill down to Management, Extended Events. Right click on Sessions and click New Session Wizard:

Deadlocks with Extended Events 1

Click next on the Introduction screen and give the Session a name. I’m going to name this session Deadlocks:

Deadlocks with Extended Events 2

Click next. On the Choose Template screen you can choose a predefined template (like Profiler) or you can create your own events by choosing “Do not use a template”. For this post, let’s create our own:

Deadlocks with Extended Events 3

Click next and you’ll see hundreds of events (like Profiler). We only want to capture deadlock data so let’s scroll down to the very bottom and choose xml_deadlock_report. Click on the event and click the right arrow to move it into the Selected Events box:

Deadlocks with Extended Events 4

You can choose other events if needed, but for the simplicity of this post I’m just going to use this one. Click next. The Capture Global Fields page allows us to select what fields we want to capture. These are unique to each event selected. For this example, I’ll choose the following fields:

  • Callstack
  • Client_app_name
  • Client_hostname
  • Database_id
  • Database_name
  • Plan_handle
  • Process_id
  • Sql_text
  • Transaction_id
  • Transaction_sequence

Deadlocks with Extended Events 5

Click next. On this page you can apply filters if needed. I’ll setup a filter so that I only capture data from the RollTide database. There are hundreds of different filters that can be configured so that you don’t pull back data that is not needed:

Deadlocks with Extended Events 6

Click next to the Session Data Storage page. This page allows you to save data to a file or work with only the most recent data. I don’t want to keep thousands upon thousands of events so I’ll choose “Work with only the most recent data”

Deadlocks with Extended Events 7

The next page summarizes all the options we have selected. You can also script this session if you need to create it on other servers or save it for later. Click Finish to create the new session.

The last page allows you to start the session immediately and watch live data. For this post, I’ll choose both:

Deadlocks with Extended Events 8

You should see the new session under Extended Events and the Live Data tab should appear:

Deadlocks with Extended Events 9

Deadlocks with Extended Events 10

Once a deadlock occurs it should show the deadlock in the Live Data window:

Deadlocks with Extended Events 11

This view shows all of the fields we selected including the XML report. If you click on the Deadlock tab, you’ll see the graph:

Deadlocks with Extended Events 12

You can also use this query to see detailed information including the Deadlock graph and Event XML

SELECT
DATEADD(mi, DATEDIFF(mi, GETUTCDATE(), CURRENT_TIMESTAMP), DeadlockEventXML.value('(event/@timestamp)[1]', 'datetime2')) AS [EventTime],
DeadlockEventXML.value('(//process[@id[//victim-list/victimProcess[1]/@id]]/@hostname)[1]', 'nvarchar(max)') AS HostName,
DeadlockEventXML.value('(//process[@id[//victim-list/victimProcess[1]/@id]]/@clientapp)[1]', 'nvarchar(max)') AS ClientApp,
DB_NAME(DeadlockEventXML.value('(//process[@id[//victim-list/victimProcess[1]/@id]]/@currentdb)[1]', 'nvarchar(max)')) AS [DatabaseName],
DeadlockEventXML.value('(//process[@id[//victim-list/victimProcess[1]/@id]]/@transactionname)[1]', 'nvarchar(max)') AS VictimTransactionName,
DeadlockEventXML.value('(//process[@id[//victim-list/victimProcess[1]/@id]]/@isolationlevel)[1]', 'nvarchar(max)') AS IsolationLevel,
DeadlockEventXML.query('(event/data[@name="xml_report"]/value/deadlock)[1]') AS DeadLockGraph,
DeadlockEventXML
FROM
(
SELECT
XEvent.query('.') AS DeadlockEventXML,
Data.TargetData
FROM
(
SELECT
CAST(target_data AS XML) AS TargetData
FROM sys.dm_xe_session_targets st
JOIN sys.dm_xe_sessions s ON s.address = st.event_session_address
WHERE s.name = 'Deadlocks' AND
st.target_name = 'ring_buffer'
) AS Data
CROSS APPLY TargetData.nodes('RingBufferTarget/event[@name="xml_deadlock_report"]') AS XEventData(XEvent)
) AS DeadlockInfo 

Correct SQL Server TempDB Spills in Query Plans Caused by Outdated Statistics

Statistics are an integral part of SQL Server and query performance. In short, the query optimizer uses statistics to create query plans that will improve the overall performance of the queries ran. Each statistic object is created on a list of one or more table columns and includes a histogram displaying the distribution of values in the first column. The histogram can have up to 200 steps, but no more regardless of the number of rows in the column or index.

In this post we’ll take a look at one specific performance issue that you might find in an execution plan of a query. If you’ve ever noticed the following warning, then this post is for you:

1 - TempDB spills
Within the AdventureWorks2014 DB, I’ll use the following query for my example:

SELECT BusinessEntityID, FirstName, LastName, EmailPromotion
FROM [AdventureWorks2014].[Person].[Person]
WHERE EmailPromotion > 0
ORDER BY LastName

Looking at this query I can already tell contention may be present so I’ll go ahead and add a covering index:

CREATE NONCLUSTERED INDEX [IX_Person_EmailPromotion_INCLUDES]
ON [Person].[Person] ([EmailPromotion])
INCLUDE ([BusinessEntityID],[FirstName],[LastName])

When adding the index above, statistics were automatically created and updated. Since the addition of this index I’ve added a few thousand rows to the Person table.

Let’s run the query and make sure the “Include Actual Execution Plan” button is selected.

2 - TempDB spills

After the query executes let’s take a look at the execution plan by clicking on the tab in the Results pane:

3 - TempDB spills
These warnings were added to SQL Server Management Studio 2012 so if you’re using an older version you may not see this. The spill data to TempDB warning means that the query was not granted enough memory to finish the operation and spilled over into the TempDB to complete the operation. We all know reading from memory is much faster than reading from disk and this is exactly what is happening here. The query read as much as it could from memory before moving over to the TempDB disk.

Click here to view the rest of this post.

SQL Server 2016 Compare Showplan Option

Just released in SQL Server 2016 CTP 2.4 is a new feature that will allow us to compare execution plans. The main purpose of this feature is to provide a side-by-side comparison of two execution plans. This makes it easier to find similarities and changes on each plan. This is a very nice enhancement that aids in troubleshooting issues such as understanding the impact of rewriting queries or observing how a design change, such as an index, may impact the plan. In the tip below, we’ll go over how it works.

For this post, I’m using the AdventureWorks2014 database and I’m going to use the following query:

SELECT FirstName, LastName, CreditCardID 
FROM Person.Person p 
INNER JOIN Sales.PersonCreditCard c ON p.BusinessEntityID = c.BusinessEntityID
WHERE p.ModifiedDate > '2014-01-01'

pen SQL Server Management Studio (SSMS) 2016 and execute the query with SET STATISTICS IO ON and “Include Actual Execution plan“ enabled.

1-Compare Showplan SQL FreelancerClick the Messages tab and you’ll notice that we’re doing 3819 logical reads on the Person table. This is a sign that this query can be tuned. We’d like to get this number to the lowest possible.

2-Compare Showplan SQL Freelancer

Click the execution plan tab and you’ll see the plan along with some Missing Index Details.

3-Compare Showplan SQL Freelancer

The execution plan is what we’ll focus on in this tip, but I wanted to look at STATISTICS IO also to see if we can improve the number of logical reads.

To be able to compare plan we’ll need to save the execution plan. Right click anywhere in the plan window and select Save Execution Plan As… Choose a name and location and hit Save.

Click here to view the rest of this post.

SQL Server Management Studio 2015

Microsoft SQL Server Management Studio is a product I use throughout the day, every day. I’ve tried using other environments to access and manage SQL Server, but SSMS is what I learned using and what I always go back to. The problem with SSMS was that you couldn’t download it individually. You always had to have a licensed copy of SQL Server or install SQL Server Express with Tools to get this…..until now!

Microsoft has finally released a standalone download of SSMS. This release supports SQL Server 2016 through SQL Server 2005. It also provides the greatest level of support when working with Azure.

Some enhancements include:

  • New SSMS Installer – SSMS can now be installed with a light weight stand-alone web installer.
  • SSMS Updates – Receive notification within SSMS when new updates are available, and choose to install them at your convenience.
  • Enhanced SSMS support for Azure SQL Database – Several fixes and enhancements, including expanded SQL Server Management Objects (SMO) coverage, and an updated Import/Export wizard.

Let’s step through the install:

Download the SSMS-Web-Setup.exe from here.

Start the install, agree to the license terms, and click Install.SSMS 2015 SQL Freelancer

 

It should take a few minutes, but once this completes restart your computer and that’s it! Easy install.

SSMS 2015 SQL Freelancer 2

 

 

 

 

 

Passing Multiple Values into a Variable

Passing multiple values into a variable is a little more difficult than it should be. In other languages you can use functions such as Lists or Arrays, but SQL makes it a bit more complicated. To show you what I mean, let’s look at an example.

First, let’s create a Demo table:

CREATE TABLE [dbo].[Demo](
[ID] [int] NULL,
[Name] [varchar](50) NULL,
[Address] [varchar](50) NULL,
[State] [varchar](50) NULL,
[Zip] [int] NULL
)

Next, populate it with some data:

INSERT INTO [dbo].[Demo]
VALUES (1, 'Brady', '123 Main Street', 'TN', 12345)

 

INSERT INTO [dbo].[Demo]
VALUES (2, 'Tommy', '124 Main Street', 'TN', 12345)

 

INSERT INTO [dbo].[Demo]
VALUES (3, 'Jonny', '125 Main Street', 'TN', 12345)

Now that we have some data, let’s try a query using variables. I want to define a variable on the column ID.

DECLARE @MultipleValue varchar(200)
SET @MultipleValue = '1,2'

SELECT * FROM Demo WHERE ID IN (@MultipleValue)

After running this query, I get 0 results and an error:

Msg 245, Level 16, State 1, Line 24
Conversion failed when converting the varchar value ‘1,2’ to data type int.

Why? I know the ID’s 1 and 2 are in the table, but SQL is looking at this variable as one string. So unless I have 1,2 in the same ID column, it will show 0 results.

One way to get around this is to use a UDF, user defined function. In this function, we’re going to convert the comma separated values (1,2) into a table, then query from that.

CREATE FUNCTION [dbo].[MultipleValues] (@InStr VARCHAR(MAX))
RETURNS @TempTable TABLE
(id int not null)
AS
BEGIN

SET @InStr = REPLACE(@InStr + ',', ',,', ',')
DECLARE @SP INT
DECLARE @VALUE VARCHAR(1000)
WHILE PATINDEX('%,%', @INSTR ) <> 0

BEGIN

SELECT @SP = PATINDEX('%,%',@INSTR)
SELECT @VALUE = LEFT(@INSTR , @SP - 1)
SELECT @INSTR = STUFF(@INSTR, 1, @SP, '')
INSERT INTO @TempTable(id) VALUES (@VALUE)
END
RETURN
END
GO

Now that we have a UDF, let’s use this in the query:

DECLARE @MultipleValue varchar(200)
SET @MultipleValue = '1,2'

SELECT * FROM Demo WHERE ID IN (SELECT * FROM dbo.MultipleValues(@MultipleValue))

Ta da!

We now have two results. ID 1 and 2:

Passing multiple values into a variable