Tag Archives: Monitoring

Free Video for Accidental DBA’s

Recently, Mike Walsh and John Sterrett teamed up with Embarcadero to give a 60 minute presentation focusing on the skills that are necessary for accidental DBA’s to master to become DBA’s with a successful long lasting career.

I wish I had all this advice when I started as a SQL Server DBA.


All Resources mentioned in the video to complete the tasks discussed can be found at http://linchpinpeople.com/where


When will my replica be synchronized?

Recently, I was pulled into a gig to help troubleshoot an windows failover cluster service issue that was hindering an availability group. Just to give a little background information its a two node windows cluster with two replicas the availability groups are using synchronous mirroring. Once, this was resolved we had a critical database that was behind on synchronization because one of the two replicas (cluster node) was down. This lead to the following question. When will my replica by synchronized?

This question was easy to answer with database mirroring. We could pull up the database mirroring monitor. We don’t have an database replica monitor GUI tool built into SSMS. Lucky for us, its not that hard with availability groups. We just have to use performance monitor. You will see below there is an perfmon collection called “SQLServer:Database Replica” that comes in handy for us.

When will replica be synchronized?

When will replica be synchronized?

Above, you can see that this secondary replica which is back online is synchronizing as its receiving log bytes and the Recovery Queue shown in the performance monitor is also reducing as the synchronization process is catching up. According to MSDN Recovery Queue is the Amount of log records in the log files of the secondary replica that has not yet been redone.

Workload Tuning: Finding Top Offenders in SQL Server 2012

I have to give credit where credit is due. Microsoft has definitely made performance tuning easier in SQL Server 2012. Performance tuning usually starts with finding your top offenders so you can get the biggest bang for your tuning bucks. For the first time, you can capture a workload using extended events and find top offenders inside of Management Studio without having to write a single line of T-SQL or use any third party tools.

Below is a five minute demo. It was recorded in 1080p HD so I strongly recommend changing the quality of the video to that setting. I hope you enjoy watching it as much as I did making it. If for any reason the video doesn’t show below you can also watch it here.

What items are in your SQL Server Daily Checklist?

Hello everyone out there on the intertubes.  In my life I have learned that I can save a lot of time and energy by

First Annual SQL Rally is in May

First Annual SQL Rally is in May

utilizing checklists. The same is true with being a DBA.  I  manage over 100 instances of SQL Server. When my career started as a production DBA my head was spinning trying to track down failed jobs, missing backups and more on a daily basis. It could be done but it took quite a while and was a manual task. I needed to find a way to automate this process.  This is where Policy-Based Management and Central Management Server comes in. Now, I can sleep at night and just review my evaluated daily checklist first thing in the morning.

UPDATE [3/7/2011] : I added SQL Agent Notifications and Alerts as recommended in the comments. If you think an option is missing leave a comment and I will add it.

Help me Help you…

Just like Jerry Maguire I need your help.  I am speaking about Policy-Based Management and Central Management Server at the first annual SQLRally.  I need your input to help my presentation. I have two quick questions. It shouldn’t take you more than two minutes to complete the questions.  I appreciate your help and support.

[Note: Please make sure to click on the vote button for each question.]

What items are included in your Database Administration Daily Checklist?

View Results

Loading ... Loading ...

How do you evaluate your Database Administration Daily Checklist?

View Results

Loading ... Loading ...

Wheeling, WV to Houston, TX – A SQL Saturday #57 Recap..

Everyone, If you attended SQL Saturday #57 in Houston, TX thank the host and the sponsors. With out their dedication and hard work this event wouldn’t have happened.  Finally, if you haven’t done so submit a speaker evaluation for the sessions you attended.

Now that the basics are covered lets move on to the recap.

Why Houston?

Last week I took a trip out to H-Town to speak at my eighth venue in the last twelve months.  Family was a huge motivator for attending SQL Saturday #57.  Being a graduate of James E. Taylor High School in Katy, Texas I was able

John Sterrett SQL Server 2008 for Developers Presentation

If you build it they will come!

to double dip and visit my parents and my oldest brother.  I flew in on Thursday to surprise my father for his birthday. I got an extra bonus when I arrived.  I found out that this was also going to be the same day my nephew was born.


On Saturday, we meet up at the church for the first SQL Saturday in Houston. Yes, that sentence is correct.  The Houston Area SQL User Group was able to get the Bammel Church of Christ to host the event. I have to be honest when I found out that a church was going to be used I was skeptical.  Normally, these events are at a community college, university or Microsoft Office.  I wondered if a church be a good venue for a SQL Saturday? I was right, it wasn’t a good venue. It was a fantastic venue.  Kudos to Nancy Hidy Wilson (Blog | Twitter) and the Houston User Group for selecting this venue.

Connecting (Volunteer & Speakers Dinner)

Normally, when I am selected to speak at a SQL Saturday I always attend the volunteer and speaker dinner and the after party. Its a great opportunity to network.  We met up at the Outback Steakhouse in Tomball.  I felt like I was back at summer camp as I hanged out with some friends and also made some new friends. (Thomas LeBlanc, Wes Brown, Ryan Adams, Jen and Sean McCowen, Sri Sridharan, Tim Mitchell and more…)

My favorite quote comes from Sri, “John you cannot say its a once in a lifetime opportunity. Your Steelers are always in the Super Bowl.”

Learning (Presentations I attended)

  • Dean Richards – Best Practices for SQL Server on VMWare
    With the economy changing we all need to find ways to do things cheaper. One of the easiest ways to do this is to virtualize your servers. The concept seams great but SQL Server can be tricky to manage.  How do you configure the guest memory? How does the host share its CPU power with guests? I learned this and a lot more during Deans session.
  • Ryan Adams – Manage Your Shop with CMS and Policy Based Management
    Those of you who know me know I am very excited about Central Management Server and Policy Based Management. I currently use it to evaluate my daily checklist against 100+ instances of SQL Server with 3,000+ databases.  I am extremely interested in seeing how other DBA’s use it.  Ryan is into mirroring and loves the mirroring facets provided with PBM.  I am glad he showed this to me.
  • Thomas LeBlanc – Transition from DBA to BI
    Any time I can sit in the front row on a sofa and watch Thomas speak its a win-win. This talk was special because  I got my learn on and felt like I was at comedy hour.  I enjoyed his path from DBA to BI.  I really liked how he was able to use his DBA skills to make a good first impression.  He also showed us a great spreadsheet out there on the interwebs to build a data dictionary for your facts and dimensions.

Sharing (My Presentations)

  • SQL Server 2008 for Developers (About 30 attendees)
    Every once in a while I submit this topic to see if people are still interested in seeing demos going over the new features provided for developers with SQL Server 2008.  During the presentation I polled the attendees to see who hasn’t migrated to SQL Server 2008.  There still is a crowd that wants to know how to implement DateTimeOffset, T-SQL enhancements,  Merge, Table Value Parameters, Change Data Capture.  I spoke right before lunch and had a few people stay extra to go over a bonus demo.  I also talked with a few of the attendees later in the day and I got some great feedback.If you attend my session please submit feedback.  You can find the slide deck and sample code on my blog.
  • Evaluate your Daily Checklist using CMS and Policy Based Management (5 Attendees)
    My PBM and CMS talk was included in the last session.  This was my first presentation using VMWare Workstation.  I did this for two reasons. One, Brent Ozar made me do it. Finally, I wanted to throw up an instance of SQL 2000 for this demo and I couldn’t do it with Windows 7.  Even though the crowed was small one attendees told me he is going to start using PBM and CMS due to my presentation. This is the greatest complement I could receive. 

    If you attend my session please submit feedback.  You can find the slide deck and sample code on my blog.

Bloggers SQL Saturday #57 Posts

Policy-Based Management Myths Series Finale

Welcome to the Policy-Based Management myths finale.  Do to some great questions at SQL Saturday # 61 in DC I am going to focus on some myths that come up with Policy-based Management.  Today is the final installment, you can catch the other myths below.

You cannot use Policy-Based Management with SQL Server 2008 R2 Express

Just like the two other myths the answer is true and false.  This one, really through me for a loop mainly because I don’t normally use SQL Express. With that said, if you want to store or use SSMS to create a policy while connected to SQL Express instance the answer is false.  If you want to evaluate an existing SQL Express instance the answer is true.

Why can’t I create policies or store them in SQL Express 2008 R2?

When I read the following article on SQL Server 2008 R2 Express site I would think that the flexible section make it seam like you could store and create policies with that edition of SQL Server.  Unfortunately, you will notice that Policy-Based Management is disabled. This isn’t too bad, lets just enable it.  Oh snap……….  You get the friendly error message that states, “Policy-Based Management cannot be enabled on this edition of SQL Server.”


How do I evaluate policies against SQL Express 2008 R2?

So, I cannot enable Policy-Based Management on SQL Server 2008 R2 Express edition can I still evaluate and apply changes to this instance of SQL Server? Yes, you can, once you have your policies built you can use Central Management Server to evaluate and apply changes.

Below is a screen shot of me evaluating the built-in “Last Successful Backup” policy against an instance of SQL Server 2008 R2 Express.  You will see that it fails because well I don’t use Express so I don’t have the databases backed up.


Note: There will be more to come in a future blog post on setting up Central management Server and using it to execute Policy-Based Management policies.

How can I build a policy to work against SQL Express 2008 R2?

Connect to another instance of SQL Server using, Standard,  Enterprise, or Developer to generate your policies via SSMS and store the policy on a central instance of SQL Server (not express) or on a file share.

Related Posts:

Policy-Based Management Myths (2 of 3)

A few weekends ago I gave an “Introduction to Policy-Based Management and Central Management System” at SQL Saturday 61 in DC. Do to some great questions I am going to focus on some myths that come up with Policy-based Management this week.  Check back through the week to catch all three myths.  Today we will cover the second myth.

By default a policy using the database facets will run against all databases

The myth is true and false.  Actually, using the default database target condition (every database) the myth is true. If you build your own condition for the target then the myth can be false.

Lets take another look at the “Auto Shrink Database” policy that comes out of the box with SQL Server 2008. When you evaluate the policy you will see that the system databases (msdb, master, model, tempdb) are not included.


My first thought is to checkout the target properties.  Maybe for some reason the system databases are exempt.  Looking at the target below you will see that every database is suppose to be included.  Obviously, this means all databases are included, right? No, sorry system databases are not included.


How do I evaluate policies against all databases?

Lets create a new condition that we will use as our target condition.  In a nutshell we need a condition that shows all databases when they are system objects or are not system objects.  If you select the database facet you will notice there is an IsSystemObject field.  This is our ticket getting all databases.


Lets now change the target condition to use the condition created above. We will no longer have every databases as our target condition and select our new System and User Databases target condition.


Now lets evaluate the policy again.  You will now see all user databases and system databases as I expected when you used the default target condition.


Come back on Monday to find our last installment of the Policy-Based Management myths.

Related Posts:

Policy-Based Management Myths (1 of 3)

A few weekends ago I gave an “Introduction to Policy-Based Management and Central Management System” at SQL Saturday 61 in DC. Do to some great questions I am going to focus on some myths that come up with Policy-based Management this week.  Check back through the week to catch all three myths.  Today we will cover the first of the following three myths.

You cannot execute and apply policies against an instance of SQL Server 2000.

The answer is yes and no.  You can execute and apply policies against an instance of SQL Server 2000 if the policy executed runs against features supported in SQL Server 2000. This is easy to do if you have a central management server setup.  I will walk you through the steps below assuming a CMS exists.

1. Right click on the node in your Configuration Management System that hosts your instances of SQL Server 2000 and right click and select evaluate policies as shown below.

image2. Click on the ellipse next the source and select the policy files you would like to execute or select a sever where the policies reside.  In this example we will execute the Database Auto Shrink built-in best practice policy.  By default they are located at C:\Program Files (x86)\Microsoft SQL Server\100\Tools\Policies\DatabaseEngine\1033\. Once you have loaded the Database Auto Shrink policy click execute and the policy will be evaluated against every server in your CMS group.

image3.   The following are the results of executing the “Database Auto Shrink” policy against a group of development SQL Server 2000 instances.  You will noticed that the policy failed as there are databases that have auto shrink equals true. We can click on the checkboxes and apply the policy to force the databases to be compliant with our policy.


4. Click yes on the next popup screen will apply the policy to modify all selected targets that do not comply.  Make sure you want to do this before you apply the policy.


5. After selecting apply you should see that you were able to apply the changes to make your SQL Server 2000 databases are compliant with the policy.

Can you show me an example of a policy that will not work on SQL Server 2000?

Database Mirroring is not supported in SQL Server 2000 as it was delivered in SQL Server 2005 SP1.  If you have a policy that checks to see if mirroring is enabled and you run it against a SQL Server 2000 instance the policy will fail.  In fact, you will receive an “Property ‘IsMirroringEnabled’ does not exist error.


Check back during the week to catch the next Policy-Based Management myth.

I don’t get the Data and Log File Location Policy.

Recently I have been spending some time to evaluate Policy Based Management.  Its a nice tool to monitor SQL Server and to prevent bad things from occurring.  At work we have a nice sized SQL Server farm so I am always interested in using tools that make my life easier.

First thing I noticed about PBM is that there are several policies out of the box that can be imported via SSMS.  The majority of the free policies are several best practices in the SQL Server community.  For example, its well known that you should keep data and transactional logs on separate spindles. Therefore, initially I was excited to see a policy for Data and Log File location that included a check condition for Data and Log Files on Separate Drives until I viewed the details of the check condition.


Looking at the image above you will see four conditions. The first condition is obvious and with an initial look I would just run with that condition.  The next three are OR conditions so if any of them are true then the policy succeeds for the database.  I fully understand why we look at the @DataAndLogFilesOnSeperateDrives and the @IsSystemObject is included.  Its valid to exclude system databases because its likely to have some system databases and transactional logs on the C: Drive.

There are two condition that are puzzling to me.  Are they puzzling to you? Why would there be a condition to exempt databases that are less than 5120 MB in size?  Why was the value 5120 selected? Is there some unknown reason why Microsoft didn’t select 5121? Seriously, does 5120 have value or am I missing something?  Why would we also exempt databases that have a status not equal to normal?

Because I am unsure  why @Status and @Size are included in the policy I decided to review the Database Performance  facet attributes just to verify that I understand them correct.  The attributes are shown below.  I still very curious towards why these attributes are included.  I am not saying they are good or bad because honestly I am not sure.


Ask the Experts

This week I am going to be attending the PASS Member Summit in Seattle.  There will be a section of the vendor expo for Ask the Experts and that is exactly what I intend to do.  Worst case it will be a good way to start a conversation. I will follow-up here with my findings.

Disk usage monitoring with Data Collector

Today, I am jumping into Adam Machanic T-SQL Tuesday challenge with the following post.

Recently, I created a build request to have a new server to move SQL Server databases.  I put together an estimate for the space needed for data, logs and backups and included this information in the build document.  This request was put in the queue and all was well.  Once this request popped to the top of the queue I reviewed the size of database files using Idera’s Space Analyzer and noticed unexpected data growth in a few of the data files.  Thankfully, the data collector and Management Data Warehouse (MDW) was enabled which helped us track the unexpected data growth.


SQL Server 2008 introduces the data collector and the management data warehouse (MDW) into SQL Server Management Studio (SSMS). The Management Data Warehouse exists by leveraging the data collector and a MDW database and custom reports. This allows administrators and developers the opportunity to do some proactive monitoring. If you are familiar with the Performance Dashboard then you will be impressed with the new Management Data Warehouse.

How does the data collector help you?

Out of the box, the data collector provides you with the tools to monitor disk usage, server activity and query statistics.  Therefore, when you are asked the following questions.  How much space do we need for database xyz? Why did database xyz grow from size 123 to size 456? you can use the data collector report to help answer these questions.  For example, below is a screen shot of some sample databases.


You can see that the third, forth and fifth databases had their current size changed.  I want to focus on this fact because it doesn’t represent the data inside the database files.  This means the actual data files (mdf, ndf, ldf) are growing which can cause disk fragmentation. You want to avoid this when possible.  Ideally, you want your database graph to be a straight line like the sixth database.  Next we will click on the graph for the fourth database and drill deeper to the analysis.



Looking at the two image above you can see that the data size multiplied during a six hour between noon and 6pm on November 30th.  We were then able to use this information to determine the cause for the unexpected growth of data.


I think this is a great tool to troubleshoot unexpected data growth.  I look forward to using this tool as part of the process to estimate future database growth.