Category Archives: SQL Server

All SQL Server posts that don’t fit in other categories

Looking Back: 2016 was Awesome!

With 2017 starting this week I would like to take a moment and reflect on 2016. I have to admit that 2016 was a great year for many different reasons, both personally and professionally. It’s nice just to stop and acknowledge some significant accomplishments that occurred last year!

2016 in Review

I always try to keep my Community contributions up to date here on my blog. I would have never gotten to where I am today without others who helped me along the way, so I am always happy to give back.  It’s been an honor to host another SQL Saturday in Austin for over 250 attendees, be a leader for the Austin SQL Server User Group and continue to grow the High Availability and Disaster Virtual Chapter to 2145 members.

Professionally, it was also a great year in 2016.  I was able to obtain two major goals on my career bucket list. I completed the Microsoft Certified Solution Expert for Data Platform certification (Yes, my procrastination took me to the

Microsoft Certified Solution Expert on Data Platform

Microsoft Certified Solution Expert on Data Platform

last work day in 2016, but I got it done!).  As Tom LaRock says, “They do not hand out certifications like candy. You have to earn them.” I also was acknowledged as a Microsoft Data Platform MVP which goes to about 100 people in the USA.

In 2016, I was also able to continue to grow as a speaker.  Once again, I was blessed with an opportunity to speak at the prestigious PASS Member Summit. I never take getting accepted lightly. This year even though I was sick and almost

Almost a perfect score for the biggest stage.

Almost a perfect score at the Superbowl for SQL Server Speakers!

had to cancel I was able to deliver my session while obtaining my best evaluation score to date (ranked 1 to 3).  I gave 15 sessions in 2016; my highest attended session of the year was at the Data Architecture Virtual Chapter to 457 attendees.

My company, Procure SQL LLC also achieved some great milestones in 2016. I was able to hire an employee, Angela Tidwell as our Marketing DBA. We were able to procure our very first office as well. Finally, after spending several hours working with an excellent health care broker, I was proud to announce that we were able to set up our very first group health plan at Procure SQL. While this isn’t required, it was a huge goal of mine to help make sure that our employees are taken care of as we grow in the future.

Personally, I was able to catch up with some great friends and spend some quality time with my family.  We took the family to Galveston, TX for our very first beach trip as a family.  I learned that it was Nina’s Dads first time in the Gulf as well.  Bonus; the weather was so nice in December my Brother and rootsportsroyjohndecided to bring our kids back for a weekend.  During my birthday weekend, I was able to see my Pirates and Wheeling Nailers in person in Dallas which is both very rare.   During the weekend I was able to hangout with one of my best friends from college.  We even made it on Root Sports Pittsburgh raising the jolly roger.

Finally, my loyalty with some friendships was tested in 2016. While one could have only focused on themselves,  I decided to help my real friends in need in 2016. I remember asking myself (yes I talk to myself), “what would I hope my friends would do for me in my hour of need?” I then tried to do that.

2017 Goals

These days, I am not a huge fan of publicly sharing my goals.  With that said, I wanted to share three goals to help keep me accountable.

Build Two Local Speakers

In the past, I have had the opportunity to help build some new speakers. I cannot think of a more rewarding experience in the SQL Community than helping someone grow. While I have more than two different people in mind, I hope to have at least two new speakers give their very first SQL Server presentations in 2017. Who knows? Maybe they will love it, and it can change their careers like it did for me.

Fill My Office

While I plan on getting us a fridge, when I say, “Fill My Office” I wasn’t just planning on filling the fridge with adult beverages. Our office fits one more person easily, and I hope to find that FTE employee in 2017.  I will provide more details on this later on as we get closer to that point of pulling that trigger.

Building Training Plan


Testing out the new office white board.

Literally, this was the first thing written on our new six-foot whiteboard

With the hiring of Angela last year, one of my goals was to build some high-Quality training material that could help someone become a SQL Server Database Administrator (DBA). I want this to be online content that could be used on demand to help as many people as possible to get a great start as a DBA. I hope to have this implemented in 2017.  It might also force me to get out of my shell and focus on building some great video content which will be a new challenge.


I am a Microsoft Data Platform MVP

My First Microsoft SQL Server MVP Award

My First Microsoft SQL Server MVP Award

Today, I have achieved an SQL Someday moment.  I am excited to share some exciting news. Microsoft has chosen me to receive the Microsoft “MVP” award. I am completely shocked as this is my first time obtaining this award.

It is a tremendous honor to be given this award. Words cannot describe how humbling it is just to be nominated for this award by peers within the Data Platform community. I am blessed to be part of the community.  I am thankful to work in a career that supports the ability for everyone to connect, share and learn. I look forward to doing the same as a new member of the MVP community.

Who Are Microsoft MVP’s?

The following excerpt comes directly from the Microsoft MVP website.

“Microsoft Most Valuable Professionals, or MVPs, are community leaders who’ve demonstrated an exemplary commitment to helping others get the most out of their experience with Microsoft technologies. They share their exceptional passion, real-world knowledge, and technical expertise with the community and with Microsoft.”

How My Journey Started

I would never forget and always be thankful to Adolph Santorine my boss at the time who told me, “John, I think you should get involved in our local Association for Informational Technology Professionals (AITP) chapter.” He shared how it help him personally and how he thought it could be beneficial for my career.   Adolph was right. I learned so much from some well experienced IT leaders. The Greater Wheeling Chapter of AITP is where my community involvement started, and I was lucky to start with a community who has been connecting, sharing and learning since 1960’s.  When I decided to get my local community involved in the SQL Server community my fellow AITP members had my back and helped make our event a success. From this moment, I was introduced into PASS and the SQL Server Community.  I have been connecting, sharing and learning ever since. 

How Will Things Change?

They will not change, and to me, that is the greatest part of this award. I still plan to connect, share and learn with as many people as possible. The data platform is evolving.  I plan on being involved in the community and building relationships with individuals who are interested in seeing the Microsoft Data Platform grow and succeed.

Special Thanks

I would like to thank my family, friends, and mentors.  Without their help and support, I would never be the person who I am today or tomorrow.

I would also like to thank everyone I have worked with in my career.  Thank you to everyone at Deloitte, Schedule Star, Orrick Herrington & Sutcliffe, RDX, Dell and Linchpin People. I am truly blessed to have a lot of great mentors and friends.   Thank you for giving me the opportunity to learn, grow, and have a career doing what I love to do.


Finally, you might not know this, but anyone can nominate someone for a Microsoft MVP Award. You don’t have to be an MVP or a Microsoft employee. I have personally been nominating people for years and will continue to do so as an MVP. If you know of anyone who you think is deserving of the Microsoft MVP Award nominate them.

Automatic Seeding Very Large Databases in SQL Server 2016

There are a lot of new features in SQL Server 2016. Availability Groups by itself got a lot of new features.  Being that I am the founder of the High Availability and Disaster Recovery Virtual Chapter, I plan on blogging about the new availability group features.

Today, I wanted to write about Automatic Seeding.  Microsoft did an excellent job of explaining how to enable and monitor automatic seeding.  I wanted to focus this post on my experience utilizing automatic seeding to seed an existing 600gb database in a test environment to test my endpoint network throughput.

The initial data synchronization easy button.

When you add a database to an availability group, the replicas must synchronize the data between the availability groups to join the database on the replicas.  In the past, data initialization has been done with mirroring, log shipping, backup, and restores.  Personally, I have been a big fan of log shipping for the initial data synchronization of VLDB’s especially when you need more than two replicas. Here is how I added a 60TB (Yes, TB not GB) database to an availability group that utilized multiple data centers.

Automatic seeding is a feature that has been in Azure SQL Databases for a while. It’s how the initial data synchronization occurs for Geo-Replication of Azure SQL Databases.  Automatic seeding utilizes a VDI Backup to take a copy only backup and send it over the endpoint network to seed the replicas and then join the databases with the replicas. This eliminates the need to manually take full and log backups from the primary replica to all the secondary replicas. It will also join the database on the replicas for you.

Bonus Feature of Automatic Seeding

There is also a bonus feature of automatic seeding for DBA’s and Information Technology professionals.  Even if you decide to not use automatic seeding I recommend testing this feature as automatic seeding can be a great way to stress your endpoint network to validate its throughput.

Background Information

This availability group has been configured with a separate 10Gbps network dedicated to endpoint traffic. Nothing else is active on the network or the replicas during the time of testing.


I configured the following performance monitor counters.

  • Bytes Received/sec on Secondary replicas
  • Bytes Sent/sec on Primary replica.

I also configured the following extended event session to monitor seeding activity on the primary and secondary replicas. We will focus on the “hadr_physical_seeding_progress” event today. We will talk about others in a future blog post.

ADD EVENT sqlserver.hadr_automatic_seeding_state_transition,
ADD EVENT sqlserver.hadr_automatic_seeding_timeout,
ADD EVENT sqlserver.hadr_db_manager_seeding_request_msg,
ADD EVENT sqlserver.hadr_physical_seeding_backup_state_change,
ADD EVENT sqlserver.hadr_physical_seeding_failure,
ADD EVENT sqlserver.hadr_physical_seeding_forwarder_state_change,
ADD EVENT sqlserver.hadr_physical_seeding_forwarder_target_state_change,
ADD EVENT sqlserver.hadr_physical_seeding_progress,
ADD EVENT sqlserver.hadr_physical_seeding_restore_state_change,
ADD EVENT sqlserver.hadr_physical_seeding_submit_callback
ADD TARGET package0.event_file(SET filename=N'autoseed.xel',max_file_size=(20),max_rollover_files=(4))

The following T-SQL script is then used to read the results once your seeding process has completed. We will talk about the results in the results section below.


;WITH cXEvent
AS (
     SELECT    object_name AS event
              ,CONVERT(XML,event_data) AS  EventXml
     FROM      sys.fn_xe_file_target_read_file(@XFiles, NULL,NULL,NULL)
     where object_name like 'hadr_physical_seeding_progress')

c1.value('(/event/@timestamp)[1]','datetime') AS time
,c1.value('(/event/@name)[1]','varchar(200)') AS XEventType
,c1.value('(/event/data[@name="database_id"]/value)[1]','int') AS database_id
,c1.value('(/event/data[@name="database_name"]/value)[1]','sysname') AS [database_name]
,c1.value('(/event/data[@name="transfer_rate_bytes_per_second"]/value)[1]','float') AS [transfer_rate_bytes_per_second]
,(c1.value('(/event/data[@name="transfer_rate_bytes_per_second"]/value)[1]','float')*8)/1000000.00 AS [transfer_Mbps]
,c1.value('(/event/data[@name="transferred_size_bytes"]/value)[1]','float') AS [transferred_size_bytes]
,c1.value('(/event/data[@name="database_size_bytes"]/value)[1]','float') AS [database_size_bytes]
,(c1.value('(/event/data[@name="transferred_size_bytes"]/value)[1]','float') / c1.value('(/event/data[@name="database_size_bytes"]/value)[1]','float'))*100.00 AS [PctCompleted]
,c1.value('(/event/data[@name="is_compression_enabled"]/value)[1]','varchar(200)') AS [is_compression_enabled]
,c1.value('(/event/data[@name="total_disk_io_wait_time_ms"]/value)[1]','bigint') AS [total_disk_io_wait_time_ms]
,c1.value('(/event/data[@name="total_network_wait_time_ms"]/value)[1]','int') AS [total_network_wait_time_ms]
,c1.value('(/event/data[@name="role_desc"]/value)[1]','varchar(300)') AS [role_desc]
,c1.value('(/event/data[@name="remote_machine_name"]/value)[1]','varchar(300)') AS [remote_machine_name]
,c1.value('(/event/data[@name="internal_state_desc"]/value)[1]','varchar(300)') AS [internal_state_desc]
,c1.value('(/event/data[@name="failure_code"]/value)[1]','int') AS [failure_code]
,c1.value('(/event/data[@name="failure_message"]/value)[1]','varchar(max)') AS [failure_message]

FROM cXEvent
    CROSS APPLY EventXml.nodes('//event') as t1(c1)


The 600 GB databases took about 66 minutes to seed across the network from a primary replica to the secondary replica.  I noticed 1.4 Gbps of consistent throughput during the seeding process. This makes a lot of sense as it caps out around what the storage system can deliver in this environment.

The first thing I would look at for benchmarking throughput for network activity would be the bytes sent per second from the primary replica and bytes received per second on the secondary replicas.

AG Seeding Primary Replica's Bytes Sent per Second

AG Seeding VLDB Primary Replica – Bytes Sent per Second


AG Seed VLDB Secondary Perfmon

AG Seed VLDB Secondary Replica – Bytes Received per Second

I am seeing average around 1.4 Gbps.  Normally, just looking at bytes sent and bytes received will be good enough for you to measure your throughput.  Especially, when nothing else is utilizing the dedicated endpoint network. In the field, I usually do not see dedicated networks for endpoint traffic so I wanted to take this a step further and monitor with some of the new extended event events for automatic seeding.

Here is a look at the raw data from the extended event capture showing the progress and throughput of the seeding.

AG Seed Xevent Transfer

The following is a nice excel graph showing the throughput in Mbps. I added an extra row with zero for throughput just to show the rise and fall of network usages as seeding starts and completes.

AG Seeding Excel Graph


My Thoughts

Initial data synchronization process just got a lot easier.  I will use this for sure for adding new databases to availability groups.  If you can live with your transactional log not being truncated during the seeding process I strongly encourage you to use automatic seeding.

I also did not use trace flag 9567  to enable compression during this test. It is why you saw compression not being enabled.  If you have the CPU resources I recommend you test this as well.

Reference Links

For more great information on SQL Server subscribe to my blog and follow me on twitter.

Photo Credit: Mike Mozart (Creative Commons)


Converting identifiers into SQL Server table variables in Management Studio

If I had a nickel for every time someone gave me a list of identifiers and asked me to pull a result set for them I would be a very wealthy man. Over the years this is an endless request. I used to use several different tools like excel for example to convert the list of identifiers into SQL Statements. I would then use set based logic to complete my task inside SQL Server.

Being a SQL Server Database Administrator I like to use one consistent tool for working with queries. One day I noticed an interesting feature inside of the search and replace functionality inside Management Studio. I could utilize regular expression as a tool in my tool belt for manipulating text inside of SQL Server Management Studio. This gave me the functionality to stay within my preferred tool for doing SQL Server development work.

If you want to see how you can quickly manipulate the identifiers and convert them into table variable check out the three minute video below.

T-SQL Tuesday #50: Automation for LazyDBAs!

T-SQL Tuesday is a monthly blog party hosted by a different blogger each month.

T-SQL Tuesday 50 - Automation

T-SQL Tuesday 50 – Automation

This blog party was started by Adam Machanic (blog|twitter). You can take part by posting your own participating post that fits the topic of the month. This month, SQLChow blessed us with a topic I am very passionate about. This months T-SQL Tuesday topic is automation.

I have a confession to share. I am a Lazy DBA. Those who know me won’t be shocked by reading this. Those who don’t know me. Trust me, I mean this in a good way.  My lazyness over the years has actually motivated me to be a better DBA and data professional.  I learned early on in my career that in order to be productive I must automate. No longer can we do manual daily checklists. We lose several hours that could have been spent on tasks that show our value not just hold the status quo. Automation allows us to end the cycle of repeating tasks and allows me to spend that time doing things that provide value, save the company money and make us happy.

Early Stages of Automation

Once in my career I was blessed with an opportunity to be a Database Administrator overseeing thousands of production databases. Quickly, I noticed there was no automated process for a daily checklist.  How did we know if a database backup failed due to low disk space? Hopefully, we got an email from the SQL Agent. Hopefully, someone remembered to setup an SQL Agent notification. I knew this wasn’t the answer. One of my first tasks was automating this whole process so we knew which databases passed and failed an automated daily checklist. I was able to leverage Policy Based Management (PBM) and Central Management Server (CMS) with Powershell to get this done. Little did I know it at the time, but this basic move changed my DBA Career. I got to speak at the PASS Summit in 2011 on how I evaluated my automated daily checklist against 1000+ servers during my morning coffee break. Starting to focus on performance I noticed a better way to pull this information without PBM. I build an automated framework using Powershell and CMS to automate the process to get my failed backups quicker. Still today meet DBA’s today who didn’t know you can automate your daily checklist only using native tools built into SQL Server.

Current Stages of Automation

Today, I am much more focused on performance and proactive monitoring. Learning from my past I knew I wanted to automate as much as possible. This didn’t change even though my core skills were changing.  In the past year I built some nice automated solutions that help me with performance tuning.  When I am in charge of a new instance I automate the process of monitor disk latency, proactively automate the process to monitor wait statistics. Once I have a good automated baseline I can drill deeper as needed. For example, I can find out which queries are causing my waits. It has gotten me to the point where most of the time I can find the root cause to SQL Server performance problems in ten minutes. The automated benchmark process does the heavy lifting for me so I can respond and stay as proactive as possible and provide value instead of running processes that should be automated.

 Future State of Automation

I see more things being automated.  More and more parts of the current “Production DBA” role as we know it today will be automated. This is going to open us up to doing amazing things. One day, an end to end performance tuning process will be automated.  I look forward to seeing things that we thought were not possible be possible and automated. For example, automating server procurement and deployment once fell into this realm. Now, it’s already here. It’s known as “the cloud”. I will be honest, I was shocked to see how easy and quick it is to deploy an Windows Azure Server.

What are your thoughts about automation? Where do you see it going in the future?

#TSQL2Day – A Day in the Life of A Service DBA

Today I am excited to join in with my #sqlfamily and participate in this months #tsql2day throw down. This month Erin Stellato is the king pin and her topic is “A day in the Life.” This gives me a great opportunity to explain a day in the life of a DBA who works for a company who provides DBA services remotely. A typical day has me working on several projects or problems with several clients. One of the things I enjoy about my job is that each day has a new set of challenges.  The following are the highlights of my work day on Wednesday, July 11th.

Backups instead of Coffee..

You might start your morning with a cup of coffee but I got to start my day with working with a new client going over an very important issue. We noticed holes in a clients backup maintenance plan so I got to work with them to correct those holes.  I know this isn’t the sexiest task out there for DBA’s but making sure that you can recover is one of the most important tasks  for a DBA.

Configuring A Cluster to improve High Availability.

Implementing a cluster usually gives you higher availability for your databases but you might not be getting the highest availability possible from how you configured your cluster.  Today, I reviewed a four node cluster and found a few flaws that limit the availability of the cluster. To summarize here are some of my findings. The cluster was running in node majority instead of node and disk majority. Instances of SQL Server were not added on all the nodes. The preferred nodes and possible owners were not configured correctly. I look forward to doing another blog post at a later date going over each of these settings.

Troubleshooting Version Store

Today,  I got to work with another team mate to help troubleshoot why the version store was filling the tempdb database and causing slowness with the clients application. This gave me an opportunity to go learn more about what all is used by the version store. Noticing that the client used RCSI for their main database we were able to use the Performance Data Warehouse to find a massive update statement that did more writes than expected which caused the growth problems with tempdb.

Assessment review with Client

One thing I enjoy about my job is going over assessments with a client. This usually gives me an opportunity to share some knowledge about how the database engine works. I also gave some recommendations that can be used to improve their performance and how to bring their server up to speed with our best practices.

Favorite Part of the Day!

Every day I get to eat lunch.  I try to make sure I have a lunch date with #babybeluga Gregory my one year old son. This is by far my favorite part of the day and it reminds me why I love working from home.

Well, there you have it. I hope you enjoyed my blog post on A Day in the Life. I look forward to seeing the recap of how everyone else in the SQL Community spent there day.

Stress Testing Stored Procedures with Free Tools

I currently have a presentation named Performance Tuning for Pirates. This presentation goes over using several free tools to help you with your performance tuning process. Today, I am happy to share my first video going over these tools.

Using SQLQueryStress to stress test Stored Procedures

SQLQueryStress by Adam Machanic (blog | twitter)  is one of my favorite tools for doing a quick stress test against a user object like a stored procedure. This tool can actually be used to do stress testing on T-SQL queries but I usually use it for compiled objects like stored procedures.  I love this tool because it allows you to save and load configurations to quickly redo tests. It also allows you to use a dataset for input parameters and allows you to add additional stress with configuring iterations and threads.

Note: It also comes in handy when you need to mock up some stress for demos.

I also written a tip for that goes over a step by step guide on how to use SQLQueryStress.

Free Training: Performance Tuning with Free Tools!

This week I have two presentations on my schedule. I get to give my Performance Tuning for Pirates presentation twice this week.

Pittsburgh SQL User Group Recap

On Tuesday, I presented my Performance Tuning for Pirates presentation at the Pittsburgh SQL Server User Group. Actually, I had a little tweak that went well. One of my friends who does a lot of tuning has alway been interested in doing a presentation so I had him jump on stage and do the presentation with me. I think it was a great success and I think we will be seeing some really cool presentations from him in the future.

Performance Virtual Chapter

Today at 2PM (EST), I am also giving my Performance Tuning for Pirates presentation at the SQLPASS Virtual Performance Chapter. This will be done via LiveMeeting and is free for all PASS Members. If your not a member, I have good news for you. PASS Membership is free so signup and join in on the fun. Also, if you are not able to make it today make sure you come back to the virtual chapter as this session should be recorded for replay.

If you are looking for the resources, tools used in the Performance Tuning for Pirates presentation you can find them here.

24 Hours of PASS – Get your free downloads!

I am speaking at 24 Hours of PASS

I am speaking at 24 Hours of PASS

Thank you to everyone who attended my #24HOP session on Performance Tuning for Pirates! You can now download my slide deck, t-sql queries, and view my reference links.  If you have any questions about the tools presented feel free to contact me and I will  try to help you out or point you in the right direction.


8 steps to moving database users from SQL 2005 to SQL 2008

Have you ever had to complete an upgrade where you had to uninstall SQL Server 2005 and install SQL Server 2008 on the same box? If so, how do you make sure the new install had the same security objects?  For me, the answer was lets be safe instead of being sorry.  I would script out all the existing server and database security objects so they could be verified and implemented if needed.

Not only is this a good practice for migrating your servers but the following post could be useful for a security audit.   I also want to mention if you are allowed to install tools on the box you can download SQLPermissions its a free tool provided by Idera to generate a script to move security objects during a migration.
[Update 1/28/2012 – USER_NAME() was replaced with SCHEMA_NAME() in the Explicit Permissions section]
** Run scripts at your own risk. I don’t assume any responsibility for these scripts on being ran in your environment. As always, I recommend testing in development before running in production 🙂 **


The very first step to connecting to an instant of SQL Server is to connect or login. In order to connect you must have a login. Therefore, our first step will be to actually copy the logins. I know, you might already be wondering how do I get the passwords moved over for SQL authenticated logins? Microsoft has a KB article (918922) that walks you through this process. Make sure you checkout the remarks section before running the code on a production box.

Server Level Security

Logins are nice but actually useless unless they have access to connect. Therefore, we might want to allow these logins to connect. You might also want to make sure a subset of the accounts  have access to view the DMV’s. This and other server level permissions can be scripted out with the following code below.

SELECT	sp.state_desc,
	principal_name = QUOTENAME(,
	sp.state_desc + N' ' + sp.permission_name + N' TO ' + cast(QUOTENAME( COLLATE DATABASE_DEFAULT) as nvarchar(256)) AS "T-SQL Script"
FROM sys.server_permissions sp
inner join sys.server_principals spl on (sp.grantee_principal_id = spl.principal_id)
where not like '##%' -- skip PBM accounts
and not in ('dbo', 'sa', 'public')
order by sp.permission_name,

Server Level Roles

Now that we can login we might want to take this to the next step. We may actually like to perform some actions. Well, if your instance includes third party application databases there could be a good chance that it depends on its login having sysadmin access. Yes, in case you were wondering this is poor security design but the point is your logins may depend on server level roles to have access to the database objects.  Here is a script you can use to script out server level roles.

QUOTENAME( AS "ServerRoleName",
sp.type_desc AS "RoleDescription",
QUOTENAME( AS "PrincipalName",
m.type_desc AS "LoginDescription",
'EXEC master..sp_addsrvrolemember @loginame = N''' + + ''', @rolename = N''' + + '''' AS "T-SQL Script"
FROM sys.server_role_members AS srm
inner join sys.server_principals sp on (srm.role_principal_id = sp.principal_id)
inner join sys.server_principals m on (srm.member_principal_id = m.principal_id)
where sp.is_disabled = 0
and m.is_disabled = 0
and not in ('dbo', 'sa', 'public')

Database Level Security

Now that your logins can connect you need to make sure that they can connect to the databases you migrated over.  Remember those third party apps that skipped the process of doing a  security design?  They will use sysadmin access so database level security will be included.  Hopefully, your applications are not one of those. Just like server level security you can also have database level security. Make sure you run the script below and save your results.

SELECT  dp.state_desc,
QUOTENAME(  AS 'principal_name',
 dp.state_desc + N' ' + dp.permission_name + N' TO ' + cast(QUOTENAME( COLLATE DATABASE_DEFAULT) as nvarchar(500))  AS "T-SQL Script"
FROM sys.database_permissions AS dp
INNER JOIN sys.database_principals AS dpl ON (dp.grantee_principal_id = dpl.principal_id)
WHERE dp.major_id = 0
and not like '##%' -- excluds PBM accounts
and not in ('dbo', 'sa', 'public')
ORDER BY dp.permission_name ASC, dp.state_desc ASC

Database Level Roles

Just like the SQL Server instance your database also has roles.  Many times I will see applications that don’t require sysadmin on the instance but do require db_owner access in the database.   This is also a bad security design practice as db_owner can do anything against that database. Hopefully, your application has object level explicit permissions as needed. Once again, I would recommend pulling this information in case its needed after your migration.

QUOTENAME( as "DatabaseRoleName",
QUOTENAME( as "PrincipalName",
'EXEC sp_addrolemember @membername = N''' + COLLATE DATABASE_DEFAULT + ''', @rolename = N''' + + '''' AS "T-SQL Script"
FROM sys.database_role_members AS drm
inner join sys.database_principals drole on (drm.role_principal_id = drole.principal_id)
inner join sys.database_principals dp on (drm.member_principal_id = dp.principal_id)
where not in ('dbo', 'sa', 'public')

Database Level Explicit Permissions

Finally, we will get to the area of security where I hope to see many entries. If someone, tasked me with designing their the security model for their application I would make sure stored procedures controlled the data access and that application users could only execute the stored procedures. This way there would be very minimal risk of data loss as the users wouldn’t have direct access to the tables. They would only access data needed by the business logic provided in the database objects. You can pull this information from the code below.

SELECT    dp.state_desc AS "StateDescription" ,
dp.permission_name AS "PermissionName" ,
SCHEMA_NAME(obj.schema_id) AS [Schema Name],
obj.NAME AS [Object Name],
QUOTENAME(SCHEMA_NAME(obj.schema_id)) + '.' + QUOTENAME(    + CASE WHEN col.column_id IS NULL THEN SPACE(0)           ELSE '(' + QUOTENAME( COLLATE DATABASE_DEFAULT) + ')'      END AS "ObjectName" ,
QUOTENAME( COLLATE database_default) AS "UserName" ,
dpl.type_Desc AS "UserRoleType" ,
obj.type_desc AS "ObjectType" ,
dp.state_desc + N' ' + dp.permission_name + N' ON '    + QUOTENAME(SCHEMA_NAME(obj.schema_id)) + '.' + QUOTENAME(    + N' TO ' + QUOTENAME( COLLATE database_default) AS "T-SQL Script"
FROM    sys.database_permissions AS dp
INNER JOIN sys.objects AS obj ON ( dp.major_id = obj.[object_id] )
INNER JOIN sys.database_principals AS dpl ON ( dp.grantee_principal_id = dpl.principal_id )
LEFT JOIN sys.columns AS col ON ( col.column_id = dp.minor_id  AND col.[object_id] = dp.major_id)
AND obj.is_ms_shipped = 0
AND NOT IN ( 'dbo', 'sa', 'public' )
ORDER BY    dp.permission_name ASC ,    dp.state_desc ASC

Orphaned users

If logins are not brought over correctly you may notice that they are not mapped correctly with the user account that exists in the database security (Logins section above should catch this).  The following is an explination that comes from BOL.

A database user for which the corresponding SQL Server login is undefined or is incorrectly defined on a server instance cannot log in to the instance. Such a user is said to be anorphaned user of the database on that server instance.

You can find and fix orphaned users by following the instructions provided here.

Validate User Accounts

Finally, yes when possible lets make sure the user accounts work.  I know this might be common sense but you don’t want to cut something over with out testing and verifying connectivity.