Tag Archives: powershell

Import Databases to Azure SQL Database

I recently spoke at a conference and was asked what is the easiest way to import databases to Azure SQL Database. Therefore, I wanted to share how I do this with DBATools.io. You can use the same code to just export if you need a local copy of an Azure SQL database as well.

Import-Module dbatools -Force
<# Variables #>
$BackupPath = "C:\Demo\AzureSQL\Bacpac" #folder location for backups
$SourceInstance = "sql2019\sql2016"
$DBName = "AdventureWorksLT2012"
$AzureDestInstance = "procuresqlsc.database.windows.net"
$DBNameDest = $DBName

<# backpac options for import and export #>
$option = New-DbaDacOption -Type Bacpac -Action Export
$option.CommandTimeout = 0

$option2 = New-DbaDacOption -Type Bacpac -Action Publish
$option2.CommandTimeout = 0

<#  
The following assums Azure SQL Database exists and is empty 
Azure will create database by default if it doesn't exist 
#>
$bacpac = Export-DbaDacPackage -Type Bacpac -DacOption $option -Path `
$BackupPath -SqlInstance $SourceInstance -Database $DBName 
Publish-DbaDacPackage -Type Bacpac -SqlInstance `
$AzureDestInstance -Database $DBNameDest -Path $bacpac.path `
 -DacOption $option2 -SqlCredential username

What Is my Performance Tier?

Great question, as of 3/3/2020 if the database in Azure SQL Database does not exist then it will be created. When its created the following database uses the default performance tier. This is General Purpose (Gen5) with 2 vCores.

Default performance tier for a new Azure SQL Database costs $371.87 per month.
The default cost of a new Azure SQL Database is 371.87 per month.

How to create cheaper databases

Great question, you can import databases to Azure SQL Database cheaper using PowerShell. It is as simple as using the Azure PowerShell Module. The following example below I use my existing Azure SQL Database server and I end up creating a new database with the “S0” tier.

<# 
Install Azure Powershell module requires local admin
Install-Module -Name Az -AllowClobber -Scope AllUsers
#>

Import-Module Az -Force
$RGName = "<Your Existing Resource Group>"
$ServerName = "<Your Azure SQL Database Server>"
$Tenant = "<Add Tenant ID>"
$Subscription = "<Add your Subscription ID>"

Connect-AzAccount -Tenant $Tenant  -SubscriptionId $Subscription
$server = Get-AzSqlServer -ResourceGroupName $RGName -ServerName  $ServerName

$db = New-AzSqlDatabase -ResourceGroupName $RGName `
-ServerName $server.ServerName -DatabaseName $DBName `
-RequestedServiceObjectiveName "S0"

Your Homework Assignment

You got all the nuts and bolts to generate a script that can migrate all your databases on an instance. Then you can import databases to Azure SQL Database in one loop.

Need Help Moving to Azure?

Let us help you! Schedule a free 30-minute chat to see if we can point you the right direction. Take advantage of our free resources too.

If you liked this blog post signup for our free newsletter!

Speaking at Pittsburgh Powershell User Group

Tomorrow, May 15th at 6pm I will be joining the script club and give a presentation on how DBA’s can leverage powershell to do their job quicker and automate processes.  It’s not too late to sign up. If you will be near the north shore I highly recommend checking it out. You can also follow the Pittsburgh PowerShell User Group to checkout their future meetings.

 

 

PowerShell User Group starts in Pittsburgh!

Recently, it has come to my attention that there is a PowerShell User Group being born in Pittsburgh.  The first user group meeting is December, 13th and  Ed Wilson also known as TheScriptingGuy will be their first presenter.   There are still a few seats available. If you are interested in developing scripts to automate processes I recommend you follow this user group.

Recap: #24hop (24hrs of PASS) – Day One

I am very happy that the committee behind #24HOP made two decisions.  One they decided to split the 24 hours into two days.  This is huge for people in the USA as we don’t have to pull all nighters.  Second, I am very glad day one fell on a Wednesday.  Why would I be exited it falls on a Wednesday?  I am excited because it is no pants Wednesday.  No pants Wednesday  means I don’t work on Wednesday’s so its very easy to attend sessions.

Day Two

If you didn’t catch it in the first paragraph there is a day two.  That’s right peeps you can still signup and attend some great sessions.  If you need some help picking a session or two I wish I could attend the ones listed below.

The following is a short review of the sessions I attended on September 15th 2010.

Gather SQL Server Performance Data with PowerShell

Allen White (Blog | @SQLRunr) showed a very slick way to automate the process of collecting WMI counters and save them in a database.  This alone was very slick but to add the icing on the cake he also showed the crowd how to build reports that work inside of SSMS.

My eyes were opened up wide when I saw how easy it was to do WMI and SQL calls with PowerShell.  I will defiantly check out http://powershell.com in the near future to get my learn on.

It looks like Allen has a great PreCon session for the SQL PASS Member Summit 2010 lined up that will get you well on your way with automating your DBA tasks.

Hardware 201: Selecting and Sizing Database Hardware for OLTP Performance

Glen Berry (Blog | @GlenAllenBerry) ran through tons of statistics behind selecting CPU’s, Memory and Disk’s for your new database servers.  I have to be honest quite a bit of this was over my head but below are a few items that stuck.

  • Optimize your hardware purchases to take advantage of your SQL and Windows Server editions
  • Don’t go cheap on CPU’s. You rarely upgrade the CPU unlike RAM or disks.
  • Xeon X5680 and Xeon X7560 were recommended CPU’s
  • SSD (solid state drives) are good for random writes (user data files) not sequential writes (log files)
  • 10K drives = 100 IOPS
  • 15K drives = 150 IOPS
  • Make sure High Performance is enable in power settings on your servers

Identifying Costly Queries

Grant Fritchey (Blog | @GFritchey) showed us several different tools you can leverage to identify costly queries.  He showed us how to setup SQL Profiler using stored procedures to lessen the load on your production boxes.  Grant also showed me a new tool I haven’t used before. This was the SQL RML Utility tool that can be helpful show how long a query really took.  Grant also showed us server DMV’s that can be used to get real-time understanding costly queries.

For some samples and resources used in the demo check out his resources blog page.

How to Rock Your Presentations

Douglas McDowell (Web | @douglasmcdowell) delivered the most important session for me.  This year I started to focus more on giving back to the community through technical presentations.  I am always looking for some tips that will improve my presentations.  The following were a few tips I plan to implement on my current presentation schedule.

  • Treat presentations like a development project
  • Storyboard each topic
  • Build an outline
  • Make sure to add RM, WIIFM and KWUC to all presentations.

You can find more in his PowerPoint presentation at http://downloads.solidq.com/DMcDowell/RockPASS_DMcDowell.zip

Conclusion

This was another great day of #24HOP.  The best part is it continues today.  The sad part is I will miss out on the sessions.  If you catch them and have good notes.  Please add them as a comment or blog them so the unlucky ones can check out the info.