Copy Database from Azure Managed Instance
Contents
Background
There is a fair amount of documentation how to restore a database from a conventional Microsoft SQL Server instance to a cloud-based, Azure SQL Managed Instance.
The reverse is not true.
Short answer is that you cannot restore from Azure SQL Managed Instance to a self-hosted instance. In my situation, I need to a backup to send to a vendor who only had the self-hosted kind.
Prerequisites
Install SQLPackage.exe. The download is available on Microsoft’s website.
Process
- Restore a copy of source database to a temporary copy on the same Managed Instance. This is not absolutely necessary; but if it is a live production copy, this is a simple way to have consistence snapshot to work with.
- Decrypt everything in the source copy.
- Remove any “broken” schema
- Extract
sqlpackage.exe /Action:Extract /ssn:tcp:{sourcesqlserver}.{1234567890ab}.database.windows.net,1433 /sdn:"SourceDB_Copy" /su:{sa} /sp:{password} /tf:"C:\temp\source.dacpac" /p:Storage=File /p:ExtractAllTableData=true
- Create shell database on target self-hosted SQL Server
- Publish
sqlpackage.exe /Action:Publish /tsn:tcp:{targetsqlserver},1433 /tdn:TargetDB /sf:"C:\temp\source.dacpac" /p:AllowIncompatiblePlatform=true /p:Storage=File /p:ExcludeObjectTypes="Assemblies;Logins;RoleMembership;Credentials"
What’s Actually Happening
We’re using sqlpackage to extract and then publish between the two different platforms, using a dacpac file as an intermediary.
Notes
- ExactAllTableData is false by default and needs to be explicitly set.
- AllowIncompatiblePlatform lets you ignore the platform mismatch. However, if you are using a feature that does not exist in the target platform, you will need to remediate it prior to the extract step.
- Excluding Assemblies and security items were intentional. Assemblies need to be manually migrated if they exist prior to the publish
- Run sqlpackage.exe ideally in the same network as the sql server that you are interacting with. The utility is very chatty and depending on the size of the data, it still take many hours.