Backup and Restore using Azure Blob Storage – Technical Guidance for CSP Partners

Background and Overview

This technical guidance is for CSP (Cloud Solution Provider) partners who have a need to use the Microsoft Azure Blob storage for backing up on-premises SQL Server databases. This document covers how to leverage CSP APIs to create an integrated backup and restore offerings for their customers using Azure Blob storage. The approach used in this document is an example to enable such an offering. This document is not intended to provide a final solution as-is, and also not intended to be the only way to implement a solution to enable backup and restore offering using CSP and Azure Blob storage.

The Microsoft Cloud Solution Provider (CSP) program is a Microsoft initiative where CSP partners can leverage key resources using an existing resource center that serves as a central hub to find benefits, resources and tools that enable them. One of the key benefits enabled by the CSP program is the Partner Center Portal that helps partners manage existing customer's subscriptions or create new ones. Although the CSP partners are able to manage and create new customers via the site, a set of key APIs have been delivered by Microsoft to help CSP partners integrate their own onsite automated solutions and best practices while at the same time leverage all the benefits and capabilities exposed via the APIs. The CREST API supports CSP partner's ability to create customers without having to log into the Partner Center. The CREST API is used to help build all the key elements needed to correctly provision a customer in the Partner Center Portal as well as build each unique Azure customer subscription. In addition, the Graph API is used to support Microsoft Azure Active Directory setup and configuration for both the CSP partners and their management of individual tenant subscriptions. Finally, the Azure Resource Manager (ARM) REST API is exposed to help CSP partners configure and setup new services in Azure. Although there are many additional capabilities exposed using the APIs mentioned, our primary focus in this document is to deliver technical guidance that helps partners deliver Backup and Restore services for on-premises SQL Server DB to new CSP provisioned customers. The samples and information outlined in this document include code snippets that we believe will help both technical leaders and developers understand one way the APIs could be leveraged.

Prerequisites

This document assumes that the HSP has already been setup as a CSP partner. Some of the critical components and assumptions this procedure depends are outlined below.

In this guidance, we are using a CSV file to obtain input of customer info, server info and database info. Additionally, we are using SQL Server Express edition to store the configuration data used for processing. This is the approach for the guidance taken; however, CSP partners may choose to use different mechanisms to achieve the same results.

Key Infrastructure Dependencies:

  • Domain Controller with all appropriate accounts needed exist
    • Domain Name Identified
    • DC operational
  • The partner has an existing subscription if they want to use a Sandbox and we have included additional details in Appendix C if needed.
  • CSP partner is a member of the CSP Program and Partner Center Portal access is setup
    • CSP partner is a member of the CSP Program
    • Access to the Partner Center Portal works
    • Administrator who will run the solutions is added in the Partner Center Portal via "User Management"
  • Enable the CREST API in the Partner Center Portal (See Procedure 2.1 for guidance)
  • HSP has set up a VM on-premises with SQL Express installed on it and added it to the on-premises environment domain.
    Note: Other Database solutions are possible, we standardized on using SQL Express.
  • All existing SQL Servers needing to be backed-up or restored live within the same domain.
    Note: If multiple domains are involved it is key to have a Trust setup between them so that the same domain account can be utilized if possible.
  • CSP partner has provisioned or identified an account with the right level of privileges to all SQL Server.
  • Microsoft Azure PowerShell has been downloaded and installed - https://azure.microsoft.com/en-us/documentation/articles/powershell-install-configure/
  • CSP partner has a list of all SQL Servers that are within scope of the project and know what DB belongs to what Customers

High Level Architecture

This following is a high level diagram that we leveraged in outlining the key aspects of our end-to-end approach. The components illustrated below help identify the key aspects of our guidance throughout this document. Some of the key illustrated is how we are using the CREST API to provision new Tenant subscriptions. As new CSP tenant customers created in the Partner Center Portal, they also kick off the creation of the new subscription. After a new tenant is provisioned, a partner can manage it using the Partner Center Portal. We also included how we create each tenants Storage Resources and Containers in them by using the ARM REST API. Our approach using the identified scripts in this document outline how each unique tenants DB is backed-up into a single container. You will also see that are approach supports a single tenant storage that can contain many containers (i.e. 1 storage to many containers). What is not illustrated is the fact that we also implement a Resource Group for all created storage resources. Although there are a lot of details that can be included here, our goal was to simplify and keep it high level and only the key components needed to help illustrate one approach for fully automating backup\restore from on-premises to\from Azure.

Existing CSP Services Procedure

Creating new customer leveraging the CSP Partner Center Portal

  1. Go to https://partnercenter.microsoft.com and sign-in to the site using your assigned CSP subscription
  2. Once signed in, you will see your Welcome Page with the CSP details. Once there, you can manually add a new customer, request a relationship, view customers, Add new Users, View Users, View your organizational profile.

  1. Capture the Microsoft ID in note pad for later use then return to previous screen by selecting "overview" on the left side.


  1. Now you want to provision the API by selecting the "Account Settings" text as shown below.

Create a new app with default settings by selecting the Register App

  1. You will now see the created API information you will need and need to capture it for later use. Ensure you capture the key since it is only shown once and cannot be viewed later for security reasons. Note: If you forget to do this you can always unregister the app and then recreate it which will give you a new key.

  1. Because you are using your production account, you are able to create an "Integration Sandbox" which can be used for testing as shown on the left. We are using a production environment for this documentation. If a Sandbox approach is needed, please refer to Appendix C for guidance.

How to manually backup\restore DB using Azure Blob Storage

How to backup and restore a single SQL 2014 on-premises DB to\from Azure Blob Storage offering manually.

Backing-up a unique customer DB to a customer Storage Container

  1. Sign into the customer's tenant site provisioned directly or via Partner Center Portal using the Partner Center Portal to jump there. Then select browse and then select storage account.
  2. Verify if the existing storage was provisioned using the recommended automation scripts identified in this document and if not continue to step 3 below. If it exists you can capture the details for the Resource Group, Storage Account and container. If you already have this detail you can skip to step 4
  3. Click the +Add button located and select Create a new storage account and also identify a Resource Group that should be created. Ensure you also identify the proper location to put the resource group in.

    - Name: newstoracct1 (lower case only and must contain at least 1 number)

    - Resource Group
    Name: newresgrp (lower case only and must contain at least 1 number)
    - Location / Affinity Group: West US (Same region as where you created Virtual Network) – If multiple regions are involved it is recommended that you modify the region as appropriate.

    Click Create button

  4. After the new storage has finished creating (it may take couple of minutes), click on the name (Blobs) of the new storage name displayed under the services section of the tenants portal.
  5. Click on containers section on the right navigation on view the available containers.
  6. Click on Create a Container link on the top of the page (+containers).
  7. On New Container wizard page

    - Name: dbbackups
    - Access: Private

    Click on Create button

  8. Click on Storage Account tab on the left navigation area.
  9. Click on view All Settings right above the services area to view the storage account settings and then select Access Keys.
  10. Record the details for the storage account name and Key1 (access key) and Connection String (Key1).
  11. You have now finished setting things for manual backup in the tenants portal.azure.com site.
  12. Log on to SQL Server

    Note: In real scenarios, DB server for source database will be different. For lab purposes, we are treating SQL server to be the on-premises database server.

  13. Download Adventure Works 2014 Full Database Backup.zip from https://msftdbprodsamples.codeplex.com/releases/view/125550
  14. Extract zip file to a local folder on the SQL Server
  15. Open SQL Server 2014 Management Studio and restore AdventureWorks2014 database
    1. Connect to the SQL Server 2014 default instance using your Windows Account.
    2. Navigate to Databases node
    3. Right click Databases and select Restore Database.
    4. On Restore Database wizard General Page, select Device and click on Browse (indicated as …) button
    5. Choose File for backup media type on select backup devices wizard.
    6. Click Add button and choose the extracted AdventureWorks2014.bak file and press OK button.
    7. Click OK button on select backup devices wizard.
    8. Click OK button on Restore Database wizard.
    9. Click OK button on success message wizard.
    10. Right click Databases and select Refresh.
    11. Post Condition: Ensure that AdventureWorks2014 is listed a child node for Databases.
  16. Backup AdventureWorks2014 database to blob storage from SQL Server
    1. Click New Query button on SQL Management Studio (Ctrl + N)
    2. Execute the following query from the query window after replacing variables with values applicable to your environment

CREATE CREDENTIAL Cred_SQL01
WITH IDENTITY = '<STORAGE_ACCOUNT_NAME>'
,SECRET = '<PRIMARY_ACCESS_KEY>'
GO

Example:

CREATE CREDENTIAL Cred_SQL01
WITH IDENTITY = 'newstoracct1'
,SECRET = 'lhLyxfNa4KibH6grJFbctn1nLg44hJStTQFYInQ3kDRxoVktD00fF7+N0zS39UKaWp0i+sy+3rNXQG9omLDRRg=='
GO

  1. Execute the following query from the query window after replacing variables with values applicable to your environment to back up the database to blob storage

BACKUP DATABASE [AdventureWorks2014]
TO URL = 'https://<STORAGE_ACCOUNT_NAME>.blob.core.windows.net/dbbackups/AdventureWorks2014.bak'
WITH CREDENTIAL = 'Cred_SQL01'
, STATS = 5;
GO

Example:

BACKUP DATABASE [AdventureWorks2014]
TO URL = 'https://newstoracct1.blob.core.windows.net/dbbackups/adventureworks2014.bak'

WITH CREDENTIAL = 'Cred_SQL01',
STATS = 5;
GO

Results in the Tenant Site:

  1. Post Condition:
    1. Based on your bandwidth, you will see backup message something like below

5 percent processed.
10 percent processed.
15 percent processed.
20 percent processed.
25 percent processed.
30 percent processed.
35 percent processed.
40 percent processed.
45 percent processed.
50 percent processed.
55 percent processed.
60 percent processed.
65 percent processed.
70 percent processed.
75 percent processed.
80 percent processed.
85 percent processed.
90 percent processed.
95 percent processed.
Processed 24272 pages for database 'AdventureWorks2014', file 'AdventureWorks2014_Data' on file 1.100 percent processed.
Processed 3 pages for database 'AdventureWorks2014', file 'AdventureWorks2014_Log' on file 1.
BACKUP DATABASE successfully processed 24275 pages in 5.217 seconds (36.350 MB/sec).

Restore a database from blob storage

Prerequisite: Ensure you have the Storage Account Name, Storage Device URL and Primary Key associated to the existing Container.

  1. Logon
    to SQL Server
  2. Open SQL Server 2014 Management Studio
  3. Click New Query button on SQL Management Studio (Ctrl + N)
  4. Execute the following query from the query window after replacing variables with values applicable to your environment

CREATE CREDENTIAL Cred_SQL01
WITH IDENTITY = '<STORAGE_ACCOUNT_NAME>'
,SECRET = '<PRIMARY_ACCESS_KEY>'
GO

  1. Execute the following query from the query window after replacing variables with values applicable to your environment to restore the database from blob storage

RESTORE DATABASE [AdventureWorks2014]
FROM URL = 'https://<STORAGE_ACCOUNT_NAME>.blob.core.windows.net/dbbackups/AdventureWorks2014.bak'
WITH CREDENTIAL = 'Cred_SQL01'
, STATS = 5;
GO

  1. Right click Databases and select Refresh.
  2. Post Condition:
    1. Based on your bandwidth, you will see a restore message something like below

5 percent processed.
10 percent processed.
15 percent processed.
20 percent processed.
25 percent processed.
30 percent processed.
35 percent processed.
40 percent processed.
45 percent processed.
50 percent processed.
55 percent processed.
60 percent processed.
65 percent processed.
70 percent processed.
75 percent processed.
80 percent processed.
85 percent processed.
90 percent processed.
95 percent processed.
100 percent processed.
Processed 24272 pages for database 'AdventureWorks2014', file 'AdventureWorks2014_Data' on file 1.
Processed 3 pages for database 'AdventureWorks2014', file 'AdventureWorks2014_Log' on file 1.
RESTORE DATABASE successfully processed 24275 pages in 13.072 seconds (14.507 MB/sec).

  1. Ensure that AdventureWorks2014 is listed a child node for Databases.

  1. Once you verify that the database has been restored you're done.

Automated Approach

As previously mentioned, one of the key benefits enabled by the CSP program is the existing partner portal that helps partners manage existing customer's subscriptions or create new ones. Although the CSP partners are able to manage and create new CSP customers via the site, partners are looking for new innovative ways to integrate their on-premises Backup and Restore solutions that are provisioned using the Partner Center. Microsoft has extended the CSP capabilities by exposing a set of APIs that we use in our code snippets below. The key API that we will be using is the CREST API for helping manage customers and services and the ARM REST API which helps support, provision and manage Azure features and services. Although each of the code snippets below may be leverage to deliver a starting point for delivering an automated solution, additional work and effort will be needed to pull all of them together into a single solution for managing customer's backup\restore scenarios.

Scenario: The recommended approach for delivering an automated solution assumes that a partner has many existing customers with various databases currently running in their on-premises environment. Each unique customer's database may live on either a single SQL server or potentially spread across various SQL servers. We also realize that any solution needs to address both a dedicated or multitenant solution so that has been considered as well. We also assume that CSP partners know which DB is associated to a specific customer they manage. Below we call out guidance using the APIs with sample (code snippets) and outlined a set of key components involved for helping partners deliver their own unique automated solution.

Assumptions: The following assumptions are applicable to the documented approach outlined below.

  • Partner is using SQL Server 2014
  • All SQL Servers critical to this process reside in the same domain or another trusted domain.
  • The partner has identified an Account that has the appropriate privileges on all SQL Servers.

In the steps below, the following preparation work is essential for using the guidance identified here and building a partner's unique automated approach:

  • Server List – Partner need to identify a list of SQL Servers that are already members of the same Domain (Assumption).
  • Partner need to identify all existing or net new customer that need to be provisioned via the Partner Center.
  • Partner need to map each DB to a customer. This will be key in later creating a storage container and backing-up or restoring to and from Azure.

Initial Requirements

The following steps below outlines the planned sequence of events required to run. It is important to identify that we assume that the following preliminary requirements are completed.

  1. Partner has been provisioned in the Partner Center Portal and is able to logon and configure their environment.
  2. Partner has been associated to their customer's subscription.
  3. Partner has identified all SQL Servers and Databases to their unique customer.
  4. Partner has mapped all DB to their unique customers
  5. The provided site and API are provisioned and required info (i.e. key, etc.) were captured for use.
  6. Ensure that the server identified to run your scripts from has SQL Express 2014 installed

The code snippets below can be leveraged as a clear starting point for CSP partners so they are able to deliver an automated solution of their own. It is important to note that the snippets are written as a guide only.

Step 1- Creating the Customer CSV Input File

For each of your customers, you must identify the following information to successfully create a customer using our guidance in this document. The information captured here is to support the creation of a customer.csv file that will be used as input when creating a new customer in Partner Center and the setup of a customer Azure Subscription.

CSV Column Name

Example 1

Example 2

Description

DomainPrefix

NEWKKK021233123

MEDKKK021233123

Identifies the prefix for the domain used

UserName

Philtestuser

MEDtestuser

User Name

Password

pwd@2012

pwd@2012

Password

FirstName

New

MED

First Name

LastName

testuser

testuser

Last Name

Email

new@new.com

med@med.com

Email Address

CompanyName

company111

company111

Company Name

Culture

en-US

en-US

Language-Country

Language

en

en

Language

Type

organization

organization

Can use any string needed

AddressLine1

AddressLine1

AddressLine1

Address

AddressLine2

AddressLine2

AddressLine2

Address

City

Redmond

Redmond

City

Region

WA

WA

Region

PostalCode

98052

98052

Zip Code

Country

US

US

Country

PhoneNumber

1234567890

1234567890

Phone Number

CustomerDatabaseBackupIntervalInHours

12

12

Value represents Interval in hours between backups

BackupType

FullDatabaseBackup

IncrementalBackup

Full or Incremental Backup Supported

File Format for identifying customers used in our guidance document: customer.csv

DomainPrefix,     UserName, Password, FirstName, LastName, Email, CompanyName, Culture, Language, Type, AddressLine1, AddressLine2, City, Region, PostalCode, Country, PhoneNumber, CustomerDatabaseBackupIntervalInHours, BackupType

Step 2- Creating the Server List Input File

For each of your customers, you must identify the following information to successfully identify what servers in your environment have SQL Server installed and should be included for backups. The information captured here will be used to determine what databases are available on a unique server(s). The file that will be used to identify the server list has been identified below. In our code snippets we use a file called server.csv file as input to our code that will scan and determine what DB live on any SQL server identified in the list. Here is an example with 3 servers listed.

File Format for identifying servers used in our guidance document: server.csv

ServerName

SQLBOX01

SQLBOX02

SQLBOX03

 

Step 3- Importing a list of Databases from the Server List (input)

For each of your customers, you must determine what Databases belong to each of your customers to ensure they get backed up and dropped into the right container later. The following code snippet helps partners determine what previously identified SQL DB live on a unique box for use later.

------Start of Code Snippet----------------------------

public class SqlServerProvider
{
  public string ServerConnectionString { get; set; }

public SqlServerProvider(string serverConnectionString)
  {
    this.ServerConnectionString = serverConnectionString;
  }

public List<SqlDatabase> GetDatabases(Logger logger)
  {
    List<SqlDatabase> databases = new List<SqlDatabase>();
    SqlConnection sqlConnection = new SqlConnection(ServerConnectionString);

try
    {
      using (sqlConnection)
      {
        SqlCommand command = new SqlCommand(@" SELECT sd.database_id as SqlDatabaseId,
              sd.name as Name,
              owner_sid as OwnerSid,
              osp.name as Owner,
              DATEADD(MINUTE, -DATEPART(TZoffset, SYSDATETIMEOFFSET()), sd.create_date) AS CreateDate
              FROM sys.databases sd
              JOIN sys.database_mirroring dm
              ON sd.database_id = dm.database_id
              LEFT JOIN sys.server_principals osp
              ON sd.owner_sid = osp.sid",
        sqlConnection);
        sqlConnection.Open();
        SqlDataReader reader = command.ExecuteReader();
        if (reader.HasRows)
        {
          while (reader.Read())
          {
            try
            {
              SqlDatabase db = new SqlDatabase();
              db.SqlDatabaseId = Guid.NewGuid();
              // get the results of each column
              db.SqlServerDatabaseId = (Int32)reader["SqlDatabaseId"];
              db.Name = (string)reader["Name"];
              db.OwnerSid = (byte[])reader["OwnerSid"];
              db.Owner = (string)reader["Owner"];
              db.CreateDate = new DateTimeOffset((DateTime)reader["CreateDate"]);

if (!db.IsSystem)
              {
                databases.Add(db);
              }
            }
            catch (Exception ex)
            {
              logger.Error(ex);
            }
          }
        }
        else
        {
          logger.Warning("No database found in server {0}", ServerConnectionString);
        }

reader.Close();
      }
    } catch (Exception ex)
    {
      logger.Error(ex);
    }
    return databases;
  }
}

------End of Code Snippet----------------------------

Step 4- Performing a manual DB mapping to customer List Input File

For each of your customers, you must map DB ownership to a CSP customer. This ensures the backup and restore automation scripts know which customer Azure Subscription. There are many ways to map Customers to DB for a given server. In our code snippets we leveraged SQL Express for storing all captured information. A flat mapping file could also be used to support CSP partner's efforts and one possible mapping could be as follows:

File = map.csv

Customer Name

Customer ID

Server ID

Database ID

Note: Mapping is required and we have found using a SQL DB like the illustration below can be used for mapping what is needed or simply use a file like the one mentioned above (i.e. Map.CSV).

Step 5- Creating a new customer in the Partner Center Portal

For each of your customers, you must create a customer account and an Azure subscription. This will allow you to track usage and billing for each customer easily. To create a customer account and provision an azure subscription, leverage the CREST APIs (See Appendix A or links) to do so. Here is a code snippet that creates customer initial accounts from a csv file:

The purpose of this function code snippet below helps partners form the http web request for creating customers. It also adds the required http header in the http request which are required by the CREST API.

------Start of Code Snippet----------------------------

internal static HttpWebRequest CreateCustomerWebRequest(string resellerCid, string sa_Token)
{
  var request = (HttpWebRequest)HttpWebRequest.Create(string.Format("{0}/{1}/customers/create-reseller-customer", baseUrl, resellerCid));
  request.Method = "POST";
  request.ContentType = "application/json";
  AddCommonRequestHeaders(request);
  request.Headers.Add("x-ms-tracking-id", Guid.NewGuid().ToString());
  request.Headers.Add("Authorization", "Bearer " + sa_Token);
  return request;
}

The following code snippet is required to get the required Authorization Bearer token.File Name: C:\Users\Phillip\Documents\GitHub\CSP\HSPDataProvider\CRESTApi\Reseller.cs
/// <summary>
/// Given the reseller domain, clientid and clientsecret of the app, this method helps to retrieve the AD token
/// </summary>
/// <param name="resellerDomain">domain of the reseller including .onmicrosoft.com</param>
/// <param name="clientId">AppId from the azure portal registered for this app</param>
/// <param name="clientSecret">Secret from the azure portal registered for this app</param>
/// <returns>this is the authentication token object that contains access_token, expiration time, can be used to get the authorization token from a resource</returns>
public static dynamic GetADToken(string resellerDomain, string clientId, string clientSecret)
{
  WebRequest request;
  string content;
  CrestApiHelper.GetGetADTokenWebRequest(resellerDomain, clientId, clientSecret, out request, out content);
  using (var writer = new StreamWriter(request.GetRequestStream()))
  {
    writer.Write(content);
  }

try
  {
    var response = request.GetResponse();
    using (var reader = new StreamReader(response.GetResponseStream()))
    {
      var responseContent = reader.ReadToEnd();
      var adResponse = Json.Decode(responseContent);
      return adResponse;
    }

}
  catch (WebException webException)
  {
    if (webException.Response != null)
    {
      using (var reader = new StreamReader(webException.Response.GetResponseStream()))
      {
        var responseContent = reader.ReadToEnd();
      }
    }
  }

return string.Empty;
}

/// <summary>
/// Given the ad token this method retrieves the sales agent token for accessing any of the partner apis
/// </summary>
/// <param name="adToken">this is the access_token we get from AD</param>
/// <returns>the sales agent token object which contains access_token, expiration duration</returns>
private static dynamic GetSA_Token(string adToken)
{
  HttpWebRequest request;
  string content;
  CrestApiHelper.GetSATokenWebRequest(adToken, out request, out content);

using (var writer = new StreamWriter(request.GetRequestStream()))
  {
      writer.Write(content);
  }

try
  {
    var response = request.GetResponse();
    using (var reader = new StreamReader(response.GetResponseStream()))
    {
      var responseContent = reader.ReadToEnd();
      return Json.Decode(responseContent);
    }
  }
  catch (WebException webException)
  {
    using (var reader = new StreamReader(webException.Response.GetResponseStream()))
    {
      var responseContent = reader.ReadToEnd();
    }
  }
  return string.Empty;
}

------End of Code Snippet----------------------------

The purpose of the code below is to create customer in Microsoft Partner Center: There are additional methods the partners need to be aware of to successfully pull into the code sample below (Customer, resellerCID, sa Token).

Main Method:

------Start of Code Snippet----------------------------

/// <summary>
/// This method is used to create a customer in the Microsoft reseller ecosystem by the reseller
/// </summary>
/// <param name="customer">customer information: domain, admin credentials for the new tenant, address, primary contact info</param>
/// <param name="resellerCid">reseller cid</param>
/// <param name="sa_Token">unexpired access token to access the partner apis</param>
/// <returns>the newly created customer information: all of the above from customer, customer cid, customer microsoft id</returns>
public static dynamic CreateCustomer(dynamic customer, string resellerCid, string sa_Token)
{
  HttpWebRequest request = CrestApiHelper.CreateCustomerWebRequest(resellerCid, sa_Token);
  string content = Json.Encode(customer);
  using (var writer = new StreamWriter(request.GetRequestStream()))
  {
    writer.Write(content);
  }

try
  {
    var response = request.GetResponse();
    using (var reader = new StreamReader(response.GetResponseStream()))
    {
      var responseContent = reader.ReadToEnd();
      var createdCustomerResponse = Json.Decode(responseContent);
      return createdCustomerResponse;
    }
  }
  catch (WebException webException)
  {
    using (var reader = new StreamReader(webException.Response.GetResponseStream()))
    {
      var responseContent = reader.ReadToEnd();
      return responseContent;
    }
  }
}

------End of Code Snippet----------------------------

Step 6 – Creating a new Azure Subscription for each CSP Customer

Once a new CSP customer is created for a unique tenant in the Partner Center a new tenant Azure Subscription must also be provisioned. The following script identifies recommended guidance on what to do to create the required Azure Subscription leveraging the ARM REST API. Here is a code snippet that creates customer initial Azure Subscription:

The code snippet below goes and gets the subscriptions the may exist in the Partner Center for a customer.

------Start of Code Snippet----------------------------

/// <summary>
/// This method is to retrieve the subscriptions of a customer bought from the reseller
/// </summary>
/// <param name="customerCid">cid of the customer</param>
/// <param name="resellerCid">cir of the reseller</param>
/// <param name="sa_Token">sales agent token</param>
/// <returns>object that contains all of the subscriptions</returns>
public static dynamic GetSubscriptions(string customerCid, string resellerCid, string sa_Token)
{
  HttpWebRequest request = CrestApiHelper.GetSubscriptionsWebRequest(customerCid, resellerCid, sa_Token);

try
  {
    var response = request.GetResponse();
    using (var reader = new StreamReader(response.GetResponseStream()))
    {
        var responseContent = reader.ReadToEnd();
        var subscriptionsResponse = Json.Decode(responseContent);
        return subscriptionsResponse;
      }
    }
    catch (WebException webException)
    {
      using (var reader = new StreamReader(webException.Response.GetResponseStream()))
      {
        var responseContent = reader.ReadToEnd();
      }
    }
  return string.Empty;
}

------End of Code Snippet----------------------------

The code snippet below help CSP partners create a new customer Azure subscription via the CREST API if one does not exist.

------Start of Code Snippet----------------------------

/// <summary>
/// This method is used to place order on behalf of a customer by a reseller
/// </summary>
/// <param name="resellerCid">the cid of the reseller</param>
/// <param name="order">new Order that can contain multiple line items</param>
/// <param name="sa_Token">unexpired access token to call the partner apis</param>
/// <returns>order information that has references to the subscription uris and entitlement uri for the line items</returns>
public static dynamic PlaceOrder(dynamic order, string resellerCid, string sa_Token)
{
  HttpWebRequest request = CrestApiHelper.PlaceOrderWebRequest(resellerCid, sa_Token);
  string content = Json.Encode(order);

using (var writer = new StreamWriter(request.GetRequestStream()))
  {
    writer.Write(content);
  }

try
  {
    var response = request.GetResponse();
    using (var reader = new StreamReader(response.GetResponseStream()))
    {
      var responseContent = reader.ReadToEnd();
      var placedOrder = Json.Decode(responseContent);
      return placedOrder;
    }
  }
  catch (WebException webException)
  {
    using (var reader = new StreamReader(webException.Response.GetResponseStream()))
    {
      var responseContent = reader.ReadToEnd();
    }
  }
  return string.Empty;
}

------End of Code Snippet----------------------------

The code snippet below is a method for forming the web request for getting and creating the subscription for a unique customer.

--Creating a subscription in CSP -------Start of Code Snippet----------------------------

internal static HttpWebRequest PlaceOrderWebRequest(string resellerCid, string sa_Token)
{
  HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(string.Format("{0}/{1}/orders", baseUrl, resellerCid));
  request.Method = "POST";
  request.ContentType = "application/json";
  AddCommonRequestHeaders(request);
  request.Headers.Add("x-ms-tracking-id", Guid.NewGuid().ToString());
  request.Headers.Add("Authorization", "Bearer " + sa_Token);
  return request;
}

------End of Code Snippet----------------------------

--Getting all subscriptions for a customer ------Start of Code Snippet----------------------------

internal static HttpWebRequest GetSubscriptionsWebRequest(string customerCid, string resellerCid, string sa_Token)
{
  HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(string.Format("{0}/{1}/subscriptions?recipient_customer_id={2}", baseUrl, resellerCid, customerCid));
  request.Method = "GET";
  AddCommonRequestHeaders(request);
  request.Headers.Add("x-ms-tracking-id", Guid.NewGuid().ToString());
  request.Headers.Add("Authorization", "Bearer " + sa_Token);
  return request;
}

------End of Code Snippet----------------------------

The code snippet below is the main code and checks if an existing subscript already exist with Azure Offer ID.

------Start of Code Snippet----------------------------

public Order CreateCustomerOrder(string customerCPID, string offerUri)
{
  string resellerCid = Reseller.GetCid(MicrosoftId, _saAuthorizationToken.AccessToken);
  var customerAuthorizationToken = CustomerProvider.GetCustomer_Token(customerCPID, _adAuthorizationToken);
  var subscriptions = Subscription.GetSubscriptions(customerCPID, resellerCid, _saAuthorizationToken.AccessToken);

if (subscriptions != null && subscriptions.total_count > 0)
  {
    foreach(var s in subscriptions.items)
    {
      if (s.state == "Active" && s.offer_uri == offerUri)
      {
        return new Order
        {
          SubscriptionId = s.id,
          SubscriptionUri = ""
        };
      }
    }
  }

var order = new
  {
    line_items = new List<dynamic>()
    {
      new
      {
        line_item_number = 0,
        offer_uri = offerUri,
        quantity = 1,
      }
    },
    recipient_customer_id = customerCPID
  };

var newCustomerPlacedOrder = OrdersProvider.PlaceOrder(order, resellerCid, _saAuthorizationToken.AccessToken);
  var subscription = Subscription.GetSubscriptionByUri(newCustomerPlacedOrder.line_items[0].resulting_subscription_uri, _saAuthorizationToken.AccessToken);

return new Order
  {
    SubscriptionId = subscription.Id,
    SubscriptionUri = newCustomerPlacedOrder.line_items[0].resulting_subscription_uri
  };
}

------End of Code Snippet----------------------------

Step 7 - Provisioning the new storage account for each customer

The following code snippet below outlines the code for properly creating a new Azure resource group and storage account. Here is a bit of details for each core area that is addressed in the code snippet.

  • Create Resource Group - Resource groups are a new concept in the Azure Preview portal. We think of them as "lifecycle boundaries," because when resources share a resource group, their lifecycles (from create, to update, to delete) are managed in an integrated way. Use resource groups to collect and manage all your application resources. Link resources across resource groups, share resources across lifecycle boundaries, and manage costs. View, monitor, and track your usage and billing for all the resources your application uses. We are using a resource group to help manage all SQL resources in each subscription where we will be backing up or restoring from.
  • Create Storage Account – An Azure storage account is a secure account that gives you access to services in Azure Storage. Your storage account provides the unique namespace for your storage resources. In our guidance we focus on the standard storage account type that can support 100 uniquely named storage accounts with a single subscription. Here is an example of how a CSP partner can provision a Storage for an Azure Customers subscription.

Here we identify the main method that checks if an existing Azure Resource Group exist and if not, it creates one, it also checks if a storage account exist for the customer in the resource group and if not it creates one or simply uses it –

------Start of Code Snippet----------------------------

internal static void CreateResourcesInResourceGroup(AzureCustomerDetail customerDetail, string token, string storageAccountName)
  {
    AppSettingDataProvider settingProvider = new AppSettingDataProvider(ServiceConfigrationHelper.GetConnectionString(), ServiceConfigrationHelper.GetEncriptionCertSubjectName());

string resourceGroupName = settingProvider.GetAppSetting("storageresourceGroupName").AppSettingValue;
    string defaultStorageAccountLocation = settingProvider.GetAppSetting("DefaultStorageAccountLocation").AppSettingValue;
    string storageAccountType = settingProvider.GetAppSetting("storageAccountType").AppSettingValue;
    string storageAccountLabel = settingProvider.GetAppSetting("storageAccountLabel").AppSettingValue;
   
    ArmApi api = new ArmApi();
    StorageResourceRequest storageRequest = new StorageResourceRequest
    {
      Location = defaultStorageAccountLocation,
      SubscriptionId = customerDetail.CustomerSubscriptionId.ToString(),
      Token = token,
      AccountType = storageAccountType,
      Label = storageAccountLabel,
      Name = storageAccountName,
      ResourceGroupName = resourceGroupName
    };

var resourceGroups = api.GetAllResourceGroup(storageRequest);
    bool resourceGroupExist = false;
    if (resourceGroups != null && resourceGroups.Count > 0)
    {
      foreach (var resourceGroup in resourceGroups)
      {
        if (!string.IsNullOrEmpty(resourceGroup.Name) && string.Compare(resourceGroup.Name, resourceGroupName, false) == 0)
        {
          if (resourceGroup.Properties["provisioningState"] == "Succeeded")
          {
            resourceGroupExist = true;
            break;
          }
        }
      }
    }

if (!resourceGroupExist)
    {
      api.CreateResourceGroup(storageRequest, resourceGroupName);
    }

api.RegisterStorageResourceProvider(storageRequest);
    var accounts = api.GetAllStorageAccountInResourceGroup(storageRequest);

if (accounts != null && accounts.Value != null && accounts.Value.Count()>0)
    {
      var existingAccount = accounts.Value.FirstOrDefault();
      storageRequest.Name = existingAccount.Name;
      storageRequest.Location = existingAccount.Location;
    }
    else
    {
      api.CreateStorageResource(storageRequest);
    }
 
  var account = api.GetStorageAccount(storageRequest);
  var accountkey = api.GetStorageAccountKeys(storageRequest);
  var url = StorageProvider.GetStorageAccountUrl(account.Name, accountkey.Key1);

CustomerDataProvider customerProvider = new CustomerDataProvider(ServiceConfigrationHelper.GetConnectionString(), ServiceConfigrationHelper.GetEncriptionCertSubjectName());
  var customer = customerProvider.GetCustomer(customerDetail.AzureCustomerDetailId);
  customer.StorageAccountId = account.Id;
  customer.StorageAccountName = account.Name;
  customer.StorageAccountLocation = url;
  customer.StorageAccountKey = accountkey.Key1;
  customerProvider.UpdateCustomer(customer);
}

------End of Code Snippet----------------------------

The following code snippet identifies the helper class which we use for calling the Azure Resource Manager (ARM) REST API.

------Start of Code Snippet----------------------------

using HSPDataProvider.Helper;
using System;
using System.Collections.Generic;
using System.Net.Http;

namespace HSPDataProvider.ARMApi
{
  public class ArmApi
  {
    private static ArmApiUrl apiUrl = null;

static ArmApi()
    {
      apiUrl = new ArmApiUrl();
    }

public List<ResourceGroup> GetAllResourceGroup(ArmApiParam request)
    {
      HttpHelper helper = new HttpHelper();
      Uri uri = new Uri(string.Format(apiUrl.GetAllResourceGroupUrl, request.SubscriptionId));
      return helper.GetArmResponseObject<ResourceGroupCollection>(uri, request.Token).Value;
    }

public void CreateResourceGroup(ArmApiParam request, string resourceGroupName)
    {
      HttpHelper helper = new HttpHelper();
      Uri uri = new Uri(string.Format(apiUrl.CreateResourceGroupUrl, request.SubscriptionId, resourceGroupName));
      var contentBody = new { location = request.Location };
      var content = new ObjectContent(contentBody.GetType(), contentBody, HttpHelper.JsonMediaTypeFormatter);

HttpResponseMessage message = helper.PutArmRequestObject(uri, request.Token, content);
    }

public void RegisterStorageResourceProvider(ArmApiParam request)
    {
      HttpHelper helper = new HttpHelper();
      Uri uri = new Uri(string.Format(apiUrl.RegisterResourceProviderWithSubscription, request.SubscriptionId));
      var contentBody = new {};
      var content = new ObjectContent(contentBody.GetType(), contentBody, HttpHelper.JsonMediaTypeFormatter);

HttpResponseMessage message = helper.PostArmRequestObject<HttpResponseMessage>(uri, request.Token, content);
    }

public bool CheckStorageAccountNameAvailability(string subscriptionId, string storageAccountName, string token, out string messageText)
    {
      messageText = string.Empty;
      HttpHelper helper = new HttpHelper();
      Uri uri = new Uri(string.Format(apiUrl.CheckStorageAccountNameAvailability, subscriptionId));
      var contentBody = new { name = storageAccountName, type = "Microsoft.Storage/storageAccounts" };
      var content = new ObjectContent(contentBody.GetType(), contentBody, HttpHelper.JsonMediaTypeFormatter);
      StorageAccountAvailability message = helper.PostArmRequestObject<StorageAccountAvailability>(uri, token, content);

if (!message.nameAvailable)
      {
        if (!string.IsNullOrEmpty(message.reason))
          messageText = message.reason;

if (!string.IsNullOrEmpty(message.message))
          messageText = messageText+ message.message;
      }

return message.nameAvailable;
    }

public StorageAccountColection GetAllStorageAccountInResourceGroup(StorageResourceRequest request)
    {
      HttpHelper helper = new HttpHelper();
      Uri uri = new Uri(string.Format(apiUrl.GetAllStorageAccountInResourceGroupUrl, request.SubscriptionId, request.ResourceGroupName));
      var accounts = helper.GetArmResponseObject<StorageAccountColection>(uri, request.Token);
      return accounts;
    }

public void CreateStorageResource(StorageResourceRequest request)
    {
      HttpHelper helper = new HttpHelper();
      Uri uri = new Uri(string.Format(apiUrl.StorageResourceUrl, request.SubscriptionId, request.ResourceGroupName, request.Name));

var properties = new { accountType = request.AccountType };
      var contentBody = new { location = request.Location , properties = properties };
      var content = new ObjectContent(contentBody.GetType(), contentBody, HttpHelper.JsonMediaTypeFormatter);

HttpResponseMessage message = helper.PutArmRequestObject(uri, request.Token, content);
    }

public StorageAccount GetStorageAccount(StorageResourceRequest request)
    {
      HttpHelper helper = new HttpHelper();
      Uri uri = new Uri(string.Format(apiUrl.StorageResourceUrl, request.SubscriptionId, request.ResourceGroupName, request.Name));
      return helper.GetArmResponseObject<StorageAccount>(uri, request.Token);
    }

public StorageKey GetStorageAccountKeys(StorageResourceRequest request)
    {
      HttpHelper helper = new HttpHelper();
      Uri uri = new Uri(string.Format(apiUrl.StorageAccountKeyUrl, request.SubscriptionId, request.ResourceGroupName, request.Name));

var contentBody = new { };
      var content = new ObjectContent(contentBody.GetType(), contentBody, HttpHelper.JsonMediaTypeFormatter);

StorageKey keys = helper.PostArmRequestObject<StorageKey>(uri, request.Token, content);

return keys;
    }
  }
}

------End of Code Snippet----------------------------

The following code snippet below is the ARM REST API URL Helper which we use for forming the URL for ARM request.

------Start of Code Snippet----------------------------

namespace HSPDataProvider.ARMApi
{
  public class ArmApiUrl
  {
    public ArmApiUrl()
    {
      ArmResourceGroupsBaseUrl = CommonConst.ArmBaseUrl + @"subscriptions/{0}/resourcegroups";
      CheckStorageAccountNameAvailability = CommonConst.ArmBaseUrl + "subscriptions/{0}/providers/Microsoft.Storage/checkNameAvailability?api-version=2015-06-15";
      RegisterResourceProviderWithSubscription = CommonConst.ArmBaseUrl + "subscriptions/{0}/providers/Microsoft.Storage/register?api-version=2015-01-01";
      StorageAccountKeyUrl = ArmResourceGroupsBaseUrl + "/{1}/providers/Microsoft.Storage/storageAccounts/{2}/listKeys?api-version=2015-06-15";
      StorageResourceUrl = ArmResourceGroupsBaseUrl + "/{1}/providers/Microsoft.Storage/storageAccounts/{2}?api-version=2015-06-15";
      CreateResourceGroupUrl = ArmResourceGroupsBaseUrl + "/{1}?api-version=2015-01-01";
      GetAllResourceGroupUrl = ArmResourceGroupsBaseUrl + "?api-version=2015-01-01";
      GetAllStorageAccountInResourceGroupUrl = ArmResourceGroupsBaseUrl + "/{1}/providers/Microsoft.Storage/storageAccounts?api-version=2015-06-15";
    }

public string ArmResourceGroupsBaseUrl { get; set; }
    public string StorageAccountKeyUrl { get; set; }
    public string StorageResourceUrl { get; set; }
    public string CreateResourceGroupUrl { get; set; }
    public string RegisterResourceProviderWithSubscription { get; set; }
    public string GetAllResourceGroupUrl { get; set; }
    public string CheckStorageAccountNameAvailability { get; set; }
    public string GetAllStorageAccountInResourceGroupUrl { get; set; }
  }
}

------End of Code Snippet----------------------------

Step 8 - Provisioning Storage Container for each customer DB

For each of your customers, a Storage Container will be needed to house the backups. The guidance here outlines how a customer can provision this as part of the DB backup process. In other words, we create a 1 to 1 mapping where each database will get their own container. We believe this will help keep all database backups organized and simplify the backup and restore process. Here is an example of how a HSP can provision a Storage Container for an Azure Customers subscription.

------Start of Code Snippet----------------------------

private static string storageConnectionString = "DefaultEndpointsProtocol=https;AccountName={0};AccountKey={1}";
  public static void CreateStorageBlobContainer(string accountName, string ContainerName, string accountKey)
  {
    CloudStorageAccount storageAccount = CloudStorageAccount.Parse(string.Format(storageConnectionString, accountName, accountKey));

//Create the blob client object.
    CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
   
    //Get a reference to a container to use for the sample code, and create it if it does not exist.
    CloudBlobContainer container = blobClient.GetContainerReference(ContainerName);
   
    //Create a new container, if it does not exist
    container.CreateIfNotExists(BlobContainerPublicAccessType.Off);
  }

------End of Code Snippet----------------------------

Step 9 - Backing-up a set of databases to a customer Storage Container in Azure

As previously identified in our guidance, our approach is to ensure a mapping exist between each partner's on-premises hosted DB to a unique CSP Azure Tenant's subscription. We know want to setup backup so each of the existing DB for a unique customer can be backed up regardless where it lives. The scripts below will be used for each of the key activities required to successfully backup customer databases.

Key Steps:

  1. Setup SQL Server Backup for Full or Partial backups

------Start of Code Snippet----------------------------

public CustomerDatabaseBackupType GetBackupType(AzureCustomerDetail customer, Models.Server sqlserver, SqlDatabase database, string containerName)
{
  if(customer.BackupType == CustomerDatabaseBackupType.FullDatabaseBackup)
  {
    return CustomerDatabaseBackupType.FullDatabaseBackup;
  }
  ServerDataProvider dataProvider = new ServerDataProvider(hspConfigConnectionstring, hspEncriptionCertSubjectName);
  var history = dataProvider.GetDatabaseBackupHistory(database.SqlDatabaseId);
  if (history == null || history.Count() == 0)
  {
    return CustomerDatabaseBackupType.FullDatabaseBackup;
  }
  else
  {
    return CustomerDatabaseBackupType.IncrementalBackup;
  }
}

------End of Code Snippet----------------------------

  1. Identify key Credentials (i.e. Azure Subscription and Storage Container)

------Start of Code Snippet----------------------------

private Credential CreateCredentialInSqlServer(AzureCustomerDetail customer, Models.Server sqlserver)
{
  if (server.Credentials.Contains(customer.StorageAccountName))
  {
    Credential credential = server.Credentials[customer.StorageAccountName];
    return credential;
  }
  else
  {
    string credentialName = customer.StorageAccountName;
    Credential credential = new Credential(server, credentialName);
    credential.Create(customer.StorageAccountName, customer.StorageAccountKey);
    return credential;
  }
}

------End of Code Snippet----------------------------

  1. Backup blob to Azure Container

------Start of Code Snippet----------------------------

public bool BackupDatabase(AzureCustomerDetail customer, Models.Server sqlserver, SqlDatabase database, out CustomerDatabaseBackupType type, out string backupUrl)
  {
    bool IsBackupDatabaseDone = true;
    type = CustomerDatabaseBackupType.None;
    backupUrl = null;
    server.ConnectionContext.Connect();
    try
    {
      string containerName = GetBlobContainerName(sqlserver, database);
      StorageProvider.CreateStorageBlobContainer(customer.StorageAccountName, containerName, customer.StorageAccountKey);

Credential credential = CreateCredentialInSqlServer(customer, sqlserver);
      type = GetBackupType(customer, sqlserver, database, containerName);
      backupUrl = string.Format(@"{0}{1}/{2}-{3}{4}.bak", customer.StorageAccountLocation, containerName,
        type.ToString().ToLowerInvariant(), database.Name.ToLower(), DateTime.UtcNow.ToString("s").Replace(":", "-"));

string databaseName = database.Name;
      Database db = default(Database);
      db = server.Databases[databaseName];

// Store the current recovery model in a variable.
      int recoverymod;
      recoverymod = (int)db.DatabaseOptions.RecoveryModel;

// Define a Backup object variable.
      Backup bk = new Backup();

// Specify the type of backup, the description, the name, and the database to be backed up.
      bk.Action = BackupActionType.Database;
      bk.BackupSetDescription = type.ToString().ToLowerInvariant() + " backup of databaseName:" + databaseName;
      bk.BackupSetName = databaseName + " Backup";
      bk.Database = databaseName;
      bk.CredentialName = credential.Name;

// Declare a BackupDeviceItem by supplying the backup device file name in the constructor, and the type of device is a file.
      BackupDeviceItem bdi = default(BackupDeviceItem);
      bdi = new BackupDeviceItem(backupUrl, DeviceType.Url);

// Add the device to the Backup object.
      bk.Devices.Add(bdi);
      // Set the Incremental property to False to specify that this is a full database backup.
      bk.Incremental = (type == CustomerDatabaseBackupType.IncrementalBackup) ? true : false;

// Specify that the log must be truncated after the backup is complete.
      bk.LogTruncation = BackupTruncateLogType.Truncate;

// Run SqlBackup to perform the database backup on the instance of SQL Server.
      bk.SqlBackup(server);
      bk.Devices.Remove(bdi);
    }
    catch (Exception ex)
    {
      IsBackupDatabaseDone = false;
      if (logger != null)
        logger.Error(ex);
      }
    finally
    {
      server.ConnectionContext.Disconnect();
    }

return IsBackupDatabaseDone;
  }

------End of Code Snippet----------------------------

  1. Setup Backup Scheduler – We recommend that a partner setup a sleeper service that can wake up every so often to check on to see what the interval settings are and compares the backup history in SQL to the current time. If needed, it will kick off an automated backup to Azure.

------Start of Code Snippet----------------------------

private void BackupDatabases()
  {
    CustomerDataProvider dataProvider = new CustomerDataProvider(ServiceConfigrationHelper.GetConnectionString(), ServiceConfigrationHelper.GetEncriptionCertSubjectName());
    var customers = dataProvider.GetCustomersWithStorageAccount();
    ServerDataProvider serverDataProvider = new ServerDataProvider(ServiceConfigrationHelper.GetConnectionString(), ServiceConfigrationHelper.GetEncriptionCertSubjectName());

foreach (var customer in customers)
    {
      var databases = dataProvider.GetAllDatabsesForCustomer(customer.AzureCustomerDetailId);
      if (databases != null && databases.Count() > 0)
      {
        foreach(var db in databases)
        {
          var history = serverDataProvider.GetLatestDatabaseBackupHistory(db.SqlDatabaseId);
          if (history == null)
          {
            BackupDatabase(serverDataProvider, customer, db);
          }
          else
          {
            if(history.BackupDate.Add(customer.CustomerDatabaseBackupInterval)< DateTime.UtcNow)
            {
              BackupDatabase(serverDataProvider, customer, db);
            }
          }
        }
    }
  }
}

------End of Code Snippet----------------------------

Step 10 - Restoring a customer's SQL Server DB from a unique customer Storage Container

One of the key events that may be needed is the restoral of a previously backed up DB using the solution identified above. Since the nature of restoral is not considered a planned event we have outlined key steps that can be implemented to kick off a restoral of a unique customer's database or all DB on an existing server.

Scenario:

Since our recommended guidance included a means for capturing all DB on a defined SQL Server that is mapped to one or more customers, a restoral of a single DB or all DB on a given server can be supported. Restoring a group of DB for different customers will require that the same command options used for 1 is repeated.

The following steps outlines the sequence of events required to properly run restores from and existing Azure tenant subscription storage container. The nature of restoral is to replace the previously DB on-premises and this guidance will restore the existing DB back to the last known good backup. All information that has not been backed up will be lost. If this is critically to partner tenants, we recommended that an HA solution be implement separately. It is important to identify that we assume that the following preliminary requirements have been met.

Initial Requirements:

  1. Partner has been provisioned in the Partner Center Portal and is able to logon and configure their environment.
  2. Partner has been associated to their customer's subscription.
  3. Partner has identified all SQL Servers and Databases to their unique customer in the provided file called "server.csv" and "customer.csv"
  4. Partner has mapped all DB to their unique customers in the file provided called "mapping.csv"
  5. The provided site and API have been provisioned and captured for use.
  6. At least a single full backup has been completed successfully to blob storage.

Once the initial requirements have been validated the following key steps below can be executed.

Key Steps:

  1. CSP partner identifies a need to restore a DB(s) or Server due to reported issues
  2. Identify which customer to restore – in our example we use the following command to execute: Restore-CustomerDatabase -CustomerId <customerguid> -ServerId <serverguid> -DatabaseId <databaseguid>
  3. Restore blob from Azure Container – The code snippet below is one way a CSP partner can implement restoration from Azure back to an previously backed up SQL box on-premises.

------Start of Code Snippet----------------------------

public bool RestoreDatabase(AzureCustomerDetail customer, Models.Server sqlserver, SqlDatabase database, List<SqlDatabaseBackupHistory> backups)
  {
    bool isRestoreDone = false;
    server.ConnectionContext.Connect();
    try
    {
      // Restore a database and move files
      string newDataFilePath = server.MasterDBLogPath + @"\" + database.Name + DateTime.Now.ToString("s").Replace(":", "-") + ".mdf";
      string newLogFilePath = server.MasterDBLogPath + @"\" + database.Name + DateTime.Now.ToString("s").Replace(":", "-") + ".ldf";
      Credential credential = CreateCredentialInSqlServer(customer, sqlserver);
      if(backups.OrderBy(b => b.BackupDate).First().DatabaseBackupType != CustomerDatabaseBackupType.FullDatabaseBackup)
      {
        throw new Exception("no full database found");
      }

int count = backups.Count();
      int index = 1;
      foreach (var bk in backups.OrderBy(b=>b.BackupDate))
      {
        Restore restore = new Restore();
        restore.CredentialName = credential.Name;
        restore.Database = database.Name;
        if(index == 1 && bk.DatabaseBackupType != CustomerDatabaseBackupType.FullDatabaseBackup)
        {
        throw new Exception("restore failed");
        }

if(bk.DatabaseBackupType == CustomerDatabaseBackupType.FullDatabaseBackup)
        restore.ReplaceDatabase = true;

BackupDeviceItem bdi = default(BackupDeviceItem);
      bdi = new BackupDeviceItem(bk.BackupUrl, DeviceType.Url);

if (index == count)
      {
        restore.NoRecovery = false;
      }
      else
      {
        restore.NoRecovery = true;
      }

restore.Devices.Add(bdi);
      restore.RelocateFiles.Add(new RelocateFile(database.Name, newDataFilePath));
      restore.RelocateFiles.Add(new RelocateFile(database.Name + "_Log", newLogFilePath));
      server.KillAllProcesses(database.Name);
      restore.SqlRestore(server);
      restore.Devices.Remove(bdi);
      index++;
    }

isRestoreDone = true;
  }
  catch (Exception ex)
  {
    if (logger != null)
      logger.Error(ex);
    }
  finally
  {
    server.ConnectionContext.Disconnect();
  }

return isRestoreDone;
}

------End of Code Snippet----------------------------

Considerations

Conducting backs or restores using a manual or automatically approach is enabled by leveraging the CSP API identified above. Manual backup or restore is fairly straight forward however; using the automated approach does require additional considerations when used for the following scenarios. In a dedicated scenario where a dedicated server exist all databases will be backed-up to a unique storage device in a unique customer's subscription whereas a unique customer may also be provisioned in a Clustered scenario that requires the use of a listener. In addition, a Multitenant and Windows Azure Pack scenarios increases the complexity of backup and restores primarily due to the reality that now you need to consider how you will map each CSP partner's database to a unique storage container for each customer that will be different subscriptions. In addition, WAP may also have a listener configured as part of their HA offering.

Dedicated Considerations for a Unique Customer

What considerations should CSP be aware of when provisioning their dedicated clients using CSP? The illustration below is a single dedicated server with Multiple DB for a single customer that needs to be backed up to Azure. If multiple dedicated servers are involved the process would simply repeat itself for each customer.

The following are key considerations partners should be aware of when delivering a dedicated solution using our guidance.

  • Partners need to be aware that each Database identified for a unique customer needs to be associated (i.e. mapped) in the Excel CSV file provided. This ensure that we properly associate each DB to customer so that the right Azure Subscription is used for backups\restores. The format needed is identified in appendix A below.
  • Each identified SQL Server hosted in a partner's environment will need to be identified in the Server List provided in our solution.
  • Each Customer DB Backup interval setting needs to be identified. Note: Only Full and Partial backups will be supported in this version of our guidance.
  • Site-to-site connections are not needed however; the server running the provided scripts and each SQL box being considered for backup\restore needs to be able to reach the internet.
  • The Service account being used needs to be able to backup\restore databased on all identified SQL Servers.
  • Restoring a unique Database can use either our automated using our guidance or manual implemented when needed.
  • If an existing subscription for a customer already exist, they will need to be associated to the existing subscription with the partner's subscription.
  • If a new CSP tenants are created manually then the partner will need to create an Azure subscription and provision storage.
  • Any customer that has already been created in the Partner Center Portal and have an existing Azure subscription associated but no storage has been provisioned will be required to set it up.
  • Partner's existing backup strategy can continue to be used in conjunction with Azure for a period of time however; it is recommended that once the backups have been validated and high confidence has been established that it be removed to minimize on-premises storage cost.
  • AlwaysOn Considerations – Standalone implementations of SQL Server would not require AlwaysOn since at minimum two SQL servers would be required setup using a Windows Server Failover Cluster. However, if this is not the case the AlwaysOn Availability Groups active secondary capabilities include support for performing backup operations on secondary replicas. Backup operations can put significant strain on I/O and CPU (with backup compression). Offloading backups to a synchronized or synchronizing secondary replica allows you to use the resources on server instance that hosts the primary replica for your tier-1 workloads.
  • Use of contained database is recommended to ensure all key database criteria needed can be restored without losing access to the database for all accounts that will be included with backups. Creating contained users enables the user to connect directly to the contained database. This is a very significant feature in high availability and disaster recovery scenarios such as in an AlwaysOn solution. If the users are contained users, in case of failover, people would be able to connect to the secondary without creating logins on the instance hosting the secondary. This provides an immediate benefit. A SQL Database Administrator can use the following command to enable Contained database authentication within SQL Transact:

    sp_configure 'contained database authentication', 1;
    GO
    RECONFIGURE;
    GO

  • Site-to-Site considerations – SQL Server 2014 does not require a Site-to-Site link to exist to successfully backup a set of on-premises databases. The only requirement that will be needed is the Identity (i.e. existing storage) and Secret key identified in the unique tenant subscription. Using a Site-to-Site is key when CSP partners want to deliver SQL disaster recovery services using this method and extending their existing customers on-premises environment. Building out a new SQL Server in their Azure subscription and then adding it to the existing on-premises cluster will enable a 3rd SQL Server in Azure for DR.
  • Performance considerations for large databases being
    • Strategy Considerations
      • How many hours a day do applications have to access the database? If there is a predictable off-peak period, we recommend that you schedule full database backups for that period.
      • How frequently are changes and updates likely to occur? If changes are frequent, consider the following:
        • Under the simple recovery model, consider scheduling differential backups between full database backups. A differential backup captures only the changes since the last full database backup.
        • Under the full recovery model, you should schedule frequent log backups. Scheduling differential backups between full backups can reduce restore time by reducing the number of log backups you have to restore after restoring the data.
      • How much disk space will a full database backup require?
        • Estimate the Size of a Full Database Backup - Before you implement a backup and restore strategy, you should estimate how much disk space a full database backup will use. The backup operation copies the data in the database to the backup file. The backup contains only the actual data in the database and not any unused space. Therefore, the backup is usually smaller than the database itself. You can estimate the size of a full database backup by using the sp_spaceused system stored procedure.
      • Database Backup Blob Files with Active Leases?
        • When backing up to or restoring from Windows Azure storage, SQL Server acquires an infinite lease in order to lock exclusive access to the blob. When the backup or restore process is successfully completed, the lease is released. If a backup or restore fails, the backup process attempts to clean up any invalid blob. However, if the backup fails due to prolonged or sustained network connectivity failure, the backup process may not be able gain access to the blob and the blob may remain orphaned. This means that the blob cannot be written to or deleted until the lease is released. If the backup operation fails, it can result in a backup file that is not valid. The backup blob file might also have an active lease, preventing it from being deleted or overwritten. In order to delete or overwrite such blobs, the lease should first be broken. If there are backup failures, we recommend that you clean up leases and delete blobs. You can also choose cleanup periodically as part of storage management tasks.
    • Performance considerations and Tips for Handling Large Database Files:
      • Considering setting up backup compression default but be advise that enabling this option significantly increases CPU usage, and the additional CPU consumed by the compression process might adversely impact current operations. Therefore, you might want to create low-priority compressed backups in a session whose CPU usage is limited by the Resource Governor.
      • Consider setting up a Resource Governor which enables you to manage SQL Server workloads and resources by specifying limits on resource consumption by incoming requests. In the Resource Governor context, workload is a set of similarly sized queries or requests that can, and should be, treated as a single entity. This is not a requirement, but the more uniform the resource usage pattern of a workload is, the more benefit you are likely to derive from Resource Governor. Resource limits can be reconfigured in real time with minimal impact on workloads that are executing on the SQL resource.
      • Backing up to Windows Azure Blob Storage is flexible, reliable, and limitless off-site storage: Storing your backups on Windows Azure Blob service can be a convenient, flexible, and easy to access off-site option. Creating off-site storage for your SQL Server backups can be as easy as modifying your existing scripts/jobs. Off-site storage should typically be far enough from the production database location to prevent a single disaster that might impact both the off-site and production database locations. By choosing to geo replicate the Blob storage you have an extra layer of protection in the event of a disaster that could affect the whole region. In addition, backups are available from anywhere and at any time and can easily be accessed for restores.
      • The SQL Server backup operation uses multiple threads to optimize data transfer to Windows Azure Blob storage services. However, the performance depends on various factors, such as ISV bandwidth and size of the database. If you plan to back up large databases or filegroups from an on-premises SQL Server database, it is recommended that you do some throughput testing first. Windows Azure storage SLA's have maximum processing times for blobs that you can take into consideration.

Multitenant Considerations

What considerations should partners be aware of when provisioning multitenant clients using CSP? Partners wanting to leverage a multi-tenant solution using the same server will need to ensure all customers have been created using our recommended automation guidance. Below you will find the additional considerations that should be understood.

The following are key considerations partners should be aware of when delivering a multi-tenant solution using our automated solution.

  • Partners need to be aware that each Database identified for a unique customer needs to be associated (i.e. mapped) in the Excel CSV file provided. This ensure that we properly associate each DB to customer so that the right Azure Subscription is used for backups\restores. The format needed is identified in appendix A below.
  • Each identified SQL Server hosted in a partner's environment will need to be identified to support the provided guidance.
  • Each Customer DB Backup interval and type setting needs to be identified. Note: Only Full and Partial backups is being addressed in our guidance.
  • Site-to-site connections are not needed however; the server running the provided scripts and each SQL box being considered for backup\restore needs to be able to reach the internet.
  • The Service account being used needs to be able to backup\restore databased on all identified SQL Servers.
  • Restoring a unique Database can use either our automated using our guidance or manual implemented when needed.
  • If an existing subscription for a customer already exist, they will need to be associated to the existing subscription with the partner. Once the subscription is associated the CSP partner needs to ensure that the required components exist and are updated as part of the guidance.
  • If a new CSP tenants are created manually then the CSP will need to create an Azure subscription and provision storage.
  • If needed, multiple SQL instance can be implemented for each unique customer up to the max allowed. Example: SQL Enterprise Edition supports up to 64 instances where as other support a max of 16.
  • Partner's existing backup strategy can continue to be used in conjunction with Azure for a period of time however; it is recommended that once the backups have been validated and high confidence has been established that it be removed to minimize on-premises storage cost
  • AlwaysOn Considerations – Standalone implementations of SQL Server would not require AlwaysOn since at minimum two SQL servers would be required setup using a Windows Server Failover Cluster. However, if this is not the case the AlwaysOn Availability Groups active secondary capabilities include support for performing backup operations on secondary replicas. Backup operations can put significant strain on I/O and CPU (with backup compression). Offloading backups to a synchronized or synchronizing secondary replica allows you to use the resources on server instance that hosts the primary replica for your tier-1 workloads.
  • Use of contained database is recommended to ensure all key database criteria needed can be restored without losing access to the database for all accounts that will be included with backups. Creating contained users enables the user to connect directly to the contained database. This is a very significant feature in high availability and disaster recovery scenarios such as in an AlwaysOn solution. If the users are contained users, in case of failover, people would be able to connect to the secondary without creating logins on the instance hosting the secondary. This provides an immediate benefit. A SQL Database Administrator can use the following command to enable Contained database authentication within SQL Transact:

    sp_configure 'contained database authentication', 1;
    GO
    RECONFIGURE;
    GO

  • Site-to-Site considerations – SQL Server 2014 does not require a Site-to-Site link to exist to successfully backup a set of on-premises databases. The only requirement that will be needed is the Identity (i.e. existing storage) and Secret key identified in the unique tenant subscription. Using a Site-to-Site is key when CSP partners want to deliver SQL disaster recovery services using this method and extending their existing customers on-premises environment. Building out a new SQL Server in their Azure subscription and then adding it to the existing on-premises cluster will enable a 3rd SQL Server in Azure for DR.
  • Performance considerations for large databases being
    • Strategy Considerations
      • How many hours a day do applications have to access the database? If there is a predictable off-peak period, we recommend that you schedule full database backups for that period.
      • How frequently are changes and updates likely to occur? If changes are frequent, consider the following:
        • Under the simple recovery model, consider scheduling differential backups between full database backups. A differential backup captures only the changes since the last full database backup.
        • Under the full recovery model, you should schedule frequent log backups. Scheduling differential backups between full backups can reduce restore time by reducing the number of log backups you have to restore after restoring the data.
      • How much disk space will a full database backup require?
        • Estimate the Size of a Full Database Backup - Before you implement a backup and restore strategy, you should estimate how much disk space a full database backup will use. The backup operation copies the data in the database to the backup file. The backup contains only the actual data in the database and not any unused space. Therefore, the backup is usually smaller than the database itself. You can estimate the size of a full database backup by using the sp_spaceused system stored procedure.
      • Database Backup Blob Files with Active Leases?
        • When backing up to or restoring from Windows Azure storage, SQL Server acquires an infinite lease in order to lock exclusive access to the blob. When the backup or restore process is successfully completed, the lease is released. If a backup or restore fails, the backup process attempts to clean up any invalid blob. However, if the backup fails due to prolonged or sustained network connectivity failure, the backup process may not be able gain access to the blob and the blob may remain orphaned. This means that the blob cannot be written to or deleted until the lease is released. If the backup operation fails, it can result in a backup file that is not valid. The backup blob file might also have an active lease, preventing it from being deleted or overwritten. In order to delete or overwrite such blobs, the lease should first be broken. If there are backup failures, we recommend that you clean up leases and delete blobs. You can also choose cleanup periodically as part of storage management tasks.
    • Performance considerations and Tips for Handling Large Database Files:
      • Considering setting up backup compression default but be advise that enabling this option significantly increases CPU usage, and the additional CPU consumed by the compression process might adversely impact current operations. Therefore, you might want to create low-priority compressed backups in a session whose CPU usage is limited by the Resource Governor.
      • Consider setting up a Resource Governor which enables you to manage SQL Server workloads and resources by specifying limits on resource consumption by incoming requests. In the Resource Governor context, workload is a set of similarly sized queries or requests that can, and should be, treated as a single entity. This is not a requirement, but the more uniform the resource usage pattern of a workload is, the more benefit you are likely to derive from Resource Governor. Resource limits can be reconfigured in real time with minimal impact on workloads that are executing on the SQL resource.
      • Backing up to Windows Azure Blob Storage is flexible, reliable, and limitless off-site storage: Storing your backups on Windows Azure Blob service can be a convenient, flexible, and easy to access off-site option. Creating off-site storage for your SQL Server backups can be as easy as modifying your existing scripts/jobs. Off-site storage should typically be far enough from the production database location to prevent a single disaster that might impact both the off-site and production database locations. By choosing to geo replicate the Blob storage you have an extra layer of protection in the event of a disaster that could affect the whole region. In addition, backups are available from anywhere and at any time and can easily be accessed for restores.
      • The SQL Server backup operation uses multiple threads to optimize data transfer to Windows Azure Blob storage services. However, the performance depends on various factors, such as ISV bandwidth and size of the database. If you plan to back up large databases or filegroups from an on-premises SQL Server database, it is recommended that you do some throughput testing first. Windows Azure storage SLA's have maximum processing times for blobs that you can take into consideration.

High Availability Considerations

What considerations should CSP be aware of when provisioning High Availability solutions for their CSP clients? Partners with customers who currently subscribe to High Availability can leverage both a manual and scripted solution. Below the illustration identifies how a unique customer subscribing to HA can leverage the backup\restore scripted solution. Some of the additional considerations that need to be identified are listed below.

The following are key considerations partners should be aware of when delivering a High Availability solution using our guidance.

  • Partners need to be aware that each Database identified for a unique customer needs to be associated (i.e. mapped) in the Excel CSV file provided. This ensures that we properly associate each DB to customer so that the right Azure Subscription is used for backups\restores. The format needed is identified in appendix A below.
  • Each identified SQL Server hosted in a partner's environment will need to be identified in the Server List provided in our solution.
  • Each Customer DB Backup interval setting needs to be identified. Note: Only Full and Partial backups will be supported in this version of our guidance.
  • Site-to-site connections are not needed however; the server running the provided scripts and each SQL box being considered for backup\restore needs to be able to reach the internet.
  • The Service account being used needs to be able to backup\restore databased on all identified SQL Servers.
  • Restoring a unique Database can use either our automated using our guidance or manual implemented when needed.
  • If an existing subscription for a customer already exist, they will need to be associated to the existing subscription with the partner subscription.
  • If a new CSP tenants are created manually then the partner will need to create an Azure subscription and provision storage.
  • Any customer that has already been created in the Partner Center Portal and has an existing Azure subscription associated but no storage will be required to set it up manually.
  • It is highly recommended that the existing HA solution be tested fully to ensure it is operational prior to integrating it with the Partner Center Portal.
  • Backup and Restore need to be setup using the HA listener setup.
  • The server that will be running the scripted solution needs to be added to the existing File Share to ensure the $Server is able to access the clusters quorum and backup shares.
  • Partner's existing backup strategy can continue to be used in conjunction with Azure for a period of time however; it is recommended that once the backups have been validated and high confidence has been established that it be removed to minimize on-premises storage cost.
  • AlwaysOn Considerations – Standalone implementations of SQL Server would not require AlwaysOn since at minimum two SQL servers would be required setup using a Windows Server Failover Cluster. However, if this is not the case the AlwaysOn Availability Groups active secondary capabilities include support for performing backup operations on secondary replicas. Backup operations can put significant strain on I/O and CPU (with backup compression). Offloading backups to a synchronized or synchronizing secondary replica allows you to use the resources on server instance that hosts the primary replica for your tier-1 workloads.
  • Use of contained database is recommended to ensure all key database criteria needed can be restored without losing access to the database for all accounts that will be included with backups. Creating contained users enables the user to connect directly to the contained database. This is a very significant feature in high availability and disaster recovery scenarios such as in an AlwaysOn solution. If the users are contained users, in case of failover, people would be able to connect to the secondary without creating logins on the instance hosting the secondary. This provides an immediate benefit. A SQL Database Administrator can use the following command to enable Contained database authentication within SQL Transact:

    sp_configure 'contained database authentication', 1;
    GO
    RECONFIGURE;
    GO

  • Site-to-Site considerations – SQL Server 2014 does not require a Site-to-Site link to exist to successfully backup a set of on-premises databases. The only requirement that will be needed is the Identity (i.e. existing storage) and Secret key identified in the unique tenant subscription. Using a Site-to-Site is key when CSP partners want to deliver SQL disaster recovery services using this method and extending their existing customers on-premises environment. Building out a new SQL Server in their Azure subscription and then adding it to the existing on-premises cluster will enable a 3rd SQL Server in Azure for DR.
  • Performance considerations for large databases being
    • Strategy Considerations
      • How many hours a day do applications have to access the database? If there is a predictable off-peak period, we recommend that you schedule full database backups for that period.
      • How frequently are changes and updates likely to occur? If changes are frequent, consider the following:
        • Under the simple recovery model, consider scheduling differential backups between full database backups. A differential backup captures only the changes since the last full database backup.
        • Under the full recovery model, you should schedule frequent log backups. Scheduling differential backups between full backups can reduce restore time by reducing the number of log backups you have to restore after restoring the data.
      • How much disk space will a full database backup require?
        • Estimate the Size of a Full Database Backup - Before you implement a backup and restore strategy, you should estimate how much disk space a full database backup will use. The backup operation copies the data in the database to the backup file. The backup contains only the actual data in the database and not any unused space. Therefore, the backup is usually smaller than the database itself. You can estimate the size of a full database backup by using the sp_spaceused system stored procedure.
      • Database Backup Blob Files with Active Leases?
        • When backing up to or restoring from Windows Azure storage, SQL Server acquires an infinite lease in order to lock exclusive access to the blob. When the backup or restore process is successfully completed, the lease is released. If a backup or restore fails, the backup process attempts to clean up any invalid blob. However, if the backup fails due to prolonged or sustained network connectivity failure, the backup process may not be able gain access to the blob and the blob may remain orphaned. This means that the blob cannot be written to or deleted until the lease is released. If the backup operation fails, it can result in a backup file that is not valid. The backup blob file might also have an active lease, preventing it from being deleted or overwritten. In order to delete or overwrite such blobs, the lease should first be broken. If there are backup failures, we recommend that you clean up leases and delete blobs. You can also choose cleanup periodically as part of storage management tasks.
    • Performance considerations and Tips for Handling Large Database Files:
      • Considering setting up backup compression default but be advise that enabling this option significantly increases CPU usage, and the additional CPU consumed by the compression process might adversely impact current operations. Therefore, you might want to create low-priority compressed backups in a session whose CPU usage is limited by the Resource Governor.
      • Consider setting up a Resource Governor which enables you to manage SQL Server workloads and resources by specifying limits on resource consumption by incoming requests. In the Resource Governor context, workload is a set of similarly sized queries or requests that can, and should be, treated as a single entity. This is not a requirement, but the more uniform the resource usage pattern of a workload is, the more benefit you are likely to derive from Resource Governor. Resource limits can be reconfigured in real time with minimal impact on workloads that are executing on the SQL resource.
      • Backing up to Windows Azure Blob Storage is flexible, reliable, and limitless off-site storage: Storing your backups on Windows Azure Blob service can be a convenient, flexible, and easy to access off-site option. Creating off-site storage for your SQL Server backups can be as easy as modifying your existing scripts/jobs. Off-site storage should typically be far enough from the production database location to prevent a single disaster that might impact both the off-site and production database locations. By choosing to geo replicate the Blob storage you have an extra layer of protection in the event of a disaster that could affect the whole region. In addition, backups are available from anywhere and at any time and can easily be accessed for restores.
      • The SQL Server backup operation uses multiple threads to optimize data transfer to Windows Azure Blob storage services. However, the performance depends on various factors, such as ISV bandwidth and size of the database. If you plan to back up large databases or filegroups from an on-premises SQL Server database, it is recommended that you do some throughput testing first. Windows Azure storage SLA's have maximum processing times for blobs that you can take into consideration.

WAP Considerations

What considerations should CSP be aware of when provisioning Windows Azure Pack (WAP) for their CSP clients? Partners with customers who currently provide a Windows Azure Pack solution can leverage both a manual and scripted solution. Below the illustration identifies how a unique customer subscribing to Distributed WAP can leverage the backup\restore scripted solution. Some of the additional considerations that need to be identified are listed below.

The following are key considerations partners should be aware of when delivering a Windows Azure Pack solution using our guidance.

  1. Partners need to be aware that each Database identified for a unique customer needs to be associated (i.e. mapped) in the Excel CSV file provided. This ensure that we properly associate each DB to customer so that the right Azure Subscription is used for backups\restores. The format needed is identified in appendix A below.
  2. Each identified SQL Server hosted in a partner's environment will need to be identified in the Server List provided in our solution.
  3. Each Customer DB Backup interval setting needs to be identified. Note: Only Full and Partial backups will be supported in this version of our guidance.
  4. Site-to-site connections are not needed however; the server running the provided scripts and each SQL box being considered for backup\restore needs to be able to reach the internet.
  5. The account being used needs to be able to backup\restore databased on all identified SQL Servers.
  6. Restoring a unique Database can use either our automated using our guidance or manual implemented when needed.
  7. If an existing subscription for a customer already exist, they will need to be associated to the existing subscription with the partner subscription.
  8. If a new CSP tenants are created manually then the partner will need to create an Azure subscription and provision storage.
  9. Any customer that has already been created in the Partner Center Portal and has an existing Azure subscription associated but no storage will be required to set it up manually.
  10. It is highly recommended that the existing HA solution be tested fully to ensure it is operational prior to integrating it with the Partner Center Portal.
  11. Backup and Restore need to be setup using the HA listener setup for WAP.
  12. The server that will be running the scripted solution needs to be added to the existing File Share to ensure the $Server is able to access the clusters quorum and backup shares.
  13. Partner's existing backup strategy can continue to be used in conjunction with Azure for a period of time however; it is recommended that once the backups have been validated and high confidence has been established that it be removed to minimize on-premises storage cost.
  • AlwaysOn Considerations – Standalone implementations of SQL Server would not require AlwaysOn since at minimum two SQL servers would be required setup using a Windows Server Failover Cluster. However, if this is not the case the AlwaysOn Availability Groups active secondary capabilities include support for performing backup operations on secondary replicas. Backup operations can put significant strain on I/O and CPU (with backup compression). Offloading backups to a synchronized or synchronizing secondary replica allows you to use the resources on server instance that hosts the primary replica for your tier-1 workloads.
  • Use of contained database is recommended to ensure all key database criteria needed can be restored without losing access to the database for all accounts that will be included with backups. Creating contained users enables the user to connect directly to the contained database. This is a very significant feature in high availability and disaster recovery scenarios such as in an AlwaysOn solution. If the users are contained users, in case of failover, people would be able to connect to the secondary without creating logins on the instance hosting the secondary. This provides an immediate benefit. A SQL Database Administrator can use the following command to enable Contained database authentication within SQL Transact:

    sp_configure 'contained database authentication', 1;
    GO
    RECONFIGURE;
    GO

  • Site-to-Site considerations – SQL Server 2014 does not require a Site-to-Site link to exist to successfully backup a set of on-premises databases. The only requirement that will be needed is the Identity (i.e. existing storage) and Secret key identified in the unique tenant subscription. Using a Site-to-Site is key when CSP partners want to deliver SQL disaster recovery services using this method and extending their existing customers on-premises environment. Building out a new SQL Server in their Azure subscription and then adding it to the existing on-premises cluster will enable a 3rd SQL Server in Azure for DR.
  • Performance considerations for large databases being
    • Strategy Considerations
      • How many hours a day do applications have to access the database? If there is a predictable off-peak period, we recommend that you schedule full database backups for that period.
      • How frequently are changes and updates likely to occur? If changes are frequent, consider the following:
        • Under the simple recovery model, consider scheduling differential backups between full database backups. A differential backup captures only the changes since the last full database backup.
        • Under the full recovery model, you should schedule frequent log backups. Scheduling differential backups between full backups can reduce restore time by reducing the number of log backups you have to restore after restoring the data.
      • How much disk space will a full database backup require?
        • Estimate the Size of a Full Database Backup - Before you implement a backup and restore strategy, you should estimate how much disk space a full database backup will use. The backup operation copies the data in the database to the backup file. The backup contains only the actual data in the database and not any unused space. Therefore, the backup is usually smaller than the database itself. You can estimate the size of a full database backup by using the sp_spaceused system stored procedure.
      • Database Backup Blob Files with Active Leases?
        • When backing up to or restoring from Windows Azure storage, SQL Server acquires an infinite lease in order to lock exclusive access to the blob. When the backup or restore process is successfully completed, the lease is released. If a backup or restore fails, the backup process attempts to clean up any invalid blob. However, if the backup fails due to prolonged or sustained network connectivity failure, the backup process may not be able gain access to the blob and the blob may remain orphaned. This means that the blob cannot be written to or deleted until the lease is released. If the backup operation fails, it can result in a backup file that is not valid. The backup blob file might also have an active lease, preventing it from being deleted or overwritten. In order to delete or overwrite such blobs, the lease should first be broken. If there are backup failures, we recommend that you clean up leases and delete blobs. You can also choose cleanup periodically as part of storage management tasks.
    • Performance considerations and Tips for Handling Large Database Files:
      • Considering setting up backup compression default but be advise that enabling this option significantly increases CPU usage, and the additional CPU consumed by the compression process might adversely impact current operations. Therefore, you might want to create low-priority compressed backups in a session whose CPU usage is limited by the Resource Governor.
      • Consider setting up a Resource Governor which enables you to manage SQL Server workloads and resources by specifying limits on resource consumption by incoming requests. In the Resource Governor context, workload is a set of similarly sized queries or requests that can, and should be, treated as a single entity. This is not a requirement, but the more uniform the resource usage pattern of a workload is, the more benefit you are likely to derive from Resource Governor. Resource limits can be reconfigured in real time with minimal impact on workloads that are executing on the SQL resource.
      • Backing up to Windows Azure Blob Storage is flexible, reliable, and limitless off-site storage: Storing your backups on Windows Azure Blob service can be a convenient, flexible, and easy to access off-site option. Creating off-site storage for your SQL Server backups can be as easy as modifying your existing scripts/jobs. Off-site storage should typically be far enough from the production database location to prevent a single disaster that might impact both the off-site and production database locations. By choosing to geo replicate the Blob storage you have an extra layer of protection in the event of a disaster that could affect the whole region. In addition, backups are available from anywhere and at any time and can easily be accessed for restores.
      • The SQL Server backup operation uses multiple threads to optimize data transfer to Windows Azure Blob storage services. However, the performance depends on various factors, such as ISV bandwidth and size of the database. If you plan to back up large databases or filegroups from an on-premises SQL Server database, it is recommended that you do some throughput testing first. Windows Azure storage SLA's have maximum processing times for blobs that you can take into consideration.

Appendix A – Additional References

Name

Purpose

Link

Windows Azure PowerShell – Storage Commands

In this Blog we will walk through few PowerShell commands to work with Windows Azure Storage.

LINK

Azure Storage

Azure Storage Accounts Information

LINK

Partner Network

Microsoft Cloud Solution Provider program

LINK

Blob Service REST API

The Blob service stores text and binary data as blobs in the cloud. The Blob service offers the following three resources: the storage account, containers, and blobs. Within your storage account, containers provide a way to organize sets of blobs.

LINK

AD Graph REST

AD Graph REST

LINK

Azure AD Graph REST API Reference

This topic lists Azure AD entities and types that are exposed by Graph and provides information about the operations that can be performed on them with Graph.

LINK

CREAT APIs

This section describes the conventions that the CREST API adopts.

LINK

APIs

Automate customers management with our APIs

LINK

API Reference

Microsoft Partner Center API Reference

LINK

Graph Explorer

Used to validate Azure AD Subscription information

LINK

Appendix B – Frequently Asked Questions

  1. How do I backup a customer's SQL Server DB in a unique geography to local Azure Subscription?

The following steps outline how a partner can elect to save a specific set of customer's databases to a unique region. The configuration file below includes a "defaultstorageaccountlocation" parameter (highlighted below) that can be updated to any of the existing Azure global regions.

Configuration Setting File used during the initial setup setting

Data

Description

Set-Setting MicrosoftId

a6e8dc40-43fc-4461-9aef-1cef6603d7f3

Microsoft ID

Set-Setting DefaultDomain

mani100.onmicrosoft.com

Default Domain used in the Partner Center Portal

Set-Setting AppId

75c681c3-ca80-4b40-8bec-78ec37c5222d

Assigned Application ID

Set-Setting AppKey

RrTTpKC3ZZvV0wWmbUzZ1U5jv6NSc+MxFRlk3pAdIko=

Assigned APP Key (will be encrypted)

Set-Setting

storageresourceGroupName DBBackupRG

Default Storage Resource Group Name that will be created for new subscriptions

Set-Setting DefaultStorageAccountLocation

"West US"

Use to identify the targeted regions a new customers subscription will be created with or is set to

Set-Setting storageAccountType

Standard_LRS

One of the following account types (case-sensitive):

  • Standard_LRS (Standard Locally-redundant storage)
  • Standard_ZRS (Standard Zone-redundant storage)
  • Standard_GRS (Standard Geo-redundant storage)
  • Standard_RAGRS (Standard Read access geo-redundant storage)
  • Premium_LRS (Premium Locally-redundant storage)

Set-Setting storageAccountLabel

"CSP partner's database backup storageaccount"

Name of the storage account

Set-Setting storageAccountBlobContainerName

DBBackup

Name of the container

Set-Setting ARMUser

armServiceaccount@mani100.onmicrosoft.com

Name of Service Account Provisioned with the appropriate rights in CSP partner's Azure Subscription and Partner Center Portal.

Set-Setting ARMPwd

Lion@2012

Service Account Password (Will be encrypted)

Set-Setting armAadClientId

1950a258-227b-4e31-a9cf-717495945fc2

Azure Client ID Assigned

  1. Are the code snippets all that CSP partners need to deliver an automated solution?

    This document provides a great starting point for helping CSP partners build a customized automated solution for their unique requirements. The details and goals outlined here are focus on providing needed guidance for helping partners leverage the existing API currently available via Partner Center and Azure.

Appendix C – Using a Sandbox environment

Since we realize that some customers may want to leverage a Sandbox environment for testing the following guidance is provided.

Below you see where to setup the Integration sandbox.

  1. Once setup it now shows done. Now simply select done and it takes you back to the home screen.

  1. Log into the Sandbox you just created using the account you provisioned.


  1. You have completed setup of the Sandbox environment.

Select API settings and register an app for the Integration Sandbox environment

  1. Capture all of the details for this API settings


  1. Now you want to allow the account access the Graph API (AD Mechanism that allows you to connect to Azure AD). First, verify what users have access and if the account does not exist go ahead and create it by selecting User Management as shown.


Below you will see the permissions established for the "amServiceaccount". Only select the Admin Agent which will give the right level of permissions needed to use the guidance scripts in this document.


  1. Now we need to ensure the Partner Center Portal is linked to the CSP partner's Azure subscription. Go to the Azure Management Portal and log in and select Active Directory from the left column. Then select New as shown below…

  1. Select "Active Directory" then "Directory" then "Custom Create" and finally "Create and manage a Microsoft Azure AD Directory"


  1. Since we are linking the Partner Center Portal which has accounts and users to the CSP partner's Windows Azure AD we select "Use existing directory". Then select the toggle box as show below and click the check mark.


  1. You will then be asked to log into your Partner Center Portal using the creds you already established.



  1. Next you will see that the Organization is found and the global user in the Partner Center Portal and you can now select "Continue" to link it.


  1. Next you will see it has successfully added the user and select to sign out.


  1. Now go and log back into the Azure Management Portal and look at the Active Directory section and you should now see the new Integration Sandbox domain.


  1. Then select it to go view the details


  1. Then Select the Application to view published apps


  1. Then select the Show "Applications my company owns" from the drop down


  1. Now select the application listed for the Partner Center API and view the details that come up and select Configure

  1. Once in the Configuration section updated the Sign-on URL as shown below.

  1. Now update the read\write permissions as shown


  1. Then save settings and you are ready to continue implementing an automated solution as identified in section 3 above.

Appendix D – Backing up SQL Server 2014 with Encryption

Procedure and guidance for creating an encrypted backup are covered here: