[Developer's community]

SQL Server Database Export/Import operation in MS Azure

 

10 August 2017
The beauty of Azure is that it has multiple ways to do the same thing. Imagine the scenario;  y

The beauty of Azure is that it has multiple ways to do the same thing. Imagine the scenario;  you need to copy the database between subscriptions. Considering SQL Server Import/Export procedure, you can do that either from the UI, or use command line tools (like SQLPackage).

The way it works with Azure UI, you go to the SQL Server database you want to export, and you have an option available (circled in red below). It will export the database to the selected Blob storage (you need to have storage account created prior this operation in the same subscription).

In the case of the same subscription but different SQL Server instances (or if your Azure subscriptions belong to the same tenant), you can Import this database (the actual *.bacpac file) using the same storage account, as it will be visible for all of them. If these are two completely different subscriptions, but you still need to Export/Import this database, you may need to create a storage account in the second subscription and import it from there. For the purpose, go to the SQL Server instance, and you will see the “Import database” option as highlighted below:

To import database, you need to select the file (*.bacpac) from the storage account and pricing tier (along with collation, admin name, and password).

TIP: When you download the database backup file (*.bacpac) from the storage account, it will appear as a zipped archive. No worries about that, as the bacpac file is a zip archive in reality.

There is nothing wrong with this approach, but in this case, the import procedure takes quite long time (as for me) for small databases and much longer for larger ones. The import process is hidden from the user, and the only message you will receive is that “Request is submitted to import database”:

You can control it to some extent by clicking “Operations” tile on the same page below (that is called “Import/Export history”) but it doesn’t update in real time. The error messages (if you would have any) are not that informative as well:

If you need to perform the same task quicker, automate it in the script (using PS or Azure CLI), or add more visibility into the process, the right way to go is to use SQLPackage command-line utility.

As stated in the documentation, to Export/Import using SQLPackage utility, you can use the relevant parameters and properties. The utility should be installed on your machine if you used SQL Server Management Studio (the latest one) or Visual Studio Tools for SQL Server. You can also install it directly from the MS website here.

The command to Export database I have used is quite simple:

Code:

SqlPackage.exe /a:Export /tf:<filename>.bacpac /scs:"Data Source=<server name>.database.windows.net;Initial Catalog=<database name to export>; User id=<user id>; Password=<password>"

TIP: You can find the SQLPackage util using the following path: “C:\Program Files (x86)\Microsoft SQL Server\<version>\DAC\bin”. If you will be using the util from there, the exported file will be dropped into the same directory. The better way is to add SQLPackage.exe to Path in the Environment variables, so you can call anywhere using the command line.

Import operation is as simple as export. Please find the code below:

Code:

SqlPackage.exe /a:import /tcs:"Data Source=<destination server name>.database.windows.net;Initial Catalog=<db name>;User Id=<server admin>;Password=<admin password>" /sf:<file name>.bacpac /p:DatabaseEdition=Standard /p:DatabaseServiceObjective=S1

It is worth mentioning that the *.bacpac will be picked from the same directory where SQLPackage.exe lies (unless you used the tip above). “User Id” and “Password” should be SQL Server admin’s credentials in Azure. Database edition corresponds to SQL Server editions (the full list of possible values: Basic|Standard|Premium|Default), the same parameters can be found in the documentation mentioned above. Database Service Objective corresponds to SQL Server pricing tiers. Possible values are: Basic, S0-S3 (Standard tier), P1/P2/P4/P6/P11/P15 (that belong to Premium tier), PRS1-2/4/6 (IO and compute-intensive instances, still in preview).

Like it was mentioned before, this way is quicker and gives you more control and visibility over the process. The error messages are more informative in this case as well:

So, using these two simple commands, you can Export/Import database for the same or different Azure subscription without the overhead and automate the process if necessary, or use SQLPackage as part of your script for some other tasks.

 

Creating a Service Principal for VSTS endpoint

 

29 June 2017
Imagine a situation where you have to manage multiple Azure subscriptions (that belong to different

Imagine a situation where you have to manage multiple Azure subscriptions (that belong to different tenants) not only for your organization but also for your company’s clients. In software development process, you can host VSTS (Visual Studio Team Services) on your organization’s Azure account and deploy to customers’ subscription (which is quite a wide-spread use case). Subscription management in Azure designed in such a way you cannot have two similar products assigned to you. For instance, you cannot have two Pay-as-you-go subscriptions assigned to your account (or two BizSpark or Enterprise subscriptions, etc.) The resources from different subscriptions can be assigned to you, but you can’t manage them directly from your account, as they belong to different tenants. What if you need to build a delivery pipeline in VSTS and don’t see a target subscription (to deploy your application to) on the list? Well, it could be a problem, but before calling Microsoft’s support, let’s see what we can do with it.

Fortunately, you can create custom service connection, by clicking ‘Manage’ next to Azur subscriptions drop-down (see the screenshot above). When you go to ‘New Service Endpoint’ -> ‘Azure Resource Manager,' you will see a familiar list of subscriptions assigned to your account:

The dialog suggests: ‘If your subscription is not listed above, or your account is not backed by Azure Active Directory or to specify an existing Service Principal, use the full version of the endpoint dialog.’ See the screenshot below:

The dialogue seems complicated (at a glance) with many fields to fill-out. Let’s see what we can do:

Let's start filling this form out :)

  • Connection name is the simplest – put any suitable name in here
  • Environment – leave a default value (unless you’re in China, Germany or US Gov)
  • Subscription ID – the id of the target subscription (you deploy to). Go to Azure -> Subscriptions, select and copy subscription ID from there. Or, you can use Azure CLI and enter:
azure login
azure account list

Copy the ID:

  • Subscription Name – opposite to Id (see the previous step)
  • Tenant ID – Go to Azure -> Active Directory -> Properties blade -> Copy ‘Directory ID’ from there. Alternatively, as long as you’re already logged in in the Azure CLI, you can use the following command:
azure account show
  • Service Principal Client ID – is a bit more complicated. You won’t find this value anywhere. The only way is to create a Service Principal that will be assigned to this endpoint. To do so, we need to use Azure CLI (or PowerShell):
azure ad sp create -n <app> -p <password>

Where <app> is random app name (doesn’t really matter) and <password> - is a password for the Service Principal (and Service Principal Key at the same time). This command will create an SP:

The value you need is ‘Service Principal Names.' The last step is to assign Service Principal to a Role. Using Service Principal Name, execute the following the CLI command to assign SP to a Contributor role:

azure role assignment create --spn 990ffeff-0016-4535-809c-79db18336db4 -o Contributor

As long as done with this, click ‘Verify Connection’ link on the form and once it verified, you will be able to use this Endpoint to connect to the client’s subscription.

 

What I learned at Microsoft Build 2017 - Part 1

 

17 May 2017
asdasd

What I learned at Microsoft Build 2017 – Part 1

In this topic, I want to share what I learned at MS Build and explain the key takeaways from this event. I won’t be focusing on the keynotes much, as you can watch the relevant videos on channel 9, but the key message, as Microsoft’s VP Scott Guthrie said – “The success of your solution on the Azure platform is our primary goal!”. That means that Microsoft keeps concentrating its efforts around Azure services… but hold on, there is much more than that!

It is rare when I so excited about the conference. I was surprised and even baffled by some new releases, and news Microsoft has thoroughly prepared to present at this conference. Let’s do everything one by one, so I can make sure I didn’t forget anything important. So, first and foremost - MS finally released long-awaited Visual Studio for Mac (release date is 8th of May 2017). Finally, the developers received a familiar instrument for solutions development on Mac, using NET Core and Xamarin (or Unity), but the tooling, comparing to the same IDE for Windows still requires some work. Especially disappointing is the absence of Docker support.

Continuing topic on development instruments, Microsoft announces a new Cloud Shell in Azure (command line) that works with your Azure account right from the browser. Every session is in sync with a $Home directory that is stored in Azure, which, in turn, gives a possibility to access files, VM’s and other artifacts you deal with via UI. PowerShell is not yet supported and coming soon, although, you can sign up for a private preview here. The App for iOS is also available in the App Store. I gave it five stars, as it pretty neat and quick (command line is not yet available though), but it was demonstrated at the conference by Corey Sanders (see “Azure Compute – new features and roadmap” on Channel 9).

Infrastructure

According to Scott Hanselmann, MS has announced 38 regions/data centers around the globe (and nine countries where disaster recovery is possible because of multiple regions: USA, Canada, UK, France, Germany, India, China, Japan, Australia). Cross-Region Disaster Recovery (Automatic DR to another area, i.e. Azure to Azure), so, we can finally take advantage of multiple regions within the country. This feature is coming soon.

So, to summarize, we going to have In-Country DR (9 countries as mentioned above), Multi-Instance SLA (99.95%), Single-Instance SLA (99.99%). In comparison to other clouds, Microsoft is far ahead.

Instance metadata service is another new feature on the Azure Platform that exposes an endpoint, so you can manage upcoming maintenance events, bootstrap VM with identity and provide VM context in a fully programmatic way. It also provides the status of all the instances running.

Containers

There were lots of topics at the conference pertaining to containers overall and their interaction with OS components and Azure services in particular. Microsoft Azure Container Service provides Docker tooling and API support along with provisioning of Kubernetes, DC/OS, and Docker Swarm. I have inserted the links for those who don’t know what this is all about. In short, these are all the container orchestration, deployment automation, and scale tools, based on Mesos DC/OS, Kubernetes or Docker Swarm that provide the best Docker experience. Apart from this, Microsoft on-boards Deis team for assistance on Kubernetes (and Helm tool in particular, that simplifies pre-configured Kubernetes resources management). See the demo by Corey Sanders (I have provided the most interesting links below).

Service Fabric

Another interesting topic is Azure Service Fabric. There is nothing new in Azure-based Microservices, but the fact you can now deploy them in containers (surprise-surprise ). Azure Service Fabric now represents stateful and stateless services (.NET and Java APIs on Windows Server and Linux) that can be deployed in Azure, Azure Stack, On-premises, in OpenStack and AWS even. The application (Music Store) shown by Corey on this presentation can be found on GitHub, along with explanations on how to run it in Docker for Windows. 

Azure Batch

This service has been around for a while. It offers Job Scheduling as a service at the first place, and the other platform offerings have become very easy having Azure Batch service. It allows you to focus on the major question, i.e. what infrastructure you want, when/where and pricing. Microsoft has announced a low priority batch VMs where you pay 80% less and, sure thing, it is deeply integrated with the job scheduling (right now is available in preview mode). Another aspect of a batch is being able to take advantage of rendering capabilities (using compute instances to render images/videos/3D models). Senior Product Manager from Autodesk has shared his experience in this regard.

Serverless

Function side of the house J For those who don’t know what this is about, it is an Event-based way of programming that doesn’t require infrastructure (and has a billing model by usage/demand/call/execution, etc.) Microsoft’s offering here is Azure Functions (I highly recommend to watch a recorded presentation by Eduardo Laureano, PM on Azure Functions and will talk about all this stuff a bit later). Microsoft has announced the integration with Visual Studio on that day (on 8th of May a lot of stuff were announced), so, it simplifies editing and debugging of the functions. A new feature that pertains to this set of technologies is Azure Functions Runtime. What’s cool about it, is that it allows to install runtime and use Functions in a container outside Azure environment. You can download a preview version of it from here. It can run anywhere where containers work or on top of the Azure Stack (which is in turn, the Azure functionality outside Azure environment).

Azure Managed Applications and Service Catalog

Package and seal the Azure functionality for the other folks to use. It means that the application which is being developed can be packaged and deployed by someone else, using ARM experience, or, sold/exposed to a third-party, etc. What is interesting about it – the catalog uses an enhanced security model, so the app in the catalog can be exposed or pre-deployed to a certain personality/account (after approval or without it) and has a configurable amount of parameters that are required for successful deployment. Right now it’s in preview mode (just like many other things in this topic), and link to documentation leads to a general Azure website (I was unable to try it myself due to the missing documentation).

Compute

Microsoft has expanded the set of compute instance sizes (by adding six new ones):

  • F – Compute intensive
  • NC – NVIDIA GPUs K80 compute
  • NV – NVIDIA GPUs M60 Visualization
  • H – Fastest CPU IB Connectivity (very high throughput between instances)
  • L – Large SSDs
  • SAP – SAP Large instances

Four more were announced during the conference:

  • ND (P40) – NVIDIA Gear towards deep learning, scale-out computation
  • NCv2 (P100s) – computational SKU (scale-out compute with InfiniBand connectivity)
  • Dv3 (SSD storage, fast CPU) and Ev3 (high memory) – next generation of existing SKUs with nested virtualization (meaning that they’ll be based on Windows 2016 and support VMs inside VMs deployed into the cloud)

Azure Logic Apps

Azure Logic App is not a new service, but obviously known not well enough. It takes advantage of existing BizTalk servers (connects to logic apps through an adapter) to connect to SaaS and invoke logic apps. It makes it easier to connect to trading partners using EDI standards and B2B capabilities.

Microsoft PowerApps

This service is around for a while, and I know people who heavily use it in development.  There is nothing new in the way it works (aside from the grooving number of different adaptors, which is 140 now) but it allows to build an app very quickly and doesn’t require much of mobile development experience, which was perfectly demonstrated by the guys (Jeff Hollan, Eduardo Laureano) on the very last day of the conference. I think you will enjoy the presentation as much as I did. Please watch “How to build serverless business applications with Azure Functions and Logic Apps for PowerApps” to get more info on the topic (the link to Channel 9 video below).

There is still a lot to tell about, so, I had to split my topic into two parts. I will publish the second one shortly. Stay tuned :)

 

 Mentioned Videos on Channel 9 

Azure Compute – new features and roadmap (Corey Sanders)

How to build serverless business applications with Azure Functions and Logic Apps for PowerApps (Jeff Hollan, Eduardo Laureano)