[Developer's community]

SQL Server Database Export/Import operation in MS Azure


10 August 2017
The beauty of Azure is that it has multiple ways to do the same thing. Imagine the scenario;  y

The beauty of Azure is that it has multiple ways to do the same thing. Imagine the scenario;  you need to copy the database between subscriptions. Considering SQL Server Import/Export procedure, you can do that either from the UI, or use command line tools (like SQLPackage).

The way it works with Azure UI, you go to the SQL Server database you want to export, and you have an option available (circled in red below). It will export the database to the selected Blob storage (you need to have storage account created prior this operation in the same subscription).

In the case of the same subscription but different SQL Server instances (or if your Azure subscriptions belong to the same tenant), you can Import this database (the actual *.bacpac file) using the same storage account, as it will be visible for all of them. If these are two completely different subscriptions, but you still need to Export/Import this database, you may need to create a storage account in the second subscription and import it from there. For the purpose, go to the SQL Server instance, and you will see the “Import database” option as highlighted below:

To import database, you need to select the file (*.bacpac) from the storage account and pricing tier (along with collation, admin name, and password).

TIP: When you download the database backup file (*.bacpac) from the storage account, it will appear as a zipped archive. No worries about that, as the bacpac file is a zip archive in reality.

There is nothing wrong with this approach, but in this case, the import procedure takes quite long time (as for me) for small databases and much longer for larger ones. The import process is hidden from the user, and the only message you will receive is that “Request is submitted to import database”:

You can control it to some extent by clicking “Operations” tile on the same page below (that is called “Import/Export history”) but it doesn’t update in real time. The error messages (if you would have any) are not that informative as well:

If you need to perform the same task quicker, automate it in the script (using PS or Azure CLI), or add more visibility into the process, the right way to go is to use SQLPackage command-line utility.

As stated in the documentation, to Export/Import using SQLPackage utility, you can use the relevant parameters and properties. The utility should be installed on your machine if you used SQL Server Management Studio (the latest one) or Visual Studio Tools for SQL Server. You can also install it directly from the MS website here.

The command to Export database I have used is quite simple:


SqlPackage.exe /a:Export /tf:<filename>.bacpac /scs:"Data Source=<server name>.database.windows.net;Initial Catalog=<database name to export>; User id=<user id>; Password=<password>"

TIP: You can find the SQLPackage util using the following path: “C:\Program Files (x86)\Microsoft SQL Server\<version>\DAC\bin”. If you will be using the util from there, the exported file will be dropped into the same directory. The better way is to add SQLPackage.exe to Path in the Environment variables, so you can call anywhere using the command line.

Import operation is as simple as export. Please find the code below:


SqlPackage.exe /a:import /tcs:"Data Source=<destination server name>.database.windows.net;Initial Catalog=<db name>;User Id=<server admin>;Password=<admin password>" /sf:<file name>.bacpac /p:DatabaseEdition=Standard /p:DatabaseServiceObjective=S1

It is worth mentioning that the *.bacpac will be picked from the same directory where SQLPackage.exe lies (unless you used the tip above). “User Id” and “Password” should be SQL Server admin’s credentials in Azure. Database edition corresponds to SQL Server editions (the full list of possible values: Basic|Standard|Premium|Default), the same parameters can be found in the documentation mentioned above. Database Service Objective corresponds to SQL Server pricing tiers. Possible values are: Basic, S0-S3 (Standard tier), P1/P2/P4/P6/P11/P15 (that belong to Premium tier), PRS1-2/4/6 (IO and compute-intensive instances, still in preview).

Like it was mentioned before, this way is quicker and gives you more control and visibility over the process. The error messages are more informative in this case as well:

So, using these two simple commands, you can Export/Import database for the same or different Azure subscription without the overhead and automate the process if necessary, or use SQLPackage as part of your script for some other tasks.


What I learned at Microsoft Build 2017 - Part 1


17 May 2017

What I learned at Microsoft Build 2017 – Part 1

In this topic, I want to share what I learned at MS Build and explain the key takeaways from this event. I won’t be focusing on the keynotes much, as you can watch the relevant videos on channel 9, but the key message, as Microsoft’s VP Scott Guthrie said – “The success of your solution on the Azure platform is our primary goal!”. That means that Microsoft keeps concentrating its efforts around Azure services… but hold on, there is much more than that!

It is rare when I so excited about the conference. I was surprised and even baffled by some new releases, and news Microsoft has thoroughly prepared to present at this conference. Let’s do everything one by one, so I can make sure I didn’t forget anything important. So, first and foremost - MS finally released long-awaited Visual Studio for Mac (release date is 8th of May 2017). Finally, the developers received a familiar instrument for solutions development on Mac, using NET Core and Xamarin (or Unity), but the tooling, comparing to the same IDE for Windows still requires some work. Especially disappointing is the absence of Docker support.

Continuing topic on development instruments, Microsoft announces a new Cloud Shell in Azure (command line) that works with your Azure account right from the browser. Every session is in sync with a $Home directory that is stored in Azure, which, in turn, gives a possibility to access files, VM’s and other artifacts you deal with via UI. PowerShell is not yet supported and coming soon, although, you can sign up for a private preview here. The App for iOS is also available in the App Store. I gave it five stars, as it pretty neat and quick (command line is not yet available though), but it was demonstrated at the conference by Corey Sanders (see “Azure Compute – new features and roadmap” on Channel 9).


According to Scott Hanselmann, MS has announced 38 regions/data centers around the globe (and nine countries where disaster recovery is possible because of multiple regions: USA, Canada, UK, France, Germany, India, China, Japan, Australia). Cross-Region Disaster Recovery (Automatic DR to another area, i.e. Azure to Azure), so, we can finally take advantage of multiple regions within the country. This feature is coming soon.

So, to summarize, we going to have In-Country DR (9 countries as mentioned above), Multi-Instance SLA (99.95%), Single-Instance SLA (99.99%). In comparison to other clouds, Microsoft is far ahead.

Instance metadata service is another new feature on the Azure Platform that exposes an endpoint, so you can manage upcoming maintenance events, bootstrap VM with identity and provide VM context in a fully programmatic way. It also provides the status of all the instances running.


There were lots of topics at the conference pertaining to containers overall and their interaction with OS components and Azure services in particular. Microsoft Azure Container Service provides Docker tooling and API support along with provisioning of Kubernetes, DC/OS, and Docker Swarm. I have inserted the links for those who don’t know what this is all about. In short, these are all the container orchestration, deployment automation, and scale tools, based on Mesos DC/OS, Kubernetes or Docker Swarm that provide the best Docker experience. Apart from this, Microsoft on-boards Deis team for assistance on Kubernetes (and Helm tool in particular, that simplifies pre-configured Kubernetes resources management). See the demo by Corey Sanders (I have provided the most interesting links below).

Service Fabric

Another interesting topic is Azure Service Fabric. There is nothing new in Azure-based Microservices, but the fact you can now deploy them in containers (surprise-surprise ). Azure Service Fabric now represents stateful and stateless services (.NET and Java APIs on Windows Server and Linux) that can be deployed in Azure, Azure Stack, On-premises, in OpenStack and AWS even. The application (Music Store) shown by Corey on this presentation can be found on GitHub, along with explanations on how to run it in Docker for Windows. 

Azure Batch

This service has been around for a while. It offers Job Scheduling as a service at the first place, and the other platform offerings have become very easy having Azure Batch service. It allows you to focus on the major question, i.e. what infrastructure you want, when/where and pricing. Microsoft has announced a low priority batch VMs where you pay 80% less and, sure thing, it is deeply integrated with the job scheduling (right now is available in preview mode). Another aspect of a batch is being able to take advantage of rendering capabilities (using compute instances to render images/videos/3D models). Senior Product Manager from Autodesk has shared his experience in this regard.


Function side of the house J For those who don’t know what this is about, it is an Event-based way of programming that doesn’t require infrastructure (and has a billing model by usage/demand/call/execution, etc.) Microsoft’s offering here is Azure Functions (I highly recommend to watch a recorded presentation by Eduardo Laureano, PM on Azure Functions and will talk about all this stuff a bit later). Microsoft has announced the integration with Visual Studio on that day (on 8th of May a lot of stuff were announced), so, it simplifies editing and debugging of the functions. A new feature that pertains to this set of technologies is Azure Functions Runtime. What’s cool about it, is that it allows to install runtime and use Functions in a container outside Azure environment. You can download a preview version of it from here. It can run anywhere where containers work or on top of the Azure Stack (which is in turn, the Azure functionality outside Azure environment).

Azure Managed Applications and Service Catalog

Package and seal the Azure functionality for the other folks to use. It means that the application which is being developed can be packaged and deployed by someone else, using ARM experience, or, sold/exposed to a third-party, etc. What is interesting about it – the catalog uses an enhanced security model, so the app in the catalog can be exposed or pre-deployed to a certain personality/account (after approval or without it) and has a configurable amount of parameters that are required for successful deployment. Right now it’s in preview mode (just like many other things in this topic), and link to documentation leads to a general Azure website (I was unable to try it myself due to the missing documentation).


Microsoft has expanded the set of compute instance sizes (by adding six new ones):

  • F – Compute intensive
  • NC – NVIDIA GPUs K80 compute
  • NV – NVIDIA GPUs M60 Visualization
  • H – Fastest CPU IB Connectivity (very high throughput between instances)
  • L – Large SSDs
  • SAP – SAP Large instances

Four more were announced during the conference:

  • ND (P40) – NVIDIA Gear towards deep learning, scale-out computation
  • NCv2 (P100s) – computational SKU (scale-out compute with InfiniBand connectivity)
  • Dv3 (SSD storage, fast CPU) and Ev3 (high memory) – next generation of existing SKUs with nested virtualization (meaning that they’ll be based on Windows 2016 and support VMs inside VMs deployed into the cloud)

Azure Logic Apps

Azure Logic App is not a new service, but obviously known not well enough. It takes advantage of existing BizTalk servers (connects to logic apps through an adapter) to connect to SaaS and invoke logic apps. It makes it easier to connect to trading partners using EDI standards and B2B capabilities.

Microsoft PowerApps

This service is around for a while, and I know people who heavily use it in development.  There is nothing new in the way it works (aside from the grooving number of different adaptors, which is 140 now) but it allows to build an app very quickly and doesn’t require much of mobile development experience, which was perfectly demonstrated by the guys (Jeff Hollan, Eduardo Laureano) on the very last day of the conference. I think you will enjoy the presentation as much as I did. Please watch “How to build serverless business applications with Azure Functions and Logic Apps for PowerApps” to get more info on the topic (the link to Channel 9 video below).

There is still a lot to tell about, so, I had to split my topic into two parts. I will publish the second one shortly. Stay tuned :)


 Mentioned Videos on Channel 9 

Azure Compute – new features and roadmap (Corey Sanders)

How to build serverless business applications with Azure Functions and Logic Apps for PowerApps (Jeff Hollan, Eduardo Laureano)

C# 4.0 Описание новых функций


25 January 2011
C# 4.0 Описание новых функций

C# представляет собой развивающийся язык. В этой статье рассматриваются новые функции, добавленные в C # 4.0 (а также короткое описание нововведений в предыдущих версиях), что в совокупности улучшает читаемость кода и предоставляет вам возможность использовать LINQ для запросов к динамическим источникам данных. Примеры, приведенные в этом обзоре показывают пути улучшения модели кодирования для чтения данных из различных источников, включая текстовые файлы, и каким образом объединять данные из COM-Interop источника для LINQ запросов к объектам.

C# 2.0
Generics/Обобщения – новая возможность языка, позволяющая создавать обобщенные алгоритмы, Итераторы (yield keyword) – механизм, упрощающий создание перечислителей (реализации интерфейса IEnumerable), Анонимные методы/anonymous methods (delegate keyword) – возможность инициализировать делегаты телами методов, Partial types (классы) – возможность разбивать код одного класса по нескольким файлам, упрощенный синтаксис инициализации делегатов.

C# 3.0
Ключевые слова select, from, where, позволяющие делать запросы из SQL, XML, коллекций и т. п. (запрос, интегрированный в язык, Language Integrated Query, или LINQ), Инициализация объекта вместе с его свойствами, лямбда-выражения ( lambda expressions (=>)), деревья выражений (лямбда-выражения могут представляться в виде структуры данных), безымянные типы (var-ы), методы-расширения — добавление метода в существующий класс с помощью ключевого слова this при первом параметре статической функции, автоматические свойства.

Важно! C# 3.0 совместим с C# 2.0 по генерируемому MSIL-коду, улучшения в языке — чисто синтаксические и реализуются на этапе компиляции.

Улучшения в C# 4.0
Microsoft разделила новые функции на следующие четыре категории:
•    Именованые и необязательные параметры
•    Динамическое связывание
•    Ковариантность и контрвариантность
•    Улучшенное взаимодействие с COM

Для рассмотрения примеров, будем считать, что опрелены следующие классы:

public class Person
    public string FirstName { get; set; }
    public string LastName { get; set; }
public class Customer : Person
    public int CustomerId { get; set; }
    public void Process() { ... }
public class SalesRep : Person
    public int SalesRepId { get; set; }
    public void SellStuff() { ... }

Именованые и необязательные параметры
Поддержка необязательных параметров позволяет передать методу значение по умолчанию, следовательно вы не должны указывать его при каждом вызове метода. Это очень удобно, когда у вас есть несколько перегруженных версий метода. "Древний" подход выглядел так:

public void Process( string data )
    Process( data, false );
public void Process( string data, bool ignoreWS )
    Process( data, ignoreWS, null );
public void Process( string data, bool ignoreWS, ArrayList moreData )
    //Наш код
Причиной перегрузки метода Process является намерение избежать необходимости постоянно включать "false, null" в третьем вызове метода. Теперь предположим, что в 99% случаев динамический массив 'moreData' не будет предоставляется - с этой т.з. выглядит "неправильным" передавать null так много раз:
Следующие 3 вызова эквивалентны

Process( "foo", false, null );
Process( "foo", false );
Process( "foo");

Новый подход выглядит так:

public void Process( string data, bool ignoreWS = false, ArrayList moreData = null )
    // Наш код

// Примечание: строковая переменная data должна передаваться всегда, т.е. у нее нет значения по умолчанию
Теперь у нас есть всего лишь один метод Process вместо трех, но те три способа вызова, которые мы рассматривали выше все еще работают (и эквивалентны между собой):

ArrayList myArrayList = new ArrayList();
Process( "foo" ); //Правильный вызов
Process( "foo", true ); //Правильный вызов
Process( "foo", false, myArrayList ); //Правильный вызов
Process( "foo", myArrayList );
// Неправильно! См. ниже описание именованых парамертов...

Именованые параметры
В предыдущем примере (выше) мы увидели, что следующий вызов неправилен:

Process( "foo", myArrayList );
Давайте разберемся: если булева переменная ignoreWS не обязательна (т.е. носит опциональный характер), почему мы не можем просто проигнорировать/опустить ее? Важными причинами могут являтся читабельность и сопровождаемость написанного кода, но, что еще более важно - при таком "подходе" невозможно узнать какой из параметров вы пытаетесь передать, т.е. если у вас к примеру два параметра одного и того же типа, то компилятор не может узнать какой из них вы имеете ввиду. Представьте себе метод с 10-ю необязательными параметрми и вы передаете ему только один ArrayList. Так как ArrayList наследуется от object, IList и IEnumerable, то становится невозможным определить, каким образом его использовать. Теперь думаю понятно в чем трудность. В свою очередь, "именованые параметры" предоставляют элегантное решение:
ArrayList myArrayList = new ArrayList();
Process( "foo", true ); // это правильный синтаксис, и moreData проигнорирован
Process( "foo", true, myArrayList ); //это также правильный синтаксис
Process( "foo", moreData: myArrayList); //это правильный синтаксис и ignoreWS проигнорирован
Process( "foo", moreData: myArrayList, ignoreWS: false ); //Правильно, но не приемлимо...
Имейте ввиду то, что если параметр имеет значение по умолчанию, его можно опустить. 

Динамическое связывание (Dynamic binding)

Пожалуй наибольшей инновацией в C# 4.0 является динамическое связывание (Dynamic binding). Реализация данной функции в C# была стимулирована примерами таких динамических языков как Python, Ruby, JavaScript и SmallTalk. Динамическое свзяывание "откладывает" связывание (процесс определения типов и членов) с этапа компиляции до времени выполнения. Хотя C# остается преимущественно статически типизированным языком - переменные типа "dynamic" определяются методом "позднего связывания". Ну ОК, давайте перейдем от слов к делу :) Уверен Вы имели дело с кодом, наподобие этого:

public object GetCustomer()
    Customer cust = new Customer();
    return cust;
Customer cust = GetCustomer() as Customer;
if( cust != null )
    cust.FirstName = "foo";

Примите во внимание, что метод GetCustomer возвращает object.

Материалы взяты с сайта www.codeproject.com