Monday, 26 October 2015

Backup SQL Server

BACKUP IS IMPORTANT. I always bug people how backup is super important. You should backup everything even the backup itself. This sounds somewhat crazy but backups get corrupted or lost. This is not one of the posts explaining why backup is important (just google it),

Prologue

I have a site with a fair amount of users that uses on-premises Microsoft SQL 2014 server. Server is hosted in on VM with NetApp storage behind it that is fault tolerant and also backed-up. But I want to make additional security measure of local db backup and using Azure Blob storage for off-site backup. Note: There is a rather big discussion on cloud security so to protect my backups I also want to encrypt it.

There are quite the few solutions to do this in the market, but most of them cost money or I should combine a couple of them to provide me with the same functionality. So I started building my own solution. In the end I want to have console app that will be full configurable and I will put it in windows Scheduler to run every night. The full source code can be found on GitHub or a compiled version can be downloaded from the URL.

Building the app

Like always lets open our Visual Studio, and make an empty Console App, I'll call mine simple "MSSQLbackup". I have made additional Settings.cs file just to hold properties of config file.

Let's connect to our SQL server.The easiest way I found is to use Microsoft.SqlServer.Management.Smo reference. This is official namespace containing core functions of SQL server, the reference can be tricky to find you can find it in your SQL server installation folder or if you don't have SQL server installed found a nuget package.

We will have a separate file for all databases on a single day. Basically we will store all backups in a folder, and later on ZIP it and encrypt. Let's create a folder, and in th same batch define that we want to go over all included databases on the server and call a function to backup the db.

Backup DB

Beckup functionality is also provided by the SQL SMO library

ZIP and encryption

Now that we have all the bak files in a single directory we want to ZIP and encrypt the whole directory. For all of that w don't want to reinvent the wheel and we'll use niffy lib SharpZipLib adn I can always recommend as a fast reliable solution for ZIP in c#.

Upload to Azure

All to do now is to upload the archive to Azure Storage account

Summary

This solution can be expanded in a lot of directions, like different archive servers beside Azure. This serves it's purpose in production for ~6 months now. If you have any thoughts for improvements leave your feedback in the comments.

Saturday, 3 October 2015

Migration from Azure SQL

After almost for 2 years we have decided to move our main SQL database from Azure SQL to our data center. The decision was made for security purposes mostly, we started to have sensitive customer data in our SQL and it was unacceptable that all that data was in the cloud, also German laws prohibit storing any customer information outside Germany. To be fair we had a pretty good experience with Azure SQL service. The biggest pro would be that you have all of the technologies at your disposal without any maintenance or admin work, but the big con was when some Azure services go down (and they do!) you are powerless. We had all kinds of failures as Traffic manager went down, Websites in multiple regions at once, etc. Also one thing to keep in mind if you are using Cloud services for important project have offline independent backup service, a lot can happen.

Migrating Azure SQL

Exporting Azure SQL is fairly simple and I prefer to do it using SQL Management studio. Also I don't need the Transaction Logs and History so using .bacpac is much more convenient for me. If you need Transaction Logs and History you need to find some other method for migration.
  1. Connect to your Azure SQL server
  2. Right click on the database you want to export 
  3. Go to Tasks -> Export Data-tier Application
  4. Wizard will guide through the rest of the process, and create a .bacpac file that is the full database backup with structure and all the data
  5. On your second server just reverse the process by using Task -> Import Data-tier Application

When it doesn't work

As I stated before sometimes The Cloud just doesn't work, and it always happens in the worst possible moment. My scenario began when EF Migrations failed to update the database schema but somehow put the new record in Migrations table. I needed to restore rollback or restore the old copy of the database. 

For the failed attempts I tried:
  • Azure point in time restore, it failed with a generic message that the action cannot be completed?!
  • Export function of the current database (I needed it to sync the data later) failed in the same matter
  • Exporting using the previously described method got stuck on trying to export database schema
Since it was a middle of the work day with high traffic I was pretty desperate so the only thing that did work was to copy the database using the old Azure portal.

Wednesday, 30 September 2015

Call URL from Windows Scheduler

When developing web applications every now and then we have to make some scheduled task to periodically call some URL. This could be also for monitoring server uptime but I would recommend going for some 3rd party service for that.

Our task is to call a simple GET URL every day, if possible log each call (can be extended to log service response). Since there is no direct way of telling windows scheduler to call web service I have found two feasible solutions: batch file and Powershell script.

Batch file

This is the simplest solution but with shortcomings, In short we need to create bat file somewhere on the drive with command "start {URL}" and tell Scheduler to run the file every day. So let's go over the process.
  1. For starters I would recommend having a folder named "Schedulers" directly in root of system drive. This will help you organize your schedulers when you have more of them.
  2. In the designated folder create a new empty file. Right click on the mouse-> New -> Text document which will create a new .txt document
  3. In the document paste code below replacing URL with your desired address
  4. Now rename the file extension to .bat, if you can't see the file extensions go to View -> and check "File name extensions"
  5. Run Scheduler by clicking on Windows button and typing Scheduler
  6. In the shown screen click on "Create basic task"
  7. Follow the instructions on the screen, when you get to Action select "Start a Program", browse your bat script.
  8. That's it!!!
Pros:
  • super simple to implement
  • no code (or coding knowledge) required
Cons:
  • opens browser on every run
  • hard to do any logging (logging response impossible)

PowerShell script

This is by far much more flexible solution, PowerShell is a great tool that is fairly easy to learn and use. For this tutorial you don't need any prior knowledge of it. In short we are going to create a small PowerShell script which will be called by Scheduler every day and also log all requests.

  1. As in first par of tutorial I encourage you to have a folder named "Schedulers" for all scheduled programs.
  2. Open PowerShell ISE (windows start -> type PowerShell ISE)
  3. CP following code to the file replacing the actual URL and log file location for your needs
  4. Save the file in desired location
  5. Open Scheduler. You can just go windows start and type Scheduler
  6. In Scheduler again go through the same process to create a basic until browsing for your script in Action part.
  7. You have to put "powershell" in Program/script option and add arguments: -file "path_to_ps_file", the interface should look like the image 
  8.  That's it!!!
Pros:
  • much more flexible solution
  • doesn't open browser
Cons:
  • Opens PowerShell window on execution but closes after execution


Monday, 14 September 2015

Download files from ASP.NET WebAPI service

From some time ago I started making all my websites AngularJS, WebAPI powered. It gives me more control over the traditional ASP.NET MVC approach (also some more problems) but also provides a way to give RESTful API access to other apps.

Below is the simplest controller for downloading files, of course I don't suggest doing anything like this without some try/catch blocks and referencing a file path as args to controller.


Short explanation:
- response should emit Http status 200 to indicate that the request was properly carried out
- Http header field Content-Disposition "suggests" a browser of default file name when downloading, this is not part of the HTTP 1.1 standard but it's widely used, you can read all about it on Stackoverflow