Over a million developers have joined DZone.

Now Running on Windows Azure

· Cloud Zone

Download the Essential Cloud Buyer’s Guide to learn important factors to consider before selecting a provider as well as buying criteria to help you make the best decision for your infrastructure needs, brought to you in partnership with Internap.

I’ve taken a few days off from the academic research work to relax and play around with some personal interests and one of these interests was .NET development. I’ve been working on a few tasks and projects that you will hear about shortly but one of my plans was to return to regular blogging and update my blog engine to something more up to date. A part of this plan was to move my blog to a cloud hosting solution preferably Microsoft Windows Azure. I was hosting my blog on a dedicated Rackspace server hosted by a friend but I thought that it is finally time to try something different!

Older readers would remember that I was using several blog engines for my blog in the past 6 years including various versions of Community Server, Graffiti CMS, and finally my own blog engine written with ASP.NET MVC, Behistun. I’ve been running the same codebase for Behistun for almost two years with a minor upgrade from ASP.NET MVC 1.0 to 2.0. This period was the only calm time in my blog’s existence as I used to employ the early builds of new technologies and blog engines regularly. Initially, I was thinking about using an open source blog engine or content management system for my blog to save myself from maintaining the code but as I will write later, I ended up upgrading Behistun and writing everything myself.

These days everyone is talking about cloud computing and moving applications to cloud solutions, and here Windows Azure is one of the main options for many developers and companies. Although it’s not really necessary to have a blog in cloud (at least for now), I thought that it’s a good practice to move my blog to Windows Azure as I was personally enthusiast about it.

Orchard

Among the existing blog engines powered by ASP.NET MVC, Orchard seems to be the most promising option and it had the advantage to have a custom deployment package for Windows Azure that has served for some known sites. So I decided to give Orchard a try and move my blog to this good content management system.

The biggest challenge could be the migration of data and I was lucky to find a good BlogML module for Orchard. Since I had a huge database of data, I had to write a custom application to export my data into smaller BlogML files and import them to Orchard one by one. This worked well but trying to import my blog theme to Orchard, I faced with difficulties. Orchard has a good theme engine structure but it wasn’t compatible with my expectations.

Ignoring this minor issue, I moved on with deploying my Orchard blog to Windows Azure but this turned out to be a big challenge since it didn’t work well and not only I couldn’t get Orchard to work the way I needed, but also I couldn’t move my static files to the new server.

In the end, I had to give up even though I could be able to manage to keep using Orchard but my main concern was the maintenance and future updates. So I decided to work on Behistun and see if I can move it to Windows Azure.

Updating Behistun for Windows Azure

The first step, and the most obvious one, was to update Behistun codebase to work with Windows Azure. This was easy enough to complete shortly and have a deployable package that successfully worked on Azure. Fortunately, I designed Behistun with simplicity as the main goal and it helped during this process.

There were three bigger tasks to complete in order to be able to move my blog to cloud. First, I had to migrate the static content to Windows Azure Storage. Second, I had to replace all the old references to static content to new Windows Azure Storage URIs. Third, I had to migrate my SQL Server database with updated data to SQL Azure.

Migrating Static Content

One of the major challenges for an old blog like mine is the amount of static content (e.g., images and download files). Since Azure doesn’t support a traditional storage system, I had to migrate all these files to Windows Azure Storage. I wrote a simple console application to perform this task which goes over the subfolders in a specific path, extracts folder names, creates containers on Windows Azure Blob Storage with the same name, and uploads the files to them. The code for this purpose is presented below:

static void Main(string[] args)
{
string storageFilePath = @"C:\AzureFileUploader\storage.txt";
 
Console.Title = "Blog File Uploader to Windows Azure Storage";
 
try
{
CloudStorageAccount cloudStorageAccount;
CloudBlobClient blobClient;
 
cloudStorageAccount = CloudStorageAccount.DevelopmentStorageAccount;
//cloudStorageAccount = CloudStorageAccount.Parse("DefaultEndpointsProtocol=https;AccountName=<youraccountname>;AccountKey=<youraccountkey>");
 
blobClient = cloudStorageAccount.CreateCloudBlobClient();
 
CloudBlobContainer blobContainer;
BlobContainerPermissions containerPermissions;
CloudBlob blob;
 
blobClient.Timeout = new System.TimeSpan(1, 0, 0);
 
string[] alreadyUploaded = File.ReadAllLines(storageFilePath);
 
string[] directories = Directory.GetDirectories(@"C:\wwwroot\Storage\Images\Posts");
foreach (string directoryName in directories)
{
DirectoryInfo directoryInfo = new DirectoryInfo(directoryName);
 
string containerName = directoryInfo.Name;
 
if (!alreadyUploaded.Contains<string>(containerName))
{
Console.WriteLine(string.Format
("Starting container creation {0}.", containerName));
 
blobContainer = blobClient.GetContainerReference(containerName.ToLower());
 
blobContainer.CreateIfNotExist();
Console.WriteLine(string.Format
("Finished container creation {0}.", containerName));
 
containerPermissions = new BlobContainerPermissions();
 
containerPermissions.PublicAccess = BlobContainerPublicAccessType.Blob;
blobContainer.SetPermissions(containerPermissions);
 
FileInfo[] files = directoryInfo.GetFiles();
foreach (FileInfo file in files)
{
string fileName = file.Name;
string filePath = file.FullName;
 
blob = blobContainer.GetBlobReference(fileName);
 
Console.WriteLine(string.Format("Starting file upload {0}.", fileName));
blob.UploadFile(filePath);
Console.WriteLine(string.Format
("File upload completed to blob {0} - {1}.", fileName, blob.Uri));
}
File.AppendAllLines(storageFilePath, new List<string> { containerName });
Console.WriteLine("***********************************");
}
}
}
catch (StorageClientException e)
{
Console.WriteLine(string.Format("Storage client error: {0}", e.Message));
}
 
catch (Exception e)
{
Console.WriteLine(string.Format("Error: {0}", e.Message));
}
 
Console.ReadLine();
}

There are a few points about this code. It has two constructors for CloudStorageAccount where one is used with local testing environment and the other is ready to configure with production on Windows Azure Storage (commented out in the code above). The other point is about a custom timeout for larger files for specific cases when a file may be big. The third point is the use of a text file to store the list of the containers that are processed before, so in the case of an exception I can start over without redoing everything. I implemented this feature because my old folder names had some exceptional cases where the folder name was either longer than the maximum allowed for a container name or it had underscore in the name (that is now allowed in Windows Azure Storage container names). I renamed these few folders manually to solve the problem, and also uploaded some specific folders manually using Azure Blob Studio.

Output

Replacing References

Having the data uploaded, I moved on by replacing the references in my local SQL Server database in post bodies to reflect the changes. This was very easy indeed using LINQ to SQL!

static void Main(string[] args)
{
Console.Title = "Blog Image Reference Replacer";
 
DBLoaderDataContext context = new DBLoaderDataContext();
var posts = from p in context.Posts select p;
 
foreach (Post post in posts)
{
string body = post.Body;
body = body.Replace("http://nayyeri.net/storage/images/posts", "http://keyvan.blob.core.windows.net");
body = body.Replace("http://nayyeri.net/storage/downloads", "http://keyvan.blob.core.windows.net/downloads");
 
post.Body = body;
 
Console.WriteLine("Editing post {0} completed.", post.Title);
 
context.SubmitChanges();
}
 
Console.ReadLine();
}

Output

Migrating Data

Having my data updated in the local SQL Server instance, I used SQL Azure Migration Wizard to export them to SQL Azure. This was also an easy step.

SQL Azure Migration Wizard

Custom Domain Name Setting

The worst part of the migration process was where I tried to set my custom domain name for my blog to point to my Windows Azure hosting domain. Unfortunately, the current features provided by Windows Azure doesn’t allow you to point your exact domain name to your Azure address and you need to have a domain prefix (e.g, www.) to point to this domain. This was a little hard for me since I was one of the big advocates of no www on the .NET community and maintained such URLs on this blog for the past six years.

Anyways, I had to give up on this interest in order to be able to use Windows Azure, so I forwarded my main domain to www.nayyeri.net that was pointing to my Azure address using a CNAME record. I hope that Windows Azure team adds better features in this area very soon.

Conclusion

My blog is now running on Windows Azure and it means that I should expect the least downtime for it in the future. These days, it’s early to expect a blog to be hosted on cloud but I was passionate to do this just like the old days when I was keeping everything up with the latest changes of technologies and platforms. Unfortunately, being busy with research and courses didn’t allow me to do this since I came to the United States.

The migration process for my blog ate up my Sunday but I think it was worth it and now I can have an up to date blog engine with the hot technologies available. The downside of having an old blog and using several blog engines is that you have different patterns and this can be problematic when writing converter and migrator tools. I had to handle all these cases either using automated code or manually.

I’m planning to upgrade the codebase for Behistun to ASP.NET MVC 3.0 and add some new features to it in the near future and deploy it to Windows Azure to take advantage of all features provided on cloud. Unlike what I had stated before, I don’t think that I’m going to share Behistun source code with public as during the past 1.5 years my opinions about writing certain open source projects is changed and I’m not willing to release an open source blog engine or CMS.

The Cloud Zone is brought to you in partnership with Internap. Read Bare-Metal Cloud 101 to learn about bare-metal cloud and how it has emerged as a way to complement virtualized services.

Topics:

Published at DZone with permission of Keyvan Nayyeri. See the original article here.

Opinions expressed by DZone contributors are their own.

The best of DZone straight to your inbox.

SEE AN EXAMPLE
Please provide a valid email address.

Thanks for subscribing!

Awesome! Check your inbox to verify your email so you can start receiving the latest in tech news and resources.
Subscribe

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}