Over a million developers have joined DZone.

Custom Continuous Integration Testing Service

Evgeni Kostadinov looks at the steps to create a custom continuous integration testing service, and how it improves his development efforts.

· DevOps Zone

The DevOps Zone is brought to you in partnership with Sonatype Nexus. The Nexus Suite helps scale your DevOps delivery with continuous component intelligence integrated into development tools, including Eclipse, IntelliJ, Jenkins, Bamboo, SonarQube and more. Schedule a demo today

As a QA engineer, my daily responsibility is to assure the stable Continuous Integration process.  In order to do this task I've used TFS build rights, Jenkins and some other in-house tools. Not that is required from me to develop my Custom Continuous Integration Testing Service - but I always try to deliver the 'extra-mile-effort'.

I've decide to use my knowledge of PowerShell scripting and try to do all this by myself. The result is around 100 lines of code that all of what is expected form a middle-range CIServer. If someone more experienced decide to optimize it maybe the code lines will be fewer - but as it is will do it's job just fine (at least for my case) .

Custom Continuous Integration Testing Service

    Let's first take a look of the typical tasks that every CI service will have to take care of:

In my case I've divided these steps to sub-steps. But let me start from the beginning - since by some reasons, some time You may not have access to the company's server (restricted to 'domain knowledge', management or security decisions) and get the Project files by simple copy from remote server to Your PC. My work-around was the Sharepoint intra-site and VCS

Step 1: getServerFiles

#create and use new web client obj
$wc = New-Object System.Net.WebClient
$wc.UseDefaultCredentials = $true
$file = "C:\Users\user\Desktop\ProjectCore.zip"
$link = ""
$wc.DownloadFile($link, $file)
It is very easy to get used with the power of the shell. It can get any of the .Net classes and functionality and combine them with Windows tasks (see why).  Here I create new WebClient Object and used it to signin, navigate, download and save the Project.zip files on my work station.

Step 2: unzipFiles

#create new powershell obj
$shell_app = new-object -com shell.application
$zip_file = $shell_app.namespace("C:\Users\user\Desktop\ProjectCore.zip")
$destination = $shell_app.namespace("C:\Users\user\Desktop")
#unzip files using windows build-in fun
Here I use the shell build-in unrar functionality - I told You that is useful. It's simple as that.

Step 3: copyProjectFiles

Copy-Item C:\Users\user\Desktop\Project.Core C:\ProjectFolder -Force -Recurse
My experience shows that this is a good practice - to keep all project files in separate folder. I use -Force to copy-and-replace and -Recurse to get all files in all folders.

Step 4: copyWebConfig 

Copy-Item C:\ProjectFolder\Web.config C:\Project\Project.Core\Source\Project\Web.config -Force
Sometimes the database Connection string will need extra adjustment - in my case to login to SQL 2008 with SysAdmin account in order to read-write.

Step 5: rebuildProject

#navigate to dir and start msbuild
cd "C:\Windows\Microsoft.NET\Framework64\v4.0.30319"
.\MSBuild.exe C:\Projet\Project.Core\Source\Project\Project.csproj
The most important thing in this step is the 'navigation': first to find the MSBuild.exe and second time to point it to the .csproj file.

Step 6: startWebServer

#navigate to dir and start the web server
cd "C:\Program Files (x86)\Common Files\microsoft shared\DevServer\11.0"
.\WebDev.WebServer40.EXE /port:6464 /path:"C:\Project\Project.Core\Source\Project"

Here You have several options: to deploy Your project to the IIS - most likely in company environment You'll need security polices and rights. OR use this work-around ..... start WebDevHost. Actualy the Visual Studio does the same. It is critical to point to the project .sln/.csproj file.

Step 7: deleteOldFiles

#clear old files
Remove-Item C:\Users\user\Desktop\Project.Core -Recurse -Force
Remove-Item C:\Users\user\Desktop\ProjectCore.zip -Force
As I said at the beginning of this article is a good idea to keep your files and project 'clean'.

Step 8: startTestConsole

cd 'c:\Project\TestProject\packages\NUnit.Runners.2.6.3\tools'
Start-Sleep -seconds 1
$console = Get-ChildItem .\nunit-console-x86.exe
$path = $console.FullName
$new = convert-path $path
$final = Resolve-Path $new
Start-Sleep -seconds 1
 /config=Release 'C:\Project\TestProject\TestProjectAcceptance\bin\Debug\TestProjectAcceptance.dll' > 'C:\Project\testsOutput.txt'
I think this is the core of the Service - all is going around this console. I use the NUnit  Testing Framework in combination with SpecFlow. In addition with all other benefits I also gain NUnit-Console Command Line. And the results of this mix are quite impressive. In this code snippet I'm doing some extra path computations, because Windows by default uses relative dir paths and sometimes even can't 'see' it's own files. The output of the tests run is redirected to a file. Later I'll used it to generate reports.

Step 9: copyOutput

#create browser obj
$ie = new-object -com "InternetExplorer.Application"
#$ie.visible = $true
#wait to load
Start-Sleep -s 3
$doc = $ie.document
#find elements
$radioBtn = $doc.getElementByID("pTagbrTag")
$oldInput = $doc.getElementByID("oldText")
$newInput = $doc.getElementByID("newCode")
$radioBtn.checked = $true
#input test results
$oldInput.value = Get-Content 'C:\Project\testsOutput.txt' | out-string
$list = $doc.getElementsByTagName('input')[0]
foreach($i in $list)
   if($i.className -eq "frmbtn")
#save the new file
$newInput.value > 'C:\Project\TestResults.html'
Remove-Variable ie

Step 10: generateReportHtml

#navigate to results.xml dir
cd 'C:\Project\ProjectTestProject\packages\NUnit.Runners.2.6.3\tools'
$xml = Get-Content .\TestResult.xml | out-string
#start and use browser with web service to convert .xml to .html
$ie = new-object -com "InternetExplorer.Application"
#$ie.visible = $true
Start-Sleep -s 3
$doc = $ie.document
$textArea =$doc.getElementById("23")
$textArea.value = $xml
$load = $doc.getElementById("18")
Start-Sleep -s 3
$radioBtn = $doc.getElementById("191")
$output = $doc.getElementById("22")
$convertBtn = $doc.getElementById("19")
$css = Get-Content 'C:\Project\css-styles.txt'
$script = Get-Content 'C:\Project\script-color.txt'
$final = 'C:\Project\TestsResult.html'
$output.innerHTML > $final
#add css styles
Add-Content $final $css
#add js func
Add-Contentt $final $script
#add test data report to history file
Start-Sleep -s 3
$date = Get-Date
$dateStr =  "{0:dd_MM_yyyy HH_mm_ss}" -f [datetime]$date
$historyFile = 'C:\Project\test-results-history\TestsResult' + $dateStr + '.html'
$finalContent = Get-Content $final
$finalContent > $historyFile
Remove-Variable ie

And as a 'sugar' the NUnit Runner provides .xml file with the tests execution result. So my work is just to link it with a corresponding web service that will allow me to transform the .xml into a .html file. Dynamically I append .css (styles) and .js - to make the result file more 'business' readable.

Step 11: sendMail

#SMTP server name
$smtpServer = ""
#Creating a Mail object
$msg = new-object Net.Mail.MailMessage
#Creating SMTP server object
$smtp = new-object  Net.Mail.SmtpClient($smtpServer)
#Email structure 
     $msg.From = "user@company.com"
     $msg.ReplyTo = "user@company.com"
     $recipient1 = "user2@company.com"
 $attachment = "C:\Project\TestResults.html"
 $attachment2 = "C:\Project\TestsResult.html"
     $msg.subject = "Continius integration testing"
     $msg.body = "Hi all, This is the result file from the Continuous integration testing of the Project, done at 03a.m. this morning. This is auto-generated message - for more details please contact me.   Best regards,  me"
#Attach tests result file
#Sending email 

For the SMTP (Simple Mail Transfer Protocol) Server address You can just ask any SysAdmin. Mostly is used by Your company for inner mailing services.

And this is it - at least the basic core. Maybe You'll want to spend time considering on how to create and use Custom Tests results analyze tool. According to my experience - the best performance is gained after I divide the steps into two separate .sp1 files. Fisrt 1-7 and Second 8-11. Also You'll want to keep test resilts history.

Here I give you a code snippet for Project files' version check:

#check source folder versions
$source = Get-Item c:\Users\user\Desktop\Project.Core\Source\
$currWrite = $source.LastWriteTime
$dateFile = 'C:\Project\lastModifiedSourceFolder.txt'
$lastWrite = Get-Item $dateFile
$currWriteStr = "{0:dd_MM_yyyy HH:mm:ss}" -f [datetime] $currWrite
$lastWriteStr = "{0:dd_MM_yyyy HH:mm:ss}" -f [datetime]$lastWrite
if($currWriteStr -eq $lastWriteStr)
    start powershell 'c:\Project\step7-deleteOldFiles.ps1'
    $currWriteStr > $dateFile
    Write-Host 'call step 1'
    Write-Host $currWriteStr
    Write-Host $lastWriteStr

As You already figured it out - this sub-step can be used to control your tests execution or extend it to run Smoke packs on every check-in.

And finally You just need to configure Your Windows scheduled tasks / powershell scripts .ps1 files (by the manager). It's can be simplified to: Wake up PC - Logon - Start CI script. And YES the power of the shell also will be very useful here too.

The DevOps Zone is brought to you in partnership with Sonatype Nexus. Use the Nexus Suite to automate your software supply chain and ensure you're using the highest quality open source components at every step of the development lifecycle. Get Nexus today

devops ,qa ,acceptance tests ,continuous integration ,testing

Opinions expressed by DZone contributors are their own.

The best of DZone straight to your inbox.

Please provide a valid email address.

Thanks for subscribing!

Awesome! Check your inbox to verify your email so you can start receiving the latest in tech news and resources.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}