Creating a Docker Container from a Scala/SBT project in TeamCity for Use with Octopus Deploy

I considered creating a series of blog posts about my journey into Scala titled “How much I hate Scala/SBT Part XX”, however i decide not to be that bitter. The language isn’t bad, its just the ecosystem around it sucks, I am more likely to find the source code for something from a google search, rather than a stack overflow post or even documentation.

So here’s where I started, I will assume your Scala project is already Packaging up the project with a TeamCity build and SBT step running compile and your ready to move to a docker container.

So the key things here is the version number for me, I use a custom variable called “Version” that i usually set to something like “1.0.%build.counter%”, for my dot NET projects i use the assembly info patcher and this is then used in the package version, so with your docker containers you can use the tag for the version. Octopus Deploy needs the version on the Tag to work effectively.

If you look internally how TeamCity’s SBT runner runs the comment you will see something like the following:

[11:49:38][Step 2/2] Starting: /usr/java/jdk1.8.0_121/bin/java -Dagent.home.dir=/opt/buildagent -Dagent.name=tt-xxx-2004 -Dagent.ownPort=9090 -Dagent.work.dir=/opt/buildagent/work -Dbuild.number=1.0.12 -Dbuild.vcs.number=411695cf560acb5b7e4b2eb837738660acf0e287 -Dbuild.vcs.number.1=411695cf560acb5b7e4b2eb837738660acf0e287 -Dbuild.vcs.number.Ycs_SupplyApiService_YcsSuppioScala1=411695cf560acb5b7e4b2eb837738660acf0e287 -Djava.io.tmpdir=/opt/buildagent/temp/buildTmp -Dsbt.ivy.home=/opt/buildagent/system/sbt_ivy -Dteamcity.agent.cpuBenchmark=627 -Dteamcity.agent.dotnet.agent_url=http://localhost:9090/RPC2 -Dteamcity.agent.dotnet.build_id=1022735 -Dteamcity.auth.password=******* -Dteamcity.auth.userId=TeamCityBuildId=1022735 -Dteamcity.build.changedFiles.file=/opt/buildagent/temp/buildTmp/changedFiles3407499404619574497.txt -Dteamcity.build.checkoutDir=/opt/buildagent/work/36dd69c049b3f712 -Dteamcity.build.id=1022735 -Dteamcity.build.properties.file=/opt/buildagent/temp/buildTmp/teamcity.build2492188537600221102.properties -Dteamcity.build.tempDir=/opt/buildagent/temp/buildTmp -Dteamcity.build.workingDir=/opt/buildagent/work/36dd69c049b3f712 -Dteamcity.buildConfName=DevelopPushDocker -Dteamcity.buildType.id=FFFGGHHH -Dteamcity.configuration.properties.file=/opt/buildagent/temp/buildTmp/teamcity.config6701741486713268575.properties -Dteamcity.projectName=APIService -Dteamcity.runner.properties.file=/opt/buildagent/temp/buildTmp/teamcity.runner3391739853422434247.properties -Dteamcity.tests.recentlyFailedTests.file=/opt/buildagent/temp/buildTmp/testsToRunFirst5160519002097758075.txt -Dteamcity.version=2017.1 (build 46533) -classpath /opt/buildagent/temp/agentTmp/agent-sbt/bin/sbt-launch.jar:/opt/buildagent/temp/agentTmp/agent-sbt/bin/classes: xsbt.boot.Boot < /opt/buildagent/temp/agentTmp/commands5523308191041557049.file
 I’ve highlighted the one I am after, but you can see that TeamCity is passing a lot of data to the SBT runner, it uses java to run from the command line instead of just running the SBT command itself.
There is something i am missing though, I need to know the branch name, becuase we have a convention that if is not built from the master branch we use a “-branchname” at the end. So to add this in you need to edit your SBT runner step in teamcity and add the below
SBTParametersFromCommandLineTeamCity
From this we can use this variable in our Build.Scala file like so, I also add a value for team that is used later.
val dockerRegistry = "MyprivateDockerReg.company.com"
 val team = "myteam"
val appName = "ycs-supply-api"
 var PROJECT_VERSION = Option(System.getProperty("build.number")).getOrElse("0.0.4")

 val BRANCH = Option(System.getProperty("teamcity.build.branch")).getOrElse("SNAPSHOT")
 if ( BRANCH != "master")
 {
 PROJECT_VERSION = PROJECT_VERSION + "-" + BRANCH.replace("-","").replace("/","")
 }
Now for docker, in you SBT plugins directory make sure you have this line to import the plugin
addSbtPlugin(“se.marcuslonnberg” % “sbt-docker” % “1.4.1”)
Then here is what our Build.sbt looks like
import scala.xml.{Elem, Node}

enablePlugins(JavaAppPackaging)

name := "MyAppBin"

dockerfile in docker := {
val appDir: File = stage.value
val targetDir = s"/opt/$team"

new Dockerfile {
from("java:8-jre-alpine")
maintainer(s"$team")

runRaw(s"mkdir -p $targetDir")
workDir(s"$targetDir")
copy(appDir, targetDir)
expose(80)
env("JAVA_OPTS" -> s"-Dappname=$appName -Dconfig.file=conf/application-qa.conf -Dlog.dir=log/ -Dlogback.configurationFile=conf/logback-test.xml -Xms256m -Xmx256m -server")
entryPoint(s"bin/${name.value}")
}
}

imageNames in docker := Seq(
// SPAPI Sets the latest tag
ImageName(s"$dockerRegistry/$team/$appName:latest"),
// SPAPI Sets a name with a tag that contains the project version
ImageName(s"$dockerRegistry/$team/$appName:${version.value}")
)

mainClass in Compile := Some("Boot")

buildOptions in docker := BuildOptions(
cache = true,
removeIntermediateContainers = BuildOptions.Remove.Always,
pullBaseImage = BuildOptions.Pull.IfMissing
)
This will get us building the container and saving it to the local docker cache. After this we need to push it to our private registry.
There is currently an open issue about SBT-Docker here so I couldn’t get docker login to run from SBT so I created a separate task in TeamCity to handle this.
To do this i want to keep a lot of my settings in the Build.Scala so that the experience on the local will be similar to the build server, but I don’t want to text parse, so what we can do is output some logs for SBT to tell TeamCity what settings to use.
Add these two lines in

println(s"##teamcity[setParameter name='DockerContainerCreated' value='$dockerRegistry/$team/$appName:${version.value}']")
println(s"##teamcity[setParameter name='SbtDockerRegistry' value='$dockerRegistry']")</div>
This Will make SBT output the format that TeamCity Reads to set parameters and allow us to create the next step as a command line step.
TeamCitySetParameterFromSBTCommandLine
Next add these parameters is as empty
EmptyParamterTeamCitySetParameterFromCommandLine
Then we can create a command line step that does the docker login/push
PushContainerToDockerRegistryWithVersionTagFromTeamCity
And we are done! you should see your container in the registry now, and if like us you are using Octopus Deploy you will see the container appear on searches and the version numbers will correct be associated with the containers

Private Nuget Servers – VS Team Services Package Management

A while back i setup a Klondike server for hosting our internal nuget packages. We use it for both internal libraries and octopus.

Microsoft recently released the Package Management feature for VSTS (Formerly know as VSO), the exciting thing about Package Management is that they have hinted they will include support for npm and bower in future, so you will have a single source for all your package management.

VisualStudioMarketPlacePackageManagement

After Installing in VSTS you will get a new “Package” option in the top bar.

PrivateNugetServerOackageManagerVSTS

From here you can create new feeds. In my case I’ve decide to break up my feeds to one per project, but you could easily create more per project if you had for example separate responsibilities where you wanted to have more granular permissions. You can restrict the Publish and Read rights to the feeds to users OR groups within VSTS so its very easy to manage, unlike my hack around for permissions in my previous post about Klondike.

CreateNewNugetFeed

Now because we use TeamCity I have considered creating the build service their own Account in VSTS as they need credentials, but in this example I’m just using my own account.

You will need to change the “pick your tool” option to nuget 2.x to get your credentials to use in the TeamCity Steps.

OldVersionOfNugetFeed

Then click “Generate nuget Credentials” and grab the username and password out.

GetNugetCredentials

NugetFeedCredentials

Next hop over to your TeamCity Server, and edit/add your build configuration.

It’s important to note that you will require at least TeamCity version 9.1.6 to do this, as there is a fix in here for nuget credentials.

First jump into “Build Features”, and add a set of nuget credetails with the URL of your feed that you got from the VSTS interface.

AddNugetServerFeedCredentialsTeamCity

Then jump over to your Build steps and edit/add your nuget steps. Below is an example of my publish step.

NugetPublishStepTeamCity

The API key I’ve set to “VSTS” as per the instructions in the web interface of VSTS.

And we are publishing.

TeamCityBuildOutputNugetPublishVSTS

You will see the built packages in the VSTS interface when you are done.

NugetPacakgeVSTSWeb

Now if you have an Octopus server like us you will need to add the credentials into it as well into the nuget feeds section.

OctopusAddExternalNugetFeedVSTS

OctopusNugetPackageVSTSFeed

And its that easy.

One of our concerns about the Klondike server we setup was capacity. Because we have more than 12 developers and run CI with auto deployment to development environment, we are generating a large number of packages daily as developers check-in/commit work, so over a period of months and years the server has become quite bloated, though to give it credit i am surprised at how long it took to get bloated.

Some queries are taking upwards of 15-20 seconds at times and we have an issue (which I have not confirmed is related) where packages are randomly “not there” after the build log say they have been successfully published.

I am hoping that the VSTS platform will do us for longer, and it has the added advantage of the granular permissions which we will be taking advantage of as we grow.

 

 

 

 

 

Automated SSRS Report Deployments from Octopus

Recently we started using the SQL Data Tools preview in 2015, its looking good so far, for the first time in a long time I don’t have to “match” the visual studio version I’m using with SQL, i can use the latest version of VS and support SQL version from 2008 to 2016, which is great, but it isn’t perfect. There is a lot of bugs in the latest version of data tools and some that Microsoft are refusing to fix that affect us.

One of the sticking points for me has always been deployment, we use octopus internally for deployment and try to run everything through that if we can to simplify our deployment environment.

SSRS is the big one, so I’m going to go into how we do deployments of that.

We use a tools in SQL called RS.EXE this is a command line tool that pre-installs with SSRS for running vb scripts to deploy reports and other objects in SSRS.

You need to be aware with this tool though that based on which API endpoint you call using the “-e” command line parameter, that the methods will be different. And out of the box it defaults to calling the 2005 endpoint, which has massive changes to 2010.

A lot of the code i wrote was based on Tom’s Blog Post on the subject in 2011, however he was using the 2005 endpoint and I have updated the code to use the 2010 end point.

The assumption is that you will have a folder full of RDL files (and possibly some pngs and connection objects) that you need to deploy, so the VB script takes a parameters of a “source folder”, the “target folder” on the SSRS server, and the “SSRS server address” itself. I package this into a predeploy.ps1 file that I package up into a nuget file for octopus, the line in the pre-deploy script looks like the below.


rs.exe -i Deploy_VBScript.rss -s $SSRS_Server_Address -e Mgmt2010 -v sourcePATH="$sourcePath"   -v targetFolder="$targetFolder"

You will note the “-e Mgmt2010” this is important for the script to work, next is the Deploy_VBScript.rss


Dim definition As [Byte]() = Nothing
Dim warnings As Warning() = Nothing
Public Sub Main()
' Create Folder for the project if not exist
try

Dim descriptionProp As New [Property]
descriptionProp.Name = "Description"
descriptionProp.Value = ""
Dim visibleProp As New [Property]
visibleProp.Name = "Visible"
visibleProp.value = True
Dim props(1) As [Property]
props(0) = descriptionProp
props(1) = visibleProp
If targetFolder.SubString(1).IndexOf("/") <> -1 Then
Dim level2 As String = targetFolder.SubString(targetFolder.LastIndexOf("/") + 1)
Dim level1 As String = targetFolder.SubString(1, targetFolder.LastIndexOf("/") - 1)
Console.WriteLine(level1)
Console.Writeline(level2)
rs.CreateFolder(level1,"/", props)
rs.CreateFolder(level2 , "/"+level1, props)
Else
Console.Writeline(targetFolder.Replace("/", ""))
rs.CreateFolder(targetFolder.Replace("/", ""), "/", props)
End If
Catch ex As Exception
If ex.Message.Indexof("AlreadyExists") > 0 Then
Console.WriteLine("Folder {0} exists on the server",targetFolder)
else
throw ex
End If
End Try
Dim Files As String()
Dim rdlFile As String
Files = Directory.GetFiles(sourcePath)
For Each rdlFile In Files
If rdlFile.EndsWith(".rds") Then
'TODO Implment handler for RDS files
End If
If rdlFile.EndsWith(".rsd") Then
'TODO Implment handler for RSD files
End If
If rdlFile.EndsWith(".png") Then
Console.WriteLine(String.Format("Deploying PNG file {2} {0} to folder {1}", rdlFile, targetFolder, sourcePath))
Dim stream As FileStream = File.OpenRead(rdlFile)
Dim x As Integer = rdlFile.LastIndexOf("\")

definition = New [Byte](stream.Length) {}
stream.Read(definition, 0, CInt(stream.Length))
Dim fileName As String = rdlFile.Substring(x + 1)
Dim prop As new [Property]
prop.Name ="MimeType"
prop.Value="image/png"
Dim props(0) as [Property]
props(0)=prop
rs.CreateCatalogItem("Resource",fileName,targetFolder,True,definition,props,Nothing)
End If
If rdlFile.EndsWith(".rdl") Then
Console.WriteLine(String.Format("Deploying report {2} {0} to folder {1}", rdlFile, targetFolder, sourcePath))
Try
Dim stream As FileStream = File.OpenRead(rdlFile)
Dim x As Integer = rdlFile.LastIndexOf("\")

Dim reportName As String = rdlFile.Substring(x + 1).Replace(".rdl", "")

definition = New [Byte](stream.Length - 1) {}
stream.Read(definition, 0, CInt(stream.Length))

rs.CreateCatalogItem("Report",reportName, targetFolder, True, definition, Nothing, warnings)
If Not (warnings Is Nothing) Then
Dim warning As Warning
For Each warning In warnings
Console.WriteLine(warning.Message)
Next warning
Else
Console.WriteLine("Report: {0} published successfully with no warnings", reportName)
End If
Catch e As Exception
Console.WriteLine(e.Message)
End Try
End If

Next
End Sub

I haven’t yet added support for RDS and RSD files to this script. If someone wants to finish it off please share. Now its running PNGs and RDLs which is the main thing we update.

Now that all sounds too easy, and it was, the next issue i ran into was this one when deploying. The VS 2015 Data Tools client saves everything in 2016 format, and when I say 2016 format, i mean it changes the xmlns and adds a tag called “ReportParametersLayout”.

When you deploy from the VS client it appears to “rollback” these changes before passing the report to the 2010 endpoint, but if you try to deploy the RDL file from source it will fail (insert fun).

To work around this I had to write my own “roll back” script in the octopus pre-deploy powershell script, below:

Get-ChildItem $sourcePath -Filter "*.rdl" | `
Foreach-Object{
[Xml]$xml = [xml](Get-Content $_.FullName)
if($xml.Report.GetAttribute("xmlns") -eq "http://schemas.microsoft.com/sqlserver/reporting/2016/01/reportdefinition")
{
$xml.Report.SetAttribute("xmlns","http://schemas.microsoft.com/sqlserver/reporting/2010/01/reportdefinition")
}
if($xml.Report.ReportParametersLayout -ne $null)
{
$xml.Report.RemoveChild($xml.Report.ReportParametersLayout)
}
$xml.Save($sourcePath + '\' + $_.BaseName + '.rdl')
}

Now things were deploying w… no, wait! there’s more!

Another error I hit was this next one, I mentioned before MS are refusing to fix some bugs here’s one. We work in Australia, so Time Zones are a pain in the ass, we have some states on daylight savings, some not, and some that were that have now stopped. Add to this we work on a multi-tenanted application, so while some companies rule “We are in Sydney time, end of story” we cannot. So using the .NET functions for timezone conversion is a must as they take care of all this nonsense for you.

I make a habit of storing all date/time data in UTC that way in the database or application layer if you need to do a comparison its easy, then i convert to local timezone in the UI based on user preference or business rule, etc. Its because of this we have been using System.TimezoneInfo in the RDL files. As of VS 2012 and up, you get an error from this library in VS, the report will still run fine on SSRS when deployed, it just errors in the Editor in VS and wont preview or deploy.

We’ve worked around this by using a CLR function that wraps TimezonInfo example here. If you are reading this please vote this one up on connect, converting timezone in the UI is better.

After this fix things are running smoothly.


 

 

 

 

 

TypeScript Project AppSettings

Ok so there’s a few nice things that MS have done with typescript, but Dev time vs build time vs deploy time they don’t all work the same, so there’s a few things we’ve done to make the F5 experience nice while making sure the build and deployed results are the same.

In the below examples we are using

  • Visual Studio 2015
  • Team City
  • Octopus Deploy

Firstly TypeScript in Visual Studio has this nice feature to wrap up your typescript into a single js file while your coding. It’ll save to this file as you are saving your TS files, so its nice for making code changes on the fly while debugging, and you get a single file output.

TypeScriptCombine

Also this will build when TeamCity runs msbuild as well. But don’t check this file in, its compile ts output, it should be treated like binary output from a C# project.

And you can further minify and compact this using npm (see this post for details).

This isn’t prefect though, because we shovel our static content to a separate CDN that is spared between environments (Folder per version), and we have environment specific JavaScript variables that need to be set, this can’t be done in a TypeScript file as they all compile into a single file. I could use npm to generate the TypeScript into 2 files but this conflicts too much with the developers local setup in using the above feature from my tests.

So we pulled it into a js file that is separate form the type script, this obviously though causes the type script to break as the object doesn’t exist. So we added a declaration file for it like below:

declare var AppSetting: {
url: string;
baseCredential: string;
ApiKey: string;
}

Then drop the actual object in a JavaScript file with the tokens ready for octopus, with a simple if statement to drop in the Developers local settings when running locally.


var AppSetting;
(function (AppSetting) {
if (window.location.hostname == "localhost") {
AppSetting.url= "http://localhost:9129/";
AppSetting.baseCredential= "XXXVVVWWW";
AppSetting.ApiKey= "82390189wuiuu";
} else {
AppSetting.url= "#{url}";
AppSetting.baseCredential= "#{baseCredential}";
AppSetting.ApiKey= "#{ApiKey}";
}
})(AppSetting || (AppSetting = {}));

This does mean we need a extra http request to pull down the appSetting js file for our environment, but it means we can maintain our single CDN for static content/code like we have traditionally done.

Deploying to Telerik Platform from Octopus Server

We have been suing Telerik Platform for a while now, while their platform is great, going from TFVC to Build to Platform for deploy always involved someone building for their local visual studio, which of course carries the risks of:

  1. Manually changing settings from local to dev/test/live and things getting missed
  2. Unchecked-in code going up
  3. Manual Labor

So since they have a CLI we decided to try to automate this process, its a bit wired what we came up with because we build at deployment time, but it works.

I setup a git repo for this one with an example solution using the friends app (https://github.com/HostedSolutions/platform-friends-hybrid)

Summary of what we are going to setup with this process

  1. Package the project into a nuget package on the TeamCity Server and stick it into a nuget server
  2. Pick up the nuget package with an Octopus Server and store variables for dev/test/prod in octopus
  3. Deploy to Telerik Platform from the octopus Tentacle
  4. Based on the octopus environment choose (dev/test/etc) use different variables, and make it available to different group

THE PROJECT

First of all, the project itself. So the Build server in this instance isn’t going to build anything, its just going to package it. so we simply need to add a nuspec file to the project in Visual Studio, example below


<?xml version="1.0" encoding="utf-8" ?>
<package xmlns="http://schemas.microsoft.com/packaging/2011/08/nuspec.xsd">
 <metadata>
 <id>platform-friends-hybrid</id>
 <version>0.0.0</version>
 <title>platform-friends-hybrid deployment package</title>
 <authors>Hosted Solutions Pty Ltd</authors>
 <owners>Hosted Solutions Pty Ltd</owners>
 <requireLicenseAcceptance>false</requireLicenseAcceptance>
 <description>platform-friends-hybrid deployment package</description>
 <summary>platform-friends-hybrid deployment package</summary>
 <releaseNotes />
 <copyright>Copyright © Hosted Solutions Pty Ltd 2015</copyright>
 <language>en-US</language>
 </metadata>
 <files>
 <file src="\**\*.*" />
 <file src=".abproject" />
 <file src=".debug.abproject" />
 <file src=".release.abproject" />
 </files>
</package>

Now in this example you will note the files beginning with a “.” i had to add individually. this is because they aren’t picked up by “*.*”

You also need to manually add the packages.config for nuget with a reference to octopack. This will package up the files for you into the nuget format octopus needs.

This Commit in github has the full details (https://github.com/HostedSolutions/platform-friends-hybrid/commit/467c2f06fdb5d123021250902cf0035bf5806790)

Finally we need to change the file /scripts/app/settings.js, so that we can token replace the variables, the need to be in a format of #{VaribleName] for octopus, below is an exmaple


var appSettings = {

everlive: {
 apiKey: '#{EVERLIVE_API_KEY}', // Put your Backend Services API key here
 scheme: 'http'
 },

THE BUILD

To get this to build on your build server you will need to download and install the Visual Studio Extension on your build server as well (https://platform.telerik.com/#workspaces and go to “Getting Started” -> “Downloads” -> “App Builder Hybrid”)

AppBuilderExtentionForVisualStudio

In my build in TeamCity to make this work I’ve got 4 steps

BuildStepsInTeamCity

These are very similar to what i outlined in this post (https://beerandserversdontmix.com/2015/09/30/tfs-teamcity-nuget-octopus-somewhere/)

The main difference is I’ve had to add an extra build configuration parameter like this

/p:OctoPackNuGetArguments=-NoDefaultExcludes

This makes Octopack pass an additional parameter to nuget when it does the packaging, without this it refuses to pickup the files beginning with the “.”

Now this will package up our solution to be built on the deployment server, it wont actually do any building.

THE DEPLOY (BUILD)

I’ve got a few projects that are odd like this, where i end up pushing from Octopus to a remote environment to then onward deploy, its not unusual I think, but still not common. So we ended up creating a machine specific in one of our setups for just running scripts, in my smaller setup we just drop a tentacle on our octopus server though.

I’m using an octopus server with a tentacle on it in this example.

First we need to get node js on the box (https://nodejs.org/download/release/latest-v0.12.x/node-v0.12.7-x86.msi)

The tricky bit here is that nodejs with AppManager CLI runs out of the User Profile, so What i have done is set the Tentacle on the box to run a a local User Account (if you are in a domain i recommend an AD account), make sure it a local administrator on the box the tentacle is installed on.

OctopusDeployTentacleChangeUser

Once this is done login as this user to install and setup nodejs and AppManager CLI with the command line

npm install appbuilder -g

npmInstallAppBuilderCommandLine

Now that that is ready you need to setup a project in Octopus

CreateStepInOctopusDeployTelerikAppBuilder

Make sure you select the substitute Variable Step Feature.

OctopusDeployStepSubstituteVarible

And you will need to add you JavaScript Settings file to the list once this is enabled.

OctopusDeploySettingsJsSubstitute

The setup the variables in the variable section.

OctopusDeployAddVariblesTelerik

And for each variable you will probably want to set a different scope per environment.

OctopusDeployVaribleScopeTelerik

Then add a step template

I’ve put notes for my original powershell script here (https://github.com/HostedSolutions/platform-friends-hybrid/blob/master/NotesOctopusStepTemplate.ps1)

And a full step template for octopus here (https://github.com/HostedSolutions/platform-friends-hybrid/blob/master/OctopusStepTemplate.json)

Now i’ll just walk through a few of the things in the powershell and why we are doing things that way.


# Setup Group Command
$GroupCmd =" --group " + $GroupAccessList.Replace(","," --group ");
$GroupCmd=$GroupCmd.Replace(" "," ").Replace(" "," ")

This code above is for giving the option for various groups to be able to access different deployments. For example, we have Dev and Test, the developers have access to both Dev and Test, but only our testers have access to Test, because we allow the developers to “mess around” with dev, which may cause false positive results in testing.


$currentstepname = $OctopusParameters["Octopus.Step[0].Package.NuGetPackageId"]

We expect the previous step to this one to be the Telerik Nuget Package step, if it is not this will fail.


CMD /C C:\"Program Files (x86)"\nodejs\npm update -g appbuilder; $LASTEXITCODE

We run an update command to make sure that we are on the latest version of AppBuilder, if AppBuilder is not the latest version it will fail


$AppData = $env:APPDATA

AppBuilder runs from the local user profile, so we need to use the AppData folder path

$JSON = (Get-Content "$parentLocation\.abproject" -Raw) | ConvertFrom-Json; 

# Set values in object
$JSON.ProjectName = $ProjectName; 
$JSON.AppIdentifier = $AppIdentifier; 
$JSON.DisplayName = $DisplayName;
$JSON.BundleVersion = $OctopusReleaseNumber; 

$JSON | ConvertTo-Json | Set-Content "$parentLocation\.abproject"

We modify values in the .abproject file to set things like the version number and also the app name (we prefix dev with “Dev” and Test with “Test” so using the example above where a developer has both dev and test on their phone, that the developers when they are using the app on their phone know which one is which.

CMD /C $APPDATA\npm\appbuilder dev-telerik-login $TelerikUserName $TelerikPassword       IF ($LASTEXITCODE -ne 0) { Write-Error "Error"}

Login to telerik platform

CMD /C $APPDATA\npm\appbuilder appmanager upload android --certificate $AndriodCertID --publish --send-push $GroupCmd;$LASTEXITCODE;IF ($LASTEXITCODE -ne 0) { Write-Error "error"}

Uploads to android

CMD /C $APPDATA\npm\appbuilder appmanager upload ios     --certificate $iOSCertID --provision $iOSProvitionID --publish $SendPushCmd $GroupCmd;$LASTEXITCODE;IF ($LASTEXITCODE -ne 0) { Write-Error "error"}

Uploads the iOS version
We normally set the Group Access list to a variable, so that it can be varied per environment.
TelerikGroupAccessListVarible

So we then end up with the steps like so in octopus

StepsOctopusDeployTelerikPlatform

Once deployed to Telerik Platform our version number are in sync with Octopus and Team City as well as our Source control labels. And we end up with seperate apps for Dev,Test , etc. and in the event you are accessing services you can token replace the right scoped variable so that the “Test Mobile App” accesses the “Test Web API” and so on.

And there you have it, TFVC -> TeamCity -> Octopus Deploy -> Telerik Platform