Over a million developers have joined DZone.

Stream SmartThings Data to Cosmos DB and PowerBI (Part 1)

DZone's Guide to

Stream SmartThings Data to Cosmos DB and PowerBI (Part 1)

SmartThings is a great platform. What doesn't really do, however, is collect data to provide historical information. I set about looking to collect data and store and report on it in Azure.

· Big Data Zone ·
Free Resource

How to Simplify Apache Kafka. Get eBook.

I've recently added a whole load of home automation functionality using Samsung's SmartThings platform. Part of this is a number of sensors to trigger automation based on motion, temperature, open/close, etc. The SmartThings platform is able to respond to events from these sensors to undertake tasks, and this works fine. What the platform doesn't really do, however, is collect this data to give you historical information. For example, I wanted to collect temperature values from various sensors around the house over time to give me a sense of temperature fluctuations throughout the day and the effectiveness of my heating. While there's no built-in way to collect this data in SmartThings, one of the good things about this platform is that it is easily extensible through its app and smart app interface. Knowing this, I set about looking to collect this data and store and report on it in Azure.

My plan was to use the following tools:

  • Event Hubs to capture the data as a live stream from SmartThings.
  • Stream Analytics to collect the data from the Event Hub and transfer it to long-term storage.
  • Cosmos DB to store the data.
  • Power BI to report on the data.

It should be noted that my approach here is to send the data to Cosmos DB for long-term storage and then have Power BI read and report on that data from Cosmos. It is possible to stream data to PowerBI and have real-time reporting, but as all I am looking for here is historical reporting, I didn't feel this is necessary. If you want to do this, you could certainly do so, and you would follow the same steps to start with to get the data to Event Hub.

This might seem a bit overkill for what we are doing (and I'm sure it is), but it's also a good learning experience for these tools in Azure and a fun project to boot. There's quite a lot of steps here, so I'm going to break this post into a few parts. In this first part, we will tackle getting the data from SmartThings to Event Hub.

Data Collection

The first thing we need to do is create an Event Hub to which we are going to stream the data. Use the Azure Portal to create an Event Hub in a region closest to you. For the volume of data we are going to get from SmartThings, the basic SKU should be enough.

This will go ahead and create an Event Hub namespace. Once that completes, select the created item in the Azure Portal then go to the Event Hubs tab. We now need to create an actual event hub. Click the + icon and then name your event hub (I generally give it the same name as the namespace we created). You can accept the defaults.

Once this has created, click on the Shared Access Policies tab and create an access policy for this Event Hub. It will need to allow Send and Listed. Then, copy both the primary key and the connection string for use later. You can call this what you want, but remember the name for later.

Create SmartThings App

Now that we have created the Event Hub, we need to get SmartThings to send data to it. This is done through creating a SmartApp. The code we are using for this is based on an article by Ben Crouse from 2015 but I have updated this code slightly to fix some issues and set it up in GitHub to make it easy to deploy. To add the SmartApp, you need to use the SmartThings web-based IDE and log in with your SmartThings credentials. Behind the scenes, there are actually multiple regional SmartThings instances. That URL should take you to the right one, but to check, you can go to the Hubs tab and check that you can see your SmartThings hub here.

Now that we are logged into the IDE, we can add the app. The easiest way to do this is through GitHub. In the IDE web app, go to the SmartApps tab. Here, you should see a Settings button.

Click on the Settings butto. Then, in the window that opens, add the details of the GitHub repository. The owner should be sam-cogan, the name st-event-hub and the branch master. You can view the content of the repo here, and obviously for a copy yourself if you wish. Click Save.

Now click on the Update from Repo button in the IDE. You should see the st-event-hub repo on the list. Click on this. This will open a window with three columns; the only content should be in the far right one labeled New and it should look like this:

Check the box and the box that says Publish, then click the Execute Update button. Once this completes, you should now see:

This indicates that the app is installed and ready for use.

App Settings

Before we can start using this solution, we do need to add some additional data so that it knows how to connect to the Event Hub. Click on the Edit button on the left of the app name. In the window that opens, expand the Settings section. There should two settings with no values:

The EventHubURL setting is fairly easy to get, it will follow this format:

https://<eventhub namespace name>.servicebus.windows.net/<eventhub name>/messages

So, in my setup, I have a namespace and an Event Hub called STEventHub, so my URL is:


The EventHub Secret is a bit more painful, as you need to generate an Event Hub SAS token. The easiest way to do this is with this PowerShell script create by Marcel Meurer:

[Reflection.Assembly]::LoadWithPartialName("System.Web")| out-null


$URI="<eventhub namespacename>.servicebus.windows.net/<eventhub name>"

$Access_Policy_Name="<access policy name>"



#Building Token

$SignatureString=[System.Web.HttpUtility]::UrlEncode($URI)+ "`n" + [string]$Expires
$HMAC = New-Object System.Security.Cryptography.HMACSHA256
$HMAC.key = [Text.Encoding]::ASCII.GetBytes($Access_Policy_Key)
$Signature = $HMAC.ComputeHash([Text.Encoding]::ASCII.GetBytes($SignatureString))
$Signature = [Convert]::ToBase64String($Signature)
$SASToken = "SharedAccessSignature sr=" + [System.Web.HttpUtility]::UrlEncode($URI) + "&sig=" + [System.Web.HttpUtility]::UrlEncode($Signature) + "&se=" + $Expires + "&skn=" + $Access_Policy_Name

You'll need to update the parameters as follows:

  • URI: This is the URI of your Event Hub, so it's the same as the one we created for the EventHub URL app setting, but without the /messages bit at the end.
  • Access_Policy_Name: This is the name of the access policy you created for your event hub earlier.
  • Access_Policy_Key: This is the Primary Key for the access policy you created.
  • Expres: This is the number of seconds until this key expires. As we want to send this data over a long period we need to make this quite long, in this example, I am using 1 year.

Run the PowerShell script and it should generate a key that you can add to the app settings and save.

Adding the App

Now that we have created and configured the app, we need to add it to our SmartThings instance and authorize it to get data. You'll need to do this through the SmartThing mobile application. Open the application and go to the Automation tab, and then click on the Smart Apps tab. Scroll down and click on Add a SmartApp. On the window that opens, scroll all the way down to My Apps.

In the window that opens, click on the ST-Event-Hub app.

This will open the app and ask you to select which devices you want to send data to SmartThings. Go through each category and select the devices you wish to send data. Then give it a name and select Save.

Confirm Data Transmission

At this point, we should now be sending our selected SmartThings data to our Event Hub. You can check the stats for your Event Hub in the portal to make sure that you are sending the data okay. You can also use Service Bus Explorer to connect directly to your Event Hub and view the data.

If you aren't seeing data sent okay, then you can look at the logs in SmartThings. To do this, go back to the web IDE and click on the Live Logging tab. You can then filter it by the ST-Event-Hub app. One thing to be aware of is that by default, the logging of failures to send data to the Event Hub is disabled. This is because for some reason, SmartThings/Groovy treats a 200 response from the Event Hub as an error rather than success, so we have to disable error logging to prevent large amounts of 200 errors. To enable it again, go into the smart app and look at the code, then un-comment the highlighted line below:

try {
        httpPost(params) { resp -> 
            log.debug "response message ${resp}"
    } catch (e) {
        // For some reason SmartThings treats 200 as an error response, so we need to comment this out to avoid errors. Uncomment the line below to debug errors 
        //log.error "something went wrong: $e"

Getting this all working was probably the hardest part of this project. I had to make a number of tweaks to the code in the SmartThings app, and getting the Event Hub secret in the right format took a good deal of time, so hopefully, this article will save you some of these pitfalls. In Part 2 of this article, we'll look at how we can take the data from Event Hub and get it into Cosmos DB ready to be reported on.

12 Best Practices for Modern Data Ingestion. Download White Paper.

big data ,streaming data ,powerbi ,cosmos db ,smartthings ,tutorial ,azure

Published at DZone with permission of

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}