DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports Events Over 2 million developers have joined DZone. Join Today! Thanks for visiting DZone today,
Edit Profile Manage Email Subscriptions Moderation Admin Console How to Post to DZone Article Submission Guidelines
View Profile
Sign Out
Refcards
Trend Reports
Events
Zones
Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Partner Zones AWS Cloud
by AWS Developer Relations
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Partner Zones
AWS Cloud
by AWS Developer Relations
  1. DZone
  2. Data Engineering
  3. Databases
  4. Brewing Beer With Raspberry Pi: Stream Analytics

Brewing Beer With Raspberry Pi: Stream Analytics

The beer brewing exercise continues, using Azure's IoT Hub and SQL database to perform stream analytics on the temperature of the beer.

Gunnar Peipman user avatar by
Gunnar Peipman
·
Feb. 10, 16 · Tutorial
Like (1)
Save
Tweet
Share
4.42K Views

Join the DZone community and get the full member experience.

Join For Free

when cooling beer we want to store history of temperatures for two reasons. first, it gives us valuable history data for next cooling sessions. second, we can ask measurements when we temporarily lost connection with iot hub. in this post we'll make some analysis and then build up a database for our beer cooling solution.

creating a sql azure database

i’m sure i want to brew eisbock more than once and therefore i have more than one cooling session coming. as all these sessions introduce data i want to store i need something to find out what measurements belong to what cooling session. in this point i want to introduce a new term – batch. in brewing, batch means the concrete brew in container.

so, in total we need two tables:

  •  batch  – batch key, measuring state, cooling rate and device used for measuring.
  •  measurement  – batch key, timestamp, beer temperature, ambient tempereture.

here is the database diagram. although we expect currently the ambient temperature to be constant this doesn’t hold always true and in the future i also want to consider situations where ambient temperature is changing (it’s important in spring and autumn when cooling takes longer due to less frost).

beer cooling solution database diagram

sql script to create these two tables is here. this sql works on azure sql too, so just create database there, take this script with copy-paste and run it:

create table [dbo].[batch] (
    [batchkey]    nvarchar (15) not null,
    [deviceid]    nvarchar (15) not null,
    [coolingrate] float (53)    constraint [df_batch_coolingrate] default ((0)) not null,
    [isactive]    bit           constraint [df_batch_isactive] default ((0)) not null,
    constraint [pk_batch] primary key clustered ([batchkey] asc)
)
go

create table [dbo].[measurement] (
    [id]          int           identity (1, 1) not null,
    [batchkey]    nvarchar (15) not null,
    [time]        datetime      not null,
    [beertemp]    float (53)    not null,
    [ambienttemp] float (53)    not null,
    constraint [pk_measurement] primary key clustered ([id] asc),
    constraint [fk_measurement_batch] foreign key ([batchkey]) references [dbo].[batch] ([batchkey])
)
go

now we have simple sql azure database where we can save measurements data.

getting data from azure iot hub to sql database

now comes the most complex part of this show. we have to get data reported to iot hub to sql azure database, and we don’t want to write any code for this. also we don’t want to write all data to the database because temperature changes are not rapid, and taking averages over some small time window helps us keep the database smaller without losing any important information.

here’s the full path for measurements to get from the device to the sql database.

how data moves from device to sql database

stream analytics is an azure service for real-time data processing and aggregating. all stream analytics queries are run in some time window on data that is flowing in. stream analytics takes data from inputs, processes it and then sends to outputs. in our case, azure iot hub is the input, and our sql database is the output.

creating a stream analytics job

now go to the azure portal and create a new stream analytics job:


adding input

when a new stream analytics job is created, you are redirected to the main page of the azure portal. browse to your newly created stream analytics job, open it and click on the inputs box. this opens an inputs list on the right. click on the add button.

give a name to input, and select “data stream” as its type. for source, select “iot hub”.

in the “iot hub” field, insert the subhost name part of your iot hub address. if your iot hub address is something.azure-devices.net then the value on “iot hub” field must be “something”.

for shared access policy, i took “service”, and to the shared acces key field, you have to insert the key from this policy. you can find the policy key from the iot hub settings. just open the settings in iot hub, select “shared access policies”, and then “service”. from the policy properties window, copy the value from “primary key” and paste it here.

when the data is inserted and the user saves it, then azure portal checks if a new input source can be connected. just wait a few moments after clicking the save button to see if the input source is okay.

adding output

now click on outputs block on main page of stream analytics job and add new output. for us, the output will be sql database.

for the “database” field, insert the name of the database you created before.

for the “server name” field, insert your database server address. for sql azure, the address is like something.database.windows.net. if you have a database hosted on your own, or if you are running sql server on an azure virtual machine, then insert the ip of your server because you don’t have any access to dns servers used by azure.

user name and password should be obvious so i'll skip them. last field is table, and for this field write "measurement."

now we have all the data inserted, and it’s time to save. click “create” to save database as a new output source.

the names you gave to the input and output sources are the ones you will use later when writing stream analytics queries. so choose these names carefully and make sure they make sense. it makes it easier to understand queries later.

creating a stream analytics query

as a last thing we have to add query that is run on incoming data flow. as azure's new portal doesn’t support testing of stream analytics queries yet, we have to switch to the old management portal and insert our query there.


now things get a little bit tricky because the query must know input and output formats. on the input side, we have data that is structured like our measurement objects we are sending out from the device. on the output side we have the measurements table we created above. additionally, we have to define aggregates on all numbers in our query.

now we have one additional problem. the stream analytics query is running on data that is coming in from a data source. currently the data we are sending to azure iot hub doesn’t have information about what batch it is, and there’s no way on the stream analytics side to make the decision. we have to add a batch key to the data transfer object we are using to send data to azure iot hub.

we add a new batchkey attribute to the anonymous dto we are using in reportmeasurement method.

private void reportmeasurement(datetime time, double beertemp, double ambienttemp, double estimate)
{
    var beermeasurement = new
    {
        deviceid = "mydevice",
        batchkey = "eisbock-1",
        timestamp = time,
        beertemp = beertemp,
        ambienttemp = ambienttemp,
        estimate = estimate
    };

    var messagestring = jsonconvert.serializeobject(beermeasurement);
    var message = new message(encoding.ascii.getbytes(messagestring));

    _deviceclient.sendeventasync(message).astask().wait();
}

now we can go on and write a query that saves data to sql azure database. here is the stream analytics query:

select
    batchkey as batchkey,
    max(cast([timestamp] as datetime)) as time,
    avg(beertemp) as beertemp,
    avg(ambienttemp) as ambienttemp
into
    [sqlazurebeeriot]
from
    [fromiothub]
group by
    batchkey,
    tumblingwindow(minute, 5)

 somenotes.  we cast the timestamp field to datetime, because otherwise stream analytics considers it as something that should be casted to float. not sure why it is so. as we are interested in temperatures during five minute time windows (called also as tumbling window), we take the averages of temperatures reported.

once we run a stream analytics job we can’t make changes there. to change something we have to stop the job and then do our modifications.

adding first batch to database

before we can start gathering data we need at least one batch to be available in batch table. add new batch with following data:

batchkey: eisbock-1

deviceid: mydevice

isactive: 1

after adding this row to batch table we are ready to run to test if data gets from iot hub to sql azure database.

testing stream analytics job

now run stream analytics job and open your iot hub so you see its dashboard. also open your database in sql server management studio, and be ready to make the "select from measurement" table. run the beer cooling solution and see if data starts coming. in sql azure, the data comes in with some delay. it depends on how wide the data window is over the results we are aggregating.

data in sql database inserted by stream analytics

if everything is okay, then soon you shoud see new data in your measurement table. if the time window is five minutes, then you should wait at least five minutes before you see any data.

wrapping up

this was long post full of analysis and configuring of services, but we made it and now our data is flowing from iot hub to a sql azure database. as we are aggregating measurement results over time windows of 5 minutes, we store less data than is coming in, but still we don’t lose much, and for the next batches we have useful historical data to take.

brewing beer with a raspberry pi: table of contents

  1.  brewing beer with a raspberry pi: measuring temperature 
  2.  brewing beer with a raspberry pi: moving to itemperatureclient interface 
  3.  brewing beer with a raspberry pi: measuring cooling rate 
  4.  brewing beer with a raspberry pi: making cooling rate calculations testable   
  5.  brewing beer with a raspberry pi: reporting measurements to azure iot hub 
  6.  brewing beer with raspberry pi: stream analytics 
  7.  brewing beer with raspberry pi: visualizing sensor data   
  8.  brewing beer with raspberry pi: building a universal windows application 
Database Stream (computing) Analytics Data processing sql azure

Published at DZone with permission of Gunnar Peipman, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

Popular on DZone

  • How To Handle Secrets in Docker
  • Fargate vs. Lambda: The Battle of the Future
  • Reliability Is Slowing You Down
  • Spring Boot vs Eclipse MicroProfile: Resident Set Size (RSS) and Time to First Request (TFR) Comparative

Comments

Partner Resources

X

ABOUT US

  • About DZone
  • Send feedback
  • Careers
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 600 Park Offices Drive
  • Suite 300
  • Durham, NC 27709
  • support@dzone.com
  • +1 (919) 678-0300

Let's be friends: