Over a million developers have joined DZone.

Using the Flick HAT for Raspberry Pi

DZone's Guide to

Using the Flick HAT for Raspberry Pi

The Flick HAT allows for interesting data ingestion options into a Raspberry Pi and allows you to use gestures to trigger events.

· IoT Zone
Free Resource

Download Red Hat’s blueprint for building an open IoT platform—open source from cloud to gateways to devices.

Image title

The Flick HAT does gesture recognition, so you can tap, click, double-click, and move your finger on or above the mat. It's cool for controlling projects or for some random inputs.

Image title

Sometimes you want to trigger events with a click on a special touchpad or device mounted somewhere. This could be in a factory, on a door, or at your desk.

For me, it's a small device on my desk that I can use to trigger events. I have this running every 15 seconds looking for gestures. I could put it in an infinite loop, but Python and the RPI could leak some memory. We are very constrained, so I am trying to keep it a little more minimal. I keep my batch duration to 15 seconds and my run schedule to 15 seconds.

Our NiFi Flow

Let's build a simple NiFi flow to receive the JSON data and react to it:

Image title

In my RouteOnContent, I just look for the word "center". I have thought of many options for running SQL, doing a backup, etc.

Build the MiniFi Flow in NiFi

Image title

Downloaded minifi 0.2.0 and minifi 0.2.0 toolkit (you can use a newer version, but make sure you install the same version on the device you are going to move your config.yml to).

minifi-toolkit-0.2.0/bin/config.sh transform minififlick.xml config.yml   

Then SCP that config.yml and the minifi-*.zip to your device.

Unzip (or tar-cvf it), then you can run. This requires the Java 8 JDK installed and running on your machine. The Oracle version runs best on the RPI.

Let's Install and Run MiniFi

cd /opt/demo/minifi-0.2.0     
bin/minifi.sh install     
bin/minifi.sh start   

Example Message

	"flick": "center",
	"host": "herrflick",
	"ipaddress": "",
	"ts": "2017-08-14 21:19:21",
	"cputemp": 47

The important data is "flick", which is the gesture made (click, tap, movement, double-click, etc.) The other data are things I always like to grab for devices (hostname, IP Address, timestamp, and CPU temperature).

We could do just about anything you want in the flow based on the trigger: start backups, send system information, anything you want to trigger on demand.

Our Source Code

#!/usr/bin/env python

#- * -coding: < utf - 8 > - * -#Based on# Author: Callum Pritchard, Joachim Hummel# Project Name: Flick 3 D Gesture# Project Description: Sending Flick 3 D Gesture sensor data to mqtt# Version Number: 0.1# Date: 15 / 6 / 17# Release State: Alpha testing# Changes: Created
import time
import colorsys
import os
import json
import sys, socket
import subprocess
import time
import datetime from time
import sleep from time
import gmtime, strftime
import signal
import flicklib
import time from curses
import wrapper some_value = 5000 flicktxt = ''####
Initialization# yyyy - mm - dd hh: mm: ss currenttime = strftime("%Y-%m-%d %H:%M:%S", gmtime()) external_IP_and_port = ('', 53)# a.root - servers.net socket_family = socket.AF_INET host = os.uname()[1] def getCPUtemperature(): res = os.popen('vcgencmd measure_temp').readline() return (res.replace("temp=", "").replace("'C\n", "")) def IP_address(): try: s = socket.socket(socket_family, socket.SOCK_DGRAM) s.connect(external_IP_and_port) answer = s.getsockname() s.close() return answer[0]
if answer
else None except socket.error: return None def message(publisher, value): print value @flicklib.move() def move(x, y, z): global xyztxt xyztxt = '{:5.3f} {:5.3f} {:5.3f}'.format(x, y, z) @flicklib.flick() def flick(start, finish): global flicktxt flicktxt = 'FLICK-' + start[0].upper() + finish[0].upper() message('flick', flicktxt) @flicklib.airwheel() def spinny(delta): global some_value global airwheeltxt global flicktxt some_value += delta
if some_value < 0: some_value = 0
if some_value > 10000: some_value = 10000 airwheeltxt = str(some_value / 100) flicktxt = airwheeltxt @flicklib.double_tap() def doubletap(position): global doubletaptxt global flicktxt doubletaptxt = position flicktxt = doubletaptxt @flicklib.tap() def tap(position): global taptxt global flicktxt taptxt = position flicktxt = taptxt @flicklib.touch() def touch(position): global touchtxt global flicktxt touchtxt = position flicktxt = touchtxt def main(): global xyztxt global flicktxt global airwheeltxt global touchtxt global taptxt global doubletaptxt flickcount = 0 airwheeltxt = ''
airwheelcount = 0 touchtxt = ''
touchcount = 0 taptxt = ''
tapcount = 0 doubletaptxt = ''
doubletapcount = 0 time.sleep(0.1) while flickcount < 100: if (flicktxt != ""): flickcount += 100 cpuTemp = int(float(getCPUtemperature())) ipaddress = IP_address() row = {
    'ts': currenttime,
    'host': host,
    'cputemp': round(cpuTemp, 2),
    'ipaddress': ipaddress,
    'flick': flicktxt
  json_string = json.dumps(row) print(json_string) sys.exit() time.sleep(0.1) flickcount += 1 main()


Image title

Build an open IoT platform with Red Hat—keep it flexible with open source software.

raspberry pi ,iot ,flick hat ,gesture recognition ,data ingestion ,tutorial

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}