Over a million developers have joined DZone.

How We Made Muletallica

The team at Mulesoft wanted to create an awesome IoT demonstration for a hackathon, so they assembled a lot of connected devices to create a giant connected rock band.

· IoT Zone

Access the survey results 'State of Industrial Internet Application Development' to learn about latest challenges, trends and opportunities with Industrial IoT, brought to you in partnership with GE Digital.

This post is the second in a three-part series covering the projects that came out our first internal hackathon of the year and that we had on display at our first Integration of Things Zone at CONNECT 2015. Missed us there? No worries, not only will you get a sense of the cool installations we built below, you’ll also get a front row seat as to how we built them, and with what technologies. So for now, put on your favorite rock star jeans, and jump in the mosh pit to learn how the team built Muletallica, an interactive visual/musical experience for our conference attendees that connected devices like Leap Motion and smart light bulbs with electric guitars, and a bell pepper.
11140242_10152941689612551_1656019084777378406_n-300x169

Why We Built It

Muletallica came out of the internal IoT hackathon we had at MuleSoft back in April. It was a team project, built by Federico Amdam, Jesica Fera, Pablo Carballo and myself, all of us based out of Mulesoft’s Buenos Aires office.

Muletallica was born out of something I had originally been tinkering with for quite some time now: the idea of using technology to create interactive musical installations that could get people –potentially with no musical knowledge or skills whatsoever– to experience the joy of making music in a fun and creative way with a minimal learning curve.


What we set to accomplish during the hackathon was to take some of my prior musical experiments and use Mule to integrate them with intelligent lights, in this way we’d be making the experience a lot more immersive and engaging. Through the added visual feedback that the lights provided, the responses to people’s actions became easier to associate.

Fast forward to a month later and I was in San Francisco at the Connect conference, representing our team. Our marketing team in the meantime had helped design an awesome-looking installation to show Muletallica off, which really made the project stand out. And there I was essentially living the dream, as (at least for a couple of days) my job description involved playing the guitar and telling people about cool tech toys. Anyone who curiously walked up to the stand was invited to join the band and jam with us, and they were always very curious about how it was all accomplished, so I was happy to walk them through all of the interaction design and the underlying architecture.

We used MiLight lights for this, which expose a python API that isn’t very easy to use. The better known Phillips Hue lights expose a nice API, that could easily receive HTTP requests… that would have been too easy, though. We wanted a challenge, somewhere where we could show off Mule’s power to take an ugly legacy interface and make it useable, so that’s why we went with MiLight instead.

Muletallica, Play By Play


In the above video you can see me play with several of the different instruments. Each instrument is linked to a different intelligent light, and sets its hue and intensity through Mule messages:

  • The Air Piano: At first, I play a Leap Motion sensor as an air-piano that can be played by simply stroking imaginary keys. The beauty of this is that whatever you play, it’ll always be in the right key and adjusted to be right on the beat. Literally anything you play will always sound musically good, or at least not painfully off. At the same time, we had a light flicker once for every note that is played, with a hue that was mapped to the note’s pitch.
  • The Guitar Duel: When playing the guitar, the sequence of notes I played is stored, so that playing the air piano automatically runs you through the same sequence of notes. This made for a pretty interesting request-response kind of musical conversation between two instruments, where the notes would be the same but the free interpretation of the timing of them was enough to allow for some exciting musical expression. It was also a fun way to interact with members of the audience who were brave enough to accept the challenge of playing back whatever I played. One of the lights was mapped to the guitar and flickered with every note I played, mapping its hue to note pitch.
  • Adding Beats: When presenting one of a series of printed cards to the webcam on the laptop, the drum pattern changes. Here the computer is using Reactivision, a computer vision software that was originally built for the Reactable, to recognize these cards. Using this in our setup was a little tribute to the creator of the Reactable, Sergi Jordá, a professor of mine who first inspired me to pursue this ideal of making music creation accessible to everyone. One of the lights flickers matching the beats of the drum, mapping intensity of the beat to luminosity, it also changes color whenever the pattern changes. Each change in the drum pattern also triggers the playing of a short three-second video.
  • The Wiiii and the WubWub: After having changed the drum beat to the most electronic pattern, playing the same Leap Motion Sensor as before invokes a dubstep-ish theremin-like instrument that responds to the height and angle at which you hold your hand above the sensor. It can actually tell what hand you’re holding up and plays a different instrument depending on which it sees. I called one of these instruments “Wiiii” and the other one “WubWub” …I suppose you can easily tell which is which from the video. Every change in these instruments was also manifested through a change in the hue of its corresponding light.

 Image title

  • The Bell Pepper: Our addition of a music-making vegetable piqued a lot of people’s curiosity from the visitors. It was an actual vegetable that was wired to a Makey Makey, and responded with a chord change every time someone touched it (going through a sequence of pre-defined chords). Yes, touching the bell-pepper involved an –imperceptibly low– electric current passing through your body. Some people seemed to be a little uneasy about this idea, I would then assure them that the bell-pepper we were using was 100% organic, fair-trade, fresh produce with no additives whatsoever, and then proceeded to show them the sticker that certified that it was in fact organic. One of the lights changed color whenever the bell pepper was touched.
  • The music that could be made with Muletallica was far from anything that could resemble the sound of Metallica… it could be described as mellow Pink Floydish trance-inducing prog-rock or sometimes as full-on twisted synthetic-sounding dubstep, but certainly never as heavy metal or anything even faintly close to that. We came up with the name as a random pun that we never expected would be taken seriously as a proper name, but people seemed to like it quite a bit, and so we went with it… it’s like what they say, if you build it, they will come.

    Muletallica’s Backstage

    All of our Integration of Things Zone projects featured Mulesoft products in their internal structure in some way or another. In the case of Muletallica, I must admit that Mule was not the backbone of the project, but still an essential bone its structure.

    The backbone was Usine, a not-so-well-known French software that is actually amazingly versatile and ideal for live performances like this. It shares a certain philosophy with Mule, as with it you’re also building flows by dragging and dropping atomic components that include all kinds of connectors and transformers. Just like in Anypoint Studio, everything is exposed through a graphical interface, while you can also get into the code and write.

    Most of the external components involved were connected together through MIDI, which is a widely accepted standard in musical interfaces. Due to the prevalence of that standard, connectivity was not a challenge when communicating Usine to Reactivision or to Mogees. The lights we used, however, didn’t support MIDI or any other universal standard for that matter, and so that’s where we had to truly put our integration developer hats on and solve the puzzle.

    We then built a RAML definition that exposed a series of methods for calling our lights, and with that in place it was really easy to just build an APIkit project and have it automatically flesh out all of the scaffolding we would need to build a neat RESTful wrapper around their ugly API. We then injected a few lines of python code into Mule, these executed the commands that made up the MiLight API, as well as the commands of a python MIDI library that allowed us to receive the MIDI messages Usine sent and make them into Mule messages.

    The RAML definition we wrote for wrapping the miLight API in a REST API:

    #%RAML 0.8
    title: muletallica
    version: 1.0.0
    baseUri: http://server/lights/{group}
    /effects:
      displayName: effects
      /{group}:
        displayName: group
        /gamma:
          displayName: gamma
          put:
            description: change color gamma in a group of lights to any color
            body: 
              application/json:
                example: |
                 {
                   "note": 1
                 }
        /directcolor:
          displayName: direct color
          put:
            description: change color gamma in a group of lights, to predefined colors
            body: 
              application/json:
                example: |
                 {
                   "note": 32,
                   "velocity": 100
                 }
        /intensity:
          displayName: intensity
          put:
            description: change brightness in a group of lights
            body: 
              application/json:
                example: |
                 {
                   "velocity": 1
                 }
        /both:
          displayName: both
          put:
            description: flicker with color and intensity
            body: 
              application/json:
                example: |
                 {
                   "note": 1,
                   "velocity": 1
                 }
        /flicker:
          displayName: flicker
          put:
            description: make a group of lights flicker
            body: 
              application/json:
                example: |
                 {
                   "note": 1
                 }
        /wiii:
          displayName: wiii
          put:
            description: make wiii effect in a group of lights
            body: 
              application/json:
                example: |
                 {
                   "note": 1,
                   "velocity": 1
                 }
        /wub:
          displayName: wub
          put:
            description: make wub effect in a group of lights
            body: 
              application/json:
                example: |
                 {
                   "note": 1,
                   "velocity": 1
                 }

    The XML of our Mule flows. Much of this was automatically built by APIkit from the file above:

    <?xml version="1.0" encoding="UTF-8"?>
    
    <mule xmlns:tracking="http://www.mulesoft.org/schema/mule/ee/tracking" xmlns:scripting="http://www.mulesoft.org/schema/mule/scripting"
    xmlns:doc="http://www.mulesoft.org/schema/mule/documentation" xmlns="http://www.mulesoft.org/schema/mule/core" xmlns:apikit="http://www.mulesoft.org/schema/mule/apikit" xmlns:http="http://www.mulesoft.org/schema/mule/http" xmlns:spring="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.mulesoft.org/schema/mule/scripting http://www.mulesoft.org/schema/mule/scripting/current/mule-scripting.xsd
    http://www.mulesoft.org/schema/mule/core http://www.mulesoft.org/schema/mule/core/current/mule.xsd
    http://www.mulesoft.org/schema/mule/http http://www.mulesoft.org/schema/mule/http/current/mule-http.xsd
    http://www.mulesoft.org/schema/mule/apikit http://www.mulesoft.org/schema/mule/apikit/current/mule-apikit.xsd
    http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.1.xsd
    http://www.mulesoft.org/schema/mule/ee/tracking http://www.mulesoft.org/schema/mule/ee/tracking/current/mule-tracking-ee.xsd" version="EE-3.7.0">
    
    <http:listener-config name="api2-httpListenerConfig" host="localhost" port="8081" doc:name="HTTP Listener Configuration"/>
    <apikit:config name="api2-config" raml="api2.raml" consoleEnabled="true" consolePath="console" doc:name="Router"/>
    
    <apikit:mapping-exception-strategy name="api2-apiKitGlobalExceptionMapping">
    
     <apikit:mapping statusCode="404">
      <apikit:exception value="org.mule.module.apikit.exception.NotFoundException" />
      <set-property propertyName="Content-Type" value="application/json" doc:name="Property"/>
      <set-payload value="{ &quot;message&quot;: &quot;Resource not found&quot; }" doc:name="Set Payload"/>
     </apikit:mapping>
    
     <apikit:mapping statusCode="405">
      <apikit:exception value="org.mule.module.apikit.exception.MethodNotAllowedException" />
      <set-property propertyName="Content-Type" value="application/json" doc:name="Property"/>
      <set-payload value="{ &quot;message&quot;: &quot;Method not allowed&quot; }" doc:name="Set Payload"/>
     </apikit:mapping>
    
     <apikit:mapping statusCode="415">
      <apikit:exception value="org.mule.module.apikit.exception.UnsupportedMediaTypeException" />
      <set-property propertyName="Content-Type" value="application/json" doc:name="Property"/>
      <set-payload value="{ &quot;message&quot;: &quot;Unsupported media type&quot; }" doc:name="Set Payload"/>
     </apikit:mapping>
    
     <apikit:mapping statusCode="406">
      <apikit:exception value="org.mule.module.apikit.exception.NotAcceptableException" />
      <set-property propertyName="Content-Type" value="application/json" doc:name="Property"/>
      <set-payload value="{ &quot;message&quot;: &quot;Not acceptable&quot; }" doc:name="Set Payload"/>
     </apikit:mapping>
    
     <apikit:mapping statusCode="400">
      <apikit:exception value="org.mule.module.apikit.exception.BadRequestException" />
      <set-property propertyName="Content-Type" value="application/json" doc:name="Property"/>
      <set-payload value="{ &quot;message&quot;: &quot;Bad request&quot; }" doc:name="Set Payload"/>
     </apikit:mapping>
    
    </apikit:mapping-exception-strategy>
    
    <flow name="api2-main" processingStrategy="non-blocking">
     <http:listener config-ref="api2-httpListenerConfig" path="/api/*" doc:name="HTTP"/>
     <apikit:router config-ref="api2-config" doc:name="APIkit Router"/>
     <exception-strategy ref="api2-apiKitGlobalExceptionMapping" doc:name="Reference Exception Strategy"/>
    </flow>
    
    <sub-flow name="pythonApi">
     <scripting:component doc:name="Python">
      <scripting:script engine="jython" file="/Users/nearnshaw/muletallica/mule.py">
       <property key="group" value="#[group]" />
       <property key="command" value="#[command]" />
      </scripting:script>
     </scripting:component>
    </sub-flow>
    
    <flow name="put:/effects/{group}/gamma:api2-config" processingStrategy="non-blocking">
     <object-to-byte-array-transformer doc:name="Object to Byte Array" />
     <object-to-string-transformer doc:name="Object to String" />
     <set-variable variableName="command" value="0" doc:name="Variable" />
     <flow-ref name="pythonApi" doc:name="Call Python Api" />
     <set-payload value="&quot;Called Gamma&quot;" doc:name="Set Payload" />
    </flow>
    
    <flow name="put:/effects/{group}/directcolor:api2-config" processingStrategy="non-blocking">
     <object-to-byte-array-transformer doc:name="Object to Byte Array" />
     <object-to-string-transformer doc:name="Object to String" />
     <set-variable variableName="command" value="6" doc:name="Variable" />
     <flow-ref name="pythonApi" doc:name="Call Python Api" />
     <set-payload value="&quot;Called Change Direct Color&quot;" doc:name="Set Payload" />
    </flow>
    
    <flow name="put:/effects/{group}/both:api2-config" processingStrategy="non-blocking">
     <object-to-byte-array-transformer doc:name="Object to Byte Array" />
     <object-to-string-transformer doc:name="Object to String" />
     <set-variable variableName="command" value="5" doc:name="Variable" />
     <flow-ref name="pythonApi" doc:name="Call Python Api" />
     <set-payload value="&quot;Called Change Both&quot;" doc:name="Set Payload" />
    </flow>
    
    <flow name="put:/effects/{group}/flicker:api2-config" processingStrategy="non-blocking">
     <object-to-byte-array-transformer doc:name="Object to Byte Array" />
     <object-to-string-transformer doc:name="Object to String" />
     <set-variable variableName="command" value="1" doc:name="Variable" />
     <flow-ref name="pythonApi" doc:name="Call Python Api" />
     <set-payload value="&quot;Called Gamma&quot;" doc:name="Set Payload" />
    </flow>
    
    <flow name="put:/effects/{group}/intensity:api2-config" processingStrategy="non-blocking">
     <object-to-byte-array-transformer doc:name="Object to Byte Array" />
     <object-to-string-transformer doc:name="Object to String" />
     <set-variable variableName="command" value="2" doc:name="Variable" />
     <flow-ref name="pythonApi" doc:name="Call Python Api" />
     <set-payload value="&quot;Called Gamma&quot;" doc:name="Set Payload" />
    </flow>
    
    <flow name="put:/effects/{group}/wiii:api2-config" processingStrategy="non-blocking">
     <object-to-byte-array-transformer doc:name="Object to Byte Array" />
     <object-to-string-transformer doc:name="Object to String" />
     <set-variable variableName="command" value="3" doc:name="Variable" />
     <flow-ref name="pythonApi" doc:name="Call Python Api" />
     <set-payload value="&quot;Called Gamma&quot;" doc:name="Set Payload" />
    </flow>
    
    <flow name="put:/effects/{group}/wub:api2-config" processingStrategy="non-blocking">
     <object-to-byte-array-transformer doc:name="Object to Byte Array" />
     <object-to-string-transformer doc:name="Object to String" />
     <set-variable variableName="command" value="4" doc:name="Variable" />
     <flow-ref name="pythonApi" doc:name="Call Python Api" />
     <set-payload value="&quot;Called Gamma&quot;" doc:name="Set Payload" />
    </flow>
    
    </mule>

    This project allowed us to show off Mule’s speed and stability when dealing with a massive stream of requests that arrived simultaneously. In music, timing is the single most important thing, as the slightest delay renders an interface unusable for musical interaction, that’s why music is the ultimate challenge for testing the real-time readiness of a system. We did have a few problems with delays at first, but we soon realised that the bottleneck was actually our wifi signal, not Mule. With that fixed, we got to the point where delays were virtually imperceptible. The music software we were running is pretty heavy on the machine’s resources, and we were running Mule in that same laptop computer …even then we didn’t experience any significant delays.

    Looking forward, it would be amazing if someone took the time to build a MIDI connector for Mule, with that in place this entire project could have been built around Mule, controlling even the triggering of musical notes and everything else… I really look forward to doing that some day!

    The IoT Zone is brought to you in partnership with GE Digital.  Discover how IoT developers are using Predix to disrupt traditional industrial development models.

    Topics:
    iot ,mulesoft ,hackathon

    Published at DZone with permission of Ross Mason, DZone MVB. See the original article here.

    Opinions expressed by DZone contributors are their own.

    The best of DZone straight to your inbox.

    SEE AN EXAMPLE
    Please provide a valid email address.

    Thanks for subscribing!

    Awesome! Check your inbox to verify your email so you can start receiving the latest in tech news and resources.
    Subscribe

    {{ parent.title || parent.header.title}}

    {{ parent.tldr }}

    {{ parent.urlSource.name }}