Interview: Music Composer on the NetBeans Platform
Interview: Music Composer on the NetBeans Platform
Join the DZone community and get the full member experience.Join For Free
Atomist automates your software deliver experience. It's how modern teams deliver modern software.
blue is a music composition environment I started in the fall of 2000. It was actually my very first Java program! At the time, I had started using the music software Csound (http://www.csounds.com) to compose, but found it slow to work with when it came to accomplishing what I was interested in musically. I had the idea to create a simple program that would have a timeline and the ability to scale musical score material in time.
Fast forward many years later: in trying to solve other musical problems, and responding to feedback from the community of users, I've expanded blue's features a great deal. It now includes things like a mixer and effects system, a GUI builder tool for creating synthesizer interfaces, embedded Jython processing of musical scripts, and more. It's been quite satisfying to create a tool that can express my musical interests and to find a community of users who have found value in this program for their own musical work.
- The Orchestra manager shows a BlueSynthBuilder instrument being edited. The "Reson 6" instrument is shown in edit mode. The BSB Object Properties panel shows the properties for the selected knob:
- The Score timeline shows a project using multiple parameter automations. The values automated include things like volume, panning, and a time pointer for a phase-vocoder instrument. All BlueSynthBuilder instruments, Effects, and the Mixer volume sliders can be automated:
- The Score timeline showing the author's composition "Reminiscences". The timeline shows multiple Python SoundObjects used. The SoundObject Editor shows the editor for the selected SoundObject in the timeline. The SoundObject Properties panel shows different properties for the selected SoundObject:
- The Score timeline showing a Tracker SoundObject being used. The timeline is configured to snap at every 4 beats and the time bar has been configured to show in numbers rather than time:
- The Score timeline showing a PianoRoll SoundObject being used. The PianoRoll is unique in that it is microtonal, meaning it can adapt the number of steps per "octave", depending on the values configured from a tuning file. In this screenshot, the scale loaded was a Bohlen-Pierce scale, which has 13-tones per tritave (octave and a half):
- The blue Mixer is shown docked into the bottom bar and in an open state. The interfaces for user-created Chorus and Reverb effects are shown. The interfaces were created using the same GUI builder tool that is found in the BlueSynthBuilder instrument:
It's got a very special appearance. How did that come about?
blue's custom look and feel started off one day when I was using my Palm PDA. I remember thinking that I enjoyed the look of the device with the backlight on, and so I wanted to recreate that kind of look for my program. Later, I modified the color scheme to tone it down in some ways, but I also introduced more colors than white and cyan to highlight secondary and tertiary features. Maybe now it is now more like Tron than it is like Palm. :)
Overall, I enjoy the darker look of the application when I'm working on music. I tend to work on music when I have free time, and that is usually only late at night—I've found having a darker screen has been easier on my eyes. Also, if anyone was wondering, yes, blue is my favorite color.
The blue look and feel is encapsulated in a module named "blue-plaf" and is available in the "blue" Mercurial repository
(http://bluemusic.hg.sourceforge.net/hgweb/bluemusic/blue). The look and feel is quite hacked up (redoing it properly has been another item on my todo list), but it can be dropped into another application and it should work, as shown below with the CRUD Sample (which can be created from a tutorial found here):
Can you explain how blue's timeline works?
blue has a concept of SoundLayers and SoundObjects. SoundObjects are objects that primarily produce notes and have a start and duration. There are many different types of SoundObjects in blue and each has an editor (viewed in the SoundObject Editor TopComponent when a SoundObject is selected), and a BarRenderer, which is used to draw the content area of the bar on the timeline.
A PolyObject is a special SoundObject. It consists of SoundLayers, which contain SoundObjects. The root timeline is itself just a PolyObject that you can add as many layers to as you like. You can also group individual SoundObjects into their own PolyObjects, and then use the resulting PolyObjects just like any other SoundObject on the timeline. If you double-click a PolyObject, the timeline is then reset with the timeline of the PolyObject you selected. As a result, PolyObjects allow timelines to be embedded within other timelines. If you think about how music is grouped into motives, phrases, sections, and even larger groupings, you can see how PolyObjects might represent these kinds of musical abstractions.
For the component design, the ScoreTopComponent starts off with a JSplitPane to split between a SoundLayerListPanel on the left and a JScrollPane on the right. The JScrollPane has a ScoreTimeCanvas (the main timeline) in the main viewPort's view, a panel with the the time bar and tempo editor in the column header, and the corner is used to open up an extra panel to modify properties for the timeline. The JScrollPane has customized JScrollBars used to add the ± buttons that perform zooming on the timeline.
There are a number of other features involved that are implemented amongst a number of classes, but the details of how viewPorts are synchronized (among other things) may be a bit too technical to discuss here. For those who are interested, the code can be viewed within the blue.ui.core.score package within the blue-ui-core module.
How did blue come to find itself atop the NetBeans Platform?
I first started to be interested in NetBeans IDE around the time 4.1 came out, but didn't really get into using it until the release of 5.0. At that time, I had hand-written Swing components for about 4-5 years (I don't really remember when 5.0 was released), and I found Matisse to be quite nice and began using it here and there. I had looked at the NetBeans Platform as an RCP at that time, but found it to be quite a bit to understand. However, I still kept it on my radar.
Around the time 6.0 or 6.5 came out, I started to reconsider migrating to the NetBeans Platform once again. By this time, I had moved over to using NetBeans IDE full time for blue development and had been using NetBeans IDE more in general—particularly Java Web development and Ruby on Rails. One of the biggest things I found attractive about NetBeans IDE is its windowing system... and the things I read about in the platform development articles I'd seen online made me curious once again to see what the NetBeans Platform offered. I still felt that there was going to be a big learning curve to learn the NetBeans Platform, but the NetBeans Platform tutorials online were really quite helpful, as were the members of the NetBeans Platform mailing list, and there were also many more books available to help me get started.
I think I ultimately spent about 6-8 months migrating blue to using the NetBeans Platform. Granted, it was a busy time in my life and I was working on this only in my spare time, so I think in the end it was a reasonable amount of time. Users have been very positive about the new blue interface and application as a whole, and I think it has been worth spending the time to use the NetBeans Platform.
blue's window layout is quite unexpected for a NetBeans Platform application.
By the time I had started migrating blue to the NetBeans Platform, the application was already some 7-8 years old. The interface I designed for blue in pure Swing was influenced by my experiences in using Flash, looking at other music composition environments (Digital Audio Workstations and Sequencing Programs), and evaluating the different aspects of working with Csound.
Mapping the components from the Swing-based application to the NetBeans Platform was a little tricky in that I couldn't quite get the exact same design of panels as I had in pure Swing. In the end, I tried to think about where most of the components resided physically, and created TopComponents and placed them in the center, left, right, or bottom parts of the main window. I kept some of the dialogs from the old codebase as-is, but I migrated others to be TopComponents so that they could be docked, opened, or dragged out into a dialog as the user wished. In the end, the GUI is different and took a little getting used to after years of using and building the old interface, but I quickly adjusted to the changes and I think there is much greater consistency and usability now. The users have responded very positively to the general polish of the application and to being able to customize their environment. I myself have very much enjoyed being able to dock all of the windows as well as using full-screen mode, especially when I am on my netbook and composing. Excellent!
What features of the NetBeans Platform are you using and what do you find to be most useful?
Currently, I am using only a very small part of the NetBeans Platform. By the time I started to move my code to the NetBeans Platform, the codebase was already some 7 or 8 years old. I took the approach recommended to me on the mailing list and started off small, focusing primarily on migrating my project to using the Windows System API, the Options API, and a few other utility API's like IO and Dialogs.
Having an old codebase, I found that I spent most of my time during migration just reorganizing my UI into TopComponents and working out communications between the components. I also spent time looking at API's that I had developed myself and seeing which ones could be replaced with API's provided by the NetBeans Platform. At this time, the application is still using a number of API's I wrote from the old codebase, but over time I would like to migrate more of the appplication to use the Nodes and Visual Library API's.
I think migrating a codebase of this size in phases really worked out well. In the first phase, I was able to take advantage of the Window System API and have a very visible result on the application and gained a lot for usability. Also, a big part of the migration involved moving the codebase from a monolithic source tree and partitioning it into logical modules. I think there really is a great deal of benefit to working with a codebase with modular design, and that too is a very positive result of working with the NetBeans Platform.
Please say something about how Jython relates to this application, how you are using it (what the benefits are), and your general opinions on Jython.
I have had a Python SoundObject in blue for quite some time—I think since 2002. For me, it is one of the most important tools in blue when it comes to accomplishing what I want musically. With computer music, we have a lot of tools for what I call Common Practice computer music: PianoRolls, Pattern Editors, and Notation Objects. For computer musicians who are interested in Uncommon Practice music, the ability to use a scripting language opens up a number of ways to express musical ideas that cannot be easily conveyed using those other tools.
In blue, Jython is primarily used to allow users to write scripts that will generate notes. For myself, I use Python scripts to model orchestral composition, creating Performer and PerformerGroup objects that I write in Python. I also write performance functions, usually per-project, to perform different musical material in different ways. Other users have used Python scripts in exploring things like algorithmic composition and genetic algorithms in their work.
A blue project can contain any number of Python objects. The score generated by each Python object is translated and scaled in time by moving and resizing the SoundObject in the timeline. This allows a user who may want to use scripting to create musical material to also take advantage of blue's timeline to organize how the different musical objects will work together.
One of the things I most appreciate about Jython (and scripting languages on the JVM in general) is that it is embeddable within a Java application. By packaging and embedding the Jython interpreter within the blue application, users can rest assured that the Python scripts they write can be interpreted anywhere that blue is installed. It's an extra assurance that their musical projects will be long-lasting, but they can still take advantage of a full programming language like Python in their work.
Overall, I think that Jython is a fine piece of software and I hope that it will continue to grow and develop for years to come.
Is the application open source and are you looking for code contributions and, if so, in which areas?
Yes, the application is available under the GPL v2 license, and the source code can be viewed from the Mercurial repository on SourceForge at http://bluemusic.hg.sourceforge.net/hgweb/bluemusic/blue/. I am a strong proponent of open source, especially for creative work. In the same way that we can today look at and study musical works by composers of the past (like Josquin and Bach), I would like to imagine that the work composers and other artists are now creating with computers will also be open and available for study in the future. I believe that using open source software for creative work greatly helps in making musical projects available for the years ahead.
I have done most of the development of blue myself, and over the years I've certainly built up a long list of things that I would like to implement. Users have also made wonderful feature requests that I would love to see in the program—but unfortunately, there are only so many hours in the day. It would certainly be nice to have others contributing code!
Beyond new features, there are a number of infrastructural things that would be nice to address. The codebase is many years old, and while the application has been refactored multiple times over its lifetime, there are still some areas of the application that could be much more cleanly implemented. Also, in moving over to the NetBeans Platforms, I only really took the first steps. There are a number of components within the application that could probably be better served by migrating to using more of the NetBeans API's.
For internal work, things like modifying the timeline to implement zooming to use Graphics2d and transforms, implementing a better waveform renderer for audio files, and further enhancing the instrument GUI builder are all things I'd like to see. I'd also love to get help in migrating all of the tables and trees to using the Nodes API, something that I have not yet had the time to do. It would also be nice to get the manual (currently in HTML and PDF, generated from DocBook) integrated into the application as JavaHelp, but this is another thing that I have had to postpone due to lack of time.
For features, some interesting things I'd love to see are a Notation SoundObject, a separate graphical instrument builder using the Visual Library API, and a Sampler instrument. There's also a sound drawing SoundObject, enhancements to existing SoundObjects, and more I'd love to see moving forward. Maybe someone will find these kinds of things interesting and will take a look at blue's code sometime!
Thanks Steven and happy music making with blue!
Opinions expressed by DZone contributors are their own.