Over a million developers have joined DZone.

Managing Heavy Resources on the NetBeans Platform

DZone 's Guide to

Managing Heavy Resources on the NetBeans Platform

· Java Zone ·
Free Resource
If you have a bit of experience with the NetBeans Platform, you know that one of the basic classes in the API is DataObject. It models a datum represented by one (or more) files, thus it offers links to the file system plus other facilities for being rendered and selected in different kind of views.

DataObject is a sort of abstract object, so you normally subclass it for your needs. For instance, one of the very first classes that I wrote in blueMarine when I ported it to NetBeans Platform was PhotoDataObject, which extends DataObject and acts as an adapter to Image I/O.

Even if subclassing is ok for providing alternate implementations, inheritance is evil if you have direct dependencies on subclasses! Instead, DataObject provides a very simple and effective delegation mechanism, by means of Lookup, another basic class in the NetBeans Platform APIs. Lookup is a generic container of objects and every instance of DataObject has got one; just declare what your objects can do as interfaces ('capabilities') and put their implementations into the Lookup.

For instance, the following code is used by the component that manages thumbnails to extract the image that will be scaled down and turned into a thumbnail:

DataObject dataObject = ...
PreviewImageProvider previewImageProvider = dataObject.getLookup().lookup(PreviewImageProvider.class);
EditableImage image = previewImageProvider.getImage();

Similar code allows to get the metadata of an object (e.g. EXIF for a photo):

DataObject dataObject = ...
MetadataProvider metadataProvider = dataObject.getLookup().lookup(MetadataProvider.class);
Object metadata = metadataProvider.getMetadata();

The same code works for photos, PDF files and movies (and could for instance work with audio files, where the thumbnail could be the waveform, etc...).

This is a very powerful approach. After the Latest Big Refactoring held in August, blueMarine basically no more depends on PhotoDataObject (that, in fact, has been turned into a closed API, that is it's not published to the whole application). This means that blueMarine is able to work with every kind of DataObject you provide - in less than one hour, for instance, I've added to the incubator the support for PDF files (thanks to pdf-renderer by Joshua Marinacci) and I also have a prototype for supporting movies, that is just waiting for Sun to release the new portable codecs announced at JavaOne. 

As it often happens, pretty good designs might suffer from some problem in the performance scenario. In my case, you have to consider that my data objects are usually large (photos) or very large (movies); load-on-demand  is a good idea here (that's why there is an explicit load() method in the previous interfaces). But this just reduces the problem without completely solving it: when you have many photos all at the same time, memory goes down very quickly. Which policy can one choose for releasing resources?

Of course, working with Java, the Garbage Collector comes to mind. Apparently, it's just the matter for using weak references, right? You keep only weak references to internal resources, so they will be automatically cleaned up when you no more need them. You could figure an asynchronous way of using the APIs like this:

PreviewImageProvider previewImageProvider ...
EditableImage image = previewImageProvider.getImage();

if (image == null)
image = previewImageProvider.load(IMAGE);

When you write client code you just have to remember that resources are not always available and you should eventually ask for them multiple times.

Unfortunately, simple load-on-demand with weak references is not enough. The strong decoupling that you have appreciated by defining different interfaces for different capabilities (PreviewImageProvider and MetadataProvider) means that you will be asking for different capabilities in different parts of the application, that are not aware each of the other (also consider that, in order to exploit parallelism, most of blueMarine tasks are performed by Master/Workers and this fragmentates further the access to resources). It's extremely likely that resources are disposed between the operations of PreviewImageProvider and MetadataProvider, leading to multiple disk accesses - consider that it's cheaper to load both metadata and the image in a single pass.

The solution that I've found effective so far is a simple utility class that acts in a first part of its life as a strong reference and later as a weak one:

class DisposableResource<T>
private static final int DELAY = 10 * 1000;

private WeakReference<T> resource;

private T keeper;

public void set (T object)
keeper = object;
resource = (object == null) ? null : new WeakReference<T>(object);

RequestProcessor.getDefault().post(new Runnable()
public void run()
keeper = null;
}, DELAY);

public T get()
return (resource == null) ? null : resource.get();

RequestProcessor comes from the NetBeans APIs and allows me to execute a task after a delay, by efficiently scheduling a thread in an inner pool; so the code above should be hopefully be very efficient even with tenths and thousands of instances. BTW the delay is currently very conservative, while a few seconds should suffice - but I've still to run some load test for it.


Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}