Platinum Partner
java

A collection with billions of entries

There are a number of problems with having a large number of records in memory. One way around this is to use direct memory, but this is too low level for most developers. Is there a way to make this more friendly?

Limitations of large numbers of objects

  • The overhead per object is between 12 and 16 bytes for 64-bit JVMs. If the object is relatively small, this is significant and could be more than the data itself.
  • The GC pause time increases with the number of objects. Pause times can be around one second per GB of objects.
  • Collections and arrays only support two billion elements

Huge collections

One way to store more data and still follow object orientated principles is have wrappers for direct ByteBuffers.  This can be tedious to write, but very efficient.

What would be ideal is to have these wrappers generated automatically.

Small JavaBean Example

This is an example of JavaBean which would have far more overhead than actual data contained.
interface MutableByte {
    public void setByte(byte b);

    public byte getByte();
}

It is also small enough that I can create billions of these on my machine. This example creates a List<Bytes> with 16 billion elements.

final long length = 16_000_000_000L;
HugeArrayList<MutableByte> hugeList = new HugeArrayBuilder<MutableByte>() {{
    allocationSize = 4 * 1024 * 1024;
    capacity = length;
}}.create();

List<MutableByte> list = hugeList;
assertEquals(0, list.size());

hugeList.setSize(length);

// add a GC to see what the GC times are like.
System.gc();

assertEquals(Integer.MAX_VALUE, list.size());
assertEquals(length, hugeList.longSize());

byte b = 0;
for (MutableByte mb : list)
    mb.setByte(b++);

b = 0;
for (MutableByte mb : list) {
    byte b2 = mb.getByte();
    byte expected = b++;
    if (b2 != expected)
        assertEquals(expected, b2);
}
From start to finish, the heap memory used is as follows. with -verbosegc
0 sec - 3100 KB used
[GC 9671K->1520K(370496K), 0.0020330 secs]
[Full GC 1520K->1407K(370496K), 0.0063500 secs]
10 sec - 3885 KB used
20 sec - 4428 KB used
30 sec - 4428 KB used
  ... deleted ...
1380 sec - 4475 KB used
1390 sec - 4476 KB used
1400 sec - 4476 KB used
1410 sec - 4476 KB used
The only GC is one triggered explicitly. Without the System.gc(); no GC logs appear.

After 20 sec, the increase in memory used is from logging how much memory was used.

Conclusion

The library is relatively slow. Each get or set takes about 40 ns which really adds up when there are so many calls to make. I plan to work on it so it is much faster. ;)

On the upside, it wouldn't be possible to create 16 billion objects with the memory I have, nor could it be put in an ArrayList, so having it a little slow is still better than not working at all.

 

From http://vanillajava.blogspot.com/2011/08/collection-with-billions-of-entries.html

{{ tag }}, {{tag}},

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}
{{ parent.authors[0].realName || parent.author}}

{{ parent.authors[0].tagline || parent.tagline }}

{{ parent.views }} ViewsClicks
Tweet

{{parent.nComments}}