Search code examples
javahazelcast

Hazelcast map synchronization


I am trying to implement distributed cache using Hazelcast in my application. I am using Hazelcast’s IMap. The problem I have is every time I get a value from a map and update the value, I need to do a put(key, value) again. If my value object has 10 properties and I have to update all 10, then I have to call put(key, value) 10 times. Something like -

IMap<Integer, Employee> mapEmployees = hz.getMap("employees");
Employee emp1 = mapEmployees.get(100);
emp1.setAge(30);
mapEmployees.put(100, emp1);
emp1.setSex(“F”);
mapEmployees.put(100, emp1);
emp1.setSalary(5000);
mapEmployees.put(100, emp1);

If I don’t do this way, some other node which operates on the same Employee object will update it and the final result is that the employee object is not synchronized. Is there any solution to avoid calling put explicitly multiple times? In a ConcurrentHashMap, I don’t need to do this because if I change the object, the map also gets updated.


Solution

  • As of version 3.3 you'll want to use an EntryProcessor:

    What you really want to do here is build an EntryProcessor<Integer, Employee> and call it using mapEmployees.executeOnKey( 100, new EmployeeUpdateEntryProcessor( new ObjectContainingUpdatedFields( 30, "F", 5000 ) );

    This way, Hazelcast handles locking the map on the key for that Employee object and allows you to run whatever code is in the EntryProcessor's process() method atomically including updating values in the map.

    So you'd implement EntryProcessor with a custom constructor that takes an object that contains all of the properties you want to update, then in process() you construct the final Employee object that will end up in the map and do an entry.setValue(). Don't forget to create a new StreamSerializer for the EmployeeUpdateEntryProcessor that can serialize Employee objects so that you don't get stuck with java.io serialization.

    Source: http://docs.hazelcast.org/docs/3.5/manual/html/entryprocessor.html