I'm working on a play framework 2 based prototype where one of the requirements is to filter duplicate post requests from clients. Its a prototype and trying to hack one out quick and dirty. I have the following data structure and code to keep track of duplicates.
public class DuplicityService {
private static ConcurrentHashMap<String,List<String>> keyIps = new ConcurrentHashMap<String,List<String>>();
public static boolean isDuplicate(String key,String ip){
List<String> value = keyIps.get(key);
return value != null && value.contains(ip);
}
public static void remove(String key){
keyIps.remove(key);
}
public static void add(String key,String ip){
List<String> value = keyIps.get(key);
if(value == null){
value = new ArrayList<String>();
}
value.add(ip);
keyIps.put(key, value);
}
}
I use this in my Controllers as such
def submitResponse(qkey:String) = CorsAction(parse.json){
req =>
val json = req.body
val key = json.\("Key").as[String]
....
if(DuplicityService.isDuplicate(key,req.remoteAddress)){
...
BadRequest("Duplicate Response " + key)
}
else{
...
DuplicityService.add(key,req.remoteAddress)
Ok(json)
...
}
}
And remove the key from the concurrent hashmap in a separate controller method
def publish(key: String) = Authenticated{
...
DuplicityService.remove(key)
...
}
Now the problem is that while testing manually on my local machine it works fine. I'm able to correctly identify duplicate post requests from same IP address.
However, on heroku, this doesn't work. I'm able to make duplicate post requests from same client.
I have a basic instance of a Heroku play 2 server.
Any pointers or help will be much appreciated.
PS: Without using a database, are there better ways to do something what I'm attempting.
Thanks
I suspect Heroku might create more than 1 JVM instance either flushing out or duplicating your static singleton.
Using Memcached or equivalent (but probably not a fully fledged DB because you're essentially just caching) would be my preference.
Alternatively, depending on the scalability needs of your application, to avoid the cache becoming a bottleneck, you might want to design your application in a way that deals with duplicate requests in a more lazy and decentralized manner; something in the spirit of http://en.m.wikipedia.org/wiki/Eventual_consistency.