Search code examples
javaelasticsearchspring-data-elasticsearch

How to provide Field type for HashMap of temporal accessors in Spring Data Elasticsearch


Given

private Map<String, ZonedDateTime> timePoints = new HashMap<>();

How to hint spring-data the type of the field ?

When put directly on the dictionary, the converter tries to parse the key and the value together if they were a date string.

@Field(FieldType.Date)
private Map<String, ZonedDateTime> timePoints = new HashMap<>();

When no field-type provided, the following error appears:

Type .. of property .. is a TemporalAccessor class but has neither a @Field annotation defining the date type nor a registered converter for writing! It will be mapped to a complex object in Elasticsearch!

Solution

  • Putting the annotation on the property like

    @Field(FieldType.Date)
    private Map<String, ZonedDateTime> timePoints = new HashMap<>();
    

    cannot work, because a Map is not a temporal type and thus cannot be converted as such.

    If you leave the annotation away, the Map<String, ZonedDateTime> will be interpreted as an object. If you for example have

    Map<String, ZonedDateTime> map = new Map();
    map.put("utc", ZonedDateTime.of(LocalDateTime.now(), ZoneId.of("UTC")));
    map.put("paris", ZonedDateTime.of(LocalDateTime.now(), ZoneId.of("Europe/Paris")));
    

    and then on storing this object with Spring Data Elasticsearch this will try to create an object to be sent to Elasticsearch (JSON representation) that looks like this:

    {
      "utc": {
        
      },
      "paris": {
        
      }
    }
    

    The inner objects that should represent the temporals are stored as nested objects and not as some converted value as it is not possible to add a field type to the values of a map - you see the warnings about the in the logs.

    But using a Map as a property in Elasticsearch is problematic anyway. The keys are interpreted as properties of a sub-object. It is not possible to define a mapping of the types in the index before, because it is not known what possible names these properties can have. In my example it was "utc" and "paris", but it could be any String. Each of these values will be added by Elasticsearch as a dynamically mapped field to the index. This might lead to something called mapping explosion, therefore Elasticsearch limits the number of fields in an index to a default value of 1000. You could rethink the way you store the data in Elasticsearch.

    If you want to stick to a Map, you will need to write a custom converter that is able to convert your Map<String, ZonedDateTime> to a Map<String, String> and back.