Search code examples
jsonpostgresqljpajdbceclipselink

Are JPA (EclipseLink) custom types possible?


In particular I am interested in using PostgreSQLs json type.

The core of the problem seems to be that there is no internal mapping in Eclipselink to json type. So, using a naive approach with:

@Column(name = "json", columnDefinition = "json")
public String getJson() {
    return json;
}

... and trying to insert an object, I get an exception:

Internal Exception: org.postgresql.util.PSQLException: ERROR: column "json" is of type json but expression is of type character varying

Fair enough I suppose.

Looking through the EclipseLink docs, it seems that the applicable customizations (Transformation Mappings, Native Queries, Converters) rely on the data being made up of the supported mappings (numbers, dates, strings etc) so it makes this quite awkward to get around using vendor specific types.

The main reason this is so frustrating is that json type in posgresql is expressed the same as text/varchar and I believe (at the moment, but not forever) is just an alias of that type - therefore the driver is more than capable of transmitting this, it's just validation rules in my way.

In terms of the solution, I don't mind losing portability (in terms of being database agnostic and using vendor specific types) but just want a solution that allows me to use a json type as an attribute on a normal JPA Entity and retain all the other behavior it is accustomed to (schema generation, merge, persist, transactional code


Solution

  • Walking through SO I've found many questions like this regarding JSON or XML types for mapping into Postgres. It looks like nobody have faced the problem of reading from custom Postgres type, so here the solution for both reading and writing using pure JPA type conversion mechanism.

    Postgres JDBC driver maps all attributes for unknown (to Java) types into org.postgresql.util.PGobject object, so it is enough to make converter for this type. Here is entity example:

    @Entity
    public class Course extends AbstractEntity {
        @Column(name = "course_mapped", columnDefinition = "json")
        @Convert(converter = CourseMappedConverter.class)
        private CourseMapped courseMapped;  // have no idea why would you use String json instead of the object to map
    
        // getters and setters
    }
    

    Here the converter example:

    @Converter
    public class CourseMappedConverter implements AttributeConverter<CourseMapped, PGobject> {
        @Override
        public PGobject convertToDatabaseColumn(CourseMapped courseMapped) {
            try {
                PGobject po = new PGobject();
                // here we tell Postgres to use JSON as type to treat our json
                po.setType("json");
                // this is Jackson already added as dependency to project, it could be any JSON marshaller
                po.setValue((new ObjectMapper()).writeValueAsString(courseMapped));
                return po;
            } catch (JsonProcessingException e) {
                e.printStackTrace();
                return null;
            } catch (SQLException e) {
                e.printStackTrace();
                return null;
            }
        }
    
        @Override
        public CourseMapped convertToEntityAttribute(PGobject po) {
            try {
                return (new ObjectMapper()).readValue(po.getValue(),CourseMapped.class);
            } catch (IOException e) {
                e.printStackTrace();
                return null;
            }
        }
    }
    

    If you really need to stick to String JSON representation in your entity, you can make converter like this for String type

    implements AttributeConverter<String, PGobject>
    

    Here is very dirty (though working) proof of concept, it also uses fake object serialization to tell JPA that object was changed if it was

    https://github.com/sasa7812/psql-cache-evict-POC