I'm asking how can I write a generic solution for inserting and selecting any type of Java objects to and from a JSON
or JSONB
PostgreSQL database column.
Additionally I have to use (inject) my Jackson ObjectMapper
, because it's a pre-configured Spring bean.
I tried to solve it with a custom ConverterProvider
and it's working fine on the select side, but my converter provider isn't called when I did the insert. Did I miss something?
I checked the jOOQ documentation and some stack overflow article about the Biding
solution, but I don't want to exactly specify the userType
. So I cannot register the Binding
, because I cannot declare the exact userType
.
My goal is that:
I would like to eliminate the requirements for the Java object string serialization in the repository classes for every JSON
and JSONB
field (and the requirements to inject my ObjectMapper
into every repository). I don't want to write this code line for each JSONB
field, jOOQ should do it under the hood with my JSONConverterProvider
.
.set(myJSONBTableField, JSONB.jsonb(objectMapper.writeValueAsString(myJavaObjectToStore))
I would like to use simply that:
.set(myJSONBTableField, myJavaObjectToStore)
I would like to ask jOOQ to store any type of Java object with the help of my custom ObjectMapper
bean, if the DB column type is JSON
or JSONB
. My JSONBConverterProvider
is called in the read phase, but not in the write phase. The JSONBConverterProvider
has a converter for the vica conversion too, but jOOQ doesn't use it.
The relevant part of my jOOQ configuration:
@Bean
public DefaultConfigurationCustomizer ziffitJooqCustomizer(JSONConverterProvider converterProvider) {
return configuration -> configuration
.set(converterProvider)
.settings()...
My custom converter provider:
@ConditionalOnClass(DSLContext.class)
@Component
public class JSONConverterProvider implements ConverterProvider {
private final ConverterProvider delegate = new DefaultConverterProvider();
private final ObjectMapper mapper;
public JSONConverterProvider(ObjectMapper mapper) {
this.mapper = mapper;
}
@Override
public <T, U> Converter<T, U> provide(Class<T> dbClass, Class<U> entityClass) {
if (dbClass == JSON.class) {
return getJSONConverter(dbClass, entityClass);
} else if (dbClass == JSONB.class) {
return getJSONBConverter(dbClass, entityClass);
} else {
// Delegate all other type pairs to jOOQ's default
return delegate.provide(dbClass, entityClass);
}
}
private <T, U> Converter<T, U> getJSONConverter(Class<T> dbClass, Class<U> entityClass) {
return Converter.ofNullable(dbClass, entityClass,
t -> {
try {
return mapper.readValue(((JSON) t).data(), entityClass);
} catch (Exception e) {
throw new DataTypeException("JSON deserialization error", e);
}
},
u -> {
try {
return (T) JSON.valueOf(mapper.writeValueAsString(u));
} catch (Exception e) {
throw new DataTypeException("JSON serialization error", e);
}
}
);
}
private <T, U> Converter<T, U> getJSONBConverter(Class<T> dbClass, Class<U> entityClass) {
return Converter.ofNullable(dbClass, entityClass,
t -> {
try {
return mapper.readValue(((JSONB) t).data(), entityClass);
} catch (Exception e) {
throw new DataTypeException("JSONB deserialization error", e);
}
},
u -> {
try {
return (T) JSONB.valueOf(mapper.writeValueAsString(u));
} catch (Exception e) {
throw new DataTypeException("JSONB serialization error", e);
}
}
);
}
}
I'm using the read query like this:
dslCtx
.select(myJSONBTableField)
.from(myTable)
.where(...)
.fetchOptionalInto(myClazz);
And this is working fine, and call my JSONBConverterProvider
.
But my write query doesn't. I would like to use the write query like this:
dslCtx
.insertInto(myTable)
.set(myJSONBTableField, myJavaObjectToStore)
.execute();
ConverterProvider
work for youThe ConverterProvider
SPI (along with the RecordMapperProvider
and RecordUnmapperProvider
SPIs) overrides the behaviour of all reflection based auto-data type conversion capabilities on a per-column / member match basis, which mostly affects the Record::from
and Record::into
methods, and related methods on other types.
Notice that if you project your field without passing a class literal, you would get a Record1<JSONB>
record, and you wouldn't expect that to use your JSONBConverterProvider
. As such, it doesn't affect any of the otherwise type safe methods, such as set()
from INSERT
or UPDATE
, nor select(...).fetch()
.
Instead, do attach a Converter<JSONB, Object>
or better (to avoid problems like this), a Converter<JSONB, MyMarkerInterface>
converter to all of your JSONB columns, which should be mapped using your generic JSON conversion logic. I think the problem you had is that you didn't want to attach any specific Java class to your JSONB columns, as the concrete class can depend on the JSONB content. But you don't have to! jOOQ doesn't care what the U
type in your Converter<T, U>
is. It is only used for type safety, so any MyMarkerInterface
type is fine.
With that approach, you can still use the fetchOptionalInto(myClazz)
call, which will just implement an unsafe downcast from MyMarkerInterface
to MyClazz