Currently, I've been defining the tokenizer analyzer upon index creation settings/mapping. Would it be possible to just define the tokenizer on class property attribute and let automap do the work?
An analyzer can be defined on a TextAttribute
applied to a string
property, and a tokenizer is one component in an analyzer, so it doesn't make sense to apply on a mapping attribute outside of the context of an analyzer.
A tokenizer has to be defined in the index in which it will be used, so is supplied upon index creation, or when updating the index settings. The important bit is that what is in the index settings in Elasticsearch matches what is defined on your POCO in your application. You might implement some logic that gets the index settings on startup and compares the analysis settings and mappings against the index and mapping settings defined in the application, and take some action if they are different.