I have the following sqlmodel base class and table models but I can't manage to make my Decimal
field optional while keeping the constraints.
PS: I already know this has been discussed for pydantic already but the solution seems not to be working with sqlmodel.
ValueError: Unknown constraint max_digits
)class ItemBase(SQLModel):
# ...
price: Optional[Decimal] = Field(
default=None, max_digits=7, decimal_places=4, ge=0, le=100
)
# ...
class ItemTable(ItemBase, table=True):
__tablename__ = "item"
# ...
TypeError: issubclass() arg 1 must be a class
class ItemBase(SQLModel):
# ...
price: Optional[Annotated[Decimal, Field(
default=None, max_digits=7, decimal_places=4, ge=0, le=100
)]] = None
# ...
max_digits
and decimal_places
constraints, that's okay because this really matters mostly on the database side.class ItemBase(SQLModel):
price: Optional[Decimal] = Field(
default=None, ge=0, le=100, sa_column=Column(DECIMAL(7,4)),
regex=r"^\d+(\.\d+)?$" # added for the string part to only accept numbers
)
With that last attempt, the le
and ge
constraints work properly in python:
ItemTable(..., price=Decimal(123.45)) # => throws ValidationError
ItemTable(..., price=Decimal(34.12)) # => works fine
But now my problem is that the json schema generated with that drops all my constraints to become only:
anyOf: [{type: number}, {type: string}, {type: null}]
Problematic because I am then using the json schema for creating fake objects (with the jsf library) for my tests and without the constraints my tests all fail...
Edit with an answer:
Thanks to @Jeremy I figured that the error stated in point 2 only occurs for table=True
models. So I ended up doing the following:
class ItemBase(SQLModel):
# ...
price: Optional[Annotated[Decimal, Field(
multiple_of=0.0001, ge=0, le=100
)]] = None
# ...
class ItemTable(ItemBase, table=True):
price: Optional[Decimal] = Field(
default=None, ge=0, le=100, sa_column=Column(DECIMAL(7,4)
)
In order to add a pattern constraint to the string type of the decimal json model I had to subclass pydantic's GenerateJsonSchema
:
class MySchemaGenerator(GenerateJsonSchema):
def decimal_schema(self, schema: core_schema.DecimalSchema) -> JsonSchemaValue:
json_schema = self.str_schema(core_schema.str_schema(pattern=r"^[-+]?\d+((?:\.\d+)|(?:[eE][-+]?\d+))?$"))
... # the rest of the method is unchanged
ItemBase.model_json_schema(schema_generator=MySchemaGenerator)
"""returns for 'price' property:
{'price': {'anyOf': [{'maximum': 100.0,
'minimum': 0.0,
'multipleOf': 0.0001,
'type': 'number'},
{'pattern': '^[-+]?\\d+((?:\\.\\d+)|(?:[eE][-+]?\\d+))?$',
'type': 'string'},
{'type': 'null'}],
'default': None,
'title': 'Price'}
"""
But this way I can't specify the pattern directly on a field and so all decimal will have the same pattern. If someone has an idea on how to pass a regex from the field definition to that decimal_schema
method I would gladly take it!
Try this
0
is equivalent to default = 0
multiple_of
gives you the precision requirement as a divisiblele
gives you the maximum integer valueclass ItemBase(SQLModel)
price: Optional[Annotated[Decimal, Field(0, multiple_of=0.0001, le=1000000)]] | None
it should produce a schema something like this
{
"type": [
"object",
"null"
],
"properties": {
"price": {
"type": "number",
"maximum": 1000000,
"multipleOf": 0.0001
}
}
}
This will validate a number up to 7 digits and if a decimal value is provided, should validate up to four(4) precision points.
456123.1235
If that still fails, there's quite a few SO answers indicating the typing-extensions
package may need to be upgraded.
pip install --force-reinstall typing-extensions==4.5.0
I haven't dug deeper than that but I wonder if pydantic is having some difficulty generating the JSON Schema due to conflicting keywords between the type defined and the constraints as something like max_digits
is equivalent to maxLength
which is a constraint for type: string
values.