I am trying semantic search in elastic search following this tutorial.
When I am copying an index documents to another index [reindexing] following this command
POST _reindex?wait_for_completion=false
{
"source": {
"index": "collection"
},
"dest": {
"index": "collection-with-embeddings",
"pipeline": "text-embeddings"
}
}
Some of the documents are missing in the new index. But I do not know the reason. I am trying to find out the reason.
For context,
PUT _ingest/pipeline/text-embeddings
{
"description": "Text embedding pipeline",
"processors": [
{
"inference": {
"model_id": "sentence-transformers__msmarco-minilm-l-12-v3",
"target_field": "text_embedding",
"field_map": {
"text": "text_field"
}
}
}
],
"on_failure": [
{
"set": {
"description": "Index document to 'failed-<index>'",
"field": "_index",
"value": "failed-{{{_index}}}"
}
},
{
"set": {
"description": "Set error message",
"field": "ingest.failure",
"value": "{{_ingest.on_failure_message}}"
}
}
]
}
This is tasks details
{
"completed": true,
"task": {
"node": "YgR8udaSSMqClwCGWOBGBw",
"id": 5946104,
"type": "transport",
"action": "indices:data/write/reindex",
"status": {
"total": 2414,
"updated": 1346,
"created": 1068,
"deleted": 0,
"batches": 3,
"version_conflicts": 0,
"noops": 0,
"retries": {
"bulk": 0,
"search": 0
},
"throttled_millis": 0,
"requests_per_second": -1.0,
"throttled_until_millis": 0
},
"description": "reindex from [source_index] to [destination_index]",
"start_time_in_millis": 1680795982705,
"running_time_in_nanos": 22702121635,
"cancellable": true,
"cancelled": false,
"headers": {}
},
"response": {
"took": 22699,
"timed_out": false,
"total": 2414,
"updated": 1346,
"created": 1068,
"deleted": 0,
"batches": 3,
"version_conflicts": 0,
"noops": 0,
"retries": {
"bulk": 0,
"search": 0
},
"throttled": "0s",
"throttled_millis": 0,
"requests_per_second": -1.0,
"throttled_until": "0s",
"throttled_until_millis": 0,
"failures": []
}
}
My data is different, But the Configuration is similar. Around 75% data were not copied.
I am using sentence-transformers__msmarco-minilm-l-12-v3
from elastic search.
Any Help?
You probably don't have enough processing power for the inference processor, and as a result, some documents land in the failed-collection-with-embeddings
index with the reason mentioned in the ingest.failure
field.
What you can do is to use smaller batches (specifying a smaller size
in source) or use request throttling.