I'm trying to recover a closed index but any attempt to do so is so far unsuccessful.
GET /_cat/indices/
shows:
green open .fess_config.web_config m7yOnyvERje6baMyvi5IHA 2 0 1 0 10kb 10kb
close .suggest_analyzer fjKXYnKHSQOeVsqf_ETTVA
green open .fess_user.group jyu086_5QUKmx9N64EHsRQ 5 0 0 0 1.2kb 1.2kb
GET /_cluster/health?level=indices&pretty
doesn't contain fjKXYnKHSQOeVsqf_ETTVA
in its output at all.
There is also no mention whatsoever1 of fjKXYnKHSQOeVsqf_ETTVA
in the startup log.
GET /.suggest_analyzer/fjKXYnKHSQOeVsqf_ETTVA/_recovery?human
and
GET /.suggest_analyzer/fjKXYnKHSQOeVsqf_ETTVA/_open
results in:
{
"error": {
"root_cause": [{
"type": "index_closed_exception",
"reason": "closed",
"index_uuid": "fjKXYnKHSQOeVsqf_ETTVA",
"index": ".suggest_analyzer"
}
],
"type": "index_closed_exception",
"reason": "closed",
"index_uuid": "fjKXYnKHSQOeVsqf_ETTVA",
"index": ".suggest_analyzer"
},
"status": 400
}
GET /_recovery?human
returns that everything is OK, yet there is again no mention of fjKXYnKHSQOeVsqf_ETTVA
in the response.
There are also no *recovery
files on the filesystem:
[root@mmm indices]# ls -lR fjKXYnKHSQOeVsqf_ETTVA
fjKXYnKHSQOeVsqf_ETTVA:
drwxr-xr-x 5 elasticsearch elasticsearch 4096 Mar 19 11:33 0
drwxr-xr-x 5 elasticsearch elasticsearch 4096 Mar 19 11:33 1
drwxr-xr-x 5 elasticsearch elasticsearch 4096 Mar 19 11:33 2
drwxr-xr-x 5 elasticsearch elasticsearch 4096 Mar 19 11:33 3
drwxr-xr-x 5 elasticsearch elasticsearch 4096 Mar 19 11:33 4
drwxr-xr-x 2 elasticsearch elasticsearch 4096 Mar 19 11:33 _state
fjKXYnKHSQOeVsqf_ETTVA/0:
drwxr-xr-x 2 elasticsearch elasticsearch 4096 Mar 19 11:33 index
drwxr-xr-x 2 elasticsearch elasticsearch 4096 Mar 19 11:33 _state
drwxr-xr-x 2 elasticsearch elasticsearch 4096 Mar 19 11:33 translog
fjKXYnKHSQOeVsqf_ETTVA/0/index:
-rw-r--r-- 1 elasticsearch elasticsearch 230 Mar 16 11:03 segments_1
-rw-r--r-- 1 elasticsearch elasticsearch 0 Mar 16 11:03 write.lock
fjKXYnKHSQOeVsqf_ETTVA/0/_state:
-rw-r--r-- 1 elasticsearch elasticsearch 125 Mar 16 11:03 state-0.st
fjKXYnKHSQOeVsqf_ETTVA/0/translog:
-rw-r--r-- 1 elasticsearch elasticsearch 43 Mar 16 11:03 translog-1.tlog
-rw-r--r-- 1 elasticsearch elasticsearch 80 Mar 16 11:03 translog.ckp
fjKXYnKHSQOeVsqf_ETTVA/1:
drwxr-xr-x 2 elasticsearch elasticsearch 4096 Mar 19 11:33 index
drwxr-xr-x 2 elasticsearch elasticsearch 4096 Mar 19 11:33 _state
drwxr-xr-x 2 elasticsearch elasticsearch 4096 Mar 19 11:33 translog
fjKXYnKHSQOeVsqf_ETTVA/1/index:
-rw-r--r-- 1 elasticsearch elasticsearch 230 Mar 16 11:03 segments_1
-rw-r--r-- 1 elasticsearch elasticsearch 0 Mar 16 11:03 write.lock
fjKXYnKHSQOeVsqf_ETTVA/1/_state:
-rw-r--r-- 1 elasticsearch elasticsearch 125 Mar 16 11:03 state-0.st
fjKXYnKHSQOeVsqf_ETTVA/1/translog:
-rw-r--r-- 1 elasticsearch elasticsearch 43 Mar 16 11:03 translog-1.tlog
-rw-r--r-- 1 elasticsearch elasticsearch 80 Mar 16 11:03 translog.ckp
fjKXYnKHSQOeVsqf_ETTVA/2:
drwxr-xr-x 2 elasticsearch elasticsearch 4096 Mar 19 11:33 index
drwxr-xr-x 2 elasticsearch elasticsearch 4096 Mar 19 11:33 _state
drwxr-xr-x 2 elasticsearch elasticsearch 4096 Mar 19 11:33 translog
fjKXYnKHSQOeVsqf_ETTVA/2/index:
-rw-r--r-- 1 elasticsearch elasticsearch 230 Mar 16 11:03 segments_1
-rw-r--r-- 1 elasticsearch elasticsearch 0 Mar 16 11:03 write.lock
fjKXYnKHSQOeVsqf_ETTVA/2/_state:
-rw-r--r-- 1 elasticsearch elasticsearch 125 Mar 16 11:03 state-0.st
fjKXYnKHSQOeVsqf_ETTVA/2/translog:
-rw-r--r-- 1 elasticsearch elasticsearch 43 Mar 16 11:03 translog-1.tlog
-rw-r--r-- 1 elasticsearch elasticsearch 80 Mar 16 11:03 translog.ckp
fjKXYnKHSQOeVsqf_ETTVA/3:
drwxr-xr-x 2 elasticsearch elasticsearch 4096 Mar 19 11:33 index
drwxr-xr-x 2 elasticsearch elasticsearch 4096 Mar 19 11:33 _state
drwxr-xr-x 2 elasticsearch elasticsearch 4096 Mar 19 11:33 translog
fjKXYnKHSQOeVsqf_ETTVA/3/index:
-rw-r--r-- 1 elasticsearch elasticsearch 230 Mar 16 11:03 segments_1
-rw-r--r-- 1 elasticsearch elasticsearch 0 Mar 16 11:03 write.lock
fjKXYnKHSQOeVsqf_ETTVA/3/_state:
-rw-r--r-- 1 elasticsearch elasticsearch 125 Mar 16 11:03 state-0.st
fjKXYnKHSQOeVsqf_ETTVA/3/translog:
-rw-r--r-- 1 elasticsearch elasticsearch 43 Mar 16 11:03 translog-1.tlog
-rw-r--r-- 1 elasticsearch elasticsearch 80 Mar 16 11:03 translog.ckp
fjKXYnKHSQOeVsqf_ETTVA/4:
drwxr-xr-x 2 elasticsearch elasticsearch 4096 Mar 19 11:33 index
drwxr-xr-x 2 elasticsearch elasticsearch 4096 Mar 19 11:33 _state
drwxr-xr-x 2 elasticsearch elasticsearch 4096 Mar 19 11:33 translog
fjKXYnKHSQOeVsqf_ETTVA/4/index:
-rw-r--r-- 1 elasticsearch elasticsearch 230 Mar 16 11:03 segments_1
-rw-r--r-- 1 elasticsearch elasticsearch 0 Mar 16 11:03 write.lock
fjKXYnKHSQOeVsqf_ETTVA/4/_state:
-rw-r--r-- 1 elasticsearch elasticsearch 125 Mar 16 11:03 state-0.st
fjKXYnKHSQOeVsqf_ETTVA/4/translog:
-rw-r--r-- 1 elasticsearch elasticsearch 43 Mar 16 11:03 translog-1.tlog
-rw-r--r-- 1 elasticsearch elasticsearch 80 Mar 16 11:03 translog.ckp
fjKXYnKHSQOeVsqf_ETTVA/_state:
-rw-r--r-- 1 elasticsearch elasticsearch 71958 Mar 19 11:33 state-7.st
[root@mmm indices]#
So how to force ES to open this index?
1 The recovery of the index failed twice due to a missing file, but that issue has already been resolved, and now ES won't even attempt to recover/open it.
Solved with:
POST /_all/_open
With an empty JSON body {}
$ curl -XPOST "http://localhost:9200/_all/_open" -H "Content-Type: application/json" -d "{}"
{
"acknowledged": true,
"shards_acknowledged": true
}
GET /_cat/indices/
green open .fess_config.web_config m7yOnyvERje6baMyvi5IHA 2 0 1 0 10kb 10kb
green open .suggest_analyzer fjKXYnKHSQOeVsqf_ETTVA 5 0 0 0 1.2kb 1.2kb
green open .fess_user.group jyu086_5QUKmx9N64EHsRQ 5 0 0 0 1.2kb 1.2kb