Search code examples
elasticsearchlogstashkibanafaceted-searchlogstash-grok

No data is being parsed + exception in elasticsearch logs



First of all excuse me if I sound a total newbie as I'm not the owner of this service (yet)

We're using ELK (Elasticsearch (1.4.2)/Logstash/Kibana - We're using a single shard, so no replicas) to parse our logs and show charts based on some filters in Kibana.

In the last 2 weeks we saw that for some reason no new data was shown on charts, no changes were done on the server other than that it's being shutdown on a daily basis.

The elasticsearch indices are created on a daily basis.
The input of the logs is via redis, and indeed I see activity in redis and new logs coming in all the time.

In the logstash.log I saw the following exception:

[2015-10-20 04:39:49,462][DEBUG][action.search.type       ] [shard1] [logstash-2015.10.15][2], node[G5aAjTSEQCiX5JXlmSx8ng], [P], s[STARTED]: Failed to execute [org.elasticsearch.action.search.SearchRequest@25f6008c] lastShard [true]
org.elasticsearch.search.SearchParseException: [logstash-2015.10.15][2]: from[-1],size[-1]: Parse Failure [Failed to parse source [{"facets":{"terms":{"terms_stats":{"value_field":"acked_contacts_","key_field":"customer_name","size":10,"order":"count"},"facet_filter":{"fquery":{"query":{"filtered":{"query":{"bool":{"should":[{"query_string":{"query":"(action:BULK_RECEIVER) AND ( result:SUCCESS or result:FAILURE OR result:Success OR result:Failed)"}}]}},"filter":{"bool":{"must":[{"range":{"@timestamp":{"from":1444711190919,"to":1445315990919}}}]}}}}}}}},"size":0}]]
        at org.elasticsearch.search.SearchService.parseSource(SearchService.java:681)
        at org.elasticsearch.search.SearchService.createContext(SearchService.java:537)
        at org.elasticsearch.search.SearchService.createAndPutContext(SearchService.java:509)
        at org.elasticsearch.search.SearchService.executeQueryPhase(SearchService.java:264)
        at org.elasticsearch.search.action.SearchServiceTransportAction$5.call(SearchServiceTransportAction.java:231)
        at org.elasticsearch.search.action.SearchServiceTransportAction$5.call(SearchServiceTransportAction.java:228)
        at org.elasticsearch.search.action.SearchServiceTransportAction$23.run(SearchServiceTransportAction.java:559)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: org.elasticsearch.search.facet.FacetPhaseExecutionException: Facet [terms]: failed to find mapping for acked_contacts_

As I said, I'm not really an expert here, but I've compared the mapping of an index from October 1st, and the index from today, and they have different mapping.
Could this be related to one of the shutdowns?, I'm not sure why it has changed (no one is doing any work on the server)

Assuming this is the problem, do I have a way to:

  1. Restore the old mapping, so future indices will be created with the correct one?
  2. Change the mapping of the existing indices (shutdown is not an issue), make the data searchable again?

I'm not sure if I have provided all of the required information, let me know if more is required
Thanks in advance,

Meny


Solution

  • You're running a query against an index that doesn't contain a field that you're referencing.

    You can ignore the unmapped fields in elasticsearch, or perhaps use exists in your kibana query.