Elasticsearch "pattern_replace", replacing whitespaces while analyzing Elasticsearch "pattern_replace", replacing whitespaces while analyzing elasticsearch elasticsearch

Elasticsearch "pattern_replace", replacing whitespaces while analyzing


The analyzer analyzes a string by tokenizing it first then applying a series of token filters. You have specified tokenizer as standard means the input is already tokenized using standard tokenizer which created the tokens separately. Then pattern replace filter is applied to the tokens.

Use keyword tokenizer instead of your standard tokenizer. Rest of the mapping is fine.You can change your mapping as below

"settings": { "index": {  "analysis": {    "filter": {      "whitespace_remove": {        "type": "pattern_replace",        "pattern": " ",        "replacement": ""      }    },    "analyzer": {      "meliuz_analyzer": {        "filter": [          "lowercase",          "whitespace_remove",          "nGram"        ],        "type": "custom",        "tokenizer": "keyword"      }    }  }}