You are looking at documentation for an older release.
Not what you want? See the
current release documentation.
Tokenizersedit
Tokenizers are used to break a string down into a stream of terms or tokens. A simple tokenizer might split the string up into terms wherever it encounters whitespace or punctuation.
Elasticsearch has a number of built in tokenizers which can be used to build custom analyzers.