Simple task to test performance of tokenizers. It just
creates a token stream for each field of the document and
read all tokens out of that stream.
Relevant properties: doc.tokenize.log.step.
Field Summary
final public static int
DEFAULT_DOC_LOG_STEP Default value for property doc.tokenize.log.step - indicating how often
an "added N docs / M tokens" message should be logged.