首页 >  Term: Tokenisierung
Tokenisierung

In text mining or Full-Text Search, the process of identifying meaningful units within strings, either at word boundaries, morphemes, or stems, so that related tokens can be grouped. For example, although "San Francisco" is two words, it could be treated as a single token.

0 0

创建者

  • Hellaweiss
  •  (Diamond) 9392 分数
  • 100% positive feedback
© 2025 CSOFT International, Ltd.