all options
bookworm  ] [  trixie  ] [  sid  ]
[ Source:  ]

Package: r-cran-tokenizers (0.3.0-1) [debports]

Links for r-cran-tokenizers

Screenshot

Debian Resources:

Download Source Package :

Not found

Maintainers:

External Resources:

Similar packages:

GNU R fast, consistent tokenization of natural language text

Convert natural language text into tokens. Includes tokenizers for shingled n-grams, skip n-grams, words, word stems, sentences, paragraphs, characters, shingled characters, lines, tweets, Penn Treebank, regular expressions, as well as functions for counting characters, words, and sentences, and a function for splitting longer texts into separate documents, each with the same number of words. The tokenizers have a consistent interface, and the package is built on the 'stringi' and 'Rcpp' packages for fast yet correct tokenization in 'UTF-8'.

Other Packages Related to r-cran-tokenizers

  • depends
  • recommends
  • suggests
  • enhances

Download r-cran-tokenizers

Download for all available architectures
Architecture Package Size Installed Size Files
hppa (unofficial port) 643.2 kB858.0 kB [list of files]