Skip to content

KOLANICH-libs/WordSplitAbs.py

Repository files navigation

WordSplitAbs.py Unlicensed work

wheel (GitLab) wheel (GHA via nightly.link) GitLab Build Status GitLab Coverage GitHub Actions Libraries.io Status Code style: antiflash

We have moved to https://codeberg.org/KAbs/WordSplitAbs.py, grab new versions there.

Under the disguise of "better security" Micro$oft-owned GitHub has discriminated users of 1FA passwords while having commercial interest in success and wide adoption of FIDO 1FA specifications and Windows Hello implementation which it promotes as a replacement for passwords. It will result in dire consequencies and is competely inacceptable, read why.

If you don't want to participate in harming yourself, it is recommended to follow the lead and migrate somewhere away of GitHub and Micro$oft. Here is the list of alternatives and rationales to do it. If they delete the discussion, there are certain well-known places where you can get a copy of it. Read why you should also leave GitHub.


This is an abstraction layer around Python libraries for splitting (tokenization) of words joined without delimiters.

It is often called words tokenization, but it is a bit different thing: tokenization is when words are naturally not splitted (in Eastern-Asian languages, for example), but splitting is when they are naturally splitted, but the delimiters got missed.

Tutorial

from WordSplitAbs import ChosenWordSplitter

s = ChosenWordSplitter()  # A resource-consuming stage, the most splitters load a corpus or a semi-preprocessed model here and infer a usable model from it. So you want to call it as less as possible.

print(s("wordsplittingisinferenceofconcatenatedwordsboundaries"))  # "word splitting is inference of concatenated words boundaries"

Backends

Backend Has default corpus Deps Model Quality Notes
instant_segment Unigram + bigram Recommended A rewrite of wordsegment into Rust with high performance boost
wordsegment ✔️ Unigram + bigram Recommended
WordSegmentationDP pythonnet + WordSegmentationDP.dll + Corpus file Unigram + Bayes Recommended
WordSegmentationTM pythonnet + WordSegmentationTM.dll + Corpus file Unigram + Bayes Recommended
SymSpell pythonnet + SymSpell.dll + Corpus file Unigram + Bigram Not recommended, fails to split elementary phrases
wordninja ✔️ Unigram order Not the best quality