8000 GitHub - chonkie-inc/chonkie at v1.0.5
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

chonkie-inc/chonkie

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Chonkie Logo

πŸ¦› Chonkie ✨

PyPI version License Documentation Package size codecov Downloads Discord GitHub stars

The no-nonsense ultra-light and lightning-fast chunking library that's ready to CHONK your texts!

Installation β€’ Usage β€’ Chunkers β€’ Integrations β€’ Benchmarks

Tired of making your gazillionth chunker? Sick of the overhead of large libraries? Want to chunk your texts quickly and efficiently? Chonkie the mighty hippo is here to help!

πŸš€ Feature-rich: All the CHONKs you'd ever need
✨ Easy to use: Install, Import, CHONK
⚑ Fast: CHONK at the speed of light! zooooom
πŸͺΆ Light-weight: No bloat, just CHONK
🌏 Wide support: CHONKie integrates with your favorite tokenizer, embedding model and APIs!
πŸ’¬ ️Multilingual: Out-of-the-box support for 5+ language CHONKS (more coming πŸ”œ)
☁️ Cloud-Ready: CHONK locally or in the Chonkie Cloud
πŸ¦› Cute CHONK mascot: psst it's a pygmy hippo btw
❀️ Moto Moto's favorite python library

Chonkie is a chunking library that "just works" ✨

Installation

To install chonkie, run:

pip install chonkie

Chonkie follows the rule of minimum installs. Have a favorite chunker? Read our docs to install only what you need Don't want to think about it? Simply install all (Not recommended for production environments)

pip install chonkie[all]

Usage

Here's a basic example to get you started:

# First import the chunker you want from Chonkie
from chonkie import RecursiveChunker

# Initialize the chunker
chunker = RecursiveChunker()

# Chunk some text
chunks = chunker("Chonkie is the goodest boi! My favorite chunking hippo hehe.")

# Access chunks
for chunk in chunks:
    print(f"Chunk: {chunk.text}")
    print(f"Tokens: {chunk.token_count}")

Check out more usage examples in the docs!

Chunkers

Chonkie provides several chunkers to help you split your text efficiently for RAG applications. Here's a quick overview of the available chunkers:

  • TokenChunker: Splits text into fixed-size token chunks.
  • SentenceChunker: Splits text into chunks based on sentences.
  • RecursiveChunker: Splits text hierarchically using customizable rules to create semantically meaningful chunks.
  • SemanticChunker: Splits text into chunks based on semantic similarity.
  • SDPMChunker: Splits text using a Semantic Double-Pass Merge approach.
  • LateChunker: Embeds text and then splits it to have better chunk embeddings.
  • CodeChunker: Splits code into structurally meaningful chunks.

More on these methods and the approaches taken inside the docs

Integrations

Chonkie integrates smoothly with the tools you already use:

  • Tokenizers: Choose from 3 supported tokenizers (like Hugging Face πŸ€— and Tiktoken) or provide your own custom token counting function. Flexibility first!
  • Embedding Models: Seamlessly works with 5 out-of-the-box embedding model providers, including SentenceTransformers, Model2Vec, OpenAI, Cohere, and Jina AI. Bring your favorite embeddings to the CHONK party!

Benchmarks

"I may be smol hippo, but I pack a big punch!" πŸ¦›

Chonkie is not just cute, it's also fast and efficient! Here's how it stacks up against the competition:

SizeπŸ“¦

  • Default Install: 15MB (vs 80-171MB for alternatives)
  • With Semantic: Still 10x lighter than the closest competition!

Speed⚑

  • Token Chunking: 33x faster than the slowest alternative
  • Sentence Chunking: Almost 2x faster than competitors
  • Semantic Chunking: Up to 2.5x faster than others

Check out our detailed benchmarks to see how Chonkie races past the competition! πŸƒβ€β™‚οΈπŸ’¨

Contributing

Want to help grow Chonkie? Check out CONTRIBUTING.md to get started! Whether you're fixing bugs, adding features, or improving docs, every contribution helps make Chonkie a better CHONK for everyone.

Remember: No contribution is too small for this tiny hippo! πŸ¦›

Acknowledgements

Chonkie would like to CHONK its way through a special thanks to all the users and contributors who have helped make this library what it is today! Your feedback, issue reports, and improvements have helped make Chonkie the CHONKIEST it can be.

And of course, special thanks to Moto Moto for endorsing Chonkie with his famous quote:

"I like them big, I like them chonkie." ~ Moto Moto

Citation

If you use Chonkie in your research, please cite it as follows:

@software{chonkie2025,
  author = {Minhas, Bhavnick AND Nigam, Shreyash},
  title = {Chonkie: A no-nonsense fast, lightweight, and efficient text chunking library},
  year = {2025},
  publisher = {GitHub},
  howpublished = {\url{https://github.com/chonkie-inc/chonkie}},
}
0