Infini-gram: An Efficient Text Search Engine
Citation
If you find Infini-gram useful in your research, please cite our paper:
@article{Liu2024InfiniGram,
title={Infini-gram: Scaling Unbounded n-gram Language Models to a Trillion Tokens},
author={Liu, Jiacheng and Min, Sewon and Zettlemoyer, Luke and Choi, Yejin and Hajishirzi, Hannaneh},
journal={arXiv preprint arXiv:2401.17377},
year={2024}
}
Acknowledgements
We would like to thank Zihao Ye for sharing his advice on building and distributing python packages.