HomebrewNLP
1.0.0
A case study of efficient training of large language models using commodity hardware.
python3 main.py train --config_path configs/small.yaml
| Discord | WandB
@misc{nestler2021homebrewnlp,
title = {{HomebrewNLP}},
author = {Nestler, Lucas and Gill, David},
year = {2021},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.5553247},
howpublished = {url{https://github.com/HomebrewNLP/HomebrewNLP}}
}