tinyrwkv
1.0.0
目前正在rewrite
分支中重写。
RWKV-LM 系列大型语言模型到tinygrad 框架的端口。
目前,需要 git 中的tinygrad 或仅使用 nix flake。
numpy
pydot (only for GRAPH=1)
tinygrad
tokenizers
torch (only for loading pytorch weights)
tqdm
wandb (optional during training)
rust (only for compiling)
clang (only for compiling)
graphviz (only for GRAPH=1)
使用python -m cli
运行 CLI。
此外,还可以作为 python 包嵌入到其他项目中。还可以将模型编译为可移植的 C 代码并以这种方式嵌入。
usage: tinyrwkv-cli [-h] [--seed SEED] {pre,gen,cht,cmp,bch,ptr,gpt,tra,bpt,wkv,mus} ...
CLI for tinyrwkv
positional arguments:
{pre,gen,cht,cmp,bch,ptr,gpt,tra,bpt,wkv,mus}
pre preprocess either tinyrwkv trained weights or pytorch trained weights into RNN form
gen freeform generation using the RNN mode (requires a preprocessed model using `pre`)
cht chat with a model in RNN mode (requires a preprocessed model using `pre`)
cmp compile a RNN model into c source code and a compiled executable (need to run with CLANG=1)
bch benchmark the rnn mode
ptr preprocess pytorch weights weights into GPT form for training or inference
gpt freeform generation using the GPT mode (requires a preprocessed model using `ptr`)
tra pretrain or finetune a model (if finetuning the model needs to be preprocessed with `ptr`)
bpt benchmark the gpt mode
wkv benchmark/test each wkv module
mus music generation using the RNN mode (requires a preprocessed model using `pre`)
options:
-h, --help show this help message and exit
--seed SEED seed for random
请参阅许可证和通知文件。