Andromeda
0.0.11
Welcome to Andromeda, The Fastest, Most Creative, and Reliable Language Model Ever Built, train your own verison, conduct inference, and finetune your own verison with simple plug in and play scripts get started in 10 seconds:
python3.11 -m pip install --upgrade andromeda-torch
import torch
from andromeda.configs import Andromeda1Billion
model = Andromeda1Billion()
x = torch.randint(0, 256, (1, 1024)).cuda()
out = model(x) # (1, 1024, 20000)
print(out)
from andromeda_torch import Tokenizer
from andromeda_torch.configs import Andromeda1Billion
model = Andromeda1Billion()
tokenizer = Tokenizer()
encoded_text = tokenizer.encode("Hello world!")
out = model(encoded_text)
print(out)
Set the environment variables:
ENTITY_NAME
: Your wandb project nameOUTPUT_DIR
: Directory to save the weights (e.g., ./weights
)MASTER_ADDR
: For distributed trainingMASTER_PORT
For master port distributed trainingRANK
- Number of nodes servicesWORLD_SIZE
Number of gpusConfigure the training:
For more information, refer to the Training SOP.
Apache License