TinyGPT
1.0.0
Tiny C++11 GPT-2 inference implementation from scratch, which is mainly based on the project picoGPT.
Accompanying blog post: Write a GPT from scratch (TinyGPT)
Tensor
: Tensor class similar to the numpy interface.Model
: GPT-2 model implementation with reference to gpt2_pico.py.Tokenizer
: BPE tokenizer with exactly the same logic as GPT-2 encoder.py.git clone --recurse-submodules https://github.com/keith2018/TinyGPT.git
Official website: Intel®-Optimized Math Library for Numerical Computing on CPUs & GPUs
python3 tools/download_gpt2_model.py
if success, you'll see the file model_file.data
in directory assets/gpt2
mkdir build
cmake -B ./build -DCMAKE_BUILD_TYPE=Release
cmake --build ./build --config Release
This will generate the executable file and copy assets to directory app/bin
, then you can run the demo:
cd app/bin
./TinyGPT_demo
[DEBUG] TIMER TinyGPT::Model::loadModelGPT2: cost: 800 ms
[DEBUG] TIMER TinyGPT::Encoder::getEncoder: cost: 191 ms
INPUT:Alan Turing theorized that computers would one day become
GPT:the most powerful machines on the planet.
INPUT:exit
intel-mkl
https://www.intel.com/content/www/us/en/developer/tools/oneapi/onemkl.htmljson11
https://github.com/dropbox/json11re2
https://github.com/google/re2abseil-cpp
https://github.com/abseil/abseil-cppThis code is licensed under the MIT License (see LICENSE).