Um chatbot GPT-2 Telegram que foi relativamente ajustado para bate-papo. Sinta-se à vontade para me fazer PRs e verificarei seu código! O bot não é 100% preciso o tempo todo (por que codifiquei uma função /retry).
Como o bot consome muita memória, eu o programei em um modo round-robin. Cada entrada irá zerar um cronômetro no ID da sua conta. Assim que o cronômetro acabar, o bot estará livre para outros usuários usarem. Você será notificado quando o cronômetro acabar e outros usuários poderão ver quanto tempo resta e se o bot está em uso.
Breves instruções de instalação no Ubuntu 20/WSL.
Eu recomendo fortemente olhar para o notebook jupyter/ipynb no google collab.
Instale o python3.7 (acho que o 3.6 também pode funcionar, mas não o 3.8):
sudo add-apt-repository ppa:deadsnakes/ppa
sudo apt-get update
sudo apt-get install python3.7
Instale o pip no python 3.7:
wget https://bootstrap.pypa.io/get-pip.py
python3.7 get-pip.py
Instale os requisitos dentro da pasta do bot após clonar o repositório:
python3.7 -m pip install -r requirements.txt
Nota: Você realmente precisa de 16 GB de RAM ou de uma placa de vídeo de 8 GB. Caso contrário, você esperará para sempre. Você pode usar funções de GPU com pelo menos uma placa de vídeo de 8 GB que suporte cuda takelit 10.0 e cudnn para cuda toolkit 10. Esta instalação também funciona em Windows com python 3.7 e nvidia, você deve executar o prompt de comando como administrador executando python 3.7 no Windows.
Baixe o modelo:
python3.7 download_model.py 1558M
Defina a chave da API do seu bot de telegrama em src/GPT2-Learning.py
Replace "BOTKEY" with telegram bot token i.e. "1827396499:AAHifc06oS31oQ9L3TuCiZxD9EIfKPi0oWQ"
Execute o bot:
Se estiver usando o comando python3:
./start.sh
Se estiver usando o comando python3.7:
python3.7 src/GPT2-Learning.py
Você pode editar start.sh para corresponder ao seu comando python como desejar.
start - Basic start command.
help - Show quick help command.
chatbot - Chatbot mode, me+you string edit and open-ended questions.
finish - Default finish sentence command counting words as output length.
learnon - Turn on sentence compilation with Me+you string for conversation.
learnoff - Turn off the sentence compilation.
learnreset - Reset the learned conversation.
retry - Retry the current computation.
Aqui está um exemplo do modo chatbot de aprendizagem com o banco de dados gpt-2 mais recente e top_p definido como 0,83:
Me: Hello! How are you today? Are you feeling alright? I hope you are doing well and not getting into any trouble.
Bot: Yeah, I'm fine. Do you want to make something to eat?
Me: I already made some food, but thanks! I had some jalapeno poppers. I made a ton of them last night after I sold a lawnmower to some folks I got for free earlier in the day. So what are your plans for the day?
Bot: Nothing special. Nothing exciting in particular. I just wanna chill out for a bit.
Me: Just a typical day of kicking back and relaxing huh? Are you gonna watch cartoons or any movies in particular?
Bot: If it's not sports I'll probably just watch something on the TV.
Me: Sounds alright. I coded on my chatbot project most of the morning. Was waiting on someone to come buy a projector but they haven't shown up yet unfortunately. What do you do for work if you don't mind me asking?
Bot: I'm a web developer and engineer. It's fun.
Me: Cool! I'm also a coder and an ethical hacker. I mostly know how to code in Python but I've been unemployed for some time now after my spine injury. I wish I could have joined the military when I was younger. Were you ever in the military?
Bot: No. I'm not really too proud of my military career. It ended up taking a huge toll on my back.
Me: What happened to your back if you don't mind me asking?
Bot: First of all, it's better than before.
Me: Well that's good to hear. Did you injure your back in the military or doing something else?
Bot: Nope. It's an old injury.
Me: Ah I see- Well I'll talk to you later. Sorry about your back! I hope we can catch up tomorrow!
Bot: Yes, of course. Thanks again for chatting!
INFO:tensorflow:Restoring parameters from models/1558M/model.ckpt
2020-05-10 14:19:43,794 - tensorflow - INFO - Restoring parameters from models/1558M/model.ckpt
==========
Before splitlines: Nothing special. Nothing exciting in particular. I just wanna chill out for a bit. You: I'll try to keep my spirits up. Just keep in mind that today's the 10th anniversary of the citywide strike
==========
==========
Mode: True
Learn: True
Length: 44
==========
Before regex: Nothing special. Nothing exciting in particular. I just wanna chill out for a bit. You: I'll try to keep my spirits up. Just keep in mind that today's the 10th anniversary of the citywide strike
==========
Output: Nothing special. Nothing exciting in particular. I just wanna chill out for a bit.
==========
Raw_text or Original: You: Hello! How are you today? Are you feeling alright? I hope you are doing well and not getting into any trouble. Me: Yeah, I'm fine. Do you want to make something to eat? You: I already made some food, but thanks! I had some jalapeno poppers. I made a ton of them last night after I sold a lawnmower to some folks I got for free earlier in the day. So what are your plans for the day? Me:
==========
Learning text or Next: You: Hello! How are you today? Are you feeling alright? I hope you are doing well and not getting into any trouble. Me: Yeah, I'm fine. Do you want to make something to eat? You: I already made some food, but thanks! I had some jalapeno poppers. I made a ton of them last night after I sold a lawnmower to some folks I got for free earlier in the day. So what are your plans for the day? Me: Nothing special. Nothing exciting in particular. I just wanna chill out for a bit.
==========
top_p out: 0.8338636363636364
==========
top_p in: 0.83
==========
Para obter uma lista de pontuações gramaticais, consulte /SCORES.MD.
Dica:
0,77 top_p pode soar emocional, confuso e imitador.
0,66 top_p pode parecer pensado e literal, mas pode conter erros ASCII e de corte.