LLM API
1.0.0
The API has three main routes
To launch the service
.env
file in the project's default directory and add your OPENAI_API_KEY
vim .env
,
OPENAI_API_KEY=<key>
docker build -t chat_api_service .
docker run -d -p 8000:8000 chat_api_service
Follow this flow to utilize the API
curl http://127.0.0.1:8000?username=TR
curl -X PUT http://127.0.0.1:8000/chat/<session_id> -H "Content-Type: application/json" -d '{"payload": "Hello! What is your name?"}'
curl http://127.0.0.1:8000/chat_history/<session_id>
Alternatively, a better tool to ping the API would be Postman; This is the tool I utilized during development.
Launch the docker container in interactive mode to run the test suite. The Redis server and Uvicorn servers must be running for the tests to execute.
docker run -it --rm -p 8000:8000 --entrypoint bash chat_api_service
launch.sh
as a background process
./launch.sh &
python -m pytest tests
Future enhancements to the API include