when i use CAI with my local model (in this case qwen3 and llama3) it stops in 1-3 turns and show this error message: [ErrorMessageCai.txt](https://github.com/user-attachments/files/19977898/ErrorMessageCai.txt) this is the command i do before start CAI: echo -e 'OPENAI_API_KEY="sk-123"\nOLLAMA_API_BASE="http://127.0.0.1:11434"\nSHODAN_API_KEY="XXXXXXX"' > .env and the server keep running before and after the connection. here's the short log only for example: [cai_log.txt](https://github.com/user-attachments/files/19977953/cai_log.txt)