Some local LLM servers (llama-server, etc.) return message.content as a dict or list instead of a plain string. This caused AttributeError 'dict object has no attribute strip' on every API call. Normalizes content to string immediately after receiving the response: - dict: extracts 'text' or 'content' field, falls back to json.dumps - list: extracts text parts (OpenAI multimodal content format) - other: str() conversion Applied at the single point where response.choices[0].message is read in the main agent loop, so all downstream .strip()/.startswith()/[:100] operations work regardless of server implementation. Closes #759
222 KiB
222 KiB