Files
turboquant/benchmarks
Alexander Whitestone 90b5eddfa1
All checks were successful
Smoke Test / smoke (pull_request) Successful in 26s
docs: Document Ollama perplexity limitation — no logprob support (closes #63)
Ollama lacks token logprob API, so true perplexity cannot be measured
via the Ollama backend. Added warning to run_benchmarks.py docstring
directing users to run_perplexity.py (llama-perplexity binary) for
real PPL measurement with --logprobs support.
2026-04-14 23:23:38 -04:00
..