Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I grappled with that issue for https://github.com/pamelafox/openai-messages-token-helper as I wanted to be able to use it for a quick token check with SLMs as well, so I ended up adding a parameter "fallback_to_default" for developers to indicate they're okay with assuming gpt-35 BPE encoding.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: