Calibration of LLM: Can You Trust The Result If The “logprob” is High?

Estimated read time 1 min read

When you use OpenAI’s GPT through the API, you can get `logprobs` for each token in the output, which stands for log-probabilities. In…

 

​ When you use OpenAI’s GPT through the API, you can get `logprobs` for each token in the output, which stands for log-probabilities. In…Continue reading on Medium »   Read More Llm on Medium 

#AI

You May Also Like

More From Author

+ There are no comments

Add yours