Experiment with a lower temperature with code-gecko
Problem
Our current temperature is 0.2
. We never really trialed this against the output of other temperatures and see whether a lower (or higher) temperature leads to a better outcome.
The temperature is used for sampling during response generation. Temperature controls the degree of randomness in token selection. Lower temperatures are good for prompts that require a more deterministic and less open-ended or creative response, while higher temperatures can lead to more diverse or creative results. A temperature of 0 is deterministic, meaning that the highest probability response is always selected.
https://cloud.google.com/vertex-ai/docs/generative-ai/model-reference/code-completion
Desired Outcome
We test a temperature of 0.1
against the current temperature of 0.2
by using the Testing framework developed in https://gitlab.com/gitlab-org/modelops/applied-ml/code-suggestions/prompt-library/-/issues/10
Separate Idea
If the last suggestion for user X was accepted, raise the temperature on the next one...if not lower it...perhaps by 0.1 increments.