-
Notifications
You must be signed in to change notification settings - Fork 2.3k
feat: custom openai temperature #662
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
feat: custom openai temperature #662
Conversation
Hey, making changes to the temperature makes the model generate garbage (If the temperature is increased), I've tried several temperatures and have selected the best between quality and quantity. If the temperature is too low, the answers wouldn't be as long and if the temperature is too high, the answers wouldn't be right. Hope you understand, I'll be closing this for now, but if more people demand this feature, we can consider reopening it. Thanks! |
Hi @ItzCrazyKns. Thanks for your answer. I understand your point. Yet Perplexity allows to specify the "temperature" in the API parameters. So I thought it would be a great asset to add. Would you reconsider this opinion if I add a serious warning (or recommendation) in the frontend part, just above the text input? So users will know that they are playing with a dangerous parameter. |
@ItzCrazyKns Looks like this parameter is definitely something people want to be able to customize: #456 I'd be glad to add an explicit warning like
|
@VinceOPS, good luck. The repo owner loves closing PRs and issues if he doesn't immediately like them. |
Please have some sense, I've seen 100s of issues, people tend to change random parameters unknowingly and then start filing issues. If you wish this should be merged, I'll conduct a test with different temperatures and models and see how they react and then finally merge it. Moreover, I am quite sure its gonna affect the quality of the answers too much. |
Adding warnings will most likely not work, people are always "curious" to try out new things. |
Please have some sense, this is not a random parameter. The contributor states crystal clear that:
If you're so busy or incapable of maintaining this project adequately, you could ask contributors to conduct the 'test' themselves and report on, or to do changes based on your liking instead of disrespectfully dismissing their contirbution. The pattern of bad maintenance is very apparent with only a few checks, not hundreds. |
I get what you're saying. I never meant to ignore PRs or issues on purpose, I just got caught up with a lot of things. I guess I have been a bad maintainer, and I see why that’s frustrating. This is my first open source project, and I didn’t know much about managing a community or keeping up with contributions, but I’m learning. I have been taking feedback from the community, but not as much as I should have. Thanks for pointing that out. I’ll work on improving that and being more open to input. I’ll also make sure to close all the PRs by the end of this week and stay more active on the project moving forward. Extremely sorry for being a bad maintainer, nothing was intentional. |
Hey @VinceOPS, you can go ahead and add a warning below the input. |
Hi there 👋🏼
And thanks for this project!
I am working on a very specific project which needs answers as "stable" as possible.
I noticed that the "temperature" used with all models was set to 0.7 and could not be configured.
I took a naive approach to propose a first iteration. Let me know if you'd like something different.