Skip to content

feat: custom openai temperature #662

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 3 commits into
base: master
Choose a base branch
from

Conversation

VinceOPS
Copy link

@VinceOPS VinceOPS commented Mar 3, 2025

Hi there 👋🏼
And thanks for this project!

I am working on a very specific project which needs answers as "stable" as possible.
I noticed that the "temperature" used with all models was set to 0.7 and could not be configured.

I took a naive approach to propose a first iteration. Let me know if you'd like something different.

@ItzCrazyKns
Copy link
Owner

Hey, making changes to the temperature makes the model generate garbage (If the temperature is increased), I've tried several temperatures and have selected the best between quality and quantity. If the temperature is too low, the answers wouldn't be as long and if the temperature is too high, the answers wouldn't be right. Hope you understand, I'll be closing this for now, but if more people demand this feature, we can consider reopening it. Thanks!

@ItzCrazyKns ItzCrazyKns closed this Mar 4, 2025
@VinceOPS
Copy link
Author

VinceOPS commented Mar 5, 2025

Hi @ItzCrazyKns. Thanks for your answer.

I understand your point. Yet Perplexity allows to specify the "temperature" in the API parameters. So I thought it would be a great asset to add.

Would you reconsider this opinion if I add a serious warning (or recommendation) in the frontend part, just above the text input? So users will know that they are playing with a dangerous parameter.

@VinceOPS
Copy link
Author

VinceOPS commented Mar 5, 2025

@ItzCrazyKns Looks like this parameter is definitely something people want to be able to customize: #456

I'd be glad to add an explicit warning like

Do not modify this value unless you fully understand the implications. Doing so may critically affect the user experience of Perplexica, including the relevance and accuracy of the results.

@realies
Copy link
Contributor

realies commented Mar 7, 2025

@VinceOPS, good luck. The repo owner loves closing PRs and issues if he doesn't immediately like them.
And then comms usually stop. Might be better to make a new project like this.

@VinceOPS
Copy link
Author

VinceOPS commented Mar 7, 2025

@VinceOPS, good luck. The repo owner loves closing PRs and issues if he doesn't immediately like them. And then comms usually stop. Might be better to make a new project like this.

@realies Well, people will just get used to maintain their own fork 😅 ... Too bad!

@ItzCrazyKns ItzCrazyKns reopened this Mar 7, 2025
@ItzCrazyKns
Copy link
Owner

@VinceOPS, good luck. The repo owner loves closing PRs and issues if he doesn't immediately like them. And then comms usually stop. Might be better to make a new project like this.

Please have some sense, I've seen 100s of issues, people tend to change random parameters unknowingly and then start filing issues. If you wish this should be merged, I'll conduct a test with different temperatures and models and see how they react and then finally merge it. Moreover, I am quite sure its gonna affect the quality of the answers too much.

@ItzCrazyKns
Copy link
Owner

Hi @ItzCrazyKns. Thanks for your answer.

I understand your point. Yet Perplexity allows to specify the "temperature" in the API parameters. So I thought it would be a great asset to add.

Would you reconsider this opinion if I add a serious warning (or recommendation) in the frontend part, just above the text input? So users will know that they are playing with a dangerous parameter.

Adding warnings will most likely not work, people are always "curious" to try out new things.

@realies
Copy link
Contributor

realies commented Mar 7, 2025

Please have some sense, this is not a random parameter. The contributor states crystal clear that:

I am working on a very specific project which needs answers as "stable" as possible.

If you're so busy or incapable of maintaining this project adequately, you could ask contributors to conduct the 'test' themselves and report on, or to do changes based on your liking instead of disrespectfully dismissing their contirbution.

The pattern of bad maintenance is very apparent with only a few checks, not hundreds.

@ItzCrazyKns
Copy link
Owner

If you're so busy or incapable of maintaining this project adequately, you could ask contributors to conduct the 'test' themselves and report on, or to do changes based on your liking instead of disrespectfully dismissing their contirbution.

The pattern of bad maintenance is very apparent with only a few checks, not hundreds.

I get what you're saying. I never meant to ignore PRs or issues on purpose, I just got caught up with a lot of things. I guess I have been a bad maintainer, and I see why that’s frustrating. This is my first open source project, and I didn’t know much about managing a community or keeping up with contributions, but I’m learning. I have been taking feedback from the community, but not as much as I should have. Thanks for pointing that out. I’ll work on improving that and being more open to input. I’ll also make sure to close all the PRs by the end of this week and stay more active on the project moving forward. Extremely sorry for being a bad maintainer, nothing was intentional.

@ItzCrazyKns
Copy link
Owner

Hey @VinceOPS, you can go ahead and add a warning below the input.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants