Skip to content

Perplexica still uses llama 3.1 on groq, when it's getting retired #525

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
KhazAkar opened this issue Dec 21, 2024 · 1 comment
Closed
Labels
bug Something isn't working

Comments

@KhazAkar
Copy link

Describe the bug
Currently, perplexica is showing 'llama 3.1 70b' as model on groq, when it's being sunset according to deprecations page here: https://console.groq.com/docs/deprecations

To Reproduce
Steps to reproduce the behavior:

  1. Go to 'Settings'
  2. Setup groq as cloud API provider
  3. Click on 'Chat model'
  4. See llama 3.1

Expected behavior
See llama 3.3. instead

Screenshots
image

@KhazAkar KhazAkar added the bug Something isn't working label Dec 21, 2024
@KhazAkar KhazAkar changed the title Perplexica still uses llama 3.1 on groq, when it's getting retired in 70b variant Perplexica still uses llama 3.1 on groq, when it's getting retired Dec 21, 2024
@ItzCrazyKns
Copy link
Owner

Fixed by #523

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants