You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
Currently, perplexica is showing 'llama 3.1 70b' as model on groq, when it's being sunset according to deprecations page here: https://console.groq.com/docs/deprecations
To Reproduce
Steps to reproduce the behavior:
Go to 'Settings'
Setup groq as cloud API provider
Click on 'Chat model'
See llama 3.1
Expected behavior
See llama 3.3. instead
Screenshots
The text was updated successfully, but these errors were encountered:
KhazAkar
changed the title
Perplexica still uses llama 3.1 on groq, when it's getting retired in 70b variant
Perplexica still uses llama 3.1 on groq, when it's getting retired
Dec 21, 2024
Describe the bug
Currently, perplexica is showing 'llama 3.1 70b' as model on groq, when it's being sunset according to deprecations page here: https://console.groq.com/docs/deprecations
To Reproduce
Steps to reproduce the behavior:
Expected behavior
See llama 3.3. instead
Screenshots

The text was updated successfully, but these errors were encountered: