-
-
Notifications
You must be signed in to change notification settings - Fork 2.5k
gpuFillInfo not implemented on darwin #5465
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Labels
Comments
thanks for opening the issue - however that error shouldn't be really fatal. is the inference working? if not, can you paste the output when running with |
nothing seems to be working. not sure if it's related to above error or something else its hitting. here's a clean run. i deleted the configuration and models directory and started again.
|
using |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
LocalAI version:
v2.29.0 (fd17a33)
Environment, CPU architecture, OS, and Version:
Darwin Alexiss-Mac-mini.local 24.3.0 Darwin Kernel Version 24.3.0: Thu Jan 2 20:22:58 PST 2025; root:xnu-11215.81.4~3/RELEASE_ARM64_T8132 arm64
Describe the bug
when trying to use chat models i get an error.
To Reproduce
smolvlm-256m-instruct
modelExpected behavior
no error
Logs
Additional context
N/A
The text was updated successfully, but these errors were encountered: