-
Notifications
You must be signed in to change notification settings - Fork 195
partial alias/mapping between a container and the model being used #1268
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
We do translate if we have the name expanded by shortnames, but if ollama expands it we don't. @ericcurtin Do you know if there is an easy way to figure out tinylama=ollama://tinyllama:latest? I guess we could just check if the name has |
We could decorate the shortname our self with TRANSPORT://model:TAG, if we don't see TRANSPORT or TAG specified, but in some cases doesn't ollama translate tiny->tinyllama. |
I guess:
playing with docker I guess 😄 I think |
It wouldn't be that hard to do a reverse lookup if we wanted to. |
The idea behind this is: How to get the model id from 'model list' command when I have a container running So no change expected on the list command, but on the annotation/label applied on the container to have as model name the FQN model, the one we can get from 'list' command It's to be able to know which containers are related to a given model or from a model, what are the containers using that model. |
I agree with @benoitf here, it would be nice to have this label match the FQN of the pulled model. The question is how easy is this to get with ollama short names. |
Podman does not list shortnames after the fact. Only reports the full name. |
Uh oh!
There was an error while loading. Please reload this page.
Issue Description
if I use RamaLama on the host and it starts a container with
ramalama run ollama://tinyllama:latest
I haveai.ramalama.model: ollama://tinyllama:latest
so I know which model is used if I compare with the value of
ramalama model list
commandbut if I use
ramalama serve tinyllama
the value isai.ramalama.model: tinyllama
so it contains the value I provided so it's difficult/impossible to map the container to the original model
I would expect to see
ollama://tinyllama:latest
as well forai.ramalama.model
in container mode, there is an alias being added but the alias is not the fully qualified model name as well
#1009
Steps to reproduce the issue
start models using shortnames or a subpart
or use the CLI container
Describe the results you received
different format of
ai.ramalama.model
Describe the results you expected
I expect to always see one of the name returned by
ramalama list
ramalama info output
0.7.5 / macOS
Upstream Latest Release
Yes
Additional environment details
No response
Additional information
No response
The text was updated successfully, but these errors were encountered: