Skip to content

[script] fix vllm version #7193

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Mar 6, 2025
Merged

[script] fix vllm version #7193

merged 1 commit into from
Mar 6, 2025

Conversation

hiyouga
Copy link
Owner

@hiyouga hiyouga commented Mar 6, 2025

What does this PR do?

Fixes #7192

Before submitting

@hiyouga hiyouga merged commit 3133557 into main Mar 6, 2025
0 of 12 checks passed
@hiyouga hiyouga added the solved This problem has been already solved label Mar 6, 2025
@hiyouga hiyouga deleted the hiyouga/fix branch March 6, 2025 09:14
stephen-nju pushed a commit to stephen-nju/Llmtrain that referenced this pull request Mar 24, 2025
yoonseok312 pushed a commit to pensieve-ai/LLaMA-Factory-vlm that referenced this pull request Apr 29, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
solved This problem has been already solved
Projects
None yet
Development

Successfully merging this pull request may close these issues.

vllm v0.7.3 updating bug
1 participant