Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FEAT: support qwen2.5-vl-instruct #2788

Merged
merged 1 commit into from
Jan 28, 2025

Conversation

qinxuye
Copy link
Contributor

@qinxuye qinxuye commented Jan 27, 2025

For now, transformers needs to be installed from source code:

pip install git+https://github.com/huggingface/transformers.git

vllm not supported yet.

2025-01-27.15.35.18.mov

@XprobeBot XprobeBot added this to the v1.x milestone Jan 27, 2025
@qinxuye qinxuye merged commit efff7f8 into xorbitsai:main Jan 28, 2025
11 of 13 checks passed
@qinxuye qinxuye deleted the feat/qwen2.5-vl-instruct branch January 28, 2025 01:21
@hxujal
Copy link

hxujal commented Feb 8, 2025

vllm已经支持了,可以更新吗

@qinxuye
Copy link
Contributor Author

qinxuye commented Feb 8, 2025

vllm已经支持了,可以更新吗

0.7.2 支持,你愿意提交 PR 来增加吗?相信不困难。

if VLLM_INSTALLED and vllm.__version__ >= "0.6.3":
VLLM_SUPPORTED_MODELS.append("llama-3.2-vision")
VLLM_SUPPORTED_VISION_MODEL_LIST.append("llama-3.2-vision-instruct")
VLLM_SUPPORTED_VISION_MODEL_LIST.append("qwen2-vl-instruct")
VLLM_SUPPORTED_VISION_MODEL_LIST.append("QvQ-72B-Preview")

类似这个,>=0.7.2 支持。

@qinxuye
Copy link
Contributor Author

qinxuye commented Feb 8, 2025

vllm已经支持了,可以更新吗

@hxujal 有在做吗?下午要发版了,如果没有的话,我就弄下。

@hxujal
Copy link

hxujal commented Feb 8, 2025

vllm已经支持了,可以更新吗

@hxujal 有在做吗?下午要发版了,如果没有的话,我就弄下。

还没弄,你弄吧。谢谢

@elepherai
Copy link

image
你好,我拉了1.2.2的镜像,跑不了vllm的qwen2.5-vl-instruct,意思是要自己升级到vllm 0.7.2是吗

@XiaoCC
Copy link

XiaoCC commented Feb 10, 2025

图像你好,我拉了1.2.2的镜像,跑不了vllm的qwen2.5-vl-instruct,意思是要自己升级到vllm 0.7.2是吗

我也是没有vllm选项

@qinxuye
Copy link
Contributor Author

qinxuye commented Feb 10, 2025

image 你好,我拉了1.2.2的镜像,跑不了vllm的qwen2.5-vl-instruct,意思是要自己升级到vllm 0.7.2是吗

对,目前需要自己在镜像里升级 vllm 到 0.7.2,下个版本我们将升级镜像里的版本。

@caulijun
Copy link

拉取qwen2.5-vl-7b后,加载模型报错:
AttributeError: [address=0.0.0.0:42908, pid=30123] Qwen2TokenizerFast has no attribute tokenizer
已经升级到了最新的版本
transformers==4.48.3
xinference==1.2.2
token

@ignore1999
Copy link

拉取qwen2.5-vl-7b后,加载模型报错:属性错误:[地址=0.0.0.0:42908,进程 ID=30123] Qwen2TokenizerFast 没有属性 tokenizer 已经升级到了最新的版本transformers==4.48.3 翻译文本:transformers==4.48.3xinference==1.2.2 xinference==1.2.2 token

same issues

@qinxuye
Copy link
Contributor Author

qinxuye commented Feb 10, 2025

从源码安装 transformers 试下。

pip install git+https://github.com/huggingface/transformers.git

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

8 participants