-
Notifications
You must be signed in to change notification settings - Fork 96
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Patch_level extract #45
Comments
Yes, as we have specified the timm version. At one point, timm underwent a significant update. We always used "strict=True" and never met this issue. For the tile-level extractor, you can also follow the instruction in here. |
I get the same Error. The version of timm is 0.5.4, which is instralled through pip. |
When I try the command "pip install timm-0.5.4.tar", I get the Error as follows:
|
@Jonyyqn I had the same problem once, maybe you can try using the 'build ' tool. pip install build |
Thx |
model = ctranspath()
model.head = nn.Identity()
td = torch.load(r'./model_weight/CHIEF_CTransPath.pth')
model.load_state_dict(td['model'], strict=True)
when the strict = True,there are some keys missing:
RuntimeError: Error(s) in loading state_dict for SwinTransformer:
Missing key(s) in state_dict: "patch_embed.proj.weight", "patch_embed.proj.bias".
Unexpected key(s) in state_dict: "patch_embed.proj.0.weight", "patch_embed.proj.1.weight", "patch_embed.proj.1.bias", "patch_embed.proj.1.running_mean", "patch_embed.proj.1.running_var", "patch_embed.proj.1.num_batches_tracked", "patch_embed.proj.3.weight", "patch_embed.proj.4.weight", "patch_embed.proj.4.bias", "patch_embed.proj.4.running_mean", "patch_embed.proj.4.running_var", "patch_embed.proj.4.num_batches_tracked", "patch_embed.proj.6.weight", "patch_embed.proj.6.bias".
But if I set false,then the model works and print [1,768] normally.
Btw,from timm.models.layers.helpers import to_2tuple,this method is outdated version,if I use the latest timm, from timm.layers.helpers import to_2tuple,there is some error in init: missing strict_img_size and output_fmt.
The text was updated successfully, but these errors were encountered: