Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] "Could not detect model type" #96

Open
eliteprox opened this issue Feb 27, 2025 · 1 comment
Open

[BUG] "Could not detect model type" #96

eliteprox opened this issue Feb 27, 2025 · 1 comment
Labels
bug Something isn't working

Comments

@eliteprox
Copy link
Collaborator

eliteprox commented Feb 27, 2025

Describe the bug

ComfyStream server memory leak during failed workflow

To Reproduce

After submitting an invalid workflow, usually where a model is missing, the server does not recover and cannot process further streams.

  1. Download a workflow like sd15_multi_cnet_depthanything_face.json.
  2. Modify the engine name for depthanything to be the wrong name.
  3. Submit the broken workflow to comfystream.
  4. The SD 1.5 models will be loaded, but the workflow will fail to run
  5. Submit a correct version of the workflow sd15_multi_cnet_depthanything_face.json to ComfyStream UI.
  6. The SD 1.5 model is unable to load (presumably twice, due to still being loaded from failed workflow start), and the following error occurs:
  File "/workspace/miniconda3/envs/comfystream/lib/python3.11/site-packages/comfy/sd.py", line 892, in load_checkpoint_guess_config
    raise RuntimeError("Could not detect model type of: {}".format(ckpt_path))
RuntimeError: Could not detect model type of: /workspace/ComfyUI/models/checkpoints/dreamshaper_8.safetensors
  • Restarting the Pod via the Runpod UI always works (when I load the workflow for the first time after restarting)
  • Reloading the same workflow (default SD1.5 workflow) sometime works, sometime doesn't work

Expected behavior

ComfyStream server should free cuda memory, free memory and restart the appropriate thread/process queue in EmbeddedComfyClient, then return a 500 error response

Screenshots

Desktop (please complete the following information)

No response

Smartphone (please complete the following information)

No response

Additional context

No response

@eliteprox eliteprox added the bug Something isn't working label Feb 27, 2025
@eliteprox eliteprox changed the title [BUG] Failure loading workflow breaks prompt server [BUG] RuntimeError("Could not detect model type of: {}".format(ckpt_path)) Mar 11, 2025
@eliteprox eliteprox changed the title [BUG] RuntimeError("Could not detect model type of: {}".format(ckpt_path)) [BUG] "Could not detect model type" Mar 11, 2025
@eliteprox
Copy link
Collaborator Author

This issue is ready to be closed, will give some time to verify downstream resolution in staging

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant