Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(server): make ComfyUI log level configurable #150

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

rickstaa
Copy link
Collaborator

@rickstaa rickstaa commented Mar 10, 2025

This pull request introduces the comfyui-log-level and comfyui-inference-log-level arguments to the server, allowing users to adjust the log level and reduce excessive logging.

Performance Profile

To ensure this does not negativelly affect the performance I ran the profiler in #80 with the https://github.com/yondonfu/comfystream/blob/main/workflows/comfystream/sd15-tensorrt-api.json workflow. The results are shown below.

With changes

Without using the CLI arguments

AVERAGE - CPU: 415.28%, RAM: 14752.30MB, GPU: 52.93%, VRAM: 3548.00MB

With using the Global ComfyUI log CLI argument

AVERAGE - CPU: 423.68%, RAM: 14779.73MB, GPU: 55.25%, VRAM: 3548.00MB

With using the Inference ComfyUI log CLI argument

AVERAGE - CPU: 424.22%, RAM: 14763.52MB, GPU: 54.25%, VRAM: 3548.00MB

Without changes

AVERAGE - CPU: 412.29%, RAM: 14817.95MB, GPU: 55.45%, VRAM: 3548.00MB

@rickstaa rickstaa marked this pull request as ready for review March 10, 2025 12:48
@eliteprox
Copy link
Collaborator

There is a logging-level parameter that EmbeddedComfyClient accepts that could be used to set the verbosity from ComfyUI. I'd encourage usage of that, but maybe there's other advantages to wrapping a logger around it. See https://github.com/hiddenswitch/ComfyUI/blob/0fbce0a28299384e4f94ab4ff301f7a566dc3461/comfy/cli_args_types.py#L126.

@rickstaa rickstaa force-pushed the add_comfyui_log_level_arg branch 4 times, most recently from 719f82e to 6a0480d Compare March 10, 2025 18:47
@rickstaa rickstaa requested a review from eliteprox March 10, 2025 19:07
@rickstaa rickstaa marked this pull request as draft March 10, 2025 19:13
@rickstaa
Copy link
Collaborator Author

rickstaa commented Mar 11, 2025

@eliteprox I took a quick look at your suggestion, but it seems that the logging level is not correctly applied in ComfyUI for the EmbeddedComfyClient. The log level is set during the import of EmbeddedComfyClient in the _configure_logging function.

I couldn't find any references to the logging_level argument of EmbeddedComfyClient apart from returning it as the verbose value in the configuration. However, this configuration value doesn't appear to be used anywhere in the codebase. As a result, it seems that this setting is currently only applicable to the regular client.

@rickstaa rickstaa force-pushed the add_comfyui_log_level_arg branch 5 times, most recently from 171ec81 to 572f80b Compare March 11, 2025 10:16
@rickstaa rickstaa marked this pull request as ready for review March 11, 2025 10:17
@rickstaa rickstaa force-pushed the add_comfyui_log_level_arg branch 2 times, most recently from 8e6f296 to 6c9def1 Compare March 11, 2025 10:22
This commit introduces the `comfyui-log-level` and `comfyui-inference-log-level`
arguments to the server, allowing users to adjust the log level and reduce
excessive logging.
@rickstaa rickstaa force-pushed the add_comfyui_log_level_arg branch from 6c9def1 to 2a2c314 Compare March 11, 2025 10:23
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants