-
Notifications
You must be signed in to change notification settings - Fork 269
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory usage keeps increasing in GitHub Actions CI pipeline #641
Comments
More findings: the memory usage seems somehow proportional to the number of unit tests run. I was able to lower the limit to 85MB on my computer, when running But |
I did some research about this before and pytest memory usage increases the more tests you have. |
I asked the question directly on I also tried pympler to investigate memory usage, #!/usr/bin/env python3
from psutil import Process
from pympler.muppy import get_objects, getsizeof
from pympler.summary import print_, summarize
from pympler.util.stringutils import pp
from pytest import console_main
try:
console_main()
finally:
print("[psutil] process RSS memory:", pp(Process().memory_info().rss))
all_objects = get_objects()
objs_size_sum = sum(getsizeof(obj) for obj in all_objects)
print("[pympler/muppy] sum of objects memory size:", pp(objs_size_sum))
print("[pympler/muppy] biggest objects summary:")
print_(summarize(all_objects)) This was the result of calling
As can be seen, pympler/muppy total size of Python objects does not account for all the process memory usage though... Another possibility could be a memory leak in Pillow |
Still investigating... This time I used tracemalloc to spot the biggest memory allocations:
|
I also attempted to use
|
I tried using scalene, but it was filling I also tried to combining PyTest fixtures with import objgraph
@pytest.fixture(scope="module", autouse=True)
def trac_memory_growth():
objgraph.show_growth(limit=10, shortnames=False) # starts with a gc.collect() Output:
The only non-builtins objects that regularly grow in number are |
Another interesting approach, using the following code in import gc, linecache, tracemalloc
# Enabling this check creates an increase in memory usage,
# so we require an opt-in through a CLI argument:
def pytest_addoption(parser):
parser.addoption("--trace-malloc", action="store_true", help="Trace main memory allocations differences during the whole execution")
@pytest.fixture(scope="session", autouse=True)
def trace_malloc(request):
if not request.config.getoption("trace_malloc"):
yield
return
capmanager = request.config.pluginmanager.getplugin("capturemanager")
gc.collect()
tracemalloc.start()
snapshot1 = tracemalloc.take_snapshot().filter_traces((
tracemalloc.Filter(False, linecache.__file__),
tracemalloc.Filter(False, tracemalloc.__file__),
))
yield
gc.collect()
snapshot2 = tracemalloc.take_snapshot().filter_traces((
tracemalloc.Filter(False, linecache.__file__),
tracemalloc.Filter(False, tracemalloc.__file__),
))
top_stats = snapshot2.compare_to(snapshot1, 'lineno')
with capmanager.global_and_fixture_disabled():
print("[ Top 10 differences ]")
for stat in top_stats[:10]:
print(stat) Output:
The lines related to Again it seems to point towards PyTest and notably its |
For reference:
|
New lead, regarding import gc
from fontTools.ttLib import TTFont
from psutil import Process
def print_mem_usage(prefix):
rss_in_mib = Process().memory_info().rss / 1024 / 1024
print(f"[psutil] {prefix} process memory (RSS) usage: {rss_in_mib:.1f} MiB")
print_mem_usage("Initial")
TTFont("test/fonts/NotoColorEmoji.ttf")
TTFont("test/fonts/NotoColorEmoji.ttf")
gc.collect()
print_mem_usage("Final") Output: [psutil] Initial process memory (RSS) usage: 11.1 MiB
[psutil] Final process memory (RSS) usage: 21.1 MiB This memory "leak" disappears if we pass |
I noticed that just importing the from psutil import Process
rss_in_mib = Process().memory_info().rss / 1024 / 1024
print(f"[psutil] Initial process RSS memory usage: {rss_in_mib:.1f} MiB")
import memory_profiler
rss_in_mib = Process().memory_info().rss / 1024 / 1024
print(f"[psutil] Final process RSS memory usage: {rss_in_mib:.1f} MiB") Output:
Sadly, given the number of unanswered issues on https://github.com/pythonprofilers/memory_profiler, I fear that this library is not much maintained anymore 😢 |
Based on the latest measurements on my laptop (with
This ratio is consistent with my similar observations last month. My current best lead : I suspect a problem related with memory usage of inserted images: just the Note: I checked how using an older version of Pillow would have an impact on this memory usage, and going back as far as Note: while experimenting, I tested 2 |
I introduced some more memory metrics in #721 Based on this recent tests suite execution: https://github.com/PyFPDF/fpdf2/actions/runs/4532166384/jobs/7983211542?pr=703 I digged a little deeper by inserting calls to
...but it is not that surprising after all, given that |
Note: in PR #703, I changed the behaviour of the The RSS memory metric is slitghly volatile, and definitely not determnistic, so this is far from perfect / stable, |
On September 29th of 2022, I introduced some checks in the unit tests suite to ensure memory usage does not increase:
7fce723. Thoses checks use
memunit
, that itself usesmemory_profiler
, which then usespsutil
to get the processus current memory usage.Those were the initial memory caps:
test_intense_image_rendering()
, 165MBtest_share_images_cache()
2 months later, we had to progresively increase those thresholds, as we kept exceeding them during the tests execution in GitHub Actions CI pipelines.
Those are the actual values at the time I'm writing this:
test_intense_image_rendering()
, 178MBtest_share_images_cache()
Those thresholds have all 3 increased by more than 10MB, with 40 extra unit tests added inbetween those commits.
While I expect some variance in GitHub Actions executions, as we have no control over the execution context (the runners capabilities can change, and the load on GitHub systems can vary), I worry that this could point to some memory leak in
fpdf2
.I'm opening this isue to keep track of this subject,
and I'd be very happy to get some help from the
fpdf2
users community about this 😊The text was updated successfully, but these errors were encountered: