Memory leaks when image discarded in Python Memory leaks when image discarded in Python tkinter tkinter

Memory leaks when image discarded in Python


First, you definitely do not have a memory leak. If it "collapses" whenever it gets near 500MB and never crosses it, it can't possibly be leaking.


And my guess is that you don't have any problem at all.

When Python's garbage collector cleans things up (which generally happens immediately when you're done with it in CPython), it generally doesn't actually release the memory to the OS. Instead, it keeps it around in case you need it later. This is intentional—unless you're thrashing swap, it's a whole lot faster to reuse memory than to keep freeing and reallocating it.

Also, if 500MB is virtual memory, that's nothing on a modern 64-bit platform. If it's not mapped to physical/resident memory (or is mapped if the computer is idle, but quickly tossed otherwise), it's not a problem; it's just the OS being nice with resources that are effectively free.

More importantly: What makes you think there's a problem? Is there any actual symptom, or just something in Program Manager/Activity Monitor/top/whatever that scares you? (If the latter, take a look at the of the other programs. On my Mac, I've got 28 programs currently running using over 400MB of virtual memory, and I'm using 11 out of 16GB, even though less than 3GB is actually wired. If I, say, fire up Logic, the memory will be collected faster than Logic can use it; until then, why should the OS waste effort unmapping memory (especially when it has no way to be sure some processes won't go ask for that memory it wasn't using later)?


But if there is a real problem, there are two ways to solve it.


The first trick is to do everything memory-intensive in a child process that you can kill and restart to recover the temporary memory (e.g., by using multiprocessing.Process or concurrent.futures.ProcessPoolExecutor).

This usually makes things slower rather than faster. And it's obviously not easy to do when the temporary memory is mostly things that go right into the GUI, and therefore have to live in the main process.


The other option is to figure out where the memory's being used and not keep so many objects around at the same time. Basically, there are two parts to this:

First, release everything possible before the end of each event handler. This means calling close on files, either deling objects or setting all references to them to None, calling destroy on GUI objects that aren't visible, and, most of all, not storing references to things you don't need. (Do you actually need to keep the PhotoImage around after you use it? If you do, is there any way you can load the images on demand?)

Next, make sure you have no reference cycles. In CPython, garbage is cleaned up immediately as long as there are no cycles—but if there are, they sit around until the cycle checker runs. You can use the gc module to investigate this. One really quick thing to do is try this every so often:

print(gc.get_count())gc.collect()print(gc.get_count())

If you see huge drops, you've got cycles. You'll have to look inside gc.getobjects() and gc.garbage, or attach callbacks, or just reason about your code to find exactly where the cycles are. For each one, if you don't really need references in both directions, get rid of one; if you do, change one of them into a weakref.


Saving 500MB is worth, saving 100MB is worth, saving 10MB is worth.Memory has price of gold and many suggests to waste it.Definitely, it is your decision, if you want to waste it on your Mac, do it...And absolutely, it is very sad advice how to write very poor software.

Use https://pypi.org/project/memory-profiler/ to track your Python memory allocations.Use

x = someRamConsumingObject()# do the stuff here ...# remove the refrencesdel xx = Nonegc.Collect() # try to force garbage collector to collect

Away from philosophical discussions, real examples from industrial Edge computing gives us exact reasons why this shall be improved. If running Python in containers one will soon hit the wall, especially having multiple containers running on the Edge under heavy production load.

And even if Edge has 16GiB, you will hit wall soon, especially using data analytics tools like Pandas.

Then, my friend, you will recognise what is the hell of garbage collectors and what means "not having memory under control".

C++ rocks!!!