Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That python one is not a memory leak, it is just uncareful use of resources.


I don't consider this to be correct since the main computation in that function will be done in `f()` where `a` is inaccessible and therefore cannot be deallocated. This is similar to mallocing `a` and then not free'ing. Obviously when the program dies python will deallocate everything, duh, that's not a useful definition of memory leak since even in a memory leaking C program, OS will clean up the memory after program dies. While the program is working though, it's going to keep leaking memory this way even in python.


I am sorry but you are wrong. Your understanding of the concept is not complete yet. Close but not complete.

Memory leak has a specific meaning. A memory leak is memory that is been allocated but cannot be accessed and deallocate anymore. How useful is that memory is not important. The Wikipedia article explains it much better than myself: https://en.m.wikipedia.org/wiki/Memory_leak

There are useful definition of memory leak, some software need to run for a very long time, they cannot leak memory.

In languages where the memory is automatically managed it is quite hard to generate a memory leak.

Indeed is the original article that simply used a wrong terminology.

If you got more questions let me know! Happy to help!


I understand this definition, I explained it above in my comment. The problem is that definition is not useful. It's useful to detect memory leaks this way since it's easier, but that doesn't help us understand all problems our program can have with leaking resources. First and foremost, the problem the original definition points out doesn't exist any more. When a program dies, all the resources it allocated, file descriptors, memory, ports etc will safely be cleaned by the OS. Any modern OS that doesn't do so will consider this a bug. So there is no practical reason to sweat the original problem. It's still useful to sweat it since it's an indication of a problem, which is that your program fails to deallocate resources it allocates. However, this failure is not important just at the moment your program dies. It will remain important the entire lifetime of the program. As I explaind above, if your program runs in a loop, and needs to make O(N) resources runs on this loop and you can use constant memory every time, all you really need is O(1) memory since you can deallocate at the end of each loop. The problem arises when you fail to do and use O(N) resources. This will make your program break on large inputs when it really can work. What's more crucial is:

(1) Analyzing your program's minimum asymptotic resource need

(2) Observing your program's real asymptotic resource need

(3) Optimizing your program in a way (2) is closer to (1)


I just believe we work in very different environments.

We had problems with compilers having memory leak. The software I write, runs for weeks or even months without being restarted. Yeah, the original problem of memory leak, or even resources leaks in general is still very very real in some field.

Now, of course if you use python or golang, or javascript, basically you will never have a real memory leak. But this is not a good reason for calling bad use of resources "a memory leak".

BTW: > First and foremost, the problem the original definition points out doesn't exist any more. When a program dies, all the resources it allocated, file descriptors, memory, ports etc will safely be cleaned by the OS. Any modern OS that doesn't do so will consider this a bug. So there is no practical reason to sweat the original problem.

Memory leak never concerns the OS, like never. When the OS allocate memory to a software then is the software responsibility to deallocate it, returning it to the OS.

> Any modern OS that doesn't do so will consider this a bug.

This is true but it is not what we are discussing.

Anyhow, I am just trying to help you understand what people usually means with "memory leak" because you seems a little confused.

But if you are sure and you are definitely not confused and you think that it is me being wrong, I am not going to engage in any discussion.

Cheers,


In JavaScript where there's often a single thread running a long-lived process I think the common case is when you keep allocating stuff through event handlers and not dereferencing them. A single wasteful build-up of memory is not really what I see discussed when people are discussing memory leaks in JS. As another commenter said, the classic case being adding UI event handlers and then removing the DOM object without dereferencing the handler. Doing this over and over will eventually make the app unresponsive but not immediately.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: