#201 – You Can Leak Memory in C#
January 4, 2011 6 Comments
In C#, you don’t need to explicitly free memory allocated by creating objects on the heap. The objects will be automatically garbage collected when no longer referenced. This means that you won’t experience memory leaks due to forgetting to delete an object.
You can, however, still leak memory in C#. This can happen if you create one or more new objects on the heap and refer to them from a variable that never goes out of scope during the application’s lifetime.
public static List<Person> userLog; static void RecordUserInfo(Person justLoggedIn) { userLog.Add(justLoggedIn); }
Here we’re adding to a list of Person objects whenever the RecordUserInfo method is called (presumably when someone logs in). Assuming that we never remove someone from this list, none of the Person objects added to the list will ever get garbage collected–because we’ll continue to reference them indefinitely. We’re leaking memory.
Note: This is a different sort of “leak” than you’ll encounter in unmanaged (e.g. C++) code. In unmanaged code, we often speak of a “leak” as a situation where the application allocates some memory and then gets rid of the reference to that memory. The application can no longer reclaim the memory. Managed code doesn’t leak memory in this way because the garbage collector releases resources that are no longer referenced. However, it’s still possible for an application to allocate memory continuously while running and not release the memory. If this is not the intent of the developer or required behavior for the application, then it is a bug. It is also a “leak” in the sense that memory is continuing to be allocated but not released. It’s not the case that the application can’t release memory but rather than it won’t release memory. We might pedantically insist that this isn’t technically a memory leak, based on the definition of a traditional leak in unmanaged code. But from a user’s point of view, the application is leaking memory. It is unnecessarily allocating memory that it isn’t using. Worse, it’s doing so in a way that will cause the application’s memory footprint to grow in an unbounded fashion. This will eventually lead to some sort of out of memory condition. In practice, people refer to both situations as leaks–the traditional unmanaged leak and the more subtle situation where the application just continues to allocate memory (see The best memory leak definition). It’s useful to adopt the broader definition of the term “leak”, rather than to say that “C# doesn’t leak memory”, which then leads developers to not think about how the application is allocating and releasing memory.