Donnerstag, 25. Februar 2010

Heap Dump Analysis with Memory Analyzer, Part 2: Shallow Size

In the second part of the blog series dedicated to heap dump analysis with Memory Analyzer (see previous post here) I will have a detailed look at the shallow size of objects in the heap dump. In Memory Analyzer the term shallow size means the size of an object itself, without counting and accumulating the size of other objects referenced from it. For example the shallow size of an instance of java.lang.String will not include the memory needed for the underlying char[].

At first look this seems like a clear definition and a relatively boring topic to read about. So why did I decide to write about it? Because despite of the understandable definition, it is not always straightforward (for the tool developers) to calculate the shallow size, or (for a user) to understand how the size was calculated. The reasons? – different JVM vendors, different pointer sizes (32 / 64 bit), different dump formats, insufficient data in some heap dumps, etc … These factors could lead to small differences in the shallow sizes for objects of the same type shallow sizes being displayed for objects of the same type, and thus to questions.

Is it really important to know the precise size? Not necessarily. If you got a heap dump from an OutOfMemoryError in your production system, and MAT helps you to easily find the leak suspect there – let’s say it is some 500Mb big object - then the shallow size of every individual object accumulated in the suspect's size doesn’t really matter. The suspect is clear and you can go on and try to fix the problem.

On the other hand, if you are trying to understand the impact of adding some fields to your “base” classes, then the size of the individual instance can be of interest.

In the rest of the post I would have a look at the information available (or missing) in the different snapshot formats, explain what MAT displays as shallow size in the different cases, and try to answer some of the questions related to the shallow size which we usually get. If you are interested, read further.

Montag, 25. Januar 2010

Heap Dump Analysis with Memory Analyzer, Part 1: Heap Dumps

Almost two years passed since the Memory Analyzer tool (MAT) was published at Eclipse. Since then we have collected a lot of feedback, questions and comments by people using it, and we also gathered experience in using the tool ourselves. Most of the people find their way to solve memory problems using MAT relatively easy, but I am convinced there are also a lot of unexplored features and concepts within the tool, which can be very handy if properly understood and used. Therefore I decided to start a series of blog posts dedicated to memory analysis (with MAT) - starting from the basics and covering the different topics in detail. I would try to answer there some of the questions which pop-up most often, give some (hopefully useful) hints, explain the benefit of certain “unpopular” queries, and (please, please, please…) collect your feedback.

As the Memory Analyzer is a tool working with heap dumps, I will start with a detailed look at heap dumps – what they are, which formats MAT can read, what can be found inside, how one can get them, etc… If you are interested in the topic, read further.

Donnerstag, 7. Mai 2009

Memory Analyzer at JavaOne 2009

The Memory Analyzer tool will be presented for the third year in a row at the JavaOne conference. In the technical session TS-4118 Practical Lessons in Memory Analysis the speakers - Krum Tsvetkov (me) and Andrew Johnson - will show how many of the common memory-related problems can be solved using heap dumps and the Memory Analyzer tool. We are going to show many demos, using real-life heap dumps in different format - HPROF and IBM dumps. Based on these examples we will try to give some hints which we found useful in our troubleshooting experience, and which we think can be easily applied in practice.

The day after the session we will be available at the Eclipse booth (Thursday, June 4, 12:00 - 14:00) in the pavilion area. If you happen to be at the conference and have some questions, critics, comments for us, we will be happy to meet you there and get your feedback.

I hope to see many of you at the conference!

Dienstag, 4. November 2008

Java User Group Mannheim

Just a quick note: if you are in or around Mannheim, Germany, on November 25th, you might want to join Markus Kohler's presentation of the Memory Analyzer at the Java User Group Mannheim. The talks is at the University Mannheim at 7pm.

Donnerstag, 23. Oktober 2008

Troubleshooting Memory Problems from IBM Systems with Memory Analyzer

There was one question which popped-up at every Memory Analyzer demo we did in the last couple of years – “Is the Memory Analyzer able to read heap dumps also from IBM VMs”? And we always had to give negative answers. Not any longer!
After some joint efforts it is now possible to analyze memory problems which occurred on IBM VMs with the help of IBM’s DFTJ technology and the Memory Analyzer tool.

Overview


The Diagnostic Tool Framework for Java (DTFJ) is a Java API from IBM used to support the building of Java diagnostics tools. Using one and the same API – DTFJ - tools can read data from different data sources in a unified way. The IBM DTFJ adapter enables MAT to work with system dumps from IBM Virtual Machines for Java version 6, version 5.0 and version 1.4.2.

Setting-up the Tools


In order to analyze IBM system dumps with Memory Analyzer, one needs to install the DTFJ adapter into the Memory Analyzer. The DTFJ adapter as well as installation instructions are available here.

Getting Data for Analysis


A detailed documentation on using DTFJ with MAT is added to the “Help” of MAT once the DTJF adapter is installed. There it is described how dumps can be obtained and pre-processed. Further reading about DTFJ can be found in the “Diagnosis documentation”

Here I will only roughly present the process:

  • Get a system dump from the Java process


  • Run the jextract tool on the system where the dump was taken. This will produce a zip file which can be then analyzed on an arbitrary system


  • Open the jextract-ed file with the Memory Analyzer (which, of course, has the DTFJ adapter installed)


  • The minimum-required versions of the IBM JDKs are:

  • Java 1.4.2 SR12


  • Java 5 SR8a


  • Java 6 SR2


  • Analyzing the DTFJ Extracted Dumps


    The analysis with the Memory Analyzer remains the same as for HPROF dumps. This means that if you have already some experience with the tool, you can just continue to do the analysis in the same way did before also for IBM dumps.

    If you are new to the tool and need some “first steps” help, here are some resources which could help:

  • Look at the “Getting Started” part from the help coming with the tool


  • Look at our Blogs


  • Watch the recorded Eclipse webinar


  • If you want to see the Memory Analyzer & DTFJ in action, you can visit the “New & Noteworthy” short session at Eclipse Summit Europe 2008.

    Dienstag, 27. Mai 2008

    Automated Heap Dump Analysis: Finding Memory Leaks with One Click

    There is a common understanding that a single snapshot of the Java heap is not enough for finding a memory leak. The usual approach is to search for a monotonous increase of the number of objects of some class by “online” profiling/monitoring or by comparing a series of snapshots made over time. However, such a “live” monitoring is not always possible, and is especially difficult to be performed in productive systems because of the performance costs of using a profiler, and because of the fact that some leaks show themselves only rarely, when certain conditions have appeared.

    In this blog will try to show that analysis based on a single heap dump can also be an extremely powerful means of finding memory leaks. I will give some tips how to obtain data suitable for the analysis. I will then describe how to use the automated analysis features of the Memory Analyzer tool, which was contributed several months ago to Eclipse. Automating the analysis greatly reduces the complexity of finding memory problems, and enables even non-experts to handle memory-related issues. All you need to do is provide a good heap dump, and click once to trigger the analysis. The Memory Analyzer will create for you a report with the leak suspects. What this report contains, and how the reported leak suspects are found is described below.

    Dienstag, 20. Mai 2008

    Blog Post looks at Eclipse' Memory Consumption

    Markus Kohler blogged about Analyzing Memory Consumption of Eclipse:

    During my talk on May 7 at the Java User Group Karlsruhe about the Eclipse Memory Analyzer I used the latest build of Eclipse 3.4 to show live, that there's room for improvement regarding the memory consumption of Eclipse.
    He goes on to have a closer look at the spell checker and looks at duplicate strings.

    Now you may think, that this guy is bashing Eclipse, but that's really not the case.
    If you beg enough, I might also take a closer look at Netbeans :]
    If nothing else, it shows how relative simple it is to gain some insights about the memory of your application... :-)