Recently I started to reverse engineer a ~10 year old java program (that means it was written at about the same time when I touched java the first and the last time at the university – not because of an dislike of java, but because other programming languages where more suitable for the problems at hand). Actually I am just reverse engineering the GUI applet (the frontend) of a service. The vendor does not exist anymore since about 10 years, the program was not taken over by someone else, and the system where it it used from needs to be updated. The problem, it runs with JRE 1.3. With Java 5 we do not get error messages, but it does not work as it is supposed to be. With Java 6 we get a popup about some values being NULL or 0.
So, first step decompiling all classes of the applet. Second step compiling the result for JRE 1.3 and test if it still works. Third step, modify it to run with Java 6 or 7. Fourth step, be happy.
Well, after decompiling all classes I have now about 1450 source files (~1100 java source code files, the rest are pictures, properties files and maybe other stuff). From initially more than 4000 compile errors I am down to about 600. Well, that are only the compile errors. Bugs in the code (either put there by the decompiler, or by the programmers which wrote this software) are still to be detected. Unfortunately I don’t know if I can just compile a subset of all classes for Java 6⁄7 and let the rest be compiled for Java 1.3, but I have a test environment where I can play around.
Plan B (searching for a replacement of the application) regarding this is already in progress in parallel. We will see which solution is faster.