I often get asked what I work with, and I’ve always had a hard time answering that. The problem usually starts when I introduce the concept of a Java Virtual Machine. So I thought I’d write a few lines on the evolution on computing to sort of explain how what I work with came about.
At the dawn of electrical computing programming was done by manually closing and opening electrical circuits. This evolved into punching holes into paper cards, representing the closed and open circuits. Eventually better methods of providing the instructions to the processing units came along, but at the hardware level the processing units still need to get its instructions as sequences of ones and zeroes. At first assemblers appeared that had simple mnemonics, or textual symbols, that were more or less a handy but direct representation of the sequences of ones and zeroes. Higher level programming languages, such as C, appeared that provided means of expressing the meaning of the program, the semantics, using better and easier to read syntax. Other programs, so called compilers, were responsible for analyzing the meaning of the program and translating it into machine code. As the processors became more advanced, so did the compilers, and today it is very complicated for a programmer to, even by hand, produce better and more efficient programs by hand in assembly, than the machine code produced by an optimizing compiler. Even with intimate knowledge of the processor architecture, it can be very hard to, for instance, decide upon the best way to schedule the instructions to best utilize parallel instruction pipes.
Java is a modern programming language, and represents another leap in abstraction. Its syntax is reminiscent to that of C, one of the most popular and successful imperative programming languages to date. Using a C-like syntax made the transition to Java easier for programmers already knowledgeable in C, and helped drive adoption.
Quite unlike C, however, Java programs are not compiled into machine code that the hardware which will execute the program can understand. Java programs are instead translated to platform independent byte code, which another piece of software, the Java Virtual Machine, or JVM, translates into machine code. You can view the Java Virtual Machine as a hardware abstraction layer, with its own machine code. The JVM does the final translation from the Java byte code to the machine code of the underlying hardware. Since the JVM is translating the Java byte code while it is running, the JVM has more information about the runtime characteristics of the program running in the JVM than a static compiler could ever have. This means that there is more information available to do advanced optimizations.
At the beginning, Java got a reputation for being very slow. The early JVMs were interpreting and translating the Java byte code line by line. This was rather inefficient. Today, high performance JVMs like JRockit are analyzing the semantics of the Java program while it is executing. It translates the program into machine code using a JIT (Just In Time) compiling strategy; translating method by method into machine code as it is required. It uses an internal profiler to find out how the application is executing, and dynamically does optimizations to the running program; in effect recompiling methods while the program is running. A modern JVM has a high performance, adaptive memory management system that efficiently handles the allocation and reclaiming of memory for the developer. Today, a Java program running in a modern JVM can outperform a well written C program.
So, to finally answer the question, I work with JRockit – one of the fastest and most advanced JVMs in the world.