It was a hot topic in early 70's that whether a language should be best interpreted or compiled-executed. People had well understood and agreed on the advantages and disadvantages of both. Reference [1] presents a typical analysis at that time. The basic conclusion was that interpreter was too slow to be widely useful.
As microprocessors and personal computers getting popular in late 70's, the world had much more cpu time. Interpreter was once again favored by some people and applications. Reference [2] and [4] promoted the use of interpreters and discussed techniques employed in construction of interpreters. Since then, more interpretive language systems had been built. Among them, there are popular BASIC, Perl, etc., which are still in use.
Nevertheless, compilation-execution was still a main stream technology in language processing due to its continuously improved performance and introduction of tools and environments that help program development process. Relatively speeking, we have seen much more progress and advancement in compiler technology than in interpreter technology during past 15-20 years. This is evidenced by vast amount of literatures on compilers, in contrast to a little on interpreters during those years.
The popularization of World Wide Web and advent of Java programming language [8] have injected new energy into interpretation technology. Besides its traditional advantages, interpretation has also proved its value in cross-platform executability and security [10]. This development has already motivated some dedicated research in interpreters such as project Rocky being conducted in University of Washington [5].
On the other hand, trying to run Java faster is an active and competitive area both academia and industry are working on these days, in various directions including: