GraalVM Unveils Latest Features with JDK 24 Release to Enhance Performance and Efficiency
Oracle's GraalVM team has announced the release of GraalVM for JDK 24, introducing a range of new features aimed at boosting performance and improving efficiency for developers using Java. As with previous updates, GraalVM is launched alongside the Java 24 release, providing users with a seamless integration of the advanced Java Development Kit (JDK) and GraalVM's capabilities.
The new release brings notable enhancements across several areas, including machine learning optimizations, reduced binary sizes, enhanced support for Java agents, and improvements in energy efficiency for applications running on GraalVM's Native Image.
GraalVM has introduced an upgraded machine learning-powered optimization called GraalNN, which leverages a pre-trained ML model to predict execution patterns and improve the peak performance of native images. In a series of microservices benchmarks, including Micronaut, Spring, and Quarkus, GraalNN delivered an average performance improvement of approximately 7.9%.
This innovation is aimed at reducing the need for manual profiling and re-building, allowing developers to achieve optimal performance with just a single build, further enhancing GraalVM's reputation for simplifying the development process.
"We're seeing major performance gains without requiring users to undergo additional steps," said a GraalVM team representative, in a blog post. "This is a game-changer for developers working on performance-critical applications."
The team is also looking ahead to even greater gains in the next JDK release, JDK 25, with continued improvements in this area.
The new release also brings improvements to GraalVM's Native Image functionality, which is known for delivering fast startup times, low memory usage, and compact packaging. A new feature called SkipFlow refines the process of analyzing which methods and classes are essential at runtime, leading to a reduction in the size of generated native images by around 6.35%, with some users even reporting faster build times.
This addition is still experimental, but it marks a step forward in streamlining Native Image builds. The feature allows developers to fine-tune their builds by using additional flags, offering flexibility in managing application size and performance.
For Java developers using agents in their applications, the latest GraalVM release extends support for premain methods at runtime. Previously, Java agents had to transform all classes at build time, but with this new functionality, agents can now execute premain methods when the native image is running.
"Supporting Java agents at runtime is something our community has been asking for, and we're excited to deliver this new capability," the GraalVM team noted. "This opens up new possibilities for developers who rely on agents for monitoring, instrumentation, and other tasks."
This improvement is expected to continue evolving, with further enhancements slated for the upcoming JDK 25 release.
A key area of focus in GraalVM 24 is its continued work on the Vector API, which provides optimized vector instructions for operations like matrix multiplications. This is particularly useful for large-scale AI models, where vector computations are crucial for performance. In this release, GraalVM now supports Vector API optimizations in Native Image, on par with JIT performance.
This improvement is significant for machine learning workloads, where speed is crucial. The team has also showcased a demonstration using the Llama3 Java project, where GraalVM's optimizations allowed a 1B-parameter Llama 3.2 model to process 52.25 tokens per second on a CPU.
As demand for more efficient computing grows, GraalVM has made strides in energy efficiency by optimizing Native Image applications to reduce resource consumption, including electricity usage. Initial tests, including one on the Spring PetClinic application, show that native images consistently consume less energy than their JIT counterparts, even under heavy load.
With sustainability in mind, the GraalVM team emphasized that these improvements could lead to lower operational costs, especially for large-scale, resource-intensive applications.
In the realm of security, GraalVM has added support for Software Bill of Materials (SBOM), allowing users to track the dependencies and components of their applications. This is essential for vulnerability scanning and ensuring compliance with security standards. The new SBOM features also include more detailed metadata, such as class-level information, which can help improve vulnerability scanning accuracy.
In addition, GraalVM now supports build reports, providing developers with insights into their build environments, resource usage, and the contents of native executables. This will help teams better understand the impact of their builds and optimize their deployment strategies.
As GraalVM continues to evolve, the platform's growing adoption is being recognized by the developer community. According to InfoQ's Java 2024 Trends Report, GraalVM is now considered a mainstream technology, with its Native Image functionality being embraced by the "Early Majority."
GraalVM's support for a wide range of frameworks, including Micronaut, Spring, and Quarkus, as well as its expanding ecosystem, reflects its growing role in modern Java development. The platform is also gaining traction in the world of cloud computing, with AWS and GitHub both integrating GraalVM to improve the performance and security of Java applications.
While GraalVM 24 offers a wealth of improvements, the team is already looking forward to the next release. Future enhancements, such as continued optimization of the Vector API and additional Java agent runtime support, are expected to further cement GraalVM's place as a key tool in the Java ecosystem.
With its robust features for optimizing performance, reducing resource consumption, and ensuring security, GraalVM is poised to remain a central player in the evolution of Java development, especially as AI, machine learning, and efficiency continue to drive innovation.
Posted by John K. Waters on March 26, 2025