Tuning Java Application Performance

There has been a lingering perception that Java applications are slower than applications written in other languages. This is a complete myth, Java applications don’t have to be slow, and with the right configuration tuning, Java applications can be as fast as any other programming language. This lingering perception about Java performance may have been true about 20 years ago when Java was initially used for developing applications. In the early Java implementations, it took a long time for the Java Virtual Machine (JVM) to start. Early Java-based user interfaces were based on Swing technology and were slow.

Java isn’t slow
hour-glassThis is probably obvious to many of you, but I wanted to write it down just in case. I used to write Java at work, and my Java programs were really slow. I thought (and people sometimes told me) that this was because Java is a slow programming language. This just isn’t true

The key measures of Java application performance are throughput (requests served per second) and response time (i.e., latency). There are several benchmarks that indicate that Java application programs these days are have similar performance as other modern application programming technologies, such as Microsoft .NET.

This blog provides insights into simple changes you can make to your Java Virtual Machine (JVM) settings to further boost your Java application performance. Most of these changes do not involve changes to the application code, and therefore, can be implemented by your application operations teams and do not require application development teams to be involved.

Java Applications Engineering and Applications Operations

7 Configurations to Enhance the Performance of Your Java Web Applications

  1. Move to the Latest Stable Java Version
  2. Size the Java Heap Memory Correctly
  3. Set the Initial Java Heap Size
  4. Choose the Right Garbage Collection Algorithm
  5. Tune the Garbage Collector
  6. Ensure your Web Container’s Thread Pool is Sized Correctly
  7. Avoid Database Connection Delays
1

Move to the Latest Stable Java Version

Every release of Java has introduced performance improvements. Make sure to work with the latest stable release of Java to take advantage of these performance improvements. For instance, benchmarks indicate a 5-16% performance improvement when using Java 11 vs. Java 8.

Java LogoAt the same time, new constructs have been introduced in newer versions of Java to address some of the challenges encountered with older versions of Java. Using these newer constructs can give you significant performance benefits. For instance, concurrentHashMap introduced in JDK 1.5 is a higher performance data structure than the more conventional Hashtable.

If you are concerned about security as well, moving to the latest stable release is advisable. For instance, for SSL-based HTTP connections, TLS v1.2 is used as the default transport-level security standard. This means, if you use an earlier version of the JVM, your application may be prone to security threats that exist with TLS v1.1.

2

Size the Java Heap Memory Correctly

While the JVM does manage memory dynamically, it needs to be configured with sufficient memory to perform its task. Some of the common symptoms that indicate that you may not have configured memory correctly are:

  • Your application works fine when you start it, but over time it becomes slower and slower. A restart often fixes the issue.
  • You are seeing java.lang.OutOfMemoryError errors when running your application. Usually, this error is thrown when the JVM cannot allocate an object because it is out of memory, and no more memory could be made available by the garbage collector.
  • Garbage collections are happening often, and the GC process is taking a lot of CPU cycles.

The memory available to the JVM is configured when the Java application is first started, and this cannot be changed dynamically. An application restart is required for any changes to a JVM’s memory setting to be reflected.

The Xmx parameter to the JVM governs the maximum amount of heap memory you are allocating to the application. This is probably the most important JVM argument. The heap size of a Java application must be set optimally. Setting too small a heap size will result in out-of-memory errors and cause your application to fail to function properly until it is restarted. Setting a very large heap memory will also have an adverse impact on performance. Garbage collection may happen less frequently and when it happens, a large amount of memory may be released. This could result in application pauses.

Make sure to monitor the heap usage of your JVM and if the heap usage ever gets close to 90%, it indicates that you will need to tune the memory available to the JVM. Memory shortage could also happen because of memory leaks in the application. To diagnose memory leaks, take a heap dump and analyze the memory used by different objects using a tool such as the Eclipse Memory Analyzer Tooling (MAT).

Illustration of a schedule of Java argumentsThere are 600+ arguments that you can pass to a JVM just around garbage collection and memory. If you include other aspects, the number of JVM arguments will easily cross 1000+. That’s way too many options and settings…
7 JVM Arguments of Highly Effective Applications

3

Set the Initial Java Heap Memory Size

The Xms argument to the JVM specifies the initial heap memory size. This means that your JVM will be started with Xms amount of memory and will be able to use a maximum of Xmx amount of memory. For example, starting a JVM like below will start it with 256 MB of memory and will allow the process to use up to 2048 MB of memory:

java -Xms256m -Xmx2048m

Note that Xms is the initial setting of the heap memory size and during normal operation of your application, the actual heap memory size can be lower than Xms. For example, when users are inactive or after GC has just happened. You should set Xms and Xmx to the same value for best performance.

4

Choose the Right Garbage Collection Algorithm

Modern JVMs provide various options for the garbage collection algorithm to be used.  Here are some of the garbage collection options for JDK 12 and higher/newer:
Garbage collection in Java

  • Serial GC
  • Parallel GC
  • Concurrent Mark and Sweep GC
  • G1 GC
  • Shenandoah GC
  • Z GC
  • Epsilon GC

If you don’t specify the GC algorithm explicitly when starting your application, then JVM will choose the default algorithm. Until Java 8, Parallel GC was the default GC algorithm. Since Java 9, G1 GC is the default GC algorithm.

The choice of the GC algorithm can play an important role in determining an application’s performance. For instance, G1 GC and Z GC are reputed to offer far better performance than Concurrent Mark and Sweep GC. Note that ZGC is available on Linux from JDK 11 and for Windows from JDK 14 (also note that Windows 10 or Window server 2019 are required). To use G1GC, you have to specify the -XX:+UseG1GC argument for the JVM.

For best Java application performance, you need to monitor the JVM, the web container, and the application code.

Get 360° visibility into Java application performance with eG Enterprise.

5

Tune the JVM Garbage Collector

The goal of tuning the JVM’s garbage collection performance is to reduce the time required to perform a full garbage collection cycle. You should not attempt to tune the JVM to minimize the frequency of full garbage collections because this generally results in an eventual forced garbage collection cycle that may take up to several full seconds to complete.

One of the key GC parameters is MaxGCPauseMillis. This value sets a target to the JVM for the maximum time it can pause an application during GC. This is a soft goal, and the JVM will make its best effort to achieve it. For GC to be not noticeable, this value should be as low as possible. At the same time, keeping it too low can cause the garbage collector to trigger often, thus impacting performance. We have found a value closer to 500msec to yield the best throughput without any noticeable performance degradation.

There are other related parameters that can be set to tune the JVM’s garbage collector further:

  • -XX:ParallelGCThreads: Sets the number of threads used during parallel phases of the garbage collectors. The default value varies with the platform on which the JVM is running.
  • -XX:ConcGCThreads: Number of threads that concurrent garbage collectors will use. The default value varies with the platform on which the JVM is running.
  • -XX:InitiatingHeapOccupancyPercent: Percentage of heap occupancy after which concurrent GC is started. A value of 0 implies that GC runs all the time. The default value is 45.
6

Ensure your Web Container’s Thread Pool is Sized Correctly

You might have the most efficient code but if you do not have the web container configured correctly, your application will not deliver the performance expected. Web containers, such as Tomcat, JBoss, WebLogic and WebSphere have limits on the number of threads that can be spawned when they are processing requests. The default settings of these containers are very low for most production workloads. Make sure that you check the thread pool configuration of your web container and ensure that incoming requests are not waiting for threads to be available to pick them up for processing.

Tomcat thread pool illustration

Thread pool configuration of the Apache Tomcat web container
While it is important to ensure that there are sufficient threads configured for your web container, providing an unnecessarily large number for the thread pool size will also be detrimental. When there are too many threads for the JVM to handle, this leads to excessive context switching. Garbage collection may also be triggered more often, and overall performance of the application will suffer. So, remember – bigger is not always better when it comes to Java thread pools!

7

Avoid Database Connection Delays

Bigger isn't always betterMost production web applications these days use database connection pools. It is well understood that the time taken to establish a connection can be more than the time taken to execute a query after the connection is established. Hence, most web applications these days use connection pooling for database access. A set of connections is pre-established to the database and applications get connections from the pool, use these connections, and return it back to the pool. By reusing connections for multiple queries, applications save on database overheads and thereby improve their responses to end users.

Sizing of the connection pool can play an important role in how scalable and performing a Java application is. If the database pool is sized too small, there will be little benefit from connection pooling. At the same time, if the connection pool is too big i.e., there are too many open connections to the database server, they can impose a higher overhead on the server and thereby slowdown responses from it. Application operations teams need to collect empirical data from an application running in production and use that to tune the sizing of the database connection pool for optimal performance.

Concluding Remarks

Application performance is not just the responsibility of application developers. Application operations teams have an important role to play in making sure that their production applications are available and highly performing. In this blog, we have explored seven different configurations that application operations teams can tune to enhance the performance of their Java web applications.

eG Enterprise is an Observability solution for Modern IT. Monitor digital workspaces,
web applications, SaaS services, cloud and containers from a single pane of glass.