IBM Cloud Authors: Elizabeth White, Liz McMillan, Pat Romanski, Roger Strukhoff, Sematext Blog

Related Topics: IBM Cloud

IBM Cloud: Article

Managing JDBC Connections

Managing JDBC Connections

If application servers like WebSphere are the heart of e-business, then the database is surely the lifeblood. Whether your firm is managing thousands of airline flights a day or simply exposing a library's card catalog to the Web, you must deal with database connectivity.

Java developers have a simple way to manage database connectivity at their disposal: JDBC. It provides a convenient interface to access a variety of vendors' databases via JDBC drivers. Simply plug in a new driver and your application can talk to DB2, Oracle, Sybase, Informix, and others.

The JDBC version 2.0 extensions introduced a feature called connection pooling, providing better performance and scalability. IBM implemented a JDBC 2.0-compliant connection pool in WebSphere 3.5, and continues to enhance it in version 4.0.

In addition to connection pooling, the WebSphere connection manager offers many additional qualities to enhance an application's performance and scalability for the savvy Web developer. This article will highlight those features and show how they can be used effectively for building better Web applications. In addition, this article will describe many best practices and potential pitfalls for developing with WebSphere's connection manager.

While the following information will provide an improvement to applications, note that this is meant to aid in application design, not replace it. No amount of "tweaking" can make up for a poor design, but a well-designed application with poor implementation of database connectivity logic can be improved. This article will address the latter case.

The WebSphere connection manager has many facets. At the core, however, it's simply a connection pool. Object pools are nothing new.

The basic reason for a pool is that creation of certain objects is expensive in performance terms (time, memory allocation, etc.). Rather than creating these objects over and over again, it's more efficient to create a pool of already initialized objects that can be allocated to application components, and then returned to the pool when no longer needed. Thus, the hit to performance is only incurred once, when the object is actually created.

Anyone doing performance-tuning of JDBC applications knows that creating database connections is an expensive operation. Connection pooling is a natural fit for Web applications that utilize servlets or EJBs that require many simultaneous threads to connect to databases. Figure 1 provides a more visual example. In this diagram, three application components (a servlet, JSP, and session bean) have each received a Connection object from the pool (whose current size is six).

In addition to pooling connections, the WebSphere connection manager monitors the health of a database connection. If a connection has been allocated from the pool but hasn't been used in a specified amount of time (orphan timeout), the connection manager will mark the connection as "stale" and will return it to the pool (preempt the connection).

In the event of a connectivity failure (database error, network connection dropped, etc.), the connection manager will throw a StaleConnectionException, a subclass of SQLExcep-tion, on the next JDBC operation. This allows application developers to handle failure conditions more flexibly, such as allowing the application to retry the database operation. Further, the connection manager removes all Connection objects from the pool.

WebSphere's connection manager supports a variety of databases, including DB2, Oracle, Sybase, Microsoft SQL Server, and Informix (only available in WebSphere 4.0). SQL Server and Oracle (for two-phase commit operations) require the use of the Merant SequeLink 5.1 JDBC Driver, which ships with WebSphere Application Server, Advanced Edition.

Configuration of the connection manager is simple and painless (see Figures 2 and 3). Each data source represents a connection pool to a specified database, and is linked to a single JDBC driver.

Accessing a data source programmatically requires a JNDI lookup such as:


Context ctx = new InitialContext();
DataSource ds = (DataSource)

The string, "jdbc/MyDataSource", represents the JNDI name of the datasource. In version 4.0, you may use a resource reference such as "java:comp/env/jdbc/MyDataSource", as per the J2EE specification. The DataSource object contains the methods getConnection() and getConnection(String username, String password) to allocate a Connection object from the pool.

Simply close the Connection object to return it to the pool. For example:


Connection con = ds.getConnection(
username, password);

Now that you understand the basics of the WebSphere connection manager, let's take a look at some of the best practices.

Keep the Connection Life Span Minimized
Ideally, an application should never hold on to a Connection object beyond a single method. Nor should a single thread acquire more than one Connection. The longer one part of an application holds onto a Connection, the longer other parts cannot use that same Connection.
An application designed to acquire more than one Connection per thread and hold them for very long periods of time could easily come to a screeching halt when multiple threads collectively request more Connection objects than the maximum pool size.

The temptation here is to simply increase the size of the pool, but that will impact performance. Each Connection, whether it's in use or not, can cost over two megabytes of JVM memory. Furthermore, a higher pool size means creating more Connection objects which, as previously mentioned, is expensive. Efficient usage of fewer Connection objects is a far better design.

Additionally, when an orphaned Connection times out, the connection manager will reclaim the Connection for the pool, throwing a StaleConnectionException in the offending client on the next JDBC operation. While this is costly for the single thread, it does at least allow the Connection object to be freed. However, inside the bounds of a transaction (e.g., inside an EJB or when using UserTransaction objects), a Connection will never be orphaned. Thus, long-running transactions could keep Connection objects out of commission for a long time, and could lead to deadlock-like conditions.

While keeping connectivity life span minimized, a developer should be careful not to compromise functionality. There are times when holding a ResultSet open for an extended operation is necessary to preserve transaction isolation (row locks). Closing a ResultSet (or its creating Connection or Statement object) too early might result in your application making decisions that assumed the state of the data in the database was X when another thread has modified it to be Y.

From an application development standpoint, the best alternative is to use fewer connections in your pool, but to use them wisely. Once a method has finished its database utilization, it should free the Connection object by calling its close() method.

Cache the Datasource, Not the Connection
To take full advantage of the connection pool and make limited calls to expensive operations, an application developer should cache a datasource object when database connectivity may be utilized often. Caching the Connection has the negative effects described above and defeats the purpose of pooling.

Also, be aware that declaring Connection objects as static class variables can have unexpected and undesirable effects. For example:


public class MyJDBCClass {
static Connection con;

When a Connection is declared static, it's shared by all instances of the declaring class. Therefore, it's possible that two threads might attempt to utilize the same Connection object simultaneously. The results could include unexpected table locking, incorrect ResultSets, and serious data integrity problems.

Also, keep in mind that all global variables of a servlet are static by default (even if you don't declare them static). If you decide to cache a Connection, make sure not to declare it static.

Caching the datasource means creating a global instance variable in one of your classes. That class will maintain the reference to the datasource object throughout its life. When a member method needs a Connection, it creates one by calling the datasource's getConnection() method.

The advantage of caching the datasource is that you prevent multiple JNDI lookups, another expensive operation. WebSphere provides additional caching in this area, but caching it locally is advantageous.

Make Sure to Clean Up!
The WebSphere connection manager does a good job of caring for your connections, but nothing makes up for careless coding. Make sure to clean up the open database resource objects properly.

The rule of thumb when cleaning up JDBC objects is to close them in the reverse order they were created. As an example, most database read operations encompass a Connection, which creates or prepares a Statement (or PreparedStatement), which produces a ResultSet. In this case, close the ResultSet first, the Statement second, the Connection last. The order of the cleanup is important to prevent database-specific annoyances, like running out of cursors.

The best way to ensure proper cleanup is to place all cleanup logic in a finally clause of a try-catch-finally block. The code in a finally clause is guaranteed to run in all cases short of a JVM crash, so a Connection.close() in a finally clause is sure to return the Connection to the pool. Example:


try {
con = ds.getConnection();
ps = con.prepareStatement
("select * from mytable where mycolumn = ?");
ps.setString(1, someString);
rs = ps.executeQuery();
// do something with the ResultSet
} catch (SQLException ex) {
// handle the SQLException
} finally {
try { rs.close(); } catch(SQLException ex) {rs = null;}
try { ps.close(); } catch (SQLException ex) {ps = null;}
try { con.close(); } catch (SQLException ex) {con = null;}

One alternative to closing Connection objects in finally clauses is to close them in a finalize() method of an object. This approach is not recommended, as the finalize() method is only called prior to garbage collection, an asynchronous operation. Also, the only reason to postpone closing a Connection in this manner would imply that you're caching it in the object, another poor practice. Cleaning up JDBC objects correctly can prevent many unnecessary headaches.

When Good Connections Go Bad
Occasionally database connectivity breaks unexpectedly. Network cables get pulled, electricity goes out, or sometimes a database server just gets pegged. Although nothing can be done until the database is brought back online, these sorts of outages don't have to permanently bring down your application's functionality.

The WebSphere connection manager provides a convenient manner of handling temporary connectivity outages. Proper handling of the StaleConnectionException allows developers to create more fault-tolerant applications.

The StaleConnectionException is an exception that extends SQLException. This is necessary in order to follow the JDBC specification (most JDBC methods may only throw a SQLException). This is important to note because if an application is to catch a StaleConnectionException, it must be caught before catching the SQLException. The actual package name for the StaleConnectionException in version 3.5 is com.ibm.ejs.cm.portability; in 4.0 it's com.ibm.websphere.ce.cm (the 3.5 package name will work in 4.0 but is deprecated).

Just for clarification, when a Connection is marked "stale," that means that on its next operation it will throw a StaleConnectionException.

A Connection can "go bad" in any number of situations:

1.   The application attempts to acquire a Connection during a database failure or when the database has not yet been started.
2.   All of the Connection objects in the pool have gone bad due to a database failure. In this case, any new calls to getConnection() will return a stale Connection object, and all Connection objects in use would throw this exception when the application attempts to use them.
3.   The application using the Connection has previously called its close() method, but later tries to use the connection again.
4.   The application acquired a Connection but didn't use it within the orphan timeout period. In this case, the connection manager orphans (preempts) the Connection and the next time the application attempts to use it (or a JDBC resource that it created; e.g. Statement, ResultSet, etc.) a StaleConnectionException will be thrown.

Handling a StaleConnection-Exception is entirely up to the application designer. For more fault tolerance, one recommended approach is to catch the StaleConnection-Exception and retry the database operation up to a predefined number of tries. Here's an example:


boolean retry = false;
do {
try {
retry = false;
con = ds.getConnection();
stmt = con.createStatement();
rowsUpdated = stmt.execute("update myTable set myColumn =
} catch(com.ibm.ejs.cm.portability.StaleConnectionException sce) {
if (retryAttempts++ < MAX_RETRIES)retry = true;
} catch(SQLException sqle) {
/* handle a more severe failure and exit loop */
} finally {/*Close stmt and con */cleanupJDBCResources();
} while (retry);

Normal execution of this code will only execute the logic in the loop once; only in error conditions will the thread retry the operation. The connection manager could throw a StaleConnectionException on any of the operations on lines four through six. In this event, the thread will check to see if it's reached the maximum number of retries (set prior to execution), and if so retry the entire operation, starting by obtaining a new Connection object.

Handling the StaleConnectionException in this manner will allow your Web application to recover from a number of temporary database outages that would normally return an error page to your Internet customer.

Wrapping Up
The practices outlined here are recommended as best practices, and in general are good advice for building WebSphere applications. Nevertheless, none of these tips will replace a poor design; they are meant to complement a well-designed application. They're also meant to provide insight into how the connection manager functions. This information should help in future development projects as well as in maintaining your current applications.

Managing JDBC connections can be tricky, but with a good application design and by following these recommendations, you should be well on your way to delivering high-performing and scalable enterprise solutions.

Cuomo, G., A Methodology for Production Performance Tuning
Lauzon, S., and Schommer, P., Handling Database Failures
Erickson, D., Connection Manager Whitepaper
WebSphere 3.5 Handbook Redbook

These articles and whitepapers can be found online at:
www7b.boulder.ibm.com/wsdd and www.redbooks.ibm.com

More Stories By J. Andrew McCright

Andy McCright is a software engineer for IBM software group. His>responsibilities include development of the WebSphere Application Server>and customer engagement. Recently, he has been researching web services and how they will play into the next generation of application servers.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@ThingsExpo Stories
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
One of biggest questions about Big Data is “How do we harness all that information for business use quickly and effectively?” Geographic Information Systems (GIS) or spatial technology is about more than making maps, but adding critical context and meaning to data of all types, coming from all different channels – even sensors. In his session at @ThingsExpo, William (Bill) Meehan, director of utility solutions for Esri, will take a closer look at the current state of spatial technology and ar...
Everyone knows that truly innovative companies learn as they go along, pushing boundaries in response to market changes and demands. What's more of a mystery is how to balance innovation on a fresh platform built from scratch with the legacy tech stack, product suite and customers that continue to serve as the business' foundation. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, will discuss why and how ReadyTalk diverted from healthy revenue an...
SYS-CON Media announced today that @WebRTCSummit Blog, the largest WebRTC resource in the world, has been launched. @WebRTCSummit Blog offers top articles, news stories, and blog posts from the world's well-known experts and guarantees better exposure for its authors than any other publication. @WebRTCSummit Blog can be bookmarked ▸ Here @WebRTCSummit conference site can be bookmarked ▸ Here
SYS-CON Events announced today that Streamlyzer will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Streamlyzer is a powerful analytics for video streaming service that enables video streaming providers to monitor and analyze QoE (Quality-of-Experience) from end-user devices in real time.
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
In past @ThingsExpo presentations, Joseph di Paolantonio has explored how various Internet of Things (IoT) and data management and analytics (DMA) solution spaces will come together as sensor analytics ecosystems. This year, in his session at @ThingsExpo, Joseph di Paolantonio from DataArchon, will be adding the numerous Transportation areas, from autonomous vehicles to “Uber for containers.” While IoT data in any one area of Transportation will have a huge impact in that area, combining sensor...
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity.
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
Cloud based infrastructure deployment is becoming more and more appealing to customers, from Fortune 500 companies to SMEs due to its pay-as-you-go model. Enterprise storage vendors are able to reach out to these customers by integrating in cloud based deployments; this needs adaptability and interoperability of the products confirming to cloud standards such as OpenStack, CloudStack, or Azure. As compared to off the shelf commodity storage, enterprise storages by its reliability, high-availabil...
In the next forty months – just over three years – businesses will undergo extraordinary changes. The exponential growth of digitization and machine learning will see a step function change in how businesses create value, satisfy customers, and outperform their competition. In the next forty months companies will take the actions that will see them get to the next level of the game called Capitalism. Or they won’t – game over. The winners of today and tomorrow think differently, follow different...
The IoT industry is now at a crossroads, between the fast-paced innovation of technologies and the pending mass adoption by global enterprises. The complexity of combining rapidly evolving technologies and the need to establish practices for market acceleration pose a strong challenge to global enterprises as well as IoT vendors. In his session at @ThingsExpo, Clark Smith, senior product manager for Numerex, will discuss how Numerex, as an experienced, established IoT provider, has embraced a ...
SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in Embedded and IoT solutions, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 7-9, 2017, at the Javits Center in New York City, NY. Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology, is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and ...
The Internet of Things (IoT), in all its myriad manifestations, has great potential. Much of that potential comes from the evolving data management and analytic (DMA) technologies and processes that allow us to gain insight from all of the IoT data that can be generated and gathered. This potential may never be met as those data sets are tied to specific industry verticals and single markets, with no clear way to use IoT data and sensor analytics to fulfill the hype being given the IoT today.
Donna Yasay, President of HomeGrid Forum, today discussed with a panel of technology peers how certification programs are at the forefront of interoperability, and the answer for vendors looking to keep up with today's growing industry for smart home innovation. "To ensure multi-vendor interoperability, accredited industry certification programs should be used for every product to provide credibility and quality assurance for retail and carrier based customers looking to add ever increasing num...
The Open Connectivity Foundation (OCF), sponsor of the IoTivity open source project, and AllSeen Alliance, which provides the AllJoyn® open source IoT framework, today announced that the two organizations’ boards have approved a merger under the OCF name and bylaws. This merger will advance interoperability between connected devices from both groups, enabling the full operating potential of IoT and representing a significant step towards a connected ecosystem.
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...