Apdex (Application Performance Index) is an open standard developed by an alliance of companies that defines a standardized method to report, benchmark, and track application performance.
Enterprises are swimming in IT performance numbers, but have no insight into how well their applications perform from a business point of view. Response time values do not reveal whether users are productive, and a greater number of samples leads to confusion. Averaging the samples washes out significant details about frustration with slow response times. Time values are not uniform across different applications. There should be a better way to analyze and measure what matters.
Apdex is a numerical measure of user satisfaction with the performance of enterprise applications. It converts many measurements into one number on a uniform scale of 0-to-1 (0 = no users satisfied, 1 = all users satisfied). This metric can be applied to any source of end-user performance measurements. If you have a measurement tool that gathers timing data similar to what a motivated end-user could gather with a stopwatch, then you can use this metric. Apdex fills the gap between timing data and insight by specifying a uniform way to measure and report on the user experience.
The index translates many individual response times, measured at the user-task level, into a single number. A Task is an individual interaction with the system, within a larger process. Task response time is defined as the elapsed time between when a user does something (mouse click, hits enter or return, etc) and when the system (client, network, servers) responds such that the user can proceed with the process. This is the time during which the human is waiting for the system. These individual waiting periods are what define the “responsiveness” of the application to the user.
How it Works
Performance measurement and reporting tools that support Apdex will conform to a specification developed by the Alliance that will be publicly available. It specifies a process that Apdex compliant tools and services will implement. A key attribute of the process is simplicity. What follows is a basic overview.
The index is based on three zones of application responsiveness:
- Satisfied: The user is fully productive. This represents the time value (T seconds) below which users are not impeded by application response time.
- Tolerating: The user notices performance lagging within responses greater than T, but continues the process.
- Frustrated: Performance with a response time greater than F seconds is unacceptable, and users may abandon the process.
The Apdex formula is the number of satisfied samples plus half of the tolerating samples plus none of the frustrated samples, divided by all the samples:
So it is easy to see how this ratio is always directly related to users’ perceptions of satisfactory application responsiveness. To understand the full meaning of the ratio, it is always presented as a decimal value with a sub-script representing the target time T. For example, if there are 100 samples with a target time of 3 seconds, where 60 are below 3 seconds, 30 are between 3 and 12 seconds, and the remaining 10 are above 12 seconds, the Apdex is:
There are many ways to use Apdex. The most important aspect of Apdex is that it will enable companies to look at application performance in a new light that will lead them to new management methods. The following is an example.
One of the most challenging management issues is the need to align IT to the business strategy of the company. Apdex provides a significant foundation for a new business-IT alignment process. Imagine the following simple scenario:
Step 1 – Defining Requirements
A CIO is managing a portfolio of several major business applications from order processing to corporate email. The CIO gets consensus among the business managers on a ranking of the applications by importance to the business. Presumably, order processing will be high and email low. The CIO then investigates how each application is used in order to assign target response time values (T). These values may be negotiated with the managers of each application. Finally, the CIO sets a goal for the lowest acceptable Apdex value across all the applications. The following chart summarizes the three sets of performance requirements: business ranking, response time target, and minimum performance goal.
Step 2 – Benchmarking Performance
Next, the CIO ranks the applications by the Apdex value they deliver during the business day. Of course, it is likely that the rankings will be unaligned and some of the applications will be operating below the Apdex goal as shown below. For example, if email has a high Apdex while the CRM system that supports the customer support center has a significantly lower Apdex, then the applications are out of alignment. This benchmark depicts the current alignment status and serves as a guide to prioritize improvements and track their success.
Step 3 – Achieving Alignment
Once the needed changes are made to the applications delivery system, the business and Apdex rankings will match, indicating that applications and business needs are aligned. In addition, more of the applications are achieving Apdex values above the corporate goal. Further work may be needed to improve the lowest-ranked applications.
Step 4 – Fully Meeting the Requirements
The applications and/or their delivery systems can be tuned over time so that all the applications meet the corporate Apdex goal, as shown below. Of course, a real business alignment exercise may be more complex, but using Apdex as a tool for discovery and remediation will be a central part of the strategy. We expect enterprises will use the index in various management approaches customized to their own needs.