Sessions Test
Typically, eG RUM treats user connections to a web site/web application from the same client IP address and via the same browser as a single session.
At any point in time, web service managers should be able to keep track of these sessions and accurately identify those that are currently live and those that are down. They should also rapidly isolate slow sessions and error-prone ones, so they can pinpoint which user sessions are adversely impacting the overall user satisfaction with the target application and why.
For such useful UX-related metrics, web service managers can periodically run the Sessions test.
This test provides insights into user experiences at both the aggregate level (overview metrics) and the granular level (session-specific details). By capturing and analyzing these metrics, IT teams can proactively optimize application performance, troubleshoot individual issues, and enhance overall user satisfaction.
Target of the test : A web site/web application managed as a Real User Monitor
Agent deploying the test : A remote agent
Outputs of the test : One set of results for the web site/web application being monitored
| Parameter | Description |
|---|---|
|
Test Period |
How often should the test be executed. |
|
Proxy Host |
If the eG agent communicates with the RUM collector via a proxy, then specify the IP address/fully-qualified host name of the proxy server here. By default, this is set to none, indicating that the eG agent does not communicate with the collector via a proxy. |
|
Proxy Port |
If the eG agent communicates with the RUM collector via a proxy, then specify the port at which the proxy server listens for requests from the eG agent. By default, this is set to none, indicating that the eG agent does not communicate with the collector via a proxy. |
|
Proxy Username and Proxy Password |
If the eG agent communicates with the RUM collector via a proxy server, and if proxy server requires authentication, then specify valid credentials for authentication against Proxy Username and Proxy Password. If no proxy server is used or if the proxy server used does not require authentication, then set the ProxyUsername and ProxyPassword to none. |
|
Confirm Password |
Confirm the Proxy Password by retyping it here. Note: If you Reconfigure the test later to change the values of the Proxy Username, Proxy Password, Confirm Password parameters, then such changes will be effected only if the eG remote agent monitoring the Real User Monitor component is restarted. |
|
Include Domain in Descriptor For Page Group |
This parameter is applicable only to Page Groups test. By default, the Page Groups test reports metrics for every page group that is configured for monitoring. This is why, this flag is set to No by default. Some administrators however, may want to further group the page groups based on the domain to which they belong. When users complain of sub-par experience with a target website/application, this grouping will help administrators rapidly pinpoint not just the page groups, but also the domains that are responsible for the performance degradation. For this, set this flag to Yes. Once this is done, then the Page Groups test will report metrics for each page group per domain - i,e., the domain name will be the first level descriptor, and the page group will be second level descriptor of the test. |
|
Max Top N Sessions in DD |
By default, the detailed diagnosis of Session test lists the top 100 browser sessions, in terms of session uptime. Accordingly, this parameter is set to 100 by default. This default cap on session count conserves eG database space and prevents unwanted strain on the data repository, particularly in the case that the target web site/application sees heavy user activity on a daily basis. If you have a well-sized, well-tuned, eG database, you can increase the value of this parameter and have more session records (with high uptime) captured and stored in the eG database. |
|
Session Max Inactive Duration |
By default, any session that remains idle for 30 minutes continuously will be deemed as an 'Inactive session'. Accordingly, this parameter is set to 30 by default. If needed, you can increase or decrease the cut-off time for 'inactivity' by changing the value of this parameter. |
|
Session Max Allowed Time |
By default, a single session can stay active for a maximum of 480 minutes (i.e., 8 hours) only. Once a session violates this threshold, eG will automatically assign a fresh session ID to it and will treat it as a 'new session'. The older session on the other hand will be counted as an 'inactive session'. Accordingly, the value of this parameter is set to 480 by default. If you want, you can override this default duration by changing the value of this parameter. |
|
Session Apdex Cutoff |
Use this parameter to dictate when a session qualifies as a Poor user experience live session. By default, this parameter is set to 10. This implies that if a minimum of 10% of the page views in a live session record a poor Apdex score, then that entire session will be counted as a Poor user experience live session. For instance, say, out of 10 page views in a live session, 2 page views register a low Apdex score. If you go with the default setting of this parameter (which is 10%), then the threshold we are looking at is 1 page view with a low Apdex score - i.e., 10% of 10 page views - in a single live session. As per our example, the user experience with 2 page views has been unsatisfactory in this live session. This has clearly violated the default threshold of 1 (10% of 10 page views). So, this user session will be counted and reported as a Poor user experience live session. If you are more tolerant to sub-par user experience, then increase the value of this parameter. If you are less tolerant to user dissatisfaction, then decrease the value of this parameter. |
|
Session Page Load Time Cutoff |
Use this parameter to configure when a session qualifies as a Slow live session. By default, this parameter is set to 10. This implies that if a minimum of 10% of the page views in a live session load slowly, then that entire session will be counted as a Slow live session. For instance, say, out of 10 page views in a live session, 2 page views experience slowness when loading. If we go with the default setting of this parameter (which is 10%), then the threshold we are looking at is 1 slow page view - i.e., 10% of 10 page views - in a single live session. As per our example, the page load experience for 2 page views has been unsatisfactory in this live session. This has clearly violated the default threshold of 1 (10% of 10 page views). So, this user session will be counted and reported as a Slow live session. If you are more tolerant to loading slowness, then increase the value of this parameter. If you are less tolerant to loading slowness, then decrease the value of this parameter. |
|
Session Errors Cutoff |
Use this parameter to configure when a session qualifies as an Eroneous live session. By default, this parameter is set to 10. This implies that if a minimum of 10% of the page views in a live session encounter errors, then that entire session will be counted as an Erroneous live session. For instance, say, out of 10 page views in a live session, 2 page views ran into JavaScript errors. If we go with the default setting of this parameter (which is 10%), then the threshold we are looking at is 1 erroneous page view - i.e., 10% of 10 page views - in a single session. As per our example, 2 page views in the live session are error-prone. This has clearly violated the default threshold of 1 (10% of 10 page views). So, this user session will be counted and reported as an Erroneous live session. If you are more tolerant to errors, then increase the value of this parameter. If you are less tolerant to errors, then decrease the value of this parameter. |
|
Session Max Action Count |
By default, this parameter is set to 500. This implies that if a user performs 500 actions (e.g., page navigations, AJAX calls etc.) or more in a session, then by default, eG RUM will assign a fresh session ID to that session and will treat it as a 'new session'; the older session on the other hand will be counted as an 'inactive session'. You can override this default setting by increasing or decreasing the value of this parameter. |
|
Session Replay |
Session replay is a feature that captures and replays user interactions within an application, showing exactly how users experience it. It records events like clicks, scrolls, and page changes by tracking DOM updates rather than creating a video, saving storage space. Collected data is then replayed in sequence to recreate each session. This helps teams diagnose issues, enhance UX, and support users more effectively. Privacy safeguards and efficient data handling ensure compliance and minimal impact on app performance. To enable the session relay feature, slide the Session Replay slider to the right. Then, click on Click here to Configure Session Replay to setup session replay. To know how to configure Session Replay, refer to Configuring Session Replay. |
|
Download Code Snippet + egrum.js |
Typically, the egrum.js is bundled with the eG RUM collector - i.e., it resides on the same host where the RUM collector is installed. In some environments, for high availability reasons, administrators may host egrum.js on other locations as well - e.g., on the same application server that hosts the target web site/application, on the cloud front etc. In such a case, if the test configuration is modified, the egrum.js hosted outside of the RUM collector will not be automatically updated with the changes. Under such circumstances, do the following:
|
|
DD Frequency |
Refers to the frequency with which detailed diagnosis measures are to be generated for this test. The default is 1:1. This indicates that, by default, detailed measures will be generated every time this test runs, and also every time the test detects a problem. You can modify this frequency, if you so desire. Also, if you intend to disable the detailed diagnosis capability for this test, you can do so by specifying none against DD frequency. |
|
Detailed Diagnosis |
To make diagnosis more efficient and accurate, the eG Enterprise embeds an optional detailed diagnostic capability. With this capability, the eG agents can be configured to run detailed, more elaborate tests as and when specific problems are detected. To enable the detailed diagnosis capability of this test for a particular server, choose the On option. To disable the capability, click on the Off option. The option to selectively enable/disable the detailed diagnosis capability will be available only if the following conditions are fulfilled:
|
Measurements made by the test
| Measurement | Description | Measurement Unit | Interpretation |
|---|---|---|---|
|
Total live sessions |
Indicates the total number of sessions that are currently live. |
Number |
This is a good indicator of the current session load on the target web site /web application. To know which users' sessions are currently live on the web site/ web application, use the detailed diagnosis of this measure. |
|
Average session duration |
Indicates the average duration for which sessions were alive. |
Mins |
|
|
Average session apdex score |
Indicates the average Apdex score across all sessions. |
Number |
Apdex (Application Performance Index) is an open standard developed by an alliance of companies. It defines a standard method for reporting and comparing the performance of software applications in computing. Its purpose is to convert measurements into insights about user satisfaction, by specifying a uniform way to analyze and report on the degree to which measured performance meets user expectations. The Apdex method converts many measurements into one number on a uniform scale of 0-to-1 (0 = no users satisfied, 1 = all users satisfied). The resulting Apdex score is a numerical measure of user satisfaction with the performance of enterprise applications. This metric can be used to report on any source of end-user performance measurements for which a performance objective has been defined. The Apdex formula is: Apdext = (Satisfied Count + Tolerating Count / 2) / Total Samples This is nothing but the number of satisfied samples plus half of the tolerating samples plus none of the frustrated samples, divided by all the samples. A score of 1.0 means all responses were satisfactory. A score of 0.0 means none of the responses were satisfactory. Tolerating responses half satisfy a user. For example, if all responses are tolerating, then the Apdex score would be 0.50. Ideally therefore, the value of this measure should be 1.0. A value less than 1.0 indicates that the user experience with the web site/web application has been less than satisfactory. |
|
Healthy live sessions |
Indicates the number of live sessions with a 'healthy' user experience - i.e., sessions that did not experience slowness or errors. |
Number |
|
|
Slow live sessions |
Indicates the number of live sessions that experience slowness when loading. |
Number |
Ideally, the value of this measure should be 0. A high value implies that you are in a spot of bother, as many sessions are experiencing lethargy in page loading. |
|
Erroneous live sessions |
Indicates the number of live sessions that have encountered errors. |
Number |
Ideally, the value of this measure should be 0. A high value implies that many sessions are erroneous. |
|
Poor user experience live sessions |
Indicates the number of live sessions with sub-par user experience. |
Number |
Ideally, the value of this measure should be 0. A non-zero value implies that one/more users are experiencing slowness or errors when interacting with the target web site/web application. |
|
Slow session percentage |
Indicates the percentage of sessions that experienced slowness. |
Percent |
Ideally, the value of this measure should be 0. A value over 50% is a cause of concern, as it means that over half of the sessions are experiencing slowness. Use the detailed diagnosis of this measure to figure out which user's sessions are slow and isolate the root-cause of the slowness – is it a slow frontend? bad network? or a malfunctioning backend? |
|
Error session percentage |
Indicates the percentage of sessions that encountered JavaScript errors. |
Percent |
Ideally, the value of this measure should be 0. A value over 50% implies that JavaScript errors are common in many sessions. Use the detailed diagnosis of this measure to figure out which user sessions are erroneous. |