37
P P e e o o p p l l e e S S o o f f t t A A p p p p l l i i c c a a t t i i o o n n S S c c a a l l a a b b i i l l i i t t y y A A n n a a l l y y s s i i s s By Citrix Consulting Citrix Systems, Inc.

PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

Embed Size (px)

Citation preview

Page 1: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

PPeeoopplleeSSoofftt AApppplliiccaattiioonn SSccaallaabbiilliittyy AAnnaallyyssiiss

By Citrix Consulting

Citrix Systems, Inc.

Page 2: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

Notice

The information in this publication is subject to change without notice.

THIS PUBLICATION IS PROVIDED “AS IS” WITHOUT WARRANTIES OF ANY KIND, EXPRESS OR IMPLIED,

INCLUDING ANY WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE OR NON-

INFRINGEMENT. CITRIX SYSTEMS, INC. (“CITRIX”), SHALL NOT BE LIABLE FOR TECHNICAL OR EDITORIAL

ERRORS OR OMISSIONS CONTAINED HEREIN, NOR FOR DIRECT, INCIDENTAL, CONSEQUENTIAL OR ANY

OTHER DAMAGES RESULTING FROM THE FURNISHING, PERFORMANCE, OR USE OF THIS PUBLICATION, EVEN

IF CITRIX HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES IN ADVANCE.

This publication contains information protected by copyright. Except for internal distribution, no part of this publication may

be photocopied or reproduced in any form without prior written consent from Citrix.

The exclusive warranty for Citrix products, if any, is stated in the product documentation accompanying such products.

Citrix does not warrant products other than its own.

Product names mentioned herein may be trademarks and/or registered trademarks of their respective companies.

Copyright © 2003 Citrix Systems, Inc., 851 West Cypress Creek Road, Ft. Lauderdale, Florida 33309-2009 U.S.A. All rights reserved.

Version History

Daniel Pinzas November 2002 Version 1.0

Christopher Capute December 2002 Version 1.1

Jo Harder April 2003 Version 1.2

Page 3: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

Table of Contents

INTRODUCTION ............................................................................................................................................................................ 1 KEY FINDINGS............................................................................................................................................................................... 1 DOCUMENT OVERVIEW................................................................................................................................................................... 2

EXECUTIVE SUMMARY................................................................................................................................................................ 3 OBJECTIVE ................................................................................................................................................................................... 3 RESULTS SUMMARY....................................................................................................................................................................... 3 APPROACH ................................................................................................................................................................................... 3 COMMENTS................................................................................................................................................................................... 4

TESTING METHODS ..................................................................................................................................................................... 5 SCRIPTED TEST............................................................................................................................................................................. 5 REAL USER TEST .......................................................................................................................................................................... 5 REAL USERS WITH TASK LIST ......................................................................................................................................................... 5 COMBINATION ............................................................................................................................................................................... 6 SCALABILITY TEST METHODS SUMMARY .......................................................................................................................................... 6 PEOPLESOFT SCALABILITY TESTING METHOD .................................................................................................................................. 6

LOADRUNNER AND CITRIX TESTING PROCESS...................................................................................................................... 7 PLANNING THE TEST ...................................................................................................................................................................... 7 CREATING VUSER SCRIPTS ............................................................................................................................................................ 7 CREATING THE SCENARIO .............................................................................................................................................................. 7 RUNNING THE SCENARIO................................................................................................................................................................ 8 MONITORING THE SCENARIO........................................................................................................................................................... 8 ANALYZING TEST RESULTS............................................................................................................................................................. 8 BASELINING .................................................................................................................................................................................. 8

PERFORMANCE METRICS THRESHOLDS................................................................................................................................. 9 USER EXPERIENCE ........................................................................................................................................................................ 9

PEOPLESOFT APPLICATION TEST SCRIPTS.......................................................................................................................... 10 ENVIRONMENTAL FACTORS........................................................................................................................................................... 10 PEOPLESOFT USER SCRIPT ......................................................................................................................................................... 10

PeopleSoft Application Scalability Analysis i

Page 4: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

PeopleSoft Script Narrative .................................................................................................................................................. 12

ENVIRONMENT SPECIFICATIONS ............................................................................................................................................ 16 METAFRAME ARCHITECTURE OVERVIEW........................................................................................................................................ 16

LoadRunner Controller ......................................................................................................................................................... 16 PeopleSoft Server................................................................................................................................................................. 17

METAFRAME HARDWARE SPECIFICATIONS..................................................................................................................................... 18 SUPPORTING HARDWARE SPECIFICATIONS..................................................................................................................................... 18 SOFTWARE AND PLATFORM SPECIFICATIONS.................................................................................................................................. 19 CITRIX® METAFRAME XP™ PRESENTATION SERVER CONFIGURATION ............................................................................................ 21

ICA Connection Configuration .............................................................................................................................................. 21 Internet Explore 6.0 Configuration ........................................................................................................................................ 22

PEOPLESOFT SCALABILITY TEST CASES.............................................................................................................................. 23 CASE 1: PEOPLESOFT BASELINE TEST.......................................................................................................................................... 23 CASE 2: PEOPLESOFT SUPPORTED USERS.................................................................................................................................... 23

SCALABILITY RESULTS AND ANALYSIS ................................................................................................................................ 24 RESULTS .................................................................................................................................................................................... 24 ANALYSIS ................................................................................................................................................................................... 25

APPENDIX A: LOADRUNNER ................................................................................................................................................... 26 VUSER SCRIPT GENERATION ........................................................................................................................................................ 26 TIPS FOR WORKING WITH CITRIX VUSER SCRIPTS........................................................................................................................... 27

Recording Tips...................................................................................................................................................................... 27 Replay Tips........................................................................................................................................................................... 27 Debugging Tips..................................................................................................................................................................... 27

APPENDIX B: PERFORMANCE METRICS ................................................................................................................................ 28 PERCENT PROCESSOR TIME......................................................................................................................................................... 28 PROCESSOR QUEUE LENGTH........................................................................................................................................................ 28 AVAILABLE AND COMMITTED MEMORY ........................................................................................................................................... 28 PAGE FAULTS PER SECOND.......................................................................................................................................................... 28 PAGE READS PER SECOND........................................................................................................................................................... 28 FREE SYSTEM PAGE TABLE ENTRIES............................................................................................................................................. 29 PERCENT USAGE PAGE FILE......................................................................................................................................................... 29

PeopleSoft Application Scalability Analysis ii

Page 5: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

PERCENT DISK TIME .................................................................................................................................................................... 29 CURRENT DISK QUEUE LENGTH.................................................................................................................................................... 30

APPENDIX C: TEST DATA ......................................................................................................................................................... 31

PeopleSoft Application Scalability Analysis iii

Page 6: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

Introduction PeopleSoft is a widely used Enterprise Resource Planning (ERP) system that can be deployed via Citrix MetaFrame Presentation Server. The purpose of this white paper is to provide details on the testing that was done within the Citrix consulting test lab as it relates to the application scalability of PeopleSoft Financials 8.4. Citrix sought to evaluate application performance and functionality of the PeopleSoft Internet Architecture client delivered through Citrix MetaFrame XP ICA sessions.

The testing and evaluation of the PeopleSoft Internet Architecture was intended to accomplish the following:

• Test the application functionality of PeopleSoft Financials 8.4/Microsoft Internet Explorer 6.0 while hosted by Citrix® MetaFrame XP™ Presentation Server, Enterprise Edition

• Determine resource consumption factors that will impact the number of users that can be hosted in a MetaFrame environment on a single server

• Evaluate the integration with the following tools and technologies: o PeopleSoft Financials 8.4 with PeopleTools 8.41 o Windows 2000 Server with Terminal Services and Citrix® MetaFrame XP™ Presentation Server,

Enterprise Edition Feature Release 2 o Mercury InterActive’s LoadRunner 7.51 o Citrix ‘s ICA Client integration with Mercury Interactive’s LoadRunner

Key Findings Primary Scalability Factors. Memory and CPU utilization were the primary constraints observed throughout the test process, with CPU Utilization representing the primary factor limiting scalability.

• CPU Utilization. Average CPU utilization remained significantly favorable throughout the testing. However, there were CPU spikes at different tiers during some operations (such as the initial login into the PeopleSoft system), which were magnified as more users were added. It is these spikes of over 70% for extended periods that limited the number of concurrent users/sessions 30-35 on the system.

• Memory. Memory consumption was found to average 30 MB per user throughout all test scenarios.

• User Limits. Maximum observed server load on the system was 40 users. For the purposes of this test, the more conservative figures of 30 to 35 users reflect tests with minimal transaction errors and good application response times. These are reported as the expected production load limit.

• Processor Speed. The system used contained two Pentium III 1.4GHz processors. Due to the nature of ever-increasing processor speeds, it is strongly recommended that each IT environment undertake its own testing to confirm scalability in their environment, on the specific server configuration that will be used in production.

• Configuration. The Citrix® MetaFrame XP™ Presentation Servers were configured with the default settings unless otherwise indicated. All testing was performed through a published Internet Explorer application. When launched, the browser immediately defaulted to the home page, which was set as the login page for the PeopleSoft application.

• Subjective Testing. The primary focus of the testing was to evaluate objective factors such as memory and utilization. To ensure functional validity in the testing, the test team performed logins and limited manual application execution to evaluate application response and usability. Manual connections were made to the MetaFrame server throughout the range of testing to ensure the quality of the user experience. The usability of the application was optimal at user counts of up to 30 to 35 users.

PeopleSoft Application Scalability Analysis 1

Page 7: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

• Test Procedures. The project team relied heavily on Mercury Interactive’s integration with the Citrix ICA client. This test approach ensures that no process is executed against the server that would artificially reduce the overall load figures. Additionally, since the script execution relied on mouse clicks, typing, and other user interface-based activities, the script represented an extremely close approximation for a production user executing the PeopleSoft application via Citrix® MetaFrame XP™ Presentation Server.

Document Overview This document is broken down into the following major sections:

• Scalability Testing Methods – Discusses the differences between the types of scalability testing methods such as scripted versus real user tests.

• LoadRunner and Citrix Testing Process – Discusses the testing methodology used to perform the scalability tests.

• Benchmarking – Describes the Performance Monitor benchmarks used to set thresholds of acceptable performance.

• Performance Thresholds – Identifies the common thresholds for the performance metrics. • Custom PeopleSoft Scripts – Discusses the requirements, assumptions, and the process flows behind the

custom scripts used in the scalability testing. • Environment Specifications – Details the MetaFrame XP architecture and configuration, server hardware,

software and platform specifications, and all other configurations required to setup the MetaFrame XP testing environment.

• Scalability Test Cases – Discusses the cases that were tested as part of the scalability testing efforts. • Scalability Results and Analysis – Presents and analyzes the significance of the results obtained from the

testing effort. • Scalability Test Summary – Presents conclusions from the results and analysis data. Additionally, it

provides a summary format of the test results / procedures that can be reviewed independent of the rest of the document.

• Appendices – The appendixes provide supplemental information for the content provide in the main body of the document.

PeopleSoft Application Scalability Analysis 2

Page 8: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

Executive Summary

Objective The purpose of this test process was to determine the overall server resource consumption on a MetaFrame XP server when running the PeopleSoft Financials 8.4 applications using Microsoft Internet Explorer version 6.0. System resources would be the primary factors in determining the total number of concurrent user sessions that could be supported in a production environment.

Results Summary The recommended number of concurrent PeopleSoft user sessions is approximately 30-35 on a dual processor 1.4GHz server with 2 GB RAM. This value was determined based on the extensive testing and resulting data captured during the tests detailed in this document. The tests were performed repeatedly, to confirm initial results and ensure reliable data. Further, the project team applied both objective and subjective analysis methods to complete the evaluation. The objective analysis/counters are detailed in this section, Appendix B: Performance Metrics. The subjective analysis relied upon response time to mouse movements & keystrokes, as well as time to paint the screen. The subjective analysis was relative to a stand-alone workstation running local to the MetaFrame XP servers with no outside factors to impact performance. An explanation of the subjective factors is detailed here, User Experience.

Because application usage patterns will be variable in each environment, it is strongly recommended that each MetaFrame XP environment perform a level of detailed scalability analysis to accurately determine the supported number of concurrent PeopleSoft user sessions. The figures represented here were determined based on the test procedures specified in this document.

Approach The focus of this testing was on the launch and exercise of the PeopleSoft application screens. The specific details can be reviewed at the following link: PeopleSoftUserWorkFlow. The scalability testing applied by Citrix consulting was intentionally structured with two primary scenarios for evaluation:

• Baseline testing scenario. A Mercury Interactive LoadRunner scenario designed to evaluate the expected server resource consumption of a single user. To further ensure validity, a second user was run concurrently to provide correlating factors. This test was evaluated from system logon through logoff.

• Steady-state scenario. A Mercury Interactive LoadRunner scenario designed to place an increasing number of users on the system performing effectively the same functional tasks concurrently. This was accomplished by loading two additional users every 15 seconds.

Baseline figures are appropriate for early system and environment planning, as well as providing valid data that can be used for calculating budgets and initial Total Cost of Ownership (TCO) figures. The additional data gathered during the steady-state testing represents conservative figures, since it is expected that an actual production server would not be used in such a rapid, condensed fashion. To further ensure accuracy and relevancy in individual client environments, Citrix strongly recommends that all scalability testing follow a standard:

• Evaluate the application and/or client components in a proof-of-concept environment. Single session and/or single workstation.

• Create a baseline scripted test scenario to evaluate multiple concurrent sessions and the respective impact on system performance.

• Evaluate the planned or existing user environment and generate more complex and customized scripted test scenarios to better evaluate the application’s performance in a simulated production environment.

• Deploy a pilot to a small user group of test users and monitor their testing and use of the application. • Finally, when deploying the application, deploy into a small pre-production environment. Evaluate and monitor

the performance in the pre-production environment, then migrate the application into full production. Utilize

PeopleSoft Application Scalability Analysis 3

Page 9: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

Citrix Load Balancing to ensure optimal application and server performance by limiting the concurrent sessions/applications on the MetaFrame XP servers.

Test Summary

• CPU. The primary system resource constraint was overall CPU utilization. The tests outlined were performed on servers equipped with two 1.4 GHz processors, so results will vary if more robust resources are used.

• Memory. Memory was a secondary constraint, due in large part to the significant number of users that were run concurrently on each system. At approximately 30 MB RAM per user, a two-processor system would benefit from a minimum of 2 GB of RAM.

• Users. The test scripts and processes used to determine when a server was ‘loaded’ were designed to provide conservative figures. Actual testing with a group of production users may indicate the ability to add additional users, depending on the actual mix of user activities.

• Reduced Think Time. Reducing the think time of the script increased the probability that the application and the script would become unsynchronized, resulting in errors. A more sophisticated script or ‘faster’ server may allow for the reduction in think time.

Comments The number of users, intensity of their keystrokes and mouse clicks, and applications that they are using are the most significant factors affecting the performance of a MetaFrame XP server. This testing effort and all performance data gathered was based on a set of requirements and assumptions that was determined to accurately simulate ‘standard’ user activity for the PeopleSoft application. It is important to note that the Total Concurrent Users (TCU) Scalability Testing was performed to determine server sizing, which is testing that measures the ability for a single MetaFrame XP server to support a given user load. Per Citrix consulting methodology, Server Scalability is typically performed during the Proof of Concept or Pilot phases. Server Scalability is different from System Scalability testing which measures the ability of the overall infrastructure (of which MetaFrame XP is one component) to support a given user load. The overall infrastructure includes components such as the network (LAN/WAN/Internet), MetaFrame XP servers, application servers, database servers, authentication servers, file servers, and print servers. Complete system scalability was beyond the scope of this document. The tests showed that some server resources were not fully utilized even with the suggested maximum user loads. However, this does not mean these resources should be overlooked or further stressed in a production environment. Server resources help maintain availability on each server in order to provide redundancy within a farm. For example, if one or several servers were to go off-line, the remaining servers should be able to handle the additional connections. Furthermore, continuously running at or near capacity will increase the likelihood of a server crash. It is also important to note that this scripted scalability analysis focused primarily on the determination of a conservative baseline figure for the number of PeopleSoft users that can be successfully supported on a MetaFrame XP server given defined hardware constraints. Again, the MetaFrame XP servers were not analyzed for system scalability. For that type of analysis, the entire picture including network, application servers, and printing performance should be considered. The scripted scalability test used in this analysis is helpful in obtaining rough estimates when sizing the server. The data and analysis in this document should server as a guide, and not definitive numbers, for further MetaFrame XP scalability testing for the PeopleSoft application. The results obtained from further testing may differ from this testing effort. Each environment is unique and the number of concurrent user sessions may vary significantly from one environment to another. Also, each end user will utilize the PeopleSoft application differently, making it difficult for scripted scalability tests to accurately mimic all user activity. Therefore, the final, ‘true’ scalability of the environment, design, and configuration should be determined by conducting a full pilot with “live” users.

PeopleSoft Application Scalability Analysis 4

Page 10: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

Testing Methods In a scalability/performance test of Windows 2000 server with Citrix® MetaFrame XP™ Presentation Server, decisions regarding the test methods were required in order to standardize valid testing of the environment.

There are four primary types of scalability testing that are appropriate to MetaFrame XP environment:

1. Scripted Test: Automated execution of scripts that mimic user actions without any user intervention.

2. Real Users Test: Actual users execute their daily tasks without any specified order.

3. Real Users with Tasks List: Actual users execute a set of pre-defined tasks.

4. Combination: A combination of two or more of the aforementioned testing methods.

The following sections discuss each method in more detail and compare the advantages and disadvantages of each method.

Scripted Test For this method, a standard set of scripts are written to control the actions of test users that are similar to typical PeopleSoft users. These scripts are developed to simulate a desired set of predefined actions (workflows), which are based on the user’s role and applications used during a typical user session. Each workflow may contain sub-workflows that dictate the multiple paths users take to complete these daily tasks. These sub-workflows will be the basis for scripts that are generated. Initiation of script execution would be at set intervals to ensure that steps taken while working in an application are not repeated simultaneously for all virtual users during a the test. These intervals ensure more accurate results since the application is able to respond in a more realistic manner.

For the test process detailed in this document, the functional flows for these scripts have been developed by Citrix consulting and are based on test flows created by PeopleSoft for application verification testing.

Real User Test The second method for scalability testing is to have users log into the system and perform tasks similar to those of a typical workday. The results obtained from this method are geared toward real-life scenarios. The caveat to using this method is that more variables exist in the test, such as the number of users, activities, and interruptions. This makes it more difficult to run the same exact test while increasing user load, making system configuration changes, or repeating the test.

When running this type of test, most client environments would benefit from monitoring their systems and capturing the performance counters and data in a database format. resource manager for MetaFrame XP is designed to accomplish this, and these figures can provide significant value and accuracy, provided that a large enough population sample of data is captured.

Real Users with Task List The next method for scalability testing is a combination of Scripted Tests and Real User Testing. Real User Testing with task lists includes having real users access the system, while executing a written set of tasks in a random order. These tasks are analogous to the workflows defined in the Custom PeopleSoft Scripts section. Developing customer specific tasks for scalability testing will best represent the different types of users that will access the system on a daily basis. Each user will be accessing the system at different speeds, reflecting a realistic production environment. However, these users will be following a common set of tasks that will help with standardizing the scalability tests when they need to be re-run with additional users.

PeopleSoft Application Scalability Analysis 5

Page 11: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

This type of test is resource intensive and can be difficult to coordinate. Most corporate environments cannot provide multiple resources for this type of application testing and evaluation.

Combination The final method for scalability testing is a combination of a custom script and real users accessing the test environment. For example, five client computers emulating six users each could be used in conjunction with several Real Users performing searches and more complex customer transactions. This would allow the administrators to load the system to specific level, and then evaluate the subjective and objective results of the users’ interaction with the MetaFrame XP servers.

Scalability Test Methods Summary Table 1 - Scalability Test Methods Summary summarizes the advantages and disadvantages of each scalability test method described above.

Testing Method Advantages: Disadvantages:

Scripted Test: • No variables. Completely controlled • Identical tests can be repeated as many times as

needed • No user time required to do test • Tests can be re-run as environment grows

• Takes significant time and tools to create test scripts

• No “user skill levels” incorporated into test

Real Users Test: • Real life test • Allows for different user types and skill levels

• Impossible to have two identical tests • User’s time is needed to perform test • Need users from each ISV’s customer base

Real Users with Task List Test:

• Can be as controlled as needed • Test can be repeated with high degree of similarity

between previous tests • Allows for different user types and skill levels

• User’s time is needed to perform test • The project team will have to create a task

list for users customized to their role. This can be very complex and time consuming

Combination • Can emulate most user activities with custom scripts and live users can test actions that were not scripted and the acceptable latency.

• Can load the system to a specified level, then allow the users to ‘overload’ the system and evaluate the results.

• Multiple users’ time is needed to perform test

Table 1 - Scalability Test Methods Summary

PeopleSoft Scalability Testing Method Based on the project requirements, Citrix consulting decided to use the Scripted Test Method. This ensured identical, controlled tests that could be repeated for multiple tests. The scripts developed for testing were carefully formulated by Citrix consulting to accurately simulate normal user load on the MetaFrame XP servers.

The primary testing tool that was used to develop the PeopleSoft scripts was Mercury Interactive’s LoadRunner and the Citrix ICA client integration. For more information regarding Mercury Interactive’s LoadRunner and the partnership between Citrix and Mercury please refer to their web site: http://www.svca.mercuryinteractive.com/alliances/alliance_directory/index/citrix.html.

PeopleSoft Application Scalability Analysis 6

Page 12: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

LoadRunner and Citrix Testing Process As with all testing strategies, a clearly defined testing process helps to ensure accurate and repeatable results. The following section is an overview of a six-step process for testing applications under load.

Planning the Test Successful testing requires development of a thorough test plan. A clearly defined test plan ensures that the LoadRunner scenarios developed would accomplish the load testing objectives. Load test planning involves:

• Analyzing the application to determine hardware and software components, the system configuration, and typical usage patterns.

• Defining testing objectives (e.g., maximum user load, application upgrade compatibilities, identifying bottlenecks).

• Planning the LoadRunner implementation. This involves defining the scope of performance measurements, defining virtual user (vUser) workflows, selecting vUsers, and choosing test hardware.

• Define the launch sequence of the test users. • Monitor the server with one user to establish a baseline. This will allow the project team to acquire a proper

benchmark (see the Baselining section for additional information). • After that information has been gathered in a log file, start adding groups of users by spawning test sessions

from the LoadRunner controller. LoadRunner scenarios can be configured to control the speed and timing of user script execution.

• Additional users should be added to the test until it is complete. Throughout the duration of the scripted test, a user should manually logon to system to measure User Experience to validate the performance of the system.

• For the initial scalability and performance tests, performance graphs should be monitored. User load increases should be stopped when the system is reaching critical thresholds or the scripts fail to respond. These thresholds are explained later in this document; see the Standard Performance Metrics Thresholds section.

After reviewing the results of the initial scalability and performance tests, some configuration changes can be made to the MetaFrame XP server such as the addition of more RAM or faster processors. After these changes, the test should be re-run in the identical fashion and results should be compared to the previous test to see if any performance gains were detected. If performance losses are detected or result in no increase in performance, the changes should be removed.

Creating vUser Scripts LoadRunner vUsers emulate actual production users interacting with the system under test conditions. vUser scripts contain the mouse clicks and keyboard entries that each virtual user will perform during a scenario execution. These scripts should emulate what real users typically do with the application in a production environment. This can best be accomplished by creating a detailed functional flow of user activities, breaking the flow down into manageable transactions, and then creating the actual test script.

Creating the Scenario A scenario describes the events that occur in a testing session. A scenario includes defining the client machines that vUsers will run on, scripts that vUsers will execute, and a specified number of vUsers or vUser groups that run the scenario. Perhaps most importantly, the LR scenario can be configured to control the rate at which users are

PeopleSoft Application Scalability Analysis 7

Page 13: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

introduced into the actual test. This scheduling feature is extremely powerful, and when combined with the other LoadRunner Controller Scenario features, provides the test team with an invaluable tool.

Running the Scenario User load is emulated by instructing multiple vUsers to perform tasks simultaneously. Before executing a scenario, configuration and scheduling is defined. This determines how all the load generators and vUsers behave when the scenario is run.

Monitoring the Scenario While the scenario is being run, a monitoring tool such as Performance Monitor or resource manager for MetaFrame XP should be leveraged to monitor all components of the MetaFrame XP servers. Real-time effects of the user load can be observed using the resource manager components in the management console.

Analyzing Test Results Throughout execution of a scenario, the monitoring tool records the performance of the system under test at different load levels. This information is later organized into a readable format for archiving, analysis, and reporting.

Baselining To get a better result set from the scalability and performance tests for the MetaFrame XP environment, some performance logging should be completed before users start accessing and testing the server. Doing so will help to determine the system resources needed to run the operating system and testing tools. This information will allow a much better representation of what resources the PeopleSoft application requires.

PeopleSoft Application Scalability Analysis 8

Page 14: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

Performance Metrics Thresholds Below is a table of standard thresholds when monitoring system performance. These are very general statements, and they are used as a basis to help identify points at which performance bottlenecks could be developing.

General Problem Value General Cause Possible Other Causes

Percent Processor Time >= 70 – 80% Consistently Processor Bottleneck Disk Bottleneck, Memory Paging/Faults, Excessive Context Switches

Processor Queue Length > 2 Processor Bottleneck Disk Bottleneck, ‘Heavy Application’, too many users/applications

Memory – Available Bytes < 30% of RAM

4 GB (4 Proc)

2 GB (2 Proc)

Memory Bottleneck Large number of users/applications, ‘Heavy Application’

Memory – Committed Bytes 85% of Page File Memory Bottleneck -

Memory – Page Reads/sec > 5 Memory Bottleneck -

Memory – Free System PTEs Varies* Memory Bottleneck -

Page File – Percent Usage 85% Disk Bottleneck Memory Bottleneck

Physical Disk - % Disk Time 40% Disk Bottleneck Memory Bottleneck

Physical Disk – Disk Queue Length

>= 2 Consistently Disk Bottleneck Memory Bottleneck

Network – Bytes Total/Sec Approaches Max for network Network Bottleneck -

User Experience ‘Acceptable’ - -

Table 3 – Performance Metrics Thresholds

User Experience The user experience is a key value used to establish thresholds for the scalability testing as application responsiveness exposes the net effect of the load/stress on the systems. Unlike other data, the user experience is a subjective measure that cannot be found in resource manager or Performance Manager. The user experience is determined by executing one or more manual sessions during the test to validate how an application user will perceive the performance. The category into which a User Experience measurement falls is determined by the time required to complete the logon process, keystroke-to-screen update time, and the overall performance of the application.

Application Responsiveness

Excellent Equivalent or better than local PC performance.

Acceptable Screen updates are fluid and there is minimal effect on user’s workflow.

Poor Screen updates are noticeable and latency is increased. However, the user is still able to function.

Failure The session becomes frozen or disconnected. Therefore, the user cannot continue his/her tasks.

Table 2 – User Experience Measurements

PeopleSoft Application Scalability Analysis 9

Page 15: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

PeopleSoft Application Test Scripts This section details the custom script developed for the PeopleSoft application scalability initiative. The script presented was created to simulate typical tasks to be performed by users. The following sections detail the environmental factors and the process flow of the script.

Environmental Factors The base environment was assumed to be the following:

• The PeopleSoft script will simulate standard users. • All user actions are automated. • PeopleSoft simulation should include the normal task of searching for receivables, creating new employees,

and creating new departments.

• The scripts do not simulate printing or file copying. • The user accounts are configured correctly. • The PeopleSoft and MetaFrame XP servers are optimally configured for the environment.

PeopleSoft User Script The objective of this script is to simulate the usage pattern of a typical PeopleSoft user. The script has been broken down into three separate components the vUser_inet, the actions, and the vUser_end. The vUser_inet and the vUser_end component are only executed once during a virtual user session and handle the authentication and logoff piece of the script. The action component of the script can be configured to run a predetermined number of times during the session through the LoadRunner Controller. Each component of the script has been broken down further into transactions to modularize the script and help track the progress of the script during the testing cycles. This script is depicted in Figure 1 – PeopleSoft User Process Flow (PeopleSoftUserWorkFlow) and discussed in the following dialog:

PeopleSoft Application Scalability Analysis 10

Page 16: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

C lic k S e t U pF in a n c ia ls / S u p p ly

C h a in

S e a r c h f o rP r o d u c t -R e la t e d

R e c e iv a b le

C l ic k P r o d u c tR e la t e d

C lic k R e c e iv a b le s

C l ic k O p t io n s

C l ic k S y s te mF u n c t io n s

C l ic k S e a r c h b o x

E n t e r " D M - 0 7 " a n dp r e s s E n te r

C l ic k H o m eh y p e r l in k

C l ic k A s s e tM a n a g e m e n t

C r e a t e N e wA s s e t

C l ic k O w n e dA s s e ts

C l ic k B a s ic A d d

C l ic k S e a r c h

C l ic k A d d a N e wV a lu e h y p e r l in k

L e a v e a s s e t I D a sN E X T a n d p r e s s

E n t e r

E n t e r D e s c r ip t io n ,S h o r t D e s c r ip t io n ,

A s s e t S t a t u s( W o r k in P r o g r e s s )

a n d A c q u is i t io nC o d e ( P u r c h a s e d )

C l ic k S a v e

C lic k H o m eh y p e r l in k

C l ic k o n C u s t o m e rC o n t r a c ts

C r e a te N e wC u s t o m e rC o n t r a c t

C l ic k C o n t r a c t sH o m e

C lic k M y C o n t r a c t s

C l ic k C r e a t eC o n t r a c t

I n S o ld T o f ie ld ,e n t e r 1 0 0 2 a n d

p r e s s E n t e r

I n t h e N e g o t ia t e df ie ld , e n t e r 1 0 0 0 0

C lic k S a v e

C l ic k H o m eh y p e r l in k

C l ic k B a n k in g

C r e a te N e wA c c o u n t

C l ic k B a n kA c c o u n t s

C l ic k M y A c c o u n tG r o u p s

E n te r a n A c c o u n tG r o u p I D

E n te r D e s c r ip t io n

S e a r c h f o r a B a n ka n d a s s ig n t o t h eB a n k C o d e f ie ld

S e a r c h f o r a nA c c o u n t a n da s s ig n t o t h eA c c o u n t f ie ld

C l ic k S a v e

C l ic k M y A c c o u n tG r o u p s

C lic k N o A c c o u n tG r o u p s

C lic k S e a r c h

C l ic k H o m eh y p e r l in k

C l ic k G e n e r a lL e d g e r

C r e a t e N e wJ o u r n a l E n t r y

C l ic k J o u r n a ls

C l ic k J o u r n a l E n t r y

C l ic k C r e a t eJ o u r n a l E n t r ie s

C l ic k A d d

E n te r L o n gD e s c r ip t io n

C l ic k L in e sh y p e r l in k

S e a r c h f o rA c c o u n t in t h eA c c o u n t f ie ld

C l ic k S e t U pF in a n c ia ls / S u p p ly

C h a in

C r e a t e N e wE m p lo y e e

C l ic k C o m m o nD e f in i t io n s

C l ic k E m p lo y e eD a t a

C l ic k C r e a t e /U p d a te P e r s o n a l

D a t a

C l ic k S e a r c h

C lic k A d d a N e wV a lu e h y p e r l in k

I n t h e E m p lI Df ie ld , e n t e r

E m p lo y e e I D

In t h e L a s t N a m ef ie ld , e n t e r L a s t

N a m e

I n t h e F ir s t N a m eF ie ld , e n t e r F ir s t

N a m e

C lic k C o m m o nD e f in i t io n s

C r e a t e N e wD e p a r t m e n t

C l ic k D e s ig nC h a r t f ie ld s

C l ic k D e f in eV a lu e s

C l ic k D e p a r t m e n t

C l ic k A d d a N e wV a lu e h y p e r l in k

I n th e D e p a r t m e n tf ie ld , a d d

D e p a r t m e n t a n dc l ic k A d d

In t h e D e s c r ip t io nf ie ld , e n t e rD e s c r ip t io n

In t h e S h o r tD e s c r ip t io n f ie ld ,

e n t e r S h o r tD e s c r ip t io n

S e le c tA C C O U N T _ R P T

C lic k R u n

C lic k O K

C lic k P r o c e s sM o n it o r

C l ic k D e t a i lsh y p e r l in k

C l ic k S a v e

C lic k R e p o r ts

C l ic k C h a r t f ie ldR e p o r t s

C l ic k D e p a r t m e n t

C l ic k S e a r c h

S e le c t P e t ty C a s hA c c o u n t

I n th e A m o u n tf ie ld , e n t e r 1 0 0 0 0

C lic k S a v e

C l ic k O K

C lic k H o m eh y p e r l in k

I n th e T e le p h o n eN u m b e r f ie ld ,

e n t e r T e le p h o n eN u m b e r

In t h e A d d r e s s 1f ie ld , e n t e r

A d d r e s s

I n th e C it y f ie ld ,e n t e r F o r t

L a u d e r d a le

I n th e S t a t e f ie ld ,e n t e r F lo r id a

I n t h e Z ip C o d ef ie ld , e n te r 3 3 3 0 0

C lic k S a v e

C lic k R e v ie wP e r s o n a l D a t a

E n t e r E m p lI D

I n A d d r e s s 2 f ie ld ,e n te r A d d r e s s

C l ic k S a v e

C lic k H o m eh y p e r l in k

C l ic k D e le t eR e q u e s t

C l ic k O K

C lic k G o B a c k toD e p a r t m e n t

C l ic k H o m eh y p e r l in k

Figure 1 – PeopleSoft User Workflow

PeopleSoft Application Scalability Analysis 11

Page 17: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

PeopleSoft Script Narrative 1. The virtual user launches an ICA session using pre-configured local users ps01 through ps99.

2. The virtual user launches the PeopleSoft Application and authenticates using PeopleSoft user VP1.

3. The vUser_inet component of the script is complete and initializes the action component of the script.

4. The virtual user searches for a Product Related Receivable by performing the following steps:

o Click on Set Up Financials/Supply Chain

o Click on Product Related

o Click on Receivables

o Click on Options

o Click on System Functions

o Click on the Search box

o Enter “DM-07”, and hit the Enter key

o After verifying the page and data appear, select the Home hyperlink to exit

5. The virtual user creates a new Asset

o Click on Asset Management

o Click on Owned Assets

o Click on Basic Add

o Click on Search

o Click on Add a New Value hyperlink

o Leave asset ID as “NEXT” and press enter key

o Enter information on the following fields: Description, short description, Asset Status (“Work In Progress”), and Acquisition Code (“Purchased”)

o Click on Save

o Click on Home

6. The virtual user creates a new Customer Contract

o Click on Customer Contracts

o Click on Contracts Home

o Click on My Contracts

o Click on Create Contact

o Enter “1002” on Sold To, press Enter

o Enter a dollar amount (ie. 10000) under the Negotiated field

o Click on Save

PeopleSoft Application Scalability Analysis 12

Page 18: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

o Click on Home

7. The virtual user creates a new Account

o Click on Banking

o Click on Bank Accounts

o Click on My Account Groups

o Click on Create an Account Group

o Enter an AccountGroupId

o Enter information in the Description field

o Search for a Bank and assign it to the Bank Code field

o Search for an Account and assign it to the Account field

o Click on Save

o Click on My Account Groups

o Click on No Account Groups

o Click on Search (the newly added account will display)

o Click on Home

8. The virtual user creates a new Journal Entry

o Click on General Ledger

o Click on Journals

o Click on Journal Entry

o Click on Create Journal Entries

o Click on Add

o Enter information for the Long Description field

o Click on the “Lines” hyperlink

o Search for an Account under the Account field

o Select the “Petty Cash” Account

o Enter a dollar amount (ie. 10000) under the Amount field

o Click on Save

o Click on OK

o Click on Home

9. The virtual user creates a new Employee

o Click on Set Up Financials/Supply Chain

o Click on Common Definitions

PeopleSoft Application Scalability Analysis 13

Page 19: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

o Click on Employee Data

o Click on Create/Update Personal Data

o Click on Search

o Click on Add a New value hyperlink

o Enter an Employee Id for the EmplID field

o Enter information for the Last Name field

o Enter information for the First Name field

o Enter information for the Telephone Number field

o Enter information for the Address 1 field

o Enter “Fort Lauderdale” for the City field

o Enter “Florida” for the State field

o Enter “33300” for the Zip Code field

o Click on Save

o Click on Review Personal Data

o Enter EmplID of recently created employee (from step 7)

o Enter information for the Address2 field

o Click on Save

o Click on Home

10. User creates a new Department

o Click on Set Up Financials/Supply Chain

o Click on Common Definitions

o Click on Design Chartfields

o Click on Define Values

o Click on Chartfield Values

o Click on Department

o Click on Add a New value hyperlink

o Enter information for the Department field and click on Add

o Enter information for the Description field

o Enter information for the Short Description field

o Click on Save

o Click on Reports

o Click on Chartfield Reports

PeopleSoft Application Scalability Analysis 14

Page 20: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

o Click on Department

o Click on Search

o Select on “ACCOUNT_RPT”

o Click on Run

o Click on OK

o Click on Process Monitor

o Click on Details Hyperlink

o Click on Delete Request

o Click on OK

o Click on Go Back to Department

o Click on Home

11. The action component of the script completes and initializes the vUser_end component of the script.

12. The virtual user logs out of the system.

13. The vUser_end component is complete and the script is finished processing.

PeopleSoft Application Scalability Analysis 15

Page 21: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

Environment Specifications This section describes the test environment and configuration of the PeopleSoft Internet Architecture scalability tests.

MetaFrame XP Architecture Overview This section provides an overview of the interaction between the components in the MetaFrame XP scalability testing environment. The components are identified in Figure 2 – MetaFrame XP Scalability Test Architecture Overview and described in the ensuing diagram.

Figure 2 – MetaFrame XP Scalability Test Architecture Overview

LoadRunner Controller Mercury LoadRunner enables testing and monitoring of MetaFrame XP-based systems to ensure high performance, scalability, and availability. LoadRunner works by emulating large numbers of ICA Client connections from one or more computers. Sessions are initiated from the LoadRunner Controller and workflow scripts are executed on client machines, simulating real-life scenarios and generating realistic network and resource loads. Since scripts controlling user sessions are executed on the client machines, no additional overhead is placed on the MetaFrame XP servers that might otherwise skew results.

The virtual users are created using a recording technology that captures the ICA traffic between the client and server into a high level, easy to read, maintainable test script. These scripts can be easily modified to represent real users with their own sets of data and replay speeds. By licensing key Citrix technology, LoadRunner’s virtual users generate exactly the same traffic as the ICA client and generate the realistic load of live users.

Citrix® MetaFrame XP™ Presentation Servers Citrix® MetaFrame XP™ Presentation Server is a server-based computing solution that enables robust, easily managed, and cost-effective delivery of applications to a great number of client devices. The ICA protocol supports nearly all types of hardware, operating platforms, network connections, and network protocols in general business use. Thus, organizations can deliver a common set of applications to different types of client devices and to users in separate locations with better performance than alternative technologies.

PeopleSoft Application Scalability Analysis 16

Page 22: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

MetaFrame XP servers are organized at the highest level into server farms. A Citrix server farm is a group of Citrix servers managed as a single entity with Citrix’s Independent Management Architecture (IMA) components. Servers in such a farm share connectivity and a single IMA-based data store. MetaFrame XP servers use the data store to centralize configuration information for a server farm. The data store maintains information about the servers, applications, and Citrix administrators in the server farm.

Servers in a MetaFrame XP server farm are also further organized into zones. A zone is a logical grouping of MetaFrame XP servers. This grouping is intended to enhance the performance of the MetaFrame XP environment by allowing geographically related servers to be grouped together. Each zone within a server farm has one server that is designated as a Zone Data Collector. Data Collectors store information about the servers and published applications in the farm from each server within its zone and all other Zone Data Collectors. This information is in turn used to dynamically load balance users to the least-busy server when connecting to a MetaFrame XP server farm.

PeopleSoft Server PeopleSoft is an Enterprise Resource Planning (ERP) system that enables businesses to help customers, suppliers, and employees to work together collaboratively. A PeopleSoft implementation is based on one or more customized modules that allow companies to focus on productivity and profitability. Most PeopleSoft implementations consist of several modules, such as HR and Financials.

For this test, the PeopleSoft system is accessed with a web client via Internet Explorer 6.0 hosted on the MetaFrame XP servers. Both the PeopleSoft application and database resided on CCSLABS49.

PeopleSoft Application Scalability Analysis 17

Page 23: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

MetaFrame XP Hardware Specifications The following table outlines the hardware specification for the MetaFrame XP servers that were tested during the MetaFrame XP scalability testing initiative.

Server CCSLABS37

Purpose Citrix® MetaFrame XP™ Presentation Server

Vendor Compaq

Server Model DL360 G2

Processor Type 1.4 GHz Pentium III

Number of Processors 2

RAM 2 GB

Partition Size c: 4 GB

Partition Size d: 12.9 GB

RAID Level 0

NIC Vendor Compaq

NIC Speed 100Mbps Full Duplex

Table 3 – MetaFrame XP Hardware Specifications

Supporting Hardware Specifications Server CCSLABS35 CCSLABS49 SQL01

Purpose LoadRunner Controller PeopleSoft Server SQL Database

Vendor Compaq Compaq Compaq

Server Model DL 360 G2 DL 360 G2 DL 360 G2

Processor Type 1.4 GHz Pentium III 1.4 GHz Pentium III 1.4 GHz Pentium III

Number of Processors 2 2 2

RAM 2 GB 2 GB 1.3 GB

Partition Size c: 16.9 GB 16.9 GB 16.9 GB

Partition Size d: N/A N/A N/A

NIC Vendor Compaq Compaq Compaq

NIC Speed 100Mbps Full Duplex 100Mbps Full Duplex 100Mbps Full Duplex

Table 4 – Support Hardware Specifications

PeopleSoft Application Scalability Analysis 18

Page 24: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

Software and Platform Specifications The following table outlines the software configuration for each component used during the scalability testing effort.

Component Software

LoadRunner Controller • Windows 2000 Advanced Server with SP2 • LoadRunner 7.51 • Citrix ICA Client Integration with LoadRunner • Post-SP2 hotfixes:

Q147222 Q296185 Q300972 Q276471 Q298012 Q301625 Q285156 Q299553 Q302755 Q285851 Q299687 Q303984 Q292435 Q299796

Citrix® MetaFrame XP™ Presentation Server

• Windows 2000 Advanced Server with SP2 • Citrix® MetaFrame XP™ Presentation Server, Enterprise Edition Feature Release 2 • Internet Explorer 6.0.2600 • Office 2000 • Pre-SP3 hotfixes:

Q147222 Q299796 Q318138 Q252795 Q300845 Q318593 Q276471 Q300972 Q319733 Q285156 Q301625 Q320176 Q285851 Q302755 Q320206 Q292435 Q303984 Q321599 Q295688 Q311967 Q323172 Q296185 Q313450 Q326830 Q298012 Q313582 Q326886 Q299553 Q313829 SP2 SRP1 Q299687 Q314147

PeopleSoft Application/Web Server

• Windows 2000 Server with SP2 • PeopleSoft Financials 8.4 • PeopleSoft Edition of BEA Tuxedo version 6.5/BEA Jolt 1.2 • PeopleSoft Internet Architecture 8.18

• BEA WebLogic version 5.1.0 and Server 6.1 (SP1) • SQL Server 2000 • Internet Information Services (IIS) 5.0 • PeopleTools 8.41

• Post SP2 hotfixes: Q299956 Q311967 Q313450 Q313582 Q313829 Q314147

PeopleSoft Application Scalability Analysis 19

Page 25: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

Q318138 Q319733 Q320176 Q320206 Q321599 Q311401

Table 5 - Environment Software and Platform Specifications

PeopleSoft Application Scalability Analysis 20

Page 26: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

Citrix® MetaFrame XP™ Presentation Server Configuration This section details the steps required to stage the MetaFrame XP servers in preparation for the scalability test. Because of the limited size needed for this particular test environment, a single MetaFrame XP server farm with a single zone was created. The Citrix XML service was configured to use port 8080.

ICA Connection Configuration In order to ensure consistent test results the following ICA connection configuration settings were configured and validated on the MetaFrame servers.

Advanced Settings

• Timeout Setting for Connection = Inherit User Configuration

• Timeout Setting for Disconnection = Inherit User Configuration

• Timeout Setting for Idle = Inherit User Configuration

• Required Encryption = Basic

• AutoLogon • Prompt for Password

= =

Inherit User Configuration No

• Initial Program

• Only Run Published Apps

= =

Inherit Client/User Config Not Selected

• User Profile Overrides = Not Selected

• On a broken or timed out connection = reset

• Reconnect sessions disconnected = Inherit User Configuration

• Shadowing = Inherit User Configuration ICA Client Settings

• Client Audio Quality = Medium Client Settings

• Connect Client Drives at Logon = Not selected / Disabled

• Connect Client Printers at Logon = Not selected / Disabled

• Default to Main Client Printer = Selected / Disabled

• Disable Client Drive Mapping = Selected

• Disable Windows Client Printer Mapping = Selected

• Disable Client LPT Port Mapping = Selected

• Disable Client COM Port Mapping = Selected

• Disable Client Clipboard Mapping = Not Selected

• Disable Client Audio Mapping = Selected

PeopleSoft Application Scalability Analysis 21

Page 27: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

Internet Explore 6.0 Configuration Internet Explorer 6.0 is the web browser that is used to access the PeopleSoft application. The following configuration was used:

ActiveX

• Download Signed ActiveX controls = Enabled

• Download unsigned ActiveX controls = Enabled

• Initialize and Script ActiveX controls not marked as safe

= Enabled

• Run ActiveX controls and plug-ins = Enabled

• Script ActiveX controls marked safe for scripting = Enabled Downloads

• File download = Enabled

• Font download = Enabled Microsoft JVM

• Java Permissions = Low Safety Miscellaneous

• Access Data Sources across domains = Enabled

• Allow META REFRESH = Enabled

• Display Mixed Content = Enabled

• Don’t prompt for client certificate selection when no certificates or only on certificate exists

= Enabled

• Drag and drop or copy and paste files = Enabled

• Installation of desktop items = Enabled

• Launching programs and files in an IFRAME = Enabled

• Navigate sub-frames across different domains = Enabled

• Software channel permissions = Low safety

• Submit nonencrypted form data = Enabled

• User data persistence = Enabled Scripting

• Active scripting = Enabled

• Allow paste operations via script = Enabled

• Scripting of Java applets = Enabled User Authentication

• Logon = Automatic logon with current user name and password

Note: The Internet Connection Wizard must be run for each test user. In environments where the test team have access to the default user and can establish mandatory profiles, this step will not be necessary.

PeopleSoft Application Scalability Analysis 22

Page 28: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

PeopleSoft Scalability Test Cases The scalability tests have been design to answer a series of questions defined by Citrix consulting to reflect common question customers have regarding PeopleSoft running on MetaFrame XP servers. The main question that the data will help to answer is:

• How many standard PeopleSoft users can be supported on a MetaFrame XP server in steady state1 on a two-processor machine?

The following sections discuss the test cases that were defined and executed to answer the aforementioned question.

Case 1: PeopleSoft Baseline Test During the initial phase of the scalability testing initiative a baseline test was completed to determine the baseline for a single user when executing the PeopleSoft script on a MetaFrame XP published application, Microsoft Internet Explorer 6.0 accessing PeopleSoft. An additional user was added to the test to validate the initial baseline figures.

Case 2: PeopleSoft Supported Users The object of this test case is to determine the number of users that can be supported on a MetaFrame XP server given a predetermined hardware configuration during steady state. The test case was design to isolate system performance to iterations of just the PeopleSoft script and load the system to simulate a production server during a standard business day. All users are logged into MetaFrame XP, and the script actions begin after all users have initialized their sessions. Throughout the test iterations, an additional connection was made to the PeopleSoft published application, and the actions were performed manually to observe response times. While there is no tangible metric, these response times and overall usability were noted for the determination of the User Experience.

1. Steady State is referred to as the state at which all users are actively accessing that application and no longer logging onto the system.

PeopleSoft Application Scalability Analysis 23

Page 29: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

Scalability Results and Analysis The following section details the results obtained from the PeopleSoft scalability testing performed on Compaq DL360 G2 dual 1.4 GHz processor machines. The scalability testing was performed to determine the maximum number of users that the server could support while the users were performing typical activities as defined by the automated script. The following section analyzes the server resource data and attempt to determine the bottlenecks that limited the number of users that a two-processor server can support.

Results The following table contains the values for each counter that were averaged over steady states in each session interval. Each counter was measured every five seconds during the execution of the tests. The chart separates the data into different user session intervals. The detailed summary of the tests can be found in Appendix C: Test Data.

# of Sessions

Mem: Available GB

Mem: Page Faults/ sec

Mem: Pages/ sec

Pagefile: % Usage

Disk: % Disk Time

Disk: Queue Length

Proc: Interrupts/sec

Proc: % Proc Time

Proc0: % Proc Time

Proc1: % Proc Time

1 (baseline) 1.77 62.58 2.12 0.26 2.34 0.04 134.93 2.57 2.54 2.60

11 1.62 256.73 10.43 0.81 12.09 0.02 290.81 13.09 13.33 12.86

21 1.36 346.07 5.24 1.79 21.19 0.08 416.81 25.77 26.10 25.45

31 1.33 509.35 14.10 2.29 31.56 0.10 445.72 26.72 26.70 26.75

41 1.02 769.46 5.52 2.20 48.50 0.12 703.31 50.28 49.81 50.75

Table 9 - Maximum Concurrent Load Test Results

The following graphically depicts the results obtained during testing:

Avg Mem: Page Faults/sec

62.58

256.73

346.07

509.35

769.46

0.00

100.00

200.00

300.00

400.00

500.00

600.00

700.00

800.00

900.00

# of Sessions

Page

Fau

lts/s

ec

Avg Mem: Page Faults/sec

1 11 21 31 41

Avg Mem: Available Gb

1.77

1.62

1.36 1.33

1.02

0.00

0.20

0.40

0.60

0.80

1.00

1.20

1.40

1.60

1.80

2.00

# of Sessions

Gb Avg Mem: Available Gb

1 11 21 31 41

PeopleSoft Application Scalability Analysis 24

Page 30: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

Avg Pagefile: % Usage

0.26

0.81

1.79

2.292.20

0.00

0.50

1.00

1.50

2.00

2.50

# of Sessions

% U

sage

Avg Pagefile: % Usage

1 11 21 31 41

Disk

2.34

12.09

21.19

31.56

48.50

0.04 0.02 0.08 0.10 0.120.00

10.00

20.00

30.00

40.00

50.00

60.00

# of Sessions

Avg Disk: % Disk TimeAvg Disk: Queue Length

1 11 21 31 41

Avg Processor: Interrupts/sec

134.93

290.81

416.81445.72

703.31

0.00

100.00

200.00

300.00

400.00

500.00

600.00

700.00

800.00

# of Sessions

Inte

rrup

ts/s

ec

Avg Proc: Interrupts/sec

1 11 21 31 41

Processor

2.6

13.1

25.8 26.7

50.3

2.5

13.3

26.1 26.7

49.8

2.6

12.9

25.426.7

50.7

0.0

10.0

20.0

30.0

40.0

50.0

60.0

# of Sessions

Avg Proc: % Proc TimeAvg Proc0: % Proc TimeAvg Proc1: % Proc Time

1 11 21 31 41

Analysis Citrix consulting recommends that the results from the 41+ user observation be considered beyond the maximum level as CPU would consistently remain over 70% for extended periods of time. The user experience factor was “Excellent” through 35 user sessions and performance was “Acceptable” up to 40 concurrent user sessions. However upon exceeding 35 user sessions, extended periods of CPU utilization over 70% would occur. Based on this, the recommended range of 30-35 users is made, since extended periods of Total Processor Time over 70% can lead to system instability.

The user session interval between 35 and 40 users should be the focal point of future scalability testing. It was between these intervals that larger processor spikes were observed. The extended periods of Total Processor Time exceeding 70% account for the significant increase (nearly double) seen from 30 to 40 users; 27.21% to 51.91% with the addition of 10 users.

PeopleSoft Application Scalability Analysis 25

Page 31: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

Appendix A: LoadRunner

vUser Script Generation Citrix vUser scripts emulate the Citrix ICA protocol communication between a Citrix client and server. LoadRunner’s VuGen enables easy creation of automated testing scripts by recording all activity during manual interactions with a MetaFrame XP application and creating a corresponding vUser script. The functions within this vUser script emulate the analog movements of the mouse and keyboard in an ICA session. In addition, these functions allow synchronization during the replay of the scripts used in a load test by waiting for bitmap changes, comparing bitmaps, or waiting for specific windows to open.

To provide an interface to the ICA Client APIs, the Citrix ICA Client integration must be installed on the LoadRunner Controller. To install the Citrix ICA Client integration:

Extract Citrix_Headless_Client.zip to the LoadRunner Controller. Run setup.exe to install the Citrix ICA Client Integration with LoadRunner Run “%Program Files%\Citrix\ICA Client\wfica32.exe /setup” to register the necessary objects in the Citrix ICA

Client Integration

Additionally, LoadRunner 7.51 must be updated to include functions that enable emulation of the ICA protocol:

Unzip LR_For_Citrix_ICA.zip to the c:\mercury interactive\LoadRunner (assuming this is where you installed LoadRunner) directory. Overwrite any files if prompted.

Execute c:\mercury interactive\LoadRunner\bin\citrix.bat. This will register all the necessary DLLs with the system.

One of the Citrix support files located in the LoadRunner\bin directory is called CitrixClientImpl.dll. If this file has a time stamp earlier than 09/02/2002, it must be replaced with the version below to support wildcard names for dialog boxes.

The high level steps for recording a vUser script using VuGen are as follows:

• Record the actions using VuGen. Invoke VuGen and create a new vUser script, specifying Citrix as the type. Choose the application to record and set the recording options. Record typical operations in the Citrix session.

• Enhance the vUser script. Enhance the vUser script by inserting transactions, rendezvous points, and control-flow structures into the script.

• Define parameters. Define any parameters for the fixed-values recorded into the vUser script. By substituting fixed-values with parameters, the same business process can be repeated many times using different values.

• Configure the run-time settings. The run-time settings control the vUser behavior during script execution. These settings include the pacing, logging, think time, and connection information.

• Run the vUser script from VuGen. Save and run the vUser script from VuGen to verify that it runs correctly.

PeopleSoft Application Scalability Analysis 26

Page 32: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

Tips for Working with Citrix vUser Scripts

Recording Tips

• When recording a session, make sure to perform the complete business process, starting with the connection and ending with the cleanup. End your session at a point from where you could start the entire process from the beginning.

• Configure the Citrix server to completely close a session. Open the Citrix Connection Configuration dialog box. Choose Start > Programs > Citrix > MetaFrame XP > Citrix Connection Configuration. Double-click on the ica-tcp connection name. The Edit Connection dialog box appears. Click on the Advanced button. In the bottom section of the dialog box, clear the inherit user config check box adjacent to the “On a broken or timed-out connection list box”. Change the entry for this list box to reset.

• Record the connection process into the vUser_init section, and the closing process in the vUser_end section. This will prevent you from performing iterations on the connection process.

• Display settings of 1024 x 768 are recommended on the recording machine. This will allow the Citrix window, which is 800 x 600 to be displayed properly.

• When opening expanded menu options, click explicitly on each option—do not depend on the expanding menu. For example, Start > Programs > Microsoft Word, be sure to click on the word Programs.

• If having trouble with window or dialog box names not being consistent, edit the script to use wildcards (*). For example, use ctrx_set_window("Spelling and Grammar:*”).

• If you minimize the ICA Client during a recording or playback and subsequently stop the recording or playback, the ICA Client will remember its previous state when you start the next recording or playback. As a result, the ICA Client will appear in the taskbar, but no GUI for the connection will be displayed unless you right-click and select Maximize. Avoid minimizing the ICA Client unnecessarily during recordings and playbacks.

Replay Tips

• To prevent overloading by multiple vUsers while connecting, set an initialization quota or use a ramp up schedule from the Controller’s scheduler.

• For best results, do not disable think time in the run-time settings. Think time is especially relevant after ctrx_set_window and before ctrx_type functions, which require time to stabilize.

Debugging Tips

• You can view additional replay information in the Extended log. You enable Extended logging from the Run-Time settings (F4 Shortcut key) Log tab. You can view this information in the Execution Log tab, or the output.txt file in the script’s directory. When running a LoadRunner scenario each vUser generates its own execution log located in C:\Program Files\mercury interactive\LoadRunner\scripts\ScriptName\res\log\scriptname_vUserID.log.

• When an error occurs, VuGen saves a snapshot of the screen to the script’s output directory. You can view the bitmap to try to determine why the error occurred.

PeopleSoft Application Scalability Analysis 27

Page 33: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

Appendix B: Performance Metrics The following section provides summary information about the metrics, both objective and subjective, used to quantify performance during the scalability testing. Additionally, a standard table of performance monitoring thresholds for these metrics is provided in the Performance Metrics Thresholds section.

Percent Processor Time Percent Processor Time is the percentage of time the processor is busy handling non-idle tasks. When observing processor utilization, quick and sudden spikes are not a huge concern. Simple tasks, such as logging in, will cause the processor utilization to spike. Administrators should look for an upward trend in the total utilization percentage. On an idle MetaFrame XP server, typical processor utilization will hover around 0-10%. As more users connect to the server and begin to work, this percentage should slowly creep upwards on the performance monitor scale. Once the ‘Percent Processor Time’ reaches a sustained value between 70 and 80% or more, users might begin to notice performance degradation in the system.

Processor Queue Length Processor Queue Length is the number of threads that are waiting to be processed by the processor. Processor queue lengths are minimal, typically less than two. If the queue length is consistently above two, a processor is generally considered to be congested.

Available and Committed Memory Available memory indicates how much RAM is available for system processes, whereas committed memory is how much of the paging file has been reserved for future memory paging in case it is needed. By doing a proper benchmarking test of the system, one will get a better representation of how much memory, available and committed, is being used just by the operating system. By then monitoring the system with one user, and then subsequent numbers of users, an administrator should be better able to extrapolate how much memory a new user’s session will utilize. By having a good estimate of this, one would be better equipped to identify how much memory a MetaFrame XP server will require to serve a certain number of concurrent users.

Page Faults per Second Page Faults/sec is the overall rate faulted pages are handled by the processor. It is measured in numbers of pages faulted per second. A page fault occurs when a process requires code or data that is not in its working set (its space in physical memory). This counter includes both hard faults (those that require disk access) and soft faults (where the faulted page is found elsewhere in physical memory). Most processors can handle large numbers of soft faults without consequence. However, hard faults can cause significant delays. This counter displays the difference between the values observed in the last two samples, divided by the duration of the sample interval.

Page Reads per Second Page Reads/sec is the number of times the disk was read to resolve hard page faults. (Hard page faults occur when a process requires code or data that is not in its working set or elsewhere in physical memory, and must be retrieved from disk). This counter was designed as a primary indicator of the kinds of faults that cause system-wide delays. It includes reads to satisfy faults in the file system cache (usually requested by applications) and in non-cached mapped memory files. This counter determines the number of read operations, without regard to the numbers of pages retrieved by each operation.

PeopleSoft Application Scalability Analysis 28

Page 34: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

Free System Page Table Entries Free System Page Table Entries is the number of page table entries not being used by the system. This counter displays the last observed value only; it is not an average. Windows 2000 can directly address up to 2^32 bytes or 4 gigabytes (GB) of memory address space, regardless of how much physical Random Access Memory (RAM) is installed. By default, 2 GB of this address space is allocated to each process, and 2 GB is allocated to the kernel. While separate 2 GB regions of address space are used for each process in the computer, most of the 2 GB kernel area is global and remains the same no matter which user-mode process is currently active.

The 2 GB of kernel area contains all system data structures and information. Therefore, the 2 GB kernel address space area can impose a limit on the number of system data structures and the amount of kernel information that can be stored on a computer, regardless of how much RAM is installed.

The two types of data that share a portion of this 2 GB address area are Paged Pool allocations, and kernel stack allocations. Paged Pool allocations are memory allocations made by kernel-mode components. Kernel stack allocations are stacks created in the kernel for each thread to use for making system calls. Paged pool allocations are made in the Paged Pool area, and kernel stack allocations are made in the System Page Table Entry (PTE) area.

While these different allocations share the same area, the partition between them is fixed at startup. If the operating system runs out of space in one of those areas, the other area cannot donate space to it, and programs may begin to encounter unexpected errors. Therefore, when you encounter a Windows 2000-based computer that is experiencing unexpected errors or an inability to accept new logins, and the computer does not have some other resource limitation such as Central Processing Unit (CPU) or disk bottlenecks, it is highly likely that the Paged Pool or System PTE areas are dwindling. Because, by default, the System PTE area is sized to be as large as possible on a computer with Terminal Services enabled, the limitation will usually be due to insufficient Paged Pool address space. Fortunately on some computers, the System PTE area can be configured to be smaller, which can alleviate these symptoms and allow more users access to the computer.

The number of Free System PTEs varies depending on the memory configuration for the server. Typically, PTEs only cause bottlenecks on servers with a high number of processors (i.e., 4 or 8 processors).

Percent Usage Page File By monitoring paging file percent usage, a systems administrator should be capable of establishing certain baselines for system usage. The goal for a MetaFrame XP server is to under-utilize the paging file and over-utilize the memory, because using memory will have a significant speed improvement over a paging file. Hence, the paging file usage percentage should never reach 100%. If it does creep up to a level close to this, chances are that the system is having a severe memory shortage. As the percent usage of the paging file approaches one hundred percent, the responsiveness of the MetaFrame XP server will be severely hindered by the slower speeds of the disks that house the paging file.

With the large amounts of RAM available on a MetaFrame XP server, if scaled correctly, the paging file should rarely approach 100% usage. If it does, the available memory is probably approaching zero. A possible solution to this problem is adding more RAM.

Percent Disk Time Percent Disk Time is the amount of time that the disk subsystem is busy trying to fulfill the requests to read or write data to or from the disks. This value is the sum of the percent disk time for all the disk drives in the system, and can therefore exceed 100%. For example: if there are 3 disks in the system and the utilization of each disk is 60%, 50% and 0%, the percent disk time would respond with 110%. However, the actual percent disk time is 110/3 or 37% busy.

PeopleSoft Application Scalability Analysis 29

Page 35: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

Current Disk Queue Length Current Disk Queue Length is the current number of disk requests to do read and write processes. Typically, one should not see any queue lengths when examining the disk subsystem. It is very likely one will see the queue length hit one or two. However, when a sustained value of greater than two is consistently observed, the disk subsystem is quite possibly the bottleneck.

PeopleSoft Application Scalability Analysis 30

Page 36: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

Appendix C: Test Data This section details the summary of the session count data that was captured during testing. Data was rounded to two decimal points. All line items that contained data that was based on five or fewer samplings was discarded.

# of Sessions

CountOf. Sessions

Avg Mem: Available Kb

Avg Mem: Committed Kb

Avg Mem: Free Sys Table Entries

Avg Mem: Page Faults/sec

Avg Mem: Page Reads/sec

Avg Mem: Pages/sec

Avg Pagefile: % Usage

1 47.00 1773581.87 161313378.04 153135.30 62.58 0.49 2.12 0.262 12.00 1726462.67 210894506.67 151372.58 981.95 0.60 10.02 0.635 6.00 1716681.33 246335488.00 151242.33 1274.73 8.42 35.11 0.437 6.00 1686102.00 291256320.00 150079.00 1300.54 9.01 38.56 0.569 61.00 1644065.18 331028849.31 149585.00 261.56 2.59 8.27 0.88

11 222.00 1621275.82 387257076.47 148134.91 256.73 5.43 10.43 0.8113 9.00 1576486.22 430566058.67 147109.44 633.05 12.06 24.54 1.1421 404.00 1359519.85 694573877.23 141827.59 346.07 4.04 5.24 1.7922 22.00 1430083.82 628547.96 141146.73 728.36 0.20 4.87 1.5231 100.00 1329620.76 819625533.44 135639.00 509.35 9.21 14.10 2.2932 36.00 1324166.44 834406968.89 135679.44 469.62 9.56 14.08 2.3235 39.00 1203513.74 965249.81 131906.64 593.17 2.32 7.74 2.4836 48.00 1163981.92 1008586.15 132126.23 786.31 4.40 10.77 2.5637 71.00 1150163.44 1025734.34 131747.75 929.58 3.67 13.49 2.6038 160.00 1104398.15 1078706.43 131729.10 552.83 2.12 4.95 2.7339 7.00 997381.14 1202595547.43 130539.43 1529.96 2.49 18.46 2.1340 322.00 1016798.24 1193196143.30 129600.06 628.35 0.61 2.65 2.2141 99.00 1020398.63 1194155618.26 129594.83 769.46 1.33 5.52 2.2042 69.00 1018687.54 1197607.89 129365.70 540.50 2.60 7.23 2.9743 44.00 1027353.45 1192873.15 129027.80 653.31 0.96 6.90 2.97

# o f S e s s io n s

A v g D is k : % D is k T im e

A v g D is k : Q u e u e L e n g th

A vg P ro c : % P ro c T im e

A vg P ro c : In te rru p ts /s ec

A vg P ro c 0 : % P ro c T im e

A v g P ro c 1 : % P ro c T im e

1 2 .3 4 0 .0 4 2 .5 7 1 3 4 .9 3 2 .5 4 2 .6 02 6 5 .4 4 0 .0 8 7 .3 7 1 6 7 .4 9 7 .0 2 7 .7 25 3 3 .2 6 0 .5 0 7 .7 8 2 2 2 .5 1 7 .3 9 8 .1 77 2 9 .2 0 0 .6 7 1 1 .7 2 2 5 8 .3 3 1 2 .0 9 1 1 .3 69 8 .8 5 0 .2 0 5 .5 1 1 9 7 .3 6 5 .8 2 5 .2 0

1 1 1 2 .0 9 0 .0 2 1 3 .0 9 2 9 0 .8 1 1 3 .3 3 1 2 .8 61 3 3 1 .0 0 3 .4 4 1 6 .5 5 3 4 1 .6 4 1 6 .6 6 1 6 .4 52 1 2 1 .1 9 0 .0 8 2 5 .7 7 4 1 6 .8 1 2 6 .1 0 2 5 .4 52 2 1 9 .0 8 0 .0 5 3 9 .3 0 4 8 7 .2 2 3 8 .7 6 3 9 .8 53 1 3 1 .5 6 0 .1 0 2 6 .7 2 4 4 5 .7 2 2 6 .7 0 2 6 .7 53 2 3 0 .5 6 0 .0 3 2 7 .6 2 4 8 2 .8 3 2 7 .8 6 2 7 .3 83 5 3 0 .9 3 0 .2 6 3 8 .8 9 5 7 3 .8 9 3 8 .8 0 3 8 .9 83 6 3 8 .9 5 0 .0 0 3 7 .7 8 5 3 4 .1 3 3 7 .2 7 3 8 .2 83 7 3 9 .2 0 0 .4 9 3 8 .1 0 5 5 6 .9 0 3 7 .6 3 3 8 .5 63 8 2 1 .9 6 0 .0 0 3 0 .1 8 4 9 9 .2 5 2 9 .6 4 3 0 .7 23 9 5 9 .2 4 0 .0 0 4 7 .2 1 6 7 3 .8 0 4 5 .7 6 4 8 .6 64 0 3 9 .8 7 0 .0 6 4 6 .9 7 7 1 2 .4 6 4 6 .5 2 4 7 .4 24 1 4 8 .5 0 0 .1 2 5 0 .2 8 7 0 3 .3 1 4 9 .8 1 5 0 .7 54 2 2 7 .0 7 0 .0 1 3 0 .4 3 4 6 8 .5 7 2 9 .8 8 3 0 .9 84 3 2 6 .8 2 0 .0 0 3 2 .0 2 4 6 4 .0 1 3 1 .6 2 3 2 .4 2

PeopleSoft Application Scalability Analysis 31

Page 37: PeopleSoft Application Scalability Analysis - Dell · PeopleSoft Application Scalability Analysis By Citrix Consulting ... • LoadRunner and Citrix Testing Process – Discusses

851 West Cypress Creek Road

Fort Lauderdale, FL 33309 954-267-3000 http://www.citrix.com

Copyright © 2003 Citrix Systems, Inc. All rights reserved. Citrix, WinFrame and ICA are registered trademarks, and MultiWin and MetaFrame are trademarks of Citrix Systems, Inc. All other products and services are trademarks or service marks of their respective companies. Technical specifications and availability are subject to change without prior notice.

PeopleSoft Application Scalability Analysis i