Main Nav

Performance Testing PeopleSoft Campus Solutions v9.0: A Case Study - Notes

Performance Testing PeopleSoft Campus Solutions v9.0: A Case Study, EDUCAUSE Enterprise Information and Technology Conference, Chicago, May 2009
Presenters: Jody Reeme, Associate Director, Student Enterprise Systems, and Jeffrey Wilson, Business Systems Analyst, Northwestern University

Presentation slides and handout available at


Reeme and Wilson began with an overview of the campus and their Oracle/PeopleSoft implementations and their hardware and then defined performance testing as:

  • Performance Testing - transactional speed under specific workload
  • Load testing - create incremental demand and measure
  • Stress testing - system performance under controlled amounts of stress

They were doing performance testing for "peace of mind" in two areas:

  • Upgrade to campus solutions v9 - Aggressive scheduling to minimize disruption for users - Started planning in 11/1/06 with the Upgrade to go-live 8/07. New load balancing hardware, etc, and
  • Freshman registration - they wanted it to go smoothly. The "rock" at Northwestern - write on the rock the bad things - the IT group doesn't want to be listed on the "rock" 2000 freshmen register 2 days before the start of the fall quarter (each takes about 4 classes) The load on the system increases throughout the day as students get closed out of classes and search for alternatives. Everyone has a significant interest in things going smoothly since their initial implementation and last major upgrade did not.

Planning Process included

  • Determine scope
  • Develop timeline
  • Get user input to determine which processes should be tested.
  • Document processes to be scripted in advance
  • Gather hardware specifications for proposed production environment
  • Create system diagrams
  • Allocate staff
  • Engage consultants

Human Resources needed include

  • Function subject matter experts
  • Business analysts
  • Developers
  • Scripting experts
  • Project managers
  • DB administrators
  • Sys administrators

They did not have a system set up in the beginning - they used the Oracle upgrade lab and it was an iterative process. Comp srvs testing moved into a new production environment

Test case requirements gathering

  • Meeting with functional users - identify key business processes to be tested
  • Prioritize processes for testing - automated vs manual
  • Collect statistics on processes identified.

Processes tested:
Test case scripting was developed - they created excel spreadsheet of steps, expected results, etc - (templates are in the ppt) The scripts used load runner and each was done separately

Scripts they created included the items below - Note that each script worked differently

  • Login
  • Class search - simple vs detailed
  • Enrollment - with search vs without
  • Drop classes
  • Degree what -if
  • Enrollment verification
  • Unofficial transcript
  • Student account lookup
  • Random self-serve navigation
  • Logout

And then

Data - selection & manipulation, parameterization
Security - login ids for new freshman, current undergrads, and staff

They exposed some bad data in the system that could be dealt with appropriately and the scripts allowed them to use same data over and over again. They also had some manual scripts to make sure all staff members were security cleared.

Issues in the process included:

  • Ever-changing application and prioritization of scripts
  • Browsers are different so they needed two sets of scripts - one for IE and one for Firefox.

Preparing the Testing Environment
They had a production testing environment where they refreshed data and the server setup was 8 web and 8 application servers.
They also tested out pieces of the student administration application - data manipulation, reset testing of users' passwords, the LDAP connection and Data conversion. The basically did a manual walk through of test cases and then some short automated tests.

During the tests they set up a "War Room" and then started small, ramping up with real-time monitoring. It was very time consuming.

The results of the test were analyzed and adjusted for a next round.

Lessons Learned

  • You should begin tracking statistics as early as possible
  • You should determine appropriate performance benchmarks
  • You must be clear about who has responsibility for what.
  • You must be clear about the resources needed
  • You should integrate your testing needs into your implementation or upgrade plans
  • Communication and planning are both critically important.

Benefits from performance testing included

  • Identifying application and hardware needs
  • Confidence that the system would work under loads
  • Created shake out of the production environment itself
  • And they had a successful freshman registration period and didn't get added to the "rock"


Tags from the Community