Upload
vanessa-watson
View
215
Download
1
Embed Size (px)
Citation preview
From Sage 500 to 1000Performance Testing myths exposed
Richard Bishop
From Sage 500 to 1000
Introduction
Trust IV & Richard Bishop
Project Background
8 myths of (non-functional) software testing
What we did…breaking myths
Conclusion
…………..Performance Testing myths exposed
Introduction
Trust IV• Founded in 2005• Testing consultancy, specialising in automated non-functional testing
Richard Bishop, Trust IV Ltd• IT consultant with 20 years experience• Specialising in Microsoft platforms and performance engineering / testing• HP specialist, UK leader of Vivit (HP software user group in UK)
Colleagues• Mixture of consultants and contract resources• Primarily HP LoadRunner specialists• On customer sites and working remotely• Skills in multiple test tools, platforms etc.
Who are we?
Non-Functional Testing
Non-functional, automated testing specialistsWhat on earth is NFT?
In a nutshell……….Usability, reliability and scalability.
Compatibility testingCompliance testingSecurity TestingBackup and Disaster Recovery TestingLoad TestingPerformance TestingScalability TestingStress Testing
What on earth is NFT?
usability
reliability
scalability
Project Background
Sage 500 to Sage 1000 migration Concerns re: numbers of concurrent users supported Required performance test to validate potential maximum user load Test to include a single “user journey”, simulating a requisition process Objective is to increase user load until system failure
University Hospital Birmingham
8 Myths
The vendor/developer has already tested this so we don't need to NFT not required if functional testing and UAT is OK Can / Can’t test in live environment Web applications are easy to test using low-cost / open source tools Anyone can test…… If a test fails “it must be the test tool / tester's fault” Testing is too expensive / time consuming If it's slow we can “throw kit at it”
(of non-functional software testing)
Myth 1Vendor/developer already tested - we don’t need to…
Myth 2NFT not required if functional testing and UAT is OK
Single user
Forms Server
DB Server
IIS servers
Load balancer
Myth 2NFT not required if functional testing and UAT is OK
Single user Multi user
Forms Server
DB Server
IIS servers
Load balancer
Forms Server
DB Server
IIS servers
Load balancer
Myth 2NFT not required if functional testing and UAT is OK
Single user Multi user
Forms Server
DB Server
IIS servers
Load balancer
Forms Server
DB Server
IIS servers
Load balancer
Myth 2NFT not required if functional testing and UAT is OK
Single user Multi user
Forms Server
DB Server
IIS servers
Load balancer
Forms Server
DB Server
IIS servers
Load balancer
Myth 2NFT not required if functional testing and UAT is OK
Single user Multi user
Forms Server
DB Server
IIS servers
Load balancer
Forms Server
DB Server
IIS servers
Load balancer
Myth 3You can/can’t test in a live environment
..…happen
results can be unreliable…..
Myth 3You can/can’t test in a live environment
You can…..
…..or by mistake….. prior to launch…..or with extremely careful planning
Myth 3Live environments.…. a cautionary tale
Myth 4Anyone can test an application.
Source: http://www.pixar.com/short_films/Theatrical-Shorts/Lifted
Myth 5
Web apps are easy to test using low-cost/no-cost tools
Myth 5
Web apps are easy to test using low-cost/no-cost tools
Myth 6If a test fails "it must be the test tool / tester's fault“
Myth 7Testing is too expensive / time consuming
“The money spent with Trust IV was the best money spent on the whole project”
Myth 8If it's slow we can "throw kit at it"
What we did….
Our standard test approach
POC
Scripting
Low vol. testsPerformance tests
Analysis
POC- January 2013
NOT “just” a web app
We had a “steroid ferret”
POC- January 2013
“Not just a web app“
Can use low cost / open source tools to test.
Not a “web only app”
POC- January 2013
“Not just a web app“
Can use low cost / open source tools to test.
Anyone can test
Needed specialist skills
Digging deeper…SAGE 1000 uses two communications protocols
Digging deeper…SAGE 1000 uses two communications protocols
Had to convert displayed “human readable” text to legacy formatsto allow SAGE 1000 to interpret our simulated user input…
Digging deeper…Complex test date requirementsSUBMIT POST "http://sagetest:80/webclient/jcsp.dll?""Comms&""__CS3SessionID2621355832230" IDENTIFIER 136BODY "MfcISAPICommand=Comms&__CS3SessionID2621355832230\x00\.............. <Some data removed for brevity>………….."\x00\xed\x00\r\x00F\x00O\x00R\x00C\x00E\x00P\x00S\x00\x00\x00\x00\x00\x01\x04\x00\r\x00\x00\x00\x00\x01\x1d\x00\r\x00"………….."\
Digging deeper…Correlation
SUBMIT POST PRIMARY "http://atum:80/webclient/jcsp.dll" IDENTIFIER 49BODY "MfcISAPICommand=Application&__CS3SessionID2121358346714&135532676\x00";SYNCHRONIZE REQUESTS;
SUBMIT POST http://atum:80/webclient/en-gb/NVPApplicationClient.asp? IDENTIFIER 50BODY "MfcISAPICommand=Application&__CS3SessionID2121358346714&135532676\x00";SYNCHRONIZE REQUESTS;
135532676=
0x08141084=
0xA 0x08 0x14, 0x10, 0x84 0xD
What we did
Script simulating login and requisition 1000 user accounts, 30,000 productsBUT Application update in January, meant re-coding, added delay Initial tests showed problems with scalability…before requisition step fully scripted
Our plan…
• Problems @ 20 user load• Blank screens / HTTP 500 errors• Spent time proving test tool
Manual tests with volunteers & network traces to prove simulation equivalent to real users
Iterative testing began…
POC
Scripting
Low vol. tests
Performance tests
Analysis
…modified test approach as we found defects
Initial tests
V. high thread count
> 2300 threads associated with the JCSP process
V. High context switches rate
> 17,000 switches / sec
Initial TestsKey observations
“System Idle” “Manual test” “Automated test”
CPU utilisation <2% <2% <6%
Available MBytes 30.1 GB 28.73 GB 27.3 GB
JCSP thread count 82 290 371
Total thread count 797 1932 2446
Context switches/sec 275 2000 2500
Processor queue length <1 <1 <1
High thread count and context switching are key issues
Response times > 20 seconds
Initial TestsUser experience
We made recommendations to improve performanceWe asked Datel to check:• Heap size• Application pool size• Timeout values
Datel reconfigured application server:• Encrypted login credentials within the application• Altered TCP/IP timeout values and keep alives• Set lifetime session limit to 30 minutes• Registry changes
Next stepsReconfiguration & re-test
Re-tested, but….
Next StepsRe-test
Next steps - retestDespite load balancer problem
Response times consistent
No degradation over time
Re-testObservations
Further tests
1. 230 users1289 logins / hr
2. 250 users1383 logins / hr
3. 500 users2715 logins / hr
4. 500 users5412 logins / hr
Test stats
We noticed large numbers of HTTP 404 errors
Still missed some “asynchronous” traffic
Hadn’t tested complex application flows, due primarily to time constraints
Final report….some caveats
Conclusion
You probably need to test• Reduced response times for SAGE 1000 login
from > 20 seconds (and timeouts) to ≈ 3 s• Application worked, just not our particular configuration
Testing needn’t be expensive• Thanks to UHB for the endorsement
Don’t trust vendors (or developers) to test
What have we learned ?
Conclusion
You probably need to test• Reduced response times for SAGE 1000 login
from > 20 seconds (and timeouts) to ≈ 3 s• Application worked, just not our particular configuration
Testing needn’t be expensive• Thanks to UHB for the endorsement
Don’t trust vendors (or developers) to test
What have we learned ?
RULES AREN’T THERE TO BE BROKEN,
BUT BREAKING MYTHS IS OK……