Yes, we can do performance testing manually. This is one of the strategies to execute performance testing, but it does not produce repeatable results, cannot deliver measurable levels of stress on an application and is an impossible process to organize. It also depends on what type of performance test a tester wants to do. However, in general, a tester can review the active sessions, number of database connections open, number of threads running. (JAVA based Web applications) total of the CPU time and memory being used by having a performance viewer. Testers can have IBM Tivoli Performance viewer and WAPT Tools. These are available for trial version. Testers also can use JMeter for performance testing as it is an open-source tool. Generally, the test is done by installing the application on the server and accessing the application from several client machines and making numerous threads run. The performance viewer should of course be installed on the server.


testing performance

Some of the techniques to perform Performance testing manually are:

1) If a tester is testing a website, odds are that he will slice response times in half (sometimes more) by performance testing the front end.

2) Use browser plug-ins or online tools to capture page load times.

3) Ask functional testers and/or user acceptance testers to record their credence about performance while doing testing. It may be useful to give them a scale to use, such as "acceptable, fast, unusable, annoying and tolerable”.

4) Have the developers put timers in their unit tests? These won't tell the tester anything about the user-observed response times, but developers will be able to see if their functions/modules/classes/objects etc. takes more or less time to execute from build to build. The same idea can be applied to various resource utilization (like CPU and memory utilization), depending on the skills and/or tools available to the development team.

5) Testers should get increasing numbers of co-workers to use the application during a specified period of time and ask the workers to note both the response time (which is easiest to do using the above-mentioned browser plug-ins) and their view about the application's performance. (Give them the same scale used for the functional and/or user acceptance testers).

6) Tester should have performance builds made with timestamps tactically output to log files. Evaluate the log files build after build and track the trends.

It also depends on what you're trying to achieve from the test.

- If you want to simulate 20 users using a website and get an overall impression of user response time, then this is OK.

- If you want to simulate 20 users all performing a piece of code at just the same time, then this is dubious to work unless the code takes a long time to execute.

- If you want precise metrics, then this possibly isn't the best way.

Also Read: Case Study on Manual Testing of Real Estate Website


Shabnam works as QA Engineer at BugRaptors. She is well versed with Manual testing, Mobile application testing, performance testing, load testing, Web applications testing .She has been working on various Web and mobile applications. She is good at carrying end to end testing.


Add a comment

BugRaptors is one of the best software testing company headquartered in India and US, which is committed to cater the diverse QA needs of any business. We are one of the fastest growing QA companies; striving to deliver the technology oriented QA services, worldwide. BugRaptors is a team of 200+ ISTQB certified testers, along with ISO 9001:2018 and ISO 27001 certification.

USA Flag

Corporate Office - USA

5858 Horton Street, Suite 101, Emeryville, CA 94608, United States

Phone Icon +1 (510) 371-9104
USA Flag

Test Labs - India

2nd Floor, C-136, Industrial Area, Phase - 8, Mohali -160071, Punjab, India

Phone Icon +91-8307547266
USA Flag

Corporate Office - India

52, First Floor, Sec-71, Mohali, PB 160071,India

USA Flag

United Kingdom

97 Hackney Rd London E2 8ET

USA Flag


Suite 4004, 11 Hassal St Parramatta NSW 2150