Reporting bug in more polite way

Today I had an argument with two developers in our lunch table. The argument was related to Bug reporting. I am always tried to see a bug in developer’s perspective. So my development friends are always keen to discuss there QA related matters with me.They were argued about the polite bug reporting ways.

One of our QA team members has reported a bug and in the last line of the bug description she has said “This conclude that the system is not stable”. I will never agree with this kind of comments in bug description. The development guy told me that’s why QA department is there. If the system is stable in the initial step then we don’t have to worry about the client releases. He is correct. Unfortunately I don’t have a way to advise this team member.

I believe developers are working hard to make the things working. We need to respect to there thinking power and effort. But I don’t accept doing careless mistakes in project. Finally I agree with my development friends regarding this matter. Get the bug fixes done from developer is an art that should be practiced by QA. We need to find bug but need to report those in very polite way without hearting developers.

Some Useful links for Software Testing Automation

Links to find resources for automated testing

Test Automation Snake Oil - James Bach

Improving the Maintainability of Automated Test Suites - Cem Kaner,
Quality Week ’97.

Techniques and ideas for automating software testing - Bret Pettichord

Architectures of Test Automation - Cem Kaner

Totally Data-Driven Automated Testing - Keith Zambelich, SQA Testing

Seven Steps to Test Automation Success - Bret Pettichord, June 2001

Test Automation Frameworks - Carl Nagle

Using Silk Test as a Automation tool

In general I am not a huge fan of the Prepackaged Test Automation Suites, however if you are going to use one I find that with a few exceptions Silk is the best one to have.

As far the best approach, Use a Keyword driven Test Methodology.

This is one step beyond a data driven approach and truly minimizes the maintenance overhead of the test automation infrastructure.

With Data Driven testing you have a test case which is a series of steps which are set by the code, but the data entered at each step is parameterized so that you can run the same test multiple times with different data.

Keyword driven testing takes the sequence of the steps themselves and removes them from the code the same way that data driven tests removes the data from the code.

In Silk you do this by controlling the names of the objects in the frame file. Come up with a naming convention so that you can predict the name of any object based on the window it is in, it's object class, and the user understandable name of the object.

You then "teach" your code how to associate keywords to specific objects, then you build a database or spreadsheet format that lists keywords that instruct the script what steps to take.

So for example, a simple login screen, that has a Username field, a Password Field, and a Submit Button, the objects would be named


One "row" of data in the test data would define one step in the testcase, it requires 5 pieces of information, The screen name, the object class, the name of the object in the screen, the action you which to take (generall this will correspond to a method available to that class, however it can be something custom too), and the data to apply.

The data for logging into the system would be


Screen - Login
Class - TextField
ObjectName - Username
Action - SetText
Data - ABC123


Screen - Login
Class - TextField
ObjectName - Password
Action - SetText
Data - "Passw0rd"


Screen - Login
Class - Button
ObjectName - Submit
Action - Click
Data -

The Automation code will then read your test data repository to not only get the data to use but also the sequence of steps to follow.

GUI Testing Checklist

GUI Testing Checklist

Here are some sites that give comprehensive checklist for GUI Standards. Hope this will be helpful to all newbies in testing world.'sSite

Measuring of Adhoc Testing

One of the possible ways to measure Adhoc testing is SessionBased Test Management. It means that you should define sessions; each of them should contain suggestions about what should be tested. Session looks like a mission, not like a Test Case. Then you split sessions between testers and they do their work. After testing done he prepare report which should contain information like this:

Session Name
Tester Name
Start Time
Test Design and Execution
Bugs investigating and Reporting
Session Setup
Test Notes
Test Areas
Bugs list
Issues list

After you have collected reports from tester you can measure their productivity, because you have enough data.

Also you can perform test planning because you can define all needed sessions (e.g 50), estimate how many sessions can be executed per day (e.g. 5), analyze how many perfect sessions you have (e.g. 70%) and you know how many testers will do this job (e.g. 7)
So, it is easy for now to estimate overall time

50 / (5 * 7 * 0.70) = 2 days

TopOfBlogs Technology Blogs - Blog Catalog Blog Directory Software blogs feeds2read