Sunday, April 29, 2007

Preparing for Quality Assurance (QA)

Test Cases Should Be Developed as Coding is
Progressing. If you wait until the QA phase
begins to create your test suite, test
cases will be rushed and your team will
not have time to fully review each
suite of test cases for each requirements.
Test case development should begin the day
coding starts. Testers should be assigned
to create test cases for specific
requirements.

Testers should remember to create test
cases for:

**Positive Testing -
These test cases ensure the software
will work exactly as specified in
the requirement.

**Negative Testing -
These test cases are used to try to
"trick" the software. For example,
try entering in an invalid date in
a date field. Try entering character
data in a numeric field.
Try entering in a date range where
the "from date" is older than the "to
date".
This also includes entering in data
that contains apostrophes, as this
tends to trip up SQL based systems.


**Bounds Testing -
These test cases test the bounds
of each field.
For example, if a field is defined
as 50 characters, try entering 60
characters.

**Relational Testing -
These test cases test parent-child
relationships.
For example, if you have a
parent-child feature you are
testing (e.g. an invoice may have
one or more invoice line items),
try deleting the parent (invoice
in this example), then ensure that
all
the child items (invoice line items
in this example) were deleted.

**Performance Testing -
These test cases ensure that the new
release will perform as quickly (or
quicker) than the past release.
To test this, prior to the new release,
try different actions (add a record,
search for a record, update a record,
delete a record, etc.) and record
your timings in a spreadsheet.
Once the new release is in the QA
environment, try those same tests and
record the new timings, this will
tell you performance has improved or
degraded.

**Regression Testing -
Upon a successful test cycle, some
of the test cases above should be
marked as regression test cases so
that they are run in future releases
to ensure that existing features
continue to work as designed.

**Smoke Tests -
Once all the test cases are designed
for all the requirements, a small
set (10 to 30 test cases)of the
positive test cases should be
identified as Smoke Tests. These
will be run prior to beginning the
major testing effort.

If any of these fail, they should
be fixed immediately before testing
begins so that time is well spent
by the testing team.

No comments: