Web Testing Checklist about Usability
Navigation
1. Is terminology consistent?
2. Are navigation buttons consistently located?
3. Is navigation to the correct/intended destination?
4. Is the flow to destination (page to page) logical?
5. Is the flow to destination the page top-bottom left to right?
6. Is there a logical way to return?
7. Are the business steps within the process clear or mapped?
8. Are navigation standards followed?
Ease of Use
1. Are help facilities provided as appropriate?
2. Are selection options clear?
3. Are ADA standards followed?
4. Is the terminology appropriate to the intended audience?
5. Is there minimal scrolling and resizeable screens?
6. Do menus load first?
7. Do graphics have reasonable load times?
8. Are there multiple paths through site (search options) that are user chosen?
9. Are messages understandable?
10. Are confirmation messages available as appropriate?
Presentation of Information
1. Are fonts consistent within functionality?
2. Are the company display standards followed?
- Logos
- Font size
- Colors
- Scrolling
- Object use
3. Are legal requirements met?
4. Is content sequenced properly?
5. Are web-based colors used?
6. Is there appropriate use of white space?
7. Are tools provided (as needed) in order to access the information?
8. Are attachments provided in a static format?
9. Is spelling and grammar correct?
10. Are alternative presentation options available (for limited browsers or performance issues)?
How to interpret/Use Info
1. Is terminology appropriate to the intended audience?
2. Are clear instructions provided?
3. Are there help facilities?
4. Are there appropriate external links?
5. Is expanded information provided on services and products? (why and how)
6. Are multiple views/layouts available?
Web Testing Checklist about Compatibility and Portability
Overall
1. Are requirements driven by business needs and not technology?
Audience
1. Has the audience been defined?
2. Is there a process for identifying the audience?
3. Is the process for identifying the audience current?
4. Is the process reviewed periodically?
5. Is there appropriate use of audience segmentation?
6. Is the application compatible with the audience experience level?
7. Where possible, has the audience readiness been ensured?
8. Are text version and/or upgrade links present?
Testing Process
1. Does the testing process include appropriate verifications? (e.g., reviews, inspections and walkthroughs)
2. Is the testing environment compatible with the operating systems of the audience?
3. Does the testing process and environment legitimately simulate the real world?
Operating systems Environment/ Platform
1. Has the operating environments and platforms been defined?
2. Have the most critical platforms been identified?
3. Have audience expectations been properly managed?
4. Have the business users/marketing been adequately prepared for what will be tested?
5. Have sign-offs been obtained?
Risk
1. Has the risk tolerance been assessed to identify the vital few platforms to test?
Hardware
1. Is the test hardware compatible with all screen types, sizes, resolution of the audience?
2. Is the test hardware compatible with all means of access, modems, etc of the audience?
3. Is the test hardware compatible will all languages of the audience?
4. Is the test hardware compatible with all databases of the audience?
5. Does the test hardware contain the compatible plug-ins and DLLs of the audience?
General
1. Is the application compatible with standards and conventions of the audience?
2. Is the application compatible with copyright laws and licenses?
Web Testing Checklist about Security (1)
Access Control
1. Is there a defined standard for login names/passwords?
2. Are good aging procedures in place for passwords?
3. Are users locked out after a given number of password failures?
4. Is there a link for help (e.g., forgotten passwords?)
5. Is there a process for password administration?
6. Have authorization levels been defined?
7. Is management sign-off in place for authorizations?
Disaster Recovery
1. Have service levels been defined. (e.g., how long should recovery take?)
2. Are fail-over solutions needed?
3. Is there a way to reroute to another server in the event of a site crash?
4. Are executables, data, and content backed up on a defined interval appropriate for the level of risk?
5. Are disaster recovery process & procedures defined in writing? If so, are they current?
6. Have recovery procedures been tested?
7. Are site assets adequately Insured?
8. Is a third party "hot-site' available for emergency recovery?
9. Has a Business Contingency Plan been developed to maintain the business while the site is being restored?
10. Have all levels in organization gone through the needed training & drills?
11. Do support notification procedures exist & are they followed?
12. Do support notification procedures support a 24/7 operation?
13. Have criteria been defined to evaluation recovery completion / correctness?
Firewalls
1. Was the software installed correctly?
2. Are firewalls installed at adequate levels in the organization and architecture? (e.g., corporate data, human resources data, customer transaction files, etc.)
3. Have firewalls been tested? (e.g., to allow & deny access).
4. Is the security administrator aware of known firewall defects?
5. Is there a link to access control?
6. Are firewalls installed in effective locations in the architecture? (e.g., proxy servers, data servers, etc.)
Proxy Servers
1. Have undesirable / unauthorized external sites been defined and screened out? (e.g. gaming sites, etc.)
2. Is traffic logged?
3. Is user access defined?
Privacy
1. Is sensitive data restricted to be viewed by unauthorized users?
2. Is proprietary content copyrighted?
3. Is information about company employees limited on public web site?
4. Is the privacy policy communicated to users and customers?
5. Is there adequate legal support and accountability of privacy practices?
Web Testing Checklist about Security (2)
Data Security
1. Are data inputs adequately filtered?
2. Are data access privileges identified? (e.g., read, write, update and query)
3. Are data access privileges enforced?
4. Have data backup and restore processes been defined?
5. Have data backup and restore processes been tested?
6. Have file permissions been established?
7. Have file permissions been tested?
8. Have sensitive and critical data been allocated to secure locations?
9. Have date archival and retrieval procedures been defined?
10. Have date archival and retrieval procedures been tested?
Monitoring
1. Are network monitoring tools in place?
2. Are network monitoring tool working effectively?
3. Do monitors detect
- Network time-outs?
- Network concurrent usage?
- IP spoofing?
4. Is personnel access control monitored?
5. Is personnel internet activity monitored?
- Sites visited
- Transactions created
- Links accessed
Security Administration
1. Have security administration procedures been defined?
2. Is there a way to verify that security administration procedures are followed?
3. Are security audits performed?
4. Is there a person or team responsible for security administration?
5. Are checks & balances in place?
6. Is there an adequate backup for the security administrator?
Encryption
1. Are encryption systems/levels defined?
2. Is there a standard of what is to be encrypted?
3. Are customers compatible in terms of encryption levels and protocols?
4. Are encryption techniques for transactions being used for secured transactions?
- Secure socket layer (SSL)
- Virtual Private Networks (VPNs)
5. Have the encryption processes and standards been documented?
Viruses
1. Are virus detection tools in place?
2. Have the virus data files been updated on a current basis?
3. Are virus updates scheduled?
4. Is a response procedure for virus attacks in place?
5. Are notification of updates to virus files obtained from anti-virus software vendor?
6. Does the security administrator maintain an informational partnership with the anti-virus software vendor?
7. Does the security administrator subscribe to early warning e-mail services? (e.g., www.fooorg or www.bar.net)
8. Has a key contact been defined for the notification of a virus presence?
9. Has an automated response been developed to respond to a virus presence?
10. Is the communication & training of virus prevention and response procedures to users adequate?
Web Testing Checklist about Performance (1)
Tools
1. Are virus detection tools in place?
2. Have the virus data files been updated on a current basis?
3. Are virus updates scheduled?
4. Is a response procedure for virus attacks in place?
5. Are notification of updates to virus files obtained from anti-virus software vendor?
6. Does the security administrator maintain an informational partnership with the anti-virus software vendor?
7. Does the security administrator subscribe to early warning e-mail services? (e.g., www.foo.org or www.bar.net)
8. Has a key contact been defined for the notification of a virus presence?
9. Has an automated response been developed to respond to a virus presence?
10. Is the communication & training of virus prevention and response procedures to users adequate?
Tools
1. Has a load testing tool been identified?
2. Is the tool compatible with the environment?
3. Has licensing been identified?
4. Have external and internal support been identified?
5. Have employees been trained?
Number of Users
1. Have the maximum number of users been identified?
2. Has the complexity of the system been analyzed?
3. Has the user profile been identified?
4. Have user peaks been identified?
5. Have languages been identified?, i.e. English, Spanish, French, etc. for global wide sites
6. Have the length of sessions been identified by the number of users?
7. Have the number of users configurations been identified?
Expectations/Requirements
1. Have the response time been identified?
2. Has the client response time been identified?
3. Has the expected vendor response time been identified?
4. Have the maximum and acceptable response times been defined?
5. Has response time been met at the various thresholds?
6. Has the break point been identified been identified for capacity planning?
7. Do you know what caused the crash if the application was taken to the breaking point?
8. How many transactions for a given period of time have been identified (bottlenecks)?
9. Have availability of service levels been defined?
Architecture
1. Has the database campacity been identified?
2. Has anticipated growth data been obtained?
3. Is the database self-contained?
4. Is the system architecture defined?
" Tiers
" Servers
" Network
5. Has the anticipated volume for initial test been defined - with allowance for future growth?
6. Has plan for vertical growth been identified?
7. Have the various environments been created?
8. Has historical experience with the databases and equipment been documented?
9. Has the current system diagram been developed?
10.Is load balancing available?
11.Have the types of programming languages been identified?
12.Can back end processes be accessed?
1 comment:
Good series on Web Testing :)
Post a Comment