Golden Master (GM)

A physical CD that contains the approved Final Candidate that will be distributed to customers.

Phase: Release

Purpose: The Golden Master is necessary in order for a First Article to be created. The purpose is to ship the software to the customers.

Entry Criteria: QA should receive the Final Candidate burned on to a CD.


Activities
Apart from all the testing activities done during FC phase, a few additional activities are done which are as follows:
• Test the CD for Viruses.
• Test the CD for block/sectors quality.
• Verify CD structure.
• Verify that all the files are being installed at the proper location.
• Verify that all the files appear with correct extension and icons on respective platforms.

For more software testing definitions, please go here

Final Candidate

Final Candidate certification by QA means that the software is usable by customers under the normal workflow environment of the features described in the Requirements.

Phase: Final Candidate

Purpose: Ship the software to the customers.

Entry Criteria
• All Showstopper, Alpha, Beta and FC defects thrashed till specified number of days prior to the publication of the build must be fixed.
• All defects fixed in the prior build of FC candidate should have appropriate action taken on them.
• All pre-release customers reported defects thrashed till specified number of days prior to the publication of the build must be fixed.
• The user documentation except for known issues and issues resolved must be reviewed and signed off by Legal, PM, R &D and QA.
• Configuration Documents are defined and reviewed by the stakeholders prior to the publication of the build.
• All test plans and DITs must be updated on iport with all the execution results.
• QA must be able to replicate the documented customer workflow on the candidate build.
• All 3rd party licenses are reviewed and approved by Legal.
• QA has verified that all legal requirements have been met.
• Product has final icons, splash screens and other artwork.
• All localization and UI defects should be fixed and localizable resources frozen in the weekly build prior to the FC build.
•The MTBF (Mean Time Between failure) of the application should be more than specified number of hours.
•The Final candidate build delivered to QA should be installable by the final product installation process from CD.

Activities
• Acceptance according to following criteria:
• Successful execution of Build Acceptance tests.
Acceptance results shall be published within 1 working day after receiving the Final
Candidate build.
• Regress all OpenForQA bugs fixed for the Accepted Final Candidate build after publishing the
Acceptance test results.
• Regress all Fixed/NLI defects.
• Execute all the identified workflows.
• Execute one round of structured tests and update the results on iport.
• QA must execute Shortcut Key tests on the candidate build.
• Localized testing must be done on the candidate build.
• Performance testing is to be done on candidate build.

Exit Criteria
• The Activities described above are completed.
• If any new defect is logged or existing defect is found to reoccur, it needs to be approved by the Triage team with the justification or the build cannot be certified as GM.
• Documented Training Tutorials are complete.
• All deliverables like Sales Kit, Printed Manuals, Boxes, BOM, Mercury Setup, QLA and XDK should be ready for distribution.
• Older validation codes do not work. Installer has no built in time bomb.
• Build is published on product server.

For more software testing definitions, please go here

First Article

A physical CD that contains the approved GM that will be distributed to customers.

Phase: Release

Purpose: Ship software to Customers

Entry Criteria
GM is approved.
• QA requests SCM/WPSI to co-ordinate for the First Article process.

Activities
• Check the Rimage text.
• Perform Installer testing.
• Test the CD for Viruses.
• Test the CD for block/sectors quality.
• Verify CD structure.
• Verify that all the files are being installed at the proper location.
• Verify that all the files appear with correct extension and icons on respective platforms.
• Check for correct splash screen.
• Perform Acceptance testing.


Exit Criteria
• The above activities are completed.
• First Article approved as identical to GM.

For more software testing definitions, please go here

EggPlant

EggPlant is a testing tool used for Automation of user actions on the Graphical User Interface (GUI) for Macintosh based applications. It interacts with the UI by identifying the images of the objects (pushbutton, checkbox, textfield etc.) in a dialog.

Eggplant runs on Mac OS system called “Client”, while the Application Under Test (AUT) runs on a separate computer, called “System-Under-Test” (SUT). Eggplant interacts with the SUT via a network, using Virtual Network Computer (VNC) server running on the SUT.

EggPlant uses an Object Oriented scripting language called SenseTalk. Basic concepts in SenseTalk include Values, Containers, Expressions and Control Structures.

Automated Testing Methodology in EggPlant
Most of the Eggplant activity takes place in four types of windows:
Remote Screen Window - The Remote Screen appears in an Eggplant window on client Mac. User can see and interact with software running on the SUT via the Remote Screen.
Script Editor Windows - The Script Editor lets the user create and edit scripts. Scripts test the software by simulating user actions with a list of scripted steps.
Run Window - The Run window shows a variety of levels of detail about the currently running script or scripts.
Suite Editor Windows - A Suite is a collection of testing elements. With Suite Editor, user can manage these elements including scripts, images, results, schedules and helpers.

Framework
The commonly used framework comprises of the following files:

Frame – Common library containing commonly used functions and images. In addition, it also contains the Recovery system used by EggPlant.

To know about Automation Script, Please go Automation Scripts

For more software testing definitions, please go here

Engineering Complete

Engineering Complete (EC) of a component ensures that each requirement of the component is demonstrable with at least one input.
Pre-EC is a phase, when some of the features have reached an EC status within a component.

Phase: Engineering

Purpose: Implement all features as specified in Software Requirements Specification.

Entry Criteria
•The requirements completely and accurately define the functionality that was intended to be developed.
• The code-level design of the feature is a complete and accurate reflection of the functionality defined in the requirements.
• The user interface of the software is an accurate and complete representation of what is represented in the user interface specification (UIS).
• The functionality is implemented on all supported operating systems as well as all flavors of the application.
• QA has developed the test plan as per SRS.

Activities
• Once the entry criteria have been met, R&D should communicate to the team, at multiple stages in the development of the component, that requirements have reached an EC status.
• R&D ensures the requirements can be demonstrated at GUI layer using an SCM compiled build, thus ensuring there are no integration issues in the software.
• When declaring requirements have reached an EC status, R&D should communicate the following to the team:
• Build number used to evaluate requirements (daily or weekly build).
• Platform used (Mac or Win).
• Requirement numbers that were tested by R&D.
• Once EC is declared by R&D, an EC review meeting should be held involving all the stakeholders, in which the following activities will occur:
• Software is evaluated against each requirement listed in the SRS.
• Software architecture is evaluated against the design documents.
• Once the EC review meeting is completed, a component EC summary should be sent to the component team, Managers/Directors of QA, PM, and R &D.
• Test plans should be reviewed by R&D and PM.
• Initial Risk Analysis should be generated from customers, developers and testing perspective.


Exit Criteria
• The Activities described above are completed.
• QA and PM sign-off that there are no missing features.
• Test plans are approved by R&D and PM.
• Installers are provided for all future builds.

For more software testing definitions, please go here

Defect Tracking

QA can start testing once SCM starts delivering builds, especially once the builds are delivered on a regular basis. During testing using the defined Strategies described in the ‘Testing Methodology’ section, the tester may discover certain deviations from the intended behavior. This deviation is termed as a defect. Hence a defect maybe defined as “a deviation of a system or component from specified or expected behavior”.

After encountering what appears to be a defect, the first step is to replicate it.

• Repeat the steps leading to the problem. Also try quitting and re-launching, or rebooting, and then repeating the steps.
• Always start from a known state (e.g. launch the application).
• After you have a repeatable sequence of steps that replicate the problem, try to narrow it down.
• Try to eliminate steps that are not required to reproduce the problem.
• Eliminate dependencies on input methods (e.g. keyboard shortcuts vs. mouse actions).
• Replicate it on a second machine.
• Determine if it is platform dependent or a cross platform issue.
• Determine if this is a general problem or specific to this feature.
• Determine if this problem is file dependent or can be replicated in a new file.
If it is not obvious that it is a bug (ambiguous feature behavior), as opposed to a system failure or a graphic output problem, check the Functional Requirements or Use Cases. If the behavior is still unclear, talk to the QA Manager or Supervisor or other team members or the developer responsible for the feature.
It is also helpful to determine when the bug was introduced. This can help the developers determine if recent changes made in an associated area created a new bug.

For more software testing definitions, please go here

Defect Thrashing Procedure

By the term “Defect Thrashing”, we mean assigning a priority to a defect. A team of individuals does "Defect Thrashing". Representatives from R &D, PM and QA prioritize defects on a regular basis depending on the development phase. Product Management always has the authority for overriding the priority.

The setting of priority of the defect is an automatic process. The QA Engineer responsible for the defect thrashing has to answer the following questions.

• What is the impact on customer workflow?
• Does this Bug stop us from Testing?
• What is the probability of occurrence or occurring in workflow of customer?
• What is the origin of the defect?
• What is the impact of the defect on other related modules?


Depending upon the answers provided to the above questions by the QA Engineer, the system automatically assigns a priority to the defect.

For more software testing definitions, please go here

Defect Logging

When the defect has been isolated
• See if it is known. Check the bug database to see if the bug you have found has already been reported.
• If the defect you have found is a duplicate but the one in the database has a status of Closed, reopen it and enter a comment in the history.
• If the bug you have just found is similar to a bug which is already in the database, but not exactly the same, then the existing report may need to be modified. Add your comments to the bug.
• If it isn't a duplicate, write it up after isolating it.

Writing up Problem Reports
In general, the Problem Title and Steps to recreate should be very specific and not contain editorial comments or opinion. With the exception of Enhancements, these fields should describe "what you did", "what object you did it to" and "what happened". The object needs to be referred to by its real name. Likewise, correct OS terminology must be used as well. Using the correct terminology will help others find your bug and reduce the number of duplicate bugs.

Given below is the format of how to log a clear and concise defect

/*****************************************************/
Problem Title:

Product name:
Build Tested:
Origin:

O.S. Tested:
O.S. Affected:

Tested in previous version of the product (if applicable):
Affected:

Browser Tested :<>
Browser Affected :<>
(Applicable only to web related defects)
_______________________________________________________________________
Steps to Recreate:
1.
2.
3.


Result:
Expected Result:

What works vs what does not:

Note
/*****************************************************/

Description of the main fields is given below:

Problem Title:

This is a one sentence summary that describes the bug. The summary should be concise, and include any special circumstances or exceptions. The Problem Title should be so accurate that someone associated with the project should be able to understand and even reproduce the problem from the problem title field alone.

Steps to Recreate:

This is a sequence of steps that describe the problem so that anyone can replicate the problem. Descriptions should be as concise as possible and should really be no more than 10 steps. The result needs to be written down separately. The steps should only describe the incorrect behavior. There is a tendency to write an example of similar correct behavior first and then the incorrect behavior to help justify the bug. This only confuses and frustrates the developer.

Result:
Description of the incorrect behavior, including specific file errors with stack crawls, asserts & user breaks.

Expected Result:
Description of what the specification defines or (if undefined) your expectations. If the expected behavior is at all at question, it probably needs to be escalated to management for definition.

What works what does not work:
a) Should contain what works and what does not (mandatory for almost all defects, few exceptional cases may be there!).
b) Special notes describing the defect, which is helpful for R&D to fix the defect (optional).
c) Related defect Ids. (Optional).

Note:
This is additional information that assists the developer in understanding the bug. This could be the version where it was introduced, things that you discovered while narrowing the bug down, circumstances where the bug doesn't occur, and example of the correct behavior elsewhere in the product, etc.

Proofread your bug reports and try to reproduce the problems following the steps exactly as written.

Visit Defect Thrashing Procedure and Defect Tracking as well for more information

For more software testing definitions, please go here

Concurrent Versions System (CVS)

• CVS is a version control system used by individual programmers in large teams.
• It allows developers to access their code from anywhere with an Internet connection
• It maintains a history of changes made to all files in the project directory
• It allows users to have an automatic backup of all their work and the ability to rollback to previous versions, if need be
• CVS is run on a central server for all team members to access. Team members can simply checkout code, and then check it back in when done

The process of updating a file to the database consists of three steps:
• Get current version of file from database
• Merge any changes between the database version and the local version
• Commit the file back to the database


Two pieces of software are required to setup CVS:
• Server
• Client
The server handles the database end and the client handles the local side.

Tracking Changes
The standard tool used for tracking changes is “diff”.

Authentication
The user needs to authenticate to the CVS server. Following information needs to be specified:
• CVS server
• Directory on that server which has the CVS files
• Username
• Authentication method
The combination of all of these is sometimes called the “CVSROOT”

For more software testing definitions, please go here

Communicating with various Departments

Effective Communication with various departments is very important in order to develop the weekly testing strategy. Below are some key points, which can be discussed on a weekly basis with the following stakeholders.

Research and Development (R&D)
• Effective bug reports.
• Tips for Regression during bug fixing.
• Complex Areas of the code.
• Affected areas after Code Optimization.
• Implementation schedule of features.
• Components with maximum code changes.
• Unit Testing of implemented features.
• Feedback on Test planning.
• Components with high defect rejection.


Product Management (PM)
• Use cases for the features.
• Customer Workflow and frequency of usage.
• Feedback on the Quality of the component.
• Feedback on Usability of the components.
• Feedback on Test planning.


Technical Support
• Type and frequency of queries.
• Issues reported and workarounds given for the components.
• Overall feedback for a Product.
• Integration of a particular feature with other features.
• Expectations of the customer from our Products.
• Third Party XTensions used by the customers.


Customers

Email Communication:

Do’s
• Always start the email with a friendly salutation.
• Run a spell check on the emails before sending them to the customer.
• Re-read the email before sending it to ensure that the email conveys the right message and does not sound too cold or rude.
• Include confidentiality notice in your e-mails.
• Keep the MDA in CC and the customer in the TO list while replying.
• If the customer is not responding write to the respective MDA.


Don’ts
• Avoid writing emails with unclear subject lines.
• If unsure about the customer’s gender, refrain from using Mr/Mrs.
• Don’t use fancy fonts and background images in the email.
• Don’t forward internal mails/communication to the customer.
• Don’t commit anything regarding bug fixes and feature enhancements.
• Don’t use abbreviations which are not commonly understood.
• Don’t write emails in upper case.


Phone Conversations:

Do’s
• Speak in a slow and clear tone.
• Greet the customer when starting a phone conversation or closing it.


Don’ts
• Don’t interrupt the customer in between.
• Don’t show anger, resentment in your tone.

For more software testing definitions, please go here

Beta Certification

Beta certification by QA means that the software is usable by customers under the normal and high probability workflow environment of the features described in the Requirements.

Phase: Beta

Purpose: Present product to external customers for environmental testing including use of intended configuration, hardware, workflow, network etc.

Entry Criteria:
• All Showstoppers, Alpha and Beta defects thrashed till specified number of days prior to the publication of the build must be fixed.
• Defects which are dependent on external implementation must have possible fix dates (before the FC date). All exceptions are in escalation with senior management.
• All Defects fixed in the prior build of Beta candidate should have appropriate action taken on them.
• All pre-release customer reported defects (Showstopper, Alpha and Beta) thrashed till specified number of days prior to publication of the build must be fixed.
• All the Gray Area tasks identified by R&D prior to Alpha build must be completed before Beta build is delivered to QA.
• Results of Performance tests are available and all Showstopper, Alpha and Beta defects have been resolved.
• QA must complete one round of site visits for predefined customers.
• QA must be able to replicate the documented customer workflow on the beta build.
• All exceptions must be approved by competent authority.
• UI specification is complete and signed off. Pixel perfect reviews are finished and all identified issues are resolved.
• Customer pre-release documentation (What to test, Known Issues, New features list, test Documents) is complete.
• The help files must be part of the Beta candidate build and must be launched from the application.
• All other documentation reviewed and final drafts must be installed by the installers. The structure of the installed build must be as per the configuration document reviewed by the stake holders.
• All test Plans and DITs reviewed and approved by R&D and PM and the test wares are updated on shared location.
• All functional requirements from PM and design doc from R&D should be up-to-date and approved.
• The MTBF (Mean Time Between failure) of the application should be more than specified number of hours.
• The UI implementation is complete in all respects.
• All localization changes (Application and XTensions) should have been completed.
• File Format should be frozen i.e. any file saved in the Beta build or later can be opened without data loss.

Activities
• Acceptance testing by QA according to following criteria:
• Successful execution of Build Acceptance tests i.e. Smoke test and Manual test.
Acceptance results shall be published within 1 working day after receiving the Beta Build.
• Regress all OpenForQA bugs fixed for the Accepted Beta build after publishing the Acceptance test results.
• Regress all Fixed/NLI Showstopper, Alpha and Beta defects.
• Execute all the identified workflows.
• Execute one round of Structured tests and update the results on Shared location.
• Gather, document and react to the feedback from external sources.

Exit Criteria:
• The above testing methods are completed.
• If any of Showstopper, Alpha or Beta defect is rejected, it needs to be approved by PM as an exception.
• Beta build distributed to testers, including external customers.

For more software testing definitions, please go here

Automation Scripts

The Automation scripts are executed on Weekly/Daily Builds to perform Regression and Functionality testing.

Contents
File Summary: Summary of the file – Functionality name, Creation date, and scripter
Script Data types and Variables
Script Functions: Commonly used functions in the script
Positive Test scripts: All positive test cases according to the Test Case Document
Negative Test scripts: All negative test cases according to the Test Case Document
Test plan file: All the test cases defined in the script file are called

Automation scripts also use pre-created Test Files which include:
• Legacy Documents
• Image files of different formats
• Different Font Files
• MS Word, Excel files etc.

Audience
The automated scripts are created for use by the QA team.

Review:

Purpose
To verify that the test cases are automated as per the pass fail criteria specified in the Test Case Document.

Who should review the Automation Scripts?
Automation scripts for each feature in the application are to be reviewed by the primary person assigned for manual testing of that feature.

Automation Script Review Checklist
• Verify that all possible verifications listed in the Pass Fail criteria are done through the script.
• Verify proper exception handling for all automated test cases.
• Verify that there is no redundant code and functions are used wherever necessary.
• Verify that proper error messages are displayed in the event of a test case failure.

To know about Apple Scripting, please visit Apple Scripting

To know about Eggplant, please visit Eggplant

To know about Silk Test, please refer to Silk Test Q&A

To know about WinRunner, please refer to WinRunner Q&A

For more software testing definitions, please go here

Apple Scripting

AppleScript (AS) is a scripting language that allows to directly control Macintosh applications, including the Mac OS itself. The AppleScript compiler is provided by Mac OS and Script Editor is used to write and execute AppleScripts.

AppleScript allows to create sets of written instructions, known as scripts, to automate repetitive tasks, customize applications, and control complex workflows.

Application support for AppleScript is provided by R&D team wherever required in the product. The role of QA is to test all those features in the application, which are made Scriptable.

Framework
The commonly used framework for AppleScript is divided into two parts:
AS suite
It consists of all AppleScripts to provide the test case input and steps in the form of AppleScripts.
EggPlant suite
It consists of scripts written in Eggplant, which verify the job performed by AS at application’s UI.

To know more about Automation Scripts, please refer to Automation Scripts

For more software testing definitions, please go here

Alpha Certification

Alpha certification by QA means that software is testable by customers under the controlled environment of the features described in the Requirements.

Phase: Alpha

Purpose: Present fully functional product to customers for testing.

Entry Criteria
• All Showstoppers and Alpha defects thrashed till specified number of days prior to the publication of the build must be fixed.
• Defects which are dependent on external implementation should have possible fix dates.
• All Defects fixed in the prior build of Alpha candidate should have appropriate action taken on them.
• UI Specification is complete and accepted by PM, QA and R&D.
• Customer Pre-Release Documentations like What to test, Known Issues, New Features is completed.
• Help Files should be launched within the application.
• Configuration Documents are defined and reviewed by stakeholders.
• All test plans reviewed and approved by R&D and PM. Final draft of DIT in progress or sent for review.
• UI implementation should be ready for pixel perfect review.
• All localization changes should have been completed.
• All workflows are defined, reviewed and accepted by stakeholders.

Activities
• Acceptance testing according to following criteria:
• Successful execution of Build Acceptance tests i.e. Smoke test and Manual test.
• Acceptance results shall be published within 1 working day after receiving the Alpha build.
• Regress all OpenForQA bugs fixed for the Accepted Alpha build after publishing the Acceptance test results.
• Regress all Fixed/NLI Showstoppers and Alpha defects.
• Automated Focus Results should not report any new Showstoppers and Alpha defects or result in reopening of the closed ones.
• QA shall execute structured tests on the Alpha candidate build and the result will be updated on shared location.
• Execute all the identified workflows.

Exit Criteria
• The Activities described above have been completed.
• If any Showstopper or Alpha defect is rejected, the build cannot be certified as Alpha. However, for a rejected defect with lower priority, the decision to certify the product shall be that of the Triage team.

For more software testing definitions, please go here

Adhoc Testing

Adhoc testing can also be termed as unstructured testing. It is a form of testing carried out using no recognized test case design technique.

Following are some of the characteristics of Adhoc Testing:

• It involves utilizing strange and random input to determine if the application reacts adversely to it.
• There is no structured testing involved and no test cases are run. It involves various permutations and combinations of different inputs that may affect the functionality of a component.
• It also involves integration of different functionalities which is outside the normal scope of structured testing.
• It goes along simultaneously with structured testing.
• It is done to ascertain that the product is running successfully as a whole under any circumstance.
• It is basically an approach to test the product from user’s perspective.
• It is unscripted, unrehearsed, and improvisational.

For more software testing definitions, please go here

Acceptance Testing

Acceptance test cases are a subset of structured test cases designed to test if the product can perform basic level functionality.

Acceptance cases are categorized as R1 cases, where R1 = (Rotation 1 i.e. test cases to be executed every week when weekly build is published)
These are the critical test cases executed on every Weekly build. All Acceptance test cases would fall in this category.

The following is the acceptance procedure followed by QA:

• QA receives SCM compiled build from R&D.
• QA conducts Acceptance testing on the very day of the publishing of the weekly build by executing all the R1 cases.
• If no R1 cases are failing then the build is declared “Accepted”. QA will now test further on this Accepted build.
• If one or more of the R1 cases are failing then the build is declared “Rejected”. This information is passed on to R&D and PM with the information of the failing test cases and their corresponding defect numbers. QA does not carry on any further testing and continues to work on the older build until a new build has passed the Acceptance criteria.

For more software testing definitions, please go here