Thursday, 20 December 2012

Windows Phone Test Checklist

During mobile application exploring, i got the idea of maintaining the checklist of Android and Window based application. I am maintaining the checklist here, i will update this list once i get the more scenarios.

A. Verify Application Tile Images : 
1.View the Application list.
2.Verify that the small mobile app tile image is representative of the application.
3.From the Application list, tap and hold the small mobile app tile of your application and select 'pin to start'.
4.Verify that the large mobile tile image on the Start screen is representative of the application.

B. Application Closure:
1.Launch your application.
2.Navigate throughout the application, and then close the application through device's "back" button.

C.Application Responsiveness:
1.Launch your application.
2.Thoroughly test the application features and functionality.
3.Verify that the application does not become unresponsive for more than three seconds.
4.Verify that a progress indicator is displayed if the application performs an operation that causes the device to appear to be unresponsive for more than three seconds.
5.If a progress indicator is displayed, verify that the application provides the user with an option to cancel the operation being performed.

D.Application Responsiveness After Being Closed:
1.Launch your application.
2.Close the application using the Back button, or by selecting the Exit function from the application menu.
3.Launch your application again.
4.Verify that the application launches normally within 5 seconds, and is responsive within 20 seconds of launching.

E. Application Responsiveness After Being Deactivated:
1.Launch your application.
2.De-activate the app. This can be achived by pressing the "Start" button or by launching another app. (By deactivation we are not closing the app's process but are merely putting the app in the background.)
3.Verify that the application launches normally within 5 seconds, and is responsive within 20 seconds of launching.
4.If your application includes pause functionality, pause the application.
5.Launch your application again.
6.Verify that the application launches normally within 5 seconds, and is responsive within 20 seconds of launching.

F. Back Button: Previous Pages:
1.Launch your application.
2.Navigate through the application.
3.Press the Back button.
4.Verify that the application closes the screen that is in focus and returns you to a previous page within the back stack.

G. Back Button: First Screen:
1.Launch your application.
2.Press the Back button.
3.Verify that either the application closes without error, or allows the user to confirm closing the application with a menu or dialog.

H. Back Button: Context Menus and Dialog:
1.Launch your application.
2.Navigate through the application.
3.Display a context menu or dialogs.
4.Tap the Back button.
5.Verify that the context menu or dialog closes and returns you to the screen where the context menu or dialog was opened.

I. Back Button: Games:
1.Launch your application.
2.Begin playing the game.
3.Tap the Back button.
4.Verify that the game pauses.

J. Trial Applications:
1.Launch the trial version of your application.
2.Launch the full version of your application.
3.Compare the performance of the trial and full versions of your application.
4.Verify that the performance of the trial version of your application meets the performance requirements mentioned in test cases 1-9

K. Verify that Application doesn't affect Phone Calls:
1.Ensure that the phone has a valid cellular connection.
2.Launch your application. Receive an incoming phone call.
3.Verify that the quality of the phone call is not negatively impacted by sounds or vibrations in your application.
4.End the phone call.
5.Verify that the application returns to the foreground and resumes.
6.De-activate the application by tapping the Start button.
7.Verify that you can successfully place a phone call.

L. Verify that Application doesn't affect SMS and MMS Messaging:
1.Ensure that the phone has a valid cellular connection.
2.Ensure that the phone is not in Airplane mode by viewing the phone Settings page.
3.Launch your application. Deactivate the application by tapping the Start button.
4.Verify that a SMS or MMS message can be sent to another phone.
5.Verify that notifications regarding the SMS or MMS messages are displayed on the phone either from within the application, or within 5 seconds after the application is closed.

M. Verify Application Responsiveness With Incoming Phone Calls and Messages:
1.Ensure that the phone has a valid cellular connection.
2.Ensure that the phone is not in Airplane mode by viewing the phone Settings page.
3.Receive an incoming phone call, SMS message or MMS message.
4.Verify that the application does not stop responding or close unexpectedly when the notification is received.
5.After verifying the above step, tap on the message notification or receive the incoming phone call.
6.If a message was received, verify that User can return to the application by pressing the Back button.

N. Language Validation:
1.Review the product description of the application and verify that it is localized to the target language.
2.Launch your application.
3.Verify that the UI text of the application is localized to the target language.

Please leave your comment so that i can refine it.

Wednesday, 12 December 2012

Bug Life Cycle


Bug can be defined as the abnormal behavior of the software. No software exists without a bug. The elimination of bugs from the software depends upon the efficiency of testing done on the software. A bug is a specific concern about the quality of the Application under Test (AUT).

Bug Life Cycle:
In software development process, the bug has a life cycle. The bug should go through the life cycle to be closed. A specific life cycle ensures that the process is standardized. The bug attains different states in the life cycle. The life cycle of the bug can be shown diagrammatically as follows:

The different states of a bug can be summarized as follows:



1. New
2. Open
3. Assign
4. Test
5. Verified
6. Deferred
7. Reopened
8. Duplicate
9. Rejected and
10. Closed

Description of Various Stages:

1. New: When the bug is posted for the first time, its state will be “NEW”. This means that the bug is not yet approved.

2. Open: After a tester has posted a bug, the lead of the tester approves that the bug is genuine and he changes the state as “OPEN”.

3. Assign: Once the lead changes the state as “OPEN”, he assigns the bug to corresponding developer or developer team. The state of the bug now is changed to “ASSIGN”.

4. Test: Once the developer fixes the bug, he has to assign the bug to the testing team for next round of testing. Before he releases the software with bug fixed, he changes the state of bug to “TEST”. It specifies that the bug has been fixed and is released to testing team.

5. Deferred: The bug, changed to deferred state means the bug is expected to be fixed in next releases. The reasons for changing the bug to this state have many factors. Some of them are priority of the bug may be low, lack of time for the release or the bug may not have major effect on the software.

6. Rejected: If the developer feels that the bug is not genuine, he rejects the bug. Then the state of the bug is changed to “REJECTED”.

7. Duplicate: If the bug is repeated twice or the two bugs mention the same concept of the bug, then one bug status is changed to “DUPLICATE”.

8. Verified: Once the bug is fixed and the status is changed to “TEST”, the tester tests the bug. If the bug is not present in the software, he approves that the bug is fixed and changes the status to “VERIFIED”.

9. Reopened: If the bug still exists even after the bug is fixed by the developer, the tester changes the status to “REOPENED”. The bug traverses the life cycle once again.

10. Closed: Once the bug is fixed, it is tested by the tester. If the tester feels that the bug no longer exists in the software, he changes the status of the bug to “CLOSED”. This state means that the bug is fixed, tested and approved.

While defect prevention is much more effective and efficient in reducing the number of defects, most organization conducts defect discovery and removal. Discovering and removing defects is an expensive and inefficient process. It is much more efficient for an organization to conduct activities that prevent defects.
Guidelines on deciding the Severity of Bug:

Indicate the impact each defect has on testing efforts or users and administrators of the application under test. This information is used by developers and management as the basis for assigning priority of work on defects.

A sample guideline for assignment of Priority Levels during the product test phase includes:

1. Critical / Show Stopper — An item that prevents further testing of the product or function under test can be classified as Critical Bug. No workaround is possible for such bugs. Examples of this include a missing menu option or security permission required to access a function under test.

2. Major / High — A defect that does not function as expected/designed or cause other functionality to fail to meet requirements can be classified as Major Bug. The workaround can be provided for such bugs. Examples of this include inaccurate calculations; the wrong field being updated, etc.

3. Average / Medium — The defects which do not conform to standards and conventions can be classified as Medium Bugs. Easy workarounds exists to achieve functionality objectives. Examples include matching visual and text links which lead to different end points.
.
4. Minor / Low — Cosmetic defects which does not affect the functionality of the system can be classified as Minor Bugs.

Thursday, 29 November 2012

API and API Testing


What is API?
An API (Application Programming Interface) is a collection of software functions and procedures, called API calls that can be executed by other software applications.

What is API Testing?
API testing is mostly used for the system which has collection of API that needs to be tested. The system could be system software, application software or libraries.API testing is different from other testing types as GUI is rarely involved in API Testing. Even if GUI is not involved in API testing, you still need to setup initial environment, invoke API with required set of parameters and then finally analyze the result. Setting initial environment become complex because GUI is not involved. In case of API, you need to have some way to make sure that system is ready for testing. This can be divided further in test environment setup and application setup. Things like database should be configured, server should be started are related to test environment setup. On the other hand object should be created before calling non static member of the class falls under application specific setup. Initial condition in API testing also involves creating conditions under which API will be called. Probably, API can be called directly or it can be called because of some event or in response of some exception.

Test Cases for API Testing:
The test cases on API testing are based on the output.

Return value based on input condition
Relatively simple to test as input can be defined and results can be validated. Example: It is very easy to write test cases for int add(int a, int b) kind of API. You can pass different combinations of int a and int b and can validate these against known results.

Does not return anything
Behavior of API on the system to be checked when there is no return value.
Example: A test case to delete(ListElement) function will probably require to validate size of the list or absence of list element in the list.

Trigger some other API/event/interrupt
 The output of an API if triggers some event or raises some interrupt, then those events and interrupt listeners should be tracked. The test suite should call appropriate API and declarations should be on the interrupts and listener.

Update data structure
 This category is also similar to the API category which does not return anything. Updating data structure will have some effect on the system and that should be validated.

Modify certain resources
 If API call is modifies some resources, for example makes update on some database, changes registry, kills some processes etc, then it should be validated by accessing the respective resources.


API Testing vs. Unit Testing: What’s the difference?
1. API testing is not Unit testing. Unit testing is owned by development team and API by QE team.API is mostly black box testing where as unit testing is essentially white box testing.

2. Both API-testing and unit-testing target the code-level , hence similar tools can be used for both activities. There are several open source tools available for API testing and a few of them are Webinject, Junit, XMLUNIT, HttpUnit, ANT etc.

3. API testing process involves testing the methods of .NET, JAVA, J2EE APIs for any valid, invalid, and inappropriate inputs, and also testing the APIs on Application servers.

4. Unit testing activity is owned by the development team; and the developers are expected to build Unit tests for each of their code modules (these are typically classes, functions, stored procedures, or some other ‘atomic’ unit of code), and to ensure that each module passes its unit tests before the code is included in a build. API testing, on the other hand, is owned by the QE team, a staff other than the author of the code. API tests are often run after the build is ready, and it is common that the authors of the tests do not have access to the source code; they essentially create black box tests against an API rather than the traditional GUI.

5. Another key difference between API and Unit testing lies in the Test Case design. Unit tests are typically designed to verify that each unit in isolation performs as it should. The scope of unit testing often does not consider the system-level interactions of the various units. Whereas, API testing, are designed to consider the ‘full’ functionality of the system, as it will be used by the end users. This means that API tests must be far more extensive than unit tests, and take intoconsideration the sorts of ‘scenarios’ that the API will be used for, which typically involveinteractions between several different modules within the application.

API Testing Approach
An approach to test the Product that contains an API.

Step I:
Understand that API Testing is a testing activity that requires some coding and is usually beyond the scope of what developers are expected to do. Testing team should own this activity.

Step II:
Traditional testing techniques such as equivalence classes and boundary analysis are also applicable to API Testing, so even if you are not too comfortable with coding, you can still design good API tests.

Step III:
It is almost impossible to test all possible scenarios that are possible to use with your API. Hence, focus on the most likely scenarios, and also apply techniques like Soap Opera Testing and Forced Error Testing using different data types and size to maximize the test coverage. Main Challenges of API Testing can be divided into following categories.
• Parameter Selection
• Parameter combination
• Call sequencing

API Framework
The framework is more or less self-explanatory. The purpose of the config file is to hold all the configurable components and their values for a particular test run. As a follow through, the automated test cases should be represented in a ‘parse-able’ format in the config file. The script should be highly ‘configurable’. In the case of API Testing, it is not necessary to test every API in every test run ( the number of API’s that are tested will lessen as testing progresses). Hence the config file should have sections which detail which all API’s are “activated” for the particular run. Based on this, the test cases should be picked up.

Since inserting the automation test case parameters into config file can be a tedious activity, it should be designed in such a way that the test case can be left static with a mechanism of  ‘activating’ and ‘deactivating’ them.




Definitions:

Soap Opera Testing:
Soap opera tests exaggerate and complicate scenarios in the way that television soap operas exaggerate and complicate real life.

Forced Error Testing:
Forced error testing is nothing but mutation testing. It is process of inducing error /changes to the application to find how application is working. The forced-error test (FET) consists of negative test cases that are designed to force a program into error conditions. A list of all error messages that the program issues should be generated. The list is used as a baseline for developing test cases.

Software Functions and Procedures:
Functions and procedures are the foundations of programming. They provide the structure to organize the program into logical units that can manage the various activities needed for a program.

Functions
There are two basic types of functions:

Built-in
—these are built into the programming environment and do things such as opening and closing files, printing, writing, and converting variables (e.g., text to numbers, singles to integers, etc.).

Application/user-specific
—depending on what the program needs, you can build functions and procedures using built-in functions and procedures and variables.

Procedures
Procedures are used to perform a series of tasks. They usually include other procedures and functions within the program. Procedures typically do not return a value; they are simply executed and return control to the calling procedure or subroutine. Procedures in Visual Basic are called "Subroutines," often "Sub" for short. In JavaScript, "Functions" are used as procedures (they simply return no or null values to whatever called them).

Source: www.scribd.com/doc/9808382/Introduction-to-API-Testing

Friday, 26 October 2012

iPhone App Test Cases



























No.


























Module


























Sub-Module







Test Case Description


























Expected Result

1

Installation

Verify that application can be Installed Successfully.

Application should be able to install successfully.

2

Uninstall

Verify that application can be uninstalled successfully.

User should be able to uninstall the application successfully.

3(i)

Network Test Cases

Verify the behavior of application when there is Network problem and user is performing operations for data call.

User should get proper error message like “Network error. Please try after some time”

3(ii)

Verify that user is able to establish data call when Network is back in action.

User should be able to establish data call when Network is back in action.

4(i)

Voice Call Handling

Call Accept

Verify that user can accept Voice call at the time when application is running and can resume back in application from the same point.

User should be able to accept Voice call at the time when application is running and can resume back in application from the same point.

4(ii)

Call Rejection

Verify that user can reject the Voice call at the time when application is running and can resume back in application from the same point.

User should be able to reject the Voice call at the time when application is running and can resume back in application from the same point.

4(iii)

Call Establish

Verify that user can establish a Voice call in case when application data call is running in background.

User should be able to establish a Voice call in case when application data call is running in background.


5(i)


SMS Handling


Verify that user can get SMS alert when application is running.


User should be able to get SMS alert when application is running.

5(ii)

Verify that user can resume back from the same point after reading the SMS.

User should be able to resume back from the same point after reading the SMS.

6

Unmapped keys

Verify that unmapped keys are not working on any screen of application.

Unmapped keys should not work on any screen of application.

7

Application Logo

Verify that application logo with Application Name is present in application manager and user can select it.

Application logo with Application name should be present in application manager and user can select it.

8(i)

Splash

Verify that when user selects application logo in application manager splash is displayed.

When user selects application logo in application manager splash should be displayed.

8(ii)

Note that Splash do not remain for fore than 3 seconds.

Splash should not remain for fore than 3 seconds.

9

Low Memory

Verify that application displays proper error message when device memory is low and exits gracefully from the situation.

Application should display proper error message when device memory is low and exits gracefully from the situation.

10

Clear Key

Verify that clear key should navigate the user to previous screen.

Clear key should navigate the user to previous screen.

11

End Key

Verify that End Key should navigate the user to native OEM screen.

End Key should navigate the user to native OEM screen.

12

Visual Feedback

Verify that there is visual feedback when response to any action takes more than 3 seconds.

There should be visual feedback given when response time for any action is more than 3 second.

13

Continual Keypad Entry

Verify that continual key pad entry do not cause any problem.

Continual key pad entry should not cause any problem in application.

14

Exit Application

Verify that user is able to exit from application with every form of exit modes like Flap,Slider,End Key or Exit option in application and from any point.

User should be able to exit with every form of exit modes like Flap,Slider,End Key or Exit option in application and from any point.

15

Charger Effect

Verify that when application is running then inserting and removing charger do not cause any problem and proper message is displayed when charger is inserted in device.

When application is running then inserting and removing charger should not cause any problem and proper message should be displayed when charger is inserted in device.

16

Low Battery

Verify that when application is running and battery is low then proper message is displayed to the user.

When application is running and battery is low then proper message is displayed to the user telling user that battery is low.

17

Removal of Battery

Verify that removal of battery at the time of application data call is going on do not cause interruption and data call is completed after battery is inserted back in the device.

Removal of battery at the time of application data call is going on should not cause interruption and data call should be completed after battery is inserted back in the device.

18

Battery Consumption

Verify that application does not consume battery excessively.

The application should not consume battery excessively.

19

Application Start/ Restart

1. Find the application icon and select it 2. “Press a button” on the device to launch the app. 3.Observe the application launch In the timeline defined

Application must not take more than 25s to start.

20

Application Side Effects

Make sure that your application is not causing other applications of device to hamper.

Installed application should not cause other applications of device to hamper.

21

External incoming communication – infrared

Application should gracefully handle the condition when incoming communication is made via Infra Red [Send a file using Infrared (if applicable) to the device application presents the user]

When the incoming communication enters the device the application must at least respect one of the following: a) Go into pause state, after the user exits the communication, the application presents the user with a continue option or is continued automatically from the point it was suspended at b) Give a visual or audible notification The application must not crash or hung.