Thursday 14 August 2014

Estimating effort for test automation



Once we identify what tests need to be automated (read previous blog on choosing the right tests for automation), the first thing we usually do before initiating automation is the estimation of effort for automation.

Automation is time consuming initiative, sometimes running for several months & years. It is important to use the right estimation approach to ensure that the automation deliverables are inline with product development roadmap. We don't wish to promise delivery of scripts in one month which actually needs two team months. And, at the same time, we don't wish to allocate four months for a two months activity, making automation even more expensive.

Let us learn about the steps involved in arriving at optimal effort estimation for automation.

1. Identify the size & complexity of tests
2. Categorize tests by size & complexity
3. Sampling of tests for individual effort
4. Average automation effort with resource experience levels
5. Effort estimation with team composition

This estimation method is proposed assuming the framework is already in place. Please read our blog on choosing the right automation framework, if not done already. I will cover the Framework Development effort estimation in next blog.

1.      Identify the size & complexity of tests

The test cases shortlisted for automation need to be categorized in to Simple, Medium & Complex based on their size and complexity.

When it comes to size, the tests can be categorized based on the number of test steps in the test case. Tests with less than 10 steps can be considered as simple, tests with 10-20 steps can be categorized as medium and above 20 steps can be marked as Complex. Additional categories can be added based on the application nature and type of tests performed.

Alternate way to categorize the tests is by complexity. Tests with plain web/screen interactions navigating through 2-3 pages can be marked as Simple. Tests with navigations of 5-6 cases and simple querying of database can be considered as Medium. If the tests include navigation of more than 6 pages or have multiple database validations can be considered as Complex. Tests that have multiple systems involved, like switching between two or more applications as part of the same test, or tests that include web and web-service transactions can also be considered as complex.

2.      Categorize tests by size & complexity

Once we mark each test case as Simple, Medium or Complex based on their size and complexity, identify the total test cases under each category for each application / product.

Let us assume we arrive at the following number after the study from above step.

Feature
Total Cases
Simple
Medium
Complex
Application 1
200
60
100
40
Application 2
200
50
120
30
Application 3
200
30
110
60

3.      Sampling of tests for individual effort

After categorizing the test cases, we need to understand what would be the effort to automate a simple test, medium and a complex test for the given application. There are few industry standard test case level estimates like 1-hour for Simple, 2-hours for Medium and 4 hours for Complex test. This effort might vary from product to product. The sampling of tests helps in coming up with more accurate estimates.

In the sampling, we pick up few cases (usually 5 cases or 2% whichever is higher) from each category for the given application and try to automate them using the framework to understand the average effort.

4.      Average automation effort with resource experience levels

The productivity of automation resources with different experience levels varies. Estimating effort of automating a test with resources of different experience levels provides more accurate estimates as the automation teams consist both junior (<=3 years exp) and senior (>3 years exp) resources.

Once the sample cases are identified in above step, we need to have these cases automated with a junior and a senior resource in the team / organization. The average effort for each category of tests needs to be documented. If a junior resource is assigned to automate 10 simple cases and it took 15 hours to automate them, the time is 1.5 hours per test. A senior resource might automate the same 10 cases in 10 hours with an average of 1 hour per test. We need to identify the average effort for Medium and Complex test cases as well and mark them in a simple table like the one below.

Resource
Average Effort (Hrs) – Simple Test
Average Effort (Hrs)  – Medium Test
Average Effort (Hrs)  – Complex Test
Junior Engineer
1.5
3
7
Senior Engineer
1
2
4


5.      Effort estimation with team composition

In the above step we have calculated the effort of automating tests with different complexity by a junior and a senior level resource. But the automation team might consist of multiple resources at each experience level. The average effort per test varies based on the team composition. The average effort of a test with a team of 1 junior & 2 senior resources would be different from a team of 2 junior & 1 senior resources though the team size is same. We need to calculate the average effort per test with given team composition.

If the team consists of X junior and Y senior resource, the average team effort needed to complete 1 test would be:

(X * 1/Jr. Effort) + (Y * 1/Sr. Effort) = 1/Avg. Team Effort

Avg. Team Effort = (Jr. Effort * Sr. Effort) / (Y * Jr. Effort + X * Sr. Effort)

From above table, the team effort for different category of tests with 1 junior and 1 senior resource would be:

Avg. Team Effort for Simple test = (1.5 * 1) / (1.5 + 1) => 1.5 / 2.5 => 0.6 hours

Avg. Team Effort for Medium test = (3 * 2) / (3 + 2) => 6 / 5 => 1.2 hours

Avg. Team Effort for Complex test = (7 * 4) / (7 + 4) => 28 / 11 => 2.55 hours

Feature
Average Effort (Hrs) – Simple Test
Average Effort  (Hrs) – Medium Test
Average Effort  (Hrs) – Complex Test
Avg. Team Effort
(1 Jr + 1 Sr)
0.6
1.2
2.55

Let us assume our automation team size would be 3 junior and 2 senior resources. Here is the effort estimation for given team composition:

Avg. Team Effort for Simple test = (1.5 * 1) / (2 * 1.5 + 3 * 1) => 1.5 / 9 => 0.17 hours

Avg. Team Effort for Medium test = (3 * 2) / (2 * 3 + 3 * 2) => 6 / 12 => 0.5 hours

Avg. Team Effort for Complex test = (7 * 4) / (2 * 7 + 3 * 4) => 28 / 26 => 1.08 hours

Resources
Average Effort (Hrs) – Simple Case
Average Effort (Hrs) – Medium Case
Average Effort (Hrs) – Complex Case
Avg. Team Effort
(3 Jr + 2 Sr)
0.17
0.5
1.08

This is the effort for one single test from different complexities. In Section 2, we have listed the test case count across different categories for 3 applications. The total team effort to automate these test cases with a team of 3 junior and 2 senior resources would be:

Feature
Effort (Hrs)  – Simple Cases
Effort (Hrs)  – Medium Cases
Effort (Hrs)  – Complex Cases
Application 1
60 * 0.17
100 * 0.5
40 * 1.08
Application 2
50 * 0.17
120 * 0.5
30 * 1.08
Application 3
30 * 0.17
110 * 0.5
60 * 1.08

Which is:

Feature
Effort (Hrs) – Simple Cases
Effort (Hrs) – Medium Cases
Effort (Hrs) – Complex Cases
Total Effort  (Team hours)
Application 1
10.2
50
43.2
103.4
Application 2
8.5
60
32.4
100.9
Application 3
5.1
55
64.8
124.9

Here is the effort in Team Days (with 8 hours per day):

Feature
Total Effort
 (Team Days)
Application 1
13
Application 2
11.2
Application 3
15.6

So, the total effort to automate Application 1 with 3 junior and 2 senior resources would be 13 days. Application 2 will need 11.2 days and Application 3 will take 15.6 days.

There would be large applications with thousands of tests and automation takes several months. The same approach would work for application of any size. The sampling of tests needs to be increased based on the size of the application.

More reusable steps in automation can save some automation time as the steps can be copied between tests. This is possible when a standard keyword-driven approach is followed. We will discuss more about this in upcoming blogs.


In next blog, we will understand the process of estimating effort for Automation Framework Development, Script execution and Script maintenance.

By
Automation Mentor
www.automationmentor.in

We provide hands-on training on automation tools and frameworks

Sunday 3 August 2014

Automating Flash applications and Windows controls using Selenium



Overview of Flash application

If you are looking for web application automation testing probably “Selenium” is one of the best open source tool available to perform the automation. But sometimes we may need to do some work around or alternative solutions to when using selenium to test the flash components in the application.

Requirements specify to use flash or flex to be used to make the web site look better, but it will be a challenging task for automation engineer because selenium doesn’t support to read or interact or perform actions on flash objects.

Flash is a multimedia software for creating graphical applications like animated images, video players, and audio players in a webpage.

Flash applications are embedded into HTML code using <object> and <embed>, flash functional methods are not displayed in HTML code .It uses JavaScript internally to call flash methods. 

Flash application testing

Flash feature testing is a kind of white-box testing, flash methods are not exposed in HTML page. So, it’s difficult for automation tester to identify elements to perform actions.

Sample flash application in http://www.permadi.com/tutorial/flashjscommand




Flash application can be tested in two ways:


  • Developer has to provide flash methods for testing, then we can perform testing by passing them through JavaScript code.
  • Independent testers doesn’t have access to source code, can use image based tool called Sikuli, which uses images to perform user actions.

If the developers can provide recompiled application or software by adding some flash methods. Then below example shows the sample application complied with the flash methods

Step 1:

Create FlashWebDriver class which receives flash and perform user actions on flash application using JavaScript internally.

package com.java.selenium;

import org.openqa.selenium.JavascriptExecutor;
import org.openqa.selenium.WebDriver;

public class FlashWebDriver 
{
private final WebDriver webdriver;
private final String flashObjectId ;

public FlashWebDriver(final WebDriver webdriver,final String  flashObjectId)   
{
this.webdriver=webdriver;
this.flashObjectId=flashObjectId;
}
public String click(final String objectId,final String buttonLabel)
{
return callFlashObject("DoFlashClick",objectId,buttonLabel);
}
public String click(final String objectId)
{
return callFlashObject(objectId,"");
}

public String callFlashObject(final String functionName,final String... args)
{
final Object result=((JavascriptExecutor)  webdriver).executeScript  
      (makeJsFuntion(functionName,args), new Object[0]);
return result!=null ? result.toString() :null;
}

private String makeJsFuntion(final String functionName,final String... args)      {

final StringBuffer functionArgs=new StringBuffer();
if(args.length>0)
      {
for(int i=0;i<args.length;i++)
{
if(i>0)
{
functionArgs.append(",");
}
functionArgs.append(String.format("'%1$s'", args[i]));
}
}
return String.format("return document.%1$s.%2$s(%3$s);", flashObjectId, functionName,functionArgs);
}
}


Step 2:

Write the automation script for testing flash application using FlashWebDriver.

package com.java.selenium;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.firefox.FirefoxDriver;

public class Flash_Example 

{
public static void main(String[] args) throws Throwable 
{
WebDriver driver=new FirefoxDriver();

driver.manage().window().maximize();

FlashWebDriver flashapp=new FlashWebDriver(driver,"myFlashMovie");

driver.get("http://www.permadi.com/tutorial/flashjscommand/");

flashapp.callFlashObject("Play");

Thread.sleep(3000);

flashapp.callFlashObject("StopPlay");

}
}


Windows controls in Web Applications

Web applications testing sometimes involves uploading & downloading of varies types of documents like .txt, .doc, .docx. Windows popup appears, when user click on upload or download button on web application.

Functional testing tool like selenium doesn’t handle windows based popups while uploading & downloading documents. So, automation tester can chose “Auto IT” in order to handle windows based popups.

Auto IT

Steps to handle window based applications using Auto IT.

Application URL : http://www.pdfonline.com/convert-pdf/





Step: 1

Write Auto IT script in Auto IT Script Editor by mentioning details regarding window popup and data.

WinWaitActive("File Upload")
WinActivate("File Upload")
Local $file="C:\"&$CmdLine[1]
WinActivate("File Upload")
ControlSetText("File Upload","","Edit1",$file)
ControlClick("File Upload","","Button1") 


Step: 2

Compile the Auto IT script file which is .au3 file, then .exe Auto IT file would be generated.

Step: 3

Now let’s invoke the autoit exe file from the standalone or our selenium code which to handle windows based applications or the popups.

package com.java.selenium;
import java.util.concurrent.TimeUnit;
import org.openqa.selenium.By;
import org.openqa.selenium.JavascriptExecutor;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.WebElement;
import org.openqa.selenium.firefox.FirefoxDriver;
public class Fileupload_Windows
{
public static void main(String[] args) throws Throwable
{
WebDriver driver=new FirefoxDriver();
driver.manage().timeouts().implicitlyWait(30, TimeUnit.SECONDS);
driver.manage().window().maximize();
driver.get("http://www.pdfonline.com/convert-pdf");
WebElement k=driver.findElement(By.cssSelector("input[name*='Filedata']"));
((JavascriptExecutor)driver).executeScript("arguments[0].click();", k);
String[] command=new String[]{"C:/ FileUpload.exe","AutoTest_WordDoc.docx"};
Runtime.getRuntime().exec(command);


}
}