Category: Automation Tools

10 steps to become QTP guru

Would you like to know how to become QTP guru?
The recipe is simple. You should learn the following:

  1. VBScript
    QTP uses VBScript language. That’s why strong knowledge of VBScript is ‘must have’.

  1. Software Testing Automation Frameworks
    To write good automated tests, you should know how to write them.
    There are different approaches and solution. There is not a silver bullet.

  1. HP QuickTest Professional Tutorial
    This QTP tutorial cames with QTP.
    It’s quite clean and informative. Its main goal is to show and explain basic concepts of QuickTest Professional. It provides knowledge on:

    • creating a test
    • working with Object Repository (OR)
    • running and analyzing tests
    • verifying tests
    • parameterizing, etc

Sources: ‘\help\QTTutorial.pdf’ or ‘\help\Tutorial.chm’ in QTP Install folder.


  1. HP QuickTest Professional User’s Guide
    As previous tutorial, the present User’s Guide come with QTP install too.
    This guide is intended for QuickTest Professional users at all levels. It gives a deeper knowledge on:

    • working with QTP Object Repositoies
    • designing tests
    • enhancing tests (checkpoints, parameterizing, etc)
    • maintaining anf debugging tests
    • advanced testing features, etc

Sources: ‘\help\QTUsersGuide.pdf’ or ‘\help\MainUsersGuide.chm’ in QTP Install folder.


  1. COM/DCOM Technologies
    When working with QTP, you will be closely linked with such Microsoft’s applications, as: Excel, Word, Outlook. So, I recommend to get familiar with COM, COM Technologies, and COM objects of:

  2. SQL
    SQL is so important in programming, that I strongly recommend to spend time to learn SQL:

    • concepts of RDBM
    • selecting/updating/deleting data
    • SQL queries optimizations
    • databases administration


  1. XML
    XML is an extremely popular and useful format. I’m sure that нou will have to deal with data, stored in XML-files.

  2. HTML, DOM
    Since QuickTest Professional works perfectly with web applications, you should be expert in related fields – HTML, HTTP, DHTML, DOM, etc. They will simplify your future QTP script and make them more reliable and maintenable.

  3. HP QTP Knowledge Base
    It contains a lot of practical articles about QuickTest Professional.
    You can find there QTP webinars, QTP FAQs, documentations, solutions of your problems or different ways how to improve your QTP automated tests.

  4. Useful sites
    Sure, it’s impossible to know all information on QTP.
    That’s why I recommend to use these two sites to search answers to your QTP questions:


Infogain is looking for below given positions.

(1) Sr Developer / Analyst – .Net

Experience: 2- 5 years

Skill Set:

– Minimum of 3 years of experience developing software for RDBMS based n-tier applications.

– At least 1 year of VB 6.0 Experience.

-At least 1 year of VB.NET/C# experience.

– At least 1 year of SQL Server experience (TSQL queries, stored procedures).

– Code maintenance experience.

– Long distance collaboration experience.

– ASP/ASP.NET/HTTP/Javascript experience

– A degree in Computer Sc. or equivalent from a reputed Engineering School such as IIT, IISc, Delhi College of Engineering, DIT, RECs etc.

(2) Sr Developer / Analyst  – Symbian

Experience: 4- 6 years

Skill Set :

C++ , Symbian

-Systems Programming.

-Should have good interpersonal skills .

-Systems Programming.

(3) Engineer/Sr Engineer – Testing

Experience: 2- 5 years

Skill Set:

– 2-5 years of QA experience.

– 2 + years of automation experience.

– Excellent experience on QTP (writing scripts).

– Experience with Test director.

– Fair knowledge about XML.

– Fair understanding about SQL (should be able to write SQL queries).

– Load Runner experience is good to have.

(4) Lead – J2ee

Experience: 6 – 8 years

Skill Set:

– Good technical skills in Java, JSP, J2EE, EJB, WebLogic and XML.

– Knowledge of Oracle/SQL.

– Good in application design with exposure on Rational Rose .

– Project Mgmt, Status reporting, client interfacing skills .

– Good to have experience in Product Life Cycle  .

– Knowledge in SDLC .

All positions would be based in Noida. Applicants should be BE/B.Tech/MCA/MBA or equivalent. Excellent verbal, written communication and presentation skills is a must.

Please send your referrals against the above positions to me at while responding, pls do mention relevant skill in the subject line.

Happy job hunting 🙂


Too Good…


Infogain Is looking for below given positions for their state of the art facility in Noida.

Open Positions:

1)         Senior Software Engineer/Team Lead – J2ee
      Experience: 4+ years
·    Experience in Java/JSP/Servlets/Struts/EJB/JMS skills.
·    Experience in Ant is a must.
·    Good knowledge of Weblogic and/ or Websphere.
·    Unix skills are a must.
·    Understanding of Oracle/DB2/Informix is a must.
·    Good to have: Jboss, SQL Server.

2)       Sr. Developer/Lead- C/C++
Experience: 3+ years

    ·    Must have good experience in C++, OOAD.
    ·    Experience in ERD (Enhancement Requirement Document) Analysis.
    ·    Strong Engineering Analysis & debugging skill.
    ·    Previous experience of large-scale application design/development.
    ·    Ability to write well defined unit test cases, functional test cases.

3)         Sr. Developer/ Team Lead – VB.Net
Experience:  4-7 years
·    Windows Service programming experience.
·    Code maintenance experience.
·    SQL Server familiarity (TSQL queries, stored procedures).
·    XML familiarity.
·    Long distance collaboration experience.
·    Knowledge of C#, ASP. Net

4)          Sr. Developer/ Team Lead – Testing   
     Experience:  3 plus years
·    Good QA aptitude and Testing skills.
·    Ability to write and execute test cases.
·    Ability to write concise and accurate defect reports and reproduction steps.
·    Experience in using defect-tracking system.
·    Install/setup skills.
·    Automated and manual testing of web based applications.
·    Experience of using Automated Testing Tools preferably Load runner and/or Quick test professional.
·    Knowledge of various types of Testing (Load/ Performance/ Reliability/ Supportability).
·    Testing and QA of multiple web based applications.
·    Multiple software project experience preferably with a medium/ large consulting firm.
·    Complete life cycle understanding of projects.

5)       Siebel Consultant
 Experience: 3+ years
·    Strong technical skills with the ability to coordinate and ensure the success of Siebel deployments.
·    Experience in Siebel Application Administration and Siebel 7.x. Proficiency in writing SQL and previous experience with Oracle desired.
·    Tuning and performance improvement skills in a Siebel 7.x environment.
·    Experience with Siebel Enterprise Application Integration (EAI) data flows and Enterprise Integration Manager (EIM) and EIM Interface tables, Siebel Server Process Monitor,       Workflow Manager, and Assignment Manager is desired.
·    Must be a self-starter with ability to organize work for others.
·    Ability to work well under pressure and work on multiple tasks simultaneously. Must be available to work flexible hours.
    ·    Excellent problem solving, organizational and analytical skills.

6)      Sr Developer /Lead -SSIS /SSRS/SSAS
     Experience: 4- 8 years
·    Must have strong skills in SQL Server Integration Services, SQL Server Reporting Services and SQL Server Analysis Services.
·    Prior experience in SQL Server Reporting Services is preferred.
·    Should have good exposure in ETL tools, DTS.
·    Should have good interpersonal skills.
·    Those having B1 visa will be preferred

7)     Integration Architect
     Experience: 8+ yrs
·    Overall 8+ yrs of IT industry with minimum of 3+ years of demonstrable experience of delivering integration solution.
·    Experience in complex technology solutions in Integration, and ability to architect solutions across entire architectural frameworks.
·    Experience of leading technology streams demonstrating integration infrastructure architecture design.
·    Good understanding of SOA (service oriented architecture), EDA (Event Driven Architecture) and ESB.
·    Experience with Security Design and Implementation, ranging from network to application security.
·    Any one of Integration Products (TIBCO, IBM-MQ, BEA -WLI, Web methods, Vitria).
·    Integration Design Patterns, OOAD.
·    Integration infrastructure design, architecture and deployment.
·    Experience with implementation of security authentication and authorization services.
·    High energy/motivation level with a proven ability to deliver.
·    Strong communication skills both written and oral.
·    Knowledge of any one of Finance, Retail, Manufacturing, Logistics, Life Science domains.

8)      Senior Developer – Oracle
     Experience: 3+ years
·    Experience with Customer Data Integration (CDI).
·    Experience with Oracle Customer Online (OCO).
·    Experience with Data Quality Tools (such as Trillium etc.).
·    Other mandatory Skills:
– Oracle 10i
– Oracle PL/SQL

9)    Project Manager-Genesys
    Experience: 8+ years
·    Strong skills in Genesys.
·    Should have good experience in Routing, CC Pulse.
·    Should have good experience in a multi-channel contact center environment.
·    Knowledge of Genesys suites is required.

10)    Architect – Oracle
    Experience: 8+ years
·    Experience in architecting solutions using Oracle Customer Data Hub.
·    Experience in Data Quality Management using Data Quality Tools (such as Trillium etc.).
·    Experience with Customer Data Integration (CDI).
·    Experience with Oracle Customer Online (OCO).
·    Experience in Oracle Apps OR Siebel OR SAP
·    Other mandatory Skills:
– Oracle 10i
– Oracle PL/SQL

11)    Developer/ Sr. Developer- Tibco
         Experience: 3+ years.
         Should have strong skills in Tibco, Business Works, adaptor, hawk, rendezvous, JMS.

12) Project Managers – Java and .Net technologies.

All positions are based at Noida. Applicants should be BE/B.Tech/MCA/MBA or equivalent. Excellent verbal, written communication and presentation skills are a must.

In case any of y
ou are intrested please e-mail me your CV at

so dont wait send across your CV’s with years of experience and position that you are applying to in the subject of your mail.


Technorati Tags: , , , ,

Powered by ScribeFire.

I am looking for individuals who can be a part of Techieminds as the site is growing in leaps and bounds (see the number of hits in last 3-4 months) so I would like to have some dedicated users who can help me.
In case you are interested reply back..

Technorati Tags: , ,

Powered by ScribeFire.

Here are some details about Testing Metrics..

While testing a product, test manager/lead has to take a lot of decisions like when to stop testing or when is the application ready for production, how to track testing progress, how to measure the quality of a product at a certain point in the testing cycle?

Testing metrics can help to take better and accurate decisions

Lets start by defining the term ‘Metric’
A metric is a mathematical number that shows a relationship between two variables. Software metrics are measures used to quantify status or results.

How to track testing progress?
The best way is to have a fixed number of test cases ready before test execution cycle begins.Then the testing progress is measured by the total number of test cases executed.

% Completion = (Number of test cases executed)/(Total number of test cases)

Not only the testing progress but also the following metrics are helpful to measure the quality of the product
% Test cases Passed = (Number of test cases Passed)/(Number of test cases executed)

% Test cases Failed = (Number of test cases failed)/(Number of test cases executed)
Note: A test case is Failed when atleast one bug is found while executing it, otherwise Passed

How many rounds or cycles of testing should be done?
When to stop testing?

Lets discuss few approaches
Approach 1:This approache requires, that you have a fixed number of test cases ready before test execution cycle.In each testing cycle you execute all test cases.You stop testing when all the test cases are Passed or % failure is very very less in the latest testing cycle.

Approach 2:Make use of the following metrics
Mean Time Between Failure: The average operational time it takes before a software system fails.
Coverage metrics: the percentage of instructions or paths executed during tests.
Defect density: defects related to size of software such as “defects/1000 lines of code” Open bugs and their severity levels,

If the coverage of code is good, Mean time between failure is quite large, defect density is very ow and not may high severity bugs still open, then ‘may’ be you should stop testing. ‘Good’, ‘large’, ‘low’ and ‘high’ are subjective terms and depends on the product being tested.Finally, the risk associated with moving the application into production, as well as the risk of not moving forward, must be taken into consideration.

Hope these details goes well with u..

ping back in case u guyz need further details..


I received an e-mail from Snigdha,Gaurav, and Samy asking me about types of testing one can perform:

Here is the compiled list:

Black box testing – not based on any knowledge of internal design or code. Tests are based on requirements and functionality.

White box testing – based on knowledge of the internal logic of an application’s code. Tests are based on coverage of code statements, branches, paths, conditions.

Unit testing – Unit is the smallest compilable component. A unit typically is the work of one programmer.This unit is tested in isolation with the help of stubs or drivers.Typically done by the programmer and not by testers.

Incremental integration testing – continuous testing of an application as new functionality is added; requires that various aspects of an application’s functionality be independent enough to work separately before all parts of the program are completed, or that test drivers be developed as needed; done by programmers or by testers.
Integration testing – testing of combined parts of an application to determine if they function together correctly. The ‘parts’ can be code modules, individual applications, client and server applications on a network, etc. This type of testing is especially relevant to client/server and distributed systems.

Functional testing – black-box testing aimed to validate to functional requirements of an application; this type of testing should be done by testers.

System testing – black-box type testing that is based on overall requirements specifications; covers all combined parts of a system.

End-to-end testing – similar to system testing but involves testing of the application in a environment that mimics real-world use, such as interacting with a database, using network communications, or interacting with other hardware, applications, or systems if appropriate. Even the transactions performed mimics the end users usage of the application.

Sanity testing – typically an initial testing effort to determine if a new software version is performing well enough to accept it for a major testing effort. For example, if the new software is crashing systems every 5 minutes, bogging down systems to a crawl, or destroying databases, the software may not be in a ‘sane’ enough condition to warrant further testing in its current state.

Smoke testing – The general definition (related to Hardware) of Smoke Testing is:
Smoke testing is a safe harmless procedure of blowing smoke into parts of the sewer and drain lines to detect sources of unwanted leaks and sources of sewer odors.
In relation to software, the definition is Smoke testing is non-exhaustive software testing, ascertaining that the most crucial functions of a program work, but not bothering with finer details.

Static testing – Test activities that are performed without running the software is called static testing. Static testing includes code inspections, walkthroughs, and desk checks

Dynamic testing – test activities that involve running the software are called dynamic testing.

Regression testing – Testing of a previously verified program or application following program modification for extension or correction to ensure no new defects have been introduced.Automated testing tools can be especially useful for this type of testing.

Acceptance testing – final testing based on specifications of the end-user or customer, or based on use by end-users/customers over some limited period of time.

Load testing -Load testing is a test whose objective is to determine the maximum sustainable load the system can handle. Load is varied from a minimum (zero) to the maximum level the system can sustain without running out of resources or having, transactions suffer (application-specific) excessive delay.
Stress testing – Stress testing is subjecting a system to an unreasonable load while denying it the resources (e.g., RAM, disc, mips, interrupts) needed to process that load. The idea is to stress a system to the breaking point in order to find bugs that will make that break potentially harmful. The system is not expected to process the overload without adequate resources, but to behave (e.g., fail) in a decent manner (e.g., not corrupting or losing data). The load (incoming transaction stream) in stress testing is often deliberately distorted so as to force the system into resource depletion.


Performance testing – Validates that both the online response time and batch run times meet the defined performance requirements.
Usability testing – testing for ‘user-friendliness’. Clearly this is subjective, and will depend on the targeted end-user or customer. User interviews, surveys, video recording of user sessions, and other techniques can be used. Programmers and testers are usually not appropriate as usability testers.

Install/uninstall testing – testing of full, partial, or upgrade install/uninstall processes.

Recovery testing – testing how well a system recovers from crashes, hardware failures, or other catastrophic problems.

Security testing – testing how well the system protects against unauthorized internal or external access, willful damage, etc; may require sophisticated testing techniques.

Compatibility testing – testing how well software performs in a particular hardware/software/ operating system/network/etc. environment.

Exploratory testing – often taken to mean a creative, informal software test that is not based on formal test plans or test cases; testers may be learning the software as they test it.

Ad-hoc testing – similar to exploratory testing, but often taken to mean that the testers have significant understanding of the software before testing it.

Monkey testing:-monkey testing is a testing that runs with no specific test in mind. The monkey in this case is the producer of any input data (whether that be file data, or input device data).
Keep pressing some keys randomely and check whether the software fails or not.

User acceptance testing – determining if software is satisfactory to an end-user or customer.
Comparison testing – comparing software weaknesses and strengths to competing products.

Alpha testing – testing of an application when development is nearing completion; minor design changes may still be made as a result of such testing. Typically done by users within the development team.

Beta testing – testing when development and testing are essentially completed and final bugs and problems need to be found before final release. Typically done by end-users or others, not by programmers or testers.

Mutation testing – a method for determining if a set of test data or test cases is useful, by deliberately introducing various code changes (‘bugs’) and retesting with the original test data/cases to determine if the ‘bugs’ are detected. Proper implementation requires large computational resources

Cross browser testing – application tested with different browser for usablity testing & compatiblity testing

Concurrent testing – Multi-user testing geared towards determining the effects of accessing the same application code, module or database records. Identifies and measures the level of locking, deadlocking and use of single-threaded code and locking semaphores etc.

Negative testing – Testing the application for fail conditions,negative testing is testing the tool with improper inputs.for example entering the special characters for phone number

I hope things are very clear now.

Keep Testing 🙂