India Equity Research

Wednesday, February 17, 2016

USER ACCEPTANCE TESTING (UAT) - Insights


For most of the organisations, if IT stops, the business stops, no upgrade viewing the future, business loss. Whenever an organisation plans for the rapid growth or extend the service, there is invariably a new or modified IT system behind it. A problem-free IT system is the “acid test” of significant investment of time, effort, resources and finance.

Whilst the technical testing of IT systems is a highly professional and exhaustive process, testing of business functionality is an entirely different proposition.

Does the system deliver the business functions – does it follow the company’s business rules – does it support a government regulations - does it cope with exceptions – is it secured?

The people who have to make these decisions – to accept or reject the new system – are the business users. It is therefore critical to get the business user involved in testing and not rely only on the technicians.

Both robust quality system which serves the business to scale in the future. Quality is never an accident; it is always the result of intelligent effort.

PROBLEM
How often do we see project teams making compromises to meet an immoveable deadline? Deadlines can be imposed by business objectives and by regulatory compliance requirements. Whatever the reason, to meet a deadline usually requires a compromise.

All projects share three common parameters:
·         Time
·         Resources (people, funding, tools, hardware, workspace)
·         Deliverables (functionality, quality, assessment, reports)

We can’t control time so we’re left with resources and deliverables. The amount of resources we commit to a project will determine the functionality and quality of deliverables. Functionality by a certain date is what an immovable deadline usually means so we are left with quality. The quality of a deliverable is directly proportional to the thoroughness of testing - but testing takes time.

When do you stop testing?
a) When all the planned tests are finished
b) When you run out of time
c) When you run out of budget

In an ideal world, all projects would allow adequate time for testing. Project teams would plan exhaustive testing for each piece of system functionality and if they ran out of time then they would drop functionality from a release rather than compromise on quality.

With business systems, it’s virtually impossible to test for every possible eventuality. We must therefore ask ourselves what is the most important functionality that must be tested within the available timeframe. The obvious answer is – the business critical functions that the system will deliver and on which the project justification is based.

User acceptance testing should be performed by business users to prove that a new system delivers what they are paying for. Business users have the knowledge and understanding of business requirements that IT testers do not have. They are uniquely placed to accept or reject the new system – after all they have to live with the consequences.

BUSINESS USER
In these examples, we see systems failing at great cost for many different reasons. The systems all went through many types of testing by programmers, analysts and designers. Why focus on UAT?

The key is in the word User. Quite simply all other forms of testing are carried out by technicians to ensure that the system works technically, according to their understanding of the business requirements.

UAT alone is carried out by the users and business managers to satisfy themselves that the system will meet their business requirements. Business users know whether a new system will not only work technically, but also actually allow the company to make money out of it or meet their regulatory compliance.

USER ACCEPTANCE TESTING (UAT)

The allocation of resources by the business to determine the expected business outcome of a new or changed IT system or other asset.

UAT is in essence part of risk management. We cannot allocate time and resources to test everything the system will do. With UAT we must allocate limited resources where they will be
most effective.

One of the most commonly accepted models for software testing – regardless of the development methodology – is the V Model. This maps each development phase of a project to a particular testing phase of a project. UNIT – INTEGRATION – SYSTEM - ACCEPTANCE.

Acceptance testing is the final checkpoint before the system goes live – and before the client pays. Whilst business requirements start the development project, it is acceptance testing that completes it. When specifying requirements our goal should be: This is what the system must do and this is how we will test it.

User Acceptance Testing: proves to the users that the system works according to their understanding of their own business requirements. They test the primary business functions they asked for and look at the business results to make sure they are correct.

Note: UAT is the final chance for the users to test the end system to their satisfaction before they committing to use it in production.

RESPONSIBILITY
·         The business unit which owns the system is responsible for testing business functionality
·         IT is responsible for supporting UAT with resources, infrastructure and personnel to correct errors
·         The testing team is responsible for managing the deployment of resources to provide the best recommendation to management
·         Management is responsible for allocating sufficient resources, skilled personnel, infrastructure, time and budget
·         At the end of the day, management must also make the final decision - to release or delay the system - based upon the testing evidence provided and any identified risks.
·         Often a system will be released with known risks but an informed decision is always better than an uninformed one.


RULES
·         Define testing criteria when defining business requirements
·         UAT testers should have their own reporting line - not be part of the project team
·         Detachment- during development testers can “look but not touch” (review only)
·         Not everyone does testing well
·         We are not good at finding errors in our own work
·         Teamwork is essential
·         It’s almost never right first time
·         A successful test is one that finds errors
·         User acceptance testing is a business, not a technical, activity.
·         UAT is partly a risk management process, a balance of risk against available resources
·         UAT is the business users checkpoint, it tests the business outcomes
·         Resource planning and prioritisation are required, not everything can be tested
·         Testers need experience and knowledge of business operations and business rules
·         UAT needs people working together – business users, IT, suppliers and even customers
·         To re-emphasise the final point, the system supplier (in-house or external company) must work with the business users to develop a testing strategy, valid acceptance criteria of the business functionality and a test plan that maximises the potential for a successful system rollout. The business depends on it.


Keep a positive attitude; testing is not about assigning blame. Quality standards (Criteria, process and procedures) should ensure that risks are identified and mitigated at source.

I have not failed. I've just found 10,000 ways that won't work. - Thomas Edison


IMPLEMENTATION ADVISOR/ CONSULTANT

Adviser/ Consultant role is to help organizations determine whether software initiatives follow:

  • Established development methodologies and procedures
  • Meet organizational needs
  • Include adequate security
  • Management controls. 


Key areas to consider are planning, methodology assessments, reports of project results, and post-implementation reviews. This includes review of:

  • Project Management
  • Application Controls
  • Application Security
  • Change Management
  • Data Conversion


Value that can be added by adviser involvement during the implementation process includes:
  • Independent third party perspective,
  • Subject Matter Knowledge over systems
  • Activity Assessments
  • Knowledge of current state business processes
  • Knowledge of key Application Controls
  • Previous experience testing Logical access and Change Management controls
  • Experience in review Data Conversion
  • Setup activity criteria
  • Review risk and mitigation
  • setup Go-NoGo matrix to help the organisation to take decision.


OBJECTIVES AND PRINCIPLES OF SOFTWARE TESTING

OBJECTIVES:
  • To ensure that application works as specified in requirements document.
  • To provide a bug free application
  • To achieve the customer satisfaction
  • To ensure that error handling has been done gracefully in the application. In situations when user has entered incorrect data, the application display user-friendly messages.
  • To establish confidence in software
  • To evaluate properties of software
  • To discuss the distinctions between validation testing and defect testing
  • To describe strategies for generating system test cases
  • To understand the essential characteristics of tool used for test automation


PRINCIPLES:
  • Testing must be done by an independent party.
  • Assign best personnel to the task
  • Testing should not be planned under the assumptions that no errors will be found.
  • Test for invalid and unexpected input conditions as well as valid conditions.
  • Testing is the process of executing software with the intent of finding errors.
  • Keeps software static during testing.
  • Documents test cases and test results.
  • Provide expected test results if possible.

Effective Testing Strategy in Data Migration

The success of the migration project depends on proper planning, strategy, approach and testing techniques.

The following steps for successful implementation of the migration program.

§ Analyze the business requirements
§ Prepare the migration test plan/strategy
§ Prepare environment equal to production.
§ Identify success and failure conditions and also the application interface requirements
§ End-to-end migration plan for data verification and validation
§ Compliance with Data acceptance criteria
§ Post-production support to eliminate any issues that may occur during system go live

Pre-requisite Before migration
§ The organisation must look for the expertise in the migration area so that he can provide better guidance for the migration process. The data migration project involves specialized skills, tools and resources. But sometimes the resources identified for the migration project may not have the essential knowledge to carry out the migration program.
§ All stakeholders must be informed in advance of the migration project so that they are well known about the time period of the migration process, the duration the old system will not be in use, and benefits accrued through migration of the legacy system into the new application.
§ Ratify the working condition of old systems and address the issues found during migration.
§ Ensure that the backup of the old environment or system is taken so that if the migration fails, the data can be reloaded or migrated again.

During migration
§ Always be interactive to all the end users and stakeholders when the migration process is in progress.
§ No change in environments during migration trial runs. Backup environment should be available.
§ Risks should be documents with mitigation actions.
§ Compliance to the agreed entry – exit criteria.

After migration
§ All failed items should be reviewed, migrated and RCA to ensure why they failed to migrate.

§ All stakeholders should be informed about the expected time when the new system will come into existence. 


Slno Category Parameter
1 Technical Lack of agreement on system of record, data definitions, standards, transformations, conversion methods, etc.
2 Technical Data migration tool(s) not well defined or settled
3 Technical Hardware sizing and data volume are inconsistent
4 Technical Requiring historical data conversion (rather than using legacy systems for historical purposes)
5 Technical Multiple legacy sources of data for single master record loads.
6 Technical Data gaps – no corresponding data record for conversion
7 Technical Different data structures between legacy systems.  For example may have structured hierarchies and the source data may not.
8 Technical Poor data quality – errors, duplicates, inconsistent use of fields
9 Management Improper estimates of data migration effort and activities
10 Management Client side skills gaps on data migration
11 Management Deferring data corrections until after go-live
12 Management Knowledge of and access to data sources
13 Management Insufficient or incorrect change control processes
14 Management Insufficient participation from key functional project team members (client and consulting)
15 Management Improper management of dependencies between functional areas or modules. 
16 Business Identifying legacy and other data sources
17 Business Cleaning or scrubbing the data
18 Business Ensuring the converted data meets business processing requirements
19 Business Defining sufficient test plans to validate data is properly converted
20 Business Provide sufficiently skilled employees to work the data
21 Business Make as many data corrections as possible in legacy systems before conversion
22 Business Ensure there are sufficient hardware resources available for meaningful data conversion tests

Converted/Migrated Data Acceptance Criteria for UAT

1.     DATA MIGRATION TESTING (DMT) - OBJECTIVE

The objective of any Data Migration Testing is to perform post-migration validation to ensure the database structure and data integrity. This includes consistent data formats between the source and the target data, and continued interfacing capabilities of post-data migration.

2.     DMT SCOPE

The scope of DMT is to verify if the COB run results for the pre-identified data set from Legacy works correctly as per the process being applied. E.g. of processes being calculation of accruals, interest capitalization, etc.
·         Data must be migrated completely and accurately from the source location to the target location and according to the following:
    • Bank requirements
    • Regulatory policies on information controls
    • Security.
·         There cannot be dropped or incomplete records nor any data fields that fail validation or other quality controls in the target environment.

·         The process needs to be done quickly with as short a downtime window as possible.

·         The data should be verified for:
COMPLETENESS: (For every Trial Run before go-live)
       Reconciling Totals
       Reconciling SUMS
       CHECKSUMS

CORRECTNESS: (For every Trial Run before go-live)
       On-Screen Verification on the enquiry screens.
       Validate calculated fields.

SNO
PARAMETER
ENTRY CRITERIA FOR TRIAL RUN
1
Defects
All CRITICAL and MAJOR defects must be Resolved unless agreed by all stakeholders
2
Defects
Not more that 10% Average severity defects are Unresolved based on the Agreed list of Defects/ Bugs.  

SNO
PARAMETER
EXIT CRITERIA FOR TRIAL RUN
1
Defects
No Re-opened CRITICAL and MAJOR defects from the Agreed list of Defects/ Bugs
2
Defects
Not more than 10% of Average defects that are re-opened based on the Agreed list of Defects/ Bugs
3
Reports
Each reconciliation reports are 100% accurate with discrepancies accounted for

4.     PROCESS INTEGRITY TESTS (PIT):

For Trial Run(s) before accepting data into UAT based on the agreed criteria.

5.     PIT OBJECTIVE AND APPROACH

Process integrity ensures the successful continuation of business processes in FIS Profile with respect to the business requirements for the converted data. It also helps in identifying the inconsistencies and the behavior of converted data in the FIS Profile.
·            Objective of PIT is to identify behavior of data processing when it is processed post migration from legacy is consistent to Bank processes laid down and if any issue occurs in doing so
·            PIT will be performed after completion of data integrity tests and identified defects closure.
·            Two rounds of PIT of 1 week duration each. Each of the PIT should be executed in subsequent trial run and should start at least 4 weeks before UAT with CD. This will allow any defect fixes to be applied to the CD before accepting for UAT and also ensures Process Integrity of data to be continued in UAT with CD.


Following is the approach by which process integrity will be checked as part of the DM testing phase.
1.    Identification of business processes: Includes the processes that are executed as part EOD/EOM/EOQ/EOY. COB dependent processes will be validated.
2.    Pre COB Validation: Data set that will get processed during COB will be identified before the execution of COB for expected changes for e.g., Account records that are to be marked as ‘Dormant’ in the COB.  .
3.    Data Verification: Before Running the COB the Extract provided will be queried and the dataset will be captured which will get affected by COB.
Post COB execution the Extract will be taken from target system, and the data will be queried for the already identified data set to verify the correctness of Data. (i.e., results of the processes identified) This will ensure that the migrated data is behaving as expected in the target system

The following Processes/ Transactions will be tested in DM environment to ensure Converted Data is accepted for UAT based on the outcome. The below is not a complete list. Business modules can add/modify/delete the transactions to identify critical processes are working and can continue in UAT with CD.


Note: Migrated data usage with respect to their functionality and Business process will be performed as part of UAT with migrated data.

6.     ENTRY CRITERIA FOR PIT EXECUTION:

SNO
PARAMETER
ENTRY CRITERIA FOR PIT EXECUTION
1
Data Conversion
TR Completed and exit criteria met
2
Environment
UAT environment available with latest code drop and data.
3
Execution
SIT  Completed.

7.     ACCEPTANCE CRITERIA FOR UAT WITH CD:

SNO
PARAMETER
ENTRY CRITERIA
1
Execution
100% of the PIT execution completed.
2
Execution
All agreed CRITICAL functions to work as expected and passed.
3
Execution
Greater than 85% transaction should be passed from the agreed list.
4
Defects
No open CRITICAL or MAJOR defects without acceptable correction tasks defined (Data and Functional).
5
Execution
Day End to execute without Manual Intervention
6
Defects
Not more than 10% Average/Minor Defects on the Overall Defects raised
(Data Related only not functional)

8.     UAT ENVIRONMENT:

·         Target systems capacity not to exceed 40% as measured after loading converted data.
·         Data Refresh in middle of the UAT cycle, if there are major data related fixes which impacts the functionality or Business process.
·         Reduced turnaround time for roll back of the data in case of issues.
·         Any issues identified in transformation and loading can be fixed and data refreshed in the DM environment for further testing. This will reduce the issue fix time and effective testing.
·         Subset of the converted data to be used in UAT with CD. This will help to trouble shoot the resolve issues quicker.

9.     RISK AND CHALLENGES

There are also risks and challenges associated with a data migration that must be considered:
·         Migration data may be part of a larger chain of dependencies which can increase the complexity
·         ill-defined rules for integrity, controls, security, availability, and recoverability can result in an incorrect migration
·         data may be too distributed to migrate easily

Data conversions are a leading cause of schedule and quality issues in Core Banking Transformations. Research from several sources, including Gartner, confirms that over 80% of data conversion efforts either fail outright or suffer significant cost overruns and delays. As a result, they jeopardize IT projects that depend on them.

10.  GLOSSARY


·         PIT – Process Integrity Testing
·         CD – Converted Data
·         DMT – Data Migration Testing
·         DM – Data Migration
·         TR – Trial Run
·         UAT – User Acceptance Testing
COB – Close of Business