<PROJECT NAME>
TEST PLAN
Version <1.0>
<04/15/2016>
<Project Name>
VERSION HISTORY
[Provide information on how the development and distribution of the Test Plan].
Version
#
1.0
Implemented
By
Revision
Date
<04/15/16>
Approved
By
Page 2 of 11
Approval
Date
Reason
-Test Plan Template
draft
<Project Name>
TABLE OF CONTENTS
1 INTRODUCTION.....................................................................................................................4
1.1
Purpose of The Test Plan Document....................................................................4
2 INTEGRATION TESTING.....................................................................................................4
2.1
Test Scope CCDA OUT testing...............................................................................4
2.2
its Results testing........................................................................................................5
2.3
CCDA IN testing........................................................................................................5
2.4
Items to be Tested / Not Tested..................................................................................5
2.5
Test Approach(s)........................................................................................................6
2.5.1 Test Approach for CCDA out Testing...............................................................6
2.5.2 Test Approach for ITS result Testing...............................................................7
2.5.3 Test Approach for CCDA IN Testing................................................................8
2.6
Test Pass / Fail Criteria..............................................................................................9
2.6.1 Test execution criteria for CCDA OUT Testing:.............................................9
2.6.2 Test execution criteria for ITS results:.............................................................9
2.7
Test Entry / Exit Criteria............................................................................................9
2.8
Test Defect...............................................................................................................10
3.0 Test Strategy.....................................................................................................................10
3.1 Test level responsibility.................................................................................................10
2.9
Test Deliverables......................................................................................................10
2.10
Environment Roles and Responsibilities...........................................................10
TEST PLAN APPROVAL...........................................................................................................11
Page 3 of 11
<Project Name>
1 INTRODUCTION
1.1
PURPOSE OF THE TEST PLAN DOCUMENT
Test Plan documents the necessary information required to effectively define the approach
to be used for CCDA - OUT testing.
Purpose of the Test Plan document is to outline Integration Testing of outbound message
from HIE to End user system (Affiliate) in the form of C32 and CCDA out file.
Testing includes interface testing of SDA with C32 and CCDA file.
2 INTEGRATION TESTING
2.1
TEST SCOPE CCDA OUT TESTING
Integration testing scope includes validation of translation unit as defined in Design
Integration document (DID) and compare SDA with C32 and CCDA out file generated for
various affiliates.
[Link to Test Plan document on SJH box.]
HIETranslation
Unit
(SDA)
Web Service Req
soapUI tool
Web Service Res
Web
Service
link
Clinical Viewer
(C32/ CCDA)
After ingesting the data to HIE will result in consolidated output in the form of SDA file
which needs to be validated with C32 and CCDA out file generated through SOAP UI.
The CCDA output file needs to be validated through NIST validator (base line) to ensure
they will no parsing issues.
Sections needs to be validated for CCDA out
1. Patient
2. Allergy
3. History
4. Problems
5. Laboratory
6. Medications
Page 4 of 11
<Project Name>
2.2
ITS RESULTS TESTING
Integration testing scope for ITS results includes validation of Inbound and Outbound
messages after the message is processed. HL7 messages are triggered using HLl7
simulator which sends the message to the HIE interface and post processing outbound
message is compared with inbound message.
HL7
Simulator
Inbound
Message
HIE- Interface
Translation
Unit
Outbound
Message
Outbound
message
Sections needs to be validated for ITS results:
1. Validation against rules defined in interface mapping.
2. Comparison of Inbound message and outbound message.
3. Comparison of Segment and fields values of HL7 message.
4. Validation for BBK&PTH, LAB, Mic, ITS message.
2.3
CCDA IN TESTING
Integration testing scope for CCDA IN includes validation of CCDA IN file and SDA file
after the message is processed.
Meditech
EMR
CCDA IN
Message
HIE- Interface
Translation
Unit
SDA
SDA message
Message
Sections needs to be validated for CCDA IN results:
1. Comparison of CCDA IN fields values are correctly populated in SDA file.
2. All the records of CCDA IN are available in SDA as per the mapping rules.
3. All the Required fields of CCDA IN has the value.
4. Comparison of HL7 message is documented as per the below document template.
2.4
ITEMS TO BE TESTED / NOT TESTED
[Below section describe the items/features/functions to be tested that are within the scope
of this test plan.
Page 5 of 11
<Project Name>
2.5
Item to Test
Test Description
Message with
Required fields
Validation of messages based on the
field defined mandatory/optional for
the interface to ensure mapping rules
works as per defined DID.
Mapping rules
validation
Validation of mapping between SDA
file and and fields required by
destination system.
Translation unit
Validation
Field level testing to
validate message
failure based on invalid
values.
Protocol and Alert
validation.
Validation of translation unit based on
xsd.
Test Date
Responsibility
Validation of fields by comparing input
(CCDA) and output file (SDA) for the
respective affiliates.
Validation of scenarios when protocol
error occurs for (TCP/IP, FTP/ Web
service/HTTP) based on
implementation of interface.
TEST APPROACH(S)
[Below section describe the overall testing approach to be used to test the Interface.]
Test case document: Test case document outlines all the fields needs to be tested based
on the interface requirement. All the test items mentioned in the section 2.2 will be
validated against test case document.
2.5.1
Test Approach for CCDA out Testing
CCDA OUT file and SDA is compared to ensure clinical data is received correctly and as
per the mapping and translation unit is defined. Mapping document outlines the fields
mapping and required/Optional fields. Below section outlines the approach followed to
validate CCDA out comparison.
1. SDA file is obtained from clinical viewer.
2. soapUI tool is utilized which consumes web service and generates CCDA OUT file for
patient. Detailed steps are defined in attached document to create CCDA OUT file.
3. NIST Validation is performed to ensure any xml format issue should not persist in the
CCDA OUT file.
4. Document to generate CCDA OUT file using soapUI tool.
5. CCDA OUT file validation includes HTML validation and XML file validation.
Page 6 of 11
<Project Name>
6. CCDA OUT-HTML instance is compared against SDA to ensure all the information on
web page is matching with values defined in SDA.
7. CCDA OUT- XML instance is compared against SDA to ensure right information is
flowing in CCDA- XML file.
8. Below sections of SDA are compared with CCDA OUT file.
a. Patient
b. Allergy
c. History
d. Problems
e. Laboratory
f.
Medications
All the issues/Defects are documented and tracked in QuanTM tool.
[Link to Test Case document on QuanTM]
2.5.2
Test Approach for ITS result Testing
ITS results includes validation of Inbound and Outbound messages after the message is
processed. ITS results validates HIE interfaces for HL7 message translation. When a HL7
message is triggered from source system it gets processed in the HIE engine and output
file is sent to destination system. Mapping rules and message structure is defined in the
translation unit to process the message as per the interface requirement.
Below section outlines steps to be followed for validation and validation scope for ITS
results.
1. HL7 message is triggered to the HIE Interface using Interface explorer.
2. IP and Port is configured based on the HIE interface.
3.
Change in HL7 message is incorporated based on the provider.
4. Once the message is triggered through simulator message received to the Inbound
thread of HIE and pass through the translator unit.
5. After the message is processed output file is generated in the Outbound Queue.
6. HL7 message validation includes validation for below:
a. BBK-PTH-Map
b. LAB-Map
c. ITS-MAP
d. MICRO-MAP
7. All the Segment and field values are compared in Inbound and Outbound message
to verify output result. Results are compared based on the mapping guide. Output
result template is attached for reference.
Page 7 of 11
<Project Name>
All afore mentioned HL7 messages are validated to verify Inbound and Outbound
message record. Attached document provide detailed steps to validate HL7 message.
2.5.3
Test Approach for CCDA IN Testing
Integration testing for CCDA IN includes validation of CCDA IN file and SDA file after the
message is processed. CCDA IN file is received to HIE from EMR system. HIE system
translates the CCDA IN file and SDA is obtained after processing the CCDA IN message.
Testing ensures clinical data is received correctly and as per the mapping and translation
unit is defined. Mapping document outlines the fields mapping and required/Optional
fields. Below section outlines the approach followed to validate CCDA IN comparison.
a. Patient Information
b. Record Target
c. Participant encounter
d. Medications
e. Diagnostic Results
f.
Result
g. Result observation
h. Problems
i.
Procedures
j.
Encounters
k. Immunizations
l.
Medication activity
m. Plan of Care
n. Immunizations Administered
o. Social History
p. Vital Signs
Output result template is attached for reference:
2.6
2.6.1
TEST PASS / FAIL CRITERIA
Test execution criteria for CCDA OUT Testing:
Below sections of SDA are compared with CCDA OUT file and all the values populated
in SDA file should pass through correctly for Test execution.
a. Patient
b. Allergy
c. History
d. Problems
Page 8 of 11
<Project Name>
e. Laboratory
f.
Medications
[Test execution summary report provides the execution status and defect to be tracked.]
[Link to Test summary report document on QuanTM.]
2.6.2
Test execution criteria for ITS results:
HL7 message validation Test execution criteria includes validation for below:
a. BBK-PTH-Map
b. LAB-Map
c. ITS-MAP
d. MICRO-MAP
2.7
TEST ENTRY / EXIT CRITERIA
[Below section describe the entry and exit criteria used to start testing and determine
when to stop testing.]
Pre- requisite/Entry criteria:
Design Integration document
Interface architecture
XSD
File protocol (TCP/IP, FTP, Web service, HTTP)
Build delivery document
Access to Process, thread and process logs to validate the message process
Environment readiness
Exit criteria:
2.8
All priority bug have been fixed and closed.
All low priority bug has been addressed
Test Case execution in QuanTM for all covered scenarios.
TEST DEFECT
All the defects identified during execution will be tracked in QuanTM tool. Defects can be
logged with reference to test case and as ad hoc test execution. Test Lab has been
defined in QuanTM for CCDAIN, CCDA OUT and HL7 Results. All the defects will linked to
respective requirement and report for Test execution can be obtained.
3.0 TEST STRATEGY
3.1 TEST LEVEL RESPONSIBILITY
Detail the testing levels expected to be applied and who has primary (P) and secondary (S)
responsibility for performing this testing (example below).
Page 9 of 11
<Project Name>
Test Level
External
Party
Unit Testing
Integration Testing
Security Testing
2.9
Proj Team
Business
Connectivity Testing
User Acceptance Testing
Production Verification Testing
TEST DELIVERABLES
[Below section describe the deliverables that will result from the testing process]
All the Test case are uploaded at QuanTM and link is provided below:
Test case document Link (QuanTM link)
Test summary report Link (QuanTM link)
2.10 ENVIRONMENT ROLES AND RESPONSIBILITIES
Define the roles and responsibilities of persons who will be responsible for, or interface with the
environment
Role
Staff Member
Responsibilities
Release Manager
Bill Smith
Responsible for overall
establishment, coordination and
support of the test environment
Test Manager
Mary Jones
Responsible for advising release
manager of environment
requirements for planning,
establishment and ongoing
Project Manager
Cathy Simons
Escalation point for environment
issues.
Page 10 of 11
<Project Name>
TEST PLAN APPROVAL
[List the individuals whose signatures are required.]
Signature:
Date:
Print Name:
Title:
Role:
Signature:
Date:
Print Name:
Title:
Role:
Signature:
Date:
Print Name:
Title:
Role:
Page 11 of 11