Testing Your RTSM for Your Next Clinical Trial

Regulatory requirements make the documented testing and validation of the electronic systems used in clinical trials critically important to the success of any clinical study. This is because approval authorities will reject clinical data collected by what regulators deem to be substandard means. (For examples, see 21CFRPart11 and EU Annex 11)

There are two categories of project-level RTSM testing* to be considered:

Step I: Vendor validation of system functionality and the confirmation of system compliance with approved user specifications
This validation is performed by the RTSM vendor. The reports should be detailed and documented with a written plan and test scripts. The results from the internal testing, positive and negative, should confirm the proper functioning of customizations/changes and items deemed high risk.

Step II: User Acceptance Testing (UAT) which confirms the system does what the study needs it to do
UAT is performed by the CRO and/or Sponsor. User Acceptance Testing is the final step in testing a project specific RTSM system and should be independent of vendor testing.

Here is an example scenario:

You have completed the user requirements for your clinical project and how it will be implemented by your RTSM vendor. Your vendor has built the system and completed their verification testing to confirm it meets user specifications. Your team has also had a chance to join the vendor for an informal walk-through of the system to confirm, in global terms, that it will support your project.

Now it is time for you to complete your User Acceptance Testing (UAT) to formally confirm that the system, as developed, meets the detailed needs of the study. Robust testing dictates that users do not simply repeat whatever test steps and test-subject selection the vendor has employed. New tests, new challenges to the system from the user’s ‘real use’ perspective is the optimal goal.

This is where common difficulties may arise.

  • What do you test?
  • How does one go about creating test cases and/or scripts?
  • Should you include both negative and positive testing or just the latter?
  • Can you use the vendor’s verification documents or must you create testing materials of your own?

Vendor vs. User Test Cases

Vendor test scripts/cases can be leveraged as informative tools that enable you to see what has and has not been tested during the vendor’s verification testing. These should be reviewed by the UAT team prior to finalizing their own test plans and test cases. You use their scripts to help determine that each of the project’s User Functions and User Rules (defined below) have been tested to your satisfaction.

Frame UAT in terms of functions and users

The project specifications include two kinds of requirements: functions that users can perform (i.e., User Functions, such as Screening a Subject or Receiving a Shipment) and rules for the system to follow (i.e., User Rules, such as checking that a new subject’s age is within a specified range for the study and issuing a warning if it is not).

In a system with a fully validated CORE technology (which includes most mainstream RTSM systems), the vendor has already confirmed that whatever configuration data is entered into the system during project setup will work appropriately.

A good example of this is testing the age range of new subjects when they are added to the system. Perhaps the vendor didn’t explicitly test this rule in their verification testing because they’ve built a system where age range is configurable. During their extensive testing of their core system, they’ve confirmed that the configuration item works regardless of the minimum and maximum ages specified.

Therefore, testing for age restrictions may not be included in the vendor’s verification testing. What most likely will be part of the vendor’s verification testing is a test that confirms the configuration for the current project’s age range has been set correctly, for example, a range of 18-65.

This common restriction on verification testing means you’ll often want to test a User Function using different steps than the vendor. This allows you to confirm that the system responds with the appropriate error messages and allows you to complete – or prevents you from completing, when appropriate – a task successfully, with the correct information ultimately stored in the system.

 

Let’s drill down a little further:

The User Rule for this example is, “Subjects must be between the age of 18 years old and 65 years old.”

The steps to test this rule are:

Add a new subject

Out of Range Test 1: Enter a birthday so that tomorrow is the subject’s 18th birthday

Expected Result – a warning is generated by the system stating that the subject’s age is outside the expected range

Out of Range Test 2: Enter a birthday so that today is the subject’s 66th birthday

Expected Result – a warning is generated by the system stating that the subject’s age is outside the expected range

In Range Test 1: Enter a birthday so that today is the subject’s 18th birthday

Expected Result – subject is added to the system

Out of Range Test 2: Enter a birthday so that tomorrow is the subject’s 66th birthday

Expected Result – subject is added to the system

By reviewing the vendor’s test cases, you’ll be able to see how comprehensively they’ve tested, or not tested, for issues relating to dates and age. This knowledge will help you determine whether you want to conduct additional tests. For instance, you may want to test for proper handling of invalid dates (such as 32-Jan-2000), or a future date (such as 01-Jan-2030), or even so-called garbage text (“&^*^$^@HU”).

Please note that while you do not want to repeat the testing that was done by the vendor, you do want to test to make sure all User Functions perform to meet your project expectations.

With UAT testing, your task is to create a set of documents to accomplish the adage: “Say what you’re going to do, do it, and show that you did it”. This is accomplished by creating a Test Plan, executing the plan, and creating a UAT Summary Report describing the results of your testing.

 

For more information on UAT testing, download our detailed Guide.

 

*Project-level testing is separated from the validation of the core system which the vendor alone is responsible for and should be assessed/audited prior to starting development on a specific project.