The pre-production review is the final step in building out a survey project. It occurs prior to the project builder requesting to move their project to production. The process is informal. The focus is on project's readiness to send out surveys and collect responses.
Why is the review required?
The effort to learn REDCap survey functionality is substantial. The feedback we've heard over the years is that setting up survey projects, especially those that are ambitious in scope or complexity, can be challenging, even for experienced REDCap project builders. The suite of tools builders use to set up survey collection is not always intuitive or easy to navigate. Additionally, there is a lot of documentation, and some of the documentation is missing.
Conducting a review is our way of 1) acknowledging that it's hard, even for expert builders, to navigate survey project building activities and get all the details right, and 2) helping builders address this by working with them to identify set up issues that are easy to miss or be misunderstood, and that would prevent participants from being able to complete one of project's surveys or lead to a security compliance or protocol violation.
Who should attend the review?
The review is conducted with the project builder who has completed survey training and who is in charge of the survey set up in the project.
What should I do to prepare for the review?
- Complete survey training.
- If this is the first time you've met with us at a drop-in about your survey project, prepare to share the big picture perspective, survey-wise, to help us understand what the project's requirements are related to sending out surveys and collecting responses.
- Test, vet and validate all surveys, survey settings, and survey related features that you've set up in the project.
- If your survey project includes an electronic consent form - test, vet and validate that the set up of your consent form or information sheet meets the IRB's guidance and requirements for using REDCap for consenting.
For most projects, the review wraps up a conversation that has been ongoing over the course of one or more prior drop-in sessions. This is because many survey projects have some complexity or the scope of the requirements is considerable and/or many survey project builders will need hands-on help setting up the advanced and less intuitive to use survey features that best address their project's needs. Examples include: setting up the Queue to ensure only consented participants have access to the study survey, and setting up Automated Survey Invitations to email a participant a survey invitation 30 days after they enrolled in the study. In these situations, where the builder has already been working with the REDCap Team at prior drop-in sessions, many of items in the list above have already been addressed.
What kinds of things will the REDCap Team be reviewing my project for?
The REDCap Team will be reviewing with you:
- Your project's surveys and their corresponding surveys settings.
- How you've set up the project to get your survey links to your participants to meet the requirements of your study.
- You and your teams efforts to test and validate the first two items.
It's similar to a quality control check. The objective is to determine if there is a problem that needs to be addressed before the project is approved for production. A common problem and example from real reviews we've conducted: an Information Sheet with the Survey Setting enabled for Auto-continue to next survey.
We can address some problems, such as the example above, during the review. In these scenarios, typically there is a learning opportunity for the builder and some instructions to complete some minimal, but additional, testing prior to submitting the production request.
Some problems mean the project is not yet ready for production. Some examples from real reviews we've conducted include: 1) A consent form that needs significant modifications to meet the IRB's quidance and requirements for using REDCap for consenting research participants, 2) one or more surveys that have not had any testing or validation, and 3) Automated Survey Invitations with invalid logic that was not discovered during test runs.
In these scenarios, we identify some training and learning resources for the builder to follow up on, provide guidance about the kind of additional testing that will be needed, and request the builder to return for another review.
How do I sign up for a review?
Currently, we conduct reviews at our drop-in sessions, and prioritize attendees who have signed up for reviews. Please only sign up for a review after you have tested, vetted and validated all surveys, survey settings, and survey related features that you've set up in the project.