CR Toolkit™ Enables Your Organization to Expertly Work with Study Reports, Consistently and Accurately Every Time
Standard Call Interface
Fully Documented Validation Package & User Manual
The toolkit is compatible with SDTM and ADaM data standards.
- Analysis Data Reviewer’s Guide (ADRG)
- Study Data Reviews Guide (SDRG)
- CDISC SEND, SDTM, and ADaM
- Reduce lines of code & development time by over 50%
- Reduce variability by standardizing statistical operations
- Enables consistency throughout your programming environment
- Streamlines QC process
- Adds validation
Easy to modify for execution with systems such as EntimICE, and fully compatible with PhUSE and CDISC initiatives.
Why CR Toolkit™?
We believe that data transformed properly into drug information is at the heart of every FDA approved, life-saving drug... and organized, efficient, accurate data enables those life-saving drugs to get approval and reach patients faster.
The CR Toolkit™ facilitates preparation of reports conforming to sponsors’ standards. It enables you to assemble and produce consistent reports. You can work with any data structure and content, and it supports CDISC standards (ADaM and SDTM).
Over multiple studies and the long path to submission, many things can change including protocols, personnel, and reporting requirements. These changes will affect your data, and in turn, the information for your submission - reports.
The Risk: Reports in various formats generated from data stored on local computers, on-premises servers, databases, off-site servers, and in the cloud, must be brought together for analysis.
The Challenge: Reports are processed using code and non standardized formats from multiple sources written by different programmers/vendors with varying skill levels and standards.
The Solution: CR Toolkit™ reduces programmer variability, drives overall standardization, and reduces programming and validation efforts.
Manual Reporting vs. CR Toolkit™
When code is developed over time by multiple programmers with varying skill levels, standardization becomes difficult.
Macros that have been tested by experienced programmers are used with datasets organized in standard formats.
Teams are presented with quality control challenges due to tables, listings, and graphs generated using non-standard code.
Quality control becomes a clear process that enables efficient review and management of tables, listings, and graphs output by the macro library.
Variability is introduced as a result of non-standard code being used by multiple programmers on data in different formats without a unified QC process.
Standardized code and a consistent QC process create macro outputs with higher quality and fewer errors.
The combination of these issues causes inconsistency in validation of outputs and can put deliverables at risk.
The combined result of macro standardization and process control is simplified, reliable validation process that ensures your deliverables are correct and ready for submission.