Abstract Systematic reviews are widely used in evidence‐based medicine. Conducting a systematic review requires intensive mental efforts, especially during the study screening process. This challenge has motivated the development of intelligent software. This study examined and compared the performance, workload, and user experience of two systematic review tools – Colandr with Artificial Intelligence (AI) features and Covidence without AI features by conducting a mixed‐method usability study. The results showed that reviewers had higher precision in citation screening using Colandr than using Covidence. However, the user experience with Colandr was not optimal due to problems in its user interface design. Therefore, we suggest that the design and development of AI‐enabled SR software emphasize the usability of the interface and apply user‐centered design principles.
|