S'abonner

Real-time artificial intelligence for detection of upper gastrointestinal cancer by endoscopy: a multicentre, case-control, diagnostic study - 03/12/19

Doi : 10.1016/S1470-2045(19)30637-0 
Huiyan Luo, ProfMD a, , Guoliang Xu, ProfMD b, , Chaofeng Li, PhD c, , Longjun He, MD b, , Linna Luo, MD a, , Zixian Wang, MD a, , Bingzhong Jing, MS c, Yishu Deng, MS c, Ying Jin, MD a, Yin Li, MD b, Bin Li, MS c, Wencheng Tan, MD b, Caisheng He, PhD c, Sharvesh Raj Seeruttun, PhD d, Qiubao Wu, ProfMD f, Jun Huang, MD f, De-wang Huang, ProfMD g, Bin Chen, ProfMD h, Shao-bin Lin, ProfMD i, Qin-ming Chen, ProfMD i, Chu-ming Yuan, ProfMD j, Hai-xin Chen, ProfMD j, Heng-ying Pu, PhD e, Feng Zhou, PhD e, Yun He, PhD e, Rui-hua Xu, ProfMD a,
a Department of Medical Oncology, State Key Laboratory of Oncology in South China, Collaborative Innovation Center for Cancer Medicine, Sun Yat-sen University Cancer Center, Guangzhou, China 
b Department of Endoscopy, State Key Laboratory of Oncology in South China, Collaborative Innovation Center for Cancer Medicine, Sun Yat-sen University Cancer Center, Guangzhou, China 
c Artificial Intelligence Laboratory, State Key Laboratory of Oncology in South China, Collaborative Innovation Center for Cancer Medicine, Sun Yat-sen University Cancer Center, Guangzhou, China 
d Department of Gastric Surgery, State Key Laboratory of Oncology in South China, Collaborative Innovation Center for Cancer Medicine, Sun Yat-sen University Cancer Center, Guangzhou, China 
e Medical Administration Department, State Key Laboratory of Oncology in South China, Collaborative Innovation Center for Cancer Medicine, Sun Yat-sen University Cancer Center, Guangzhou, China 
f Department of Endoscopy, Jiangxi Cancer Hospital, Nanchang, China 
g Department of Digestive Internal, Wuzhou Red Cross Hospital, Wuzhou, China 
h Department of Digestive Internal, The North Guangdong People’s Hospital, Shaoguan, China 
i Department of Digestive Internal, Puning People’s Hospital, Puning, China 
j Department of Digestive Internal, Jieyang People’s Hospital, Jieyang, China 

* Correspondence to: Dr Rui-hua Xu, Department of Medical Oncology, State Key Laboratory of Oncology in South China, Collaborative Innovation Center for Cancer Medicine, Sun Yat-sen University Cancer Center, Guangzhou 510060, China Department of Medical Oncology State Key Laboratory of Oncology in South China Collaborative Innovation Center for Cancer Medicine Sun Yat-sen University Cancer Center Guangzhou 510060 China

Summary

Background

Upper gastrointestinal cancers (including oesophageal cancer and gastric cancer) are the most common cancers worldwide. Artificial intelligence platforms using deep learning algorithms have made remarkable progress in medical imaging but their application in upper gastrointestinal cancers has been limited. We aimed to develop and validate the Gastrointestinal Artificial Intelligence Diagnostic System (GRAIDS) for the diagnosis of upper gastrointestinal cancers through analysis of imaging data from clinical endoscopies.

Methods

This multicentre, case-control, diagnostic study was done in six hospitals of different tiers (ie, municipal, provincial, and national) in China. The images of consecutive participants, aged 18 years or older, who had not had a previous endoscopy were retrieved from all participating hospitals. All patients with upper gastrointestinal cancer lesions (including oesophageal cancer and gastric cancer) that were histologically proven malignancies were eligible for this study. Only images with standard white light were deemed eligible. The images from Sun Yat-sen University Cancer Center were randomly assigned (8:1:1) to the training and intrinsic verification datasets for developing GRAIDS, and the internal validation dataset for evaluating the performance of GRAIDS. Its diagnostic performance was evaluated using an internal and prospective validation set from Sun Yat-sen University Cancer Center (a national hospital) and additional external validation sets from five primary care hospitals. The performance of GRAIDS was also compared with endoscopists with three degrees of expertise: expert, competent, and trainee. The diagnostic accuracy, sensitivity, specificity, positive predictive value, and negative predictive value of GRAIDS and endoscopists for the identification of cancerous lesions were evaluated by calculating the 95% CIs using the Clopper-Pearson method.

Findings

1 036 496 endoscopy images from 84 424 individuals were used to develop and test GRAIDS. The diagnostic accuracy in identifying upper gastrointestinal cancers was 0·955 (95% CI 0·952–0·957) in the internal validation set, 0·927 (0·925–0·929) in the prospective set, and ranged from 0·915 (0·913–0·917) to 0·977 (0·977–0·978) in the five external validation sets. GRAIDS achieved diagnostic sensitivity similar to that of the expert endoscopist (0·942 [95% CI 0·924–0·957] vs 0·945 [0·927–0·959]; p=0·692) and superior sensitivity compared with competent (0·858 [0·832–0·880], p<0·0001) and trainee (0·722 [0·691–0·752], p<0·0001) endoscopists. The positive predictive value was 0·814 (95% CI 0·788–0·838) for GRAIDS, 0·932 (0·913–0·948) for the expert endoscopist, 0·974 (0·960–0·984) for the competent endoscopist, and 0·824 (0·795–0·850) for the trainee endoscopist. The negative predictive value was 0·978 (95% CI 0·971–0·984) for GRAIDS, 0·980 (0·974–0·985) for the expert endoscopist, 0·951 (0·942–0·959) for the competent endoscopist, and 0·904 (0·893–0·916) for the trainee endoscopist.

Interpretation

GRAIDS achieved high diagnostic accuracy in detecting upper gastrointestinal cancers, with sensitivity similar to that of expert endoscopists and was superior to that of non-expert endoscopists. This system could assist community-based hospitals in improving their effectiveness in upper gastrointestinal cancer diagnoses.

Funding

The National Key R&D Program of China, the Natural Science Foundation of Guangdong Province, the Science and Technology Program of Guangdong, the Science and Technology Program of Guangzhou, and the Fundamental Research Funds for the Central Universities.

Le texte complet de cet article est disponible en PDF.

Plan


© 2019  Elsevier Ltd. Tous droits réservés.
Ajouter à ma bibliothèque Retirer de ma bibliothèque Imprimer
Export

    Export citations

  • Fichier

  • Contenu

Vol 20 - N° 12

P. 1645-1654 - décembre 2019 Retour au numéro
Article précédent Article précédent
  • Will immunotherapy really change radiotherapy?
| Article suivant Article suivant
  • Ramucirumab plus erlotinib in patients with untreated, EGFR-mutated, advanced non-small-cell lung cancer (RELAY): a randomised, double-blind, placebo-controlled, phase 3 trial
  • Kazuhiko Nakagawa, Edward B Garon, Takashi Seto, Makoto Nishio, Santiago Ponce Aix, Luis Paz-Ares, Chao-Hua Chiu, Keunchil Park, Silvia Novello, Ernest Nadal, Fumio Imamura, Kiyotaka Yoh, Jin-Yuan Shih, Kwok Hung Au, Denis Moro-Sibilot, Sotaro Enatsu, Annamaria Zimmermann, Bente Frimodt-Moller, Carla Visseren-Grul, Martin Reck, RELAY Study Investigators, Quincy Chu, Alexis Cortot, Jean-Louis Pujol, Denis Moro-Sibilot, Elizabeth Fabre, Corinne Lamour, Helge Bischoff, Jens Kollmeier, Martin Reck, Martin Kimmich, Walburga Engel-Riedel, Stefan Hammerschmidt, Wolfgang Schütte, Konstantinos Syrigos, James Chung Man Ho, Kwok-Hung Au, Silvia Novello, Andrea Ardizzoni, Giulia Pasello, Vanessa Gregorc, Alessandro Del Conte, Domenico Galetta, Toshiaki Takahashi, Kazuhiko Nakagawa, Makoto Nishio, Kiyotaka Yoh, Takashi Seto, Fumio Imamura, Toru Kumagai, Katsuyuki Hotta, Yasushi Goto, Yukio Hosomi, Hiroshi Sakai, Yuichi Takiguchi, Young Hak Kim, Takayasu Kurata, Hiroyuki Yamaguchi, Haruko Daga, Isamu Okamoto, Miyako Satouchi, Satoshi Ikeda, Kazuo Kasahara, Shinji Atagi, Koichi Azuma, Toru Kumagai, Keisuke Aoe, Toru Kumagai, Keisuke Aoe, Yoshitsugu Horio, Nobuyuki Yamamoto, Hiroshi Tanaka, Satoshi Watanabe, Naoyuki Nogami, Tomohiro Ozaki, Ryo Koyama, Tomonori Hirashima, Hiroyasu Kaneda, Keisuke Tomii, Yuka Fujita, Masahiro Seike, Naoki Nishimura, Terufumi Kato, Masao Ichiki, Hideo Saka, Katsuya Hirano, Yasuharu Nakahara, Shunichi Sugawara, Keunchil Park, Sang-We Kim, Young Joo Min, Hyun Woo Lee, Jin-Hyoung Kang, Ho Jung An, Ki Hyeong Lee, Jin-Soo Kim, Gyeong-Won Lee, Sung Yong Lee, Aurelia Alexandru, Anghel Adrian Udrea, Óscar Juan-Vidal, Ernest Nadal-Alforja, Ignacio Gil-Bazo, Santiago Ponce-Aix, Luis Paz-Ares, Belén Rubio-Viqueira, Miriam Alonso Garcia, Enriqueta Felip Font, Jose Fuentes Pradera, Juan Coves Sarto, Meng-Chih Lin, Wu-Chou Su, Te-Chun Hsia, Gee-Chen Chang, Yu-Feng Wei, Chao-Hua Chiu, Jin-Yuan Shih, Jian Su, Irfan Cicin, Tuncay Goksel, Hakan Harputluoglu, Ozgur Ozyilkan, Ivo Henning, Sanjay Popat, Olivia Hatcher, Kathryn Mileham, Jared Acoba, Edward Garon, Gabriel Jung, Moses Raj, William Martin, Shaker Dakhil

Bienvenue sur EM-consulte, la référence des professionnels de santé.
L’accès au texte intégral de cet article nécessite un abonnement.

Déjà abonné à cette revue ?

Mon compte


Plateformes Elsevier Masson

Déclaration CNIL

EM-CONSULTE.COM est déclaré à la CNIL, déclaration n° 1286925.

En application de la loi nº78-17 du 6 janvier 1978 relative à l'informatique, aux fichiers et aux libertés, vous disposez des droits d'opposition (art.26 de la loi), d'accès (art.34 à 38 de la loi), et de rectification (art.36 de la loi) des données vous concernant. Ainsi, vous pouvez exiger que soient rectifiées, complétées, clarifiées, mises à jour ou effacées les informations vous concernant qui sont inexactes, incomplètes, équivoques, périmées ou dont la collecte ou l'utilisation ou la conservation est interdite.
Les informations personnelles concernant les visiteurs de notre site, y compris leur identité, sont confidentielles.
Le responsable du site s'engage sur l'honneur à respecter les conditions légales de confidentialité applicables en France et à ne pas divulguer ces informations à des tiers.


Tout le contenu de ce site: Copyright © 2024 Elsevier, ses concédants de licence et ses contributeurs. Tout les droits sont réservés, y compris ceux relatifs à l'exploration de textes et de données, a la formation en IA et aux technologies similaires. Pour tout contenu en libre accès, les conditions de licence Creative Commons s'appliquent.