Basit öğe kaydını göster

dc.contributor.authorOzturk, Saban
dc.contributor.authorCelik, Emin
dc.contributor.authorCukur, Tolga
dc.date.accessioned2024-03-12T19:29:13Z
dc.date.available2024-03-12T19:29:13Z
dc.date.issued2023
dc.identifier.issn0020-0255
dc.identifier.issn1872-6291
dc.identifier.urihttps://doi.org/10.1016/j.ins.2023.118938
dc.identifier.urihttps://hdl.handle.net/20.500.12450/2232
dc.description.abstractThe increasing utilization of medical imaging technology with digital storage capabilities has facilitated the compilation of large-scale data repositories. Fast access to image samples with similar appearance to suspected cases in these repositories can help establish a consulting system for healthcare professionals, and improve diagnostic procedures while minimizing processing delays. However, manual querying of large repositories is labor intensive. Content-based image retrieval (CBIR) offers an automated solution based on quantitative assessment of image simi-larity based on image features in a latent space. Since conventional methods based on hand-crafted features typically show poor generalization performance, learning-based CBIR methods have received attention recently. A common framework in this domain involves classifier-guided models that are trained to detect different image classes. Similarity assessments are then per-formed on the features captured by the intermediate stages of the trained models. While classifier -guided methods are powerful in inter-class discrimination, they are suboptimally sensitive to within-class differences in image features. An alternative framework instead performs task -agnostic training to learn an embedding space that enforces the representational discriminabil-ity of images. Within this representational-learning framework, a powerful method is triplet-wise learning that addresses the deficiencies of point-wise and pair-wise learning in characterizing the similarity relationships between image classes. However, the traditional triplet loss enforces separation between only a subset of image samples within the triplet via a manually-set constant margin value, so it can lead to suboptimal segregation of opponent classes and limited general-ization performance. To address these limitations, we introduce a triplet-learning method for automated querying of medical image repositories based on a novel Opponent Class Adaptive Margin (OCAM) loss. To maintain optimally discriminative representations, OCAM considers relationships among all image pairs within the triplet and utilizes an adaptive margin value that is automatically selected per dataset and during the course of training iterations. CBIR performance of OCAM is compared against state-of-the-art loss functions for representational learning on three public databases (gastrointestinal disease, skin lesion, lung disease). On average, OCAM shows an mAP performance of 86.30% in the KVASIR dataset, 70.30% in the ISIC 2019 dataset, and 85.57% in the X-RAY dataset. Comprehensive experiments in each application domain demonstrate the superior performance of OCAM against competing triplet-wise methods at 1.52%, classifier -guided methods at 2.29%, and non-triplet representational-learning methods at 4.56%.en_US
dc.language.isoengen_US
dc.publisherElsevier Science Incen_US
dc.relation.ispartofInformation Sciencesen_US
dc.rightsinfo:eu-repo/semantics/openAccessen_US
dc.subjectCBIRen_US
dc.subjectMedical image retrievalen_US
dc.subjectTripleten_US
dc.subjectRepresentational learningen_US
dc.subjectHashingen_US
dc.titleContent-based medical image retrieval with opponent class adaptive margin lossen_US
dc.typearticleen_US
dc.departmentAmasya Üniversitesien_US
dc.authoridÇukur, Tolga/0000-0002-2296-851X
dc.authoridOZTURK, Saban/0000-0003-2371-8173
dc.identifier.volume637en_US
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.identifier.scopus2-s2.0-85153475334en_US
dc.identifier.doi10.1016/j.ins.2023.118938
dc.department-temp[Ozturk, Saban; Celik, Emin; Cukur, Tolga] Bilkent Univ, Dept Elect & Elect Engn, TR-06800 Ankara, Turkiye; [Ozturk, Saban; Celik, Emin; Cukur, Tolga] Bilkent Univ, Natl Magnet Resonance Res Ctr, TR-06800 Ankara, Turkiye; [Ozturk, Saban] Amasya Univ, Dept Elect & Elect Engn, TR-05001 Amasya, Turkiye; [Cukur, Tolga] Bilkent Univ, Sabuncu Brain Res Ctr, Neurosci Program, TR-06800 Ankara, Turkiyeen_US
dc.identifier.wosWOS:000990675100001en_US
dc.authorwosidÇukur, Tolga/Z-5452-2019


Bu öğenin dosyaları:

DosyalarBoyutBiçimGöster

Bu öğe ile ilişkili dosya yok.

Bu öğe aşağıdaki koleksiyon(lar)da görünmektedir.

Basit öğe kaydını göster