dc.contributor.author | Unal, Yavuz | |
dc.contributor.author | Taspinar, Yavuz Selim | |
dc.contributor.author | Cinar, Ilkay | |
dc.contributor.author | Kursun, Ramazan | |
dc.contributor.author | Koklu, Murat | |
dc.date.accessioned | 2024-03-12T19:28:57Z | |
dc.date.available | 2024-03-12T19:28:57Z | |
dc.date.issued | 2022 | |
dc.identifier.issn | 1936-9751 | |
dc.identifier.issn | 1936-976X | |
dc.identifier.uri | https://doi.org/10.1007/s12161-022-02362-8 | |
dc.identifier.uri | https://hdl.handle.net/20.500.12450/2123 | |
dc.description.abstract | Coffee is an important export product of the tropical countries where it is grown. Therefore, the separation of coffee beans in the world in terms of the quality element and variety forgery is an important situation. Currently, the use of manual control methods leads to the fact that the parsing processes are inconsistent, time-consuming, and subjective. Automated systems are needed to eliminate such negative situations. The aim of this study is to classify 3 different coffee beans by using their images, through the transfer learning method by utilizing 4 different Convolutional Neural Networks-based models, which are SqueezeNet, Inception V3, VGG16, and VGG19. The dataset used in the models' training was created specially for this study. A total of 1554 coffee bean images of Espresso, Kenya, and Starbucks Pike Place coffee types were collected with the created mechanism. Model training and model testing processes were carried out with the obtained images. In order to test the models, the cross-validation method was used. Classification success, Precision, Recall, and F-1 Score metrics were used for the detailed analysis of the models of performances. ROC curves were used for analyzing their distinctiveness. As a result of the tests, the average classification success of the models was determined as 87.3% for SqueezeNet, 81.4% for Inception V3, 78.2% for VGG16, and 72.5% for VGG19. These results demonstrate that the SqueezeNet is the most successful model. It is thought that this study may contribute to the subject of coffee beans of separation in the industry. | en_US |
dc.description.sponsorship | Scientific Research Coordinator of Selcuk University [22111002] | en_US |
dc.description.sponsorship | This project was supported by the Scientific Research Coordinator of Selcuk University with the project number 22111002. | en_US |
dc.language.iso | eng | en_US |
dc.publisher | Springer | en_US |
dc.relation.ispartof | Food Analytical Methods | en_US |
dc.rights | info:eu-repo/semantics/closedAccess | en_US |
dc.subject | Coffee beans | en_US |
dc.subject | Deep learning | en_US |
dc.subject | CNN | en_US |
dc.subject | Transfer learning | en_US |
dc.title | Application of Pre-Trained Deep Convolutional Neural Networks for Coffee Beans Species Detection | en_US |
dc.type | article | en_US |
dc.department | Amasya Üniversitesi | en_US |
dc.authorid | KOKLU, Murat/0000-0002-2737-2360 | |
dc.authorid | Taspinar, Yavuz Selim/0000-0002-7278-4241 | |
dc.authorid | KURSUN, Ramazan/0000-0002-6729-1055 | |
dc.authorid | cinar, ilkay/0000-0003-0611-3316 | |
dc.authorid | UNAL, Yavuz/0000-0002-3007-679X | |
dc.identifier.volume | 15 | en_US |
dc.identifier.issue | 12 | en_US |
dc.identifier.startpage | 3232 | en_US |
dc.identifier.endpage | 3243 | en_US |
dc.relation.publicationcategory | Makale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanı | en_US |
dc.identifier.scopus | 2-s2.0-85135329785 | en_US |
dc.identifier.doi | 10.1007/s12161-022-02362-8 | |
dc.department-temp | [Unal, Yavuz] Amasya Univ, Dept Comp Engn, Amasya, Turkey; [Taspinar, Yavuz Selim] Selcuk Univ, Doganhisar Vocat Sch, Konya, Turkey; [Cinar, Ilkay; Koklu, Murat] Selcuk Univ, Dept Comp Engn, Konya, Turkey; [Kursun, Ramazan] Selcuk Univ, Guneysinir Vocat Sch, Konya, Turkey | en_US |
dc.identifier.wos | WOS:000835575300001 | en_US |
dc.authorwosid | KOKLU, Murat/Y-7354-2018 | |
dc.authorwosid | Taspinar, Yavuz Selim/AAZ-9537-2021 | |
dc.authorwosid | cinar, ilkay/GLS-2427-2022 | |
dc.authorwosid | KURSUN, Ramazan/ACG-4351-2022 | |