Hand Washing Gesture Recognition Using Synthetic Dataset
| dc.contributor.author | Ozakar, Rustem | |
| dc.contributor.author | Gedikli, Eyup | |
| dc.date.accessioned | 2026-03-26T14:56:50Z | |
| dc.date.available | 2026-03-26T14:56:50Z | |
| dc.date.issued | 2025 | |
| dc.description | Ozakar, Rustem/0000-0002-7724-6848; Gedikli, Eyup/0000-0002-7212-5457 | en_US |
| dc.description.abstract | Hand hygiene is paramount for public health, especially in critical sectors like healthcare and the food industry. Ensuring compliance with recommended hand washing gestures is vital, necessitating autonomous evaluation systems leveraging machine learning techniques. However, the scarcity of comprehensive datasets poses a significant challenge. This study addresses this issue by presenting an open synthetic hand washing dataset, created using 3D computer-generated imagery, comprising 96,000 frames (equivalent to 64 min of footage), encompassing eight gestures performed by four characters in four diverse environments. This synthetic dataset includes RGB images, depth/isolated depth images and hand mask images. Using this dataset, four neural network models, Inception-V3, Yolo-8n, Yolo-8n segmentation and PointNet, were trained for gesture classification. The models were subsequently evaluated on a large real-world hand washing dataset, demonstrating successful classification accuracies of 56.9% for Inception-V3, 76.3% for Yolo-8n and 79.3% for Yolo-8n segmentation. These findings underscore the effectiveness of synthetic data in training machine learning models for hand washing gesture recognition. | en_US |
| dc.identifier.doi | 10.3390/jimaging11070208 | |
| dc.identifier.issn | 2313-433X | |
| dc.identifier.scopus | 2-s2.0-105011613749 | |
| dc.identifier.uri | https://doi.org/10.3390/jimaging11070208 | |
| dc.identifier.uri | https://hdl.handle.net/20.500.14901/2940 | |
| dc.language.iso | en | en_US |
| dc.publisher | MDPI | en_US |
| dc.relation.ispartof | Journal of Imaging | en_US |
| dc.rights | info:eu-repo/semantics/openAccess | en_US |
| dc.subject | Computer Vision | en_US |
| dc.subject | Machine Learning | en_US |
| dc.subject | Hand Washing | en_US |
| dc.subject | Hand Gesture Recognition | en_US |
| dc.subject | Synthetic Dataset | en_US |
| dc.subject | Rendering | en_US |
| dc.title | Hand Washing Gesture Recognition Using Synthetic Dataset | en_US |
| dc.type | Article | en_US |
| dspace.entity.type | Publication | |
| gdc.author.id | Ozakar, Rustem/0000-0002-7724-6848 | |
| gdc.author.id | Gedikli, Eyup/0000-0002-7212-5457 | |
| gdc.author.scopusid | 57190744807 | |
| gdc.author.scopusid | 8507392800 | |
| gdc.author.wosid | Ozakar, Rustem/H-3843-2018 | |
| gdc.author.wosid | Gedikli, Eyup/U-5309-2017 | |
| gdc.description.department | Erzurum Technical University | en_US |
| gdc.description.departmenttemp | [Ozakar, Rustem] Erzurum Tech Univ, Fac Engn & Architecture, Deparment Comp Engn, TR-25100 Erzurum, Turkiye; [Gedikli, Eyup] Trabzon Univ, Fac Comp & Informat Sci, Deparment Comp Engn, TR-61300 Trabzon, Turkiye | en_US |
| gdc.description.issue | 7 | en_US |
| gdc.description.publicationcategory | Makale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanı | en_US |
| gdc.description.scopusquality | Q1 | |
| gdc.description.volume | 11 | en_US |
| gdc.description.woscitationindex | Emerging Sources Citation Index | |
| gdc.description.wosquality | Q2 | |
| gdc.identifier.pmid | 40710595 | |
| gdc.identifier.wos | WOS:001553316900001 | |
| gdc.index.type | Scopus | |
| gdc.virtual.author | Özakar, Rüstem | |
| relation.isAuthorOfPublication | 53915913-b510-4a92-a0bf-7dc3350e4810 | |
| relation.isAuthorOfPublication.latestForDiscovery | 53915913-b510-4a92-a0bf-7dc3350e4810 |
