Hand Washing Gesture Recognition Using Synthetic Dataset

Loading...
Publication Logo

Date

2025

Authors

Ozakar, Rustem
Gedikli, Eyup

Journal Title

Journal ISSN

Volume Title

Publisher

MDPI

Open Access Color

OpenAIRE Downloads

OpenAIRE Views

Research Projects

Journal Issue

Abstract

Hand hygiene is paramount for public health, especially in critical sectors like healthcare and the food industry. Ensuring compliance with recommended hand washing gestures is vital, necessitating autonomous evaluation systems leveraging machine learning techniques. However, the scarcity of comprehensive datasets poses a significant challenge. This study addresses this issue by presenting an open synthetic hand washing dataset, created using 3D computer-generated imagery, comprising 96,000 frames (equivalent to 64 min of footage), encompassing eight gestures performed by four characters in four diverse environments. This synthetic dataset includes RGB images, depth/isolated depth images and hand mask images. Using this dataset, four neural network models, Inception-V3, Yolo-8n, Yolo-8n segmentation and PointNet, were trained for gesture classification. The models were subsequently evaluated on a large real-world hand washing dataset, demonstrating successful classification accuracies of 56.9% for Inception-V3, 76.3% for Yolo-8n and 79.3% for Yolo-8n segmentation. These findings underscore the effectiveness of synthetic data in training machine learning models for hand washing gesture recognition.

Description

Ozakar, Rustem/0000-0002-7724-6848; Gedikli, Eyup/0000-0002-7212-5457

Keywords

Computer Vision, Machine Learning, Hand Washing, Hand Gesture Recognition, Synthetic Dataset, Rendering

Fields of Science

Citation

WoS Q

Q2

Scopus Q

Q1

Source

Journal of Imaging

Volume

11

Issue

7

Start Page

End Page

Google Scholar Logo
Google Scholar™

Sustainable Development Goals

SDG data could not be loaded because of an error. Please refresh the page or try again later.