서지주요정보
Integrating wearable muscle sensing for hand-based interactions in mixed reality = 혼합 현실 손 기반 상호작용을 위한 착용형 근육 센싱 결합
서명 / 저자 Integrating wearable muscle sensing for hand-based interactions in mixed reality = 혼합 현실 손 기반 상호작용을 위한 착용형 근육 센싱 결합 / Hyung-il Kim.
발행사항 [대전 : 한국과학기술원, 2023].
Online Access 원문보기 원문인쇄

소장정보

등록번호

8041446

소장위치/청구기호

학술문화관(문화관)B1층 보존서고

DGCT 23007

휴대폰 전송

도서상태

이용가능(대출불가)

사유안내

반납예정일

리뷰정보

초록정보

This dissertation proposes a method for utilizing wearable muscle sensing technology to recognize users' hand force and touch information in the context of mixed reality (MR) environments and to enhance hand-based interactions and collaborations. Although mixed reality technologies, encompassing augmented reality (AR) and virtual reality (VR), have advanced, hand-based interactions for real-virtual 3D interactions heavily rely on visual information provided by head-mounted displays, leading to limitations that hinder natural and effective interactions. To address this, a comprehensive framework is proposed in this dissertation for integrating wearable muscle sensing technology into MR interactions and collaborations, considering the user's posture, interacting objects, and sensor signals. Furthermore, a system is developed that applies wearable electromyography (EMG) sensors to measure the hand force of a worker in an MR remote collaboration system. The impact of visualizing the worker's hand force on the collaborator's task awareness is evaluated through user studies. The visualization of hand force data positively influences task comprehension, force perception, object weight estimation, and the collaborator's sense of social presence. Moreover, this dissertation proposes a system that utilizes EMG sensors to estimate accurate touch events and touch intensity in MR interactions. The system demonstrated the capability to enable touch interactions that utilize force information, leveraging all fingers for precise and force-sensitive touch within mixed reality. In conclusion, the research findings contribute to the improvement of MR interactions and collaborations by utilizing muscle sensing technology to provide additional force information and accurate hand gesture recognition, alongside users' hand information, in mixed reality interaction and collaboration scenarios.

본 학위논문은 혼합 현실 환경 내에서 웨어러블 근육 감지 기술을 활용하여 손 기반 상호작용 및 협업을 개선시키는 방법을 제안한다. 증강현실, 가상현실을 아우르는 혼합현실 기술이 발전하고 있으나, 사용자가 실제 및 가상 환경과 3차원 상호작용을 하기 위해 사용되는 손 기반 상호작용은 착용형 헤드셋에서 주어지는 시각적 정보에 의존하고 있으며, 이에 따른 한계점들이 자연스럽고 효과적인 상호작용을 저해하고 있다. 이를 해결하고자, 본 학위논문에서는 먼저 혼합현실 상호작용과 협업에 착용형 근육 감지 기술을 통합하기 위한 포괄적인 프레임워크를 제안한다. 다음으로 착용형 근전도 센서를 활용하여 작업자의 손 힘 정보를 측정하여 혼합 현실의 원격 협업 시스템에 적용하는 시스템을 개발하고, 사용자 평가를 통해 원격 협업 상황에서 작업자의 손 힘을 시각화하는 것이 협업자의 작업 인식에 미치는 영향을 평가하였다. 손 힘 데이터의 시각화는 작업 이해, 힘 인식, 물체 무게 추정 및 원격 전문가의 사회적 공존감에 긍정적 영향을 미침을 확인하였다. 나아가, 본 논문은 근전도 센서를 활용하여 혼합현실 상호작용에서 정확한 터치 이벤트 및 터치 강도를 추정하는 시스템을 제안한다. 이 시스템은 모든 손가락을 활용하여 사용자가 혼합현실 내에서 정확하며, 힘 정보를 활용하는 터치 상호작용을 가능하게 함을 보였다. 종합적으로, 본 연구 결과는 혼합현실 상호작용 및 협업 상황에서 사용자의 손 정보와 더불어 근전도 신호를 활용하면 추가적인 힘 정보와 정확한 손 동작 구분을 통해 상호작용을 개선하는 데 기여함을 보인다.

서지기타정보

서지기타정보
청구기호 {DGCT 23007
형태사항 vi, 68 p. : 삽도 ; 30 cm
언어 영어
일반주기 저자명의 한글표기 : 김형일
지도교수의 영문표기 : Woontack Woo
지도교수의 한글표기 : 우운택
Including appendix
학위논문 학위논문(박사) - 한국과학기술원 : 문화기술대학원,
서지주기 References : p. 61-66
주제 Augmented reality
Virtual reality
Mixed reality
Human-computer interaciton
Wearable computing
증강현실
가상현실
혼합현실
인간-컴퓨터 상호작용
웨어러블 컴퓨팅
QR CODE

책소개

전체보기

목차

전체보기

이 주제의 인기대출도서

Prototype system overview for sEMG-based Hand Interaction and Collaboration Framework for Mixed Reality

Basic Collaboration in Mixed Reality

System diagram of the collaboration system with shared hand force

Prototype system overview for remote collaboration between a local worker and a remote expert. Remote expert monitors local worker's behavior through (a-c) first person view or (d-f) third person view. Local worker's hand force is measured using an EMG armband, and can be augmented by (b,e) changing the color of hand mesh or (c,f) augmenting gauge beside local worker's hand.

System diagram of the prototype system.

Data Collection for the validation study

Handheld object and grasping methods used in the validation study. (a) Cylindrical grasp and (b) Spherical grasp

Placement of the cylinders for the user study. The cylinders are marked in alphabetical order. Left: First-person view (FPV), Right: Third-person view (TPV)

Flow chart of overall study procedure

The results of the validation study using (a) cylindrical grasp and (b)spherical grasp. The estimated weight is normalized for each participant using the average output with a 1000gobject.

Task performance results (P and V: a significant effect of View and visualization Vis, respectively) (a) Ordering Time, (b) Ordering Error

Results of subjective measures on Social Presence (P and V: a significant effect of View and Vis, respectively; I:significant interaction effect between independent variables): (a) Aggregated Social Presence (SP), (b) Co-presence (CP), (c) Attentional Allocation (AA), (d) Perceived Message Understanding (PMU)

Results of subjective measures on user's perception of force and weight (V:a significant effect of Vis): (a) Force Perception, (b) WeightPerception

Results of subjective measures on user's mental effort and likability (V: asignificant effect of Vis): (a) Subjective Mental Effort, (b) Likability

User preference between hand force visualizations

(a) We propose the system that integrates Mixed Reality devices with forearm-worn commodity sEMG sensor for enhanced touch interaction on the known flat surfaces (b) our system utilizes wearable sEMG armband to detect touch event, the finger used for touch, and touch pressure. (c) Combining accurate touch point and pressure, proposed system can be usedfor playing pressure-sensitive musical instrum

System diagram of the proposed system

Preprocessing of the EMG signal. (a) raw EMG signal, (b) rectified EMG signal, (c) filtered EMG signal in a single channel for 1000 data frames (5 seconds).

Proposed touch detection pipeline

Configuration ofthe sEMG armband. Red, Green, Blue arrows denote X, Y,Z axes ofthe accelerom- eter and gyroscope, respectively.

Comparison of validation strategies across different sensor configurations.

Overlaid preprocessed EMG signals of channel 1, 4, 5, 6 caused by 100 taps by each finger of the right hand (a: P2,b: P5) Vertical line at sample 10 represents the time ofthe touch event.

Touch Pressure Visualization

Data collection setup. The surface-mounted force-sensitive resistor was used for collecting ground truth for touch event and touch pressure.

Confusion matrix for70/30 Split, using 4-channel EMG data

Confusion matrix for 70/30 Split, using IMU data

Confusion matrix for 70/30 Split, using 8-channel EMG data

Confusion matrix for 70/30 Split validation, using all sensor data

Confusion matrix for Leave-One-User-Out cross-validation, using 4-channel EMG data

Confusion matrix for Leave-One-User-Out cross-validation, using IMU data only.

Confusion matrix for Leave-One-User-Out cross-validation, using 8-channel EMG data only.

Confusion matrix for Leave-One- User-Out cross-validation, using all sensor data

Playing Musical Instrument

Pressure-aware Touch User Interface