3Dの視聴触覚情報を遠隔提示する実験装置を構築した。視覚はステレオカメラ情報を3Dモニタに提示, 聴覚は操作対象を画像認識した信号に基づく合成音を4チャンネルスピーカーに提示, 触覚は2自由度のプレートロボットとジョイスティックの角度をバイラテラル制御することで位置と力覚とを提示した。人工的な遅延時間や合成音などの感覚提示条件を様々に設定した上で, 被験者に遠隔作業を実施させ, 作業効率や操作性を評価した。実験結果から, 通信遅延の下で, 三次元空間で遠隔操作を遂行する際に知覚すべき因子として, 奥行きや速度, 加速度の情報を聴覚で提示することが効果的であることなどの知見が得られた。
An experimental system, which can represent 3D visual-auditory-haptic information of its slave side to its master side, was constructed. The visual information captured by a stereo camera was displayed on a 3D monitor, the artificially-generated auditory information based on a target information measured by a visual recognition system was played from a 4-channel speaker, and the haptic information was transmitted via a bilateral controller which established 2-DoF position tracking and action-reaction law between a plate robot and a joy-stick. Task efficiency and operationality were evaluated through remote-operation tests conducted by human examinees under various conditions of delay and generated audio. The results verify that depth, velocity, or acceleration information is worth representing as audio information especially under communication delay.
|