HomeAbout usMembersDeep LearningVisualizationHCIMultimodal InteractionTouch & Pen InteractionImmersive Virtual Reality_DisplaysInnovative Interaction_DevicesSystem ControlSpeech ControlGesture Control3D MenusCollaborationVRGeo on YouTubeTestimonialsContact usGDPRMember LoginSearch
 

The introduction of gestures as a supplementary input modality has become of increasing interest to human computer interaction (HCI) design, especially for 3D computer environments. We use gesture control for the VRGeo Demonstrator. Well-known from the field of speech recognition, Hidden Markov Models are employed to represent and recognize predefined gestures. Gestures are defined as symbols, such as simple geometric shapes or Roman letters. They are extracted from a stream of three-dimensional optical tracking data which is resampled, reduced to 2D and quantized to be used as input to discrete Hidden Markov Models. A set of prerecorded training data is used to learn the parameters of the models and recognition is achieved by evaluating the trained models.

Related Thesis

Gesture Control in Virtual Environments

Sven Seele: Fachhochschule Bonn-Rhein-Sieg, 2008