As mobile devices such as smartphones and tablet computers are becoming increasingly powerful, it is important to embedding the capability to track eye movements and support gaze-based applications in these devices. The paper offers several uses and applications.
Buy Eye Tracking Research and Applications essay paper online
Comparison of eye movement filters used in HCI (human-computer interaction) was offered by Oleg Špakov. The study compares various real-time filters designed to denote eye movements. The output from the filters is compared against two set signals. Eye movements measured with eye-tracking systems and commonly used for HCI purposes always contain some noise due to imperfection of the measuring technology. The first step taken when mapping gaze onto onscreen objects is raw data filtering and smoothing. The strongest disadvantage of this filter is the introduced delay in update of the gaze point position.
Torsten Bierz and Achim Ebert offer GPU Accelerated gesture detection for real time interaction. A system has been developed which facilitates optical tracking without optical markers for human gesture-controlled real-time interaction with a software system. The real-time capability can be accomplished by using a GPU-accelerated computation method. For dark rooms and under direct sunlight the option may be an automated recalibration of the camera to compensate for different exposure situations.
Effect of Location as the fact in Aware Mobile Eye Tracking was offered by Peter Kiefer, Florian Straub, Martin Raubal. Context-aware mobile computing strives to offer services on a mobile device that supports the user within a context, location being the most common type. Compared to static eye tracking, MET methods provide a good solution for studying insights with the liberty of movement and variable contexts that characterize natural vision.
Voice activity detection from the gaze in video mediate communication was offered by Michal Hradis and Shahram Eivazi Roman Bednarik. This paper describes the estimation of working speaker in multi-party video-mediated communication from gaze data and measurements of one of the participants. Eye gaze is central for grounding during communication since gaze signals are vital for collecting and providing information for mutual understanding. In systems supporting multi-party video-mediated (MPVM) communication, a principal problem is presenting information from a remote location on a limited visualization device.
Related Free Review Essays
- Can Virtual Teams Perform as Well as Face-to-face Teams
- Film History
- Cathedral by Raymond Carver
- In Search of a Hero
- "The God Who Is There" by Donald Arthur Carson
- Hills Like White Elephants
- Minorities and Discrimination
- Bayer External Recommendations
- Employee Compensation
- Structure of Canadian Prisons and Parole