Skip Navigation
Skip to contents

Res Vestib Sci : Research in Vestibular Science

OPEN ACCESS
SEARCH
Search

Articles

Page Path
HOME > Res Vestib Sci > Volume 23(3); 2024 > Article
Review Article
Smartphones versus goggles for video-oculography: current status and future direction
Pouya Barahim Bastani1,2orcid, Shervin Badihian2,3orcid, Vidith Phillips1,4orcid, Hector Rieiro1,2orcid, Jorge Otero-Millan5orcid, Nathan Farrell1,2orcid, Max Parker6orcid, David Newman-Toker1,2,7orcid, Ali Saber Tehrani1orcid
Research in Vestibular Science 2024;23(3):63-70.
DOI: https://doi.org/10.21790/rvs.2024.009
Published online: September 15, 2024

1Department of Neurology, Johns Hopkins University School of Medicine, Baltimore, MD, USA

2Armstrong Institute Center for Diagnostic Excellence, Baltimore, MD, USA

3Neurological Institute, Cleveland Clinic, Cleveland, OH, USA

4Division of Biomedical Informatics & Data Science, Department of Internal Medicine, Johns Hopkins University School of Medicine, Baltimore, MD, USA

5Herbert Wertheim School of Optometry and Vision Science, University of California, Berkeley, Berkeley, CA, USA

6Department of Neurology, NYU Langone Health, New York, NY, USA

7Department of Epidemiology, Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, USA

Corresponding author: Ali Saber Tehrani Department of Neurology, Johns Hopkins Hospital, suite 215, Pathology 2 building, 600 N. Wolfe Street, Baltimore, MD 21287, USA. E-mail: ali.tehrani@jhmi.edu
• Received: May 26, 2024   • Revised: July 18, 2024   • Accepted: July 31, 2024

© 2024 The Korean Balance Society

This is an open access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/4.0) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.

  • 136 Views
  • 8 Download
next
  • Assessment of eye movements is the cornerstone of diagnosing vestibular disorders and differentiating central from peripheral causes of dizziness. Nonetheless, accurate assessment of eye movements is challenging, especially in the emergency department and primary care settings. To overcome this challenge, clinicians objectively measure eye movements using devices like video-oculography (VOG) goggles, which provide a video recording of the eye and quantified eye position traces. However, despite the value of VOG goggles in studying eye movements, barriers such as high prices and the need for dedicated operators have limited their use to subspecialty clinics. Recent advancements in the hardware and software of smartphones have positioned them as potential alternatives to VOG goggles that can reliably record and quantify eye movements. Although currently not as accurate as VOG goggles, smartphones can provide a cheap, widely available tool that can be used in various medical settings and even at home by patients. We review the current state and future directions of the devices that can be used for recording and quantifying eye movements.
Eye movement assessment is valuable in evaluating acute and chronic conditions affecting the extraocular muscles and associated nerves, inner ears, and brain. These conditions range from benign, such as benign positional paroxysmal vertigo (BPPV), to life-threatening, like vestibular strokes [1]. Moreover, patients with dizziness, who comprise approximately 5 million of the annual emergency department (ED) visits (3%–5%), require eye movement evaluation for accurate diagnosis and management [2,3]. Studies have shown that bedside examination of eye movements with the Head Impulse test, Nystagmus, Test of Skew (HINTS) examination battery is superior to magnetic resonance imaging with diffusion-weighted imaging in detecting strokes during the first 24 hours in patients with continuous dizziness and spontaneous nystagmus [4]. On the other hand, BPPV is diagnosed by detecting a characteristic nystagmus after positional maneuvers [5]. While these tests have a high diagnostic yield, they require expertise in eye movement examination, detection of subtle abnormalities, and interpretation of the findings. Moreover, studies have shown that not all physicians perform ideally in HINTS examinations [6]. Unfortunately, those with adequate expertise are not readily available throughout the United States [7].
Quantitative recording of eye movements addresses some of these challenges, provides the opportunity to detect subtle abnormalities, and can promote equitable access to eye movement examination in emergency rooms and underserved areas [8]. Here, we examine how recent technological advancements have changed the landscape of quantified eye tracking and offer insights into its future direction and impact on patient care.
Video-oculography (VOG) is a method of quantified eye tracking that uses video recordings of the eyes. It is traditionally performed by VOG goggles, a pair of goggles with one or several built-in infrared cameras that enable eye detection and recording of eye movements while performing various tests [9]. These goggles need to be connected to a computer with software that processes the input and provides quantitative data and traces of eye movements based on the recordings. The clinician would be able to evaluate the quantified metrics (e.g., the gain of head impulse test [HIT], nystagmus velocity, or eye position traces), along with the videos of the eyes—and, in some cases, the videos from the examination room depicting the types of tests and positional maneuvers—using the same software.
Several VOG goggles are approved by the U.S. Food and Drug Administration (FDA). ICS Impulse (Natus Medical, Inc.) and EyeSeeCam (Interacoustics) are among the most widely used models in research and clinics [10,11]. The price of the goggles depends mainly on their model, the features they offer, and the complexity of analysis they provide, with higher-end devices routinely used in clinics and research costing as much as US $40,000. These VOG goggles require a trained technician to calibrate the instrument before recording each patient and to select the appropriate tests in the software before performing them on the patients. The technicians must also have adequate expertise in accurately performing the ocular and vestibular tests to obtain the correct measurements and output. The VOG goggles can quantify different components of bedside HINTS and positional eye movement exam by detecting and measuring the slow phase velocity of nystagmus, measuring the gain of HIT for different semicircular canals and detecting catch-up saccades, and measuring the vertical misalignment of the eye during the test of skew [12]. Furthermore, these goggles utilize an infrared camera at high frame rates (typically about 250 frames per second), which enables them to detect subtle or brief eye movements that might be invisible to the naked eye, which makes them especially valuable in detecting small covert saccades (i.e., catch-up saccades during HIT that would otherwise be invisible to the naked eye) [13]. The use of infrared light sensors in VOG goggles enables clinicians to obtain eye movement recordings in the dark, which is essential when there is a need for removing fixation—as some types of nystagmus tend to be suppressed by fixation and provide overall better-quality eye tracking than visible light [14]. These features make the VOG goggles a fitting tool for objectively screening vestibular strokes in acutely dizzy patients referred to the ED and evaluating eye movements in the outpatient setting. Studies have shown that quantitative HINTS recorded by VOG goggles can be used to diagnose vestibular strokes [15]. A multicenter randomized controlled clinical trial (Acute video-oculography for vertigo in emergency rooms for rapid triage; NCT02483429) has aimed to study the use of VOG goggles in the initial evaluation of patients with acute vertigo; the preliminary results indicate how remote evaluation of VOG recordings by experts has superior accuracy to that of ED evaluation [16]. The final results are expected to be published later this year.
Despite the advantages of VOG goggles in quantifying HINTS battery and their potential use for screening vestibular strokes, the high price of these goggles, the need for a trained technician, and their limited availability in the EDs throughout the country are barriers to their widespread use. On the other hand, smartphones can offer potential solutions to overcome these obstacles and act as reasonable alternatives for recording and quantifying eye movements [17,18].
Smartphones are ubiquitous devices possessing cameras and sensors that allow them to analyze facial features, detect attention, and track the gaze. These features have positioned smartphones in the zone of interest for researchers in the field of eye movement tracking [19]. Multiple studies have investigated how videos captured by smartphones could be used to detect different features of eye movements. Most of the published literature on smartphone eye tracking investigates how eye movement videos obtained by a phone can be later analyzed to quantify certain eye movement features [17]. Despite the broad scope and utility of eye tracking in medicine, here we only discuss studies with adult participants that focused on quantifying components of the HINTS exam that could aid in diagnosing stroke in patients with acute vestibular syndrome. The majority of studies on quantifying nystagmus have relied on induced nystagmus in healthy volunteers in a controlled environment [17]. Nonetheless, Kıroğlu and Dağkıran [20] were able to show video recordings of nystagmus during Ménière attacks (without quantification) can aid in the earlier establishment of diagnosis in this patient group. On the other hand, studies on quantifying HIT gain have included both healthy volunteers and those with vestibular neuritis with abnormal vestibulo-ocular reflex and catch-up saccades [21,22].
Friedrich et al. [23] showed proof of concept for a convolutional neural network model that can quantify eye movement recordings of optokinetic nystagmus. In a similar study, van Bonn et al. [24] showed how a smartphone app can reliably detect nystagmus induced by optokinetic stimulation or caloric stimulation in healthy volunteers. Evaluating the HIT, Kuroda et al. [21] mounted a smartphone on a goggle frame and used the phone camera instead of VOG goggles to record the eyes while performing HIT. Their results indicate how videos from phones can detect low HIT gains similar to VOG and detect catch-up saccades (although with lower sensitivity).
To provide a comprehensive alternative to VOG goggles in screening for vestibular strokes, our team has developed an eye and head tracking application that utilizes the augmented reality (ARKit) features of the iPhone’s (Apple Inc.) front camera and automatically detects facial features, pupil position, and phone distance from the face [18]. In addition, the app provides time-stamped data on the eye position coordinates and other recorded variables (e.g., head position, lighting), along with graphs depicting the eye and head positions during the recording, similar to those of the VOG goggles [22].
We showed how this app’s combined head and eye tracking could be used to measure the HIT gain and detect vestibular hypofunction and catch-up saccades [22]. We outlined the calibration process we utilized in the recording process and highlighted how final measurements were obtained from the raw data [25]. Recently, we shared the data regarding the agreement of the measurements by our smartphone app and the VOG goggles, indicating a high correlation (Spearman correlation of 0.98 for horizontal optokinetic nystagmus and 0.94 for vertical) between the two in quantifying optokinetic nystagmus in healthy volunteers). Moreover, we showed how using an average calibration paradigm was as accurate as individual calibration for each participant, saving considerable time during recording [26]. Further work is needed to determine the extent of the detection thresholds for optimal output. Nonetheless, other studies on eye tracking have shown similar differences in the accuracy of smartphone tracking compared to the VOG goggles [19].
While the technical challenges of accurately performing the eye movement tests will remain unchanged, smartphone applications make recording eye movements as simple as recording a selfie video, eliminating the need for a separate device and software. Moreover, smartphones are widely available, and even clinicians with a limited number of dizzy patients could obtain quantified eye movement traces without purchasing expensive devices. While studies that our team and research teams from around the globe have conducted on smartphone eye tracking show promising results, several factors must be considered regarding the accuracy of smartphone recordings versus VOG goggles. Smartphone cameras that have been tested so far record with a lower frame rate than 250 frames per second in VOG goggles; this could limit the capability of phone cameras to detect more subtle movements—especially saccadic eye movements. Moreover, the need for a source of ambient lighting in phone recording, as opposed to infrared detectors in VOG goggles, limits their use in exams where visual fixation should be removed for optimal assessment. Furthermore, the goggles are worn on patients’ faces, unlike phones that must be placed at a certain distance to capture facial features needed for optimal detection. Therefore, goggles are always positioned at a relatively constant angle to the globe while the angle between the phone camera and the patient’s eyes changes as the head moves. This can potentially impact the quality of the recording as a part of the face or one of the eyes might move out of the frame during the recording. Finally, perhaps the most critical obstacle that needs to be addressed in smartphone eye tracking is the lack of a user-friendly output interface that would frame outcomes in a clinically relevant manner that would save the clinicians the time and the challenge of reviewing individual videos or having to upload the videos to be analyzed. Our efforts are currently focused on making our application user-friendly with outputs that could guide the clinical decision-making of the providers.
Diagnostic Decision Support
Table 1 summarizes the comparison between the phone and the goggles. When considering the future of smartphone eye tracking, it’s essential to remember that this technology is meant to complement the use of more advanced eye-tracking technologies, not replace them. Therefore, one must not expect a one-to-one match in the capabilities of smartphones and VOG goggles. Expanding access to eye movement quantification in settings that would have otherwise not utilized this technology is the core concept behind these applications. One can imagine a future with VOG goggles are used in subspecialized clinics and EDs in large hospitals, and phone-based eye recording at home, primary care settings, and rural EDs that don’t have the funds or staff necessary for 24-hour VOG goggle recording.
One of the pathways that could improve the diagnostic errors in distinguishing central from peripheral dizziness could rely on smartphone applications like the one our team has developed. Automation of output based on embedded or cloud-based algorithms and providing expert opinions based on recorded data are some examples of where these applications could be integrated into telehealth systems that are already in place. Likewise, the smartphone applications and ease of dissemination make them a fit for a follow-up method. For example, patients can be educated to use their smartphones at home to record their eye movements. In such a scenario, they could provide valuable information to their clinicians that might otherwise be impossible to obtain. Of note, the in-home use of cellphone applications can yield diagnostic data in episodic conditions with asymptomatic intervals that might lack objective findings during a clinic visit. Regulatory procedures such as FDA de novo and 510(K) applications are another necessary step. However, payor approval following the FDA regulatory process remains a significant barrier to widespread use.
Artificial Intelligence and Machine Learning Algorithms
With rapid advancements in the technologies embedded in smartphones in conjunction with the use of artificial intelligence (AI) and machine learning (ML), the future where cell phones provide ample information for clinical decision-making is no longer a matter of if but of when. Nonetheless, AI and ML algorithms entail their own unique challenges, including validation and privacy. So far, most published studies of eye tracking by ML algorithms have been confined to controlled research environments, highlighting the need for more studies exploring the validity of these tools in the clinical setting [17]. The variety of approaches in how ML algorithms quantify eye movements also highlights the need for a best-practice guideline for AI in eye tracking similar to those for AI in medical imaging to guide future research efforts in this field [27,28]. Moreover, depending on the primary user, the usability of the recording process has to be methodically studied to guide the future development of the technology [29]. The de-identification guidance based on the ‘Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule’ by the United States Department of Health and Human Services includes biometric information and full photographic images of the face among the list of patient identifiers to ensure patient privacy [30]. Although biometric identification based on face and iris recognition is well-established [31], several recent studies have shown that gaze tracking patterns [32] and eye movement information (e.g., saccades) [33] can be used as biometric data. Hence, we face a dual challenge of protecting potentially cyber-security compromising biometric data [34] and private health information. In the face of these challenges, scientists have worked on de-identifying the eye movement data used for training models. Özdel et al. [35] devised a scanpath analyzing model—which they made publicly available—that preserved the privacy of the individual data; they tested this model on three public datasets with promising results. Moreover, evaluating a convolutional neural network of facial and eye movement data, Seyedi et al. [36] showed that it’s possible to achieve good performance metrics while preserving the privacy of the data. Therefore, we can assume that even though using face and eye data for training and the use of AI models might pose a privacy threat, effective methods can be developed to mitigate such risks.
As the use of deep learning increases in the development of eye-tracking algorithms, issues such as the interpretability of algorithms, human-computer interaction development of eye-tracking algorithms, issues such as the interpretability of algorithms, human-computer interaction, and data-quality standards take center stage. Given these challenges, research teams are attempting to provide solutions. Kumar et al. [37] investigated the challenges in the interpretability of algorithms and highlighted key issues of “explainability beyond events,” “performance analysis of spatio-temporal data,” and “lack of annotation support for real-world data.” With regards to the need for improved human-computer interaction practices when dealing with fixed setups such as the VOG goggles- or smartphone-enabled tracking, Valtakari et al. [38] provided a deep dive into the factors (“number of gaze sources, use of scripted behavior, desired freedom of movement, data quality, and ease of data analysis”) best suited for each type of eye-tracking setup (“head-free vs. head-boxed vs. head-restricted”) as well as a comprehensive guide and decision tree for establishing the required research environment. Furthermore, Adhanom et al. [39] explored the technological limitations in data quality for eye tracking, highlighting the issues of low spatial precision and accuracy, high latency, low sampling rate, and calibration errors in real-world settings pose a significant constraint for standardization of eye-tracking solutions. Although the aforementioned issues limit the generalizability of the existing solutions, they also serve as future directions for research in studying eye movement disorders via the use of advanced deep-learning solutions.
Overall, smartphone applications that track eye movements have been previously proven to be worthy diagnostic tools. With further investment and studies in this field, we can expand the use of objective eye movement assessment beyond the current limits.

Funding/Support

None.

Conflicts of Interest

David Newman-Toker, Jorge Otero-Millan, Max Parker, and Nathan Farrell have a provisional patent application regarding the use of smartphone in tracking eye and head position. David Newman-Toker, Ali Saber Tehrani, Jorge Otero-Millan, Hector Rieiro, Pouya Barahim Bastani, Max Parker, and Nathan Farrell have a provisional patent application regarding using the EyePhone for recording saccades and smooth pursuit.

Availability of Data and Materials

All data generated or analyzed during this study are included in this published article. For other data, these may be requested through the corresponding author.

Authors' Contributions

Conceptualization: PBB, SB, DNT, AST; Investigation: PBB, SB, VP, HR, NF, MP, AST; Supervision: JOM, DNT, AST; Writing–Original Draft: All authors; Writing–Review & Editing: All authors.

All authors read and approved the final manuscript.

Table 1.
Video-oculography goggles vs. phone video-oculography
Variable Video-oculography goggles Phone video-oculography
Hardware price US $12,000–40,000 per device plus, a laptop for the software No need to purchase a separate device as the application integrates recording and analysis
Software Additional costs for extra software features A subscription plan for the application vs. pay-per-use
Availability Limited availability Available to any physician/patient with a smartphone
Operator Trained technician needed for operation User-friendly interface that any physician/patient can use
Placement Mounted on head Held (or mounted) in front of the patient
Illumination Independent of room illumination Requires face illumination for optimal performance
Fixation Fixation can be blocked Cannot block fixation
Potential errors Risk of goggle slippage and erroneous recording A part of the face or one eye might move out of the frame
Results Specialist needed for interpretation of results Results can be simplified and reported for triage of patients
Continued support Complicated updating and troubleshooting Easily updated like any other application on the smartphone
  • 1. Saber Tehrani AS, Kattah JC, Kerber KA, et al. Diagnosing stroke in acute dizziness and vertigo: pitfalls and pearls. Stroke 2018;49:788–795.ArticlePubMedPMC
  • 2. Jung I, Kim JS. Approach to dizziness in the emergency department. Clin Exp Emerg Med 2015;2:75–88.ArticlePubMedPMCPDF
  • 3. Saber Tehrani AS, Coughlan D, Hsieh YH, et al. Rising annual costs of dizziness presentations to U.S emergency departments. Acad Emerg Med 2013;20:689–696.ArticlePubMed
  • 4. Newman-Toker DE, Kerber KA, Hsieh YH, et al. HINTS outperforms ABCD2 to screen for stroke in acute continuous vertigo and dizziness. Acad Emerg Med 2013;20:986–996.PubMed
  • 5. Kim JS, Zee DS. Clinical practice: benign paroxysmal positional vertigo. N Engl J Med 2014;370:1138–1147.ArticlePubMed
  • 6. Ohle R, Montpellier RA, Marchadier V, et al. Can emergency physicians accurately rule out a central cause of vertigo using the HINTS examination?: a systematic review and meta-analysis. Acad Emerg Med 2020;27:887–896.ArticlePubMedPDF
  • 7. Xue K, Feng Y, Tam V, Lin CC, De Lott LB, Hamedani AG. Sociodemographic and geographic variation in access to neuro-ophthalmologists in the United States. J Neuroophthalmol 2023;43:149–152.ArticlePubMed
  • 8. Rizzo JR, Beheshti M, Dai W, Rucker JC. Eye movement recordings: practical applications in neurology. Semin Neurol 2019;39:775–784.ArticlePubMed
  • 9. DiScenna AO, Das V, Zivotofsky AZ, Seidman SH, Leigh RJ. Evaluation of a video tracking device for measurement of horizontal and vertical eye rotations during locomotion. J Neurosci Methods 1995;58:89–94.ArticlePubMed
  • 10. Natus. ICS® Impulse - Vestibular Testing: a new level of clinician confidence [Internet]. Natus; c2024 [cited 2024 May 26]. Available from: https://natus.com/sensory/ics-impulse/.
  • 11. Interacoustics. EyeSeeCam vHIT: vHIT system and goggles [Internet].Interacoustics; c2024 [cited 2024 May 26]. Available from: https://www.interacoustics.com/us/balance/software/eyeseecam.
  • 12. Newman-Toker DE, Curthoys IS, Halmagyi GM. Diagnosing stroke in acute vertigo: the HINTS family of eye movement tests and the future of the “eye ECG”. Semin Neurol 2015;35:506–521.ArticlePubMedPMC
  • 13. Korda A, Carey JP, Zamaro E, Caversaccio MD, Mantokoudis G. How good are we in evaluating a bedside head impulse test? Ear Hear 2020;41:1747–1751.ArticlePubMedPMC
  • 14. Hirvonen TP, Juhola M, Aalto H. Suppression of spontaneous nystagmus during different visual fixation conditions. Eur Arch Otorhinolaryngol 2012;269:1759–1762.ArticlePubMedPDF
  • 15. Newman-Toker DE, Saber Tehrani AS, Mantokoudis G, et al. Quantitative video-oculography to help diagnose stroke in acute vertigo and dizziness: toward an ECG for the eyes. Stroke 2013;44:1158–1161.ArticlePubMedPMC
  • 16. National Library of Medicine (US). Acute video-oculography for vertigo in emergency rooms for rapid triage (AVERT). ClinicalTrials.gov identifier: NCT02483429. Updated 2024 Jul 30. Accessed 2024 May 2. Available from: https://clinicaltrials.gov/ct2/show/NCT02483429.
  • 17. Noda M, Kuroda T, Nomura A, Ito M, Yoshizaki T, Fushiki H. Smartphone-assisted medical care for vestibular dysfunction as a telehealth strategy for digital therapy beyond COVID-19: scoping review. JMIR Mhealth Uhealth 2023;11:e48638. ArticlePubMedPMC
  • 18. Greinacher R, Voigt-Antons JN. Accuracy assessment of ARKit 2 based gaze estimation. In: Kurosu M, editor. Human-computer interaction. Proceedings, Part I. Design and user experience: thematic area, HCI 2020; Held as Part of the 22nd International Conference, HCII 2020; July 19-24, 2020; Copenhagen, Denmark. Springer-Verlag; 2020. p. 439-449.
  • 19. Valliappan N, Dai N, Steinberg E, et al. Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nat Commun 2020;11:4553. ArticlePubMedPMCPDF
  • 20. Kıroğlu M, Dağkıran M. The role of mobile phone camera recordings in the diagnosis of Meniere’s disease and pathophysiological implications. J Int Adv Otol 2020;16:18–23.ArticlePubMedPMC
  • 21. Kuroda T, Kuroda K, Fushiki H. Development of a prototype video head impulse test system using an iPhone for screening of peripheral vestibular dysfunction. Digit Biomark 2023;7:150–156.ArticlePubMedPMCPDF
  • 22. Parker TM, Farrell N, Otero-Millan J, Kheradmand A, McClenney A, Newman-Toker DE. Proof of concept for an “eyePhone” app to measure video head impulses. Digit Biomark 2020;5:1–8.ArticlePubMedPMCPDF
  • 23. Friedrich MU, Schneider E, Buerklein M, et al. Smartphone video nystagmography using convolutional neural networks: ConVNG. J Neurol 2023;270:2518–2530.ArticlePubMedPMCPDF
  • 24. van Bonn SM, Behrendt SP, Pawar BL, Schraven SP, Mlynski R, Schuldt T. Smartphone-based nystagmus diagnostics: development of an innovative app for the targeted detection of vertigo. Eur Arch Otorhinolaryngol 2022;279:5565–5571.ArticlePubMedPMCPDF
  • 25. Parker TM, Badihian S, Hassoon A, et al. Eye and head movement recordings using smartphones for telemedicine applications: measurements of accuracy and precision. Front Neurol 2022;13:789581. ArticlePubMedPMC
  • 26. Bastani PB, Rieiro H, Badihian S, et al. Quantifying induced nystagmus using a smartphone eye tracking application (EyePhone). J Am Heart Assoc 2024;13:e030927. ArticlePubMedPMC
  • 27. Varoquaux G, Cheplygina V. Machine learning for medical imaging: methodological failures and recommendations for the future. NPJ Digit Med 2022;5:48. ArticlePubMedPMCPDF
  • 28. Kędras M, Sobecki J. What is hidden in clear sight and how to find it: a survey of the integration of artificial intelligence and eye tracking. Information 2023;14:624. Article
  • 29. Zhou L, Bao J, Setiawan IM, Saptono A, Parmanto B. The mHealth App Usability Questionnaire (MAUQ): development and validation study. JMIR Mhealth Uhealth 2019;7:e11500. ArticlePubMedPMC
  • 30. Office for Civil Rights. Guidance regarding methods for de-identification of protected health information in accordance with the Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule [Internet]. U.S. Department of Health & Human Services; 2022 [cited 2024 May 26]. Available from: https://www.hhs.gov/hipaa/for-professionals/privacy/special-topics/de-identification/index.html.
  • 31. Alay N, Al-Baity HH. Deep learning approach for multimodal biometric recognition system based on fusion of iris, face, and finger vein traits. Sensors (Basel) 2020;20:5523. ArticlePubMedPMC
  • 32. Schröder C, Al-Zaidawi SMK, Prinzler MH, Maneth S, Zachmann G. Robustness of eye movement biometrics against varying stimuli and varying trajectory length. In: Proceedings of the 2020 ACM CHI Conference on Human Factors in Computing System s (CHI ‘20); 2020 Apr 25-30; Honolulu, HI, USA. Association for Computing Machinery; 2020.
  • 33. George A, Routray A. A score level fusion method for eye movement biometrics. Pattern Recognit Lett 2016;82:207–215.Article
  • 34. Katsini C, Abdrabou Y, Raptis GE, Khamis M, Alt F. The role of eye gaze in security and privacy applications: survey and future HCI research directions. In: Proceedings of the 2020 ACM CHI Conference on Human Factors in Computing Systems (CHI '20); 2020 Apr 25-30; Honolulu, HI, USA. Association for Computing Machinery; 2020.
  • 35. Özdel S, Bozkir E, Kasneci E. Privacy-preserving scanpath comparison for pervasive eye tracking. Proc ACM Hum Comput Interact 2024;8(ETRA):231.
  • 36. Seyedi S, Jiang Z, Levey A, Clifford GD. An investigation of privacy preservation in deep learning-based eye-tracking. Biomed Eng Online 2022;21:67. ArticlePubMedPMCPDF
  • 37. Kumar A, Howlader P, Garcia R, Weiskopf D, Mueller K. Challenges in interpretability of neural networks for eye movement data. In: Proceedings of the 2020 ACM CHI Conference on Human Factors in Computing Systems (CHI '20); 2020 Apr 25-30; Honolulu, HI, USA. Association for Computing Machinery; 2020.
  • 38. Valtakari NV, Hooge IT, Viktorsson C, Nyström P, Falck-Ytter T, Hessels RS. Eye tracking in human interaction: possibilities and limitations. Behav Res Methods 2021;53:1592–1608.ArticlePubMedPMCPDF
  • 39. Adhanom IB, MacNeilage P, Folmer E. Eye tracking in virtual reality: a broad review of applications and challenges. Virtual Real 2023;27:1481–1505.ArticlePubMedPMCPDF

Figure & Data

References

    Citations

    Citations to this article as recorded by  

      • PubReader PubReader
      • ePub LinkePub Link
      • Cite
        CITE
        export Copy
        Close
        Download Citation
        Download a citation file in RIS format that can be imported by all major citation management software, including EndNote, ProCite, RefWorks, and Reference Manager.

        Format:
        • RIS — For EndNote, ProCite, RefWorks, and most other reference management software
        • BibTeX — For JabRef, BibDesk, and other BibTeX-specific software
        Include:
        • Citation for the content below
        Smartphones versus goggles for video-oculography: current status and future direction
        Res Vestib Sci. 2024;23(3):63-70.   Published online September 15, 2024
        Close
      • XML DownloadXML Download
      Smartphones versus goggles for video-oculography: current status and future direction
      Smartphones versus goggles for video-oculography: current status and future direction
      Variable Video-oculography goggles Phone video-oculography
      Hardware price US $12,000–40,000 per device plus, a laptop for the software No need to purchase a separate device as the application integrates recording and analysis
      Software Additional costs for extra software features A subscription plan for the application vs. pay-per-use
      Availability Limited availability Available to any physician/patient with a smartphone
      Operator Trained technician needed for operation User-friendly interface that any physician/patient can use
      Placement Mounted on head Held (or mounted) in front of the patient
      Illumination Independent of room illumination Requires face illumination for optimal performance
      Fixation Fixation can be blocked Cannot block fixation
      Potential errors Risk of goggle slippage and erroneous recording A part of the face or one eye might move out of the frame
      Results Specialist needed for interpretation of results Results can be simplified and reported for triage of patients
      Continued support Complicated updating and troubleshooting Easily updated like any other application on the smartphone
      Table 1. Video-oculography goggles vs. phone video-oculography


      Res Vestib Sci : Research in Vestibular Science
      TOP