Achieving the Visual Turing Test: Integrated Display and Eye Tracking Technologies
Mantas Žurauskas, Mohamed El-Haddad, Barry Silverstein, Douglas Lanman, Rob Cavin
Electronic Imaging (EI)
Pixels per degree (PPD) alone is not a reliable predictor for high-resolution experience in VR and AR. This is because "high resolution experience" depends not only on PPD but also display fill factor, pixel arrangement, optical blur, and other factors. This often complicates system architecture decisions and design comparisons. Is there a simple way to capture all the contributors and quantitatively match user experience? In this paper, we present a system-level model and metric, system modulation transfer function (system MTF), to predict perceptual quality considering all the key parameter dimensions: pixel per degree (display), pixel shape (display), fill factor (display), optical blur (Optics), and image processing (graphics pipeline). The metric can be defined in much the same way of traditional MTF for imaging systems by examining image formation of a point source and then performing Fourier transform over the response function, but with special mathematical treatments. One application of the model is described on perceived text quality, where two weight functions depending on text orientation and spatial frequency are incorporated into the above model. A perceptual study on text quality across different resolutions was performed to validate the system MTF model.
Mantas Žurauskas, Mohamed El-Haddad, Barry Silverstein, Douglas Lanman, Rob Cavin
Brennan Jones, Yan Xu, Mary Anne Hood, Mohammad Shahidul Kader, Hamid Eghbalzadeh
Jacqui Fashimpaur, Aakar Gupta, Amy Karlson, Hrvoje Benko, Tanya Jonker