Evaluating 2D Flow Visualization Using Eye Tracking
  Hsin-Yang Ho     I-Cheng Yeh     Yu-Chi Lai     Wei-Chien Lin     Fu-Yin Cherng  
Ho, H.-Y., Yeh, I.-C., Lai, Y.-C., Lin, W.-C., and Cherng, F.-Y., "Evaluating 2D Flow Visualization Using Eye Tracking," , Computer Graphics Forum, Vol.34, No.3. June, 2015 (Accepted by EuroVis 2015) 

Flow visualization is recognized as an essential tool for many scientific research fields and different visualization approaches are proposed. Several studies are also conducted to evaluate their effectiveness but these studies rarely examine the performance from the perspective of visual perception. In this paper, we aim at exploring how users' visual perception is influenced by different 2D flow visualization methods. An eye tracker is used to analyze users' visual behaviors when they perform the free viewing, advection prediction, flow feature detection, and flow feature identification tasks on the flow field images generated by different visualizations methods. We evaluate the illustration capability of five representative visualization algorithms. Our results show that the eye-tracking-based evaluation provides more insights to quantitatively analyze the effectiveness of these visualization methods.


@article {10.1111:cgf.12662,
journal = {Computer Graphics Forum},
title = {{Evaluating 2D Flow Visualization Using Eye Tracking}},
author = {Ho, Hsin-Yang and Yeh, I-Cheng and Lai, Yu-Chi and Lin, Wen-Chieh and Cherng, Fu-Yin},
year = {2015},
publisher = {The Eurographics Association and John Wiley & Sons Ltd.},
DOI = {10.1111/cgf.12662}


The authors would like to thank Prof. Chen-Chao Tao for providing the eye-tracker, Prof. Han-Wei Shen for the UFLIC code, and anonymous reviewers for their insightful comments. This work was supported in part by Taiwan Ministry of Science and Technology (MOST) under grants 102-2221-E-009-082-MY3, 101-2628-E-009-
021-MY3, and 103-2221-E-011-114-MY2, the UST-UCSD International Center of Excellence in Advanced Bioengineering sponsored by the MOST I-RiCE Program under
grant MOST 103-2911-I-009-101-, and the Innovation Center for Big Data and Digital Convergence, Yuan Ze University.