+46 (0)46 222 0326
Your most visited
Theses, dissertations and research publications (including journal articles, conference abstracts and books) from Lund University are collected in this database. Where possible, the option to download a full text document is available. It is also possible to search for Lund University student theses in the student theses database.
|Title||A new method for comparing scanpaths based on vectors and dimensions|
|Author/s||Richard Dewhurst, Halszka Jarodzka, Kenneth Holmqvist, Tom Foulsham, Marcus Nyström|
|Full-text||Full text is not available in this archive|
|Document type||Conference abstract|
|Conference name||Vision Sceinces Society|
|Conference location||Naples, Florida|
Mathematics and Statistics
|Keywords||Cognition, Eye movements, Scanpaths|
|Research group||Crypto and Security|
|Additional info||We make different sequences of eye movements—or scanpaths—depending on what we are viewing and the current task we are carrying out (e.g. Land, Mennie & Rusted, 1999). In recent years, research efforts have been very informative in identifying commonalities between scanpath pairs, allowing us to quantify, for example, the similarity in eye movement behaviour between experts and novices (Underwood, Humphrey & Foulsham, 2008), or between encoding and recognition of the same image (Foulsham & Underwood, 2008). However, common methods for comparing scanpaths (e.g. ‘string-edit’, based on Levenshtein, 1966, or ‘positon measures’, see Mannan, Ruddock & Wooding, 1995) fail to capture both the spatial and temporal aspects of scanpaths. Even the newest techniques (e.g. ‘Scanmatch’, Cristino, Mathôt, Theeuwes & Gilchrist, 2010) are restricted by the fact that they rely on the division of space into Areas of Interest (AOIs), thus limiting the spatial resolution of the similarity metric produced. Here we validate a new algorithm for comparing scanpaths (Jarodzka, Holmqvist & Nyström, 2010) with eye movement data from human observers. Instead of relying on the quantization of space into AOIs, our method represents scanpaths as geometrical vectors, which retain temporal order and spatial position. Scanpaths are then compared across several dimensions—shape, position, length, direction, and duration—and a similarity value is returned for each. Using this new multidimensional approach, our data from two experiments highlights aspects of scanpath similarity which cannot otherwise be quantified: when scanpaths are clearly similar, but are spatially downscaled, for instance. Moreover, we show how scanpath similarity changes depending on task, using our algorithm in comparison to the most popular alternatives. This data demonstrates that our vector-based multi-dimensional approach to scanpath comparison is favorable to others, and should encourage a shift away from methods which are rooted in the Levenstein principle or spatial position alone.|
+46 (0)46 222 0326
Lund University's "ReSearch for the Future" magazine (Pdf, 10 Mb) presents a range of research from across the University.