Assessing the performance of short multi-item questionnaires in aesthetic evaluation of websites
Eleftherios Papachristos
Behaviour and Information Technology, 2019, vol. 38, issue 5, 469-485
Abstract:
In recent years, website aesthetics has received a fair amount of attention from the HCI community. This has led to the creation of a variety of multi-item questionnaires aimed at capturing users’ aesthetic judgments. Researchers have used these questionnaires in several HCI studies to investigate the relationship between aesthetics and other evaluative constructs such as usability. However, their usefulness as evaluation tools in visual design practice remains underexplored. Lengthy multi-item questionnaires can be particularly problematic especially in studies where participants must evaluate multiple designs or when they are required to give responses repeatedly in predefined time intervals. Despite the criticism, single-item scales have been used in many past studies in which questionnaire length could be problematic. Another alternative available to practitioners/researchers are short versions of standardised multi-item questionnaires that have been created for the aesthetic evaluations of websites. In this paper, we present a study in which we compare the performance of three such condensed aesthetic questionnaires (i.e. aesthetics scale, AttrakDiff, VisAWI) during a website redesign project. The short versions of those questionnaires were used by 187 users during an evaluation of 7 alternative website designs. The questionnaires were compared on performance criteria such as reliability, validity, and predictive ability. Data analysis showed that although AttrakDiff’s overall performance was better, a considerable amount of variance in aesthetic judgment could not be accounted for by any of the questionnaires.
Date: 2019
References: Add references at CitEc
Citations:
Downloads: (external link)
http://hdl.handle.net/10.1080/0144929X.2018.1539521 (text/html)
Access to full text is restricted to subscribers.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:taf:tbitxx:v:38:y:2019:i:5:p:469-485
Ordering information: This journal article can be ordered from
http://www.tandfonline.com/pricing/journal/tbit20
DOI: 10.1080/0144929X.2018.1539521
Access Statistics for this article
Behaviour and Information Technology is currently edited by Dr Panos P Markopoulos
More articles in Behaviour and Information Technology from Taylor & Francis Journals
Bibliographic data for series maintained by Chris Longhurst ().