Upload
jennifer-romano-bergstrom
View
751
Download
1
Embed Size (px)
DESCRIPTION
Typically survey pretesting involves separate timelines and research staffs for cognitive and usability testing. In this paper, we make the case that a more comprehensive and less labor-intensive approach to pretesting is to conduct both cognitive and usability testing concurrently. By testing the same questionnaire concurrently with respondents and interviewers (the users in this case), potentially problematic question wording and instrument design can be more efficiently identified in a way that can be used to improve the questionnaire for both the respondent and the interviewer. In 2005 and 2006, the U.S. Census Bureau conducted separate rounds of cognitive and usability testing on an interviewer-administered non-response follow-up questionnaire in preparation for the 2010 Census. The usability testing was conducted in the Census Bureau’s Usability Lab with an early version of the instrument. Later, the Census Bureau’s Cognitive Lab conducted cognitive testing of the instrument. In doing the testing separately, we learned that in addition to usability issues, usability testing also identifies question wording issues, but that usability staff does not have the specialized experience (or sometimes the authority) to make recommendations in that arena. Similarly, while examining question wording, cognitive testing also identifies poor usability features, but the cognitive-testing staff lacks the experience with such testing to be able to recommend improvements in usability features. Based on this observation, in 2008, the Cognitive and Usability Labs at the Census Bureau conducted 40 cognitive and 20 usability interviews concurrently and in conjunction to test the questionnaire and presented results and recommendations from both types of testing together. When testing is conducted concurrently, staff from both labs, representing both specialties, can be at the table at once, creating a more efficient methodology. By examining these two case studies, this paper will discuss what can be gained by conducting these studies in concert above and beyond conducting them independently. Examples of the kinds of findings that are possible through this joint research and the synergy from having both research teams involved will be described.
Citation preview
1
Benefits of Concurrent Cognitive and Usability Testing
Jennifer Childs, Jennifer Romano, Erica Olmsted-Hawala, and Elizabeth Murphy
U.S. Census Bureau
Paper presented at the Questionnaire Evaluation Standards Conference
Bergen, NorwayMay 17-20, 2009
2
Overview
• Background– Cognitive and Usability Labs
• Case Study – U.S. 2010 Census– Example 1 – Separate Cognitive and Usability
Testing– Example 2 – Joint Cognitive and Usability
Testing
• Conclusions
3
Description of Pretesting Methods
• Cognitive Testing– Focus on respondent’s
understanding of questions
– Some focus on navigation
– Often used to determine final question wording
• Usability Testing – Focus on user’s ability to
complete the survey• User = Interviewer • User = Respondent
– Focus on users’ interaction with questionnaire and on effects of visual design
– Used to improve visual design and navigational controls
4
U.S. Census Bureau Lab Structure
• Cognitive Lab– Psychologists,
sociologists, anthropologists, demographers, etc.
– Based on CASM and Tourangeau’s 4 stages
– Measures comprehension, accuracy and ability and willingness to respond
• Usability Lab– Psychologists, human
factors and usability specialists
– Based on principles of user-centered design (UCD)
– Measures accuracy, efficiency (time to complete tasks), and satisfaction
5
Census Bureau Lab Techniques• Cognitive Lab
– Concurrent think aloud– Concurrent and/or
retrospective probing– Retrospective
debriefing– Emergent and
expansive probing– Vignettes (hypothetical
situations)– In-depth ethnographic
interviews
• Usability Lab– Random assignment (as
appropriate)– Scenarios (hypothetical
situations)– Concurrent think aloud– Concurrent and/or
retrospective probing– Eye tracking – Satisfaction
questionnaire– Retrospective debriefing
Case Studies:United States 2010 Census
7
U.S. 2010 Census
• Approximately 10 questions – Names, relationships, ages, sex, race and
Hispanic origin for each household member
• Mailed forms to most households
• Non-response follow-up by personal visit
8
U.S. Census Nonresponse Followup (NRFU)
• Developed and tested CAPI instrument– Usability and Cognitive testing
independently conducted– Example 1
• Turned to PAPI data collection– Usability and Cognitive testing conducted
concurrently– Example 2
Example 1:CAPI Instrument Testing
• Usability testing and cognitive testing conducted separately
CAPI Usability Testing Methods
• Early version of software• 2 rounds• Round 1 – 6 users (4 English and 2 Spanish speakers)
• Round 2– 5 users (4 English and 1 Spanish speaker)
• 4 Rounds (2 English and 2 Spanish)• English Round 1– 14 interviews – Paper script
• English Round 2– 16 interviews – Computer
• Spanish Rounds 1 and 2– 30 interviews total– Paper script
CAPI Cognitive Testing Methods
CAPI Usability Findings (1)
• Data Entry– The way the cursor moves between entry
boxes
• Navigation issues– Next and Back button navigated between
“questions” not “screens”
• Question Text– Repeating questions for each household member
– Unclear whether or not to read examples associated with question text
– Unclear if verification for sex was ok
– Identified instances of problematic question wording
• Interview Tasks– Administering flashcard was difficult for interviewers
– Difficult fills – e.g., Is this (house/apartment/mobile home)
CAPI Usability Findings (2)
CAPI Cognitive Findings
• Navigation Issues– “Back” and “Next” buttons
• Question Text– Repetitive to read each question for each person– Reference period unclear– Long, complex questions– Problematic question wording
• Interviewer Tasks– Difficulty administering flashcard– Difficult fill, e.g., (house/apartment/mobile home)
Separate Usability and Cognitive Testing Conclusions
• Separate– Unique results
– Common results
– Areas of expertise and knowledge of relevant research
• Together– Fully understand how to best resolve problems
16
Example 2: Joint Cognitive and Usability PAPI Testing
• Paper and Pencil Instrument• 40 cognitive interviews• 20 usability sessions• Concurrent testing led to development of
joint recommendations
17
Methods
• Cognitive Testing• Retrospective probing• Comprehension, accuracy, ability to answer
questions given personal situation
• Usability Testing • Shortened interviewer training• Test scenarios• Accuracy, satisfaction and ease of use
18
Joint Findings and Recommendations
19
Information Sheet
20
Information Sheet Findings
Cognitive Findings Respondents understood and were able
to use the Information Sheet for all relevant questions
Usability FindingsInterviewers were successful in
administering the information sheet and associated questions
21
Hispanic Origin Question
22
Hispanic Origin Findings• Cognitive Findings
– A lot of difficulty for Hispanic respondents • Responding “Latino” or “Spanish”– without a country of
origin• Describing race and origin• Unable to respond unassisted
– With probing by interviewer, able to provide a code-able answer (note: the dash is yellow- I must have added it last time- so check the other slides too…)
• Usability Findings
ScenarioDominican
RepublicColumbian
PuertoRican
Cambodian Mexican
SuccessRate
95% 95% 100% 95% 100%
23
Hispanic Origin Discussion and Recommendations
• Respondents had some difficulty, but…
• Interviewers in usability testing were able to successfully navigate the question.
• Recommendations focused on training
24
Conclusions from Joint Cognitive and Usability Studies
• Understanding of how:– Respondents react to questions
– Interviewers react to questionnaire
– Interviewers react to respondent situations
• Recommendations:– Improve the form – question wording, visual design,
and/or navigational instructions
– Improve interviewer training
25
General Recommendations Conduct cognitive and usability testing
concurrently with early versions of the questionnaire, with time to change: – Question wording
– Visual Design (i.e., format and layout of the questionnaire)
– Navigational strategies
– Interviewer training Conduct iterative testing
26
Future Research
• Cognitive and Usability testing of American Community Survey Internet data collection instrument– Early in development cycle– Iterative testing