Loading…
This event has ended. Visit the official site or create your own event on Sched.
Follow us on Facebook: https://www.facebook.com/evalsociety/ | Conference enquiries: conference@aes.asn.au | Twitter: #aes17CBR
Monday, September 4 • 11:30 - 12:00
Improving Validity: Asking the Right Questions in Evaluations

Log in to save this to your schedule, view media, leave feedback and see who's attending!

Feedback form is now closed.
David Roberts (RobertsBrown)

A major issue for evaluators is the validity of the answers people give when questioned on topics of interest to evaluators. Cognitive and psychological research show that people generally do NOT search all their memory or use rational processes to answer the questions we ask. Indeed, what people 'Say' in interviews often bears little relation to actual behaviour in the past or in future. Instead, implicit cognitive processes throw up a large number of possible answers, most of which never reach conscious awareness.  One such answer may emerge in our awareness as ‘the’ answer.  Sometimes, two or three answers emerge to awareness as 'probable' answers for consideration.   Generally, we then use a 'best fit' heuristic to choose between the 'probable' answers generated by our implicit processes.

While the cognitive research is relatively clear, there is very little research into how we might apply the lessons to our own evaluation practice.  The research does suggest that the closer the interview context is to the context of action the greater the validity of the responses.  So techniques that are able to recreate the context of action are more likely to generate valid answers.

This interactive session explores the lessons from research and how those lessons might be applied to improve our research techniques. Participants will be asked to participate in a question-answering session and then to explore the cognitive bases of their responses.  We will then discuss the cognitive research to see how it applies to the experience of the group.  If time allows, the final part of the session will involve participants working together to develop ideas about how to apply the research to developing questions


Chairs
avatar for Squirrel Main

Squirrel Main

Research and Evaluation Manager, The Ian Potter Foundation
Dr Squirrel Main is The Ian Potter Foundation's first Research and Evaluation Manager and she co-chairs the Philanthropic Evaluation and Data Analysis network. Squirrel completed her Masters at Stanford University in Evaluation and Policy Analysis (with David Fetterman--hello David... Read More →

Speakers
avatar for David Roberts

David Roberts

Principal Consultant, RobertsBrown
David Roberts is a self-employed consultant with wide experience over 35 years in evaluations using both qualitative and quantitative methods. David has training and experience in Anthropology, Evaluation and Community Development. In 2016, David was awarded the Ros Hurworth Prize... Read More →


Monday September 4, 2017 11:30 - 12:00 AEST
Swan Room – first floor