How is it that I can ask a dozen people to watch two job interviews, rate each applicant on a three point scale (Suitable, Suitable + and Suitable -), and find a spread across all three ratings for both applicants? Surely there should be greater consistency?
When discussing what people base their assessment on, one of the key factors is the assumptions people make about a person’s intentions and personality, based on single words and phrases, and non-verbal behaviour. These judgments are made with certainty. The person is convinced that they are making a valid assessment of the evidence provided.
Comments are made like: ‘I wouldn’t want Gillian on my team. She is too self-centred, you can tell she is only out for herself’. Of the same example, others say she made the best choice she could given the circumstances.
The ease with which we slide from what someone says to ‘knowing’ what they are thinking is concerning when it comes to selection panel behaviour. Too often I have seen selection reports that base assessments on such judgements rather than on the evidence provided.
The selection panel’s job is to assess who is the most suitable person for the job based on the work-related qualities specified in the job description. Part of that assessment will be a gut-feeling about whether the person will fit in to the workplace culture. There is a fine line between objectively seeking evidence of capabilities and judging fit. Panels need to be clear about this line and consciously active about suspending judgement, probing answers and articulating prejudices and assumptions.
The likelihood of subjective interpretation over-riding objective evidence is minimised if members of a selection panel pay attention to the following:
- Defining the job requirements clearly at the outset. Poorly worded selection criteria and job descriptions reduce clarity of thought and process.
- Choosing selection methods appropriate to the criteria. An interview is not an appropriate method to use for all selection criteria.
- Establishing an assessment standard for the criteria that is appropriate to the level of the job.
- Crafting quality questions that elicit relevant evidence.
- Probing answers to ensure evidence is fully explored and understood.
- Making explicit any prejudices and assumptions that play a role in assessing the evidence.
- Being willing to call panel members on their biases.
- Suspending judgement until all the evidence is in.
- Assessing evidence against the job specifications and assessment standards.