Selection panels favour behaviour based questions to assess whether applicants meet a selection criteria. This choice is based on evidence that past performance is a good predictor of future performance, and that is what a panel is trying to do – predict which applicant will best deliver on the job.
What can reduce the value of such questions for reaching a decision is that the only one question is asked and the response is taken at face value. If the question is poorly framed and no probing is used, a panel can find themselves with a range of responses from applicants with little to differentiate which one is best for the role.
To illustrate, let’s take a criterion on problem solving. The panel comes up with a question like: Tell us about a time you had to solve a challenging problem that involved dealing with a range of people.
The context of the job is a corporate business development role in IT. The panel wants to find out if the applicant can liaise with a range of people, win cooperation, and persuade people to accept a line of action, and implement a change effectively. In the short term the person in the role will be dealing with designing a new management system that is critical to the agency’s legal compliance.
One applicant gives an example of a system upgrade that affected 1500 staff and talks about the supplier delays, training challenges, management pressures involved.
Another applicant talks about a software solution they discussed with a line area and the challenges of matching unrealistic expectations to the reality of what products were available and the costs of modifications.
A third applicant talks about a system crash the emotional responses of staff to the inconvenience of lack of access.
How will a panel judge these responses? While some of the skills spoken about are relevant, none of the examples relates specifically to the short term needs of the job. This raises two questions:
- Has it been made clear to applicants what the immediate needs of the role are? If not, why not?
- Does the question really generate useful information to assess applicants’ ability to deliver results in this role? Probably not.
A more direct question would focus on what the panel really wants to know. For example: During the next twelve months the person in this role will drive the design process for a new management system. This system will enable us to better meet our legal obligations under the Act. Tell us about your experience in getting such projects off the ground. This could be followed up with a range of questions such as:
- What was the purpose of the new system?
- Who was involved?
- How did you go about getting stakeholders on side?
- What was the budget? Did you meet it?
- How did you go about building the specifications?
- How did you go about identifying possible providers?
With the first question – Tell us about a time you had to solve a challenging problem that involved dealing with a range of people – more useful information could be obtained if the panel then probes down into the response given. Panels often don’t do this because they hold to the false belief that all applicants have to be asked identical questions. While each applicant does need to be asked the initial question, each response will be different. Therefore, to fully assess any applicant’s evidence, the panel must drill down into the response with applicant-specific questions.
So the applicant who talks about a software solution they discussed with a line area and the challenges of matching unrealistic expectations to the reality of what products were available and the costs of modifications, could be asked about:
- What the software was designed to do.
- How they handled the unrealistic expectations.
- How they developed the specifications.
- How they identified relevant providers.
- Whether the project came in on time, on budget, and met the specifications.
- What is another example of introducing new systems or software.
- Have you ever worked on designing a new management system from scratch?
However, because the question is generic, and not targeted to the actual job demands, the information gained may still make a decision difficult.
To avoid this dilemma panels need to:
- Carefully craft questions that provide information about the demands of the job that are most critical to delivering results.
- Consider gaining more than one example of critical behaviours.
- Drill down into any responses given.