There is abundant career-related research available via academic journals and public reports. While extending knowledge and understanding, and providing evidence of what works and what doesn’t, there are some fundamental shortcomings in this body of work. So long as these shortcomings keep being repeated, our discussions and policies will continue to reflect inaccuracies and short sightedness.
Academic research papers often include a section on the limitations of the research study. While this is a valuable inclusion, perceptions of limitations can be limited.
There are at least eight shortcomings that need to be addressed.
-
Lack of or poor evaluation
Evaluation of programs and policies is not easy. The Committee for Economic Development of Australia (CEDA) examined a sample of 20 Federal Government programs worth more than $200 billion. They found that ninety-five per cent of these programs were not properly evaluated. State government evaluations shows similar results. (p. 10)
Evaluation is not something tacked on after work is completed. CEDA found that evaluation problems started with poor program and policy design, such as unclear objectives or a lack of any definition of success, plus a failure to collect data from the outset. Lack of time and resources for evaluation also impedes action.
CEDA points out that evaluation is a specialised skill. (p. 45)
The Professional Standards for Australian Career Development Practitioners refers to evaluation in four standards:
- Standard 7d concerns evaluating the service provided to clients. To demonstrate this competency, Career Development Practitioners: Understand and apply a range of evaluation strategies; Evaluate cases and/or projects to ensure accountability; Measure and improve client satisfaction; Use evaluation to identify new client services; Provide evidence to assist in service promotion and enhancement.
- Specialised Competency 3 concerns program delivery, and includes evaluation of programs, specifically; Review, evaluate, and revise career development programs.
- Specialised Competency 5 concerns project management, including evaluation, specifically: Ensure that quality deliverables are produced to customers’ expectations.
- Specialised Competency 7 concerns research skills. It includes: Plan, design, manage, and report on research projects; and Critically analyse and interpret data, which imply, but don’t specifically mention, evaluation.
These competencies support the need for skills in evaluating career development programs and services.
2. The unquestioning use of ‘soft’ skills
Research papers may explore relevant concepts by summarising the range of terms used. For example, employability skills covers a wide range of terms, yet rarely do writers question the accuracy, validity, usefulness of these terms. In some cases, the term ‘soft’ skills is unquestioningly incorporated into research programs.
The unquestioning use of the term ‘soft’ skills is a significant shortcoming in many research reports, as it has major implications for career advice and policies concerning skill shortages, training, transition programs, and government policies and programs. The term is gendered, inaccurate, imprecise and ignores the complexity and interrelatedness of skills.
3. The unquestioning acceptance of ‘received wisdom’
Another practice is to accept what other reports say, without seeking to clarify or confirm their ‘evidence’. For example, researchers may refer to how industry bodies and employers attest to a skills short-fall amongst graduates, expecting universities to do better in producing work-ready graduates. Yet where is the research that explores how employers arrive at this view? How do they judge what skills graduates have? Are their expectations realistic? Is it a ploy to avoid bearing the cost of training new employees? Do employers misinterpret students’ failure to articulate their skills as a lack of skills?
4. Relying on people’s self-assessment of skills
Much research is based on people’s own assessment of their career-related skills. For example, research based on students’ self-ratings of their career management skills, specific skills, and transferability of their skills, assumes that students have an accurate and shared understanding of these skills and how they relate to multiple contexts. There is plenty of evidence that students do not necessarily have this knowledge. Research reports don’t make clear whether a shared understanding of skills has been established in the first place, over-looking the need for before and after testing.
5. Limiting voices that are heard
Senior leaders in organisations, institutions, peak bodies, industry groups, think tanks and consultancies are drawn on to provide their ‘expert commentary’ on research issues. Voices that may not be heard, or even considered, belong to the client groups research seeks to serve. It is not always clear whether the client groups being investigated play any role in informing the research, or whether their perspective is explored.
Much research draws on large corporations, often international, while not making clear whether small and medium businesses were included in the sample, and if so, whether the results differ from large companies.
6. Ignoring the complexity of skill groupings
Many reports talk about a skill area without delving into the breadth of the skill. Communication is a good example. Communication is a complex and diverse mix of specific skills, some of which are highly specialised. Ignoring this depth means that reports provide over-simplified information that is of little use to people trying to understand the scope and application of their skills.
7. Ignoring the interrelatedness of skills
Many reports focus on specific areas of skill, such as technology-related skills. While this body of work acknowledges that other skills may also be needed, such as communication, problem-solving and teamwork, the interrelatedness of these skills is largely left unexamined.
8. Ignoring Career Development theory and research
A significant flaw in career-related research is that theories and practice established by the career development profession, is largely ignored. Much career-related literature is written by economists, who may not consider much or any input beyond their own profession.
Conclusion
When reading research and reports on career-related topics, keep these eight flaws in mind. Seeking additional information to fill these gaps may point to significant holes in the research record and the ease with which limited or inaccurate thinking occurs.