A
C
R
O
S
S
             
 
  Investigating educational issues across Europe (ACROSS Base)
Roundtable
Proceedings of ECER 2003 Roundtable on methodological approaches in European projects
Home

Introduction

Access by 
> theme
> approach
> method

Index of
> projects
> experts
> references

Search

Roundtable
 

BACK TO CONTRIBUTIONS

Jean-Paul Reeff

New assessment tools for cross-curricular competencies in the domain of problem solving (NATCCC-PS)

It is widely accepted that problem-solving skills constitute one of the crucial life-skills. Problem solving is ranked as an important key qualification by labor market experts (see Binkley, Sternberg, Jones, & Nohara, 1999) as well as in the literature on vocational training and education (Didi, Fay, Kloft, & Vogt, 1993). Recent discussions of lifelong learning also point to problem solving as one of the major competencies to be fostered in a lifelong learning process. Furthermore, problem-solving skills were defined as an important outcome of schooling by OECD experts (OECD, 1997), and are often identified as high-level curricular aims (see, e.g., Svecnik, 1999).

The NATCCC-PS network had been created as a small and short-term co-operation network of European researchers to improve the visibility and increase the impact of European research in the field of problem solving, with special emphasis given to large-scale international comparative studies. Based on previous work in basic research and a long-term project on measuring competencies in a vocational training setting, a conceptual framework was prepared for measuring problem solving. Later on, this framework was further developed in the context of the international “Adult Literacy and Lifeskills Survey” (ALL) and the resulting framework forms the basis for the measurement of adults’ problem solving competencies in ALL.

One main challenge in measuring problem solving and in assessing vocational competencies can be described as follows: How can contextualized, real-life problems be defined and transformed into test items? The “project approach”, chosen by the NATCCC network as the main measurement tool, and further developed in the context of the ALL Survey, uses different problem-solving phases as a dimension along which to generate the actual test items. 

Following Pólya (1945, 1980), the process of problem solving has been frequently described in terms of the following stages:
- Define the goal. 
- Analyze the given situation and construct a mental representation.
- Devise a strategy and plan the steps to be taken.
- Execute the plan, including control and – if necessary – modification of the strategy. 
- Evaluate the result.

These stages correspond to the results of research on vocational training and job analyses within educational research and applied psychology that have been described as a part of the so-called “complete action” approach. Extensive analyses of very different jobs (different professions with varying types of work places) indicate that new forms of labor organization require people to perform more complex operations that go “beyond mere routine”.  Nowadays, even production workers and office clerks are required to master complex tasks requiring integrative skills. Complete actions include different steps such as planning, executing and evaluating. The basic structure of the model of complete action is thus fully compatible with the above-mentioned normative process model for problem solving --–- action steps are similar to problem-solving steps. 

The model of complete action has been successfully applied to curriculum development, assessment, and certification reforms in various professions in both Germany and Luxembourg (Hensgen & Blum, 1998; Hensgen & Klieme, 1998). The main idea is that both training tasks and also test problems should include all or most elements of a complete action. The project approach uses this complete action model to establish the underlying structure of the problem-solving test. The different action steps define the course of action for an “everyday” project. One or more tasks or items correspond to each of these action steps. The respondents thus work on the individual tasks that have been identified as steps that need to be carried out as a part of their project. Embedding the individual tasks in an action context yields a high degree of context authenticity. A project, designed as a complete action, encompasses various tasks that can vary in complexity. 

The following table provides an overview of the problem-solving steps corresponding to the above-illustrated action steps.  Different components and aspects of each of the problem-solving steps are listed. 

Table 1.  Problem-solving steps and instantiations
 
Define the Goals
  • Set goals.
  • Recognize which goals are to be reached and specify the essential reasons for the decision.
  • Recognize which goals/wishes are contradictory and which are compatible.
  • Assign priorities to goals/wishes.
Analyze the Situation
  • Select, obtain and evaluate information. 
  • What information is required, what is already available, what is still missing, and what is superfluous?
  • Where and how can you obtain the information?
  • How should you interpret the information?
  • Identify the people (e.g. with what knowledge and skills) that are to be involved in solving the problem. 
  • Select the tools to be used.
  • Recognize conditions (e.g. time restrictions) that need to be taken into account.
Plan the Solution
  • Recognize which steps need to be taken. 
  • Decide on the sequence of steps (e.g. items on the agenda).
  • Coordinate work and deadlines.
  • Make a comparative analysis of alternative plans (recognize which plan is suitable for reaching the goals).
  • Adapt the plan to changed conditions.
  • Opt for a plan.
Execute the Plan
  • Carry out the individual steps (e.g., write a letter, fill in a form, make calculations).
Evaluate the Results
  • Assess whether and to what extent the target has been reached.
  • Recognize mistakes.
  • Identify reasons for mistakes.
  • Assess consequences of mistakes.

Concrete projects consisting of different tasks have been developed and used both in a VET setting and in large-scale comparative studies. Empirical analyses showed very satisfactory results. Within the ALL study, the assessment results of the field study yielded one scale for problem-solving skills with four competency levels:

1. Content related reasoning
2. Evaluating
3. Ordering/Integrating
4. Critical Thinking

Another important result of the ALL filed study was that short versions of the projects provided results of similar quality (compared to longer versions), a result that may substantially influence the assessment in a VET setting.

BACK TO CONTRIBUTIONS

Top of the page
© WIFO