Tues m3 johannes_magenheim
-
Upload
johannesmagenheim -
Category
Education
-
view
312 -
download
1
description
Transcript of Tues m3 johannes_magenheim
Informatics Systems and Modeling - Case Studies of Expert Interviews
University of Paderborn Computer Science Education Group
University of Siegen Department of Computer Science and E-Learning
Johannes Magenheim (presenter) Leopold Lehner, Wolfgang Nelles, Thomas Rhode, Niclas Schaper, Sigrid Schubert and Peer Stechert
1
Informatics Systems and Modeling - Case Studies of Expert Interviews • Theoretically derived Competence Model • Objectives and Research Methodology • Achieved research Results: - Empirically refined Competence Model - Differences in Experts Views on Scenarios and Competence Components • Further Research Tasks
Sigrid Schubert Peer Stechert
Johannes Magenheim Leopold Lehner Thomas Rhode
Niclas Schaper Wolfgang Nelles
Electrical Engineering & Informatics University of Siegen
Informatics, CSE University of Paderborn
Organizational Psychology University of Paderborn
JSM SES PS TR NIS LL
Outline
WN
2 Johannes Magenheim, University of Paderborn
Theoretical Relations
Modelling, System
Comprehension Competences
Informatics System
System Application
System Development
System Properties
CS Curricula
3 3 Johannes Magenheim, University of Paderborn
Theoretically derived Competence Model
4 Johannes Magenheim, University of Paderborn
CSE: Objectives and Research Methodology
Analysis of international syllabi and curricula
2. Theoretically derived competence model
4/2008
30 expert interviews (Critical Incident Technique)
3. Empirically refined competence Model
2/2010
Development of test items and observation of learner-
centred approaches
4. Instruments to measure competence
4/2010
Evaluation of learning environments by
competence measurement
1a. Traditional: System Development
1b. New since 2006: System Comprehension
5. Improving learning environments since 2011
Qualitative content analysis (meaning units)
Expert Rating
Development of competence stimuli (authentic and complex)
6. Competence Level Model
2011
7. Competence Development Model
2012
5 Johannes Magenheim, University of Paderborn
Research Methodology
• 30 Expert Interviews • 3 Groups of Experts
- Experts of Informatics - Experts of Didactics of Informatics - Expert Informatics Teachers
• Interviews on Use Cases (Critical Incident Technique)
• Content Analysis (Mayring)
6 Johannes Magenheim, University of Paderborn
Two complex hypothetic scenarios were content analyzed:
(1) “Merchandise Management System” which especially deals with system development requirements and
(2) “Testing of Unknown Software” which deals with system comprehension requirements in particular
2 Examples for Use Cases (Scenarios)
Prof. Dr. Johannes Magenheim University of Paderborn – Computer Science Education Group
7 Johannes Magenheim, University of Paderborn
Scenario “Merchandise Management System”
Scenario “Merchandise Management System”: “You are asked to develop a software based merchandise management system for a small school kiosk.” Question 1: “What is your course of action to solve this task? Which software engineering workflows do you have to process?” Question 2: “Which graphical models would you apply?” Question 2.1: “Which informatics views are important for this task?” Question 2.2: “Which complexity would you assign to this task?” Question 3: “Which cognitive skills are required to develop such a software system?” Question 4: “Could you imagine a potential pupil’s procedure to solve this problem?” Question 5: “Which attitudes, social communicative skills and motivational aspects are necessary to solve this problem?”
8 Johannes Magenheim, University of Paderborn
Empirically refined Competence Model
9 Johannes Magenheim, University of Paderborn
Example K4 Non-Cognitive Skills Theoretically derived competence model Empirically refined competence model
10 Johannes Magenheim, University of Paderborn
Further Research Questions
In which respect do the experts differ in their competence-relevant statements?
How can these different contributions be explained with reference to different expert perspectives, backgrounds and attitudes towards the topic?
11 Johannes Magenheim, University of Paderborn
Experts of all groups contributed to the refinement of the competence model and the appropriateness of the theoretically derived categories of the competence model of informatics modelling and comprehension were confirmed
Especially, the relevant competence dimension K1 (BASIC COMPETENCIES) with its categories K1.2 (SYSTEM COMPREHENSION) and K1.3 (SYSTEM DEVELOPMENT) and their sub-categories could be confirmed by the descriptions of the experts.
Furthermore, the experts´ answers on questions concerning social competence requirements provided valuable and confirming clues to the fourth dimension Non-Cognitive Skills.
The closer the experts´ relationship to school, the more differentiated the non-cognitive skills are described.
Further Outcomes
12 Johannes Magenheim, University of Paderborn
• Furthermore: especially experts of informatics felt uncomfortable with scenarios, which covered parts of informatics, that were not in their research field.
• The expert of informatics expressed not a negative but a positive attitude towards the appropriateness of the scenario for informatics secondary education – in contrast to the expert of didactics and the expert teacher, which were more critical concerning the appropriateness of the scenario
• We have to be careful to generalize that experts of informatics are more critical concerning the school-appropriateness of informatics learning contents. Such appraisals might also depend on the personal experiences or other background characteristics of an expert
13 Johannes Magenheim, University of Paderborn
Further Outcomes
Further work to do......... • It is necessary to conduct additional empirical research steps to
proof the content and criteria validity of the developed competence model: The evaluation of the content validity of the model should be accomplished by an expert rating.
• The different informatics experts have to rate the extracted competence descriptions concerning their relevance, difficulty, representativeness and degree of differentiation.
• The evaluation of the criteria validity of the competence model should be accomplished by developing instruments to measure the different facets of the competence model and the criteria behaviour
• The resulting correlations between both can be interpreted as indicators for criteria validity of the competence model.
14 Johannes Magenheim, University of Paderborn
Further work to do.......
Analysis of international syllabi and curricula
2. Theoretically derived competence model
4/2008
30 expert interviews (Critical Incident Technique)
3. Empirically refined competence Model
2/2010
Development of test items and observation of learner-
centred approaches
4. Instruments to measure competence
4/2010
Evaluation of learning environments by
competence measurement
1a. Traditional: System Development
1b. New since 2006: System Comprehension
5. Improving learning environments since 2011
Qualitative content analysis (meaning units)
Expert Rating
Development of competence stimuli (authentic and complex)
6. Competence Level Model
2011
7. Competence Development Model
2012
15 Johannes Magenheim, University of Paderborn
Thank you
Prof. Dr. J. Magenheim University of Paderborn Computer Science Education Group Fuerstenallee 11 33102 Paderborn (Germany) [email protected] http://ddi.upb.de
16