The UNDP Perspective: F/OSS as a Tool for Development and The F/OSS Initiative in the Arab Region
1 United Nations Development Programme / Regional Bureau for Arab States (UNDP / RBAS) THE PROJECT...
-
date post
19-Dec-2015 -
Category
Documents
-
view
221 -
download
1
Transcript of 1 United Nations Development Programme / Regional Bureau for Arab States (UNDP / RBAS) THE PROJECT...
1
United Nations Development Programme / Regional Bureau for Arab States (UNDP / RBAS)
THE PROJECT
“Enhancement of Quality Assurance and Institutional Planning in Arab Universities”
Phase I 1/1/2002 – 30/6/2004 (30 months)
Phase IIStarted June 2005 (42 months)
2
PROJECT AIMS
TO WORK, IN VOLUNTARY PARTNERSHIP WITH ARAB UNIVERSITIES:
To introduce- implement – demonstrate On a pilot regional scale and In real academic time
THREE INDEPENDENT INSTRUMENTS OF QUALITY ASSURANCE AND ENHANCEMENT:
A. EVALUATION OF ACADEMIC PROGRAMS Through internal and external (peer) evaluation
B. ADMINISTRATION OF INTERNATIONAL TESTS For assessing the performance of students of reviewed programs
C. DEVELOPMENT OF STATISTICAL DATABASES
On the programs and resources of universities (programs / students and staff demographic/ finances) in accordance with commonly agreed data definitions and specifications C)
3
FIRST PHASE ACHIEVEMENTS
A. PROGRAM EVALUATION: TWO CONSECUTIVE REVIEW CYCLES OF PROGRAM EVALUATION First cycle (2002-2003): review of computer science programs in 15
universities Second cycle (2003–2004) : review of business administration
programs in 16 universities
B. STUDENT TESTING: TWO PARALLEL CONSECUTIVE CYCLES OF STUDENT TESTING First cycle ( 2002-2003) English-based CS + BA tests (788 senior
students) Second cycle (2003-2004) Arabic / French-based CS + BA tests (921
senior students). translation: Unesco Beirut Office.
C. ONE CYCLE FOR DATABASE DEVELOPMENT Statistical database development in 15 universities in accordance with
common data definitions and specifications
4
PARICIPATING UNIVERSITIES TWENTY NINE LEADING ARAB UNIVERSITIES
29 universities = 23 Public + 6 Private Moh V, Al-Akhawayn*, AM Saadi ( Morocco); USTHB, Oran (Algeria), Helwan,
Cairo, Arab Academy* ( Egypt), SUST, Khartoum ( Sudan), Sanaa, UTS*, Aden ( Yemen), Bahrain U ( Bahrain), Ajman ( UAE), SQU ( Oman), Jordan, JUST, Yarmouk, Zarka* ( Jordan) , IUG, PalPoly, An-najah, Al-Azhar ( Palestine), Lebanese U, Jinan* (Lebanon), Damascus, Aleppo (Syria).
FROM 12 COUNTRIES: Oman, Bahrain, UAE, Jordan, Palestine, Syria,
Lebanon, Yemen, Sudan, Egypt, Algeria and Morocco
5
PROJECT TEAM
UNDP Country Offices12 countries Focal point in each office
Project Management Team Project Manager: Isam NaqibRegional Coordination Unit: Rima
Mulhim, Head Staff: Rami Dababneh; Basima Shaheen
(2003-4), PT Consultant: Ali Yaghi
Academic Team92 Leading Academics = 31 programs x 2
(A+B) + 15 x 2 (C)29 University Coordinators ( VP level)
Sponsorship and Strategic Oversight
(UNDP/RBAS)
Director: Director: Rima Khalaf Hunaidi / Amat Al-Alim Rima Khalaf Hunaidi / Amat Al-Alim SoswaSoswa
Project Advisory Committee
Rima Khalaf Hunaidi (chair), Assia Bensalah Al Rima Khalaf Hunaidi (chair), Assia Bensalah Al Aaoui, Adnan Al Amin, Hussein Anis, Victor Bileh, Aaoui, Adnan Al Amin, Hussein Anis, Victor Bileh, Hasan Al Ebrahim, Ali Fakhro, Zahir Jamal, Marwan Hasan Al Ebrahim, Ali Fakhro, Zahir Jamal, Marwan Kamal, Amin Mahmood, Safwan Malek Masri, Kamal, Amin Mahmood, Safwan Malek Masri, Mohammed Zahi Mogherbi, Mohammed Masheer Al Mohammed Zahi Mogherbi, Mohammed Masheer Al Munajied, Maen Nsour, Isam Naqib, Kamal Al Sha’erMunajied, Maen Nsour, Isam Naqib, Kamal Al Sha’er
Regional Programme Division (RPD)
Portfolio Management : Portfolio Management : Zahir Jamal / Nada Al-Zahir Jamal / Nada Al-Nashif Nashif ((cchief)hief) ; Maen Nsour / Azza Karam ( ; Maen Nsour / Azza Karam (portfolio portfolio managermanager))
Executing Agency: UNOPSPortfolio Manager:Gilman Rebello / Melissa Esteva
6
PROJECT COMPONENT A
Evaluation of Programs
7
EVALUATION OF PROGRAMS CYCLES OF REVIEW
FIRST CYCLE: COMPUTER SCIENCE (2002-2003) 15 participating universities in 12 countries Cycle duration: 15 months. Individual reports distributed (Feb 2003)
SECOND CYCLE: BUSINESS ADMINISTRATION (2003-2004) 16 universities in the same countries Cycle duration: 15 months Reports distributed ( Aug 2004) .
REGIONAL OVERVIEW REPORT Dual report public distribution ( 2005). Reports on state of education in both subjects. Based on 15 + 16 individual reports
------------------------------------------------------
8
EVALUATION CYCLE - 2
Cycle stages Self evaluation External evaluation Final reporting
Cycle Duration 15 months Activities closely timed with academic calendar. Strict deadlines
Main Aspects of evaluation Academic standards Learning Opportunities
Internal Mechanisms for Quality Assurance and Enhancement
9
JUDGMENTS AND OUTPUTSJUDGMENTS Academic Standards (G / S / U)
Intended learning Outcomes (contract with students) ( G / S / U) Curricula (G / S / U) Student Assessment (G / S / U) Student Achievement (G / S / U)
Quality of Learning Opportunities Teaching and learning (G / S / U) Support for Student progression (G / S / U) Learning resources (G / S / U)
Quality Assurance and Enhancement (G / S / U)
OUTPUTS Individual Reports
Analyses of each aspect. Recommendations for improvement Judgments ( U/S/G)
Overview Report Regional patterns of strengths and weaknesses Recommended reforms and strategies Comparative chart of Judgments and Indicators
10
PROJECT METHODOLGIES
1) START FROM AN ADVANCED POINT Academic Subject Review Method (QAA, UK) Generic: Discipline – independent. Tested
2) ADAPT TO SUIT REGION’S NEEDS AND PRIORITIES additions
Expanded Program Specification New Key data and information (added annex) Cross-match curriculum to an external reference ( MFT) Amplified reporting (annex)
Benefits Accommodate variations of systems and terminology Bring up the detailed developmental conditions with regard to human and physical
resources. In depth view of strengths and weaknesses: mini judgments on sub-issues.
Indicators . Learn from experience: successive implementations (cycles)
Comprehensive handbook ( three editions / three languages). Training of UK reviewers
Ownership
3) INVEST IN CAPACITY BUILDING ( Means and outcome of process)
11
REVIEW CYCLE STRUCTURE
THREE WORKSHOPS + THREE PROCESSES ( 15 months)
WORKSHOP 1Training on/ and discussion of the method. Planning of tasks and schedules
PROCESS 1Internal ( -self) evaluation of program. Preparation of SE documents. Advisory support
WORKSHOP 2Review of progress: Group discussions. Bilateral tutorials.
PROCESS 2First SED drafts, Detailed feedback. Submission of final SED documents.
WORKSHOP 3 Training on external reviewing. Selection of reviewers (39/62; 33/48)
PROCESS 3External review missions ( UK + Arab reviewers) . Final individual
reports. Overview report
12
HIGHLIGHTS OF FINDINGS
COMPUTER SCIENCE Academic standards quality gap ( 1 U no G). Wide variations between universities Programming Fundamentals: Predominant. Emphasis on repetitive
teaching of many languages. Wasteful. Weak theoretical foundation. Theory and Computational Mathematics: Continuous mathematics:
Little Discrete / Computational methods
BUSINESS ADMINISTRATION Academic standards quality gap ( 2G 4 U). Wide variations between universities Weakness of Quantitative methods / Finance Language of Teaching CS RESULTS BA RESULTS
13
EVALUATION OF PROGRAMSPHASE II
THIRD CYCLE OF REVIEWS: EDUCATIONSTARTED
– June 2005– 24 universities – New participation: 35 universities in 14 countries (Qatar, SA).
COMPLETED– Workshops 1, 2 and 3 ( July - Dec 2005) .– All SEDs and related documents submitted, commented on / finalized ( Jan 2006) .– Review missions (May 5, 2006). Each 2 UK + 2 Arab reviewers– Draft reports ( June 2006) – Reviewers coordination / editing meeting ( June 7, 2006)– Pending review: Islamic University of Gaza
FINAL REPORTS– Final individual reports to universities: before Aug 31, 2006– Overview report for regional distribution before Dec 31, 2006
FOURTH CYCLE OF REVIEWS: LAW/ ENGINEERING (2007-08)RTED
14
PROJECT COMPONENT B
STUDENT TESTING
15
MODEL
THE MAJOR FIELD TEST (MFT) / EDUCATIONAL TESTING SERVICES (ETS) /PRINCETON, USA
Multiple choice two hour test. Published detailed curriculum of topics (with
weights) , updated periodically , Represents core topics of knowledge and skills
across a wide cross section of American education ETS provides statistical analysis for each group
of test takers, enabling comparison with other groups and with much larger international groups
16
IMPLEMENTATION
• TWO CYCLES– Spring 2003: English-based CS and BA ( 788 senior students from 17 universities.– Spring of 2004: French and Arabic-based CS and BA (921 senior students from 16
universities). Translation /administration carried out by UNESCO Beirut Office. – Total: 1808 senior students.
• REGIONAL TEST ADMINISTRATION– Workshop– ETS Manual + Added Guidelines– Independent Observers. Added Security– Regional scheduliong
• TEST OUTCOMES– Results with analyses sent by ETS to each university and project. – Regional analyses by project. Overview report being prepared, one on each cycle..
Regional comparison and with international groups of test-takers ( 3029 test-takers from 133 institutions for CS and 24,715 from 359 institutions for BA) .
17
TEST RESULTS
ILLUSTRATIVE EXAMPLES
COMPUTER SCIENCE ( ENGLISH-BASED) – Distribution of grades for Arab group: Normal, but
grades A and A+ under-populated: total 6% compared with 20% for international group. Another manifestation of of quality gap.
– Combination with outcomes of program review show clear potential for improvement . Examples: programming and Computational Theory.
CS ResultsBUSINESS ADMINISTRATION (ENGLISH-BASED)
– Very surprising result. A and B and part of C grades under populated
– Possible role of teaching language. BA culture.Requires regional consultation.
BA Results
18
STUDENT TESTING – 4
Lessons Learned
• Feasibility and value of coordinated regional testing: Regional guidelines, use independent observers, optimum utilisation of resources, regional comparisons, competition
• Importance of university /regional analyses: Correlations
• English-based tests: importance of analysing impact of student language proficiency on performance ( especially for BA)
• Arabic / French-based tests : importance of analysing impact of translation accuracy on. performance
• Analyses by ETS showed cultural bias of tests not pronounced
19
STUDENT TESTING – 5
Project Phase IIAdministration of MFT – Education
CHALLENGES Education is a culturally sensitive field ETS Claim: Test covers core knowledge and skills An Arabic version is needed ( 21 out of 24 universities)
PILOT STAGE Produced Arabic version in consultation with universities. Estimated culturally ( contextually) biased questions ( 25%) Administered test to a sample of 600 students from 12 universities ( June 2006)
PLAN Analyse results with universities Weed out culturally biased questions. Add questions by university ( 25%) Administer test to all universities in spring 2007.
20
PROJECT COMPONENT C
STATISTICAL DATABASE DEVELOPMENT
21
AIM AND INTENDED BENEFITS
AIM Develop a model for the collection and dissemination of
management information to be adopted by Arab universities.
BENEFITS • Comparability of information (statistical indicators) : national,
regional, international: (HE planning, research, int. presence) • Provide access to reliable updated HE data and information to
all stakeholders (public accountability) .• Strengthening of communication with stakeholders: students,
employers, HE Ministries, other universities• Informing funding processes (UK experience)
22
CHALLENGES
ARAB UNIVERSITIES HAVE DIVERSE:– Academic organizational / academic structures
Structures (governance, faculties, departments) – Program structure ( credits, semesters, degrees)– Methods of internal data management (e.g. levels of:
centralization, computerization) – Definitions of data variables– Commitment to public access to data ( e.g. apart from
selective practices: annual reports, press releases. All selective)
23
AS A RESULT
Users (stakeholders) encounter difficulty in:– Accessing– comparing – Interpreting HE data– e.g. poor Arab HE data in regional / international reports– Credit: Pioneering work of Dr. Subhi Al-Qassem
Needed: A common set of data definitions that :– Interfaces with / not replaces internal system / definitions of data.– Accommodates diversity among universities– Is agreed through regional field application, testing and
consultation– Adhered to be all (participating) universities
24
METHOD
MODEL– UK Higher Education Statistical Agency (168 institutions). Statutory. System
independent – With adaptations (below)
DEFINITIONS (32) UNDER: – Cost centres (academic / non-academic) , subject areas, reporting period, program
level, mode of study, staff costs, etc.
DATA SPECIFICATIONS template tables (14)Example
ADAPTATIONS ( CHALLENGING)
– Cost Centres– Staff rank, non-academic staff– Language of teaching– Program based loads– Aggregated finances– Interface tables
25
STUDENT NUMBER AND GROWTH
Student Number in 2001-02 and Growth since Previous Year
-6%5%20%11%13%18%20%26%
-3%9%
57%
0%
10%
4%
-2%
0
20000
40000
60000
80000
100000
120000
13 2 3 9 4 12 8 15 14 6 11 1 10 5 7
Individual Universities
Total:: 485,000; range 1000 to 106,000/university
Average: 32,000 (UK: 13,000 )
Average growth: 7.4%; range –6% to 57%
Private universities: 22%
Public universities: 6.6%
UK: 5%
26
SPECIAL FEATURES
Student Population( 2001-2002)
Level Average UK Sub-degree* 11.3 % 27.2% First degree 83.0 % 50.3% Postgraduate**: 5.7% 22.5% Part-time (only in 3 /15) 1% UK: 44%
* Less sub-degree qualifications ( vocational, employment related).
** Intermediate: 4.5 %, Doctorate: 1.2%. None in 3 for-profit private universities. *** Rigid modes of learning (lifelong, open, etc)
27
Distribution of Students by Subject Area and Course Level
1.9%1.3%1.5%1.6%1.7%2.1%2.1%2.6%3.1%
4.4%4.7%
8.6%9.1%9.7%10.0%
17.6%17.9%
0%
2%
4%
6%
8%
10%
12%
14%
16%
18%
Pro
po
rtio
n o
f T
ota
l N
um
ber
of
Stu
den
ts
Students at the Subdegree and First Degree Level Students at the Intermediate and Doctorate Level
STUDENTS PER SUBJECT AREA
Education, Business Administration, Law and Social Sciences: 55% Sciences, Engineering & Medicine: 27 % (UK: 41%)
Creative Arts ( none in 10 universities): 1.3%
28
Female % of Students and Staff(Average students: 47%; Average staff: 19%)
0
10
20
30
40
50
60
70
80
U-1 U-2 U-3 U-4 U-5 U-6 U-7 U-8 U-9 U-10 U-11 U-12 U-13 U-14 U-15
University
Pro
po
rtio
n )
%(
Female Students %
Average Students
Female Staff %
Average Staff
29
GENDER OF STUDENTS
• 46 % (UK 56%)
• Range: 27 % and 65 %
Female Students per University 2001-02
0%
10%
20%
30%
40%
50%
60%
70%
9 14 15 12 6 13 7 8 11 4 10 5 3 2 1 Average
Individual Universities
Fem
ale
Stu
den
ts
All Females UK Average
30
Age Distribution for All Students
0%
5%
10%
15%
20%
25%
30%
35%
40%
45%
Under 20 20 to 24 25 to 29 30 to 34 35 to 39 40 to 44 45 to 49 50 andover
To
tal
Nu
mb
er
of
Stu
de
nts
Arab UK
AGE OF STUDENTS
Majority below 25 years
6% 30 years or over ( UK: 30%)
Compare to UK (modes of learning)
31
LANGUAGE OF TEACHING - 1
Single language: 3 (2 Arabic, 1 English)
Teaching Lagnuage for All Fields
English18%
French4%
Combination12%
Arabic66%
32
Social Sciences & Humanities
English8%
Combination12%
Arabic76%
French4%
LANGUAGE OF TEACHING - 2
Science & Engineering
English39%
French5%
Arabic42%
Combination14%
Medical Sciences
English56%
French7%
Arabic33%
Combination4%
33
Student-Staff Ratio )SSR( by Cost Centre
7168 67
59
51
40
33 33 32 31 29 2825 24
22 2017 17 17 16 16 15 14 14 14 13 11 11
9 95
34
0
10
20
30
40
50
60
70
80
Busine
ss &
Man
ag. S
tudie
s
Human
ities
Info
. Tec
hnolo
gy &
Sys
t. Scie
nces
Lang
uage
Bas
ed S
tudie
s
Social
Stu
dies
Mat
hem
atics
Geogr
aphy
Educa
tion
Cater
ing &
Hos
pitali
ty M
anag
.
Libra
rians
hip, C
omm
s. & M
edia
Studie
s
Chem
istry
Mec
h., A
ero
& Pro
d. E
ng.
Physic
s
Cont.
Educa
tion
Electri
c, Elec
tronic
& C
omp.
Eng
.
Vet. S
cienc
e
Chem
ical E
ng.
Nursin
g & P
aram
ed S
tudie
s
Bioscie
nces
Civil E
ng.
Other
Tec
hnolo
gies
Dentis
try
Design
& C
reat
ive A
rts
Pharm
acy &
Pha
rmoc
ology
Arch.
, Buil
t Env
. & P
lannin
g
Earth
, Mar
ine &
Env
. Scie
nces
Med
icine
Health
& C
omm
Stu
dies
Gener
al Eng
.
Agr. &
For
estry
Mine
ral, M
etal.
, & M
ater
ials E
ng.
Avera
ge
Cost Centre
Stu
de
nt-
Sta
ff R
ati
o
34
GENDER OF ACADEMIC STAFF BY RANK
Gender per Academic Rank 2001-02
0%
5%
10%
15%
20%
25%
Professor AssociateProfessor
AssistantProfessor
Lecturer
Male Staff Female Staff
20 % Professors; 25 % Associates; 31 % Assistant; 24 % Lecturers
Trend: rising (AHDR)
35
Expenditure per Student for Individual Universities 2001-02
0
2000
4000
6000
8000
10000
12000
14000
16000
7 13 11 14 8 10 6 5 9 12 1 3 4 Average
Individual Universities
Ex
pen
ditu
re in
Eu
ros
EXPENDITURE PER STUDENT
4 below US$ 1,000
5 between US$ 1,000 – US$ 2,000
4 between US$ 2,000 – US$ 5,500
1 above US$ 13,000
36
END SLIDE