Activities in CoreGRID Work Package 3 University of Münster...
Transcript of Activities in CoreGRID Work Package 3 University of Münster...
Activities in CoreGRID Work Package 3University of Münster (UMUE, Nr. 22)
Fourth plenary meeting, January 2006,University of Westminster, London, UK
1
THE PVS WORK GROUP AT WWU
Researchers involved in CoreGRID Activities:
•
Sergei Gorlatch: Head of the group,Leader of integration activities
•
Martin Alt: PhD student,Expertise: performance prediction, RMI
•
Jan Dünnweber: PhD student,Expertise: grid components, Web services
•
Jens Müller: PhD student,Expertise: interactive systems, grids & games
THE PVS WORK GROUP AT WWU 2
COOPERATION PROJECT 1: SKELETON OPTIMIZATIONS
• Involved partners: UNIPI & WWUMarco Aldinucci & Marco Danelutto (UNIPI),
Jan Dünnweber & Sergei Gorlatch (WWU)
• Objectives
– Enhanced performance– Java impl. (Lithium library)– Experiments in grid environment
Server 1
Server 1
Client
1
2
3
4
1 : A = Server 1(f, a); 2 : A
3 : B = Server 2(g, A); 4 : B
• Results: Three novel optimizations
➜ Optimization 1: Reduced communication requirements➜ Optimization 2: Increased degree of parallelism➜ Optimization 3: Dynamic load-balancing
COOPERATION PROJECT 1: SKELETON OPTIMIZATIONS 3
COOPERATION PROJECT 1: SKELETON OPTIMIZATIONS
• How the optimizations work:
– Transfer of references– Server-side multithreading– Counting of active threads
Server 1
Server 1
Client
1
2
3
6
1 : A = Server 1(f, a); 2 : ref(A);
3 : B = Server 2(g, ref(A)); 4 : get(ref(A));
Thr3Thr1
Thr2
Thr1Thr2 Thr3
45
5 : A 6 : B
• Publications and presentations:
➜ CMPP workshop presentation in Scotland (2004)➜ CoreGRID technical report TR-0001 (2005)➜ Book chapter in “Grid Computing“-book
COOPERATION PROJECT 1: SKELETON OPTIMIZATIONS 4
PROJECT 2: HIGHER-ORDER COMPONENTS (HOCS)➜ Grid component model developed by the PVS-group in Münster
➜ Idea: Components pre-packaged with implementation &pre-configured middleware support
➜ Customization of HOCs by transfering mobile code, e. g. ,
public interface Worker<E> {
public E[] compute(E[] input); }
public interface Master<E> {
public E[][] split( E[] input, int numWorkers );
public E[] join(E[][] input); }
➜ Code parameters can be plugged into the Farm-HOC, which isreadily integrated with the target middleware (e. g. , WSRF)
➜ Transparency of data distribution, parallelism and configuration
PROJECT 2: HIGHER-ORDER COMPONENTS (HOCS) 5
HIGHER-ORDER COMPONENTS IN COREGRID
➜ The HOC Web site:➜ http://pvs.uni-muenster.de/pvs/forschung/hoc➜ Documentation, code, examples, publications, ...
Two publications in the CoreGRID series (Springer Verlag)
➜ Volume 1: Component Models and Systems for Grid Applications➜ HOCs - Higher-Order Components for Grids, pages 157-166➜ Martin Alt, Jan Dünnweber, Jens Müller, Sergei Gorlatch
➜ Volume 2: Future Generation Grids➜ From Grid Middleware to Grid Applications: Bridging the Gap with HOCs➜ Sergei Gorlatch, Jan Dünnweber
HIGHER-ORDER COMPONENTS IN COREGRID 6
HOC-SA, THE HOC-SERVICE ARCHITECTURE
➜ The HOC-SA implements a runtime environment for HOCs usingWeb services
Client
code parameter
inte
rfac
eW
SDL
web service
Farm−HOC
code service
WSD
Lin
terf
ace
farm implementationin
terf
ace
Java
code
code
code
localfilesystem
remoteclass loader
scheduler
worker 1
worker 2
local code
mobile codeID
ID
Master
Worker
Worker
➜ The neccessary transfer of code units is addressed using thespecially configured Code Service
➜ The HOC-SA was first presented at the International Conferenceon Services Computing (IEEE, SCC04)
HOC-SA, THE HOC-SERVICE ARCHITECTURE 7
THE HOC-SA IN COREGRID
Three presentations on system architecture & code mobility:
•
Jens Müller & Jan Dünnweber:From Interactive Applications Towards a Network-Centric OSWorkshop on Network-Centric OS, Brussels, March 2005http://www.cetic.be/internal.php3?id_article=284
http://www.cordis.lu/ist/grids/operating_systems.htm
•
Jan Dünnweber:Mobile Code for Component Customisation and OptimisationEuropean Grid Technology Days, Brussels, June 1, 2005http://www.scai.fraunhofer.de/1451.0.html
•
Jan Dünnweber:Using Mobile Code in Grid ApplicationsSecond CoreGRID WP3 Workshop, Barcelona, June 2005http://www.coregrid.net/mambo/content/view/187/30
• Code mobility for HOCs is documented in D.PM.02,Deliverable on the CoreGRID GCM, Section 4.4, p.10–12
THE HOC-SA IN COREGRID 8
COOPERATION PROJECT 3: HOCS, FRACTAL & PROACTIVE
• Involved partners: WWU & INRIA:Jan Dünnweber & Sergei Gorlatch (WWU)
Françoise Baude, Virginie Legrand,
Nikos Parlavantzas (INRIA)
• Objectives:• Simplified development of HOCs
• Code mobility and state for WS
• Publications and Presentations:
➜ GSTE workshop presentation, France, 2005➜ CoreGRID integration workshop, Italy, 2005➜ Proceedings of the ETSI Grid plugtest
COOPERATION PROJECT 3: HOCS, FRACTAL & PROACTIVE 9
COOPERATION PROJECT 4: COMPONENT ADAPTATIONS
• Involved partners: WWU & UNIPI:Jan Dünnweber & Sergei Gorlatch (WWU)
Marco Aldinucci, Sonia Campa,
Marco Danelutto (UNIPI)
• Objectives:• Enhance component flexibility
• Customize component behavior
• Algorithm Adaptations as an extension to Re-
source Adaptation
• Publications and Presentations:
➜ CoreGRID Technical Report TR-0002➜ CoreGRID integration workshop, Italy, 2005
COOPERATION PROJECT 4: COMPONENT ADAPTATIONS 10
COOPERATION PROJECT 5: INTEGRATION OF LEGACY CODE
• Involved partners: WWU & ENS Lyon:Jan Dünnweber & Sergei Gorlatch (WWU)
Anne Benoit (Ecole Normale Supérieure de
Lyon)
• Objectives:• Integrate C+MPI with Web services
• Use the eSkel library inside a HOC
Web Service
HOC
eSkel skeleton
results
grid client
requeste.g., image filter
e.g., pipeline
xml
xml par
amet
ers
MPI hosting platformWeb service host (Globus)
TCP gateway
• Publications and Presentations:
➜ HPC-Europa Technical Report:Component-based Grid Programming. A Case Study on Wavelets
➜ International Conference on Parallel Computing, Spain, 2005:Integrating MPI-Skeletons with Web Services
COOPERATION PROJECT 5: INTEGRATION OF LEGACY CODE 11
FELLOWSHIP PROGRAM: SCHEDULING HOC-BASED APPLICATIONS
• Fellow: Catalin L. Dumitrescu
• Involved partners: WWU & TU Delft:Jan Dünnweber & Sergei Gorlatch (WWU), Dick Epema (TU Delft)
• Scheduling of HOCs using KOALA
• HOC service sends to KOALA: pattern type & input
• KOALA runner builds RSL, sets optimal mappingand submits the application
FELLOWSHIP PROGRAM: SCHEDULING HOC-BASED APPLICATIONS 12
THE COREGRID GCM
The Uni-Muenster team participated in the writing of the firstWP 3 Roadmap document. Amongst others, we created thesurvey on existing grid component models (state of the artsection):
feature 1 2 3 4 5 6 7 8
ProActive yes ongoing yes ongoing yes yes yes
ICENI yes yes yes
Reflex yes yes
QUB yes yes
HOC-SA yes yes yes yes yes yes
Polytope yes yes yes yes yes supported yes
ASSIST yes yes yes yes yes
MALLBA yes yes yes
GAT yes yes yes
THE COREGRID GCM 13
SUMMARY
• 2 CoreGRID technical reports
• 2 book chapters in the CoreGRID series
• 5 cooperations with 3 different partners
• 3 implementation projects
• 7 workshop presentations
• 2 conference publications on CoreGRID related research
• 3 short visits
• 1 fellowship project within the CoreGRID Fellow Program
• Participation in the documentation of the GCM
SUMMARY 14