Post on 29-Jan-2016
Computing Research Center, High Energy Accelerator Organization (KEK)Computing Research Center, High Energy Accelerator Organization (KEK)
Site Status ReportSite Status Report
Go Iwai, KEK/CRC, JapanGo Iwai, KEK/CRC, Japan
WLCG Tier-2 WorkshopWLCG Tier-2 Workshop
Dec. 1 ~ 4, 2006Dec. 1 ~ 4, 2006
Tata Institute of Fundamental Research, Mumbai, IndiaTata Institute of Fundamental Research, Mumbai, India
2006/12/12006/12/1 KEK/CRC, Japan, Go IwaiKEK/CRC, Japan, Go Iwai 22
OutlineOutline• 1. Introduction1. Introduction
– KEKKEK– Network in JapanNetwork in Japan
• 2. Grid Deployment2. Grid Deployment– Current StatusCurrent Status– StrategyStrategy– Hosted VOs: Belle, ILC, APDG and so onHosted VOs: Belle, ILC, APDG and so on
• 3. Grid Inter-operability3. Grid Inter-operability– NAREGI: NAtional REsearch Grid InitiativeNAREGI: NAtional REsearch Grid Initiative– Relationship among LCG, NAREGI and KEK Relationship among LCG, NAREGI and KEK
• 4. Summary4. Summary
2006/12/12006/12/1 KEK/CRC, Japan, Go IwaiKEK/CRC, Japan, Go Iwai 33
Next TopicsNext Topics• IntroductionIntroduction
– KEKKEK– Network in JapanNetwork in Japan
• Grid DeploymentGrid Deployment– Current StatusCurrent Status– StrategyStrategy– Hosted VOs: Belle, ILC, APDG and so onHosted VOs: Belle, ILC, APDG and so on
• Grid Inter-operabilityGrid Inter-operability– NAREGI: NAtional REsearch Grid InitiativeNAREGI: NAtional REsearch Grid Initiative– Relationship among LCG, NAREGI and KEK Relationship among LCG, NAREGI and KEK
• SummarySummary
2006/12/12006/12/1 KEK/CRC, Japan, Go IwaiKEK/CRC, Japan, Go Iwai 44
KEK Computing Research KEK Computing Research CenterCenter
IntroductionIntroduction
Overview of CRCOverview of CRC
Super Computer SystemSuper Computer System
• Our mission related computing and networkingOur mission related computing and networking– Providing Computing Facility Providing Computing Facility
• KEK-B/BelleKEK-B/Belle• ILCILC• J-PARCJ-PARC
– Proton SynchrotronProton Synchrotron• K2K, T2KK2K, T2K
– long baseline neutrino detectionlong baseline neutrino detection• Accelerator design Accelerator design • Application at Synchrotron Radiation Facility Application at Synchrotron Radiation Facility
– material science, life science and etcmaterial science, life science and etc
– NetworkingNetworking– SecuritySecurity– Support for university groups in the fieldSupport for university groups in the field
• As an Inter University Research Institute CorporationAs an Inter University Research Institute Corporation
2006/12/12006/12/1 KEK/CRC, Japan, Go IwaiKEK/CRC, Japan, Go Iwai 55
HEPnet-JHEPnet-J• Originally, KEK organized HEP institutes in Japan to proOriginally, KEK organized HEP institutes in Japan to pro
vide the networking among them vide the networking among them – We started from 9600bps DECnet in early 1980’sWe started from 9600bps DECnet in early 1980’s– KEK is one of the first Internet sites and the first web site in KEK is one of the first Internet sites and the first web site in
Japan (1983? and 1992)Japan (1983? and 1992)
• The current network infrastructure is the SuperSINET oThe current network infrastructure is the SuperSINET operated by NII (National Institute of Informatics)perated by NII (National Institute of Informatics)
• NII will be upgraded to the SINET3 in April 2007.NII will be upgraded to the SINET3 in April 2007.• The SINET3 will provide multi-layered network service wThe SINET3 will provide multi-layered network service w
ith 10 - 40 Gbps backbone. ith 10 - 40 Gbps backbone.
IntroductionIntroduction
2006/12/12006/12/1 KEK/CRC, Japan, Go IwaiKEK/CRC, Japan, Go Iwai 66
SINETSINET (44nodes)(44nodes) 100100Mbps~Mbps~1Gbps1Gbps
Super SINETSuper SINET (32nodes)(32nodes)
10Gbps10Gbps
International linesInternational lines
Japan~U.S.AJapan~U.S.A10Gbps (To:N.Y.)10Gbps (To:N.Y.)2.4Gbps (To:L.A.)2.4Gbps (To:L.A.)
Japan~SingaporeJapan~Singapore622Mbps622Mbps
Japan~Hong KongJapan~Hong Kong622Mbps622Mbps
NationaNationall
PublicPublic PrivatPrivatee
JuniorJunior
CollegeCollegess
SpecializedSpecialized
Training Training CollegesColleges
Inter-Univ. Res. IInter-Univ. Res. Inst. Corp.nst. Corp.
OtherOtherss
TotaTotall
8181 5151 273273 6868 4141 1414 182182 710710
IntroductionIntroduction
Tab.: Number of SINET particular Organizations (Feb. 2006)Tab.: Number of SINET particular Organizations (Feb. 2006)
Tab.: Line SpeedsTab.: Line Speeds
HEPnet-J (cont.)HEPnet-J (cont.)Network Topology Map of SINET/SuperSINET(Feb. 2006)Network Topology Map of SINET/SuperSINET(Feb. 2006)
2006/12/12006/12/1 KEK/CRC, Japan, Go IwaiKEK/CRC, Japan, Go Iwai 77
Next TopicsNext Topics• IntroductionIntroduction
– KEKKEK– Network in JapanNetwork in Japan
• Grid DeploymentGrid Deployment– Current StatusCurrent Status– StrategyStrategy– Hosted VOs: Belle, ILC, APDG and so onHosted VOs: Belle, ILC, APDG and so on
• Grid Inter-operabilityGrid Inter-operability– NAREGI: NAtional REsearch Grid InitiativeNAREGI: NAtional REsearch Grid Initiative– Relationship among LCG, NAREGI and KEK Relationship among LCG, NAREGI and KEK
• SummarySummary
2006/12/12006/12/1 KEK/CRC, Japan, Go IwaiKEK/CRC, Japan, Go Iwai 88
Strategy on GRIDStrategy on GRIDGrid DeploymentGrid Deployment
• Deployment at KEK for Deployment at KEK for major groupsmajor groups– BELLEBELLE
• Ongoing experimentOngoing experiment
– ILCILC• Near future targetNear future target
• University support University support – education and training education and training – Deployment at smaller Deployment at smaller
centerscenters• HEPNET-J VOHEPNET-J VO
Overview of KEK-B acceleratorOverview of KEK-B accelerator
Design of ILC accelerator/detectorDesign of ILC accelerator/detector
2006/12/12006/12/1 KEK/CRC, Japan, Go IwaiKEK/CRC, Japan, Go Iwai 99
Recent EventsRecent Events• Nov. 2005: HEP Data Grid WorkshopNov. 2005: HEP Data Grid Workshop
– training and cooperation in Asia-Pacific regiontraining and cooperation in Asia-Pacific region– Lecturer: Dr. Marco La RosaLecturer: Dr. Marco La Rosa
• very good exercise for usvery good exercise for us
• Mar. 2006: First meeting on NAREGI/EGEE InteroperabilityMar. 2006: First meeting on NAREGI/EGEE Interoperability– launched Inter-OP projects between NAREGI and EGEElaunched Inter-OP projects between NAREGI and EGEE
• talk about NAREGI latertalk about NAREGI later– at CERNat CERN
• Aug. 2006: Belle workshop on GridAug. 2006: Belle workshop on Grid– to share the information among Belle collaborationsto share the information among Belle collaborations– at Nagoya Univ.at Nagoya Univ.
• Sep. 2006: Japan-France Workshop on Grid ComputingSep. 2006: Japan-France Workshop on Grid Computing– at IN2PS/Lyon Univ.at IN2PS/Lyon Univ.
Grid DeploymentGrid Deployment
2006/12/12006/12/1 KEK/CRC, Japan, Go IwaiKEK/CRC, Japan, Go Iwai 1010
Summary of LCG Summary of LCG DeploymentDeployment• JP-KEK-CRC-01 (Pre-Production System)JP-KEK-CRC-01 (Pre-Production System)
– since Nov. 2005.since Nov. 2005.– is registered to GOC, is ready to WLCG is registered to GOC, is ready to WLCG
(World wide LCG).(World wide LCG).– is operated by KEK staffs.is operated by KEK staffs.– Site Role:Site Role:
• practice for production system JP-KEK-CRpractice for production system JP-KEK-CRC-02.C-02.
• test use among university groups in Japatest use among university groups in Japan.n.
– Resource and Component:Resource and Component:• SL-3.0.5 w/ LCG-2.7SL-3.0.5 w/ LCG-2.7
– upgrade to gLite-3.0 is done.upgrade to gLite-3.0 is done.• CPU: 14, Storage: 1TBCPU: 14, Storage: 1TB
– Supported VOs:Supported VOs:• belle, apdg, dteam and opsbelle, apdg, dteam and ops
• JP-KEK-CRC-02 (Production System)JP-KEK-CRC-02 (Production System)– since early 2006.since early 2006.– is registered to GOC, is ready to WLCG.is registered to GOC, is ready to WLCG.– is outsourced to IBM Co.,Ltd.is outsourced to IBM Co.,Ltd.– Resource and Component:Resource and Component:
• SL or SLC w/ LCG-2.7SL or SLC w/ LCG-2.7– upgrade to gLite-3.0 is done.upgrade to gLite-3.0 is done.
• CPU: 48, Storage: 6TB (w/o including HPSS)CPU: 48, Storage: 6TB (w/o including HPSS)– Supported VOs:Supported VOs:
• belle, apdg, atlasj, ilc, dteam and opsbelle, apdg, atlasj, ilc, dteam and ops
• JP-KEK-CRC-00 (Testbed System)JP-KEK-CRC-00 (Testbed System)– since Jun. 2005.since Jun. 2005.– is closed environment in comparison with othis closed environment in comparison with oth
er sites.er sites.• easy to access and configure.easy to access and configure.
– Resource and Component:Resource and Component:• SL-3.0.5 w/ gLite-3.0 (100% pure)SL-3.0.5 w/ gLite-3.0 (100% pure)
– Supported VOs:Supported VOs:• belle, apdg, atlasj and g4medbelle, apdg, atlasj and g4med
Grid DeploymentGrid Deployment
2006/12/12006/12/1 KEK/CRC, Japan, Go IwaiKEK/CRC, Japan, Go Iwai 1111
Other Grid-related ServicesOther Grid-related Services• We have our own GRID CAWe have our own GRID CA
– is started on Feb. 2006, and is recognized by LCG.is started on Feb. 2006, and is recognized by LCG.– is accredited by APGRID PMAis accredited by APGRID PMA– # of issued user certificates: 32# of issued user certificates: 32– # of issued host certificates: 77# of issued host certificates: 77– http://http://gridca.kek.jpgridca.kek.jp//
• VO Membership ServiceVO Membership Service– Supported VOs:Supported VOs:
• apdg is the VO for Asia-Pacific Data Grid.apdg is the VO for Asia-Pacific Data Grid.• belle is the VO for Belle experiments.belle is the VO for Belle experiments.• atlasj is the VO for Atlas experiments in Japan.atlasj is the VO for Atlas experiments in Japan.• g4med is the VO for Geant4 medical application.g4med is the VO for Geant4 medical application.
• Local Mirror ServiceLocal Mirror Service– SL, SLC, LCG, gLiteSL, SLC, LCG, gLite– It takes ~30 minutes to update by using apt-get with CERN or FNAL repositories.It takes ~30 minutes to update by using apt-get with CERN or FNAL repositories.
• ~3 minutes with KEK repository~3 minutes with KEK repository– http://http://hepdg.cc.kek.jphepdg.cc.kek.jp/mirror//mirror/
• Semi-automatic Installation ServiceSemi-automatic Installation Service– WNs can be installed semi-automatically by PXE (Preboot eXecution Environment) and kickstart configuratWNs can be installed semi-automatically by PXE (Preboot eXecution Environment) and kickstart configurat
ion file.ion file.– http://hepdg.cc.kek.jp/install/http://hepdg.cc.kek.jp/install/
• Site PortalSite Portal– http://http://grid.kek.jpgrid.kek.jp//
Grid DeploymentGrid Deployment
KEK Grid CA Web RepositoryKEK Grid CA Web Repository
2006/12/12006/12/1 KEK/CRC, Japan, Go IwaiKEK/CRC, Japan, Go Iwai 1212
Belle VOBelle VO• Started using SRB and LCGStarted using SRB and LCG
– LCG site: JP-KEK-CRC-01/02LCG site: JP-KEK-CRC-01/02
• Data distribution service using SRB-DSIData distribution service using SRB-DSI– Belle already have a few PBs data in total including 100s TB Belle already have a few PBs data in total including 100s TB
DST and MCDST and MC• Bulk file register helps us: SregisterBulk file register helps us: Sregister• we do not move any of themwe do not move any of them
– Benefits both for native SRB users and LCG users Benefits both for native SRB users and LCG users
• VO is supported by KEKVO is supported by KEK– Nagoya (JP), Melbourne (AU), Academia Sinica (TW), Krakow Nagoya (JP), Melbourne (AU), Academia Sinica (TW), Krakow
(PL) and etc(PL) and etc
Grid DeploymentGrid Deployment
2006/12/12006/12/1 KEK/CRC, Japan, Go IwaiKEK/CRC, Japan, Go Iwai 1313
New B Factory Computer New B Factory Computer SystemSystem
PerformancePerformance
(Year)(Year)1997~1997~
(4years)(4years)2001~2001~
(5years)(5years)2006~2006~
(6years)(6years)
Computing Server Computing Server
(( SPECint2000 rateSPECint2000 rate ))~100~100
(WS)(WS)~1,250~1,250
(WS+PC)(WS+PC)~42,500~42,500
(PC)(PC)
Disk Capacity Disk Capacity
(( TBTB ))~4~4 ~9~9
1,0001,000
(1PB)(1PB)
Tape Library CapacityTape Library Capacity
(( TBTB ))160160 620620
3,5003,500
(3.5PB)(3.5PB)
Work Group serverWork Group server
(( # of hosts# of hosts ))3+(9)3+(9) 1111 80+16FS80+16FS
User WorkstationUser Workstation
(( # of hosts# of hosts ))25WS25WS+68X+68X
23WS23WS+100PC+100PC 128PC128PC
Grid DeploymentGrid Deployment
- New B Factory Computer System since March 23. 2006- New B Factory Computer System since March 23. 2006- History of B Factory Computer System- History of B Factory Computer System
Moore’s Law: 1.5y=x2.0 4y=x~6.3 5y=x~10
2006/12/12006/12/1 KEK/CRC, Japan, Go IwaiKEK/CRC, Japan, Go Iwai 1414
Belle Grid Deployment PlanBelle Grid Deployment Plan• We are planning a 2-phased deployment for BELLE expWe are planning a 2-phased deployment for BELLE exp
eriments.eriments.– Phase-1: BELLE user uses VO in JP-KEK-CRC-02 sharing with Phase-1: BELLE user uses VO in JP-KEK-CRC-02 sharing with
other VOs.other VOs.• JP-KEK-CRC-02 consists of “JP-KEK-CRC-02 consists of “Central Computing SystemCentral Computing System” maintained b” maintained b
y IBM corporation.y IBM corporation.• Available resources:Available resources:
– CPU: 72 processors (opteron), SE: 200TB (with HPSS)CPU: 72 processors (opteron), SE: 200TB (with HPSS)
– Phase-2: Deployment of JP-KEK-CRC-03 as BELLE Production Phase-2: Deployment of JP-KEK-CRC-03 as BELLE Production SystemSystem• JP-KEK-CRC-03 uses a part of “JP-KEK-CRC-03 uses a part of “B Factory Computer SystemB Factory Computer System” resource” resource
s.s.• Available resources (maximum estimation)Available resources (maximum estimation)
– CPU: 2200 CPU,CPU: 2200 CPU, SE: 1PB (disk), 3.5 PB (HSM)SE: 1PB (disk), 3.5 PB (HSM)• This system will be maintained by CRC and NetOne corporation.This system will be maintained by CRC and NetOne corporation.
Grid DeploymentGrid Deployment
2006/12/12006/12/1 KEK/CRC, Japan, Go IwaiKEK/CRC, Japan, Go Iwai 1515
Belle Grid Deployment Plan Belle Grid Deployment Plan (cont.)(cont.)• We are planning to federate with Japanese universities.We are planning to federate with Japanese universities.
– KEK hosts the BELLE experiment and behaves as Tier-0.KEK hosts the BELLE experiment and behaves as Tier-0.– Univ. with reasonable resources: full LCG (Tier-1)Univ. with reasonable resources: full LCG (Tier-1)– Univ. without resources: UIUniv. without resources: UI
– The central services such The central services such as VOMS, LFC and FTS as VOMS, LFC and FTS are provided by KEK. are provided by KEK.
– KEK also covers web KEK also covers web Information and support Information and support service.service.
– Grid operation is co-Grid operation is co-operated with 1~2 staffs operated with 1~2 staffs in each full LCG site.in each full LCG site.
Grid DeploymentGrid Deployment
JP-KEK-CRC-02JP-KEK-CRC-02 JP-KEK-CRC-03JP-KEK-CRC-03
UniversityUniversityUIUI
UniversityUniversityUIUI
UniversityUniversityUIUI
UniversityUniversity
UIUIUniversityUniversity
UIUIUniversityUniversity
UIUIUniversityUniversity
UIUIUniversityUniversity
UIUIUniversityUniversity
UIUI
Tier-0Tier-0
Tier-1Tier-1
deploy at phase-2deploy at phase-2
preliminary designpreliminary design
2006/12/12006/12/1 KEK/CRC, Japan, Go IwaiKEK/CRC, Japan, Go Iwai 1616
Other VOsOther VOs• ILCILC
– The initial goal is data sharing among The initial goal is data sharing among institute in Japan, Asia and world-wide.institute in Japan, Asia and world-wide.• 1-10TB in total1-10TB in total
• APDGAPDG– is the VO for DataGrid in Asia-Pacific region.is the VO for DataGrid in Asia-Pacific region.
• Atlas JapanAtlas Japan– is the VO for the ATLAS collaborations in is the VO for the ATLAS collaborations in
Japan Japan not for world wide ATLASnot for world wide ATLAS..
2006/12/12006/12/1 KEK/CRC, Japan, Go IwaiKEK/CRC, Japan, Go Iwai 1717
Next TopicsNext Topics• IntroductionIntroduction
– KEKKEK– Network in JapanNetwork in Japan
• Grid DeploymentGrid Deployment– Current StatusCurrent Status– StrategyStrategy– Hosted VOs: Belle, ILC, APDG and so onHosted VOs: Belle, ILC, APDG and so on
• Grid Inter-operabilityGrid Inter-operability– NAREGI: NAtional REsearch Grid InitiativeNAREGI: NAtional REsearch Grid Initiative– Relationship among LCG, NAREGI and KEKRelationship among LCG, NAREGI and KEK
• SummarySummary
2006/12/12006/12/1 KEK/CRC, Japan, Go IwaiKEK/CRC, Japan, Go Iwai 1818
NAREGI: NAtional REsearch Grid InitiativNAREGI: NAtional REsearch Grid Initiativeses• National Research Grid Initiatives (NAREGI)National Research Grid Initiatives (NAREGI)
– Apr. 2003 MEXT funded NAREGI 5 years ProjectApr. 2003 MEXT funded NAREGI 5 years Project– Lead by Prof. Ken Miura (NII)Lead by Prof. Ken Miura (NII)– Development of Grid infrastructure and an application for promotion of naDevelopment of Grid infrastructure and an application for promotion of na
tional economy tional economy – Target application is nano science and technology for new material desigTarget application is nano science and technology for new material desig
nn– PlayersPlayers
• Computing & networking: NII, AIST, TITECComputing & networking: NII, AIST, TITEC• Material scientists :IMS, U. Tokyo, Tohoku U., Kyushu U., KEK, ..Material scientists :IMS, U. Tokyo, Tohoku U., Kyushu U., KEK, ..• Companies: Fujitsu, Hitachi, NECCompanies: Fujitsu, Hitachi, NEC
– Distributed facility: Computing Grid up to 100 TFLOPS in totalDistributed facility: Computing Grid up to 100 TFLOPS in total– Extended to 2010 as a part of National Peta-scale Computing ProjectExtended to 2010 as a part of National Peta-scale Computing Project
Grid Inter-operabilityGrid Inter-operability
SuperSINETSuperSINET NIINII
10 TFLOPS (1,618 CPU)10 TFLOPS (1,618 CPU)Application TestbedApplication Testbed
5 TFLOS (896 CPU)5 TFLOS (896 CPU)Software TestbedSoftware Testbed* As of 2004* As of 2004
TokyoTokyoNagoyaNagoya
IMSIMS
2006/12/12006/12/1 KEK/CRC, Japan, Go IwaiKEK/CRC, Japan, Go Iwai 1919
Collaboration with NAREGICollaboration with NAREGI• What we expect for NAREGIWhat we expect for NAREGI
– Better quality Better quality – Easier deployment Easier deployment – Better support in the native languageBetter support in the native language
• What we need but still looks not in NAREGI What we need but still looks not in NAREGI – File/replica catalogue and data GRID related functionalitiesFile/replica catalogue and data GRID related functionalities
• Need more assessmentsNeed more assessments• Comes a little bit lateComes a little bit late
– Earlier is better for us Earlier is better for us • We need something working today!We need something working today!
• Require commercial version of PBS for beta-1Require commercial version of PBS for beta-1
• LCG (LHC Computing GRID) is now based on gLite 3. LCG (LHC Computing GRID) is now based on gLite 3. • Only middleware available today to satisfy HEP requirements Only middleware available today to satisfy HEP requirements
– US people are also developing their ownUS people are also developing their own• Difficulty Difficulty
– SupportSupport• Language gaps Language gaps
– Quality assurance Quality assurance – Assumes rich man powerAssumes rich man power
Grid Inter-operabilityGrid Inter-operability
2006/12/12006/12/1 KEK/CRC, Japan, Go IwaiKEK/CRC, Japan, Go Iwai 2020
LCG and NAREGI Inter-LCG and NAREGI Inter-operabilityoperability• NAREGI has much interests on NAREGI has much interests on
interoperability because they came late interoperability because they came late and they decided to establish in their sideand they decided to establish in their side
• First meeting at CERNFirst meeting at CERN– March 2006March 2006– NAREGI, LCG and people from KEK NAREGI, LCG and people from KEK
• Second meeting at GGF Tokyo Second meeting at GGF Tokyo
Grid Inter-operabilityGrid Inter-operability
2006/12/12006/12/1 KEK/CRC, Japan, Go IwaiKEK/CRC, Japan, Go Iwai 2121
KEK Plan on GRID Inter-opKEK Plan on GRID Inter-op
• NAREGI will implement LFC on their middlewareNAREGI will implement LFC on their middleware– We assume job submission between them will be realized soonWe assume job submission between them will be realized soon– Share the same file/replica catalogue space between LCG and NAREGIShare the same file/replica catalogue space between LCG and NAREGI
• Move data between them using GridFTP Move data between them using GridFTP
• NAREGI <--> SRB <--> LCG will be tried alsoNAREGI <--> SRB <--> LCG will be tried also– using SRB-DSIusing SRB-DSI
• Assessments will be done forAssessments will be done for– Command level compatibility (syntax) between NAREGI and LCGCommand level compatibility (syntax) between NAREGI and LCG– Job description languages Job description languages – Software in experimentsSoftware in experiments
• ILC, International Linear Collider, will be a targetILC, International Linear Collider, will be a target– interoperability among LCG, OSG and NAREGI will be required interoperability among LCG, OSG and NAREGI will be required
Grid Inter-operabilityGrid Inter-operability
2006/12/12006/12/1 KEK/CRC, Japan, Go IwaiKEK/CRC, Japan, Go Iwai 2222
Next TopicsNext Topics• IntroductionIntroduction
– KEKKEK– Network in JapanNetwork in Japan
• Grid DeploymentGrid Deployment– Current StatusCurrent Status– StrategyStrategy– Hosted VOs: Belle, ILC, APDG and so onHosted VOs: Belle, ILC, APDG and so on
• Grid Inter-operabilityGrid Inter-operability– NAREGI: NAtional REsearch Grid InitiativeNAREGI: NAtional REsearch Grid Initiative– Relationship among LCG, NAREGI and KEK Relationship among LCG, NAREGI and KEK
• SummarySummary
2006/12/12006/12/1 KEK/CRC, Japan, Go IwaiKEK/CRC, Japan, Go Iwai 2323
SummarySummary• We, KEK Computing Research Center, are We, KEK Computing Research Center, are
working for Belle and ILC mainly related GRIDworking for Belle and ILC mainly related GRID– Belle has started to use LCGBelle has started to use LCG– ILC soonILC soon– University supports alsoUniversity supports also
• 3 LCG sites are in operation at KEK.3 LCG sites are in operation at KEK.• Other Grid-related services are in operation Other Grid-related services are in operation
also.also.– CA, VOMS, Mirror, Installation and DocumentationCA, VOMS, Mirror, Installation and Documentation
• Grid inter operability between NAREGI and LCG Grid inter operability between NAREGI and LCG will be established.will be established.
Computing Research Center, High Energy Accelerator Organization (KEK)Computing Research Center, High Energy Accelerator Organization (KEK)
Thank YouThank YouK. AmakoK. Amako1,21,2, J. Ebihara, J. Ebihara33, Y. Iida, Y. Iida11, K. Inami, K. Inami44, K. , K.
IshikawaIshikawa55, , M. KagaM. Kaga44, S. Kameoka, S. Kameoka1,21,2, S. Kawabata, S. Kawabata11, K. Kawagoe, K. Kawagoe66, , A. KimuraA. Kimura77, Y. Kiyamura, Y. Kiyamura66, M. Matsui, M. Matsui55, K. Murakami, K. Murakami1,21,2, ,
H. SakamotoH. Sakamoto88, T. Sasaki, T. Sasaki1,21,2, S. Suzuki, S. Suzuki11, Y. Watase, Y. Watase11, , S. YashiroS. Yashiro11 and H. Yoshida and H. Yoshida99
11 High Energy Accelerator Organization (KEK) High Energy Accelerator Organization (KEK)22 Japan Science and Technology Agency (JST) Japan Science and Technology Agency (JST)
33 SOUM Co.,Ltd. SOUM Co.,Ltd.44 Nagoya University Nagoya University
55 IBM Japan Systems Engineering Co.,Ltd. IBM Japan Systems Engineering Co.,Ltd.66 Kobe University Kobe University
77 Ashikaga Institute of Technology Ashikaga Institute of Technology88 ICEPP, University of Tokyo ICEPP, University of Tokyo
99 Naruto University of Education Naruto University of Education
Computing Research Center, High Energy Accelerator Organization (KEK)Computing Research Center, High Energy Accelerator Organization (KEK)
BackupBackup
2006/12/12006/12/1 KEK/CRC, Japan, Go IwaiKEK/CRC, Japan, Go Iwai 2626
ILC VOILC VO• GRID for ILC is sponsored byGRID for ILC is sponsored by
– GAKUJYUTSU SOUSEI budget (a grant from MEXT)GAKUJYUTSU SOUSEI budget (a grant from MEXT)– French-Japan Joint Lab. ProgramFrench-Japan Joint Lab. Program
• Initial goalInitial goal– As a tool to share data of total As a tool to share data of total
size 1~10TB among Institutes size 1~10TB among Institutes in Japan, Asia, and World in Japan, Asia, and World Wide.Wide.
Grid DeploymentGrid Deployment
2006/12/12006/12/1 KEK/CRC, Japan, Go IwaiKEK/CRC, Japan, Go Iwai 2727
APDG VOAPDG VO• Asia Pacific Data GRIDAsia Pacific Data GRID• Collaboration among Academia SCollaboration among Academia S
inica(TW), Center for HEP-Korea, inica(TW), Center for HEP-Korea, University of Melbourne and KEKUniversity of Melbourne and KEK
• Regular meetings, workshops and Regular meetings, workshops and conferences conferences
• We are seeking tighter collaboration We are seeking tighter collaboration with ASGC with ASGC – GOC in Asia GOC in Asia
Grid DeploymentGrid Deployment
2006/12/12006/12/1 KEK/CRC, Japan, Go IwaiKEK/CRC, Japan, Go Iwai 2828
ATLAS JAPAN VOATLAS JAPAN VO
• federation among ICEPP, Kobe Univ., federation among ICEPP, Kobe Univ., Nagoya Univ. and KEK.Nagoya Univ. and KEK.– Okayama Univ. and Hiroshima-IT are also Okayama Univ. and Hiroshima-IT are also
potential sitespotential sites
• VO UsageVO Usage– Testing inter-connectivity among ICEPP, Kobe Testing inter-connectivity among ICEPP, Kobe
Univ. and KEKUniv. and KEK– Testing Function of middlewareTesting Function of middleware– Measuring performance of data sharingMeasuring performance of data sharing
• ATLAS RC is hosted by ICEPP not usATLAS RC is hosted by ICEPP not us
Grid DeploymentGrid Deployment