INFORMATION TECHNOLOGY SERVICES University Data Center Project Overview January 11, 2010.
-
Upload
merry-crawford -
Category
Documents
-
view
215 -
download
0
Transcript of INFORMATION TECHNOLOGY SERVICES University Data Center Project Overview January 11, 2010.
INFORMATION TECHNOLOGY SERVICES
University Data Center Project Overview
January 11, 2010
INFORMATION TECHNOLOGY SERVICES
Topics
• Participants• Project overview• Current status• Facility overview
INFORMATION TECHNOLOGY SERVICES
Customer Steering Committee
• Bob Gloyd, Engineering (Co-location)• Mark Jacaman, TRECS (Co-location)• Mark McFarland, Libraries (Co-location)• Robert O’Halloran, IQ (Managed Co-location)• Joe TenBarge, Liberal Arts (Managed
Co-location)• John Scannell, PETEX (Managed Co-location) • Dave Watson, PETEX (Managed Co-location)
INFORMATION TECHNOLOGY SERVICES
Key customer contacts
• Executive sponsor: Brad Englert• Project manager: William Green• Data Center operations: Gary Henderson• Customer liaison: John Lovelace• Managed server contact: Bill
Bova/Managed Services contacts• Communications and committee support:
Lisa Wright
INFORMATION TECHNOLOGY SERVICES
Project definition
• The data center project will renovate an existing campus building to construct a state-of-the-art data center for The University of Texas at Austin
• The facility will address immediate needs for a reliable data center capable of supporting enterprise-wide services critical to faculty, staff and students
INFORMATION TECHNOLOGY SERVICES
Project phases
• Construction– Actual construction tasks for repurposing the building
into the new UT Austin Data Center.• Migration planning and preparation*
– Plan movement of ITS and customer systems to the new facility. Identify hardware, services and applications to be moved. Includes development of internal process and approvals for the actual move.
• Occupancy planning and preparation– Plan building and white space network. Plan for
operational staff to move to the renovated building. ITS Operations must be moved and operational before moving systems.
*Customer Steering Committee input required
INFORMATION TECHNOLOGY SERVICES
Project phases
• Occupancy– Creating the building network in the white
space and moving in operational staff.• Move*– The actual movement of hardware, services
and applications to the new DC facility. Some windows are fixed (the major move). Only ITS Internal systems move prior to the completion of the redundancy check.
• Communication and project management*.
*Customer Steering Committee input required
INFORMATION TECHNOLOGY SERVICES
Project phases
INFORMATION TECHNOLOGY SERVICES
Move phases
• ITS Internal Only– Systems that will allow for redundancy check
and move process validation with zero customer impact.
• Redundancy Check– Ensures that systems react properly to
equipment designed to ensure redundancy. Other systems do not move until this item is completed successfully.
• ITS Development– Additional systems with no customer impact.
INFORMATION TECHNOLOGY SERVICES
Move phases
• Co-location customers• Managed co-location customers• ITS clustered systems– Services with redundancy built in and can be
moved with minimal to zero downtime.• ITS major move– December 2010-Jan 2011. Customer-facing
services that will experience downtime, including the mainframe. No other systems will move during this period
• May 2011 an available window for moving systems
INFORMATION TECHNOLOGY SERVICES
Current status
• Establishing the Customer Steering Committee• Operating cost information submitted and price
currently under discussion• Revising project schedule based on upcoming
changes• Migration planning for ITS in progress• Migration planning next for customers• Change management process under development• Construction on track• Move vendor selection planning beginning
INFORMATION TECHNOLOGY SERVICES
Facility benefits
• Leverage central institutional space, financial and staff resources to minimize need for like-kind, college-level investments
• Reclaim space for core academic needs• Choose hardware and service options that meet
departmental needs• Protect research and departmental data with high
levels of physical and information security • Experience increased uptime and reliability for
systems and services• Support sustainability by using greener
computing facilities
INFORMATION TECHNOLOGY SERVICES
Facility features
• Redundant 10-gigabit uplink connection to campus and research networks
• 4,700 sq ft of raised floor space equipped with standard server cabinets, network switches and network connections
• Server co-location for individual or clustered research and administrative systems
• Staff work space and server build room to support system administrators in setting up and supporting systems
INFORMATION TECHNOLOGY SERVICES
Facility features
• Availability 24 hours a day, 365 days a year
• Dedicated remote management network• Systems monitoring and notification• Physical and information security• Consultation and assessment for academic
and administrative units that want to move equipment into the new UDC
• Climate control, fire detection and suppression, and power systems