Post on 19-Aug-2018
July 16-17, 2015National Center for Education StatisticsWashington, DC
Institute for IPEDS EducatorsNational Center for Education Statistics
Association for Institutional Research
1
2015 Institute for IPEDS Educators
Contents
Meeting Agenda .............................................................................................................................. 2
Contacts
2015-16 IPEDS Educators .......................................................................................................... 4
National Center for Education Statistics (NCES) ....................................................................... 7
RTI International (RTI) ............................................................................................................... 8
Association for Institutional Research (AIR) .............................................................................. 8
Bios
2015-16 IPEDS Educators .......................................................................................................... 9
National Center for Education Statistics (NCES) ..................................................................... 18
RTI International (RTI) ............................................................................................................. 20
Association for Institutional Research (AIR) ............................................................................ 21
Additional Speakers .................................................................................................................. 22
2013-14 IPEDS Training Report
Introduction ........................................................................................................................... 24
Oversight of IPEDS Training Program ................................................................................. 25
Development of Training Curriculum ................................................................................... 28
IPEDS Workshops for Training of Institutional Personnel ................................................... 34
Online Training Materials ..................................................................................................... 42
Communications .................................................................................................................... 46
NCES PowerPoint Slides .............................................................................................................. 48
2
Meeting Agenda
Thursday, July 16
8:00 – 9:00 am 9:00 – 9:30 am 9:30 – 10:00 am 10:00 – 10:20 am 10:20 am – 12:00 pm 12:00 – 1:30 pm 1:30 – 3:30 pm 3:30 – 3:40 pm 3:40 – 4:00 pm 4:00 – 5:00 pm
Registration
Welcome
Christopher S. Coogan, Chief of Staff & Deputy Director, AIR Eric Godin, Director of Education AIR Tinsley Smith, Assistant Director of Contracts and Grants, AIR
Group Activity
Break
NCES Presentation & Activity
IPEDS Team, NCES Richard Reeves, Program Director
Tara Lawley, Acting Team Leader for Operations
Sam Barbett, Team Leader for Data Quality and Dissemination
Moussa Ezzeddine, Survey Director
Gigi Jones, Survey Director
Bao Le, Survey Director
Andrew Mary, Survey Director
Stefanie McDonald, Survey Director
Lunch
NCES Presentation & Activity Continued
IPEDS Team, NCES
IPEDS Help Desk Update
Amy Barmer, RTI
Break
Introduction to IT Innovative Solutions
Ruba Hay, IT Innovative Solutions (INOVAS)
3
Friday, July 17
8:30 – 10:10 am 10:10 – 10:30 am 10:30 – 11:30 am 11:30 am – 12:00 pm 12:00 – 1:30 pm 1:30 – 2:30 pm 2:30 – 3:00 pm 3:00 – 4:00 pm
Overview of AIR’s IPEDS Workshops
IPEDS Data as the Public Face of an Institution Donyell Francis, Technical College System of Georgia
Sandra Kinney, Higher Education Consultant
Keyholder Training Kurt Gunnell, Western Governors University
Yvonne Kirby, Central Connecticut State University
IPEDS Data and Benchmarking: Supporting Decision Making and Institutional Effectiveness Erez Lenchner, CUNY LaGuardia Community College
Joseph Stankovich, Skidmore College
Best Practices for Reporting and Using IPEDS Data to Improve Office Efficiencies Amy Ballagh, Georgia Southern University
Kristina (Cragg) Powers, Bridgepoint Education
IPEDS Finance Training for IR Professionals Eric Atchison, Mississippi Institutions of Higher Learning
Braden Hosch, Stony Brook University
Break
Discussion with NASFAA
Charlotte Etier, Research Analyst
Overview of AIR’s IPEDS Online Tutorials
Eric Godin, AIR
Lunch
IPEDS Course Overview
Christopher S. Coogan, AIR Mary Ann Coughlin, Springfield College
Eric Godin, AIR
Group Activity
Looking Ahead and Conclusion
Christopher S. Coogan, AIR Eric Godin, AIR Tinsley Smith, AIR
4
Contacts
2015-16 IPEDS Educators
Eric Atchison Institutional Research Analyst Mississippi Institutions of Higher Learning Jackson, MS eatchison@mississippi.edu 228-324-2448
Amy Ballagh Associate Vice President for Student Affairs & Enrollment Management Georgia Southern University Statesboro, GA aballagh@georgiasouthern.edu 912-478-5256
Elizabeth Clune-Kneuer*1 Director of Institutional Research & Reporting Prince George's Community College Largo, MD cluneea@pgcc.edu 301-546-0721
Mary Ann Coughlin Associate Vice-President for Academic Affairs Springfield College Springfield, MA mcoughlin@springfieldcollege.edu 413-748-3038
Joseph Curtin Assistant Commissioner, Instutional Research & Analysis Utah System of Higher Education Salt Lake City, UT jcurtin@ushe.edu 801-321-7121
* First-year IPEDS Educator
Rebecca Drennen* Director, Institutional Research Berkeley College Woodland Park, NJ rjd@berkeleycollege.edu 973-278-5400 ext. 5753
Jennifer Dunseath Director of Institutional Research Rhode Island School of Design Providence, RI jdunseat@risd.edu 401-454-6386
Nancy Floyd Director of Institutional Research University of South Carolina-Columbia Columbia, SC nfloyd@sc.edu 803-777-1323
Donyell Francis Data Compliance Manager Technical College System of Georgia Smyrna, GA dfrancis@tcsg.edu 404-679-4954
Deborah Geil Data Analyst/System Developer Cleveland State University Cleveland, OH d.geil@csuohio.edu 216-687-4798
5
Kurt Gunnell Director of Institutional Research Western Governors University Salt Lake City, UT kgunnell@wgu.edu 801-924-4679
Thomas Haakenson Associate Provost California College of the Arts Oakland, CA thaakenson@cca.edu 651-894-2894
Lakia Hairston President P&A Scholars Beauty School Beverly Hills, MI lakiahairston@comcast.net 313-399-0594
Erin Holmes Associate Vice Provost University of Alaska Anchorage Anchorage, AK ejholmes@uaa.alaska.edu 907-786-1544
Braden Hosch Assistant Vice President for Institutional Research, Planning and Effectiveness Stony Brook University Stony Brook, NY Braden.Hosch@stonybrook.edu 631-632-6210
John Ingram* Associate Director of Enrollment Management Analytics Duquesne University Pittsburgh, PA ingramj2333@duq.edu 412-396-5610
Sandra Kinney Higher Education Consultant Alpharetta, GA sandra.kinney@gmail.com 678-4791-9709
Yvonne Kirby Director of Institutional Research and Assessment Central Connecticut State University Colchester, CT ykirby@ccsu.edu 479-601-3256
Michelle Landa Vice President of Accademic Affairs Eastern Wyoming College Torrington, WY michelle.landa@ewc.wy.edu 541-881-5583
Erez Lenchner Senior Institutional Researcher CUNY LaGuardia Community College Long Island City, NY elenchner@lagcc.cuny.edu 718-482-6155
Marc LoGrasso Institutional Outcomes Assessment Analyst Bryant and Stratton College Getzville, NY mlograsso@bryantstratton.edu 716-250-7500 ext. 228
Carolyn Mata* Director of Research Georgia Independent College Association Atlanta, GA cmata@georgiacolleges.org 404-233-5433 ext. 23
6
Kristina Cragg Powers Associate Vice President of Institutional Research Services Bridgepoint Education San Diego, CA kristina.cragg@bridgepointeducation.com 866-475-0317 ext. 2312
Joseph Stankovich Director of Institutional Research Skidmore College Saratoga Springs, NY jstankov@skidmore.edu 518-580-5719
Naomi Stennes-Spidahl Director of Assessment and Institutional Research Viterbo University La Crosse, WI nrstennesspidahl@viterbo.edu 608-796-3481
Jill Triplett* Director Spelman College Douglasville, GA jtriple1@spelman.edu 404-270-5677
Gail Tudor* Director of Institutional Research Husson University Bangor, ME tudorg@husson.edu 207-941-7039
Christopher Vinger Director New York University New York, NY CVinger@nysid.edu 646-935-5479
Hui-Min Wen Director of Institutional Research and Assessment New College of Florida Sarasota, FL hwen@ncf.edu 941-487-4601
Barbara Wharton Associate Provost, Institutional Research Ohio University-Main Campus Athens, OH whartonb@ohio.edu 614-593-1059
Ta-Tanisha Young Research Analyst Harper College Chicago, IL tyoung@harpercollege.edu 847-925-6121
7
National Center for Education Statistics (NCES)
Sam Barbett Team Leader for Data Quality and Dissemination samuel.barbett@ed.gov 202-502-7305
• General data questions
• IPEDS data release
• IPEDS files
Moussa Ezzeddine
Survey Director moussa.ezzeddine@ed.gov 202-502-7781
• Institutional Characteristics Header
• Human Resources
• Report Mapping
• Universe Maintenance
Gigi Jones
Survey Director gigi.jones@ed.gov 202-502-7428
• Data Feedback Report
• Graduation Rates
• Outcome Measures
• Admissions
Tara Lawley Acting Team Leader for Operations tara.lawley@ed.gov 202-502-7476
• IPEDS Training
• IPEDS coordinators
• Research and development
• IPEDS data collection
• Institution compliance
Bao Le
Survey Director 202-502-7328 bao.le@ed.gov
• Academic Libraries
• Fall Enrollment
• 12-month Enrollment
• Finance
Andrew Mary
Survey Director andrew.mary@ed.gov 202-502-7337
• Classification of Instructional Programs (CIP)
• Completions
• IPEDS data release
• IPEDS publications
Stefanie McDonald
Survey Director srmcdonald@air.org
• Institutional Characteristics
• Student Financial Aid
Richard Reeves
Program Director richard.reeves@ed.gov 202-502-7436
• IPEDS Program Director
• Postsecondary Institutional Ratings System
8
RTI International (RTI)
Amy Barmer Research Education Analyst Education Studies Division abarmer@rti.org 919-316-3403
Jamie Isaac
Senior Survey Director Education Studies Division isaac@rti.org 919-541-6342
Janice Kelly-Reid
Director, Survey Operations Program Education Studies Division jrk@rti.org 919-541-6020
Jennifer Pauli Research Education Analyst Education Studies Division jmize@rti.org 919-485-5598
IPEDS Data Collection Help Desk
ipedshelp@rti.org 1-877-225-2568
• IPEDS data submission
IPEDS Data Tools Help Desk
ipedstools@rti.org 1-866-558-0658
• IPEDS data tools
• College Navigator
• IPEDS Data Center
• Trend Generator
• Tables Library
Association for Institutional Research (AIR)
Meg Andraza
Project Coordinator mandraza@airweb.org 850-385-4155 ext. 123
Elaine Cappellino
Assistant Director for Member Services ecappellino@airweb.org 850-385-4155 ext. 109
Christopher S. Coogan Deputy Director & Chief of Staff ccoogan@airweb.org 850-385-4155 ext. 124
Eric Godin
Director of Education egodin@airweb.org 850-385-4155 ext. 108
Darlena Jones
Director of Assessment and Research djones@airweb.org 850-385-4155 ext. 146
Tinsley Smith
Assistant Director of Contract and Grants tsmith@airweb.org 850-385-4155 ext. 106
Randy L. Swing
Executive Director rswing@airweb.org 850-385-4155 ext. 101
9
Bios
2015-16 IPEDS Educators
Eric Atchison serves as Institutional Research Analyst and IPEDS State Coordinator for the Board of Trustees of State Institutions of Higher Learning in Mississippi. Previously, Eric was the Research and Information Specialist at Delta State University. As part of the SLDS grant, he developed an interactive data portal which provides public access to longitudinal data for all public universities in Mississippi across a variety of topical areas. He has also been involved in the creation and implementation of the performance funding allocation model which was implemented in 2013. Eric has been active in MAIR, SAIR, and AIR since 2008 and his research interests include institutional well-being, student financial aid, and student outcomes. As the IPEDS Coordinator, Eric works with 36 institutions to ensure consistent and accurate data reporting as well as presenting on the various IPEDS tools and updates within Mississippi.
Amy Ballagh is Associate Vice President for Student Affairs and Enrollment Management at Georgia Southern University in Statesboro, Georgia. Prior to joining the GSU staff in 2006, she spent seven years at Ogeechee Technical College, where she was a faculty member and the Director of Institutional Research and Evaluation. Her areas of expertise include assessment, strategic planning, financial management, and grant management. During her 15 years as a higher education administrator, she has used IPEDS data to conduct comparisons with peer institutions in order to make informed, strategic decisions on her campus.
Elizabeth Clune-Kneuer is the Director of Institutional Research and Reporting at Prince George’s Community College in Largo, Maryland. Prior to joining the Office of Planning, Assessment, and Institutional Research (OPAIR) staff in May 2015, she spent eight years at St. Mary’s College of Maryland as the Assistant then Associate Director of Institutional Research. Elizabeth is an MdAIR Past President and a current NEAIR Steering Committee member and Membership Chair. In 2009, Elizabeth received an AIR/NCES Fellowship for Institutional Research to support her attendance in the Pennsylvania State University’s Graduate Certificate Program in Institutional Research. In 2010, Elizabeth received a Fellowship to the National Summer Data Policy Institute sponsored by AIR/NSF/NCES.
10
Mary Ann Coughlin is Associate Vice-President for Academic Affairs at Springfield College in Springfield, Massachusetts. Across her tenure at Springfield College, Mary Ann has served in a variety of positions; serving as a faculty member, President of the Faculty Senate, and her current administrative position in Academic Affairs. Across these positions, she has supervised academic support services and provided leadership for outcomes assessment initiatives, academic progress reviews, and institutional research. Coughlin is a Past President of AIR and was awarded the 2012 AIR Outstanding Service Award. In addition to being well known within AIR for her training and workshops in the area of statistics and as a faculty member for the Foundations Institute and the Statistics Institute, she also contributes significantly to the development of training materials and resources on the use of data tools for IPEDS.
Kristina Cragg Powers has nearly 15 years of experience in higher education, including serving as the head of institutional research at two institutions. As Associate Vice President for Institutional Research Services at Bridgepoint Education, she provides leadership and project management for institutional research, data-driven decision-making support for senior level administrators. She has produced more than 50 national and regional conference presentations on institutional research in the areas of management, analyses, policy, and student success. Her scholarship has been published in numerous higher education journals. In 2014, she was selected to serve as lead author on a project funded by a grant AIR received from the Bill & Melinda Gates Foundation to develop Statements of Aspirational Practice for Institutional Research. The project focused on the changing landscape of higher education, data sources, and data tools, and the work reflects what matters most to future-looking IR offices in order to remain relevant and effective in decision support roles. Dr. Powers was elected as Vice President to the California Association for Institutional Research for 2015.
Joseph Curtin is Assistant Commissioner for Institutional Research & Analysis for the Utah System of Higher Education (USHE) and served as the IPEDS State Coordinator for the eight public colleges and universities in USHE as well as three private-not-for-profit universities within the state from 2008-2015. Prior to working in the Utah System Office, he spent twelve years as the IPEDS key-holder at Brigham Young University and six years as the IPEDS coordinator at Utah Valley State College (now Utah Valley University). With over 25 years of experience working in Institutional Research at both public and private institutions, Joseph has gained extensive experience using IPEDS data in support strategic planning efforts and outcome measures.
11
Rebecca Drennen is Director of Institutional Research at Berkeley College. She is responsible for the external reporting including IPEDS for both Berkeley College-Woodland Park and Berkeley College-New York. Over her 26 years with the institution she has held numerous positions. Among the projects she has been involved in are the reestablishing the Institutional Research Department in 1999, coordinating the centralization of all external reporting including IPEDS and state reporting; assisting with gainful employment reporting data collection; drafting the College’s record retention policy; assisting with the development of new programs including the first baccalaureate degree offered at Berkeley; aiding in the compilation of several Middle States Self-Study and Periodic Review Reports; and serving as the PeopleSoft Student Records functional lead for Berkeley. She has been an IPEDS Keyholder for a less than two-year career school, a two-year institution, and a four-year institution. Rebecca served as a member of the Steering Committee for New Jersey Association for Institutional Research (NJAIR) for 2004-2005 through 2006-2007 including two years as Treasurer of the organization. She is a long time member of AIR and North East Association for Institutional Research (NEAIR).
Jennifer Dunseath has 25 years of experience in Institutional Research and is currently Director of IR at Rhode Island School of Design (RISD). She has held positions in IR at the University of Michigan, Flint campus, Kettering University in Flint, Michigan, and Framingham State University in Framingham, Massachusetts. She has been an IPEDS Keyholder for most of her career, and an IPEDS Trainer in 2000-2003, and again in 2009-present. She has also been part of an IPEDS Curriculum Revision Team. In addition, Jennifer has served as a Baldrige Examiner and an Evaluation Team Member for the New England Association of Schools and Colleges. She is a former Chair of the Michigan Association for Institutional Research.
Nancy Floyd is Director of Institutional Research at the University of South Carolina; her responsibilities include submissions to the state Commission on Higher Education for all eight USC campuses, and she serves as IPEDS Keyholder for the flagship Columbia campus as well as four two-year campuses, and IPEDS Coordinator for three additional four-year system campuses. In her 22 years in higher education, she has worked in institutional effectiveness for Midlands Technical College, manager of a test bank for the South Carolina Criminal Justice Academy, served as President of SCAIR and the treasurer of the Southern Universities Group (SUG), and taught SAIR preconference workshops in basic statistics for institutional research. Her work in coordinating a multi-campus system’s IPEDS responses and in benchmarking with SUG has sparked an interest in increasing the clarity and usefulness of operational definitions. Other areas of interest include the validity of national survey constructs and the coordination of IT and IR.
12
Donyell Francis is Data Compliance Manager at the Technical College System of Georgia (TCSG), where he serves as the IPEDS State Coordinator and oversees statewide data collection and reporting. Prior to joining TCSG in 2010, he spent twelve years at Life University, where he was Manager of Information Technology for six years and later Associate Director of Institutional Research and IPEDS Keyholder. Donyell has done several presentations and trainings at colleges throughout Georgia and professional organizations across the country, demonstrating how to use SQL to pull data and analytical tools such as PivotTables to analyze data. He is conducting workshops with IPEDS Keyholders in his system on how to setup dashboards at their colleges using data downloaded from the IPEDS Data Center.
Deborah Geil utilizes her 20 plus years of diverse higher education expertise as a Data Analyst/Systems Developer in the Institutional Research Department at Cleveland State University. She also teaches computer science and management information systems at Bowling Green State University. Additionally, she is a member of the Instructional Design Team at Northcentral University.
Kurt Gunnell is Director of Institutional Research at Western Governors University (WGU) in Salt Lake City, UT and currently is the IPEDS Keyholder at WGU. Previous to his current position, he was Associate Director of Institutional Research at the Kansas Board of Regents for 10 years. As part of his duties, he was an IPEDS state coordinator and worked with all reporting public and private universities/colleges in Kansas. Additionally, he has held several adjunct faculty positions at Washburn University, Baker University, and Kansas State University. Kurt has been an IPEDS trainer for eight years. His professional experiences have focused on IPEDS reporting at the state level, transfer students analysis, and emerging distance education issues.
Thomas Haakenson is Associate Provost at California College of the Arts (CCA). He oversees all of CCA's accreditation and assessment activities. He has participated in workshops on IPEDS data and benchmarking in support of decision making and institutional effectiveness. He also has presented at various accreditation conferences, including those of the North Central Association of the Higher Learning Commission (HLC) and the National Association of Schools of Art and Design (NASAD).
Lakia Hairston is President of P&A Scholars Beauty Schools in Michigan. She has an undergraduate degree in Business Management, and a Master’s degree in Business Administration. Her background is in compliance management, ensuring conformity with local, State, Federal governments and also Accreditation Commissions. She is an IPEDS Trainer bringing nine years experience as an IPEDS Keyholder. She joined the IPEDS team in order to
13
learn every element of IPEDS to be a helpful resource to her local community of post-secondary schools.
Erin Holmes is Associate Vice Provost for Institutional Research at the University of Alaska Anchorage. Prior to joining UAA in 2013, she spent five years at Black Hills State University in Spearfish, SD as Director of Institutional Research and Assessment. Her areas of expertise include assessment, accreditation, and statistical analysis. Over the course of her career, she has used IPEDS data in institutional accreditation efforts, strategic planning and analysis. Most recently, she has applied cluster analysis techniques to update UAA’s institutional peer lists.
Braden Hosch is Assistant Vice President for Institutional Research, Planning, and Effectiveness at Stony Brook University where he leads all aspects of institutional research, helps shape policy and planning, and oversees institutional effectiveness efforts. Braden conducts large-scale national benchmarking for use at the institution and state-level using the IPEDS Data Center, with a special focus on constructing measures of institutional effectiveness, including financial metrics, and he contributes actively to state, regional, national and international efforts to advance quality and accountability in higher education. Prior to his position at Stony Brook, he served as the Chief Policy and Research Officer for the Connecticut Board of Regents for Higher Education, where he led accountability initiatives, helped shape policy for higher education, oversaw statewide data collections, and led higher education research operations and initiatives for the 17 Connecticut State Colleges and Universities (ConnSCU).
John Ingram is Associate Director of Enrollment Analytics at Duquesne University in Pittsburgh. In this position, he applies 18 years of institutional research experience to create several best practices to improve the IPEDS data submission process at the university. John conducts insight analytics using IPEDS data to guide initiatives in student recruitment, student support, teaching and academic program reviews. Before working at Duquesne, he was a research analyst and IPEDS Keyholder at Indiana University of Pennsylvania where he was responsible for reporting key performance funding measures to the state system. At University of Evansville, John was Coordinator of Institutional Research and Systems Application Analyst. He presented at the NEAIR annual conference and outside higher education at the X Change, Ernst and Young's annual digital analytics summit. He is currently authoring position papers exploring back-of-the-envelope answers to data mining questions.
14
Sandra Kinney is currently an independent higher education consultant. Previously, Sandra was the Vice President for Institutional Research and Planning at the Louisiana Community and Technical College System, Research Manager at the Technical College System of Georgia and an Assistant Director for Institutional Research and Planning at Clayton State University. She has experience in IPEDS as both an IPEDS Keyholder and State Coordinator. Sandra has over 22 years of institutional research experience in both public two-year and four-year colleges and has served on the National Postsecondary Education Cooperative (NPEC) for IPEDS surveys. She is a past president of the Georgia Association of Research, Planning and Quality.
Yvonne Kirby is Director of Institutional Research & Assessment at Central Connecticut State University in New Britain, Connecticut. She coordinates and oversees the office’s Federal, State, and other mandatory reporting and ensures timely and accurate submissions. She is also responsible for coordinating assessment activities on campus and analyzing data to help inform senior leadership decisions. Yvonne’s higher education experience includes serving as an IPEDS Keyholder and using the IPEDS Data Center for benchmarking at the state, regional and national level to help inform and support institutional policy and long-range planning.
Michelle Landa is Vice President of Academic Affairs at Eastern Wyoming College. She has worked in institutional research for over six years in Idaho and Oregon. Previously, Michelle was a faculty member at two community colleges in California. Michelle’s area of expertise is providing information and direction that assists the college in understanding and improving student outcomes and success and evaluating key performance indicators for college-wide quality improvement. She has been an IPEDS Keyholder at both public and private institutions.
Erez Lenchner serves as Senior Institutional Researcher at the City University of New York (CUNY), LaGuardia Community College. Before joining LaGuardia he served as an Institutional Researcher at the Research Foundation of CUNY. His areas of expertise are optimization problems and administrative longitudinal data analysis, with particular attention to the progress and success of community college students. He also served as an external evaluator for various grant programs (total value exceeds $6 million as of January 2013). He served on Technical Review Panels (TRP) for NCES and RTI. He uses IPEDS to provide statistical comparative analysis, target information for students, programs, and institutional planning.
15
Marc LoGrasso is Institutional Outcomes Assessment Analyst for Bryant & Stratton College and is the IPEDS Coordinator for the College’s 18 campus locations in four states and its Online Education division, positions he has held since 2010. Beginning in 2010, Marc aligned the efforts of the College’s IT, Registrar, Financial Aid, Business Office, and Human Resource functions. His major current projects include analyzing data to direct and support the College’s next Strategic Plan, an in-depth predictor analysis for both graduation and retention rates, and further streamlining the IPEDS collection and upload processes to reduce the workload of the functions supporting these processes. Marc has also recently been an AIR Publications Peer Reviewer and served on IPEDS Technical Review Panels related to the Finance survey.
Carolyn Mata is Director of Research for the Georgia Independent College Association and serves as coordinator for the 27 independent, not-for-profit colleges and universities in Georgia. A former professor of Sociology with experience in data analysis, Dr. Mata was awarded a fellowship in 2012 to attend the National Data Institute of the National Center for Education Statistics. She serves on the Georgia Governor’s Office of Student Achievement Executive Researcher Committee and assists with the build of a statewide longitudinal data system. In addition, she currently serves as a representative on the Department of Education’s National Postsecondary Education Cooperative and is a frequent participant at Integrated Postsecondary Education Data System (IPEDS) Technical Review Panels.
Joe Stankovich is Director of Institutional Research at Skidmore College in Saratoga Springs, NY. His background is in education, sociology, and applied social research. Entering the field of institutional research upon graduation, he has been active in IR for over 15 years. Joe has been an IPEDS Keyholder and uses the IPEDS reporting tools routinely to answer questions of internal institutional interest and concern. He has used IPEDS data for just about everything, but is particularly interested in the use of these data to supplement knowledge in making efficient and informed decisions.
Naomi Stennes-Spidahl is Director of Assessment and Institutional Research at Viterbo University in La Crosse, Wisconsin. She has been Viterbo’s IPEDS Keyholder for the last five years. She coordinates federal, state, and accreditation reporting, ensures that effective assessment is used to strengthen student learning, and oversees research and analysis for informed decision-making. She brings over twenty years of diverse experience in higher education to the collaborative work of targeted action based on research. She often utilizes the IPEDS Data Center to contextualize findings for Viterbo’s administration and faculty for the purposes of analysis and long-range planning.
16
Jill Triplett is the Director of Institutional Research, Assessment, and Planning at Spelman College. She has extensive experience in institutional effectiveness, institutional research, management, and accreditation. She has served on several committees with the Southern Association of Colleges and Schools Commission on Colleges, presented at conferences, and consulted with various education entities. Prior to assuming her position at Spelman in 2009, Ms. Triplett served for more than 10 years in Technical College System of Georgia in both institutional effectiveness and student services. In addition, she was a research associate for the Southern Regional Education Board (SREB). She is a current member of the Georgia Association for Institutional Research, Planning, Assessment, and Quality (GAIRPAQ), Southern Association for Institutional Research (SAIR), and Association for Institutional Research (AIR). Ms. Triplett also is a former president of the Southeastern Association of Community College Research (SACCR). She has been an IPEDS Keyholder at several colleges.
Gail Tudor has been a faculty member for 20 years. She is currently a professor and Director of Institutional Research at Husson University in Bangor, ME. She has taught courses in probability and statistics, biostatistics, research design, public health and epidemiology. She serves as the primary resource for designing, conducting and analyzing research studies for the university. In addition she serves as the biostatistical consultant for the Department of Clinical Research at Eastern Maine Medical Center and is an adjunct professor for the School for Community and Population Health at the University of New England and an external graduate faculty member at the University of Maine. She also serves as a biostatistical consultant for the University of North Carolina. Prior to coming to Husson University, Gail was Assistant Director of the Biostatistics Consulting Laboratory, a recharge center within the Department of Biostatistics of the University of North Carolina at Chapel Hill, which provides biostatistical consultation and data analysis services to investigators within the Health Affairs Division.
Christopher Vinger is Director of Institutional Research at the New York School of Interior Design. He has almost 20 years of experience in institutional research, planning, and assessment. He has served in IR roles at a diverse range of institutions including Berkeley College, Austin Community College, South Texas College, and The University of Texas at Brownsville. During much of his career, Christopher has had responsibility for IPEDS, state, accreditation, and other external reporting. He has been an IPEDS Keyholder for over ten years and regularly uses the IPEDS Data Center to develop peer comparisons to support his college’s strategic planning and assessment activities.
17
Hui-Min Wen is Director of Institutional Research & Assessment at New College of Florida in Sarasota, Florida, and serves as Florida State University System Board of Governor Data Administrator for the College. She coordinates and ensures timely and accurate submission of the Federal, State, and other mandatory reporting. She is also responsible for assessment activities and studies on campus to support strategic planning and policy decisions. She has been an IPEDS Keyholder for a decade and has conducted comparative analysis with peer institutions for campus planning and accountability reporting process, using a wide range of IPEDS data.
Barbara Wharton is Associate Provost for Institutional Research at Ohio University in Athens, Ohio. Prior to joining Ohio University in 2012, she served in institutional research and assessment related positions at Otterbein University and The Ohio State University. Her areas of interest and research include student financial wellness, student success, and survey research. She has also focused on information system development and data management throughout her career. In over 15 years in higher education, she has used IPEDS data to conduct peer comparisons to assist in strategic planning and has been a keyholder or coordinator at both a private and a public institution.
Ta-Tanisha Young serves in the Institutional Research Department at Harper College as a Research Analyst. She works closely with various departments in the institution to collect IPEDS data and is specifically responsible for the preparation and submission of IPEDS survey data. Beyond the survey data, she uses IPEDS Analytics to examine a number of statistical items longitudinally and provide peer comparison information. This enables her to provide more predictive data to the institution for the purposes of informed decision making.
18
National Center for Education Statistics (NCES)
Samuel Barbett is a statistician with the Postsecondary Branch of the Administrative Data Division of NCES. He currently serves as team leader for Data Quality Control for IPEDS. In this role, he directs the information system development activities of IPEDS and develops, implements, and monitors quality control procedures related to data collected from the IPEDS data collection system and disseminated through the various IPEDS analytical systems. Sam joined NCES in 1979 and has been involved with many NCES studies, including Projections of Education Statistics, National Postsecondary Aid Study, and State Higher Education Profiles. He earned a B.S. in Mathematics and Environmental Science from Wilkes University. Moussa Ezzeddine joined NCES in May 2014, but he has worked on IPEDS since 2008. He is currently serving as the IC Header and Human Resources survey director and is in charge of maintaining the IPEDS universe of institutions and works on other database projects including the Crosswalk. Moussa worked at IT Innovative solutions as chief statistician, as well database administrator, before he joined NCES. His background includes work on a number of IPEDS projects, including the Data Center, Net Price Calculator and the Trend Generator. Moussa has a Bachelor’s in Civil Engineering and a Master’s in Statistics. Gigi Jones joined NCES in April 2014 and is currently serving as survey director for the Admissions, Graduation Rates, and Outcome Measures survey components, as well as helping with Data Integration projects. Prior to coming on board, Gigi has worked in a research capacity at several DC higher education associations - the National Association of Student Financial Aid Administrator (NASFAA), National Association of Independent Colleges and Universities (NAICU), and American Council on Education (ACE). She received her Ph.D. and M.A. in Education from UCLA’s Graduate School of Education and Information Science and B.A. in Cognitive Sciences from University of California, Irvine.
Tara Lawley joined NCES in 2007 and is currently Acting Team Lead for IPEDS Operations. She also works on IPEDS R&D, as IPEDS coordinator liaison, and with IPEDS training. Before joining the IPEDS staff, Tara worked for Kaplan Higher Education in institutional research and assessment. Tara received her MA and PhD from University of Minnesota and her BS in Psychology from the University of Illinois.
Bao Le joined NCES in 2014. She serves as the Survey Director for the IPEDS Finance, 12-Month and Fall Enrollment, and Academic Libraries components and works with the College Affordability and Transparency Explanation Form (CATEF) data collection. Prior to joining IPEDS, Bao worked in NCES’s International Assessment Program, especially focusing on the Program for International Student Assessment (PISA) survey. She received her BS in Biology from the University of Texas and an MS in Public Policy and Management from Carnegie Mellon University.
Andrew Mary joined NCES in July 2002. In addition to being the Survey Director for the IPEDS Completions Survey, Andrew is part of the IPEDS Dissemination team, as well as the publications manager for all IPEDS publications at NCES. He also currently monitors the IPEDS contract as the Contracting Officer’s Representative. Andrew earned his B.S. degree in
19
management science and statistics from the University of Maryland. He also earned a master’s certificate in IT Project Management from George Washington University in 2008. Stefanie R. McDonald joined NCES in May 2015 (as a contractor from the American Institutes for Research) and is serving as the Survey Director for the IPEDS Institutional Characteristics and Student Financial Aid components. She earned her BS in Engineering from The Ohio State University, MS in Applied Statistics from DePaul University, and PhD in Quantitative Methods – Education from the University of California. Before joining IPEDS, Stefanie worked as a Psychometrician / Statistician for two years within American Institutes for Research. The common thread of her career has involved data -- collecting, analyzing, and reporting.
Richard Reeves is the Branch Chief for the Postsecondary Branch within the Administrative Data Division. In this role, Rich leads and directs activities related to IPEDS and other topics related to postsecondary education. Rich has more than a decade of experience conducting research in postsecondary education and has worked at the National Student Clearinghouse as its inaugural Director of Research, at Cornell University as a Senior Researcher, and Johns Hopkins University as the Director of Enrollment Research and Technology. Most recently, Rich served as a team lead for the U.S. Energy Information Administration (EIA) where he led a team dedicated to the development and implementation of energy surveys. He has earned a B.A. in Psychology from Frostburg State University, a M.A. in Experimental Psychology from Towson University, and a M.S. in Statistics from Cornell University.
20
RTI International (RTI)
Amy Barmer (formerly Lister) is a Research Education Analyst at RTI International, where she works as part of the data collection team for large-scale postsecondary studies. In this role, she oversees data collection operations including Help Desk activities, respondent notification (via e-mail, mail, and phone), progress and quality control reports, and requests related to data collection. Since 2006, Amy has provided assistance to the IPEDS Help Desk and IPEDS Data Tools Help Desk with survey operations and data preparations. Throughout the data collection process, the IPEDS Help Desk answers questions from institutional users, provides administrative support, documents problems and solutions, conducts prompting calls with institutions that have not completed the data collection(s), helps with data entry problems, and reviews data submitted prior to migration into the IPEDS Data Center.
Jamie Isaac is a Senior Survey Director at RTI International, where he has worked since 2000, and is currently a program manager within the Education Survey Operations Program in RTI’s Education and Workforce Development division. Jamie has managed the IPEDS Help Desk since its inception in July of 2000, working closely with NCES and the community of institutional researchers to assist in the use and refinement of the web collection system. Jamie also has expertise with the IPEDS data tools, and oversees the Data Tools help desk which was launched in March of 2007.
Janice Kelly-Reid is Director of the Education Survey Operations Program in the Education and Workforce Development division at RTI International. She joined RTI in 1985 and has served as Project Director for the IPEDS data collection services since 2000, working closely with NCES staff to achieve the IPEDS objectives. In her role with IPEDS, Janice is responsible for the overall technical, operational, and fiscal aspects of IPEDS data collection operations, as well as conducting the Technical Review Panel meetings and overseeing the data processing and reporting activities. Additionally, she monitors the subcontractors responsible for the IPEDS web-based data collection system and the respondent training, research, and dissemination tasks. Jennifer Pauli is a Research Education Analyst at RTI International, where she works as part of the data collection team for large-scale postsecondary studies. In this role, she oversees data collection operations including Help Desk activities, Help Desk training, working with respondents, progress and quality control reports, and other requests related to data collection. Since 2009, Jennifer has worked closely with the IPEDS Help Desk and IPEDS Data Tools Help Desk to provide assistance to IPEDS data providers and data users. Throughout the data collection process, the IPEDS Help Desk answers questions from institutional users, provides administrative support, documents problems and solutions, conducts prompting calls with institutions that have not completed the data collection(s), helps with data entry problems, and reviews data submitted prior to migration into the IPEDS Data Center.
21
Association for Institutional Research (AIR)
Meg Andraza is Project Coordinator. She supports IPEDS Workshops and the IPEDS Trainer cohort by providing logistical support for workshop participants, instructors, and co-hosting organizations. Her responsibilities include facilitating the application and selection processes, registration and scheduling, distribution of electronic and print media, and logistics planning for meetings and travel of program instructors.
Elaine Cappellino is Assistant Director for Member Services. She manages programs and services directly related to AIR membership and membership benefits including Volunteer Management, AIR Scholarships and Member Awards. She also contributes to the Association’s monthly newsletter eAIR and oversees the administration of AIR’s Data and Decisions® Academy. Elaine provides support for various components of AIR’s IPEDS training, specifically related to the cohort of IPEDS Educators and serving as registrar for the online IPEDS Keyholder courses.
Christopher S. Coogan is Deputy Director and Chief of Staff. He also serves as Director of the Association’s Data and Decisions® Academy—an online training program for community college institutional researchers that started with funding from Lumina Foundation for Education. Christopher has oversight for the Association’s contracts and grants, IT staff and infrastructure, and serves on the senior leadership team. In these roles he leads the Association’s online training programs for institutional research officers at two-year institutions, national training programs for IPEDS reporters and data users, grants to support faculty and student research and dissertations, and other AIR membership services.
Eric Godin is as Director of Education. He oversees AIR’s range of education programs and services, including IPEDS training resources, webinars, research and dissertation grants, Data and Decisions® Academy, and pre- and post-Forum workshops. He has primary oversight for AIR’s online IPEDS training resources, including the development and maintenance of online tutorials, online IPEDS Keyholder courses, and the cohort of IPEDS Educators. In addition, he takes a leading role with AIR’s IPEDS Team related to strategic planning and reporting.
Darlena Jones is Director of Assessment and Research. She leads the assessment and evaluation initiatives for AIR member education programming such as the Forum, webinars, and the Data and Decision's Academy® online courses, as well as the association's contract and grant funded services such as IPEDS, the National Data Institute, and Research & Dissertation Grants. Darlena provides statistical expertise for IPEDS workshop and program evaluations, including both the design of evaluations and pre/post-tests, as well as the analysis of data.
Tinsley Smith is Assistant Director of Contracts and Grants. She manages the activities associated with AIR’s Research and Dissertation Grant program and IPEDS Workshops including the application and selection processes, registration and scheduling, distribution of electronic and print communications, and customer service. She also supports the development of curricula and related support materials for the IPEDS Workshops. Randy L. Swing is Executive Director. Dr. Swing is a frequent speaker at national and international conferences and author of books and articles on assessment, institutional research, and student success, especially the first-year experience. He serves as a reviewer for the Journal
22
of General Education, The Journal on Excellence in College Teaching, and Innovative Higher Education. Prior to joining AIR, Randy served as Co-Director and Senior Scholar at the Policy Center on the First Year of College and as a fellow in the National Resource Center for The First-Year Experience and Students in Transition at the University of South Carolina. Earlier, he held leadership positions at Appalachian State University in assessment, advising, Upward Bound, and Freshman Seminar.
Additional Speakers
Ruba Hay is Project Manager (PMP) at IT Innovative Solutions (INOVAS). Ms. Hay joined INOVAS as a Senior Systems Analyst in 2009 shortly after receiving her Bachelor’s Degree from The University of Tennessee. In this role, she led project teams analyzing new web-based system functions prior to implementation and system upgrades on existing systems. Certified as a Project Management Professional (PMP), Ms. Hay is currently the lead program/project manager for all INOVAS contracts. As Project Manager for the Office of Postsecondary Education (OPE) and National Center for Education Statistics (NCES) contracts – including the IPEDS Data Collection System and Data Tools – Ms. Hay manages relationships, coordinates work, and performs the day-to-day management of all phases of the projects including requirements gathering, design, development, testing, and deployment. In addition to her project management responsibilities, Ms. Hay provides management oversight for all testing, system documentation, and help desk activities.
Charlotte Etier is a Senior Research Analyst and Grant Manager at the National Association of Student Financial Aid Administrators (NASFAA) where she contributes to research and policy needs. Prior to joining NASFAA in the summer of 2014, she worked as the Dallas Martin Endowment Policy Intern at NASFAA, where she contributed to work related to policy and advocacy efforts. Charlotte began her career in higher education as a graduate assistant in the Financial Aid Office at Central Connecticut State University and as a Financial Aid Officer at UConn. She received a master's in student development in higher education at Central Connecticut State University and BS in political science and women’s studies from the University of North Carolina at Charlotte.
2013-14 IPEDS Training Report
(Deliverable 9)
Subcontract 0214104
Prepared by:
Randy L. Swing, Ph.D.Executive Director
Christopher CooganDeputy Director & Chief of Sta!
Eric GodinDirector
Darlena Jones, Ph.D.Director
Elaine CappellinoProject Manager
Tinsley SmithAssistant Director
Meg AndrazaProject Coordinator
Amy CombsInstructional Designer
ASSOCIATION FOR INSTITUTIONAL RESEARCHData and Decisions for Higher Education
24
Introduction
The Association for Institutional Research (AIR), operating as a subcontractor for RTI
International, supports the National Center for Education Statistics’ (NCES) Integrated
Postsecondary Education Data System (IPEDS) through the development and delivery of face to-
face workshops and online training materials. This report details AIR’s IPEDS training activities
for work completed between October 1, 2013 and July, 31 2014, as prescribed by Deliverable 9
of the RTI/AIR subcontract, Task Order Number 2-312-0214104-51691L.
Deliverable 9 is listed in the Statement of Work as follows:
“Produce an annual report which evaluates the entire IPEDS training program that includes (1) a
summary of all face-to-face training events held, including data on evaluation by participants; (2)
statistics on the use of online training options; (3) metrics on the effectiveness of training
activities and specific examples of direct benefits of the training to the government; and (4) a
plan for improvements for the next year.”
AIR produced the annual report (Deliverable 9) in two components. The first component is a
narrative report arranged in five sections:
I. Oversight of IPEDS Training Program
II. Development of Training Curriculum
III. IPEDS Workshops for Training of Institutional Personnel
IV. Online Training Materials
V. Communications
The second component contains detailed evidence to support the narrative report. The evidence
is arranged in the same five sections, consistent with the format of the narrative report.
25
Oversight of IPEDS Training Program
AIR employed a talented and experienced staff to develop and maintain all aspects of the IPEDS
training subcontract. Although AIR relied on a core IPEDS team of eight individuals to
accomplish the statement of work, the expertise of the entire AIR staff was engaged in
significant supporting roles, including technology, finance, meeting planning, and
administration.
In addition to the AIR staff, subject matter experts (SME) and a media production company
contributed to the development of high-quality training materials produced and distributed by
AIR during the 2013-14 contract period.
Key project managers were AIR Executive Director, Randy L. Swing, and Deputy Director and
Chief of Staff, Christopher S. Coogan. Dr. Swing and Mr. Coogan provided senior-level
management and oversight for the AIR staff assigned to this subcontract.
Two project leaders and staff supervisors were assigned to administer the contracted tasks and
related evaluation and reporting activities. Mr. Eric Godin provided leadership for the Train-the-
Trainers meeting, IPEDS Trainer cohort, and development of tutorials and workshop curriculum
and materials. Ms. Tinsley Smith provided leadership for the IPEDS Workshops.
Four additional staff held primary responsibility for portions of the contracted activities. Darlena
Jones, Amy Combs, Elaine Cappellino, and Meg Andraza provided professional services
including designing evaluation instruments and methods, managing communications, meeting
and project management, and printing and shipping.
Biographies of AIR’s core IPEDS staff and a list of additional staff members who supported this
subcontract are included below2.
Core IPEDS Team
Meg Andraza serves as Project Coordinator and supports IPEDS Workshops and the IPEDS
Trainer cohort by providing logistical support for workshop participants, instructors, and co-
hosting organizations. Her responsibilities include facilitating the application and selection
processes, registration and scheduling, distribution of electronic and print media, and logistics
planning for meetings and travel of program instructors and participants. Meg joined AIR in
November 2009 with over 17 years of experience in association management.
Christopher S. Coogan is Deputy Director and Chief of Staff for AIR. He also serves as
Director of the Association’s Data and Decisions® Academy—an online training program for
community college institutional researchers that started with funding from the Lumina
Foundation for Education. Christopher has oversight for the Association’s contracts and grants,
IT staff and infrastructure, and serves on the senior leadership team. In these roles he leads the
Association’s online training programs for institutional research officers at two-year institutions,
2 Full resumes of AIR’s core IPEDS team are included in the Evidence File, Section I, pages 1-10. The evidence File
will be made available to Institute for IPEDS Educators participating via an electronic link.
26
national training programs for IPEDS data providers and users, grants to support faculty and
student research and dissertations, and other AIR membership services. Previously, he served as
Associate Director of the University of Florida’s Institute of Higher Education and Assistant
Editor of the Florida Journal of Educational Administration and Practice. Christopher has
Bachelor’s and Master’s of accounting degrees from the University of Florida.
Elaine Cappellino serves as Assistant Director for Member Services. In this role, Elaine
manages programs and services directly related to AIR membership benefits including Volunteer
Management, AIR Scholarships, and Member Awards. She also contributes to the Association’s
monthly newsletter (eAIR) and oversees the administration of AIR’s Data and Decisions®
Academy. Elaine provides support for various components of the IPEDS Trainer cohort. Elaine
holds Bachelor’s and Master’s degrees in education from the Florida State University.
Amy Combs serves as Instructional Designer for AIR. Amy’s primary responsibility is to
manage the development of educational content for the Association’s online IPEDS resources
that are funded by NCES. Prior to joining AIR, Amy served as a Program Coordinator at
Tallahassee Community College Workforce Development where she managed all non-credit
information technology course offerings. She earned a Bachelor’s degree in Management
Information Systems from Florida State University, and a Master’s degree in Educational
Leadership with an Instructional Technology emphasis from the University of West Florida.
Eric Godin serves as Director of Education, overseeing AIR’s range of education programs and
services, including IPEDS training resources, webinars, the Data and Decisions® Academy, and
pre- and post-Forum workshops. Prior to joining AIR, Eric served as Manager of Research
Projects for the Council of Independent Colleges. He earned a Bachelor’s degree from the
University of Richmond and a Master’s degree in Higher Education Administration from the
College of William and Mary. He has also completed the Graduate Certificate in Institutional
Research from Florida State University.
Darlena Jones serves as Director of Assessment and Research. Dr. Jones directs the assessment
and evaluation initiatives for AIR services, including the Forum, webinars, and the Data and
Decisions® Academy online courses, as well as the association's contract and grant funded
services such as IPEDS, the National Data Institute, and the Research and Dissertation Grant
program. Prior to joining AIR, Darlena was the Director of Education and Program Development
at EBI Map-Works, authoring over 100 national benchmarking assessments in student and
academic affairs departments. Darlena received her Doctoral degree in physics at Oklahoma
State University.
Tinsley Smith serves as Assistant Director of Contracts and Grants for AIR. Tinsley manages
the activities associated with AIR’s Research and Dissertation Grant program and IPEDS
Workshops including marketing and communications, attendee application and selection
processes, registration and scheduling, and overall customer service. She also supports the
development of curricula and related support materials for the IPEDS Workshops. Prior to
joining AIR, Tinsley served as Director of Career Placement for the Florida State University
College of Law. She also served as an Assessment Services and Enrollment Management
Consultant with ACT, Inc. She earned a Bachelor’s degree in history from Denison University
27
and a Master’s degree specializing in Student Counseling and Personnel Services from Kansas
State University.
Randy L. Swing is Executive Director of AIR. Dr. Swing is a frequent speaker at national and
international conferences and author of books and articles on assessment, institutional research,
and student success, especially the first-year experience. He serves as a reviewer for the Journal
of General Education, The Journal on Excellence in College Teaching, and Innovative Higher
Education. Prior to joining AIR, Randy served as Co-Director and Senior Scholar at the Policy
Center on the First Year of College and as a fellow in the National Resource Center for The
First-Year Experience and Students in Transition at the University of South Carolina. Earlier, he
held leadership positions at Appalachian State University in assessment, advising, Upward
Bound, and Freshman Seminar. He holds a Doctoral degree from the University of Georgia.
AIR Support Staff
Finance
• Jason Lewis, Chief Financial Officer
• Cathy Sexton, Finance Manager
• Charles McCumber, Accountant
Meetings
• H.A. Scott, Director of Conference Services
• Laura Harrington, Coordinator of Conference Services
Technology
• Kashif Imran, Chief Information Officer
• Steve Alvarado, Web Coordinator
AIR Contractors/Consultants
• 27 IPEDS Trainers
• B-Clip Productions – video production and multimedia company
• 4 additional content editors
28
Development of Training Curriculum3
AIR recruited 27 individuals to serve as IPEDS Trainers. These individuals were selected
because of their experience with and high knowledge of IPEDS data submission and use.
Collectively, they represented all sectors of higher education, allowing the development of
training curriculum that addressed the perspectives of the wide range of institutions in the IPEDS
universe, regardless of control, size, or level of degrees/certificates awarded. The diversity of the
pool of trainers assured that expertise was available for the range of colleges from small
proprietary institutions to large non-profit public and private colleges and universities. This
section highlights IPEDS Trainer roles and characteristics, the annual Train-the-Trainers
meeting, and IPEDS web conferences.
Trainer Roles and Characteristics
The 2013-14 IPEDS Trainers served in a number of roles, including Lead Instructor, Assistant
Trainer, Curriculum Developer, and/or Advisory Group Member4. While all Trainers were
encouraged to serve as Assistant Trainers, the other roles were offered to IPEDS Trainers with
significant experience and demonstrated expertise. Details about each role are provided below:
Lead Instructors – Trainers who previously demonstrated outstanding presentation skills
and thorough knowledge of IPEDS were selected as Lead Instructors for the face-to-face
workshops. Usually, two Lead Instructors were assigned to a full-day workshop. Each
had primary responsibility for conducting either the morning or afternoon session and
each served as an assistant, helping participants with exercises and questions, during the
other half of the day.
Assistant Trainers – Each workshop was staffed with an Assistant Trainer who helped
the Lead Instructors with delivering the training curriculum by assisting with hands-on
exercises, managing onsite logistics, and conducting the evaluation process. Assistant
Trainers provided additional perspectives and are especially critical when workshops
have several participants who are new IPEDS keyholders.
Curriculum Developers – Curriculum Developers assisted AIR in developing and
updating IPEDS Workshop modules. They typically worked in teams of four to create,
maintain, and periodically update all materials for each IPEDS training module, including
PowerPoint presentations, training videos, Instructor Guides, and exercises.
Advisory Group – The IPEDS Trainer Advisory Group included the most senior and
experienced IPEDS Trainers. These individuals also served as Lead Instructors and
Curriculum Developers. The Advisory Group assisted AIR staff with planning for and
presenting at the annual Train-the-Trainers meeting, selection of new and continuing
IPEDS Trainers, and general oversight related to AIR’s IPEDS activities.
3 Although AIR’s 2013-14 subcontract with RTI did not include funding for this section, Development of Training
Curricula, funding was provided via a contract extension to AIR/RTI subcontract 0212130-1. 4 A detailed list of AIR’s IPEDS Trainers is included in the Evidence File, Section II, page 1. The list includes
Trainers’ names, affiliations, and roles in 2013-14.
29
IPEDS Trainers were selected from across the United States (see Figure 2.1) which allowed for
more local and regional Instructors at face-to-face workshops. In addition to reducing travel costs
by selecting locally based Trainers, these individuals were also more aware of regional and state
issues, thereby providing greater assistance to workshop participants.
Figure 2.1: IPEDS Trainer Locations
Most Trainers were actively involved in IPEDS reporting for their campus or state during the
time they served as Trainers (see Figure 2.2), which further established them as peer experts.
Forty-eight percent (48%) were IPEDS keyholders, 11% were State Coordinators, 7% served as
proxy keyholders, and an additional 7% served in both keyholder and coordinator roles.
Although one-quarter (26%) of Trainers were not active keyholders or coordinators, those
individuals had previously served in these roles or supervised staff who were.
The cohort included a diverse group of individuals from all sectors of higher education,
including public institutions (37%); private, not-for-profit institutions (26%); private, for-profit
institutions (15%); system offices (15%); and other, which includes Trainers at associations or
those serving as consultants (7%). Over two-thirds of Trainers were from four-year institutions
(67%), followed by 7% at two-year institutions, 4% at less-than two-year institutions, and 22% at
system offices or in the other category. Figure 2.2 illustrates Trainer representation by sector of
higher education.
30
Figure 2.2: IPEDS Trainer Characteristics
Train-the-Trainers Meeting
The Train-the-Trainers meeting, held annually before the start of the IPEDS data collection
cycle, was held July 15–17, 2013 at NCES offices in Washington, DC. The meeting prepared
IPEDS Trainers for their roles and introduced them to changes to the IPEDS data collection. The
Trainer program encouraged and promoted professional development related to IPEDS
knowledge and presentation skills. The meeting also provided Trainers an opportunity to discuss
questions and concerns related to IPEDS with NCES staff. The meeting was evaluated by
Trainers and given high reviews. The 2013 meeting included three types of sessions: IPEDS
Updates, Trainer Sessions, and Brainstorming Sessions. Descriptions of these sessions, along
with evaluation data, are presented below.
IPEDS Updates. Because IPEDS data collection and data tools are updated regularly,
presentations from NCES staff were a critical component of the meeting. IPEDS survey directors
and staff provided updates to survey components, the data collection cycle, and data tools. These
sessions were important to ensure that Trainers were aware of and accurately interpreted new and
updated data elements. Trainers discussed changes, asked questions, and engaged in meaningful
dialogue with NCES staff. The discussions were productive and allowed Trainers to be more
effective peer educators and resources during Workshops.
In the evaluation:
• 96% of Trainers agreed or strongly agreed that the NCES session on IPEDS updates
improved their knowledge of IPEDS;
• 96% agreed or strongly agreed that the presentation on IPEDS updates was effective in
raising their awareness of changes to IPEDS that will impact the 2013-14 collection
cycle;
31
• 100% agreed or strongly agreed that the presentation on IPEDS updates was effective in
raising their awareness of changes to IPEDS that will impact the 2014-15 collection
cycle;
• 100% agreed or strongly agreed that they were prepared to discuss changes in IPEDS
with colleagues; and
• 100% agreed or strongly agreed that they were prepared to discuss important concepts
from the individual IPEDS survey components.
In addition, a number of Trainers commented positively on the openness of the discussion with
NCES staff and their willingness to answer questions in a collegial atmosphere5.
Trainer Presentations. The second group of sessions included presentations by IPEDS Trainers.
These presentations, designed to illustrate how IPEDS information can be developed for delivery
at local, regional, or national conferences, provided Trainers an opportunity to share their expert
knowledge with their colleagues. The presentations by Trainers this year focused on the use of
IPEDS data to develop institutional dashboards, important information for data providers
regarding student financial aid, and an explanation of the Common Education Data Standards
(CEDS).
Trainer presentations were also well-received:
• 85% agreed or strongly agreed that the session on creating an IPEDS dashboard increased
their knowledge of IPEDS;
• 81% agreed or strongly agreed that the session on student financial aid increased their
knowledge of IPEDS;
• 93% of Trainers agreed or strongly agreed that the session on CEDS increased their
knowledge on the subject.
Brainstorming Sessions. The third group of sessions included brainstorming with IPEDS
Trainers on better understanding the needs of IPEDS keyholders, identifying resources to help
keyholders be more successful in their roles, and providing suggestions on how AIR might better
facilitate keyholder education. Trainers also appreciated the opportunity to share their own
expertise in addition to hearing from NCES and AIR staff.
In regards to the Brainstorming Sessions:
• 93% of Trainers agreed or strongly agreed that the sessions were effective techniques for
sharing information;
• 65% agreed or strongly agreed that the sessions increased their knowledge of IPEDS; and
• 74% agreed or strongly agreed that the sessions increased their knowledge of new
keyholder issues.
In summary, the evaluations supported the value of convening the Trainers for approximately
two days to prepare them to train others about IPEDS. Because the quality of IPEDS data is
ultimately dependent on IPEDS keyholders submitting data accurately and on time, it is critical
5 Evaluation data (quantitative and qualitative) for the 2013 Train-the-Trainers meeting are included in the Evidence
File, Section II, pages 2-23.
32
that all IPEDS Trainers are regularly provided the opportunity to learn from NCES staff and
from each other regarding updates to the IPEDS data collection and data tools.
Trainer Web Conferences
IPEDS Trainer web conferences provided ongoing professional development for AIR’s cohort of
IPEDS Trainers. Although not listed in the statement of work, AIR determined that once-a-year
training is insufficient for assuring that Trainers are current and prepared for their roles. The web
conferences focused on in-depth topics about IPEDS data collection and use. The majority of the
web conferences featured representatives from national organizations explaining how they use
IPEDS data to inform policy and decision-making. Trainers reported that they use information
from these web conferences to highlight the importance of accurate and consistent reporting of
IPEDS data. Recordings of these web conferences are also disseminated via the AIR website,
making them available as an additional resource for IPEDS keyholders and others who work
with IPEDS data6. Figure 2.3 lists the nine sessions presented during the 2013-14 collection year.
Figure 2.3: IPEDS Trainer Web Conferences August, 2013 Behind the Scenes with the IPEDS Help Desk
Jamie Isaac, RTI International
September, 2013 Sectors of the IPEDS Universe
Donyell Francis, Jack Mahoney, and Kimberly Thompson, IPEDS Trainers
October, 2013 Utilizing the Delta Cost Project for Institutional Improvement
David Mongold, University of Hawaii System
November, 2013 A Tale of Two Income Years: Comparing Prior-Prior Year and Prior-Year
Through Pell Grant Award
Gigi Jones, National Association of Student Financial Aid Administrators
January, 2014 Using IPEDS Classification of Instructional Programs (CIP) Codes to Improve
Institutional Effectiveness
John Barnshaw, Delaware Cost Study
February, 2014 University of Minnesota's Comparison Group Generator
Peter Radcliffe, Daniel Jones-White, and David Peterson, University of Minnesota
March, 2014 Moving from Data to Wisdom
Darlena Jones, Association for Institutional Research
April, 2014
May, 2014
The Economic Value of Degrees: A National and State-Level Analysis
Katie Zabak, State Higher Education Executive Officers Association
IPEDS Data Analysis and Reporting
Scott Ginder, RTI International
6 IPEDS Trainer Web Conference recordings are available online at:
www.airweb.org/EducationAndEvents/OnlineLearning/Pages/TrainerConferences.aspx
33
Each web conference was evaluated by participating Trainers. Across all web conferences:
• 97% of Trainers rated the overall experience of the webinar as good or very good;
• 99% rated the presenter’s knowledge of the subject as good or very good; and
• 92% rated the webinar as a good or very good learning experience.
Areas for Improvement
IPEDS Trainers provide a wealth of knowledge to higher education professionals. However, with
the upcoming addition of new IPEDS survey components (Admissions, Libraries, and Outcome
Measures), and the increased focus on finances at for-profit institutions, there is a growing need
to recruit Trainers with experience in specific subject matter areas. As AIR continues to promote,
select, and develop Trainers, individuals with valued expertise will be encouraged to apply and
participate.
Although Trainers are active participants on the IPEDS Listserv, they should also be proactive in
their own training. For example, holding an open web conference where a Trainer demonstrates
the use of an IPEDS data tool could be a helpful resource not just for other IPEDS Trainers but
for the IPEDS data community as well.
34
IPEDS Workshops for Training of Institutional Personnel
The Association’s face-to-face IPEDS Workshops provided high-quality, interactive training to
IPEDS data providers and users from around the country. AIR produced half-day and full-day
Workshops that were engaging to adult learners by combining lecture-style instruction,
demonstration videos, discussions, and hands-on exercises. In addition, the Workshops provided
opportunities for participants and instructors to develop peer networks of knowledgeable IPEDS
data providers and users. In 2013-14, AIR conducted 27 IPEDS Workshops for 779 individuals.
The following sections provide details on Workshop locations and participants, curriculum and
delivery, evaluations, and areas for improvement.
Workshop Locations and Participants
AIR used two primary methods for arranging and delivering IPEDS Workshops in 2013-14.
Most were arranged with an organization serving as a co-host and attached to an existing event at
which IPEDS data providers and users would be present. Others were free-standing events
designed to cater to a broad spectrum of participants and to occur at optimal times.
Co-hosted Workshops (20). These IPEDS Workshops were usually pre-conference events
aligned with local, regional, or national meetings of higher education organizations. Offering
Workshops in conjunction with already scheduled conferences and events reduced both the time
burden and cost of attendance.
IPEDS Workshops were intentionally marketed to organizations that had not held IPEDS
training sessions since 2010 and to groups that have been historically underrepresented at
training sessions. As a result, eight of these organizations and associations co-hosted Workshops
this past year. For example, training was provided for the first time to the Student Aid
Researchers Association (SARA), a group of higher education researchers and policy analysts
that work for associations and other non-profit organizations based in the Washington, DC area.
The Workshop for SARA is an example of training directed at data users who are often tasked
with explaining IPEDS data to college and university leaders in the sectors they represent.
Freestanding Workshops (7). Because the best time to provide training on certain IPEDS topics
may not align with existing conferences and events, AIR established free-standing Workshops
that were scheduled for optimal effectiveness and geographic diversity. This year, Workshops
were conducted in Atlanta, GA; Phoenix, AZ; Orlando, FL; and Philadelphia, PA in addition to
those held in conjunction with the AIR Forum. These cities provided highly accessible travel
options for drive-in participants and for individuals traveling from remote locations. While
NCES funding was not used to cover costs related to participant travel or accommodations, AIR
leveraged its purchasing power and provided discounted hotel options for participants to control
the cost burden for participants.
35
Figure 3.1: IPEDS Workshop Partnerships and Locations Organization
(* indicates first-time co-host, † indicates no Workshop in the last three years)
AIR Hosted: Keyholder Training at AIR Forum
AIR Hosted: Workshop at AIR Forum
AIR Hosted: Keyholder Training #1 (Atlanta, GA)
AIR Hosted: Keyholder Training #2 (Atlanta, GA)
AIR Hosted: Keyholder Training #3 (Orlando, FL)
AIR Hosted: Keyholder Training #4 (Phoenix, AZ)
AIR Hosted: Keyholder Training #5 (Philadelphia, PA)
American Association of Collegiate Registrars and Admissions Officers (AACRAO)
American Association of Cosmetology Schools (AACS)
Assessment Network of New York (ANNY)*
Associated Colleges of the Twin Cities (ACTC)*
Association of Private Sector Colleges and Universities (APSCU)
Connecticut Association for Institutional Research (ConnAIR)†
Higher Education Data Sharing Consortium (HEDS)*
Indiana Association for Institutional Research (INAIR)
IPEDS Coordinator Workshop and State Data Conference
National Higher Education Benchmarking Institute (NHEBI)
New Jersey Association for Institutional Research (NJAIR)
North American Association of Summer Sessions (NAASS)*
North Carolina Association for Institutional Research (NCAIR)
North East Association for Institutional Research (NEAIR)
Pacific Association for Institutional Research (PacAIR)† #1
Pacific Association for Institutional Research (PacAIR)† #2
Rocky Mountain Association of Collegiate Registrars and Admissions Officers (RMACRAO)
Student Aid Researchers Association (SARA)*
Tribal Colleges and Universities (TCU)†
The locations of Workshops and participants are shown in Figure 3.2. Workshops (blue makers)
were strategically located throughout the country to allow individuals (red markers) from all
areas to participate.
Figure 3.2: IPEDS Workshops and Participants
36
In 2013-14, AIR received 970 requests from individuals to attend an IPEDS Workshop and
accepted 98% (946). Of those accepted, 82% (779) attended a Workshop, with 50% (388)
identifying as an IPEDS keyholder or coordinator7.
Figures 3.3 and 3.4 highlight the control and level of institutions represented by the participants
(data on institutional control and level were not available for all participants), indicating
participation from all areas of higher education, including for-profit and less-than two-year
institutions. The Other category includes participants from state systems, organizations,
companies, or non-classifiable affiliations.
Workshop Curriculum and Delivery
With the assistance of IPEDS Trainers with deep subject matter expertise, AIR updated all
training curriculum for the 2013-14 collection year to incorporate changes to IPEDS data
collection and data tools. An AIR staff member and a team of experienced IPEDS Trainers
reviewed and revised the curriculum for each Workshop module. Each team member had expert
technical knowledge about IPEDS and direct experience teaching Workshops. As such, they
approached curriculum development as content experts and experienced instructors with
firsthand knowledge of promising practices in content development, delivery, and audience
engagement.
Abstracts for each IPEDS Workshop module are presented below8:
IPEDS Data and Benchmarking: Supporting Decision Making and Institutional Effectiveness
(half-day) – This module was designed for participants with little or no experience conducting
benchmarking studies or using the IPEDS Data Center. It introduced the fundamentals of
creating benchmarks to measure institutional effectiveness, provided an overview of the types of
comparison groups that could be constructed using IPEDS data, and demonstrated how the
IPEDS Data Center and IPEDS Data Feedback Reports could be utilized for benchmarking
purposes.
7 More detailed information for each workshop, including the number of requests, accepted requests, attendees,
number of keyholders and coordinators, and Lead Instructors is included in the Evidence File, Section III, pages 1-7. 8 PowerPoint handouts for each of the six modules are included in the Evidence File, Section III, pages 8-118.
45%
32%
14%9%
0%
10%
20%
30%
40%
50%
Public Private, not-
for-profit
Private, for-
profit
Other
Figure 3.3: Participant Affiliation - Control
(n=730)
34%
26%
7% 9%
0%
10%
20%
30%
40%
50%
Four-year Two-Year Less-than
two-year
Other
Figure 3.4: Participant Affiliation - Level
(n=730)
37
IPEDS Data as the Public Face of an Institution (half-day) – This module raised the level of
awareness among higher education professionals about the importance of accuracy and
consistency in data reported to IPEDS. Examples of real IPEDS data used in the public domain
were incorporated, enabling participants to understand how IPEDS data are used by
governmental and nongovernmental entities.
IPEDS Finance Training for IR Professionals (half-day) – This module was designed
specifically for individuals who had no training or expertise in finance or accounting. The
module presented eight objectives, including an introduction to higher education finance;
differences between the Governmental Accounting Standards Board (GASB) and the Financial
Accounting Standards Board (FASB); key accounting concepts; a review of general purpose
financial statements and how they relate to the IPEDS Finance component; common errors in
completing the IPEDS Finance component; reporting comparison challenges; and where and
how IPEDS Finance data are used.
Keyholders Workshop (full-day) – This module provided participants with a thorough
introduction to the IPEDS data collection cycle and reporting requirements. This workshop was
created specifically for beginning IPEDS keyholders and included an outline of the roles and
responsibilities of a keyholder and the resources available to assist in the IPEDS planning and
reporting processes.
Best Practices for Reporting and Using IPEDS Data to Improve Office Efficiencies (full-day) –
This module was designed for individuals who lead the IPEDS data submission process on their
campuses. Participants learned IR best practices and technical efficiencies in data management
through Excel, examined multiple options for IPEDS submission, and learned how to use
benchmarking data to address key institutional questions and needs.
Workshop Evaluations
To continue improving the quality of IPEDS training for higher education professionals, AIR has
developed an evaluation process that incorporates feedback from participants and instructors.
After each Workshop, participants complete an evaluation which is shared with AIR’s IPEDS
staff and the Lead Instructors for that Workshop. Figure 3.5 illustrates this evaluation cycle, with
items in white indicating historical activities and items in orange indicating new activities piloted
in 2013-14.
These new activities include pre- and post-tests and a follow-up survey. Using the learning
objectives for each module, pre- and post-tests were developed for all modules (except for
Finance) and administered during the Workshop. Follow-up emails were sent to participants
three months after their Workshop, inviting them to participate in a short survey to evaluate their
knowledge retention and the usefulness of the skills taught in the Workshop.
The pre- and post-tests and the follow-up survey provided AIR with more robust data to
determine participant learning gains on specific learning outcomes. These data will help in the
development of an online self-assessment tool to allow future participants to identify which
38
Workshop(s) and/or online training materials would be the most valuable based on their
individual needs.
Figure 3.5: IPEDS Workshop Delivery Cycle
Data for the Workshop evaluations, the pre-test/post-test, and the follow-up survey are presented
below.
Overall, evaluations of IPEDS Workshops indicated that9:
• 94% of participants agreed or strongly agreed that the session improved their knowledge
of IPEDS;
• 96% agreed or strongly agreed that the material was presented in a clear and organized
manner; and
• 96% agreed or strongly agreed that the presenter was knowledgeable about the material
covered.
Figures 3.6 and 3.7 highlight the consistently high ratings for the IPEDS Workshop modules.
Figure 3.6 shows ratings of Workshop presenters, and Figure 3.7 shows self-reported learning.
When asked to report if presenters were knowledgeable about the material covered, if the
9 Evaluation data for the Workshops are included in the Evidence File, Section III, pages 119-340.
Curriculum Development
& Update
Participant Pre-Test
Workshop Presentation
Participant Post-Test
Workshop Evaluations
Long-Term Follow-Up
39
material was presented in a clear and organized manner, and if examples and illustrations were
used effectively, over 94% of participants agreed or strongly agreed, across all modules.
Reported ratings of learning by module are also very positive, with at least 90% of participants
agreeing or strongly agreeing that the workshop improved their knowledge of IPEDS. The
ratings also indicate that there is room for improvement regarding exercises and discussions
contributing to participants’ learning.
Figure 3.8 details the pre- and post-test results, indicating the average number of correct answers
across the Workshop modules10. In total, 555 participants (71%) completed the tests. Each test
consisted of 10 questions (the pre- and post-test questions were identical). A number of
Workshops offered both Benchmarking and Public Face modules, and the tests for these
Workshops were combined in an effort to save time.
10 Pre and post-test questions and data for each workshop are included in the Evidence File, Section III, Pages 341-
356.
40
Across all modules, participants showed increases in learning by answering more correct
questions on the post-test, with the largest increase in correct scores in the combined
Benchmarking and Public Face Workshops. Although the Keyholder training showed one of the
smallest gains in correct scores, AIR staff and Trainers believe that the actual gain in knowledge
was not accurately captured by the pre- and post-test. An improved instrument will be developed
before this workshop is presented again.
Figure 3.9 displays the results of the follow-up survey sent three months after the Workshop.
Over 75% of participants agreed or strongly agreed that they better understand the importance of
accurate IPEDS data; are better able to perform IPEDS related tasks; gained knowledge or a skill
related to IPEDS that they used immediately after the Workshop; and gained knowledge or a
skill related to IPEDS that has improved their ability to do their job.
Areas for Improvement
The addition of two new pilot activities provides opportunity for growth in AIR’s evaluation of
its Workshops. First, AIR will work to determine if the pre- and post-test questions are
appropriately difficult and focus on the skills that are most important for IPEDS data providers
and users. The pre- and post-test questions will be used to develop a testing bank for AIR’s
8.07.0
6.0 6.2 6.6
9.18.1
7.5 7.78.7
0.0
2.0
4.0
6.0
8.0
10.0
Keyholder (133) Best Practices (52) Benchmarking (97) Public Face (185) Public Face and
Benchmarking (88)
Figure 3.8: IPEDS Workshops Assessment Results
Average Workshop Score: Pre-Test Average Workshop Score: Post-Test
79%
82%
79%
79%
85%
0% 20% 40% 60% 80% 100%
I am better able to perform IPEDS-related tasks.
I better understand the importance of accurate IPEDS data.
I gained knowledge or a skill that has improved my abilities.
I gained knowledge or a skill that I used immediately.
The Workshop increased my comfort level with IPEDS.
Figure 3.9: Workshop Participant Follow-Up Survey
(n = 43)
41
online self-assessment. Second, AIR will further examine the long-term benefits of IPEDS
Workshop attendance. Identifying the most useful outcomes will not only enhance marketing of
these educational opportunities, but also indicate what learning gains may be transferable
through online educational resources.
42
Online Training Materials
Online training provides immediate, on-demand assistance for IPEDS data providers and users
with no travel required. Training materials developed by AIR are accessible through links
embedded in the IPEDS data collection system and from a central website (Figure 4.1). Five
types of online training were provided by AIR; 1) IPEDS survey component tutorials; 2) data
tools tutorials; 3) new keyholders tutorial; 4) annual IPEDS update tutorial; and 5) other IPEDS
related tutorials. These are discussed in detail below.
Figure 4.1: AIR’s IPEDS Online Tutorials Webpage
IPEDS Survey Component Tutorials
Tutorials for each IPEDS survey component were made available prior to the opening of the data
collection period11. The tutorials supplemented IPEDS survey instructions with animated videos
that helped clarify difficult concepts with images, text, and voice-over instruction. These
tutorials clarified commonly misunderstood concepts such as defining a first-time, full-time
student; identifying the various groups of students reported in the Student Financial Aid
component; demonstrating report mapping in the Institutional Characteristics Header component;
and calculating course and program data for institutions that offer courses through distance
education.
The IPEDS survey component tutorials were accessible through multiple paths. Viewers could
access them through the IPEDS website, the AIR website, and with internet search engines. Most
11 An example screenshot of AIR’s Student Financial Aid webpage is included in the Evidence File, Section IV,
page 1.
43
importantly, links to the tutorials were embedded directly in the IPEDS data collection system,
providing convenient access to data providers as they completed their data submissions.
Figure 4.2 displays the number of pageviews for each survey component tutorial. The horizontal
axis identifies the survey components as well as the number of tutorials available for each
component12. For example, the Human Resources (HR) component included five tutorials, and
the total number of pageviews for all five tutorials combined was 2,386.
Data were collected for each individual tutorial within each of the surveys noted above. For
example, the tutorial describing how cohorts should be formed and which students should be
included (one of three tutorials developed for the Graduation Rates component) generated 401
pageviews alone. Such data inform AIR’s selection of future topics and show the value of the
videos in clarifying definitions and avoiding confusion which could negatively impact IPEDS
data quality.
Data Tools Tutorials
AIR provided tutorials about accessing and using NCES’ data tools. These tutorials offered
detailed demonstrations of College Navigator, the IPEDS Data Center, Data Feedback Reports,
the College Career and Tables Library, and the Trend Generator.
Figure 4.3 presents the number of pageviews for each data tool tutorial. (Note that no data are
presented for the Data Tools Overview and College Career and Tables Library tutorials because
they were released at the end of this reporting period. Pageview data for those tutorials will be
available in the next reporting period.)
12 A list of all tutorials with learning objectives, length, and pageviews is included in the Evidence File, Section IV,
pages 3-10.
1,5801,407
975
1,4761,594
1,248
521
1,454
969
2,386
0
500
1,000
1,500
2,000
2,500
IC-H (4) IC (6) C (4) E12 (6) SFA (4) GR (3) GR200
(2)
EF (8) F (4) HR (5)
Figure 4.2: IPEDS Survey Component Pageviews
44
New Keyholder Tutorial
The New Keyholder tutorial was created specifically for IPEDS keyholders who have been in the
role for fewer than two years. The tutorial was organized in six chapters: introduction; keyholder
responsibilities; getting ready for data submission; using the data collection system;
communications from NCES; and additional resources. The first chapter of the New Keyholder
tutorial received 767 pageviews. Each additional chapter received fewer pageviews, as not all
viewers visited all pages.
Annual IPEDS Update Tutorial
The Annual IPEDS Update tutorial provided information to new and seasoned IPEDS keyholders
regarding changes for the 2014-15 and 2015-16 collection cycles. The tutorial included screen
captures of new survey component questions and information about how data should be collected
and reported for these items. In addition, the tutorial provided information on how NCES makes
changes to IPEDS and why certain changes are implemented. Because this video was released
only recently, no pageview data are available. Pageview data for this tutorial will be available in
the next reporting period.
Additional Tutorials on Topics of Interest
The AIR website also included four tutorials that addressed issues or topics of interest to broad
audiences. First, the IPEDS Overview tutorial provided a broad overview of IPEDS, discussed
which federal agency oversees IPEDS, what data are collected, and how data are used. Second,
the IPEDS Data Release Stages tutorial outlined the differences between collection level,
preliminary, provisional, and final data13. Third, the Net Price Calculator tutorial demonstrated
how to use the NCES net price calculator template. Finally, the IPEDS Community tutorial
provided an overview of four important resources that help IPEDS data providers and users stay
up-to-date on IPEDS changes. These resources include Technical Review Panels (TRPs); the
National Postsecondary Education Cooperative (NPEC); This Week in IPEDS emails; and the
IPEDS Listserv. While released during this reporting cycle, these four tutorials have not been
13 A screenshot of the IPEDS Data Release Stages tutorial webpage is included in the Evidence File, Section IV,
page 2.
474
612
205
430
0
200
400
600
Data Tools
Overivew
Data Center Data Feedback
Report
College Navigator Trend Generator College Career and
Tables Library
Figure 4.3: IPEDS Data Tools Tutorials
N/A N/A
45
available long enough to generate quality usage data14. Pageview data for these tutorials will be
available in the next reporting period.
Areas for Improvement
In addition to pageview data, AIR should incorporate a method for viewers to provide both
quantitative and qualitative feedback. AIR has experimented with this in the past, but has not yet
developed a long-term plan for collecting such feedback.
AIR should include dates, or collection years, on videos to identify when tutorials were last
updated. This will provide viewers with greater assurance that the information is up-to-date and
accurate.
AIR should develop a webpage specifically for new IPEDS keyholders focused on providing a
“road map” for accessing IPEDS resources. For example, the page would direct them first to the
IPEDS Overview tutorial to provide a broad understanding of the data collection system before
directing them to more detailed survey component tutorials or data tools videos.
14 The current IPEDS subcontract was awarded later than anticipated, shortening the window for completing contact
deliverables.
46
Communications
Communicating to the higher education community – specifically to IPEDS data providers and
users – about IPEDS training resources was an important component of AIR’s activities.
Through websites, e-newsletters and listservs, partnerships, and targeted emails, AIR provided a
comprehensive suite of marketing and communications to inform the data community about this
year’s IPEDS collection and data tools.
AIR and NCES Websites
The AIR and NCES websites were the primary methods for marketing IPEDS training
resources15. AIR’s website provided links to IPEDS Workshops, including information on
location, date, time, co-host (if applicable), and IPEDS Instructors. Information on submitting a
request to co-host an IPEDS Workshop was also available. This section of AIR’s website also
included links to access AIR’s online tutorials for submitting IPEDS data, using IPEDS data
tools, information for new IPEDS keyholders, updates to the IPEDS data collection cycle, and
other helpful videos.
Links to AIR resources were also available on NCES’ IPEDS website, the Report Your Data
website, and within the IPEDS data collection system. The IPEDS website included links to
AIR’s main training pages. Links within the Report Your Data website and IPEDS data
collection system provided access to online tutorials specific to data being reported. For
example, the Completions survey component form included a direct link to view a video on how
to report distance education data.
e-Newsletters and Listservs
e-Newsletters and listservs were useful tools to broadcast IPEDS training activities, news, and
resources to higher education professionals. These included eAIR (managed by AIR) sent to over
8,000 individuals, as well as This Week in IPEDS and the IPEDS Listserv (both managed by
RTI)16.
AIR provided regular updates to RTI for inclusion in This Week in IPEDS to announce upcoming
IPEDS Workshops and the release of new IPEDS tutorials throughout the year17. Additionally,
IPEDS Trainers were active participants on RTI’s IPEDS Listserv and they provided answers to
technical questions and practical advice for keyholders18.
Partnerships
AIR’s partnerships with co-hosting associations, NCES, RTI, and other organizations provided
additional avenues for communications regarding IPEDS training. Combining the marketing
reach of AIR with those of co-hosting organizations allowed AIR to inform a broad audience
about training opportunities well beyond the institutional research community. A number of
organizations included information about AIR’s training activities on their websites, including:
15 Images of the AIR and NCES websites are included in the Evidence File, Section V, pages 1-7. 16 A sample of an eAIR edition is included in the Evidence File, Section V, page 8. 17 A sample email is included in the Evidence File, Section V, page 9-10. 18 A sample discussion is included in the Evidence File, Section V, page 11.
47
South Dakota State University; the North Carolina Association for Institution Research
(NCAIR); and the National Association of Independent Colleges and Universities (NAICU)19. In
addition, many organizations provided details of upcoming IPEDS Workshops and other
resources to their members via their respective listservs.
Targeted AIR Emails
Finally, targeted personal emails were a key component of AIR’s communications strategy20.
From announcing IPEDS Workshops to informing the higher education community about
updated tutorials, personalized and concise emails provided a professional approach to marketing
IPEDS resources and for handling pre- and post-Workshop communications. In addition
participants’ supervisors also received an email after a participant attended a Workshop. This
served as a way to reinforce the availability of training and assistance with IPEDS data reporting
and use.
Areas for Improvement
Attendance at IPEDS Workshops and pageviews on AIR’s website indicate an already successful
strategy for announcing IPEDS training resources. AIR continues to seek ways to improve
communication and outreach regarding IPEDS training. These include exploring the value of
social media communication through sites such as LinkedIn; analyzing email open rates and
click-through rates to better understand the effectiveness of specific subject line words and email
content; developing standard communication packages for partner organizations; and updating
the IPEDS training section of the AIR website so it is more user friendly. To remain effective in
electronic communication will require continuous updates and improvements.
19 Images of partnership websites are included in the Evidence File, Section V, pages 12-13. 20 Sample emails are included in the Evidence File, Section V, pages 14-22.
IPEDS Institute for Educators | July 2015
Institute for Educators
(fka Train the Trainers)
IPEDS TEAM
NATIONAL CENTER FOR EDUCATION STATISTICS
U.S. DEPARTMENT OF EDUCATION
IPEDS Institute for Educators | July 2015
Agenda
• Introductions
• 2015-16 data collection– Outcome Measures
– Academic Libraries
• Handy things to know and tips to share
• Interactive activity
• Data release protocol
• IPEDS Data Feedback Report
• IPEDS Website Changes (Vision)
• Questions/comments/other things you would like to discuss
IPEDS Institute for Educators | July 2015
INTRODUCTIONS
Everyone
IPEDS Institute for Educators | July 2015
IPEDS StaffSam Barbett
202-502-7305; samuel.barbett@ed.gov
IPEDS Data Quality Team Lead, IPEDS data release,
IPEDS files
Moussa Ezzeddine
202-502-7781; moussa.ezzeddine@ed.gov
Institutional Characteristics Header, Human Resources,
Report Mapping, Universe Maintenance
Gigi Jones
202-502-7428; gigi.jones@ed.gov
Graduation Rates / Outcome Measures, Admissions,
Data Feedback Report, Data Integration
Tara Lawley
202-502-7476; tara.lawley@ed.gov
Acting Operations Team Lead, IPEDS Training, IPEDS
Technical Review Panels, National Postsecondary
Education Cooperative (NPEC)
Bao Le
202-502-7328; bao.le@ed.gov
Academic Libraries, Fall and 12-Month Enrollment,
Finance
Andrew Mary
202-502-7337; andrew.mary@ed.gov
Completions, Classification of Instructional Programs
(CIP), IPEDS data release and publications
Stefanie McDonald
srmcdonald@air.org
Institutional Characteristics, Student Financial Aid
Richard Reeves
202-502-7436; richard.reeves@ed.gov
Postsecondary Branch Chief
Jie Sun
202-502-7742; Jie.sun@ed.gov
IPEDS files
IPEDS Institute for Educators | July 2015
2015-16 DATA COLLECTION
Tara Lawley
IPEDS Institute for Educators | July 2015
2015-16 Data Collection Calendar
RegistrationFall
(6 wks)
Winter
(9 wks)
Spring
(17 wks)
Open date Aug 5 Sept 2 Dec 9 Dec 9
Keyholder close
dateOct 14 Feb 10 Apr 6
Coordinator
close dateOct 28 Feb 24 Apr 20
Includes
Registration
Mapping
Institution ID
IC-H
IC
C
E12
SFA
GR
GR200
ADM
OM
EF
F
HR
AL
IPEDS Institute for Educators | July 2015
Prior Year Revision System
• Components will be open for revision during
their regular data collection period
– Example: Revisions to the Completions
component will need to be made during the fall
collection period.
• Net price data will not be revised in the PYR
system
IPEDS Institute for Educators | July 2015
New Collection Website
IPEDS Institute for Educators | July 2015
Adding FAQs to Screens
IPEDS Institute for Educators | July 2015
Changes
• Outcome Measures
• Academic Libraries (Impacts IC Header)
IPEDS Institute for Educators | July 2015
OUTCOME MEASURES
Gigi Jones
IPEDS Institute for Educators | July 2015
OM 4 Cohorts
Four degree/certificate-seeking undergraduate
student cohorts:
– Full-time, first-time
– Part-time, first-time
– Full-time, non-first-time entering
– Part-time, non-first-time entering
IPEDS Institute for Educators | July 2015
OM: Two-Points in Time
• Data collected at 8 years for two-points in time
from the time the student enters a cohort:
– Six-years out
– Eight-years out
• For the 2015-16 Collection year, the cohort year
is 2007
– Six-year status: August 31, 2013
– Eight-year status: August 31, 2015
IPEDS Institute for Educators | July 2015 14
Adjusted Cohort
• Revisions
– An institution needs to
make any corrections to
its initial 2007 cohort
(pulled from the 2007
Fall Enrollment Survey)
• Omissions
• Double counting
• Exclusions
– Deceased or
permanently disabled
and unable to return to
school
– Left school to serve in
the armed forces
– Left school to serve in
foreign services
– Left school to serve an
official religious mission
IPEDS Institute for Educators | July 2015
OM at 6 yrs: Award Status Only
• For each of the 4 cohorts, OM will ask
institutions to report if the student:
– Received award
• For 2015-16 collection year, did the student
receive an award as of August 31, 2013?
IPEDS Institute for Educators | July 2015
OM at 8 yrs: Award and Enrollment Status
For each of the 4 adjusted cohorts, OM collects the award and enrollment status 8 years after the cohort enters the institution using the following categories:
– Received award by August 31, 2015
– If the student did not receive award by August 31, 2015, is the student:
• Still enrolled at reporting institution
• Subsequently enrolled at another institution
• Subsequent enrollment status unknown
• Total of students who did not receive an award (calculated)
IPEDS Institute for Educators | July 2015
Pre-loaded Data
*For all four 2007 cohorts:
– Cohort data will be pre-loaded from the Fall 2007
Enrollment survey
**Revisions and exclusions will be preloaded
only for
– Full-time, first-time students from the Graduation
Rates (GR) for only 4-year institutions
– GR collected this data during 2013-14 Data
Collection Year
IPEDS Institute for Educators | July 2015 18
2007
Cohort
Revisions
(through
August 31,
2013)
Exclusions
(through
August 31,
2013)
Adjusted
Cohort
Awarded
(through
August 31,
2013)
First-time entering
Full-time Pre-loaded Calculated
Part-time Pre-loaded Calculated
Non-first-time entering
Full-time Pre-loaded Calculated
Part-time Pre-loaded Calculated
OM: Award Status at 6-years
IPEDS Institute for Educators | July 2015 19
OM: Award and Enrollment Status at 8-years
2007
Cohort
Revisions
(through
August 31,
2015)
Exclusions
(through
August 31,
2015)Adjusted
Cohort
Awarded
(through
August
31,
2015)
Students who did not receive an award
by August 31, 2015
Number
still
enrolled at
institution
Number
who
subseque-
ntly
enrolled at
another
institution
Number
whose
subsequent
enrollment
status is
unknown
Total
number
who did
not
receive an
award
First-time entering
Full-
timePre-
loaded ** ** Calculated Calculated
Part-
timePre-
loadedCalculated Calculated
Non-first-time entering
Full-
timePre-
loadedCalculated Calculated
Part-
timePre-
loadedCalculated Calculated
IPEDS Institute for Educators | July 2015
FAQ: What do you think?
• The next series of slides are the FAQs for OM
• As a group exercise, let’s go through each
question one by one and discuss
• A sheet with the answers will be passed out
after this exercise.
• Take notes!
IPEDS Institute for Educators | July 2015
FAQ - General
1. Who is the best institutional representative to complete the Outcome Measures survey?
2. Will race and ethnicity be required in future years?
3. Are institutions required to subscribe to the National Student Clearinghouse (NSC) in order to report the number students that subsequently enrolled at another institution?
4. Can social media be used to confirm the receipt of an award at subsequent institutions?
IPEDS Institute for Educators | July 2015
FAQ - Terminology
1. What is a “still enrolled” undergraduate
student?
2. What is a “non-first-time” undergraduate
student?
3. Are the Outcome Measures of 6-year and 8-
year the same as Graduation Rates’ 150%
and 200% of normal time?
IPEDS Institute for Educators | July 2015
FAQ – Cohort
1. Is the cohort year for Outcome Measures set
up the same as the Graduation Rates cohort
years, which have two different cohort years
for 4-year and 2-year institutions?
2. What about spring cohorts? Should students
who enroll during the spring be included?
IPEDS Institute for Educators | July 2015
FAQ – Cohort Revisions
1. If an institution classified and reported a
student as first-time, but learns later that a
student should have been classified as non-
first-time after the current collection has
closed, how does the institution make the
correction?
IPEDS Institute for Educators | July 2015
FAQ – Degree Granting/Seeking Status
1. If an institution was not a degree-granting
institution back in 2007, but later became a
degree-granting institution, will that
institution have to complete the Outcome
Measures survey component?
2. Should students be included in Outcome
Measure cohorts if degree-seeking intent is
not explicitly stated?
IPEDS Institute for Educators | July 2015
FAQ – Award
1. Does any award really mean any award?
2. How would a student that transfers from a 4-year institution to a 2-year institution and completes a lower-level degree/certificate be counted? Is that a measure of success for the 4-year institution?
3. If a student earns multiple awards at my institution, do I count the higher award? Which award column would the student’s award be counted in?
IPEDS Institute for Educators | July 2015
FAQ – Award (cont.)
4. If a student transfers in with an award from
another institution, and then earns an award
at my institution, which award do I count?
5. How should I count students who left my
institution and we know has received an
award at another institution?
6. Does transfer-prep count as an award?
7. Can stackable credentials count as an award?
IPEDS Institute for Educators | July 2015
FAQ – Transfer Students
1. Won’t there be double counting of students
if two institutions are counting an award? For
example, could institutions double-count
awards earned by transfer students because
both the originating institution and transfer
institution report students as having received
an award?
IPEDS Institute for Educators | July 2015
FAQ – Transfer Students
2. What if a degree-seeking student starts out
at my institution, transfers to another
institution, but then returns back to my
institution within the eight-year
award/enrollment timeframe? Would my
institution double counting this student in
different cohorts and in different years?
IPEDS Institute for Educators | July 2015
FAQ – Transfer Students
3. Are first-time or non-first-time students who
transfer to another institution included in the
non-first-time entering cohort of the transfer
institution?
IPEDS Institute for Educators | July 2015
FAQ – Transfer Students
2. What if a degree-seeking student starts out
at my institution, transfers to another
institution, but then returns back to my
institution within the eight-year
award/enrollment timeframe? Would my
institution double counting this student in
different cohorts and in different years?
IPEDS Institute for Educators | July 2015
CHANGES TO ACADEMIC LIBRARIES
Bao Le
IPEDS Institute for Educators | July 2015
Reporting AL Expenses to Determine
Eligibility
• Institutions will no longer be required to report
estimates of library expenditures in ICH
– The question in ICH will be a yes/no
– A new screening question will be added to the AL
component in the spring
IPEDS Institute for Educators | July 2015
Reporting E-Books and E-Media
• Institutions will count by titles instead of by
“units” (defined as the number of simultaneous
uses available)
• Report any title (even those in databases) if
they’re in the library catalog or discovery system
• Include digital government documents, theses
and dissertations with e-books
• Include digital graphic materials and cartographic
materials with e-media
IPEDS Institute for Educators | July 2015
Reporting Other Collections
• Databases: Report if there is bibliographic or
discovery access available
• Physical Media: Include microforms and
cartographic materials
IPEDS Institute for Educators | July 2015
Reporting Physical Circulation
• Physical Circulation:
– Report circulation from the general instead of
from both the general and reserve collections
– Exclude interlibrary lending and borrowing
• Digital/Electronic Circulation:
– Defined more as “usage” rather than circulation;
includes views, downloads, streams
– Added instructions for using COUNTER metrics
from vendors
IPEDS Institute for Educators | July 2015
COUNTER Statistics
• COUNTER (Counting Online Usage of Networked Electronic
Resources) is an international initiative to improve the
reliability of online usage statistics
• Supported by the vendor, intermediary and librarian
communities
– E-books: report usage using COUNTER BR1 statistic (# Successful Title
Requests) or BR2 (# Successful Section Requests)
– E-media: report usage using COUNTER MR1 (# Successful Multimedia
Full Content Requests)
• If COUNTER statistics not available, report using downloads,
session views, transaction logs, etc.
IPEDS Institute for Educators | July 2015
Reporting Expenses
• Prior Year:
– Report Salaries and Wages and/or Fringe Benefits
only if paid from the library budget
• 2015-16 collection:
– Report Fringe Benefits only if paid from the library
budget
– Report Salaries and Wages if the funds were
expended from the library budget or from all
other known sources
IPEDS Institute for Educators | July 2015
HANDY THINGS TO KNOW
AND TIPS TO SHARE
IPEDS Survey Directors
IPEDS Institute for Educators | July 2015
HANDY THINGS TO KNOW
AND TIPS TO SHARE: INSTITUTIONAL
CHARACTERISTICS
Stefanie McDonald
IPEDS Institute for Educators | July 2015
Cost of Attendance
• Total amount of how much it will cost for
students to attend college for a year
• Full-time, first-time degree/certificate seeking
undergraduate students
• Computed differently for academic/hybrid
reporters vs. program reporters
IPEDS Institute for Educators | July 2015
Reporting Cost of Attendance
Cost ComponentAcademic / Hybrid
Reporters
Program
Reporters
Tuition and required fees Academic year Entire program
Books and supplies Academic year Entire program
Room and board Academic year One month
Other expenses Academic year One month
! The COA is for full-time, first-time degree/certificate seeking undergraduate students only.
! Program reporters report COA for the institution’s largest program.
IPEDS Institute for Educators | July 2015
Tuition Data
Institutions report tuition data for
undergraduates, graduates, and doctor’s-
professional practice students for all levels
– Academic/hybrid report undergraduate and
graduate tuition for both full-time and part-time
students
– Programs report tuition for the 2nd through 6th
largest programs
IPEDS Institute for Educators | July 2015
Uses of Cost of Attendance Data
Cost of attendance is used along with data from
the Student Financial Aid component to
calculate net price.
– College Affordability List
– College Scorecard
– College Navigator
! Important to make sure data are reported correctly so that institution
does not mistakenly end up on a high net price watch list.
IPEDS Institute for Educators | July 2015
Tuition Guarantee
When reporting COA, academic and hybrid
reporters are asked to indicate if the tuition
and/or fees are guaranteed and the percentage
increase.
– Institutions should not report the non-guaranteed
rate when they check the tuition guarantee box
– Instead, report the guaranteed rate (e.g.,
institution promises the tuition will not go up
more than 3% or 5%)
IPEDS Institute for Educators | July 2015
On-Campus Housing
Institutions asked where all full-time, first-time
degree/certificate seeking undergraduates must
live on campus.
– Institutions should only check yes if all students
are required to live on campus
– Checking this box and then reporting for other
housing requirements on SFA creates problems
with the net price calculation
! Even if only 1 in 10,000 students is allowed to live off-campus, check No.
IPEDS Institute for Educators | July 2015
HANDY THINGS TO KNOW
AND TIPS TO SHARE: STUDENT
FINANCIAL AID
Stefanie McDonald
IPEDS Institute for Educators | July 2015
Changes to SFA
New questions on military/veteran/eligible
dependent students receiving benefits were
added for both undergraduate and graduate
students.
! Work closely with your financial aid office and campus representative
who certifies veteran and military benefits.
IPEDS Institute for Educators | July 2015
Reporting Period
Report data for the prior academic year.
– Academic and hybrid reporters : Period of time
generally extending from September to June and
usually equated to 2 semesters or trimesters, 3
quarters, or the period covered by a 4-1-4
calendar system
– Program reporters: Defined by the institution, so
long as the reported data falls within July 1–June
30
IPEDS Institute for Educators | July 2015
Student Cohort
Institutions report on a cohort of students based on their reporter type.
– Academic reporters: Undergraduate students enrolled as of October 15 or as of the institution’s official fall reporting date
– Hybrid reporters: Undergraduate students enrolled at any time within the period of August 1 through October 31
– Program reporters: Undergraduate students enrolled anytime during the academic year, as defined by the institution
IPEDS Institute for Educators | July 2015
Student Groups
Group 1
All Undergraduates
• Public
Institutions
report on
students
who paid
in-district or
in-state
tuition rate
• Program
Reporters
report on
largest
program
IPEDS Institute for Educators | July 2015
Types of Financial Aid to Report– Federal grants
• Include Title IV and Educational assistance funds from other agencies
• Do not include Veterans education benefits; however, the institutional aid used in matching for Post 9/11 Yellow Ribbon program can be included
– Federal loans to students• Do not include PLUS loans and other loans not made directly
to student
– State/local government grants, scholarships, waivers
– Institutional grants, scholarships, waivers
– Private grants or scholarships
– Private loans to students
– Other sources of aid known to the institution
IPEDS Institute for Educators | July 2015
Changes to COA in SFA
The SFA component is connected to other IPEDS
components.
– Prior years (3 years) of cost of attendance can now
be edited in SFA component
– However, revised data will not be adjudicated in
time for the College Affordability and
Transparency Center’s List “Watch List”
IPEDS Institute for Educators | July 2015
HANDY THINGS TO KNOW
AND TIPS TO SHARE: 12 MONTH
ENROLLMENT
Bao Le
IPEDS Institute for Educators | July 2015
FTE
• Instructional Activity and FTE – enabling an apples-to-apples comparison across all institutions is the priority
– FTE as used in IPEDS, and on DFR statistics, is not meant to describe institutions outside of the IPEDS context
– Institution reported FTE is best used only when it is methodologically necessary
• Doctor’s-Professional Practice (DPP) – IPEDS requests a reported FTE
IPEDS Institute for Educators | July 2015
Program reporters instructional
activity
• Program reporters are sometimes confused
about what to report for instructional activity
– Institutions should report the total contact hour
activity (or credit hour activity) over the 12-month
period, not the average hours for a student or the
program length.
IPEDS Institute for Educators | July 2015
Student Level
• Even though Teacher Preparation certificate programs may require a bachelor's degree for admission, they are considered subbaccalaureateundergraduate programs, and students in these programs are undergraduate students.
• Doctor’s-professional practice students (formerly called first-professional) are included with graduate students in the unduplicated count but are not included with the graduate student FTE count
IPEDS Institute for Educators | July 2015
HANDY THINGS TO KNOW
AND TIPS TO SHARE: COMPLETIONS
Andrew Mary
IPEDS Institute for Educators | July 2015
Completions and completers
• Number of degrees and certificates awarded (completions)
– Reported by program (CIP code) and award level; the race/ethnicity and gender of the student earning the degree or certificate are also reported
• Number of students who completed a degree or award (completers)
– Reported at the total level by race/ethnicity and gender of the student; and by race/ethnicity, gender, and age within consolidated award levels.
IPEDS Institute for Educators | July 2015
Reporting for programs
• Completions data are reported for each
program of study at an institution.
– Programs of study are described using 6-digit
Classification of Instructional Program (CIP) codes.
– Institutions should include all programs of study
offered at their institution – even if the program
did not have any completers in the reporting
period.
IPEDS Institute for Educators | July 2015
Completions
• Only credit awards conferred as the result of
completion of a recognized program of study
should be reported.
– Institutions should not report non-credit awards, such
as informal certificates of completion or merit.
• Award levels in Completions should match the
levels selected on the previous year’s IC Header.
IPEDS Institute for Educators | July 2015
Distance education
• If more than one program is reported under a CIP code by award level, you should respond "YES" to the distance education question if ANY of the programs are offered as a distance education program.
• The distance education checkbox should be answered YES even if the program could also be completed through a traditional offering.
IPEDS Institute for Educators | July 2015
HANDY THINGS TO KNOW
AND TIPS TO SHARE: GRADUATION
RATES AND GRADUATION RATES 200
Gigi Jones
IPEDS Institute for Educators | July 2015
GR Cohort• All full-time, first-time degree/certificate-seeking
undergraduate students entering the institution either:
– Fall term (AY reporters) or
– 12-month period (program and hybrid reporters).
• For 4-year institutions, the cohort is divided into two subcohorts:
– Students who upon entry are seeking a bachelor’s or equivalent degree
– Students who upon entry are seeking an undergraduate award other than a bachelor’s degree
• Students remain in the cohort even if their status changes after they enter.
IPEDS Institute for Educators | July 2015
Cohort Revisions
• Institutions have the option of revising their preloaded cohort if:– There are eligible students who were omitted in the past
– Students who were originally included were included erroneously (e.g., they were not actually first-time, or full-time)
– Better information regarding race/ethnicity or gender is available for eligible students
• Cohorts should not be revised for students who have dropped out or transferred out.
• If the initial cohort changes by more than 20%, an institution will need to provide a good edit explanation for the large change.
IPEDS Institute for Educators | July 2015
Status as of 150% of normal time
• Institutions must report the status of the cohort as of 150% of normal time to completion.– For the bachelor’s or equivalent degree-seeking subcohort,
the length of time it took students to complete their program of study—4 years, 5 years, or 6 years.
– When reporting the status of the bachelor’s degree-seeking subcohort of students, report only for FTFT students who were seeking a bachelor’s or equivalent degree upon entry.
– When reporting the status of the other degree/certificate-seeking subcohort, report only for FTFT students who were seeking an undergraduate award other than a bachelor’s degree upon entry.
IPEDS Institute for Educators | July 2015
Status of non-completers
• Institutions must also report the status of non-
completers as of 150% of normal time to
complete their program:
– Transfers-out
– Allowable exclusions from the cohort
– Students still enrolled at the institution
• Allowable exclusions should be permanently
excluded from the cohort; including those who
return to the institution prior to the status date.
IPEDS Institute for Educators | July 2015
GR200 Cohort
• All full-time, first-time degree/certificate-
seeking undergraduate students entering the
institution either:
– The given fall term (AY reporters) or
– 12 month period (program and hybrid reporters).
• At 4-year institutions, GR200 is limited to the
bachelor’s degree-seeking students.
IPEDS Institute for Educators | July 2015
Status as of 200% of normal time
• Institutions must report the status of the
cohort within 151-200% of normal time to
completion.
– While the data reported in the GR component at
150% of normal time are cumulative, the data
reported in the GR200 component should include
only additional students who completed between
151 and 200% of normal time.
IPEDS Institute for Educators | July 2015
HANDY THINGS TO KNOW
AND TIPS TO SHARE: HUMAN
RESOURCES
Moussa Ezzeddine
IPEDS Institute for Educators | July 2015
Reporting Employees
• The HR component is intended to provide a snapshot of the institution’s human resources/payroll data at a specific point in the fall.– Institutions report employees on the payroll of the
institution as of November 1.
• Report each employee only once. If an employee could be coded in more than one occupation:– Code the employee in the occupation that requires the
highest level of skill; or
– If there is no measurable difference in skill requirements, code the employee in the occupation in which they spend the most time.
IPEDS Institute for Educators | July 2015
Human Resources
• Institutions should always report data in order of displayed screens.– Data must be entered on each displayed screen. If a screen
is not applicable, the institution must enter at least one zero to continue.
• All staff must be reported using the new IPEDS occupational categories, which align with the 2010 SOC codes.
• Institutions should provide additional information on any employees who are difficult to categorize in the context boxes provided, including the “Human Resources Survey Evaluation” screen presented at the end of the survey.
IPEDS Institute for Educators | July 2015
Reporting instructional staff
IPEDS Institute for Educators | July 2015
Reporting by faculty status
• Keyholders should refer to their institution’s
policies to determine whether staff members
have the designation of “faculty”.
– This is not limited to instructional staff, but may
also include such positions as president, provost,
or librarians.
– Any staff without faculty designation should be
reported in the Without Faculty Status column.
IPEDS Institute for Educators | July 2015
Reporting by Contract Length
• Data on full-time instructional staff with
faculty status who are not on tenure track (or
where the institution does not have a tenure
system) are collected for three categories of
employment agreements or contracts:
– Multi-year or Continuing or At-will Contract
– Annual
– Less-than-annual
IPEDS Institute for Educators | July 2015
Salary Outlays
• Collected for full-time, non-medical school, instructional and non-instructional staff
– FT instructional staff should be reported based on the number of months they work during the year, not the number of months during which they are paid—9 months, 10 months, 11 months, or 12 months.
– Include all full-time, non-medical school, instructional staff—both with and without faculty status.
– Any remaining full-time instructional staff (i.e. whose annual salary covers less than 9 months of work) will go to the Balance column.
IPEDS Institute for Educators | July 2015
Salary Outlays
• Salary outlays should include base salaries
only – no supplements, overloads, or bonuses.
• Report total annual salary outlays for staff, not
average salary outlays.
• Report total annual salary outlays for full-time
staff only. Do not include part-time staff in
salary outlay amounts.
IPEDS Institute for Educators | July 2015
HANDY THINGS TO KNOW
AND TIPS TO SHARE: FALL ENROLLMENT
Bao Le
IPEDS Institute for Educators | July 2015
Reporting Enrollment by Age
• MANDATORY this year
• How to report students whose ages are unknown?
– This is a calculated value of the difference between the
total enrollment reported in Part A and total enrollment by
age reported in Part B
– Either revise Part A or Part B
• What if the institution’s age categories are different
from the IPEDS age categories?
– Must use the IPEDS categories
– Use the student’s date of birth as a reference
IPEDS Institute for Educators | July 2015
First-time Degree-/Certificate-seeking
Undergraduate Student• First-time counts impact other IPEDS surveys:
– Graduation Rates (GR) component
– Outcome Measures (OM) component
– Student Financial Aid (SFA) component
• First-time students are those with NO prior postsecondary experience
(regardless of credit earning)
• Not considered prior postsecondary experience:
– Credit for military service/training from an association such as the American
Council on Education,
– Credit from any non-credit courses, as defined by the institution,
– Credit received via Competency-Based Education,
– Credit received before the student has earned a high school diploma (i.e., AP
or dual enrollment credits), or
– Credit for life experience.
IPEDS Institute for Educators | July 2015
Transfer-in Degree-/Certificate-seeking
Undergraduate Students
• Report students who are new to the reporting
institution in the Fall (includes prior summer new
enrollments)
• How to count students that change degree-seeking
status? Students with multiple transfers? See the
new cohorts guidance…
– Once in a cohort, the student cannot be reassigned to
another cohort
– High school students earning college credit as non-degree-
seeking who graduate and re-enroll as degree-seeking are
first-time
IPEDS Institute for Educators | July 2015
Cohorts Guidance (Full-time Undergrad)Key
NDS = Non-Degree-Seeking
DS = Degree-Seeking
N/A = Non-Applicable
FTFT = First-time Full-time
GR = Graduation Rates Cohort
EF = Fall Enrollment Cohort
OM = Outcome Measures Cohort
IPEDS Institute for Educators | July 2015
Cohorts Guidance (Full-time Undergrad)Key
NDS = Non-Degree-Seeking
DS = Degree-Seeking
N/A = Non-Applicable
FTFT = First-time Full-time
GR = Graduation Rates Cohort
EF = Fall Enrollment Cohort
OM = Outcome Measures Cohort
IPEDS Institute for Educators | July 2015
Undergraduate Student Retention
• Tracks the number of first-time degree-/certificate-
seeking undergraduates enrolled in the prior fall who
are still enrolled in the current fall
– 4-year institutions only report retention rate data for
bachelor’s-seeking students
• This may end up being a small percentage of the students for an
institution (i.e., primarily 2-year degree-granting schools)
– 2-year and less-than-2-year institutions report data for all
first-time degree/certificate-seeking undergraduates
• Must report: first-time degree/certificate-seeking students from the
prior fall that still enrolled + students who completed their 1-year or
less-than-1-year program in that timeframe
IPEDS Institute for Educators | July 2015
Distance Education
• Institutions reporting that they offer distance education
opportunities (courses and/or programs) in IC will be asked to
report enrollment in distance education courses in EF
• Institutions should only report students enrolled in distance
education courses at THEIR institution
• Hybrid courses (using a mixture of face-to-face and distance
education interaction for instruction) are NOT distance
education courses and students enrolled in these courses are
NOT considered “enrolled in some but not all distance education
courses”
IPEDS Institute for Educators | July 2015
Student Enrolled Exclusively in
Distance Education
• Students taking the instructional portions of their programs entirely online are considered enrolled exclusively in distance education courses
• Programs where a majority of instructional portions are taken online but students come to campus for some portion of the program
– If the students come to campus to complete a practicum, residency, or internship, then they are still considered enrolled in exclusively distance education courses
– If the student comes to campus to meet for instructional purposes, then they are not considered enrolled in exclusively distance education courses
IPEDS Institute for Educators | July 2015
HANDY THINGS TO KNOW
AND TIPS TO SHARE: FINANCE
Bao Le
IPEDS Institute for Educators | July 2015
Finance• Report on most recent Fiscal Year ending before October 1,
2015
• Where to get help for…
– Definitions and examples: IPEDS glossary and NACUBO FARM document (for members only)
– Reporting scholarships/fellowships and discounts/allowances: IPEDS tip sheet (http://1.usa.gov/1JS7ifm) and AIR video tutorial (http://bit.ly/1JS7njt)
– Reporting in a parent/child relationship: IPEDS tip sheet and AIR video tutorial
– Guidance for Data Users on FASB vs. GASB: IPEDS tip sheet and Delta Cost Project History Documentation (http://1.usa.gov/1JS82kM)
– Allocating Expenses: AIR video tutorial, NACUBO Advisory Report (http://bit.ly/1JS8qQl), and soon to come IPEDS guidance in the tip sheet page
IPEDS Institute for Educators | July 2015
How Pell Grants Are Treated By The
Different Finance Forms
GASB
Must treat Pell as federal grant
Scholarships: Pell is applied to discount/allowances (expense)
Expense: Excess Pell after application to tuition/fees and auxiliary is reported as
part of Net Grant Aid Expense
Revenue: Pell reported as federal non-operating grant
FASB
Can treat Pell as federal grant or pass-through. If treated as pass-through:
Scholarships: Pell is NOT applied to discount/allowances (expense)
Expense: Pell not reported in expenses
Revenue: Pell should be reported as Tuition/Fees Revenue or Auxiliary Revenue
IPEDS Institute for Educators | July 2015
Other Scholarship/Fellowship
• FDSL are NOT scholarship, fellowship or student grant aid
– Are captured as student payments for tuition and fees or auxiliary enterprises.
• Federal veteran education benefits are also not considered scholarship, fellowship, or student grant aid
– Are not included in grant revenue, scholarship expenses, or discounts and allowances
– Captured as student payments
IPEDS Institute for Educators | July 2015
HANDY THINGS TO KNOW
AND TIPS TO SHARE: ACADEMIC
LIBRARIES
Bao Le
IPEDS Institute for Educators | July 2015
Resources and Tips
• IPEDS Academic Libraries Information Center (http://1.usa.gov/1LV4K3i)
– Crosswalks
– Tip sheets
– Q & A for Library Directors
– Links to old survey and data
– Links to other external resources and organizations
• Encourage keyholders to reach out to library staff for help completing AL and HR
IPEDS Institute for Educators | July 2015
For Institutions Sharing Library Resources
• Which relationship should be established?
Share all resources
Main/branch reporting
• Unit ID is same
Full p/c
• Unit ID is different
• Sector is the same
Share partialresources
No p/c relationship
IPEDS Institute for Educators | July 2015
Institutions Sharing ALL Library Resources
How is the
relationship
established in the
Data Collection
System (DCS)?
How is Section I:
Collections
reported?
How is Section II:
Expenditures
reported?
How to report the
number of Branch
or Independent
Libraries in
Section II?
How is allocation
factors reported?
Main/Branch No formal way to
establish in the
DCS because only
one IPEDS Unit ID
is involved
Main library
reports combined
totals for itself and
the branch
libraries
Main library
reports combined
totals for itself and
the branch
libraries
Report the total
number of branch
libraries
Allocation factors
will not be asked
for this type of
relationship.
Parent/Child Help Desk
establishes this
relationship in DCS
Parent institution
reports combined
totals for itself and
the child
institutions
Parent institution
reports combined
totals for itself and
the child
institutions
Do not report the
child institutions as
branch libraries
Parent institution
will report the
allocation factors
for its child
institutions
IPEDS Institute for Educators | July 2015
Institutions Sharing PARTIAL Library Resources
• Digital/electronic collection:
– Report whatever the library has access to as part of its
collection
– Report at the administrative and not branch level
• Digital/electronic circulation:
– Report only circulation of those shared resources at your
institution. Do not include circulation at other institutions
in consortium
– If this cannot be broken out, use whichever method
implemented locally to monitor circulation
IPEDS Institute for Educators | July 2015
INTERACTIVE ACTIVITY
Everyone
IPEDS Institute for Educators | July 2015
DATA RELEASE PROTOCOL
Andrew Mary
IPEDS Institute for Educators | July 2015
Data Release Protocol
• 4 Stages– Collection
– Preliminary
– Provisional
– Final
• Outlined in IPEDS Resource Center
IPEDS Institute for Educators | July 2015
Data Release Protocol – Collection Level
• After data are locked, they are
1. reviewed by the Help Desk
2. migrated to the Collection Level Data Center (login
available only through the Data Collection System)
• At the collection level, any respondent whose
data have already been migrated can see their
own data, as well as the data for all of the other
institutions that have already been migrated
IPEDS Institute for Educators | July 2015
Data Release Protocol – Preliminary
• Soon* after an IPEDS data collection cycle closes:
– A First Look publication based on preliminary data is
released
– Preliminary data are made publicly available through the
IPEDS Data Center
• Preliminary data have been edited but are subject to
further NCES quality control procedures
• Imputed data for nonresponding institutions are not
included
IPEDS Institute for Educators | July 2015
Data Release Protocol – Provisional
• After all quality control procedures are complete:
– The First Look publication is reissued based on the
provisional data
– Provisional data are made publicly available through
the IPEDS Data Center
• Data have been imputed for non-responding
institutions
IPEDS Institute for Educators | July 2015
Data Release Protocol – Final
• Institutions may submit revisions to data in the
subsequent data collection year.
• After editing of these revised data is complete:
– Final data are made public through the IPEDS Data
Center
– The First Look publication is not reissued
IPEDS Institute for Educators | July 2015
Timing of First Look Publications
and Data Release
• Preliminary Data – Fall Survey Data (ICH, IC, E12, C): mid- to late-May
following the collection
– Winter Survey Data (SFA, GR, GR200, ADM, OM): early- to mid-September following the collection
– Spring Survey Data (HR, EF, F, AL): early- to mid-October following the collection
• Provisional Data– Approximately 4-6 weeks after the Preliminary data
release
IPEDS Institute for Educators | July 2015
DATA FEEDBACK REPORT
Gigi Jones
IPEDS Institute for Educators | July 2015 105
2015 DFRs
General Timeline
- Currently: Selecting pilot group to review
- Early September: Pilot group reviews their DFRs
- October: Emailed to KH, Coordinators, and CEOs
- DFRs on Data Center
IPEDS Institute for Educators | July 2015
Data Feedback Report Update
• AIR Benchmarking Workshop
• Institutions had until July 17, 2015 to load or
update their comparison groups
• 2015 reports will be emailed again to KH,
Coordinators, and CEOs.
– New this year, NO attachments. Links to the
reports will be provided instead.
• New data = New figures
IPEDS Institute for Educators | July 2015 107
• Admissions (only for
non-open admissions
schools)
• Student Enrollment
• Awards
• Charges and Net Price
• Student Financial Aid
• Military Benefits **
• Retention and
Graduation Rates
• Finance *
• Staff
• Libraries ***
IPEDS Institute for Educators | July 2015
Data Center: DFR
• Facilitating Customization
• Tree layout
• Headers organizes the figures like the reports
• Admissions (only for non-admissions schools)
• Student Enrollment
• Awards
• Charges and Net Price
• Student Financial Aid
• Military Benefits
• Retention and Graduation Rates
• Finance
• Staff
• Libraries
IPEDS Institute for Educators | July 2015
IPEDS WEBSITE (VISION)
Tara Lawley
IPEDS Institute for Educators | July 2015
IPEDS Institute for Educators | July 2015
QUESTIONS/COMMENTS/OTHER
THINGS YOU WOULD LIKE TO DISCUSS
Everyone