U.S. Department of Agriculture, Office of the Chief Information Officer ...

59
United States Department of Agriculture OFFICE OF INSPECTOR GENERAL

Transcript of U.S. Department of Agriculture, Office of the Chief Information Officer ...

Page 1: U.S. Department of Agriculture, Office of the Chief Information Officer ...

United States Department of Agriculture

OFFICE OF INSPECTOR GENERAL

Page 2: U.S. Department of Agriculture, Office of the Chief Information Officer ...

U.S. Department of Agriculture Office of the Chief Information Officer Fiscal Year 2016 Federal Information

Security Modernization Act

Audit Report 50501-0012-12As required by FISMA, OIG reviewed USDA’s ongoing efforts to improve its IT security program and practices during FY 2016.

WHAT OIG FOUNDThe Department of Agriculture (USDA) is working to improve its information technology (IT) security posture, but many longstanding weaknesses remain. We continue to find that the Office of the Chief Information Officer (OCIO) has not implemented corrective actions that the Department has committed to in response to prior recommendations from the Office of Inspector General. In FYs 2009 through 2015, OIG made 61 recommendations for improving the overall security of USDA’s systems, 39 have been closed (i.e., the agreed-upon corrective action has been implemented), 4 are current and have not yet reached the estimated completion date; however, 18 are overdue for completion. Our testing identified that security weaknesses still exist in 3 of the 39 closed recommendations.

Our testing also identified weaknesses in eight subject areas as defined for review by the Federal Information Security Modernization Act (FISMA): risk management, contractor systems, configuration management, identity and access management, security and privacy training, Information Security Continuous Monitoring (ISCM) program maturity level, incident response program maturity level, and contingency planning.

Due to these identified outstanding recommendations and weaknesses, we continue to report a material weakness in USDA’s IT security that should be included in the Department’s Federal Managers Financial Integrity Act report. Based on these outstanding recommendations, and the findings in this report, OIG concludes that the Department lacks an effective information security program and practices.

We noted that OCIO continues to take positive steps towards improving the Department’s security posture. OCIO released two critical policies this year that, once implemented, should improve IT security within USDA, and also began implementation of the Continuous Diagnostic and Mitigation (CDM) project.

OCIO concurred with the findings in this report and generally agreed with the recommendations.

OBJECTIVE

The objectives of this audit were to evaluate the status of USDA’s overall IT security program by evaluating five information security functions as defined in the NIST Framework for Improving Critical Infrastructure Cybersecurity.

REVIEWED

The scope was Department-wide, and reviewed agency, OIG, and contractor IT audit work completed in FY 2016. This audit covered 23 agencies and offices, operating 310 of the Department’s 316 general support and major application systems in the systems inventory.

RECOMMENDS

The Department should continue its progress by issuing critical policy and completing actions on the 18 outstanding recommendations from the FYs 2009 through 2015 FISMA audit reports and the 3 new recommendations included in this report.

Page 3: U.S. Department of Agriculture, Office of the Chief Information Officer ...
Page 4: U.S. Department of Agriculture, Office of the Chief Information Officer ...

United States Department of Agriculture

Office of Inspector General

Washington, D.C. 20250

November 9, 2016

The Honorable Shaun Donovan Director for the Office of Management and Budget 725 17th Street, NW. Washington, D.C. 20503

Dear Mr. Donovan:

Enclosed is a copy of our report, U.S. Department of Agriculture, Office of the Chief Information Officer, Fiscal Year 2016 Federal Information Security Modernization Act (Audit Report 50501-0012-12), presenting the results of our audit of the Department of Agriculture’s (USDA) efforts to improve the management and security of its information technology (IT) resources. USDA and its agencies have taken actions to improve the security over their IT resources; however, additional actions are still needed to establish an effective security program.

If you have any questions, please contact me at (202) 720-8001, or have a member of your staff contact Mr. Gil H. Harden, Assistant Inspector General for Audit, at (202) 720-6945.

Sincerely,

Phyllis K. Fong Inspector General

Enclosure

Page 5: U.S. Department of Agriculture, Office of the Chief Information Officer ...
Page 6: U.S. Department of Agriculture, Office of the Chief Information Officer ...

United States Department of Agriculture

Office of Inspector General

Washington, D.C. 20250

DATE:

AUDIT NUMBER: 50501-0012-12

TO: Jonathan Alboum Chief Information Officer Office of Chief Information Officer

ATTN: Megen Davis Audit Liaison

FROM: Gil H. Harden Assistant Inspector General for Audit

SUBJECT: U.S. Department of Agriculture, Office of the Chief Information Officer,Fiscal Year 2016 Federal Information Security Modernization Act

This report presents the results of our audit of the U.S. Department of Agriculture, Office of the Chief Information Officer, Fiscal Year 2016 Federal Information Security Modernization Act (FISMA). The instructions for fiscal year (FY) 2016 FISMA reporting are outlined in the FY 2016 Inspector General Federal Information Security Modernization Act of 2014 Reporting Metrics (IG FISMA Metrics), V1.1.3, dated September 26, 2016. This report contains our responses to the questions contained in these instructions. Your written response is included, in its entirety, as an attachment to the report.

In accordance with Departmental Regulation 1720-1, please furnish a reply within 60 days describing the corrective actions taken or planned, and timeframes for implementing the recommendations for which management decisions have not been reached. Please note that the regulation requires management decision to be reached on all recommendations within 6 months from report issuance, and final action to be taken within 1 year of each management decision to prevent being listed in the Department’s annual Agency Financial Report. Follow your internal agency procedures in forwarding final action correspondence to the Office of the Chief Financial Officer.

We appreciate the courtesies and cooperation extended to us by members of your staff during our audits. This report contains publicly available information and will be posted in its entirety to our website http://www.usda.gov/oig in the near future. A a copy is being sent to the Director of the Office of Management and Budget.

November 9, 2016

Page 7: U.S. Department of Agriculture, Office of the Chief Information Officer ...
Page 8: U.S. Department of Agriculture, Office of the Chief Information Officer ...

Table of Contents

Background and Objectives ................................................................................ 1

Section 1: U.S. Department of Agriculture, Office of the Chief Information Officer, Fiscal Year 2016 Federal Information Security Modernization Act ... 4

Findings and Recommendations ......................................................................... 4

Recommendation 1 .................................................................................... 9Recommendation 2 .................................................................................... 9Recommendation 3 ...................................................................................10

Scope and Methodology ..................................................................................... 11Abbreviations ..................................................................................................... 13Exhibit A: Office of Management and Budget/Department of Homeland Security Reporting Requirements and U.S. Department of Agriculture Office of Inspector General Position ............................................................................ 14Exhibit B: Sampling Methodology and Projections ........................................ 40Agency Response ................................................................................................ 47

Page 9: U.S. Department of Agriculture, Office of the Chief Information Officer ...
Page 10: U.S. Department of Agriculture, Office of the Chief Information Officer ...

Background and Objectives 

AUDIT REPORT 50501-0012-12 1

Background Improving the overall management and security of information technology (IT) resources needs to be a top priority for the Department of Agriculture (USDA). Technology enhances the ability to share information instantaneously among computers and networks, but it also makes organizations’ networks and IT resources vulnerable to malicious activity and exploitation by internal and external sources. Insiders with malicious intent, recreational and institutional hackers, and attacks by foreign intelligence organizations are a few of the threats to the Department’s critical systems and data.

On December 17, 2002, the President signed the e-Government Act (Public Law 107-347), which includes Title III, Federal Information Security Management Act (FISMA 2002). FISMA 2002 permanently reauthorized the framework established by the Government Information Security Reform Act (GISRA) of 2000, which expired in November 2002. FISMA continued the annual review and reporting requirements introduced in GISRA, and also included new provisions that further strengthened the Federal Government’s data and information systems security, such as requiring the development of minimum control standards for agencies’ systems.

On December 18, 2014, the President signed the Federal Information Security Modernization Act of 2014 (FISMA), which “amended FISMA 2002 to (1) reestablish the oversight authority of the Director of Office of Management and Budget (OMB) with respect to agency security policies and practices, and (2) set forth authority for the Secretary of the Department of Homeland Security (DHS) to administer the implementation of such policies and practices for information systems.”1 According to FISMA, the Secretary must:

develop and oversee implementation of operational directives requiring agencies to implement OMB standards and guidelines for safeguarding Federal information and systems from a known or reasonably suspected information security threat, vulnerability, or risk. It authorizes the Director of OMB to revise or repeal operational directives that are not in accordance with the Director's policies.2

FISMA also “directs the Secretary to consult with, and consider guidance developed by, the National Institute of Standards and Technology (NIST) to ensure that operational directives do not conflict with NIST information security standards.”3

FISMA changes annual reporting requirements by directing that agencies:

submit an annual report regarding major incidents to OMB, DHS, Congress, and the Comptroller General of the Government Accountability Office (GAO). It requires such reports to include: (1) threats and threat actors, vulnerabilities, and impacts; (2) risk

1 Federal Information Security Modernization Act of 2014, Pub. L. No. 113-283, 128 Stat. 3073. 2 Ibid. 3 Ibid.

Page 11: U.S. Department of Agriculture, Office of the Chief Information Officer ...

assessments of affected systems before, and the status of compliance of the systems at the time of, major incidents; (3) detection, response, and remediation actions; (4) the total number of incidents; and (5) a description of the number of individuals affected by, and the information exposed by, major incidents involving a breach of personally identifiable information.

2 AUDIT REPORT 50501-0012-12

4

Further, it “requires OMB to ensure the development of guidance for evaluating the effectiveness of information security programs and practices.”5 NIST was tasked to work with agencies in developing standards as part of its statutory role in providing technical guidance to Federal agencies.

Instructions for Fiscal Year (FY) 2016 FISMA reporting are outlined in the FY 2016 Inspector General Federal Information Security Modernization Act of 2014 Reporting Metrics (IG FISMA Metrics) V1.1.3 (September 26, 2016). OMB, DHS, and the Council of the Inspectors General on Integrity and Efficiency (CIGIE) developed the FY 2016 IG FISMA Metrics in consultation with the Federal Chief Information Officer (CIO) Council. DHS uses the CyberScope website for consolidated reporting.

FISMA requires that the head of each agency shall be responsible for:

· Providing information security protections commensurate with the risk and magnitude of the harm resulting from unauthorized access, use, disclosure, disruption, modification, or destruction of information collected or maintained by or on behalf of the agency and information systems used or operated by an agency or by a contractor of an agency or other organization on behalf of an agency;

· Complying with the requirements of NIST’s related policies, procedures, and standards; · Ensuring that information security management processes are integrated with agency

strategic, operational, and budgetary planning processes; and · Ensuring that senior agency officials provide information security for the information and

information systems that support the operations and assets under their control, including assessing risk, determining the levels of information security, implementing policies to cost-effectively reduce risks, and periodically testing and evaluating security controls.

FISMA requires the Inspector General (IG) to conduct an annual independent evaluation to determine the effectiveness of the information security program and practices of its respective agency. These evaluations (a) test the effectiveness of information security policies, procedures, and practices of a subset of agency information systems, and (b) assess the effectiveness of an agency’s information security policies, procedures, and practices.

As with the FY 2016 CIO FISMA Reporting Metrics, the IG metrics are organized around the five information security functions outlined in the NIST Framework for Improving Critical Infrastructure Cybersecurity (Cybersecurity Framework): Identify, Protect, Detect, Respond, and Recover. The Cybersecurity Framework provides agencies with a common structure for identifying and managing cybersecurity risks across the enterprise and provides IGs with 4 Ibid. 5 Ibid.

Page 12: U.S. Department of Agriculture, Office of the Chief Information Officer ...

guidance for assessing the maturity of controls to address those risks. In addition to these key performance areas, the IG FISMA Metrics assess the effectiveness of an agency’s information security program. NIST defines security control effectiveness as the extent to which security controls are implemented correctly, operating as intended, and producing the desired outcome with respect to meeting the security requirements for the information system in its operational environment or enforcing/mediating established security policies.

AUDIT REPORT 50501-0012-12 3

6

For the second year, IG FISMA Metrics include sections with the use of levels and maturity models. The purpose of the CIGIE maturity models is to (1) summarize the status of agencies’ information security programs and their maturity on a 5-level scale, (2) provide transparency to agency CIOs, top management officials, and other interested readers of IG FISMA reports regarding what has been accomplished and what still needs to be implemented to improve the information security program, and (3) help ensure consistency across the IGs in their annual FISMA reviews. Within the maturity model context, agencies should perform a risk assessment and identify the optimal maturity level that achieves cost-effective security based on their missions and risks. The IG FISMA Metrics state that Level 4, Managed and Measurable, represents an effective information security program.

The five levels an agency can receive in the maturity model are: (1) Ad hoc, (2) Defined, (3) Consistently Implemented, (4) Managed and Measurable, and (5) Optimized. Agencies are allotted points for each Cybersecurity Framework function area based on their achievement of various levels of maturity.

Objectives

The objectives of this audit were to evaluate the status of USDA’s overall IT security program by evaluating the five Cybersecurity Framework security functions:

· Identify, which includes questions pertaining to Risk Management and Contractor systems;

· Protect, which includes questions pertaining to Configuration Management, Identity and Access Management, and Security and Privacy Training questions;

· Detect, which includes questions pertaining to Information Security Continuous Monitoring;

· Respond, which includes questions pertaining to Incident Response; and · Recover, which includes questions pertaining to Contingency Planning.

6 NIST SP 800-53 Revision 4, Security and Privacy Controls for Federal Information Systems and Organizations (April 2013).

Page 13: U.S. Department of Agriculture, Office of the Chief Information Officer ...

Section 1:  U.S. Department of Agriculture, Office of the Chief Information Officer, Fiscal Year 2016 Federal Information Security Modernization Act 

4 AUDIT REPORT 50501-0012-12

Findings and Recommendations

This report constitutes the Office of Inspector General’s (OIG) independent evaluation of USDA’s IT security program and practices, as required by FISMA, and is based on the questions provided by OMB, DHS, and CIGIE (OMB/DHS/CIGIE). These questions are designed to assess the status of the Department’s security posture during FY 2016. The OMB/DHS/CIGIE framework requires OIG to evaluate processes, policies, and procedures that were documented, implemented, and monitored during FY 2016.

The Office of the Chief Information Officer (OCIO) continues to take positive steps towards improving the Department’s security posture. OCIO released two critical policies this year: Secure Communication Systems and Contingency Planning and Disaster Recovery Planning. Once implemented, these policies should improve IT security within USDA. OCIO also began the implementation of the DHS Continuous Diagnostic and Mitigation (CDM) project.7 However, USDA’s FISMA score declined this year. As CIGIE states, “It is expected that agencies will see a drop in their overall FISMA scores this year with the move to integrate the maturity model and the concept of effectiveness into the FISMA scoring methodology.” Because of this new methodology, any historical comparison to past USDA FISMA scores would not be appropriate.

USDA senior management needs to make sure each agency and office understands that how well it implements IT security directly impacts USDA’s overall security posture and FISMA score. For USDA to attain a security posture that is secure and sustainable, all 35 of its agencies and offices must consistently implement Departmental policy based on a standard methodology. The degree to which USDA, as a whole, complies with FISMA and other security guidance is directly correlated to the security posture of each agency and office. When every agency and office is in compliance with USDA’s policies, USDA as a whole will be FISMA compliant and, more importantly, will have a sustainable security posture.

USDA is working to improve its IT security posture, but many longstanding weaknesses remain. We continue to find that OCIO has not implemented corrective actions that the Department has committed to in response to prior OIG recommendations. In FYs 2009 through 2015, OIG made 61 recommendations for improving the overall security of USDA’s systems; 39 have been closed (i.e., the agreed upon corrective action has been implemented), 4 are current and have not yet reached the estimated completion date, however, 18 are overdue for completion. Our testing

7 The CDM program is a dynamic approach to fortifying the cybersecurity of Government networks and systems. Congress established the CDM program to provide adequate, risk-based, and cost-effective cybersecurity and more efficiently allocate cybersecurity resources. CDM offers commercial off-the-shelf tools, with robust terms for technical modernization as threats change. DHS, in partnership with the General Services Administration, established a Government-wide acquisition vehicle for CDM. The CDM blanket purchase agreement is available to Federal, State, local, and tribal government entities.

Page 14: U.S. Department of Agriculture, Office of the Chief Information Officer ...

identified that security weaknesses still exist in 3 of the 39 closed recommendations. Due to these identified outstanding recommendations and weaknesses, we continue to report a material weakness in USDA’s IT security that should be included in the Department’s Federal Managers Financial Integrity Act report. Based on these outstanding recommendations, and the findings in this report, OIG concludes that the Department lacks an effective information security program and practices.

We also noted concerns about USDA’s CDM project that we detailed during our proactive assessment of the project.

AUDIT REPORT 50501-0012-12 5

8 Our main issues related to the schedule, awareness, manpower resources, operational resources and communication. Additional concerns that were presented included the overall awareness within USDA of the upcoming changes, business and technology process and regular updates that were needed to be communicated to the agencies on the changing DHS CDM scope, schedule and requirements. OCIO stated that funding has been a constant source of concern for the DHS CDM program. Inadequate funding provided for the USDA CDM Implementation has impacted the purchase of additional tool capabilities and integration capabilities. Among the recommendations we made to the Department were to increase leadership visibility of the project and to increase operational CDM participation by establishing specific working groups to assist in the completion of the project.

Another way OCIO can achieve better results is with an internal governance team. Governance is a set of processes that ensures the effective and efficient use of information technology in enabling an organization to achieve its goals. This combination is focused upon providing direction and oversight, which guides the achievement of the needed business outcome from the execution of the program effort, and providing data and feedback. A nonexistent governance structure will leave the program in a reactive state, continuously struggling to adapt to changing conditions.

USDA is a large, complex organization that includes 35 separate agencies and offices, most with their own IT infrastructure. Each of USDA’s 35 agencies and offices, including OCIO, needs to be held accountable for implementing the Department’s policies and procedures. Currently, FISMA scores are directly impacted by sampling risk, as the agencies OIG selects for detailed testing come from a non-homogenous universe with differing IT environments. FISMA testing results should be similar based on the agency’s environment and risk tolerance once compliance by all agencies is attained. This should also improve the Department’s overall security posture.

The following paragraphs summarize the key matters discussed in Exhibit A of this report, which contains OIG’s responses to the OMB/DHS/CIGIE questions. These questions were defined on the DHS CyberScope FISMA reporting website.

To address the FISMA metrics, OIG reviewed IT systems in individual agencies, OIG independent contractor audits, annual agency self-assessments, and various OIG audits

8 Audit Report 50501-0010-12 Continuous Diagnostic and Mitigation Program Assessment, June 2016.

Page 15: U.S. Department of Agriculture, Office of the Chief Information Officer ...

throughout the year.

6 AUDIT REPORT 50501-0012-12

9 Since the scope of each review and audit differed, we could not use every review or audit to address each question.

We found that the Department has established a risk management program. However, the program has not fully developed an organization-wide risk management strategy that addresses risk from an organizational perspective and has not established a governance structure.10 The Department has issued a guide11 that addresses the six step Risk Management Framework (RMF) process.12 Although improvements have been made, we continue to find authorization issues. For a system to become operational, NIST SP 800-37 requires USDA agencies to follow the RMF process to obtain an authority to operate (ATO) and effectively manage risk for its systems. In order for an ATO to be granted, systems must be categorized, controls identified and implemented, risks assessed, and the final concurrence review examined to proceed with accreditation. We found 59 of 316 operational systems with expired ATOs.13 The Department said these systems were necessary for USDA operations and therefore needed to operate without ATOs for business reasons. We also found 9 of 50 systems reviewed did not have adequately identified or documented interfaces in Cyber Security Assessment and Management (CSAM).14 Due to these conditions, OIG has determined that the Risk Management Program is not effective.

We found that the Department Plan of Action and Milestones (POA&M)15 program has established a process for centrally tracking, maintaining, and reviewing POA&M activities at 9 Agency annual self-assessments are derived from OMB Circular No. A-123, Management's Responsibility for Enterprise Risk Management and Internal Control (July 15, 2016). The Circular requires agency management to annually provide assurances on internal control in Performance and Accountability Reports. During annual assessments, agencies take measures to develop, implement, assess, and report on internal controls, and take action on needed improvements. 10 NIST Special Publication (SP) 800-37 Revision 1, Guide for Applying the Risk Management Framework to Federal Information Systems (February 2010), defines tier 1 of a risk management framework as addressing risk from an organizational perspective with the development of a comprehensive governance structure and organization-wide risk management strategy. 11 USDA Six Step RMF Process Guide, Revision 2.44 (May 2015). 12 RMF is a NIST publication that promulgates a common framework which is intended to improve information security, strengthen risk management, and encourage reciprocity between Federal agencies. NIST SP 800-37 Revision 1, Guide for Applying the Risk Management Framework to Federal Information Systems (February 2010), was developed by the Joint Task Force Transformation Initiative Interagency Working Group. OMB Memorandum (M)-04-25, FY 2004 Reporting Instructions for the Federal Information Security Management Act (August 23, 2004). 13 The total number of operational systems with expired ATOs was generated out of CSAM as of September 30, 2016, while the total number of operational systems was generated out of CSAM as of October 5, 2016 to capture any operational systems that might have been added after our first run. 14 CSAM is a comprehensive system developed by the Department of Justice, which can help in achieving FISMA compliance. CSAM provides a vehicle for the Department, agencies, system owners, and security staffs to: (1) manage their system inventory, interfaces, and related system security threats and risks; (2) enter system security data into a single repository to ensure all system security factors are adequately addressed; (3) prepare annual system security documents, such as security plans, risk analyses, and internal security control assessments; and (4) generate custom and predefined system security status reports to effectively and efficiently monitor each agency’s security posture and FISMA compliance. This includes agency-owned systems or those operated by contractors on the agency’s behalf. 15 A POA&M is a tool that identifies tasks needing to be accomplished to assist agencies in identifying, assessing, prioritizing, and monitoring the progress of efforts to correct security weaknesses found in programs and systems. It details resources required to accomplish the elements of the plan, milestones for meeting the tasks, and scheduled completion dates for the milestones. The goal of a POA&M should be to reduce the risk of the weakness identified.

Page 16: U.S. Department of Agriculture, Office of the Chief Information Officer ...

least quarterly as required by OMB.

AUDIT REPORT 50501-0012-12 7

16 However, our testing identified some areas for improvement. The Department has a total of 1,125 open POA&Ms. Of this total, 226 (20 percent) were delayed and 135 were not started (12 percent).

We found the Department had a policy for information security oversight for systems operated on USDA’s behalf by contractors or other entities, including systems and services residing in the public cloud.17 FISMA requires USDA to maintain a complete inventory of systems operated on the organization’s behalf by contractors or other entities, including cloud systems and services.18 We reviewed 33 operational contractor systems in CSAM and found 8 systems were not properly documented in the inventory. Specifically, we found that four systems were missing the interconnection agreements required to be included in the inventory, and four other systems were improperly identified as not FISMA-reportable causing them to be excluded from the inventory. Overall, due to these conditions, OIG has determined that the Contractor Systems Program is not effective.

The Department established and maintains a security configuration management program. However, there are opportunities for improvement. Specifically, we found that the Department established adequate policy and made standard baseline configurations available for all applicable operating systems; however, agencies have not followed the policy or baselines when configuring servers and workstations. At one agency, we found over 30 percent of NIST’s United States Government Configuration Baseline (USGCB) settings for Windows® workstations had deviations without the required documentation.19 This has been an outstanding issue since the FY 2013 FISMA audit.20 In that year OIG recommended the Department monitor agencies’ workstations for USGCB compliance, servers for NIST baseline compliance, and verify that deviations are documented, approved, and on file with the Department. This recommendation remains open and OCIO has exceeded its estimated implementation date of September 30, 2014. The opportunities for improvement are generally with properly documenting deviations and not in the implementation of Configuration Management. Therefore, OIG has determined that the Configuration Management Program is effective.

The Department has established an identity and access management program that is consistent with FISMA requirements, OMB policy, and applicable NIST guidelines for identifying users and network devices. For example, the Department has developed an account and identity management policy that is compliant with NIST standards and has adequately planned and implemented Personal Identity Verification (PIV) for logical and physical access in accordance

16 OMB M-04-25, FY 2004 Reporting Instructions for the Federal Information Security Management Act (August 23, 2004). 17 NIST SP 800-145, The NIST Definition of Cloud Computing (September 2011), states “Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.” 18 FISMA 2002, Public Law 107-347, 116 Statute 2899 (December 17, 2002). 19 Departmental Regulation 3520-002, Configuration Management (August 12, 2014), states, “All deviations from USGCB settings shall be documented and submitted to the USDA Chief Information Security Officer (CISO) and be approved prior to implementation on agency and office production systems.” 20 Audit Report 50501-0004-12, USDA Office of the Chief Information Officer, Fiscal Year 2013 Federal Information Security Management Act, Nov. 2013.

Page 17: U.S. Department of Agriculture, Office of the Chief Information Officer ...

with Government standards.

8 AUDIT REPORT 50501-0012-12

21 However, our testing identified opportunities for improvement. We found that two agencies did not ensure that users were granted access based on need, and agencies did not terminate or deactivate employee accounts when access was no longer required. Specifically, we found 17 separated employees that did not have their user accounts disabled or deactivated after access was no longer needed. In addition, contractor audits noted three systems and four agencies self-reported identity and access management issues. Departmental policy requires that all user identifications and passwords, or other means of accessing files or using computer resources, shall be permanently disabled immediately upon notice of termination.22 Due to these conditions, OIG has determined that the Identity and Access Management Program is not effective. The Department has not established a security training program that is consistent with FISMA requirements, OMB policy, and applicable NIST guidelines. We found that the Department had policy23 and procedures24 that met all NIST requirements for annual security awareness training (SAT). However three of six tested agencies did not have adequate procedures in place. Additionally, in the six reviewed agencies, nearly 6 percent of the individuals reviewed had not completed the annual SAT. More than 53 percent of individuals from five of the six agencies reviewed had not completed the NIST SP 800-53 required specialized security training and one agency had not identified or tracked users since 2014. Also, we found that 574 out of 1,160 users for an agency had active agency computer access accounts but had not set up their training accounts. The 574 users had no training documented and were not identified by the agency or the Department as needing to complete SAT. Therefore, we noted the Department is not accurately tracking all users who are required to take the annual SAT. Due to these conditions, OIG has determined that the Security and Privacy Training Program is not effective.

During our review, we found that USDA had established a continuous monitoring program. Specifically, we found that the Department has issued a policy, as well as procedures, for continuous monitoring. The Department has also issued the U.S. Department of Agriculture Strategic Plan for Information Security Continuous Monitoring (ISCM) as its enterprise-wide ISCM Program.25 However, we found issues with regard to the Department’s implementation of its ISCM Program. Specifically, we found that USDA has not performed an assessment of the skills, knowledge, and resources needed to effectively implement an ISCM program. Also, the Department has not yet defined how it will use automation to produce an accurate inventory of the authorized and unauthorized devices and software on its network and the security

21 The Executive Branch mandate entitled, Homeland Security Presidential Directive 12 (HSPD-12) (August 2004), requires Federal agencies to develop and deploy for all of their employees and contract personnel a PIV credential which is used as a standardized, interoperable card capable of being used as employee identification and allows for both physical and information technology system access. 22 DR 3505-003, Access Control for Information and Information Systems (February 10, 2015). 23 DR 3545-001, Information Security Awareness and Training Policy (October 22, 2013). 24 Departmental Standard Operating Procedures (SOP)-Oversight and Compliance Division (OCD), Information Security Awareness Training Standard Operating Procedures (March 2, 2015). 25 U.S. Department of Agriculture Strategic Plan for Information Security Continuous Monitoring, Version 1.2 (July 2015).

Page 18: U.S. Department of Agriculture, Office of the Chief Information Officer ...

configuration of these devices and software. Overall, OIG rated the level of the Department’s ISCM at the Ad hoc level, the first level in continuous monitoring maturity.

AUDIT REPORT 50501-0012-12 9

26

During our review, we found that USDA does have an incident response and reporting program but is missing key elements to be fully effective. We reviewed a sample of incidents and found that incidents are being reported, reviewed, and monitored by OCIO. However, we found the Department’s policy and procedures for meeting the laws, regulations, and standards of a comprehensive incident response program are not up to date and are currently in draft. The procedures have been in draft since 2011 and serve as the foundation for agencies and offices to develop and implement cyber security incident management procedures and plans that comply with Federal and Departmental requirements. Without adequate and up-to-date Departmental policy and procedures, agencies may not have sufficient resources and guidance when implementing their incident response program. Overall, OIG rated the level of the Department’s Incident Response and Reporting program at the Ad hoc level, the first level in incident response and reporting maturity.

The Department has established an enterprise-wide business continuity/disaster recovery program. However, our testing identified opportunities for improvement. We identified 189 of 316 agency systems27 within the Department that did not have a testing date recorded in CSAM during FY 2016.28 We also found that 69 of 79 sampled systems did not have evidence of testing in at least one of the last 3 years. The following recommendations are new for FY 2016. Because 22 recommendations from FYs 2009 through 2015 have not been closed, we have not made any repeat recommendations. If the planned corrective actions initiated to close out the FY 2009 through 2015 recommendations are no longer achievable due to budget cuts or other reasons, then OCIO needs to update those corrective action plans and request a change in management decision, in accordance with Departmental guidance.

OCIO concurred with the findings in this report and generally agreed with the recommendations.

Recommendation 1 Implement a governance structure in accordance with the Risk Management Framework, utilizing tools that exist and have been implemented, as well as those under development.

Recommendation 2

Identify all users who need security awareness training, populate the training repository completely with those individuals, and ensure they receive the required training.

26 Ad hoc is the first level of continuous monitoring and means that the ISCM program is not formalized and ISCM activities are performed in a reactive manner. An Ad hoc program does not meet Level 2 requirements for a defined program consistent with Government standards. 27 System inventory report from CSAM was run on September 8, 2016 and then updated on October 19, 2016 to capture contingency plan test dates for the remainder of the FY 2016. 28 USDA Contingency Plan Exercise Handbook, Revision 2.0 (October 2014).

Page 19: U.S. Department of Agriculture, Office of the Chief Information Officer ...

Recommendation 3

USDA needs to develop a strategy to attain adequate resources to ensure that the CDM program is effectively implemented, maintained, and funded across the entire Department for the life of the program.

10 AUDIT REPORT 50501-0012-12

Page 20: U.S. Department of Agriculture, Office of the Chief Information Officer ...

Scope and Methodology 

AUDIT REPORT 50501-0012-12 11

The scope of our review was Departmentwide and included agency IT audit work completed during FY 2016. Fieldwork for this audit was performed from February 2016 through October 2016. In addition, this report incorporates OIG audits completed during the year. Testing was conducted at offices in the Washington, D.C., and Kansas City, Missouri, areas. Additionally, we included the results of IT control testing and compliance with laws and regulations performed by contract auditors and agency self-assessments. In total, our FY 2016 FISMA audit work covered 23 agencies and offices:

· Agricultural Marketing Service · Animal and Plant Health Inspection Service · Agricultural Research Service · Departmental Management · Foreign Agricultural Service · Food and Nutrition Service · Forest Service · Farm Service Agency · Food Safety and Inspection Service · Grain Inspection, Packers and Stockyards Administration · National Appeals Division · National Agricultural Library · National Agricultural Statistics Service · National Institute of Food and Agriculture · Natural Resources Conservation Service · Office of the Assistant Secretary for Civil Rights · Office of Communications · Office of the Chief Economist · OCIO · Office of Ethics · Office of the Chief Financial Officer · Rural Development · Risk Management Agency

These agencies and offices operate 310 of the Department’s 316 general support and major application systems.29

29 OMB Circular A-130, Appendix III, Security of Federal Automated Information Resources (November 28, 2000), defines a major application as an application that requires special attention to security due to the risk and magnitude of harm resulting from the loss, misuse, or unauthorized access to or modification of the information in the application. A general support system is defined as an interconnected set of information resources under the same direct management control that share common functionality. It normally includes hardware, software, information, data, applications, communications, and people.

Page 21: U.S. Department of Agriculture, Office of the Chief Information Officer ...

To accomplish our audit objectives, we performed the following procedures:

· Consolidated the results and issues from our prior IT security audit work, including interim audit reports 50501-0012-12 (1) and (2) and the work contractors performed on our behalf. Contractor audit work consisted primarily of audit procedures found in the GAO’s Financial Information System Control Audit Manual.

· Performed detailed testing specific to FISMA requirements at selected agencies, as detailed in this report.

· Gathered the necessary information to address the specific reporting requirements outlined in FY 2016 Inspector General Federal Information Security Modernization Act of 2014 Reporting Metrics (IG FISMA Metrics) V1.1.3 (September 26, 2016). DHS uses the website CyberScope to consolidate the reporting.

· Evaluated the Department’s progress in implementing recommendations to correct material weaknesses identified in prior OIG audit reports.

· Performed statistical sampling for testing where appropriate. Additional sample analysis information is presented in Exhibit B.

We compared test results against NIST controls, OMB/DHS/CIGIE guidance, e-Government Act requirements, and Departmental policies and procedures to determine compliance.

We conducted this audit in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives.

12 AUDIT REPORT 50501-0012-12

Page 22: U.S. Department of Agriculture, Office of the Chief Information Officer ...

Abbreviations A&A Assessment and Authorization ATO Authority to Operate AV Anti-Virus BCP Business Continuity Plan BIA Business Impact Analysis CDM Continuous Diagnostics and Mitigation CIGIE Council of the Inspectors General on Integrity and Efficiency CIO Chief Information Officer CSAM Cyber Security Assessment Management System DHS Department of Homeland Security DLP Data Loss Prevention DRP Disaster Recovery Plan EVPN Enterprise Virtual Private Network FAR Federal Acquisition Regulation FISMA .. Federal Information Security Modernization Act FY Fiscal Year GAO Government Accountability Office GISRA .. Government Information Security Reform Act HSPD Homeland Security Presidential Directive HWAM . Hardware Asset Management IG Inspector General ISCM Information Security Continuous Monitoring IT information technology NIST National Institute of Standards and Technology OCIO Office of the Chief Information Officer OIG Office of Inspector General OMB Office of Management and Budget OS Operating System PII Personally Identifiable Information PIV Personal Identity Verification POA&M Plan of Action and Milestones RMF Risk Management Framework SAT Security Awareness Training SWAM . Software Asset Management TIC Trusted Internet Connections TT&E Test, Training, and Exercise US-CERT United States Computer Emergency Response Team USDA United States Department of Agriculture USGCB . United States Government Configuration Baseline

AUDIT REPORT 50501-0012-12 13

...........................................................................

..............................................................................

............................................................................

..........................................................................

.....................................................................................................................................................

.........................................................................

.........................................................................

.....................................................................

....................................................................

.............................................................................

...................................................................................................................

............................................................................

.........................................

................................................................................

....................................................................

......................................................................

..........................................................................

................................................................

................................

Page 23: U.S. Department of Agriculture, Office of the Chief Information Officer ...

Exhibit A:  Office of Management and Budget/Department of Homeland Security Reporting Requirements and U.S. Department of Agriculture Office of Inspector General Position 

14 AUDIT REPORT 50501-0012-12

OMB/DHS/CIGIE’s questions are set apart using boldface type in each section. We answered direct questions, in boldface, with either a Met or Not Met, or the appropriate levels as required. The universe of systems and agencies reviewed varied during each audit or review included in each question of this report. As part of FISMA, OIG reviewed: systems and agencies, audit work conducted for OIG by independent public accounting firm contractors, annual agency self-assessments, and various OIG audits conducted throughout the year.30 Since the scope of each review and audit differed, we could not use every review or audit to answer each question. The audit team reviewed all five Cybersecurity Framework security functions and incorporated statistical sampling into three review areas. Each of the three areas was represented by the relevant universe associated with it. The specific sample designs are summarized in Exhibit B.

For Sections 3 and 4 that use the maturity model, we have included the question corresponding to the highest level of maturity achieved.

Section 0: Overall

0.1 Please provide an overall narrative assessment of the agency’s information security program. Please note that OMB will include this information in the publicly available Annual FISMA Report to Congress to provide additional context for the Inspector General’s effectiveness rating of the agency’s information security program. OMB may modify this response to conform with the grammatical and narrative structure of the Annual Report.

The Office of the Chief Information Officer (OCIO) continues to take positive steps towards improving the Department’s security posture. OCIO released two critical policies this year: Secure Communication Systems and Contingency Planning and Disaster Recovery Planning. Once implemented, these policies should improve IT security within USDA. OCIO also began the implementation of the DHS Continuous Diagnostic and Mitigation (CDM) project.

Although, USDA is working to improve its IT security posture, many longstanding weaknesses remain. We continue to find that OCIO has not implemented corrective actions that the Department has committed to in response to prior OIG recommendations. In FYs 2009 through 2015, OIG made 61 recommendations for improving the overall security of USDA’s systems; 39 have been closed (i.e., the agreed upon corrective action has been implemented), 4 are current 30 Agency annual self-assessments are required by OMB Circular No. A-123, Management's Responsibility for Enterprise Risk Management and Internal Control (July 15, 2016), which defines management’s responsibility for internal controls in Federal agencies. The Circular requires an agency’s management to annually provide assurances on internal control in its Performance and Accountability Report. During the annual assessment, agencies take measures to develop, implement, assess, and report on internal control, and to take action on needed improvements.

Page 24: U.S. Department of Agriculture, Office of the Chief Information Officer ...

and have not yet reached the estimated completion date, however, 18 are overdue for completion. Our testing identified that security weaknesses still exist in 3 of the 39 closed recommendations. Due to these identified outstanding recommendations and weaknesses, we continue to report a material weakness in USDA’s IT security that should be included in the Department’s Federal Managers Financial Integrity Act report. Based on these outstanding recommendations, and the findings in this report, OIG concludes that the Department lacks an effective information security program and practices.

Section 1: Identify

Risk Management (Identify) 1.1 Has the organization established a risk management program that includes comprehensive agency policies and procedures consistent with FISMA requirements, OMB policy, and applicable NIST guidelines? – Met

The Department has established a risk management program. However, the program is lacking an implemented risk management strategy and a governance structure.

1.1.1 Identifies and maintains an up-to-date system inventory, including organization- and contractor-operated systems, hosting environments, and systems residing in the public, hybrid, or private cloud. (2016 CIO FISMA Metrics, 1.1; NIST Cybersecurity Framework (CF) ID.AM.1, NIST 800-53: PM-5) – Not Met

Exception noted. The Department does not maintain an up-to-date inventory of systems. We found 32 systems improperly identified.

AUDIT REPORT 50501-0012-12 15

31

1.1.2 Develops a risk management function that is demonstrated through the development, implementation, and maintenance of a comprehensive governance structure and organization-wide risk management strategy as described in NIST SP 800-37, Rev. 1. (NIST SP 800-39) – Not Met Exception noted. The Department has developed a risk management guide32 and a draft risk management strategy that addresses risk from an organizational perspective, but the governance structure has not been fully implemented.33 As a result, the Department cannot ensure that managing information system-related security risks is consistent with the organization’s mission/business objectives and overall risk strategy.

31 We found 22 contractor systems incorrectly documented in CSAM, 6 cloud systems improperly identified, and 4 systems containing Personally Identifiable Information (PII) that were not notated in CSAM as having PII, totaling 32 systems improperly identified. 32 USDA Six Step RMF Process Guide, Revision 2.44 (May 2015). 33 NIST SP 800-37 Revision 1, Guide for Applying the Risk Management Framework to Federal Information Systems (February 2010), defines Tier 1 of a risk management framework as addressing risk from an organizational perspective with the development of a comprehensive governance structure and organization-wide risk management strategy.

Page 25: U.S. Department of Agriculture, Office of the Chief Information Officer ...

1.1.3 Incorporates mission and business process-related risks into risk-based decisions at the organizational perspective, as described in NIST SP 800-37, Rev. 1. (NIST SP 800-39) – Not Met

Exception noted. As noted in 1.1.2, the Department does have a draft management strategy, but the governance structure has not been fully implemented. Therefore, the Department does not incorporate mission and business process-related risks into risk-based decisions at the organizational perspective.

1.1.4 Conducts information system level risk assessments that integrate risk decisions from the organizational and mission/business process perspectives and take into account threats, vulnerabilities, likelihood, impact, and risks from external parties and common control providers. (NIST SP 800-37, Rev. 1, NIST SP 800-39, NIST SP 800-53: RA-3) – Not Met

Exception noted. The Department conducts system-level risk assessments. However, as noted in 1.1.2, the Department has developed a draft management strategy, but the governance structure has not been fully implemented. Therefore, the Department does not conduct system level risk assessments that integrate risk decisions from the organizational and mission/business process perspectives. 1.1.5 Provides timely communication of specific risks at the information system, mission/business, and organization-level to appropriate levels of the organization. – Met

No exception noted. The Department does provide timely communication of specific risks at the information system, mission/business, and organization-level to the appropriate level of the organization.

1.1.6 Performs comprehensive assessments to categorize information systems in accordance with Federal standards and applicable guidance. (FIPS 199, FIPS 200, FISMA, Cybersecurity Sprint, OMB M-16-04, President’s Management Council (PMC) cybersecurity assessments) – Met

No exception noted. We generated a report from CSAM, which identified the impact level for each of the Department’s systems. The report included the impact levels for confidentiality, integrity, and availability, which were categorized as high, moderate, or low.

16 AUDIT REPORT 50501-0012-12

34 We compared the generated report to NIST’s guidance and found no exceptions.35

34 FISMA (44 U.S.C. Section 3542) defines integrity as guarding against improper information modification or destruction, and includes ensuring information on repudiation and authenticity. Confidentiality is defined as preserving authorized restrictions on access and disclosure, including means for protecting personal privacy and proprietary information. Availability is defined as ensuring timely and reliable access to and use of information. 35 NIST SP 800-60, Guide for Mapping Types of Information and Information Systems to Security Categories, Volume 1 (August 2008).

Page 26: U.S. Department of Agriculture, Office of the Chief Information Officer ...

1.1.7 Selects an appropriately tailored set of baseline security controls based on mission/business requirements and policies and develops procedures to employ controls within the information system and its environment of operation. – Met

No exception noted. The correct categorization noted in 1.1.6 led to adequate baseline controls being implemented. The Department’s RMF Process Guide states that once the security categorization of the information system is documented in CSAM, the corresponding set of controls (high, moderate, or low) will automatically be selected for the information system within CSAM.

1.1.8 Implements the tailored set of baseline security controls as described in 1.1.7. – Met No exception noted. NIST SP 800-53 recommends a set of minimum baseline security controls based on the system’s overall categorization. The lower the category, the fewer controls required. The correct categorization noted in 1.1.6 led to adequate controls being implemented.

1.1.9 Identifies and manages risks with system interconnections, including through authorizing system interconnections, documenting interface characteristics and security requirements, and maintaining interconnection security agreements. (NIST SP 800-53: CA-3) – Not Met Exception noted. We found 9 of 50 systems’ interfaces had not been adequately identified or documented in CSAM. Although the Department had sufficient published policies, the agencies were not consistently implementing the policies.

In the FY 2012 FISMA report, OIG recommended the Department develop and implement an effective process for making sure interface connections are documented and that Interconnection Agreements accurately reflect all connections to the systems. The Department needs to review interfaces during the annual testing processes. USDA agreed and management decision has been reached, but OCIO has exceeded its estimated completion date of September 30, 2013.

1.1.10 Continuously assesses the security controls, including hybrid and shared controls, using appropriate assessment procedures to determine the extent to which the controls are implemented correctly, operating as intended, and producing the desired outcome with respect to meeting the security requirements for the system. – Met No exception noted. The Department assesses the security controls using appropriate assessment procedures to determine the extent to which the controls are implemented correctly, operating as intended, and producing the desired outcome with respect to meeting the security requirements for the system.

1.1.11 Maintains ongoing information system authorizations based on a determination of the risk to organizational operations and assets, individuals, other organizations, and the Nation resulting from the operation of the information system and the decision that this risk is acceptable (OMB M-14-03, NIST Supplemental Guidance on Ongoing Authorization). – Not Met

AUDIT REPORT 50501-0012-12 17

Page 27: U.S. Department of Agriculture, Office of the Chief Information Officer ...

Exception noted. The Department does not authorize information system operation based on a determination of the risk to organizational operations and assets. We found 59 of 316 operational systems with expired ATOs.

In the FY 2012 FISMA report, OIG recommended the Department verify that all systems have the proper ATO prior to implementation. USDA agreed and management decision has been reached, but OCIO has exceeded its estimated completion date of September 30, 2013.

1.1.12 Security authorization package contains system security plan, security assessment report, and POA&M that are prepared and maintained in accordance with government policies. (SP 800-18, SP 800-37) – Met

No exception noted. The security authorization package contains system security plan, security assessment report, and POA&Ms that are prepared and maintained in accordance with government policies.

18 AUDIT REPORT 50501-0012-12

36

1.1.13 POA&Ms are maintained and reviewed to ensure they are effective for correcting security weaknesses. – Not Met

Exception noted. OMB M-04-25 specifies that effective remediation of IT security weaknesses is essential to achieve a mature, sound IT security program and to secure information and systems. The Department has a total of 1,125 open POA&Ms. Of this total, 226 (20 percent) were delayed and 135 (12 percent) were not started.

1.1.14 Centrally tracks, maintains, and independently reviews/validates POA&M activities at least quarterly. (NIST SP 800-53: CA-5; OMB M-04-25) – Met

No material exception noted. We found that the Department POA&M program has established a process for centrally tracking, maintaining, and reviewing POA&M activities at least quarterly. For example, we found, based on our sample,37 that 95 percent of closed POA&Ms are maintained and reviewed to ensure they are effective for correcting security weaknesses.38

1.1.15 Prescribes the active involvement of information system owners and common control providers, chief information officers, senior information security officers,

36 NIST SP 800-18, Guide for Developing Security Plans for Federal Information Systems (February 2006), requires the SSP as part of the Assessment and Authorization documentation. It provides an overview of the security requirements of the system and describes the controls in place (or planned) for meeting those requirements. The SSP also delineates responsibilities and expected behavior of all individuals who access the system. 37 We based our sample size on an 18 percent error rate and desired absolute precision of +/-10 percent, at the 95 percent confidence level. With these assumptions, we calculated a sample size of 57 POA&Ms for review and selected them by choosing a simple random sample. Additional sample design information is presented in Exhibit B. 38 We are 95 percent confident that between 599 (89 percent) and 670 (99.6 percent) of closed POA&Ms in FY 2016 had remediation actions that sufficiently addressed the identified weaknesses in accordance with Government policies. Additional sample design information is presented in Exhibit B.

Page 28: U.S. Department of Agriculture, Office of the Chief Information Officer ...

authorizing officials, and other roles as applicable in the ongoing management of information-system-related security risks. – Met

No exception noted. The Department’s Risk Management program prescribes the active involvement of information system owners and common control providers, chief information officers, senior information security officers, authorizing officials, and other roles as applicable in the ongoing management of information-system-related security risks.

1.1.16 Implemented an insider threat detection and prevention program, including the development of comprehensive policies, procedures, guidance, and governance structures, in accordance with Executive Order 13587 and the National Insider Threat Policy. (PMC; NIST SP 800-53: PM-12) – Met

No exception noted. The Department implemented an insider threat detection and prevention program, including the development of comprehensive policies, procedures, guidance, and governance structures, in accordance with government policies.

1.1.17 Provide any additional information on the effectiveness (positive or negative) of the organization’s Risk Management program that was not noted in the questions above. Based on all testing performed, is the Risk Management program effective? – Not Effective Based on the questions answered in this section, OIG has determined that the Risk Management Program is not effective.

Contractor Systems (Identify)

1.2 Has the organization established a program to oversee systems operated on its behalf by contractors or other entities, including other government agencies, managed hosting environments, and systems and services residing in a cloud external to the organization that is inclusive of policies and procedures consistent with FISMA requirements, OMB policy, and applicable NIST guidelines? – Met We found the Department has established a program to oversee contractor systems; however, the program is not consistently implemented.

1.2.1 Establishes and implements a process to ensure that contracts/statements of work/solicitations for systems and services, include appropriate information security and privacy requirements and material disclosures, FAR clauses, and clauses on protection, detection, and reporting of information. (FAR Case 2007-004, Common Security Configurations, FAR Sections 24.104, 39.101, 39.105, 39.106, 52.239-1; PMC, 2016 CIO Metrics 1.8, NIST 800-53, SA-4 FedRAMP standard contract clauses; Cloud Computing Contract Best Practices) –Met No exception noted. We found that USDA has implemented a process to ensure that contracts/statements of work/solicitations for systems and services include appropriate

AUDIT REPORT 50501-0012-12 19

Page 29: U.S. Department of Agriculture, Office of the Chief Information Officer ...

information security and privacy requirements and material disclosures, FAR clauses, and clauses on protection, detection, and reporting of information.

1.2.2 Specifies within appropriate agreements how information security performance is measured, reported, and monitored on contractor- or other entity-operated systems. (CIO and CAO Council Best Practices Guide for Acquiring IT as a Service, NIST SP 800-35) – Met

No exception noted. We found that USDA has implemented a process to specify how information security performance was measured, reported, and monitored within its contractor system agreements. 1.2.3 Obtains sufficient assurance that the security controls of systems operated on the organization’s behalf by contractors or other entities and services provided on the organization’s behalf meet FISMA requirements, OMB policy, and applicable NIST guidelines. (NIST SP 800-53: CA-2, SA-9) – Not Met

Exception noted. We found USDA’s inventory of contractor and cloud systems was not completely accurate, and therefore does not obtain sufficient assurance that the security controls of systems operated on the organization’s behalf by contractors or other entities and services provided on the organization’s behalf, meet FISMA requirements, OMB policy, and applicable NIST guidelines. Specifically, we found 4 systems that had interconnections insufficiently documented, 4 systems identified as not FISMA-reportable, and 11 systems had expired or invalid ATOs. We also found seven cloud systems that were not correctly identified as contractor systems. Although the Department had sufficient published policies,

20 AUDIT REPORT 50501-0012-12

39 the agencies were not following the guidance.

In the FY 2009 FISMA report, we recommended the Department develop and implement an effective process to ensure system interfaces are accounted for in CSAM. Although this recommendation has reached final action and is closed, we take exception to the recommendation being closed. In the FY 2010 FISMA report, we recommended the Department ensure contractor and non-contractor systems inventory and interfaces are accurate and updates are completed at least annually. Although management decision has been reached, the recommendation was still open and had exceeded USDA’s estimated final action completion date of September 30, 2011.

In the FY 2012 FISMA report, we recommended the Department develop and implement an effective process for making sure interface connections are documented, and that Interconnection Agreements accurately reflect all connections to the systems. The Department needs to review interfaces during the annual testing processes. Although management decision has been reached, the recommendation was still open and had exceeded USDA’s estimated final action completion date of September 30, 2013.

39 USDA DR 3540-003, Security Assessment and Authorization (August 12, 2014).

Page 30: U.S. Department of Agriculture, Office of the Chief Information Officer ...

1.2.4 Provide any additional information on the effectiveness (positive or negative) of the organization’s Contractor Systems Program that was not noted in the questions above. Based on all testing performed, is the Contractor Systems Program effective? – Not Effective

Based on the questions answered in this section, OIG has determined that the Contractor Systems Program is not effective.

AUDIT REPORT 50501-0012-12 21

Section 2: Protect Configuration Management (Protect)

2.1 Has the organization established a configuration management program that is inclusive of comprehensive agency policies and procedures consistent with FISMA requirements, OMB policy, and applicable NIST guidelines? – Met

USDA has developed a configuration management program. However, it is not consistently implemented across the USDA agencies.

2.1.1 Develops and maintains an up-to-date inventory of the hardware assets (i.e., endpoints, mobile assets, network devices, input/output assets, and SMART/NEST devices) connected to the organization’s network with the detailed information necessary for tracking and reporting. (NIST CF ID.AM-1; 2016 CIO FISMA Metrics 1.5, 3.17; NIST 800-53: CM-8) – Met

No exception noted. NIST SP 800-53 requires the organization to develop and document an inventory of information system components that accurately reflects the current information system, includes all components within the authorization boundary of the information system, and is at the level of granularity deemed necessary for tracking and reporting. The agencies reviewed were able to provide an up-to-date accurate inventory of hardware assets. The Department’s Hardware Asset Management (HWAM) solution of the CDM project will be fully operational in early FY 2017. This solution will provide hardware asset management Departmentwide when implemented.

2.1.2 Develops and maintains an up-to-date inventory of software platforms and applications used within the organization and with the detailed information necessary for tracking and reporting. (NIST 800-53: CM-8, NIST CF ID.AM-2) – Not Met

Exception noted. NIST requires the organization to develop and document an inventory of information system components that accurately reflects the current information system, includes all components within the authorization boundary of the information system, and is at the level of granularity deemed necessary for tracking and reporting. One of the agencies we reviewed was not able to provide an accurate inventory of software applications. The Department’s Software Asset Management (SWAM) solution of the CDM project should be fully operational

Page 31: U.S. Department of Agriculture, Office of the Chief Information Officer ...

in late FY 2017. This solution will provide software asset management Departmentwide when implemented.

2.1.3 Implements baseline configurations for IT systems that are developed and maintained in accordance with documented procedures. (NIST SP 800-53: CM-2; NIST CF PR.IP-1) – Met

No exception noted. NIST requires the organization to develop, document, and maintain under configuration control, a current baseline configuration of the information system. The Department has issued policy stating that NIST will be the official baseline configuration guide repository for all operating systems in use at USDA.

22 AUDIT REPORT 50501-0012-12

40 2.1.4 Implements and maintains standard security settings (also referred to as security configuration checklists or hardening guides) for IT systems in accordance with documented procedures. (NIST SP 800-53: CM-6; CIO 2016 FISMA Metrics, 2.3) – Not Met

Exception noted. NIST requires the organization to establish and document configuration settings for information technology products employed within the information system using defined security configuration checklists that reflect the most restrictive mode consistent with operational requirements. We found over 30 percent (119,331 of 392,060) of the USGCB settings on the Windows® workstations at one agency were not compliant with the baseline guides, nor were the deviations sufficiently documented.

In the FY 2013 FISMA report, we recommended that the Department monitor agencies' workstations for USGCB compliance, servers for NIST baseline compliance, and verify that deviations are documented, approved, and on file with the Department. OCIO has exceeded its estimated completion date of September 30, 2014, for final action.

2.1.5 Assesses configuration change control processes, including processes to manage configuration deviations across the enterprise that are implemented and maintained. (NIST SP 800-53: CM-3, NIST CF PR.IP-3) – Met

No material exception noted. NIST requires the organization to document approved configuration-controlled changes to the system. OIG reviewed the detailed change documentation at two agencies and found no issues with the change control process.

40 DR 3520-002, Configuration Management (August 12, 2014).

Page 32: U.S. Department of Agriculture, Office of the Chief Information Officer ...

2.1.6 Identifies and documents deviations from configuration settings. Acceptable deviations are approved with business justification and risk acceptance. Where appropriate, automated means that enforce and redeploy configuration settings to systems at regularly scheduled intervals are deployed, while evidence of deviations is also maintained. (NIST SP 800-53: CM-6, Center for Internet Security Controls (CIS) 3.7) – Not Met

Exception noted. NIST requires Federal agencies to establish and document mandatory configuration settings for information technology products employed within the information system and to implement the recommended configuration settings.

AUDIT REPORT 50501-0012-12 23

41 OIG found that both of the agencies reviewed did not remediate configuration vulnerabilities. Specifically, we found 861 configuration-related vulnerabilities on 7 websites maintained by the agencies.42 Consequently, the websites are at risk for compromise. In addition, three other agencies self-reported a deficiency with deviations from configuration settings.

2.1.7 Implemented SCAP certified software assessing (scanning) capabilities against all systems on the network to assess both code-based and configuration-based vulnerabilities in accordance with risk management decisions. (NIST SP 800-53: RA-5, SI- 2; CIO 2016 FISMA Metrics 2.2, CIS 4.1) – Not Met

Exception noted. The Department requires all agencies to establish and implement procedures for accomplishing vulnerability scanning of all networks, systems, servers, and desktops for which they have responsibility. This includes performing monthly scans and remediating vulnerabilities found as a result of the scans.43 We found one agency’s scans reported 25 percent less vulnerabilities than the Department’s scanners. Additionally, two agencies self-reported that they had an issue with the process for timely remediation of scan results.

2.1.8 Remediates configuration-related vulnerabilities, including scan findings, in a timely manner as specified in organization policy or standards. (NIST 800-53: CM-4, CM-6, RA-5, SI-2) – Not Met

Exception noted. NIST SP 800-70 requires Federal agencies to establish and document mandatory configuration settings for information technology products employed within the information system and to implement the recommended configuration settings. The Department further requires all agencies to document any vulnerability that cannot be remediated within 30 days. While the agencies reviewed did remediate some configuration-related vulnerabilities, OIG found that both agencies reviewed did not mitigate all vulnerabilities, nor did they

41 NIST SP 800-70, Revision 3, National Checklist Program for IT Products: Guidelines for Checklist Users and Developers (December 2015) states: all deviations from the settings in the checklist should be documented for future reference. The documentation should include the reason behind each deviation, including the impact of retaining the setting and the impact of deviating from the setting. This documentation helps in managing changes to the checklist over the life cycle of the product being secured. 42 We utilized a commercially available software package designed to thoroughly analyze web applications and web services (websites) for security vulnerabilities. 43 A patch is a small piece of software that is used to correct a problem with a software program or an operating system. Most major software companies will periodically release patches, usually downloadable from the internet, that correct very specific problems or security flaws in their software programs.

Page 33: U.S. Department of Agriculture, Office of the Chief Information Officer ...

document open vulnerabilities that were not mitigated within 30 days. Specifically, at one reviewed agency we found 391 of 1,108 (over 35 percent) of un-remediated vulnerabilities over 30 days old were not properly documented. In addition, two agencies self-reported that they had an issue with remediating vulnerabilities timely.

2.1.9 Develops and implements a patch management process in accordance with organization policy or standards, including timely and secure installation of software patches. (NIST SP 800-53: CM-3, SI-2, OMB M-16-04, DHS Binding Operational Directive 15-01) – Not Met

Exception noted. NIST SP 800-53 requires the organization to identify and correct system flaws and incorporate flaw remediation (known as vendor patches) into the organizational configuration management process. We found that both of the agencies reviewed had not implemented a process for timely and secure installation of software patches. Specifically, at one agency we found that over 13 percent (982 of 7,143) of the vulnerabilities were not corrected within 90 days with an available patch from the vendor. As a result, systems are at risk of compromise when they could have been secured, had the available patch been applied. In addition, two agencies self-reported that they had issues with the patch management process.

2.1.10 Provide any additional information on the effectiveness (positive or negative) of the organization’s Configuration Management Program that was not noted in the questions above. Based on all testing performed, is the Configuration Management Program effective? – Effective

Based on the above questions, OIG has determined that the Configuration Management Program is effective. Although there are opportunities for improvement, issues are generally with properly documenting deviations, not in the implementation of Configuration Management..

Additional Issue: OIG reviewed computers in use at USDA specifically looking for operating systems (OS) in use past their end-of-support. We found 15 servers at USDA that were using an OS past its end-of-support.

24 AUDIT REPORT 50501-0012-12

44 Devices using an expired OS are more vulnerable to malware, and agency data are at greater risk of unauthorized access.

Identity and Access Management (Protect)

2.2 Has the organization established an identity and access management program, including policies and procedures consistent with FISMA requirements, OMB policy, and applicable NIST guidelines? – Met

We found that the Department’s and agencies’ current policy was compliant with NIST SP 800-53. 2.2.1 Ensures that individuals requiring access to organizational information and information systems sign appropriate access agreements, participate in required training 44 End-of-support refers to the date when a vendor no longer provides automatic fixes, updates, or online technical support.

Page 34: U.S. Department of Agriculture, Office of the Chief Information Officer ...

prior to being granted access, and recertify access agreements on a predetermined interval. (NIST 800-53: PL-4, PS-6) – Not Met

Exception noted. We found one of two agencies reviewed was not ensuring individuals requiring access to organizational information and information systems were signing appropriate access agreements, participating in required training prior to being granted access, and recertifying access agreements on a predetermined interval. In addition, a contractor review found that for one of five systems reviewed, users were not required to sign appropriate access agreements and participate in required training.

2.2.2 Ensures that all users are only granted access based on least privilege and separation-of-duties principles. – Not Met

Exception noted. We found that both agencies reviewed did not ensure that users were granted access based on need and “separation of duties” principles.

AUDIT REPORT 50501-0012-12 25

45 Additionally, four agencies self-reported deficiencies in this area. A contractor review found that for three of the eight systems reviewed, users were not granted access based on need. As a result, accounts have excessive privileges, which may result in the unauthorized access, misuse, disclosure, disruption, modification, or destruction of information.

2.2.3 Distinguishes hardware assets that have user accounts (e.g., desktops, laptops, servers) from those without user accounts (e.g. networking devices, such as load balancers and intrusion detection/prevention systems, and other input/output devices such as faxes and IP phones). – Met

No exception noted. We found that both agencies reviewed identified hardware assets that have user accounts and non-user accounts.

2.2.4 Implements PIV for physical access in accordance with government policies. (HSPD 12, FIPS 201, OMB M-05-24, OMB M-07-06, OMB M-08-01, OMB M-11-11) – Met

No exception noted. We found that the two agencies reviewed adequately implemented PIV cards for physical access in accordance with government policies.

45 Separation of duties refers to dividing roles and responsibilities so that a single individual cannot subvert a critical process. For example, in financial systems, no single individual should normally be given authority to issue checks. Rather, one person initiates a request for a payment and another authorizes that same payment. In effect, checks and balances need to be designed into both the process as well as the specific, individual duties of personnel who will implement the process. Ensuring that such duties are well defined is the responsibility of management.

Page 35: U.S. Department of Agriculture, Office of the Chief Information Officer ...

2.2.5 Implements PIV or a NIST Level of Assurance (LOA) 4 credential for logical access by all privileged users (system, network, database administrators, and others responsible for system/application control, monitoring, or administration functions). (Cybersecurity Sprint, OMB M-16-04, PMC, 2016 CIO FISMA Metrics 2.5.1) – Met

No exception noted. We found that all agencies reviewed adequately implemented PIV for logical access for all privileged users in accordance with government policies.

26 AUDIT REPORT 50501-0012-12

46

2.2.6 Enforces PIV or a NIST LOA 4 credential for logical access for at least 85% of non-privileged users. (Cybersecurity Sprint, OMB M-16-04, PMC, 2016 CIO FISMA Metrics 2.4.1) – Met No material exception noted. We found that all agencies reviewed adequately enforced PIV for logical access for at least 85 percent of non-privileged users in accordance with government policies.

2.2.7 Tracks and controls the use of administrative privileges and ensures that these privileges are periodically reviewed and adjusted in accordance with organizationally defined timeframes. (2016 CIO FISMA Metrics 2.9, 2.10; OMB M-16-04, CIS 5.2) – Not Met Exception noted. We found that both reviewed agencies were not ensuring privileged accounts were periodically reviewed. Additionally, three agencies self-reported deficiencies in this area. Also, a contractor review found two of eight agencies’ systems did not track and control the use of administrative privileges and ensure that these privileges were periodically reviewed. As a result, accounts may have excessive privileges which may result in the unauthorized access, misuse, disclosure, disruption, modification, or destruction of information.

2.2.8 Ensures that accounts are terminated or deactivated once access is no longer required or after a period of inactivity, according to organizational policy. – Not Met

Exception noted. We found that both reviewed agencies were not ensuring that accounts were terminated or deactivated once access was no longer required. Specifically, we found 17 total separated employees that did not have their accounts disabled or deactivated after access was no longer needed. Additionally, two agencies self-reported deficiencies in this area. The agencies were not properly terminating users when access was no longer required, which could result in the unauthorized access, misuse, disclosure, disruption, modification, or destruction of information.

46 The Executive Branch mandate entitled HSPD-12 (August 2004) requires Federal agencies to develop and deploy for all of their employees and contract personnel a PIV credential which is used as a standardized, interoperable card capable of being used as employee identification and allows for both physical and information technology system access.

Page 36: U.S. Department of Agriculture, Office of the Chief Information Officer ...

2.2.9 Identifies, limits, and controls the use of shared accounts. (NIST SP 800-53: AC-2) – Met

No exception noted. OIG found that all agencies reviewed did not use shared accounts.

2.2.10 All users are uniquely identified and authenticated for remote access using Strong Authentication (multi-factor), including PIV. (NIST SP 800-46, Section 4.2, Section 5.1, NIST SP 800-63) – Met

No exception noted. We found that both agencies reviewed have multi-factor authentication properly implemented.

AUDIT REPORT 50501-0012-12 27

47 Also, we found that multi-factor authentication for remote access is required by Departmental policy,48 and the Department’s two-factor authentication solution Enterprise Virtual Private Network (EVPN) meets NIST requirements for remote electronic authentication.49

2.2.11 Protects against and detects unauthorized remote access connections or subversion of authorized remote access connections, including through remote scanning of host devices. (CIS 12.7, 12.8, FY 2016 CIO FISMA metrics 2.17.3, 2.17.4, 3.11, 3.11.1) – Met

No exception noted. We found that both agencies reviewed were capturing and monitoring (or reviewing) logs for remote access and therefore had programs protecting against unauthorized connections or subversion of authorized connections.

2.2.12 Remote access sessions are timed-out after 30 minutes of inactivity, requiring user re-authentication, consistent with OMB M-07-16 – Met

No exception noted. We reviewed two agencies’ remote access session time-out settings and found they were compliant with OMB guidance.50

2.2.13 Enforces a limit of consecutive invalid remote access logon attempts and automatically locks the account or delays the next logon prompt. (NIST 800-53: AC-7) – Met

No exception noted. We reviewed two agencies’ remote access invalid logon attempts settings and found they were compliant with NIST SP 800-53.

47 Multi-factor authentication is using two or more different factors to achieve authentication. Factors include: (i) something you know (e.g., password/PIN); (ii) something you have (e.g., cryptographic identification device, token); or (iii) something you are (e.g., biometric). 48 DR 3640-001, Identity, Credential, and Access Management (ICAM) (December 9, 2011). 49 NIST SP 800-35, Security and Privacy Controls for Federal Information Systems and Organizations, Revision 4 (April 2013). 50 OMB M-07-17, Safeguarding Against and Responding to the Breach of Personally Identifiable Information (May 22, 2007).

Page 37: U.S. Department of Agriculture, Office of the Chief Information Officer ...

2.2.14 Implements a risk-based approach to ensure that all agency public websites and services are accessible through a secure connection through the use and enforcement of https and strict transport security. (OMB M-15-13) – Not Met

Exception noted. We found that both agencies reviewed did not ensure all agency public websites and services were accessible through a compliant secure connection. However, one agency remediated this issue during the course of this audit.

2.2.15 Provide any additional information on the effectiveness (positive or negative) of the organization’s Identity and Access Management Program that was not noted in the questions above. Based on all testing performed is the Identity and Access Management Program effective? – Not Effective

Based on the questions answered in this section, OIG has determined that the Identity and Access Management Program is not effective.

Security and Privacy Training (Protect)

2.3 Has the organization established a security and privacy awareness and training program, including comprehensive agency policies and procedures consistent with FISMA requirements, OMB policy, and applicable NIST guidelines? – Not Met Exception noted. We determined the Department and six of the reviewed agencies’ security awareness policies51 met the requirements outlined in NIST SP 800-53. In addition, the Department’s SAT procedures met NIST SP 800-53 requirements.52 However, three of six agencies we reviewed did not have adequate procedures in place to ensure employees and contractors received adequate SAT.

2.3.1 Develops training material for security and privacy awareness training containing appropriate content for the organization, including anti-phishing, malware defense, social engineering, and insider threat topics. (NIST SP 800-50, 800-53: AR-5, OMB M-15-01, 2016 CIO Metrics, PMC, National Insider Threat Policy (NITP)) – Met

No exception noted. We found that the material for the SAT contained the appropriate content to meet NIST requirements.53 2.3.2 Evaluates the skills of individuals with significant security and privacy responsibilities and provides additional security and privacy training content or implements human capital strategies to close identified gaps. (NIST SP 800-50) – Not Met

Exception noted. The Department and all six agencies’ policies for specialized security training were effective and fully developed in accordance with NIST SP 800-53 for FY 2016.54

51 DR 3545-001, Information Security Awareness and Training Policy (October 22, 2013). 52 Departmental SOP-OCD-018, Information Security Awareness Training (March 2, 2015). 53 NIST SP 800-50, Building an Information Technology Security Awareness and Training Program (October 2003).

28 AUDIT REPORT 50501-0012-12

Page 38: U.S. Department of Agriculture, Office of the Chief Information Officer ...

However, we found three of six agencies did not have documented procedures in place to meet NIST SP 800-50 and NIST SP 800-53 requirements.

2.3.3 Identifies and tracks status of security and privacy awareness training for all information system users (including employees, contractors, and other organization users) requiring security awareness training with appropriate internal processes to detect and correct deficiencies. (NIST 800-53: AT-2) – Not Met

Exception noted. NIST SP 800-53 requires agencies to document and monitor individual information system security training activities and to retain individual training records. We found that the Department does not have an effective process in place to ensure that all users have completed the annual SAT. The Department stated that one agency was at a 93.1 percent completion rate for FY 2016 SAT. However, we found that 574 out of 1,160 users for that agency had active agency computer access accounts but had not set up their AgLearn accounts. The 574 users had no training documented in AgLearn and were not identified by the agency or the Department as needing to complete SAT. All users are required to take SAT. Therefore, we found that the correct percentage of completion for that agency was 66.9 percent. Also, during our review of six agencies, we found 1,264 out of 21,377 users (nearly 6 percent) users with login privileges had not completed the annual SAT.

In the FY 2010 FISMA report, OIG recommended that the Department ensure its training repository is completely populated and all required personnel receive the training. USDA reported that this recommendation has reached final action; however, we take exception to the recommendation being closed.

2.3.4 Identifies and tracks status of specialized security and privacy training for all personnel (including employees, contractors, and other organization users) with significant information security and privacy responsibilities requiring specialized training. – Not Met

Exception noted. NIST SP 800-53 requires agencies to provide role-based training. Agencies are required to document and monitor individual information system security training activities and to retain individual training records. OIG reviewed the training content for individuals with significant information security responsibilities of the two sampled agencies. Our testing of 1,104 employees with significant security responsibilities found 590 employees (53.4 percent) from five of six reviewed agencies did not have adequate role-based training to meet NIST requirements. The other agency had not identified or tracked users since 2014.

AUDIT REPORT 50501-0012-12 29

54 NIST SP 800-53 requires the organization to provide basic security awareness training to all users. Additionally, it requires the organization to identify and provide information system managers, system and network administrators, personnel performing independent verification and validation activities, security control assessors, and other personnel having access to system-level software with role-based specialized security training related to their specific roles and responsibilities. The organization is to determine the appropriate content of security training and the specific requirements of the organization and the information systems to which personnel have authorized access.

Page 39: U.S. Department of Agriculture, Office of the Chief Information Officer ...

2.3.5 Measures the effectiveness of its security and privacy awareness and training programs, including through social engineering and phishing exercises. (PMC, 2016 CIO FISMA Metrics 2.19, NIST SP 800-50, NIST SP 800-55) – Met

No exception noted. The Department’s FY 2016 Information SAT course required to be taken by every user each year is effective.

2.3.6 Provide any additional information on the effectiveness (positive or negative) of the organization’s Security and Privacy Training Program that was not noted in the questions above. Based on all testing performed is the Security and Privacy Training Program effective? – Not Effective Based on the questions answered in this section, OIG has determined that the Security and Privacy Training Program is not effective.

30 AUDIT REPORT 50501-0012-12

Section 3: Detect

3.1.1 ISCM program is not formalized and ISCM activities are performed in a reactive manner resulting in an Ad hoc program that does not meet Level 2 requirements for a defined program consistent with NIST SP 800-53, SP 800-137, OMB M-14-03, and the CIO ISCM CONOPS.

People Domain – Level 1: Ad hoc

3.2.1.1 ISCM stakeholders and their responsibilities have been defined and communicated across the organization. However, stakeholders may not have adequate resources (people, processes, and technology) to effectively implement ISCM activities. – Level 2 (defined)

OCIO has defined the Information Security Continuous Monitoring (ISCM) stakeholders and their responsibilities within its ISCM Strategic Plan and it has been communicated across the organization.

3.1.1.2 The organization has not performed an assessment of the skills, knowledge, and resources needed to effectively implement an ISCM program. Key personnel do not possess knowledge, skills, and abilities to successfully implement an effective ISCM program. –Level 1 (Ad hoc)

OCIO has not performed an assessment of the skills, knowledge, and resources needed to effectively implement an ISCM program.

In the FY 2015 FISMA report, OIG recommended the Department perform an assessment of the skills, knowledge, and resources needed to effectively implement an ISCM program. The recommendation is still open, and OCIO has exceeded its final action estimated completion date of September 30, 2016.

Page 40: U.S. Department of Agriculture, Office of the Chief Information Officer ...

3.2.1.3 The organization has defined how ISCM information will be shared with individuals with significant security responsibilities and used to make risk-based decisions. However, ISCM information is not always shared with individuals with significant security responsibilities in a timely manner with which to make risk-based decisions. – Level 2 (defined)

USDA’s ISCM Strategic Plan outlines how USDA will share ISCM information with individuals with significant security responsibilities and use it to make risk-based decisions.

3.1.1.4 The organization has not defined how it will integrate ISCM activities with organizational risk tolerance, the threat environment, and business/mission requirements. – Level 1 (Ad hoc)

USDA is performing key control assessments based on risk tolerance through its regular RMF Framework key control testing; however, it has not yet been incorporated into the ISCM activities.

Processes Domain – Level 1: Ad hoc

3.1.1.5 ISCM processes have not been fully defined and are performed in an Ad hoc, reactive manner for the following areas: ongoing assessments and monitoring of security controls; performing hardware asset management, software asset management, configuration setting management, and common vulnerability management; collecting security related information required for metrics, assessments, and reporting; analyzing ISCM data, reporting findings, and determining the appropriate risk responses; and reviewing and updating the ISCM program. – Level 1 (Ad hoc)

USDA maintains policies that define how ongoing assessments of security controls should be performed, and reviews and updates the ISCM program; however, USDA does not perform configuration setting management, and common vulnerability management in its ISCM activities. Additionally, USDA does not analyze ISCM data, report findings, and determine the appropriate risk responses due to the CDM tools not being fully implemented yet.

3.1.1.6 ISCM results vary depending on who performs the activity, when it is performed, and the methods and tools used. – Level 1 (Ad hoc) OCIO stated that the CDM tools, once implemented, will be used in differing ways according to each specific agency’s separate processes. Additionally, training for the CDM tools, which would assist in making ISCM results more consistent, has been limited.

3.1.1.7 The organization has not identified and defined the qualitative and quantitative performance measures that will be used to assess the effectiveness of its ISCM program, achieve situational awareness, and control ongoing risk. – Level 1 (Ad hoc)

The ISCM Strategy Plan states that the Chief Information Officers Council will provide evaluations of the effective and efficient performance of the ISCM program; however, it is not

AUDIT REPORT 50501-0012-12 31

Page 41: U.S. Department of Agriculture, Office of the Chief Information Officer ...

detailed enough in outlining measurement criteria for ISCM performance and therefore cannot be considered fully defined.

3.1.1.8 The organization has not defined its processes for collecting and considering lessons learned to improve ISCM processes. – Level 1 (Ad hoc)

OCIO has not defined its processes for collecting and considering lessons learned to improve ISCM processes. In the FY 2015 FISMA report, OIG recommended the Department define within the ISCM Strategic Plan (or other formal document) USDA’s process for collecting and considering lessons learned to improve ISCM processes. The recommendation is still open, and OCIO has exceeded its final action estimated completion date of September 30, 2016. Technology Domain – Level 1: Ad hoc

3.2.1.9 The organization has identified and fully defined the ISCM technologies it plans to utilize in the following automation areas. In addition, the organization has developed a plan for implementing ISCM technologies in these areas: patch management, license management, information management, software assurance, vulnerability management, event management, malware detection, asset management, configuration management, network management, and incident management. However, the organization has not fully implemented technology is these automation areas and continues to rely on manual/procedural methods in instances where automation would be more effective. In addition, while automated tools are implemented to support some ISCM activities, the tools may not be interoperable. – Level 2 (defined)

OCIO has identified and defined the ISCM technologies needed in all of the automation areas; however, these tools have not been implemented.

3.1.1.10 The organization has not defined how it will use automation to produce an accurate point-in-time inventory of the authorized and unauthorized devices and software on its network and the security configuration of these devices and software. – Level 1 (Ad hoc)

OCIO has performed pilots for two CDM tools that will be used for hardware and software asset management. These tools are still in the implementation stage, and therefore OCIO does not currently have an automated process to produce an accurate point-in-time inventory of the authorized and unauthorized devices and software on its network.

In the FY 2015 FISMA report, OIG recommended the Department define within the ISCM Strategy or Plan (or other formal document) how USDA will use automation to produce an accurate inventory of authorized and unauthorized devices. The recommendation is still open, and OCIO has exceeded its final action estimated completion date of September 30, 2016.

32 AUDIT REPORT 50501-0012-12

Page 42: U.S. Department of Agriculture, Office of the Chief Information Officer ...

Section 4: Respond

4.1.1 Incident response program is not formalized and incident response activities are performed in a reactive manner resulting in an Ad hoc program that does not meet Level 2 requirements for a defined program consistent with FISMA (including guidance from NIST SP 800-83, NIST SP 800-61 Rev. 2, NIST SP 800-53, OMB M-16-03, OMB M-16-04, and US-CERT Federal Incident Notification Guidelines). – Level 1 (Ad hoc)

People Domain – Level 1: Ad hoc

4.2.1.1 Incident response team structures/models, stakeholders, and their roles, responsibilities, levels of authority, and dependencies have been fully defined and communicated across the organization, including the designation of a principal security operations center or equivalent organization that is accountable to agency leadership, DHS, and OMB for all incident response activities. However, stakeholders may not have adequate resources (people, processes, and technology) to effectively implement incident response activities. Further, the organization has not verified roles and responsibilities as part of incident response testing. – Level 2 (defined)

The Department’s draft policy and procedures have defined the roles and responsibilities for incident response teams. OCIO follows these draft procedures; therefore we consider this level defined. However, the procedures have been in draft since 2011. Without Departmental procedures that are finalized and distributed, agencies may not have adequate guidance to follow when implementing their internal incident response program.

4.1.1.2 The organization has not performed an assessment of the skills, knowledge, and resources needed to effectively implement an incident response program. Key personnel do not possess the knowledge, skills, and abilities to successfully implement an effective incident response program. – Level 1 (Ad hoc)

OCIO provided the Standard Operating Procedures for Incident Handling Management and has identified the responsibilities and goals of its organization for an effective incident response program. However, they do not have a plan for closing the gaps and stated that agencies may not have adequate staffing and skills.

4.2.1.3 The organization has defined a common threat vector taxonomy and defined how incident response information will be shared with individuals with significant security responsibilities and other stakeholders, and used to make timely, risk-based decisions. However, the organization does not consistently utilize its threat vector taxonomy and incident response information is not always shared with individuals with significant security responsibilities and other stakeholders in a timely manner. –Level 2 (defined) The Department’s draft policy and procedures have been updated to define a common threat vector taxonomy. OCIO follows United States Computer Emergency Readiness Team (US-CERT) guidelines for reporting incidents by threat vector and this process has been implemented.

AUDIT REPORT 50501-0012-12 33

Page 43: U.S. Department of Agriculture, Office of the Chief Information Officer ...

4.1.1.4 The organization has not defined how it will integrate incident response activities with organizational risk management, continuous monitoring, continuity of operations, and other mission/business areas, as appropriate. – Level 1 (Ad hoc)

As noted above, the Department does not have CDM tools fully implemented.

Processes Domain – Level 1: Ad hoc

4.2.1.5 Incident response processes have been fully defined for the following areas: incident response planning, incident response training and testing; incident detection and analysis; incident containment, eradication, and recovery; incident coordination, information sharing, and reporting using standard data elements and impact classifications within timeframes established by US-CERT. However, these processes are inconsistently implemented across the organization. – Level 2 (defined)

The Department has drafted procedures for using the standard data elements and impact classifications within time frames established by US-CERT. Effective September 30, 2015, US-CERT Federal Incident Notification Guidelines established a 1-hour notification time frame for all incidents to improve US-CERT's ability to understand cybersecurity events affecting the government. OIG reviewed a sample of incidents to determine if they were following US-CERT requirements and being reported within the 1-hour timeframe requirement. We found that 9 out of 73 incidents we reviewed were not reported within 1 hour.

4.1.1.6 The organization has not fully defined how it will collaborate with DHS and other parties, as appropriate, to provide on-site, technical assistance/surge resources/special capabilities for quickly responding to incidents. – Level 1 (Ad hoc)

The organization has not fully defined how it will collaborate with DHS and other parties, as appropriate, to provide on-site, technical assistance/surge resources/special capabilities for quickly responding to incidents. OCIO collaborates with external entities, but the process has not been fully defined.

4.1.1.7 The organization has not identified and defined the qualitative and quantitative performance measures that will be used to assess the effectiveness of its incident response program, perform trend analysis, achieve situational awareness, and control ongoing risk. – Level 1 (Ad hoc)

The Department has not identified and defined the qualitative and quantitative performance measures that will be used to assess the effectiveness of its incident response program, perform trend analysis, achieve situational awareness, and control ongoing risk. OCIO is currently working to define its performance metrics.

4.2.1.8 The organization has defined its processes for collecting and considering lessons learned and incident data to improve security controls and incident response processes.

34 AUDIT REPORT 50501-0012-12

Page 44: U.S. Department of Agriculture, Office of the Chief Information Officer ...

However, lessons learned are not consistently captured and shared across the organization and used to make timely improvements to security controls and the incident response program. – Level 2 (defined)

The OCIO procedures include a lessons learned section to be completed when reporting security and personally identifiable information incidents. However, OCIO stated that this information is not provided by the agencies on a consistent basis.

Technology Domain – Level 1: Ad hoc

4.1.1.9 The organization has not identified and defined the incident response technologies needed in one or more of the following areas and relies on manual/procedural methods in instances where automation would be more effective. Use of incident response technologies in the following areas is Ad hoc. - Web application protections, such as web application firewalls. – Level 1 (Ad hoc)

- Event and incident management, such as intrusion detection and prevention tools, and incident tracking and reporting tools

- Aggregation and analysis, such as security information and event management (SIEM) products

- Malware detection, such as anti-virus and antispam software technologies - Information management, such as data loss prevention - File integrity and endpoint and server security tools

OCIO uses tools to collect and store security information and manually perform analysis. Tripwire has also been installed on all OCIO owned systems.

AUDIT REPORT 50501-0012-12 35

55 However, OCIO currently does not have an enterprise solution for file integrity or endpoint protection and does not have an enterprise Data Loss Prevention (DLP) solution.56 OCIO does not have an enterprise platform for Anti-Virus (AV)57 and SPAM58 under their control and they rely on agency endpoint and SPAM protection.

4.2.1.10 The organization has defined how it will meet the defined TIC security controls and ensure that all agency traffic, including mobile and cloud, are routed through defined access points, as appropriate. However, the organization has not ensured that the TIC 2.0 provider and agency managed capabilities are consistently implemented. – Level 2 (defined) The Department has defined how it will meet the defined TIC security controls.59 OCIO stated that any endpoint device on Wi-Fi or wired USDA networks will be logged through the TIC

55 Tripwire is commercial software used by the Department that ensures system integrity and validates that changes that are made to systems are authorized. 56 Data loss prevention (DLP) refers to technology or software developed to protect and prevent the potential for data loss or theft. 57 Anti-virus (AV) software is a utility that searches a hard disk for viruses and removes any that are found. 58 Spam is considered to be electronic junk mail or junk newsgroup postings. 59 Trusted Internet Connection’s (TIC) purpose is to optimize and standardize the security of individual external network connections currently in use by the Federal Government to include connections to the internet.

Page 45: U.S. Department of Agriculture, Office of the Chief Information Officer ...

collection points. The Department has defined that if devices are on mobile, cloud or non-USDA networks, traffic does not have to go through TICs.

4.2.1.11 The organization has defined how it plans to utilize DHS’ Einstein program for intrusion detection/prevention capabilities for traffic entering and leaving its networks. – Level 2 (defined)

The Department has defined how it uses DHS’ Einstein program.

36 AUDIT REPORT 50501-0012-12

60

4.1.1.12 The organization has not defined how it plans to utilize technology to develop and maintain a baseline of network operations and expected data flows for users and systems. – Level 1 (Ad hoc)

OCIO is in the process of defining how it plans to utilize technology to develop and maintain a baseline of network operations and expected data flows for users and systems. OCIO stated that it is implementing a toolset to accomplish this plan.

Section 5: Recover

Contingency Planning (Recover) 5.1 Has the organization established an enterprise-wide business continuity/disaster recovery program, including policies and procedures consistent with FISMA requirements, OMB policy, and applicable NIST guidelines? – Met

No material exception noted. Although USDA has established a Continuity/Disaster Recovery Program, it is not consistently implemented across the agencies within USDA.

5.1.1 Develops and facilitates recovery testing, training, and exercise (TT&E) programs. (FCD1, NIST SP 800-34, NIST SP 800-53) – Met No material exception noted. NIST SP 800-53 requires Federal agencies to test the contingency plan to determine its effectiveness and their readiness for execution. OIG found 3 of 79 total systems reviewed, did not have a contingency plan in the required Departmental system. Without a contingency plan, we could not review the TT&E program.

5.1.2 Incorporates the system’s Business Impact Analysis and Business Process Analysis into analysis and strategy toward development of the organization’s Continuity of Operations Plan, Business Continuity Plan (BCP), and Disaster Recovery Plan (DRP). (NIST SP 800-34) – Not Met

60 DHS’ Einstein detects and blocks cyber-attacks from compromising Federal agencies. Einstein also provides DHS with the situational awareness to use threat information detected in one agency to protect the rest of the government and to help the private sector protect itself.

Page 46: U.S. Department of Agriculture, Office of the Chief Information Officer ...

Exception noted. NIST states that conducting the Business Impact Analysis (BIA) is a key element in a comprehensive information system contingency planning process.

AUDIT REPORT 50501-0012-12 37

61 The Departmental guidance on developing contingency plans requires that a BIA be completed for each new information system.62 OIG found that two of two agencies reviewed had not incorporated the BIA into their contingency plan.

5.1.3 Develops and maintains documented recovery strategies, plans, and procedures at the division, component, and IT infrastructure levels. (NIST SP 800-34) – Met

No material exception noted. OIG only found one system reviewed did not address key information as required by NIST SP 800-34. Because the agency had not developed a contingency plan we did not find documented recovery strategies, plans, and procedures.

5.1.4 BCP and DRP are in place and ready to be executed upon if necessary. (FCD1, NIST SP 800-34, 2016 CIO FISMA Metrics 5.3, PMC) – Met

No exception noted. NIST SP 800-53 requires the agency to have formal, documented procedures to facilitate the implementation of its contingency planning policy and associated controls. The Department and two of two agencies reviewed had plans in place and ready to be executed. 5.1.5 Tests BCP and DRP for effectiveness and updates plans as necessary. (2016 CIO FISMA Metrics, 5.4) – Not Met

Exception noted. NIST SP 800-53 requires Federal agencies to test contingency plans for information systems, review the contingency plan test results, and initiate corrective actions, if needed. We found that 69 of 79 systems did not have evidence of testing in at least one of the last three years. Additionally, four agencies self-reported deficiencies in this area.

5.1.6 Tests system-specific contingency plans, in accordance with organizationally defined timeframes, to determine the effectiveness of the plans as well as readiness to execute the plans if necessary. (NIST SP 800-53: CP-4) – Not Met

Exception noted. NIST SP 800-53 requires Federal agencies to test contingency plans for information systems, using organization-defined tests. This testing is done to determine the plans’ effectiveness and the organizations’ readiness to execute the plans. We identified 189 of 316 agency systems63 within the Department that did not have a testing date recorded in CSAM during FY 2016.64 Additionally, the A-123 self-assessment noted four agencies with problems regarding system-specific contingency plan testing.

61 NIST SP 800-34 Revision 1, Contingency Planning Guide For Federal Information Systems (May 2010). 62 DR 3571-001, Information System Contingency Planning and Disaster Recovery Planning (June 1, 2016). 63 System inventory report from CSAM was ran on September 8, 2016 and then updated on October 19, 2016 to capture contingency plan test dates for the remainder of the FY. 64 USDA Contingency Plan Exercise Handbook, Revision 2.0 (October 2014).

Page 47: U.S. Department of Agriculture, Office of the Chief Information Officer ...

In the FY 2010 FISMA report, we recommended the Department ensure that all required contingency planning documents are in CSAM and all required fields are properly populated. This should include recovery strategies, plans, and procedures, as well as testing, training, and exercise results. We recommended the Department periodically review CSAM to ensure agency compliance. Although this recommendation has reached final action and is closed, we take exception to the recommendation being closed.

5.1.7 Develops after-action reports that address issues identified during contingency/disaster recovery exercises in order to improve contingency/disaster recovery processes. (FCD1, NIST SP 800-34) – Not Met Exception noted. NIST SP 800-34 states that all recovery and reconstitution events should be well documented, including actions taken and problems encountered during recovery and reconstitution efforts. An after-action report, with lessons learned, should be documented and updated. Our review found 22 of 79 systems in the Department did not have after-action reports in CSAM.

5.1.8 Determines alternate processing and storage sites based upon risk assessments which ensure the potential disruption of the organization’s ability to initiate and sustain operations is minimized, and are not subject to the same physical and/or cybersecurity risks as the primary sites. (FCD1, NIST SP 800-34, NIST SP 800-53: CP-6, CP-7) – Met No material exception noted. NIST SP 800-53 requires alternate processing sites be established for information systems in case of a disaster. We found only 3 of 86 systems reviewed did not meet the NIST requirements of an alternate processing site. Specifically, one system had an alternate processing site that was in close proximity and therefore subject to the same risks. We also found one agency did not have an alternate processing site for the two systems or its IT infrastructure.

5.1.9 Conducts backups of information at the user- and system-levels and protects the confidentiality, integrity, and availability of backup information at storage sites. (FCD1, NIST SP 800-34, NIST SP 800-53: CP-9, NIST CF, PR.IP-4, NARA guidance on information systems security records) – Not Met

Exception noted. NIST SP 800-34 states system data should be backed up regularly. OIG found that backups were not adequately conducted on 2 of 10 (20 percent) systems reviewed. This occurred because agencies had plans to move the system to a data center or backups were not done completely. If backup and recovery methods and strategies are not implemented, agencies’ systems may not be able to restore operations quickly and effectively following a service disruption.

38 AUDIT REPORT 50501-0012-12

Page 48: U.S. Department of Agriculture, Office of the Chief Information Officer ...

5.1.10 Contingency planning that considers supply chain threats. – Not Met

Exception noted. We found 12 of 79 (15.2 percent) contingency plans in our statistical sample of Department systems that did not document, or consider, supply chain threats within the contingency plan. Based on our sample results, we estimate that 46 systems may not have complied with the requirement to consider supply chain or vendor threats.

AUDIT REPORT 50501-0012-12 39

65 As noted in 5.1.3, one of two agencies we reviewed did not have a contingency plan and therefore we could not review supply chain threats.

5.1.11 Provide any additional information on the effectiveness (positive or negative) of the organization’s Contingency Planning Program that was not noted in the questions above. Based on all testing performed is the Contingency Planning Program effective? ? – Not Effective

Based on the questions answered in this section, OIG has determined that the Contingency Planning/Disaster Recovery Program is not effective.

65 We based our sample size on a 35 percent error rate and desired absolute precision of +/- 10 percent, at the 95 percent confidence level. With these assumptions, we calculated a sample size of 69 contingency plans for review and selected them by choosing a simple random sample. Additional sample design information is presented in Exhibit B.

Page 49: U.S. Department of Agriculture, Office of the Chief Information Officer ...

Exhibit B:  Sampling Methodology and Projections 

40 AUDIT REPORT 50501-0012-12

Objective:

The objectives of this audit were to evaluate the status of USDA’s overall IT security program by evaluating the five Cybersecurity Framework security functions areas of:

· Identify, which includes questions pertaining to Risk Management and Contractor systems;

· Protect, which includes questions pertaining to Configuration Management, Identity and Access Management and Security and Privacy Training questions;

· Detect, which includes questions pertaining to Information Security Continuous Monitoring;

· Respond, which includes questions pertaining to Incident Response; and · Recover, which includes questions pertaining to Contingency Planning.

This sample was designed to support OIG’s FY 2016 FISMA audit. The objective of this audit was to evaluate the status of USDA’s overall IT security program based on the following overarching criteria: FISMA Audit Universes and Sample Designs: FISMA contains multiple areas pertaining to various areas of IT security. We incorporated statistical sampling in three FISMA areas. Each of those areas was represented by a different universe. The specific designs are summarized below.

1. Incident Response and Reporting

Universe: The audit universe consisted of 1,709 incidents reported during FY 2016, as of June 7, 2016.66 Each incident had a unique identifier and was categorized based on incident type into 1 of 11 categories.

Sample Design: Each incident category has specific procedures and timelines that must be met by OCIO and the agency. While standards differ among the categories, the standards fall into four common groups: checklist requirements, reporting requirements, timely resolution, and damage containment. Thus, each incident response can be assessed as “pass” or “fail” when compared to the criteria that specifically apply to that incident type. This allowed us to combine incident response performance results (pass or fail) for the mix of incident types.

Our audit team wanted to ensure that at least one incident per category is selected in the sample. A couple of the categories (such as External/Removable Media and Impersonation/Spoofing) 66 Originally the universe data consisted of 1,827 incidents, which included OIG’s IT security operations. OIG’s operations, however, are reviewed by the Office of Compliance and Integrity (OCI). OCI performs independent quality assurance and internal control reviews of OIG operations. Therefore, we excluded the 118 OIG incidents from the 1,827 universe total. Hence, the projectable universe totals 1,709 incidents.

Page 50: U.S. Department of Agriculture, Office of the Chief Information Officer ...

included very few incidents. Therefore we isolated those into a census stratum—every incident in that stratum is reviewed. Table 1 below presents the counts of incidents per category in the audit universe and sample.

Table 1: Incidents Universe and Sample by Category and Stratum

AUDIT REPORT 50501-0012-12 41

Stratum Category Count Category

Number of

Incidents in

Universe

Number of

Incidents in

Sample

I Census 1 External/Removable Media Count 2 2

I 2 Impersonation/Spoofing Count 6 6 II

Simple Random Sample

3 Email Count 29 1 II 4 Improper Usage Count 372 15

II 5 Improper Usage - Pornography Count 59 4

II 6 Loss or Theft of Equipment Count 288 8

II 7 Not Assigned - Category 0 Count 154 7 II 8 Not Assigned - Category 6 Count 365 13 II 9 Other Count 120 5 II 10 Paper-Based PII Count 27 3 II 11 Web Count 287 9

TOTAL 1,709 73

Sample size of 65 incidents in the random sample is based on:

· 95 percent confidence level, · +/-10 percent precision in an attribute testing scenario, · A universe size of 1,701 units in random stratum, and · An average expected error rate of 21 percent, based on historical information.

Results: Results are projected to the audit universe of 1,709 incidents. Achieved precision, relative to the universe, is reflected by the confidence interval for a 95 percent confidence level. All projections are made using the normal approximation to the binomial as reflected in standard equations for a simple random sample.67

The audit team tested a variety of criteria: whether or not the incidents were reported to US-CERT within the required timeframe; whether or not the proper checklist was completed, and if not, was still accepted by the OCIO; whether or not the completed Incident Identification Form was completed in its entirety; whether or not the required incident category checklist was

67 Scheaffer, Mendenhall, Ott, Elementary Survey Sampling, Seventh Edition, Duxbury Press, c1990.

Page 51: U.S. Department of Agriculture, Office of the Chief Information Officer ...

completed; if incidents were open for over 30 days without a POA&M being created; and if the incidents were resolved to minimize further damage.

In addition to the two criteria using estimation, we developed a projection based on the number of incidents found in our sample with at least one exception. All projections are shown in Table 2 below. The narrative interpretation of the results is presented below the table.

Table 2: Incident Response and Reporting Projections

42 AUDIT REPORT 50501-0012-12

68

Criteria Tested Estimate Standard

Error

95% Confidence

Interval Coefficient

of Variation

Actual Found Precision

Lower Upper Incidents not reported to US-CERT within the required 1-hour timeframe

236 72.020 92 379 0.306

9 8%

as a % of the universe

14% 4% 5% 22% 0.306

Incidents closed within the 30 day timeframe

1,604 50.111 1,504

1,704 0.031

69 6%

as a % of the universe

94% 3% 88% 100% 0.031

Incidents with at least one issue

288 78.187 132 444 0.272

11 9% as a % of the universe

17% 5% 8% 26% 0.272

68 All percentages used in this table and tables below are rounded to the nearest whole number.

Page 52: U.S. Department of Agriculture, Office of the Chief Information Officer ...

Based on our sample results, we estimate that: · The number of incidents not reported to US-CERT within the requested 1-hour time

frame is 236 (14 percent of the universe). We are 95 percent confident that the true value is between 92 (5 percent) and 379 (22 percent) incidents.

· 1,604 incidents (94 percent) were closed within the 30 day timeframe. We are 95 percent confident that the true value is between 1,504 (88 percent) and 1,704 (100 percent) incidents.

· The number of incidents with at least one issue is 288 (17 percent of the universe). We are 95 percent confident that the true value is between 132 (8 percent) and 444 (26 percent) incidents.

2. Closed POA&Ms

Universe:

AUDIT REPORT 50501-0012-12 43

The audit universe consisted of 673 closed POA&Ms.

Sample Design: We selected a simple random sample of 57 closed POA&Ms for review. We based our sample size on the following factors:

· 95 percent confidence level, · +/- 10 percent precision in an attribute testing scenario, · Universe size of 673 units, and · Average expected error rate of 18 percent based on historical information.

Results: Results for all criteria are projected to the audit universe of 673 closed POA&Ms. Achieved precision relative to the audit universe is reported for each criterion. The corresponding lower and upper bounds of the 95 percent confidence interval are also included. All projections are made using the normal approximation to the binomial as reflected in standard equations for a simple random sample.69 This fiscal year, the error rate we found was lower than expected. Hence, we are projecting the number of closed POA&Ms that are error free. Projections are shown in Table 3 below. The narrative interpretation of the results can be found below the table.

69 Op. cit., Scheaffer et al.

Page 53: U.S. Department of Agriculture, Office of the Chief Information Officer ...

Table 3: POA&M (closed) Projections

44 AUDIT REPORT 50501-0012-12

Criteria Estimate Standard Error

95% Confidence

Interval

Coefficient of

Variation

Actual Found Precision

Lower Upper Number of POA&M with effective remediation

638 19.213 599 67070 .030 54 6%

as a % of the universe 95% 3% 89% 99.6%

Based on our sample, we estimate that 638 POA&M (95 percent) have effective remediation plan/action/support for correcting weakness. We are 95 percent confident that the true value is between 599 POA&M (89 percent) and 670 (99.6 percent) of the POA&M in the universe.

3. System / Contingency Planning

Universe: Our universe contained 312 systems.71 Two of the USDA agencies in this universe – Grain Inspection, Packers & Stockyards Administration (GIPSA) and National Agricultural Statistics Service (NASS), contained very few systems for review – 2 and 8 systems, respectively. The audit team reviewed a 100 percent of those (i.e., a census review). The rest of the 302 systems were combined into a universe of their own. These reportable systems came from a variety of agencies that were documented in CSAM as of June 22, 2016. Each system is to have a contingency plan that contains very specific recovery information in the event of a disaster.

Sample Designs: From the universes described above, we:

· Selected a simple random sample of 69 from 302 systems used by various USDA agencies. The sample size for our simple random sample was based on:

o 95 percent confidence level, o +/-10 percent precision in an attribute testing scenario, o A universe size of 302 units, and o An expected error rate of 35 percent, based on historical information

· Reviewed both GIPSA systems(i.e., a census) · Reviewed all eight NASS systems (i.e., a census)

70 The statistical upper bound for the 95 percent confidence interval is 676. This number is larger than the universe count. Hence, we are reporting the logical upper bound, which is calculated by subtracting the error count from the universe count. 673 POA&M - 3 with an error 670 POA&Ms maximum would be error free. 71 Originally the universe data consisted of 314 systems, which included OIG’s systems. OIG’s operations, however, are reviewed by the Office of Compliance and Integrity (OCI). OCI performs independent quality assurance and internal control reviews of OIG operations. Therefore, we excluded the 2 OIG systems from the 314 universe total. Hence, the projectable universe totals 312 systems.

Page 54: U.S. Department of Agriculture, Office of the Chief Information Officer ...

Results:

AUDIT REPORT 50501-0012-12 45

Projections are made to the whole universe of 312 Department systems. Achieved precision relative to the universe is reported for each criterion. The corresponding lower and upper bounds of the 95 percent confidence interval are also included. All projections are made using the normal approximation to the binomial as reflected in standard equations for a simple random sample.72 Projections are shown in Table 4. A narrative interpretation of the results is presented below the table.

Table 4: System / Contingency Planning Projections

Criteria Tested Estimate Standard Error

95% Confidence

Interval

Coefficient of

Variation

Actual Found Precision

Lower Upper Systems that had test, training and exercise programs in their contingency planning programs

302 5.397 291 31273 0.018 76 3%

as a % of the universe 97% 2% 93% 100%

Systems that did not have evidence of ongoing testing of system contingency plans

268 11.324 246 291 0.042 69 7%

as a % of the universe 86% 4% 79% 93%

Systems did not have evidence of after-action reports

79 13.861 52 107 0.175 22 9%

as a % of the universe 25% 4% 17% 34%

Systems that had an alternate site in a geographically different area and thus did not pose a risk to the agency

307 3.844 299 31274 0.013 77 2%

as a % of the universe 98% 1% 96% 100%

72 Ibid. 73 Statistical Upper bound is 313. Logical upper bound is 312, or 100 percent. 74 Statistical Upper bound is 314. Logical upper bound is 312, or 100 percent.

Page 55: U.S. Department of Agriculture, Office of the Chief Information Officer ...

46 AUDIT REPORT 50501-0012-12

The system contingency plans did not consider their vendors and supply chain in their contingency plan

46 11.324 23 68 0.247 12 7%

as a % of the universe 15% 4% 7% 22%

Based on our sample results, we estimate that:

· 302 systems (97 percent) have test, training and exercise programs in their contingency planning programs. We are 95 percent confident that the true value is between 291 (93 percent) and 312 (100 percent) systems.

· 268 systems (86 percent) did not have evidence of ongoing testing of system contingency plans. We are 95 percent confident that the true value is between 246 (79 percent) and 291 (93 percent) systems.

· 79 systems (25 percent) did not have evidence of after-action reports. We are 95 percent confident that the true value is between 52 (17 percent) and 107 (34 percent) systems.

· 307 systems (98 percent) have an alternate site in a geographically different area and thus do not pose a risk to the agency. We are 95 percent confident that the true value is between 299 (96 percent) and 312 (100 percent) systems.

· 46 systems (15 percent) system contingency plans did not consider their vendors and supply chain in their contingency plan. We are 95 percent confident that the true value is between 23 (7 percent) and 68 (22 percent) systems.

Page 56: U.S. Department of Agriculture, Office of the Chief Information Officer ...

AUDIT REPORT 50501-0012-12 47

Agency Response

USDA’S OFFICE OF THE CHIEF INFORMATION OFFICER’S

RESPONSE TO AUDIT REPORT

Page 57: U.S. Department of Agriculture, Office of the Chief Information Officer ...
Page 58: U.S. Department of Agriculture, Office of the Chief Information Officer ...

AN EQUAL OPPORTUNITY EMPLOYER

United States Department of Agriculture

TO: Phyllis K. Fong

Inspector General Office of Inspector General

FROM: Jonathan Alboum /s/, November 3rd, 2016

Chief Information Officer Office of the Chief Information Officer

SUBJECT: Chief Information Officer Response to Office of the Inspector General’s Draft

Report Audit #50501-0012012 Fiscal Year (FY) 2016 Federal Information Security Management Act (FISMA)

The Department is pleased to respond to your audit report on the FY 2016 Federal Information Security Management Act (FISMA). We concur with the findings in the report. We generally agree with the recommendations in the report and will develop corrective action plans with milestones to address the findings within 60 days. We believe that the implementation of a maturity model is a positive change. We noted, however, that the new scoring methodology does not allow flexibility for the OIG to credit partial achievements in program areas, which artificially lowers the relative scores, and does not correctly represent the progress we have made. I would like to express my appreciation for the cooperation and professionalism displayed by your staff and your contract auditors during the course of your audit.

Departmental Management Office of the Chief Information Officer 1400 Independence Avenue S.W. Washington, DC 20250

Page 59: U.S. Department of Agriculture, Office of the Chief Information Officer ...

In Washington, DC 202-690-1622 Outside DC 800-424-9121 TDD (Call Collect) 202-690-1202

Bribes or Gratuities 202-720-7257 (Monday–Friday, 9:00 a.m.– 3 p.m. ET)

-

-’

-

To learn more about OIG, visit our website at

www.usda.gov/oig/index.htm

How To Report Suspected Wrongdoing in USDA Programs

Learn more about USDA OIG Visit our website: www.usda.gov/oig/index.htm Follow us on Twitter: @OIGUSDA

How to Report Suspected Wrongdoing in USDA Programs

FFraud,raud, WWaste,aste, andand AbuseAbuse File complaint online: www.usda.gov/oig/hotline.htm

Monday–Friday, 9:00 a.m.– 3:00 p.m. ET In Washington, DC 202-690-1622 Outside DC 800-424-9121 TDD (Call Collect) 202-690-1202

Bribes or Gratuities 202-720-7257 (24 hours)

In accordance with Federal civil rights law and U.S. Department of Agriculture (USDA) civil rights regulations and policies, the USDA, its Agencies, offices, and employees, and institutions participating in or administering USDA programs are prohibited from discriminating based on race, color, national origin, religion, sex, gender identity (including gender expression), sexual orientation, disability, age, marital status, family/parental status, income derived from a public assistance program, political beliefs, or reprisal or retaliation for prior civil rights activity, in any program or activity conducted or funded by USDA (not all bases apply to all programs). Remedies and complaint filing deadlines vary by program or incident.

Persons with disabilities who require alternative means of communication for program information (e.g., Braille, large print, audiotape, American Sign Language, etc.) should contact the responsible Agency or USDA 's TARGET

Center at (202) 720-2600 (voice and TTY) or contact USDA through the Federal Relay Service at (800) 877-8339. Additionally, program information may be made available in languages other than English.

To file a program discrimination complaint, complete the USDA Program Discrimination Complaint Form, AD-3027, found online at How to File a Program Discrimination Complaint and at any USDA office or write a letter addressed to USDA and provide in the letter all of the information requested in the form. To request a copy of the complaint form, call (866) 632-9992. Submit your completed form or letter to USDA by: (1) mail: U.S. Department of Agriculture, Office of the Assistant Secretary for Civil Rights, 1400 Independence Avenue, SW, Washington, D.C. 20250-9410; (2) fax: (202) 690-7442; or (3) email: [email protected].

USDA is an equal opportunity provider, employer, and lender.