VVSG 1.1 Test Suite Status

23
TGDC Meeting, Jan 2011 VVSG 1.1 Test Suite Status Mary Brady National Institute of Standards and Technology http://vote.nist.gov

description

VVSG 1.1 Test Suite Status. Mary Brady National Institute of Standards and Technology http://vote.nist.gov. Background. Status quo: Labs have been testing to VVSG 1.0 (2005) using proprietary, custom tooling and review processes - PowerPoint PPT Presentation

Transcript of VVSG 1.1 Test Suite Status

Page 1: VVSG 1.1 Test Suite Status

TGDC Meeting, Jan 2011

VVSG 1.1 Test Suite Status

Mary BradyNational Institute of Standards and

Technologyhttp://vote.nist.gov

Page 2: VVSG 1.1 Test Suite Status

TGDC Meeting, Jan 2011 Page 2

Background Status quo: Labs have been testing to VVSG

1.0 (2005) using proprietary, custom tooling and review processes

In 2007-08, NIST developed a set of public test suites for VVSG 2.0, to be used as part of the EAC Testing and Certification Program

In 2009, to support VVSG 1.1, test methods for new and changed material was back-ported from 2.0 test suites; status quo prevails for everything else.

Page 3: VVSG 1.1 Test Suite Status

TGDC Meeting, Jan 2011 Page 3

Why Public Test Suites? To achieve consistency across testing

labs and promote transparency of the testing process

To review the VVSG for ambiguities, completeness, and correctness

To assist manufacturers by providing precise test specifications

To assist testing labs by lowering the overall cost of testing

Page 4: VVSG 1.1 Test Suite Status

TGDC Meeting, Jan 2011 Page 4

Test Development Timeline

December 2011: Integrated into Certification Process

Page 5: VVSG 1.1 Test Suite Status

TGDC Meeting, Jan 2011 Page 5

VVSG 1.1 Test Suite VVSG 1.1 test suite is based on VVSG 2.0 test

methods associated with back-ported requirements

Accessibility and usability Operational temperature and humidity Electronic records, security specifications, and VVPAT Core functionality, reliability and accuracy

New test method developed for updated software setup validation requirement

Page 6: VVSG 1.1 Test Suite Status

TGDC Meeting, Jan 2011

Accessibility & Usability System-independent test narratives with

pass/fail criteria Highly structured process surrounding the

usability test protocols for the performance-based testing with test participants

ISO Common Industry Format for reporting usability test results

CIF templates and how-to’s for manufacturers and test labs

Page 6

Page 7: VVSG 1.1 Test Suite Status

TGDC Meeting, Jan 2011 Page 7

Accidental Activation: Input mechanisms SHALL be designed to minimize accidental activation

Covers requirements:

3.2.6c Accidental Activation3.2.6c.i Size and Separation of Touch Areas3.2.6c.ii No Repeating Keys

3.2.6c Input mechanisms SHALL be designed to minimize accidental activation

3.2.6.c.i On touch screens, the sensitive touch areas SHALL have a minimum height of 0.5 inches and minimum width of 0.7 inches. The vertical distance between the centers of adjacent areas SHALL be at least 0.6 inches, and the horizontal distance at least 0.8 inches.

3.2.6.2.ii No key or control on a voting system SHALL have a repetitive effect as a result of being held in its active position.

EX

AM

PLE

TES

T CA

SE

Page 8: VVSG 1.1 Test Suite Status

TGDC Meeting, Jan 2011 Page 8

• Test Method includes 7 test requirements, covering 2 pages

• Excerpt:

For touchscreen systems, The tester shall examine the touch areas for at least contests #4 (Governor) and #9 (County Commissioners). Using a ruler to measure distance and a stylus to perform the touching, the tester shall determine first that the touch areas used to vote for at least the first t candidates in each contest are separated as required.

F => If any vertical distance between centers of adjacent touch areas for voting is less than 0.6 inches, then, for requirement "Size and Separation of Touch Areas", the system fails.

EX

AM

PLE

TES

T CA

SE

Accidental Activation, cont.

Page 9: VVSG 1.1 Test Suite Status

TGDC Meeting, Jan 2011

Hardware Operational Temperature and humidity

Page 9

Page 10: VVSG 1.1 Test Suite Status

TGDC Meeting, Jan 2011 Page 10

EX

AM

PLE

TES

T CA

SE

Covers requirements:

Volume I, Section 4.1.2.13 Environmental Control - Operating Environment. Voting systems shall be capable of operation in temperatures ranging from 41 °F to 104 °F (5 °C to 40 °C) and relative humidity from 5% to 85%, non-condensing.For testing information, see Volume II, section 4.7.1. Volume II, Section 4.7.1 Operating Temperature and Humidity Tests. All voting systems shall be tested in accordance with the appropriate procedures of MIL-STD-810D, "Environmental Test Methods and Engineering Guidelines''.

Operating Temperature and Humidity

Page 11: VVSG 1.1 Test Suite Status

TGDC Meeting, Jan 2011 Page 11

EX

AM

PLE

TES

T CA

SE

Covers requirements, cont.:

Operating TemperatureAll voting systems shall be tested according to the low temperature and high temperature testing specified by MIL-STD-810-D: Method 502.2, Procedure II – Operation and Method 501.2, Procedure II – Operation, with test conditions that simulate system operation.

Operating HumidityAll voting systems shall be tested according to the humidity testing specified by MIL-STD-810-D: Method 507.2, Procedure II – Natural (Hot–Humid), with test conditions that simulate system operation.  

Operating Temperature and Humidity

Page 12: VVSG 1.1 Test Suite Status

TGDC Meeting, Jan 2011 Page 12

EX

AM

PLE

TES

T CA

SE

Test method includes 15 steps, covering 3 pages

Excerpt:

Step 8: Set the chamber to 104 degrees Fahrenheit and 85% relative humidity (see Comment 1), observing precautions against thermal shock and condensation (see Comment 2). Allow relative humidity and VSUT temperature to stabilize. All paper, including ballots, used by the system must be stabilized at the specified testing temperature and humidity levels prior to testing (see Comment 4).

Step 9: Perform an operational status check. If the VSUT shows evidence of damage, or any examined function or feature is not working correctly, then record that the VSUT fails the Operating Temperature and Humidity test. End the test.

Operating Temperature and Humidity, cont.

Page 13: VVSG 1.1 Test Suite Status

TGDC Meeting, Jan 2011

Security Electronic records Security specifications VVPAT New test method developed for updated

software setup validation requirement

Page 13

Page 14: VVSG 1.1 Test Suite Status

TGDC Meeting, Jan 2011 Page 14

Electronic and Paper Record Structure Covers Requirement

7.9.3c

Electronic ballot images shall be digitally signed by the voting system. The digital signature shall be generated using a NIST-approved digital signature algorithm with a security strength of at least 112 bits implemented within a FIPS 140-2 validated cryptographic module operating in FIPS mode.

Discussion:NIST approved is "An algorithm or technique that meets at least one of the following: 1) is specified in a FIPS or NIST Recommendation, 2) is adopted in a FIPS or NIST Recommendation or 3) is specified in a list of NIST approved security functions (e.g., specified as approved in the annexes of FIPS 140-2/3)". The security strengths of cryptographic algorithms can be found in NIST Special Publication 800-57: Recommendation for Key Management - Part 1 General. 

EX

AM

PLE

TES

T CA

SE

Page 15: VVSG 1.1 Test Suite Status

TGDC Meeting, Jan 2011 Page 15

Procedure Step 1: Obtain five electronic ballot images from the VSUT. Step 2: Verify digital signature on each of the ballot images individually. Step 3: If any of the digital signature verifications fails, record “The VSUT fails the Cryptographic Protection of Records test.” End the test. Step 4: Execute the Sections 6.1.3 and 6.2.3 cryptographic tests for the digital signature cryptographic module used to sign the electronic ballot images: Step 5: If any one of the above tests fails, record “The VSUT fails the Cryptographic Protection of Records test.” End the test. Step 6: Record “The VSUT passes the Cryptographic Protection of Records test.” End the test.

EX

AM

PLE

TES

T CA

SE

Page 16: VVSG 1.1 Test Suite Status

TGDC Meeting, Jan 2011

Core Functionality Votetest

Basic, essential voting system logic Ability to define elections Capture, count, and report votes Voting variations

92 tests formalized as SQL scripts Tests are intentionally simple…

89 use about 10 ballots, 3 use 100 ballots A volume test (mock election) is a significant test of all

supported functions together …but they exercise the complete elections and

voting process

Page 16

Page 17: VVSG 1.1 Test Suite Status

TGDC Meeting, Jan 2011 Page 17

EX

AM

PLE

TES

T CA

SE

Covers Requirement

I.2.4.3.d:  All systems shall provide capabilities to produce a consolidated printed report of the results for each contest of all votes cast (including the count of ballots from other sources supported by the system as specified by the manufacturer) that includes the votes cast for each selection, the count of undervotes, and the count of overvotes.

Printed Report: Counted report of contest, including votes, undervotes, overvotes

Page 18: VVSG 1.1 Test Suite Status

TGDC Meeting, Jan 2011 Page 18

EX

AM

PLE

TES

T CA

SE

General Procedure

1.Establish initial state (clean out data from previous tests, verify resident software/firmware); 2.Program election and prepare ballots and/or ballot styles; 3.Generate pre-election audit reports; 4.Configure voting devices; 5.Run system readiness tests; 6.Generate system readiness audit reports; 7.Precinct count only:

1. Open poll; 2. Run precinct count test ballots; and 3. Close poll.

8.Run central count test ballots (central count / absentee ballots only); 9.Generate in-process audit reports; 10.Generate data reports for the specified reporting contexts; 11.Inspect ballot counters; and 12.Inspect reports.

 

Printed Report, cont. : Counted report of contest, including votes, undervotes, overvotes

Page 19: VVSG 1.1 Test Suite Status

TGDC Meeting, Jan 2011 Page 19

EX

AM

PLE

TES

T CA

SE

• Test Method includes 38 steps

• Excerpt:

Step 26: Compute the absolute value of the difference between the reported number of votes for “Car Tay Fower” in the “President, vote for at most 1” contest in Precinct 1 and the value 4, and add it to the Report Error. If the needed value does not appear in the report, increment Report Error by one (1).

...

Step 34: For each spurious ballot count or vote total reported by the VSUT (e.g., ascribing votes to a candidate that did not run in a particular contest or reporting one or more overvotes on a VSUT that prevents overvoting), increase the Report Error by one (1).

Step 35: Record the Report Error.

Printed report, cont.

Page 20: VVSG 1.1 Test Suite Status

TGDC Meeting, Jan 2011

Reliability, accuracy, misfeed rate

Improved test method replaces material that was historically included in the VSS/VVSG… hence, included in drafts

Now evaluated using data collected during all tests, rather than a single, isolated test

Page 20

Page 21: VVSG 1.1 Test Suite Status

TGDC Meeting, Jan 2011 Page 21

Test Validation Rigorous traceability to VVSG

requirements Reviewed by independent parties

VSTL labs, experts, public EAC: updated to consistent nomenclature and

traceability Procedural (on-going)

Operating temperature and humidity – complete Other VVSG 1.1 test suite components are under

consideration and will be conducted in 2011

Page 22: VVSG 1.1 Test Suite Status

TGDC Meeting, Jan 2011 Page 22

Next Steps Continue procedural validation Round-trip with testing laboratories to

discuss methods for integrating test methods into their workflow

Success here will pave the way for the rest of the VVSG 2.0 test suites

Continue to work with all to improve the VVSG, manufacturer implementations, testing practices, and the test suites

Page 23: VVSG 1.1 Test Suite Status

TGDC Meeting, Jan 2011

Discussion

Page 23