Other Topics?
• IDV
• VVSG current draft
• Human Factors
• Core Requirements and Testing
IDV
Topics
• IDV Review
• IDV in the EAC VVSG
• IDV Research Issues
• Next Steps
Review: What is IDV?• IDV = Independent Dual Verification• Voting systems that create one or more records
of ballot choices that are– Also verified by the voter (along with voting system’s electronic record)– Once verified, cannot be changed by the voting system– Useful in efficient comparisons with voting system’s electronic record– Efficiently handled, resistant to damage, accessible
• VVPAT is one example using paper as 2nd record
Why is IDV important?
• 2nd record essential for meaningful audits and recounts
• 2nd record essential because voting systems are computers, which can– Be very difficult to assess and test for accuracy– Fail for a myriad of unknown reasons– Be compromised or attacked
EAC VVSG and IDV• The VVPAT requirements in the EAC VVSG are an
instantiation of IDV• IDV discussion contained in appendix D• Begins with core IDV definitions (characteristics)• Lists more specific definitions for
– Split process– Witness– Cryptographic– Op scan
IDV in the marketplace now• 2 + witness systems available• 4 + VVPAT systems (25 states now require a
verified paper trail)• Some ballot marking/op scan systems are split-
process• 1 + cryptographic systems available• None these systems meet all IDV definitions but
are mostly in IDV
IDV Issues
• Usability of multiple representations in comparisons and audits
• Usability for both voters and election officials• Accessibility of multiple representations• Interoperability of record formats to facilitate 3rd
party audits, IDV add-ons (e.g., Witness devices)
Core Requirements and Testing
Profiles• Profile: specialization of a standard for a
particular context, with constraints and extensions that are specific to that context. (Glossary def. 2)
• Formalize profiles for “categories”• Add profiles for supported voting variations and
optional functions• Types of independent dual verification• Defined in conformance clause (Sec. 4.2)• Extensible (e.g., this state’s profile)
Compliance points• Identified, testable requirements• Getting there
– Extricate compound requirements from one another, make separate compliance points
– Clarify general requirements via sub-requirements that are profile- and/or activity-specific
– Refactor repeating, overlapping requirements
Implementation statement
• Vendor identifies profiles to which the system is believed to conform
• Applicable test cases are identified by profiles
• Certification is to those profiles only
Issues
• Sorting out will take time
• Identification, referencing and indexing of compliance points require a robust document production system
• Versioning of standard
Agenda1. Standards architecture (profiles, compliance points,
formalized implementation statements…) (section 4.2.2)2. Software integrity and coding conventions3. Methods for conformity assessment
– Logic verification– Test protocols
4. Casting, counting, and reporting requirements5. Process model6. Performance and workmanship requirements7. Issue paper8. Research papers on VVSG maintenance and information sharing9. Future work
DefinitionProfile: (1) Subset of a standard for a particular
constituency that identifies the features, options, parameters, and implementation requirements necessary for meeting a particular set of requirements. (2) Specialization of a standard for a particular context, with constraints and extensions that are specific to that context.
(1) ISO 8632 (Computer Graphics Metafile)
Profiles in implementation statement
• An implementation statement shall identify: – Exactly 1 profile from group 1 – Exactly 1 profile from group 2– All applicable profiles from group 3
• Profile group 1: product classes, taxonomy ‘A’– Precinct count– Central count
Structure of profiles• Profiles form a classification hierarchy• Profiles may subsume other profiles (e.g., paper-based
subsumes marksense and punchcard, and “all systems” subsumes everything)
• Profiles may be declared to be mutually exclusive, or not, as appropriate (e.g., paper-based and electronic overlap in the case of marksense)
• Constraint: in all cases, that which conforms to a subprofile also conforms to the subsuming profile
• Anything conforming to a profile of a standard conforms to that standard
The hierarchy so farVoting systems
IDV…
Precinct count
Central count
- Disjoint -
- Disjoint -
DRE
Marksense
Punchcard
Straight-partyvoting
Ranked ordervoting
Split precincts
All those optionalfeatures VVPAT
WitnessSplit
process
End-to-end(crypto)
- Disjoint -
Profiles in compliance points• Restricts applicability of compliance point to a
precisely specified subset of voting systems• Requirement 4.4.3.5-1.2 Systems conforming to
the DRE profile shall maintain an accurate Cast Vote Record of each ballot cast.
• Requirement 4.4.2-2.13.1 In systems conforming to the Provisional / challenged ballots and DRE profiles, the DRE shall provide the capability to categorize each provisional/challenged ballot.
Profiles in conformity assessment
• Like compliance points, test cases and expert reviews are associated with specific profiles or combinations of profiles
• Only those conformity assessment activities indicated for the profiles claimed in the implementation statement are applicable
Profiles and traceability• Using the profiles mechanism, jurisdictions may
formally define their own specializations of the standard, provided that they do not conflict with the standard (that which conforms to a subprofile must conform to the subsuming profile)
• Conformance to the jurisdiction’s profile is well-defined because the conformance clause is included by reference, as are all of the applicable compliance points
Agenda1. Standards architecture (profiles, compliance points, formalized
implementation statements…)2. Software integrity and coding conventions (sections
4.3.1.1 and 4.3.4.1.1)3. Methods for conformity assessment
– Logic verification– Test protocols
4. Casting, counting, and reporting requirements5. Process model6. Performance and workmanship requirements7. Issue paper8. Research papers on VVSG maintenance and information sharing9. Future work
What they are• Mostly, requirements on the form (not function)
of source code• Some requirements affecting software integrity,
implemented as defensive coding practices– Error checking– Exception handling– Prohibit practices that are known risk factors for latent
software faults and unverifiable code– Unresolved overlap with STS….
Motivation• Started in 1990 VSS, expanded in 2002,
expanded more in IEEE P1583 5.3.2b• TGDC Resolution #29-05, “Ensuring
Correctness of Software Code” (part 2)• Enhance workmanship, security, integrity,
testability, and maintainability of applications
2002 VSS / VVSG 1 status
• Mixture of mandatory and optional• Vendors may substitute “published, reviewed,
and industry-accepted coding conventions”• Incorporated conventions have suffered from
rapid obsolescence and limited applicability• Some mandatory requirements had unintended
consequences
Recommended changes• Expand coding conventions addressing software
integrity– Start with IEEE requirements– Make defensive coding requirements (error and range
checking) more explicit (Sec. 4.3.1.1.2)– Require structured exception handling
• Clarify length limits (modules vs. callable units)• Delete the rest; require use of “published,
credible” coding conventions
Credible ≈ industry-accepted
• Coding conventions shall be considered credible if at least two different organizations with no ties to the creator of the rules or to the vendor seeking qualification independently decided to adopt them and made active use of them at some time within the three years before qualification was first sought.
Issues (1)• Definition of “credible” is problematic• General: prescriptions for how to write code
versus open-ended expert review– Performance requirement “shall have high integrity” is
too vague, not measurable– 2002 VSS, P1583 5.3.2b include prescriptions– STS favors open-ended expert review– Compromise: conservative prescriptions followed by
STS review
Issues (2)• Public comment, April 14: “The NASED
Technical Committee has previously ruled that assembler code is permitted as long as the code meets all other requirements.” TBD – need to get a copy of that ruling, determine if it covers tabulation code, and if so, why.
• C doesn’t have structured exception handling• Delete integrity requirements, etc. if “published,
credible” replacements are found
Agenda1. Standards architecture (profiles, compliance points, formalized
implementation statements…)2. Software integrity and coding conventions
3. Methods for conformity assessment– Logic verification– Test protocols
4. Casting, counting, and reporting requirements5. Process model6. Performance and workmanship requirements7. Issue paper8. Research papers on VVSG maintenance and information sharing9. Future work
Conformity assessment overview
• Reviews– Logic verification– Open-ended security review– Accessibility, usability reviews– Design requirement verification
• Tests– Test protocols– Performance-based usability testing– Environmental tests
Agenda1. Standards architecture (profiles, compliance points, formalized
implementation statements…)2. Software integrity and coding conventions
3. Methods for conformity assessment– Logic verification (sections 4.5.2, 5.1.2, 5.3, & 6.3.1)– Test protocols
4. Casting, counting, and reporting requirements5. Process model6. Performance and workmanship requirements7. Issue paper8. Research papers on VVSG maintenance and information sharing9. Future work
What it is
• Formal characterization of software behavior within a carefully restricted scope
• Proof that this behavior conforms to specified assertions (i.e., votes are reported correctly in all cases)
• Complements [falsification] testing
Motivation
• TGDC Resolution #29-05, “Ensuring Correctness of Software Code”
• Higher level of assurance than functional testing alone
• Clarify objectives of source code review
How it works• Vendor specifies pre- and post-conditions for
each callable unit• Vendor proves assertions regarding tabulation
correctness• Testing authority reviews, checks the math, and
issues findings– Pre- and post-conditions correctly characterize the
software– The assertions are satisfied
Issues
• Training required
• Limited scope = limited assurance; unlimited scope = impracticable
• Overlaps with security reviews
Agenda1. Standards architecture (profiles, compliance points, formalized
implementation statements…)2. Software integrity and coding conventions
3. Methods for conformity assessment– Logic verification– Test protocols (section 6.4)
4. Casting, counting, and reporting requirements5. Process model6. Performance and workmanship requirements7. Issue paper8. Research papers on VVSG maintenance and information sharing9. Future work
What it is• General test template• General pass criteria• Collection of testing scenarios with
implementation-specific behavior parameterized or abstracted out
• Functional tests, typical case tests, capacity tests, error case tests
• Performance-based usability tests might be integrated, T.B.D. (Template may differ)
Motivation
• TGDC Resolution #25-05, item 4 (test methods)
• TGDC Resolution #26-05, “Uniform Testing Methods and Procedures”
• Improved reproducibility
Changes from current
• Augments implementation-dependent, white box structural and functional testing
• Raises the baseline level of testing that all systems must pass
• Error rate, MTBF estimated from statistics collected across run of entire test suite
Issues
• Implementation-dependent white box testing: poor reproducibility, but hardly redundant
• Testing combinations of features requires extensive test suite
• Typical case tests – need input on what is typical
Today’s agenda1. Standards architecture (profiles, compliance points, formalized
implementation statements…)2. Software integrity and coding conventions3. Methods for conformity assessment
– Logic verification– Test protocols
4. Casting, counting, and reporting requirements (sections 4.4.2 and 4.4.3)
5. Process model6. Performance and workmanship requirements7. Issue paper8. Research papers on VVSG maintenance and information sharing9. Future work
Origins
• Mostly derived from 2002 VSS
• Refactored requirements to clarify and reduce redundancy
• Separated election administration concerns (best practices, Sec. 4.6)
• Added precision
Motivation
• TGDC Resolution #25-05, “Precise and Testable Requirements”
Significant changes
• In casting and counting, specified functional requirements for the many voting variations (optional profiles)
• In reporting, significantly revised requirements on the content of reports
• Accuracy of reported totals now defined by logic model
Reporting issues• Cast, read, and counted: three concepts, not two• Reporting levels: tabulator, precinct, election district,
and jurisdiction (state)– 2002 VSS unclear on what levels are required– Generic facility to define arbitrary reporting contexts is not
required
• Manual / optional processing of ballots with write-in votes: outside the system, unverifiable. Can these possibly conform to the write-ins profile?
• Define “unofficial”
Agenda1. Standards architecture (profiles, compliance points, formalized
implementation statements…)2. Software integrity and coding conventions3. Methods for conformity assessment
– Logic verification– Test protocols
4. Casting, counting, and reporting requirements
5. Process model (section 4.5.1)6. Performance and workmanship requirements7. Issue paper8. Research papers on VVSG maintenance and information sharing9. Future work
What it is
• A formal model of the elections process in both graphical and textual representations
• An identification of the dependencies among activities and objects
• Informative, not normative
Motivation
• TGDC Resolution #33-05, “Glossary and Voting Model”
• Complements glossary to further clarify vocabulary
• Helps to specify intended scope of requirements
Origins
• Review of previous work
• Input from CRT subcommittee
• Reconciliation of conflicting vocabulary and models
• Elaboration as needed
Diagram: Register voters
Input: [Registration database [original]]Output: [Voter lists]
[Registration database [original]] -><ForkNode>{ ->(Register new voters) -><JoinNode *j>, ->(Update voter information) -><*j>, ->(Purge ineligible, inactive, or dead voters) -><*j> };<*j> ->[Registration database [updated]] ->(Generate voter lists) ->[Voter lists].
Registration database[original]
Registration database[updated]
Register new voters Purge ineligible, inactive,or dead voters
Generate voter lists
Voter lists
Update voter information
Issues
• Vocabulary is still evolving
• Probably never match any state’s processes perfectly
Accessibility, Usability and Privacy Requirements
Accessibility and usability are qualities of a voting system
• Accessibility refers to the degree to which a system is available to, and usable by, individuals with disabilities.
• HAVA also includes alternative language access for voters with limited English proficiency as required by the Voting Rights Act.
• Usability means that voters can cast valid votes as they intended, quickly, without errors, and with confidence that their ballot choices were recorded correctly.
• It also concerns the setup, operation and maintenance of voting equipment by poll workers and election officials.
Voting systems must be usable and accessible for everyone
• The VVSG usability and accessibility section focuses on the voting process, and on voters.
• Future work shouldfocus on other users:– Election officials– Poll workers
Language in HAVA guided our work
• That system be “accessible for individuals with disabilities, including non-visual accessibility for the blind and visually impaired, in a manner that provides the same opportunity for access and participation (including privacy and independence) as for other voters.” -- 301 (a)(3)(A)
• At least one voting system “equipped for individuals with disabilities” must be used at each polling place for federal elections held on or after January 1, 2006. --. 301 (a)(3)(B).
• And “provide alternative language accessibility as already required by section 203 of the Voting Rights Act.” -- 301 (a)(4).
Resolutions• Human factors and privacy rely
on both having well designed systems, and the effective deployment of those systems in the polling place (#3-05)
• Human abilities exist on a wide spectrum. Strong universal usability requirements make all voting systems not only more usable, but accessible to more people. (#6-05)
• Ballot design, instructions and error messages are a critical part of the voting experience. They are of particular importance for people with cognitive disabilities (#8-05)
• Setting performance, rather than design, standards will encourage innovation to address the complex, interlocking requirements for accessibility, functionality, security and trust. (#5-05)
Our Practical Approach• Accessibility requirements were a top priority under HAVA (#2-05)
• Other human factors and privacy requirements cover aspects of accurately capturing indication of a voter’s choice (#4-05)
• All requirements involving human interaction must ensure that basic usability, accessibility, and privacy are maintained. (#9-05)
• The standards themselves must be usable. Voting system standards should be written in plain language understandable by both test experts and by voting officials who are not experts in human factors or design. (#10-05)
• Voting machines must be available to validate conformance tests and establish performance benchmarks. (#11-05)
Usability should be part of all design stages
During design and
development
Summative testing as part of qualification
For each election
Evaluate the product usability throughout the development process
Evaluate the finished product against usability requirements to measure its success against human performance
Ensure that the ballot design and use of the system in the polling place continue to meet requirements
Even the best standards have limitations.
A standard should ensure a base level of usability, accessibility, and privacy.
Good design and election procedures support and extend standard requirements
VVSG 2.2.7 requirements maintain or upgrade VSS 2002
• Comprehensive accessibility requirements and recommendations that point the way to future requirements
• First usability requirements for voting systems, upgraded from informative appendix
• New privacy requirements focused on the voter-equipment interface
• Other elements
– Recommendation that vendors present report of summative usability tests for both general and accessible voting systems
– Work to clarify ambiguous requirements and fill gaps
– Reflect what is readily achievable with current technology. – Some human factors requirements in section on VVPAT
Research is currently under way at NIST
• Usability performance benchmarks
• Plain language guidance for ballots, instructions, error messages
• Guidance for ballot design• Usability of standards
Courtesy Design for Democracy
VVSG Section 2.2.71. Accessibility
1.1 General accessibility
1.2 Visual
1.3 Dexterity
1.4 Mobility
1.5 hearing
1.6 speech
1.7 Cognitive
2. Alternate languages
3. Usability
3.1 Usability testing
3.2 Functional
3.3 Cognitive
3.4 Perceptual
3.5 Interaction
4. Privacy
4.1 Voting station configuration
4.2. Anonymity for alternate formats
VVSG current draft
Human Factors – VVSG
• 4 Areas:– Accessibility– Usability– Limited English Proficiency– Privacy
• Based on current state of the art
Human Factors – Setting the stage for VVSG 2007
• Require more advanced accessibility but still in industry state of the art– Synchronized audio and video
• Performance measures for usability
VVPAT
• VVPAT not mandatory. These requirements are for States that choose to use VVPAT.
• Requirements address:– Operations– Auditability– Privacy
Independent Dual Verification – Setting the stage for VVSG 2007
• Multiple records of ballots for verification
• Several types currently being studied:– Split process– Witness– End-to-end (cryptographic)– VVPAT
Wireless
• Control & identify usage
• Encrypt and authenticate
• Function without wireless
• Protect against wireless-based attack
Software Distribution & Setup Validation
• Help ensure correct version of voting software is used
• Help ensure voting software is set up correctly
• Use National Software Reference Library www.nsrl.nist.gov
Top Related