Share Button

Voluntary Voting System Guidelines

Election Assistance Commission (2007-08-31)

This Page: http://www.copswiki.org/Common/VoluntaryVotingSystemGuidelines
Media Link: http://www.copswiki.org/w119/pub/Common/VoluntaryVotingSystemGuidelines/Final-TGDC-VVSG-08312007.pdf

More Info: Election Integrity, Election2008

Intro

Chapter 1

  • Authored by: The Technical Guidelines Development Committee (TGDC), a committee authorized under the HELP America Vote Act (HAVA) of 2002, and researchers at the National Institute of Standards and Technology (NIST) for the Election Assistance Commission (EAC).
  • Used primarily by voting system manufacturers and voting system test labs.
  • The VVSG can be considered essentially as a mandatory standard as the vast majority of states and territories, however, now require that their voting systems conform to the requirements in the VVSG.

Chapter 2 - New and expanded material

  • Device Classes, Requirements Structure, Strict Terminology
  • Usability & Accessibility
    • Usability Metrics - usability, accessibility, privacy
    • Human Factors
      • Voter-Editable Ballot Device (VEBD)
      • Ballot Checking and Correction
      • Notification of Ballot Casting
      • Plain Language
      • Icons and Language
      • Adjustability
      • Choice of font and contrast
      • Legibility
      • Timing, no undue delay, warning for inactivity.
      • Alternative Languages
      • Poll Workers - usability for poll workers
      • End-to-End Accessibility
      • Accessibility of Paper Records - Allow voters to confirm paper records
      • Color Adjustment
      • Synchronized Audio and video - three modes: audio-only, visual-only, or synchronized audio/video.
  • Software Independence - an undetected error or fault in the voting system’s software is not capable of causing an undetectable change in election results. All voting systems must be software independent in order to conform to the VVSG.
    • it must be possible to audit voting systems to verify that ballots are being recorded correctly
    • audits of voting system correctness cannot rely on the software itself being correct.
    • This is a major change from previous versions of the VVSG, because previous versions permitted voting systems that are software dependent, that is, voting systems whose audits must rely on the correctness of the software. One example of a software dependent voting system is the DRE, which is now non-conformant to this version of the VVSG.
    • all voting systems must include a vote-capture device that uses independent voter-verifiable records (IVVR).
      • Voter-verifiable paper records (VVPR)
        • Optical Scanners used with
          • manually marked paper ballots or
          • EBP - Electronic Ballot Printer or
          • Electronically Assisted Ballot Marker
        • VVPAT - Voter-verifiable paper audit trail.
  • Open-Ended Vulnerability Testing (OEVT) - discover architecture, design and implementation flaws in the system which may not be detected using systematic functional, reliability, and security testing and which may be exploited to change the outcome of an election, interfere with voters’ ability to cast ballots or have their votes counted during an election, or compromise the secrecy of vote.
  • Expanded Security Coverage
    • Cryptography
    • Setup Inspection
    • Software Installation
    • Access Control
    • System Integrity Management
    • Communications Security
    • System Event Logging
    • Physical Security
  • Treatment of COTS in Voting System Testing
    • COTS - commercial off-the-shelf software/hardware
  • End-to-End Testing
    • Testing only portions of the system for accuracy and reliability is now prohibited in this version of the VVSG.
  • Reliability - defined as failure rate
  • Expanded Core Requirements Coverage
    • EBMs - Requirements broadened to cover Electronically-assisted Ballot Markers (EBMs) and Electronic Ballot Printers (EBPs).
    • Early voting - Updates to requirements to handle early voting.
    • Optical scanner accuracy - Significant changes to accuracy requirements for optical scanners and handling of marginal marks.
    • Coding conventions - Major revisions to coding conventions and prohibited constructs in languages.
    • QA and CM - Major revisions to Quality Assurance and Configuration Management requirements for manufacturers.
    • Humidity - New operating tests for humidity affecting paper and the voting system.
    • Logic verification - Requirements to show that the logic of the system satisfies certain constraints and correctness.
    • Epollbooks - Requirements on ballot activation involving epollbooks to protect integrity and privacy of ballot activation information and to ensure records on epollbooks do not violate secrecy of the ballot.
    • Common data formats - Requirements dealing with making voting device interfaces and data formats transparent and interchangeable and to use consensus-based, publicly available formats.

Chapter 3: Background

  • NIST - National Institute of Standards and Technology - 1975 NBS Interagency Report, later reprinted as SP 500-30, Effective Use of Computing Technology in Vote-Tallying NIST75.
  • The 1990 Voting Systems Standard 1990 VSS
  • The 2002 Voting Systems Standard - 2002 VSS
  • HAVA and VVSG 2005
    • HAVA is a Federal law that, among other things, provides to the states financial aid for the purchase of new voting equipment. In section 301 it also sets forth broad functional standards for voting systems as used in Federal elections.
    • The VVSG is a set of highly detailed technical requirements in support of the broad goals of HAVA. These requirements apply only to voting equipment, not to procedures in the polling place.

Equipment Requirements

  • Voting System:
    • In-person voting
    • Absentee voting
    • Provisional-challenged ballots
    • Review-required ballots
    • Primary elections
      • Closed primaries
      • Open primaries
    • Write-ins
    • Ballot rotation
    • Straight party voting
      • Cross-party endorsement
    • Split precincts
    • N-of-M voting
    • Cumulative voting
    • Ranked order voting

  • Voting Device
    • Absentee voting device
    • Provisional-challenged ballots device
    • Review-required ballots device
    • Primary elections device
      • Closed primaries device
      • Open primaries device
    • Write-ins device
    • Ballot rotation device
    • Straight party voting device
      • Cross-party endorsement device
    • Split precincts device
    • N-of-M voting device
    • Cumulative voting device
    • Ranked order voting device

  • Voting Device Classes
    • audit device
    • electronic device
      • programmed device (subsumes VEBD, tabulator, and activation device)
        • VEBD (Voter-Editable Ballot Device) (subsumes EBM, VEBD-A, VEBD-V and DRE)
          • EBM (Electronically-assisted Ballot Marker) (subsumes EBP)
            • EBP (Electronic Ballot Printer)
          • VEBD-A (Audio VEBD) (subsumes Acc-VS)
          • VEBD-V (Video VEBD) (subsumes Acc-VS)
            • Acc-VS (accessible voting station)
          • DRE (Direct Record Electronic) (subsumes VVPAT)
            • VVPAT (Voter-Verifiable Paper Audit Trail)
    • paper-based device (subsumes MMPB, EBM and optical scanner)
      • MMPB (Manually-Marked Paper Ballot)
      • EBM (Electronically-assisted Ballot Marker) (subsumes EBP)
        • EBP (Electronic Ballot Printer)
      • optical scanner (subsumes MCOS, ECOS, PCOS and CCOS)
        • MCOS (MMPB-Capable Optical Scanner)
        • ECOS (EMPB-Capable Optical Scanner)
        • PCOS (Precinct-count optical scanner)
        • CCOS (Central-count optical scanner)

    • Vote-capture device (subsumes IVVR vote-capture device and VEBD)
      • IVVR vote-capture device (subsumes MMPB, EBM, and VVPAT)
        • MMPB (Manually-Marked Paper Ballot)
        • EBM (Electronically-assisted Ballot Marker) (subsumes EBP)
          • EBP (Electronic Ballot Printer)
        • VVPAT (Voter-Verifiable Paper Audit Trail)

  • tabulator (subsumes DRE, EMS, optical scanner, precinct tabulator and central tabulator)
  • EMS (Election Management System)
  • precinct tabulator (subsumes PCOS)
    • PCOS (Precinct-count optical scanner)
  • central tabulator (subsumes CCOS)
    • CCOS (Central-count optical scanner)

  • activation device


  • 2.4-A Implementation statement - An implementation statement SHALL include:
    1. Full product identification of the voting system, including version number or timestamp;
    2. Separate identification of each device (see below) that is part of the voting system;
    3. Version of VVSG to which conformity assessment is desired;
    4. Classes implemented (see Part 1: 2.5.3 “Classes identified in implementation statement”);
    5. Device capacities and limits (especially those appearing in Part 1: 8.3.1 “Domain of discourse”);
    6. List of languages supported; and
    7. Signed attestation that the foregoing accurately characterizes the system submitted for testing.

  • 2.7-A Software independence - Voting systems SHALL be software independent, that is, an undetected error or fault in the voting system’s software SHALL NOT be capable of causing an undetectable change in election results.
  • 2.7.1-A Independent voter-verifiable records may be software independent - software independence MAY be achieved through the use of independent voter-verifiable records or it MAY be achieved through an innovation class submission. (IVVR is the only method available at this time).
  • 2.7.1-B IVVR voting system requires IVVR vote-capture device - In a voting system of the IVVR class, every Vote-capture device SHALL be an IVVR vote-capture device.
  • 2.7.2-A Innovation class submissions follow same procedures as standard - For each distinct innovation class submission, the manufacturer SHALL adhere to the same submission procedures and requirements as for standard submissions.
  • 2.7.2-B Identification of innovativeness - Each distinct innovation class submission SHALL include additional documentation that provides an explanation as to why the voting system and its accompanying devices are innovative and how they differ from voting technology that implements other voting device classes in the VVSG.
  • 2.7.2-C Innovation class submission creates new device class - For each distinct innovation class submission, the manufacturer SHALL request and justify that a new device class be created in the VVSG for each distinct innovative device in the submission
  • 2.7.2-C.1 Innovative device class submission - For each distinct innovation device class submission included In the voting system, the implementation statement for the voting system SHALL identify the new device classes to be created and where they fit into the device class hierarchy.
  • 2.7.2-C.2 Innovation device class identification of requirements - For each distinct innovation device class submission included in the voting system, the implementation statement for the voting system SHALL identify all requirements that apply to the new class and suggested test methods.

  • 3.2.1.1-A Total completion performance - The system SHALL achieve a Total Completion Score (the proportion of users who successfully cast a ballot (whether or not the ballot contains erroneous votes)) of at least 98% as measured by the VPP (Voting Performance Protocol).
  • 3.2.1.1-B Perfect ballot performance - The system SHALL achieve a Perfect Ballot Index (the ratio of the number of cast ballots containing no erroneous votes to the number of cast ballots containing one or more errors (either a vote for an unintended choice, or a missing vote)) of at least 2.33 as measured by the VPP.
  • 3.2.1.1-C Voter inclusion performance - The system SHALL achieve a voter inclusion index (a measure of both voting accuracy and consistency. It is based on mean accuracy and the associated standard deviation. Accuracy per voter depends on how many “voting opportunities” within each ballot are performed correctly. A low value for the standard deviation of these individual accuracy scores indicates higher consistency of performance across voters..) of at least 0.35 as measured by the VPP.
  • 3.2.1.1-D Usability metrics from the Voting Performance Protocol - The test lab SHALL report the metrics for usability of the voting system, as measured by the VPP.
  • 3.2.1.1-D.1 Effectiveness metrics for usability - The test lab SHALL report all the effectiveness metrics for usability as defined and measured by the VPP.
  • 3.2.1.1-D.2 Voting session time - The test lab SHALL report the Average Voting Session Time (mean time taken per voter to complete the process of activating, filling out, and casting the ballot), as measured by the VPP.
  • 3.2.1.1-D.3 Average voter confidence - The test lab SHALL report the Average Voter Confidence (mean confidence level expressed by the voters that the system successfully recorded their votes), as measured by the VPP.
  • 3.2.1.2-A Usability testing by manufacturer for general population - The manufacturer SHALL conduct summative usability tests on the voting system using individuals who are representative of the general population and SHALL report the test results, using the Common Industry Format, as part of the TDP.
  • 3.2.2-A Notification of effect of overvoting - If the voter selects more than the allowable number of choices within a contest, the voting system SHALL notify the voter of the effect of this action before the ballot is cast and counted.
  • 3.2.2-B Undervoting to be permitted - The voting system SHALL allow the voter, at the voter’s choice, to submit an undervoted ballot without correction.
  • 3.2.2-C Correction of ballot - The voting system SHALL provide the voter the opportunity to correct the ballot for either an undervote or overvote before the ballot is cast and counted.
  • 3.2.2-D Notification of ballot casting - If and only if the voter successfully casts the ballot, then the system SHALL so notify the voter.

3.2.2.1 Editable interfaces
  • 3.2.2.1-A Prevention of overvotes - The VEBD SHALL prevent voters from selecting more than the allowable number of choices for each contest.
  • 3.2.2.1-B Warning of undervotes - The VEBD SHALL provide feedback to the voter, before final casting of the ballot that identifies specific contests for which the voter has selected fewer than the allowable number of choices (i.e., undervotes).
  • 3.2.2.1-C Independent correction of ballot - The VEBD SHALL provide the voter the opportunity to correct the ballot before it is cast and counted. This correction process SHALL NOT require external assistance. The corrections to be supported include modifying an undervote or overvote, and changing a vote from one candidate to another.
  • 3.2.2.1-D Ballot editing per contest - The VEBD SHALL allow the voter to change a vote within a contest before advancing to the next contest.
  • 3.2.2.1-E Contest navigation - The VEBD SHALL provide navigation controls that allow the voter to advance to the next contest or go back to the previous contest before completing a vote on the contest(s) currently being presented (whether visually or aurally).
  • 3.2.2.1-F Notification of ballot casting failure (DRE) - If the voter takes the appropriate action to cast a ballot, but the system does not accept and record it successfully, including failure to store the ballot image, then the DRE SHALL so notify the voter and provide clear instruction as to the steps the voter should take to cast the ballot.

3.2.2.2 Non-Editable interfaces
  • 3.2.2.2-A Notification of overvoting - The voting system SHALL be capable of providing feedback to the voter that identifies specific contests for which the voter has made more than the allowable number of votes (i.e.,. overvotes).
  • 3.2.2.2-B Notification of undervoting - The voting system SHALL be capable of providing feedback to the voter that identifies specific contests for which the voter has made fewer than the allowable number of votes (i.e., undervotes). The system SHALL provide a means for an authorized election official to deactivate this capability entirely and by contest.
  • 3.2.2.2-C Notification of blank ballots - The voting system SHALL be capable of notifying the voter that he or she has submitted a paper ballot that is blank on one or both sides. The system SHALL provide a means for an authorized election official to deactivate this capability.
  • 3.2.2.2-D Ballot correction or submission following notification - If the voting system has notified the voter that a potential error condition (such as an overvote, undervote, or blank ballot) exists, the system SHALL then allow the voter to correct the ballot or to submit it as is.
  • 3.2.2.2-E Handling of marginal marks - Paper-based precinct tabulators SHOULD be able to identify a ballot containing marginal marks. When such a ballot is detected, the tabulator SHALL:
    1. Return the ballot to the voter;
    2. Provide feedback to the voter that identifies the specific contests for which a marginal mark was detected; and
    3. Allow the voter either to correct the ballot or to submit the ballot "as is" without correction.
  • 3.2.2.2-F Notification of ballot casting failure (PCOS) - If the voter takes the appropriate action to cast a ballot, but the system does not accept and record it successfully, including failure to read the ballot or to transport it into the ballot box, the PCOS SHALL so notify the voter. (precinct count optical scanner).

3.2.3.1 Privacy at the polls
  • 3.2.3.1-A System support of privacy - The voting system SHALL prevent others from determining the contents of a ballot.
  • 3.2.3.1-A.1 Visual privacy - The ballot, any other visible record containing ballot information, and any input controls SHALL be visible only to the voter during the voting session and ballot submission.
  • 3.2.3.1-A.2 Auditory privacy - During the voting session, the audio interface of the voting system SHALL be audible only to the voter.
  • 3.2.3.1-A.3 Privacy of warnings - The voting system SHALL issue all warnings in a way that preserves the privacy of the voter and the confidentiality of the ballot.
  • 3.2.3.1-A.4 No receipts - The voting system SHALL NOT issue a receipt to the voter that would provide proof to another of how the voter voted.

3.2.3.2 No recording of alternative format usage
  • 3.2.3.2-A No recording of alternative languages - No information SHALL be kept within an electronic CVR that identifies any alternative language feature(s) used by a voter.
  • 3.2.3.2-B No Recording of Accessibility Features - No information SHALL be kept within an electronic CVR that identifies any accessibility feature(s) used by a voter.
3.2.4 Cognitive issues
  • 3.2.4-A Completeness of instructions - The voting station SHALL provide instructions for all its valid operations.
  • 3.2.4-B Availability of assistance from the system - The voting system SHALL provide a means for the voter to get help directly from the system at any time during the voting session.
  • 3.2.4-C Plain Language - Instructional material for the voter SHALL conform to norms and best practices for plain language.
  • 3.2.4-C.1 Clear Warnings - Warnings and alerts issued by the voting system SHOULD clearly state:
    1. The nature of the problem;
    2. Whether the voter has performed or attempted an invalid operation or whether the voting equipment itself has malfunctioned in some way; and
    3. The set of responses available to the voter.
  • 3.2.4-C.2 Context before action - When an instruction is based on a condition, the condition SHOULD be stated first, and then the action to be performed.
  • 3.2.4-C.3 No Jargon - The system SHOULD use familiar, common words and avoid technical or specialized words that voters are not likely to understand.
  • 3.2.4-C.4 Start each instruction on a new line - The system SHOULD start the visual presentation of each new instruction on a new line.
  • 3.2.4-C.5 Use of positive - The system SHOULD issue instructions on the correct way to perform actions, rather than telling voters what not to do.
  • 3.2.4-C.6 Use of imperative voice - The system's instructions SHOULD address the voter directly rather than use passive voice constructions.
  • 3.2.4-C.7 Gender-based pronouns - The system SHOULD avoid the use of gender-based pronouns.
  • 3.2.4-D No bias among choices - Consistent with election law, the voting system SHALL support a process that does not introduce bias for or against any of the contest choices to be presented to the voter. In both visual and aural formats, the choices SHALL be presented in an equivalent manner.
  • 3.2.4-E Ballot design - The voting system SHALL provide the capability to design a ballot with a high level of clarity and comprehensibility.
  • 3.2.4-E.1 Contests split among pages or columns - The voting system SHOULD NOT visually present a single contest spread over two pages or two columns.
  • 3.2.4-E.2 Indicate maximum number of candidates - The ballot SHALL clearly indicate the maximum number of candidates for which one can vote within a single contest.
  • 3.2.4-E.3 Consistent representation of candidate selection - The relationship between the name of a candidate and the mechanism used to vote for that candidate SHALL be consistent throughout the ballot.
  • 3.2.4-E.4 Placement of instructions - The system SHOULD display instructions near to where they are needed.
  • 3.2.4-F Conventional use of color - The use of color by the voting system SHOULD agree with common conventions: (a) green, blue or white is used for general information or as a normal status indicator; (b) amber or yellow is used to indicate warnings or a marginal status; (c) red is used to indicate error conditions or a problem requiring immediate attention.
  • 3.2.4-G Icons and language - When an icon is used to convey information, indicate an action, or prompt a response, it SHALL be accompanied by a corresponding linguistic label.

3.2.5 Perceptual issues
  • 3.2.5-A Screen flicker - No voting system display screen SHALL flicker with a frequency between 2 Hz and 55 Hz.
  • 3.2.5-B Resetting of adjustable aspects at end of session - Any aspect of the voting station that is adjustable by the voter or poll worker, including font size, color, contrast, audio volume, or rate of speech, SHALL automatically reset to a standard default value upon completion of that voter's session. For the Acc-VS, the aspects include synchronized audio/video mode and non-manual input mode.
  • 3.2.5-C Ability to reset to default values - If any aspect of a voting system is adjustable by the voter or poll worker, there SHALL be a mechanism to reset all such aspects to their default values.
  • 3.2.5-D Minimum font size - Voting systems SHALL provide a minimum font size of 3.0mm (measured as the height of a capital letter) for all text intended for voters or poll workers.
  • 3.2.5-E Available font sizes - A voting station that uses an electronic image display SHALL be capable of showing all information in at least two font sizes, (a) 3.0-4.0 mm and (b) 6.3-9.0 mm, under control of the voter. The system SHALL allow the voter to adjust font size throughout the voting session while preserving the current votes.
  • 3.2.5-F Use of sans serif font - Text intended for the voter SHOULD be presented in a sans serif font. (Research has shown that users prefer such fonts.)
  • 3.2.5-G Legibility of paper ballots and verification records - Voting systems using paper ballots or paper verification records SHALL provide features that assist in the reading of such ballots and records by voters with poor reading vision.
  • 3.2.5-G.1 Legibility via font size - The system MAY achieve legibility of paper records by supporting the printing of those records in at least two font sizes, 3.0 - 4.0mm and 6.3 - 9.0mm.
  • 3.2.5-G.2 Legibility via magnification - The system MAY achieve legibility of paper records by supporting magnification of those records. This magnification MAY be done by optical or electronic devices. The manufacturer MAY either: 1) provide the magnifier itself as part of the system, or 2) provide the make and model number of readily available magnifiers that are compatible with the system.
  • 3.2.5-H Contrast Ratio - The minimum figure-to-ground ambient contrast ratio for all text and informational graphics (including icons) intended for voters or poll workers SHALL be 3:1.
  • 3.2.5-I High contrast for electronic displays - The voting station SHALL be capable of showing all information in high contrast either by default or under the control of the voter. The system SHALL allow the voter to adjust contrast throughout the voting session while preserving the current votes. High contrast is a figure-to-ground ambient contrast ratio for text and informational graphics of at least 6:1.
  • 3.2.5-J Accommodation for color blindness - The default color coding SHALL support correct perception by voters with color blindness.
  • 3.2.5-K No reliance solely on color - Color coding SHALL NOT be used as the sole means of conveying information, indicating an action, prompting a response, or distinguishing a visual element.

3.2.6 Interaction issues
  • 3.2.6-A No page scrolling - Voting systems SHALL NOT require page scrolling by the voter.
  • 3.2.6-B Unambiguous feedback for voter's selection - The voting system SHALL provide unambiguous feedback regarding the voter’s selection, such as displaying a checkmark beside the selected option or conspicuously changing its appearance.
  • 3.2.6-C Accidental Activation - Input mechanisms SHALL be designed to minimize accidental activation.
  • 3.2.6-C.1 Size and separation of touch areas - On touch screens, the sensitive touch areas SHALL have a minimum height of 0.5 inches and minimum width of 0.7 inches. The vertical distance between the centers of adjacent areas SHALL be at least 0.6 inches, and the horizontal distance at least 0.8 inches.
  • 3.2.6-C.2 No repeating keys - No key or control on a voting system SHALL have a repetitive effect as a result of being held in its active position.

3.2.6.1 Timing issues
  • 3.2.6.1-A Maximum initial system response time - The initial system response time of the voting system SHALL be no greater than 0.5 seconds.
  • 3.2.6.1-B Maximum completed system response time for vote confirmation - When the voter performs an action to record a single vote, the completed system response time of the voting system SHALL be no greater than one second in the case of a visual response, and no greater than five seconds in the case of an audio response.
  • 3.2.6.1-C Maximum completed system response time for all operations - The completed system response time of the voting system for visual operations SHALL be no greater than 10 seconds.
  • 3.2.6.1-D System response indicator - If the system has not completed its visual response within one second, it SHALL present to the voter, within 0.5 seconds of the voter's action, some indication that it is preparing its response.
  • 3.2.6.1-E Voter inactivity time - The voting system SHALL detect and warn about lengthy voter inactivity during a voting session. Each system SHALL have a defined and documented voter inactivity time, and that time SHALL be between two and five minutes.
  • 3.2.6.1-F Alert time - Upon expiration of the voter inactivity time, the voting system SHALL issue an alert and provide a means by which the voter may receive additional time. The alert time SHALL be between 20 and 45 seconds. If the voter does not respond to the alert within the alert time, the system SHALL go into an inactive state requiring poll worker intervention.

3.2.7 Alternative languages
  • 3.2.7-A General support for alternative languages - The voting system SHALL be capable of presenting the ballot, contest choices, review screens, vote verification records, and voting instructions in any language declared by the manufacturer to be supported by the system.
  • 3.2.7-A.1 Voter control of language - The system SHALL allow the voter to select among the available languages throughout the voting session while preserving the current votes.
  • 3.2.7-A.2 Complete information in alternative language - Information presented to the voter in the typical case of English-literate voters (including instructions, warnings, messages, contest choices, and vote verification information) SHALL also be presented when an alternative language is being used, whether the language is written or spoken.
  • 3.2.7-A.3 Auditability of records for English readers - Any records, including paper ballots and paper verification records, SHALL have sufficient information to support auditing by poll workers and others who can read only English.
  • 3.2.7-A.4 Usability testing by manufacturer for alternative languages - The manufacturer SHALL conduct summative usability tests for each of the system's supported languages, using subjects who are fluent in those languages but not fluent in English and SHALL report the test results, using the Common Industry Format, as part of the TDP.

3.2.8 Usability for poll workers
  • 3.2.8-A Clarity of system messages for poll workers - Messages generated by the system for poll workers in support of the operation, maintenance, or safety of the system SHALL adhere to the requirements for clarity in Part 1: 3.2.4 “Cognitive issues”.

3.2.8.1 Operation
  • 3.2.8.1-A Ease of normal operation - The procedures for system setup, polling, and shutdown, as documented by the manufacturer, SHALL be reasonably easy for the typical poll worker to learn, understand, and perform.
  • 3.2.8.1-B Usability testing by manufacturer for poll workers - The manufacturer SHALL conduct summative usability tests on the voting system using individuals who are representative of the general population and SHALL report the test results, using the Common Industry Format, as part of the TDP. The tasks to be covered in the test SHALL include setup, operation, and shutdown.
  • 3.2.8.1-C Documentation usability - The system SHALL include clear, complete, and detailed instructions and messages for setup, polling, and shutdown.
  • 3.2.8.1-C.1 Poll Workers as target audience - The documentation required for normal system operation SHALL be presented at a level appropriate for non-expert poll workers.
  • 3.2.8.1-C.2 Usability at the polling place - The documentation SHALL be in a format suitable for practical use in the polling place.
  • 3.2.8.1-C.3 Enabling verification of correct operation - The instructions and messages SHALL enable the poll worker to verify that the system
    1. Has been set up correctly (setup);
    2. Is in correct working order to record votes (polling); and
    3. Has been shut down correctly (shutdown).

3.2.8.2 Safety
  • 3.2.8.2-A Safety certification - Equipment associated with the voting system SHALL be certified in accordance with the requirements of UL 60950-1, Information Technology Equipment – Safety – Part 1 [UL05] by a certification organization accredited by the Department of Labor, Occupational Safety and Health Administration’s Nationally Recognized Testing Laboratory program. The certification organization’s scope of accreditation SHALL include UL 60950-1.

3.3 Accessibility requirements

3.3.1 General
  • 3.3.1-A Accessibility throughout the voting session - The Acc-VS SHALL be integrated into the manufacturer’s complete voting system so as to support accessibility for disabled voters throughout the voting session.
  • 3.3.1-A.1 Documentation of Accessibility Procedures - The manufacturer SHALL supply documentation describing 1) recommended procedures that fully implement accessibility for voters with disabilities and 2) how the Acc-VS supports those procedures.
  • 3.3.1-B Complete information in alternative formats - When the provision of accessibility involves an alternative format for ballot presentation, then all information presented to non-disabled voters, including instructions, warnings, error and other messages, and contest choices, SHALL be presented in that alternative format.
  • 3.3.1-C No dependence on personal assistive technology - The support provided to voters with disabilities SHALL be intrinsic to the Accessible Voting Station. It SHALL NOT be necessary for the Accessible Voting Station to be connected to any personal assistive device of the voter in order for the voter to operate it correctly.
  • 3.3.1-D Secondary means of voter identification - If a voting system provides for voter identification or authentication by using biometric measures that require a voter to possess particular biological characteristics, then the system SHALL provide a secondary means that does not depend on those characteristics.
  • 3.3.1-E Accessibility of paper-based vote verification - If the Acc-VS generates a paper record (or some other durable, human-readable record) for the purpose of allowing voters to verify their votes, then the system SHALL provide a means to ensure that the verification record is accessible to all voters with disabilities, as identified in Part 1: 3.3 “Accessibility requirements”.
  • 3.3.1-E.1 Audio readback for paper-based vote verification. - If the Acc-VS generates a paper record (or some other durable, human-readable record) for the purpose of allowing voters to verify their votes, then the system SHALL provide a mechanism that can read that record and generate an audio representation of its contents.

3.3.2 Low vision
  • 3.3.2-A Usability testing by manufacturer for voters with low vision - The manufacturer SHALL conduct summative usability tests on the voting system using individuals with low vision and SHALL report the test results, using the Common Industry Format, as part of the TDP.
  • 3.3.2-B Adjustable saturation for color displays - An Accessible Voting Station with a color electronic image display SHALL allow the voter to adjust the color saturation throughout the voting session while preserving the current votes. At least two options SHALL be available: a high and a low saturation presentation.
  • 3.3.2-C Distinctive buttons and controls - Buttons and controls on Accessible Voting Stations SHALL be distinguishable by both shape and color. This applies to buttons and controls implemented either "on-screen" or in hardware. This requirement does not apply to sizeable groups of keys, such as a conventional 4x3 telephone keypad or a full alphabetic keyboard.
  • 3.3.2-D Synchronized audio and video - The voting station SHALL provide synchronized audio output to convey the same information as that which is displayed on the screen. There SHALL be a means by which the voter can disable either the audio or the video output, resulting in a video-only or audio-only presentation, respectively. The system SHALL allow the voter to switch among the three modes (synchronized audio/video, video-only, or audio-only) throughout the voting session while preserving the current votes.

3.3.3 Blindness
  • 3.3.3-A Usability testing by manufacturer for blind voters - The manufacturer SHALL conduct summative usability tests on the voting system using individuals who are blind and SHALL report the test results, using the Common Industry Format, as part of the TDP.
  • 3.3.3-B Audio-tactile interface - The Accessible Voting Station SHALL provide an Audio-Tactile Interface (ATI) that supports the full functionality of the visual ballot interface, as specified in Part 1: 6.2 “Voting Variations”.
  • 3.3.3-B.1 Equivalent functionality of ATI - The ATI of the Accessible Voting Station SHALL provide the same capabilities to vote and cast a ballot as are provided by its visual interface.
  • 3.3.3-B.2 ATI supports repetition - The ATI SHALL allow the voter to have any information provided by the voting system repeated.
  • 3.3.3-B.3 ATI supports pause and resume - The ATI SHALL allow the voter to pause and resume the audio presentation.
  • 3.3.3-B.4 ATI supports transition to next or previous contest - The ATI SHALL allow the voter to skip to the next contest or return to previous contests.
  • 3.3.3-B.5 ATI can skip referendum wording - The ATI SHALL allow the voter to skip over the reading of a referendum so as to be able to vote on it immediately.
  • 3.3.3-C Audio features and characteristics - Voting stations that provide audio presentation of the ballot SHALL do so in a usable way, as detailed in the following sub-requirements.
  • 3.3.3-C.1 Standard connector - The ATI SHALL provide its audio signal through an industry standard connector for private listening using a 3.5mm stereo headphone jack to allow voters to use their own audio assistive devices.
  • 3.3.3-C.2 T-Coil coupling - When a voting system utilizes a telephone style handset or headphone to provide audio information, it SHALL provide a wireless T-Coil coupling for assistive hearing devices so as to provide access to that information for voters with partial hearing. That coupling SHALL achieve at least a category T4 rating as defined by [ANSI01] American National Standard for Methods of Measurement of Compatibility between Wireless Communications Devices and Hearing Aids, ANSI C63.19.
  • 3.3.3-C.3 Sanitized headphone or handset - A sanitized headphone or handset SHALL be made available to each voter.
  • 3.3.3-C.4 Initial volume - The voting system SHALL set the initial volume for each voting session between 40 and 50 dB SPL.
  • 3.3.3-C.5 Range of volume - The audio system SHALL allow the voter to control the volume throughout the voting session while preserving the current votes. The volume SHALL be adjustable from a minimum of 20dB SPL up to a maximum of 100 dB SPL, in increments no greater than 10 dB.
  • 3.3.3-C.6 Range of frequency - The audio system SHALL be able to reproduce frequencies over the audible speech range of 315 Hz to 10 KHz.
  • 3.3.3-C.7 Intelligible audio - The audio presentation of verbal information SHOULD be readily comprehensible by voters who have normal hearing and are proficient in the language. This includes such characteristics as proper enunciation, normal intonation, appropriate rate of speech, and low background noise. Candidate names SHOULD be pronounced as the candidate intends.
  • 3.3.3-C.8 Control of speed - The audio system SHALL allow the voter to control the rate of speech throughout the voting session while preserving the current votes. The range of speeds supported SHALL include 75% to 200% of the nominal rate.
  • 3.3.3-D Ballot activation - If the voting station supports ballot activation for non-blind voters, then it SHALL also provide features that enable voters who are blind to perform this activation.
  • 3.3.3-E Ballot submission and vote verification - If the voting station supports ballot submission or vote verification for non-blind voters, then it SHALL also provide features that enable voters who are blind to perform these actions.
  • 3.3.3-F Tactile discernability of controls - Mechanically operated controls or keys on an Accessible Voting Station SHALL be tactilely discernible without activating those controls or keys.
  • 3.3.3-G Discernability of key status - The status of all locking or toggle controls or keys (such as the "shift" key) SHALL be visually discernible, and also discernible through either touch or sound.

3.3.4 Dexterity
  • 3.3.4-A Usability testing by manufacturer for voters with dexterity disabilities - The manufacturer SHALL conduct summative usability tests on the voting system using individuals lacking fine motor control and SHALL report the test results, using the Common Industry Format, as part of the TDP.
  • 3.3.4-B Support for non-manual input - The Accessible Voting Station SHALL provide a mechanism to enable non-manual input that is functionally equivalent to tactile input. All the functionality of the Accessible Voting Station (e.g., straight party voting, write-in candidates) that is available through the conventional forms of input, such as tactile, SHALL also be available through the non-manual input mechanism.
  • 3.3.4-C Ballot submission and vote verification - If the voting station supports ballot submission or vote verification for non-disabled voters, then it SHALL also provide features that enable voters who lack fine motor control or the use of their hands to perform these actions.
  • 3.3.4-D Manipulability of controls - Keys and controls on the Accessible Voting Station SHALL be operable with one hand and SHALL NOT require tight grasping, pinching, or twisting of the wrist. The force required to activate controls and keys SHALL be no greater 5 lbs. (22.2 N).
  • 3.3.4-E No dependence on direct bodily contact - The Accessible Voting Station controls SHALL NOT require direct bodily contact or for the body to be part of any electrical circuit.

3.3.5 Mobility
  • 3.3.5-A Clear floor space - The Accessible Voting Station SHALL provide a clear floor space of 30 inches (760 mm) minimum by 48 inches (1220 mm) minimum for a stationary mobility aid. The clear floor space SHALL be level with no slope exceeding 1:48 and positioned for a forward approach or a parallel approach.
  • 3.3.5-B Allowance for assistant - When deployed according to the installation instructions provided by the manufacturer, the voting station SHALL allow adequate room for an assistant to the voter. This includes clearance for entry to and exit from the area of the voting station.
  • 3.3.5-C Visibility of displays and controls - Labels, displays, controls, keys, audio jacks, and any other part of the Accessible Voting Station necessary for the voter to operate the voting system SHALL be easily legible and visible to a voter in a wheelchair with normal eyesight (no worse than 20/40, corrected) who is in an appropriate position and orientation with respect to the Accessible Voting Station.

3.3.5.1 Controls within reach
  • 3.3.5.1-A Forward approach, no obstruction - If the Accessible Voting Station has a forward approach with no forward reach obstruction then the high reach SHALL be 48 inches maximum and the low reach SHALL be 15 inches minimum. See Part 1: Figure 3-1.
  • 3.3.5.1-B Forward approach, with obstruction - If the Accessible Voting Station has a forward approach with a forward reach obstruction, the following sub-requirements SHALL apply (See Part 1: Figure 3-2).
  • 3.3.5.1-B.1 Maximum size of obstruction - The forward obstruction SHALL be no greater than 25 inches in depth, its top no higher than 34 inches and its bottom surface no lower than 27 inches.
  • 3.3.5.1-B.2 Maximum high reach over obstruction - If the obstruction is no more than 20 inches in depth, then the maximum high reach SHALL be 48 inches, otherwise it SHALL be 44 inches.
  • 3.3.5.1-B.3 Toe clearance under obstruction - Space under the obstruction between the finish floor or ground and 9 inches (230 mm) above the finish floor or ground SHALL be considered toe clearance and SHALL comply with the following provisions:
    1. Toe clearance depth SHALL extend 25 inches (635 mm) maximum under the obstruction;
    2. The minimum toe clearance depth under the obstruction SHALL be either 17 inches (430 mm) or the depth required to reach over the obstruction to operate the Accessible Voting Station, whichever is greater; and
    3. Toe clearance width SHALL be 30 inches (760 mm) minimum.
  • 3.3.5.1-B.4 Knee clearance under obstruction - Space under the obstruction between 9 inches (230 mm) and 27 inches (685 mm) above the finish floor or ground SHALL be considered knee clearance and SHALL comply with the following provisions:
    1. Knee clearance depth SHALL extend 25 inches (635 mm) maximum under the obstruction at 9 inches (230 mm) above the finish floor or ground;
    2. The minimum knee clearance depth at 9 inches (230 mm) above the finish floor or ground SHALL be either 11 inches (280 mm) or 6 inches less than the toe clearance, whichever is greater;
    3. Between 9 inches (230 mm) and 27 inches (685 mm) above the finish floor or ground, the knee clearance depth SHALL be permitted to reduce at a rate of 1 inch (25 mm) in depth for each 6 inches (150 mm) in height. (It follows that the minimum knee clearance at 27 inches above the finish floor or ground SHALL be 3 inches less than the minimum knee clearance at 9 inches above the floor.); and
    4. Knee clearance width SHALL be 30 inches (760 mm) minimum.

  • 3.3.5.1-C Parallel approach, no obstruction - If the Accessible Voting Station has a parallel approach with no side reach obstruction then the maximum high reach SHALL be 48 inches and the minimum low reach SHALL be 15 inches. See Part 1: Figure 3-3.
  • 3.3.5.1-D Parallel approach, with obstruction - If the Accessible Voting Station has a parallel approach with a side reach obstruction, the following sub-requirements SHALL apply. See Part 1: Figure 3-4.
  • 3.3.5.1-D.1 Maximum size of obstruction - The side obstruction SHALL be no greater than 24 inches in depth and its top no higher than 34 inches.
  • 3.3.5.1-D.2 Maximum high reach over obstruction - If the obstruction is no more than 10 inches in depth, then the maximum high reach SHALL be 48 inches, otherwise it SHALL be 46 inches.

3.3.6 Hearing
  • 3.3.6-A Reference to audio requirements - The Accessible Voting Station SHALL incorporate the features listed under Requirement Part 1: 3.3.3-C for voting equipment that provides audio presentation of the ballot.
  • 3.3.6-B Visual redundancy for sound cues - If the voting system provides sound cues as a method to alert the voter, the tone SHALL be accompanied by a visual cue, unless the station is in audio-only mode.
  • 3.3.6-C No electromagnetic interference with hearing devices - No voting equipment SHALL cause electromagnetic interference with assistive hearing devices that would substantially degrade the performance of those devices. The voting equipment, considered as a wireless device, SHALL achieve at least a category T4 rating as defined by [ANSI01] American National Standard for Methods of Measurement of Compatibility between Wireless Communications Devices and Hearing Aids, ANSI C63.19.

3.3.7 Cognition
  • 3.3.7-A General support for cognitive disabilities - The Accessible Voting Station SHOULD provide support to voters with cognitive disabilities.

3.3.8 English proficiency
  • 3.3.8-A Use of ATI - For voters who lack proficiency in reading English, the voting equipment SHALL provide an audio interface for instructions and ballots as described in Part 1: 3.3.3-B.

3.3.9 Speech
  • 3.3.9-A Speech not to be required by equipment - No voting equipment SHALL require voter speech for its operation.

4.2.1 Pollbook audit
  • 4.2.1-A Voting system, support for pollbook audit - The voting system SHALL support a secure pollbook audit that can detect differences in ballot counts between the pollbooks, vote-capture devices, activation devices, and tabulators.
    • The pollbook audit is critical for blocking various threats on voting systems, such as simply inserting additional votes into the voting system. This requirement and its subrequirement are high-level “goal” requirements whose aim is to ensure that the voting system produces records that are adequate and usable by election officials for conducting pollbook audits. This requirement is supported by various other requirements for general reporting and in Part 1:4.3 “Electronic Records”. It can be tested as part of the volume tests discussed in Part 1:7.8 “Reporting” and Part 3:5.3 “Benchmarks”; this type of testing may be useful for assessing the usability of the audit records for typical election environments.
  • 4.2.1-A.1 Records and reports for pollbook audit - Vote-capture devices, activation devices, and tabulators SHALL support production and retention of records and reports that support the pollbook audit.
    • The pollbook audit is only practical when the number of ballots, and of each distinct type of ballot, is available from both the pollbooks and the tabulators.

4.2.2 Hand audit of IVVR record
  • 4.2.2-A IVVR, support for hand audit - The voting system SHALL support a hand audit of IVVRs that can detect differences between the IVVR and the electronic CVR.
    • Hand auditing verifies the reported electronic records; IVVR offer voters an opportunity to discover attempts to misrecord their votes on the IVVR, and the hand audit ensures that devices that misrecord votes on the electronic record but not the IVVR are very likely to be caught.

      Hand auditing draws on the results from the pollbook audit and the ballot count and vote total. For example, the hand audit cannot detect insertion of identical invalid votes in both paper and electronic records in a VVPAT, but the pollbook audit can detect this since it reconciles the electronic CVR count with the number of voters who cast ballots. Similarly, the hand audit cannot detect that the summary of reported ballots from the tabulator or polling place agrees with the final election result, but this can be checked by the ballot count and vote total audit.

      This requirement and its subrequirement are high-level “goal” requirements whose aim is to ensure that the voting system produces records that are adequate and usable by election officials for conducting audits of IVVR records by hand. It can be tested as part of the volume tests discussed in Part 1: 7.8 “Reporting” and Part 3: 5.3 “Benchmarks”; this type of testing may be useful for assessing the usability of the audit records for manual audits in typical election volumes.

  • 4.2.2-A.1 IVVR, information to support hand auditing - IVVR vote-capture devices and tabulators SHALL provide information to support hand auditing of IVVR.
    • The electronic summary information from the DRE or scanner and the IVVRs, must contain sufficient information to carry out the hand audit. Because the hand audit may be carried out at different reporting contexts (for example, a specific tabulator or a whole precinct or polling place may be selected for audit), the voting system must be able to provide reports that support hand auditing at each of the different reporting contexts.

4.2.3 Ballot count and vote total audit
  • 4.2.3-A EMS, support for reconciling voting device totals - The EMS SHALL support the reconciliation of the tabulator totals and the final ballot count and vote totals according to the following:
    1. A tabulator whose reported totals are not correctly included in the ballot count and vote total reports, and which is audited, SHALL be detectable;
    2. A difference between the final ballot count and vote totals and the audit records for a tabulator that is audited SHALL be detectable;
    3. The disagreements in records SHALL be detectable even when the election management software is acting in a malicious way; and
    4. The EMS SHALL be able to provide reports that support ballot count and vote total auditing for different reporting contexts.
    • This auditing process, part of the canvassing procedure, is a defense against problematic behavior by the voting device computing the final election ballot count and vote totals. Section 4.3 includes requirements to make this procedure easier to carry out and to add cryptographic protection to the records produced by the voting devices. One complication in making a full voting system support this procedure is the likely mixing of old and new voting devices in a full voting system.

      When the specific reporting context used is the same as for the hand audit, the ballot count and vote totals audit and hand audit together verify that the votes that appear on the IVVR correspond to the votes that are reported in the final election result.

      This requirement and its subrequirement can be tested as part of the volume tests discussed in Part 1 Section 7.8 and Part 3 Section 5.3.

  • 4.2.3-B Records for ballot count/vote total audit - Vote-capture devices, tabulators, and activation devices SHALL produce records that support the ballot count and vote total audit.
    • This auditing step requires that electronic summary records from voting devices can be reconciled with the final election ballot count and vote total reports. The ballot count and vote total records must thus be capable of breaking down totals by voting device as well as by precinct and polling place.

      Sections 4.3 and 4.4 specify content of the IVVR and electronic records, respectively, needed to support this requirement.

4.2.4 Additional behavior to support auditing for accessible IVVR voting systems
  • 4.2.4-A IVVR vote-capture device, observational testing - IVVR vote-capture devices that support assistive technology SHALL support Observational Testing.
    • Blind, partial vision, and non-written languages voters may not be able to directly verify the IVVR produced by the voting system. This may be because they are using the audio-tactile interface, magnified screen images, or other assistive technology. This raises the possibility that a malicious IVVR vote-capture device could modify these voters’ votes by simply recording the wrong votes on both electronic records and IVVRs. Observational testing provides a defense by using volunteer voters. When observational testing is in use, a malicious IVVR vote-capture device cannot safely assume that a voter using the audio-tactile interface will be unable to check the IVVR record.
  • 4.2.4-B IVVR vote-capture device, authentication for observational testing - The mechanism for authenticating the voter to the accessible IVVR vote-capture device SHALL NOT allow the IVVR vote-capture device to distinguish whether a voter is performing Observational Testing. The pollworker issuing the ballot activation for voters performing Observational Testing SHALL NOT be capable of signaling to the IVVR vote-capture device that it is being tested.
    • Observational testing would not detect attacks if the IVVR vote-capture device were somehow alerted that the voter was carrying out observational testing. Thus, the authentication mechanism must not permit the device to discover this fact.

4.3 Electronic Records

4.3.1 Records produced by voting devices
  • 4.3.1-A All records capable of being exported - The voting system SHALL provide the capability to export its electronic records to files.
    • The exported format for the records must meet the requirements for data export in Part 1: 6.6 “Integratability and Data Export/Interchange”.
  • 4.3.1-B All records capable of being printed - The voting system SHALL provide the ability to produce printed forms of its electronic records.
    1. The printed forms SHALL retain all required information as specified for each record type other than digital signatures;
    2. The printing MAY be done from a different device than the voting device that produces the electronic record; and
    3. It shall be possible to print records produced by the central tabulator or EMS on a different device.
    • Printed versions of all records in this chapter are either necessary or extremely helpful to support required auditing steps. Ensuring that the printing can be done from a machine other than the tabulator used to compute the final totals for the election supports the vote total audit, and is a logical consequence of the requirement for a fully open record format.

  • 4.3.1-C Cryptographic protection of records from voting devices - Electronic records SHALL be digitally signed with the Election Signature Key.
    • The digital signatures address the threat that the records might be tampered with in transit or in storage. When combined with the Election Public Key Certificate, the signature also addresses the threat that a legitimate electronic record might be misinterpreted as coming from the wrong voting device or scanner. The use of per-election keys to sign these records addresses the threat that a compromise of a voting device before or after election day might permit production of a false set of records for the election, which could then be reported to the EMS.

      This requirement mandates a similar optional recommendation in [VVSG2005] 7.9.3-d which applies only to VVPATs. There is no requirement that states that all electronic records must be signed in the [VVSG2005].

4.3.2 Records produced by tabulators
  • 4.3.2-A Tabulator, summary count record - Each tabulator SHALL produce a tabulator Summary Count record including the following:
    1. Device unique identifier from the X.509 certificate;
    2. Time and date of summary record;
    3. The following, both in total and broken down by ballot configuration and precinct:
      1. Number of read ballots;
      2. Number of counted ballots;
      3. Number of rejected electronic CVRs; and
      4. For each N-of-M (including 1-of-M) or cumulative voting contest appearing in any ballot configuration handled by the tabulator:
        1. Number of counted ballots that included that contest, per the definition of K(j,r,t) in Part 1: Table 8-2 ;
        2. Vote totals for each non-write-in contest choice per the definition of T(c,j,r,t) in Part 1: Table 8-2 ;
        3. Number of write-in votes;
        4. Number of overvotes per the definition of O(j,r,t) in Part 1: Table 8-2 ; and
        5. Number of undervotes per the definition of U(j,r,t) in Part 1: Table 8-2.
    • In producing this summary count record, the tabulator shall assume that no provisional or challenged ballots are accepted.
    • The Tabulator Summary Count Record is essentially an estimated summary report from the viewpoint of the individual tabulator, for auditing purposes. Since the eventual disposition of provisional ballots, challenged ballots, and write-in votes is unknown at the close of polls, arbitrary assumptions are made in order to make a summary possible. All provisional and challenged ballots are assumed rejected, and all write-in votes are effectively aliased to a single contest choice that is not one of the choices "on the ballot." The quantities provided for each contest should balance in the sense that

      N × K = sum of non-write-in vote totals (T) + write-ins + overvotes (O) + undervotes (U).

      In addition to the reporting context corresponding to the tabulator itself, reporting contexts corresponding to the different ballot configurations handled by that tabulator are synthesized. These contexts are quite narrow in scope as they include only the ballots of a specific configuration that were counted by a specific tabulator. The tabulator is not required to handle the complexities of reporting contexts that are outside of its scope.

      This record is sufficient to support random audits of paper records. The record will not contain the results of election official review of review-required ballots, so auditors can use this record to verify that the number of these ballots is correct, but will need to do further steps to verify that these ballots were handled correctly. This record can be used to verify a correct result from a system under parallel testing. This record can be used to randomly check electronic totals, when the final results are given broken out by voting system or scanner. When used in the Ballot Count and Vote Total Audit, this record blocks the class of attacks that involves tampering with the EMS computer used to compute the final totals. The tabulator summary could in principle be published for each voting system, along with corrected final totals for each precinct and for absentee ballots, to show how the final election outcomes were computed, though care would have to be taken to avoid violations of voter privacy.

      For auditing, this record must be output in a human-readable format, such as a printed report.

      This requirement clarifies [VVSG2005] I.2.4.3, which describes the vote data summary reports that all voting systems are required to produce. While [VVSG2005] I.2.4.3 applies to voting systems as a whole, this requirement specifically requires that all vote tabulators produce such a report.

  • 4.3.2-B Tabulator, summary count record handling - The tabulator SHALL handle the summary count record according to the following:
    1. The record SHALL be transmitted to the EMS with the other electronic records;
    2. It SHALL be stored in the election archive, if available; and
    3. It SHALL be stored in the voting systems event log.

  • 4.3.2-C Tabulator, collection of ballot images record - Tabulators SHOULD produce a record of ballot images that includes:
    1. Time and date of creation of complete ballot image record; and
    2. ballot images recorded in randomized order by the DRE for the election. For each voted ballot, this includes:
      1. ballot configuration and counting context;
      2. Whether the ballot is accepted or rejected;
      3. For each contest:
        1. The choice recorded, including undervotes and write-ins; and
        2. Any information collected by the vote-capture device electronically about each write-in;
      4. Information specifying whether the ballot is provisional, and providing unique identifier for the ballot, as well as provisional category information required to support Requirement Part 1: 7.7.2-A.6.
    • This record is not required for auditing, however it is useful.

  • 4.3.2-C.1 DRE, collection of ballot images record - DREs SHALL produce a record of ballot images that includes:
    1. Time and date at poll closing; and
    2. ballot images recorded in randomized order by the DRE for the election. For each voted ballot, this includes:
      1. ballot configuration and counting context;
      2. Whether the ballot is accepted or rejected;
      3. For each contest:
        1. The choice recorded, including undervotes and write-ins; and
        2. Any information collected by the vote-capture device electronically about each write-in;
      4. Information specifying whether the ballot is provisional, and providing unique identifier for the ballot, as well as provisional category information required to support Requirement Part 1: 7.7.2-A.6.

  • 4.3.2-C.2 Tabulator. collection of cast votes handling - Tabulators that produce the collection of ballot images record SHALL handle the record according to the following:
    1. The record SHALL be transmitted to the EMS with the other electronic records;
    2. It SHALL be stored in the election archive, if available; and
    3. It SHALL be stored in the voting systems event log.

  • 4.3.2-D Tabulator, electronic records event log record handling - The tabulator SHALL digitally sign the event log, transmit the signed event log to an EMS, and retain a record of the transmission.

4.3.3 Records produced by the EMS
  • 4.3.3-A EMS tabulator summary count record - The EMS tabulator Summary Count Record SHALL include:
    1. Unique identifiers for each tabulator contained in the summary;
    2. For tabulators with public keys:
      1. The public key for each tabulator in the summary;
      2. The Election Signature Key certification and closeout record; and
      3. Signed tabulator summary count record.
    3. Summary ballot counts and vote totals by tabulator, precinct, and polling place.
      1. Precinct totals include subtotals from each tabulator used in the precinct.
    • Requirements in Part 1 Section 7.8 ensure that the EMS is capable of producing a report containing this information. This report is required to allow checking of the final ballot counts and vote totals, based on their agreement with local totals, without relying on the correct operation of equipment and execution of procedures at the tabulation center. The goal is to provide cryptographic support for a process that is currently done in a manual, procedural way, which may be subject to undetected error or tampering. This record can be used to detect most problems at the tabulation center. Item c.1 is needed for cases when a tabulator, such as a DRE, contains votes from multiple precincts. Note: The requirement supports older voting systems to allow for transitioned upgrades of fielded equipment.

      This requirement extends [VVSG2005] I.2.4.3; this requirement specifically requires that each tabulation center EMS produce this report.

  • 4.3.3-A.1 Tabulator, report combination for privacy - The EMS SHALL be capable of combining tabulator reports to protect voter privacy in cases when there are tabulators with few votes.

  • 4.3.3-B EMS, precinct summary count records - The EMS SHALL produce a report for each precinct including:
    1. Each tabulator included in the precinct with its unique identifier;
    2. Number of read ballots;
    3. Number of counted ballots;
    4. Number of rejected electronic CVRs; and
    5. For each N-of-M (including 1-of-M) or cumulative voting contest appearing in any ballot configuration handled by the tabulator:
      1. Number of counted ballots that included that contest, per the definition of K(j,r,t) in Part 1: Table 8-2 ;
      2. Vote totals for each non-write-in contest choice per the definition of T(c,j,r,t) in Part 1: Table 8-2 ; and
      3. Number of write-in votes
    • This report supports hand auditing of paper records against the final totals, the ballot count and vote totals audit, and the pollbook audit.

  • 4.3.3-C EMS, precinct adjustment record - The EMS SHALL produce a report showing the changes made to each contest based on the resolution of provisional ballot, challenged ballots, write-in choices, and the date and time of the report.
    • This report may be produced more than once during the course of an election as the resolution of provisional ballots, challenged ballots, and write-in choices are processed. This report can be used to support pollbook audit showing that number of ballots processed do not exceed the total recorded by the tabulator as well as to support the ballot total and vote count audit. Many jurisdictions resolve provisional and challenged ballots in groups to protect voter privacy.

4.3.4 Digital signature verification
  • 4.3.4-A Tabulator, verify signed records - For each tabulator producing electronic records, the EMS SHALL verify:
    1. The Election Public Key Certificate associated with the record is valid for the current election, using the public key of the tabulator to verify the certificate as specified in Part 1: 5.1 “Cryptography”;
    2. The election ID and timestamp of the record agrees with the current election and the values in the Election Public Key Certificate; and
    3. The digital signature on the record is correct, using the Election Public Key to verify it.

4.3.5 Ballot counter
  • 4.3.5-A Ballot counter - Tabulators and vote-capture devices SHALL maintain a count of the number of ballots read at all times during a particular test cycle or election.
    • For auditability, the ballot count must be maintained (incremented each time a ballot is read) rather than calculated on demand (by counting the ballots currently in storage). This requirement restates [VVSG2005] I.2.1.8.

  • 4.3.5-B Ballot counter, availability - Tabulators SHALL enable election judges to determine the number of ballots read at all times during a particular test cycle or election without disrupting any operations in progress.

4.4 Independent Voter-Verifiable Records

4.4.1 General requirements

  • 4.4.1-A IVVR vote-capture device, IVVR creation - The IVVR vote-capture device SHALL create an independent voter verifiable record.
  • 4.4.1-A.1 IVVR vote-capture device, IVVR direct verification by voters - IVVR vote-capture devices SHALL create an IVVR that voters can verify (a) without software, or (b) without programmable devices excepting assistive technology.
    • The exclusion of software or programmable devices from the voter verification process is necessary for the system to be software independent. It suffices to meet this requirement that most voters can review the record directly. Voters who use some assistive technologies may not be able to directly review the record. This requirement allows for Observational Testing to be able to determine whether the assistive technology is operating without error or fraud.

  • 4.4.1-A.2 IVVR vote-capture device, IVVR direct review by election officials - IVVR vote-capture devices SHALL create an IVVR that election officials and auditors can review without software or programmable devices.
  • 4.4.1-A.3 IVVR vote-capture device, support for hand auditing - IVVR vote-capture devices SHALL create an IVVR that election officials can use without software or programmable devices to verify that the reported electronic totals are correct.
    • The records must support a hand audit that uses no programmable devices to read or interpret the records. The hand audit may provide a statistical basis for other larger audits or recounts performed using technology (such as OCR).
  • 4.4.1-A.4 IVVR vote-capture device, IVVR use in recounts - IVVR vote-capture devices SHALL create an IVVR that election officials can use to reconstruct the full set of totals from the election.
  • 4.4.1-A.5 IVVR vote-capture device, IVVR durability - IVVR vote-capture devices SHALL create an IVVR that will remain unchanged for minimally 22 months unaffected by power failure, software failure, or other technology failure.
  • 4.4.1-A.6 IVVR vote-capture device, IVVR tamper evidence - IVVR vote-capture devices SHALL create an IVVR that show evidence of tampering or change by the voting system.
    • [Unfortunately, the steps to take if such tampering has occurred is not defined.]
  • 4.4.1-A.7 IVVR vote-capture device, IVVR support for privacy - IVVR vote-capture devices SHALL create an IVVR for which procedures or technology can be used to protect voter privacy.
    • Privacy protection includes a method to separate the order of voters from the order of records or procedural means to ensure that information relating to the order of voters, including time a record is created, can be protected. Privacy also includes other methods to make records hard to identify, normally by having them be indistinguishable from each other.
  • 4.4.1-A.8 IVVR vote-capture device, IVVR public format - IVVR vote-capture devices SHALL create an IVVR in a non-restrictive, publicly-available format, readable without confidential, proprietary, or trade secret information.
  • 4.4.1-A.9 IVVR vote-capture device, IVVR unambiguous interpretation of cast vote - Each IVVR SHALL contain a human-readable summary of the electronic CVR. In addition, all IVVR SHALL contain audit-related information including:
    1. Polling place;
    2. Reporting context;
    3. ballot configuration;
    4. Date of election; and
    5. Complete summary of voter’s choices.
  • All IVVR contain some human-readable content. In addition, some IVVR may use machine-readable content to make counting or recounting more efficient. For example, PCOS systems place a human-readable representation of the votes beside a machine-readable set of ovals to be marked by a human or a machine.

    The human-readable content of the IVVR must contain all information needed to interpret the cast vote. This is necessary to ensure that hand audits and recounts can be done using only the human-readable parts of the paper records.

    This requirement generalizes [VVSG2005] I.7.9.1-b, I.7.9.1-c and I.7.9.3-h by extending its provisions to include all IVVR.

  • 4.4.1-A.10 IVVR vote-capture device, no codebook required to interpret - The human-readable ballot contest and choice information on the IVVR SHALL NOT require additional information, such as a codebook, lookup table, or other information, to unambiguously determine the voter’s ballot choices.
    • The hand audit of records requires the ability for auditors to verify that the electronic CVR as seen and verified by voters is the same as the electronic CVR that was counted. This requires that the auditor have all information necessary on the IVVR to interpret completely how the contests were voted. If an external codebook or lookup table were needed to interpret the IVVR, there would be no way for the auditor to be certain that the codebook had not changed since the voter used it.

  • 4.4.1-A.11 IVVR vote-capture device, multiple physical media - When a single IVVR spans multiple physical media, each physical piece of media SHALL include polling place, reporting context, ballot configuration, date of election, and number of the media and total number of the media (e.g. page 1 of 4).
  • 4.4.1-A.12 IVVR vote-capture device, IVVR accepted or rejected - The IVVR SHALL be marked as accepted or rejected in the presence of the voter.
    • Unambiguous verification or rejection markings address the threat that the voting device might attempt to accept or reject ballot summaries without the voter’s approval. This requirement extends [VVSG2005] I.7.9.2-b to all IVVR voting systems.
  • 4.4.1-A.13 IVVR vote-capture device, IVVR accepted or rejected for multiple physical media - Each piece of IVVR physical media or SHALL be individually accepted or rejected by the voter.
  • 4.4.1-A.14 IVVR vote-capture device, IVVR non-human-readable contents permitted - The IVVR MAY include machine-readable encodings of the electronic CVR and other information that is not human-readable.
  • 4.4.1-A.15 IVVR vote-capture device, IVVR machine-readable part contains same information as human-readable part - If a non-human-readable encoding is used on the IVVR, it SHALL contain the entirety of the human-readable information on the record.
  • 4.4.1-A.16 IVVR vote-capture device, IVVR machine-readable contents may include error correction/detection information - If a non-human-readable encoding is used on the IVVR, the encoding MAY also contain information intended to ensure the correct decoding of the information stored within, including:
    1. Checksums;
    2. Error correcting codes;
    3. Digital signatures; and
    4. Message Authentication Codes.

  • 4.4.1-A.17 IVVR vote-capture device, public format for IVVR non-human-readable data - Any non-human-readable information on the IVVR SHALL be presented in a fully disclosed public format.

4.4.2 VVPAT

4.4.2.1 VVPAT components and definitions
  • 4.4.2.1-A VVPAT, definition and components - A VVPAT SHALL consist minimally of the following fundamental components:
    1. A voting device, on which a voter makes selections and prepares to cast a ballot;
    2. A printer that prints a VVPR summary of the voter’s ballot selections, and that allows the voter to compare it with the electronic ballot selections;
    3. A mechanism by which the voter may indicate acceptance or rejection of the VVPR;
    4. Ballot box/cartridge to contain accepted and voided VVPRs; and
    5. A VVPR for each electronic CVR. The VVPR may be printed on a separate sheet for each VVPR (“cut-sheet VVPAT”) or on a continuous paper roll (“paper-roll VVPAT”).

4.4.2.2 VVPAT printer/computer interactions
  • 4.4.2.2-A VVPAT, printer connection to voting system - The VVPAT printer SHALL be physically connected via a standard, publicly documented printer port using a standard communications protocol.
  • 4.4.2.2-B VVPAT, printer able to detect errors - The VVPAT SHALL detect printer errors that may prevent VVPRs from being correctly displayed, printed or stored, such as lack of consumables such as paper, ink, or toner, paper jams/misfeeds, and memory errors.

  • 4.4.2.2-C VVPAT, error handling specific requirements - If a printer error or malfunction is detected, the VVPAT SHALL:
    1. Present a clear indication to the voter and election officials of the malfunction. This must indicate clearly whether the current voter’s vote has been cast, discarded, or is waiting to be completed;
    2. Suspend voting operations until the problem is resolved;
    3. Allow canceling of the current voter’s electronic CVR by election officials in the case of an unrecoverable error; and
    4. Protect the privacy of the voter while the error is being resolved.

  • 4.4.2.2-C.1 VVPAT, general recovery from misuse or voter error - Voter actions SHALL NOT be capable of causing a discrepancy between the VVPR and its corresponding electronic CVR.

4.4.2.3 Protocol of operation
  • 4.4.2.3-A VVPAT, prints and displays a paper record - The VVPAT SHALL provide capabilities for the voter to print a VVPR and compare with a summary of the voter’s electronic ballot selections prior to the voter casting a ballot.
  • 4.4.2.3-B Easy - The VVPAT format and presentation of the VVPR and electronic summaries of ballot selections SHALL be designed to facilitate the voter’s rapid and accurate comparison.
  • 4.4.2.3-C VVPAT, vote acceptance process requirements - When a voter indicates that the VVPR is to be accepted, the VVPAT SHALL:
    1. Immediately print an unambiguous indication that the vote has been accepted, in view of the voter;
    2. Electronically store the CVR as a cast vote; and
    3. Deposit the VVPR into the ballot box or other receptacle.

  • 4.4.2.3-D VVPAT, vote rejection process requirements - When a voter indicates that the VVPR is to be rejected, the VVPAT SHALL:
    1. Immediately print an unambiguous indication that the vote has been rejected, in view of the voter;
    2. Electronically store a record that the VVPR was rejected including the summary of choices; and
    3. Deposit the rejected VVPR into the ballot box or other receptacle.

  • 4.4.2.3-D.1 VVPAT, rejected vote configurable limits per voter - The VVPAT SHALL have the capacity to be configured to limit the number of times a single voter may reject a VVPR without election official intervention. The VVPAT SHALL support limits between zero (any rejected VVPR requires election official intervention) to five times, and MAY support an unlimited number of rejections without election official intervention.
    • This requirement permits election officials to configure the VVPAT to limit the number of times a single voter can reject VVPRs before election official intervention is required. This allows equipment to be configured to meet election law of the jurisdiction.

      This addresses the threat that a single voter may reject a large number of VVPRs, thus depleting supplies.

      This also helps to address the threat that a malicious or malfunctioning VVPAT may indicate a different set of voter choices on the screen than it does on paper and in the electronic records. Such an attack can only be detected by the existence of large numbers of rejected VVPRs. Requiring election official intervention each time a voter rejects a VVPR allows election officials to quickly recognize a malfunctioning or malicious machine.

      If the VVPAT is behaving maliciously, it can simply ignore this limit. Voters may notice this and complain, and if the VVPAT is chosen for a hand audit, the auditors will notice a large number of rejected VVPRs and may try to verify whether election officials noticed a large number of problems with the VVPAT.

  • 4.4.2.3-D.2 VVPAT, rejected vote limits per machine - The VVPAT SHALL have the capacity to limit the total number of VVPRs that a machine may reject before election official intervention is required. The VVPAT SHALL permit the setting of no limit, so that no number of total rejected VVPRs requires immediate election official intervention.
    • This requirement supports the procedural defense of taking a VVPAT offline when too many voters complain about its behavior.

  • 4.4.2.3-D.3 VVPAT, rejected vote election official intervention - When a VVPAT reaches a configured limit of rejected VVPRs per voter or per machine, it SHALL do the following:
    1. Remove any indication of the voter’s choices from the screen;
    2. Place the VVPR that has been rejected into the ballot box or other receptacle;
    3. Clearly display that a VVPR has been rejected and indicate the need for election official intervention; and
    4. Suspend normal operations until re-enabled by an authorized election official.

4.4.2.4 Human-readable VVPR contents for VVPAT
  • 4.4.2.4-A VVPAT, machine readability of VVPAT VVPR - The human-readable contents of the VVPAT VVPR SHALL be created in a manner that is machine-readable by optical character recognition.
  • 4.4.2.4-A.1 VVPAT, support for audit of machine-read representations - The VVPAT SHALL include supporting software, hardware, and documentation of procedures to verify the agreement between the machine read content and the content as reviewed directly by an auditor.
  • 4.4.2.4-B VVPAT, paper-roll, required human-readable content per roll - Paper-roll VVPATs SHALL mark paper rolls with the following:
    1. Polling place;
    2. Reporting context;
    3. Date of election;
    4. If multiple paper rolls were produced during this election on this device, the number of the paper roll (e.g., Roll #2); and
    5. A final summary line specifying how many total VVPRs appear on the roll, and how many accepted VVPRs appear on the roll.

  • 4.4.2.4-C VVPAT, paper-roll, information per VVPR - Paper-roll VVPATs SHALL include the following on each VVPR:
    1. ballot configuration;
    2. Type of voting (e.g., provisional, early, etc.);
    3. Complete summary of voter’s choices;
    4. For each ballot contest:
      1. contest name (e.g., “Governor”);
      2. Any additional information needed for unambiguous interpretation of the VVPR;
      3. A clear indication, if the contest was undervoted; and
      4. A clear indication, if the choice is a write-in vote.
    5. An unambiguous indication of whether the ballot has been accepted or rejected by the voter.
    • The paper roll and the electronic CVRs, together, must give an auditor all information needed to do a meaningful hand audit or recount. The contents in this requirement ensure that the human-readable parts of the paper rolls are sufficient to recount the election and to audit the device totals.

  • 4.4.2.4-D VVPAT, paper-roll, VVPRs on a single roll - Paper-roll VVPATs SHALL NOT split VVPRs across rolls; each VVPR must be contained in its entirety by the paper roll.
  • 4.4.2.4-E VVPAT, cut-sheet, content requirements per electronic CVR - Cut-sheet VVPATs SHALL include the following on each VVPR:
    1. Polling place;
    2. Reporting context;
    3. Date of election;
    4. ballot configuration
    5. Type of voting (e.g., provisional, early, etc.);
    6. Complete summary of voter’s choices;
    7. For each ballot contest:
      1. contest name (e.g., “Governor”);
      2. Any additional information needed for unambiguous interpretation of the VVPR;
      3. A clear indication, if the contest was undervoted; and
      4. A clear indication, if the choice is a write-in vote.
    8. An unambiguous indication of whether each sheet has been accepted or rejected by the voter.

  • 4.4.2.4-F VVPAT, cut-sheet, VVPR split across sheets - If a cut-sheet VVPAT splits VVPRs across multiple sheets of paper, each sheet SHALL include:
    1. Page number of this sheet and total number of sheets (e.g., page 1 of 4);
    2. ballot configuration
    3. Reporting context
    4. Unambiguous indication that the sheet’s contents have been accepted or rejected by the voter; and
    5. Any correspondence information included to link the VVPR to its corresponding electronic CVR.

  • 4.4.2.4-F.1 VVPAT, cut-sheet, ballot contests not split across sheets - If a cut-sheet VVPAT splits VVPRs across multiple sheets of paper, it SHALL NOT split ballot contests across sheets.
  • 4.4.2.4-F.2 VVPAT, cut-sheet, VVPR sheets verified individually - If a cut-sheet VVPAT splits VVPRs across multiple sheets of paper, the ballot choices on each sheet SHALL be submitted to the voter for verification separately according to the following:
    1. The voter shall be presented a verification screen for the contents of each sheet separately at the same time as the voter is able to verify the contents of the part of the VVPR on the sheet;
    2. When a voter accepts or rejects the contents of a sheet, the votes contained on that sheet and verification screen shall be committed to memory, regardless of the verification of any other sheet by the same voter;
    3. Configurable limits on rejected VVPRs per voter shall count each rejected sheet as a rejected VVPR;
    4. Configurable limits on rejected VVPRs per machine shall not count more than one rejected VVPR per voter; and
    5. When a rejected VVPR requires election official intervention, the VVPAT shall indicate which sheets have been accepted and which rejected.

  • 4.4.2.5-A VVPAT, identification of electronic CVR correspondence - The VVPAT SHALL provide a capability to print information on each VVPR sufficient for auditors to identify from an electronic CVR its corresponding VVPR and from a VVPR its corresponding electronic CVR. This capability SHALL be possible for election officials to enable or disable.
  • 4.4.2.5-A.1 VVPAT, CVR correspondence identification hidden from voter - Any information on the VVPAT VVPR that identifies the corresponding electronic CVR SHOULD NOT be possible for the voter to read or copy by hand.
    • This requirement addresses the threat that some voters might copy down the correspondence information to prove to some third party how they have voted. If the correspondence information is not possible for voters to copy down by hand, they must use a camera or similar technology to prove how they voted—in which case, the correspondence information makes vote buying no easier than it already was.
  • 4.4.2.5-A.2 VVPAT, CVR correspondence identification viewable to auditors - The VVPAT manufacturer SHALL include a capability for auditors to verify the correspondence between the electronic CVR and VVPR pairs, if the correspondence information is printed on the VVPR.
  • 4.4.2.5-A.3 VVPAT, CVR correspondence identification in reported ballot images - When electronic CVR correspondence identification is printed on the VVPAT VVPR, the correspondence information SHALL be included in the ballot images sent to the EMS by collection of ballot images record.

4.4.2.6 Paper-roll VVPAT privacy and audit-support
  • 4.4.2.6-A VVPAT, paper-roll, VVPRs secured immediately after vote cast - Paper-roll VVPATs SHALL store the part of the paper roll containing VVPRs in a secure, opaque container, immediately after they are verified.
  • .4.2.6-B VVPAT, paper-roll, privacy during printer errors - Procedures for recovery from printer errors on paper-roll VVPATs SHALL NOT expose the contents of previously cast VVPRs.
  • 4.4.2.6-C VVPAT, paper-roll, support tamper-seals and locks - Paper-roll VVPATs SHALL be designed so that when the rolls are removed from the voting device according to the following:
    1. All paper containing VVPRs are contained inside the secure, opaque container;
    2. The container supports being tamper-sealed and locked; and
    3. The container supports being labeled with the device serial number, precinct, and other identifying information to support audits and recounts.
    • Paper-roll VVPAT must support good procedures to protect the voters’ privacy. The supported procedure in this case is immediately locking and tamper sealing each VVPAT container upon removing it from the voting device. This is consistent with the goal of having the paper rolls with VVPRs on them treated like paper ballots, stored in a locked and sealed box.

      If the paper roll cartridge is locked and sealed before the start of voting, and some mechanism in the cartridge prevents extraction of the used paper roll collected inside the cartridge, locking and sealing the cartridge a second time at poll closing would be necessary only for preventing further VVPRs being printed on the paper roll.

  • 4.4.2.6-D VVPAT, paper-roll, mechanism to view spooled records - If a continuous paper spool is used to store VVPRs, the manufacturer SHALL provide a mechanism for an auditor to unspool the paper, view each VVPR in its entirety, and then respool the paper, without modifying the paper in any way or causing the paper to become electrically charged.

4.4.3 PCOS systems
  • 4.4.3-A Optical scanner, optional marking - Optical scanners MAY add markings to each paper ballot, such as:
    1. Unique record identifiers to allow individual matching of paper and electronic CVRs;
    2. Digital signatures; and
    3. Batch information.

  • 4.4.3-A.1 Optical scanner, optional marking restrictions - Optical scanners that add markings to paper ballots scanned SHALL NOT be capable of altering the contents of the human-readable CVR on the ballot. Specifically, Optical scanners capable of adding markings to the scanned ballots SHALL NOT permit:
    1. Marking in the regions of the ballot that indicate voter choices;
    2. Marking in the regions of the ballot that contain the human-readable description of the marked choice; and
    3. Marking in regions reserved for timing marks.
    • If the scanner could alter the human-readable contents of the ballot, or mark the ballot, after scanning, then the paper records stored by the scanner could no longer be considered voter-verifiable, and the optical scan system would no longer be software independent.

Chapter 5: General Security Requirements

5.1 Cryptography

5.1.1 General cryptographic implementation
  • 5.1.1-A Cryptographic module validation - Cryptographic functionality SHALL be implemented in a FIPS 140-2 validated cryptographic module operating in FIPS mode.
    • Use of validated cryptographic modules ensures that the cryptographic algorithms used are secure and their correct implementation has been validated. Moreover, the security module security requirements have been validated to a specified security level. The current version of FIPS 140 and information about the NIST Cryptographic Module Verification Program are available at: http://csrc.nist.gov/cryptval/. Note that a voting device may use more than one cryptographic module, and quite commonly will use a “software” module for some functions, and a “hardware” module for other functions.

      This requirement is a generalization of [VVSG2005] I.7.5.1-b, which is a cryptographic requirement with a limited scope to the encryption of data across public communication networks. That requirement mandated use of "an encryption standard currently documented and validated for use by an agency of the U.S. government". Use of public communication networks is forbidden in this document except for transmitting unofficial results or communicating with an electronic pollbook.

      This requirement extends and strengthens [VVSG2005] I.7.8.2, which required use of a validated cryptographic module if signature signatures were used in voting system with independent verification. Use of digital signatures is required in this document, and this requirement mandates the use of a FIPS validated module.

      This requirement is a generalization of [VVSG2005] I.7.4.6-d, which is a cryptographic requirement with a limited scope. That requirement mandated the use of FIPS 140-2 level 1 or higher validated cryptographic modules if hash functions or digital signatures are used during software validation.

      Lastly, this requirement restates and strengthens [VVSG2005] I.7.9.3-a by requiring all cryptographic functionality be implemented in FIPS validated modules. [VVSG2005] I.7.9.3-a provides an exception when a cryptographic voting system uses cryptographic algorithms that are necessarily different from any algorithms that have approved CMVP implementations.

  • 5.1.1-B Cryptographic strength - Programmed devices that apply cryptographic protection SHALL employ NIST approved algorithms with a security strength of at least 112-bits to protect sensitive voting information and election records. Message Authentication Codes of 96-bits are conventional in standardized secure communications protocols, and acceptable to protect voting records and systems; however, the key used with such MACs SHALL also have a security strength of at least 112 bits.

5.1.2 Digital signatures for election records
  • 5.1.2-A Digital signature generation requirements - Digital signatures used to sign election records SHALL be generated in an embedded hardware Signature Module (SM).
  • 5.1.2-B Signature Module (SM) - Programmed devices that sign election records SHALL contain a hardware cryptographic module, the Signature Module (SM), that is capable of generating and protecting signature key pairs and generating digital signatures.
    • For the purpose of this requirement a “hardware” cryptographic module means a distinct electronic device, typically a preprogrammed, dedicated microcomputer that holds keying material and performs cryptographic operations. Although today this might typically be a single chip, soldered onto a larger motherboard, it is not the intent of this guideline to preclude higher levels of integration. It is expected that future voting devices may integrate the SM onto the same die as the rest of the voting device, as long as the SM is clearly physically and logically separated on the die from the rest of the voting device so that there is a distinct cryptographic module boundary, and there is no way for the rest of the device to access signature private keys except through the defined cryptographic module interface.

      Signature verification and other cryptographic operations need not be implemented in hardware, but may also be implemented on the embedded signature module if desired.

  • 5.1.2-B.1 Non-replaceable embedded Signature Module (SM) - Signatures Modules (SMs) SHALL be an integral, permanently attached component of a programmed device.

  • 5.1.2-B.2 Signature module validation level - Signature Modules SHALL be validated under FIPS 140-2 with FIPS 140 level 2 overall security and FIPS 140 level 3 physical security.
    • FIPS 140 level 3 physical security requires tamper resistance.

5.1.3 Key management for signature keys

5.1.3.1 Device Signature Key (DSK)
  • 5.1.3.1-A DSK Generation - Signature Modules SHALL securely generate a permanent DSK in the module, using an integral nondeterministic random bit generator.
    • FIPS 186-3 and NIST Special Publication 800-89 give technical requirements for the generation of secure digital signature keys.
  • 5.1.3.1-B Device Certificate generation - There SHALL be a process or mechanism for generating an X.509 Device Certificate that binds the DSK public key to the unique identification of the programmed device, the certificate’s date of issue, the name of the issuer of the certificate and other relevant permanent information.

  • 5.1.3.1-C Device Certificate storage - Device Certificates SHALL be stored permanently in the SM and be readable on demand by the programmed device.

  • 5.1.3.1-D Device identification placard - A human readable identification placard SHALL be permanently affixed to the external frame of any programmed device containing an SM that states, at a minimum, the same unique identification of the voting device contained in the device certificate.

  • 5.1.3.1-E Device Signature Key protection - Signature Modules and the process for generating DSKs SHALL be implemented so that the private component of DSK is created and exists only inside the protected cryptographic module boundary of the SM, and the key cannot be altered or exported from the SM.

  • 5.1.3.1-F Use of Device Signature Key - Signature Modules SHALL implement and permit only three uses of the DSK:
    1. to sign Election Public Key Certificates;
    2. to sign Election Closeout Records; and
    3. to sign Device Public Key Certificates.

5.1.4 Election Signature Key (ESK)

The purpose of an ESK is to sign election records in the course of an election. A voting device that signs election records generates its own ESKs and maintains only one ESK at a time. The public component of every ESK generated by the embedded signature module is signed by the DSK to create an Election Public Key Certificate, and when an election is closed out, the private component of that election key is destroyed by the SM, which produces an Election Closeout Record attesting to that destruction, signed by the DSK.

In the context of this section, an “election” may be held on a single day, for a single precinct or voting district, with a single ballot style, or it may span a period of days or weeks, and may involve a number of precincts and voting districts and ballot styles, if the voting device is intended to be so used (e.g., in voting centers or for early polling).

The SM is not aware of the context of its use, it simply creates a new ESK when requested by the voting device, signs hashes as requested by the voting device while keeping a count of the number of signatures for the ESK, and finally, when requested by the voting device, the SM destroys the ESK and produces a signed Election Closeout Record stating the number of times the ESK was used. The specific minimum requirements for this are specified below.

However, nothing in this section is intended to preclude the creation of other manufacturer defined signed records by the SM to support the overall election records and audit strategy for these more complex cases. For example, the SM might implement signed daily subtotals ESK use similar to those of the Election Closeout Record for use in multi-day elections. Alternatively, the SM might accumulate and output as a part of the closeout process signed totals by ballot style or some other identifier (which implies that the SM would have to include a way to input ballot style information in its API).

  • 5.1.4-A Election Signature Key (ESK) generation - Signature Modules SHALL internally generate election signature key-pairs (ESK) using an integral nondeterministic random bit generator.
  • 5.1.4-B Election Public Key Certificate - Signature Modules SHALL generate and output an X.509 public key certificate for each ESK generated, binding public key to the unique identification of the election, the date of issue of the certificate, the identification of the voting device (the issuer of the certificate), and, optionally, to other election relevant information.
  • 5.1.4-C Election counter - Signature Modules SHALL maintain an election counter that maintains a running count of each ESK generated.
  • 5.1.4-D Election Signature Key use counter - Embedded signature modules SHALL maintain a counter of the number of times that an ESK is used.
  • 5.1.4-E Election Key Closeout - Signature Modules SHALL implement a closeout command that causes an Election Key Closeout record to be created and output, and the private component of the ESK to be destroyed.
  • 5.1.4-F Election Key Closeout record - The Election Key Closeout record SHALL be signed by the DSK and contain at least:
    1. The election signature public key (or a message digest of that key);
    2. The ESK number; and
    3. The final value of the ESK use counter.
    • The Election Key Closeout Record provides a signed record attesting to the destruction of the particular ESK and the number of signatures executed with the ESK. The number of signed election records should match the ESK use counter; this should be checked by tally devices, and any discrepancies flagged and investigated. The format of the Election Key Closeout Record is not specified and might be either a signed XML object or it might, potentially, use another signed format such as the ASN.1 Cryptographic Message Syntax.

5.2 Setup Inspection

5.2.1 Voting device software inspection

5.2.1.1 Software identification verification
  • 5.2.1.1-A Voting device software identification - The voting device SHALL be able to identify all software installed on programmed devices of the voting device.

  • 5.2.1.1-B Voting device, software identification verification log - Voting devices SHALL be capable of a software identification verification inspection that records, minimally, the following information to the device’s event log:
    1. Time and date of the inspection;
    2. Information that uniquely identifies the software (such as software name, version, build number, etc.);
    3. Information that identifies the location (such as full path name or memory address); and
    4. Information that uniquely identifies the programmed device that was inspected.

  • 5.2.1.1-B.1 EMS, software identification verification log - EMSs and other programmed devices that identify and authenticate individuals also SHALL record identifying information of the individual and role that performed the inspection.

5.2.1.2 Software integrity verification
  • 5.2.1.2-A Software integrity verification - The voting device SHALL verify the integrity of software installed on programmed devices using cryptographic software reference information from the National Software Reference Library (NSRL), voting device owner, or designated notary repositories.

  • 5.2.1.2-B Voting device, software integrity verification log - Voting devices SHALL be capable of performing a software integrity verification inspection that records, minimally, the following information to the device’s event log:
    1. Time and date of the inspection;
    2. Information that uniquely identifies the software (such as software name, version, build number, etc.);
    3. Information that identifies the software integrity verification technique used; Results of the software verification,
    4. Including the cryptographic software reference information used for the verification; and
    5. Information that uniquely identifies the voting device that contained the software that was verified.

  • 5.2.1.2-B.1 EMS, software integrity verification log - EMSs and other programmed devices that identify and authenticate individuals also SHALL record identifying information of the individual and role that performed the inspection.

5.2.2 Voting device election information inspection

The requirements found in this section provide the ability to inspect contents of storage locations that hold election information for a voting device.

Voting devices can be inspected to determine the content for storage locations that hold election information. Storage locations can hold election information that changes, such as accumulation registers, or information that does not change during an election. The proper initial and constant values of storage locations use to hold election information can be determined from documentation provided by manufacturers and jurisdictions before a voting device is used during an election.

  • 5.2.2-A Election information value determination - The voting device SHALL be able to determine the values contained in storage locations used to hold election information that changes during the election such as the number of ballots cast or total for a given contest.

  • 5.2.2-B Voting device, election information value inspection log - Voting devices SHALL be capable of performing an election information inspection that records, minimally, the following information to the device’s event log:
    1. Time and date of the inspection;
    2. Information that uniquely identifies the storage location of the information inspected;
    3. The value of each piece of election information; and
    4. Information that uniquely identifies the voting device that was inspected.

  • 5.2.2-B.1 EMS, election information value inspection log - EMSs and programmed devices that identify and authenticate individuals also SHALL record identifying information of the individual and role that performed the inspection.

5.2.3 Voting equipment properties inspection
  • 5.2.3-A Backup power source charge indicator - The voting device SHALL indicate the remaining charge of backup power sources in quarterly increments (i.e. full, three-quarters full, half-full, quarter full, empty) at a minimum without the use of software.
  • 5.2.3-B Cabling connectivity indicator - The voting device SHALL indicate the connectivity of cabling attached to the voting device without the use of software.
  • 5.2.3-C Communications operational status indicator - The voting device SHALL indicate the operational status of the communications capability of the voting device.
  • 5.2.3-D Communications on/off indicator - The voting device SHALL indicate when the communications capability of the voting device is on/off without the use of software.
  • 5.2.3-E Consumables remaining indicator - The voting device SHALL indicate the remaining amount of voting device consumables (i.e. ink, paper, etc.) in quarterly increments (i.e. full, three-quarters full, half-full, quarter full, empty) at a minimum.
  • 5.2.3-F Calibration determination of voting device components - The voting device SHALL be able to determine the calibration of voting device components that require calibration.
  • 5.2.3-G Calibration of voting device components adjustment - The voting device SHALL be able adjust the calibration of voting device components that require calibration.
  • 5.2.3-H Voting device, property inspection log - Voting devices SHALL be capable of performing a device properties inspection that records, minimally, the following information to the device’s event log:
    1. Time and date of the inspection;
    2. A description of the inspections performed;
    3. Results of each inspection; and
    4. Information that uniquely identifies the voting device that was inspected.
  • 5.2.3-H.1 EMS, property inspection log - EMSs and other programmed devices that identify and authenticate individuals also SHALL record identifying information of the individual and role that performed the inspection.

5.3 Software Installation

  • 5.3-A Software installation state restriction - Vote-capture devices SHALL only allow software to be installed while in the pre-voting state.
  • 5.3-B Authentication to install software - Programmed devices SHALL allow only authenticated administrators to install software on voting equipment.
  • 5.3-B.1 Authentication to install software on EMS - The EMS SHALL uniquely authenticate individuals associated with the administrator role before allowing software to be installed on the voting equipment.
  • 5.3-C Authentication to install software election-specific software - Programmed devices SHALL only allow authenticated central election officials to install election-specific software and data files on voting equipment.
  • 5.3-C.1 Authentication to install software election-specific software on EMS - The EMS SHALL uniquely authenticate individuals associated with the central election official role before allowing election-specific software and data files to be installed on the voting equipment.
  • 5.3-D Software installation procedures usage documentation - Software on programmed devices of the voting system SHALL only be able to be installed using the procedures in the user documentation.
  • 5.3-E Software digital signature verification - A test lab, National Software Reference Library (NSRL), or notary repository digital signature associated with the software SHALL be successfully validated before placing the software on programmed devices of voting systems.
  • 5.3-E.1 Software installation programs digital signature verification - Software installation programs SHALL validate a test lab, National Software Reference Library (NSRL), or notary repository digital signature of the software before installing software on programmed devices of voting systems.
  • 5.3-E.2 Software digital signature verification record - The results of digital signature verifications including who generated the signature SHALL be part of the software installation record.
  • 5.3-F Software installation error alert media - When installation of software fails, software installation programs SHALL provide an externally visible error message identifying the software that has failed to be installed on programmed devices of the voting system.
  • 5.3-G Programmed device, software installation logging - Programmed devices SHALL be able to log, minimally, the following information associated with each piece of software installed to the device’s event log:
    1. The date and time of the installation;
    2. The software’s filename and version;
    3. The location where the software is installed (such as directory path or memory addresses);
    4. If the software was installed successfully or not; and
    5. The digital signature validation results including who generated the signature.
  • 5.3-G.1 EMS, vote equipment property inspection log - EMSs and other programmed devices that identify and authenticate individuals also SHALL record identifying information of the individual and role performing the software installation.
  • 5.3-H Authentication to access configuration file - Programmed devices SHALL allow only authenticated administrators to access and modify voting device configuration file(s).
  • 5.3-H.1 Authentication to access configuration file on EMS - The EMS SHALL uniquely authenticate individuals associated with the administrator role before allowing them to access and modify voting device configuration files.

  • 5.3-I Authentication to access election–specific configuration file - Programmed device SHALL allow authenticated only central election officials to access and modify election specific configuration files.

  • 5.3-I.1 Authentication to access election–specific configuration file on EMS - The EMS SHALL uniquely authenticate individuals associated with the central election official role before allowing them to access and modify voting device configuration files.

  • 5.3-J Programmed device, configuration file access logging - Programmed devices SHALL be able to log, minimally, the following information associated with configuration file accesses:
    1. The date and time of the access;
    2. The configuration file’s filename;
    3. An indication of the configuration file was modified; and
    4. The location of the configuration file (such as directory path or memory addresses).

  • 5.3-J.1 EMS, configuration file access logging - EMSs and other Programmed devices that identify and authenticate individuals also SHALL record identifying information of the individual and role accessing the configuration file.

5.4 Access Control

Access controls support the following security principles in terms of voting systems:

  1. Accountability of actions by identifying and authenticating users;
  2. Confidentiality of casting and storing of votes;
  3. Integrity of event logs, electronic records, and vote reporting; and
  4. Availability of the voting ballot and the ability to cast, store, and report votes.

This requirement extends [VVSG2005] I.7.2.1.2 by requiring controlled access to voting device components and by requiring access control mechanisms.

5.4.1 General access control
  • 5.4.1-A Access control mechanisms - The voting device SHALL provide access control mechanisms designed to permit authorized access to the voting system and to prevent unauthorized access to the voting system.

Group or Role Description
Voter The voter role is a restricted process in the vote-capture device. It allows the vote-capture device to enter the Activated state for voting activities.
Election Judge The election judge has the ability to open the polls, close the polls, handle fled voters, recover from errors, and generate reports.
Poll Worker The poll worker checks in voters and activates the ballot style.
Central Election Official The central election official loads ballot definition files.
Administrator The administrator updates and configures the voting devices and troubleshoots system problems.

  • 5.4.1-A.1 Voting device access control - The access control mechanisms of the voting device SHALL be capable of identifying and authenticating roles from Part 1: Table 5-1 permitted to perform operations on the voting device.

  • 5.4.1-A.2 EMS access control - The access control mechanisms of the EMS SHALL be capable of identifying and authenticating individuals permitted to perform operations on the EMS.

  • 5.4.1-B Access control for software and files - The voting device SHALL provide controls that permit or deny access to the device’s software and files.

  • 5.4.1-C Access control voting states - The vote-capture device’s access control mechanisms SHALL distinguish at least the following voting states from Part 1: Table 5-2:
    1. Pre-voting;
    2. Activated;
    3. Suspended; and
    4. Post-voting.

  • 5.4.1-D Access control state policies - The Vote-capture device SHALL allow the administrator group or role to configure different access control policies available in each voting state.

  • 5.4.1-E Minimum permissions default - The voting device’s default access control permissions SHALL implement the minimum permissions needed for each role or group.

  • 5.4.1-F Privilege escalation prevention - The voting device SHALL prevent a lower-privilege process from modifying a higher-privilege process.

  • 5.4.1-G Privileged operations authorization - The voting device SHALL ensure that an administrator authorizes each privileged operation.

  • 5.4.1-H Software and firmware modification prevention - The voting device SHALL prevent modification to or tampering with software or firmware through any means other than the documented procedure for software upgrade.

5.4.2 Access control identification
  • 5.4.2-A Access control identification - The voting device SHALL identify users, applications, and processes to which access is granted and the specific functions and data to which each entity holds authorized access.

  • 5.4.2-B Role-based access control standard - Voting devices that implement role-based access control SHALL support the recommendations for Core RBAC in the ANSI INCITS 359-2004 American National Standard for Information Technology – Role Based Access Control document.

  • 5.4.2-C Access control roles identification - The voting device SHALL identify, at a minimum, the groups or roles outlined in Part 1: Table 5-1.

  • 5.4.2-D Group member identification - The EMS SHALL individually identify the members within all groups or roles except the voting group.

  • 5.4.2-E Access control configuration - The voting device SHALL allow the administrator group or role to configure the permissions and functionality for each identity, group or role to include account and group/role creation, modification, and deletion. (Includes Table 5-3 that describes Roles and voting states.)

5.4.3 Access control authentication
  • (omitted)
5.4.4 Access control authorization
  • (omitted)
5.5 System Integrity Management

This chapter is a guideline for securely deploying and maintaining voting system electronic devices across all system modes of voting. It is inclusive of platform security configuration including network interfaces. In many ways, security of the electronic devices is subject to the current voting system state. Perhaps more importantly, the voting system state is an indicator of who requires access to any given device. This factor significantly influences security measures.

There are some similarities between voting machines and gaming machines. As a method of assuring completeness of requirements, the Nevada Gaming Commission’s [NGC06] technical standards on gaming machines were consulted for applicability.

5.5.1 Electronic devices

5.5.2 Removable media

5.5.3 Backup and recovery

5.5.4 Malicious software protection

5.6 Communication Security

This chapter provides requirements for communications security. The requirements address both the integrity of transmitted information and protect the voting system from communications based threats.

This chapter is organized in three parts. The first set of requirements address physical communication components including the prohibition of radio frequency (RF) capable components. The second set of requirements address data transmission security requirements related to the encoding and decoding data packets, and creating logical paths for transferring data between systems. The third set of requirements address communication security related to the voting application including the authentication of communications between voting devices.

Although voting systems can have the capability to communicate with other voting devices, there are key security concerns that must be accounted for both during voting and when election administrators prepare the voting device. This chapter does not address networking issues based on hand carried electronic media, which are addressed in the Systems Integrity Management Chapter.

5.6.1 Physical communication security
  • 5.6.1-A Prohibiting wireless technology - Electronic devices SHALL not be enabled or installed with any wireless technology (e.g., Wi-Fi, wireless broadband, Bluetooth) except for infrared technology when the signal path is shielded to prevent the escape of the signal and saturation jamming of the signal.
  • 5.6.1-B Restricting dependency on public communication networks - Electronic devices SHALL not use public communication networks (including, but not limited to the Internet and modem usage through public telephone networks), except for electronic devices at polling places that transmit unofficial end of the day results and interface with voter registration databases on election day.

  • 5.6.1-B.1 Air gap for transmitting end of day results on election day - Electronic devices SHALL not be connected to other polling place electronic devices when transmitting end of the day results on election day.

  • 5.6.1-B.2 Air gap for connecting to voter registration databases - Electronic devices that connect to voter registration databases outside a polling place on election day SHALL never be connected to other polling place electronic devices.

  • 5.6.1-C Limiting network interfaces based on voting state - Electronic devices SHALL have the ability to enable or disable physical network interfaces (including modems) based upon the voting system state.

  • 5.6.1-D Preventing traffic from passing through EMSs - EMSs with multiple active network interfaces (including modems) SHALL not act as bridges or routers between networks that permit network traffic to pass through the electronic management systems.

  • 5.6.1-E Implementing unique network identification - Each electronic device SHALL have a unique physical address/identifier for each network interface.
5.6.2 Data transmission security

This section describes security requirements related to the encoding and decoding of data packets, and creating logical paths for transferring date between voting systems.

5.6.3 Application communication security

This section describes security requirements related to the communications of the voting application.

5.7 System Event Logging

An event is something that occurs within a voting device and a log is a record of these events that have occurred. Each log entry contains information related to a specific event. Logs are used for error reporting, auditing, troubleshooting problems, optimizing performance, recording the actions of users, and providing data useful for investigating malicious activity.

Event logs are typically divided into two categories: system events and audit records. System events are operational actions performed by voting device components, such as shutting down the voting device, starting a service, usage information, client requests, and other information. Audit records contain security event information such as successful and failed authentication attempts, file accesses, and security policy changes. Other applications and third party software, such as antivirus software and intrusion detection software also record audit logs. For the purpose of this chapter system event logging will be used to include both system and audit logs for voting devices. System event logs are of equal importance in the output of an election as the electronic CVRs and vote totals.

This chapter describes voting device capabilities that perform system event logging to assist in voting device troubleshooting, recording a history of voting device activity, and detecting unauthorized or malicious activity. It also describes the use of log management to protect the confidentiality and integrity of logs while also ensuring their availability. The voting device software, operating system, and/or applications may perform the actual system event logging. There may be multiple logs in use on a single device.

The requirements in this section protect against the following intermediate attack goals:

  • The ability of an attacker to undetectably alter the logs;
  • The ability of an attacker to remove an entry from the log; and
  • The ability of an attacker to create an entry in the log.

This section defines the event logging requirements for voting devices. It outlines the various measures that the manufacturers and the voting device shall provide to ensure the functionality, performance, and security of the voting device event logging. These recommendations apply to the full scope of voting device functionality, including voting, pre- and post-voting activities, and maintenance of the voting device.

5.7.1 General system event logging
  • 5.7.1-A Event logging mechanisms requirement - The voting device SHALL provide event logging mechanisms designed to record voting device activities.
  • 5.7.1-B Integrity protection requirement - The voting device SHALL enable file integrity protection for stored log files as part of the default configuration.
  • 5.7.1-C Voter privacy and ballot secrecy requirement - The voting device logs SHALL NOT contain information that, if published, would violate ballot secrecy or voter privacy or that would compromise voting system security in any way.
  • 5.7.1-D Event characteristics logging requirement - The voting device SHALL log at a minimum the following data characteristics for each type of event:
    1. System ID;
    2. Unique event ID and/or type;
    3. Timestamp;
    4. Success or failure of event, if applicable;
    5. User ID triggering the event, if applicable;
    6. Resources requested, if applicable.
  • 5.7.1-D.1 Timekeeping requirement - Timekeeping mechanisms SHALL generate time and date values.
  • 5.7.1-D.2 Time precision requirement - The precision of the timekeeping mechanism SHALL be able to distinguish and properly order all audit records.
  • 5.7.1-D.3 Timestamp data requirement - Timestamps SHALL include date and time, including hours, minutes, and seconds.
  • 5.7.1-D.4 Timestamp compliance requirement - Timestamps SHALL comply with ISO 8601 and provide all four digits of the year and include the time zone.
  • 5.7.1-D.5 Clock set requirement - Voting devices SHALL only allow administrators to set the clock.
  • 5.7.1-D.6 Clock drift minimum requirement - The voting device SHALL limit clock drift to a minimum of 1 minute within a 15 hour period after the clock is set.
  • 5.7.1-E Minimum event logging requirement - The voting device SHALL log at a minimum the system events described in Part 1: Table 5-5 .
  • 5.7.1-E.1 Minimum logging disabling requirement - The voting device SHALL ensure that the minimum event logging in Part 1: Table 5-5 cannot be disabled.

(extract of Table 5-5)
System Event Description Applies To
Voting events Includes:
1. Opening and closing polls
2. Casting a vote
3. Canceling a vote during verification
4. Fled voters
5. Success or failure of log and election results exportation
6. Note: for paper-based devices, these requirements may need to be met procedurally
Programmed device

5.7.2 System event log management

Log management is the process for generating, transmitting, storing, analyzing, and disposing of log data. Log management primarily involves protecting the integrity of logs while also ensuring their availability. It also ensures that records are stored in sufficient detail for an appropriate period of time.

A log management infrastructure consists of the hardware, software, networks, and media used to generate, transmit, store, and analyze log data. The events outlined in this section may be logged as part of the underlying operating system, the voting device software, or other third party applications.

5.7.3 System event log protection

5.8 Physical Security for Voting Devices

5.8.1 Unauthorized physical access
  • 5.8.1-A Unauthorized physical access requirement - Any unauthorized physical access SHALL leave physical evidence that an unauthorized event has taken place.
  • 5.8.1-B Unauthorized physical access capability requirement - Voting devices SHALL produce an audible and visual alarm if access to a restricted voting device component is gained during the Activated state.

5.8.2 Physical port and access least functionality
  • 5.8.2-A Physical port and access point requirement - The voting device SHALL only have physical ports and access points that are essential to voting operations and to voting device testing and auditing.

5.8.3 Voting device boundary protection
  • 5.8.3-A Physical port shutdown requirement - If a physical connection between voting device components is broken during Activated or Suspended State, the affected voting machine port SHALL be automatically disabled.
  • 5.8.3-B Physical component alarm requirement - The voting device SHALL produce an audible and visual alarm if a connected component is disconnected during the Activated state.
  • 5.8.3-C Physical component event log requirement - An event log entry that identifies the name of the affected device SHALL be generated if a voting device component is disconnected during the Activated state.
  • 5.8.3-D Physical port enablement requirement - Ports disabled during Activated or Suspended State SHALL only be re-enabled by authorized administrators.
5.8.4 Information flow
  • 5.8.4-A Physical port restriction requirement - Voting devices SHALL be designed with the capability to restrict physical access to voting machine ports that accommodate removable media, with the exception of ports used to activate a voting session.
  • 5.8.4-B Physical port tamper evidence requirement - Voting devices SHALL be designed to give a physical indication of tampering or unauthorized access to ports and all other access points, if used as described in the manufacturer's documentation.
  • 5.8.4-C Physical port disabling capability requirement - Voting machines SHALL be designed such that physical ports can be manually disabled by an authorized administrator.

5.8.5 Door cover and panel security
  • 5.8.5-A Door cover and panel security requirement - Access points such as covers and panels SHALL be secured by locks or tamper evidence or tamper resistance countermeasures SHALL be implemented so that system owners can monitor access to voting device components through these points.

5.8.6 Secure ballot box
  • 5.8.6-A Secure ballot box requirement - Ballot boxes SHALL be designed such that any unauthorized physical access results in physical evidence that an unauthorized event has taken place.
    • The goal here is to ensure that poll workers or observers would easily notice if someone has tampered with the ballot box. This requirement can be achieved through locks or seals as a part of tamper evidence and tamper resistance countermeasures described by the use procedures and supplied by the manufacturer.
    • [What should happen if tampering is noticed?? -- not defined here]

5.8.7 Secure physical lock and key

5.8.8 Physical encasing lock

5.8.9 Power supply

6.1 General Design Requirements

  • 6.1-D Paper ballots, separate data from metadata - Paper ballots used by paper-based voting devices SHALL meet the following standards:
    1. Marks that identify the unique ballot style SHALL be outside the area in which votes are recorded, so as to minimize the likelihood that these marks will be mistaken for vote responses and the likelihood that recorded votes will obliterate these marks; and
    2. If alignment marks are used to locate the vote response fields on the ballot, these marks SHALL be outside the area in which votes are recorded, so as to minimize the likelihood that these marks will be mistaken for vote responses and the likelihood that recorded votes will obliterate these marks.
  • 6.1-E Card holder - A frame or fixture for printed ballot cards is optional. However, if such a device is provided, it SHALL:
    1. Position the card properly; and
    2. Hold the ballot card securely in its proper location and orientation for voting.
  • 6.1-F Ballot boxes - Ballot boxes and ballot transfer boxes, which serve as secure containers for the storage and transportation of voted ballots, SHALL:
    1. Provide specific points where ballots are inserted, with all other points on the box constructed in a manner that prevents ballot insertion; and
    2. If needed, contain separate compartments for the segregation of ballots that may require special handling or processing.
  • 6.1-G Vote-capture device activity indicator - Programmed vote-capture devices SHALL include an audible or visible indicator to provide the status of each voting device to election judges. This indicator SHALL:
    1. Indicate whether the device is in polls-opened or polls-closed state; and
    2. Indicate whether a voting session is in progress.
  • 6.1-H Precinct devices operation - Precinct tabulators and vote-capture devices SHALL be designed for operation in any enclosed facility ordinarily used as a polling place.

6.2 Voting Variations

The purpose of this formulaic requirement is to clarify that support for a given voting variation cannot be asserted at the system level unless device-level support is present. It is not necessarily the case that every device in the system would support every voting variation claimed at the system level; e.g., vote-capture devices used for in-person voting may have nothing in common with the vote-capture devices (typically MMPB) used for absentee voting. However, sufficient devices must be present to enable satisfaction of the system-level claim.

6.3 Hardware and Software Performance, General Requirements

6.3.1 Reliability

6.3.1.2 Estimated volume per election

The "typical" volumes described below are the volumes that medium-sized jurisdictions in western states need their equipment to handle in a high turn-out election, as of 2006. A county of 150 000 registered voters will have 120 000 ballots cast in a presidential election. A typical polling place will be set up to handle 2000 voters, which equals 60 polling places in a mid-sized county.

Central-count optical scanner: Medium-sized jurisdictions in western states need their central count equipment to scan 120 000 ballots in an election. Depending upon the actual throughput speeds of the scanners, they use 2 to 8 machines to handle the volume. "Typical" volume for a single scanner is the maximum tabulation rate that the manufacturer declares for the equipment times 8 hours.

Election Management System: The volume equals the total number of interactions with the vote gathering equipment required by the design configuration of the voting system to collect the election results from all the vote-capture devices.

The typical constant across the systems is that the Election Management System will interact once with each polling place for each class of equipment. Assuming our "typical" county with 60 polling places, one or more DREs in each polling place, and one or more optical scan devices, that totals 2×60=120 transactions per election.

The primary differences in the central count EMS environment are whether the optical scan devices are networked with the EMS or function independently.

In the networked environment, the device will interact with the EMS once per batch (typically around 250 ballots). So, 120 000/250=480 interactions.

In the non-networked environment, the results are handled similar to the polling place uploads. Results are copied off to media and uploaded to the EMS. Since central counting typically occurs over several days – especially in a vote-by-mail environment – the test should include several uploads from each scanner. 2 scanners × 4 days = 8 uploads.

To simplify these different cases to a single benchmark, we use the highest of the volumes (480 transactions), which leads to the lowest failure rate benchmark.

Precinct-count optical scanner: Polling place equipment has a maximum number of paper ballots that can be handled before the outtake bins fill up. Usually around 2500.

Direct Recording Electronic: Typical ballot takes 3–5 minutes to vote, so the most a single DRE should be expected to handle are 150–200 voters in a 12 hour election day.

Electronically-assisted Ballot Marker: Typically takes longer to vote than with a DRE. An individual unit should not be expected to handle more than 70 voters on election day.

Ballot activator: The volume use of these devices match the volumes for the polling place, which in our assumed county is 2000/polling place. Our assumed county would have 10–14 DREs/polling place with around 20 tokens. Each token would be used about 100 times.

Audit device: No information available.

The estimated volumes are summarized in Part 1: Table 6-1 . The estimates for PCOS and CCOS have been generalized to cover precinct tabulator and central tabulator respectively, and a default volume based on the higher of the available estimates has been supplied for other vote-capture devices that may appear in the future. Audit devices are assumed to be comparable to activation devices in the numbers that are deployed.

Table 6-1 Estimated volumes per election by device class

Device class Estimated volume per device per election Estimated volume per election
central tabulator Maximum tabulation rate times 8 hours 120 000 ballots
EMS 480 transactions 480 transactions
precinct tabulator 2000 ballots 120 000 ballots
DRE 200 voting sessions 120 000 voting sessions
EBM 70 voting sessions 120 000 voting sessions
other vote-capture device 200 voting sessions 120 000 voting sessions
activation device 2000 ballot activations 120 000 ballot activations
audit device 2000 ballots 120 000 ballots

6.3.1.3 Manageable failures per election

The term failure is defined in Appendix A. In plain language, failures are equipment breakdowns, including software crashes, such that continued use without service or replacement is worrisome to impossible. Normal, routine occurrences like running out of paper are not considered failures. Misfeeds of ballots into optical scanners are handled by a separate benchmark (Requirement Part 1: 6.3.3-A), so these are not included as failures for the general reliability benchmark.

The following estimates express what failures would be manageable for a mid-sized county in a high-turnout election. Medium-sized counties send out troubleshooters to polling places to replace or resolve problems with machines.

Any failure that results in all CVRs pertaining to a given ballot becoming unusable or that makes it impossible to determine whether or not a ballot was cast is called disenfranchisement. It is unacceptable for even one ballot to become unrecoverable or to end up in an unknown state. For example, an optical scanner that shreds a paper ballot, rendering it unreadable by human or machine, is assessed a disenfranchisement type failure; so is a DRE that is observed to "freeze," providing no evidence one way or the other whether the ballot was cast, when the voter attempts to cast the ballot.

Central-count optical scanner: No more than one machine breakdown per jurisdiction requiring repairs done by the manufacturer or highly trained personnel. Medium sized jurisdictions plan on having one backup machine for each election.

Election Management System: This is a critical system that must perform in an extremely time sensitive environment for a mid-sized county over a 3 to 4 hour period election night. Any failure during the test that requires the manufacturer or highly trained personnel to recover should disqualify the system. Otherwise, as long as the manufacturer's documentation provides usable procedures for recovering from the failures and methods to verify results and recover any potentially missing election results, 1 failure is assessed for each 10 minutes of downtime (minimum 1 – no fractional failures are assessed). A total of 3 or more such failures disqualifies the system.

Precinct-count optical scanner: A failure in this class of machine has a negligible impact on the ability of voters to vote in the polling place. No more than 1 of the machines in an election experience serious failures that would require the manufacturer or highly trained personnel to repair (e.g., will not boot). No more than 5 % of the machines in the election experience failures that require the attention of a troubleshooter/poll worker (e.g., memory card failure).

Direct Recording Electronic and Electronically-assisted Ballot Marker: No more than 1 % of the machines in an election experience failures that would require the manufacturer or highly trained personnel to repair (e.g., won't boot) and no more than 3 % of the machines in an election experience failures that require the attention of a troubleshooter (e.g., printer jams, recalibration, etc.).

Ballot activator: The media/token should not fail more than 3 % of the time (the county will provide the polling place with more tokens than necessary). No more than 1 of the devices should fail (the device will be replaced by the county troubleshooter).

Audit device: No information available. If comparable to ballot activators, there should be at least 1 spare.

The manageable failure estimates are summarized in Part 1: Table 6-2 . A "user-serviceable" failure is one that can be remedied by a troubleshooter and/or election official using only knowledge found in voting equipment user documentation; a "non-user-serviceable" failure is one that requires the manufacturer or highly trained personnel to repair.

Please note that the failures are relative to the collection of all devices of a given class, so the value 1 in the row for central tabulator means 1 failure among the 2 to 8 central tabulators that are required to count 120 000 ballots in 8 hours, not 1 failure per device.

Table 6-2 Estimated manageable failures per election by device class

Device class Failure type Manageable failures per election
voting device (all) Disenfranchisement 0
central tabulator All 1
EMS Non-user-serviceable 0
EMS User-serviceable (10 minutes) 2
precinct tabulator Non-user-serviceable 1
precinct tabulator User-serviceable 5 % of devices = 3
DRE Non-user-serviceable 1 % of devices = 6
DRE User-serviceable 3 % of devices = 18
EBM Non-user-serviceable 1 % of devices = 17
EBM User-serviceable 3 % of devices = 51
other vote-capture device Non-user-serviceable 1 % of devices = 6
other vote-capture device User-serviceable 3 % of devices = 18
activation device Media/token 3 % of tokens = 36
activation device Main unit 1
audit device All 1

6.3.1.4 Derivation of benchmarks We focus on one class of device and one type of failure at a time, and we assume that each failure is followed by repair or replacement of the affected device. This means that we consider two failures of the same device to be equivalent to one failure each of two different devices of the same class. The sense of "X % of the machines fail" is thus approximated by a simple failure count, which is X/100 times the number of devices. This then must be related to the total volume processed by the entire group of devices over the course of an election in order to determine the number of failures that would be manageable in an election of that size.

To reduce the likelihood of an unmanageable situation to an acceptably low level, a benchmark is needed such that the probability of occurrence of an unmanageable number of failures for the total volume estimated is "acceptably low." That "acceptably low level" is here defined to be a probability of no more than 1 %, except in the case of disenfranchisement, where the only acceptable probability is 0.

Under the simplifying assumption that failures occur randomly and in a Poisson distribution, the probability of observing n or less failures for volume v and failure rate r is the value of the Poisson cumulative distribution function,

P(n,rv) = SUM(x=0 to n) { (e^-rv)*(rv)^x / x! }

Poisson cumulative distribution function

Consequently, given ve (the estimated total volume) and ne (the maximum manageable number of failures for volume ve), the desired benchmark rate rb is found by solving P(ne,rbve)=0.99 for rb. This sets the benchmark rate such that there remains a 1 % risk that a greater number of failures would occur with marginally conforming devices during an election in which they collectively process volume ve. In the case of disenfranchisement, that risk is unacceptable; hence the benchmark is simply set to zero.
  • 6.3.1.5-A Failure rate benchmark - All devices SHALL achieve failure rates not exceeding those indicated in Part 1: Table 6-3.

Device class Failure type Unit of volume Benchmark
voting device (all) Disenfranchisement   0
central tabulator All ballot 1.237×10−6
EMS Non-user-serviceable transaction 2.093×10−5
EMS User-serviceable (10 minutes) transaction 9.084×10−4
precinct tabulator Non-user-serviceable ballot 1.237×10−6
precinct tabulator User-serviceable ballot 6.860×10−6
DRE Non-user-serviceable voting session 1.941×10−5
DRE User-serviceable voting session 8.621×10−5
EBM Non-user-serviceable voting session 8.013×10−5
EBM User-serviceable voting session 3.058×10−4
other vote-capture device Non-user-serviceable voting session 1.941×10−5
other vote-capture device User-serviceable voting session 8.621×10−5
activation device Media/token ballot activation 2.027×10−4
activation device Main unit ballot activation 1.237×10−6
audit device All ballot 1.237×10−6

  • 6.3.2-B End-to-End accuracy benchmark - All systems SHALL achieve a report total error rate of no more than 8×10–6 (1 / 125 000).

  • 6.3.3-A Misfeed rate benchmark - The misfeed rate SHALL NOT exceed 0.002 (1 / 500).

6.3.4 Electromagnetic Compatibility (EMC) immunity

6.3.5 Electromagnetic Compatibility (EMC) emission limits

6.3.6 Other requirements

6.4 Workmanship

6.5 Archival Requirements
  • 6.5.1-A Records last at least 22 months - All systems SHALL maintain the integrity of election management, voting and audit data, including CVRs, during an election and for a period of at least 22 months afterward, in temperatures ranging from 5 °C to 40 °C (41 °F to 104 °F) and relative humidity from 5 % to 85 %, non-condensing.

6.5.3 Period of retention (informative) - This informative section provides extended discussion for Requirement Part 1: 6.5.1-A and Part 1: 6.5.2.

United States Code Title 42, Sections 1974 through 1974e, states that election administrators must preserve for 22 months "all records and paper that came into (their) possession relating to an application, registration, payment of poll tax, or other act requisite to voting." This retention requirement applies to systems that will be used at any time for voting of candidates for federal offices (e.g., Member of Congress, United States Senator, and/or Presidential Elector). Therefore, all systems must provide for maintaining the integrity of voting and audit data during an election and for a period of at least 22 months thereafter.

6.6 Integratability and Data Export/Interchange

Media Form edit

Title Voluntary Voting System Guidelines
Publisher Election Assistance Commission
Author
Pub Date 2007-08-31
Media Link http://www.copswiki.org/w119/pub/Common/VoluntaryVotingSystemGuidelines/Final-TGDC-VVSG-08312007.pdf
Embed HTML
Note
Keywords Election Integrity, Election2008
Media Type PDF
Media Group Govt Doc
Book ISBN
Author Name Sortable
Thumbnail Link
Topic attachments
I Attachment Action Size Date Who Comment
Final-TGDC-VVSG-08312007.pdfpdf Final-TGDC-VVSG-08312007.pdf manage 4967.7 K 2014-06-26 - 17:23 Raymond Lutz Voluntary Voting Systems Guidelines
Topic revision: r5 - 2015-03-11, RaymondLutz
 

This site is powered by FoswikiCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding Copswiki? Send feedback