General
Discussion and Recommendations
The Ohio counties using some form of touch screen are, according
to
secretary of states files and augmented by CASE inquiries:
County
|
County
Seat
|
Vendor
Used
|
System
Used
|
AUGLAIZE |
Wapakoneta
|
ES&S |
iVotronics |
FRANKLIN |
Columbus
|
DANAHER/ES&S |
Danaher 1242 |
KNOX |
Mt. Vernon
|
MICROVOTE |
|
LAKE |
Painesville
|
TRIAD/SEQUOIA |
|
MAHONING |
Youngstown
|
ES&S |
iVotronics |
PICKAWAY |
Circleville |
MICROVOTE |
MicroVote 464 |
ROSS |
Chillicothe |
MICROVOTE |
touchpad |
Every one of these systems has a history of serious failures
somewhere in the country; Danaher is no exception, we know of the
Danaher 1242 problems because they were mentioned during the Joint
Committee on Ballot Security hearings. Worse than all the documented
problems are the undocumented problems that have not been observed
because there was no check in place to catch them.
In Indiana, Johnson
County has decided to put their ES&S
touch screens away and use a paper voting system instead this November.
County Clerk Jill Jackson said using a paper system was "the safest and
surest thing to do." Johnson County made this decision after they
learned that ES&S had installed uncertified software in their
voting systems. The county gave ES&S a deadline to get their
software certified, and ES&S failed to meet the deadline. (For more
on this story see Kim Alexander's Weblog for Aug 23, 2004.
http://www.calvoter.org/news/blog/index.html)
Our recommendations are in 5 categories: System Security,
Voter/System Interaction, Poll Worker Concerns,
System Testing, Management and Security (Pre-Voting Day, Voting Day,
and
Post-Voting Day). Some are expensive to
implement or require a long lead time, but they are all important and
can help. We hope election officials will get some good ideas for their
own jurisdictions
and will be able to make their election process even better by using
these ideas and understanding the reasons behind them.
- System Security (ref.
mostly from Brennon)
- Hire
and work closely with a
well-qualified, independent security team to examine the potential for
operational failures of and malicious attacks against the
jurisdiction’s TS voting system. No Board of
Elections has the
qualified experts on staff to guide the employees through and around
the many security concerns; the contractor who provides the
equipment and maintains it is exactly the wrong person to provide
independent security services. Recent scrutiny of the new DRE
systems has revealed numerous problems with electronic voting. They
could easily be sharply criticized with close elections this
November and a close cortical review is imperative.
- The expert security team that is retained must be free
of any
business relationships with any voting system vendors or designers in
order to
perform the necessary analysis and ensure public confidence in that
analysis. The
outside team must also have a proven track record in assessing computer
security in voting systems or comparable technologies.
- The
independent expert security team must be allowed full access to the
hardware/firmware, software code, procedural protocols, design
documentation, and other relevant items associated with the TS voting
system under analysis.
- To ensure such
access, elections officials from
the various states that have purchased the same vendor’s systems should
collectively demand full and complete cooperation from the vendors.
- Elections officials should
inform vendors that their level of cooperation will be documented on
publicly visible websites for purchasers including secretaries of state
to review.
- Contract terms may be used to require such
cooperation (and future state purchase contracts should be drafted to
include such requirements). Such alliances of state elections officials
could also be used, where appropriate, to take advantage of economies
of scale in the assessments themselves.
- This may be particularly valuable to avoid
duplicative
assessments of identical voting system technologies used in different
jurisdictions. Indeed, once a full assessment of a given voting system
has been completed and can be shared among all jurisdictions that use
identical technology (i.e.,
hardware and software), elections officials and the independent experts
with whom they contract should be able to focus more exclusively upon
those elements that are unique to their jurisdiction.
(See details in accompanying document: "Performing Voting System
Assessment")
- Voter Interaction with
Voting
System / Voter
education
- Improve usability.
- Consider hiring a usability expert to review your
system and make recommendations. In addition to the on-screen
instructions and layout, other factors affecting usability include
appropriate lighting and placement of machines.
Example
Arlington County, Virginia,
dispatches
demo units in each
polling
place. Make sure poll workers learn procedures for activating demo
smart
cards so they don’t accidentally use “live” smart cards.
- Review FEC publications on usability.
Develop a web-accessible sample ballot that shows each screen,
including the instruction and ending screen.
Example
Arlington County, Virginia,
includes a
presentation on voting
machines and the voting process on its website.
- Track over-votes and under-votes. Develop Election Day
procedures
to help determine the nature and cause of under-votes and blank votes
to determine whether they are genuine under-votes or the result of
voter confusion.
- Ask minority language organizations to review ballot
translations.
- If you find a higher percentage of voter error in certain
communities, work with pertinent community groups to educate voters in
those communities.
- Establish procedures for how to handle a scenario in which
a
voter exits before casting a vote.
- Develop procedures that allow you to determine after the
election
which machine the voter used; ensure these procedures also protect the
secrecy of the ballot.
Example
Montgomery County, Maryland, requires the poll workers to
conduct a written tally for each unit.
- Poll worker Training and
Polling
Place Procedures
Poll worker recruiting:
- Recruit local government
employees, particularly those with IT background.
Example
The Washington, DC, Board of
Elections
created a “precinct
technician” position to provide technical support in the polling place;
the technician receives hands-on training on start up and
troubleshooting machine problems.
Poll worker training:
- Create a poll worker position that is dedicated to machine
set
up,
shut down and troubleshooting. Provide supplemental training on
equipment; supplement pay for extra training.
- Require pollworkers to keep a log of Election Day events
and
problems, including voter complaints, that will help you to recreate
the events of that day.
Polling Place Setup
- Use USB’s as
machine
power source; connect each machine to a
USB.
Daisy-chaining machines may become a single point of failure. Have a
back-up plan and train pollworkers on how to troubleshoot and report
alleged “power failure” problems.
- Angle the machines to protect voter privacy.
- Survey polling places with tech support to check outlets
Pollworker
Accountability.
- Establish checklists to track
pollworker performance on key steps of DRE voting processes.
Example
Montgomery County, MD, has a Precinct Performance Report
(http://www.eac.gov/bp/docs/Precinct%20Performance%20Report.doc) which
tracks such factors as completion of election logs, provisional
ballot accounting forms and compliance with check-in procedures.
- Testing/System Integrity
- Calibration Issues - too much wear and tear can impact
sensitivity.
- Rely as little on the vendor as possible; look
for outside IT
expertise if it is not available in house. Have either election staff
or independent consultants design and run tests.
- Ensure systematic and consistent testing of each machine.
- Conduct, at a minimum, both acceptance testing and logic
and
accuracy testing on each system. Logic and accuracy test should include
“incremental testing.”
- Conduct system diagnostics on every machine for every
election
before you conduct Logic and Accuracy.
- Use separate machines for training and outreach.
- Management and Security
Pre-Election
Day
- Establish a deadline for patches or modifications to
prevent
unnecessary confusion. (SOS
should do this, but if not, it becomes the county BOEs' responsibility)
- Build a public-private
partnership to encourage civic participation. Civic groups like
the League of Women Voters, the election reform groups, the
chamber of commerce, trade unions, religious groups,
and service clubs can all help, as can the political parties, elected
officials and the county government. Everyone has an interest in
improving the conduct of elections, but isolated and independent
campaigns by these groups are unlikely to be as effective as a
coordinated effort. It should not take much coordination; certainly,
Democrats and Republicans are unlikely to cooperate intensively,
but even a little cooperation can have great impact. (Jones)
- Educate the public about these measures in advance. If we
are
to build
voter confidence, the voting public must understand what the government
is doing to protect their voting rights and to protect the integrity of
the election system. Any campaign will be ineffective if voters believe
that it is empty propaganda, so
it is essential that the measures include a campaign to recruit and
educate members
of the public to serve as observers of and witnesses to every critical
step in the election process. This is why it is so essential that the
campaign listed above be conducted as part of this effort. (Jones)
- Create a time line for election preparation. If you are
introducing a new system, expect to quadruple the amount of time
necessary for preparing
precinct-specific units. Preparation, testing and staging all require
more time.
- Conduct a risk analysis - where are you most vulnerable to
problems?
At what points are the system - both the administrative system and the
machines - most likely to break-down. For example, is there an
indispensable person? If so, develop a plan for dealing with his/her
absence. Develop contingency plans, such as off-site storage of all
software and data.
- Cross-train election staff to perform multiple tasks.
- Ensure all software, including patches, is certified.
Example
New York uses bar codes to track
delivery of lever machines in
anticipation of transitioning to DREs.
- Develop sound documentation of all election administration
procedures that will allow you to identify the cause of problems after
an election. Keep a log of receipt of equipment and software, who
performed the programming and testing, and delivery to staging area or
polling place. all paperwork that may be relevant in recreating how a
failure might have occurred.
- If the state is the contract holder, develop Memorandums of
Understanding (MOUs) with state election office on authority over the
system maintenance and modifications, including appropriate liens of
communication.
- Develop rules for access to any sensitive equipment.
- Keep a maintenance log for all voting system equipment.
This
log should track who has had access to the machine(s).
- Computers used for ballot definition should be stand-alone
PCs
unconnected to servers or the Internet.
- Machine delivery:
- Conduct risk analysis of the delivery system.
- Develop agreements with each polling place delineating
the
responsible election office and the facility.
- Establish chain of custody.
- Develop checklist for delivery.
- Use bar-coding to ensure proper delivery of all machines
to
polling places.
- Credentialing
Election Observers:
- Citing
relevant provisions in the state election law and regulations, county
team
request county officials’ support for non-partisan citizen teams to
monitor
polls on election day.
- If possible,
secure letters or other forms of permission/support/authorization for
precinct-level teams to have with them on election day.
- If permission is denied, publicize this,
appeal to the next level, seek credentials directly from candidates and
parties, and continue with plans to train volunteers and place them in
precincts on election day. (Toolset)
Election
Day / Election Night
- If smart cards are used:
- Control access to the voter smart
cards. Educate pollworkers
and voters to know that the “smart card” is not the ballot and the
voter’s choices are not recorded on the “smart card.” The card merely
directs the voting unit to bring forward the voter’s correct voting
screens.
Example
Montgomery County, MD directs the
pollworker to insert the Smart
Card in to the unit on behalf of the voter to ensure that the voter
correctly accesses the system.
Develop a plan to provide Election Day technical support for
pollworkers, including a troubleshooting checklist, a call center, and
rovers.
- Establish written procedures for handling Election Day
equipment failure.
- Provide for redundant records of results, including paper
printouts.
- Ensure transparency in all aspects of the tabulation
process,
especially in the transport or transmission of results to the central
election office.
- Develop chain of custody for memory cards and machines.
- Reconcile the number of
voters
with the number of ballots. One of the central
accounting measures that can be used to protect the integrity of the
ballotbox, whether electronic or paper, is the reconciliation of the
number of voters who signed into the polling place with the number of
ballots counted. If this number is brought forward through the canvass,
we can gain a very useful check against classic forms of fraud such as
ballot boxstuffing, pollworker errors such as casting demonstration
ballots on real voting machines, voters who have difficulty casting
their ballot, and a variety of other problems. The
following numbers should be tracked during the election: (Jones)
- Ballots
Cast (as recorded by the voting machine)
- Ballots
voided after a voter attempted to vote and failed or fled
These numbers
should add up to the number of signatures in the pollbook, and whenever
ballots are voided, a record should be made, explaining why.
The above
reconciliation will only be possible if a written log is maintained for
all ballots voided, since there are other reasons to void a ballot, for
example, when a voting machine is incorrectly used for demonstration.
It
would be appropriate for this log to be maintained for all
precinct worker actions that involve precinct workers entering voting
booths between the time the polls open and the time the polls close.
In addition, the
number of absentee ballots received by the deadline should be
counted (including those cast at satellite polling places during early
voting) and the number of provisional ballots accepted should be
counted. These, added to the number of signatures, give a measure of
the
turnout against which the number of ballots counted can be compared,
not only at the precinct, but at vote collection centers and during the
canvass.
Discrepancies,
for
example, precincts with unusual numbers of voided ballots, should
receive intensive and early attention from auditors. It is useful to
carry the turnout figures forward outside of the election management
system using manual methods in order to act as a check on the integrity
of the electronics. While canvassing an election by hand is horrible
clerical work, doing this one sum manually or at least carrying these
numbers forward outside the election management system should not be
onerous. (Jones #3)
.
- When the polls close, print and post an additional copy of
the
precinct totals.
- Posting a copy of
the precinct totals at the precinct allows any observer to note the
totals for any races that interest them and to check them against the
official canvass.
This allows any interested
person to help audit the integrity of the canvassing process and prove
to themselves that the county or the election management system has not
corrupted the count between the time that the polls close and the time
the final canvass is published.This measure only
works if the final published canvass does not combine absentee, early
voting and provisional ballots, so it is important to break these
numbers out. (Jones
#4)
- Also, the number of provisional ballots collected at the
precinct must be disclosed to any observers present at the closing of
the polls so that they can verify that the number of provisional
ballots eventually counted does not exceed the number
distributed.Reconcile
the printed record from the precinct with the electronic record.
At the close of the
polls, printouts from the precinct are saved, along with compact flash
cards and the DEB that was used to close the polls, and all of these
are
delivered, through a secure chain of custody, to the vote collection
center and eventually to the county's central vote tabulation center.
Because of questions about the integrity of computerized vote
tabulation technology, it is appropriate to finish the canvassing
process by comparing the printed record from the precincts with the
electronic records tabulated by the election management system.
In effect, once the
voters are allowed to perform this reconciliation independently, it is
wise for the county to defend itself by performing the same
reconciliation in-house in order to catch and correct any errors before
they are exposed to the public. This procedure is widely used
elsewhere,
including, for example, Dallas County, Texas (where Steve Bolton has
observed it), as well as Iowa, where we are working to write it into
law. (Jones
#5)
- Guard the chain of custody for all election materials.
The chain
of
custody for election materials is strongest when we minimize reliance
on locked doors at the precinct! It is one thing for the law to demand
that there be a lock, but quite another for county officials to examine
the locks to determine if they are genuinely secure or vulnerable
to classical attacks with credit cards, hairpins and other commonplace
tools.If possible, open
the polls on the morning of the election. (Not done this way in
Florida)
Use of security
seals is valuable, but only if these seals are not cosmetic. There are
reports from many jurisdictions of the use of custom printed numbered
security seals where the numbers are never checked when the seals are
broken. Such cosmetic procedures are of no use. Only if the seal number
is recorded when the seal is applied and verified at the time the seal
is broken is that seal of any value. Polling place, vote collection
center and canvassing center workers should all be informed of this,
and
election observers should be aware of the need for these checks.(Jones
#6)
- Institute a program of rigorous testing.
Direct recording electronic voting systems have been described as black
box voting systems because
observers can do very little to assure themselves that the software and
mechanism inside the voting machine performs correctly. This flaw is
compounded by the fact that the voting system firmware and software in
use today is proprietary and not open to public inspection. The most
difficult part of the voting system to test for correctness is the
touch-screen interface and the firmware behind it. In comparison, the
canvassing procedure is far more open, particularly if the protective
measures outlined above are in place.
The best available
defense against the known risks of direct recording electronic voting
machine sis a rigorous program of testing. Unfortunately, the
pre-election tests conducted on current voting machines are not
sufficiently rigorous. The central problem is that if the machine is
informed that it is undergoing a test, it can be programmed to perform
differently under test than in a regular election. Therefore, an
effective program of testing must include some tests that are
significantly more rigorous than is conventional in pre-election
testing
or even the testing performed by the independent testing authorities
that certify the machines to Federal and State standards. There are
three categories of tests that are particularly important:
a)
Challenge or red-team testing, where knowledgeable technicians and
programmers attempting to find and exploit weaknesses in the voting
system. Maryland had RABA Technologies of Columbia Maryland perform
such
tests on its voting system (the results are reported in the Trusted
Agent Report
of Jan. 20, 2004). Ohio had weaker tests performed by Compuware Corp.
on 4 different voting systems, including the iVotronic (the results are
available in Direct Recording Electronic (DRE) Technical Security
Assessment Report of Nov. 21, 2003).
b)
Serious investigation of how the system responds to normal errors,
including how the consequences of these errors show up in the election
results. This is a category of investigation that is important at the
local level because local election workers are the most likely to know
what kinds of errors are normal, and it is the local election workers
who must recognize and correct such errors when they occur. Once you
know what the normal errors are, the training materials may need
adjustment to reflect this, not only with measures to try to reduce the
frequency of these errors, but with specific procedures for dealing
with
their consequences.
Normal errors are
the types of errors that a system invites. In the everyday world, for
example,locking your keys in the car is a normal error -- a
dispassionate observer of the driver-automobile-door lock interaction
can easily predict that, no matter how carefully you train drivers not
to lock their keys in the door, some will accidentally do so. For a
voting system, we know that forgetting to close the polls on one of the
voting machines in the precinct is a normal error, but there are others
that are less well known. The important question for voting system
administrators is, what are the normal errors and for each, how can we
protect ourselves against it?
c)
Parallel testing. Because we know that voting machine software can be
prepared to recognize when it is being tested, the most effective tests
of a voting machine will be tests that are as nearly indistinguishable
from normal polling place operation as is possible. The best proposal
for this involves selecting the machines to be tested at the last
moment, and testing these machines from the minute the polls open to
the
minute the polls close.
Parallel testing,
sometimes called parallel monitoring has been advocated by many people.
The California Ad Hoc Touch Screen Task Force recommended parallel
testing in their Report
of July1, 2003, and Hans Van Wijk of NEDAP, a voting system vendor
based in Holland, presented a paper on parallel testing at the USACM
Workshop on Voter-Verifiable Election Systems in Denveron July 28,
2003. This model of testing has been offered by a number of
organizations as an alternative to the use of a voter-verified paper
ballot printer on each voting machine. Parallel testing is strongly
endorsed by the Leadership Conference on Civil Rights, and the state
of California used parallel testing in the March 2, 2004 primary (see
the Parallel Monitoring Program Summary Report prepared by R&G
Associates, Apr. 19, 2004).
In summary,
parallel testing is conducted by picking precincts at random and then
having the testing teams arrive at those precincts at roughly the time
the pollworkers arrive. Each testing team then selects a voting machine
at random from the machines at the selected precinct as the machine to
be tested. This machine is segregated from the other machines at the
precinct, for example, roped off with signs indicating that it is under
test. While the pollworkers open the polls at the other machines, the
testing team opens the polls on the machine being tested, and then,all
day, as the election is conducted on the other machines at the
precinct, the testing team tests the machine under test. As the polls
close, the testing team closes the machine under test, and then prints
out the totals for that machine and verifies them against the test
votes that were cast.
It is important
that parallel testing be conducted in public because the public needs
to know that the county is taking this measure, and at the precinct so
that there is no way that the voting machine can be in any way
specially
prepared for the test. Given both of these conditions, it is reasonable
to recruit members of the public to help. After voters have cast their
ballots, for example, they could be invited to help cast test ballots.
As each test ballot is cast, members of the testing team must note the
votes on that ballot, so that they can compute what the totals should
be
at the end of the day. Any problems voters have with the machine under
test should be noted as well, and of course, voters who helped with the
test by casting test ballots should be rewarded, for example, with
special stickers saying more than the usual "I voted." (Jones #7)
- Involve the county Audit and Management Services
Department.
Counting votes is
an accounting function, just as much so as counting dollars. As with
all accounting, it is subject to both error and the threat of fraud. We
deal with both of these threats in the world of financial accounting by
conducting regular audits, and we know that, if it were not for the
threat of such audits, corporate fraud would be far more common than it
is.
In elections, the
canvassing board conducts a number of self-audits with every election,
and many of the recommendations given here aim to strengthen these.
What
we do not have in our election system in most of the United States is a
system of external audits, where auditors from outside the election
office examine the integrity of the process.
External audits
could, of course, be conducted by the state election office, or by some
national election authority, but we can gain much of the protection of
such an external audit by bringing in auditors from the county's own
audit department. (Jones #8)
- Involve the county E-Gov Department.
Computerized voting
systems contain computers. This is obvious, but what is not obvious is
why local and state governments across the country are not recognizing
this. Data processing or electronic government departments have decades
of experience in questions of security, fault recovery, backup policy,
data communications and related domains, and all of this is
applicable to election systems. (Jones #9)
- Improve incident reporting.
In general, the
process of system certification requires feedback. Thus, for example,
the Federal Aviation Administration requires reports for all incidents
involving airplanes to be sent to the FAA as well as to whoever might
have caused the incident. Without this feedback, the FAA would not have
the information needed to improve their regulations, their testing of
airplanes, or their operating rules. Similarly, in the voting system
domain, the state elections office and the Federal Election Assistance
Commission need to learn about any problems encountered by the
counties so that they can adjust their certification requirements.
Unfortunately,
nationwide, we have a problem with this. The Election Assistance
Commission has no resources to handle incident reporting, and incident
reports sent to most states are filed, never to be examined again. This
must change, but change will require that the counties act,
routinely reporting incidents to the state. If state authorities
continue to ignore such reports, we will need to create a
non-governmental organization to handle them. (Jones #10)
- Track software version usage.
The canvass for an
election should include a record of the firmware and software version
numbers of all electronic systems used in arriving at that canvass.
These should not be taken from records of the version that was supposed
to be installed, but should be taken from the systems themselves.
In fact, even this
is inadequate, since a corrupt piece of software can report any version
number it wants. There are unsolved technical problems involved in
actually determining, to any degree of certainty, what software is
actually running on an arbitrary computer. Therefore, for the time
being, we must accept the report of the system and hope that the
software certification process and
the chain of custody from the certification authority to the voting
machine are both rigorous enough to defend us against misreported
versions. (Jones #11)
-
Track the source of data used in canvassing.
The iVotronic
system offers many ways to extract data for inclusion in the canvass.
There are three internal flash EEPROM memories in the machine, from
which data may be extracted using PEBs, compact flash EEPROM memory
cards or a serial data link. Depending on which extraction path is
used,
it is possible that different data may be extracted!
Only summary data
is extracted to the PEB, but this summary data is committed to
paper immediately on the printer at the precinct, so it provides
valuable protection against loss or corruption of data in the
electronic
transmission paths upward through the canvassing process.
As Steve Bolton of
ES&S has explained, data extracted via the serial port, for example
to a laptop computer, includes all vote image reports and event logs,
but all of this data comes from the first of three flash EEPROM chips
inside the iVotronic computer. This fact is important in the event that
there is any disagreement between these chips.
Data extracted via the compact flash card also includes all vote image
reports and event logs,
but in this case, this data will come from any one of the three flash
EEPROM chips, whichever one the internal firmware judges to be the most
authoritative.
Therefore, in the
event of disagreement between the internal EEPROM chips, data extracted
via the Compact Flash card and data extracted via the serial port may
differ, and these two paths are the only ways to extract detailed
reports from the machine, as opposed to the summary data extracted via
the PEB! Therefore, it is imperative to maintain a record, for each
machine, of any alternate path used for data extraction. It is also
noteworthy that the integrity of the data extracted may vary depending
on the path by which it is extracted.
Ideally, ES&S
(and other electronic voting system vendors) should incorporate data
path and data source tracking into their systems, so that this
information is automatically tracked by the canvassing system, but
until
this is done, manual records are essential. In addition, even when this
is automated, it should be subject to routine testing, and this
requires that manual records be maintained during the closing and
canvassing of precincts that are subject to audit.
In summary, unless
the iVotronic machine indicates a serious error condition because of
a disagreement between the internal EEPROM memories, extraction by the
serial port is an acceptable path, so long as chain of custody issues
are carefully attended to. Extraction of data via the compact flash
card should be acceptable once the problems with the Unity
election management system are solved; these are the subject of a later
section. (Jones #12)
- Allow only the minimum necessary software on election
computers.
The
Unity election management system makes little or no use of security
technology to protect the integrity of election data, and the data
downloaded from the iVotronic are similarly unsecured. These weaknesses
are documented in the security assessment of the iVotronic system
performed by Compuware for the State of Ohio, available on the web at:
http://www.sos.state.oh.us/sos/hava/files/compuware.pdf It
is worth noting, for example, that the database used within the Unity
election management system is in dBase format, and files in this format
can be manipulated by using Microsoft Excel. (Jones #13)
Post-Election
- Conduct post-election logic and accuracy testing of
machines.
- Modem
unofficial results over phone line using encryption to protect
data during transmission.
- Conduct a post-election audit to reconcile all records,
especially the number of voters and the number of votes cast.
- Conduct a public post-election “debriefing” to address any
concerns related to the voting system.
- Considerations for making TS voting systems
accessible
- Solicit the help of disability organizations in training
poll
workers to assist voters using accessible equipment.
- Place machines in a location where polling place noise
won’t
overwhelm the audio ballot.
- Recruit voters with disabilities and minority language
voters
to serve as pollworkers.
References:
Performing
Voting System
Assessment
The
expert
security team that is chosen should include within their scope of work
and final recommendations, at a minimum, the analyses listed below.5
Each jurisdiction and each voting system will inevitably present unique
concerns that must be assessed by the contracting expert security team.
Indeed, officials should establish that one of the most important
aspects of an expert security team’s preliminary review will be to
identify areas of vulnerability that are unique to the jurisdiction at
issue. In addition, as noted already, elections officials can and
should take advantage of voting system assessments performed in other
jurisdictions on identical hardware and software systems.
a.
Hardware Design Assessment
Potential
vulnerabilities:
Hardware
design flaws can allow an attacker to access the voting system to
change critical settings, install malicious devices, or otherwise
tamper with the voting terminals or tally servers. Examples include
machines or ancillary components without sufficient locks, with exposed
drives, or with other easily accessible hardware components. Such
vulnerabilities could lead to machine malfunctions, miscounted votes,
or erasure of data, were an attacker able to exploit them.
Recommendations:
In
the area of hardware design, a critical assessment tool has been
so-called “red team” exercises, in which a team of analysts attempts to
attack the system under review to identify points of vulnerability.6
In addition, the hardware must be studied to identify design flaws that
could allow either access to attackers or mere operational failures.
All devices and casings must be protected against such access. The
independent expert security team should provide a comprehensive
assessment of hardware design flaws or opportunities for improvement.
Among
other
remedial recommendations that have resulted from such hardware design
assessments are: the use of “tamper tape” on vulnerable hardware
components to ensure that attempts to breach those components are
detectable, replacement of certain hardware components with less
vulnerability, and new security procedures to compensate for an
identified hardware design flaw.
b.
Hardware/Firmware
Configuration Assessment
Potential
vulnerabilities: Hardware
or firmware configuration refers to the manner in which different
hardware or firmware components are connected and their operating
settings.7
Certain configurations create more potential access points through
which malicious attackers could gain access into the voting system.
Examples include the ability to “boot” a voting terminal or tally
server from a diskette or CD ROM (rather than from an internal hard
drive) and thereby gain access to the software code of that terminal or
server without a password. Such vulnerabilities could allow an attacker
to cause significant damage, from systematically erasing or
misrecording votes as they are cast to complete machine malfunctions.
Recommendations:
“Red team” exercises and other tools should be used to assess the
vulnerability within hardware/firmware configurations in the DRE voting
system. All devices must be checked to ensure that proper locks with
unique keys or passwords are used; network access is not available
through modems, Ethernet ports, or other points between or in hardware
components; and machines can be booted only off a secure drive (as
opposed to a CD ROM or floppy disk).
Among
other
recommendations that are likely to address such concerns are
configuration controls, so that it is not possible to boot off a CD ROM
or floppy disk; the use of user names, passwords, and file access
controls that are unique and inaccessible to potential attackers; and
the use of “tamper tape” to protect the server or voting terminal from
tampering.
c.
Software Design Assessment
Potential
vulnerabilities:
Software design vulnerabilities could involve either good faith flaws
or malicious software code hidden within the voting system. Examples of
good faith design flaws include poor practices, such as including
passwords or encryption keys in lines of easily accessible software,8
or simply faulty software code that leads to voting machine
malfunctions on Election Day. Malicious software code could include
instructions to a voting system to count votes erroneously at random or
in specified patterns designed to affect the tallies of a voting
machine or an entire election. Although computer security experts warn
that it is virtually impossible to guarantee that malicious code has
not been introduced into a system, certain basic measures can be taken
to reduce the risk of bad software design substantially, whether of
unintentional or malevolent origins.
Recommendations:
To assess the vulnerability of the system’s software, the independent
expert security team should review source code with particular
attention to authentication, encryption, and the accessibility of
critical files, such as those containing voting records. In short, the
expert security team must assess the extent to which the source code
itself includes unnecessary security risks that could be reduced
through patches, encryption, or other security measures, and whether
the source code follows good engineering practices to reduce the risk
of accidental failures.
In
addition to
security risks, the expert security team should perform extensive tests
of the basic functionality of each aspect of the voting systems,
including the recording and reporting of votes. Such testing is
essential to assure good software quality. Although it is virtually
impossible to guarantee that even an expert will find cleverly written
malicious software code, extensive testing will increase the likelihood
that the product of such code will be detected before Election Day.
Among
other
recommendations that are likely to address software design problems
are: specific updated patches; crypto-signatures (i.e.,
digital “fingerprints”) to ensure that any unintended software code can
be identified more easily; and, in the case of good faith software
design flaws, revisions to software source code to address specific
problems of security or functionality. Note that such revisions must
themselves undergo security assessments, within the constraints of
time, before use on Election Day.
d.
Software Configuration
Assessment
Potential
vulnerabilities:
Software
configuration refers to the ways in which the various software elements
are set up and arranged together to work properly. Flaws in such
configuration can allow unintended access into the software code by an
attacker, or simply expose the software to common dangers, such as
computer viruses. Examples of vulnerable software configurations
include the failure to ensure that anti-virus software programs or
other software “patches” designed to block unauthorized access are in
place and up-to-date throughout the system.9
In addition, the software configuration could also expose weak links in
the security of the connections between various software components,
through which an attacker could gain access to the system and affect
the machines’ operation.10
Uncertainty about poorly controlled configuration details will make
security assessment much more difficult, if not impossible.
Recommendations:
To assess software configuration problems, the independent expert
security team should analyze the entire voting system to examine how
data flows from one element to another. For example, experts may find
that there is a security vulnerability in the software that moves the
ballot information into the vote capture system to record the vote.
Each separate device or interface between devices (and the software
inside) represents a potential point of attack that must be assessed.
In addition, experts must examine the patches and anti-virus software
used in the servers and the terminals. Further, the expert security
team should study the procedures and mechanism, if any, to upgrade
software in the system. To assess whether improper software upgrades
have occurred, the expert security team must compare the existing code
with the most trusted version of the same. If software upgrades are to
be completed from a remote location, the risks inherent in such
upgrades must be documented and assessed. In any event, software
upgrades and even parameter changes should be carefully controlled and
documented at all times, and the procedures for doing so should be
reviewed as part of the assessment process.
Among
other
recommendations that are likely to address software configuration
problems are: placing digital signatures on software to detect
malicious code, precluding any remote software upgrades as unacceptable
risks, new patches in the operating systems to improve security, and
reconfiguration of certain software elements to eliminate weak links in
the system.
e.
Assessment of Procedures
Potential
vulnerabilities:
The procedures used to handle a voting system can facilitate security
breaches or machine malfunctions or, at the least, fail to stop such
problems. Examples of problems in this area include the absence of
adequate security procedures (e.g.,
using only one encryption key or password for all machines rather than
unique keys or passwords for each machine), poor implementation of
adequate procedures by elections workers, or departures from protocol
caused by unforeseen circumstances on Election Day.11
In addition, procedures that are not directly related to security can
produce unnecessary security risks. For example, procedures that allow
last-minute software upgrades to the machines or server can, if not
handled properly, allow uncertified software to be used on Election Day
that bypasses critical security safeguards.12
Inadequate procedures for routine auditing, detection, and response to
security incidents can also undermine the effectiveness of other
security measures.
Recommendations:
To
assess both security procedures and election procedures that may have
security implications, the independent experts must study relevant
procedures in place in the jurisdiction, determine whether they are
fully in use, and understand which individuals are trained and
responsible to ensure their proper implementation. In addition, the
expert security team must assess all locks or other security devices to
determine their vulnerability, including such facts as how many keys
have been made that can open a lock and to whom the keys have been
given. Such analyses must address the entire voting system and must
incorporate any changes that occur in procedures on or before Election
Day. The objective is to assess the chain of possession from vendor to
precinct so that no unintended software modifications or hardware
tampering can occur. The same consideration should be given to
assessing procedures used to create the chain of possession of voting
results, from balloting through certification.
Measures
that
are likely to improve security and other procedures include:
replacement of locks and security devices; implementation or
improvement of standard procedures; better training on procedures for
key officials and workers; the use of Tripwire, or a similar software
authentication program, to provide a check of software integrity on the
machines and server; and protocols for use of “tamper tape” and other
protective measures.
f.
Physical Security
Assessment
Potential
vulnerabilities:
Voting
systems must be securely stored and kept physically out of the reach of
potential attackers. Without such physical security precautions, the
finest security checks on voting terminals or servers may be rendered
moot by subsequent attacks on or before Election Day, software or
hardware may be maliciously altered, and machines may be programmed to
miscount or erase votes or simply to malfunction in certain areas or
polling places.
Recommendations:
The assessment of physical security will require different analyses in
different jurisdictions, depending upon the size of the jurisdiction,
the number of machines, the methods of storing and handling the
machines, and other factors. The independent expert security team must
study the entire chain of custody of all of the voting terminals, the
servers, and any other materials related to the use of the DRE voting
systems. The “chain of custody” assessment should also cover the
recording and transmission of voting results, including all
telecommunications or networking facilities utilized. The chain of
custody must not end on Election Day, moreover, in case of the need for
a new election or additional analysis of the systems after the
election.
Among
other
recommendations that are likely to address physical security concerns
are: changes in storage methods for machines and servers, limits on
personnel access to such components, improved security procedures, and
better training of election workers to avoid unnecessary exposure of
voting system components.
3. Implementing Expert Recommendations
Eliminating
unnecessary security risks and restoring public confidence in voting
systems within a jurisdiction requires not just obtaining a risk
assessment but also implementing measures to limit those risks before
Election Day. For this reason, elections officials should commit prior
to hiring an independent expert security team to implement all
reasonable recommendations within a pre-established timetable and to
provide public explanations (working in concert with the independent
oversight panel) of any decisions not to implement specific
recommendations. Officials should provide public notice of both the
risk assessment process and the plan for implementation of such
recommendations. The independent oversight panel recommended below
would be a valuable asset in this effort.
In
addition,
the independent expert security team should be required to identify a
series of checks that can be performed after the recommendations have
been adopted and implemented that will test whether they have, in fact,
been so implemented. Such tests are critical not only to ensure that
security and operational improvements have been made, but also to
instill public confidence that the independent assessment process was
indeed independent.
4. Developing Security
Training
Any
serious
expert assessment will result in recommended improvements in the
training of elections officials and workers to address security
concerns and operational failures on DRE voting systems. This is true
because experience with DRE machines is still limited in most
jurisdictions, and election worker training often remains limited in
any event. Accordingly, elections officials should develop a
comprehensive security training program for election workers at every
stage in the election process. Although the specifics of each
jurisdiction’s training will differ, all jurisdictions must include
training on the changes implemented in response to the independent
expert security team’s recommendations.
5. Randomized Parallel Testing
Parallel testing is the only procedure available to detect
non-routine code bugs or malicious code on DRE systems. In addition to
laboratory testing during the certification process it is essential
that DRE systems get tested during real elections,
using so-called parallel testing procedures. Parallel testing is needed
for two separate purposes: (a) to test the myriad parts of the system
that get used during a real election but not in a laboratory testing
situation, and (b) to check for the possible presence of malicious code
or insider manipulation that is designed specifically to avoid
detection in a laboratory or testing situation, but to modify votes
surreptitiously during a real election. Where possible, parallel
testing should be performed in every jurisdiction, for each distinct
kind of DRE system. While experts agree that parallel testing cannot
reveal all forms of malicious code, it can be a critical part of the
kind of comprehensive security measures recommended in this report.
(Brennon)
5
Elections
officials should consult the procedures described in the
publication NIST 800-30, “Risk Management Guide for Information
Technology Systems” and the baseline information categories defined in
the NSA Infosec Assessment Methodology. These documents are used by the
U.S. Government to define the scope of work for its security
assessments.
6
As described in RABA’s report, “A Red Team exercise is designed to
simulate the environment of an actual event, using the same equipment
and procedures of the system to be evaluated. Teams are then free to
experiment with attack scenarios without penalty.” RABA Technologies
LLC, Trusted
Agent Report Diebold AccuVote-TS Voting System,
at 16 (Jan. 20, 2004), available at
http://mlis.state.md.us/Other/voting_system/trusted_agent_report.pdf.
RABA’s red team exercises focused on smart card vulnerabilities, the
security of each voting machine terminal and of the server, and the
methods used to upload results after an election. Id.
7
Firmware commonly refers to the coded instructions that are stored
permanently in the read-only memory (“ROM”) inside a computer system’s
hardware. It is thus easier to change than hardware but harder than
software stored on a disk. Firmware is often responsible for the
behavior of a computer system when it is first switched on. A typical
example would be a firmware program that loads an operating system from
a hard drive or from a network and then passes control to that
operating system once the computer is fully booted.
8 See,
e.g.,
RABA Technologies LLC, supra
note 6, at 16. RABA’s red team exercises revealed that the smart cards’
passwords were actually contained in the source code for the systems,
which allowed the team easily to gain access to a card’s contents and
thus to vote multiple times.
9
For example, the RABA investigators who analyzed the Diebold machines
to be used in Maryland found that, with the correct phone number of the
central server in each local board of elections, they could take
control of the entire server from any phone in the world. The
vulnerability was the result of failure to update the so-called “GEM
Server” with at least 15 security patches available from Microsoft. Id.
at 20-21.
10
The Seattle
Times
reports that an internal Diebold email allegedly noted that King County
(WA) was “famous” for using uncertified Microsoft Access software to
open the GEMS election database. See
Keith Ervin, No
election snags, director says: Absentee ballots on time, security
measures in place,
Seattle Times, Oct. 28, 2003, available at
http://seattletimes.nwsource.com/html/localnews/2001776406_voting28m.html.
11
The RABA investigators found that all 32,000 of Maryland’s touch-screen
terminals had the same locks and keys, making every machine accessible
to anyone with one of the keys. The keys could also be easily
reproduced at three local hardware stores. RABA Technologies LLC, supra
note 6, at 18. The Washington
Post
reports that malfunctioning machines were removed for repair and
returned to service during Election Day in Fairfax County, Virginia. See
Eric M. Weiss & David Cho, Glitches
Prompt GOP Suit Over Fairfax Tabulations,
Washington Post, Nov. 5, 2003, available at
http://www.washingtonpost.com/ac2/wp-dyn/A1397-2003Nov5.
12
In three central Indiana counties, for example, uncertified firmware
was loaded into the voting systems by Election Systems & Software
as a result of inadequate procedures. See Rick
Dawson and
Loni
Smith McKown , Voting
Machine Company Takes Heat Over Illegal Software,
WISH-TV8, March 11, 2004, available at
http://www.wishtv.com/Global/story.asp?S=1704709&nav=0Ra7JXq2. In
California, the installation of uncertified software occurred on
several occasions and led to the Secretary of State’s decertification
of DREs. See,
e.g.,
Kim Zetter, E-Voting
Undermined By Sloppiness,
Wired, Dec. 17, 2003, available at
http://www.wired.com/news/evote/0%2C2645%2C61637%2C00.html.
|