Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
CRYPTOGRAPHIC SYSTEMS AND METHODS, INCLUDING PRACTICAL HIGH CERTAINTY INTENT VERIFICATION, SUCH AS FOR ENCRYPTED VOTES IN AN ELECTRONIC ELECTION
Document Type and Number:
WIPO Patent Application WO/2005/122049
Kind Code:
A3
Abstract:
Methods and associated systems provide proof of a ballot cast in an election or of user choices under a data structure The method includes, for example, casting a ballot representing a voter's intended choice associated with a cast ballot, and creating a private, pap receipt (602) that represents the voter's intended choice associated with the cast ballot. The private, paper receipt (602) includes human-readable information to permit the voter to publicly verify that the cast ballot has been included in a ballot tabulation process, and wherein only the voter can discern from the human-readable information on the private, paper receipt (602) what the voter's intended choice was, with respect to the cast ballot

Inventors:
NEFF ANDREW C (US)
Application Number:
PCT/US2005/020094
Publication Date:
July 24, 2008
Filing Date:
June 07, 2005
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
DATEGRITY CORP (US)
NEFF ANDREW C (US)
International Classes:
G06K17/00; G07C13/00; G07C13/02
Foreign References:
US20030158775A12003-08-21
US7237717B12007-07-03
US20020158118A12002-10-31
US20060273169A12006-12-07
Attorney, Agent or Firm:
DALEY-WATSON, Christopher, J. (P.O. Box 1247Seattle, WA, US)
Download PDF:
Claims:
CLAIMS
I/We claim:
1. An automated method for permitting a voter to vote in an election, the method comprising: providing to the voter an electronic ballot, wherein the electronic ballot includes at least two ballot choices; receiving from the voter a selected ballot choice; automatically generating a verifiable choice associated with the selected ballot choice, wherein the verifiable choice represents a matrix having at least a two dimensional array of positions, wherein the matrix includes at least two rows or columns along one axis associated with the at least two ballot choices, and a predetermined number of columns or rows in a transverse axis, and wherein each of the positions in the two dimensional array of positions is associated with a secret value hidden from the voter; printing at least one pledge associated with the selected ballot choice; prompting the voter to select at least one of the predetermined number of columns or rows associated with the transverse axis; receiving from the voter a selected one of the predetermined number of columns or rows associated with the transverse axis; revealing to the voter at least one value in the matrix associated with at least one position corresponding to the selected ballot choice along the one axis and the selected one of the predetermined number of columns or rows associated with the transverse axis, wherein the revealed value is associated with the selected ballot choice and is related to the printed pledge to verify to the voter that the verifiable choice is associated with the selected ballot choice; and, providing the verifiable choice for tally in the election. 2. The method of claim 1 wherein the printing includes printing and obscuring the pledge before receiving from the voter a selected one of the predetermined number of columns or rows associated with the transverse axis.
3. The method of claim 1 wherein the values are binary values, and wherein the selected one of the predetermined number of columns or rows associated with the transverse axis is a predetermined pattern of positions to be revealed.
4. The method of claim 1 wherein automatically generating a verifiable choice associated with the selected ballot choice includes providing a same value for each position in the one axis associated with the selected ballot choice.
5. The method of claim 1 wherein the ballot choices include yes, no, and abstain choices, and wherein the voter's choice is represented by an encoding scheme that differs from an encoding scheme for the voter's unchoices.
6. A computer-readable medium whose contents cause at least one data processing device to perform a method to provide proof of a ballot cast in an electronic election, the method comprising: receiving an indication corresponding to casting of an electronic ballot to represent an intended choice associated with the cast ballot; generating a private, paper receipt that represents the intended choice associated with the cast ballot; and wherein the private, paper receipt includes human-readable information to permit public verification that the cast ballot has been included in a ballot tabulation process, and wherein an ability to discern what the intended choice was with respect to the cast ballot from the human-readable information on the private, paper receipt is restricted. 7. The computer-readable medium of claim 6 wherein the computer- readable medium is a memory for a data processing device.
8. The computer-readable medium of claim 6 wherein the computer- readable medium is a logical node in a computer network receiving the contents.
9. The computer-readable medium of claim 6 wherein the computer- readable medium is a computer-readable disk.
10. The computer-readable medium of claim 6 wherein the computer- readable medium is a data transmission medium carrying a generated data signal containing the contents.
11. The computer-readable medium of claim 6 wherein the generating includes employing error detecting codes to represent with equal parity the intended choice, and to represent with unequal parity any unintended choices.
12. An apparatus for providing electronic voting, comprising: display means for displaying instructions, information, and ballot choices to a voter; printer means for producing a voting receipt to the voter; and voting device means, including user input means, for receiving input from the voter to select at least one of the ballot choices, and wherein the voting device means includes: means for generating an electronic ballot commitment based on the received ballot choice; means for producing a pledge commitment based at least in part on the electronic ballot; means for receiving a challenge from the voter and providing response information; and means for recording or transmitting an encrypted ballot that at least includes the received ballot choice, and wherein the printer means produces the voting receipt based on the received ballot choice and provides information to the voter regarding the received ballot choice without providing public information regarding the received ballot choice.
13. The apparatus of claim 12 wherein the means for generating employs cryptographic error detection codes that represent the received ballot choice with one parity, and represent remaining ballot choices with another parity
14. The apparatus of claim 12 wherein the printer means includes an obscuring shield to visually obscure a printed pledge commitment before receiving from the voter the challenge, and wherein the printer means provides the pledge commitment to the voter after receiving the challenge.
15. The apparatus of claim 12 wherein the printer means includes an obscuring shield to visually obscure a printed unlock code before receiving from the voter the challenge, and wherein the display means displays one or more pledge commitments with one or more voter received ballot choices after the voting device means receives the unlock code.
16. The apparatus of claim 12 wherein the printer means includes an obscuring shield to visually obscure at least one printed pledge commitment before receiving from the voter at least one challenge, wherein the printer means provides the pledge commitment to the voter after receiving the challenge, wherein the printer means prints the receipt with ballot choices and associated pledges for comparison by the voter with the printed pledge commitment, and wherein the printer means erases the printed pledge commitment after voter comparison.
17. The apparatus of claim 12 wherein the voting device means includes a one-way communication channel coupled with the printer means for at least selectively only providing data to and not receive data from the printer means, and wherein the printer means, or a separate device, includes user-input means for receiving a voter input code. 18. The apparatus of claim 12 wherein the voting device means is located remote from a voting poll location, and wherein the voting device means is at least selectively coupled to a computer network.
19. The apparatus of claim 12 wherein the pledge commitment is provided to the voter by either the voting device means or by the printer means.
20. An apparatus for use in electronic voting, the apparatus comprising: a display device for displaying ballot choices to a voter; a receipt generator for producing a voting receipt to the voter based on the voter's selection of at least one of the ballot choices; a voting device having a user input portion to receive the voter's selection of the ballot choices, wherein the voting device is configured to create an encrypted electronic ballot based on the voter selected ballot choices; and a one-way communication channel coupled among the receipt generator and the voting device, wherein the one-way communication channel at least selectively permits the voting device to provide information to the receipt generator to permit the receipt generator to produce the voting receipt for the voter.
21. The apparatus of claim 20 wherein the one-way communication channel includes: an wireless transmitter coupled to the voting device, and a wireless receiver coupled to the receipt generator and configured to receive communications from the wireless transmitter; a removable storage media writer coupled to the voting device and a removable storage media reader coupled to the receipt generator to permit the voter to move a piece of storage media from the removable storage media writer to the removable storage media reader; or an intermediate switch box for selective connecting and disconnecting the voting device to the receipt generator. 22. The apparatus of claim 20 wherein the one-way communication channel or the receipt generator includes a user input device to receive voter input at least at a time when the voting device can not receive data from the receipt generator.
23. A machine-readable medium storing a data structure, wherein the data structure is configured for encoding user selected choices associated with the data structure, the data structure comprising: first and second code words, wherein the first code word corresponds to a user-selected choice and the second code word corresponds to at least one choice not selected by the user; an encryption function for encrypting the first and second code words; a partial reveal function for both generating encrypted data for output, and for generating an output parameter regarding the encrypted first and second code words and the user-selected choice; and, a relationship operation between the first and second code words and the partial reveal function, wherein the relationship operation is capable of being automatically implemented, and wherein the relationship operation permits a receipt to be generated for the user that represents to the user the user-selected choice, but which does not provide to others information regarding the user- selected choice.
24. The data structure of claim 23 wherein the first and second code words are encoded using odd and even parity encoding pairs, or using orthogonal group encoding.
25. A method for providing proof of a ballot cast in an election, the method comprising: casting a ballot representing a voter's intended choice associated with a cast ballot; creating a private, paper receipt that represents the voter's intended choice associated with the cast ballot; and wherein the private, paper receipt includes human-readable information to permit the voter to publicly verify that the cast ballot has been included in a ballot tabulation process, and wherein only the voter can discern from the human-readable information on the private, paper receipt what the voter's intended choice was, with respect to the cast ballot.
26. The method of claim 25, further comprising generating, from multiple election authorities, symmetric shares of verifiable choices for use in casting the ballot.
27. The method of claim 25, further comprising generating, from multiple election authorities, multiple challenge strings or challenge string shares for use by voters to verify intended choices with respect to cast ballots.
Description:
CRYPTOGRAPHIC SYSTEMS AND METHODS, INCLUDING PRACTICAL HIGH CERTAINTY INTENT VERIFICATION, SUCH AS FOR ENCRYPTED VOTES IN AN ELECTRONIC ELECTION

CROSS-REFERENCE TO RELATED APPLICATIONS

[0001] This application claims the benefit of U.S. Provisional Patent Application Nos. 60/577,566 and 60/579,894, filed June 7 and June 15, 2004 (attorney docket numbers 32462-8011 US and -8011 US1 ), respectively, both entitled "Practical High Certainty Intent Verification for Encrypted Votes," and 60/682,792, filed May 18, 2005 "Cryptographic System and Method, Such as For Use in Verifying Intent of a Voter in an Electronic Election," (attorney docket number 32462-8011 US2), all by the same inventor and assignee.

BACKGROUND

[0002] Cryptographic election protocols have for many years endeavored to provide a purely information based procedure by which private (i.e. secret) voter choices (i.e. votes) can be publicly aggregated (i.e. tallied) subject to two requirements:

1. Every voter should be able to determine with high certainty that her choice (vote) has been accurately included in the final result, or tally, without any requirement for the voter to trust the behavior, action, or proper functioning of one or more election system components.

2. It should not be possible for any voter, by means of tangible evidence, to convince another individual, or party (technically referred to as the coerceή, of the value of her choice (vote).

[0003] Intuitively, these two requirements seem mutually exclusive. Regarding the second criteria, there are technical limits to the degree that it can be achieved at all. For example, if the "other party" includes all other voters in the election, then the value of the voter's choice can be simply deduced by the coercer from the final tally, independent of any help from the voter. Nevertheless, under standard cryptographic assumptions, and reasonable assumptions about the extent of collusion achievable by the coercer, protocols have been proposed that, at least theoretically, successfully address these requirements simultaneously. Each of these schemes has some practical drawbacks though.

[0004] In "Receipt-Free Secret-Ballot Elections" by, J. Benaloh and D. Tuinstra, the first theoretical framework is described, but certain elements of how it is to be embodied in practice are left unspecified. In particular, each voter must leave the "booth" with a record of information to compare with the public tally. The assumption seems to be that voters will remember this information, but the amount of information is large enough that human memory is not a reasonable data recording device. Further, if one imagines that a receipt type printer is used instead for data recording, it would be important to "undo" the sequence in which information is presented in the booth. This probably means that the receipt printer must cut the receipt into several pieces before delivering it to the voter. Furthermore, the scheme is cumbersome in that it requires the voter to compare a large amount of data with data in the public tally.

[0005] The scheme described in "Secret-Ballot Receipts and Transparent Integrity: Better and less-costly electronic voting at poling places," by D. Chaum, http://www.vreceipt.com/article.pdf, 2002, and elaborated upon in "A Dependability Anaysis of the Chaum Digital Voting Scheme," by J. Bryans and P. Ryan, University of Newcastle upon Tyne Technical Report Series CS-TR-809, 2003, address several of the issues of the previous two schemes described above by using visual cryptography (See: M. Naor and A. Shamir, Visual Cryptograph. Advances in Cryptology - Eurocrypt 94, LNCS vol. 950, pp. 1-12. Springer-Verlag, Berlin, 1995). However, it also creates some issues of its own. First, as with the previous scheme, there is still a need to "destroy physical evidence" in order to prevent the threat of coercion. In this scheme, it is one of the two media layers on which the visual cryptography pieces of the scheme are printed or otherwise marked. Also, because multiple layers must be printed and exactly aligned, a printer with special capabilities is required. That is, it is not possible to use many standard and inexpensive printing devices. A third characteristic of the scheme is the fact that fraud can be detected by the voter with a probability of at best 1/2. In principle, this is not a big problem for attended (i.e. poll sitej voting, but it does raise some practical concerns: A moderately large chance of undetected fraud means that voters must be able to protest when they detect a fraud event. The protest process may be cumbersome, and could result in a loss of ballot privacy. Moreover, protests may occasionally occur even if the voting device never misbehaves, since some voters are likely to make mistakes and confuse their own error with device misbehavior. Additionally, a 1/2 chance of undetected fraud is insufficient, even in principle, for remote voting applications where election officials are not available to resolve disputes between voter and device. Furthermore, a fourth characteristic of the scheme may present a significant usability problem. The receipt data that must be compared against the public election tally is a set of pixels. In order to handle typical sized ballots, these pixels will be quite small. The assumption that voters will be able to visually compare this data is problematic.

BRIEF DESCRIPTION OF THE DRAWINGS

[0006] Figure 1 is a block diagram of a suitable computing system in which aspects of the invention may be employed.

[0007] Figure 2 is a flow diagram of an example of a data communication protocol performed by a voting computer or device and associated elements.

[0008] Figure 3 is data flow diagram showing data flow after display of a ballot.

[0009] Figure 4 is a block diagram illustrating a one way communication system for communicating data from a voting device or computer to a printer or other output device for use in voting.

[0010] Figure 5 is a block diagram of an intermediate device between the voting device and printer for selectively disconnecting the printer from the voting device, and which includes a user input portion, such as a keypad.

[0011] Figures 6A and 6B together are a flow diagram illustrating a series of information/instruction display screens and receipt generation under an alternative voting protocol that may employ the devices of Figures 4 or 5.

[0012] The headings provided herein are for convenience only and do not necessarily affect the scope or meaning of the claimed invention. DETAILED DESCRIPTION

[0013] Presented below is a verification scheme that overcomes the drawbacks of the schemes mentioned above, and which provides additional benefits. In particular, this scheme may be employed in an electronic voting context, and is a practical, coercion free, secret vote receipt scheme that does not produce some piece of physical evidence which must be destroyed immediately after each voter casts a ballot. It also provides a way for the voter to detect error or ballot fraud by the voting device with very high probability.

[0014] In particular, a universally verifiable, cryptographic vote casting protocol is described that enables each voter to determine with high certainty via a receipt that her choices (intended votes) have been accurately represented in the input to a public tally. However, since the receipt, in isolation, can represent a choice for any candidate with equal probability, it does not enable vote buying or coercion. The information that the voter uses to convince herself of encrypted ballot integrity includes temporal information that is only available at the time the ballot is cast. As with conventional voting systems, the act of casting takes place in a private environment - i.e. the "poll booth." Under this assumption then, the scheme, in conjunction with a universally verifiable tabulation protocol, provides an end-to-end verifiable, secret vote receipt based election protocol that is coercion free.

[0015] Intrinsically, the protocol is unconditionally secure, although for the sake of usability, the commitment of data is likely to be implemented via a secure one-way hash. The security of such an implementation would then depend on the one-way property of the hash function employed. The scheme requires no more computation or data processing from the voter than that which is performed by a bank customer at a typical ATM. Thus, it is very practical.

[0016] Various embodiments of the invention will now be described. The following description provides specific details for a thorough understanding and enabling description of these embodiments. One skilled in the art will understand, however, that the invention may be practiced without many of these details. Additionally, some well-known structures or functions may not be shown or described in detail, so as to avoid unnecessarily obscuring the relevant description of the various embodiments [0017] The terminology used in the description presented below is intended to be interpreted in its broadest reasonable manner, even though it is being used in conjunction with a detailed description of certain specific embodiments of the invention. Certain terms may even be emphasized below; however, any terminology intended to be interpreted in any restricted manner will be overtly and specifically defined as such in this Detailed Description section.

Suitable Computing System

[0018] Figure 1 and the following discussion provide a brief, general description of a suitable computing environment in which aspects of the invention can be implemented. Although not required, embodiments of the invention may be implemented as computer-executable instructions, such as routines executed by a general-purpose computer, such as a personal computer or web server. Those skilled in the relevant art will appreciate that aspects of the invention (such as small elections) can be practiced with other computer system configurations, including Internet appliances, hand-held devices, wearable computers, personal digital assistants ("PDAs"), multiprocessor systems, microprocessor-based or programmable consumer electronics, network PCs, mini computers, cell or mobile phones, set-top boxes, mainframe computers, and the like. Aspects of the invention can be embodied in a special purpose computer or data processor that is specifically programmed, configured or constructed to perform one or more of the computer- executable instructions explained herein. Indeed, the term "computer," as generally used herein, refers to any of the above devices, as well as any data processor.

[0019] Aspect of the invention can also be practiced in distributed computing environments where tasks or modules are performed by remote processing devices, which are linked through a communications network, such as a Local Area Network (LAN), Wide Area Network (WAN), or the Internet. In a distributed computing environment, program modules or sub-routines may be located in both local and remote memory storage devices. Aspects of the invention described herein may be stored or distributed on computer-readable media, including magnetic and optically readable and removable computer disks, stored as firmware in chips, as well as distributed electronically over the Internet or other networks (including wireless networks). Those skilled in the relevant art will recognize that portions of the protocols described herein may reside on a server computer, while corresponding portions reside on client computers. Data structures and transmission of data particular to such protocols are also encompassed within the scope of the invention.

[0020] Unless described otherwise, the construction and operation of the various blocks shown in Figure 1 are of conventional design. As a result, such blocks need not be described in further detail herein, as they will be readily understood by those skilled in the relevant art.

[0021] Referring to Figure 1 , a suitable environment of system 100 includes one or more voter or client computers 102, some of which may include a browser program module 104 that permits the computer to access and exchange data with the Internet, including web sites within the World Wide Web portion 106 of the Internet. Possibly more importantly, embodiments of the invention described below may employ a voting computer or voting device 103 that is stand-alone and not connected to any network. (As explained below, the voter may check a final ballot tally via the voter computer 102 over the internet 106 after voting.) The voter computers 102 and voting device computer 103 may include one or more central processing units or other logic processing circuitry, memory, input devices (e.g., keyboards, microphones, touch screens, and pointing devices), output devices (e.g., display devices, audio speakers, and printers), and storage devices (e.g., fixed, floppy, and optical disk drives, memory/smart card readers/writers), all well known but not shown in Figure 1. As shown in Figure 1 , there are N number of voter computers 102, representing voters 1 , 2, 3 . . . N, and while only one voting device 103 is shown, more than one voting device may be employed at any given poll site.

[0022] A server computer system 108 or "vote collection center," coupled to the Internet or World Wide Web ("Web") 106, performs much or all of the final ballot collection, storing and other processes. (Needless to say, various other electronic devices "in the field" are also likely to be used for transitory ballot collection and storing processes, much as paper ballots are collected in batches at individual poll sites before being transported to a central counting location.) A database 110, coupled to the server computer 108, stores much of the data (including ballots) from the voter computers 102, and the one or more voting device computers 103. [0023] One or more poll site logistic computers or voting poll computers 112 are personal computers, server computers, mini-computers, or the like, that may be positioned at a public voting location to permit members of the public to electronically vote under the system described herein. Thus, the voter computers 102 may be positioned at individual voter's homes, where one or more poll site logistics and voting device computers 112 and 103 are located publicly or otherwise accessible to voters in a public election. The poll site logistics computers 112 include a printer or other suitable output device (such as a recording drive for recording data on a removable storage medium), and may include a local area network (LAN) having one server computer and several client computers or voter terminals coupled thereto via the LAN.

[0024] Overall, embodiments of the invention may be employed by a stand¬ alone voting device 103 located within the privacy of a poll booth at a poll site, as well as by remote voter computers 102 located within individual voter's homes. (When performed by the remote voting computers 102, embodiments of the invention still provide private receipts and full verification, and may deter coercion, but may suffer from a third party looking over the shoulder of the voter (thus lacking the privacy of a poll booth), and may lack certain hardware features, such as a printer screen, to indicate to the voter that the voting computer has printed a commitment without the voter seeing it, as described below.) Without network connectivity, the voting devices 103 can locally store electronically cast ballots, which may then be provided to the poll site computer 112 via a wire-to-wireless connection, or physical transport of data storage media to the poll site computer, such as when the polls close. In general (unless described below), the voting devices 103 and voter computers 102 need only provide a one-way transmission of electronic ballots during voting (although the voting computer 102 may need to receive information regarding cast ballots to confirm that a ballot had been properly cast and included in the tally by accessing a public bulletin board, described below). Note also that the term "voter" is generally used herein to refer to any individual or organization that employs some or all of the protocols described herein.

[0025] Under an alternative embodiment, the system 100 may be used in the context of a private election, such as the election of corporate officers or board members. Under this embodiment, the voter computers 102 may be laptops or desktop computers of shareholders, and the voting device 103 poll site computer 112 can be one or more computers positioned within the company (e.g., in the lobby) performing the election. Thus, shareholders may visit the company to access the voting device 103 to cast their votes.

[0026] One or more authority or organization computers 114 are also coupled to the server computer system 108 via the Internet 106. If a threshold cryptosystem is employed, then the authority computers 114 each hold a key share necessary to decrypt the electronic ballots stored in the database 110. Threshold cryptographic systems require that a subset t of the total number of authorities n (i.e., t<n) agree to decrypt the ballots, to thereby avoid the requirement that all authorities are needed for ballot decryption. In other words, the objective of a threshold cryptosystem is to share a private key, s, among n members of a group such that messages can be decrypted when a substantial subset, T, cooperate - a (t, ή) threshold cryptosystem. Protocols are defined to (1 ) generate keys jointly among the group, and (2) decrypt messages without reconstructing the private key. The authority computers 114 may provide their decryption shares (typically one per Authority per voted ballot) to the server computer system 108 after the voting period ends so that the server computer system may decrypt the ballots and tally the results.

[0027] One or more optional verifier computers 130 may also be provided, similar to the authority computers 114. The verifier computers may receive election transcripts to verify that the election has not been compromised. For example, the verifier computers may receive electronic validity proofs from each of the authority computers. The verifier computers may perform verifications after the election, and need not be connected to the Internet. Indeed, the verifications may be performed by other computers shown or described herein, the authority and verifier computers 114 and 130 may likely deal with every large amounts of numerical or character data over a network (such as the internet). As a result, while web browsers are shown, such computers may alternatively employ a client/server-type application.

[0028] The server, voting poll, verifier or authority computers may perform voter registration protocols, or separate registration computers may be provided (not shown). Such registration computers may include biometric readers for reading biometric data of registrants, such as fingerprint data, voice fingerprint data, digital picture comparison, and other techniques known by those skilled in the relevant art. Details on a suitable cryptographic protocol may be found in U.S. Patent Application No. 10/484,931 , entitled VERIFIABLE SECRET SHUFFLES AND THEIR APPLICATION TO ELECTRONIC VOTING (attorney docket number 32462- 8002US7), while details on a suitable registration process may be found in U.S. Patent Application No. 09/534,836, entitled METHOD, ARTICLE AND APPARATUS FOR REGISTERING REGISTRANTS, SUCH AS VOTER REGISTRANTS (attorney docket number 32462-8004US), both by the same inventor and assignee.

[0029] The server computer 108 includes a server engine 120, an optional web page management component 122, a database management component 124, as well as other components not shown. The server engine 120 performs, in addition to standard functionality, portions of an electronic voting protocol. The encryption protocol may be stored on the server computer, and portions of such protocol also stored on the client computers, together with appropriate constants. Indeed, the above protocol may be stored and distributed on computer readable media, including magnetic and optically readable and removable computer disks, microcode stored on semiconductor chips (e.g., EEPROM), as well as distributed electronically over the Internet or other networks. Those skilled in the relevant art will recognize that portions of the protocol reside on the server computer, while corresponding portions reside on the client computer. Data structures and transmission of data particular to the above protocol are also encompassed within the present invention.

[0030] The server computer 120 collects electronic ballots or tallies and posts them for external review and access by, for example, the voter computers 102, as in a "public bulletin board" described below. The server computer may furthermore manage communication between various election participants, as well as archived data. In an alternative embodiment, the server engine 120 may perform ballot transmission to authorized voters or poll sites, ballot collection, verifying ballots (e.g., checking digital signatures and passing verification of included proofs of validity in ballots), vote aggregation, ballot decryption and/or vote tabulation. The electronic ballots are then stored and provided to a third party organization conducting the election, such as a municipality, together with tools to shuffle ballots, decrypt the tally and produce election results. Likewise, election audit information, such as shuffle validity proofs and the like may be stored locally or provided to a municipality or other organization. [0031] The optional web page component 122 handles creation and display or routing of web pages such as an electronic ballot box web page. For example, voters and users may access the server computer 108 by means of a URL associated therewith, such as http:Wwww.votehere.net, or a URL associated with the election, such as a URL for a municipality. The municipality may host or operate the server computer system 108 directly, or automatically forward such received electronic ballots to a third party vote authorizer who may operate the server computer system. The URL, or any link or address noted herein, can be any resource locator.

[0032] The election scheme and system may use a "bulletin board" where each posting is digitally signed and nothing can be erased. The bulletin board is implemented as a web server. The "ballot box" resides on the bulletin board and holds all of the encrypted ballots. The risk of erasure can be effectively mitigated by writing the web server data to a write-once, read-many (WORM) permanent storage medium or similar device.

[0033] Note that while one embodiment of the invention is described herein as employing the Internet to connect computers, other alternative embodiments are possible. For example, aspects of the invention may be employed by stand alone computers (like the voting device 103). Aspects of the invention may also be employed by any interconnected data processing machines. Rather than employing a browser, such machines may employ client software for implementing aspects of the methods or protocols described herein. Further, data can always be communicated by physical transport of storage media from one location to another.

Suitable Electronic Voting Model

[0034] Aspects of the invention are described with respect to a simple model, which reflects current poll site voting practice. (Later, a relaxation of this model will be described which reflects a remote, i.e. "vote from home," scenario.) Three primary participants in the model (scheme) are a Voter, V , a Voting Device, D, and a Printer, P . These "participants" communicate information between each other, and all such communication consists of strings of characters from an arbitrary, but pre-established alphabet. (For example, numbers, letters, and/or icons or other glyphs.) [0035] A fourth entity, C - the Coercer. C does not explicitly participate in the model, and in practice may not be present at all. However, a property of a protocol under the model that is described in detail below is its coercion resistance; even if V cooperates with C , V can not prove the value of its vote choice to C by way of all the available election data, as long as all steps of the protocol are conducted in an environment that can not be observed - either directly or indirectly - by C. In practice, the voting booth provides this environment.

[0036] Finally, as noted above, there are other entities which are standard elements of all universally verifiable election protocols: a set of Election Authorities, A1,..., An (where n ≥ \ is a public election parameter), and a "Bulletin Board," BB, which represents some accepted mechanism for publishing data to all election participants.

[0037] The characteristics of each of the primary participants are as follows:

Voter The Voter, V , is capable of both reading from, and writing to D . Typically, reading is done via a CRT, or touch-screen display, and writing is done via a keyboard, and/or mouse, and/or touch-screen display. The form of the data to be read and written is very simple - short character strings (typically 2-4 characters long depending on the alphabet chosen). V is also capable of reading from P , but may or may not be not capable of writing to P .

Voting Device The Voting Device, D , is capable of writing to both V and P . The act of "writing to V " simply means showing information on the CRT, or touch-screen display for V to view. Any computing device may be employed, including those noted herein (such as the voter computer 102, as noted below). The act of writing an information string, or message, m , to P means that P outputs, or displays, m , on its media. (This media is usually, but not necessarily, a sheet, or tape, of paper). Thus, the model may ignore hardware level communication subtleties that may take place in a physical embodiment of the model.

D is also capable of reading from (i.e. taking input from) V , typically via a keyboard, mouse, or touch-screen display. It is not assumed that D can read from (i.e. receive information from) P , however, whether or not it can is not important to the verification properties of the protocol. That is, V need not be certain that P can not communicate back to D in order to derive desired confidence from successful execution of the protocol.

Further, a bit stream, £ , is read only by D . One may assume that C can not distinguish Q from a random bit stream. Any full voting system embodiment of the protocol will need to assure this "Random Number Generator" property by a combination of procedures, policy and other audit mechanisms, since in practice it could be possible for C to gain knowledge about, or control £ by some means external to the protocol. However, note that:

1. This property is relevant to any universally verifiable, secret ballot election protocol, regardless of the voter verification protocol employed: If, for example, C can control ^ , C can simply read V 's vote choices from the public election tally by decrypting. So, in such circumstances, whether or not a receipt is present is inconsequential.

2. Whether or not an embodiment is successful at assuring this property does not impact election results integrity. That is, neither F 's assurance of encrypted ballot correctness, nor the public assurance of encrypted tally correctness is impacted by the degree of randomness provided by Q .

Printer The Printer, P , is effectively embodied by a generic receipt tape ("cash register") printer, although any other printer or similar device may be employed, or other output device, including those noted herein (not shown in Figure 1 ). It is capable of "showing" (e.g. printing) information that it receives from D (which can then be read by V ).

For vote verification (i.e. V can detect if D has cast a ballot inconsistent with F 's intent), assumptions about P are: 1. V can inspect its output (i.e. the printer tape isn't somehow permanently hidden from V ).

2. P is not capable of erasing, changing, or otherwise overwriting information that has already been "shown" (i.e. printed) without immediate detection by V .

Both of these properties are nearly unavoidable for most, if not all, standard printing devices and technologies in any reasonable configuration, and thus any such printing device with these properties a may be referred to as a "generic printer".

In order to prevent coercion, P is capable of committing a short string, s , of characters to V without revealing any information about the actual value of s . One simple way to achieve this with a standard receipt tape printer, is to attach a "shield" that partially obscures the paper at the printer's tape exit. In order to commit s , P can print a line that contains some alignment marks, and $ positioned so that the alignment marks are visible to V , but s is not. After some further steps, P can easily reveal s to V by scrolling its tape past the "shield." Many variations on this configuration are possible. In fact, some standard receipt printers have their print head positioned so far from their tape exit that they may support such a commit action without modification. ""

This last assumption does impose a physical security constraint on the system, though one that is easy to realize in practice. For example, in the configuration suggested above, the voter should be prevented from removing the shield, or pulling the paper tape out prematurely.

Notation

[0038] For standard cryptographic election terms and parameters, the following discussion employs the notation of "A secure and optimally efficient multi-authority election scheme" by R. Cramer, R. Gennaro, B. Schoenmakers, from Advances in Cryptology - EUROCRYPT '97, Lecture Notes in Computer Science, Springer- Verlag, 1997. In order to simplify the presentation, an election below consists of a single question, or issue, Q , which offers n candidates (or choice options) whose identifiers are C1, ...,Cn from which each voter can select one, and that for the sake of tabulation, the candidate identifiers receive a publicly known fixed order. This order will be easily derived from the election's official "blank ballot" (sometimes known as "ballot style"), though it does not mean that the candidates must be displayed in this order during the ballot casting process. Abstentions can be handled by adding an explicit "ABSTAIN" candidate. The generalization to multi- question (or multi-issue) ballots, and to questions that allow for voters to choose multiple candidates will be obvious to those skilled in the relevant art to.

[0039] The following summarizes the additional notation used throughout.

M The message space. This is the set of bit strings, m, that can be encrypted by an encryption function, E . For example, using the standard EIGamal encryption scheme, M = {g), a cyclic subgroup of Z*

generated by some fixed g e Z* .

ω The encryption seed space. The space (set) of strings, or elements that are used to generate encryptions of a message m e M . For example, using the standard EIGamal encryption scheme, ω = {θ ≤ ω < q] for some large prime factor q of p-l , and the encryption of a message m

by seed ω is the EIGamal pair {gω,hωnι} .

E The encryption function. In the standard EIGamal scheme, E(m,ω) = (gω,hωm) .

Y A unique, fixed, "yes" message, Y eM , which is a publicly known parameter of the scheme (as are g and h , the election encryption parameters). In the standard EIGamal scheme, one choice for Y would be Y = G , for some arbitrary, fixed, and publicly known G ≡ (g) .

N A unique, fixed, "no" message, Ne M , which is a publicly known parameter of the scheme. In the standard EIGamal scheme, one choice for N would be N = I. T The encrypted message inversion function. T has the following properties:

Tf 1. Voe ω, T(E(Y, ω)) = E(N, θ) for some θ e ω .

Tf2. Vffleω, T(E(N, ω)) = E(Y, θ) for some θe ω.

Tf3. Vωe ω, if m z {Y,N} then τ(E(Y,ω)) = E(m,θ) for some θ e ω

and jϋ g {r,_V} .

In the standard EIGamal scheme, τ((X,Y)) = (χ-\GY~λ).

T The encoding alphabet. A publicly known, ordered set of characters (i.e. symbols, marks, or glyphs). For example, the standard HEX alphabet is [0, 1, 2, . . . , 9, A, B, . . . , F] .

e The encoding function. A publicly known, 1-1 function from bit strings

N0,l}* ) to strings of characters from r(r*) . (Both e and e"1 need to be

"easily computable.") In practice, e is simply taken to be the counting function using lexicographical order. For example, if r is the HEX alphabet, e(ll010) = lA .

L The size of a challenge space (see below). L can be thought of as a "verification security parameter." It is a "moderate sized" positive integer (e.g. l < £ « 210 --215 ), which is a public election parameter along with quantities such as the encryption moduli, p and g .

A The challenge space. λ is a publicly known subset of r* of size L . Typically, λ will consist of the first L strings in r* taken in lexicographical order. Note that if the size of T is between 24 and 26 (standard HEX , etc.), then the number of characters that are needed to represent an element of λ will be between 2 and 4 - about the length of a bank ATM PIN.

P : x eR S Participant, P , randomly selects an element JC from set S . P will typically be V or D . (When P is D, this means that D will choose x by taking bits from @ as needed.) Truly random selection is a stronger requirement than often needed in practice. The selection need only be "sufficiently unpredictable."

P1 ^ — P2 Participant, P1 "reads" (i.e. gets, or sees) x from participant P2. (In other words, participant P2 "writes" (i.e. shows) x to participant P1.) Typically, P2 is either F or Z) , and P1 is V , D , or P . Typically, x will

be a string over r (i.e. x e T* ). A typical mechanism for read/write communication are described above.

((X1;... ;xk)) A string sequence encoding. If the xt characters from some alphabet, A , ((X1;...;xk)) is a string representing a sequence of individual strings. One way to define such an encoding is to let ((x1;...;xk)) = X1 '+'... '+'xk where '+' is some special separator character or symbol not in A . (It could be a tab, or white space character.) Conceptually, it is useful to think that whenever P shows (i.e. prints) ((X1;...;xk)) it does so on a line all by itself - although there are certainly other ways to achieve the goal of setting it off from other characters.

Suitable Protocol

[0040] A suitable protocol may be presented in two stages. First, a communication level view of the protocol is presented that indicates a sequence of data exchanged between participants. With this in mind, it will then be easier to present a mathematical level structure for the protocol.

Communication View

[0041] Figure 2 provides an example of a communication view of data elements exchanged, namely a process 200. Beginning in block 202, the device D displays a ballot whereby:

1 . For l ≤ i ≤ n , D : pt eR A .

2_ V < «e;c,;...;cn» D

[0042] Under block 204, the voter V selects a candidate, wherein: 1. V selects intended choice, C1 , where l ≤ i ≤ n .

2. D< <<Cl>> — V . (That is, V communicates to D its candidate choice.)

(Note that processes under blocks 202 and 204 are identical to those followed by typical direct recording equipment (DRE).)

[0043] Under block 206, the device D makes a ballot commitment, wherein:

1. For each l ≤ j ≤ n , D computes x. eR A .

2. D computes a verifiable ballot commitment, H "consistent with" {*,}" (as explained below).

3. P^-D

(Effectively, H is V 's encrypted voted ballot itself, or, for the sake of convenience, a one-way hash of it. In practice, H would be surrounded by easily identifiable "BEGIN" and "END " strings, much like the strings used to surround PgP encrypted messages under the publicly available encryption applications by PGP Corp. of Palo Alto, CA.)

[0044] Under block 208, the voter V provides "unchoice" challenges to D , wherein:

1. For each l ≤ j ≠ i ≤ n , V selects ^ e AuY as desired. The c] are not true challenges, but are available to V in order to prevent coercion (as explained below).

2. D < «c';-;c-';e'+>;c''>> — v . (That is, V communicates the sequence

3. [Optional] For each c. =Y , D : cj eR λ . That is, D "randomly fills

in" the cy V does not care about. (Alternatively, V may be

required to explicitly choose all cy .)

[0045] Under block 210, the device D and printer P make a pledge commitment, wherein: 1. For each l≤j≤n, D computes p. according to

Pj=e-'{e{Cj)®e(Xj)) j≠i

p.=Xi

where θ is bitwise XOR .

2. P commits the sequence {pj]"._ (or a 1-way hash of it) as

discussed above. This means that V can tell that D has committed to this particular sequence of p. on a receipt tape or other printable substrate, but V has no knowledge of the specific values of the p.. (Again, this property is not necessary for vote verification, only for coercion prevention.)

[0046] Under block 212, the voter V and device D perform a voter challenge, wherein:

1. V:csRA.

2. D< c V. (That is, V communicates c to D.)

[0047] Under block 214, the device D, records an encrypted ballot and prints question receipt data, wherein:

1. D sets C1=C.

2. Vox l≤j≤n, P < ({Cj;Cj}} D.

3. BBi κ D. That is, D posts F1S encrypted voted ballot, Bv (format to be described below), to BB. (In practice, this is likely to be achieved by having D write Bv to a local storage medium so that it can be transported and/or copied to BB soon after the poll site closes.)

[0048] Under block 216, the voter V may perform inspection of the voter receipt, wherein:

V accepts the protocol execution if and only if 1. V observed H fully printed before continuing to the Voter Challenge step.

2. The printer (receipt) display of c is accurate. That is, P prints c on the "candidate C1 line."

3. (For prevention of coercion only) V is "satisfied" with the printed lines corresponding to C7 for j ≠ i - that is, for those / that V cared to choose a string, cy. , that same string is printed on the C7 line. The values of the data on these lines do not impact the level of certainty that V has as to the correctness of its (cast) encrypted ballot. These lines are present purely to thwart coercion.

[0049] The steps described above are executed in the privacy of the poll booth. In order for V to check that his or her intended choice has been properly included in the public tally, V may also compare the contents of the receipt against the contents of BB . This compare operation is simple - in fact, it can be carried out by inspection. This is because the receipt data can be derived from Bv by a well defined, public computation. V need only check that its receipt data matches the corresponding BB data character for character. Further, the total number of characters to be compared is relatively small, and can be carried out by anyone - V 's chosen proxy, for example - without having to know the value of V 's candidate choice.

Structural Details

[0050] Merely exchanging data as described in the previous subsection does not provide any level of certainty to V that its encrypted ballot has been formed as V intended. Certainty is derived from the connection between the exchanged data and the underlying tabulatable data. This connection can be publicly verified as explained below.

Structural Overview [0051] At a high level, the connection between receipt data exchanged in the poll booth and tabulatable data that will be input to a verifiable electronic election such as a shuffle or mix-net electronic election protocol can be described as follows:

1. The "opened verifiable choice" (the voted ballot - defined below), Bv , that D posts to BB is "verifiably linked" to the ballot commitment, H . That is, it can be publicly verified that Bv is the encrypted choice corresponding to H.

2. A public transformation, M is applied to Bv producing three outputs, an

ordered sequence I p .)" and ordered sequence icX , and Vs

encrypted choice, Iv, which is the input to tabulation.

3. If /v is "well formed" according to a publicly known specification, and if it represents a vote for a candidate, C7 , different from the candidate

chosen by V (recall that V chose candidate C. , so this means j ≠ i), then there is only one value of c = C1 for which the output value p~. is equal to the value pt committed by D in the poll booth. Since V has an authenticated copy of P1 and c. , if V selected c randomly in the poll booth (which means from the uniform distribution on λ , but in practice the distribution can be somewhat skewed), the probability that /v is both "well formed" and undetectably represents a choice for "the wrong candidate" (a candidate different from the one chosen by V ) is 1/ L .

4. If /v is not "well formed," then it will, with certainty, produce an invalid decryption at the output of the verifiable mix-net tabulation. The mixers or shufflers, if required, can cooperate to find the specific input, or inputs, that are not well formed. Alternatively, the protocol can be trivially augmented to require that D post a ballot validity proof to BB at the same time that Bv is posted (see, e.g., R. Cramer, R. Gennaro, B. Schoenmakers, "A secure and optimally efficient multi-authority election scheme," Advances in Cryptology - EUROCRYPT '97, Lecture Notes in Computer Science, Springer-Verlag, 1997). This would allow each voter, V , to verify that its Iv is well formed independently and before election

tabulation.

[0052] Putting these properties together, as long as D can not predict F 1S choice of c with probability significantly greater than I/ L , the chance that Vs intent

has been undetectably miscounted in the public tally is \\L .

Protocol Data Structures

[0053] A version of the protocol is now presented that is simple conceptually; optimizations are discussed below. In this protocol version, L = 2e for some positive integer £ .

[0054] Definition 1. A ballot mark pair (BMP) is an ordered pair of encryptions,

φ = (EvE2) = (e (/«,,<»,), e(m2,ω2)) . A valid BMP is a BMP with the property that both

mx e {Y, N} and m2 e {Y, N} .

[0055] Definition 2. Let x be the "type" function which takes a BMP as

argument, takes values in the set {-1,0,1} , and is defined by

1 if φ is valid and Pi1 = Jn2 0 if φ is valid and Inx ≠ m2 -1 if φ is not valid. -

[0056] Definition 3. A candidate mark (CM) is an ordered sequence of £ BMPs.

[0057] The notion of "type" to CMs may be extended as follows:

[0058] Definition 4. Let τ be the function which takes a CM, φ = ($,...,$,) , as

argument, takes values in the set {-1,0,1} , and is defined by

1 ifj($) = l Vl < / < ^ ■ (φ) = 0 ifj(^,.) = 0 Vl < z < ^ -1 otherwise [0059] Definition 5. A CM, φ , is valid if r(φ) e {0,1} . It is invalid if r(φ) = -1 (i.e. otherwise).

[0060] The voter's encrypted choice will be represented by three different, but closely related data structures. These are a verifiable choice (VC), opened verifiable choice (OVC), and tabulation input (Tl).

[0061] Definition 6. A verifiable choice (VC), is an ordered sequence of n CMs, ψ = (φ15...,φn) . A ballot commitment (BC) is a string H = hash(ψ) , where hash is a one-way hash function (possibly the identity function). (As noted above, the ballot in the above example contains only one question - generalization to multi- question ballots being straightforward. In this simplified case, a single VC takes care of an entire ballot. In the case of multiple question ballots, the BC, H would be computed by applying hash to the full ordered sequence of ψs - one for each ballot question.)

[0062] VCs mean may divided into disjoint valid and invalid categories.

[0063] Definition 7. A VC, ψ = (φ15...,φ,,) , is valid if and only if

1. All φ; are valid.

2. There is exactly one index, i , that satisfies r(φ,.) = l . (So

r(φy) = 0 for all j ≠ i .) In this case, the convenient notation

i = r^ (l) may be adopted.

[0064] Definition 8. With ψ as in definition 6, ψ may be considered a vote for candidate i .

[0065] A structure that represents a "partially revealed" VC will now be defined.

[0066] Definition 9. An opened ballot mark pair (OBMP) is a 4-tuple β = (ε,m,ω,e) where ε is a message encryption (i.e. 3m eM , ώ e ω with

ε = E(m,w)), m e M , ω <≡ ω , and ^ e {θ,l} .

[0067] Definition 10. Continuing with the notation in the previous definition, there is a publicly computable mapping, U , from OBMPs to BMPs given by:

[0068] Definition 11. An OBMP, β = (ε,m,ω,e) , is well formed if w e {Y,N} . Otherwise it is misformed.

[0069] Definition 12. An OBMP, β = (ε,m,ω,e) , is valid if

[0070] β is well formed.

[0071 ] a ώ e ω and m e { Y,N} such that e = E(m,ώ) .

[0072] Equivalently, β is valid if and only if U(β) is valid. However, the definition first given highlights the fact that "well formedness" of β can be determined by inspection, while in order to determine whether β is valid requires knowing the decryption of ε .

[0073] Definition 13. Let β = (ε,m,ω,e) be an OBMP. The bit function C may be defined by

C(β) = e

[0074] Definition 14. Let β = (ε,m,ω,e) be a well formed OBMP. Let P be the bit function given by

[0075] The next two definitions parallel the definition structure for VC.

[0076] Definition 15. An opened candidate mark (OCM), S , is an ordered sequence of £ OBMPs. It is well formed if all of its OBMPs are well formed.

[0077] Definition 16. An opened verifiable choice (OVC), δ , is an ordered sequence of n OCMs. It is well formed if all of its OCMs are well formed. The OVC is essentially the encrypted voted ballot that is cast for tabulation. (Again, as with the definition for VC, this is true since the entire ballot consists of a single question. In the case of a multi-question ballot, the encrypted voted ballot will actually be an ordered sequence of OVCs. Nevertheless, the OCM may be occasionally referred to as the "encrypted ballot" below because equivalence of the two structures makes sense in this context.)

[0078] The function U of Definition 10 naturally extends to a function mapping OCMs to CMs and also to a function mapping OVCs to VCs simply by applying it element-wise, since OCMs and CMs are arrays of the same size, and also OVCs and VCs are arrays of the same size. To ease notation, all three of these functions by U may be denoted and distinguished between each other by context.

[0079] As with OBMPs then, OCMs and OVCs may be separated into valid and invalid via:

[0080] Definition 17. An OCM, δ , is valid if and only if the corresponding CM, U(δ) , is valid. (In particular, δ must be well formed.)

[0081] Definition 18. An OVC, δ , is valid if and only if the corresponding VC, U (A) , is valid. (In particular, δ must be well formed.)

[0082] The function P and C also extend to functions on OCMs and OVCs. The extension requires a bit more to describe, which is provided in the next two definitions.

[0083] Definition 19. Let δ = (βιt...,βt) be an OCM. The function C(δ) e A may be defined by

where e is the encoding function and | represents bit concatenation.

[0084] Similarly, for well formed OCMs:

[0085] Definition 20. Let δ = (βι,...,βι) be a well formed OCM. Let P(β) e A be defined by

[0086] Definition 21. Let δ = (διt...,δB) be an OVC. Let C(δ) G λ" (that is,

C(δ) is an ordered sequence of n strings from λ ) be defined by

Similarly, if δ is well formed, define P(δ)e λ" by

[0087] Looking at these functions in the inverse direction, one may think of "opening" a CM "according to ce λ , or "opening" a VC "according to (cλ,...,cn) e An , as is highlighted in the following definition.

[0088] Definition 22. For CM, φ , and c e λ , define δ = Oc (φ) to be the OCM, δ , given by

1. U(δ) = φ

2. C(δ) = c

[0089] For c e λ" , the extension to O- mapping (opening) VCs to OVCs should be obvious.

[0090] The importance of P and c is seen in the following lemma.

[0091] Lemma 1. Let φ be a CM with τ(φ) = 0 , and fix /? e λ . Then there is

exactly one c e λ with the property that P{θc (φ)) = p .

[0092] The next three definitions build a definition structure for tabulation input that parallels the definition structure for VC and OVC. The first is rather superfluous, but it is introduced regardless in order to be most consistent with the other notation.

[0093] Definition 23. A tabulation input element (TIE), γ, is simply an encrypted message, γ = E(m,ω) for some (unknown) m e. M and ω <= ω .

[0094] Definition 24. A TIE, γ = E(m, ω) , is valid if m e { Y,N} . [0095] Definition 25. A tabulation candidate mark (TCM), μ -{χλ,...,γe) , is an ordered sequence of £ TIEs.

[0096] Definition 26. A TCM, μ = (rlt...,rt) > is valid if

1. γ. is valid for all l ≤j ≤£ .

2. All γ. are encryptions of the same message, m . So either all γ. are independent (different ω ) encryptions of Y - in which case μ is "type 1" and write τ(μ) = l , or they are all independent encryptions of N - in which case μ is "type 0" and write τ(μ) - 0. (To be fully consistent with definition 4, one may also write r(//) = -l if μ is invalid.)

[0097] Definition 27. A encrypted choice (EC), J = (^,...,//,,) , is an ordered sequence of n TCMs. [0098] Parallel to Definitions 7 and 8 two definitions follow: [0099] Definition 28. An EC, / = (#,..., μn) , is valid if

1. μ} is valid for all l ≤ j ≤ n.

2. There is exactly one index, i , that satisfies τ{μi) = \ . (So τ^μj ) = 0 for all j ≠ i ). In this case write i = τf (l) .

[00100] Definition 29. With / as in Definition 28, I is a vote for candidate i. [00101] Therefore, a connecting transformation is: [00102] Definition 30. There is a publicly computable mapping, W, from well formed OBMPs, J3 = (ε,m,ω,e) , to TIEs given by:

J ε ifm = Y W^~ {T(ε) ifm = N

where T is the encrypted message inversion function. [00103] As with the function, U , the function W extends naturally to a function from OCMs to TCMs, and to a function from OVCs to ECs by applying it coordinate wise. As was the case with U , all three of these functions may be denoted by W , and distinguished between each other by context. [00104] The transformation, W , is found in the main lemma of this section. [00105] Lemma 2. Let ψ be a VC and c any element of A . 1. If ψ /s valid, then υ = w(θc (ψ)) is valid and τ~x (l) = τ'1 (l) (i.e. both are votes for the same candidate). 2. If ψ is invalid, then either Oc (ψ) is misformed, or Oc (ψ) is also invalid. Proof: Follows directly from the properties of T .

A Protocol From the Data Structures

[00106] A data flow representing a protocol based on the above protocol data structures is shown pictorially in Figure 3. Specifically, the protocol includes the following, which clarify blocks under Figure 2: After receiving the C1 from V in step 204, D generates random encryption seeds [ω) and computes a valid VB , VBV , which is a vote for candidate C1 [Vs indicated choice). D can choose the order of Y and N encryptions in the type 0 CMs, φy , so that the order is (N, Y) if the corresponding bit of x. is 0, and is (Y , N) if the corresponding bit of x. is 1 , for all j ≠ i. For the type 1 CM, φ(. , in position i , D can choose (N, N) or (Y , Y) encryption pairs precisely so that for all c e λ , p(Oc (φ,. )) = *,.. In block 206, the value H written to P is exactly the BC (see Definition 6) corresponding to VBV , BCV , computed in block 204. After receiving the c from V in block 212, D "opens" VBV according to c = {cx,...,cn) creating Vs OCM (encrypted ballot), OCMV , which is posted to BB :

0CMv = 0- (VBv)

Since both c and p can be computed by anyone from the posted OCM v , V need only

1. Compare the value of H printed on its receipt to the value of H computed from the copy of OCMV posted to BB (the public "ballot box").

2. Compare the printed c and p on its receipt with the corresponding values computed from the copy of OCMV posted to BB . to complete its verification of encrypted ballot integrity. The verification properties follow from Lemmas 1 and 2.

[00107] Remark 1. The scheme resists coercion (under the random number generation (RNG) assumption), since all cj are chosen freely (and hence symmetrically) by V . Only V knows which one was chosen after the commitment of

[00108] Remark 2. It is not necessary to restrict F 's choice of cy (and c ) to λ . As a "usability feature," longer "personal pass phrases," Phv , could be accepted. The protocol would simply compute "the modulus of Phv relative to £ , or a hash of it into λ . This may encourage greater randomness from the voters in choosing their challenges. Some Variations and Alternative Embodiments [00109] Many variations exist for aspects of the above embodiments and protocols exist. By using one or more of these variations in place of the specific steps described previously, many alternative embodiments can be built.

1. Printer Pledge Commitments on Separate Tape

[00110] The voter/device interaction presented in Figure 2, is similar to the interaction that takes place when voting with typical direct recording equipment (DRE) at poll sites. Besides the usual candidate display and selection, there are three very simple additional steps, the first of which the voter may ignore if desired. They are:

1. (Optional) Voters are given the option to select n-1 arbitrary short strings before ballot commitment.

2. Voters should check to see that the ballot commitment has been completely printed (terminated by a known end string, for example).

3. Voters need to pick one short string, c , and check that it and the corresponding p are printed appropriately. (The time for this to take place is very short which means it will be very easy for voters to perform this check.)

Nevertheless, a variation on the protocol, described below and using a more specialized printer, requires even less from voters.

[00111] Consider a two headed printer (or two printers), the first of which is capable of erasing or destroying its output, the second being generic. The pledge, Xi = P, for candidate C, can be printed (behind shield) with print head 1 , as a commitment. The voter then needs only to pick c (not the c. for j ≠ i), with the understanding that D is required to open all CMs with the same value, c . The printer must then print, with print head 2, a single line containing c and then C7 lines

with pj in place of C7 . V then verifies that the printed value of c matches her

choice, and that the value of p. is the same as the values committed by print head 1. The output of print head 1 , combined with the output of print head 2 (the voter's receipt) gives evidence of the voter's choice. But if the output of print head 1 is destroyed, this evidence is eliminated.

[00112] A desirable aspect of this embodiment is that all verification steps that are not part of a generic direct recording equipment (DRE) voting experience can occur after the ballot commitment step - the point that would typically be referred to as ballot "casting." As a matter of convenience, this implies an especially nice consequence: the selection of Voter Unchoice Challenges in block 208 is irrelevant, and the step can be removed.

2. Shared Computation of the Verifiable Choices

[00113] The Voting Device model in the embodiment initially presented assumed that the bit stream, @ , could not be distinguished from a random bit stream by the coercer, C . In practice, this amounts to assuming that the coercer has not been able to somehow compromise the Voting Device software or hardware.

[00114] Such compromise is likely to be difficult, and as noted previously, would not limit the power of the protocol to provide each voter with evidence of any attempt by the Voting Device, D , to alter the voter's ballot choices. However, it could potentially undermine the secrecy of some or all voter choices, or it may allow for some form of vote coercion.

[00115] Thus it is desirable to consider an alternative embodiment that does not require the Voting Device, or any other possibly untrustworthy entity, to implement ^ . The most desirable solution then should symmetrically share the implementation of - between the Election Authorities, A1. An embodiment with this characteristic can be obtained as follows:

Overview

[00116] Before voting begins, the jurisdiction decides on a maximum number of "blank ballots," NB , that will be required for the election. (This step is also needed in paper ballot elections. The value of NB needs to be at least as large as the total number of voters expected to turn out, and in practice needs to be a bit larger to insure against some ballot loss, and occasional voter ballot marking errors.) The Election Authorities, An then cooperate to produce a sequence of NB verifiable choices (VCs) which are the VCs that are used by the Voting Device as in the embodiment first presented. All random choices required to form each VC (i.e. those referencing @ ) are thus symmetrically shared among the Election Authorities instead of being determined by the (possibly untrustworthy) Voting Device, V .

[00117] The cooperative action of the Election Authorities will now be described in greater detail. For the purpose of this description, refer to a sequence of NB VCs as a Blank Ballot Stack.

Election Authority Computation Steps

[00118] Much as in shuffle or mix-net tabulation, the Election Authorities, An construct, in sequence a Blank Ballot Stack. Each Election Authority uses as input to its construction the Blank Ballot Stack that was output by the previous Authority. The first Authority in the sequence uses as input a fixed, publicly known Blank Ballot Stack - that is the starting point of the computation is always the same.

[00119] The authorities are numbered, A1,..., An according to their order in the computation sequence. Let bM be the Blank Ballot Stack output by A._λ , which is then also the Blank Ballot Stack taken as input by A1. The computation performed by A1 proceeds as follows:

BBS.1. For each Verifiable Choice, ψ(M ., , in bM , execute the steps:

BBS.1.1 Choose a random permutation, π . e V n ■ (Recall that n is the

number of "candidates," or possible responses to the question on the ballot.)

BBS.1.2l_et ( φ/M ;.λ φ(H i/ β) J be the representation of ψ(M λ as a

sequence of Candidate Marks.

BBS.1.3For each φ,, . .... l≤k≤ n BBS.1.3.1 Let (f|..utιl) ^Hj>)) be the representation of

φ,._, .,, , l ≤k≤ n as a sequence of Ballot Mark Pairs

(BMPs).

BBS.1.3.2 For each l ≤l ≤l , choose randomly a bit, b, ,

b, € {0, 1} . If b, = 0 , set φ{i_1JtkJ) = φ(i- 1, j, k, I) . Otherwise

(i.e. if Z7 =I), set φ{i_1Jλ l) = F(φ(i-l,j,k,l)) using the

function F defined below.

BBS.1.3.3 Set φ(w^) = (^U«)^(H^)

BBS.1.4Set φ, / /t) = φ, . . ..

BBS.2. Sθt ψM =(φ(w>1),...,φ(UiB))

BBS.3. Output b/ ={{ψM}j}

[00120] The function F is defined by

F ((X0J0), [X1J1)) = (T (X0J0), T (X1J1))

where, as previously defined, T is the encrypted message inversion function.

Additional Details

[00121] The final Blank Ballot Stack, bn , must be communicated to the Voting Device along with all the individual permutations, π,^ and the randomly chosen bits

br The π,us and the b, must be communicated secretly. Ideally, the Voting Device

has a secure cryptographic module capable of exporting a public key. If so, the secret communication can easily be implemented using public key encryption. If not the Voting Device does not have a secure cryptographic module, other conventional methods for secure communication such as supervised transport of physical media can be used. The combination of bn , all π,. . , and all b, is sufficient information for the Voting Device to execute the verification protocol as previously described using the predetermined Verifiable Choices in bn instead of Verifiable Choices chosen by the Voting Device. Thus the need for any Voting Device chosen random bits is eliminated.

3. Pledge Commitment on the Voting Device

[00122] An important aspect of the protocol first described above is the pledge commitment presented by the Printer. A purpose of this event is to convince the Voter that a particular string of data that will be shown to the Voter later - the pledge, P/ - is completely independent of the Voter's arbitrarily selected challenge string. Ideally, the Voting Device would simply show p,- before the Voter communicates the challenge string, but this would allow the Voter to correlate the challenge string with P/, thereby enabling the threat of coercion. Thus it is important that the Voter be unable to see the value of p,- until after communicating the challenge string.

[00123] Under the model above for the Voting Device, D, and the Printer, P, it is not possible to show the Voter the pledge string, p/, by using the Voting Device's display, or monitor, because of this constraint. However, under at least one alternative embodiment it is possible. For example, by extending the model of the Printer and Voting Device very slightly in two ways, it is possible to use the Voting Device's display to show p,- in an indirect way. The two additional properties are:

Printer: The Printer, P, is capable of direct Voter input. Specifically, it should be possible for the Voter to communicate a string to P without any involvement of the Voting Device. The type of user-input required is very simple - a small numeric, or alpha-numeric keypad is sufficient. Each Voter will only need to enter one short character string (2-4 characters in length) during the vote verification process.

Voting Device: The Voting Device can be physically disconnected from the Printer, either permanently or temporarily. In this case, "physically disconnected" implies that an observer can determine that all communication from the Printer to the Voting Device is prevented. (Communication from the Voting Device to the Printer is allowed, thus communication in one-way from the Voting Device to the Printer.) An example of such an arrangement is shown in Figure 4, showing a display device 402 coupled with the voting device 103, which has a one-way communication channel 404 with a printing device 406. The printing device in turn has a user-input portion 408. (As is clear from the detailed description herein, while the voting device 103 is shown, the voting computer 102 or other computing device may be employed.)

[00124] There are several methods by which one can observably enforce the communication restriction between Printer and Voting Device. A few such of methods include:

• If all communication between Printer and Voting Device is done by way of physical transport of media (e.g. a smart card handled by the Voter), this restriction can be enforced simply by way of voter education.

• If the Printer and Voting Device communicate by cable or wire, allow the Voter to physically disconnect the two devices at an appropriate time. This can be made convenient by installing an "on/off' switch box in the communication cable, which itself may have a keypad (thus avoiding the need for a user-input device on the printer 406). Figure 5 shows an example of this alternative as a unit 500, which has a switch 502 and keypad 504. Moving the switch 502 from the "vote" to the "verify" position disconnects the Printer from the Voting Device. Moving the switch 502 to the "finish" position reconnects the Voting Device to the Printer to finish printing the receipt and to allow the voter to confirm his or her choices.

• Install a physical "one-way" communication channel between the Printer and the Voting Device. A simple example might be an infrared link between the devices with only a transmitter at the Voting Device and only a receiver at the Printer.

[00125] Figures 6A and 6B together illustrate steps of this embodiment, and are generally self explanatory based on the detailed description provided herein. As shown, a series of suitable displays screens providing information and instructions are shown via the display device 402. A receipt 602 is progressively printed by the printer 406 during the voting process, and an obscuring shield 604 on the printer is provided (as explained above). [00126] The Voting Device does not know the pledge string(s) at all until it is given the unlock string. The unlock string is used as a decryption key to a (list of) encrypted pledge string(s). Thus the Voting Device is prevented from showing pledges too early.

[00127] Note that in the case of a one question ballot, which represents the simplified ballot discussed above, this embodiment offers little advantage. However, if the number of questions on the ballot is large, this embodiment provides a convenient way for the Voter to issue only one challenge string, instead of the one challenge string per ballot question.

4. Authority Selected Challenge Strings

[00128] As previously mentioned, a purpose of the Voter selected challenge string is to convince the Voter that the associated pledge string (as determined by the Verifiable Choice already committed by the Voting Device) does not depend on the value of the challenge string selected. An alternative approach would be to let the Election Authorities symmetrically share the responsibility for selecting the challenge strings.

[00129] It is possible to accomplish this. As in the Shared Computation of the Verifiable Choices section above, prior to the start of voting, each Election Authority would generate a list of a sufficient number of "challenge shares" and make some public commitment (e.g. publish a hash) of their challenge share list. The challenge issued by each Voter is then required to be some symmetric algebraic combination (e.g. XOR) of the Authority challenge shares, and the validity of this requirement with respect to each Voter receipt would be checked as part of election tabulation and audit.

[00130] In practice, implementing this solution will some procedural or structural additions to the above system:

1. Communicating to each Voter the Voter's challenge shares with appropriate level of secrecy.

2. Enabling each Voter to perform the proper algebraic combination of challenge shares without error. Various ways exist for implement the above additions within the above protocols. For example, assuming appropriate safeguards, a central facility would receive the challenge shares from the Election Authorities and prepare scratch-off sheets having several challenge strings. One such sheet would be provided to each voter at the poll, and the voter may scratch off to reveal one of the several preprinted (but obscured) challenge strings that could then be used in voting. Alternatively, the voter would receive two or more scratch-off sheets, and then mentally combine challenge shares from two or more scratched-off sheets to produce a challenge string, thus avoiding the need for the central facility to combine the shares prior to printing of the sheets.

[00131] Overall, while the alternative described in the Shared Computation of the Verifiable Choices section pushes random choices in the ballot formation to the Election Authorities, the alternative described in this section also provides to the Election Authorities any random choices that a voter is required to make (i.e., challenge strings/codes), further anonymizing or insulating the voter from the voting process (which is unavailable in any "show-of-hands" election).

5. Remote Voting

[00132] The above protocol and all of its variations can also be used in the context of a remote, or unattended voting system - that is one where the act of voting is not restricted to a supervised poll site. Examples of such systems include Internet voting systems (e.g. the voter computers 102), and unattended kiosk voting systems. A limitation is that since the physical equipment used by the voter can not logistically be supervised and inspected, it may be possible for Voters to circumvent the constraints that prevent them from knowing the value of their committed pledge, Pn before selecting and communicating their challenge string.

[00133] Thus, aspects of the invention are perfectly applicable to remote voting systems for the purpose of vote verification - that is, Voters can use the protocol in one of its several embodiments to assure themselves that their ballot has been cast correctly, and demand a valid receipt for it. This receipt does not provide information in the public tally indicating which ballot choices Voters made. However, coercion may not be completely prevented. [00134] Since all conventional remote voting systems (e.g. vote by mail) are coercible by way of "shoulder surfing," a verifiable, but potentially coercible remote system may be acceptable in practice.

[00135] There are also several ways to minimize the coercion threat in practice. Examples include:

1. Distribute secure hardware, such as a Printer or like device, as noted above. Such hardware could authenticate itself and prove via digital signatures that it was used as specified by the protocol.

2. Simulate the physical commitment of data by way of data encryption. In effect, instead of printing the pledge string behind an obscuring shield, commit it to the Voter by delivering an encryption of the pledge string - after the Voter replies with a challenge, deliver the decryption seed.

3. Use the embodiment described for Authority Selected Challenge Strings.

6. Alternative Ballot Encodings

[00136] The data structures described above constitute only one example of a very broad class of data representations that can support aspects of this invention. Abstractly, some requirements are

1. The data structures should be able to represent two sets of code words, a set of CHOICE code words, and a set of UNCHOICE code words. Consider, for example, the data structures above, and make the trivial identification of Y with "1 bit" and N with "0 bit." Then the CHOICE code words can be identified with the set of binary strings \,bv...,bu_x of length 2£ which have even parity when restricted to each of the £ bit pairs (&0λ)'-'(A^2A/M ) > and tne UNCHOICE code words can be identified with the set of binary strings \,bv...,b2t_x of length 2£ which have odd parity when restricted to each of the £ bit pairs

2. There should be an encryption function capable of "codeword hiding" in the sense that essentially no information about the specific value of a code word can be determined from an encryption of it. Above, the encryption function used was coordinate-wise EIGamal encryption. The encrypted code words are the Candidate Marks (CMs).

3. There should be a parameterized "partial reveal" function operating on encrypted code words that produces both data suitable for input to a tabulation process, and an output parameter (i.e. pledge value). Again, under the first described process above, the parameterized "partial reveal" function used is OC(φ) , parameterized by c and operating on

the CM, φ . The input parameter space is the set of strings of correct length from the encoding alphabet. The output parameter is P=P( O C(φ)) .

4. In order that the receipt not reveal the voter's candidate choice, the relationship between O, c and p must be further constrained as follows:

If there exists an element, c , of the input parameter space, encrypted CHOICE code word, φ , and output parameter p all with the property that p is the output parameter obtained by applying the parameterized partial reveal function to φ with parameter c , then there is an element, c' , of the input parameter space, and encrypted UNCHOICE code word, φ' , such that p is also the output parameter obtained by applying the parameterized partial reveal function to φ' with parameter d . Further, the same should hold with the roles of CHOICE code word and UNCHOICE code word reversed.

Example Encoding Alternatives

[00137] Swap Even/Odd Parity: A trivial example of a data structure alternative would be to use "odd parity" pairs for CHOICE code words and "even parity" pairs for UNCHOICE code words. (This is simply the reverse of the convention presented above) To be consistent, one must also make the corresponding obvious change to tabulation convention. [00138] Orthogonal Group Encoding: A more interesting example of an alternate encoding is based on the structure of finite special orthogonal groups. Consider a subgroup, G , of SO2 (q) of order Am , where m > 0 is an integer. The elements of this group are 2 x 2 matrices over Fq .

[00139] Let X be a generator of G , Gc = {xM} , Gv = {x4i+2} and G7 = {X2M} .

[00140] Use the top row vectors of elements of Gc as CHOICE code words, the top row vectors of elements of Gυ as UNCHOICE code words.

[00141] The encryption of a code word (χ,y) is a pair of EIGamal pairs given by

((^AVMs'Sλ'V))

where ωι and ω2 are random encryption exponents as before.

[00142] The challenge parameter space is identified with the top row vectors of G7, modulo sealer multiplication by -I e F9 , and the set of logarithms base g of the elements of the pledge parameter space is identified with the set of standard inner product values (over F2 ), {u-v} where u is a CHOICE code word and w is an element of the pledge parameter space. (Note that the span of these values is identical with {wv} where w is UNCHOICE code word and v is an element of the pledge parameter space.)

[00143] The partial reveal operation with challenge parameter c = (cuc2) applied to an encrypted code word is the unencrypted value of the EIGamal pair

which is simply gv+C2y - an element of the pledge parameter space.

[00144] Note that the EIGamal encryption of the pledge value, above, can be computed publicly by modular multiplication. The decrypted value can be revealed by standard decryption proof of validity techniques.

[00145] An advantage of this method is that only one pair of EIGamal pairs is needed to implement very large pledge and challenge spaces. In section 4.2.2, £ EIGamal pairs are required to implement a pledge/challenge space size of 2' . Thus this encoding offers significant practical advantages with respect to data size and computational complexity.

Conclusion

[00146] Unless the context clearly requires otherwise, throughout the description and the claims, the words "comprise," "comprising," and the like are to be construed in an inclusive sense, as opposed to an exclusive or exhaustive sense; that is to say, in the sense of "including, but not limited to." As used herein, the terms "connected," "coupled," or any variant thereof, means any connection or coupling, either direct or indirect, between two or more elements; the coupling of connection between the elements can be physical, logical, or a combination thereof. Additionally, the words "herein," "above," "below," and words of similar import, when used in this application, shall refer to this application as a whole and not to any particular portions of this application. Where the context permits, words in the above Detailed Description using the singular or plural number may also include the plural or singular number respectively. The word "or," in reference to a list of two or more items, covers all of the following interpretations of the word: any of the items in the list, all of the items in the list, and any combination of the items in the list.

[00147] The above detailed description of embodiments of the invention is not intended to be exhaustive or to limit the invention to the precise form disclosed above. While specific embodiments of, and examples for, the invention are described above for illustrative purposes, various equivalent modifications are possible within the scope of the invention, as those skilled in the relevant art will recognize. For example, while processes or blocks are presented in a given order, alternative embodiments may perform routines having steps, or employ systems having blocks, in a different order, and some processes or blocks may be deleted, moved, added, subdivided, combined, and/or modified to provide alternative or subcombinations. Each of these processes or blocks may be implemented in a variety of different ways. Also, while processes or blocks are at times shown as being performed in series, these processes or blocks may instead be performed in parallel, or may be performed at different times.

[00148] The teachings of the invention provided herein can be applied to other systems, not necessarily the system described above. The elements and acts of the various embodiments described above can be combined to provide further embodiments. While aspects of the invention are employed in electronic voting applications, these aspects may be applied to other applications, including any application requiring verifiability of user input with respect to electronic documents or data elements.

[00149] All of the above patents and applications and other references, including any that may be listed in accompanying filing papers, are incorporated herein by reference, including U.S. Patent Application Nos. 09/535,927 filed March 24, 2000; 09/816,869 filed March 24, 2001 ; 09/534,836 filed March 24, 2000; 10/484,931 filed March 25, 2002; 09/989,989 filed November 21 , 2001 ; 10/038,752 filed December 31 , 2001 ; 10/081 ,863 filed February 20, 2002; 10/367,631 filed February 14, 2003; and 10/718,035 filed November 20, 2003; and 60/542,807, filed Feb. 6, 2004, and international application: PCT/US03/04798 filed February 14, 2003.. Aspects of the invention can be modified, if necessary, to employ the systems, functions, and concepts of the various references described above to provide yet further embodiments of the invention.

[00150] These and other changes can be made to the invention in light of the above Detailed Description. While the above description describes certain embodiments of the invention, and describes the best mode contemplated, no matter how detailed the above appears in text, the invention can be practiced in many ways. Details of the library status reporting to the host may vary considerably in its implementation details, while still being encompassed by the invention disclosed herein. As noted above, particular terminology used when describing certain features or aspects of the invention should not be taken to imply that the terminology is being redefined herein to be restricted to any specific characteristics, features, or aspects of the invention with which that terminology is associated. In general, the terms used in the following claims should not be construed to limit the invention to the specific embodiments disclosed in the specification, unless the above Detailed Description section explicitly defines such terms. Accordingly, the actual scope of the invention encompasses not only the disclosed embodiments, but also all equivalent ways of practicing or implementing the invention under the claims.

[00151] While certain aspects of the invention are presented below in certain claim forms, the inventors contemplate the various aspects of the invention in any number of claim forms. For example, while only one aspect of the invention is recited as embodied in a computer-readable medium, other aspects may likewise be embodied in a computer-readable medium. Accordingly, the inventors reserve the right to add additional claims after filing the application to pursue such additional claim forms for other aspects of the invention.